Sample records for sensorless tool failure

  1. A sensorless method for measuring the point mobility of mechanical structures

    NASA Astrophysics Data System (ADS)

    Boulandet, R.; Michau, M.; Herzog, P.; Micheau, P.; Berry, A.

    2016-09-01

    This paper presents a convenient and cost-effective experimental tool for measuring the mobility characteristics of a mechanical structure. The objective is to demonstrate that the point mobility measurement can be performed using only an electrodynamic inertial exciter. Unlike previous work based on voice coil actuators, no load cell or accelerometer is needed. Instead, it is theoretically shown that the mobility characteristics of the structure can be estimated from variations in the electrical input impedance of the actuator fixed onto it, provided that the electromechanical parameters of the actuator are known. The proof of concept is made experimentally using a cheap commercially available actuator on a simply supported plate, leading to a good dynamic range from 100 Hz to 1 kHz. The methodology to assess the basic parameters of the actuator is also given. Measured data are compared to a standard shaker testing and the strengths and weaknesses of the sensorless mobility measuring device are discussed. It is believed that this sensorless mobility measuring device can be a convenient experimental tool to determine the dynamic characteristics of a wide range of mechanical structures.

  2. Sensorless Modeling of Varying Pulse Width Modulator Resolutions in Three-Phase Induction Motors

    PubMed Central

    Marko, Matthew David; Shevach, Glenn

    2017-01-01

    A sensorless algorithm was developed to predict rotor speeds in an electric three-phase induction motor. This sensorless model requires a measurement of the stator currents and voltages, and the rotor speed is predicted accurately without any mechanical measurement of the rotor speed. A model of an electric vehicle undergoing acceleration was built, and the sensorless prediction of the simulation rotor speed was determined to be robust even in the presence of fluctuating motor parameters and significant sensor errors. Studies were conducted for varying pulse width modulator resolutions, and the sensorless model was accurate for all resolutions of sinusoidal voltage functions. PMID:28076418

  3. Sensorless Modeling of Varying Pulse Width Modulator Resolutions in Three-Phase Induction Motors.

    PubMed

    Marko, Matthew David; Shevach, Glenn

    2017-01-01

    A sensorless algorithm was developed to predict rotor speeds in an electric three-phase induction motor. This sensorless model requires a measurement of the stator currents and voltages, and the rotor speed is predicted accurately without any mechanical measurement of the rotor speed. A model of an electric vehicle undergoing acceleration was built, and the sensorless prediction of the simulation rotor speed was determined to be robust even in the presence of fluctuating motor parameters and significant sensor errors. Studies were conducted for varying pulse width modulator resolutions, and the sensorless model was accurate for all resolutions of sinusoidal voltage functions.

  4. Sensorless Load Torque Estimation and Passivity Based Control of Buck Converter Fed DC Motor

    PubMed Central

    Kumar, S. Ganesh; Thilagar, S. Hosimin

    2015-01-01

    Passivity based control of DC motor in sensorless configuration is proposed in this paper. Exact tracking error dynamics passive output feedback control is used for stabilizing the speed of Buck converter fed DC motor under various load torques such as constant type, fan type, propeller type, and unknown load torques. Under load conditions, sensorless online algebraic approach is proposed, and it is compared with sensorless reduced order observer approach. The former produces better response in estimating the load torque. Sensitivity analysis is also performed to select the appropriate control variables. Simulation and experimental results fully confirm the superiority of the proposed approach suggested in this paper. PMID:25893208

  5. Universal Parameter Measurement and Sensorless Vector Control of Induction and Permanent Magnet Synchronous Motors

    NASA Astrophysics Data System (ADS)

    Yamamoto, Shu; Ara, Takahiro

    Recently, induction motors (IMs) and permanent-magnet synchronous motors (PMSMs) have been used in various industrial drive systems. The features of the hardware device used for controlling the adjustable-speed drive in these motors are almost identical. Despite this, different techniques are generally used for parameter measurement and speed-sensorless control of these motors. If the same technique can be used for parameter measurement and sensorless control, a highly versatile adjustable-speed-drive system can be realized. In this paper, the authors describe a new universal sensorless control technique for both IMs and PMSMs (including salient pole and nonsalient pole machines). A mathematical model applicable for IMs and PMSMs is discussed. Using this model, the authors derive the proposed universal sensorless vector control algorithm on the basis of estimation of the stator flux linkage vector. All the electrical motor parameters are determined by a unified test procedure. The proposed method is implemented on three test machines. The actual driving test results demonstrate the validity of the proposed method.

  6. Fast Fourier and discrete wavelet transforms applied to sensorless vector control induction motor for rotor bar faults diagnosis.

    PubMed

    Talhaoui, Hicham; Menacer, Arezki; Kessal, Abdelhalim; Kechida, Ridha

    2014-09-01

    This paper presents new techniques to evaluate faults in case of broken rotor bars of induction motors. Procedures are applied with closed-loop control. Electrical and mechanical variables are treated using fast Fourier transform (FFT), and discrete wavelet transform (DWT) at start-up and steady state. The wavelet transform has proven to be an excellent mathematical tool for the detection of the faults particularly broken rotor bars type. As a performance, DWT can provide a local representation of the non-stationary current signals for the healthy machine and with fault. For sensorless control, a Luenberger observer is applied; the estimation rotor speed is analyzed; the effect of the faults in the speed pulsation is compensated; a quadratic current appears and used for fault detection. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Electric propulsion using the permanent magnet synchronous motor without rotor position transducers

    NASA Astrophysics Data System (ADS)

    Batzel, Todd Douglas

    The permanent magnet synchronous motor (PMSM) is increasingly playing an important role in electric propulsion systems due to its many advantages over competing technologies. For successful operation of the PMSM, rotor position and speed information is required. A resolver or encoder attached to the shaft of the machine usually provides this information. Many applications, however, cannot tolerate the use of the position sensor because of space and weight limitations, reliability concerns, or packaging issues. Thus, there has been an intense interest in the development of a so-called position sensorless drive, where the PMSM stator itself is used as the rotor position sensor. In this work, a sensorless electric drive is developed for various undersea propulsion applications, where the rotor position sensor is often undesirable due to the harsh operating environment as well as space and weight limitations. In this work, an observer is developed which enables sensorless operation of the PMSM over a wide speed range. In addition, a method is presented for estimating the standstill rotor angle, an operating condition at which the rotor position observers are typically ill conditioned. In this work two design methodologies are applied to the sensorless electric drive application, including a model-based and a neural network-based approach. Implementation issues for the sensorless electric drive are discussed, and experimental results are presented in order to demonstrate the effectiveness of the proposed techniques to the sensorless PMSM.

  8. Speed Sensorless Induction Motor Drives for Electrical Actuators: Schemes, Trends and Tradeoffs

    NASA Technical Reports Server (NTRS)

    Elbuluk, Malik E.; Kankam, M. David

    1997-01-01

    For a decade, induction motor drive-based electrical actuators have been under investigation as potential replacement for the conventional hydraulic and pneumatic actuators in aircraft. Advantages of electric actuator include lower weight and size, reduced maintenance and operating costs, improved safety due to the elimination of hazardous fluids and high pressure hydraulic and pneumatic actuators, and increased efficiency. Recently, the emphasis of research on induction motor drives has been on sensorless vector control which eliminates flux and speed sensors mounted on the motor. Also, the development of effective speed and flux estimators has allowed good rotor flux-oriented (RFO) performance at all speeds except those close to zero. Sensorless control has improved the motor performance, compared to the Volts/Hertz (or constant flux) controls. This report evaluates documented schemes for speed sensorless drives, and discusses the trends and tradeoffs involved in selecting a particular scheme. These schemes combine the attributes of the direct and indirect field-oriented control (FOC) or use model adaptive reference systems (MRAS) with a speed-dependent current model for flux estimation which tracks the voltage model-based flux estimator. Many factors are important in comparing the effectiveness of a speed sensorless scheme. Among them are the wide speed range capability, motor parameter insensitivity and noise reduction. Although a number of schemes have been proposed for solving the speed estimation, zero-speed FOC with robustness against parameter variations still remains an area of research for speed sensorless control.

  9. Modified artificial fish school algorithm for free space optical communication with sensor-less adaptive optics system

    NASA Astrophysics Data System (ADS)

    Cao, Jingtai; Zhao, Xiaohui; Li, Zhaokun; Liu, Wei; Gu, Haijun

    2017-11-01

    The performance of free space optical (FSO) communication system is limited by atmospheric turbulent extremely. Adaptive optics (AO) is the significant method to overcome the atmosphere disturbance. Especially, for the strong scintillation effect, the sensor-less AO system plays a major role for compensation. In this paper, a modified artificial fish school (MAFS) algorithm is proposed to compensate the aberrations in the sensor-less AO system. Both the static and dynamic aberrations compensations are analyzed and the performance of FSO communication before and after aberrations compensations is compared. In addition, MAFS algorithm is compared with artificial fish school (AFS) algorithm, stochastic parallel gradient descent (SPGD) algorithm and simulated annealing (SA) algorithm. It is shown that the MAFS algorithm has a higher convergence speed than SPGD algorithm and SA algorithm, and reaches the better convergence value than AFS algorithm, SPGD algorithm and SA algorithm. The sensor-less AO system with MAFS algorithm effectively increases the coupling efficiency at the receiving terminal with fewer numbers of iterations. In conclusion, the MAFS algorithm has great significance for sensor-less AO system to compensate atmospheric turbulence in FSO communication system.

  10. Wavefront sensorless adaptive optics ophthalmoscopy in the human eye

    PubMed Central

    Hofer, Heidi; Sredar, Nripun; Queener, Hope; Li, Chaohong; Porter, Jason

    2011-01-01

    Wavefront sensor noise and fidelity place a fundamental limit on achievable image quality in current adaptive optics ophthalmoscopes. Additionally, the wavefront sensor ‘beacon’ can interfere with visual experiments. We demonstrate real-time (25 Hz), wavefront sensorless adaptive optics imaging in the living human eye with image quality rivaling that of wavefront sensor based control in the same system. A stochastic parallel gradient descent algorithm directly optimized the mean intensity in retinal image frames acquired with a confocal adaptive optics scanning laser ophthalmoscope (AOSLO). When imaging through natural, undilated pupils, both control methods resulted in comparable mean image intensities. However, when imaging through dilated pupils, image intensity was generally higher following wavefront sensor-based control. Despite the typically reduced intensity, image contrast was higher, on average, with sensorless control. Wavefront sensorless control is a viable option for imaging the living human eye and future refinements of this technique may result in even greater optical gains. PMID:21934779

  11. Position and speed control of brushless DC motors using sensorless techniques and application trends.

    PubMed

    Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime

    2010-01-01

    This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks.

  12. Position and Speed Control of Brushless DC Motors Using Sensorless Techniques and Application Trends

    PubMed Central

    Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime

    2010-01-01

    This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks. PMID:22163582

  13. Implementation of a sliding-mode-based position sensorless drive for high-speed micro permanent-magnet synchronous motors.

    PubMed

    Chi, Wen-Chun; Cheng, Ming-Yang

    2014-03-01

    Due to issues such as limited space, it is difficult if it is not impossible to employ a position sensor in the drive control of high-speed micro PMSMs. In order to alleviate this problem, this paper analyzes and implements a simple and robust position sensorless field-oriented control method of high-speed micro PMSMs based on the sliding-mode observer. In particular, the angular position and velocity of the rotor of the high-speed micro PMSM are estimated using the sliding-mode observer. This observer is able to accurately estimate rotor position in the low speed region and guarantee fast convergence of the observer in the high speed region. The proposed position sensorless control method is suitable for electric dental handpiece motor drives where a wide speed range operation is essential. The proposed sensorless FOC method is implemented using a cost-effective 16-bit microcontroller and tested in a prototype electric dental handpiece motor. Several experiments are performed to verify the effectiveness of the proposed method. © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Method and apparatus for sensorless operation of brushless permanent magnet motors

    DOEpatents

    Sriram, Tillasthanam V.

    1998-01-01

    A sensorless method and apparatus for providing commutation timing signals for a brushless permanent magnet motor extracts the third harmonic back-emf of a three-phase stator winding and independently cyclically integrates the positive and negative half-cycles thereof and compares the results to a reference level associated with a desired commutation angle.

  15. Method and apparatus for sensorless operation of brushless permanent magnet motors

    DOEpatents

    Sriram, T.V.

    1998-04-14

    A sensorless method and apparatus for providing commutation timing signals for a brushless permanent magnet motor extracts the third harmonic back-emf of a three-phase stator winding and independently cyclically integrates the positive and negative half-cycles thereof and compares the results to a reference level associated with a desired commutation angle. 23 figs.

  16. Fuel sensor-less control of a liquid feed fuel cell system under steady load for portable applications

    NASA Astrophysics Data System (ADS)

    Chang, C. L.; Chen, C. Y.; Sung, C. C.; Liou, D. H.

    This study presents a novel fuel sensor-less control scheme for a liquid feed fuel cell system that does not rely on a fuel concentration sensor. The proposed approach simplifies the design and reduces the cost and complexity of a liquid feed fuel cell system, and is especially suited to portable power sources, of which the volume and weight are important. During the reaction of a fuel cell, the cell's operating characteristics, such as potential, current and power are measured to control the supply of fuel and regulate its concentration to optimize performance. Experiments were conducted to verify that the fuel sensor-less control algorithm is effective in the liquid feed fuel cell system.

  17. Sensor-less pseudo-sinusoidal drive for a permanent-magnet brushless ac motor

    NASA Astrophysics Data System (ADS)

    Liu, Li-Hsiang; Chern, Tzuen-Lih; Pan, Ping-Lung; Huang, Tsung-Mou; Tsay, Der-Min; Kuang, Jao-Hwa

    2012-04-01

    The precise rotor-position information is required for a permanent-magnet brushless ac motor (BLACM) drive. In the conventional sinusoidal drive method, either an encoder or a resolver is usually employed. For position sensor-less vector control schemes, the rotor flux estimation and torque components are obtained by complicated coordinate transformations. These computational intensive methods are susceptible to current distortions and parameter variations. To simplify the method complexity, this work presents a sensor-less pseudo-sinusoidal drive scheme with speed control for a three-phase BLACM. Based on the sinusoidal drive scheme, a floating period of each phase current is inserted for back electromotive force detection. The zero-crossing point is determined directly by the proposed scheme, and the rotor magnetic position and rotor speed can be estimated simultaneously. Several experiments for various active angle periods are undertaken. Furthermore, a current feedback control is included to minimize and compensate the torque fluctuation. The experimental results show that the proposed method has a competitive performance compared with the conventional drive manners for BLACM. The proposed scheme is straightforward, bringing the benefits of sensor-less drive and negating the need for coordinate transformations in the operating process.

  18. Field-programmable analogue arrays for the sensorless control of DC motors

    NASA Astrophysics Data System (ADS)

    Rivera, J.; Dueñas, I.; Ortega, S.; Del Valle, J. L.

    2018-02-01

    This work presents the analogue implementation of a sensorless controller for direct current motors based on the super-twisting (ST) sliding mode technique, by means of field programmable analogue arrays (FPAA). The novelty of this work is twofold, first is the use of the ST algorithm in a sensorless scheme for DC motors, and the implementation method of this type of sliding mode controllers in FPAAs. The ST algorithm reduces the chattering problem produced with the deliberate use of the sign function in classical sliding mode approaches. On the other hand, the advantages of the implementation method over a digital one are that the controller is not digitally approximated, the controller gains are not fine tuned and the implementation does not require the use of analogue-to-digital and digital-to-analogue converter circuits. In addition to this, the FPAA is a reconfigurable, lower cost and power consumption technology. Simulation and experimentation results were registered, where a more accurate transient response and lower power consumption were obtained by the proposed implementation method when compared to a digital implementation. Also, a more accurate performance by the DC motor is obtained with proposed sensorless ST technique when compared with a classical sliding mode approach.

  19. Portable DMFC system with methanol sensor-less control

    NASA Astrophysics Data System (ADS)

    Chen, C. Y.; Liu, D. H.; Huang, C. L.; Chang, C. L.

    This work develops a prototype 20 W portable DMFC by system integration of stack, condenser, methanol sensor-less control and start-up characteristics. The effects of these key components and control schemes on the performance are also discussed. To expedite the use of portable DMFC in electronic applications, the system utilizes a novel methanol sensor-less control method, providing improved fuel efficiency, durability, miniaturization and cost reduction. The operating characteristics of the DMFC stack are applied to control the fuel ejection time and period, enabling the system to continue operating even when the MEAs of the stack are deteriorated. The portable system is also designed with several features including water balance and quick start-up (in 5 min). Notably, the proposed system using methanol sensor-less control with injection of pure methanol can power the DVD player and notebook PC. The system specific energy and energy density following three days of operation are 362 Wh kg -1 and 335 Wh L -1, respectively, which are better than those of lithium batteries (∼150 Wh kg -1 and ∼250 Wh L -). This good energy storage feature demonstrates that the portable DMFC is likely to be valuable in computer, communication and consumer electronic (3C) markets.

  20. BP artificial neural network based wave front correction for sensor-less free space optics communication

    NASA Astrophysics Data System (ADS)

    Li, Zhaokun; Zhao, Xiaohui

    2017-02-01

    The sensor-less adaptive optics (AO) is one of the most promising methods to compensate strong wave front disturbance in free space optics communication (FSO). The back propagation (BP) artificial neural network is applied for the sensor-less AO system to design a distortion correction scheme in this study. This method only needs one or a few online measurements to correct the wave front distortion compared with other model-based approaches, by which the real-time capacity of the system is enhanced and the Strehl Ratio (SR) is largely improved. Necessary comparisons in numerical simulation with other model-based and model-free correction methods proposed in Refs. [6,8,9,10] are given to show the validity and advantage of the proposed method.

  1. Sensorless H∞ speed-tracking synthesis for surface-mount permanent magnet synchronous motor.

    PubMed

    Ramírez-Villalobos, Ramón; Aguilar, Luis T; Coria, Luis N

    2017-03-01

    In this paper, a sensorless speed tracking control is proposed for a surface-mount permanent magnet synchronous motor by using a nonlinear H ∞ -controller via stator currents measurements for feedback. An output feedback nonlinear H ∞ -controller was designed such that the undisturbed system is uniformly asymptotically stable around the desired speed reference, while also the effects of external vanishing and non-vanishing disturbances, noise, and input backlash were attenuated locally. The rotor position was calculated from the causal dynamic output feedback compensator and from the desired speed reference. The existence of the proper solutions of the perturbed differential Riccati equations ensures stabilizability and detectability of the control system. The efficiency of the proposed sensorless controller was supported by numerical simulations. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Sensorless Estimation and Nonlinear Control of a Rotational Energy Harvester

    NASA Astrophysics Data System (ADS)

    Nunna, Kameswarie; Toh, Tzern T.; Mitcheson, Paul D.; Astolfi, Alessandro

    2013-12-01

    It is important to perform sensorless monitoring of parameters in energy harvesting devices in order to determine the operating states of the system. However, physical measurements of these parameters is often a challenging task due to the unavailability of access points. This paper presents, as an example application, the design of a nonlinear observer and a nonlinear feedback controller for a rotational energy harvester. A dynamic model of a rotational energy harvester with its power electronic interface is derived and validated. This model is then used to design a nonlinear observer and a nonlinear feedback controller which yield a sensorless closed-loop system. The observer estimates the mechancial quantities from the measured electrical quantities while the control law sustains power generation across a range of source rotation speeds. The proposed scheme is assessed through simulations and experiments.

  3. Sensorless optimal sinusoidal brushless direct current for hard disk drives

    NASA Astrophysics Data System (ADS)

    Soh, C. S.; Bi, C.

    2009-04-01

    Initiated by the availability of digital signal processors and emergence of new applications, market demands for permanent magnet synchronous motors have been surging. As its back-emf is sinusoidal, the drive current should also be sinusoidal for reducing the torque ripple. However, in applications like hard disk drives, brushless direct current (BLDC) drive is adopted instead of sinusoidal drive for simplification. The adoption, however, comes at the expense of increased harmonics, losses, torque pulsations, and acoustics. In this paper, we propose a sensorless optimal sinusoidal BLDC drive. First and foremost, the derivation for an optimal sinusoidal drive is presented, and a power angle control scheme is proposed to achieve an optimal sinusoidal BLDC. The scheme maintains linear relationship between the motor speed and drive voltage. In an attempt to execute the sensorless drive, an innovative power angle measurement scheme is devised, which takes advantage of the freewheeling diodes and measures the power angle through the detection of diode voltage drops. The objectives as laid out will be presented and discussed in this paper, supported by derivations, simulations, and experimental results. The proposed scheme is straightforward, brings about the benefits of sensorless sinusoidal drive, negates the need for current sensors by utilizing the freewheeling diodes, and does not incur additional cost.

  4. A new technique to control brushless motor for blood pump application.

    PubMed

    Fonseca, Jeison; Andrade, Aron; Nicolosi, Denys E C; Biscegli, José F; Legendre, Daniel; Bock, Eduardo; Lucchi, Júlio César

    2008-04-01

    This article presents a back-electromotive force (BEMF)-based technique of detection for sensorless brushless direct current motor (BLDCM) drivers. The BLDCM has been chosen as the energy converter in rotary or pulsatile blood pumps that use electrical motors for pumping. However, in order to operate properly, the BLDCM driver needs to know the shaft position. Usually, that information is obtained through a set of Hall sensors assembled close to the rotor and connected to the electronic controller by wires. Sometimes, a large distance between the motor and controller makes the system susceptible to interference on the sensor signal because of winding current switching. Thus, the goal of the sensorless technique presented in this study is to avoid this problem. First, the operation of BLDCM was evaluated on the electronic simulator PSpice. Then, a BEMF detector circuitry was assembled in our laboratories. For the tests, a sensor-dependent system was assembled where the direct comparison between the Hall sensors signals and the detected signals was performed. The obtained results showed that the output sensorless detector signals are very similar to the Hall signals at speeds of more than 2500 rpm. Therefore, the sensorless technique is recommended as a responsible or redundant system to be used in rotary blood pumps.

  5. Real time optimization algorithm for wavefront sensorless adaptive optics OCT (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Verstraete, Hans R. G. W.; Heisler, Morgan; Ju, Myeong Jin; Wahl, Daniel J.; Bliek, Laurens; Kalkman, Jeroen; Bonora, Stefano; Sarunic, Marinko V.; Verhaegen, Michel; Jian, Yifan

    2017-02-01

    Optical Coherence Tomography (OCT) has revolutionized modern ophthalmology, providing depth resolved images of the retinal layers in a system that is suited to a clinical environment. A limitation of the performance and utilization of the OCT systems has been the lateral resolution. Through the combination of wavefront sensorless adaptive optics with dual variable optical elements, we present a compact lens based OCT system that is capable of imaging the photoreceptor mosaic. We utilized a commercially available variable focal length lens to correct for a wide range of defocus commonly found in patient eyes, and a multi-actuator adaptive lens after linearization of the hysteresis in the piezoelectric actuators for aberration correction to obtain near diffraction limited imaging at the retina. A parallel processing computational platform permitted real-time image acquisition and display. The Data-based Online Nonlinear Extremum seeker (DONE) algorithm was used for real time optimization of the wavefront sensorless adaptive optics OCT, and the performance was compared with a coordinate search algorithm. Cross sectional images of the retinal layers and en face images of the cone photoreceptor mosaic acquired in vivo from research volunteers before and after WSAO optimization are presented. Applying the DONE algorithm in vivo for wavefront sensorless AO-OCT demonstrates that the DONE algorithm succeeds in drastically improving the signal while achieving a computational time of 1 ms per iteration, making it applicable for high speed real time applications.

  6. Rotor Position Sensorless Control and Its Parameter Sensitivity of Permanent Magnet Motor Based on Model Reference Adaptive System

    NASA Astrophysics Data System (ADS)

    Ohara, Masaki; Noguchi, Toshihiko

    This paper describes a new method for a rotor position sensorless control of a surface permanent magnet synchronous motor based on a model reference adaptive system (MRAS). This method features the MRAS in a current control loop to estimate a rotor speed and position by using only current sensors. This method as well as almost all the conventional methods incorporates a mathematical model of the motor, which consists of parameters such as winding resistances, inductances, and an induced voltage constant. Hence, the important thing is to investigate how the deviation of these parameters affects the estimated rotor position. First, this paper proposes a structure of the sensorless control applied in the current control loop. Next, it proves the stability of the proposed method when motor parameters deviate from the nominal values, and derives the relationship between the estimated position and the deviation of the parameters in a steady state. Finally, some experimental results are presented to show performance and effectiveness of the proposed method.

  7. Novel Observer Scheme of Fuzzy-MRAS Sensorless Speed Control of Induction Motor Drive

    NASA Astrophysics Data System (ADS)

    Chekroun, S.; Zerikat, M.; Mechernene, A.; Benharir, N.

    2017-01-01

    This paper presents a novel approach Fuzzy-MRAS conception for robust accurate tracking of induction motor drive operating in a high-performance drives environment. Of the different methods for sensorless control of induction motor drive the model reference adaptive system (MRAS) finds lot of attention due to its good performance. The analysis of the sensorless vector control system using MRAS is presented and the resistance parameters variations and speed observer using new Fuzzy Self-Tuning adaptive IP Controller is proposed. In fact, fuzzy logic is reminiscent of human thinking processes and natural language enabling decisions to be made based on vague information. The present approach helps to achieve a good dynamic response, disturbance rejection and low to plant parameter variations of the induction motor. In order to verify the performances of the proposed observer and control algorithms and to test behaviour of the controlled system, numerical simulation is achieved. Simulation results are presented and discussed to shown the validity and the performance of the proposed observer.

  8. Proposition for sensorless self-excitation by a piezoelectric device

    NASA Astrophysics Data System (ADS)

    Tanaka, Y.; Kokubun, Y.; Yabuno, H.

    2018-04-01

    In this paper, we propose a method to realize self-excitation in an oscillator actuated by a piezoelectric device without a sensor. In general, the positive feedback associated with the oscillator velocity causes the self-excitation. Instead of measuring the velocity with a sensor, we utilize the electro-mechanical coupling effect in the oscillator and piezoelectric device. We drive the piezoelectric device with a current proportional to the linear combination of the voltage across the terminals of the piezoelectric device and its differential voltage signal. Then, the oscillator with the piezoelectric device behaves like a third-order system, which has three eigenvalues. The self-excitation can be realized because appropriate feedback gains can set two of the eigenvalues to be conjugate complex roots with a positive real part and the other eigenvalue to be a negative real root. To confirm the validity of the proposed method, we experimentally demonstrated the sensorless self-excitation and, as an application example, carried out mass sensing in a sensorless self-excited macrocantilever.

  9. Type-2 fuzzy logic control based MRAS speed estimator for speed sensorless direct torque and flux control of an induction motor drive.

    PubMed

    Ramesh, Tejavathu; Kumar Panda, Anup; Shiva Kumar, S

    2015-07-01

    In this research study, a model reference adaptive system (MRAS) speed estimator for speed sensorless direct torque and flux control (DTFC) of an induction motor drive (IMD) using two adaptation mechanism schemes are proposed to replace the conventional proportional integral controller (PIC). The first adaptation mechanism scheme is based on Type-1 fuzzy logic controller (T1FLC), which is used to achieve high performance sensorless drive in both transient as well as steady state conditions. However, the Type-1 fuzzy sets are certain and unable to work effectively when higher degree of uncertainties presents in the system which can be caused by sudden change in speed or different load disturbances, process noise etc. Therefore, a new Type-2 fuzzy logic controller (T2FLC) based adaptation mechanism scheme is proposed to better handle the higher degree of uncertainties and improves the performance and also robust to various load torque and sudden change in speed conditions, respectively. The detailed performances of various adaptation mechanism schemes are carried out in a MATLAB/Simulink environment with a speed sensor and speed sensorless modes of operation when an IMD is operating under different operating conditions, such as, no-load, load and sudden change in speed, respectively. To validate the different control approaches, the system also implemented on real-time system and adequate results are reported for its validation. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  10. MRAS state estimator for speed sensorless ISFOC induction motor drives with Luenberger load torque estimation.

    PubMed

    Zorgani, Youssef Agrebi; Koubaa, Yassine; Boussak, Mohamed

    2016-03-01

    This paper presents a novel method for estimating the load torque of a sensorless indirect stator flux oriented controlled (ISFOC) induction motor drive based on the model reference adaptive system (MRAS) scheme. As a matter of fact, this method is meant to inter-connect a speed estimator with the load torque observer. For this purpose, a MRAS has been applied to estimate the rotor speed with tuned load torque in order to obtain a high performance ISFOC induction motor drive. The reference and adjustable models, developed in the stationary stator reference frame, are used in the MRAS scheme in an attempt to estimate the speed of the measured terminal voltages and currents. The load torque is estimated by means of a Luenberger observer defined throughout the mechanical equation. Every observer state matrix depends on the mechanical characteristics of the machine taking into account the vicious friction coefficient and inertia moment. Accordingly, some simulation results are presented to validate the proposed method and to highlight the influence of the variation of the inertia moment and the friction coefficient on the speed and the estimated load torque. The experimental results, concerning to the sensorless speed with a load torque estimation, are elaborated in order to validate the effectiveness of the proposed method. The complete sensorless ISFOC with load torque estimation is successfully implemented in real time using a digital signal processor board DSpace DS1104 for a laboratory 3 kW induction motor. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  11. In vivo imaging of human photoreceptor mosaic with wavefront sensorless adaptive optics optical coherence tomography.

    PubMed

    Wong, Kevin S K; Jian, Yifan; Cua, Michelle; Bonora, Stefano; Zawadzki, Robert J; Sarunic, Marinko V

    2015-02-01

    Wavefront sensorless adaptive optics optical coherence tomography (WSAO-OCT) is a novel imaging technique for in vivo high-resolution depth-resolved imaging that mitigates some of the challenges encountered with the use of sensor-based adaptive optics designs. This technique replaces the Hartmann Shack wavefront sensor used to measure aberrations with a depth-resolved image-driven optimization algorithm, with the metric based on the OCT volumes acquired in real-time. The custom-built ultrahigh-speed GPU processing platform and fast modal optimization algorithm presented in this paper was essential in enabling real-time, in vivo imaging of human retinas with wavefront sensorless AO correction. WSAO-OCT is especially advantageous for developing a clinical high-resolution retinal imaging system as it enables the use of a compact, low-cost and robust lens-based adaptive optics design. In this report, we describe our WSAO-OCT system for imaging the human photoreceptor mosaic in vivo. We validated our system performance by imaging the retina at several eccentricities, and demonstrated the improvement in photoreceptor visibility with WSAO compensation.

  12. An Adaptive Supervisory Sliding Fuzzy Cerebellar Model Articulation Controller for Sensorless Vector-Controlled Induction Motor Drive Systems

    PubMed Central

    Wang, Shun-Yuan; Tseng, Chwan-Lu; Lin, Shou-Chuang; Chiu, Chun-Jung; Chou, Jen-Hsiang

    2015-01-01

    This paper presents the implementation of an adaptive supervisory sliding fuzzy cerebellar model articulation controller (FCMAC) in the speed sensorless vector control of an induction motor (IM) drive system. The proposed adaptive supervisory sliding FCMAC comprised a supervisory controller, integral sliding surface, and an adaptive FCMAC. The integral sliding surface was employed to eliminate steady-state errors and enhance the responsiveness of the system. The adaptive FCMAC incorporated an FCMAC with a compensating controller to perform a desired control action. The proposed controller was derived using the Lyapunov approach, which guarantees learning-error convergence. The implementation of three intelligent control schemes—the adaptive supervisory sliding FCMAC, adaptive sliding FCMAC, and adaptive sliding CMAC—were experimentally investigated under various conditions in a realistic sensorless vector-controlled IM drive system. The root mean square error (RMSE) was used as a performance index to evaluate the experimental results of each control scheme. The analysis results indicated that the proposed adaptive supervisory sliding FCMAC substantially improved the system performance compared with the other control schemes. PMID:25815450

  13. Constant Switching Frequency DTC for Matrix Converter Fed Speed Sensorless Induction Motor Drive

    NASA Astrophysics Data System (ADS)

    Mir, Tabish Nazir; Singh, Bhim; Bhat, Abdul Hamid

    2018-05-01

    The paper presents a constant switching frequency scheme for speed sensorless Direct Torque Control (DTC) of Matrix Converter fed Induction Motor Drive. The use of matrix converter facilitates improved power quality on input as well as motor side, along with Input Power Factor control, besides eliminating the need for heavy passive elements. Moreover, DTC through Space Vector Modulation helps in achieving a fast control over the torque and flux of the motor, with added benefit of constant switching frequency. A constant switching frequency aids in maintaining desired power quality of AC mains current even at low motor speeds, and simplifies input filter design of the matrix converter, as compared to conventional hysteresis based DTC. Further, stator voltage estimation from sensed input voltage, and subsequent stator (and rotor) flux estimation is done. For speed sensorless operation, a Model Reference Adaptive System is used, which emulates the speed dependent rotor flux equations of the induction motor. The error between conventionally estimated rotor flux (reference model) and the rotor flux estimated through the adaptive observer is processed through PI controller to generate the rotor speed estimate.

  14. An adaptive supervisory sliding fuzzy cerebellar model articulation controller for sensorless vector-controlled induction motor drive systems.

    PubMed

    Wang, Shun-Yuan; Tseng, Chwan-Lu; Lin, Shou-Chuang; Chiu, Chun-Jung; Chou, Jen-Hsiang

    2015-03-25

    This paper presents the implementation of an adaptive supervisory sliding fuzzy cerebellar model articulation controller (FCMAC) in the speed sensorless vector control of an induction motor (IM) drive system. The proposed adaptive supervisory sliding FCMAC comprised a supervisory controller, integral sliding surface, and an adaptive FCMAC. The integral sliding surface was employed to eliminate steady-state errors and enhance the responsiveness of the system. The adaptive FCMAC incorporated an FCMAC with a compensating controller to perform a desired control action. The proposed controller was derived using the Lyapunov approach, which guarantees learning-error convergence. The implementation of three intelligent control schemes--the adaptive supervisory sliding FCMAC, adaptive sliding FCMAC, and adaptive sliding CMAC--were experimentally investigated under various conditions in a realistic sensorless vector-controlled IM drive system. The root mean square error (RMSE) was used as a performance index to evaluate the experimental results of each control scheme. The analysis results indicated that the proposed adaptive supervisory sliding FCMAC substantially improved the system performance compared with the other control schemes.

  15. Advanced simulation model for IPM motor drive with considering phase voltage and stator inductance

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Myung; Park, Hyun-Jong; Lee, Ju

    2016-10-01

    This paper proposes an advanced simulation model of driving system for Interior Permanent Magnet (IPM) BrushLess Direct Current (BLDC) motors driven by 120-degree conduction method (two-phase conduction method, TPCM) that is widely used for sensorless control of BLDC motors. BLDC motors can be classified as SPM (Surface mounted Permanent Magnet) and IPM motors. Simulation model of driving system with SPM motors is simple due to the constant stator inductance regardless of the rotor position. Simulation models of SPM motor driving system have been proposed in many researches. On the other hand, simulation models for IPM driving system by graphic-based simulation tool such as Matlab/Simulink have not been proposed. Simulation study about driving system of IPMs with TPCM is complex because stator inductances of IPM vary with the rotor position, as permanent magnets are embedded in the rotor. To develop sensorless scheme or improve control performance, development of control algorithm through simulation study is essential, and the simulation model that accurately reflects the characteristic of IPM is required. Therefore, this paper presents the advanced simulation model of IPM driving system, which takes into account the unique characteristic of IPM due to the position-dependent inductances. The validity of the proposed simulation model is validated by comparison to experimental and simulation results using IPM with TPCM control scheme.

  16. Optimal model-based sensorless adaptive optics for epifluorescence microscopy.

    PubMed

    Pozzi, Paolo; Soloviev, Oleg; Wilding, Dean; Vdovin, Gleb; Verhaegen, Michel

    2018-01-01

    We report on a universal sample-independent sensorless adaptive optics method, based on modal optimization of the second moment of the fluorescence emission from a point-like excitation. Our method employs a sample-independent precalibration, performed only once for the particular system, to establish the direct relation between the image quality and the aberration. The method is potentially applicable to any form of microscopy with epifluorescence detection, including the practically important case of incoherent fluorescence emission from a three dimensional object, through minor hardware modifications. We have applied the technique successfully to a widefield epifluorescence microscope and to a multiaperture confocal microscope.

  17. Sensorless Control of Permanent Magnet Machine for NASA Flywheel Technology Development

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.; Kascak, Peter E.

    2002-01-01

    This paper describes the position sensorless algorithms presently used in the motor control for the NASA "in-house" development work of the flywheel energy storage system. At zero and low speeds a signal injection technique, the self-sensing method, is used to determine rotor position. At higher speeds, an open loop estimate of the back EMF of the machine is made to determine the rotor position. At start up, the rotor is set to a known position by commanding dc into one of the phase windings. Experimental results up to 52,000 rpm are presented.

  18. Ensemble machine learning and forecasting can achieve 99% uptime for rural handpumps

    PubMed Central

    Thomas, Evan A.

    2017-01-01

    Broken water pumps continue to impede efforts to deliver clean and economically-viable water to the global poor. The literature has demonstrated that customers’ health benefits and willingness to pay for clean water are best realized when clean water infrastructure performs extremely well (>99% uptime). In this paper, we used sensor data from 42 Afridev-brand handpumps observed for 14 months in western Kenya to demonstrate how sensors and supervised ensemble machine learning could be used to increase total fleet uptime from a best-practices baseline of about 70% to >99%. We accomplish this increase in uptime by forecasting pump failures and identifying existing failures very quickly. Comparing the costs of operating the pump per functional year over a lifetime of 10 years, we estimate that implementing this algorithm would save 7% on the levelized cost of water relative to a sensor-less scheduled maintenance program. Combined with a rigorous system for dispatching maintenance personnel, implementing this algorithm in a real-world program could significantly improve health outcomes and customers’ willingness to pay for water services. PMID:29182673

  19. Sliding mode based fault detection, reconstruction and fault tolerant control scheme for motor systems.

    PubMed

    Mekki, Hemza; Benzineb, Omar; Boukhetala, Djamel; Tadjine, Mohamed; Benbouzid, Mohamed

    2015-07-01

    The fault-tolerant control problem belongs to the domain of complex control systems in which inter-control-disciplinary information and expertise are required. This paper proposes an improved faults detection, reconstruction and fault-tolerant control (FTC) scheme for motor systems (MS) with typical faults. For this purpose, a sliding mode controller (SMC) with an integral sliding surface is adopted. This controller can make the output of system to track the desired position reference signal in finite-time and obtain a better dynamic response and anti-disturbance performance. But this controller cannot deal directly with total system failures. However an appropriate combination of the adopted SMC and sliding mode observer (SMO), later it is designed to on-line detect and reconstruct the faults and also to give a sensorless control strategy which can achieve tolerance to a wide class of total additive failures. The closed-loop stability is proved, using the Lyapunov stability theory. Simulation results in healthy and faulty conditions confirm the reliability of the suggested framework. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Sensorless sliding mode observer for a five-phase permanent magnet synchronous motor drive.

    PubMed

    Hosseyni, Anissa; Trabelsi, Ramzi; Mimouni, Med Faouzi; Iqbal, Atif; Alammari, Rashid

    2015-09-01

    This paper deals with the sensorless vector controlled five-phase permanent magnet synchronous motor (PMSM) drive based on a sliding mode observer (SMO). The observer is designed considering the back electromotive force (EMF) of five-phase permanent magnet synchronous motor. The SMO structure and design are illustrated. Stability of the proposed observer is demonstrated using Lyapunov stability criteria. The proposed strategy is asymptotically stable in the context of Lyapunov theory. Simulated results on a five-phase PMSM drive are displayed to validate the feasibility and the effectiveness of the proposed control strategy. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  1. An Approach to Sensorless Detection of Human Input Torque and Its Application to Power Assist Motion in Electric Wheelchair

    NASA Astrophysics Data System (ADS)

    Kaida, Yukiko; Murakami, Toshiyuki

    A wheelchair is an important apparatus of mobility for people with disability. Power-assist motion in an electric wheelchair is to expand the operator's field of activities. This paper describes force sensorless detection of human input torque. Reaction torque estimation observer calculates the total disturbance torque first. Then, the human input torque is extracted from the estimated disturbance. In power-assist motion, assist torque is synthesized according to the product of assist gain and the average torque of the right and left input torque. Finally, the proposed method is verified through the experiments of power-assist motion.

  2. Use of digital micromirror devices as dynamic pinhole arrays for adaptive confocal fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Pozzi, Paolo; Wilding, Dean; Soloviev, Oleg; Vdovin, Gleb; Verhaegen, Michel

    2018-02-01

    In this work, we present a new confocal laser scanning microscope capable to perform sensorless wavefront optimization in real time. The device is a parallelized laser scanning microscope in which the excitation light is structured in a lattice of spots by a spatial light modulator, while a deformable mirror provides aberration correction and scanning. A binary DMD is positioned in an image plane of the detection optical path, acting as a dynamic array of reflective confocal pinholes, images by a high performance cmos camera. A second camera detects images of the light rejected by the pinholes for sensorless aberration correction.

  3. Estimating Tool-Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool.

    PubMed

    Zhao, Baoliang; Nelson, Carl A

    2016-10-01

    Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool-tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool-tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool-tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool-tissue interaction forces in real time, thereby increasing surgical efficiency and safety.

  4. Sensorless position estimation and control of permanent-magnet synchronous motors using a saturation model

    NASA Astrophysics Data System (ADS)

    Kassem Jebai, Al; Malrait, François; Martin, Philippe; Rouchon, Pierre

    2016-03-01

    Sensorless control of permanent-magnet synchronous motors at low velocity remains a challenging task. A now well-established method consists of injecting a high-frequency signal and using the rotor saliency, both geometric and magnetic-saturation induced. This paper proposes a clear and original analysis based on second-order averaging of how to recover the position information from signal injection; this analysis blends well with a general model of magnetic saturation. It also proposes a simple parametric model of the saturated motor, based on an energy function which simply encompasses saturation and cross-saturation effects. Experimental results on a surface-mounted motor and an interior magnet motor illustrate the relevance of the approach.

  5. Model-based sensor-less wavefront aberration correction in optical coherence tomography.

    PubMed

    Verstraete, Hans R G W; Wahls, Sander; Kalkman, Jeroen; Verhaegen, Michel

    2015-12-15

    Several sensor-less wavefront aberration correction methods that correct nonlinear wavefront aberrations by maximizing the optical coherence tomography (OCT) signal are tested on an OCT setup. A conventional coordinate search method is compared to two model-based optimization methods. The first model-based method takes advantage of the well-known optimization algorithm (NEWUOA) and utilizes a quadratic model. The second model-based method (DONE) is new and utilizes a random multidimensional Fourier-basis expansion. The model-based algorithms achieve lower wavefront errors with up to ten times fewer measurements. Furthermore, the newly proposed DONE method outperforms the NEWUOA method significantly. The DONE algorithm is tested on OCT images and shows a significantly improved image quality.

  6. Contrast-based sensorless adaptive optics for retinal imaging.

    PubMed

    Zhou, Xiaolin; Bedggood, Phillip; Bui, Bang; Nguyen, Christine T O; He, Zheng; Metha, Andrew

    2015-09-01

    Conventional adaptive optics ophthalmoscopes use wavefront sensing methods to characterize ocular aberrations for real-time correction. However, there are important situations in which the wavefront sensing step is susceptible to difficulties that affect the accuracy of the correction. To circumvent these, wavefront sensorless adaptive optics (or non-wavefront sensing AO; NS-AO) imaging has recently been developed and has been applied to point-scanning based retinal imaging modalities. In this study we show, for the first time, contrast-based NS-AO ophthalmoscopy for full-frame in vivo imaging of human and animal eyes. We suggest a robust image quality metric that could be used for any imaging modality, and test its performance against other metrics using (physical) model eyes.

  7. Sensorless speed detection of squirrel-cage induction machines using stator neutral point voltage harmonics

    NASA Astrophysics Data System (ADS)

    Petrovic, Goran; Kilic, Tomislav; Terzic, Bozo

    2009-04-01

    In this paper a sensorless speed detection method of induction squirrel-cage machines is presented. This method is based on frequency determination of the stator neutral point voltage primary slot harmonic, which is dependent on rotor speed. In order to prove method in steady state and dynamic conditions the simulation and experimental study was carried out. For theoretical investigation the mathematical model of squirrel cage induction machines, which takes into consideration actual geometry and windings layout, is used. Speed-related harmonics that arise from rotor slotting are analyzed using digital signal processing and DFT algorithm with Hanning window. The performance of the method is demonstrated over a wide range of load conditions.

  8. The Technique of Changing the Drive Method of Micro Step Drive and Sensorless Drive for Hybrid Stepping Motor

    NASA Astrophysics Data System (ADS)

    Yoneda, Makoto; Dohmeki, Hideo

    The position control system with the advantage large torque, low vibration, and high resolution can be obtained by the constant current micro step drive applied to hybrid stepping motor. However loss is large, in order not to be concerned with load torque but to control current uniformly. As the one technique of a position control system in which high efficiency is realizable, the same sensorless control as a permanent magnet motor is effective. But, it was the purpose that the control method proposed until now controls speed. Then, this paper proposed changing the drive method of micro step drive and sensorless drive. The change of the drive method was verified from the simulation and the experiment. On no load, it was checked not producing change of a large speed at the time of a change by making electrical angle and carrying out zero reset of the integrator. On load, it was checked that a large speed change arose. The proposed system could change drive method by setting up the initial value of an integrator using the estimated result, without producing speed change. With this technique, the low loss position control system, which employed the advantage of the hybrid stepping motor, has been built.

  9. New Technique of High-Performance Torque Control Developed for Induction Machines

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.

    2003-01-01

    Two forms of high-performance torque control for motor drives have been described in the literature: field orientation control and direct torque control. Field orientation control has been the method of choice for previous NASA electromechanical actuator research efforts with induction motors. Direct torque control has the potential to offer some advantages over field orientation, including ease of implementation and faster response. However, the most common form of direct torque control is not suitable for the highspeed, low-stator-flux linkage induction machines designed for electromechanical actuators with the presently available sample rates of digital control systems (higher sample rates are required). In addition, this form of direct torque control is not suitable for the addition of a high-frequency carrier signal necessary for the "self-sensing" (sensorless) position estimation technique. This technique enables low- and zero-speed position sensorless operation of the machine. Sensorless operation is desirable to reduce the number of necessary feedback signals and transducers, thus improving the reliability and reducing the mass and volume of the system. This research was directed at developing an alternative form of direct torque control known as a "deadbeat," or inverse model, solution. This form uses pulse-width modulation of the voltage applied to the machine, thus reducing the necessary sample and switching frequency for the high-speed NASA motor. In addition, the structure of the deadbeat form allows the addition of the high-frequency carrier signal so that low- and zero-speed sensorless operation is possible. The new deadbeat solution is based on using the stator and rotor flux as state variables. This choice of state variables leads to a simple graphical representation of the solution as the intersection of a constant torque line with a constant stator flux circle. Previous solutions have been expressed only in complex mathematical terms without a method to clearly visualize the solution. The graphical technique allows a more insightful understanding of the operation of the machine under various conditions.

  10. Wavefront sensorless adaptive optics optical coherence tomography for in vivo retinal imaging in mice

    PubMed Central

    Jian, Yifan; Xu, Jing; Gradowski, Martin A.; Bonora, Stefano; Zawadzki, Robert J.; Sarunic, Marinko V.

    2014-01-01

    We present wavefront sensorless adaptive optics (WSAO) Fourier domain optical coherence tomography (FD-OCT) for in vivo small animal retinal imaging. WSAO is attractive especially for mouse retinal imaging because it simplifies optical design and eliminates the need for wavefront sensing, which is difficult in the small animal eye. GPU accelerated processing of the OCT data permitted real-time extraction of image quality metrics (intensity) for arbitrarily selected retinal layers to be optimized. Modal control of a commercially available segmented deformable mirror (IrisAO Inc.) provided rapid convergence using a sequential search algorithm. Image quality improvements with WSAO OCT are presented for both pigmented and albino mouse retinal data, acquired in vivo. PMID:24575347

  11. Stator and Rotor Flux Based Deadbeat Direct Torque Control of Induction Machines

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.; Lorenz, Robert D.

    2001-01-01

    A new, deadbeat type of direct torque control is proposed, analyzed, and experimentally verified in this paper. The control is based on stator and rotor flux as state variables. This choice of state variables allows a graphical representation which is transparent and insightful. The graphical solution shows the effects of realistic considerations such as voltage and current limits. A position and speed sensorless implementation of the control, based on the self-sensing signal injection technique, is also demonstrated experimentally for low speed operation. The paper first develops the new, deadbeat DTC methodology and graphical representation of the new algorithm. It then evaluates feasibility via simulation and experimentally demonstrates performance of the new method with a laboratory prototype including the sensorless methods.

  12. Contrast-based sensorless adaptive optics for retinal imaging

    PubMed Central

    Zhou, Xiaolin; Bedggood, Phillip; Bui, Bang; Nguyen, Christine T.O.; He, Zheng; Metha, Andrew

    2015-01-01

    Conventional adaptive optics ophthalmoscopes use wavefront sensing methods to characterize ocular aberrations for real-time correction. However, there are important situations in which the wavefront sensing step is susceptible to difficulties that affect the accuracy of the correction. To circumvent these, wavefront sensorless adaptive optics (or non-wavefront sensing AO; NS-AO) imaging has recently been developed and has been applied to point-scanning based retinal imaging modalities. In this study we show, for the first time, contrast-based NS-AO ophthalmoscopy for full-frame in vivo imaging of human and animal eyes. We suggest a robust image quality metric that could be used for any imaging modality, and test its performance against other metrics using (physical) model eyes. PMID:26417525

  13. Sensorless position estimator applied to nonlinear IPMC model

    NASA Astrophysics Data System (ADS)

    Bernat, Jakub; Kolota, Jakub

    2016-11-01

    This paper addresses the issue of estimating position for an ionic polymer metal composite (IPMC) known as electro active polymer (EAP). The key step is the construction of a sensorless mode considering only current feedback. This work takes into account nonlinearities caused by electrochemical effects in the material. Owing to the recent observer design technique, the authors obtained both Lyapunov function based estimation law as well as sliding mode observer. To accomplish the observer design, the IPMC model was identified through a series of experiments. The research comprises time domain measurements. The identification process was completed by means of geometric scaling of three test samples. In the proposed design, the estimated position accurately tracks the polymer position, which is illustrated by the experiments.

  14. Stator and Rotor Flux Based Deadbeat Direct Torque Control of Induction Machines

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.; Lorenz, Robert D.

    2003-01-01

    A new, deadbeat type of direct torque control is proposed, analyzed and experimentally verified in this paper. The control is based on stator and rotor flux as state variables. This choice of state variables allows a graphical representation which is transparent and insightful. The graphical solution shows the effects of realistic considerations such as voltage and current limits. A position and speed sensorless implementation of the control, based on the self-sensing signal injection technique, is also demonstrated experimentally for low speed operation. The paper first develops the new, deadbeat DTC methodology and graphical representation of the new algorithm. It then evaluates feasibility via simulation and experimentally demonstrates performance of the new method with a laboratory prototype including the sensorless methods.

  15. Stator and Rotor Flux Based Deadbeat Direct Torque Control of Induction Machines. Revision 1

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.; Lorenz, Robert D.

    2002-01-01

    A new, deadbeat type of direct torque control is proposed, analyzed, and experimentally verified in this paper. The control is based on stator and rotor flux as state variables. This choice of state variables allows a graphical representation which is transparent and insightful. The graphical solution shows the effects of realistic considerations such as voltage and current limits. A position and speed sensorless implementation of the control, based on the self-sensing signal injection technique, is also demonstrated experimentally for low speed operation. The paper first develops the new, deadbeat DTC methodology and graphical representation of the new algorithm. It then evaluates feasibility via simulation and experimentally demonstrates performance of the new method with a laboratory prototype including the sensorless methods.

  16. In Vivo Evaluation of Active and Passive Physiological Control Systems for Rotary Left and Right Ventricular Assist Devices.

    PubMed

    Gregory, Shaun D; Stevens, Michael C; Pauls, Jo P; Schummy, Emma; Diab, Sara; Thomson, Bruce; Anderson, Ben; Tansley, Geoff; Salamonsen, Robert; Fraser, John F; Timms, Daniel

    2016-09-01

    Preventing ventricular suction and venous congestion through balancing flow rates and circulatory volumes with dual rotary ventricular assist devices (VADs) configured for biventricular support is clinically challenging due to their low preload and high afterload sensitivities relative to the natural heart. This study presents the in vivo evaluation of several physiological control systems, which aim to prevent ventricular suction and venous congestion. The control systems included a sensor-based, master/slave (MS) controller that altered left and right VAD speed based on pressure and flow; a sensor-less compliant inflow cannula (IC), which altered inlet resistance and, therefore, pump flow based on preload; a sensor-less compliant outflow cannula (OC) on the right VAD, which altered outlet resistance and thus pump flow based on afterload; and a combined controller, which incorporated the MS controller, compliant IC, and compliant OC. Each control system was evaluated in vivo under step increases in systemic (SVR ∼1400-2400 dyne/s/cm(5) ) and pulmonary (PVR ∼200-1000 dyne/s/cm(5) ) vascular resistances in four sheep supported by dual rotary VADs in a biventricular assist configuration. Constant speed support was also evaluated for comparison and resulted in suction events during all resistance increases and pulmonary congestion during SVR increases. The MS controller reduced suction events and prevented congestion through an initial sharp reduction in pump flow followed by a gradual return to baseline (5.0 L/min). The compliant IC prevented suction events; however, reduced pump flows and pulmonary congestion were noted during the SVR increase. The compliant OC maintained pump flow close to baseline (5.0 L/min) and prevented suction and congestion during PVR increases. The combined controller responded similarly to the MS controller to prevent suction and congestion events in all cases while providing a backup system in the event of single controller failure. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  17. Estimating Tool–Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool

    PubMed Central

    Zhao, Baoliang; Nelson, Carl A.

    2016-01-01

    Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool–tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool–tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool–tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool–tissue interaction forces in real time, thereby increasing surgical efficiency and safety. PMID:27303591

  18. A high speed model-based approach for wavefront sensorless adaptive optics systems

    NASA Astrophysics Data System (ADS)

    Lianghua, Wen; Yang, Ping; Shuai, Wang; Wenjing, Liu; Shanqiu, Chen; Xu, Bing

    2018-02-01

    To improve temporal-frequency property of wavefront sensorless adaptive optics (AO) systems, a fast general model-based aberration correction algorithm is presented. The fast general model-based approach is based on the approximately linear relation between the mean square of the aberration gradients and the second moment of far-field intensity distribution. The presented model-based method is capable of completing a mode aberration effective correction just applying one disturbing onto the deformable mirror(one correction by one disturbing), which is reconstructed by the singular value decomposing the correlation matrix of the Zernike functions' gradients. Numerical simulations of AO corrections under the various random and dynamic aberrations are implemented. The simulation results indicate that the equivalent control bandwidth is 2-3 times than that of the previous method with one aberration correction after applying N times disturbing onto the deformable mirror (one correction by N disturbing).

  19. Wavefront sensorless adaptive optics OCT with the DONE algorithm for in vivo human retinal imaging [Invited

    PubMed Central

    Verstraete, Hans R. G. W.; Heisler, Morgan; Ju, Myeong Jin; Wahl, Daniel; Bliek, Laurens; Kalkman, Jeroen; Bonora, Stefano; Jian, Yifan; Verhaegen, Michel; Sarunic, Marinko V.

    2017-01-01

    In this report, which is an international collaboration of OCT, adaptive optics, and control research, we demonstrate the Data-based Online Nonlinear Extremum-seeker (DONE) algorithm to guide the image based optimization for wavefront sensorless adaptive optics (WFSL-AO) OCT for in vivo human retinal imaging. The ocular aberrations were corrected using a multi-actuator adaptive lens after linearization of the hysteresis in the piezoelectric actuators. The DONE algorithm succeeded in drastically improving image quality and the OCT signal intensity, up to a factor seven, while achieving a computational time of 1 ms per iteration, making it applicable for many high speed applications. We demonstrate the correction of five aberrations using 70 iterations of the DONE algorithm performed over 2.8 s of continuous volumetric OCT acquisition. Data acquired from an imaging phantom and in vivo from human research volunteers are presented. PMID:28736670

  20. A New Unified Analysis of Estimate Errors by Model-Matching Phase-Estimation Methods for Sensorless Drive of Permanent-Magnet Synchronous Motors and New Trajectory-Oriented Vector Control, Part II

    NASA Astrophysics Data System (ADS)

    Shinnaka, Shinji

    This paper presents a new unified analysis of estimate errors by model-matching extended-back-EMF estimation methods for sensorless drive of permanent-magnet synchronous motors. Analytical solutions about estimate errors, whose validity is confirmed by numerical experiments, are rich in universality and applicability. As an example of universality and applicability, a new trajectory-oriented vector control method is proposed, which can realize directly quasi-optimal strategy minimizing total losses with no additional computational loads by simply orienting one of vector-control coordinates to the associated quasi-optimal trajectory. The coordinate orientation rule, which is analytically derived, is surprisingly simple. Consequently the trajectory-oriented vector control method can be applied to a number of conventional vector control systems using model-matching extended-back-EMF estimation methods.

  1. Study on Stability of High Speed Traction Drive CVT for Aircraft Generator

    NASA Astrophysics Data System (ADS)

    Goi, Tatsuhiko; Tanaka, Hirohisa; Nakashima, Kenichi; Watanabe, Koji

    A half-toroidal traction drive CVT has a feature of small spin at traction pitch in whole speed ratio range of 1:4, which suits to transmit high rotational speed with minimum temperature increase of traction surface. Research activity on traction drive CVT has commenced in 1996 for applying it to an aircraft 24,000rpm constant-speed generator instead of a hydro-static transmission. This paper shows fundamental design of 90kW traction drive integrated drive generator, ``T-IDG", and stability analysis on a sensor-less electro-hydraulic speed control servo-mechanism by bond graphs. The performance test of T-IDG mounted on a test bench and an actual jet engine proved that the control system using sensor-less servomechanism can keep the generator speed within MIL-STD-704E allowable limit against steep changes of speed and load.

  2. Wavefront sensorless adaptive optics OCT with the DONE algorithm for in vivo human retinal imaging [Invited].

    PubMed

    Verstraete, Hans R G W; Heisler, Morgan; Ju, Myeong Jin; Wahl, Daniel; Bliek, Laurens; Kalkman, Jeroen; Bonora, Stefano; Jian, Yifan; Verhaegen, Michel; Sarunic, Marinko V

    2017-04-01

    In this report, which is an international collaboration of OCT, adaptive optics, and control research, we demonstrate the Data-based Online Nonlinear Extremum-seeker (DONE) algorithm to guide the image based optimization for wavefront sensorless adaptive optics (WFSL-AO) OCT for in vivo human retinal imaging. The ocular aberrations were corrected using a multi-actuator adaptive lens after linearization of the hysteresis in the piezoelectric actuators. The DONE algorithm succeeded in drastically improving image quality and the OCT signal intensity, up to a factor seven, while achieving a computational time of 1 ms per iteration, making it applicable for many high speed applications. We demonstrate the correction of five aberrations using 70 iterations of the DONE algorithm performed over 2.8 s of continuous volumetric OCT acquisition. Data acquired from an imaging phantom and in vivo from human research volunteers are presented.

  3. Sensorless control of ship propulsion interior permanent magnet synchronous motor based on a new sliding mode observer.

    PubMed

    Ren, Jun-Jie; Liu, Yan-Cheng; Wang, Ning; Liu, Si-Yuan

    2015-01-01

    This paper proposes a sensorless speed control strategy for ship propulsion interior permanent magnet synchronous motor (IPMSM) based on a new sliding-mode observer (SMO). In the SMO the low-pass filter and the method of arc-tangent calculation of extended electromotive force (EMF) or phase-locked loop (PLL) technique are not used. The calculation of the rotor speed is deduced from the Lyapunov function stability analysis. In order to reduce system chattering, sigmoid functions with switching gains being adaptively updated by fuzzy logic systems are innovatively incorporated into the SMO. Finally, simulation results for a 4.088 MW ship propulsion IPMSM and experimental results from a 7.5 kW IPMSM drive are provided to verify the effectiveness of the proposed SMO method. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Wavefront sensorless adaptive optics temporal focusing-based multiphoton microscopy

    PubMed Central

    Chang, Chia-Yuan; Cheng, Li-Chung; Su, Hung-Wei; Hu, Yvonne Yuling; Cho, Keng-Chi; Yen, Wei-Chung; Xu, Chris; Dong, Chen Yuan; Chen, Shean-Jen

    2014-01-01

    Temporal profile distortions reduce excitation efficiency and image quality in temporal focusing-based multiphoton microscopy. In order to compensate the distortions, a wavefront sensorless adaptive optics system (AOS) was integrated into the microscope. The feedback control signal of the AOS was acquired from local image intensity maximization via a hill-climbing algorithm. The control signal was then utilized to drive a deformable mirror in such a way as to eliminate the distortions. With the AOS correction, not only is the axial excitation symmetrically refocused, but the axial resolution with full two-photon excited fluorescence (TPEF) intensity is also maintained. Hence, the contrast of the TPEF image of a R6G-doped PMMA thin film is enhanced along with a 3.7-fold increase in intensity. Furthermore, the TPEF image quality of 1μm fluorescent beads sealed in agarose gel at different depths is improved. PMID:24940539

  5. Implementation of a MFAC based position sensorless drive for high speed BLDC motors with nonideal back EMF.

    PubMed

    Li, Haitao; Ning, Xin; Li, Wenzhuo

    2017-03-01

    In order to improve the reliability and reduce power consumption of the high speed BLDC motor system, this paper presents a model free adaptive control (MFAC) based position sensorless drive with only a dc-link current sensor. The initial commutation points are obtained by detecting the phase of EMF zero-crossing point and then delaying 30 electrical degrees. According to the commutation error caused by the low pass filter (LPF) and other factors, the relationship between commutation error angle and dc-link current is analyzed, a corresponding MFAC based control method is proposed, and the commutation error can be corrected by the controller in real time. Both the simulation and experimental results show that the proposed correction method can achieve ideal commutation effect within the entire operating speed range. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Coherence-Gated Sensorless Adaptive Optics Multiphoton Retinal Imaging

    PubMed Central

    Cua, Michelle; Wahl, Daniel J.; Zhao, Yuan; Lee, Sujin; Bonora, Stefano; Zawadzki, Robert J.; Jian, Yifan; Sarunic, Marinko V.

    2016-01-01

    Multiphoton microscopy enables imaging deep into scattering tissues. The efficient generation of non-linear optical effects is related to both the pulse duration (typically on the order of femtoseconds) and the size of the focused spot. Aberrations introduced by refractive index inhomogeneity in the sample distort the wavefront and enlarge the focal spot, which reduces the multiphoton signal. Traditional approaches to adaptive optics wavefront correction are not effective in thick or multi-layered scattering media. In this report, we present sensorless adaptive optics (SAO) using low-coherence interferometric detection of the excitation light for depth-resolved aberration correction of two-photon excited fluorescence (TPEF) in biological tissue. We demonstrate coherence-gated SAO TPEF using a transmissive multi-actuator adaptive lens for in vivo imaging in a mouse retina. This configuration has significant potential for reducing the laser power required for adaptive optics multiphoton imaging, and for facilitating integration with existing systems. PMID:27599635

  7. Analysis of field-oriented controlled induction motor drives under sensor faults and an overview of sensorless schemes.

    PubMed

    Arun Dominic, D; Chelliah, Thanga Raj

    2014-09-01

    To obtain high dynamic performance on induction motor drives (IMD), variable voltage and variable frequency operation has to be performed by measuring speed of rotation and stator currents through sensors and fed back them to the controllers. When the sensors are undergone a fault, the stability of control system, may be designed for an industrial process, is disturbed. This paper studies the negative effects on a 12.5 hp induction motor drives when the field oriented control system is subjected to sensor faults. To illustrate the importance of this study mine hoist load diagram is considered as shaft load of the tested machine. The methods to recover the system from sensor faults are discussed. In addition, the various speed sensorless schemes are reviewed comprehensively. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Design and realization of adaptive optical principle system without wavefront sensing

    NASA Astrophysics Data System (ADS)

    Wang, Xiaobin; Niu, Chaojun; Guo, Yaxing; Han, Xiang'e.

    2018-02-01

    In this paper, we focus on the performance improvement of the free space optical communication system and carry out the research on wavefront-sensorless adaptive optics. We use a phase only liquid crystal spatial light modulator (SLM) as the wavefront corrector. The optical intensity distribution of the distorted wavefront is detected by a CCD. We develop a wavefront controller based on ARM and a software based on the Linux operating system. The wavefront controller can control the CCD camera and the wavefront corrector. There being two SLMs in the experimental system, one simulates atmospheric turbulence and the other is used to compensate the wavefront distortion. The experimental results show that the performance quality metric (the total gray value of 25 pixels) increases from 3037 to 4863 after 200 iterations. Besides, it is demonstrated that our wavefront-sensorless adaptive optics system based on SPGD algorithm has a good performance in compensating wavefront distortion.

  9. Coherence-Gated Sensorless Adaptive Optics Multiphoton Retinal Imaging.

    PubMed

    Cua, Michelle; Wahl, Daniel J; Zhao, Yuan; Lee, Sujin; Bonora, Stefano; Zawadzki, Robert J; Jian, Yifan; Sarunic, Marinko V

    2016-09-07

    Multiphoton microscopy enables imaging deep into scattering tissues. The efficient generation of non-linear optical effects is related to both the pulse duration (typically on the order of femtoseconds) and the size of the focused spot. Aberrations introduced by refractive index inhomogeneity in the sample distort the wavefront and enlarge the focal spot, which reduces the multiphoton signal. Traditional approaches to adaptive optics wavefront correction are not effective in thick or multi-layered scattering media. In this report, we present sensorless adaptive optics (SAO) using low-coherence interferometric detection of the excitation light for depth-resolved aberration correction of two-photon excited fluorescence (TPEF) in biological tissue. We demonstrate coherence-gated SAO TPEF using a transmissive multi-actuator adaptive lens for in vivo imaging in a mouse retina. This configuration has significant potential for reducing the laser power required for adaptive optics multiphoton imaging, and for facilitating integration with existing systems.

  10. EFFICIENCY OPTIMIZATIN CONTROL OF AC INDUCTION MOTORS: INITIAL LABORATORY RESULTS

    EPA Science Inventory

    The report discusses the development of a fuzzy logic, energy-optimizing controller to improve the efficiency of motor/drive combinations that operate at varying loads and speeds. This energy optimizer is complemented by a sensorless speed controller that maintains motor shaft re...

  11. Sensorless Sinusoidal Drives for Fan and Pump Motors by V/f Control

    NASA Astrophysics Data System (ADS)

    Kiuchi, Mitsuyuki; Ohnishi, Tokuo

    This paper proposes sensorless sinusoidal driving methods of permanent magnet synchronous motors for fans and pumps by V/f control. The proposed methods are simple methods that control the motor peak current constant by voltage or frequency control, and are characterized by DC link current detection using a single shunt resistor at carrier wave signal bottom timing. As a result of the dumping factor from square torque load characteristics of fan and pump motors, it is possible to control stable starting and stable steady state by V/f control. In general, pressure losses as a result of the fluid pass of fan and pump systems are nearly constant; therefore, the flow rate and motor torque are determined by revolutions. Accordingly, high efficiency driving is possible by setting corresponding currents to q-axis currents (torque currents) at target revolutions. Because of the simple current detection and motor control methods, the proposed methods are optimum for fan and pump motor driving systems of home appliances.

  12. Fast correction approach for wavefront sensorless adaptive optics based on a linear phase diversity technique.

    PubMed

    Yue, Dan; Nie, Haitao; Li, Ye; Ying, Changsheng

    2018-03-01

    Wavefront sensorless (WFSless) adaptive optics (AO) systems have been widely studied in recent years. To reach optimum results, such systems require an efficient correction method. This paper presents a fast wavefront correction approach for a WFSless AO system mainly based on the linear phase diversity (PD) technique. The fast closed-loop control algorithm is set up based on the linear relationship between the drive voltage of the deformable mirror (DM) and the far-field images of the system, which is obtained through the linear PD algorithm combined with the influence function of the DM. A large number of phase screens under different turbulence strengths are simulated to test the performance of the proposed method. The numerical simulation results show that the method has fast convergence rate and strong correction ability, a few correction times can achieve good correction results, and can effectively improve the imaging quality of the system while needing fewer measurements of CCD data.

  13. A New Unified Analysis of Estimate Errors by Model-Matching Phase-Estimation Methods for Sensorless Drive of Permanent-Magnet Synchronous Motors and New Trajectory-Oriented Vector Control, Part I

    NASA Astrophysics Data System (ADS)

    Shinnaka, Shinji; Sano, Kousuke

    This paper presents a new unified analysis of estimate errors by model-matching phase-estimation methods such as rotor-flux state-observers, back EMF state-observers, and back EMF disturbance-observers, for sensorless drive of permanent-magnet synchronous motors. Analytical solutions about estimate errors, whose validity is confirmed by numerical experiments, are rich in universality and applicability. As an example of universality and applicability, a new trajectory-oriented vector control method is proposed, which can realize directly quasi-optimal strategy minimizing total losses with no additional computational loads by simply orienting one of vector-control coordinates to the associated quasi-optimal trajectory. The coordinate orientation rule, which is analytically derived, is surprisingly simple. Consequently the trajectory-oriented vector control method can be applied to a number of conventional vector control systems using one of the model-matching phase-estimation methods.

  14. Wind Velocity and Position Sensor-less Operation for PMSG Wind Generator

    NASA Astrophysics Data System (ADS)

    Senjyu, Tomonobu; Tamaki, Satoshi; Urasaki, Naomitsu; Uezato, Katsumi; Funabashi, Toshihisa; Fujita, Hideki

    Electric power generation using non-conventional sources is receiving considerable attention throughout the world. Wind energy is one of the available non-conventional energy sources. Electrical power generation using wind energy is possible in two ways, viz. constant speed operation and variable speed operation using power electronic converters. Variable speed power generation is attractive, because maximum electric power can be generated at all wind velocities. However, this system requires a rotor speed sensor, for vector control purpose, which increases the cost of the system. To alleviate the need of rotor speed sensor in vector control, we propose a new sensor-less control of PMSG (Permanent Magnet Synchronous Generator) based on the flux linkage. We can estimate the rotor position using the estimated flux linkage. We use a first-order lag compensator to obtain the flux linkage. Furthermore‚we estimate wind velocity and rotation speed using a observer. The effectiveness of the proposed method is demonstrated thorough simulation results.

  15. Model-based wavefront sensorless adaptive optics system for large aberrations and extended objects.

    PubMed

    Yang, Huizhen; Soloviev, Oleg; Verhaegen, Michel

    2015-09-21

    A model-based wavefront sensorless (WFSless) adaptive optics (AO) system with a 61-element deformable mirror is simulated to correct the imaging of a turbulence-degraded extended object. A fast closed-loop control algorithm, which is based on the linear relation between the mean square of the aberration gradients and the second moment of the image intensity distribution, is used to generate the control signals for the actuators of the deformable mirror (DM). The restoration capability and the convergence rate of the AO system are investigated with different turbulence strength wave-front aberrations. Simulation results show the model-based WFSless AO system can restore those images degraded by different turbulence strengths successfully and obtain the correction very close to the achievable capability of the given DM. Compared with the ideal correction of 61-element DM, the averaged relative error of RMS value is 6%. The convergence rate of AO system is independent of the turbulence strength and only depends on the number of actuators of DM.

  16. Enhanced visualization of peripheral retinal vasculature with wavefront sensorless adaptive optics OCT angiography in diabetic patients

    PubMed Central

    Polans, James; Cunefare, David; Cole, Eli; Keller, Brenton; Mettu, Priyatham S.; Cousins, Scott W.; Allingham, Michael J.; Izatt, Joseph A.; Farsiu, Sina

    2017-01-01

    Optical coherence tomography angiography (OCTA) is a promising technique for non-invasive visualization of vessel networks in the human eye. We debut a system capable of acquiring wide field-of-view (>70°) OCT angiograms without mosaicking. Additionally, we report on enhancing the visualization of peripheral microvasculature using wavefront sensorless adaptive optics (WSAO). We employed a fast WSAO algorithm that enabled wavefront correction in <2 seconds by iterating the mirror shape at the speed of OCT B-scans rather than volumes. Also, we contrasted ~7° field-of-view OCTA angiograms acquired in the periphery with and without WSAO correction. On average, WSAO improved the sharpness of microvasculature by 65% in healthy and 38% in diseased eyes. Preliminary observations demonstrated that the location of 7° images could be identified directly from the wide field-of-view angiogram. A pilot study on a normal subject and patients with diabetic retinopathy showed the impact of utilizing WSAO for OCTA when visualizing peripheral vasculature pathologies. PMID:28059209

  17. A sensor-less LED dimming system based on daylight harvesting with BIPV systems.

    PubMed

    Yoo, Seunghwan; Kim, Jonghun; Jang, Cheol-Yong; Jeong, Hakgeun

    2014-01-13

    Artificial lighting in office buildings typically requires 30% of the total energy consumption of the building, providing a substantial opportunity for energy savings. To reduce the energy consumed by indoor lighting, we propose a sensor-less light-emitting diode (LED) dimming system using daylight harvesting. In this study, we used light simulation software to quantify and visualize daylight, and analyzed the correlation between photovoltaic (PV) power generation and indoor illumination in an office with an integrated PV system. In addition, we calculated the distribution of daylight illumination into the office and dimming ratios for the individual control of LED lights. Also, we were able directly to use the electric power generated by PV system. As a result, power consumption for electric lighting was reduced by 40 - 70% depending on the season and the weather conditions. Thus, the dimming system proposed in this study can be used to control electric lighting to reduce energy use cost-effectively and simply.

  18. Fuzzy crane control with sensorless payload deflection feedback for vibration reduction

    NASA Astrophysics Data System (ADS)

    Smoczek, Jaroslaw

    2014-05-01

    Different types of cranes are widely used for shifting cargoes in building sites, shipping yards, container terminals and many manufacturing segments where the problem of fast and precise transferring a payload suspended on the ropes with oscillations reduction is frequently important to enhance the productivity, efficiency and safety. The paper presents the fuzzy logic-based robust feedback anti-sway control system which can be applicable either with or without a sensor of sway angle of a payload. The discrete-time control approach is based on the fuzzy interpolation of the controllers and crane dynamic model's parameters with respect to the varying rope length and mass of a payload. The iterative procedure combining a pole placement method and interval analysis of closed-loop characteristic polynomial coefficients is proposed to design the robust control scheme. The sensorless anti-sway control application developed with using PAC system with RX3i controller was verified on the laboratory scaled overhead crane.

  19. Full-order Luenberger observer based on fuzzy-logic control for sensorless field-oriented control of a single-sided linear induction motor.

    PubMed

    Holakooie, Mohammad Hosein; Ojaghi, Mansour; Taheri, Asghar

    2016-01-01

    This paper investigates sensorless indirect field oriented control (IFOC) of SLIM with full-order Luenberger observer. The dynamic equations of SLIM are first elaborated to draw full-order Luenberger observer with some simplifying assumption. The observer gain matrix is derived from conventional procedure so that observer poles are proportional to SLIM poles to ensure the stability of system for wide range of linear speed. The operation of observer is significantly impressed by adaptive scheme. A fuzzy logic control (FLC) is proposed as adaptive scheme to estimate linear speed using speed tuning signal. The parameters of FLC are tuned using an off-line method through chaotic optimization algorithm (COA). The performance of the proposed observer is verified by both numerical simulation and real-time hardware-in-the-loop (HIL) implementation. Moreover, a detailed comparative study among proposed and other speed observers is obtained under different operation conditions. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Analysis of sensorless control of brushless DC motor using unknown input observer with different gains

    NASA Astrophysics Data System (ADS)

    Astik, Mitesh B.; Bhatt, Praghnesh; Bhalja, Bhavesh R.

    2017-03-01

    A sensorless control scheme based on an unknown input observer is presented in this paper in which back EMF of the Brushless DC Motor (BLDC) is continuously estimated from available line voltages and currents. During negative rotation of motor, actual and estimated speed fail to track the reference speed and if the corrective action is not taken by the observer, the motor goes into saturation. To overcome this problem, the speed estimation algorithm has been implemented in this paper to control the dynamic behavior of the motor during negative rotation. The Ackermans method was used to calculate the gains of an unknown input observer which is based on the appropriate choice of the eigenvalues in advance. The criteria to choose eigenvalue is to obtain a balance between faster convergence rate and the least noise level. Simulations have been carried out for different disturbances such as step changes in motor reference speed and load torque. The comparative simulation results clearly depict that the disturbance effects in actual and estimated responses minimizes as observer gain setting increases.

  1. Fuel sensor-less control of a liquid feed fuel cell under dynamic loading conditions for portable power sources (II)

    NASA Astrophysics Data System (ADS)

    Chang, C. L.; Chen, C. Y.; Sung, C. C.; Liou, D. H.; Chang, C. Y.; Cha, H. C.

    This work presents a new fuel sensor-less control scheme for liquid feed fuel cells that is able to control the supply to a fuel cell system for operation under dynamic loading conditions. The control scheme uses cell-operating characteristics, such as potential, current, and power, to regulate the fuel concentration of a liquid feed fuel cell without the need for a fuel concentration sensor. A current integral technique has been developed to calculate the quantity of fuel required at each monitoring cycle, which can be combined with the concentration regulating process to control the fuel supply for stable operation. As verified by systematic experiments, this scheme can effectively control the fuel supply of a liquid feed fuel cell with reduced response time, even under conditions where the membrane electrolyte assembly (MEA) deteriorates gradually. This advance will aid the commercialization of liquid feed fuel cells and make them more adaptable for use in portable and automotive power units such as laptops, e-bikes, and handicap cars.

  2. New sensorless, efficient optimized and stabilized v/f control for pmsm machines

    NASA Astrophysics Data System (ADS)

    Jafari, Seyed Hesam

    With the rapid advances in power electronics and motor drive technologies in recent decades, permanent magnet synchronous machines (PMSM) have found extensive applications in a variety of industrial systems due to its many desirable features such as high power density, high efficiency, and high torque to current ratio, low noise, and robustness. In low dynamic applications like pumps, fans and compressors where the motor speed is nearly constant, usage of a simple control algorithm that can be implemented with least number of the costly external hardware can be highly desirable for industry. In recent published works, for low power PMSMs, a new sensorless volts-per-hertz (V/f) controlling method has been proposed which can be used for PMSM drive applications where the motor speed is constant. Moreover, to minimize the cost of motor implementation, the expensive rotor damper winding was eliminated. By removing the damper winding, however, instability problems normally occur inside of the motor which in some cases can be harmful for a PMSM drive. As a result, to address the instability issue, a stabilizing loop was developed and added to the conventional V/f. By further studying the proposed sensorless stabilized V/f, and calculating power loss, it became known that overall motor efficiency still is needed to be improved and optimized. This thesis suggests a new V/f control method for PMSMs, where both efficiency and stability problems are addressed. Also, although in nearly all recent related research, methods have been applied to low power PMSM, for the first time, in this thesis, the suggested method is implemented for a medium power 15 kW PMSM. A C2000 F2833x Digital Signal Processor (DSP) is used as controller part for the student custom built PMSM drive, but instead of programming the DSP in Assembly or C, the main control algorithm was developed in a rapid prototype software environment which here Matlab Simulink embedded code library is used.

  3. Model for Sucker-Rod Pumping Unit Operating Modes Analysis Based on SimMechanics Library

    NASA Astrophysics Data System (ADS)

    Zyuzev, A. M.; Bubnov, M. V.

    2018-01-01

    The article provides basic information about the process of a sucker-rod pumping unit (SRPU) model developing by means of SimMechanics library in the MATLAB Simulink environment. The model is designed for the development of a pump productivity optimal management algorithms, sensorless diagnostics of the plunger pump and pumpjack, acquisition of the dynamometer card and determination of a dynamic fluid level in the well, normalization of the faulty unit operation before troubleshooting is performed by staff as well as equilibrium ratio determining by energy indicators and outputting of manual balancing recommendations to achieve optimal power consumption efficiency. Particular attention is given to the application of various blocks from SimMechanics library to take into account the pumpjack construction principal characteristic and to obtain an adequate model. The article explains in depth the developed tools features for collecting and analysis of simulated mechanism data. The conclusions were drawn about practical implementation possibility of the SRPU modelling results and areas for further development of investigation.

  4. Adjustable Speed Drive Project for Teaching a Servo Systems Course Laboratory

    ERIC Educational Resources Information Center

    Rodriguez-Resendiz, J.; Herrera-Ruiz, G.; Rivas-Araiza, E. A.

    2011-01-01

    This paper describes an adjustable speed drive for a three-phase motor, which has been implemented as a design for a servo system laboratory course in an engineering curriculum. The platform is controlled and analyzed in a LabVIEW environment and run on a PC. Theory is introduced in order to show the sensorless algorithms. These are computed by…

  5. Sensorless battery temperature measurements based on electrochemical impedance spectroscopy

    NASA Astrophysics Data System (ADS)

    Raijmakers, L. H. J.; Danilov, D. L.; van Lammeren, J. P. M.; Lammers, M. J. G.; Notten, P. H. L.

    2014-02-01

    A new method is proposed to measure the internal temperature of (Li-ion) batteries. Based on electrochemical impedance spectroscopy measurements, an intercept frequency (f0) can be determined which is exclusively related to the internal battery temperature. The intercept frequency is defined as the frequency at which the imaginary part of the impedance is zero (Zim = 0), i.e. where the phase shift between the battery current and voltage is absent. The advantage of the proposed method is twofold: (i) no hardware temperature sensors are required anymore to monitor the battery temperature and (ii) the method does not suffer from heat transfer delays. Mathematical analysis of the equivalent electrical-circuit, representing the battery performance, confirms that the intercept frequency decreases with rising temperatures. Impedance measurements on rechargeable Li-ion cells of various chemistries were conducted to verify the proposed method. These experiments reveal that the intercept frequency is clearly dependent on the temperature and does not depend on State-of-Charge (SoC) and aging. These impedance-based sensorless temperature measurements are therefore simple and convenient for application in a wide range of stationary, mobile and high-power devices, such as hybrid- and full electric vehicles.

  6. Lens-based wavefront sensorless adaptive optics swept source OCT

    NASA Astrophysics Data System (ADS)

    Jian, Yifan; Lee, Sujin; Ju, Myeong Jin; Heisler, Morgan; Ding, Weiguang; Zawadzki, Robert J.; Bonora, Stefano; Sarunic, Marinko V.

    2016-06-01

    Optical coherence tomography (OCT) has revolutionized modern ophthalmology, providing depth resolved images of the retinal layers in a system that is suited to a clinical environment. Although the axial resolution of OCT system, which is a function of the light source bandwidth, is sufficient to resolve retinal features at a micrometer scale, the lateral resolution is dependent on the delivery optics and is limited by ocular aberrations. Through the combination of wavefront sensorless adaptive optics and the use of dual deformable transmissive optical elements, we present a compact lens-based OCT system at an imaging wavelength of 1060 nm for high resolution retinal imaging. We utilized a commercially available variable focal length lens to correct for a wide range of defocus commonly found in patient’s eyes, and a novel multi-actuator adaptive lens for aberration correction to achieve near diffraction limited imaging performance at the retina. With a parallel processing computational platform, high resolution cross-sectional and en face retinal image acquisition and display was performed in real time. In order to demonstrate the system functionality and clinical utility, we present images of the photoreceptor cone mosaic and other retinal layers acquired in vivo from research subjects.

  7. Control algorithms and applications of the wavefront sensorless adaptive optics

    NASA Astrophysics Data System (ADS)

    Ma, Liang; Wang, Bin; Zhou, Yuanshen; Yang, Huizhen

    2017-10-01

    Compared with the conventional adaptive optics (AO) system, the wavefront sensorless (WFSless) AO system need not to measure the wavefront and reconstruct it. It is simpler than the conventional AO in system architecture and can be applied to the complex conditions. Based on the analysis of principle and system model of the WFSless AO system, wavefront correction methods of the WFSless AO system were divided into two categories: model-free-based and model-based control algorithms. The WFSless AO system based on model-free-based control algorithms commonly considers the performance metric as a function of the control parameters and then uses certain control algorithm to improve the performance metric. The model-based control algorithms include modal control algorithms, nonlinear control algorithms and control algorithms based on geometrical optics. Based on the brief description of above typical control algorithms, hybrid methods combining the model-free-based control algorithm with the model-based control algorithm were generalized. Additionally, characteristics of various control algorithms were compared and analyzed. We also discussed the extensive applications of WFSless AO system in free space optical communication (FSO), retinal imaging in the human eye, confocal microscope, coherent beam combination (CBC) techniques and extended objects.

  8. Fault tolerant operation of switched reluctance machine

    NASA Astrophysics Data System (ADS)

    Wang, Wei

    The energy crisis and environmental challenges have driven industry towards more energy efficient solutions. With nearly 60% of electricity consumed by various electric machines in industry sector, advancement in the efficiency of the electric drive system is of vital importance. Adjustable speed drive system (ASDS) provides excellent speed regulation and dynamic performance as well as dramatically improved system efficiency compared with conventional motors without electronics drives. Industry has witnessed tremendous grow in ASDS applications not only as a driving force but also as an electric auxiliary system for replacing bulky and low efficiency auxiliary hydraulic and mechanical systems. With the vast penetration of ASDS, its fault tolerant operation capability is more widely recognized as an important feature of drive performance especially for aerospace, automotive applications and other industrial drive applications demanding high reliability. The Switched Reluctance Machine (SRM), a low cost, highly reliable electric machine with fault tolerant operation capability, has drawn substantial attention in the past three decades. Nevertheless, SRM is not free of fault. Certain faults such as converter faults, sensor faults, winding shorts, eccentricity and position sensor faults are commonly shared among all ASDS. In this dissertation, a thorough understanding of various faults and their influence on transient and steady state performance of SRM is developed via simulation and experimental study, providing necessary knowledge for fault detection and post fault management. Lumped parameter models are established for fast real time simulation and drive control. Based on the behavior of the faults, a fault detection scheme is developed for the purpose of fast and reliable fault diagnosis. In order to improve the SRM power and torque capacity under faults, the maximum torque per ampere excitation are conceptualized and validated through theoretical analysis and experiments. With the proposed optimal waveform, torque production is greatly improved under the same Root Mean Square (RMS) current constraint. Additionally, position sensorless operation methods under phase faults are investigated to account for the combination of physical position sensor and phase winding faults. A comprehensive solution for position sensorless operation under single and multiple phases fault are proposed and validated through experiments. Continuous position sensorless operation with seamless transition between various numbers of phase fault is achieved.

  9. A self-sensing active magnetic bearing based on a direct current measurement approach.

    PubMed

    Niemann, Andries C; van Schoor, George; du Rand, Carel P

    2013-09-11

    Active magnetic bearings (AMBs) have become a key technology in various industrial applications. Self-sensing AMBs provide an integrated sensorless solution for position estimation, consolidating the sensing and actuating functions into a single electromagnetic transducer. The approach aims to reduce possible hardware failure points, production costs, and system complexity. Despite these advantages, self-sensing methods must address various technical challenges to maximize the performance thereof. This paper presents the direct current measurement (DCM) approach for self-sensing AMBs, denoting the direct measurement of the current ripple component. In AMB systems, switching power amplifiers (PAs) modulate the rotor position information onto the current waveform. Demodulation self-sensing techniques then use bandpass and lowpass filters to estimate the rotor position from the voltage and current signals. However, the additional phase-shift introduced by these filters results in lower stability margins. The DCM approach utilizes a novel PA switching method that directly measures the current ripple to obtain duty-cycle invariant position estimates. Demodulation filters are largely excluded to minimize additional phase-shift in the position estimates. Basic functionality and performance of the proposed self-sensing approach are demonstrated via a transient simulation model as well as a high current (10 A) experimental system. A digital implementation of amplitude modulation self-sensing serves as a comparative estimator.

  10. Optical coherence tomography with a 2.8-mm beam diameter and sensorless defocus and astigmatism correction

    NASA Astrophysics Data System (ADS)

    Reddikumar, Maddipatla; Tanabe, Ayano; Hashimoto, Nobuyuki; Cense, Barry

    2017-02-01

    An optical coherence tomography (OCT) system with a 2.8-mm beam diameter is presented. Sensorless defocus correction can be performed with a Badal optometer and astigmatism correction with a liquid crystal device. OCT B-scans were used in an image-based optimization algorithm for aberration correction. Defocus can be corrected from -4.3 D to +4.3 D and vertical and oblique astigmatism from -2.5 D to +2.5 D. A contrast gain of 6.9 times was measured after aberration correction. In comparison with a 1.3-mm beam diameter OCT system, this concept achieved a 3.7-dB gain in dynamic range on a model retina. Both systems were used to image the retina of a human subject. As the correction of the liquid crystal device can take more than 60 s, the subject's spectacle prescription was adopted instead. This resulted in a 2.5 times smaller speckle size compared with the standard OCT system. The liquid crystal device for astigmatism correction does not need a high-voltage amplifier and can be operated at 5 V. The correction device is small (9 mm×30 mm×38 mm) and can easily be implemented in existing designs for OCT.

  11. Multiscale sensorless adaptive optics OCT angiography system for in vivo human retinal imaging.

    PubMed

    Ju, Myeong Jin; Heisler, Morgan; Wahl, Daniel; Jian, Yifan; Sarunic, Marinko V

    2017-11-01

    We present a multiscale sensorless adaptive optics (SAO) OCT system capable of imaging retinal structure and vasculature with various fields-of-view (FOV) and resolutions. Using a single deformable mirror and exploiting the polarization properties of light, the SAO-OCT-A was implemented in a compact and easy to operate system. With the ability to adjust the beam diameter at the pupil, retinal imaging was demonstrated at two different numerical apertures with the same system. The general morphological structure and retinal vasculature could be observed with a few tens of micrometer-scale lateral resolution with conventional OCT and OCT-A scanning protocols with a 1.7-mm-diameter beam incident at the pupil and a large FOV (15 deg× 15 deg). Changing the system to a higher numerical aperture with a 5.0-mm-diameter beam incident at the pupil and the SAO aberration correction, the FOV was reduced to 3 deg× 3 deg for fine detailed imaging of morphological structure and microvasculature such as the photoreceptor mosaic and capillaries. Multiscale functional SAO-OCT imaging was performed on four healthy subjects, demonstrating its functionality and potential for clinical utility. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  12. Alert management for home healthcare based on home automation analysis.

    PubMed

    Truong, T T; de Lamotte, F; Diguet, J-Ph; Said-Hocine, F

    2010-01-01

    Rising healthcare for elder and disabled people can be controlled by offering people autonomy at home by means of information technology. In this paper, we present an original and sensorless alert management solution which performs multimedia and home automation service discrimination and extracts highly regular home activities as sensors for alert management. The results of simulation data, based on real context, allow us to evaluate our approach before application to real data.

  13. Automatically Detecting Failures in Natural Language Processing Tools for Online Community Text.

    PubMed

    Park, Albert; Hartzler, Andrea L; Huh, Jina; McDonald, David W; Pratt, Wanda

    2015-08-31

    The prevalence and value of patient-generated health text are increasing, but processing such text remains problematic. Although existing biomedical natural language processing (NLP) tools are appealing, most were developed to process clinician- or researcher-generated text, such as clinical notes or journal articles. In addition to being constructed for different types of text, other challenges of using existing NLP include constantly changing technologies, source vocabularies, and characteristics of text. These continuously evolving challenges warrant the need for applying low-cost systematic assessment. However, the primarily accepted evaluation method in NLP, manual annotation, requires tremendous effort and time. The primary objective of this study is to explore an alternative approach-using low-cost, automated methods to detect failures (eg, incorrect boundaries, missed terms, mismapped concepts) when processing patient-generated text with existing biomedical NLP tools. We first characterize common failures that NLP tools can make in processing online community text. We then demonstrate the feasibility of our automated approach in detecting these common failures using one of the most popular biomedical NLP tools, MetaMap. Using 9657 posts from an online cancer community, we explored our automated failure detection approach in two steps: (1) to characterize the failure types, we first manually reviewed MetaMap's commonly occurring failures, grouped the inaccurate mappings into failure types, and then identified causes of the failures through iterative rounds of manual review using open coding, and (2) to automatically detect these failure types, we then explored combinations of existing NLP techniques and dictionary-based matching for each failure cause. Finally, we manually evaluated the automatically detected failures. From our manual review, we characterized three types of failure: (1) boundary failures, (2) missed term failures, and (3) word ambiguity failures. Within these three failure types, we discovered 12 causes of inaccurate mappings of concepts. We used automated methods to detect almost half of 383,572 MetaMap's mappings as problematic. Word sense ambiguity failure was the most widely occurring, comprising 82.22% of failures. Boundary failure was the second most frequent, amounting to 15.90% of failures, while missed term failures were the least common, making up 1.88% of failures. The automated failure detection achieved precision, recall, accuracy, and F1 score of 83.00%, 92.57%, 88.17%, and 87.52%, respectively. We illustrate the challenges of processing patient-generated online health community text and characterize failures of NLP tools on this patient-generated health text, demonstrating the feasibility of our low-cost approach to automatically detect those failures. Our approach shows the potential for scalable and effective solutions to automatically assess the constantly evolving NLP tools and source vocabularies to process patient-generated text.

  14. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics.

    PubMed

    Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin

    2016-09-02

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method.

  15. Sensorless adaptive optics for isoSTED nanoscopy

    NASA Astrophysics Data System (ADS)

    Antonello, Jacopo; Hao, Xiang; Allgeyer, Edward S.; Bewersdorf, Joerg; Rittscher, Jens; Booth, Martin J.

    2018-02-01

    The presence of aberrations is a major concern when using fluorescence microscopy to image deep inside tissue. Aberrations due to refractive index mismatch and heterogeneity of the specimen under investigation cause severe reduction in the amount of fluorescence emission that is collected by the microscope. Furthermore, aberrations adversely affect the resolution, leading to loss of fine detail in the acquired images. These phenomena are particularly troublesome for super-resolution microscopy techniques such as isotropic stimulated-emission-depletion microscopy (isoSTED), which relies on accurate control of the shape and co-alignment of multiple excitation and depletion foci to operate as expected and to achieve the super-resolution effect. Aberrations can be suppressed by implementing sensorless adaptive optics techniques, whereby aberration correction is achieved by maximising a certain image quality metric. In confocal microscopy for example, one can employ the total image brightness as an image quality metric. Aberration correction is subsequently achieved by iteratively changing the settings of a wavefront corrector device until the metric is maximised. This simplistic approach has limited applicability to isoSTED microscopy where, due to the complex interplay between the excitation and depletion foci, maximising the total image brightness can lead to introducing aberrations in the depletion foci. In this work we first consider the effects that different aberration modes have on isoSTED microscopes. We then propose an iterative, wavelet-based aberration correction algorithm and evaluate its benefits.

  16. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  17. An FMS Dynamic Production Scheduling Algorithm Considering Cutting Tool Failure and Cutting Tool Life

    NASA Astrophysics Data System (ADS)

    Setiawan, A.; Wangsaputra, R.; Martawirya, Y. Y.; Halim, A. H.

    2016-02-01

    This paper deals with Flexible Manufacturing System (FMS) production rescheduling due to unavailability of cutting tools caused either of cutting tool failure or life time limit. The FMS consists of parallel identical machines integrated with an automatic material handling system and it runs fully automatically. Each machine has a same cutting tool configuration that consists of different geometrical cutting tool types on each tool magazine. The job usually takes two stages. Each stage has sequential operations allocated to machines considering the cutting tool life. In the real situation, the cutting tool can fail before the cutting tool life is reached. The objective in this paper is to develop a dynamic scheduling algorithm when a cutting tool is broken during unmanned and a rescheduling needed. The algorithm consists of four steps. The first step is generating initial schedule, the second step is determination the cutting tool failure time, the third step is determination of system status at cutting tool failure time and the fourth step is the rescheduling for unfinished jobs. The approaches to solve the problem are complete-reactive scheduling and robust-proactive scheduling. The new schedules result differences starting time and completion time of each operations from the initial schedule.

  18. Bearingless Flywheel Systems, Winding and Control Schemes, and Sensorless Control

    NASA Technical Reports Server (NTRS)

    Kascak, Peter E (Inventor); Jansen, Ralph H (Inventor); Trase, Larry M (Inventor); Dever, Timothy P (Inventor); Kraft, Thomas G (Inventor)

    2016-01-01

    Flywheel systems are disclosed that provide increased energy density and operational effectiveness. A first bearingless motor and a second bearingless motor may be configured to simultaneously suspend the central rotor in a radial direction and to rotate the central rotor. However, certain implementations may have one motor or more than two motors, depending on the design. A plurality of the flywheel systems may be collectively controlled to perform community energy storage with higher storage capacities than individual flywheel systems.

  19. A Flywheel Energy Storage System Demonstration for Space Applications

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.; Kascak, Peter E.; Jansen, Ralph; Dever, Timothy

    2003-01-01

    A novel control algorithm for the charge and discharge modes of operation of a flywheel energy storage system for space applications is presented. The motor control portion of the algorithm uses sensorless field oriented control with position and speed estimates determined from a signal injection technique at low speeds and a back EMF technique at higher speeds. The charge and discharge portion of the algorithm use command feed-forward and disturbance decoupling, respectively, to achieve fast response with low gains. Simulation and experimental results are presented.

  20. U.S. Governmental Information Operations and Strategic Communications: A Discredited Tool or User Failure? Implications for Future Conflict

    DTIC Science & Technology

    2013-12-01

    U.S. GOVERNMENTAL INFORMATION OPERATIONS AND STRATEGIC COMMUNICATIONS : A DISCREDITED TOOL OR USER FAILURE? IMPLICATIONS FOR FUTURE CONFLICT Steve...TYPE 3. DATES COVERED 00-00-2013 to 00-00-2013 4. TITLE AND SUBTITLE U.S. Governmental Information Operations and Strategic Communications : A ...GOVERNMENTAL INFORMATION OPERATIONS AND STRATEGIC COMMUNICATIONS : A DISCREDITED TOOL OR USER FAILURE? IMPLICATIONS FOR FUTURE CONFLICT Steve Tatham

  1. Accurate Prediction of Motor Failures by Application of Multi CBM Tools: A Case Study

    NASA Astrophysics Data System (ADS)

    Dutta, Rana; Singh, Veerendra Pratap; Dwivedi, Jai Prakash

    2018-02-01

    Motor failures are very difficult to predict accurately with a single condition-monitoring tool as both electrical and the mechanical systems are closely related. Electrical problem, like phase unbalance, stator winding insulation failures can, at times, lead to vibration problem and at the same time mechanical failures like bearing failure, leads to rotor eccentricity. In this case study of a 550 kW blower motor it has been shown that a rotor bar crack was detected by current signature analysis and vibration monitoring confirmed the same. In later months in a similar motor vibration monitoring predicted bearing failure and current signature analysis confirmed the same. In both the cases, after dismantling the motor, the predictions were found to be accurate. In this paper we will be discussing the accurate predictions of motor failures through use of multi condition monitoring tools with two case studies.

  2. Feasibility Study of Jupiter Icy Moons Orbiter Permanent Magnet Alternator Start Sequence

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.; Tokars, Roger P.

    2006-01-01

    The Jupiter Icy Moons Orbiter (JIMO) mission was a proposed, (recently cancelled) long duration science mission to study three moons of Jupiter: Callisto, Ganymede, and Europa. One design of the JIMO spacecraft used a nuclear heat source in conjunction with a Brayton rotating machine to generate electrical power for the electric thrusters and the spacecraft bus. The basic operation of the closed cycle Brayton system was as follows. The working fluid, a heliumxenon gas mixture, first entered a compressor, then went through a recuperator and hot-side heat exchanger, then expanded across a turbine that drove an alternator, then entered the cold-side of the recuperator and heat exchanger and finally returned to the compressor. The spacecraft was to be launched with the Brayton system off-line and the nuclear reactor shut down. Once the system was started, the helium-xenon gas would be circulated into the heat exchangers as the nuclear reactors were activated. Initially, the alternator unit would operate as a motor so as to drive the turbine and compressor to get the cycle started. This report investigated the feasibility of the start up sequence of a permanent magnet (PM) machine, similar in operation to the alternator unit, without any position or speed feedback sensors ("sensorless") and with a variable load torque. It is found that the permanent magnet machine can start with sensorless control and a load torque of up to 30 percent of the rated value.

  3. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics

    PubMed Central

    Dong, Bing; Li, Yan; Han, Xin-li; Hu, Bin

    2016-01-01

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10−5 in optimized correction and is 1.427 × 10−5 in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161

  4. A Sensorless Predictive Current Controlled Boost Converter by Using an EKF with Load Variation Effect Elimination Function

    PubMed Central

    Tong, Qiaoling; Chen, Chen; Zhang, Qiao; Zou, Xuecheng

    2015-01-01

    To realize accurate current control for a boost converter, a precise measurement of the inductor current is required to achieve high resolution current regulating. Current sensors are widely used to measure the inductor current. However, the current sensors and their processing circuits significantly contribute extra hardware cost, delay and noise to the system. They can also harm the system reliability. Therefore, current sensorless control techniques can bring cost effective and reliable solutions for various boost converter applications. According to the derived accurate model, which contains a number of parasitics, the boost converter is a nonlinear system. An Extended Kalman Filter (EKF) is proposed for inductor current estimation and output voltage filtering. With this approach, the system can have the same advantages as sensored current control mode. To implement EKF, the load value is necessary. However, the load may vary from time to time. This can lead to errors of current estimation and filtered output voltage. To solve this issue, a load variation elimination effect elimination (LVEE) module is added. In addition, a predictive average current controller is used to regulate the current. Compared with conventional voltage controlled system, the transient response is greatly improved since it only takes two switching cycles for the current to reach its reference. Finally, experimental results are presented to verify the stable operation and output tracking capability for large-signal transients of the proposed algorithm. PMID:25928061

  5. Progressive Damage and Failure Analysis of Composite Laminates

    NASA Astrophysics Data System (ADS)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis tool is validated by comparing the simulations against experiments for a selected number of quasi-static loading cases.

  6. Failure environment analysis tool applications

    NASA Astrophysics Data System (ADS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-02-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  7. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  8. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1994-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within it, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  9. A Selection of Composites Simulation Practices at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Ratcliffe, James G.

    2007-01-01

    One of the major areas of study at NASA Langley Research Center is the development of technologies that support the use of advanced composite materials in aerospace applications. Amongst the supporting technologies are analysis tools used to simulate the behavior of these materials. This presentation will discuss a number of examples of analysis tools and simulation practices conducted at NASA Langley. The presentation will include examples of damage tolerance analyses for both interlaminar and intralaminar failure modes. Tools for modeling interlaminar failure modes include fracture mechanics and cohesive methods, whilst tools for modeling intralaminar failure involve the development of various progressive failure analyses. Other examples of analyses developed at NASA Langley include a thermo-mechanical model of an orthotropic material and the simulation of delamination growth in z-pin reinforced laminates.

  10. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  11. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Astrophysics Data System (ADS)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  12. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    PubMed

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Position Estimation for Switched Reluctance Motor Based on the Single Threshold Angle

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Pang; Yu, Yue

    2017-05-01

    This paper presents a position estimate model of switched reluctance motor based on the single threshold angle. In view of the relationship of between the inductance and rotor position, the position is estimated by comparing the real-time dynamic flux linkage with the threshold angle position flux linkage (7.5° threshold angle, 12/8SRM). The sensorless model is built by Maltab/Simulink, the simulation are implemented under the steady state and transient state different condition, and verified its validity and feasibility of the method..

  14. Automating the Transition Between Sensorless Motor Control Methods for the NASA Glenn Research Center Flywheel Energy Storage System

    NASA Technical Reports Server (NTRS)

    Fehrmann, Elizabeth A.; Kenny, Barbara H.

    2004-01-01

    The NASA Glenn Research Center (GRC) has been working to advance the technology necessary for a flywheel energy storage system for the past several years. Flywheels offer high efficiency, durability, and near-complete discharge capabilities not produced by typical chemical batteries. These characteristics show flywheels to be an attractive alternative to the more typical energy storage solutions. Flywheels also offer the possibility of combining what are now two separate systems in space applications into one: energy storage, which is currently provided by batteries, and attitude control, which is currently provided by control moment gyroscopes (CMGs) or reaction wheels. To date, NASA Glenn research effort has produced the control algorithms necessary to demonstrate flywheel operation up to a rated speed of 60,000 RPM and the combined operation of two flywheel machines to simultaneously provide energy storage and single axis attitude control. Two position-sensorless algorithms are used to control the motor/generator, one for low (0 to 1200 RPM) speeds and one for high speeds. The algorithm allows the transition from the low speed method to the high speed method, but the transition from the high to low speed method was not originally included. This leads to a limitation in the existing motor/generator control code that does not allow the flywheels to be commanded to zero speed (and back in the negative speed direction) after the initial startup. In a multi-flywheel system providing both energy storage and attitude control to a spacecraft, speed reversal may be necessary.

  15. Memory Circuit Fault Simulator

    NASA Technical Reports Server (NTRS)

    Sheldon, Douglas J.; McClure, Tucker

    2013-01-01

    Spacecraft are known to experience significant memory part-related failures and problems, both pre- and postlaunch. These memory parts include both static and dynamic memories (SRAM and DRAM). These failures manifest themselves in a variety of ways, such as pattern-sensitive failures, timingsensitive failures, etc. Because of the mission critical nature memory devices play in spacecraft architecture and operation, understanding their failure modes is vital to successful mission operation. To support this need, a generic simulation tool that can model different data patterns in conjunction with variable write and read conditions was developed. This tool is a mathematical and graphical way to embed pattern, electrical, and physical information to perform what-if analysis as part of a root cause failure analysis effort.

  16. Introduction of the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Costing Tool: a user-friendly spreadsheet program to estimate costs of providing patient-centered interventions.

    PubMed

    Reed, Shelby D; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L; Bowers, Margaret T; Samsa, Gregory P; Paul, Sara; Schulman, Kevin A; Whellan, David J; Riegel, Barbara J

    2012-01-01

    Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers and health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions.

  17. Motor Control and Regulation for a Flywheel Energy Storage System

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara; Lyons, Valerie

    2003-01-01

    This talk will focus on the motor control algorithms used to regulate the flywheel system at the NASA Glenn Research Center. First a discussion of the inner loop torque control technique will be given. It is based on the principle of field orientation and is implemented without a position or speed sensor (sensorless control). Then the outer loop charge and discharge algorithm will be presented. This algorithm controls the acceleration of the flywheel during charging and the deceleration while discharging. The algorithm also allows the flywheel system to regulate the DC bus voltage during the discharge cycle.

  18. Design of BLDCM emulator for transmission control units

    NASA Astrophysics Data System (ADS)

    Liu, Chang; He, Yongyi; Zhang, Bodong

    2018-04-01

    According to the testing requirements of the transmission control unit, a brushless DC motor emulating system is designed based on motor simulation and power hardware-in-the-loop. The discrete motor model is established and a real-time numerical method is designed to solve the motor states. The motor emulator directly interacts with power stage of the transmission control unit using a power-efficient circuit topology and is compatible with sensor-less control. Experiments on a laboratory prototype help to verify that the system can emulate the real motor currents and voltages whenever the motor is starting up or suddenly loaded.

  19. Construct validity of the Heart Failure Screening Tool (Heart-FaST) to identify heart failure patients at risk of poor self-care: Rasch analysis.

    PubMed

    Reynolds, Nicholas A; Ski, Chantal F; McEvedy, Samantha M; Thompson, David R; Cameron, Jan

    2018-02-14

    The aim of this study was to psychometrically evaluate the Heart Failure Screening Tool (Heart-FaST) via: (1) examination of internal construct validity; (2) testing of scale function in accordance with design; and (3) recommendation for change/s, if items are not well adjusted, to improve psychometric credential. Self-care is vital to the management of heart failure. The Heart-FaST may provide a prospective assessment of risk, regarding the likelihood that patients with heart failure will engage in self-care. Psychometric validation of the Heart-FaST using Rasch analysis. The Heart-FaST was administered to 135 patients (median age = 68, IQR = 59-78 years; 105 males) enrolled in a multidisciplinary heart failure management program. The Heart-FaST is a nurse-administered tool for screening patients with HF at risk of poor self-care. A Rasch analysis of responses was conducted which tested data against Rasch model expectations, including whether items serve as unbiased, non-redundant indicators of risk and measure a single construct and that rating scales operate as intended. The results showed that data met Rasch model expectations after rescoring or deleting items due to poor discrimination, disordered thresholds, differential item functioning, or response dependence. There was no evidence of multidimensionality which supports the use of total scores from Heart-FaST as indicators of risk. Aggregate scores from this modified screening tool rank heart failure patients according to their "risk of poor self-care" demonstrating that the Heart-FaST items constitute a meaningful scale to identify heart failure patients at risk of poor engagement in heart failure self-care. © 2018 John Wiley & Sons Ltd.

  20. Comprehensive in-hospital monitoring in acute heart failure: applications for clinical practice and future directions for research. A statement from the Acute Heart Failure Committee of the Heart Failure Association (HFA) of the European Society of Cardiology (ESC).

    PubMed

    Harjola, Veli-Pekka; Parissis, John; Brunner-La Rocca, Hans-Peter; Čelutkienė, Jelena; Chioncel, Ovidiu; Collins, Sean P; De Backer, Daniel; Filippatos, Gerasimos S; Gayat, Etienne; Hill, Loreena; Lainscak, Mitja; Lassus, Johan; Masip, Josep; Mebazaa, Alexandre; Miró, Òscar; Mortara, Andrea; Mueller, Christian; Mullens, Wilfried; Nieminen, Markku S; Rudiger, Alain; Ruschitzka, Frank; Seferovic, Petar M; Sionis, Alessandro; Vieillard-Baron, Antoine; Weinstein, Jean Marc; de Boer, Rudolf A; Crespo Leiro, Maria G; Piepoli, Massimo; Riley, Jillian P

    2018-04-30

    This paper provides a practical clinical application of guideline recommendations relating to the inpatient monitoring of patients with acute heart failure, through the evaluation of various clinical, biomarker, imaging, invasive and non-invasive approaches. Comprehensive inpatient monitoring is crucial to the optimal management of acute heart failure patients. The European Society of Cardiology heart failure guidelines provide recommendations for the inpatient monitoring of acute heart failure, but the level of evidence underpinning most recommendations is limited. Many tools are available for the in-hospital monitoring of patients with acute heart failure, and each plays a role at various points throughout the patient's treatment course, including the emergency department, intensive care or coronary care unit, and the general ward. Clinical judgment is the preeminent factor guiding application of inpatient monitoring tools, as the various techniques have different patient population targets. When applied appropriately, these techniques enable decision making. However, there is limited evidence demonstrating that implementation of these tools improves patient outcome. Research priorities are identified to address these gaps in evidence. Future research initiatives should aim to identify the optimal in-hospital monitoring strategies that decrease morbidity and prolong survival in patients with acute heart failure. © 2018 The Authors. European Journal of Heart Failure © 2018 European Society of Cardiology.

  1. Wide-field retinal optical coherence tomography with wavefront sensorless adaptive optics for enhanced imaging of targeted regions.

    PubMed

    Polans, James; Keller, Brenton; Carrasco-Zevallos, Oscar M; LaRocca, Francesco; Cole, Elijah; Whitson, Heather E; Lad, Eleonora M; Farsiu, Sina; Izatt, Joseph A

    2017-01-01

    The peripheral retina of the human eye offers a unique opportunity for assessment and monitoring of ocular diseases. We have developed a novel wide-field (>70°) optical coherence tomography system (WF-OCT) equipped with wavefront sensorless adaptive optics (WSAO) for enhancing the visualization of smaller (<25°) targeted regions in the peripheral retina. We iterated the WSAO algorithm at the speed of individual OCT B-scans (~20 ms) by using raw spectral interferograms to calculate the optimization metric. Our WSAO approach with a 3 mm beam diameter permitted primarily low- but also high- order peripheral wavefront correction in less than 10 seconds. In preliminary imaging studies in five normal human subjects, we quantified statistically significant changes with WSAO correction, corresponding to a 10.4% improvement in average pixel brightness (signal) and 7.0% improvement in high frequency content (resolution) when visualizing 1 mm (~3.5°) B-scans of the peripheral (>23°) retina. We demonstrated the ability of our WF-OCT system to acquire non wavefront-corrected wide-field images rapidly, which could then be used to locate regions of interest, zoom into targeted features, and visualize the same region at different time points. A pilot clinical study was conducted on seven healthy volunteers and two subjects with prodromal Alzheimer's disease which illustrated the capability to image Drusen-like pathologies as far as 32.5° from the fovea in un-averaged volume scans. This work suggests that the proposed combination of WF-OCT and WSAO may find applications in the diagnosis and treatment of ocular, and potentially neurodegenerative, diseases of the peripheral retina, including diabetes and Alzheimer's disease.

  2. Wide-field retinal optical coherence tomography with wavefront sensorless adaptive optics for enhanced imaging of targeted regions

    PubMed Central

    Polans, James; Keller, Brenton; Carrasco-Zevallos, Oscar M.; LaRocca, Francesco; Cole, Elijah; Whitson, Heather E.; Lad, Eleonora M.; Farsiu, Sina; Izatt, Joseph A.

    2016-01-01

    The peripheral retina of the human eye offers a unique opportunity for assessment and monitoring of ocular diseases. We have developed a novel wide-field (>70°) optical coherence tomography system (WF-OCT) equipped with wavefront sensorless adaptive optics (WSAO) for enhancing the visualization of smaller (<25°) targeted regions in the peripheral retina. We iterated the WSAO algorithm at the speed of individual OCT B-scans (~20 ms) by using raw spectral interferograms to calculate the optimization metric. Our WSAO approach with a 3 mm beam diameter permitted primarily low- but also high- order peripheral wavefront correction in less than 10 seconds. In preliminary imaging studies in five normal human subjects, we quantified statistically significant changes with WSAO correction, corresponding to a 10.4% improvement in average pixel brightness (signal) and 7.0% improvement in high frequency content (resolution) when visualizing 1 mm (~3.5°) B-scans of the peripheral (>23°) retina. We demonstrated the ability of our WF-OCT system to acquire non wavefront-corrected wide-field images rapidly, which could then be used to locate regions of interest, zoom into targeted features, and visualize the same region at different time points. A pilot clinical study was conducted on seven healthy volunteers and two subjects with prodromal Alzheimer’s disease which illustrated the capability to image Drusen-like pathologies as far as 32.5° from the fovea in un-averaged volume scans. This work suggests that the proposed combination of WF-OCT and WSAO may find applications in the diagnosis and treatment of ocular, and potentially neurodegenerative, diseases of the peripheral retina, including diabetes and Alzheimer’s disease. PMID:28101398

  3. Wavefront sensorless adaptive optics versus sensor-based adaptive optics for in vivo fluorescence retinal imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wahl, Daniel J.; Zhang, Pengfei; Jian, Yifan; Bonora, Stefano; Sarunic, Marinko V.; Zawadzki, Robert J.

    2017-02-01

    Adaptive optics (AO) is essential for achieving diffraction limited resolution in large numerical aperture (NA) in-vivo retinal imaging in small animals. Cellular-resolution in-vivo imaging of fluorescently labeled cells is highly desirable for studying pathophysiology in animal models of retina diseases in pre-clinical vision research. Currently, wavefront sensor-based (WFS-based) AO is widely used for retinal imaging and has demonstrated great success. However, the performance can be limited by several factors including common path errors, wavefront reconstruction errors and an ill-defined reference plane on the retina. Wavefront sensorless (WFS-less) AO has the advantage of avoiding these issues at the cost of algorithmic execution time. We have investigated WFS-less AO on a fluorescence scanning laser ophthalmoscopy (fSLO) system that was originally designed for WFS-based AO. The WFS-based AO uses a Shack-Hartmann WFS and a continuous surface deformable mirror in a closed-loop control system to measure and correct for aberrations induced by the mouse eye. The WFS-less AO performs an open-loop modal optimization with an image quality metric. After WFS-less AO aberration correction, the WFS was used as a control of the closed-loop WFS-less AO operation. We can easily switch between WFS-based and WFS-less control of the deformable mirror multiple times within an imaging session for the same mouse. This allows for a direct comparison between these two types of AO correction for fSLO. Our results demonstrate volumetric AO-fSLO imaging of mouse retinal cells labeled with GFP. Most significantly, we have analyzed and compared the aberration correction results for WFS-based and WFS-less AO imaging.

  4. Early detection of nonneurologic organ failure in patients with severe traumatic brain injury: Multiple organ dysfunction score or sequential organ failure assessment?

    PubMed

    Ramtinfar, Sara; Chabok, Shahrokh Yousefzadeh; Chari, Aliakbar Jafari; Reihanian, Zoheir; Leili, Ehsan Kazemnezhad; Alizadeh, Arsalan

    2016-10-01

    The aim of this study is to compare the discriminant function of multiple organ dysfunction score (MODS) and sequential organ failure assessment (SOFA) components in predicting the Intensive Care Unit (ICU) mortality and neurologic outcome. A descriptive-analytic study was conducted at a level I trauma center. Data were collected from patients with severe traumatic brain injury admitted to the neurosurgical ICU. Basic demographic data, SOFA and MOD scores were recorded daily for all patients. Odd's ratios (ORs) were calculated to determine the relationship of each component score to mortality, and area under receiver operating characteristic (AUROC) curve was used to compare the discriminative ability of two tools with respect to ICU mortality. The most common organ failure observed was respiratory detected by SOFA of 26% and MODS of 13%, and the second common was cardiovascular detected by SOFA of 18% and MODS of 13%. No hepatic or renal failure occurred, and coagulation failure reported as 2.5% by SOFA and MODS. Cardiovascular failure defined by both tools had a correlation to ICU mortality and it was more significant for SOFA (OR = 6.9, CI = 3.6-13.3, P < 0.05 for SOFA; OR = 5, CI = 3-8.3, P < 0.05 for MODS; AUROC = 0.82 for SOFA; AUROC = 0.73 for MODS). The relationship of cardiovascular failure to dichotomized neurologic outcome was not significant statistically. ICU mortality was not associated with respiratory or coagulation failure. Cardiovascular failure defined by either tool significantly related to ICU mortality. Compared to MODS, SOFA-defined cardiovascular failure was a stronger predictor of death. ICU mortality was not affected by respiratory or coagulation failures.

  5. Enhanced Schapery Theory Software Development for Modeling Failure of Fiber-Reinforced Laminates

    NASA Technical Reports Server (NTRS)

    Pineda, Evan J.; Waas, Anthony M.

    2013-01-01

    Progressive damage and failure analysis (PDFA) tools are needed to predict the nonlinear response of advanced fiber-reinforced composite structures. Predictive tools should incorporate the underlying physics of the damage and failure mechanisms observed in the composite, and should utilize as few input parameters as possible. The purpose of the Enhanced Schapery Theory (EST) was to create a PDFA tool that operates in conjunction with a commercially available finite element (FE) code (Abaqus). The tool captures the physics of the damage and failure mechanisms that result in the nonlinear behavior of the material, and the failure methodology employed yields numerical results that are relatively insensitive to changes in the FE mesh. The EST code is written in Fortran and compiled into a static library that is linked to Abaqus. A Fortran Abaqus UMAT material subroutine is used to facilitate the communication between Abaqus and EST. A clear distinction between damage and failure is imposed. Damage mechanisms result in pre-peak nonlinearity in the stress strain curve. Four internal state variables (ISVs) are utilized to control the damage and failure degradation. All damage is said to result from matrix microdamage, and a single ISV marks the micro-damage evolution as it is used to degrade the transverse and shear moduli of the lamina using a set of experimentally obtainable matrix microdamage functions. Three separate failure ISVs are used to incorporate failure due to fiber breakage, mode I matrix cracking, and mode II matrix cracking. Failure initiation is determined using a failure criterion, and the evolution of these ISVs is controlled by a set of traction-separation laws. The traction separation laws are postulated such that the area under the curves is equal to the fracture toughness of the material associated with the corresponding failure mechanism. A characteristic finite element length is used to transform the traction-separation laws into stress-strain laws. The ISV evolution equations are derived in a thermodynamically consistent manner by invoking the stationary principle on the total work of the system with respect to each ISV. A novel feature is the inclusion of both pre-peak damage and appropriately scaled, post-peak strain softening failure. Also, the characteristic elements used in the failure degradation scheme are calculated using the element nodal coordinates, rather than simply the square root of the area of the element.

  6. Health information systems: failure, success and improvisation.

    PubMed

    Heeks, Richard

    2006-02-01

    The generalised assumption of health information systems (HIS) success is questioned by a few commentators in the medical informatics field. They point to widespread HIS failure. The purpose of this paper was therefore to develop a better conceptual foundation for, and practical guidance on, health information systems failure (and success). Literature and case analysis plus pilot testing of developed model. Defining HIS failure and success is complex, and the current evidence base on HIS success and failure rates was found to be weak. Nonetheless, the best current estimate is that HIS failure is an important problem. The paper therefore derives and explains the "design-reality gap" conceptual model. This is shown to be robust in explaining multiple cases of HIS success and failure, yet provides a contingency that encompasses the differences which exist in different HIS contexts. The design-reality gap model is piloted to demonstrate its value as a tool for risk assessment and mitigation on HIS projects. It also throws into question traditional, structured development methodologies, highlighting the importance of emergent change and improvisation in HIS. The design-reality gap model can be used to address the problem of HIS failure, both as a post hoc evaluative tool and as a pre hoc risk assessment and mitigation tool. It also validates a set of methods, techniques, roles and competencies needed to support the dynamic improvisations that are found to underpin cases of HIS success.

  7. Fault management for the Space Station Freedom control center

    NASA Technical Reports Server (NTRS)

    Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet

    1992-01-01

    This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.

  8. Fault Injection Techniques and Tools

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen; Tsai, Timothy K.; Iyer, Ravishankar K.

    1997-01-01

    Dependability evaluation involves the study of failures and errors. The destructive nature of a crash and long error latency make it difficult to identify the causes of failures in the operational environment. It is particularly hard to recreate a failure scenario for a large, complex system. To identify and understand potential failures, we use an experiment-based approach for studying the dependability of a system. Such an approach is applied not only during the conception and design phases, but also during the prototype and operational phases. To take an experiment-based approach, we must first understand a system's architecture, structure, and behavior. Specifically, we need to know its tolerance for faults and failures, including its built-in detection and recovery mechanisms, and we need specific instruments and tools to inject faults, create failures or errors, and monitor their effects.

  9. Passive control of a biventricular assist device with compliant inflow cannulae.

    PubMed

    Gregory, Shaun David; Pearcy, Mark John; Timms, Daniel

    2012-08-01

    Rotary ventricular assist device (VAD) support of the cardiovascular system is susceptible to suction events due to the limited preload sensitivity of these devices. This may be of particular concern with rotary biventricular support (BiVAD) where the native, flow balancing Starling response is diminished in both ventricles. The reliability of sensor and sensorless-based control systems which aim to control VAD flow based on preload has limitations, and, thus, an alternative solution is desired. This study introduces a compliant inflow cannula (CIC) which could improve the preload sensitivity of a rotary VAD by passively altering VAD flow depending on preload. To evaluate the design, both the CIC and a standard rigid inflow cannula were inserted into a mock circulation loop to enable biventricular heart failure support using configurations of atrial and ventricular inflow, and arterial outflow cannulation. A range of left (LVAD) and right VAD (RVAD) rotational speeds were tested as well as step changes in systemic/pulmonary vascular resistance to alter relative preloads, with resulting flow rates recorded. Simulated suction events were observed, particularly at higher VAD speeds, during support with the rigid inflow cannula, while the CIC prevented suction events under all circumstances. The compliant section passively restricted its internal diameter as preload was reduced, which increased the VAD circuit resistance and thus reduced VAD flow. Therefore, a CIC could potentially be used as a passive control system to prevent suction events in rotary left, right, and biventricular support. © 2012, Copyright the Authors. Artificial Organs © 2012, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  10. Graphical Displays Assist In Analysis Of Failures

    NASA Technical Reports Server (NTRS)

    Pack, Ginger; Wadsworth, David; Razavipour, Reza

    1995-01-01

    Failure Environment Analysis Tool (FEAT) computer program enables people to see and better understand effects of failures in system. Uses digraph models to determine what will happen to system if set of failure events occurs and to identify possible causes of selected set of failures. Digraphs or engineering schematics used. Also used in operations to help identify causes of failures after they occur. Written in C language.

  11. 40 CFR 1065.410 - Maintenance limits for stabilized test engines.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... engineering grade tools to identify bad engine components. Any equipment, instruments, or tools used for... no longer use it as an emission-data engine. Also, if your test engine has a major mechanical failure... your test engine has a major mechanical failure that requires you to take it apart, you may no longer...

  12. 40 CFR 1065.410 - Maintenance limits for stabilized test engines.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... engineering grade tools to identify bad engine components. Any equipment, instruments, or tools used for... no longer use it as an emission-data engine. Also, if your test engine has a major mechanical failure... your test engine has a major mechanical failure that requires you to take it apart, you may no longer...

  13. Control of a High Speed Flywheel System for Energy Storage in Space Applications

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.; Kascak, Peter E.; Jansen, Ralph; Dever, Timothy; Santiago, Walter

    2004-01-01

    A novel control algorithm for the charge and discharge modes of operation of a flywheel energy storage system for space applications is presented. The motor control portion of the algorithm uses sensorless field oriented control with position and speed estimates determined from a signal injection technique at low speeds and a back EMF technique at higher speeds. The charge and discharge portion of the algorithm use command feed-forward and disturbance decoupling, respectively, to achieve fast response with low gains. Simulation and experimental results are presented demonstrating the successful operation of the flywheel control up to the rated speed of 60,000 rpm.

  14. Adaptive optics plug-and-play setup for high-resolution microscopes with multi-actuator adaptive lens

    NASA Astrophysics Data System (ADS)

    Quintavalla, M.; Pozzi, P.; Verhaegen, Michelle; Bijlsma, Hielke; Verstraete, Hans; Bonora, S.

    2018-02-01

    Adaptive Optics (AO) has revealed as a very promising technique for high-resolution microscopy, where the presence of optical aberrations can easily compromise the image quality. Typical AO systems however, are almost impossible to implement on commercial microscopes. We propose a simple approach by using a Multi-actuator Adaptive Lens (MAL) that can be inserted right after the objective and works in conjunction with an image optimization software allowing for a wavefront sensorless correction. We presented the results obtained on several commercial microscopes among which a confocal microscope, a fluorescence microscope, a light sheet microscope and a multiphoton microscope.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lherbier, Louis, W.; Novotnak, David, J.; Herling, Darrell, R.

    Hot forming processes such as forging, die casting and glass forming require tooling that is subjected to high temperatures during the manufacturing of components. Current tooling is adversely affected by prolonged exposure at high temperatures. Initial studies were conducted to determine the root cause of tool failures in a number of applications. Results show that tool failures vary and depend on the operating environment under which they are used. Major root cause failures include (1) thermal softening, (2) fatigue and (3) tool erosion, all of which are affected by process boundary conditions such as lubrication, cooling, process speed, etc. Whilemore » thermal management is a key to addressing tooling failures, it was clear that new tooling materials with superior high temperature strength could provide improved manufacturing efficiencies. These efficiencies are based on the use of functionally graded materials (FGM), a new subset of hybrid tools with customizable properties that can be fabricated using advanced powder metallurgy manufacturing technologies. Modeling studies of the various hot forming processes helped identify the effect of key variables such as stress, temperature and cooling rate and aid in the selection of tooling materials for specific applications. To address the problem of high temperature strength, several advanced powder metallurgy nickel and cobalt based alloys were selected for evaluation. These materials were manufactured into tooling using two relatively new consolidation processes. One process involved laser powder deposition (LPD) and the second involved a solid state dynamic powder consolidation (SSDPC) process. These processes made possible functionally graded materials (FGM) that resulted in shaped tooling that was monolithic, bi-metallic or substrate coated. Manufacturing of tooling with these processes was determined to be robust and consistent for a variety of materials. Prototype and production testing of FGM tooling showed the benefits of the nickel and cobalt based powder metallurgy alloys in a number of applications evaluated. Improvements in tool life ranged from three (3) to twenty (20) or more times than currently used tooling. Improvements were most dramatic where tool softening and deformation were the major cause of tool failures in hot/warm forging applications. Significant improvement was also noted in erosion of aluminum die casting tooling. Cost and energy savings can be realized as a result of increased tooling life, increased productivity and a reduction in scrap because of improved dimensional controls. Although LPD and SSDPC tooling usually have higher acquisition costs, net tooling costs per component produced drops dramatically with superior tool performance. Less energy is used to manufacture the tooling because fewer tools are required and less recycling of used tools are needed for the hot forming process. Energy is saved during the component manufacturing cycle because more parts can be produced in shorter periods of time. Energy is also saved by minimizing heating furnace idling time because of less downtime for tooling changes.« less

  16. Anthology of the Development of Radiation Transport Tools as Applied to Single Event Effects

    NASA Astrophysics Data System (ADS)

    Reed, R. A.; Weller, R. A.; Akkerman, A.; Barak, J.; Culpepper, W.; Duzellier, S.; Foster, C.; Gaillardin, M.; Hubert, G.; Jordan, T.; Jun, I.; Koontz, S.; Lei, F.; McNulty, P.; Mendenhall, M. H.; Murat, M.; Nieminen, P.; O'Neill, P.; Raine, M.; Reddell, B.; Saigné, F.; Santin, G.; Sihver, L.; Tang, H. H. K.; Truscott, P. R.; Wrobel, F.

    2013-06-01

    This anthology contains contributions from eleven different groups, each developing and/or applying Monte Carlo-based radiation transport tools to simulate a variety of effects that result from energy transferred to a semiconductor material by a single particle event. The topics span from basic mechanisms for single-particle induced failures to applied tasks like developing websites to predict on-orbit single event failure rates using Monte Carlo radiation transport tools.

  17. Failure mode and effect analysis in blood transfusion: a proactive tool to reduce risks.

    PubMed

    Lu, Yao; Teng, Fang; Zhou, Jie; Wen, Aiqing; Bi, Yutian

    2013-12-01

    The aim of blood transfusion risk management is to improve the quality of blood products and to assure patient safety. We utilize failure mode and effect analysis (FMEA), a tool employed for evaluating risks and identifying preventive measures to reduce the risks in blood transfusion. The failure modes and effects occurring throughout the whole process of blood transfusion were studied. Each failure mode was evaluated using three scores: severity of effect (S), likelihood of occurrence (O), and probability of detection (D). Risk priority numbers (RPNs) were calculated by multiplying the S, O, and D scores. The plan-do-check-act cycle was also used for continuous improvement. Analysis has showed that failure modes with the highest RPNs, and therefore the greatest risk, were insufficient preoperative assessment of the blood product requirement (RPN, 245), preparation time before infusion of more than 30 minutes (RPN, 240), blood transfusion reaction occurring during the transfusion process (RPN, 224), blood plasma abuse (RPN, 180), and insufficient and/or incorrect clinical information on request form (RPN, 126). After implementation of preventative measures and reassessment, a reduction in RPN was detected with each risk. The failure mode with the second highest RPN, namely, preparation time before infusion of more than 30 minutes, was shown in detail to prove the efficiency of this tool. FMEA evaluation model is a useful tool in proactively analyzing and reducing the risks associated with the blood transfusion procedure. © 2013 American Association of Blood Banks.

  18. Instrument Failures for the da Vinci Surgical System: a Food and Drug Administration MAUDE Database Study.

    PubMed

    Friedman, Diana C W; Lendvay, Thomas S; Hannaford, Blake

    2013-05-01

    Our goal was to analyze reported instances of the da Vinci robotic surgical system instrument failures using the FDA's MAUDE (Manufacturer and User Facility Device Experience) database. From these data we identified some root causes of failures as well as trends that may assist surgeons and users of the robotic technology. We conducted a survey of the MAUDE database and tallied robotic instrument failures that occurred between January 2009 and December 2010. We categorized failures into five main groups (cautery, shaft, wrist or tool tip, cable, and control housing) based on technical differences in instrument design and function. A total of 565 instrument failures were documented through 528 reports. The majority of failures (285) were of the instrument's wrist or tool tip. Cautery problems comprised 174 failures, 76 were shaft failures, 29 were cable failures, and 7 were control housing failures. Of the reports, 10 had no discernible failure mode and 49 exhibited multiple failures. The data show that a number of robotic instrument failures occurred in a short period of time. In reality, many instrument failures may go unreported, thus a true failure rate cannot be determined from these data. However, education of hospital administrators, operating room staff, surgeons, and patients should be incorporated into discussions regarding the introduction and utilization of robotic technology. We recommend institutions incorporate standard failure reporting policies so that the community of robotic surgery companies and surgeons can improve on existing technologies for optimal patient safety and outcomes.

  19. A study of unstable rock failures using finite difference and discrete element methods

    NASA Astrophysics Data System (ADS)

    Garvey, Ryan J.

    Case histories in mining have long described pillars or faces of rock failing violently with an accompanying rapid ejection of debris and broken material into the working areas of the mine. These unstable failures have resulted in large losses of life and collapses of entire mine panels. Modern mining operations take significant steps to reduce the likelihood of unstable failure, however eliminating their occurrence is difficult in practice. Researchers over several decades have supplemented studies of unstable failures through the application of various numerical methods. The direction of the current research is to extend these methods and to develop improved numerical tools with which to study unstable failures in underground mining layouts. An extensive study is first conducted on the expression of unstable failure in discrete element and finite difference methods. Simulated uniaxial compressive strength tests are run on brittle rock specimens. Stable or unstable loading conditions are applied onto the brittle specimens by a pair of elastic platens with ranging stiffnesses. Determinations of instability are established through stress and strain histories taken for the specimen and the system. Additional numerical tools are then developed for the finite difference method to analyze unstable failure in larger mine models. Instability identifiers are established for assessing the locations and relative magnitudes of unstable failure through measures of rapid dynamic motion. An energy balance is developed which calculates the excess energy released as a result of unstable equilibria in rock systems. These tools are validated through uniaxial and triaxial compressive strength tests and are extended to models of coal pillars and a simplified mining layout. The results of the finite difference simulations reveal that the instability identifiers and excess energy calculations provide a generalized methodology for assessing unstable failures within potentially complex mine models. These combined numerical tools may be applied in future studies to design primary and secondary supports in bump-prone conditions, evaluate retreat mining cut sequences, asses pillar de-stressing techniques, or perform backanalyses on unstable failures in select mining layouts.

  20. Public Choice, Market Failure, and Government Failure in Principles Textbooks

    ERIC Educational Resources Information Center

    Fike, Rosemarie; Gwartney, James

    2015-01-01

    Public choice uses the tools of economics to analyze how the political process allocates resources and impacts economic activity. In this study, the authors examine twenty-three principles texts regarding coverage of public choice, market failure, and government failure. Approximately half the texts provide coverage of public choice and recognize…

  1. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach

    PubMed Central

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-01-01

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design. PMID:28773800

  2. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach.

    PubMed

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-08-09

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design.

  3. Introduction of the TEAM-HF Costing Tool: A User-Friendly Spreadsheet Program to Estimate Costs of Providing Patient-Centered Interventions

    PubMed Central

    Reed, Shelby D.; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L.; Bowers, Margaret T.; Samsa, Gregory P.; Paul, Sara; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara J.

    2011-01-01

    Background Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Methods and Results Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers or health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. Conclusions The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions. PMID:22147884

  4. A tool for assessment of heart failure prescribing quality: A systematic review and meta-analysis.

    PubMed

    El Hadidi, Seif; Darweesh, Ebtissam; Byrne, Stephen; Bermingham, Margaret

    2018-04-16

    Heart failure (HF) guidelines aim to standardise patient care. Internationally, prescribing practice in HF may deviate from guidelines and so a standardised tool is required to assess prescribing quality. A systematic review and meta-analysis were performed to identify a quantitative tool for measuring adherence to HF guidelines and its clinical implications. Eleven electronic databases were searched to include studies reporting a comprehensive tool for measuring adherence to prescribing guidelines in HF patients aged ≥18 years. Qualitative studies or studies measuring prescription rates alone were excluded. Study quality was assessed using the Good ReseArch for Comparative Effectiveness Checklist. In total, 2455 studies were identified. Sixteen eligible full-text articles were included (n = 14 354 patients, mean age 69 ± 8 y). The Guideline Adherence Index (GAI), and its modified versions, was the most frequently cited tool (n = 13). Other tools identified were the Individualised Reconciled Evidence Recommendations, the Composite Heart Failure Performance, and the Heart Failure Scale. The meta-analysis included the GAI studies of good to high quality. The average GAI-3 was 62%. Compared to low GAI, high GAI patients had lower mortality rate (7.6% vs 33.9%) and lower rehospitalisation rates (23.5% vs 24.5%); both P ≤ .05. High GAI was associated with reduced risk of mortality (hazard ratio = 0.29, 95% confidence interval, 0.06-0.51) and rehospitalisation (hazard ratio = 0.64, 95% confidence interval, 0.41-1.00). No tool was used to improve prescribing quality. The GAI is the most frequently used tool to assess guideline adherence in HF. High GAI is associated with improved HF outcomes. Copyright © 2018 John Wiley & Sons, Ltd.

  5. High satisfaction and low decisional conflict with advance care planning among chronically ill patients with advanced chronic obstructive pulmonary disease or heart failure using an online decision aid: A pilot study.

    PubMed

    Van Scoy, Lauren J; Green, Michael J; Dimmock, Anne Ef; Bascom, Rebecca; Boehmer, John P; Hensel, Jessica K; Hozella, Joshua B; Lehman, Erik B; Schubart, Jane R; Farace, Elana; Stewart, Renee R; Levi, Benjamin H

    2016-09-01

    Many patients with chronic illnesses report a desire for increased involvement in medical decision-making. This pilot study aimed to explore how patients with exacerbation-prone disease trajectories such as advanced heart failure or chronic obstructive pulmonary disease experience advance care planning using an online decision aid and to compare whether patients with different types of exacerbation-prone illnesses had varied experiences using the tool. Pre-intervention questionnaires measured advance care planning knowledge. Post-intervention questionnaires measured: (1) advance care planning knowledge; (2) satisfaction with tool; (3) decisional conflict; and (4) accuracy of the resultant advance directive. Comparisons were made between patients with heart failure and chronic obstructive pulmonary disease. Over 90% of the patients with heart failure (n = 24) or chronic obstructive pulmonary disease (n = 25) reported being "satisfied" or "highly satisfied" with the tool across all satisfaction domains; over 90% of participants rated the resultant advance directive as "very accurate." Participants reported low decisional conflict. Advance care planning knowledge scores rose by 18% (p < 0.001) post-intervention. There were no significant differences between participants with heart failure and chronic obstructive pulmonary disease. Patients with advanced heart failure and chronic obstructive pulmonary disease were highly satisfied after using an online advance care planning decision aid and had increased knowledge of advance care planning. This tool can be a useful resource for time-constrained clinicians whose patients wish to engage in advance care planning. © The Author(s) 2016.

  6. Application of Quality Management Tools for Evaluating the Failure Frequency of Cutter-Loader and Plough Mining Systems

    NASA Astrophysics Data System (ADS)

    Biały, Witold

    2017-06-01

    Failure frequency in the mining process, with a focus on the mining machine, has been presented and illustrated by the example of two coal-mines. Two mining systems have been subjected to analysis: a cutter-loader and a plough system. In order to reduce costs generated by failures, maintenance teams should regularly make sure that the machines are used and operated in a rational and effective way. Such activities will allow downtimes to be reduced, and, in consequence, will increase the effectiveness of a mining plant. The evaluation of mining machines' failure frequency contained in this study has been based on one of the traditional quality management tools - the Pareto chart.

  7. Modified Sainsbury tool: an initial risk assessment tool for primary care mental health and learning disability services.

    PubMed

    Stein, W

    2005-10-01

    Risk assessments by health and social care professionals must encompass risk of suicide, of harm to others, and of neglect. The UK's National Confidential Inquiry into Homicide and Suicide paints a picture of failure to predict suicides and homicides, failure to identify opportunities for prevention and a failure to manage these opportunities. Assessing risk at 'first contact' with the mental health service assumes a special place in this regard. The initial opportunity to be alerted to, and thus to influence, risk, usually falls to the general psychiatric service (as opposed to forensic specialists) or to a joint health and local authority community mental health team. The Mental Health and Learning Disabilities Directorate of Renfrewshire & Inverclyde Primary Care NHS Trust, Scotland, determined to standardize their approach to risk assessment and selected a modified version of the Sainsbury Risk Assessment Tool. A year-long pilot revealed general support for its service-wide introduction but also some misgivings to address, including: (i) rejection of the tool by some medical staff; (ii) concerns about limited training; and (iii) a perceived failure on the part of the management to properly resource its use. The tool has the potential to fit well with the computer-networked needs assessment system used in joint-working with partner local authorities to allocate care resources.

  8. Learning from Failures: Archiving and Designing with Failure and Risk

    NASA Technical Reports Server (NTRS)

    VanWie, Michael; Bohm, Matt; Barrientos, Francesca; Turner, Irem; Stone, Robert

    2005-01-01

    Identifying and mitigating risks during conceptual design remains an ongoing challenge. This work presents the results of collaborative efforts between The University of Missouri-Rolla and NASA Ames Research Center to examine how an early stage mission design team at NASA addresses risk, and, how a computational support tool can assist these designers in their tasks. Results of our observations are given in addition to a brief example of our implementation of a repository based computational tool that allows users to browse and search through archived failure and risk data as related to either physical artifacts or functionality.

  9. A new version of Stochastic-parallel-gradient-descent algorithm (SPGD) for phase correction of a distorted orbital angular momentum (OAM) beam

    NASA Astrophysics Data System (ADS)

    Jiao Ling, LIn; Xiaoli, Yin; Huan, Chang; Xiaozhou, Cui; Yi-Lin, Guo; Huan-Yu, Liao; Chun-YU, Gao; Guohua, Wu; Guang-Yao, Liu; Jin-KUn, Jiang; Qing-Hua, Tian

    2018-02-01

    Atmospheric turbulence limits the performance of orbital angular momentum-based free-space optical communication (FSO-OAM) system. In order to compensate phase distortion induced by atmospheric turbulence, wavefront sensorless adaptive optics (WSAO) has been proposed and studied in recent years. In this paper a new version of SPGD called MZ-SPGD, which combines the Z-SPGD based on the deformable mirror influence function and the M-SPGD based on the Zernike polynomials, is proposed. Numerical simulations show that the hybrid method decreases convergence times markedly but can achieve the same compensated effect compared to Z-SPGD and M-SPGD.

  10. Simulink-aided Design and Implementation of Sensorless BLDC Motor Digital Control System

    NASA Astrophysics Data System (ADS)

    Zhilenkov, A. A.; Tsvetkov, Y. N.; Chistov, V. B.; Nyrkov, A. P.; Sokolov, S. S.

    2017-07-01

    The paper describes the process of creating of brushless direct current motor’s digital control system. The target motor has no speed sensor, so back-EMF method is used for commutation control. Authors show how to model the control system in MatLab/Simulink and to test it onboard STM32F4 microcontroller.This technology allows to create the most flexible system, which will control possible with a personal computer by communication lines. It is possible to examine the signals in the circuit of the actuator without any external measuring instruments - testers, oscilloscopes, etc. - and output waveforms and measured values of signals directly on the host PC.

  11. Smart sensorless prediction diagnosis of electric drives

    NASA Astrophysics Data System (ADS)

    Kruglova, TN; Glebov, NA; Shoshiashvili, ME

    2017-10-01

    In this paper, the discuss diagnostic method and prediction of the technical condition of an electrical motor using artificial intelligent method, based on the combination of fuzzy logic and neural networks, are discussed. The fuzzy sub-model determines the degree of development of each fault. The neural network determines the state of the object as a whole and the number of serviceable work periods for motors actuator. The combination of advanced techniques reduces the learning time and increases the forecasting accuracy. The experimental implementation of the method for electric drive diagnosis and associated equipment is carried out at different speeds. As a result, it was found that this method allows troubleshooting the drive at any given speed.

  12. Incremental Adaptive Fuzzy Control for Sensorless Stroke Control of A Halbach-type Linear Oscillatory Motor

    NASA Astrophysics Data System (ADS)

    Lei, Meizhen; Wang, Liqiang

    2018-01-01

    The halbach-type linear oscillatory motor (HT-LOM) is multi-variable, highly coupled, nonlinear and uncertain, and difficult to get a satisfied result by conventional PID control. An incremental adaptive fuzzy controller (IAFC) for stroke tracking was presented, which combined the merits of PID control, the fuzzy inference mechanism and the adaptive algorithm. The integral-operation is added to the conventional fuzzy control algorithm. The fuzzy scale factor can be online tuned according to the load force and stroke command. The simulation results indicate that the proposed control scheme can achieve satisfied stroke tracking performance and is robust with respect to parameter variations and external disturbance.

  13. TU-AB-BRD-02: Failure Modes and Effects Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  14. Wear and Adhesive Failure of Al2O3 Powder Coating Sprayed onto AISI H13 Tool Steel Substrate

    NASA Astrophysics Data System (ADS)

    Amanov, Auezhan; Pyun, Young-Sik

    2016-07-01

    In this study, an alumina (Al2O3) ceramic powder was sprayed onto an AISI H13 hot-work tool steel substrate that was subjected to sanding and ultrasonic nanocrystalline surface modification (UNSM) treatment processes. The significance of the UNSM technique on the adhesive failure of the Al2O3 coating and on the hardness of the substrate was investigated. The adhesive failure of the coating sprayed onto sanded and UNSM-treated substrates was investigated by a micro-scratch tester at an incremental load. It was found, based on the obtained results, that the coating sprayed onto the UNSM-treated substrate exhibited a better resistance to adhesive failure in comparison with that of the coating sprayed onto the sanded substrate. Dry friction and wear property of the coatings sprayed onto the sanded and UNSM-treated substrates were assessed by means of a ball-on-disk tribometer against an AISI 52100 steel ball. It was demonstrated that the UNSM technique controllably improved the adhesive failure of the Al2O3 coating, where the critical load was improved by about 31%. Thus, it is expected that the application of the UNSM technique to an AISI H13 tool steel substrate prior to coating may delay the adhesive failure and improve the sticking between the coating and the substrate thanks to the modified and hardened surface.

  15. Process tool monitoring and matching using interferometry technique

    NASA Astrophysics Data System (ADS)

    Anberg, Doug; Owen, David M.; Mileham, Jeffrey; Lee, Byoung-Ho; Bouche, Eric

    2016-03-01

    The semiconductor industry makes dramatic device technology changes over short time periods. As the semiconductor industry advances towards to the 10 nm device node, more precise management and control of processing tools has become a significant manufacturing challenge. Some processes require multiple tool sets and some tools have multiple chambers for mass production. Tool and chamber matching has become a critical consideration for meeting today's manufacturing requirements. Additionally, process tools and chamber conditions have to be monitored to ensure uniform process performance across the tool and chamber fleet. There are many parameters for managing and monitoring tools and chambers. Particle defect monitoring is a well-known and established example where defect inspection tools can directly detect particles on the wafer surface. However, leading edge processes are driving the need to also monitor invisible defects, i.e. stress, contamination, etc., because some device failures cannot be directly correlated with traditional visualized defect maps or other known sources. Some failure maps show the same signatures as stress or contamination maps, which implies correlation to device performance or yield. In this paper we present process tool monitoring and matching using an interferometry technique. There are many types of interferometry techniques used for various process monitoring applications. We use a Coherent Gradient Sensing (CGS) interferometer which is self-referencing and enables high throughput measurements. Using this technique, we can quickly measure the topography of an entire wafer surface and obtain stress and displacement data from the topography measurement. For improved tool and chamber matching and reduced device failure, wafer stress measurements can be implemented as a regular tool or chamber monitoring test for either unpatterned or patterned wafers as a good criteria for improved process stability.

  16. Hyper-X Stage Separation: Simulation Development and Results

    NASA Technical Reports Server (NTRS)

    Reubush, David E.; Martin, John G.; Robinson, Jeffrey S.; Bose, David M.; Strovers, Brian K.

    2001-01-01

    This paper provides an overview of stage separation simulation development and results for NASA's Hyper-X program; a focused hypersonic technology effort designed to move hypersonic, airbreathing vehicle technology from the laboratory environment to the flight environment. This paper presents an account of the development of the current 14 degree of freedom stage separation simulation tool (SepSim) and results from use of the tool in a Monte Carlo analysis to evaluate the risk of failure for the separation event. Results from use of the tool show that there is only a very small risk of failure in the separation event.

  17. Microstructure and Mechanical Performance of Friction Stir Spot-Welded Aluminum-5754 Sheets

    NASA Astrophysics Data System (ADS)

    Pathak, N.; Bandyopadhyay, K.; Sarangi, M.; Panda, Sushanta Kumar

    2013-01-01

    Friction stir spot welding (FSSW) is a recent trend of joining light-weight sheet metals while fabricating automotive and aerospace body components. For the successful application of this solid-state welding process, it is imperative to have a thorough understanding of the weld microstructure, mechanical performance, and failure mechanism. In the present study, FSSW of aluminum-5754 sheet metal was tried using tools with circular and tapered pin considering different tool rotational speeds, plunge depths, and dwell times. The effects of tool design and process parameters on temperature distribution near the sheet-tool interface, weld microstructure, weld strength, and failure modes were studied. It was found that the peak temperature was higher while welding with a tool having circular pin compared to tapered pin, leading to a bigger dynamic recrystallized stir zone (SZ) with a hook tip bending towards the upper sheet and away from the keyhole. Hence, higher lap shear separation load was observed in the welds made from circular pin compared to those made from tapered pin. Due to influence of size and hardness of SZ on crack propagation, three different failure modes of weld nugget were observed through optical cross-sectional micrograph and SEM fractographs.

  18. Failure mode and effects analysis drastically reduced potential risks in clinical trial conduct.

    PubMed

    Lee, Howard; Lee, Heechan; Baik, Jungmi; Kim, Hyunjung; Kim, Rachel

    2017-01-01

    Failure mode and effects analysis (FMEA) is a risk management tool to proactively identify and assess the causes and effects of potential failures in a system, thereby preventing them from happening. The objective of this study was to evaluate effectiveness of FMEA applied to an academic clinical trial center in a tertiary care setting. A multidisciplinary FMEA focus group at the Seoul National University Hospital Clinical Trials Center selected 6 core clinical trial processes, for which potential failure modes were identified and their risk priority number (RPN) was assessed. Remedial action plans for high-risk failure modes (RPN >160) were devised and a follow-up RPN scoring was conducted a year later. A total of 114 failure modes were identified with an RPN score ranging 3-378, which was mainly driven by the severity score. Fourteen failure modes were of high risk, 11 of which were addressed by remedial actions. Rescoring showed a dramatic improvement attributed to reduction in the occurrence and detection scores by >3 and >2 points, respectively. FMEA is a powerful tool to improve quality in clinical trials. The Seoul National University Hospital Clinical Trials Center is expanding its FMEA capability to other core clinical trial processes.

  19. Stimulating Creativity and Innovation through Intelligent Fast Failure

    ERIC Educational Resources Information Center

    Tahirsylaj, Armend S.

    2012-01-01

    Literature on creativity and innovation has discussed the issue of failure in the light of its benefits and limitations for enhancing human potential in all domains of life, but in business, science, engineering, and industry more specifically. In this paper, the Intelligent Fast Failure (IFF) as a useful tool of creativity and innovation for…

  20. Prediction of morbidity and mortality in patients with type 2 diabetes.

    PubMed

    Wells, Brian J; Roth, Rachel; Nowacki, Amy S; Arrigain, Susana; Yu, Changhong; Rosenkrans, Wayne A; Kattan, Michael W

    2013-01-01

    Introduction. The objective of this study was to create a tool that accurately predicts the risk of morbidity and mortality in patients with type 2 diabetes according to an oral hypoglycemic agent. Materials and Methods. The model was based on a cohort of 33,067 patients with type 2 diabetes who were prescribed a single oral hypoglycemic agent at the Cleveland Clinic between 1998 and 2006. Competing risk regression models were created for coronary heart disease (CHD), heart failure, and stroke, while a Cox regression model was created for mortality. Propensity scores were used to account for possible treatment bias. A prediction tool was created and internally validated using tenfold cross-validation. The results were compared to a Framingham model and a model based on the United Kingdom Prospective Diabetes Study (UKPDS) for CHD and stroke, respectively. Results and Discussion. Median follow-up for the mortality outcome was 769 days. The numbers of patients experiencing events were as follows: CHD (3062), heart failure (1408), stroke (1451), and mortality (3661). The prediction tools demonstrated the following concordance indices (c-statistics) for the specific outcomes: CHD (0.730), heart failure (0.753), stroke (0.688), and mortality (0.719). The prediction tool was superior to the Framingham model at predicting CHD and was at least as accurate as the UKPDS model at predicting stroke. Conclusions. We created an accurate tool for predicting the risk of stroke, coronary heart disease, heart failure, and death in patients with type 2 diabetes. The calculator is available online at http://rcalc.ccf.org under the heading "Type 2 Diabetes" and entitled, "Predicting 5-Year Morbidity and Mortality." This may be a valuable tool to aid the clinician's choice of an oral hypoglycemic, to better inform patients, and to motivate dialogue between physician and patient.

  1. A novel methodology for in-process monitoring of flow forming

    NASA Astrophysics Data System (ADS)

    Appleby, Andrew; Conway, Alastair; Ion, William

    2017-10-01

    Flow forming (FF) is an incremental cold working process with near-net-shape forming capability. Failures by fracture due to high deformation can be unexpected and sometimes catastrophic, causing tool damage. If process failures can be identified in real time, an automatic cut-out could prevent costly tool damage. Sound and vibration monitoring is well established and commercially viable in the machining sector to detect current and incipient process failures, but not for FF. A broad-frequency microphone was used to record the sound signature of the manufacturing cycle for a series of FF parts. Parts were flow formed using single and multiple passes, and flaws were introduced into some of the parts to simulate the presence of spontaneously initiated cracks. The results show that this methodology is capable of identifying both introduced defects and spontaneous failures during flow forming. Further investigation is needed to categorise and identify different modes of failure and identify further potential applications in rotary forming.

  2. TU-AB-BRD-00: Task Group 100

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  3. TU-AB-BRD-03: Fault Tree Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunscombe, P.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  4. TU-AB-BRD-01: Process Mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palta, J.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  5. TU-AB-BRD-04: Development of Quality Management Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomadsen, B.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  6. Are Your Students Ready for Anatomy and Physiology? Developing Tools to Identify Students at Risk for Failure

    ERIC Educational Resources Information Center

    Gultice, Amy; Witham, Ann; Kallmeyer, Robert

    2015-01-01

    High failure rates in introductory college science courses, including anatomy and physiology, are common at institutions across the country, and determining the specific factors that contribute to this problem is challenging. To identify students at risk for failure in introductory physiology courses at our open-enrollment institution, an online…

  7. Robust detection, isolation and accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Emami-Naeini, A.; Akhter, M. M.; Rock, S. M.

    1986-01-01

    The objective is to extend the recent advances in robust control system design of multivariable systems to sensor failure detection, isolation, and accommodation (DIA), and estimator design. This effort provides analysis tools to quantify the trade-off between performance robustness and DIA sensitivity, which are to be used to achieve higher levels of performance robustness for given levels of DIA sensitivity. An innovations-based DIA scheme is used. Estimators, which depend upon a model of the process and process inputs and outputs, are used to generate these innovations. Thresholds used to determine failure detection are computed based on bounds on modeling errors, noise properties, and the class of failures. The applicability of the newly developed tools are demonstrated on a multivariable aircraft turbojet engine example. A new concept call the threshold selector was developed. It represents a significant and innovative tool for the analysis and synthesis of DiA algorithms. The estimators were made robust by introduction of an internal model and by frequency shaping. The internal mode provides asymptotically unbiased filter estimates.The incorporation of frequency shaping of the Linear Quadratic Gaussian cost functional modifies the estimator design to make it suitable for sensor failure DIA. The results are compared with previous studies which used thresholds that were selcted empirically. Comparison of these two techniques on a nonlinear dynamic engine simulation shows improved performance of the new method compared to previous techniques

  8. The Utility of Failure Modes and Effects Analysis of Consultations in a Tertiary, Academic, Medical Center.

    PubMed

    Niv, Yaron; Itskoviz, David; Cohen, Michal; Hendel, Hagit; Bar-Giora, Yonit; Berkov, Evgeny; Weisbord, Irit; Leviron, Yifat; Isasschar, Assaf; Ganor, Arian

    Failure modes and effects analysis (FMEA) is a tool used to identify potential risks in health care processes. We used the FMEA tool for improving the process of consultation in an academic medical center. A team of 10 staff members-5 physicians, 2 quality experts, 2 organizational consultants, and 1 nurse-was established. The consultation process steps, from ordering to delivering, were computed. Failure modes were assessed for likelihood of occurrence, detection, and severity. A risk priority number (RPN) was calculated. An interventional plan was designed according to the highest RPNs. Thereafter, we compared the percentage of completed computer-based documented consultations before and after the intervention. The team identified 3 main categories of failure modes that reached the highest RPNs: initiation of consultation by a junior staff physician without senior approval, failure to document the consultation in the computerized patient registry, and asking for consultation on the telephone. An interventional plan was designed, including meetings to update knowledge of the consultation request process, stressing the importance of approval by a senior physician, training sessions for closing requests in the patient file, and reporting of telephone requests. The number of electronically documented consultation results and recommendations significantly increased (75%) after intervention. FMEA is an important and efficient tool for improving the consultation process in an academic medical center.

  9. Construction and Validation of a Questionnaire about Heart Failure Patients' Knowledge of Their Disease

    PubMed Central

    Bonin, Christiani Decker Batista; dos Santos, Rafaella Zulianello; Ghisi, Gabriela Lima de Melo; Vieira, Ariany Marques; Amboni, Ricardo; Benetti, Magnus

    2014-01-01

    Background The lack of tools to measure heart failure patients' knowledge about their syndrome when participating in rehabilitation programs demonstrates the need for specific recommendations regarding the amount or content of information required. Objectives To develop and validate a questionnaire to assess heart failure patients' knowledge about their syndrome when participating in cardiac rehabilitation programs. Methods The tool was developed based on the Coronary Artery Disease Education Questionnaire and applied to 96 patients with heart failure, with a mean age of 60.22 ± 11.6 years, 64% being men. Reproducibility was obtained via the intraclass correlation coefficient, using the test-retest method. Internal consistency was assessed by use of Cronbach's alpha, and construct validity, by use of exploratory factor analysis. Results The final version of the tool had 19 questions arranged in ten areas of importance for patient education. The proposed questionnaire had a clarity index of 8.94 ± 0.83. The intraclass correlation coefficient was 0.856, and Cronbach's alpha, 0.749. Factor analysis revealed five factors associated with the knowledge areas. Comparing the final scores with the characteristics of the population evidenced that low educational level and low income are significantly associated with low levels of knowledge. Conclusion The instrument has satisfactory clarity and validity indices, and can be used to assess the heart failure patients' knowledge about their syndrome when participating in cardiac rehabilitation programs. PMID:24652054

  10. Tapered Roller Bearing Damage Detection Using Decision Fusion Analysis

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Kreider, Gary; Fichter, Thomas

    2006-01-01

    A diagnostic tool was developed for detecting fatigue damage of tapered roller bearings. Tapered roller bearings are used in helicopter transmissions and have potential for use in high bypass advanced gas turbine aircraft engines. A diagnostic tool was developed and evaluated experimentally by collecting oil debris data from failure progression tests conducted using health monitoring hardware. Failure progression tests were performed with tapered roller bearings under simulated engine load conditions. Tests were performed on one healthy bearing and three pre-damaged bearings. During each test, data from an on-line, in-line, inductance type oil debris sensor and three accelerometers were monitored and recorded for the occurrence of bearing failure. The bearing was removed and inspected periodically for damage progression throughout testing. Using data fusion techniques, two different monitoring technologies, oil debris analysis and vibration, were integrated into a health monitoring system for detecting bearing surface fatigue pitting damage. The data fusion diagnostic tool was evaluated during bearing failure progression tests under simulated engine load conditions. This integrated system showed improved detection of fatigue damage and health assessment of the tapered roller bearings as compared to using individual health monitoring technologies.

  11. Value of Telemonitoring and Telemedicine in Heart Failure Management

    PubMed Central

    Alderighi, Camilla; Rasoini, Raffaele; Mazzanti, Marco; Casolo, Giancarlo

    2017-01-01

    The use of telemonitoring and telemedicine is a relatively new but quickly developing area in medicine. As new digital tools and applications are being created and used to manage medical conditions such as heart failure, many implications require close consideration and further study, including the effectiveness and safety of these telemonitoring tools in diagnosing, treating and managing heart failure compared to traditional face-to-face doctor–patient interaction. When compared to multidisciplinary intervention programs which are frequently hindered by economic, geographic and bureaucratic barriers, non-invasive remote monitoring could be a solution to support and promote the care of patients over time. Therefore it is crucial to identify the most relevant biological parameters to monitor, which heart failure sub-populations may gain real benefits from telehealth interventions and in which specific healthcare subsets these interventions should be implemented in order to maximise value. PMID:29387464

  12. Value of Telemonitoring and Telemedicine in Heart Failure Management.

    PubMed

    Gensini, Gian Franco; Alderighi, Camilla; Rasoini, Raffaele; Mazzanti, Marco; Casolo, Giancarlo

    2017-11-01

    The use of telemonitoring and telemedicine is a relatively new but quickly developing area in medicine. As new digital tools and applications are being created and used to manage medical conditions such as heart failure, many implications require close consideration and further study, including the effectiveness and safety of these telemonitoring tools in diagnosing, treating and managing heart failure compared to traditional face-to-face doctor-patient interaction. When compared to multidisciplinary intervention programs which are frequently hindered by economic, geographic and bureaucratic barriers, non-invasive remote monitoring could be a solution to support and promote the care of patients over time. Therefore it is crucial to identify the most relevant biological parameters to monitor, which heart failure sub-populations may gain real benefits from telehealth interventions and in which specific healthcare subsets these interventions should be implemented in order to maximise value.

  13. Failure mode and effects analysis drastically reduced potential risks in clinical trial conduct

    PubMed Central

    Baik, Jungmi; Kim, Hyunjung; Kim, Rachel

    2017-01-01

    Background Failure mode and effects analysis (FMEA) is a risk management tool to proactively identify and assess the causes and effects of potential failures in a system, thereby preventing them from happening. The objective of this study was to evaluate effectiveness of FMEA applied to an academic clinical trial center in a tertiary care setting. Methods A multidisciplinary FMEA focus group at the Seoul National University Hospital Clinical Trials Center selected 6 core clinical trial processes, for which potential failure modes were identified and their risk priority number (RPN) was assessed. Remedial action plans for high-risk failure modes (RPN >160) were devised and a follow-up RPN scoring was conducted a year later. Results A total of 114 failure modes were identified with an RPN score ranging 3–378, which was mainly driven by the severity score. Fourteen failure modes were of high risk, 11 of which were addressed by remedial actions. Rescoring showed a dramatic improvement attributed to reduction in the occurrence and detection scores by >3 and >2 points, respectively. Conclusions FMEA is a powerful tool to improve quality in clinical trials. The Seoul National University Hospital Clinical Trials Center is expanding its FMEA capability to other core clinical trial processes. PMID:29089745

  14. Failure mode analysis to predict product reliability.

    NASA Technical Reports Server (NTRS)

    Zemanick, P. P.

    1972-01-01

    The failure mode analysis (FMA) is described as a design tool to predict and improve product reliability. The objectives of the failure mode analysis are presented as they influence component design, configuration selection, the product test program, the quality assurance plan, and engineering analysis priorities. The detailed mechanics of performing a failure mode analysis are discussed, including one suggested format. Some practical difficulties of implementation are indicated, drawn from experience with preparing FMAs on the nuclear rocket engine program.

  15. Fracture - An Unforgiving Failure Mode

    NASA Technical Reports Server (NTRS)

    Goodin, James Ronald

    2006-01-01

    During the 2005 Conference for the Advancement for Space Safety, after a typical presentation of safety tools, a Russian in the audience simply asked, "How does that affect the hardware?" Having participated in several International System Safety Conferences, I recalled that most attention is dedicated to safety tools and little, if any, to hardware. The intent of this paper on the hazard of fracture and failure modes associated with fracture is my attempt to draw attention to the grass roots of system safety - improving hardware robustness and resilience.

  16. TEAM-HF Cost-Effectiveness Model: A Web-Based Program Designed to Evaluate the Cost-Effectiveness of Disease Management Programs in Heart Failure

    PubMed Central

    Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.

    2015-01-01

    Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504

  17. An Acuity Tool for Heart Failure Case Management: Quantifying Workload, Service Utilization, and Disease Severity.

    PubMed

    Kilgore, Matthew D

    The cardiology service line director at a health maintenance organization (HMO) in Washington State required a valid, reliable, and practical means for measuring workloads and other productivity factors for six heart failure (HF) registered nurse case managers located across three geographical regions. The Kilgore Heart Failure Case Management (KHFCM) Acuity Tool was systematically designed, developed, and validated to measure workload as a dependent function of the number of heart failure case management (HFCM) services rendered and the duration of times spent on various care duties. Research and development occurred at various HMO-affiliated internal medicine and cardiology offices throughout Western Washington. The concepts, methods, and principles used to develop the KHFCM Acuity Tool are applicable for any type of health care professional aiming to quantify workload using a high-quality objective tool. The content matter, scaling, and language on the KHFCM Acuity Tool are specific to HFCM settings. The content matter and numeric scales for the KHFCM Acuity Tool were developed and validated using a mixed-method participant action research method applied to a group of six outpatient HF case managers and their respective caseloads. The participant action research method was selected, because the application of this method requires research participants to become directly involved in the diagnosis of research problems, the planning and execution of actions taken to address those problems, and the implementation of progressive strategies throughout the course of the study, as necessary, to produce the most credible and practical practice improvements (; ; ; ). Heart failure case managers served clients with New York Heart Association Functional Class III-IV HF (), and encounters were conducted primarily by telephone or in-office consultation. A mix of qualitative and quantitative results demonstrated a variety of quality improvement outcomes achieved by the design and practice application of the KHFCM Acuity Tool. Quality improvement outcomes included a more valid reflection of encounter times and demonstration of the KHFCM Acuity Tool as a reliable, practical, credible, and satisfying tool for reflecting HF case manager workloads and HF disease severity. The KHFCM Acuity Tool defines workload simply as a function of the number of HFCM services performed and the duration of time spent on a client encounter. The design of the tool facilitates the measure of workload, service utilization, and HF disease characteristics, independently from the overall measure of acuity, so that differences in individual case manager practice, as well as client characteristics within sites, across sites, and potentially throughout annual seasons, can be demonstrated. Data produced from long-term applications of the KHFCM Acuity Tool, across all regions, could serve as a driver for establishing systemwide HFCM productivity benchmarks or standards of practice for HF case managers. Data produced from localized applications could serve as a reference for coordinating staffing resources or developing HFCM productivity benchmarks within individual regions or sites.

  18. German disease management guidelines: surgical therapies for chronic heart failure.

    PubMed

    Sindermann, J R; Klotz, S; Rahbar, K; Hoffmeier, A; Drees, G

    2010-02-01

    The German Disease Management Guideline "Chronic Heart Failure" intends to guide physicians working in the field of diagnosis and treatment of heart failure. The guideline provides a tool on the background of evidence based medicine. The following short review wants to give insights into the role of some surgical treatment options to improve heart failure, such as revascularization, ventricular reconstruction and aneurysmectomy, mitral valve reconstruction, ventricular assist devices and heart transplantation. (c) Georg Thieme Verlag KG Stuttgart-New York.

  19. Factors Influencing Progressive Failure Analysis Predictions for Laminated Composite Structure

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model for use with a nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details are described in the present paper. Parametric studies for laminated composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented and to demonstrate their influence on progressive failure analysis predictions.

  20. A Automated Tool for Supporting FMEAs of Digital Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue,M.; Chu, T.-L.; Martinez-Guridi, G.

    2008-09-07

    Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach,more » an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.« less

  1. Pressure-induced critical influences on workpiece-tool thermal interaction in high speed dry machining of titanium

    NASA Astrophysics Data System (ADS)

    Abdel-Aal, H. A.; Mansori, M. El

    2012-12-01

    Cutting tools are subject to extreme thermal and mechanical loads during operation. The state of loading is intensified in dry cutting environment especially when cutting the so called hard-to-cut-materials. Although, the effect of mechanical loads on tool failure have been extensively studied, detailed studies on the effect of thermal dissipation on the deterioration of the cutting tool are rather scarce. In this paper we study failure of coated carbide tools due to thermal loading. The study emphasizes the role assumed by the thermo-physical properties of the tool material in enhancing or preventing mass attrition of the cutting elements within the tool. It is shown that within a comprehensive view of the nature of conduction in the tool zone, thermal conduction is not solely affected by temperature. Rather it is a function of the so called thermodynamic forces. These are the stress, the strain, strain rate, rate of temperature rise, and the temperature gradient. Although that within such consideration description of thermal conduction is non-linear, it is beneficial to employ such a form because it facilitates a full mechanistic understanding of thermal activation of tool wear.

  2. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    PubMed

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  3. Pilot study of an Internet patient-physician communication tool for heart failure disease management.

    PubMed

    Wu, Robert C; Delgado, Diego; Costigan, Jeannine; Ross, Heather; MacIver, Jane

    2006-01-01

    Internet disease management has the promise of improving care in patients with heart failure but evidence supporting its use is limited. We have designed a Heart Failure Internet Communication Tool (HFICT), allowing patients to enter messages for clinicians, as well as their daily symptoms, weight, blood pressure and heart rate. Clinicians review the information on the same day and provide feedback. This pilot study evaluated the feasibility and patients' acceptability of using the Internet to communicate with patients with symptomatic heart failure. Patients with symptomatic heart failure were instructed how to use the Internet communication tool. The primary outcome measure was the proportion of patients who used the system regularly by entering information on average at least once per week for at least 3 months. Secondary outcomes measures included safety and maintainability of the tool. We also conducted a content analysis of a subset of the patient and clinician messages entered into the comments field. Between 3 May 1999 and 1 November 2002, 62 patients (mean age 48.7 years) were enrolled. At 3 months 58 patients were alive and without a heart transplant. Of those, 26 patients (45%; 95% Confidence Interval, 0.33-0.58) continued using the system at 3 months. In 97% of all entries by participants weight was included; 68% of entries included blood pressure; and 71% of entries included heart rate. In 3,386 entries out of all 5,098 patient entries (66%), comments were entered. Functions that were not used included the tracking of diuretics, medications and treatment goals. The tool appeared to be safe and maintainable. Workload estimates for clinicians for entering a response to each patient's entry ranged from less than a minute to 5 minutes or longer for a detailed response. Patients sent 3,386 comments to the Heart Function Clinic. Based on the content analysis of 100 patient entries, the following major categories of communication were identified: patient information; patient symptoms; patient questions regarding their condition; patient coordinating own care; social responses. The number of comments decreased over time for both patients and clinicians. While the majority of patients discontinued use, 45% of the patients used the system and continued to use it on average for 1.5 years. An Internet tool is a feasible method of communication in a substantial proportion of patients with heart failure. Further study is required to determine whether clinical outcomes, such as quality of life or frequency of hospitalization, are improved.

  4. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  5. A Drive Method of Permanent Magnet Synchronous Motor Using Torque Angle Estimation without Position Sensor

    NASA Astrophysics Data System (ADS)

    Tanaka, Takuro; Takahashi, Hisashi

    In some motor applications, it is very difficult to attach a position sensor to the motor in housing. One of the examples of such applications is the dental handpiece-motor. In those designs, it is necessary to drive highly efficiency at low speed and variable load condition without a position sensor. We developed a method to control a motor high-efficient and smoothly at low speed without a position sensor. In this paper, the method in which permanent magnet synchronous motor is controlled smoothly and high-efficient by using torque angle control in synchronized operation is shown. The usefulness is confirmed by experimental results. In conclusion, the proposed sensor-less control method has been achieved to be very efficiently and smoothly.

  6. Development of High Temperature Electro-Magnetic Actuators (HTEMA) for Aircraft Propulsion Systems (Preprint)

    DTIC Science & Technology

    2013-05-01

    an 18 inch gap diameter has roughly a 2 foot outer diameter                                                              2 “ Brushless  Permanent...require PMs include wound rotor DC (brush and brushless ), Variable or Switched reluctance (VR or SR) machines and squirrel cage induction motors...Trades have identified Brushless DC PM and SR machines are of primary interest. Both motors can use sensorless commutation methods. A VR resolver can

  7. Sensor-less force-reflecting macro-micro telemanipulation systems by piezoelectric actuators.

    PubMed

    Amini, H; Farzaneh, B; Azimifar, F; Sarhan, A A D

    2016-09-01

    This paper establishes a novel control strategy for a nonlinear bilateral macro-micro teleoperation system with time delay. Besides position and velocity signals, force signals are additionally utilized in the control scheme. This modification significantly improves the poor transparency during contact with the environment. To eliminate external force measurement, a force estimation algorithm is proposed for the master and slave robots. The closed loop stability of the nonlinear micro-micro teleoperation system with the proposed control scheme is investigated employing the Lyapunov theory. Consequently, the experimental results verify the efficiency of the new control scheme in free motion and during collision between the slave robot and the environment of slave robot with environment, and the efficiency of the force estimation algorithm. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Adaptive optics compensation of orbital angular momentum beams with a modified Gerchberg-Saxton-based phase retrieval algorithm

    NASA Astrophysics Data System (ADS)

    Chang, Huan; Yin, Xiao-li; Cui, Xiao-zhou; Zhang, Zhi-chao; Ma, Jian-xin; Wu, Guo-hua; Zhang, Li-jia; Xin, Xiang-jun

    2017-12-01

    Practical orbital angular momentum (OAM)-based free-space optical (FSO) communications commonly experience serious performance degradation and crosstalk due to atmospheric turbulence. In this paper, we propose a wave-front sensorless adaptive optics (WSAO) system with a modified Gerchberg-Saxton (GS)-based phase retrieval algorithm to correct distorted OAM beams. We use the spatial phase perturbation (SPP) GS algorithm with a distorted probe Gaussian beam as the only input. The principle and parameter selections of the algorithm are analyzed, and the performance of the algorithm is discussed. The simulation results show that the proposed adaptive optics (AO) system can significantly compensate for distorted OAM beams in single-channel or multiplexed OAM systems, which provides new insights into adaptive correction systems using OAM beams.

  9. Clinical risk analysis with failure mode and effect analysis (FMEA) model in a dialysis unit.

    PubMed

    Bonfant, Giovanna; Belfanti, Pietro; Paternoster, Giuseppe; Gabrielli, Danila; Gaiter, Alberto M; Manes, Massimo; Molino, Andrea; Pellu, Valentina; Ponzetti, Clemente; Farina, Massimo; Nebiolo, Pier E

    2010-01-01

    The aim of clinical risk management is to improve the quality of care provided by health care organizations and to assure patients' safety. Failure mode and effect analysis (FMEA) is a tool employed for clinical risk reduction. We applied FMEA to chronic hemodialysis outpatients. FMEA steps: (i) process study: we recorded phases and activities. (ii) Hazard analysis: we listed activity-related failure modes and their effects; described control measures; assigned severity, occurrence and detection scores for each failure mode and calculated the risk priority numbers (RPNs) by multiplying the 3 scores. Total RPN is calculated by adding single failure mode RPN. (iii) Planning: we performed a RPNs prioritization on a priority matrix taking into account the 3 scores, and we analyzed failure modes causes, made recommendations and planned new control measures. (iv) Monitoring: after failure mode elimination or reduction, we compared the resulting RPN with the previous one. Our failure modes with the highest RPN came from communication and organization problems. Two tools have been created to ameliorate information flow: "dialysis agenda" software and nursing datasheets. We scheduled nephrological examinations, and we changed both medical and nursing organization. Total RPN value decreased from 892 to 815 (8.6%) after reorganization. Employing FMEA, we worked on a few critical activities, and we reduced patients' clinical risk. A priority matrix also takes into account the weight of the control measures: we believe this evaluation is quick, because of simple priority selection, and that it decreases action times.

  10. A Framework for Creating a Function-based Design Tool for Failure Mode Identification

    NASA Technical Reports Server (NTRS)

    Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Knowledge of potential failure modes during design is critical for prevention of failures. Currently industries use procedures such as Failure Modes and Effects Analysis (FMEA), Fault Tree analysis, or Failure Modes, Effects and Criticality analysis (FMECA), as well as knowledge and experience, to determine potential failure modes. When new products are being developed there is often a lack of sufficient knowledge of potential failure mode and/or a lack of sufficient experience to identify all failure modes. This gives rise to a situation in which engineers are unable to extract maximum benefits from the above procedures. This work describes a function-based failure identification methodology, which would act as a storehouse of information and experience, providing useful information about the potential failure modes for the design under consideration, as well as enhancing the usefulness of procedures like FMEA. As an example, the method is applied to fifteen products and the benefits are illustrated.

  11. Integrative Assessment of Congestion in Heart Failure Throughout the Patient Journey.

    PubMed

    Girerd, Nicolas; Seronde, Marie-France; Coiro, Stefano; Chouihed, Tahar; Bilbault, Pascal; Braun, François; Kenizou, David; Maillier, Bruno; Nazeyrollas, Pierre; Roul, Gérard; Fillieux, Ludivine; Abraham, William T; Januzzi, James; Sebbag, Laurent; Zannad, Faiez; Mebazaa, Alexandre; Rossignol, Patrick

    2018-04-01

    Congestion is one of the main predictors of poor patient outcome in patients with heart failure. However, congestion is difficult to assess, especially when symptoms are mild. Although numerous clinical scores, imaging tools, and biological tests are available to assist physicians in ascertaining and quantifying congestion, not all are appropriate for use in all stages of patient management. In recent years, multidisciplinary management in the community has become increasingly important to prevent heart failure hospitalizations. Electronic alert systems and communication platforms are emerging that could be used to facilitate patient home monitoring that identifies congestion from heart failure decompensation at an earlier stage. This paper describes the role of congestion detection methods at key stages of patient care: pre-admission, admission to the emergency department, in-hospital management, and lastly, discharge and continued monitoring in the community. The multidisciplinary working group, which consisted of cardiologists, emergency physicians, and a nephrologist with both clinical and research backgrounds, reviewed the current literature regarding the various scores, tools, and tests to detect and quantify congestion. This paper describes the role of each tool at key stages of patient care and discusses the advantages of telemedicine as a means of providing true integrated patient care. Copyright © 2018. Published by Elsevier Inc.

  12. Failure mode and effect analysis: improving intensive care unit risk management processes.

    PubMed

    Askari, Roohollah; Shafii, Milad; Rafiei, Sima; Abolhassani, Mohammad Sadegh; Salarikhah, Elaheh

    2017-04-18

    Purpose Failure modes and effects analysis (FMEA) is a practical tool to evaluate risks, discover failures in a proactive manner and propose corrective actions to reduce or eliminate potential risks. The purpose of this paper is to apply FMEA technique to examine the hazards associated with the process of service delivery in intensive care unit (ICU) of a tertiary hospital in Yazd, Iran. Design/methodology/approach This was a before-after study conducted between March 2013 and December 2014. By forming a FMEA team, all potential hazards associated with ICU services - their frequency and severity - were identified. Then risk priority number was calculated for each activity as an indicator representing high priority areas that need special attention and resource allocation. Findings Eight failure modes with highest priority scores including endotracheal tube defect, wrong placement of endotracheal tube, EVD interface, aspiration failure during suctioning, chest tube failure, tissue injury and deep vein thrombosis were selected for improvement. Findings affirmed that improvement strategies were generally satisfying and significantly decreased total failures. Practical implications Application of FMEA in ICUs proved to be effective in proactively decreasing the risk of failures and corrected the control measures up to acceptable levels in all eight areas of function. Originality/value Using a prospective risk assessment approach, such as FMEA, could be beneficial in dealing with potential failures through proposing preventive actions in a proactive manner. The method could be used as a tool for healthcare continuous quality improvement so that the method identifies both systemic and human errors, and offers practical advice to deal effectively with them.

  13. A dual-mode generalized likelihood ratio approach to self-reorganizing digital flight control system design

    NASA Technical Reports Server (NTRS)

    Bueno, R.; Chow, E.; Gershwin, S. B.; Willsky, A. S.

    1975-01-01

    The research is reported on the problems of failure detection and reliable system design for digital aircraft control systems. Failure modes, cross detection probability, wrong time detection, application of performance tools, and the GLR computer package are discussed.

  14. Acoustic emission and nondestructive evaluation of biomaterials and tissues.

    PubMed

    Kohn, D H

    1995-01-01

    Acoustic emission (AE) is an acoustic wave generated by the release of energy from localized sources in a material subjected to an externally applied stimulus. This technique may be used nondestructively to analyze tissues, materials, and biomaterial/tissue interfaces. Applications of AE include use as an early warning tool for detecting tissue and material defects and incipient failure, monitoring damage progression, predicting failure, characterizing failure mechanisms, and serving as a tool to aid in understanding material properties and structure-function relations. All these applications may be performed in real time. This review discusses general principles of AE monitoring and the use of the technique in 3 areas of importance to biomedical engineering: (1) analysis of biomaterials, (2) analysis of tissues, and (3) analysis of tissue/biomaterial interfaces. Focus in these areas is on detection sensitivity, methods of signal analysis in both the time and frequency domains, the relationship between acoustic signals and microstructural phenomena, and the uses of the technique in establishing a relationship between signals and failure mechanisms.

  15. Investigation of Tapered Roller Bearing Damage Detection Using Oil Debris Analysis

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Krieder, Gary; Fichter, Thomas

    2006-01-01

    A diagnostic tool was developed for detecting fatigue damage to tapered roller bearings. Tapered roller bearings are used in helicopter transmissions and have potential for use in high bypass advanced gas turbine aircraft engines. This diagnostic tool was developed and evaluated experimentally by collecting oil debris data from failure progression tests performed by The Timken Company in their Tapered Roller Bearing Health Monitoring Test Rig. Failure progression tests were performed under simulated engine load conditions. Tests were performed on one healthy bearing and three predamaged bearings. During each test, data from an on-line, in-line, inductance type oil debris sensor was monitored and recorded for the occurrence of debris generated during failure of the bearing. The bearing was removed periodically for inspection throughout the failure progression tests. Results indicate the accumulated oil debris mass is a good predictor of damage on tapered roller bearings. The use of a fuzzy logic model to enable an easily interpreted diagnostic metric was proposed and demonstrated.

  16. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2003-12-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  17. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2004-01-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  18. The integration of FMEA with other problem solving tools: A review of enhancement opportunities

    NASA Astrophysics Data System (ADS)

    Ng, W. C.; Teh, S. Y.; Low, H. C.; Teoh, P. C.

    2017-09-01

    Failure Mode Effect Analysis (FMEA) is one the most effective and accepted problem solving (PS) tools for most of the companies in the world. Since FMEA was first introduced in 1949, practitioners have implemented FMEA in various industries for their quality improvement initiatives. However, studies have shown that there are drawbacks that hinder the effectiveness of FMEA for continuous quality improvement from product design to manufacturing. Therefore, FMEA is integrated with other PS tools such as inventive problem solving methodology (TRIZ), Quality Function Deployment (QFD), Root Cause Analysis (RCA) and seven basic tools of quality to address the drawbacks. This study begins by identifying the drawbacks in FMEA. A comprehensive literature review on the integration of FMEA with other tools is carried out to categorise the integrations based on the drawbacks identified. The three categories are inefficiency of failure analysis, psychological inertia and neglect of customers’ perspective. This study concludes by discussing the gaps and opportunities in the integration for future research.

  19. Lifetime evaluation of large format CMOS mixed signal infrared devices

    NASA Astrophysics Data System (ADS)

    Linder, A.; Glines, Eddie

    2015-09-01

    New large scale foundry processes continue to produce reliable products. These new large scale devices continue to use industry best practice to screen for failure mechanisms and validate their long lifetime. The Failure-in-Time analysis in conjunction with foundry qualification information can be used to evaluate large format device lifetimes. This analysis is a helpful tool when zero failure life tests are typical. The reliability of the device is estimated by applying the failure rate to the use conditions. JEDEC publications continue to be the industry accepted methods.

  20. Failure Modes and Effects Analysis (FMEA): A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.

  1. Program Helps In Analysis Of Failures

    NASA Technical Reports Server (NTRS)

    Stevenson, R. W.; Austin, M. E.; Miller, J. G.

    1993-01-01

    Failure Environment Analysis Tool (FEAT) computer program developed to enable people to see and better understand effects of failures in system. User selects failures from either engineering schematic diagrams or digraph-model graphics, and effects or potential causes of failures highlighted in color on same schematic-diagram or digraph representation. Uses digraph models to answer two questions: What will happen to system if set of failure events occurs? and What are possible causes of set of selected failures? Helps design reviewers understand exactly what redundancies built into system and where there is need to protect weak parts of system or remove them by redesign. Program also useful in operations, where it helps identify causes of failure after they occur. FEAT reduces costs of evaluation of designs, training, and learning how failures propagate through system. Written using Macintosh Programmers Workshop C v3.1. Can be linked with CLIPS 5.0 (MSC-21927, available from COSMIC).

  2. APPLYING INSIGHTS FROM BEHAVIORAL ECONOMICS TO POLICY DESIGN

    PubMed Central

    Madrian, Brigitte C.

    2014-01-01

    The premise of this article is that an understanding of psychology and other social science disciplines can inform the effectiveness of the economic tools traditionally deployed in carrying out the functions of government, which include remedying market failures, redistributing income, and collecting tax revenue. An understanding of psychology can also lead to the development of different policy tools that better motivate desired behavior change or that are more cost-effective than traditional policy tools. The article outlines a framework for thinking about the psychology of behavior change in the context of market failures. It then describes the research on the effects of a variety of interventions rooted in an understanding of psychology that have policy-relevant applications. The article concludes by discussing how an understanding of psychology can also inform the use and design of traditional policy tools for behavior change, such as financial incentives. PMID:25520759

  3. A microfluidic device for simultaneous measurement of viscosity and flow rate of blood in a complex fluidic network

    PubMed Central

    Jun Kang, Yang; Yeom, Eunseop; Lee, Sang-Joon

    2013-01-01

    Blood viscosity has been considered as one of important biophysical parameters for effectively monitoring variations in physiological and pathological conditions of circulatory disorders. Standard previous methods make it difficult to evaluate variations of blood viscosity under cardiopulmonary bypass procedures or hemodialysis. In this study, we proposed a unique microfluidic device for simultaneously measuring viscosity and flow rate of whole blood circulating in a complex fluidic network including a rat, a reservoir, a pinch valve, and a peristaltic pump. To demonstrate the proposed method, a twin-shaped microfluidic device, which is composed of two half-circular chambers, two side channels with multiple indicating channels, and one bridge channel, was carefully designed. Based on the microfluidic device, three sequential flow controls were applied to identify viscosity and flow rate of blood, with label-free and sensorless detection. The half-circular chamber was employed to achieve mechanical membrane compliance for flow stabilization in the microfluidic device. To quantify the effect of flow stabilization on flow fluctuations, a formula of pulsation index (PI) was analytically derived using a discrete fluidic circuit model. Using the PI formula, the time constant contributed by the half-circular chamber is estimated to be 8 s. Furthermore, flow fluctuations resulting from the peristaltic pumps are completely removed, especially under periodic flow conditions within short periods (T < 10 s). For performance demonstrations, the proposed method was applied to evaluate blood viscosity with respect to varying flow rate conditions [(a) known blood flow rate via a syringe pump, (b) unknown blood flow rate via a peristaltic pump]. As a result, the flow rate and viscosity of blood can be simultaneously measured with satisfactory accuracy. In addition, the proposed method was successfully applied to identify the viscosity of rat blood, which circulates in a complex fluidic network. These observations confirm that the proposed method can be used for simultaneous measurement of viscosity and flow rate of whole blood circulating in the complex fluid network, with sensorless and label-free detection. Furthermore, the proposed method will be used in evaluating variations in the viscosity of human blood during cardiopulmonary bypass procedures or hemodialysis. PMID:24404074

  4. A microfluidic device for simultaneous measurement of viscosity and flow rate of blood in a complex fluidic network.

    PubMed

    Jun Kang, Yang; Yeom, Eunseop; Lee, Sang-Joon

    2013-01-01

    Blood viscosity has been considered as one of important biophysical parameters for effectively monitoring variations in physiological and pathological conditions of circulatory disorders. Standard previous methods make it difficult to evaluate variations of blood viscosity under cardiopulmonary bypass procedures or hemodialysis. In this study, we proposed a unique microfluidic device for simultaneously measuring viscosity and flow rate of whole blood circulating in a complex fluidic network including a rat, a reservoir, a pinch valve, and a peristaltic pump. To demonstrate the proposed method, a twin-shaped microfluidic device, which is composed of two half-circular chambers, two side channels with multiple indicating channels, and one bridge channel, was carefully designed. Based on the microfluidic device, three sequential flow controls were applied to identify viscosity and flow rate of blood, with label-free and sensorless detection. The half-circular chamber was employed to achieve mechanical membrane compliance for flow stabilization in the microfluidic device. To quantify the effect of flow stabilization on flow fluctuations, a formula of pulsation index (PI) was analytically derived using a discrete fluidic circuit model. Using the PI formula, the time constant contributed by the half-circular chamber is estimated to be 8 s. Furthermore, flow fluctuations resulting from the peristaltic pumps are completely removed, especially under periodic flow conditions within short periods (T < 10 s). For performance demonstrations, the proposed method was applied to evaluate blood viscosity with respect to varying flow rate conditions [(a) known blood flow rate via a syringe pump, (b) unknown blood flow rate via a peristaltic pump]. As a result, the flow rate and viscosity of blood can be simultaneously measured with satisfactory accuracy. In addition, the proposed method was successfully applied to identify the viscosity of rat blood, which circulates in a complex fluidic network. These observations confirm that the proposed method can be used for simultaneous measurement of viscosity and flow rate of whole blood circulating in the complex fluid network, with sensorless and label-free detection. Furthermore, the proposed method will be used in evaluating variations in the viscosity of human blood during cardiopulmonary bypass procedures or hemodialysis.

  5. Evaluation of the probability of arrester failure in a high-voltage transmission line using a Q learning artificial neural network model

    NASA Astrophysics Data System (ADS)

    Ekonomou, L.; Karampelas, P.; Vita, V.; Chatzarakis, G. E.

    2011-04-01

    One of the most popular methods of protecting high voltage transmission lines against lightning strikes and internal overvoltages is the use of arresters. The installation of arresters in high voltage transmission lines can prevent or even reduce the lines' failure rate. Several studies based on simulation tools have been presented in order to estimate the critical currents that exceed the arresters' rated energy stress and to specify the arresters' installation interval. In this work artificial intelligence, and more specifically a Q-learning artificial neural network (ANN) model, is addressed for evaluating the arresters' failure probability. The aims of the paper are to describe in detail the developed Q-learning ANN model and to compare the results obtained by its application in operating 150 kV Greek transmission lines with those produced using a simulation tool. The satisfactory and accurate results of the proposed ANN model can make it a valuable tool for designers of electrical power systems seeking more effective lightning protection, reducing operational costs and better continuity of service.

  6. A Genuine TEAM Player

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Qualtech Systems, Inc. developed a complete software system with capabilities of multisignal modeling, diagnostic analysis, run-time diagnostic operations, and intelligent interactive reasoners. Commercially available as the TEAMS (Testability Engineering and Maintenance System) tool set, the software can be used to reveal unanticipated system failures. The TEAMS software package is broken down into four companion tools: TEAMS-RT, TEAMATE, TEAMS-KB, and TEAMS-RDS. TEAMS-RT identifies good, bad, and suspect components in the system in real-time. It reports system health results from onboard tests, and detects and isolates failures within the system, allowing for rapid fault isolation. TEAMATE takes over from where TEAMS-RT left off by intelligently guiding the maintenance technician through the troubleshooting procedure, repair actions, and operational checkout. TEAMS-KB serves as a model management and collection tool. TEAMS-RDS (TEAMS-Remote Diagnostic Server) has the ability to continuously assess a system and isolate any failure in that system or its components, in real time. RDS incorporates TEAMS-RT, TEAMATE, and TEAMS-KB in a large-scale server architecture capable of providing advanced diagnostic and maintenance functions over a network, such as the Internet, with a web browser user interface.

  7. WE-H-BRC-01: Failure Mode and Effects Analysis of Skin Electronic Brachytherapy Using Esteya Unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibanez-Rosello, B; Bautista-Ballesteros, J; Bonaque, J

    Purpose: A failure mode and effect analysis (FMEA) of skin lesions treatment process using Esteya™ device (Elekta Brachyterapy, Veenendaal, The Netherlands) was performed, with the aim of increasing the quality of the treatment and reducing the likelihood of unwanted events. Methods: A multidisciplinary team with experience in the treatment process met to establish the process map, which outlines the flow of various stages for such patients undergoing skin treatment. Potential failure modes (FM) were identified and the value of severity (S), frequency of occurrence (O), and lack of detectability (D) of the proposed FM were scored individually, each on amore » scale of 1 to 10 following TG-100 guidelines of the AAPM. These failure modes were ranked according to our risk priority number (RPN) and S scores. The efficiency of existing quality management tools was analyzed through a reassessment of the O and D made by consensus. Results: 149 FM were identified, 43 of which had RPN ≥ 100 and 30 had S ≥ 7. After introduction of the tools of quality management, only 3 FM had RPN ≥ 100 and 22 FM had RPN ≥ 50. These 22 FM were thoroughly analyzed and new tools for quality management were proposed. The most common cause of highest RPN FM was associated with the heavy patient workload and the continuous and accurate applicator-patient skin contact during the treatment. To overcome this second item, a regular quality control and setup review by a second individual before each treatment session was proposed. Conclusion: FMEA revealed some of the FM potentials that were not predicted during the initial implementation of the quality management tools. This exercise was useful in identifying the need of periodic update of the FMEA process as new potential failures can be identified.« less

  8. CONFIG: Qualitative simulation tool for analyzing behavior of engineering devices

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.; Harris, Richard A.

    1987-01-01

    To design failure management expert systems, engineers mentally analyze the effects of failures and procedures as they propagate through device configurations. CONFIG is a generic device modeling tool for use in discrete event simulation, to support such analyses. CONFIG permits graphical modeling of device configurations and qualitative specification of local operating modes of device components. Computation requirements are reduced by focussing the level of component description on operating modes and failure modes, and specifying qualitative ranges of variables relative to mode transition boundaries. Simulation processing occurs only when modes change or variables cross qualitative boundaries. Device models are built graphically, using components from libraries. Components are connected at ports by graphical relations that define data flow. The core of a component model is its state transition diagram, which specifies modes of operation and transitions among them.

  9. Failure analysis in the identification of synergies between cleaning monitoring methods.

    PubMed

    Whiteley, Greg S; Derry, Chris; Glasbey, Trevor

    2015-02-01

    The 4 monitoring methods used to manage the quality assurance of cleaning outcomes within health care settings are visual inspection, microbial recovery, fluorescent marker assessment, and rapid ATP bioluminometry. These methods each generate different types of information, presenting a challenge to the successful integration of monitoring results. A systematic approach to safety and quality control can be used to interrogate the known qualities of cleaning monitoring methods and provide a prospective management tool for infection control professionals. We investigated the use of failure mode and effects analysis (FMEA) for measuring failure risk arising through each cleaning monitoring method. FMEA uses existing data in a structured risk assessment tool that identifies weaknesses in products or processes. Our FMEA approach used the literature and a small experienced team to construct a series of analyses to investigate the cleaning monitoring methods in a way that minimized identified failure risks. FMEA applied to each of the cleaning monitoring methods revealed failure modes for each. The combined use of cleaning monitoring methods in sequence is preferable to their use in isolation. When these 4 cleaning monitoring methods are used in combination in a logical sequence, the failure modes noted for any 1 can be complemented by the strengths of the alternatives, thereby circumventing the risk of failure of any individual cleaning monitoring method. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  10. Nondestructive evaluation tools and experimental studies for monitoring the health of space propulsion systems

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    1991-01-01

    An overview is given of background and information on space propulsion systems on both the programmatic and technical levels. Feasibility experimental studies indicate that nondestructive evaluation tools such as ultrasonic, eddy current and x-ray may be successfully used to monitor the life limiting failure mechanisms of space propulsion systems. Encouraging results were obtained for monitoring the life limiting failure mechanisms for three space propulsion systems; the degradation of tungsten arcjet and magnetoplasmadynamic electrodes; presence and thickness of spallable electrically conducting molybdenum films in ion thrusters; and the degradation of the catalyst in hydrazine thrusters.

  11. User-Defined Material Model for Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)

    2006-01-01

    An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.

  12. New diagnostic and therapeutic possibilities for diastolic heart failure.

    PubMed

    Jeong, Euy-Myoung; Dudley, Samuel C

    2014-02-03

    Despite the fact that up to half of all heart failure occurs in patients without evidence of systolic cardiac dysfunction, there are no universally accepted diagnostic markers and no approved therapies for heart failure with preserved ejection fraction (HFpEF). HFpEF, otherwise known as diastolic heart failure, has nearly the same grim prognosis as systolic heart failure, and diastolic heart failure is increasing in incidence and prevalence. Major trials have shown that many of the treatments that are salutary in systolic heart failure have no beneficial effects in diastolic heart failure, suggesting different underlying mechanisms for these two disorders. Even criteria for diagnosis of HFpEF are still debated, and there is still no gold standard marker to detect diastolic dysfunction. Here, we will review some promising new insights into the pathogenesis of diastolic dysfunction that may lead to new diagnostic and therapeutic tools.

  13. A Novelty Design Of Minimization Of Electrical Losses In A Vector Controlled Induction Machine Drive

    NASA Astrophysics Data System (ADS)

    Aryza, Solly; Irwanto, M.; Lubis, Zulkarnain; Putera Utama Siahaan, Andysah; Rahim, Robbi; Furqan, Mhd.

    2018-01-01

    The induction motor has in the industry . More attention has been a focus to develop and design of induction motor drive. With the method of vector control novelty prove the efficiency of induction motor over their entire speed range. In this paper desirable to design a loss minimization controller which can improve the efficiency. Also, this research described Modeling of an induction motor with core loss included. Realization of methods vector control for an induction motor drive with loss element included. The case of the loss minimization condition. The procedure was successful to calculate the gains of a PI controller. Though the problem of obtaining a robust and sensorless induction motor drive is by no means completely solved, the results obtained as part of this work point in a promising direction.

  14. Brillouin micro-spectroscopy through aberrations via sensorless adaptive optics

    NASA Astrophysics Data System (ADS)

    Edrei, Eitan; Scarcelli, Giuliano

    2018-04-01

    Brillouin spectroscopy is a powerful optical technique for non-contact viscoelastic characterizations which has recently found applications in three-dimensional mapping of biological samples. Brillouin spectroscopy performances are rapidly degraded by optical aberrations and have therefore been limited to homogenous transparent samples. In this work, we developed an adaptive optics (AO) configuration designed for Brillouin scattering spectroscopy to engineer the incident wavefront and correct for aberrations. Our configuration does not require direct wavefront sensing and the injection of a "guide-star"; hence, it can be implemented without the need for sample pre-treatment. We used our AO-Brillouin spectrometer in aberrated phantoms and biological samples and obtained improved precision and resolution of Brillouin spectral analysis; we demonstrated 2.5-fold enhancement in Brillouin signal strength and 1.4-fold improvement in axial resolution because of the correction of optical aberrations.

  15. Failure analysis of a tool steel torque shaft

    NASA Technical Reports Server (NTRS)

    Reagan, J. R.

    1981-01-01

    A low design load drive shaft used to deliver power from an experimental exhaust heat recovery system to the crankshaft of an experimental diesel truck engine failed during highway testing. An independent testing laboratory analyzed the failure by routine metallography and attributed the failure to fatigue induced by a banded microstructure. Visual examination by NASA of the failed shaft plus the knowledge of the torsional load that it carried pointed to a 100 percent ductile failure with no evidence of fatigue. Scanning electron microscopy confirmed this. Torsional test specimens were produced from pieces of the failed shaft and torsional overload testing produced identical failures to that which had occurred in the truck engine. This pointed to a failure caused by a high overload and although the microstructure was defective it was not the cause of the failure.

  16. Nurses' strategies to address self-care aspects related to medication adherence and symptom recognition in heart failure patients: an in-depth look.

    PubMed

    Jaarsma, Tiny; Nikolova-Simons, Mariana; van der Wal, Martje H L

    2012-01-01

    Despite an increasing body of knowledge on self-care in heart failure patients, the need for effective interventions remains. We sought to deepen the understanding of interventions that heart failure nurses use in clinical practice to improve patient adherence to medication and symptom monitoring. A qualitative study with a directed content analysis was performed, using data from a selected sample of Dutch-speaking heart failure nurses who completed booklets with two vignettes involving medication adherence and symptom recognition. Nurses regularly assess and reassess patients before they decide on an intervention. They evaluate basic/factual information and barriers in a patient's behavior, and try to find room for improvement in a patient's behavior. Interventions that heart failure nurses use to improve adherence to medication and symptom monitoring were grouped into the themes of increasing knowledge, increasing motivation, and providing patients with practical tools. Nurses also described using technology-based tools, increased social support, alternative communication, partnership approaches, and coordination of care to improve adherence to medications and symptom monitoring. Despite a strong focus on educational strategies, nurses also reported other strategies to increase patient adherence. Nurses use several strategies to improve patient adherence that are not incorporated into guidelines. These interventions need to be evaluated for further applications in improving heart failure management. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Diagnosis and management of heart failure in the fetus

    PubMed Central

    DAVEY, B.; SZWAST, A.; RYCHIK, J.

    2015-01-01

    Heart failure can be defined as the inability of the heart to sufficiently support the circulation. In the fetus, heart failure can be caused by a myriad of factors that include fetal shunting abnormalities, genetic cardiomyopathies, extracardiac malformations, arrhythmias and structural congenital heart disease. With advances in ultrasound has come the ability to characterize many complex conditions, previously poorly understood. Fetal echocardiography provides the tools necessary to evaluate and understand the various physiologies that contribute to heart failure in the fetus. In this review, we will explore the different mechanisms of heart failure in this unique patient population and highlight the role of fetal echocardiography in the current management of these conditions PMID:22992530

  18. System overview of the fully implantable destination therapy--ReinHeart-total artificial heart.

    PubMed

    Pelletier, Benedikt; Spiliopoulos, Sotirios; Finocchiaro, Thomas; Graef, Felix; Kuipers, Kristin; Laumen, Marco; Guersoy, Dilek; Steinseifer, Ulrich; Koerfer, Reiner; Tenderich, Gero

    2015-01-01

    Owing to the lack of suitable allografts, the demand for long-term mechanical circulatory support in patients with biventricular end-stage heart failure is rising. Currently available Total Artificial Heart (TAH) systems consist of pump units with only limited durability, percutaneous tubes and bulky external equipment that limit the quality of life. Therefore we are focusing on the development of a fully implantable, highly durable destination therapy total artificial heart. The ReinHeart-TAH system consists of a passively filling pump unit driven by a low-wear linear drive between two artificial ventricles, an implantable control unit and a compliance chamber. The TAH is powered by a transcutaneous energy transmission system. The flow distribution inside the ventricles was analysed by fluid structure interaction simulation and particle image velocimetry measurements. Along with durability tests, the hydrodynamic performance and flow balance capability were evaluated in a mock circulation loop. Animal trials are ongoing. Based on fluid structure interaction simulation and particle image velocimetry, blood stagnation areas have been significantly reduced. In the mock circulation loop the ReinHeart-TAH generated a cardiac output of 5 l/min at an operating frequency of 120 bpm and an aortic pressure of 120/80 mmHg. The highly effective preload sensitivity of the passively filling ventricles allowed the sensorless integration of the Frank Starling mechanism. The ReinHeart-TAH effectively replaced the native heart's function in animals for up to 2 days. In vitro and in vivo testing showed a safe and effective function of the ReinHeart-TAH system. This has the potential to become an alternative to transplantation. However, before a first-in-man implant, chronic animal trials still have to be completed. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  19. WE-G-BRC-02: Risk Assessment for HDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayadev, J.

    2016-06-15

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  20. WE-G-BRC-01: Risk Assessment for Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, G.

    2016-06-15

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  1. WE-G-BRC-03: Risk Assessment for Physics Plan Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, S.

    2016-06-15

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  3. Impact of electronic clinical decision support on adherence to guideline-recommended treatment for hyperlipidaemia, atrial fibrillation and heart failure: protocol for a cluster randomised trial

    PubMed Central

    Kessler, Maya Elizabeth; Cook, David A; Kor, Daryl Jon; McKie, Paul M; Pencille, Laurie J; Scheitel, Marianne R; Chaudhry, Rajeev

    2017-01-01

    Introduction Clinical practice guidelines facilitate optimal clinical practice. Point of care access, interpretation and application of such guidelines, however, is inconsistent. Informatics-based tools may help clinicians apply guidelines more consistently. We have developed a novel clinical decision support tool that presents guideline-relevant information and actionable items to clinicians at the point of care. We aim to test whether this tool improves the management of hyperlipidaemia, atrial fibrillation and heart failure by primary care clinicians. Methods/analysis Clinician care teams were cluster randomised to receive access to the clinical decision support tool or passive access to institutional guidelines on 16 May 2016. The trial began on 1 June 2016 when access to the tool was granted to the intervention clinicians. The trial will be run for 6 months to ensure a sufficient number of patient encounters to achieve 80% power to detect a twofold increase in the primary outcome at the 0.05 level of significance. The primary outcome measure will be the percentage of guideline-based recommendations acted on by clinicians for hyperlipidaemia, atrial fibrillation and heart failure. We hypothesise care teams with access to the clinical decision support tool will act on recommendations at a higher rate than care teams in the standard of care arm. Ethics and dissemination The Mayo Clinic Institutional Review Board approved all study procedures. Informed consent was obtained from clinicians. A waiver of informed consent and of Health Insurance Portability and Accountability Act (HIPAA) authorisation for patients managed by clinicians in the study was granted. In addition to publication, results will be disseminated via meetings and newsletters. Trial registration number NCT02742545. PMID:29208620

  4. How to apply clinical cases and medical literature in the framework of a modified "failure mode and effects analysis" as a clinical reasoning tool--an illustration using the human biliary system.

    PubMed

    Wong, Kam Cheong

    2016-04-06

    Clinicians use various clinical reasoning tools such as Ishikawa diagram to enhance their clinical experience and reasoning skills. Failure mode and effects analysis, which is an engineering methodology in origin, can be modified and applied to provide inputs into an Ishikawa diagram. The human biliary system is used to illustrate a modified failure mode and effects analysis. The anatomical and physiological processes of the biliary system are reviewed. Failure is defined as an abnormality caused by infective, inflammatory, obstructive, malignancy, autoimmune and other pathological processes. The potential failures, their effect(s), main clinical features, and investigation that can help a clinician to diagnose at each anatomical part and physiological process are reviewed and documented in a modified failure mode and effects analysis table. Relevant medical and surgical cases are retrieved from the medical literature and weaved into the table. A total of 80 clinical cases which are relevant to the modified failure mode and effects analysis for the human biliary system have been reviewed and weaved into a designated table. The table is the backbone and framework for further expansion. Reviewing and updating the table is an iterative and continual process. The relevant clinical features in the modified failure mode and effects analysis are then extracted and included in the relevant Ishikawa diagram. This article illustrates an application of engineering methodology in medicine, and it sows the seeds of potential cross-pollination between engineering and medicine. Establishing a modified failure mode and effects analysis can be a teamwork project or self-directed learning process, or a mix of both. Modified failure mode and effects analysis can be deployed to obtain inputs for an Ishikawa diagram which in turn can be used to enhance clinical experiences and clinical reasoning skills for clinicians, medical educators, and students.

  5. Use of FMEA analysis to reduce risk of errors in prescribing and administering drugs in paediatric wards: a quality improvement report

    PubMed Central

    Lago, Paola; Bizzarri, Giancarlo; Scalzotto, Francesca; Parpaiola, Antonella; Amigoni, Angela; Putoto, Giovanni; Perilongo, Giorgio

    2012-01-01

    Objective Administering medication to hospitalised infants and children is a complex process at high risk of error. Failure mode and effect analysis (FMEA) is a proactive tool used to analyse risks, identify failures before they happen and prioritise remedial measures. To examine the hazards associated with the process of drug delivery to children, we performed a proactive risk-assessment analysis. Design and setting Five multidisciplinary teams, representing different divisions of the paediatric department at Padua University Hospital, were trained to analyse the drug-delivery process, to identify possible causes of failures and their potential effects, to calculate a risk priority number (RPN) for each failure and plan changes in practices. Primary outcome To identify higher-priority potential failure modes as defined by RPNs and planning changes in clinical practice to reduce the risk of patients harm and improve safety in the process of medication use in children. Results In all, 37 higher-priority potential failure modes and 71 associated causes and effects were identified. The highest RPNs related (>48) mainly to errors in calculating drug doses and concentrations. Many of these failure modes were found in all the five units, suggesting the presence of common targets for improvement, particularly in enhancing the safety of prescription and preparation of endovenous drugs. The introductions of new activities in the revised process of administering drugs allowed reducing the high-risk failure modes of 60%. Conclusions FMEA is an effective proactive risk-assessment tool useful to aid multidisciplinary groups in understanding a process care and identifying errors that may occur, prioritising remedial interventions and possibly enhancing the safety of drug delivery in children. PMID:23253870

  6. Use of FMEA analysis to reduce risk of errors in prescribing and administering drugs in paediatric wards: a quality improvement report.

    PubMed

    Lago, Paola; Bizzarri, Giancarlo; Scalzotto, Francesca; Parpaiola, Antonella; Amigoni, Angela; Putoto, Giovanni; Perilongo, Giorgio

    2012-01-01

    Administering medication to hospitalised infants and children is a complex process at high risk of error. Failure mode and effect analysis (FMEA) is a proactive tool used to analyse risks, identify failures before they happen and prioritise remedial measures. To examine the hazards associated with the process of drug delivery to children, we performed a proactive risk-assessment analysis. Five multidisciplinary teams, representing different divisions of the paediatric department at Padua University Hospital, were trained to analyse the drug-delivery process, to identify possible causes of failures and their potential effects, to calculate a risk priority number (RPN) for each failure and plan changes in practices. To identify higher-priority potential failure modes as defined by RPNs and planning changes in clinical practice to reduce the risk of patients harm and improve safety in the process of medication use in children. In all, 37 higher-priority potential failure modes and 71 associated causes and effects were identified. The highest RPNs related (>48) mainly to errors in calculating drug doses and concentrations. Many of these failure modes were found in all the five units, suggesting the presence of common targets for improvement, particularly in enhancing the safety of prescription and preparation of endovenous drugs. The introductions of new activities in the revised process of administering drugs allowed reducing the high-risk failure modes of 60%. FMEA is an effective proactive risk-assessment tool useful to aid multidisciplinary groups in understanding a process care and identifying errors that may occur, prioritising remedial interventions and possibly enhancing the safety of drug delivery in children.

  7. Numerical simulation of deformation and failure processes of a complex technical object under impact loading

    NASA Astrophysics Data System (ADS)

    Kraus, E. I.; Shabalin, I. I.; Shabalin, T. I.

    2018-04-01

    The main points of development of numerical tools for simulation of deformation and failure of complex technical objects under nonstationary conditions of extreme loading are presented. The possibility of extending the dynamic method for construction of difference grids to the 3D case is shown. A 3D realization of discrete-continuum approach to the deformation and failure of complex technical objects is carried out. The efficiency of the existing software package for 3D modelling is shown.

  8. [Failure mode and effects analysis to improve quality in clinical trials].

    PubMed

    Mañes-Sevilla, M; Marzal-Alfaro, M B; Romero Jiménez, R; Herranz-Alonso, A; Sanchez Fresneda, M N; Benedi Gonzalez, J; Sanjurjo-Sáez, M

    The failure mode and effects analysis (FMEA) has been used as a tool in risk management and quality improvement. The objective of this study is to identify the weaknesses in processes in the clinical trials area, of a Pharmacy Department (PD) with great research activity, in order to improve the safety of the usual procedures. A multidisciplinary team was created to analyse each of the critical points, identified as possible failure modes, in the development of clinical trial in the PD. For each failure mode, the possible cause and effect were identified, criticality was calculated using the risk priority number and the possible corrective actions were discussed. Six sub-processes were defined in the development of the clinical trials in PD. The FMEA identified 67 failure modes, being the dispensing and prescription/validation sub-processes the most likely to generate errors. All the improvement actions established in the AMFE were implemented in the Clinical Trials area. The FMEA is a useful tool in proactive risk management because it allows us to identify where we are making mistakes and analyze the causes that originate them, to prioritize and to adopt solutions to risk reduction. The FMEA improves process safety and quality in PD. Copyright © 2018 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. In situ transmission electron microscopy of transistor operation and failure.

    PubMed

    Wang, Baoming; Islam, Zahabul; Haque, Aman; Chabak, Kelson; Snure, Michael; Heller, Eric; Glavin, Nicholas

    2018-08-03

    Microscopy is typically used as a post-mortem analytical tool in performance and reliability studies on nanoscale materials and devices. In this study, we demonstrate real time microscopy of the operation and failure of AlGaN/GaN high electron mobility transistors inside the transmission electron microscope. Loading until failure was performed on the electron transparent transistors to visualize the failure mechanisms caused by self-heating. At lower drain voltages, thermo-mechanical stresses induce irreversible microstructural deformation, mostly along the AlGaN/GaN interface, to initiate the damage process. At higher biasing, the self-heating deteriorates the gate and catastrophic failure takes place through metal/semiconductor inter-diffusion and/or buffer layer breakdown. This study indicates that the current trend of recreating the events, from damage nucleation to catastrophic failure, can be replaced by in situ microscopy for a quick and accurate account of the failure mechanisms.

  10. Validation of self assessment patient knowledge questionnaire for heart failure patients.

    PubMed

    Lainscak, Mitja; Keber, Irena

    2005-12-01

    Several studies showed insufficient knowledge and poor compliance to non-pharmacological management in heart failure patients. Only a limited number of validated tools are available to assess their knowledge. The aim of the study was to test our 10-item Patient knowledge questionnaire. The Patient knowledge questionnaire was administered to 42 heart failure patients from Heart failure clinic and to 40 heart failure patients receiving usual care. Construct validity (Pearson correlation coefficient), internal consistency (Cronbach alpha), reproducibility (Wilcoxon signed rank test), and reliability (chi-square test and Student's t-test for independent samples) were assessed. Overall score of the Patient knowledge questionnaire had the strongest correlation to the question about regular weighing (r=0.69) and the weakest to the question about presence of heart disease (r=0.33). There was a strong correlation between question about fluid retention and questions assessing regular weighing, (r=0.86), weight of one litre of water (r=0.86), and salt restriction (r=0.57). The Cronbach alpha was 0.74 and could be improved by exclusion of questions about clear explanation (Chronbach alpha 0.75), importance of fruit, soup, and vegetables (Chronbach alpha 0.75), and self adjustment of diuretic (Chronbach alpha 0.81). During reproducibility testing 91% to 98% of questions were answered equally. Patients from Heart failure clinic scored significantly better than patients receiving usual care (7.9 (1.3) vs. 5.7 (2.2), p<0.001). Patient knowledge questionnaire is a valid and reliable tool to measure knowledge of heart failure patients.

  11. Innovative fabrication processing of advanced composite materials concepts for primary aircraft structures

    NASA Technical Reports Server (NTRS)

    Kassapoglou, Christos; Dinicola, Al J.; Chou, Jack C.

    1992-01-01

    The autoclave based THERM-X(sub R) process was evaluated by cocuring complex curved panels with frames and stiffeners. The process was shown to result in composite parts of high quality with good compaction at sharp radius regions and corners of intersecting parts. The structural properties of the postbuckled panels fabricated were found to be equivalent to those of conventionally tooled hand laid-up parts. Significant savings in bagging time over conventional tooling were documented. Structural details such as cocured shear ties and embedded stiffener flanges in the skin were found to suppress failure modes such as failure at corners of intersecting members and skin stiffeners separation.

  12. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  13. Use of failure mode effect analysis (FMEA) to improve medication management process.

    PubMed

    Jain, Khushboo

    2017-03-13

    Purpose Medication management is a complex process, at high risk of error with life threatening consequences. The focus should be on devising strategies to avoid errors and make the process self-reliable by ensuring prevention of errors and/or error detection at subsequent stages. The purpose of this paper is to use failure mode effect analysis (FMEA), a systematic proactive tool, to identify the likelihood and the causes for the process to fail at various steps and prioritise them to devise risk reduction strategies to improve patient safety. Design/methodology/approach The study was designed as an observational analytical study of medication management process in the inpatient area of a multi-speciality hospital in Gurgaon, Haryana, India. A team was made to study the complex process of medication management in the hospital. FMEA tool was used. Corrective actions were developed based on the prioritised failure modes which were implemented and monitored. Findings The percentage distribution of medication errors as per the observation made by the team was found to be maximum of transcription errors (37 per cent) followed by administration errors (29 per cent) indicating the need to identify the causes and effects of their occurrence. In all, 11 failure modes were identified out of which major five were prioritised based on the risk priority number (RPN). The process was repeated after corrective actions were taken which resulted in about 40 per cent (average) and around 60 per cent reduction in the RPN of prioritised failure modes. Research limitations/implications FMEA is a time consuming process and requires a multidisciplinary team which has good understanding of the process being analysed. FMEA only helps in identifying the possibilities of a process to fail, it does not eliminate them, additional efforts are required to develop action plans and implement them. Frank discussion and agreement among the team members is required not only for successfully conducing FMEA but also for implementing the corrective actions. Practical implications FMEA is an effective proactive risk-assessment tool and is a continuous process which can be continued in phases. The corrective actions taken resulted in reduction in RPN, subjected to further evaluation and usage by others depending on the facility type. Originality/value The application of the tool helped the hospital in identifying failures in medication management process, thereby prioritising and correcting them leading to improvement.

  14. Vulnerability Management for an Enterprise Resource Planning System

    NASA Astrophysics Data System (ADS)

    Goel, Shivani; Kiran, Ravi; Garg, Deepak

    2012-09-01

    Enterprise resource planning (ERP) systems are commonly used in technical educational institutions(TEIs). ERP systems should continue providing services to its users irrespective of the level of failure. There could be many types of failures in the ERP systems. There are different types of measures or characteristics that can be defined for ERP systems to handle the levels of failure. Here in this paper, various types of failure levels are identified along with various characteristics which are concerned with those failures. The relation between all these is summarized. The disruptions causing vulnerabilities in TEIs are identified .A vulnerability management cycle has been suggested along with many commercial and open source vulnerability management tools. The paper also highlights the importance of resiliency in ERP systems in TEIs.

  15. Virtual-Instrument-Based Online Monitoring System for Hands-on Laboratory Experiment of Partial Discharges

    ERIC Educational Resources Information Center

    Karmakar, Subrata

    2017-01-01

    Online monitoring of high-voltage (HV) equipment is a vital tool for early detection of insulation failure. Most insulation failures are caused by partial discharges (PDs) inside the HV equipment. Because of the very high cost of establishing HV equipment facility and the limitations of electromagnetic interference-screened laboratories, only a…

  16. [Rare cause of heart failure in an elderly woman in Djibouti: left ventricular non compaction].

    PubMed

    Massoure, P L; Lamblin, G; Bertani, A; Eve, O; Kaiser, E

    2011-10-01

    The purpose of this report is to describe the first case of left ventricular non compaction diagnosed in Djibouti. The patient was a 74-year-old Djiboutian woman with symptomatic heart failure. Echocardiography is the key tool for assessment of left ventricular non compaction. This rare cardiomyopathy is probably underdiagnosed in Africa.

  17. Assessing the Causes of Encapsulant Delamination in PV Modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wohlgemuth, John H.; Hacke, Peter; Bosco, Nick

    Delamination of the encapsulant is one of the most prevalent PV module field failures. This paper will present examples of various types of delaminations that have been observed in the field. It will then discuss the development of accelerated stress tests designed to duplicate those field failures and thus provide tools for avoiding them in the future.

  18. Motor Control of Two Flywheels Enabling Combined Attitude Control and Bus Regulation

    NASA Technical Reports Server (NTRS)

    Kenny, Barbara H.

    2004-01-01

    This presentation discussed the flywheel technology development work that is ongoing at NASA GRC with a particular emphasis on the flywheel system control. The "field orientation" motor/generator control algorithm was discussed and explained. The position-sensorless angle and speed estimation algorithm was presented. The motor current response to a step change in command at low (10 kRPM) and high (60 kRPM) was discussed. The flywheel DC bus regulation control was explained and experimental results presented. Finally, the combined attitude control and energy storage algorithm that controls two flywheels simultaneously was presented. Experimental results were shown that verified the operational capability of the algorithm. shows high speed flywheel energy storage (60,000 RPM) and the successful implementation of an algorithm to simultaneously control both energy storage and a single axis of attitude with two flywheels. Overall, the presentation demonstrated that GRC has an operational facility that

  19. Power Maximization Control of Variable Speed Wind Generation System Using Permanent Magnet Synchronous Generator

    NASA Astrophysics Data System (ADS)

    Morimoto, Shigeo; Nakamura, Tomohiko; Takeda, Yoji

    This paper proposes the sensorless output power maximization control of the wind generation system. A permanent magnet synchronous generator (PMSG) is used as a variable speed generator in the proposed system. The generator torque is suitably controlled according to the generator speed and thus the power from a wind turbine settles down on the maximum power point by the proposed MPPT control method, where the information of wind velocity is not required. Moreover, the maximum available generated power is obtained by the optimum current vector control. The current vector of PMSG is optimally controlled according to the generator speed and the required torque in order to minimize the losses of PMSG considering the voltage and current constraints. The proposed wind power generation system can be achieved without mechanical sensors such as a wind velocity detector and a position sensor. Several experimental results show the effectiveness of the proposed control method.

  20. Sensorless FOC Performance Improved with On-Line Speed and Rotor Resistance Estimator Based on an Artificial Neural Network for an Induction Motor Drive

    PubMed Central

    Gutierrez-Villalobos, Jose M.; Rodriguez-Resendiz, Juvenal; Rivas-Araiza, Edgar A.; Martínez-Hernández, Moisés A.

    2015-01-01

    Three-phase induction motor drive requires high accuracy in high performance processes in industrial applications. Field oriented control, which is one of the most employed control schemes for induction motors, bases its function on the electrical parameter estimation coming from the motor. These parameters make an electrical machine driver work improperly, since these electrical parameter values change at low speeds, temperature changes, and especially with load and duty changes. The focus of this paper is the real-time and on-line electrical parameters with a CMAC-ADALINE block added in the standard FOC scheme to improve the IM driver performance and endure the driver and the induction motor lifetime. Two kinds of neural network structures are used; one to estimate rotor speed and the other one to estimate rotor resistance of an induction motor. PMID:26131677

  1. Sensorless FOC Performance Improved with On-Line Speed and Rotor Resistance Estimator Based on an Artificial Neural Network for an Induction Motor Drive.

    PubMed

    Gutierrez-Villalobos, Jose M; Rodriguez-Resendiz, Juvenal; Rivas-Araiza, Edgar A; Martínez-Hernández, Moisés A

    2015-06-29

    Three-phase induction motor drive requires high accuracy in high performance processes in industrial applications. Field oriented control, which is one of the most employed control schemes for induction motors, bases its function on the electrical parameter estimation coming from the motor. These parameters make an electrical machine driver work improperly, since these electrical parameter values change at low speeds, temperature changes, and especially with load and duty changes. The focus of this paper is the real-time and on-line electrical parameters with a CMAC-ADALINE block added in the standard FOC scheme to improve the IM driver performance and endure the driver and the induction motor lifetime. Two kinds of neural network structures are used; one to estimate rotor speed and the other one to estimate rotor resistance of an induction motor.

  2. Indirect rotor position sensing in real time for brushless permanent magnet motor drives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ertugrul, N.; Acarnley, P.P.

    1998-07-01

    This paper describes a modern solution to real-time rotor position estimation of brushless permanent magnet (PM) motor drives. The position estimation scheme, based on flux linkage and line-current estimation, is implemented in real time by using the abc reference frame, and it is tested dynamically. The position estimation model of the test motor, development of hardware, and basic operation of the digital signal processor (DSP) are discussed. The overall position estimation strategy is accomplished with a fast DSP (TMS320C30). The method is a shaft position sensorless method that is applicable to a wide range of excitation types in brushless PMmore » motors without any restriction on the motor model and the current excitation. Both rectangular and sinewave-excited brushless PM motor drives are examined, and the results are given to demonstrate the effectiveness of the method with dynamic loads in closed estimated position loop.« less

  3. Signal injection as a fault detection technique.

    PubMed

    Cusidó, Jordi; Romeral, Luis; Ortega, Juan Antonio; Garcia, Antoni; Riba, Jordi

    2011-01-01

    Double frequency tests are used for evaluating stator windings and analyzing the temperature. Likewise, signal injection on induction machines is used on sensorless motor control fields to find out the rotor position. Motor Current Signature Analysis (MCSA), which focuses on the spectral analysis of stator current, is the most widely used method for identifying faults in induction motors. Motor faults such as broken rotor bars, bearing damage and eccentricity of the rotor axis can be detected. However, the method presents some problems at low speed and low torque, mainly due to the proximity between the frequencies to be detected and the small amplitude of the resulting harmonics. This paper proposes the injection of an additional voltage into the machine being tested at a frequency different from the fundamental one, and then studying the resulting harmonics around the new frequencies appearing due to the composition between injected and main frequencies.

  4. A Novel Design of an Automatic Lighting Control System for a Wireless Sensor Network with Increased Sensor Lifetime and Reduced Sensor Numbers

    PubMed Central

    Mohamaddoust, Reza; Haghighat, Abolfazl Toroghi; Sharif, Mohamad Javad Motahari; Capanni, Niccolo

    2011-01-01

    Wireless sensor networks (WSN) are currently being applied to energy conservation applications such as light control. We propose a design for such a system called a Lighting Automatic Control System (LACS). The LACS system contains a centralized or distributed architecture determined by application requirements and space usage. The system optimizes the calculations and communications for lighting intensity, incorporates user illumination requirements according to their activities and performs adjustments based on external lighting effects in external sensor and external sensor-less architectures. Methods are proposed for reducing the number of sensors required and increasing the lifetime of those used, for considerably reduced energy consumption. Additionally we suggest methods for improving uniformity of illuminance distribution on a workplane’s surface, which improves user satisfaction. Finally simulation results are presented to verify the effectiveness of our design. PMID:22164114

  5. DSP-based adaptive backstepping using the tracking errors for high-performance sensorless speed control of induction motor drive.

    PubMed

    Zaafouri, Abderrahmen; Regaya, Chiheb Ben; Azza, Hechmi Ben; Châari, Abdelkader

    2016-01-01

    This paper presents a modified structure of the backstepping nonlinear control of the induction motor (IM) fitted with an adaptive backstepping speed observer. The control design is based on the backstepping technique complemented by the introduction of integral tracking errors action to improve its robustness. Unlike other research performed on backstepping control with integral action, the control law developed in this paper does not propose the increase of the number of system state so as not increase the complexity of differential equations resolution. The digital simulation and experimental results show the effectiveness of the proposed control compared to the conventional PI control. The results analysis shows the characteristic robustness of the adaptive control to disturbances of the load, the speed variation and low speed. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Signal Injection as a Fault Detection Technique

    PubMed Central

    Cusidó, Jordi; Romeral, Luis; Ortega, Juan Antonio; Garcia, Antoni; Riba, Jordi

    2011-01-01

    Double frequency tests are used for evaluating stator windings and analyzing the temperature. Likewise, signal injection on induction machines is used on sensorless motor control fields to find out the rotor position. Motor Current Signature Analysis (MCSA), which focuses on the spectral analysis of stator current, is the most widely used method for identifying faults in induction motors. Motor faults such as broken rotor bars, bearing damage and eccentricity of the rotor axis can be detected. However, the method presents some problems at low speed and low torque, mainly due to the proximity between the frequencies to be detected and the small amplitude of the resulting harmonics. This paper proposes the injection of an additional voltage into the machine being tested at a frequency different from the fundamental one, and then studying the resulting harmonics around the new frequencies appearing due to the composition between injected and main frequencies. PMID:22163801

  7. A novel design of an automatic lighting control system for a wireless sensor network with increased sensor lifetime and reduced sensor numbers.

    PubMed

    Mohamaddoust, Reza; Haghighat, Abolfazl Toroghi; Sharif, Mohamad Javad Motahari; Capanni, Niccolo

    2011-01-01

    Wireless sensor networks (WSN) are currently being applied to energy conservation applications such as light control. We propose a design for such a system called a lighting automatic control system (LACS). The LACS system contains a centralized or distributed architecture determined by application requirements and space usage. The system optimizes the calculations and communications for lighting intensity, incorporates user illumination requirements according to their activities and performs adjustments based on external lighting effects in external sensor and external sensor-less architectures. Methods are proposed for reducing the number of sensors required and increasing the lifetime of those used, for considerably reduced energy consumption. Additionally we suggest methods for improving uniformity of illuminance distribution on a workplane's surface, which improves user satisfaction. Finally simulation results are presented to verify the effectiveness of our design.

  8. A cascading failure analysis tool for post processing TRANSCARE simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This is a MATLAB-based tool to post process simulation results in the EPRI software TRANSCARE, for massive cascading failure analysis following severe disturbances. There are a few key modules available in this tool, including: 1. automatically creating a contingency list to run TRANSCARE simulations, including substation outages above a certain kV threshold, N-k (1, 2 or 3) generator outages and branche outages; 2. read in and analyze a CKO file of PCG definition, an initiating event list, and a CDN file; 3. post process all the simulation results saved in a CDN file and perform critical event corridor analysis; 4.more » provide a summary of TRANSCARE simulations; 5. Identify the most frequently occurring event corridors in the system; and 6. Rank the contingencies using a user defined security index to quantify consequences in terms of total load loss, total number of cascades, etc.« less

  9. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Chao; Xu, Jun; Cao, Lei

    The electrodes of lithium-ion batteries (LIB) are known to be brittle and to fail earlier than the separators during an external crush event. Thus, the understanding of mechanical failure mechanism for LIB electrodes (anode and cathode) is critical for the safety design of LIB cells. In this paper, we present experimental and numerical studies on the constitutive behavior and progression of failure in LIB electrodes. Mechanical tests were designed and conducted to evaluate the constitutive properties of porous electrodes. Constitutive models were developed to describe the stress-strain response of electrodes under uniaxial tensile and compressive loads. The failure criterion andmore » a damage model were introduced to model their unique tensile and compressive failure behavior. The failure mechanism of LIB electrodes was studied using the blunt rod test on dry electrodes, and numerical models were built to simulate progressive failure. The different failure processes were examined and analyzed in detail numerically, and correlated with experimentally observed failure phenomena. Finally, the test results and models improve our understanding of failure behavior in LIB electrodes, and provide constructive insights on future development of physics-based safety design tools for battery structures under mechanical abuse.« less

  11. Failures of Sustained Attention in Life, Lab, and Brain: Ecological Validity of the SART

    ERIC Educational Resources Information Center

    Smilek, Daniel; Carriere, Jonathan S. A.; Cheyne, J. Allan

    2010-01-01

    The Sustained Attention to Response Task (SART) is a widely used tool in cognitive neuroscience increasingly employed to identify brain regions associated with failures of sustained attention. An important claim of the SART is that it is significantly related to real-world problems of sustained attention such as those experienced by TBI and ADHD…

  12. Why Won't You Do What I Want? The Informative Failures of Children and Models

    ERIC Educational Resources Information Center

    Chatham, Christopher H.; Yerys, Benjamin E.; Munakata, Yuko

    2012-01-01

    Computational models are powerful tools--too powerful, according to some. We argue that the idea that models can "do anything" is wrong, and we describe how their failures have been informative. We present new work showing surprising diversity in the effects of feedback on children's task-switching, such that some children perseverate despite this…

  13. Fault Tree Analysis: An Operations Research Tool for Identifying and Reducing Undesired Events in Training.

    ERIC Educational Resources Information Center

    Barker, Bruce O.; Petersen, Paul D.

    This paper explores the fault-tree analysis approach to isolating failure modes within a system. Fault tree investigates potentially undesirable events and then looks for failures in sequence that would lead to their occurring. Relationships among these events are symbolized by AND or OR logic gates, AND used when single events must coexist to…

  14. Closed-form solution of decomposable stochastic models

    NASA Technical Reports Server (NTRS)

    Sjogren, Jon A.

    1990-01-01

    Markov and semi-Markov processes are increasingly being used in the modeling of complex reconfigurable systems (fault tolerant computers). The estimation of the reliability (or some measure of performance) of the system reduces to solving the process for its state probabilities. Such a model may exhibit numerous states and complicated transition distributions, contributing to an expensive and numerically delicate solution procedure. Thus, when a system exhibits a decomposition property, either structurally (autonomous subsystems), or behaviorally (component failure versus reconfiguration), it is desirable to exploit this decomposition in the reliability calculation. In interesting cases there can be failure states which arise from non-failure states of the subsystems. Equations are presented which allow the computation of failure probabilities of the total (combined) model without requiring a complete solution of the combined model. This material is presented within the context of closed-form functional representation of probabilities as utilized in the Symbolic Hierarchical Automated Reliability and Performance Evaluator (SHARPE) tool. The techniques adopted enable one to compute such probability functions for a much wider class of systems at a reduced computational cost. Several examples show how the method is used, especially in enhancing the versatility of the SHARPE tool.

  15. Estimation of probability of failure for damage-tolerant aerospace structures

    NASA Astrophysics Data System (ADS)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This dissertation describes and develops new PDTA methodologies that directly address the deficiencies of the currently used tools. The new methods are implemented as a free, publicly licensed and open source R software package that can be downloaded from the Comprehensive R Archive Network. The tools consist of two main components. First, an explicit (and expensive) Monte Carlo approach is presented which simulates the life of an aircraft structural component flight-by-flight. This straightforward MC routine can be used to provide defensible estimates of the failure probabilities for future flights and repair probabilities for future inspections under a variety of failure and maintenance scenarios. This routine is intended to provide baseline estimates against which to compare the results of other, more efficient approaches. Second, an original approach is described which models the fatigue process and future scheduled inspections as a hidden Markov model. This model is solved using a particle-based approximation and the sequential importance sampling algorithm, which provides an efficient solution to the PDTA problem. Sequential importance sampling is an extension of importance sampling to a Markov process, allowing for efficient Bayesian updating of model parameters. This model updating capability, the benefit of which is demonstrated, is lacking in other PDTA approaches. The results of this approach are shown to agree with the results of the explicit Monte Carlo routine for a number of PDTA problems. Extensions to the typical PDTA problem, which cannot be solved using currently available tools, are presented and solved in this work. These extensions include incorporating observed evidence (such as non-destructive inspection results), more realistic treatment of possible future repairs, and the modeling of failure involving more than one crack (the so-called continuing damage problem). The described hidden Markov model / sequential importance sampling approach to PDTA has the potential to improve aerospace structural safety and reduce maintenance costs by providing a more accurate assessment of the risk of failure and the likelihood of repairs throughout the life of an aircraft.

  16. Equipment failures and their contribution to industrial incidents and accidents in the manufacturing industry.

    PubMed

    Bourassa, Dominic; Gauthier, François; Abdul-Nour, Georges

    2016-01-01

    Accidental events in manufacturing industries can be caused by many factors, including work methods, lack of training, equipment design, maintenance and reliability. This study is aimed at determining the contribution of failures of commonly used industrial equipment, such as machines, tools and material handling equipment, to the chain of causality of industrial accidents and incidents. Based on a case study which aimed at the analysis of an existing pulp and paper company's accident database, this paper examines the number, type and gravity of the failures involved in these events and their causes. Results from this study show that equipment failures had a major effect on the number and severity of accidents accounted for in the database: 272 out of 773 accidental events were related to equipment failure, where 13 of them had direct human consequences. Failures that contributed directly or indirectly to these events are analyzed.

  17. Tensile failure criteria for fiber composite materials

    NASA Technical Reports Server (NTRS)

    Rosen, B. W.; Zweben, C. H.

    1972-01-01

    The analysis provides insight into the failure mechanics of these materials and defines criteria which serve as tools for preliminary design material selection and for material reliability assessment. The model incorporates both dispersed and propagation type failures and includes the influence of material heterogeneity. The important effects of localized matrix damage and post-failure matrix shear stress transfer are included in the treatment. The model is used to evaluate the influence of key parameters on the failure of several commonly used fiber-matrix systems. Analyses of three possible failure modes were developed. These modes are the fiber break propagation mode, the cumulative group fracture mode, and the weakest link mode. Application of the new model to composite material systems has indicated several results which require attention in the development of reliable structural composites. Prominent among these are the size effect and the influence of fiber strength variability.

  18. Comparison of Damage Path Predictions for Composite Laminates by Explicit and Standard Finite Element Analysis Tools

    NASA Technical Reports Server (NTRS)

    Bogert, Philip B.; Satyanarayana, Arunkumar; Chunchu, Prasad B.

    2006-01-01

    Splitting, ultimate failure load and the damage path in center notched composite specimens subjected to in-plane tension loading are predicted using progressive failure analysis methodology. A 2-D Hashin-Rotem failure criterion is used in determining intra-laminar fiber and matrix failures. This progressive failure methodology has been implemented in the Abaqus/Explicit and Abaqus/Standard finite element codes through user written subroutines "VUMAT" and "USDFLD" respectively. A 2-D finite element model is used for predicting the intra-laminar damages. Analysis results obtained from the Abaqus/Explicit and Abaqus/Standard code show good agreement with experimental results. The importance of modeling delamination in progressive failure analysis methodology is recognized for future studies. The use of an explicit integration dynamics code for simple specimen geometry and static loading establishes a foundation for future analyses where complex loading and nonlinear dynamic interactions of damage and structure will necessitate it.

  19. A Focus Group Exploration of Automated Case-Finders to Identify High-Risk Heart Failure Patients Within an Urban Safety Net Hospital.

    PubMed

    Patterson, Mark E; Miranda, Derick; Schuman, Greg; Eaton, Christopher; Smith, Andrew; Silver, Brad

    2016-01-01

    Leveraging "big data" as a means of informing cost-effective care holds potential in triaging high-risk heart failure (HF) patients for interventions within hospitals seeking to reduce 30-day readmissions. Explore provider's beliefs and perceptions about using an electronic health record (EHR)-based tool that uses unstructured clinical notes to risk-stratify high-risk heart failure patients. Six providers from an inpatient HF clinic within an urban safety net hospital were recruited to participate in a semistructured focus group. A facilitator led a discussion on the feasibility and value of using an EHR tool driven by unstructured clinical notes to help identify high-risk patients. Data collected from transcripts were analyzed using a thematic analysis that facilitated drawing conclusions clustered around categories and themes. From six categories emerged two themes: (1) challenges of finding valid and accurate results, and (2) strategies used to overcome these challenges. Although employing a tool that uses electronic medical record (EMR) unstructured text as the benchmark by which to identify high-risk patients is efficient, choosing appropriate benchmark groups could be challenging given the multiple causes of readmission. Strategies to mitigate these challenges include establishing clear selection criteria to guide benchmark group composition, and quality outcome goals for the hospital. Prior to implementing into practice an innovative EMR-based case-finder driven by unstructured clinical notes, providers are advised to do the following: (1) define patient quality outcome goals, (2) establish criteria by which to guide benchmark selection, and (3) verify the tool's validity and reliability. Achieving consensus on these issues would be necessary for this innovative EHR-based tool to effectively improve clinical decision-making and in turn, decrease readmissions for high-risk patients.

  20. [Failure mode and effects analysis (FMEA) of insulin in a mother-child university-affiliated health center].

    PubMed

    Berruyer, M; Atkinson, S; Lebel, D; Bussières, J-F

    2016-01-01

    Insulin is a high-alert drug. The main objective of this descriptive cross-sectional study was to evaluate the risks associated with insulin use in healthcare centers. The secondary objective was to propose corrective measures to reduce the main risks associated with the most critical failure modes in the analysis. We conducted a failure mode and effects analysis (FMEA) in obstetrics-gynecology, neonatology and pediatrics. Five multidisciplinary meetings occurred in August 2013. A total of 44 out of 49 failure modes were analyzed. Nine out of 44 (20%) failure modes were deemed critical, with a criticality score ranging from 540 to 720. Following the multidisciplinary meetings, everybody agreed that an FMEA was a useful tool to identify failure modes and their relative importance. This approach identified many corrective measures. This shared experience increased awareness of safety issues with insulin in our mother-child center. This study identified the main failure modes and associated corrective measures. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  1. Constitutive behavior and progressive mechanical failure of electrodes in lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Xu, Jun; Cao, Lei; Wu, Zenan; Santhanagopalan, Shriram

    2017-07-01

    The electrodes of lithium-ion batteries (LIB) are known to be brittle and to fail earlier than the separators during an external crush event. Thus, the understanding of mechanical failure mechanism for LIB electrodes (anode and cathode) is critical for the safety design of LIB cells. In this paper, we present experimental and numerical studies on the constitutive behavior and progression of failure in LIB electrodes. Mechanical tests were designed and conducted to evaluate the constitutive properties of porous electrodes. Constitutive models were developed to describe the stress-strain response of electrodes under uniaxial tensile and compressive loads. The failure criterion and a damage model were introduced to model their unique tensile and compressive failure behavior. The failure mechanism of LIB electrodes was studied using the blunt rod test on dry electrodes, and numerical models were built to simulate progressive failure. The different failure processes were examined and analyzed in detail numerically, and correlated with experimentally observed failure phenomena. The test results and models improve our understanding of failure behavior in LIB electrodes, and provide constructive insights on future development of physics-based safety design tools for battery structures under mechanical abuse.

  2. Constitutive behavior and progressive mechanical failure of electrodes in lithium-ion batteries

    DOE PAGES

    Zhang, Chao; Xu, Jun; Cao, Lei; ...

    2017-05-05

    The electrodes of lithium-ion batteries (LIB) are known to be brittle and to fail earlier than the separators during an external crush event. Thus, the understanding of mechanical failure mechanism for LIB electrodes (anode and cathode) is critical for the safety design of LIB cells. In this paper, we present experimental and numerical studies on the constitutive behavior and progression of failure in LIB electrodes. Mechanical tests were designed and conducted to evaluate the constitutive properties of porous electrodes. Constitutive models were developed to describe the stress-strain response of electrodes under uniaxial tensile and compressive loads. The failure criterion andmore » a damage model were introduced to model their unique tensile and compressive failure behavior. The failure mechanism of LIB electrodes was studied using the blunt rod test on dry electrodes, and numerical models were built to simulate progressive failure. The different failure processes were examined and analyzed in detail numerically, and correlated with experimentally observed failure phenomena. Finally, the test results and models improve our understanding of failure behavior in LIB electrodes, and provide constructive insights on future development of physics-based safety design tools for battery structures under mechanical abuse.« less

  3. Modelling the failure behaviour of wind turbines

    NASA Astrophysics Data System (ADS)

    Faulstich, S.; Berkhout, V.; Mayer, J.; Siebenlist, D.

    2016-09-01

    Modelling the failure behaviour of wind turbines is an essential part of offshore wind farm simulation software as it leads to optimized decision making when specifying the necessary resources for the operation and maintenance of wind farms. In order to optimize O&M strategies, a thorough understanding of a wind turbine's failure behaviour is vital and is therefore being developed at Fraunhofer IWES. Within this article, first the failure models of existing offshore O&M tools are presented to show the state of the art and strengths and weaknesses of the respective models are briefly discussed. Then a conceptual framework for modelling different failure mechanisms of wind turbines is being presented. This framework takes into account the different wind turbine subsystems and structures as well as the failure modes of a component by applying several influencing factors representing wear and break failure mechanisms. A failure function is being set up for the rotor blade as exemplary component and simulation results have been compared to a constant failure rate and to empirical wind turbine fleet data as a reference. The comparison and the breakdown of specific failure categories demonstrate the overall plausibility of the model.

  4. Failure mode and effects analysis based risk profile assessment for stereotactic radiosurgery programs at three cancer centers in Brazil.

    PubMed

    Teixeira, Flavia C; de Almeida, Carlos E; Saiful Huq, M

    2016-01-01

    The goal of this study was to evaluate the safety and quality management program for stereotactic radiosurgery (SRS) treatment processes at three radiotherapy centers in Brazil by using three industrial engineering tools (1) process mapping, (2) failure modes and effects analysis (FMEA), and (3) fault tree analysis. The recommendations of Task Group 100 of American Association of Physicists in Medicine were followed to apply the three tools described above to create a process tree for SRS procedure for each radiotherapy center and then FMEA was performed. Failure modes were identified for all process steps and values of risk priority number (RPN) were calculated from O, S, and D (RPN = O × S × D) values assigned by a professional team responsible for patient care. The subprocess treatment planning was presented with the highest number of failure modes for all centers. The total number of failure modes were 135, 104, and 131 for centers I, II, and III, respectively. The highest RPN value for each center is as follows: center I (204), center II (372), and center III (370). Failure modes with RPN ≥ 100: center I (22), center II (115), and center III (110). Failure modes characterized by S ≥ 7, represented 68% of the failure modes for center III, 62% for center II, and 45% for center I. Failure modes with RPNs values ≥100 and S ≥ 7, D ≥ 5, and O ≥ 5 were considered as high priority in this study. The results of the present study show that the safety risk profiles for the same stereotactic radiotherapy process are different at three radiotherapy centers in Brazil. Although this is the same treatment process, this present study showed that the risk priority is different and it will lead to implementation of different safety interventions among the centers. Therefore, the current practice of applying universal device-centric QA is not adequate to address all possible failures in clinical processes at different radiotherapy centers. Integrated approaches to device-centric and process specific quality management program specific to each radiotherapy center are the key to a safe quality management program.

  5. A Study of the Impact of Peak Demand on Increasing Vulnerability of Cascading Failures to Extreme Contingency Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vyakaranam, Bharat GNVSR; Vallem, Mallikarjuna R.; Nguyen, Tony B.

    The vulnerability of large power systems to cascading failures and major blackouts has become evident since the Northeast blackout in 1965. Based on analyses of the series of cascading blackouts in the past decade, the research community realized the urgent need to develop better methods, tools, and practices for performing cascading-outage analysis and for evaluating mitigations that are easily accessible by utility planning engineers. PNNL has developed the Dynamic Contingency Analysis Tool (DCAT) as an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power planning engineers to assess the impact and likelihoodmore » of extreme contingencies and potential cascading events across their systems and interconnections. DCAT analysis will help identify potential vulnerabilities and allow study of mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. Using the DCAT capability, we examined the impacts of various load conditions to identify situations in which the power grid may encounter cascading outages that could lead to potential blackouts. This paper describes the usefulness of the DCAT tool and how it helps to understand potential impacts of load demand on cascading failures on the power system.« less

  6. Design and Evaluation of a Web-Based Symptom Monitoring Tool for Heart Failure.

    PubMed

    Wakefield, Bonnie J; Alexander, Gregory; Dohrmann, Mary; Richardson, James

    2017-05-01

    Heart failure is a chronic condition where symptom recognition and between-visit communication with providers are critical. Patients are encouraged to track disease-specific data, such as weight and shortness of breath. Use of a Web-based tool that facilitates data display in graph form may help patients recognize exacerbations and more easily communicate out-of-range data to clinicians. The purposes of this study were to (1) design a Web-based tool to facilitate symptom monitoring and symptom recognition in patients with chronic heart failure and (2) conduct a usability evaluation of the Web site. Patient participants generally had a positive view of the Web site and indicated it would support recording their health status and communicating with their doctors. Clinician participants generally had a positive view of the Web site and indicated it would be a potentially useful adjunct to electronic health delivery systems. Participants expressed a need to incorporate decision support within the site and wanted to add other data, for example, blood pressure, and have the ability to adjust font size. A few expressed concerns about data privacy and security. Technologies require careful design and testing to ensure they are useful, usable, and safe for patients and do not add to the burden of busy providers.

  7. Energy Saving Melting and Revert Reduction Technology (E-SMARRT): Development of Surface Engineered Coating Systems for Aluminum Pressure Die Casting Dies: Towards a 'Smart' Die Coating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. John J. Moore; Dr. Jianliang Lin,

    2012-07-31

    The main objective of this research program was to design and develop an optimal coating system that extends die life by minimizing premature die failure. In high-pressure aluminum die-casting, the die, core pins and inserts must withstand severe processing conditions. Many of the dies and tools in the industry are being coated to improve wear-resistance and decrease down-time for maintenance. However, thermal fatigue in metal itself can still be a major problem, especially since it often leads to catastrophic failure (i.e. die breakage) as opposed to a wear-based failure (parts begin to go out of tolerance). Tooling costs remain themore » largest portion of production costs for many of these parts, so the ability prevent catastrophic failures would be transformative for the manufacturing industry.The technology offers energy savings through reduced energy use in the die casting process from several factors, including increased life of the tools and dies, reuse of the dies and die components, reduction/elimination of lubricants, and reduced machine down time, and reduction of Al solder sticking on the die. The use of the optimized die coating system will also reduce environmental wastes and scrap parts. Current (2012) annual energy saving estimates, based on initial dissemination to the casting industry in 2010 and market penetration of 80% by 2020, is 3.1 trillion BTU's/year. The average annual estimate of CO2 reduction per year through 2020 is 0.63 Million Metric Tons of Carbon Equivalent (MM TCE).« less

  8. Morbidity and Mortality conference as part of PDCA cycle to decrease anastomotic failure in colorectal surgery.

    PubMed

    Vogel, Peter; Vassilev, Georgi; Kruse, Bernd; Cankaya, Yesim

    2011-10-01

    Morbidity and Mortality meetings are an accepted tool for quality management in many hospitals. However, it is not proven whether these meetings increase quality. It was the aim of this study to investigate whether Morbidity and Mortality meetings as part of a PDCA cycle (Plan, Do, Check, Act) can improve the rate of anastomotic failure in colorectal surgery. From January 1, 2004, to December 31, 2009, data for all anastomotic failures in patients operated on for colorectal diseases in the Department of Surgery (Klinikum Friedrichshafen, Germany) were prospectively collected. The events were discussed in Morbidity and Mortality meetings. On the basis of these discussions, a strategy to prevent anastomotic leaks and a new target were defined (i.e. 'Plan'). This strategy was implemented in the following period (i.e. 'Do') and results were prospectively analysed. A new strategy was established when the results differed from the target, and a new standard was defined when the target was achieved (i.e. 'Check, Act'). The year 2004 was set as the base year. In 2005 and 2006, new strategies were established. Comparing this period with the period of strategy conversion (2007-2009), we found a significant decrease in the anastomotic failure rate in colorectal surgery patients (5.7% vs 2.8%; p = 0.05), whereas the risk factors for anastomotic failure were unchanged or unfavourable. If Morbidity and Mortality meetings are integrated in a PDCA cycle, they can decrease anastomotic failure rates and improve quality of care in colorectal surgery. Therefore, the management tool 'PDCA cycle' should be considered also for medical issues.

  9. 32 CFR 507.18 - Processing complaints of alleged breach of policies.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... (4) Manufacturers may be suspended for failure to return a loaned tool without referral to a hearing... tools are overdue and suspension will take effect if not returned within the specified time. (d..., time, or place of the hearing for purposes of having reasonable time in which to prepare the case. (iv...

  10. 29 CFR 1926.302 - Power-operated hand tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the source of supply or branch line to reduce pressure in case of hose failure. (8) Airless spray guns... the open barrel end. (6) Loaded tools shall not be left unattended. (7) Fasteners shall not be driven...-hardened steel, glass block, live rock, face brick, or hollow tile. (8) Driving into materials easily...

  11. 29 CFR 1926.302 - Power-operated hand tools.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the source of supply or branch line to reduce pressure in case of hose failure. (8) Airless spray guns... the open barrel end. (6) Loaded tools shall not be left unattended. (7) Fasteners shall not be driven...-hardened steel, glass block, live rock, face brick, or hollow tile. (8) Driving into materials easily...

  12. 29 CFR 1926.302 - Power-operated hand tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the source of supply or branch line to reduce pressure in case of hose failure. (8) Airless spray guns... the open barrel end. (6) Loaded tools shall not be left unattended. (7) Fasteners shall not be driven...-hardened steel, glass block, live rock, face brick, or hollow tile. (8) Driving into materials easily...

  13. 29 CFR 1926.302 - Power-operated hand tools.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the source of supply or branch line to reduce pressure in case of hose failure. (8) Airless spray guns... the open barrel end. (6) Loaded tools shall not be left unattended. (7) Fasteners shall not be driven...-hardened steel, glass block, live rock, face brick, or hollow tile. (8) Driving into materials easily...

  14. 29 CFR 1926.302 - Power-operated hand tools.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the source of supply or branch line to reduce pressure in case of hose failure. (8) Airless spray guns... the open barrel end. (6) Loaded tools shall not be left unattended. (7) Fasteners shall not be driven...-hardened steel, glass block, live rock, face brick, or hollow tile. (8) Driving into materials easily...

  15. Failure mode and effects analysis of skin electronic brachytherapy using Esteya® unit

    PubMed Central

    Bautista-Ballesteros, Juan Antonio; Bonaque, Jorge; Celada, Francisco; Lliso, Françoise; Carmona, Vicente; Gimeno-Olmos, Jose; Ouhib, Zoubir; Rosello, Joan; Perez-Calatayud, Jose

    2016-01-01

    Purpose Esteya® (Nucletron, an Elekta company, Elekta AB, Stockholm, Sweden) is an electronic brachytherapy device used for skin cancer lesion treatment. In order to establish an adequate level of quality of treatment, a risk analysis of the Esteya treatment process has been done, following the methodology proposed by the TG-100 guidelines of the American Association of Physicists in Medicine (AAPM). Material and methods A multidisciplinary team familiar with the treatment process was formed. This team developed a process map (PM) outlining the stages, through which a patient passed when subjected to the Esteya treatment. They identified potential failure modes (FM) and each individual FM was assessed for the severity (S), frequency of occurrence (O), and lack of detection (D). A list of existing quality management tools was developed and the FMs were consensually reevaluated. Finally, the FMs were ranked according to their risk priority number (RPN) and their S. Results 146 FMs were identified, 106 of which had RPN ≥ 50 and 30 had S ≥ 7. After introducing the quality management tools, only 21 FMs had RPN ≥ 50. The importance of ensuring contact between the applicator and the surface of the patient’s skin was emphasized, so the setup was reviewed by a second individual before each treatment session with periodic quality control to ensure stability of the applicator pressure. Some of the essential quality management tools are already being implemented in the installation are the simple templates for reproducible positioning of skin applicators, that help marking the treatment area and positioning of X-ray tube. Conclusions New quality management tools have been established as a result of the application of the failure modes and effects analysis (FMEA) treatment. However, periodic update of the FMEA process is necessary, since clinical experience has suggested occurring of further new possible potential failure modes. PMID:28115958

  16. A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools

    DTIC Science & Technology

    1991-04-01

    designs in terms of their contributions toward forced mission termination and vehicle or function loss . Includes the ability to treat failure modes of...ABSTRACT: Inputs: MTBFs, MTTRs, support equipment costs, equipment weights and costs, available targets, military occupational specialty skill level and...US Army CECOM NAME: SPARECOST ABSTRACT: Calculates expected number of failures and performs spares holding optimization based on cost, weight , or

  17. Medication regimen complexity in ambulatory older adults with heart failure.

    PubMed

    Cobretti, Michael R; Page, Robert L; Linnebur, Sunny A; Deininger, Kimberly M; Ambardekar, Amrut V; Lindenfeld, JoAnn; Aquilante, Christina L

    2017-01-01

    Heart failure prevalence is increasing in older adults, and polypharmacy is a major problem in this population. We compared medication regimen complexity using the validated patient-level Medication Regimen Complexity Index (pMRCI) tool in "young-old" (60-74 years) versus "old-old" (75-89 years) patients with heart failure. We also compared pMRCI between patients with ischemic cardiomyopathy (ISCM) versus nonischemic cardiomyopathy (NISCM). Medication lists were retrospectively abstracted from the electronic medical records of ambulatory patients aged 60-89 years with heart failure. Medications were categorized into three types - heart failure prescription medications, other prescription medications, and over-the-counter (OTC) medications - and scored using the pMRCI tool. The study evaluated 145 patients (n=80 young-old, n=65 old-old, n=85 ISCM, n=60 NISCM, mean age 73±7 years, 64% men, 81% Caucasian). Mean total pMRCI scores (32.1±14.4, range 3-84) and total medication counts (13.3±4.8, range 2-30) were high for the entire cohort, of which 72% of patients were taking eleven or more total medications. Total and subtype pMRCI scores and medication counts did not differ significantly between the young-old and old-old groups, with the exception of OTC medication pMRCI score (6.2±4 young-old versus 7.8±5.8 old-old, P =0.04). With regard to heart failure etiology, total pMRCI scores and medication counts were significantly higher in patients with ISCM versus NISCM (pMRCI score 34.5±15.2 versus 28.8±12.7, P =0.009; medication count 14.1±4.9 versus 12.2±4.5, P =0.008), which was largely driven by other prescription medications. Medication regimen complexity is high in older adults with heart failure, and differs based on heart failure etiology. Additional work is needed to address polypharmacy and to determine if medication regimen complexity influences adherence and clinical outcomes in this population.

  18. Failure mode and effects analysis outputs: are they valid?

    PubMed

    Shebl, Nada Atef; Franklin, Bryony Dean; Barber, Nick

    2012-06-10

    Failure Mode and Effects Analysis (FMEA) is a prospective risk assessment tool that has been widely used within the aerospace and automotive industries and has been utilised within healthcare since the early 1990s. The aim of this study was to explore the validity of FMEA outputs within a hospital setting in the United Kingdom. Two multidisciplinary teams each conducted an FMEA for the use of vancomycin and gentamicin. Four different validity tests were conducted: Face validity: by comparing the FMEA participants' mapped processes with observational work. Content validity: by presenting the FMEA findings to other healthcare professionals. Criterion validity: by comparing the FMEA findings with data reported on the trust's incident report database. Construct validity: by exploring the relevant mathematical theories involved in calculating the FMEA risk priority number. Face validity was positive as the researcher documented the same processes of care as mapped by the FMEA participants. However, other healthcare professionals identified potential failures missed by the FMEA teams. Furthermore, the FMEA groups failed to include failures related to omitted doses; yet these were the failures most commonly reported in the trust's incident database. Calculating the RPN by multiplying severity, probability and detectability scores was deemed invalid because it is based on calculations that breach the mathematical properties of the scales used. There are significant methodological challenges in validating FMEA. It is a useful tool to aid multidisciplinary groups in mapping and understanding a process of care; however, the results of our study cast doubt on its validity. FMEA teams are likely to need different sources of information, besides their personal experience and knowledge, to identify potential failures. As for FMEA's methodology for scoring failures, there were discrepancies between the teams' estimates and similar incidents reported on the trust's incident database. Furthermore, the concept of multiplying ordinal scales to prioritise failures is mathematically flawed. Until FMEA's validity is further explored, healthcare organisations should not solely depend on their FMEA results to prioritise patient safety issues.

  19. An Educational Intervention to Evaluate Nurses' Knowledge of Heart Failure.

    PubMed

    Sundel, Siobhan; Ea, Emerson E

    2018-07-01

    Nurses are the main providers of patient education in inpatient and outpatient settings. Unfortunately, nurses may lack knowledge of chronic medical conditions, such as heart failure. The purpose of this one-group pretest-posttest intervention was to determine the effectiveness of teaching intervention on nurses' knowledge of heart failure self-care principles in an ambulatory care setting. The sample consisted of 40 staff nurses in ambulatory care. Nurse participants received a focused education intervention based on knowledge deficits revealed in the pretest and were then resurveyed within 30 days. Nurses were evaluated using the valid and reliable 20-item Nurses Knowledge of Heart Failure Education Principles Survey tool. The results of this project demonstrated that an education intervention on heart failure self-care principles improved nurses' knowledge of heart failure in an ambulatory care setting, which was statistically significant (p < .05). Results suggest that a teaching intervention could improve knowledge of heart failure, which could lead to better patient education and could reduce patient readmission for heart failure. J Contin Educ Nurs. 2018;49(7):315-321. Copyright 2018, SLACK Incorporated.

  20. "If at first you don't succeed": using failure to improve teaching.

    PubMed

    Pinsky, L E; Irby, D M

    1997-11-01

    The authors surveyed a group of distinguished clinical teachers regarding episodes of failure that had subsequently led to improvements in their teaching. Specifically, they examined how these teachers had used reflection on failed approaches as a tool for experiential learning. The respondents believed that failures were as important as successes in learning to be a good teacher. Using qualitative content analysis of the respondents' comments, the authors identified eight common types of failure associated with each of the three phases of teaching: planning, teaching, and reflection. Common failures associated with the planning stage were misjudging learners, lack of preparation, presenting too much content, lack of purpose, and difficulties with audiovisuals. The primary failure associated with actual teaching was inflexibly using a single teaching method. In the reflection phase, respondents said they most often realized that they had made one of two common errors: selecting the wrong teaching strategy or incorrectly implementing a sound strategy. For each identified failure, the respondents made recommendations for improvement. The deliberative process that had guided planning, teaching, and reflecting had helped all of them transform past failures into successes.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daniel, Isaac M.

    To facilitate and accelerate the process of introducing, evaluating and adopting of new material systems, it is important to develop/establish comprehensive and effective procedures of characterization, modeling and failure prediction of structural laminates based on the properties of the constituent materials, e. g., fibers, matrix, and the single ply or lamina. A new failure theory, the Northwestern (NU-Daniel) theory, has been proposed for predicting lamina yielding and failure under multi-axial states of stress including strain rate effects. It is primarily applicable to matrix-dominated interfiber/interlaminar failures. It is based on micromechanical failure mechanisms but is expressed in terms of easily measuredmore » macroscopic lamina stiffness and strength properties. It is presented in the form of a master failure envelope incorporating strain rate effects. The theory was further adapted and extended to the prediction of in situ first ply yielding and failure (FPY and FPF) and progressive failure of multi-directional laminates under static and dynamic loadings. The significance of this theory is that it allows for rapid screening of new composite materials without very extensive testing and offers easily implemented design tools.« less

  2. Characterization of delamination and transverse cracking in graphite/epoxy laminates by acoustic emission

    NASA Technical Reports Server (NTRS)

    Garg, A.; Ishaei, O.

    1983-01-01

    Efforts to characterize and differentiate between two major failure processes in graphite/epoxy composites - transverse cracking and Mode I delamination are described. Representative laminates were tested in uniaxial tension and flexure. The failure processes were monitored and identified by acoustic emission (AE). The effect of moisture on AE was also investigated. Each damage process was found to have a distinctive AE output that is significantly affected by moisture conditions. It is concluded that AE can serve as a useful tool for detecting and identifying failure modes in composite structures in laboratory and in service environments.

  3. Yield and failure criteria for composite materials under static and dynamic loading

    DOE PAGES

    Daniel, Isaac M.

    2015-12-23

    To facilitate and accelerate the process of introducing, evaluating and adopting of new material systems, it is important to develop/establish comprehensive and effective procedures of characterization, modeling and failure prediction of structural laminates based on the properties of the constituent materials, e. g., fibers, matrix, and the single ply or lamina. A new failure theory, the Northwestern (NU-Daniel) theory, has been proposed for predicting lamina yielding and failure under multi-axial states of stress including strain rate effects. It is primarily applicable to matrix-dominated interfiber/interlaminar failures. It is based on micromechanical failure mechanisms but is expressed in terms of easily measuredmore » macroscopic lamina stiffness and strength properties. It is presented in the form of a master failure envelope incorporating strain rate effects. The theory was further adapted and extended to the prediction of in situ first ply yielding and failure (FPY and FPF) and progressive failure of multi-directional laminates under static and dynamic loadings. The significance of this theory is that it allows for rapid screening of new composite materials without very extensive testing and offers easily implemented design tools.« less

  4. Evaluating Micrometeoroid and Orbital Debris Risk Assessments Using Anomaly Data

    NASA Technical Reports Server (NTRS)

    Squire, Michael

    2017-01-01

    The accuracy of micrometeoroid and orbital debris (MMOD) risk assessments can be difficult to evaluate. A team from the National Aeronautics and Space Administration (NASA) Engineering and Safety Center (NESC) has completed a study that compared MMOD-related failures on operational satellites to predictions of how many of those failures should occur using NASA's TM"s MMOD risk assessment methodology and tools. The study team used the Poisson probability to quantify the degree of inconsistency between the predicted and reported numbers of failures. Many elements go into a risk assessment, and each of those elements represent a possible source of uncertainty or bias that will influence the end result. There are also challenges in obtaining accurate and useful data on MMOD-related failures.

  5. Development of a practice tool for community-based nurses: the Heart Failure Palliative Approach to Care (HeFPAC).

    PubMed

    Strachan, Patricia H; Joy, Cathy; Costigan, Jeannine; Carter, Nancy

    2014-04-01

    Patients living with advanced heart failure (HF) require a palliative approach to reduce suffering. Nurses have described significant knowledge gaps about the disease-specific palliative care (PC) needs of these patients. An intervention is required to facilitate appropriate end-of-life care for HF patients. The purpose of this study was to develop a user-friendly, evidence-informed HF-specific practice tool for community-based nurses to facilitate care and communication regarding a palliative approach to HF care. Guided by the Knowledge to Action framework, we identified key HF-specific issues related to advanced HF care provision within the context of a palliative approach to care. Informed by current evidence and subsequent iterative consultation with community-based and specialist PC and HF nurses, a pocket guide tool for community-based nurses was created. We developed the Heart Failure Palliative Approach to Care (HeFPAC) pocket guide to promote communication and a palliative approach to care for HF patients. The HeFPAC has potential to improve the quality of care and experiences for patients with advanced HF. It will be piloted in community-based practice and in a continuing education program for nurses. The HeFPAC pocket guide offers PC nurses a concise, evidence-informed and practical point-of care tool to communicate with other clinicians and patients about key HF issues that are associated with improving disease-specific HF palliative care and the quality of life of patients and their families. Pilot testing will offer insight as to its utility and potential for modification for national and international use.

  6. Solving Component Structural Dynamic Failures Due to Extremely High Frequency Structural Response on the Space Shuttle Program

    NASA Technical Reports Server (NTRS)

    Frady, Greg; Nesman, Thomas; Zoladz, Thomas; Szabo, Roland

    2010-01-01

    For many years, the capabilities to determine the root-cause failure of component failures have been limited to the analytical tools and the state of the art data acquisition systems. With this limited capability, many anomalies have been resolved by adding material to the design to increase robustness without the ability to determine if the design solution was satisfactory until after a series of expensive test programs were complete. The risk of failure and multiple design, test, and redesign cycles were high. During the Space Shuttle Program, many crack investigations in high energy density turbomachines, like the SSME turbopumps and high energy flows in the main propulsion system, have led to the discovery of numerous root-cause failures and anomalies due to the coexistences of acoustic forcing functions, structural natural modes, and a high energy excitation, such as an edge tone or shedding flow, leading the technical community to understand many of the primary contributors to extremely high frequency high cycle fatique fluid-structure interaction anomalies. These contributors have been identified using advanced analysis tools and verified using component and system tests during component ground tests, systems tests, and flight. The structural dynamics and fluid dynamics communities have developed a special sensitivity to the fluid-structure interaction problems and have been able to adjust and solve these problems in a time effective manner to meet budget and schedule deadlines of operational vehicle programs, such as the Space Shuttle Program over the years.

  7. Hierarchically-driven Approach for Quantifying Materials Uncertainty in Creep Deformation and Failure of Aerospace Materials

    DTIC Science & Technology

    2016-07-01

    characteristics and to examine the sensitivity of using such techniques for evaluating microstructure. In addition to the GUI tool, a manual describing its use has... Evaluating Local Primary Dendrite Arm Spacing Characterization Techniques Using Synthetic Directionally Solidified Dendritic Microstructures, Metallurgical and...driven approach for quanti - fying materials uncertainty in creep deformation and failure of aerspace materials, Multi-scale Structural Mechanics and

  8. Functionality, Complexity, and Approaches to Assessment of Resilience Under Constrained Energy and Information

    DTIC Science & Technology

    2015-03-26

    albeit powerful , method available for exploring CAS. As discussed above, there are many useful mathematical tools appropriate for CAS modeling. Agent-based...cells, tele- phone calls, and sexual contacts approach power -law distributions. [48] Networks in general are robust against random failures, but...targeted failures can have powerful effects – provided the targeter has a good understanding of the network structure. Some argue (convincingly) that all

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teixeira, Flavia C., E-mail: flavitiz@gmail.com; Almeida, Carlos E. de; Saiful Huq, M.

    Purpose: The goal of this study was to evaluate the safety and quality management program for stereotactic radiosurgery (SRS) treatment processes at three radiotherapy centers in Brazil by using three industrial engineering tools (1) process mapping, (2) failure modes and effects analysis (FMEA), and (3) fault tree analysis. Methods: The recommendations of Task Group 100 of American Association of Physicists in Medicine were followed to apply the three tools described above to create a process tree for SRS procedure for each radiotherapy center and then FMEA was performed. Failure modes were identified for all process steps and values of riskmore » priority number (RPN) were calculated from O, S, and D (RPN = O × S × D) values assigned by a professional team responsible for patient care. Results: The subprocess treatment planning was presented with the highest number of failure modes for all centers. The total number of failure modes were 135, 104, and 131 for centers I, II, and III, respectively. The highest RPN value for each center is as follows: center I (204), center II (372), and center III (370). Failure modes with RPN ≥ 100: center I (22), center II (115), and center III (110). Failure modes characterized by S ≥ 7, represented 68% of the failure modes for center III, 62% for center II, and 45% for center I. Failure modes with RPNs values ≥100 and S ≥ 7, D ≥ 5, and O ≥ 5 were considered as high priority in this study. Conclusions: The results of the present study show that the safety risk profiles for the same stereotactic radiotherapy process are different at three radiotherapy centers in Brazil. Although this is the same treatment process, this present study showed that the risk priority is different and it will lead to implementation of different safety interventions among the centers. Therefore, the current practice of applying universal device-centric QA is not adequate to address all possible failures in clinical processes at different radiotherapy centers. Integrated approaches to device-centric and process specific quality management program specific to each radiotherapy center are the key to a safe quality management program.« less

  10. Failure of engineering artifacts: a life cycle approach.

    PubMed

    Del Frate, Luca

    2013-09-01

    Failure is a central notion both in ethics of engineering and in engineering practice. Engineers devote considerable resources to assure their products will not fail and considerable progress has been made in the development of tools and methods for understanding and avoiding failure. Engineering ethics, on the other hand, is concerned with the moral and social aspects related to the causes and consequences of technological failures. But what is meant by failure, and what does it mean that a failure has occurred? The subject of this paper is how engineers use and define this notion. Although a traditional definition of failure can be identified that is shared by a large part of the engineering community, the literature shows that engineers are willing to consider as failures also events and circumstance that are at odds with this traditional definition. These cases violate one or more of three assumptions made by the traditional approach to failure. An alternative approach, inspired by the notion of product life cycle, is proposed which dispenses with these assumptions. Besides being able to address the traditional cases of failure, it can deal successfully with the problematic cases. The adoption of a life cycle perspective allows the introduction of a clearer notion of failure and allows a classification of failure phenomena that takes into account the roles of stakeholders involved in the various stages of a product life cycle.

  11. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    PubMed

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  12. Review of nutritional screening and assessment tools and clinical outcomes in heart failure.

    PubMed

    Lin, Hong; Zhang, Haifeng; Lin, Zheng; Li, Xinli; Kong, Xiangqin; Sun, Gouzhen

    2016-09-01

    Recent studies have suggested that undernutrition as defined using multidimensional nutritional evaluation tools may affect clinical outcomes in heart failure (HF). The evidence supporting this correlation is unclear. Therefore, we conducted this systematic review to critically appraise the use of multidimensional evaluation tools in the prediction of clinical outcomes in HF. We performed descriptive analyses of all identified articles involving qualitative analyses. We used STATA to conduct meta-analyses when at least three studies that tested the same type of nutritional assessment or screening tools and used the same outcome were identified. Sensitivity analyses were conducted to validate our positive results. We identified 17 articles with qualitative analyses and 11 with quantitative analysis after comprehensive literature searching and screening. We determined that the prevalence of malnutrition is high in HF (range 16-90 %), particularly in advanced and acute decompensated HF (approximate range 75-90 %). Undernutrition as identified by multidimensional evaluation tools may be significantly associated with hospitalization, length of stay and complications and is particularly strongly associated with high mortality. The meta-analysis revealed that compared with other tools, Mini Nutritional Assessment (MNA) scores were the strongest predictors of mortality in HF [HR (4.32, 95 % CI 2.30-8.11)]. Our results remained reliable after conducting sensitivity analyses. The prevalence of malnutrition is high in HF, particularly in advanced and acute decompensated HF. Moreover, undernutrition as identified by multidimensional evaluation tools is significantly associated with unfavourable prognoses and high mortality in HF.

  13. Novel prognostic tissue markers in congestive heart failure.

    PubMed

    Stone, James R

    2015-01-01

    Heart failure is a relatively common disorder associated with high morbidity, mortality, and economic burden. Better tools to predict outcomes for patients with heart failure could allow for better decision making concerning patient treatment and management and better utilization of health care resources. Endomyocardial biopsy offers a mechanism to pathologically diagnose specific diseases in patients with heart failure, but such biopsies can often be negative, with no specific diagnostic information. Novel tissue markers in endomyocardial biopsies have been identified that may be useful in assessing prognosis in heart failure patients. Such tissue markers include ubiquitin, Gremlin-1, cyclophilin A, and heterogeneous nuclear ribonucleoprotein C. In some cases, tissue markers have been found to be independent of and even superior to clinical indices and serum markers in predicting prognosis for heart failure patients. In some cases, these novel tissue markers appear to offer prognostic information even in the setting of an otherwise negative endomyocardial biopsy. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Study on residual stresses in ultrasonic torsional vibration assisted micro-milling

    NASA Astrophysics Data System (ADS)

    Lu, Zesheng; Hu, Haijun; Sun, Yazhou; Sun, Qing

    2010-10-01

    It is well known that machining induced residual stresses can seriously affect the dimensional accuracy, corrosion and wear resistance, etc., and further influence the longevity and reliability of Micro-Optical Components (MOC). In Ultrasonic Torsional Vibration Assisted Micro-milling (UTVAM), cutting parameters, vibration parameters, mill cutter parameters, the status of wear length of tool flank are the main factors which affect residual stresses. A 2D model of UTVAM was established with FE analysis software ABAQUS. Johnson-Cook's flow stress model and shear failure principle are used as the workpiece material model and failure principle, while friction between tool and workpiece uses modified Coulomb's law whose sliding friction area is combined with sticking friction. By means of FEA, the influence rules of cutting parameters, vibration parameters, mill cutter parameters, the status of wear length of tool flank on residual stresses are obtained, which provides a basis for choosing optimal process parameters and improving the longevity and reliability of MOC.

  15. Failure Mode and Effect Analysis (FMEA) may enhance implementation of clinical practice guidelines: An experience from the Middle East.

    PubMed

    Babiker, Amir; Amer, Yasser S; Osman, Mohamed E; Al-Eyadhy, Ayman; Fatani, Solafa; Mohamed, Sarar; Alnemri, Abdulrahman; Titi, Maher A; Shaikh, Farheen; Alswat, Khalid A; Wahabi, Hayfaa A; Al-Ansary, Lubna A

    2018-02-01

    Implementation of clinical practice guidelines (CPGs) has been shown to reduce variation in practice and improve health care quality and patients' safety. There is a limited experience of CPG implementation (CPGI) in the Middle East. The CPG program in our institution was launched in 2009. The Quality Management department conducted a Failure Mode and Effect Analysis (FMEA) for further improvement of CPGI. This is a prospective study of a qualitative/quantitative design. Our FMEA included (1) process review and recording of the steps and activities of CPGI; (2) hazard analysis by recording activity-related failure modes and their effects, identification of actions required, assigned severity, occurrence, and detection scores for each failure mode and calculated the risk priority number (RPN) by using an online interactive FMEA tool; (3) planning: RPNs were prioritized, recommendations, and further planning for new interventions were identified; and (4) monitoring: after reduction or elimination of the failure mode. The calculated RPN will be compared with subsequent analysis in post-implementation phase. The data were scrutinized from a feedback of quality team members using a FMEA framework to enhance the implementation of 29 adapted CPGs. The identified potential common failure modes with the highest RPN (≥ 80) included awareness/training activities, accessibility of CPGs, fewer advocates from clinical champions, and CPGs auditing. Actions included (1) organizing regular awareness activities, (2) making CPGs printed and electronic copies accessible, (3) encouraging senior practitioners to get involved in CPGI, and (4) enhancing CPGs auditing as part of the quality sustainability plan. In our experience, FMEA could be a useful tool to enhance CPGI. It helped us to identify potential barriers and prepare relevant solutions. © 2017 John Wiley & Sons, Ltd.

  16. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  17. Simplified spacecraft vulnerability assessments at component level in early design phase at the European Space Agency's Concurrent Design Facility

    NASA Astrophysics Data System (ADS)

    Kempf, Scott; Schäfer, Frank K.; Cardone, Tiziana; Ferreira, Ivo; Gerené, Sam; Destefanis, Roberto; Grassi, Lilith

    2016-12-01

    During recent years, the state-of-the-art risk assessment of the threat posed to spacecraft by micrometeoroids and space debris has been expanded to the analysis of failure modes of internal spacecraft components. This method can now be used to perform risk analyses for satellites to assess various failure levels - from failure of specific sub-systems to catastrophic break-up. This new assessment methodology is based on triple-wall ballistic limit equations (BLEs), specifically the Schäfer-Ryan-Lambert (SRL) BLE, which is applicable for describing failure threshold levels for satellite components following a hypervelocity impact. The methodology is implemented in the form of the software tool Particle Impact Risk and vulnerability Analysis Tool (PIRAT). During a recent European Space Agency (ESA) funded study, the PIRAT functionality was expanded in order to provide an interface to ESA's Concurrent Design Facility (CDF). The additions include a geometry importer and an OCDT (Open Concurrent Design Tool) interface. The new interface provides both the expanded geometrical flexibility, which is provided by external computer aided design (CAD) modelling, and an ease of import of existing data without the need for extensive preparation of the model. The reduced effort required to perform vulnerability analyses makes it feasible for application during early design phase, at which point modifications to satellite design can be undertaken with relatively little extra effort. The integration of PIRAT in the CDF represents the first time that vulnerability analyses can be performed in-session in ESA's CDF and the first time that comprehensive vulnerability studies can be applied cost-effectively in early design phase in general.

  18. Time series analysis of tool wear in sheet metal stamping using acoustic emission

    NASA Astrophysics Data System (ADS)

    Vignesh Shanbhag, V.; Pereira, P. Michael; Rolfe, F. Bernard; Arunachalam, N.

    2017-09-01

    Galling is an adhesive wear mode that often affects the lifespan of stamping tools. Since stamping tools represent significant economic cost, even a slight improvement in maintenance cost is of high importance for the stamping industry. In other manufacturing industries, online tool condition monitoring has been used to prevent tool wear-related failure. However, monitoring the acoustic emission signal from a stamping process is a non-trivial task since the acoustic emission signal is non-stationary and non-transient. There have been numerous studies examining acoustic emissions in sheet metal stamping. However, very few have focused in detail on how the signals change as wear on the tool surface progresses prior to failure. In this study, time domain analysis was applied to the acoustic emission signals to extract features related to tool wear. To understand the wear progression, accelerated stamping tests were performed using a semi-industrial stamping setup which can perform clamping, piercing, stamping in a single cycle. The time domain features related to stamping were computed for the acoustic emissions signal of each part. The sidewalls of the stamped parts were scanned using an optical profilometer to obtain profiles of the worn part, and they were qualitatively correlated to that of the acoustic emissions signal. Based on the wear behaviour, the wear data can be divided into three stages: - In the first stage, no wear is observed, in the second stage, adhesive wear is likely to occur, and in the third stage severe abrasive plus adhesive wear is likely to occur. Scanning electron microscopy showed the formation of lumps on the stamping tool, which represents galling behavior. Correlation between the time domain features of the acoustic emissions signal and the wear progression identified in this study lays the basis for tool diagnostics in stamping industry.

  19. Simulation Assisted Risk Assessment: Blast Overpressure Modeling

    NASA Technical Reports Server (NTRS)

    Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael

    2006-01-01

    A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.

  20. Readability Assessment of Online Patient Education Material on Congestive Heart Failure.

    PubMed

    Kher, Akhil; Johnson, Sandra; Griffith, Robert

    2017-01-01

    Online health information is being used more ubiquitously by the general population. However, this information typically favors only a small percentage of readers, which can result in suboptimal medical outcomes for patients. The readability of online patient education materials regarding the topic of congestive heart failure was assessed through six readability assessment tools. The search phrase "congestive heart failure" was employed into the search engine Google. Out of the first 100 websites, only 70 were included attending to compliance with selection and exclusion criteria. These were then assessed through six readability assessment tools. Only 5 out of 70 websites were within the limits of the recommended sixth-grade readability level. The mean readability scores were as follows: the Flesch-Kincaid Grade Level (9.79), Gunning-Fog Score (11.95), Coleman-Liau Index (15.17), Simple Measure of Gobbledygook (SMOG) index (11.39), and the Flesch Reading Ease (48.87). Most of the analyzed websites were found to be above the sixth-grade readability level recommendations. Efforts need to be made to better tailor online patient education materials to the general population.

  1. How to Recognize Success and Failure: Practical Assessment of an Evolving, First-Semester Laboratory Program Using Simple, Outcome-Based Tools

    ERIC Educational Resources Information Center

    Gron, Liz U.; Bradley, Shelly B.; McKenzie, Jennifer R.; Shinn, Sara E.; Teague, M. Warfield

    2013-01-01

    This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory laboratory program created to teach green analytical chemistry using environmental samples. This general chemistry laboratory program, like many introductory courses, has a wide array of stakeholders within and…

  2. Fault Tree Analysis: A Research Tool for Educational Planning. Technical Report No. 1.

    ERIC Educational Resources Information Center

    Alameda County School Dept., Hayward, CA. PACE Center.

    This ESEA Title III report describes fault tree analysis and assesses its applicability to education. Fault tree analysis is an operations research tool which is designed to increase the probability of success in any system by analyzing the most likely modes of failure that could occur. A graphic portrayal, which has the form of a tree, is…

  3. Fishing and casing repairs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Short, J.A.

    1982-01-01

    Up to 1/4 of a total drilling budget can be spent on fishing failures and downhole remedial operations. The book presented is aimed at cutting these costs. Specific examples of operations are included throughout the book to illustrate conditions in field situations. The author also discusses background conditions causing the problems and possible solutions, along with preventive measures. Also included are chapters on types of fishing tools, fishing procedures and operations, casing failures and repairs. (JMT)

  4. The TrialsTracker: Automated ongoing monitoring of failure to share clinical trial results by all major companies and research institutions.

    PubMed

    Powell-Smith, Anna; Goldacre, Ben

    2016-01-01

    Background : Failure to publish trial results is a prevalent ethical breach with a negative impact on patient care. Audit is an important tool for quality improvement. We set out to produce an online resource that automatically identifies the sponsors with the best and worst record for failing to share trial results. Methods: A tool was produced that identifies all completed trials from clinicaltrials.gov, searches for results in the clinicaltrials.gov registry and on PubMed, and presents summary statistics for each sponsor online. Results : The TrialsTracker tool is now available. Results are consistent with previous publication bias cohort studies using manual searches. The prevalence of missing studies is presented for various classes of sponsor. All code and data is shared. Discussion: We have designed, built, and launched an easily accessible online service, the TrialsTracker, that identifies sponsors who have failed in their duty to make results of clinical trials available, and which can be maintained at low cost. Sponsors who wish to improve their performance metrics in this tool can do so by publishing the results of their trials.

  5. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  6. EEMD-based wind turbine bearing failure detection using the generator stator current homopolar component

    NASA Astrophysics Data System (ADS)

    Amirat, Yassine; Choqueuse, Vincent; Benbouzid, Mohamed

    2013-12-01

    Failure detection has always been a demanding task in the electrical machines community; it has become more challenging in wind energy conversion systems because sustainability and viability of wind farms are highly dependent on the reduction of the operational and maintenance costs. Indeed the most efficient way of reducing these costs would be to continuously monitor the condition of these systems. This allows for early detection of the generator health degeneration, facilitating a proactive response, minimizing downtime, and maximizing productivity. This paper provides then an assessment of a failure detection techniques based on the homopolar component of the generator stator current and attempts to highlight the use of the ensemble empirical mode decomposition as a tool for failure detection in wind turbine generators for stationary and non-stationary cases.

  7. Review on failure prediction techniques of composite single lap joint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ab Ghani, A.F., E-mail: ahmadfuad@utem.edu.my; Rivai, Ahmad, E-mail: ahmadrivai@utem.edu.my

    2016-03-29

    Adhesive bonding is the most appropriate joining method in construction of composite structures. The use of reliable design and prediction technique will produce better performance of bonded joints. Several papers from recent papers and journals have been reviewed and synthesized to understand the current state of the art in this area. It is done by studying the most relevant analytical solutions for composite adherends with start of reviewing the most fundamental ones involving beam/plate theory. It is then extended to review single lap joint non linearity and failure prediction and finally on the failure prediction on composite single lap joint.more » The review also encompasses the finite element modelling part as tool to predict the elastic response of composite single lap joint and failure prediction numerically.« less

  8. High Reliability Organizations--Medication Safety.

    PubMed

    Yip, Luke; Farmer, Brenna

    2015-06-01

    High reliability organizations (HROs), such as the aviation industry, successfully engage in high-risk endeavors and have low incidence of adverse events. HROs have a preoccupation with failure and errors. They analyze each event to effect system wide change in an attempt to mitigate the occurrence of similar errors. The healthcare industry can adapt HRO practices, specifically with regard to teamwork and communication. Crew resource management concepts can be adapted to healthcare with the use of certain tools such as checklists and the sterile cockpit to reduce medication errors. HROs also use The Swiss Cheese Model to evaluate risk and look for vulnerabilities in multiple protective barriers, instead of focusing on one failure. This model can be used in medication safety to evaluate medication management in addition to using the teamwork and communication tools of HROs.

  9. Electrical failure debug using interlayer profiling method

    NASA Astrophysics Data System (ADS)

    Yang, Thomas; Shen, Yang; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh

    2017-03-01

    It is very well known that as technology nodes move to smaller sizes, the number of design rules increases while design structures become more regular and the process manufacturing steps have increased as well. Normal inspection tools can only monitor hard failures on a single layer. For electrical failures that happen due to inter layers misalignments, we can only detect them through testing. This paper will present a working flow for using pattern analysis interlayer profiling techniques to turn multiple layer physical info into group linked parameter values. Using this data analysis flow combined with an electrical model allows us to find critical regions on a layout for yield learning.

  10. Processes in construction of failure management expert systems from device design information

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Lance, Nick

    1987-01-01

    This paper analyzes the tasks and problem solving methods used by an engineer in constructing a failure management expert system from design information about the device to te diagnosed. An expert test engineer developed a trouble-shooting expert system based on device design information and experience with similar devices, rather than on specific expert knowledge gained from operating the device or troubleshooting its failures. The construction of the expert system was intensively observed and analyzed. This paper characterizes the knowledge, tasks, methods, and design decisions involved in constructing this type of expert system, and makes recommendations concerning tools for aiding and automating construction of such systems.

  11. Measuring impedance in congestive heart failure: Current options and clinical applications

    PubMed Central

    Tang, W. H. Wilson; Tong, Wilson

    2011-01-01

    Measurement of impedance is becoming increasingly available in the clinical setting as a tool for assessing hemodynamics and volume status in patients with heart failure. The 2 major categories of impedance assessment are the band electrode method and the implanted device lead method. The exact sources of the impedance signal are complex and can be influenced by physiologic effects such as blood volume, fluid, and positioning. This article provides a critical review of our current understanding and promises of impedance measurements, the techniques that have evolved, as well as the evidence and limitations regarding their clinical applications in the setting of heart failure management. PMID:19249408

  12. e-Health readiness assessment factors and measuring tools: A systematic review.

    PubMed

    Yusif, Salifu; Hafeez-Baig, Abdul; Soar, Jeffrey

    2017-11-01

    The evolving, adoption and high failure nature of health information technology (HIT)/IS/T systems requires effective readiness assessment to avert increasing failures while increasing system benefits. However, literature on HIT readiness assessment is myriad and fragmented. This review bares the contours of the available literature concluding in a set of manageable and usable recommendations for policymakers, researchers, individuals and organizations intending to assess readiness for any HIT implementation. Identify studies, analyze readiness factors and offer recommendations. Published articles 1995-2016 were searched using Medline/PubMed, Cinahl, Web of Science, PsychInfo, ProQuest. Studies were included if they were assessing IS/T/mHealth readiness in the context of HIT. Articles not written in English were excluded. Themes that emerged in the process of the data synthesis were thematically analysed and interpreted. Analyzed themes were found across 63 articles. In accordance with their prevalence of use, they included but not limited to "Technological readiness", 30 (46%); "Core/Need/Motivational readiness", 23 (37%); "Acceptance and use readiness", 19 (29%); "Organizational readiness", 20 (21%); "IT skills/Training/Learning readiness" (18%), "Engagement readiness", 16 (24%) and "Societal readiness" (14%). Despite their prevalence in use, "Technological readiness", "Motivational readiness" and "Engagement readiness" all had myriad and unreliable measuring tools. Core readiness had relatively reliable measuring tools, which repeatedly been used in various readiness assessment studies CONCLUSION: Thus, there is the need for reliable measuring tools for even the most commonly used readiness assessment factors/constructs: Core readiness, Engagement and buy-ins readiness, Technological readiness and IT Skills readiness as this could serve as an immediate step in conducting effective/reliable e-Health readiness assessment, which could lead to reduced HIT implementation failures. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Introducing a change in hospital policy using FMEA methodology as a tool to reduce patient hazards.

    PubMed

    Ofek, Fanny; Magnezi, Racheli; Kurzweil, Yaffa; Gazit, Inbal; Berkovitch, Sofia; Tal, Orna

    2016-01-01

    Intravenous potassium chloride (IV KCl) solutions are widely used in hospitals for treatment of hypokalemia. As ampoules of concentrated KCL must be diluted before use, critical incidents have been associated with its preparation and administration. Currently, we have introduced ready-to-use diluted KCl infusion solutions to minimize the use of high-alert concentrated KCl. Since this process may be associated with considerable risks, we embraced a proactive hazard analysis as a tool to implement a change in high-alert drug usage in a hospital setting. Failure mode and effect analysis (FMEA) is a systematic tool to analyze and identify risks in system operations. We used FMEA to examine the hazards associated with the implementation of the ready-to-use solutions. A multidisciplinary team analyzed the risks by identifying failure modes, conducting a hazard analysis and calculating the criticality index (CI) for each failure mode. A 1-day survey was performed as an evaluation step after a trial run period of approximately 4 months. Six major possible risks were identified. The most severe risks were prioritized and specific recommendations were formulated. Out of 28 patients receiving IV KCl on the day of the survey, 22 received the ready-to-use solutions and 6 received the concentrated solutions as instructed. Only 1 patient received inappropriate ready-to-use KCl. Using the FMEA tool in our study has proven once again that by creating a gradient of severity of potential vulnerable elements, we are able to proactively promote safer and more efficient processes in health care systems. This article presents a utilization of this method for implementing a change in hospital policy regarding the routine use of IV KCl.

  14. Failure mode and effects analysis outputs: are they valid?

    PubMed Central

    2012-01-01

    Background Failure Mode and Effects Analysis (FMEA) is a prospective risk assessment tool that has been widely used within the aerospace and automotive industries and has been utilised within healthcare since the early 1990s. The aim of this study was to explore the validity of FMEA outputs within a hospital setting in the United Kingdom. Methods Two multidisciplinary teams each conducted an FMEA for the use of vancomycin and gentamicin. Four different validity tests were conducted: · Face validity: by comparing the FMEA participants’ mapped processes with observational work. · Content validity: by presenting the FMEA findings to other healthcare professionals. · Criterion validity: by comparing the FMEA findings with data reported on the trust’s incident report database. · Construct validity: by exploring the relevant mathematical theories involved in calculating the FMEA risk priority number. Results Face validity was positive as the researcher documented the same processes of care as mapped by the FMEA participants. However, other healthcare professionals identified potential failures missed by the FMEA teams. Furthermore, the FMEA groups failed to include failures related to omitted doses; yet these were the failures most commonly reported in the trust’s incident database. Calculating the RPN by multiplying severity, probability and detectability scores was deemed invalid because it is based on calculations that breach the mathematical properties of the scales used. Conclusion There are significant methodological challenges in validating FMEA. It is a useful tool to aid multidisciplinary groups in mapping and understanding a process of care; however, the results of our study cast doubt on its validity. FMEA teams are likely to need different sources of information, besides their personal experience and knowledge, to identify potential failures. As for FMEA’s methodology for scoring failures, there were discrepancies between the teams’ estimates and similar incidents reported on the trust’s incident database. Furthermore, the concept of multiplying ordinal scales to prioritise failures is mathematically flawed. Until FMEA’s validity is further explored, healthcare organisations should not solely depend on their FMEA results to prioritise patient safety issues. PMID:22682433

  15. Challenges on non-invasive ventilation to treat acute respiratory failure in the elderly.

    PubMed

    Scala, Raffaele

    2016-11-15

    Acute respiratory failure is a frequent complication in elderly patients especially if suffering from chronic cardio-pulmonary diseases. Non-invasive mechanical ventilation constitutes a successful therapeutic tool in the elderly as, like in younger patients, it is able to prevent endotracheal intubation in a wide range of acute conditions; moreover, this ventilator technique is largely applied in the elderly in whom invasive mechanical ventilation is considered not appropriated. Furthermore, the integration of new technological devices, ethical issues and environment of treatment are still largely debated in the treatment of acute respiratory failure in the elderly.This review aims at reporting and critically analyzing the peculiarities in the management of acute respiratory failure in elderly people, the role of noninvasive mechanical ventilation, the potential advantages of applying alternative or integrated therapeutic tools (i.e. high-flow nasal cannula oxygen therapy, non-invasive and invasive cough assist devices and low-flow carbon-dioxide extracorporeal systems), drawbacks in physician's communication and "end of life" decisions. As several areas of this topic are not supported by evidence-based data, this report takes in account also "real-life" data as well as author's experience.The choice of the setting and of the timing of non-invasive mechanical ventilation in elderly people with advanced cardiopulmonary disease should be carefully evaluated together with the chance of using integrated or alternative supportive devices. Last but not least, economic and ethical issues may often challenges the behavior of the physicians towards elderly people who are hospitalized for acute respiratory failure at the end stage of their cardiopulmonary and neoplastic diseases.

  16. Reliable Broadcast under Cascading Failures in Interdependent Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Sisi; Lee, Sangkeun; Chinthavali, Supriya

    Reliable broadcast is an essential tool to disseminate information among a set of nodes in the presence of failures. We present a novel study of reliable broadcast in interdependent networks, in which the failures in one network may cascade to another network. In particular, we focus on the interdependency between the communication network and power grid network, where the power grid depends on the signals from the communication network for control and the communication network depends on the grid for power. In this paper, we build a resilient solution to handle crash failures in the communication network that may causemore » cascading failures and may even partition the network. In order to guarantee that all the correct nodes deliver the messages, we use soft links, which are inactive backup links to non-neighboring nodes that are only active when failures occur. At the core of our work is a fully distributed algorithm for the nodes to predict and collect the information of cascading failures so that soft links can be maintained to correct nodes prior to the failures. In the presence of failures, soft links are activated to guarantee message delivery and new soft links are built accordingly for long term robustness. Our evaluation results show that the algorithm achieves low packet drop rate and handles cascading failures with little overhead.« less

  17. Prosthetic design directives: Low-cost hands within reach.

    PubMed

    Jones, G K; Rosendo, A; Stopforth, R

    2017-07-01

    Although three million people around the world suffer from the lack of one or both upper limbs 80% of this number is located within developing countries. While prosthetic prices soar with technology 3D printing and low cost electronics present a sensible solution for those that cannot afford expensive prosthetics. The electronic and control design of a low-cost prosthetic hand, the Touch Hand II, is discussed. This paper shows that sensorless techniques can be used to reduce design complexities, costs, and provide easier access to the electronics. A closing and opening finite state machine (COFSM) was developed to handle the actuated digit joint control state and a supervisory switching control scheme, used for speed and grip strength control. Three torque and speed settings were created to be preset for specific grasps. The hand was able to replicate ten frequently used grasps and grip some common objects. Future work is necessary to enable a user to control it with myoelectric signals (MESs) and to solve operational problems related to electromagnetic interference (EMI).

  18. 500  Gb/s free-space optical transmission over strong atmospheric turbulence channels.

    PubMed

    Qu, Zhen; Djordjevic, Ivan B

    2016-07-15

    We experimentally demonstrate a high-spectral-efficiency, large-capacity, featured free-space-optical (FSO) transmission system by using low-density, parity-check (LDPC) coded quadrature phase shift keying (QPSK) combined with orbital angular momentum (OAM) multiplexing. The strong atmospheric turbulence channel is emulated by two spatial light modulators on which four randomly generated azimuthal phase patterns yielding the Andrews spectrum are recorded. The validity of such an approach is verified by reproducing the intensity distribution and irradiance correlation function (ICF) from the full-scale simulator. Excellent agreement of experimental, numerical, and analytical results is found. To reduce the phase distortion induced by the turbulence emulator, the inexpensive wavefront sensorless adaptive optics (AO) is used. To deal with remaining channel impairments, a large-girth LDPC code is used. To further improve the aggregate data rate, the OAM multiplexing is combined with WDM, and 500 Gb/s optical transmission over the strong atmospheric turbulence channels is demonstrated.

  19. High speed wavefront sensorless aberration correction in digital micromirror based confocal microscopy.

    PubMed

    Pozzi, P; Wilding, D; Soloviev, O; Verstraete, H; Bliek, L; Vdovin, G; Verhaegen, M

    2017-01-23

    The quality of fluorescence microscopy images is often impaired by the presence of sample induced optical aberrations. Adaptive optical elements such as deformable mirrors or spatial light modulators can be used to correct aberrations. However, previously reported techniques either require special sample preparation, or time consuming optimization procedures for the correction of static aberrations. This paper reports a technique for optical sectioning fluorescence microscopy capable of correcting dynamic aberrations in any fluorescent sample during the acquisition. This is achieved by implementing adaptive optics in a non conventional confocal microscopy setup, with multiple programmable confocal apertures, in which out of focus light can be separately detected, and used to optimize the correction performance with a sampling frequency an order of magnitude faster than the imaging rate of the system. The paper reports results comparing the correction performances to traditional image optimization algorithms, and demonstrates how the system can compensate for dynamic changes in the aberrations, such as those introduced during a focal stack acquisition though a thick sample.

  20. Sensorless control for permanent magnet synchronous motor using a neural network based adaptive estimator

    NASA Astrophysics Data System (ADS)

    Kwon, Chung-Jin; Kim, Sung-Joong; Han, Woo-Young; Min, Won-Kyoung

    2005-12-01

    The rotor position and speed estimation of permanent-magnet synchronous motor(PMSM) was dealt with. By measuring the phase voltages and currents of the PMSM drive, two diagonally recurrent neural network(DRNN) based observers, a neural current observer and a neural velocity observer were developed. DRNN which has self-feedback of the hidden neurons ensures that the outputs of DRNN contain the whole past information of the system even if the inputs of DRNN are only the present states and inputs of the system. Thus the structure of DRNN may be simpler than that of feedforward and fully recurrent neural networks. If the backpropagation method was used for the training of the DRNN the problem of slow convergence arise. In order to reduce this problem, recursive prediction error(RPE) based learning method for the DRNN was presented. The simulation results show that the proposed approach gives a good estimation of rotor speed and position, and RPE based training has requires a shorter computation time compared to backpropagation based training.

  1. Improving Attachments of Non-Invasive (Type III) Electronic Data Loggers to Cetaceans

    DTIC Science & Technology

    2015-09-30

    animals in human care will be performed to test and validate this approach. The cadaver trials will enable controlled testing to failure or with both...quantitative metrics and analysis tools to assess the impact of a tag on the animal . Here we will present: 1) the characterization of the mechanical...fine scale motion analysis for swimming animals . 2 APPROACH Our approach is divided into four subtasks: Task 1: Forces and failure modes

  2. [A health passport for patients with heart failure].

    PubMed

    Turpeau, Stéphanie

    2017-11-01

    The health passport gives concrete form to the patient's care project. It constitutes a tool for sharing information between the community and hospital centred on and kept by the patient. It favours coordination between the different professionals involved in the care of patients with heart failure. Given to all patients from the beginning of their treatment, this personalised care programme enables the proposed treatment to be formalised. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  3. An analysis of potential stream fish and fish habitat monitoring procedures for the Inland Northwest: Annual Report 1999

    Treesearch

    James T. Peterson; Sherry P. Wollrab

    1999-01-01

    Natural resource managers in the Inland Northwest need tools for assessing the success or failure of conservation policies and the impacts of management actions on fish and fish habitats. Effectiveness monitoring is one such potential tool, but there are currently no established monitoring protocols. Since 1991, U.S. Forest Service biologists have used the standardized...

  4. Acoustic Emission Measurements for Tool Wear Evaluation in Drilling

    NASA Astrophysics Data System (ADS)

    Gómez, Martín P.; Migliori, Julio; Ruzzante, José E.; D'Attellis, Carlos E.

    2009-03-01

    In this work, the tool condition in a drilling process of SAE 1040 steel samples was studied by means of acoustic emission. The studied drill bits were modified with artificial and real failures, such as different degrees of wear in the cutting edge and in the outer corner. Some correlation between mean power of the acoustic emission parameters and the drill bit wear condition was found.

  5. Centerline latch tool for contingency orbiter door closure

    NASA Technical Reports Server (NTRS)

    Trevino, R. C.

    1982-01-01

    The centerline latch tool was designed and developed as an EVA manual backup device for latching the Space Shuttle Orbiter's payload bay doors for reentry in case of a failure of the existing centerline latches to operate properly. The tool was designed to satisfy a wide variety of structural, mechanical, and EVA requirements. It provides a load path for forces on the payload bay doors during reentry. Since the tool would be used by an EVA crewmember, control, handgrips, operating forces, and procedures must be within the capabilities of a partially restrained, suited crewmember in a zero-gravity environment. The centerline latch tool described was designed, developed, and tested to meet these requirements.

  6. Failure analysis of single-bolted joint for lightweight composite laminates and metal plate

    NASA Astrophysics Data System (ADS)

    Li, Linjie; Qu, Junli; Liu, Xiangdong

    2018-01-01

    A three-dimensional progressive damage model was developed in ANSYS to predict the damage accumulation of single bolted joint in composite laminates under in-plane tensile loading. First, we describe the formulation and algorithm of this model. Second, we calculate the failure loads of the joint in fibre reinforced epoxy laminated composite plates and compare it with the experiment results, which validates that our model can appropriately simulate the ultimate tensile strength of the joints and the whole process of failure of structure. Finally, this model is applied to study the failure process of the light-weight composite material (USN125). The study also has a great potential to provide a strong basis for bolted joints design in composite Laminates as well as a simple tool for comparing different laminate geometries and bolt arrangements.

  7. Nuclear medicine in the management of patients with heart failure: guidance from an expert panel of the International Atomic Energy Agency (IAEA).

    PubMed

    Peix, Amalia; Mesquita, Claudio Tinoco; Paez, Diana; Pereira, Carlos Cunha; Felix, Renata; Gutierrez, Claudia; Jaimovich, Rodrigo; Ianni, Barbara Maria; Soares, Jose; Olaya, Pastor; Rodriguez, Ma Victoria; Flotats, Albert; Giubbini, Raffaele; Travin, Mark; Garcia, Ernest V

    2014-08-01

    Heart failure is increasing worldwide at epidemic proportions, resulting in considerable disability, mortality, and increase in healthcare costs. Gated myocardial perfusion single photon emission computed tomography or PET imaging is the most prominent imaging modality capable of providing information on global and regional ventricular function, the presence of intraventricular synchronism, myocardial perfusion, and viability on the same test. In addition, I-mIBG scintigraphy is the only imaging technique approved by various regulatory agencies able to provide information regarding the adrenergic function of the heart. Therefore, both myocardial perfusion and adrenergic imaging are useful tools in the workup and management of heart failure patients. This guide is intended to reinforce the information on the use of nuclear cardiology techniques for the assessment of heart failure and associated myocardial disease.

  8. Effect of Thread and Rotating Speed on Material Flow Behavior and Mechanical Properties of Friction Stir Lap Welding Joints

    NASA Astrophysics Data System (ADS)

    Ji, Shude; Li, Zhengwei; Zhou, Zhenlu; Wu, Baosheng

    2017-10-01

    This study focused on the effects of thread on hook and cold lap formation, lap shear property and impact toughness of alclad 2024-T4 friction stir lap welding (FSLW) joints. Except the traditional threaded pin tool (TR-tool), three new tools with different thread locations and orientations were designed. Results showed that thread significantly affected hook, cold lap morphologies and lap shear properties. The tool with tip-threaded pin (T-tool) fabricated joint with flat hook and cold lap, which resulted in shear fracture mode. The tools with bottom-threaded pin (B-tool) eliminated the hook. The tool with reverse-threaded pin (R-tool) widened the stir zone width. When using configuration A, the joints fabricated by the three new tools showed higher failure loads than the joint fabricated by the TR-tool. The joint using the T-tool owned the optimum impact toughness. This study demonstrated the significance of thread during FSLW and provided a reference to optimize tool geometry.

  9. High-sensitivity c-reactive protein (hs-CRP) value with 90 days mortality in patients with heart failure

    NASA Astrophysics Data System (ADS)

    Nursyamsiah; Hasan, R.

    2018-03-01

    Hospitalization in patients with chronic heart failure is associated with high rates of mortality and morbidity that during treatment and post-treatment. Despite the various therapies available today, mortality and re-hospitalization rates within 60 to 90 days post-hospitalization are still quite high. This period is known as the vulnerable phase. With the prognostic evaluation tools in patients with heart failure are expected to help identify high-risk individuals, then more rigorous monitoring and interventions can be undertaken. To determine whether hs-CRP have an impact on mortality within 90 days in hospitalized patients with heart failure, an observational cohort study was conducted in 39 patients with heart failure who were hospitalized due to worsening chronic heart failure. Patients were followed for up to 90 days after initial evaluation with the primary endpoint is death. Hs-CRP value >4.25 mg/L we found 70% was dead and hs-CRP value <4.25 mg/L only 6.9% was dead whereas the survival within 90 days. p:0.000.In conclusion, there were differences in hs-CRP values between in patients with heart failure who died and survival within 90 days.

  10. Lungs in Heart Failure

    PubMed Central

    Apostolo, Anna; Giusti, Giuliano; Gargiulo, Paola; Bussotti, Maurizio; Agostoni, Piergiuseppe

    2012-01-01

    Lung function abnormalities both at rest and during exercise are frequently observed in patients with chronic heart failure, also in the absence of respiratory disease. Alterations of respiratory mechanics and of gas exchange capacity are strictly related to heart failure. Severe heart failure patients often show a restrictive respiratory pattern, secondary to heart enlargement and increased lung fluids, and impairment of alveolar-capillary gas diffusion, mainly due to an increased resistance to molecular diffusion across the alveolar capillary membrane. Reduced gas diffusion contributes to exercise intolerance and to a worse prognosis. Cardiopulmonary exercise test is considered the “gold standard” when studying the cardiovascular, pulmonary, and metabolic adaptations to exercise in cardiac patients. During exercise, hyperventilation and consequent reduction of ventilation efficiency are often observed in heart failure patients, resulting in an increased slope of ventilation/carbon dioxide (VE/VCO2) relationship. Ventilatory efficiency is as strong prognostic and an important stratification marker. This paper describes the pulmonary abnormalities at rest and during exercise in the patients with heart failure, highlighting the principal diagnostic tools for evaluation of lungs function, the possible pharmacological interventions, and the parameters that could be useful in prognostic assessment of heart failure patients. PMID:23365739

  11. Failure Progress of 3D Reinforced GFRP Laminate during Static Bending, Evaluated by Means of Acoustic Emission and Vibrations Analysis.

    PubMed

    Koziol, Mateusz; Figlus, Tomasz

    2015-12-14

    The work aimed to assess the failure progress in a glass fiber-reinforced polymer laminate with a 3D-woven and (as a comparison) plain-woven reinforcement, during static bending, using acoustic emission signals. The innovative method of the separation of the signal coming from the fiber fracture and the one coming from the matrix fracture with the use of the acoustic event's energy as a criterion was applied. The failure progress during static bending was alternatively analyzed by evaluation of the vibration signal. It gave a possibility to validate the results of the acoustic emission. Acoustic emission, as well as vibration signal analysis proved to be good and effective tools for the registration of failure effects in composite laminates. Vibration analysis is more complicated methodologically, yet it is more precise. The failure progress of the 3D laminate is "safer" and more beneficial than that of the plain-woven laminate. It exhibits less rapid load capacity drops and a higher fiber effort contribution at the moment of the main laminate failure.

  12. A new yield and failure theory for composite materials under static and dynamic loading

    DOE PAGES

    Daniel, Isaac M.; Daniel, Sam M.; Fenner, Joel S.

    2017-09-12

    In order to facilitate and accelerate the process of introducing, evaluating and adopting new material systems, it is important to develop/establish comprehensive and effective procedures of characterization, modeling and failure prediction of composite structures based on the properties of the constituent materials, e. g., fibers, matrix, and the single ply or lamina. A new yield/failure theory is proposed for predicting lamina yielding and failure under multi-axial states of stress including strain rate effects. It is based on the equivalent stress concept derived from energy principles and is expressed in terms of a single criterion. It is presented in the formmore » of master yield and failure envelopes incorporating strain rate effects. The theory can be further adapted and extended to the prediction of in situ first ply yielding and failure (FPY and FPF) and progressive damage of multi-directional laminates under static and dynamic loadings. The significance of this theory is that it allows for rapid screening of new composite materials without extensive testing and offers easily implemented design tools.« less

  13. Design study of the geometry of the blanking tool to predict the burr formation of Zircaloy-4 sheet

    NASA Astrophysics Data System (ADS)

    Ha, Jisun; Lee, Hyungyil; Kim, Dongchul; Kim, Naksoo

    2013-12-01

    In this work, we investigated factors that influence burr formation for zircaloy-4 sheet used for spacer grids of nuclear fuel roads. Factors we considered are geometric factors of punch. We changed clearance and velocity in order to consider the failure parameters, and we changed shearing angle and corner radius of L-shaped punch in order to consider geometric factors of punch. First, we carried out blanking test with failure parameter of GTN model using L-shaped punch. The tendency of failure parameters and geometric factors that affect burr formation by analyzing sheared edges is investigated. Consequently, geometric factor's influencing on the burr formation is also high as failure parameters. Then, the sheared edges and burr formation with failure parameters and geometric factors is investigated using FE analysis model. As a result of analyzing sheared edges with the variables, we checked geometric factors more affect burr formation than failure parameters. To check the reliability of the FE model, the blanking force and the sheared edges obtained from experiments are compared with the computations considering heat transfer.

  14. CRISPR/Cas9 Technology Targeting Fas Gene Protects Mice From Concanavalin-A Induced Fulminant Hepatic Failure.

    PubMed

    Liang, Wei-Cheng; Liang, Pu-Ping; Wong, Cheuk-Wa; Ng, Tzi-Bun; Huang, Jun-Jiu; Zhang, Jin-Fang; Waye, Mary Miu-Yee; Fu, Wei-Ming

    2017-03-01

    Fulminant hepatic failure is a life-threatening disease which occurs in patients without preexisting liver disease. Nowadays, there is no ideal therapeutic tool in the treatment of fulminant hepatic failure. Recent studies suggested that a novel technology termed CRISPR/Cas9 may be a promising approach for the treatment of fulminant hepatic failure. In this project, we have designed single chimeric guide RNAs specifically targeting the genomic regions of mouse Fas gene. The in vitro and in vivo effects of sgRNAs on the production of Fas protein were examined in cultured mouse cells and in a hydrodynamic injection-based mouse model, respectively. The in vivo delivery of CRISPR/Cas9 could maintain liver homeostasis and protect hepatocytes from Fas-mediated cell apoptosis in the fulminant hepatic failure model. Our study indicates the clinical potential of developing the CRISPR/Cas9 system as a novel therapeutic strategy to rescue Concanavalin-A-induced fulminant hepatic failure in the mouse model. J. Cell. Biochem. 118: 530-536, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. A new yield and failure theory for composite materials under static and dynamic loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daniel, Isaac M.; Daniel, Sam M.; Fenner, Joel S.

    In order to facilitate and accelerate the process of introducing, evaluating and adopting new material systems, it is important to develop/establish comprehensive and effective procedures of characterization, modeling and failure prediction of composite structures based on the properties of the constituent materials, e. g., fibers, matrix, and the single ply or lamina. A new yield/failure theory is proposed for predicting lamina yielding and failure under multi-axial states of stress including strain rate effects. It is based on the equivalent stress concept derived from energy principles and is expressed in terms of a single criterion. It is presented in the formmore » of master yield and failure envelopes incorporating strain rate effects. The theory can be further adapted and extended to the prediction of in situ first ply yielding and failure (FPY and FPF) and progressive damage of multi-directional laminates under static and dynamic loadings. The significance of this theory is that it allows for rapid screening of new composite materials without extensive testing and offers easily implemented design tools.« less

  16. Analysis and Test Correlation of Proof of Concept Box for Blended Wing Body-Low Speed Vehicle

    NASA Technical Reports Server (NTRS)

    Spellman, Regina L.

    2003-01-01

    The Low Speed Vehicle (LSV) is a 14.2% scale remotely piloted vehicle of the revolutionary Blended Wing Body concept. The design of the LSV includes an all composite airframe. Due to internal manufacturing capability restrictions, room temperature layups were necessary. An extensive materials testing and manufacturing process development effort was underwent to establish a process that would achieve the high modulus/low weight properties required to meet the design requirements. The analysis process involved a loads development effort that incorporated aero loads to determine internal forces that could be applied to a traditional FEM of the vehicle and to conduct detailed component analyses. A new tool, Hypersizer, was added to the design process to address various composite failure modes and to optimize the skin panel thickness of the upper and lower skins for the vehicle. The analysis required an iterative approach as material properties were continually changing. As a part of the material characterization effort, test articles, including a proof of concept wing box and a full-scale wing, were fabricated. The proof of concept box was fabricated based on very preliminary material studies and tested in bending, torsion, and shear. The box was then tested to failure under shear. The proof of concept box was also analyzed using Nastran and Hypersizer. The results of both analyses were scaled to determine the predicted failure load. The test results were compared to both the Nastran and Hypersizer analytical predictions. The actual failure occurred at 899 lbs. The failure was predicted at 1167 lbs based on the Nastran analysis. The Hypersizer analysis predicted a lower failure load of 960 lbs. The Nastran analysis alone was not sufficient to predict the failure load because it does not identify local composite failure modes. This analysis has traditionally been done using closed form solutions. Although Hypersizer is typically used as an optimizer for the design process, the failure prediction was used to help gain acceptance and confidence in this new tool. The correlated models and process were to be used to analyze the full BWB-LSV airframe design. The analysis and correlation with test results of the proof of concept box is presented here, including the comparison of the Nastran and Hypersizer results.

  17. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  18. Enlarging the Societal Pie Through Wise Legislation: A Psychological Perspective.

    PubMed

    Baron, Jonathan; Bazerman, Max H; Shonk, Katherine

    2006-06-01

    We offer a psychological perspective to explain the failure of governments to create near-Pareto improvements. Our tools for analyzing these failures reflect the difficulties people have trading small losses for large gains: the fixed-pie approach to negotiations, the omission bias and status quo bias, parochialism and dysfunctional competition, and the neglect of secondary effects. We examine the role of human judgment in the failure to find wise trade-offs by discussing diverse applications of citizen and government decision making, including AIDS treatment, organ-donation systems, endangered-species protection, subsidies, and free trade. Our overall goal is to offer a psychological approach for understanding suboptimality in government decision making. © 2006 Association for Psychological Science.

  19. Analysis of Full-Test tools and their limitations as applied to terminal junction blocks

    NASA Technical Reports Server (NTRS)

    Smith, J. L.

    1983-01-01

    Discovery of unlocked contacts in Deutsch Block terminal junctions in Solid Rocket Booster flight hardware prompted an investigation into pull test techniques to help insure against possible failures. Internal frictional forces between socket and pin and between wire and grommet were examined. Pull test force must be greater than internal friction yet less than the crimp strength of the pin or socket. For this reason, a 100 percent accurate test is impossible. Test tools were evaluated. Available tools are adequate for pull testing.

  20. Monitoring the response to pharmacologic therapy in patients with stable chronic heart failure: is BNP or NT-proBNP a useful assessment tool?

    PubMed

    Balion, Cynthia M; McKelvie, Robert S; Reichert, Sonja; Santaguida, Pasqualina; Booker, Lynda; Worster, Andrew; Raina, Parminder; McQueen, Matthew J; Hill, Stephen

    2008-03-01

    B-type natriuretic peptides are biomarkers of heart failure (HF) that can decrease following treatment. We sought to determine whether B-type natriuretic peptide (BNP) or N-terminal proBNP (NT-proBNP) concentration changes occurred in parallel to changes in other measures of heart failure following treatment. We conducted a systematic review of the literature for studies that assessed B-type natriuretic peptide measurements in treatment monitoring of patients with stable chronic heart failure. Selected studies had to include at least three consecutive measurements of BNP or NT-proBNP. Of 4338 citations screened, only 12 met all of the selection criteria. The selected studies included populations with a wide range of heart failure severity and therapy. BNP and NT-proBNP decreased following treatment in nine studies and was associated with improvement in clinical measures of HF. There was limited data to support using BNP or NT-proBNP to monitor therapy in patients with HF.

  1. Cognitive Dysfunction in Patients with Renal Failure Requiring Hemodialysis

    PubMed Central

    Thimmaiah, Rohini; Murthy, K. Krishna; Pinto, Denzil

    2012-01-01

    Background and Objectives: Renal failure patients show significant impairment on measures of attention and memory, and consistently perform significantly better on neuropsychological measures of memory and attention, approximately 24 hours after hemodialysis treatment. The objectives are to determine the cognitive dysfunction in patients with renal failure requiring hemodialysis. Materials and Methods: A total of 60 subjects comprising of 30 renal failure patients and 30 controls were recruited. The sample was matched for age, sex, and socioeconomic status. The tools used were the Standardized Mini-Mental State Examination and the Brief Cognitive Rating Scale. Results: The patients showed high cognitive dysfunction in the pre-dialysis group, in all the five dimensions (concentration, recent memory, past memory, orientation and functioning, and self-care), and the least in the 24-hour post dialysis group. This difference was found to be statistically significant (P=0.001). Conclusion: Patients with renal failure exhibited pronounced cognitive impairment and these functions significantly improved after the introduction of hemodialysis. PMID:23439613

  2. Utility of Failure Mode and Effect Analysis to Improve Safety in Suctioning by Orotracheal Tube.

    PubMed

    Vázquez-Valencia, Agustín; Santiago-Sáez, Andrés; Perea-Pérez, Bernardo; Labajo-González, Elena; Albarrán-Juan, Maria Elena

    2017-02-01

    The objective of the study was to use the Failure Mode and Effect Analysis (FMEA) tool to analyze the technique of secretion suctioning on patients with an endotracheal tube who were admitted into an intensive care unit. Brainstorming was carried out within the service to determine the potential errors most frequent in the process. After this, the FMEA was applied, including its stages, prioritizing risk in accordance with the risk prioritization number (RPN), selecting improvement actions in which they have an RPN of more than 300. We obtained 32 failure modes, of which 13 surpassed an RPN of 300. After our result, 21 improvement actions were proposed for those failure modes with RPN scores above 300. FMEA allows us to ascertain possible failures so as to later propose improvement actions for those which have an RPN of more than 300. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  3. Design, development, and fabrication of extravehicular activity tools for support of the transfer orbit stage

    NASA Technical Reports Server (NTRS)

    Albritton, L. M.; Redmon, J. W.; Tyler, T. R.

    1993-01-01

    Seven extravehicular activity (EVA) tools and a tool carrier have been designed and developed by MSFC in order to provide a two fault tolerant system for the transfer orbit stage (TOS) shuttle mission. The TOS is an upper stage booster for delivering payloads to orbits higher than the shuttle can achieve. Payloads are required not to endanger the shuttle even after two failures have occurred. The Airborne Support Equipment (ASE), used in restraining and deploying TOS, does not meet this criteria. The seven EVA tools designed will provide the required redundancy with no impact to the TOS hardware.

  4. The evolution of diagnosis-related groups (DRGs): from its beginnings in case-mix and resource use theory, to its implementation for payment and now for its current utilization for quality within and outside the hospital.

    PubMed

    Goldfield, Norbert

    2010-01-01

    Policymakers are searching for ways to control health care costs and improve quality. Diagnosis-related groups (DRGs) are by far the most important cost control and quality improvement tool that governments and private payers have implemented. This article reviews why DRGs have had this singular success both in the hospital sector and, over the past 10 years, in ambulatory and managed care settings. Last, the author reviews current trends in the development and implementation of tools that have the key ingredients of DRG success: categorical clinical model, separation of the clinical model from payment weights, separate payment adjustments for nonclinical factors, and outlier payments. Virtually all current tools used to manage health care costs and improve quality do not have these characteristics. This failure explains a key reason for the failure, for example, of the Medicare Advantage program to control health care costs. This article concludes with a discussion of future developments for DRG-type models outside the hospital sector.

  5. 'Hearts and minds': association, causation and implication of cognitive impairment in heart failure.

    PubMed

    Cannon, Jane A; McMurray, John Jv; Quinn, Terry J

    2015-01-01

    The clinical syndrome of heart failure is one of the leading causes of hospitalisation and mortality in older adults. An association between cognitive impairment and heart failure is well described but our understanding of the relationship between the two conditions remains limited. In this review we provide a synthesis of available evidence, focussing on epidemiology, the potential pathogenesis, and treatment implications of cognitive decline in heart failure. Most evidence available relates to heart failure with reduced ejection fraction and the syndromes of chronic cognitive decline or dementia. These conditions are only part of a complex heart failure-cognition paradigm. Associations between cognition and heart failure with preserved ejection fraction and between acute delirium and heart failure also seem evident and where data are available we will discuss these syndromes. Many questions remain unanswered regarding heart failure and cognition. Much of the observational evidence on the association is confounded by study design, comorbidity and insensitive cognitive assessment tools. If a causal link exists, there are several potential pathophysiological explanations. Plausible underlying mechanisms relating to cerebral hypoperfusion or occult cerebrovascular disease have been described and it seems likely that these may coexist and exert synergistic effects. Despite the prevalence of the two conditions, when cognitive impairment coexists with heart failure there is no specific guidance on treatment. Institution of evidence-based heart failure therapies that reduce mortality and hospitalisations seems intuitive and there is no signal that these interventions have an adverse effect on cognition. However, cognitive impairment will present a further barrier to the often complex medication self-management that is required in contemporary heart failure treatment.

  6. Understanding and Resolving Failures in Human-Robot Interaction: Literature Review and Model Development

    PubMed Central

    Honig, Shanee; Oron-Gilad, Tal

    2018-01-01

    While substantial effort has been invested in making robots more reliable, experience demonstrates that robots operating in unstructured environments are often challenged by frequent failures. Despite this, robots have not yet reached a level of design that allows effective management of faulty or unexpected behavior by untrained users. To understand why this may be the case, an in-depth literature review was done to explore when people perceive and resolve robot failures, how robots communicate failure, how failures influence people's perceptions and feelings toward robots, and how these effects can be mitigated. Fifty-two studies were identified relating to communicating failures and their causes, the influence of failures on human-robot interaction (HRI), and mitigating failures. Since little research has been done on these topics within the HRI community, insights from the fields of human computer interaction (HCI), human factors engineering, cognitive engineering and experimental psychology are presented and discussed. Based on the literature, we developed a model of information processing for robotic failures (Robot Failure Human Information Processing, RF-HIP), that guides the discussion of our findings. The model describes the way people perceive, process, and act on failures in human robot interaction. The model includes three main parts: (1) communicating failures, (2) perception and comprehension of failures, and (3) solving failures. Each part contains several stages, all influenced by contextual considerations and mitigation strategies. Several gaps in the literature have become evident as a result of this evaluation. More focus has been given to technical failures than interaction failures. Few studies focused on human errors, on communicating failures, or the cognitive, psychological, and social determinants that impact the design of mitigation strategies. By providing the stages of human information processing, RF-HIP can be used as a tool to promote the development of user-centered failure-handling strategies for HRIs.

  7. Failure modes and effects analysis (FMEA) for Gamma Knife radiosurgery.

    PubMed

    Xu, Andy Yuanguang; Bhatnagar, Jagdish; Bednarz, Greg; Flickinger, John; Arai, Yoshio; Vacsulka, Jonet; Feng, Wenzheng; Monaco, Edward; Niranjan, Ajay; Lunsford, L Dade; Huq, M Saiful

    2017-11-01

    Gamma Knife radiosurgery is a highly precise and accurate treatment technique for treating brain diseases with low risk of serious error that nevertheless could potentially be reduced. We applied the AAPM Task Group 100 recommended failure modes and effects analysis (FMEA) tool to develop a risk-based quality management program for Gamma Knife radiosurgery. A team consisting of medical physicists, radiation oncologists, neurosurgeons, radiation safety officers, nurses, operating room technologists, and schedulers at our institution and an external physicist expert on Gamma Knife was formed for the FMEA study. A process tree and a failure mode table were created for the Gamma Knife radiosurgery procedures using the Leksell Gamma Knife Perfexion and 4C units. Three scores for the probability of occurrence (O), the severity (S), and the probability of no detection for failure mode (D) were assigned to each failure mode by 8 professionals on a scale from 1 to 10. An overall risk priority number (RPN) for each failure mode was then calculated from the averaged O, S, and D scores. The coefficient of variation for each O, S, or D score was also calculated. The failure modes identified were prioritized in terms of both the RPN scores and the severity scores. The established process tree for Gamma Knife radiosurgery consists of 10 subprocesses and 53 steps, including a subprocess for frame placement and 11 steps that are directly related to the frame-based nature of the Gamma Knife radiosurgery. Out of the 86 failure modes identified, 40 Gamma Knife specific failure modes were caused by the potential for inappropriate use of the radiosurgery head frame, the imaging fiducial boxes, the Gamma Knife helmets and plugs, the skull definition tools as well as other features of the GammaPlan treatment planning system. The other 46 failure modes are associated with the registration, imaging, image transfer, contouring processes that are common for all external beam radiation therapy techniques. The failure modes with the highest hazard scores are related to imperfect frame adaptor attachment, bad fiducial box assembly, unsecured plugs/inserts, overlooked target areas, and undetected machine mechanical failure during the morning QA process. The implementation of the FMEA approach for Gamma Knife radiosurgery enabled deeper understanding of the overall process among all professionals involved in the care of the patient and helped identify potential weaknesses in the overall process. The results of the present study give us a basis for the development of a risk based quality management program for Gamma Knife radiosurgery. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  8. RESTRICTIVE CARDIOMYOPATHY AND SECONDARY CONGESTIVE HEART FAILURE IN A MCDOWELL'S CARPET PYTHON (MORELIA SPILOTA MCDOWELLI).

    PubMed

    Schilliger, Lionel; Chetboul, Valérie; Damoiseaux, Cécile; Nicolier, Alexandra

    2016-12-01

    Echocardiography is an established and noninvasive diagnostic tool used in herpetologic cardiology. Various cardiac lesions have been previously described in reptiles with the exception of restrictive cardiomyopathy. In this case report, restrictive cardiomyopathy and congestive heart failure associated with left atrial and sinus venosus dilation were diagnosed in a 2-yr-old captive lethargic McDowell's carpet python ( Morelia spilota mcdowelli), based on echocardiographic, Doppler, and histopathologic examinations. This cardiomyopathy was also associated with thrombosis within the sinus venosus.

  9. User-defined Material Model for Thermo-mechanical Progressive Failure Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.

  10. The NOL Gap Test: Past, present, and future

    NASA Technical Reports Server (NTRS)

    Jacobs, S. J.; Price, D.

    1980-01-01

    A brief history of the development of the gap test is presented with emphasis on accumulated knowledge of how best to use it as a tool to assess relative shock sensitivity of explosives and propellants. Present information about the detonation process in build-up, steady state, and failure is used to show that failure diameter cannot give the desired shock sensitivity assessment. Suggestions are made for improving the test to an experiment. The use of spherical donors, water as attenuator, and camera instrumentation are briefly discussed.

  11. Casualty Risk Assessment Controlled Re-Entry of EPS - Ariane 5ES - ATV Mission

    NASA Astrophysics Data System (ADS)

    Arnal, M.-H.; Laine, N.; Aussilhou, C.

    2012-01-01

    To fulfil its mission of compliance check to the French Space Operations Act, CNES has developed ELECTRA© tool in order to estimate casualty risk induced by a space activity (like rocket launch, controlled or un-controlled re-entry on Earth of a space object). This article describes the application of such a tool for the EPS controlled re-entry during the second Ariane 5E/S flight (Johannes Kepler mission has been launched in February 2011). EPS is the Ariane 5E/S upper composite which is de-orbited from a 260 km circular orbit after its main mission (release of the Automated Transfer Vehicle - ATV). After a brief description of the launcher, the ATV-mission and a description of all the failure cases taken into account in the mission design (which leads to "back-up scenarios" into the flight software program), the article will describe the steps which lead to the casualty risk assessment (in case of failure) with ELECTRA©. In particular, the presence on board of two propulsive means of de-orbiting (main engine of EPS, and 4 ACS longitudinal nozzles in case of main engine failure or exhaustion) leads to a low remaining casualty risk.

  12. Bridge Scour Technology Transfer

    DOT National Transportation Integrated Search

    2018-01-24

    Scour and flooding are the leading causes of bridge failures in the United States and therefore should be monitored. New applications of tools and technologies are being developed, tested, and implemented to reduce bridge scour risk. The National Coo...

  13. Characterization of emission microscopy and liquid crystal thermography in IC fault localization

    NASA Astrophysics Data System (ADS)

    Lau, C. K.; Sim, K. S.

    2013-05-01

    This paper characterizes two fault localization techniques - Emission Microscopy (EMMI) and Liquid Crystal Thermography (LCT) by using integrated circuit (IC) leakage failures. The majority of today's semiconductor failures do not reveal a clear visual defect on the die surface and therefore require fault localization tools to identify the fault location. Among the various fault localization tools, liquid crystal thermography and frontside emission microscopy are commonly used in most semiconductor failure analysis laboratories. Many people misunderstand that both techniques are the same and both are detecting hot spot in chip failing with short or leakage. As a result, analysts tend to use only LCT since this technique involves very simple test setup compared to EMMI. The omission of EMMI as the alternative technique in fault localization always leads to incomplete analysis when LCT fails to localize any hot spot on a failing chip. Therefore, this research was established to characterize and compare both the techniques in terms of their sensitivity in detecting the fault location in common semiconductor failures. A new method was also proposed as an alternative technique i.e. the backside LCT technique. The research observed that both techniques have successfully detected the defect locations resulted from the leakage failures. LCT wass observed more sensitive than EMMI in the frontside analysis approach. On the other hand, EMMI performed better in the backside analysis approach. LCT was more sensitive in localizing ESD defect location and EMMI was more sensitive in detecting non ESD defect location. Backside LCT was proven to work as effectively as the frontside LCT and was ready to serve as an alternative technique to the backside EMMI. The research confirmed that LCT detects heat generation and EMMI detects photon emission (recombination radiation). The analysis results also suggested that both techniques complementing each other in the IC fault localization. It is necessary for a failure analyst to use both techniques when one of the techniques produces no result.

  14. The 2010 Canadian Cardiovascular Society guidelines for the diagnosis and management of heart failure update: Heart failure in ethnic minority populations, heart failure and pregnancy, disease management, and quality improvement/assurance programs

    PubMed Central

    Howlett, Jonathan G; McKelvie, Robert S; Costigan, Jeannine; Ducharme, Anique; Estrella-Holder, Estrellita; Ezekowitz, Justin A; Giannetti, Nadia; Haddad, Haissam; Heckman, George A; Herd, Anthony M; Isaac, Debra; Kouz, Simon; Leblanc, Kori; Liu, Peter; Mann, Elizabeth; Moe, Gordon W; O’Meara, Eileen; Rajda, Miroslav; Siu, Samuel; Stolee, Paul; Swiggum, Elizabeth; Zeiroth, Shelley

    2010-01-01

    Since 2006, the Canadian Cardiovascular Society heart failure (HF) guidelines have published annual focused updates for cardiovascular care providers. The 2010 Canadian Cardiovascular Society HF guidelines update focuses on an increasing issue in the western world – HF in ethnic minorities – and in an uncommon but important setting – the pregnant patient. Additionally, due to increasing attention recently given to the assessment of how care is delivered and measured, two critically important topics – disease management programs in HF and quality assurance – have been included. Both of these topics were written from a clinical perspective. It is hoped that the present update will become a useful tool for health care providers and planners in the ongoing evolution of care for HF patients in Canada. PMID:20386768

  15. Static and fatigue testing of full-scale fuselage panels fabricated using a Therm-X(R) process

    NASA Technical Reports Server (NTRS)

    Dinicola, Albert J.; Kassapoglou, Christos; Chou, Jack C.

    1992-01-01

    Large, curved, integrally stiffened composite panels representative of an aircraft fuselage structure were fabricated using a Therm-X process, an alternative concept to conventional two-sided hard tooling and contour vacuum bagging. Panels subsequently were tested under pure shear loading in both static and fatigue regimes to assess the adequacy of the manufacturing process, the effectiveness of damage tolerant design features co-cured with the structure, and the accuracy of finite element and closed-form predictions of postbuckling capability and failure load. Test results indicated the process yielded panels of high quality and increased damage tolerance through suppression of common failure modes such as skin-stiffener separation and frame-stiffener corner failure. Finite element analyses generally produced good predictions of postbuckled shape, and a global-local modelling technique yielded failure load predictions that were within 7% of the experimental mean.

  16. Nuclear medicine in the management of patients with heart failure: guidance from an expert panel of the International Atomic Energy Agency (IAEA)

    PubMed Central

    Peix, Amalia; Mesquita, Claudio Tinoco; Paez, Diana; Pereira, Carlos Cunha; Felix, Renata; Gutierrez, Claudia; Jaimovich, Rodrigo; Ianni, Barbara Maria; Soares, Jose; Olaya, Pastor; Rodriguez, Ma. Victoria; Flotats, Albert; Giubbini, Raffaele; Travin, Mark

    2014-01-01

    Heart failure is increasing worldwide at epidemic proportions, resulting in considerable disability, mortality, and increase in healthcare costs. Gated myocardial perfusion single photon emission computed tomography or PET imaging is the most prominent imaging modality capable of providing information on global and regional ventricular function, the presence of intraventricular synchronism, myocardial perfusion, and viability on the same test. In addition, 123I-mIBG scintigraphy is the only imaging technique approved by various regulatory agencies able to provide information regarding the adrenergic function of the heart. Therefore, both myocardial perfusion and adrenergic imaging are useful tools in the workup and management of heart failure patients. This guide is intended to reinforce the information on the use of nuclear cardiology techniques for the assessment of heart failure and associated myocardial disease. PMID:24781009

  17. Experimental test of theory for the stability of partially saturated vertical cut slopes

    USGS Publications Warehouse

    Morse, Michael M.; Lu, N.; Wayllace, Alexandra; Godt, Jonathan W.; Take, W.A.

    2014-01-01

    This paper extends Culmann's vertical-cut analysis to unsaturated soils. To test the extended theory, unsaturated sand was compacted to a uniform porosity and moisture content in a laboratory apparatus. A sliding door that extended the height of the free face of the slope was lowered until the vertical cut failed. Digital images of the slope cross section and upper surface were acquired concurrently. A recently developed particle image velocimetry (PIV) tool was used to quantify soil displacement. The PIV analysis showed strain localization at varying distances from the sliding door prior to failure. The areas of localized strain were coincident with the location of the slope crest after failure. Shear-strength and soil-water-characteristic parameters of the sand were independently tested for use in extended analyses of the vertical-cut stability and of the failure plane angle. Experimental failure heights were within 22.3% of the heights predicted using the extended theory.

  18. Does an inter-flaw length control the accuracy of rupture forecasting in geological materials?

    NASA Astrophysics Data System (ADS)

    Vasseur, Jérémie; Wadsworth, Fabian B.; Heap, Michael J.; Main, Ian G.; Lavallée, Yan; Dingwell, Donald B.

    2017-10-01

    Multi-scale failure of porous materials is an important phenomenon in nature and in material physics - from controlled laboratory tests to rockbursts, landslides, volcanic eruptions and earthquakes. A key unsolved research question is how to accurately forecast the time of system-sized catastrophic failure, based on observations of precursory events such as acoustic emissions (AE) in laboratory samples, or, on a larger scale, small earthquakes. Until now, the length scale associated with precursory events has not been well quantified, resulting in forecasting tools that are often unreliable. Here we test the hypothesis that the accuracy of the forecast failure time depends on the inter-flaw distance in the starting material. We use new experimental datasets for the deformation of porous materials to infer the critical crack length at failure from a static damage mechanics model. The style of acceleration of AE rate prior to failure, and the accuracy of forecast failure time, both depend on whether the cracks can span the inter-flaw length or not. A smooth inverse power-law acceleration of AE rate to failure, and an accurate forecast, occurs when the cracks are sufficiently long to bridge pore spaces. When this is not the case, the predicted failure time is much less accurate and failure is preceded by an exponential AE rate trend. Finally, we provide a quantitative and pragmatic correction for the systematic error in the forecast failure time, valid for structurally isotropic porous materials, which could be tested against larger-scale natural failure events, with suitable scaling for the relevant inter-flaw distances.

  19. Coexisting Frailty, Cognitive Impairment, and Heart Failure: Implications for Clinical Care

    PubMed Central

    Butts, Brittany; Gary, Rebecca

    2015-01-01

    Objective To review some of the proposed pathways that increase frailty risk in older persons with heart failure and to discuss tools that may be used to assess for changes in physical and cognitive functioning in this population in order to assist with appropriate and timely intervention. Methods Review of the literature. Results Heart failure is the only cardiovascular disease that is increasing by epidemic proportions, largely due to an aging society and therapeutic advances in disease management. Because heart failure is largely a cardiogeriatric syndrome, age-related syndromes such as frailty and cognitive impairment are common in heart failure patients. Compared with age-matched counterparts, older adults with heart failure 4 to 6 times more likely to be frail or cognitively impaired. The reason for the high prevalence of frailty and cognitive impairment in this population is not well known but may likely reflect the synergistic effects of heart failure and aging, which may heighten vulnerability to stressors and accelerate loss of physiologic reserve. Despite the high prevalence of frailty and cognitive impairment in the heart failure population, these conditions are not routinely screened for in clinical practice settings and guidelines on optimal assessment strategies are lacking. Conclusion Persons with heart failure are at an increased risk for frailty, which may worsen symptoms, impair self-management, and lead to worse heart failure outcomes. Early detection of frailty and cognitive impairment may be an opportunity for intervention and a key strategy for improving clinical outcomes in older adults with heart failure. PMID:26594103

  20. Application of Vibration and Oil Analysis for Reliability Information on Helicopter Main Rotor Gearbox

    NASA Astrophysics Data System (ADS)

    Murrad, Muhamad; Leong, M. Salman

    Based on the experiences of the Malaysian Armed Forces (MAF), failure of the main rotor gearbox (MRGB) was one of the major contributing factors to helicopter breakdowns. Even though vibration and oil analysis are the effective techniques for monitoring the health of helicopter components, these two techniques were rarely combined to form an effective assessment tool in MAF. Results of the oil analysis were often used only for oil changing schedule while assessments of MRGB condition were mainly based on overall vibration readings. A study group was formed and given a mandate to improve the maintenance strategy of S61-A4 helicopter fleet in the MAF. The improvement consisted of a structured approach to the reassessment/redefinition suitable maintenance actions that should be taken for the MRGB. Basic and enhanced tools for condition monitoring (CM) are investigated to address the predominant failures of the MRGB. Quantitative accelerated life testing (QALT) was considered in this work with an intent to obtain the required reliability information in a shorter time with tests under normal stress conditions. These tests when performed correctly can provide valuable information about MRGB performance under normal operating conditions which enable maintenance personnel to make decision more quickly, accurately and economically. The time-to-failure and probability of failure information of the MRGB were generated by applying QALT analysis principles. This study is anticipated to make a dramatic change in its approach to CM, bringing significant savings and various benefits to MAF.

  1. A simple, inexpensive video camera setup for the study of avian nest activity

    USGS Publications Warehouse

    Sabine, J.B.; Meyers, J.M.; Schweitzer, Sara H.

    2005-01-01

    Time-lapse video photography has become a valuable tool for collecting data on avian nest activity and depredation; however, commercially available systems are expensive (>USA $4000/unit). We designed an inexpensive system to identify causes of nest failure of American Oystercatchers (Haematopus palliatus) and assessed its utility at Cumberland Island National Seashore, Georgia. We successfully identified raccoon (Procyon lotor), bobcat (Lynx rufus), American Crow (Corvus brachyrhynchos), and ghost crab (Ocypode quadrata) predation on oystercatcher nests. Other detected causes of nest failure included tidal overwash, horse trampling, abandonment, and human destruction. System failure rates were comparable with commercially available units. Our system's efficacy and low cost (<$800) provided useful data for the management and conservation of the American Oystercatcher.

  2. Application of failure mode and effect analysis in a radiology department.

    PubMed

    Thornton, Eavan; Brook, Olga R; Mendiratta-Lala, Mishal; Hallett, Donna T; Kruskal, Jonathan B

    2011-01-01

    With increasing deployment, complexity, and sophistication of equipment and related processes within the clinical imaging environment, system failures are more likely to occur. These failures may have varying effects on the patient, ranging from no harm to devastating harm. Failure mode and effect analysis (FMEA) is a tool that permits the proactive identification of possible failures in complex processes and provides a basis for continuous improvement. This overview of the basic principles and methodology of FMEA provides an explanation of how FMEA can be applied to clinical operations in a radiology department to reduce, predict, or prevent errors. The six sequential steps in the FMEA process are explained, and clinical magnetic resonance imaging services are used as an example for which FMEA is particularly applicable. A modified version of traditional FMEA called Healthcare Failure Mode and Effect Analysis, which was introduced by the U.S. Department of Veterans Affairs National Center for Patient Safety, is briefly reviewed. In conclusion, FMEA is an effective and reliable method to proactively examine complex processes in the radiology department. FMEA can be used to highlight the high-risk subprocesses and allows these to be targeted to minimize the future occurrence of failures, thus improving patient safety and streamlining the efficiency of the radiology department. RSNA, 2010

  3. Remote monitoring of heart failure: benefits for therapeutic decision making.

    PubMed

    Martirosyan, Mihran; Caliskan, Kadir; Theuns, Dominic A M J; Szili-Torok, Tamas

    2017-07-01

    Chronic heart failure is a cardiovascular disorder with high prevalence and incidence worldwide. The course of heart failure is characterized by periods of stability and instability. Decompensation of heart failure is associated with frequent and prolonged hospitalizations and it worsens the prognosis for the disease and increases cardiovascular mortality among affected patients. It is therefore important to monitor these patients carefully to reveal changes in their condition. Remote monitoring has been designed to facilitate an early detection of adverse events and to minimize regular follow-up visits for heart failure patients. Several new devices have been developed and introduced to the daily practice of cardiology departments worldwide. Areas covered: Currently, special tools and techniques are available to perform remote monitoring. Concurrently there are a number of modern cardiac implantable electronic devices that incorporate a remote monitoring function. All the techniques that have a remote monitoring function are discussed in this paper in detail. All the major studies on this subject have been selected for review of the recent data on remote monitoring of HF patients and demonstrate the role of remote monitoring in the therapeutic decision making for heart failure patients. Expert commentary: Remote monitoring represents a novel intensified follow-up strategy of heart failure management. Overall, theoretically, remote monitoring may play a crucial role in the early detection of heart failure progression and may improve the outcome of patients.

  4. [Review of the knowledge on acute kidney failure in the critical patient].

    PubMed

    Romero García, M; Delgado Hito, P; de la Cueva Ariza, L

    2013-01-01

    Acute renal failure affects from 1% to 25% of patients admitted to intensive care units. These figures vary depending on the population studied and criteria. The complications of acute renal failure (fluid overload, metabolic acidosis, hyperkalemia, bleeding) are treated. However, mortality remains high despite the technological advances of recent years because acute renal failure is usually associated with sepsis, respiratory failure, serious injury, surgical complications or consumption coagulopathy. Mortality ranges from 30% to 90%. Although there is no universally accepted definition, the RIFLE classification gives us an operational tool to define the degree of acute renal failure and to standardize the initiation of renal replacement techniques as well as to evaluate the results. Therefore, nurses working within the intensive care unit must be familiar with this disease, with its treatment (drug or alternative) and with the prevention of possible complications. Equally, they must be capable of detecting the manifestations of dependency each one of the basic needs and to be able to identify the collaboration problems in order to achieve an individualized care plan. Copyright © 2012 Elsevier España, S.L. y SEEIUC. All rights reserved.

  5. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  6. A fundamental study on the structural integrity of magnesium alloys joined by friction stir welding

    NASA Astrophysics Data System (ADS)

    Rao, Harish Mangebettu

    The goal of this research is to study the factors that influence the physical and mechanical properties of lap-shear joints produced using friction stir welding. This study focuses on understanding the effect of tool geometry and weld process parameters including the tool rotation rate, tool plunge depth and dwell time on the mechanical performance of similar magnesium alloy and dissimilar magnesium to aluminum alloy weld joints. A variety of experimental activities were conducted including tensile and fatigue testing, fracture surface and failure analysis, microstructure characterization, hardness measurements and chemical composition analysis. An investigation on the effect of weld process conditions in friction stir spot welding of magnesium to magnesium produced in a manner that had a large effective sheet thickness and smaller interfacial hook height exhibited superior weld strength. Furthermore, in fatigue testing of friction stir spot welded of magnesium to magnesium alloy, lap-shear welds produced using a triangular tool pin profile exhibited better fatigue life properties compared to lap-shear welds produced using a cylindrical tool pin profile. In friction stir spot welding of dissimilar magnesium to aluminum, formation of intermetallic compounds in the stir zone of the weld had a dominant effect on the weld strength. Lap-shear dissimilar welds with good material mixture and discontinues intermetallic compounds in the stir zone exhibited superior weld strength compared to lap-shear dissimilar welds with continuous formation of intermetallic compounds in the stir zone. The weld structural geometry like the interfacial hook, hook orientation and bond width also played a major role in influencing the weld strength of the dissimilar lap-shear friction stir spot welds. A wide scatter in fatigue test results was observed in friction stir linear welds of aluminum to magnesium alloys. Different modes of failure were observed under fatigue loading including crack propagation into the top sheet, into the bottom sheet, and interfacial separation. Investigation of the tested welds revealed that the voids in the weld nugget reduced the weld strength, resulting in lower fatigue life. A thin layer of IMCs formed along the faying surface which accelerated the fatigue failure.

  7. MO-E-9A-01: Risk Based Quality Management: TG100 In Action

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M; Palta, J; Dunscombe, P

    2014-06-15

    One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapymore » process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what Fault tree analysis is all about. Learn how to design a quality management program based upon the information obtained from process mapping, FMEA and FTA.« less

  8. Demographic processes limiting seedling recruitment in arid grassland restoration

    Treesearch

    Jeremy J. James; Tony J. Svejcar; Matthew J. Rinella

    2011-01-01

    Seeding is an important management tool in aridland restoration, but seeded species often fail to establish. Previous research has largely focused on the technical aspects of seeding with little effort directed at identifying demographic processes driving recruitment failures.

  9. Nondestructive SEM for surface and subsurface wafer imaging

    NASA Technical Reports Server (NTRS)

    Propst, Roy H.; Bagnell, C. Robert; Cole, Edward I., Jr.; Davies, Brian G.; Dibianca, Frank A.; Johnson, Darryl G.; Oxford, William V.; Smith, Craig A.

    1987-01-01

    The scanning electron microscope (SEM) is considered as a tool for both failure analysis as well as device characterization. A survey is made of various operational SEM modes and their applicability to image processing methods on semiconductor devices.

  10. Advanced Self-Calibrating, Self-Repairing Data Acquisition System

    NASA Technical Reports Server (NTRS)

    Medelius, Pedro J. (Inventor); Eckhoff, Anthony J. (Inventor); Angel, Lucena R. (Inventor); Perotti, Jose M. (Inventor)

    2002-01-01

    An improved self-calibrating and self-repairing Data Acquisition System (DAS) for use in inaccessible areas, such as onboard spacecraft, and capable of autonomously performing required system health checks, failure detection. When required, self-repair is implemented utilizing a "spare parts/tool box" system. The available number of spare components primarily depends upon each component's predicted reliability which may be determined using Mean Time Between Failures (MTBF) analysis. Failing or degrading components are electronically removed and disabled to reduce power consumption, before being electronically replaced with spare components.

  11. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  12. Fault tree analysis of most common rolling bearing tribological failures

    NASA Astrophysics Data System (ADS)

    Vencl, Aleksandar; Gašić, Vlada; Stojanović, Blaža

    2017-02-01

    Wear as a tribological process has a major influence on the reliability and life of rolling bearings. Field examinations of bearing failures due to wear indicate possible causes and point to the necessary measurements for wear reduction or elimination. Wear itself is a very complex process initiated by the action of different mechanisms, and can be manifested by different wear types which are often related. However, the dominant type of wear can be approximately determined. The paper presents the classification of most common bearing damages according to the dominant wear type, i.e. abrasive wear, adhesive wear, surface fatigue wear, erosive wear, fretting wear and corrosive wear. The wear types are correlated with the terms used in ISO 15243 standard. Each wear type is illustrated with an appropriate photograph, and for each wear type, appropriate description of causes and manifestations is presented. Possible causes of rolling bearing failure are used for the fault tree analysis (FTA). It was performed to determine the root causes for bearing failures. The constructed fault tree diagram for rolling bearing failure can be useful tool for maintenance engineers.

  13. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Technical Reports Server (NTRS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  14. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Astrophysics Data System (ADS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  15. Fuzzy Risk Evaluation in Failure Mode and Effects Analysis Using a D Numbers Based Multi-Sensor Information Fusion Method.

    PubMed

    Deng, Xinyang; Jiang, Wen

    2017-09-12

    Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.

  16. Fuzzy Risk Evaluation in Failure Mode and Effects Analysis Using a D Numbers Based Multi-Sensor Information Fusion Method

    PubMed Central

    Deng, Xinyang

    2017-01-01

    Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905

  17. Fundamental Research on Percussion Drilling: Improved rock mechanics analysis, advanced simulation technology, and full-scale laboratory investigations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael S. Bruno

    This report summarizes the research efforts on the DOE supported research project Percussion Drilling (DE-FC26-03NT41999), which is to significantly advance the fundamental understandings of the physical mechanisms involved in combined percussion and rotary drilling, and thereby facilitate more efficient and lower cost drilling and exploration of hard-rock reservoirs. The project has been divided into multiple tasks: literature reviews, analytical and numerical modeling, full scale laboratory testing and model validation, and final report delivery. Literature reviews document the history, pros and cons, and rock failure physics of percussion drilling in oil and gas industries. Based on the current understandings, a conceptualmore » drilling model is proposed for modeling efforts. Both analytical and numerical approaches are deployed to investigate drilling processes such as drillbit penetration with compression, rotation and percussion, rock response with stress propagation, damage accumulation and failure, and debris transportation inside the annulus after disintegrated from rock. For rock mechanics modeling, a dynamic numerical tool has been developed to describe rock damage and failure, including rock crushing by compressive bit load, rock fracturing by both shearing and tensile forces, and rock weakening by repetitive compression-tension loading. Besides multiple failure criteria, the tool also includes a damping algorithm to dissipate oscillation energy and a fatigue/damage algorithm to update rock properties during each impact. From the model, Rate of Penetration (ROP) and rock failure history can be estimated. For cuttings transport in annulus, a 3D numerical particle flowing model has been developed with aid of analytical approaches. The tool can simulate cuttings movement at particle scale under laminar or turbulent fluid flow conditions and evaluate the efficiency of cutting removal. To calibrate the modeling efforts, a series of full-scale fluid hammer drilling tests, as well as single impact tests, have been designed and executed. Both Berea sandstone and Mancos shale samples are used. In single impact tests, three impacts are sequentially loaded at the same rock location to investigate rock response to repetitive loadings. The crater depth and width are measured as well as the displacement and force in the rod and the force in the rock. Various pressure differences across the rock-indentor interface (i.e. bore pressure minus pore pressure) are used to investigate the pressure effect on rock penetration. For hammer drilling tests, an industrial fluid hammer is used to drill under both underbalanced and overbalanced conditions. Besides calibrating the modeling tool, the data and cuttings collected from the tests indicate several other important applications. For example, different rock penetrations during single impact tests may reveal why a fluid hammer behaves differently with diverse rock types and under various pressure conditions at the hole bottom. On the other hand, the shape of the cuttings from fluid hammer tests, comparing to those from traditional rotary drilling methods, may help to identify the dominant failure mechanism that percussion drilling relies on. If so, encouraging such a failure mechanism may improve hammer performance. The project is summarized in this report. Instead of compiling the information contained in the previous quarterly or other technical reports, this report focuses on the descriptions of tasks, findings, and conclusions, as well as the efforts on promoting percussion drilling technologies to industries including site visits, presentations, and publications. As a part of the final deliveries, the 3D numerical model for rock mechanics is also attached.« less

  18. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.

  19. The Use of Probabilistic Methods to Evaluate the Systems Impact of Component Design Improvements on Large Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Packard, Michael H.

    2002-01-01

    Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.

  20. Proposal on How To Conduct a Biopharmaceutical Process Failure Mode and Effect Analysis (FMEA) as a Risk Assessment Tool.

    PubMed

    Zimmermann, Hartmut F; Hentschel, Norbert

    2011-01-01

    With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.

  1. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study

    PubMed Central

    Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data. PMID:28255331

  2. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study.

    PubMed

    Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  3. A retrospective study on the incidences of adverse drug events and analysis of the contributing trigger factors

    PubMed Central

    Sam, Aaseer Thamby; Lian Jessica, Looi Li; Parasuraman, Subramani

    2015-01-01

    Objectives: To retrospectively determine the extent and types of adverse drug events (ADEs) from the patient cases sheets and identify the contributing factors of medication errors. To assess causality and severity using the World Health Organization (WHO) probability scale and Hartwig's scale, respectively. Methods: Hundred patient case sheets were randomly selected, modified version of the Institute for Healthcare Improvement (IHI) Global Trigger Tool was utilized to identify the ADEs; causality and severity were calculated utilizing the WHO probability scale and Hartwig's severity assessment scale, respectively. Results: In total, 153 adverse events (AEs) were identified using the IHI Global Trigger Tool. Majority of the AEs are due to medication errors (46.41%) followed by 60 adverse drug reactions (ADRs), 15 therapeutic failure incidents, and 7 over-dose cases. Out of the 153 AEs, 60 are due to ADRs such as rashes, nausea, and vomiting. Therapeutic failure contributes 9.80% of the AEs, while overdose contributes to 4.58% of the total 153 AEs. Using the trigger tools, we were able to detect 45 positive triggers in 36 patient records. Among it, 19 AEs were identified in 15 patient records. The percentage of AE/100 patients is 17%. The average ADEs/1000 doses is 2.03% (calculated). Conclusion: The IHI Global Trigger Tool is an effective method to aid provisionally-registered pharmacists to identify ADEs quicker. PMID:25767366

  4. 48 CFR 52.232-32 - Performance-Based Payments.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... is endangered by the Contractor's (i) failure to make progress, or (ii) unsatisfactory financial... title; (iii) Nondurable (i.e., noncapital) tools, jigs, dies, fixtures, molds, patterns, taps, gauges... Government access. The Contractor shall promptly furnish reports, certificates, financial statements, and...

  5. QSAR Modeling: Where Have You Been? Where Are You Going To?.

    EPA Science Inventory

    Quantitative structure–activity relationship modeling is one of the major computational tools employed in medicinal chemistry. However, throughout its entire history it has drawn both praise and criticism concerning its reliability, limitations, successes, and failures. In this...

  6. Reliability culture at La Silla Paranal Observatory

    NASA Astrophysics Data System (ADS)

    Gonzalez, Sergio

    2010-07-01

    The Maintenance Department at the La Silla - Paranal Observatory has been an important base to keep the operations of the observatory at a good level of reliability and availability. Several strategies have been implemented and improved in order to cover these requirements and keep the system and equipment working properly when it is required. For that reason, one of the latest improvements has been the introduction of the concept of reliability, which implies that we don't simply speak about reliability concepts. It involves much more than that. It involves the use of technologies, data collecting, data analysis, decision making, committees concentrated in analysis of failure modes and how they can be eliminated, aligning the results with the requirements of our internal partners and establishing steps to achieve success. Some of these steps have already been implemented: data collection, use of technologies, analysis of data, development of priority tools, committees dedicated to analyze data and people dedicated to reliability analysis. This has permitted us to optimize our process, analyze where we can improve, avoid functional failures, reduce the failures range in several systems and subsystems; all this has had a positive impact in terms of results for our Observatory. All these tools are part of the reliability culture that allows our system to operate with a high level of reliability and availability.

  7. Tools for outcome prediction in patients with community acquired pneumonia.

    PubMed

    Khan, Faheem; Owens, Mark B; Restrepo, Marcos; Povoa, Pedro; Martin-Loeches, Ignacio

    2017-02-01

    Community-acquired pneumonia (CAP) is one of the most common causes of mortality world-wide. The mortality rate of patients with CAP is influenced by the severity of the disease, treatment failure and the requirement for hospitalization and/or intensive care unit (ICU) management, all of which may be predicted by biomarkers and clinical scoring systems. Areas covered: We review the recent literature examining the efficacy of established and newly-developed clinical scores, biological and inflammatory markers such as C-Reactive protein (CRP), procalcitonin (PCT) and Interleukin-6 (IL-6), whether used alone or in conjunction with clinical severity scores to assess the severity of CAP, predict treatment failure, guide acute in-hospital or ICU admission and predict mortality. Expert commentary: The early prediction of treatment failure using clinical scores and biomarkers plays a developing role in improving survival of patients with CAP by identifying high-risk patients requiring hospitalization or ICU admission; and may enable more efficient allocation of resources. However, it is likely that combinations of scoring systems and biomarkers will be of greater use than individual markers. Further larger studies are needed to corroborate the additive value of these markers to clinical prediction scores to provide a safer and more effective assessment tool for clinicians.

  8. Making intelligent systems team players. A guide to developing intelligent monitoring systems

    NASA Technical Reports Server (NTRS)

    Land, Sherry A.; Malin, Jane T.; Thronesberry, Carroll; Schreckenghost, Debra L.

    1995-01-01

    This reference guide for developers of intelligent monitoring systems is based on lessons learned by developers of the DEcision Support SYstem (DESSY), an expert system that monitors Space Shuttle telemetry data in real time. DESSY makes inferences about commands, state transitions, and simple failures. It performs failure detection rather than in-depth failure diagnostics. A listing of rules from DESSY and cue cards from DESSY subsystems are included to give the development community a better understanding of the selected model system. The G-2 programming tool used in developing DESSY provides an object-oriented, rule-based environment, but many of the principles in use here can be applied to any type of monitoring intelligent system. The step-by-step instructions and examples given for each stage of development are in G-2, but can be used with other development tools. This guide first defines the authors' concept of real-time monitoring systems, then tells prospective developers how to determine system requirements, how to build the system through a combined design/development process, and how to solve problems involved in working with real-time data. It explains the relationships among operational prototyping, software evolution, and the user interface. It also explains methods of testing, verification, and validation. It includes suggestions for preparing reference documentation and training users.

  9. Altered sarco(endo)plasmic reticulum calcium adenosine triphosphatase 2a content: Targets for heart failure therapy.

    PubMed

    Liu, Gang; Li, Si Qi; Hu, Ping Ping; Tong, Xiao Yong

    2018-05-01

    Sarco(endo)plasmic reticulum calcium adenosine triphosphatase is responsible for transporting cytosolic calcium into the sarcoplasmic reticulum and endoplasmic reticulum to maintain calcium homeostasis. Sarco(endo)plasmic reticulum calcium adenosine triphosphatase is the dominant isoform expressed in cardiac tissue, which is regulated by endogenous protein inhibitors, post-translational modifications, hormones as well as microRNAs. Dysfunction of sarco(endo)plasmic reticulum calcium adenosine triphosphatase is associated with heart failure, which makes sarco(endo)plasmic reticulum calcium adenosine triphosphatase a promising target for heart failure therapy. This review summarizes current approaches to ameliorate sarco(endo)plasmic reticulum calcium adenosine triphosphatase function and focuses on phospholamban, an endogenous inhibitor of sarco(endo)plasmic reticulum calcium adenosine triphosphatase, pharmacological tools and gene therapies.

  10. Integrating FMEA in a Model-Driven Methodology

    NASA Astrophysics Data System (ADS)

    Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno

    2016-08-01

    Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.

  11. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  12. SU-F-T-245: The Investigation of Failure Mode and Effects Analysis and PDCA for the Radiotherapy Risk Reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Wang, J; P, J

    2016-06-15

    Purpose: To optimize the clinical processes of radiotherapy and to reduce the radiotherapy risks by implementing the powerful risk management tools of failure mode and effects analysis(FMEA) and PDCA(plan-do-check-act). Methods: A multidiciplinary QA(Quality Assurance) team from our department consisting of oncologists, physicists, dosimetrists, therapists and administrator was established and an entire workflow QA process management using FMEA and PDCA tools was implemented for the whole treatment process. After the primary process tree was created, the failure modes and Risk priority numbers(RPNs) were determined by each member, and then the RPNs were averaged after team discussion. Results: 3 of 9 failuremore » modes with RPN above 100 in the practice were identified in the first PDCA cycle, which were further analyzed to investigate the RPNs: including of patient registration error, prescription error and treating wrong patient. New process controls reduced the occurrence, or detectability scores from the top 3 failure modes. Two important corrective actions reduced the highest RPNs from 300 to 50, and the error rate of radiotherapy decreased remarkably. Conclusion: FMEA and PDCA are helpful in identifying potential problems in the radiotherapy process, which was proven to improve the safety, quality and efficiency of radiation therapy in our department. The implementation of the FMEA approach may improve the understanding of the overall process of radiotherapy while may identify potential flaws in the whole process. Further more, repeating the PDCA cycle can bring us closer to the goal: higher safety and accuracy radiotherapy.« less

  13. Advantage of the modified Lunn-McNeil technique over Kalbfleisch-Prentice technique in competing risks

    NASA Astrophysics Data System (ADS)

    Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.

    2002-03-01

    Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.

  14. Shape memory alloy wire for self-sensing servo actuation

    NASA Astrophysics Data System (ADS)

    Josephine Selvarani Ruth, D.; Dhanalakshmi, K.

    2017-01-01

    This paper reports on the development of a straightforward approach to realise self-sensing shape memory alloy (SMA) wire actuated control. A differential electrical resistance measurement circuit (the sensorless signal conditioning (SSC) circuit) is designed; this sensing signal is directly used as the feedback for control. Antagonistic SMA wire actuators designed for servo actuation is realized in self-sensing actuation (SSA) mode for direct control with the differential electrical resistance feedback. The self-sensing scheme is established on a 1-DOF manipulator with the discrete time sliding mode controls which demonstrates good control performance, whatever be the disturbance and loading conditions. The uniqueness of this work is the design of the generic electronic SSC circuit for SMA actuated system, for measurement and control. With a concern to the implementation of self-sensing technique in SMA, this scheme retains the systematic control architecture by using the sensing signal (self-sensed, electrical resistance corresponding to the system position) for feedback, without requiring any processing as that of the methods adopted and reported previously for SSA techniques of SMA.

  15. Improved artificial bee colony algorithm for wavefront sensor-less system in free space optical communication

    NASA Astrophysics Data System (ADS)

    Niu, Chaojun; Han, Xiang'e.

    2015-10-01

    Adaptive optics (AO) technology is an effective way to alleviate the effect of turbulence on free space optical communication (FSO). A new adaptive compensation method can be used without a wave-front sensor. Artificial bee colony algorithm (ABC) is a population-based heuristic evolutionary algorithm inspired by the intelligent foraging behaviour of the honeybee swarm with the advantage of simple, good convergence rate, robust and less parameter setting. In this paper, we simulate the application of the improved ABC to correct the distorted wavefront and proved its effectiveness. Then we simulate the application of ABC algorithm, differential evolution (DE) algorithm and stochastic parallel gradient descent (SPGD) algorithm to the FSO system and analyze the wavefront correction capabilities by comparison of the coupling efficiency, the error rate and the intensity fluctuation in different turbulence before and after the correction. The results show that the ABC algorithm has much faster correction speed than DE algorithm and better correct ability for strong turbulence than SPGD algorithm. Intensity fluctuation can be effectively reduced in strong turbulence, but not so effective in week turbulence.

  16. Multiphoton imaging microscopy at deeper layers with adaptive optics control of spherical aberration.

    PubMed

    Bueno, Juan M; Skorsetz, Martin; Palacios, Raquel; Gualda, Emilio J; Artal, Pablo

    2014-01-01

    Despite the inherent confocality and optical sectioning capabilities of multiphoton microscopy, three-dimensional (3-D) imaging of thick samples is limited by the specimen-induced aberrations. The combination of immersion objectives and sensorless adaptive optics (AO) techniques has been suggested to overcome this difficulty. However, a complex plane-by-plane correction of aberrations is required, and its performance depends on a set of image-based merit functions. We propose here an alternative approach to increase penetration depth in 3-D multiphoton microscopy imaging. It is based on the manipulation of the spherical aberration (SA) of the incident beam with an AO device while performing fast tomographic multiphoton imaging. When inducing SA, the image quality at best focus is reduced; however, better quality images are obtained from deeper planes within the sample. This is a compromise that enables registration of improved 3-D multiphoton images using nonimmersion objectives. Examples on ocular tissues and nonbiological samples providing different types of nonlinear signal are presented. The implementation of this technique in a future clinical instrument might provide a better visualization of corneal structures in living eyes.

  17. Poster - 30: Use of a Hazard-Risk Analysis for development of a new eye immobilization tool for treatment of choroidal melanoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prooijen, Monique van; Breen, Stephen

    Purpose: Our treatment for choroidal melanoma utilizes the GTC frame. The patient looks at a small LED to stabilize target position. The LED is attached to a metal arm attached to the GTC frame. A camera on the arm allows therapists to monitor patient compliance. To move to mask-based immobilization we need a new LED/camera attachment mechanism. We used a Hazard-Risk Analysis (HRA) to guide the design of the new tool. Method: A pre-clinical model was built with input from therapy and machine shop personnel. It consisted of an aluminum frame placed in aluminum guide posts attached to the couchmore » top. Further development was guided by the Department of Defense Standard Practice - System Safety hazard risk analysis technique. Results: An Orfit mask was selected because it allowed access to indexes on the couch top which assist with setup reproducibility. The first HRA table was created considering mechanical failure modes of the device. Discussions with operators and manufacturers identified other failure modes and solutions. HRA directed the design towards a safe clinical device. Conclusion: A new immobilization tool has been designed using hazard-risk analysis which resulted in an easier-to-use and safer tool compared to the initial design. The remaining risks are all low probability events and not dissimilar from those currently faced with the GTC setup. Given the gains in ease of use for therapists and patients as well as the lower costs for the hospital, we will implement this new tool.« less

  18. Neurodevelopmental and Cognitive Outcomes in Children With Intestinal Failure.

    PubMed

    Chesley, Patrick M; Sanchez, Sabrina E; Melzer, Lilah; Oron, Assaf P; Horslen, Simon P; Bennett, F Curt; Javid, Patrick J

    2016-07-01

    Recent advances in medical and surgical management have led to improved long-term survival in children with intestinal failure. Yet, limited data exist on their neurodevelopmental and cognitive outcomes. The aim of the present study was to measure neurodevelopmental outcomes in children with intestinal failure. Children enrolled in a regional intestinal failure program underwent prospective neurodevelopmental and psychometric evaluation using a validated scoring tool. Cognitive impairment was defined as a mental developmental index <70. Neurodevelopmental impairment was defined as cerebral palsy, visual or hearing impairment, or cognitive impairment. Univariate analyses were performed using the Wilcoxon rank-sum test. Data are presented as median (range). Fifteen children with a remnant bowel length of 18 (5-85) cm were studied at age 17 (12-67) months. Thirteen patients remained dependent on parenteral nutrition. Twelve (80%) subjects scored within the normal range on cognitive testing. Each child with cognitive impairment was noted to have additional risk factors independent of intestinal failure including cardiac arrest and extreme prematurity. On univariate analysis, cognitive impairment was associated with longer inpatient hospital stays, increased number of surgical procedures, and prematurity (P < 0.02). In total, 4 (27%) children demonstrated findings consistent with neurodevelopmental impairment. A majority of children with intestinal failure demonstrated normal neurodevelopmental and cognitive outcomes on psychometric testing. These data suggest that children with intestinal failure without significant comorbidity may be at low risk for long-term neurodevelopmental impairment.

  19. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang

    Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vsmore » treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.« less

  20. No Child Left Behind.

    ERIC Educational Resources Information Center

    Bush, George W.

    This blueprint describes President Bush's education plan. Federal block grants are provided to states for schools that establish annual assessments, demand progress, improve poorly-performing schools, create consequences for failure, and protect home and private schools. The "Reading First" initiative gives funds and tools to promote…

  1. Columbus safety and reliability

    NASA Astrophysics Data System (ADS)

    Longhurst, F.; Wessels, H.

    1988-10-01

    Analyses carried out to ensure Columbus reliability, availability, and maintainability, and operational and design safety are summarized. Failure modes/effects/criticality is the main qualitative tool used. The main aspects studied are fault tolerance, hazard consequence control, risk minimization, human error effects, restorability, and safe-life design.

  2. Logic Design Pathology and Space Flight Electronics

    NASA Technical Reports Server (NTRS)

    Katz, Richard B.; Barto, Rod L.; Erickson, Ken

    1999-01-01

    This paper presents a look at logic design from early in the US Space Program and examines faults in recent logic designs. Most examples are based on flight hardware failures and analysis of new tools and techniques. The paper is presented in viewgraph form.

  3. Fault Tree Analysis as a Planning and Management Tool: A Case Study

    ERIC Educational Resources Information Center

    Witkin, Belle Ruth

    1977-01-01

    Fault Tree Analysis is an operations research technique used to analyse the most probable modes of failure in a system, in order to redesign or monitor the system more closely in order to increase its likelihood of success. (Author)

  4. p53 protects against genome instability following centriole duplication failure

    PubMed Central

    Lambrus, Bramwell G.; Uetake, Yumi; Clutario, Kevin M.; Daggubati, Vikas; Snyder, Michael; Sluder, Greenfield

    2015-01-01

    Centriole function has been difficult to study because of a lack of specific tools that allow persistent and reversible centriole depletion. Here we combined gene targeting with an auxin-inducible degradation system to achieve rapid, titratable, and reversible control of Polo-like kinase 4 (Plk4), a master regulator of centriole biogenesis. Depletion of Plk4 led to a failure of centriole duplication that produced an irreversible cell cycle arrest within a few divisions. This arrest was not a result of a prolonged mitosis, chromosome segregation errors, or cytokinesis failure. Depleting p53 allowed cells that fail centriole duplication to proliferate indefinitely. Washout of auxin and restoration of endogenous Plk4 levels in cells that lack centrioles led to the penetrant formation of de novo centrioles that gained the ability to organize microtubules and duplicate. In summary, we uncover a p53-dependent surveillance mechanism that protects against genome instability by preventing cell growth after centriole duplication failure. PMID:26150389

  5. Methodological pitfalls in the analysis of contraceptive failure.

    PubMed

    Trussell, J

    1991-02-01

    Although the literature on contraceptive failure is vast and is expanding rapidly, our understanding of the relative efficacy of methods is quite limited because of defects in the research design and in the analytical tools used by investigators. Errors in the literature range from simple arithmetical mistakes to outright fraud. In many studies the proportion of the original sample lost to follow-up is so large that the published results have little meaning. Investigators do not routinely use life table techniques to control for duration of exposure; many employ the Pearl index, which suffers from the same problem as does the crude death rate as a measure of mortality. Investigators routinely calculate 'method' failure rates by eliminating 'user' failures from the numerator (pregnancies) but fail to eliminate 'imperfect' use from the denominator (exposure); as a consequence, these 'method' rates are biased downward. This paper explores these and other common biases that snare investigators and establishes methodological guidelines for future research.

  6. Failure Study of Composite Materials by the Yeh-Stratton Criterion

    NASA Technical Reports Server (NTRS)

    Yeh, Hsien-Yang; Richards, W. Lance

    1997-01-01

    The newly developed Yeh-Stratton (Y-S) Strength Criterion was used to study the failure of composite materials with central holes and normal cracks. To evaluate the interaction parameters for the Y-S failure theory, it is necessary to perform several biaxial loading tests. However, it is indisputable that the inhomogeneous and anisotropic nature of composite materials have made their own contribution to the complication of the biaxial testing problem. To avoid the difficulties of performing many biaxial tests and still consider the effects of the interaction term in the Y-S Criterion, a simple modification of the Y-S Criterion was developed. The preliminary predictions by the modified Y-S Criterion were relatively conservative compared to the testing data. Thus, the modified Y-S Criterion could be used as a design tool. To further understand the composite failure problem, an investigation of the damage zone in front of the crack tip coupled with the Y-S Criterion is imperative.

  7. Challenges in Resolution for IC Failure Analysis

    NASA Astrophysics Data System (ADS)

    Martinez, Nick

    1999-10-01

    Resolution is becoming more and more of a challenge in the world of Failure Analysis in integrated circuits. This is a result of the ongoing size reduction in microelectronics. Determining the cause of a failure depends upon being able to find the responsible defect. The time it takes to locate a given defect is extremely important so that proper corrective actions can be taken. The limits of current microscopy tools are being pushed. With sub-micron feature sizes and even smaller killing defects, optical microscopes are becoming obsolete. With scanning electron microscopy (SEM), the resolution is high but the voltage involved can make these small defects transparent due to the large mean-free path of incident electrons. In this presentation, I will give an overview of the use of inspection methods in Failure Analysis and show example studies of my work as an Intern student at Texas Instruments. 1. Work at Texas Instruments, Stafford, TX, was supported by TI. 2. Work at Texas Tech University, was supported by NSF Grant DMR9705498.

  8. Derivation of Failure Rates and Probability of Failures for the International Space Station Probabilistic Risk Assessment Study

    NASA Technical Reports Server (NTRS)

    Vitali, Roberto; Lutomski, Michael G.

    2004-01-01

    National Aeronautics and Space Administration s (NASA) International Space Station (ISS) Program uses Probabilistic Risk Assessment (PRA) as part of its Continuous Risk Management Process. It is used as a decision and management support tool to not only quantify risk for specific conditions, but more importantly comparing different operational and management options to determine the lowest risk option and provide rationale for management decisions. This paper presents the derivation of the probability distributions used to quantify the failure rates and the probability of failures of the basic events employed in the PRA model of the ISS. The paper will show how a Bayesian approach was used with different sources of data including the actual ISS on orbit failures to enhance the confidence in results of the PRA. As time progresses and more meaningful data is gathered from on orbit failures, an increasingly accurate failure rate probability distribution for the basic events of the ISS PRA model can be obtained. The ISS PRA has been developed by mapping the ISS critical systems such as propulsion, thermal control, or power generation into event sequences diagrams and fault trees. The lowest level of indenture of the fault trees was the orbital replacement units (ORU). The ORU level was chosen consistently with the level of statistically meaningful data that could be obtained from the aerospace industry and from the experts in the field. For example, data was gathered for the solenoid valves present in the propulsion system of the ISS. However valves themselves are composed of parts and the individual failure of these parts was not accounted for in the PRA model. In other words the failure of a spring within a valve was considered a failure of the valve itself.

  9. A new casemix adjustment index for hospital mortality among patients with congestive heart failure.

    PubMed

    Polanczyk, C A; Rohde, L E; Philbin, E A; Di Salvo, T G

    1998-10-01

    Comparative analysis of hospital outcomes requires reliable adjustment for casemix. Although congestive heart failure is one of the most common indications for hospitalization, congestive heart failure casemix adjustment has not been widely studied. The purposes of this study were (1) to describe and validate a new congestive heart failure-specific casemix adjustment index to predict in-hospital mortality and (2) to compare its performance to the Charlson comorbidity index. Data from all 4,608 admissions to the Massachusetts General Hospital from January 1990 to July 1996 with a principal ICD-9-CM discharge diagnosis of congestive heart failure were evaluated. Massachusetts General Hospital patients were randomly divided in a derivation and a validation set. By logistic regression, odds ratios for in-hospital death were computed and weights were assigned to construct a new predictive index in the derivation set. The performance of the index was tested in an internal Massachusetts General Hospital validation set and in a non-Massachusetts General Hospital external validation set incorporating data from all 1995 New York state hospital discharges with a primary discharge diagnosis of congestive heart failure. Overall in-hospital mortality was 6.4%. Based on the new index, patients were assigned to six categories with incrementally increasing hospital mortality rates ranging from 0.5% to 31%. By logistic regression, "c" statistics of the congestive heart failure-specific index (0.83 and 0.78, derivation and validation set) were significantly superior to the Charlson index (0.66). Similar incrementally increasing hospital mortality rates were observed in the New York database with the congestive heart failure-specific index ("c" statistics 0.75). In an administrative database, this congestive heart failure-specific index may be a more adequate casemix adjustment tool to predict hospital mortality in patients hospitalized for congestive heart failure.

  10. Modelling river bank erosion processes and mass failure mechanisms using 2-D depth averaged numerical model

    NASA Astrophysics Data System (ADS)

    Die Moran, Andres; El kadi Abderrezzak, Kamal; Tassi, Pablo; Herouvet, Jean-Michel

    2014-05-01

    Bank erosion is a key process that may cause a large number of economic and environmental problems (e.g. land loss, damage to structures and aquatic habitat). Stream bank erosion (toe erosion and mass failure) represents an important form of channel morphology changes and a significant source of sediment. With the advances made in computational techniques, two-dimensional (2-D) numerical models have become valuable tools for investigating flow and sediment transport in open channels at large temporal and spatial scales. However, the implementation of mass failure process in 2D numerical models is still a challenging task. In this paper, a simple, innovative algorithm is implemented in the Telemac-Mascaret modeling platform to handle bank failure: failure occurs whether the actual slope of one given bed element is higher than the internal friction angle. The unstable bed elements are rotated around an appropriate axis, ensuring mass conservation. Mass failure of a bank due to slope instability is applied at the end of each sediment transport evolution iteration, once the bed evolution due to bed load (and/or suspended load) has been computed, but before the global sediment mass balance is verified. This bank failure algorithm is successfully tested using two laboratory experimental cases. Then, bank failure in a 1:40 scale physical model of the Rhine River composed of non-uniform material is simulated. The main features of the bank erosion and failure are correctly reproduced in the numerical simulations, namely the mass wasting at the bank toe, followed by failure at the bank head, and subsequent transport of the mobilised material in an aggradation front. Volumes of eroded material obtained are of the same order of magnitude as the volumes measured during the laboratory tests.

  11. Use of Failure Mode and Effects Analysis to Improve Emergency Department Handoff Processes.

    PubMed

    Sorrentino, Patricia

    2016-01-01

    The purpose of this article is to describe a quality improvement process using failure mode and effects analysis (FMEA) to evaluate systems handoff communication processes, improve emergency department (ED) throughput and reduce crowding through development of a standardized handoff, and, ultimately, improve patient safety. Risk of patient harm through ineffective communication during handoff transitions is a major reason for breakdown of systems. Complexities of ED processes put patient safety at risk. An increased incidence of submitted patient safety event reports for handoff communication failures between the ED and inpatient units solidified a decision to implement the use of FMEA to identify handoff failures to mitigate patient harm through redesign. The clinical nurse specialist implemented an FMEA. Handoff failure themes were created from deidentified retrospective reviews. Weekly meetings were held over a 3-month period to identify failure modes and determine cause and effect on the process. A functional block diagram process map tool was used to illustrate handoff processes. An FMEA grid was used to list failure modes and assign a risk priority number to quantify results. Multiple areas with actionable failures were identified. A majority of causes for high-priority failure modes were specific to communications. Findings demonstrate the complexity of transition and handoff processes. The FMEA served to identify and evaluate risk of handoff failures and provide a framework for process improvement. A focus on mentoring nurses to quality handoff processes so that it becomes habitual practice is crucial to safe patient transitions. Standardizing content and hardwiring within the system are best practice. The clinical nurse specialist is prepared to provide strong leadership to drive and implement system-wide quality projects.

  12. The consistency service of the ATLAS Distributed Data Management system

    NASA Astrophysics Data System (ADS)

    Serfon, Cédric; Garonne, Vincent; ATLAS Collaboration

    2011-12-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.

  13. Reducing maintenance costs in agreement with CNC machine tools reliability

    NASA Astrophysics Data System (ADS)

    Ungureanu, A. L.; Stan, G.; Butunoi, P. A.

    2016-08-01

    Aligning maintenance strategy with reliability is a challenge due to the need to find an optimal balance between them. Because the various methods described in the relevant literature involve laborious calculations or use of software that can be costly, this paper proposes a method that is easier to implement on CNC machine tools. The new method, called the Consequence of Failure Analysis (CFA) is based on technical and economic optimization, aimed at obtaining a level of required performance with minimum investment and maintenance costs.

  14. The Influence of High Pressure Thermal Behavior on Friction-induced material transfer During Dry Machining of Titanium

    NASA Astrophysics Data System (ADS)

    Abdel-Aal, H. A.; El Mansori, M.

    2011-05-01

    In this paper we study failure of coated carbide tools due to thermal loading. The study emphasizes the role assumed by the thermo-physical properties of the tool material in enhancing or preventing mass attrition of the cutting elements within the tool. It is shown that within a comprehensive view of the nature of conduction in the tool zone, thermal conduction is not solely affected by temperature. Rather it is a function of the so called thermodynamic forces. These are the stress, the strain, strain rate, rate of temperature rise, and the temperature gradient. Although that within such consideration description of thermal conduction is non-linear, it is beneficial to employ such a form because it facilitates a full mechanistic understanding of thermal activation of tool wear.

  15. Microstructures, Forming Limit and Failure Analyses of Inconel 718 Sheets for Fabrication of Aerospace Components

    NASA Astrophysics Data System (ADS)

    Sajun Prasad, K.; Panda, Sushanta Kumar; Kar, Sujoy Kumar; Sen, Mainak; Murty, S. V. S. Naryana; Sharma, Sharad Chandra

    2017-04-01

    Recently, aerospace industries have shown increasing interest in forming limits of Inconel 718 sheet metals, which can be utilised in designing tools and selection of process parameters for successful fabrication of components. In the present work, stress-strain response with failure strains was evaluated by uniaxial tensile tests in different orientations, and two-stage work-hardening behavior was observed. In spite of highly preferred texture, tensile properties showed minor variations in different orientations due to the random distribution of nanoprecipitates. The forming limit strains were evaluated by deforming specimens in seven different strain paths using limiting dome height (LDH) test facility. Mostly, the specimens failed without prior indication of localized necking. Thus, fracture forming limit diagram (FFLD) was evaluated, and bending correction was imposed due to the use of sub-size hemispherical punch. The failure strains of FFLD were converted into major-minor stress space ( σ-FFLD) and effective plastic strain-stress triaxiality space ( ηEPS-FFLD) as failure criteria to avoid the strain path dependence. Moreover, FE model was developed, and the LDH, strain distribution and failure location were predicted successfully using above-mentioned failure criteria with two stages of work hardening. Fractographs were correlated with the fracture behavior and formability of sheet metal.

  16. Diverse Redundant Systems for Reliable Space Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    Reliable life support systems are required for deep space missions. The probability of a fatal life support failure should be less than one in a thousand in a multi-year mission. It is far too expensive to develop a single system with such high reliability. Using three redundant units would require only that each have a failure probability of one in ten over the mission. Since the system development cost is inverse to the failure probability, this would cut cost by a factor of one hundred. Using replaceable subsystems instead of full systems would further cut cost. Using full sets of replaceable components improves reliability more than using complete systems as spares, since a set of components could repair many different failures instead of just one. Replaceable components would require more tools, space, and planning than full systems or replaceable subsystems. However, identical system redundancy cannot be relied on in practice. Common cause failures can disable all the identical redundant systems. Typical levels of common cause failures will defeat redundancy greater than two. Diverse redundant systems are required for reliable space life support. Three, four, or five diverse redundant systems could be needed for sufficient reliability. One system with lower level repair could be substituted for two diverse systems to save cost.

  17. Is it possible to predict office hysteroscopy failure?

    PubMed

    Cobellis, Luigi; Castaldi, Maria Antonietta; Giordano, Valentino; De Franciscis, Pasquale; Signoriello, Giuseppe; Colacurci, Nicola

    2014-10-01

    The purpose of this study was to develop a clinical tool, the HFI (Hysteroscopy Failure Index), which gives criteria to predict hysteroscopic examination failure. This was a retrospective diagnostic test study, aimed to validate the HFI, set at the Department of Gynaecology, Obstetric and Reproductive Science of the Second University of Naples, Italy. The HFI was applied to our database of 995 consecutive women, who underwent office based to assess abnormal uterine bleeding (AUB), infertility, cervical polyps, and abnormal sonographic patterns (postmenopausal endometrial thickness of more than 5mm, endometrial hyperechogenic spots, irregular endometrial line, suspect of uterine septa). Demographic characteristics, previous surgery, recurrent infections, sonographic data, Estro-Progestins, IUD and menopausal status were collected. Receiver operating characteristic (ROC) curve analysis was used to assess the ability of the model to identify patients who were correctly identified (true positives) divided by the total number of failed hysteroscopies (true positives+false negatives). Positive and Negative Likelihood Ratios with 95%CI were calculated. The HFI score is able to predict office hysteroscopy failure in 76% of cases. Moreover, the Positive likelihood ratio was 11.37 (95% CI: 8.49-15.21), and the Negative likelihood ratio was 0.33 (95% CI: 0.27-0.41). Hysteroscopy failure index was able to retrospectively predict office hysteroscopy failure. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Current indications for transplantation: stratification of severe heart failure and shared decision-making

    PubMed Central

    Vucicevic, Darko; Honoris, Lily; Raia, Federica

    2018-01-01

    Heart failure (HF) is a complex clinical syndrome that results from structural or functional cardiovascular disorders causing a mismatch between demand and supply of oxygenated blood and consecutive failure of the body’s organs. For those patients with stage D HF, advanced therapies, such as mechanical circulatory support (MCS) or heart transplantation (HTx), are potentially life-saving options. The role of risk stratification of patients with stage D HF in a value-based healthcare framework is to predict which subset might benefit from advanced HF (AdHF) therapies, to improve outcomes related to the individual patient including mortality, morbidity and patient experience as well as to optimize health care delivery system outcomes such as cost-effectiveness. Risk stratification and subsequent outcome prediction as well as therapeutic recommendation-making need to be based on the comparative survival benefit rationale. A robust model needs to (I) have the power to discriminate (i.e., to correctly risk stratify patients); (II) calibrate (i.e., to show agreement between the predicted and observed risk); (III) to be applicable to the general population; and (IV) provide good external validation. The Seattle Heart Failure Model (SHFM) and the Heart Failure Survival Score (HFSS) are two of the most widely utilized scores. However, outcomes for patients with HF are highly variable which make clinical predictions challenging. Despite our clinical expertise and current prediction tools, the best short- and long-term survival for the individual patient, particularly the sickest patient, is not easy to identify because among the most severely ill, elderly and frail patients, most preoperative prediction tools have the tendency to be imprecise in estimating risk. They should be used as a guide in a clinical encounter grounded in a culture of shared decision-making, with the expert healthcare professional team as consultants and the patient as an empowered decision-maker in a trustful safe therapeutic relationship. PMID:29492383

  19. Current indications for transplantation: stratification of severe heart failure and shared decision-making.

    PubMed

    Vucicevic, Darko; Honoris, Lily; Raia, Federica; Deng, Mario

    2018-01-01

    Heart failure (HF) is a complex clinical syndrome that results from structural or functional cardiovascular disorders causing a mismatch between demand and supply of oxygenated blood and consecutive failure of the body's organs. For those patients with stage D HF, advanced therapies, such as mechanical circulatory support (MCS) or heart transplantation (HTx), are potentially life-saving options. The role of risk stratification of patients with stage D HF in a value-based healthcare framework is to predict which subset might benefit from advanced HF (AdHF) therapies, to improve outcomes related to the individual patient including mortality, morbidity and patient experience as well as to optimize health care delivery system outcomes such as cost-effectiveness. Risk stratification and subsequent outcome prediction as well as therapeutic recommendation-making need to be based on the comparative survival benefit rationale. A robust model needs to (I) have the power to discriminate (i.e., to correctly risk stratify patients); (II) calibrate (i.e., to show agreement between the predicted and observed risk); (III) to be applicable to the general population; and (IV) provide good external validation. The Seattle Heart Failure Model (SHFM) and the Heart Failure Survival Score (HFSS) are two of the most widely utilized scores. However, outcomes for patients with HF are highly variable which make clinical predictions challenging. Despite our clinical expertise and current prediction tools, the best short- and long-term survival for the individual patient, particularly the sickest patient, is not easy to identify because among the most severely ill, elderly and frail patients, most preoperative prediction tools have the tendency to be imprecise in estimating risk. They should be used as a guide in a clinical encounter grounded in a culture of shared decision-making, with the expert healthcare professional team as consultants and the patient as an empowered decision-maker in a trustful safe therapeutic relationship.

  20. Minimizing treatment planning errors in proton therapy using failure mode and effects analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Yuanshui, E-mail: yuanshui.zheng@okc.procure.com; Johnson, Randall; Larson, Gary

    Purpose: Failure mode and effects analysis (FMEA) is a widely used tool to evaluate safety or reliability in conventional photon radiation therapy. However, reports about FMEA application in proton therapy are scarce. The purpose of this study is to apply FMEA in safety improvement of proton treatment planning at their center. Methods: The authors performed an FMEA analysis of their proton therapy treatment planning process using uniform scanning proton beams. The authors identified possible failure modes in various planning processes, including image fusion, contouring, beam arrangement, dose calculation, plan export, documents, billing, and so on. For each error, the authorsmore » estimated the frequency of occurrence, the likelihood of being undetected, and the severity of the error if it went undetected and calculated the risk priority number (RPN). The FMEA results were used to design their quality management program. In addition, the authors created a database to track the identified dosimetric errors. Periodically, the authors reevaluated the risk of errors by reviewing the internal error database and improved their quality assurance program as needed. Results: In total, the authors identified over 36 possible treatment planning related failure modes and estimated the associated occurrence, detectability, and severity to calculate the overall risk priority number. Based on the FMEA, the authors implemented various safety improvement procedures into their practice, such as education, peer review, and automatic check tools. The ongoing error tracking database provided realistic data on the frequency of occurrence with which to reevaluate the RPNs for various failure modes. Conclusions: The FMEA technique provides a systematic method for identifying and evaluating potential errors in proton treatment planning before they result in an error in patient dose delivery. The application of FMEA framework and the implementation of an ongoing error tracking system at their clinic have proven to be useful in error reduction in proton treatment planning, thus improving the effectiveness and safety of proton therapy.« less

  1. Randomized, Controlled Trial of an Advance Care Planning Video Decision Support Tool for Patients With Advanced Heart Failure.

    PubMed

    El-Jawahri, Areej; Paasche-Orlow, Michael K; Matlock, Dan; Stevenson, Lynne Warner; Lewis, Eldrin F; Stewart, Garrick; Semigran, Marc; Chang, Yuchiao; Parks, Kimberly; Walker-Corkery, Elizabeth S; Temel, Jennifer S; Bohossian, Hacho; Ooi, Henry; Mann, Eileen; Volandes, Angelo E

    2016-07-05

    Conversations about goals of care and cardiopulmonary resuscitation (CPR)/intubation for patients with advanced heart failure can be difficult. This study examined the impact of a video decision support tool and patient checklist on advance care planning for patients with heart failure. This was a multisite, randomized, controlled trial of a video-assisted intervention and advance care planning checklist versus a verbal description in 246 patients ≥64 years of age with heart failure and an estimated likelihood of death of >50% within 2 years. Intervention participants received a verbal description for goals of care (life-prolonging care, limited care, and comfort care) and CPR/intubation plus a 6-minute video depicting the 3 levels of care, CPR/intubation, and an advance care planning checklist. Control subjects received only the verbal description. The primary analysis compared the proportion of patients preferring comfort care between study arms immediately after the intervention. Secondary outcomes were CPR/intubation preferences and knowledge (6-item test; range, 0-6) after intervention. In the intervention group, 27 (22%) chose life-prolonging care, 31 (25%) chose limited care, 63 (51%) selected comfort care, and 2 (2%) were uncertain. In the control group, 50 (41%) chose life-prolonging care, 27 (22%) selected limited care, 37 (30%) chose comfort care, and 8 (7%) were uncertain (P<0.001). Intervention participants (compared with control subjects) were more likely to forgo CPR (68% versus 35%; P<0.001) and intubation (77% versus 48%; P<0.001) and had higher mean knowledge scores (4.1 versus 3.0; P<0.001). Patients with heart failure who viewed a video were more informed, more likely to select a focus on comfort, and less likely to desire CPR/intubation compared with patients receiving verbal information only. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01589120. © 2016 American Heart Association, Inc.

  2. Minimizing treatment planning errors in proton therapy using failure mode and effects analysis.

    PubMed

    Zheng, Yuanshui; Johnson, Randall; Larson, Gary

    2016-06-01

    Failure mode and effects analysis (FMEA) is a widely used tool to evaluate safety or reliability in conventional photon radiation therapy. However, reports about FMEA application in proton therapy are scarce. The purpose of this study is to apply FMEA in safety improvement of proton treatment planning at their center. The authors performed an FMEA analysis of their proton therapy treatment planning process using uniform scanning proton beams. The authors identified possible failure modes in various planning processes, including image fusion, contouring, beam arrangement, dose calculation, plan export, documents, billing, and so on. For each error, the authors estimated the frequency of occurrence, the likelihood of being undetected, and the severity of the error if it went undetected and calculated the risk priority number (RPN). The FMEA results were used to design their quality management program. In addition, the authors created a database to track the identified dosimetric errors. Periodically, the authors reevaluated the risk of errors by reviewing the internal error database and improved their quality assurance program as needed. In total, the authors identified over 36 possible treatment planning related failure modes and estimated the associated occurrence, detectability, and severity to calculate the overall risk priority number. Based on the FMEA, the authors implemented various safety improvement procedures into their practice, such as education, peer review, and automatic check tools. The ongoing error tracking database provided realistic data on the frequency of occurrence with which to reevaluate the RPNs for various failure modes. The FMEA technique provides a systematic method for identifying and evaluating potential errors in proton treatment planning before they result in an error in patient dose delivery. The application of FMEA framework and the implementation of an ongoing error tracking system at their clinic have proven to be useful in error reduction in proton treatment planning, thus improving the effectiveness and safety of proton therapy.

  3. Failure Assessment of Brazed Structures

    NASA Technical Reports Server (NTRS)

    Flom, Yuri

    2012-01-01

    Despite the great advances in analytical methods available to structural engineers, designers of brazed structures have great difficulties in addressing fundamental questions related to the loadcarrying capabilities of brazed assemblies. In this chapter we will review why such common engineering tools as Finite Element Analysis (FEA) as well as many well-established theories (Tresca, von Mises, Highest Principal Stress, etc) don't work well for the brazed joints. This chapter will show how the classic approach of using interaction equations and the less known Coulomb-Mohr failure criterion can be employed to estimate Margins of Safety (MS) in brazed joints.

  4. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1976-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program assembly control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools are described, as well as the program test plans and their implementation on the various simulators. Failure effects analysis and the creation of special failure generating software for testing purposes are described.

  5. Analysis of Minimum Quantity Lubrication (MQL) for Different Coating Tools during Turning of TC11 Titanium Alloy.

    PubMed

    Qin, Sheng; Li, Zhongquan; Guo, Guoqiang; An, Qinglong; Chen, Ming; Ming, Weiwei

    2016-09-28

    The tool coating and cooling strategy are two key factors when machining difficult-to-cut materials such as titanium alloy. In this paper, diamond coating was deposited on a commercial carbide insert as an attempt to increase the machinability of TC11 alloy during the turning process. An uncoated carbide insert and a commercial Al₂O₃/TiAlN-coated tool were also tested as a comparison. Furthermore, MQL was applied to improve the cutting condition. Cutting performances were analyzed by cutting force, cutting temperate and surface roughness measurements. Tool wears and tool lives were evaluated to find a good matchup between the tool coating and cooling strategy. According to the results, using MQL can slightly reduce the cutting force. By applying MQL, cutting temperatures and tool wears were reduced by a great amount. Besides, MQL can affect the tool wear mechanism and tool failure modes. The tool life of an Al₂O₃/TiAlN-coated tool can be prolonged by 88.4% under the MQL condition. Diamond-coated tools can obtain a good surface finish when cutting parameters and lubrication strategies are properly chosen.

  6. Analysis of Minimum Quantity Lubrication (MQL) for Different Coating Tools during Turning of TC11 Titanium Alloy

    PubMed Central

    Qin, Sheng; Li, Zhongquan; Guo, Guoqiang; An, Qinglong; Chen, Ming; Ming, Weiwei

    2016-01-01

    The tool coating and cooling strategy are two key factors when machining difficult-to-cut materials such as titanium alloy. In this paper, diamond coating was deposited on a commercial carbide insert as an attempt to increase the machinability of TC11 alloy during the turning process. An uncoated carbide insert and a commercial Al2O3/TiAlN-coated tool were also tested as a comparison. Furthermore, MQL was applied to improve the cutting condition. Cutting performances were analyzed by cutting force, cutting temperate and surface roughness measurements. Tool wears and tool lives were evaluated to find a good matchup between the tool coating and cooling strategy. According to the results, using MQL can slightly reduce the cutting force. By applying MQL, cutting temperatures and tool wears were reduced by a great amount. Besides, MQL can affect the tool wear mechanism and tool failure modes. The tool life of an Al2O3/TiAlN-coated tool can be prolonged by 88.4% under the MQL condition. Diamond-coated tools can obtain a good surface finish when cutting parameters and lubrication strategies are properly chosen. PMID:28773926

  7. Multiple IMU system development, volume 1

    NASA Technical Reports Server (NTRS)

    Landey, M.; Mckern, R.

    1974-01-01

    A redundant gimballed inertial system is described. System requirements and mechanization methods are defined and hardware and software development is described. Failure detection and isolation algorithms are presented and technology achievements described. Application of the system as a test tool for shuttle avionics concepts is outlined.

  8. Towards a Personalized Prescription Tool for Diabetic Treatment

    DTIC Science & Technology

    2015-05-18

    for public release and sale ; its distribution is limited. U.S.N.A. --- Trident Scholar project report; no. 432 (2015) TOWARDS A...stroke, high blood pressure, blindness, kidney failure, and nervous symptom damage. Both type I and type II diabetes are currently treated with insulin

  9. Fault Tree Analysis: An Emerging Methodology for Instructional Science.

    ERIC Educational Resources Information Center

    Wood, R. Kent; And Others

    1979-01-01

    Describes Fault Tree Analysis, a tool for systems analysis which attempts to identify possible modes of failure in systems to increase the probability of success. The article defines the technique and presents the steps of FTA construction, focusing on its application to education. (RAO)

  10. Exploring tool innovation: a comparison of Western and Bushman children.

    PubMed

    Nielsen, Mark; Tomaselli, Keyan; Mushin, Ilana; Whiten, Andrew

    2014-10-01

    A capacity for constructing new tools, or using old tools in new ways, to solve novel problems is a core feature of what it means to be human. Yet current evidence suggests that young children are surprisingly poor at innovating tools. However, all studies of tool innovation to date have been conducted with children from comparatively privileged Western backgrounds. This raises questions as to whether or not previously documented tool innovation failure is culturally and economically specific. In the current study, thus, we explored the innovation capacities of children from Westernized urban backgrounds and from remote communities of South African Bushmen. Consistent with past research, we found tool innovation to occur at extremely low rates and that cultural background had no bearing on this. The current study is the first to empirically test tool innovation in children from non-Western backgrounds, with our data being consistent with the view that despite its key role in human evolution, a capacity for innovation in tool making remains remarkably undeveloped during early childhood. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Comparing Quick Sequential Organ Failure Assessment Scores to End-tidal Carbon Dioxide as Mortality Predictors in Prehospital Patients with Suspected Sepsis.

    PubMed

    Hunter, Christopher L; Silvestri, Salvatore; Ralls, George; Stone, Amanda; Walker, Ayanna; Mangalat, Neal; Papa, Linda

    2018-05-01

    Early identification of sepsis significantly improves outcomes, suggesting a role for prehospital screening. An end-tidal carbon dioxide (ETCO 2 ) value ≤ 25 mmHg predicts mortality and severe sepsis when used as part of a prehospital screening tool. Recently, the Quick Sequential Organ Failure Assessment (qSOFA) score was also derived as a tool for predicting poor outcomes in potentially septic patients. We conducted a retrospective cohort study among patients transported by emergency medical services to compare the use of ETCO 2 ≤ 25 mmHg with qSOFA score of ≥ 2 as a predictor of mortality or diagnosis of severe sepsis in prehospital patients with suspected sepsis. By comparison of receiver operator characteristic curves, ETCO 2 had a higher discriminatory power to predict mortality, sepsis, and severe sepsis than qSOFA. Both non-invasive measures were easily obtainable by prehospital personnel, with ETCO 2 performing slightly better as an outcome predictor.

  12. Supporting secure programming in web applications through interactive static analysis.

    PubMed

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2014-07-01

    Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.

  13. The role of failure modes and effects analysis in showing the benefits of automation in the blood bank.

    PubMed

    Han, Tae Hee; Kim, Moon Jung; Kim, Shinyoung; Kim, Hyun Ok; Lee, Mi Ae; Choi, Ji Seon; Hur, Mina; St John, Andrew

    2013-05-01

    Failure modes and effects analysis (FMEA) is a risk management tool used by the manufacturing industry but now being applied in laboratories. Teams from six South Korean blood banks used this tool to map their manual and automated blood grouping processes and determine the risk priority numbers (RPNs) as a total measure of error risk. The RPNs determined by each of the teams consistently showed that the use of automation dramatically reduced the RPN compared to manual processes. In addition, FMEA showed where the major risks occur in each of the manual processes and where attention should be prioritized to improve the process. Despite no previous experience with FMEA, the teams found the technique relatively easy to use and the subjectivity associated with assigning risk numbers did not affect the validity of the data. FMEA should become a routine technique for improving processes in laboratories. © 2012 American Association of Blood Banks.

  14. Satellite Vulnerability to Space Debris- An Improved 3D Risk Assessment Methodology

    NASA Astrophysics Data System (ADS)

    Grassi, Lilith; Destefanis, Roberto; Tiboldo, Francesca; Donath, Therese; Winterboer, Arne; Evand, Leanne; Janovsky, Rolf; Kempf, Scott; Rudolph, Martin; Schafer, Frank; Gelhaus, Johannes

    2013-08-01

    The work described in the present paper, performed as a part of the PÇ-ROTECT project, presents an enhanced method to evaluate satellite vulnerability to micrometeoroids and orbital debris (MMOD), using the ESABASE2/Debris tool (developed under ESA contract). Starting from the estimation of induced failures on spacecraft (S/C) components and from the computation of lethal impacts (with an energy leading to the loss of the satellite), and considering the equipment redundancies and interactions between components, the debris-induced S/C functional impairment is assessed. The developed methodology, illustrated through its application to a case study satellite, includes the capability to estimate the number of failures on internal components, overcoming the limitations of current tools which do not allow propagating the debris cloud inside the S/C. The ballistic limit of internal equipment behind a sandwich panel structure is evaluated through the implementation of the Schäfer Ryan Lambert (SRL) Ballistic Limit Equation (BLE).

  15. Structural Health Monitoring with Fiber Bragg Grating and Piezo Arrays

    NASA Technical Reports Server (NTRS)

    Black, Richard J.; Faridian, Ferey; Moslehi, Behzad; Sotoudeh, Vahid

    2012-01-01

    Structural health monitoring (SHM) is one of the most important tools available for the maintenance, safety, and integrity of aerospace structural systems. Lightweight, electromagnetic-interference- immune, fiber-optic sensor-based SHM will play an increasing role in more secure air transportation systems. Manufacturers and maintenance personnel have pressing needs for significantly improving safety and reliability while providing for lower inspection and maintenance costs. Undetected or untreated damage may grow and lead to catastrophic structural failure. Damage can originate from the strain/stress history of the material, imperfections of domain boundaries in metals, delamination in multi-layer materials, or the impact of machine tools in the manufacturing process. Damage can likewise develop during service life from wear and tear, or under extraordinary circumstances such as with unusual forces, temperature cycling, or impact of flying objects. Monitoring and early detection are key to preventing a catastrophic failure of structures, especially when these are expected to perform near their limit conditions.

  16. Supporting secure programming in web applications through interactive static analysis

    PubMed Central

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2013-01-01

    Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases. PMID:25685513

  17. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  18. Probabilistic risk analysis of building contamination.

    PubMed

    Bolster, D T; Tartakovsky, D M

    2008-10-01

    We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.

  19. Risk and Vulnerability Analysis of Satellites Due to MM/SD with PIRAT

    NASA Astrophysics Data System (ADS)

    Kempf, Scott; Schafer, Frank Rudolph, Martin; Welty, Nathan; Donath, Therese; Destefanis, Roberto; Grassi, Lilith; Janovsky, Rolf; Evans, Leanne; Winterboer, Arne

    2013-08-01

    Until recently, the state-of-the-art assessment of the threat posed to spacecraft by micrometeoroids and space debris was limited to the application of ballistic limit equations to the outer hull of a spacecraft. The probability of no penetration (PNP) is acceptable for assessing the risk and vulnerability of manned space mission, however, for unmanned missions, whereby penetrations of the spacecraft exterior do not necessarily constitute satellite or mission failure, these values are overly conservative. The newly developed software tool PIRAT (Particle Impact Risk and Vulnerability Analysis Tool) has been developed based on the Schäfer-Ryan-Lambert (SRL) triple-wall ballistic limit equation (BLE), applicable for various satellite components. As a result, it has become possible to assess the individual failure rates of satellite components. This paper demonstrates the modeling of an example satellite, the performance of a PIRAT analysis and the potential for subsequent design optimizations with respect of micrometeoroid and space debris (MM/SD) impact risk.

  20. Investigation of fatigue crack growth in acrylic bone cement using the acoustic emission technique.

    PubMed

    Roques, A; Browne, M; Thompson, J; Rowland, C; Taylor, A

    2004-02-01

    Failure of the bone cement mantle has been implicated in the loosening process of cemented hip stems. Current methods of investigating degradation of the cement mantle in vitro often require sectioning of the sample to confirm failure paths. The present research investigates acoustic emission as a passive experimental method for the assessment of bone cement failure. Damage in bone cement was monitored during four point bending fatigue tests through an analysis of the peak amplitude, duration, rise time (RT) and energy of the events emitted from the damage sections. A difference in AE trends was observed during failure for specimens aged and tested in (i) air and (ii) Ringer's solution at 37 degrees C. It was noted that the acoustic behaviour varied according to applied load level; events of higher duration and RT were emitted during fatigue at lower stresses. A good correlation was observed between crack location and source of acoustic emission, and the nature of the acoustic parameters that were most suited to bone cement failure characterisation was identified. The methodology employed in this study could potentially be used as a pre-clinical assessment tool for the integrity of cemented load bearing implants.

  1. [Use of lung ultrasound as a prognostic tool in outpatients with heart failure].

    PubMed

    Tojo Villanueva, María Del Carmen; Fernández López, María; Canora Lebrato, Jesús; Satué Bartolomé, José Ángel; San Martín Prado, Alberto; Zapatero Gaviria, Antonio

    2016-07-01

    To assess the prognostic value of lung ultrasound for patients with chronic heart failure. Prospective observational cohort study, in which a lung ultrasound was performed on 54 patients at a heart failure outpatient consultation. Ultrasonography was classified as positive or negative for ultrasound interstitial syndrome depending on the number of B lines observed. Patients were followed up for six months; considering emergency visits, readmissions and deaths due to heart failure as markers of poor prognosis. 53.7% (29) of the patients had ultrasound interstitial syndrome. Among them, 48.3% (14) were readmitted, compared to 16% (4) of those without the syndrome (P=.012). Considering any of the events previously described as end points (readmissions, emergencies and deaths), we found that in the group of patients with ultrasound interstitial syndrome, 55.2% (16) had at least one of these complications, compared to 20% (5) of participants without the syndrome (P=.008). Lung ultrasound in the outpatient setting is useful in predicting which patients are at increased risk of heart failure decompensation in the mid-term. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.

  2. Evolving Role of Natriuretic Peptides from Diagnostic Tool to Therapeutic Modality.

    PubMed

    Pagel-Langenickel, Ines

    2018-01-01

    Natriuretic peptides (NP) are widely recognized as key regulators of blood pressure, water and salt homeostasis. In addition, they play a critical role in physiological cardiac growth and mediate a variety of biological effects including antiproliferative and anti-inflammatory effects in other organs and tissues. The cardiac release of NPs ANP and BNP represents an important compensatory mechanism during acute and chronic cardiac overload and during the pathogenesis of heart failure where their actions counteract the sustained activation of renin-angiotensin-aldosterone and other neurohormonal systems. Elevated circulating plasma NP levels correlate with the severity of heart failure and particularly BNP and the pro-peptide, NT-proBNP have been established as biomarkers for the diagnosis of heart failure as well as prognostic markers for cardiovascular risk. Despite activation of the NP system in heart failure it is inadequate to prevent progressive fluid and sodium retention and cardiac remodeling. Therapeutic approaches included administration of synthetic peptide analogs and the inhibition of NP-degrading enzyme neutral endopeptidase (NEP). Of all strategies only the combined NEP/ARB inhibition with sacubitril/valsartan had shown clinical success in reducing cardiovascular mortality and morbidity in patients with heart failure.

  3. Identifying the latent failures underpinning medication administration errors: an exploratory study.

    PubMed

    Lawton, Rebecca; Carruthers, Sam; Gardner, Peter; Wright, John; McEachan, Rosie R C

    2012-08-01

    The primary aim of this article was to identify the latent failures that are perceived to underpin medication errors. The study was conducted within three medical wards in a hospital in the United Kingdom. The study employed a cross-sectional qualitative design. Interviews were conducted with 12 nurses and eight managers. Interviews were transcribed and subject to thematic content analysis. A two-step inter-rater comparison tested the reliability of the themes. Ten latent failures were identified based on the analysis of the interviews. These were ward climate, local working environment, workload, human resources, team communication, routine procedures, bed management, written policies and procedures, supervision and leadership, and training. The discussion focuses on ward climate, the most prevalent theme, which is conceptualized here as interacting with failures in the nine other organizational structures and processes. This study is the first of its kind to identify the latent failures perceived to underpin medication errors in a systematic way. The findings can be used as a platform for researchers to test the impact of organization-level patient safety interventions and to design proactive error management tools and incident reporting systems in hospitals. © Health Research and Educational Trust.

  4. Finite element based damage assessment of composite tidal turbine blades

    NASA Astrophysics Data System (ADS)

    Fagan, Edward M.; Leen, Sean B.; Kennedy, Ciaran R.; Goggins, Jamie

    2015-07-01

    With significant interest growing in the ocean renewables sector, horizontal axis tidal current turbines are in a position to dominate the marketplace. The test devices that have been placed in operation so far have suffered from premature failures, caused by difficulties with structural strength prediction. The goal of this work is to develop methods of predicting the damage level in tidal turbines under their maximum operating tidal velocity. The analysis was conducted using the finite element software package Abaqus; shell models of three representative tidal turbine blades are produced. Different construction methods will affect the damage level in the blade and for this study models were developed with varying hydrofoil profiles. In order to determine the risk of failure, a user material subroutine (UMAT) was created. The UMAT uses the failure criteria designed by Alfred Puck to calculate the risk of fibre and inter-fibre failure in the blades. The results show that degradation of the stiffness is predicted for the operating conditions, having an effect on the overall tip deflection. The failure criteria applied via the UMAT form a useful tool for analysis of high risk regions within the blade designs investigated.

  5. Subcritical crack growth in SiNx thin-film barriers studied by electro-mechanical two-point bending

    NASA Astrophysics Data System (ADS)

    Guan, Qingling; Laven, Jozua; Bouten, Piet C. P.; de With, Gijsbertus

    2013-06-01

    Mechanical failure resulting from subcritical crack growth in the SiNx inorganic barrier layer applied on a flexible multilayer structure was studied by an electro-mechanical two-point bending method. A 10 nm conducting tin-doped indium oxide layer was sputtered as an electrical probe to monitor the subcritical crack growth in the 150 nm dielectric SiNx layer carried by a polyethylene naphthalate substrate. In the electro-mechanical two-point bending test, dynamic and static loads were applied to investigate the crack propagation in the barrier layer. As consequence of using two loading modes, the characteristic failure strain and failure time could be determined. The failure probability distribution of strain and lifetime under each loading condition was described by Weibull statistics. In this study, results from the tests in dynamic and static loading modes were linked by a power law description to determine the critical failure over a range of conditions. The fatigue parameter n from the power law reduces greatly from 70 to 31 upon correcting for internal strain. The testing method and analysis tool as described in the paper can be used to understand the limit of thin-film barriers in terms of their mechanical properties.

  6. Formal Specification and Validation of a Hybrid Connectivity Restoration Algorithm for Wireless Sensor and Actor Networks †

    PubMed Central

    Imran, Muhammad; Zafar, Nazir Ahmad

    2012-01-01

    Maintaining inter-actor connectivity is extremely crucial in mission-critical applications of Wireless Sensor and Actor Networks (WSANs), as actors have to quickly plan optimal coordinated responses to detected events. Failure of a critical actor partitions the inter-actor network into disjoint segments besides leaving a coverage hole, and thus hinders the network operation. This paper presents a Partitioning detection and Connectivity Restoration (PCR) algorithm to tolerate critical actor failure. As part of pre-failure planning, PCR determines critical/non-critical actors based on localized information and designates each critical node with an appropriate backup (preferably non-critical). The pre-designated backup detects the failure of its primary actor and initiates a post-failure recovery process that may involve coordinated multi-actor relocation. To prove the correctness, we construct a formal specification of PCR using Z notation. We model WSAN topology as a dynamic graph and transform PCR to corresponding formal specification using Z notation. Formal specification is analyzed and validated using the Z Eves tool. Moreover, we simulate the specification to quantitatively analyze the efficiency of PCR. Simulation results confirm the effectiveness of PCR and the results shown that it outperforms contemporary schemes found in the literature.

  7. SILHIL Replication of Electric Aircraft Powertrain Dynamics and Inner-Loop Control for V&V of System Health Management Routines

    NASA Technical Reports Server (NTRS)

    Bole, Brian; Teubert, Christopher Allen; Cuong Chi, Quach; Hogge, Edward; Vazquez, Sixto; Goebel, Kai; George, Vachtsevanos

    2013-01-01

    Software-in-the-loop and Hardware-in-the-loop testing of failure prognostics and decision making tools for aircraft systems will facilitate more comprehensive and cost-effective testing than what is practical to conduct with flight tests. A framework is described for the offline recreation of dynamic loads on simulated or physical aircraft powertrain components based on a real-time simulation of airframe dynamics running on a flight simulator, an inner-loop flight control policy executed by either an autopilot routine or a human pilot, and a supervisory fault management control policy. The creation of an offline framework for verifying and validating supervisory failure prognostics and decision making routines is described for the example of battery charge depletion failure scenarios onboard a prototype electric unmanned aerial vehicle.

  8. Chimpanzee ‘folk physics’: bringing failures into focus

    PubMed Central

    Seed, Amanda; Seddon, Eleanor; Greene, Bláthnaid; Call, Josep

    2012-01-01

    Differences between individuals are the raw material from which theories of the evolution and ontogeny of cognition are built. For example, when 4-year-old children pass a test requiring them to communicate the content of another's falsely held belief, while 3-year-olds fail, we know that something must change over the course of the third year of life. In the search for what develops or evolves, the typical route is to probe the extents and limits of successful individuals' ability. Another is to focus on those that failed, and find out what difference or lack prevented them from passing the task. Recent research in developmental psychology has harnessed individual differences to illuminate the cognitive mechanisms that emerge to enable success. We apply this approach to explaining some of the failures made by chimpanzees when using tools to solve problems. Twelve of 16 chimpanzees failed to discriminate between a complete and a broken tool when, after being set down, the ends of the broken one were aligned in front of them. There was a correlation between performance on this aligned task and another in which after being set down, the centre of both tools was covered, suggesting that the limiting factor was not the representation of connection, but memory or attention. Some chimpanzees that passed the aligned task passed a task in which the location of the broken tool was never visible but had to be inferred. PMID:22927573

  9. Reliability analysis of C-130 turboprop engine components using artificial neural network

    NASA Astrophysics Data System (ADS)

    Qattan, Nizar A.

    In this study, we predict the failure rate of Lockheed C-130 Engine Turbine. More than thirty years of local operational field data were used for failure rate prediction and validation. The Weibull regression model and the Artificial Neural Network model including (feed-forward back-propagation, radial basis neural network, and multilayer perceptron neural network model); will be utilized to perform this study. For this purpose, the thesis will be divided into five major parts. First part deals with Weibull regression model to predict the turbine general failure rate, and the rate of failures that require overhaul maintenance. The second part will cover the Artificial Neural Network (ANN) model utilizing the feed-forward back-propagation algorithm as a learning rule. The MATLAB package will be used in order to build and design a code to simulate the given data, the inputs to the neural network are the independent variables, the output is the general failure rate of the turbine, and the failures which required overhaul maintenance. In the third part we predict the general failure rate of the turbine and the failures which require overhaul maintenance, using radial basis neural network model on MATLAB tool box. In the fourth part we compare the predictions of the feed-forward back-propagation model, with that of Weibull regression model, and radial basis neural network model. The results show that the failure rate predicted by the feed-forward back-propagation artificial neural network model is closer in agreement with radial basis neural network model compared with the actual field-data, than the failure rate predicted by the Weibull model. By the end of the study, we forecast the general failure rate of the Lockheed C-130 Engine Turbine, the failures which required overhaul maintenance and six categorical failures using multilayer perceptron neural network (MLP) model on DTREG commercial software. The results also give an insight into the reliability of the engine turbine under actual operating conditions, which can be used by aircraft operators for assessing system and component failures and customizing the maintenance programs recommended by the manufacturer.

  10. Risk Factors for Noninvasive Ventilation Failure in Critically Ill Subjects With Confirmed Influenza Infection.

    PubMed

    Rodríguez, Alejandro; Ferri, Cristina; Martin-Loeches, Ignacio; Díaz, Emili; Masclans, Joan R; Gordo, Federico; Sole-Violán, Jordi; Bodí, María; Avilés-Jurado, Francesc X; Trefler, Sandra; Magret, Monica; Moreno, Gerard; Reyes, Luis F; Marin-Corral, Judith; Yebenes, Juan C; Esteban, Andres; Anzueto, Antonio; Aliberti, Stefano; Restrepo, Marcos I

    2017-10-01

    Despite wide use of noninvasive ventilation (NIV) in several clinical settings, the beneficial effects of NIV in patients with hypoxemic acute respiratory failure (ARF) due to influenza infection remain controversial. The aim of this study was to identify the profile of patients with risk factors for NIV failure using chi-square automatic interaction detection (CHAID) analysis and to determine whether NIV failure is associated with ICU mortality. This work was a secondary analysis from prospective and observational multi-center analysis in critically ill subjects admitted to the ICU with ARF due to influenza infection requiring mechanical ventilation. Three groups of subjects were compared: (1) subjects who received NIV immediately after ICU admission for ARF and then failed (NIV failure group); (2) subjects who received NIV immediately after ICU admission for ARF and then succeeded (NIV success group); and (3) subjects who received invasive mechanical ventilation immediately after ICU admission for ARF (invasive mechanical ventilation group). Profiles of subjects with risk factors for NIV failure were obtained using CHAID analysis. Of 1,898 subjects, 806 underwent NIV, and 56.8% of them failed. Acute Physiology and Chronic Health Evaluation II (APACHE II) score, Sequential Organ Failure Assessment (SOFA) score, infiltrates in chest radiograph, and ICU mortality (38.4% vs 6.3%) were higher ( P < .001) in the NIV failure than in the NIV success group. SOFA score was the variable most associated with NIV failure, and 2 cutoffs were determined. Subjects with SOFA ≥ 5 had a higher risk of NIV failure (odds ratio = 3.3, 95% CI 2.4-4.5). ICU mortality was higher in subjects with NIV failure (38.4%) compared with invasive mechanical ventilation subjects (31.3%, P = .018), and NIV failure was associated with increased ICU mortality (odds ratio = 11.4, 95% CI 6.5-20.1). An automatic and non-subjective algorithm based on CHAID decision-tree analysis can help to define the profile of patients with different risks of NIV failure, which might be a promising tool to assist in clinical decision making to avoid the possible complications associated with NIV failure. Copyright © 2017 by Daedalus Enterprises.

  11. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  12. Increase in hospital admission rates for heart failure in The Netherlands, 1980-1993.

    PubMed Central

    Reitsma, J. B.; Mosterd, A.; de Craen, A. J.; Koster, R. W.; van Capelle, F. J.; Grobbee, D. E.; Tijssen, J. G.

    1996-01-01

    OBJECTIVE: To study the trend in hospital admission rates for heart failure in the Netherlands from 1980 to 1993. DESIGN: All hospital admissions in the Netherlands with a principal discharge diagnosis of heart failure were analysed. In addition, individual records of heart failure patients from a subset of 7 hospitals were analysed to estimate the frequency and timing of readmissions. RESULTS: The total number of discharges for men increased from 7377 in 1980 to 13 022 in 1993, and for women from 7064 to 12 944. From 1980 through 1993 age adjusted discharge rates rose 48% for men and 40% for women. Age adjusted in-hospital mortality for heart failure decreased from 19% in 1980 to 15% in 1993. For all age groups in-hospital mortality for men was higher than for women. The mean length of hospital admissions in 1993 was 14.0 days for men and 16.4 days for women. A review of individual patient records from a 6.3% sample of all hospital admissions in the Netherlands indicated that within a 2 year period 18% of the heart failure patients were admitted more than once and 5% more than twice. CONCLUSIONS: For both men and women a pronounced increase in age adjusted discharge rates for heart failure was observed in the Netherlands from 1980 to 1993. Readmissions were a prominent feature among heart failure patients. Higher survival rates after acute myocardial infarction and the longer survival of patients with heart disease, including heart failure may have contributed to the observed increase. The importance of advances in diagnostic tools and of possible changes in admission policy remain uncertain. PMID:8944582

  13. Risk Analysis and Prediction of Floor Failure Mechanisms at Longwall Face in Parvadeh-I Coal Mine using Rock Engineering System (RES)

    NASA Astrophysics Data System (ADS)

    Aghababaei, Sajjad; Saeedi, Gholamreza; Jalalifar, Hossein

    2016-05-01

    The floor failure at longwall face decreases productivity and safety, increases operation costs, and causes other serious problems. In Parvadeh-I coal mine, the timber is used to prevent the puncture of powered support base into the floor. In this paper, a rock engineering system (RES)-based model is presented to evaluate the risk of floor failure mechanisms at the longwall face of E 2 and W 1 panels. The presented model is used to determine the most probable floor failure mechanism, effective factors, damaged regions and remedial actions. From the analyzed results, it is found that soft floor failure is dominant in the floor failure mechanism at Parvadeh-I coal mine. The average of vulnerability index (VI) for soft, buckling and compressive floor failure mechanisms was estimated equal to 52, 43 and 30 for both panels, respectively. By determining the critical VI for soft floor failure mechanism equal to 54, the percentage of regions with VIs beyond the critical VI in E 2 and W 1 panels is equal to 65.5 and 30, respectively. The percentage of damaged regions showed that the excess amount of used timber to prevent the puncture of weak floor below the powered support base is equal to 4,180,739 kg. RES outputs and analyzed results showed that setting and yielding load of powered supports, length of face, existent water at face, geometry of powered supports, changing the cutting pattern at longwall face and limiting the panels to damaged regions with supercritical VIs could be considered to control the soft floor failure in this mine. The results of this research could be used as a useful tool to identify the damaged regions prior to mining operation at longwall panel for the same conditions.

  14. Clinical classification of adult patients with chronic intestinal failure due to benign disease: An international multicenter cross-sectional survey.

    PubMed

    Pironi, Loris; Konrad, Denise; Brandt, Chrisoffer; Joly, Francisca; Wanten, Geert; Agostini, Federica; Chambrier, Cecile; Aimasso, Umberto; Zeraschi, Sarah; Kelly, Darlene; Szczepanek, Kinga; Jukes, Amelia; Di Caro, Simona; Theilla, Miriam; Kunecki, Marek; Daniels, Joanne; Serlie, Mireille; Poullenot, Florian; Wu, Jian; Cooper, Sheldon C; Rasmussen, Henrik H; Compher, Charlene; Seguy, David; Crivelli, Adriana; Pagano, Maria C; Hughes, Sarah-Jane; Guglielmi, Francesco W; Kozjek, Nada Rotovnik; Schneider, Stéphane M; Gillanders, Lyn; Ellegard, Lars; Thibault, Ronan; Matras, Przemysław; Zmarzly, Anna; Matysiak, Konrad; Van Gossum, Andrè; Forbes, Alastair; Wyer, Nicola; Taus, Marina; Virgili, Nuria M; O'Callaghan, Margie; Chapman, Brooke; Osland, Emma; Cuerda, Cristina; Sahin, Peter; Jones, Lynn; Lee, Andre D W; Bertasi, Valentino; Orlandoni, Paolo; Izbéki, Ferenc; Spaggiari, Corrado; Díez, Marta Bueno; Doitchinova-Simeonova, Maryana; Garde, Carmen; Serralde-Zúñiga, Aurora E; Olveira, Gabriel; Krznaric, Zeljko; Czako, Laszlo; Kekstas, Gintautas; Sanz-Paris, Alejandro; Jáuregui, Estrella Petrina; Murillo, Ana Zugasti; Schafer, Eszter; Arends, Jann; Suárez-Llanos, José P; Shaffer, Jon; Lal, Simon

    2018-04-01

    The aim of the study was to evaluate the applicability of the ESPEN 16-category clinical classification of chronic intestinal failure, based on patients' intravenous supplementation (IVS) requirements for energy and fluids, and to evaluate factors associated with those requirements. ESPEN members were invited to participate through ESPEN Council representatives. Participating centers enrolled adult patients requiring home parenteral nutrition for chronic intestinal failure on March 1st 2015. The following patient data were recorded though a structured database: sex, age, body weight and height, intestinal failure mechanism, underlying disease, IVS volume and energy need. Sixty-five centers from 22 countries enrolled 2919 patients with benign disease. One half of the patients were distributed in 3 categories of the ESPEN clinical classification. 9% of patients required only fluid and electrolyte supplementation. IVS requirement varied considerably according to the pathophysiological mechanism of intestinal failure. Notably, IVS volume requirement represented loss of intestinal function better than IVS energy requirement. A simplified 8 category classification of chronic intestinal failure was devised, based on two types of IVS (either fluid and electrolyte alone or parenteral nutrition admixture containing energy) and four categories of volume. Patients' IVS requirements varied widely, supporting the need for a tool to homogenize patient categorization. This study has devised a novel, simplified eight category IVS classification for chronic intestinal failure that will prove useful in both the clinical and research setting when applied together with the underlying pathophysiological mechanism of the patient's intestinal failure. Copyright © 2017 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  15. SU-E-T-627: Failure Modes and Effect Analysis for Monthly Quality Assurance of Linear Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Xiao, Y; Wang, J

    2014-06-15

    Purpose: To develop and implement a failure mode and effect analysis (FMEA) on routine monthly Quality Assurance (QA) tests (physical tests part) of linear accelerator. Methods: A systematic failure mode and effect analysis method was performed for monthly QA procedures. A detailed process tree of monthly QA was created and potential failure modes were defined. Each failure mode may have many influencing factors. For each factor, a risk probability number (RPN) was calculated from the product of probability of occurrence (O), the severity of effect (S), and detectability of the failure (D). The RPN scores are in a range ofmore » 1 to 1000, with higher scores indicating stronger correlation to a given influencing factor of a failure mode. Five medical physicists in our institution were responsible to discuss and to define the O, S, D values. Results: 15 possible failure modes were identified and all RPN scores of all influencing factors of these 15 failue modes were from 8 to 150, and the checklist of FMEA in monthly QA was drawn. The system showed consistent and accurate response to erroneous conditions. Conclusion: The influencing factors of RPN greater than 50 were considered as highly-correlated factors of a certain out-oftolerance monthly QA test. FMEA is a fast and flexible tool to develop an implement a quality management (QM) frame work of monthly QA, which improved the QA efficiency of our QA team. The FMEA work may incorporate more quantification and monitoring fuctions in future.« less

  16. Practical, transparent prospective risk analysis for the clinical laboratory.

    PubMed

    Janssens, Pim Mw

    2014-11-01

    Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  17. 30 CFR 250.1402 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Definitions. Terms used in this subpart have the following meaning: Case file means an MMS document file... fine. It is an MMS regulatory enforcement tool used in addition to Notices of Incidents of... employee assigned to review case files and assess civil penalties. Violation means failure to comply with...

  18. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  19. Selection of reference genes for gene expression studies in heart failure for left and right ventricles.

    PubMed

    Li, Mengmeng; Rao, Man; Chen, Kai; Zhou, Jianye; Song, Jiangping

    2017-07-15

    Real-time quantitative reverse transcriptase-PCR (qRT-PCR) is a feasible tool for determining gene expression profiles, but the accuracy and reliability of the results depends on the stable expression of selected housekeeping genes in different samples. By far, researches on stable housekeeping genes in human heart failure samples are rare. Moreover the effect of heart failure on the expression of housekeeping genes in right and left ventricles is yet to be studied. Therefore we aim to provide stable housekeeping genes for both ventricles in heart failure and normal heart samples. In this study, we selected seven commonly used housekeeping genes as candidates. By using the qRT-PCR, the expression levels of ACTB, RAB7A, GAPDH, REEP5, RPL5, PSMB4 and VCP in eight heart failure and four normal heart samples were assessed. The stability of candidate housekeeping genes was evaluated by geNorm and Normfinder softwares. GAPDH showed the least variation in all heart samples. Results also indicated the difference of gene expression existed in heart failure left and right ventricles. GAPDH had the highest expression stability in both heart failure and normal heart samples. We also propose using different sets of housekeeping genes for left and right ventricles respectively. The combination of RPL5, GAPDH and PSMB4 is suitable for the right ventricle and the combination of GAPDH, REEP5 and RAB7A is suitable for the left ventricle. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A comparison of stereology, structural rigidity and a novel 3D failure surface analysis method in the assessment of torsional strength and stiffness in a mouse tibia fracture model.

    PubMed

    Wright, David A; Nam, Diane; Whyne, Cari M

    2012-08-31

    In attempting to develop non-invasive image based measures for the determination of the biomechanical integrity of healing fractures, traditional μCT based measurements have been limited. This study presents the development and evaluation of a tool for assessment of fracture callus mechanical properties through determination of the geometric characteristics of the fracture callus, specifically along the surface of failure identified during destructive mechanical testing. Fractures were created in tibias of ten male mice and subjected to μCT imaging and biomechanical torsion testing. Failure surface analysis, along with previously described image based measures was calculated using the μCT image data, and correlated with mechanical strength and stiffness. Three-dimensional measures along the surface of failure, specifically the surface area and torsional rigidity of bone, were shown to be significantly correlating with mechanical strength and stiffness. It was also shown that surface area of bone along the failure surface exhibits stronger correlations with both strength and stiffness than measures of average and minimum torsional rigidity of the entire callus. Failure surfaces observed in this study were generally oriented at 45° to the long axis of the bone, and were not contained exclusively within the callus. This work represents a proof of concept study, and shows the potential utility of failure surface analysis in the assessment of fracture callus stability. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. EVALUATION OF SAFETY IN A RADIATION ONCOLOGY SETTING USING FAILURE MODE AND EFFECTS ANALYSIS

    PubMed Central

    Ford, Eric C.; Gaudette, Ray; Myers, Lee; Vanderver, Bruce; Engineer, Lilly; Zellars, Richard; Song, Danny Y.; Wong, John; DeWeese, Theodore L.

    2013-01-01

    Purpose Failure mode and effects analysis (FMEA) is a widely used tool for prospectively evaluating safety and reliability. We report our experiences in applying FMEA in the setting of radiation oncology. Methods and Materials We performed an FMEA analysis for our external beam radiation therapy service, which consisted of the following tasks: (1) create a visual map of the process, (2) identify possible failure modes; assign risk probability numbers (RPN) to each failure mode based on tabulated scores for the severity, frequency of occurrence, and detectability, each on a scale of 1 to 10; and (3) identify improvements that are both feasible and effective. The RPN scores can span a range of 1 to 1000, with higher scores indicating the relative importance of a given failure mode. Results Our process map consisted of 269 different nodes. We identified 127 possible failure modes with RPN scores ranging from 2 to 160. Fifteen of the top-ranked failure modes were considered for process improvements, representing RPN scores of 75 and more. These specific improvement suggestions were incorporated into our practice with a review and implementation by each department team responsible for the process. Conclusions The FMEA technique provides a systematic method for finding vulnerabilities in a process before they result in an error. The FMEA framework can naturally incorporate further quantification and monitoring. A general-use system for incident and near miss reporting would be useful in this regard. PMID:19409731

  2. Comparison of a computer assisted learning program to standard education tools in hospitalized heart failure patients.

    PubMed

    Dilles, Ann; Heymans, Valerie; Martin, Sandra; Droogné, Walter; Denhaerynck, Kris; De Geest, Sabina

    2011-09-01

    Education, coaching and guidance of patients are important components of heart failure management. The aim of this study was to compare a computer assisted learning (CAL) program with standard education (brochures and oral information from nurses) on knowledge and self-care in hospitalized heart failure patients. Satisfaction with the CAL program was also assessed in the intervention group. A quasi-experimental design was used, with a convenience sample of in-hospital heart failure patients. Knowledge and self-care were measured using the Dutch Heart Failure Knowledge Scale and the European Heart Failure Self-care Behaviour Scale at hospital admission, at discharge and after a 3-month follow-up. Satisfaction with the CAL program was assessed at hospital discharge using a satisfaction questionnaire. Within and between groups, changes in knowledge and self-care over time were tested using a mixed regression model. Of 65 heart failure patients screened, 37 were included in the study: 21 in the CAL group and 16 in the usual care group. No significant differences in knowledge (p=0.65) or self-care (p=0.40) could be found between groups. However, both variables improved significantly over time in each study group (p<0.0001). Both educational strategies increased knowledge and improved self-care. The design did not allow isolation of the effects of standard education usual care from CAL. Economic and clinical outcomes of both methods should be evaluated in further research. Copyright © 2010. Published by Elsevier B.V.

  3. A risk-based approach to sanitary sewer pipe asset management.

    PubMed

    Baah, Kelly; Dubey, Brajesh; Harvey, Richard; McBean, Edward

    2015-02-01

    Wastewater collection systems are an important component of proper management of wastewater to prevent environmental and human health implications from mismanagement of anthropogenic waste. Due to aging and inadequate asset management practices, the wastewater collection assets of many cities around the globe are in a state of rapid decline and in need of urgent attention. Risk management is a tool which can help prioritize resources to better manage and rehabilitate wastewater collection systems. In this study, a risk matrix and a weighted sum multi-criteria decision-matrix are used to assess the consequence and risk of sewer pipe failure for a mid-sized city, using ArcGIS. The methodology shows that six percent of the uninspected sewer pipe assets of the case study have a high consequence of failure while four percent of the assets have a high risk of failure and hence provide priorities for inspection. A map incorporating risk of sewer pipe failure and consequence is developed to facilitate future planning, rehabilitation and maintenance programs. The consequence of failure assessment also includes a novel failure impact factor which captures the effect of structurally defective stormwater pipes on the failure assessment. The methodology recommended in this study can serve as a basis for future planning and decision making and has the potential to be universally applied by municipal sewer pipe asset managers globally to effectively manage the sanitary sewer pipe infrastructure within their jurisdiction. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Using Performance Tools to Support Experiments in HPC Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian

    2014-01-01

    The high performance computing (HPC) community is working to address fault tolerance and resilience concerns for current and future large scale computing platforms. This is driving enhancements in the programming environ- ments, specifically research on enhancing message passing libraries to support fault tolerant computing capabilities. The community has also recognized that tools for resilience experimentation are greatly lacking. However, we argue that there are several parallels between performance tools and resilience tools . As such, we believe the rich set of HPC performance-focused tools can be extended (repurposed) to benefit the resilience community. In this paper, we describe the initialmore » motivation to leverage standard HPC per- formance analysis techniques to aid in developing diagnostic tools to assist fault tolerance experiments for HPC applications. These diagnosis procedures help to provide context for the system when the errors (failures) occurred. We describe our initial work in leveraging an MPI performance trace tool to assist in provid- ing global context during fault injection experiments. Such tools will assist the HPC resilience community as they extend existing and new application codes to support fault tolerances.« less

  5. Microstructure and Mechanical Properties of Laser Clad and Post-cladding Tempered AISI H13 Tool Steel

    NASA Astrophysics Data System (ADS)

    Telasang, Gururaj; Dutta Majumdar, Jyotsna; Wasekar, Nitin; Padmanabham, G.; Manna, Indranil

    2015-05-01

    This study reports a detailed investigation of the microstructure and mechanical properties (wear resistance and tensile strength) of hardened and tempered AISI H13 tool steel substrate following laser cladding with AISI H13 tool steel powder in as-clad and after post-cladding conventional bulk isothermal tempering [at 823 K (550 °C) for 2 hours] heat treatment. Laser cladding was carried out on AISI H13 tool steel substrate using a 6 kW continuous wave diode laser coupled with fiber delivering an energy density of 133 J/mm2 and equipped with a co-axial powder feeding nozzle capable of feeding powder at the rate of 13.3 × 10-3 g/mm2. Laser clad zone comprises martensite, retained austenite, and carbides, and measures an average hardness of 600 to 650 VHN. Subsequent isothermal tempering converted the microstructure into one with tempered martensite and uniform dispersion of carbides with a hardness of 550 to 650 VHN. Interestingly, laser cladding introduced residual compressive stress of 670 ± 15 MPa, which reduces to 580 ± 20 MPa following isothermal tempering. Micro-tensile testing with specimens machined from the clad zone across or transverse to cladding direction showed high strength but failure in brittle mode. On the other hand, similar testing with samples sectioned from the clad zone parallel or longitudinal to the direction of laser cladding prior to and after post-cladding tempering recorded lower strength but ductile failure with 4.7 and 8 pct elongation, respectively. Wear resistance of the laser surface clad and post-cladding tempered samples (evaluated by fretting wear testing) registered superior performance as compared to that of conventional hardened and tempered AISI H13 tool steel.

  6. A new statistical methodology predicting chip failure probability considering electromigration

    NASA Astrophysics Data System (ADS)

    Sun, Ted

    In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.

  7. Failure mode and effects analysis: a comparison of two common risk prioritisation methods.

    PubMed

    McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L

    2016-05-01

    Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  8. Systems Thinking, Lean Production and Action Learning

    ERIC Educational Resources Information Center

    Seddon, John; Caulkin, Simon

    2007-01-01

    Systems thinking underpins "lean" management and is best understood through action-learning as the ideas are counter-intuitive. The Toyota Production System is just that--a system; the failure to appreciate that starting-place and the advocacy of "tools" leads many to fail to grasp what is, without doubt, a significant…

  9. Finite element simulation of structural performance on flexible pavements with stabilized base/treated sub-base materials under accelerated loading : research project capsule.

    DOT National Transportation Integrated Search

    2008-12-01

    PROBLEM: The full-scale accelerated pavement testing (APT) provides a unique tool for pavement : engineers to directly collect pavement performance and failure data under heavy : wheel loading. However, running a full-scale APT experiment is very exp...

  10. "Heading up the Street": Localised Opportunities for Shared Constructions of Knowledge

    ERIC Educational Resources Information Center

    Lee, Carol D.; Majors, Yolanda J.

    2003-01-01

    Success and failure in school is contingent upon one's ability to regulate and situate identities, utilise culturally-developed semiotic tools and negotiate models of meaning in shared social activity. However, many language minority students lack such success, struggling with conflicts between their primary-and community-based identities, and…

  11. Educational Reform from the Perspective of the Student

    ERIC Educational Resources Information Center

    Vasquez-Martinez, Claudio-Rafael; Gonzalez-Gonzalez, Felipe; Cardona-Toro, Jose-Gerardo; Díaz-Renteria, María-Guadalupe; Alvarez, Maria-Ines; Rendon, Hector; Valero, Isabel; Morfin, Maria; Alvarez, Miguel

    2016-01-01

    Educational policies are tools that the state prepares to generate conditions that allow access to and retention in schools, with the consequent reduction in school failure, increasing the external yield and fulfilling the expectations of the internal agents (teachers, students, school managers), external users (families, society, employers,…

  12. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  13. Scintillation Breakdowns in Chip Tantalum Capacitors

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander

    2008-01-01

    Scintillations in solid tantalum capacitors are momentarily local breakdowns terminated by a self-healing or conversion to a high-resistive state of the manganese oxide cathode. This conversion effectively caps the defective area of the tantalum pentoxide dielectric and prevents short-circuit failures. Typically, this type of breakdown has no immediate catastrophic consequences and is often considered as nuisance rather than a failure. Scintillation breakdowns likely do not affect failures of parts under surge current conditions, and so-called "proofing" of tantalum chip capacitors, which is a controllable exposure of the part after soldering to voltages slightly higher than the operating voltage to verify that possible scintillations are self-healed, has been shown to improve the quality of the parts. However, no in-depth studies of the effect of scintillations on reliability of tantalum capacitors have been performed so far. KEMET is using scintillation breakdown testing as a tool for assessing process improvements and to compare quality of different manufacturing lots. Nevertheless, the relationship between failures and scintillation breakdowns is not clear, and this test is not considered as suitable for lot acceptance testing. In this work, scintillation breakdowns in different military-graded and commercial tantalum capacitors were characterized and related to the rated voltages and to life test failures. A model for assessment of times to failure, based on distributions of breakdown voltages, and accelerating factors of life testing are discussed.

  14. ADM guidance-Ceramics: guidance to the use of fractography in failure analysis of brittle materials.

    PubMed

    Scherrer, Susanne S; Lohbauer, Ulrich; Della Bona, Alvaro; Vichi, Alessandro; Tholey, Michael J; Kelly, J Robert; van Noort, Richard; Cesar, Paulo Francisco

    2017-06-01

    To provide background information and guidance as to how to use fractography accurately, a powerful tool for failure analysis of dental ceramic structures. An extended palette of qualitative and quantitative fractography is provided, both for in vivo and in vitro fracture surface analyses. As visual support, this guidance document will provide micrographs of typical critical ceramic processing flaws, differentiating between pre- versus post sintering cracks, grinding damage related failures and occlusal contact wear origins and of failures due to surface degradation. The documentation emphasizes good labeling of crack features, precise indication of the direction of crack propagation (dcp), identification of the fracture origin, the use of fractographic photomontage of critical flaws or flaw labeling on strength data graphics. A compilation of recommendations for specific applications of fractography in Dentistry is also provided. This guidance document will contribute to a more accurate use of fractography and help researchers to better identify, describe and understand the causes of failure, for both clinical and laboratory-scale situations. If adequately performed at a large scale, fractography will assist in optimizing the methods of processing and designing of restorative materials and components. Clinical failures may be better understood and consequently reduced by sending out the correct message regarding the fracture origin in clinical trials. Copyright © 2017 The Academy of Dental Materials. All rights reserved.

  15. The influence of microstructure on the probability of early failure in aluminum-based interconnects

    NASA Astrophysics Data System (ADS)

    Dwyer, V. M.

    2004-09-01

    For electromigration in short aluminum interconnects terminated by tungsten vias, the well known "short-line" effect applies. In a similar manner, for longer lines, early failure is determined by a critical value Lcrit for the length of polygranular clusters. Any cluster shorter than Lcrit is "immortal" on the time scale of early failure where the figure of merit is not the standard t50 value (the time to 50% failures), but rather the total probability of early failure, Pcf. Pcf is a complex function of current density, linewidth, line length, and material properties (the median grain size d50 and grain size shape factor σd). It is calculated here using a model based around the theory of runs, which has proved itself to be a useful tool for assessing the probability of extreme events. Our analysis shows that Pcf is strongly dependent on σd, and a change in σd from 0.27 to 0.5 can cause an order of magnitude increase in Pcf under typical test conditions. This has implications for the web-based two-dimensional grain-growth simulator MIT/EmSim, which generates grain patterns with σd=0.27, while typical as-patterned structures are better represented by a σd in the range 0.4 - 0.6. The simulator will consequently overestimate interconnect reliability due to this particular electromigration failure mode.

  16. Static Properties of Fibre Metal Laminates

    NASA Astrophysics Data System (ADS)

    Hagenbeek, M.; van Hengel, C.; Bosker, O. J.; Vermeeren, C. A. J. R.

    2003-07-01

    In this article a brief overview of the static properties of Fibre Metal Laminates is given. Starting with the stress-strain relation, an effective calculation tool for uniaxial stress-strain curves is given. The method is valid for all Glare types. The Norris failure model is described in combination with a Metal Volume Fraction approach leading to a useful tool to predict allowable blunt notch strength. The Volume Fraction approach is also useful in the case of the shear yield strength of Fibre Metal Laminates. With the use of the Iosipescu test shear yield properties are measured.

  17. The cost-effectiveness of monitoring strategies for antiretroviral therapy of HIV infected patients in resource-limited settings: software tool.

    PubMed

    Estill, Janne; Salazar-Vizcaya, Luisa; Blaser, Nello; Egger, Matthias; Keiser, Olivia

    2015-01-01

    The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.

  18. The Cost-Effectiveness of Monitoring Strategies for Antiretroviral Therapy of HIV Infected Patients in Resource-Limited Settings: Software Tool

    PubMed Central

    Estill, Janne; Salazar-Vizcaya, Luisa; Blaser, Nello; Egger, Matthias; Keiser, Olivia

    2015-01-01

    Background The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. Methods We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. Results Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. Conclusion Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs. PMID:25793531

  19. Acute kidney injury in the ICU: from injury to recovery: reports from the 5th Paris International Conference.

    PubMed

    Bellomo, Rinaldo; Ronco, Claudio; Mehta, Ravindra L; Asfar, Pierre; Boisramé-Helms, Julie; Darmon, Michael; Diehl, Jean-Luc; Duranteau, Jacques; Hoste, Eric A J; Olivier, Joannes-Boyau; Legrand, Matthieu; Lerolle, Nicolas; Malbrain, Manu L N G; Mårtensson, Johan; Oudemans-van Straaten, Heleen M; Parienti, Jean-Jacques; Payen, Didier; Perinel, Sophie; Peters, Esther; Pickkers, Peter; Rondeau, Eric; Schetz, Miet; Vinsonneau, Christophe; Wendon, Julia; Zhang, Ling; Laterre, Pierre-François

    2017-12-01

    The French Intensive Care Society organized its yearly Paris International Conference in intensive care on June 18-19, 2015. The main purpose of this meeting is to gather the best experts in the field in order to provide the highest quality update on a chosen topic. In 2015, the selected theme was: "Acute Renal Failure in the ICU: from injury to recovery." The conference program covered multiple aspects of renal failure, including epidemiology, diagnosis, treatment and kidney support system, prognosis and recovery together with acute renal failure in specific settings. The present report provides a summary of every presentation including the key message and references and is structured in eight sections: (a) diagnosis and evaluation, (b) old and new diagnosis tools, (c) old and new treatments, (d) renal replacement therapy and management, (e) acute renal failure witness of other conditions, (f) prognosis and recovery, (g) extracorporeal epuration beyond the kidney, (h) the use of biomarkers in clinical practice http://www.srlf.org/5th-paris-international-conference-jeudi-18-et-vendredi-19-juin-2015/ .

  20. 3D Printing of Materials with Tunable Failure via Bioinspired Mechanical Gradients.

    PubMed

    Kokkinis, Dimitri; Bouville, Florian; Studart, André R

    2018-05-01

    Mechanical gradients are useful to reduce strain mismatches in heterogeneous materials and thus prevent premature failure of devices in a wide range of applications. While complex graded designs are a hallmark of biological materials, gradients in manmade materials are often limited to 1D profiles due to the lack of adequate fabrication tools. Here, a multimaterial 3D-printing platform is developed to fabricate elastomer gradients spanning three orders of magnitude in elastic modulus and used to investigate the role of various bioinspired gradient designs on the local and global mechanical behavior of synthetic materials. The digital image correlation data and finite element modeling indicate that gradients can be effectively used to manipulate the stress state and thus circumvent the weakening effect of defect-rich interfaces or program the failure behavior of heterogeneous materials. Implementing this concept in materials with bioinspired designs can potentially lead to defect-tolerant structures and to materials whose tunable failure facilitates repair of biomedical implants, stretchable electronics, or soft robotics. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Hybrid neural intelligent system to predict business failure in small-to-medium-size enterprises.

    PubMed

    Borrajo, M Lourdes; Baruque, Bruno; Corchado, Emilio; Bajo, Javier; Corchado, Juan M

    2011-08-01

    During the last years there has been a growing need of developing innovative tools that can help small to medium sized enterprises to predict business failure as well as financial crisis. In this study we present a novel hybrid intelligent system aimed at monitoring the modus operandi of the companies and predicting possible failures. This system is implemented by means of a neural-based multi-agent system that models the different actors of the companies as agents. The core of the multi-agent system is a type of agent that incorporates a case-based reasoning system and automates the business control process and failure prediction. The stages of the case-based reasoning system are implemented by means of web services: the retrieval stage uses an innovative weighted voting summarization of self-organizing maps ensembles-based method and the reuse stage is implemented by means of a radial basis function neural network. An initial prototype was developed and the results obtained related to small and medium enterprises in a real scenario are presented.

  2. Distributed phased array architecture study

    NASA Technical Reports Server (NTRS)

    Bourgeois, Brian

    1987-01-01

    Variations in amplifiers and phase shifters can cause degraded antenna performance, depending also on the environmental conditions and antenna array architecture. The implementation of distributed phased array hardware was studied with the aid of the DISTAR computer program as a simulation tool. This simulation provides guidance in hardware simulation. Both hard and soft failures of the amplifiers in the T/R modules are modeled. Hard failures are catastrophic: no power is transmitted to the antenna elements. Noncatastrophic or soft failures are modeled as a modified Gaussian distribution. The resulting amplitude characteristics then determine the array excitation coefficients. The phase characteristics take on a uniform distribution. Pattern characteristics such as antenna gain, half power beamwidth, mainbeam phase errors, sidelobe levels, and beam pointing errors were studied as functions of amplifier and phase shifter variations. General specifications for amplifier and phase shifter tolerances in various architecture configurations for C band and S band were determined.

  3. Riding the Right Wavelet: Quantifying Scale Transitions in Fractured Rocks

    NASA Astrophysics Data System (ADS)

    Rizzo, Roberto E.; Healy, David; Farrell, Natalie J.; Heap, Michael J.

    2017-12-01

    The mechanics of brittle failure is a well-described multiscale process that involves a rapid transition from distributed microcracks to localization along a single macroscopic rupture plane. However, considerable uncertainty exists regarding both the length scale at which this transition occurs and the underlying causes that prompt this shift from a distributed to a localized assemblage of cracks or fractures. For the first time, we used an image analysis tool developed to investigate orientation changes at different scales in images of fracture patterns in faulted materials, based on a two-dimensional continuous wavelet analysis. We detected the abrupt change in the fracture pattern from distributed tensile microcracks to localized shear failure in a fracture network produced by triaxial deformation of a sandstone core plug. The presented method will contribute to our ability of unraveling the physical processes at the base of catastrophic rock failure, including the nucleation of earthquakes, landslides, and volcanic eruptions.

  4. An Example of Concurrent Engineering

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney; Whitten, David; Cloyd, Richard; Coppens, Chris; Rodriguez, Pedro

    1998-01-01

    The Collaborative Engineering Design and Analysis Room (CEDAR) facility allows on-the- spot design review capability for any project during all phases of development. The required disciplines assemble in this facility to work on any problems (analysis, manufacturing, inspection, etc.) associated with a particular design. A small highly focused team of specialists can meet in this room to better expedite the process of developing a solution to an engineering task within the framework of the constraints that are unique to each discipline. This facility provides the engineering tools and translators to develop a concept within the confines of the room or with remote team members that could access the team's data from other locations. The CEDAR area is envisioned as excellent for failure investigation meetings to be conducted where the computer capabilities can be utilized in conjunction with the Smart Board display to develop failure trees, brainstorm failure modes, and evaluate possible solutions.

  5. Shear failure of granular materials

    NASA Astrophysics Data System (ADS)

    Degiuli, Eric; Balmforth, Neil; McElwaine, Jim; Schoof, Christian; Hewitt, Ian

    2012-02-01

    Connecting the macroscopic behavior of granular materials with the microstructure remains a great challenge. Recent work connects these scales with a discrete calculus [1]. In this work we generalize this formalism from monodisperse packings of disks to 2D assemblies of arbitrarily shaped grains. In particular, we derive Airy's expression for a symmetric, divergence-free stress tensor. Using these tools, we derive, from first-principles and in a mean-field approximation, the entropy of frictional force configurations in the Force Network Ensemble. As a macroscopic consequence of the Coulomb friction condition at contacts, we predict shear failure at a critical shear stress, in accordance with the Mohr-Coulomb failure condition well known in engineering. Results are compared with numerical simulations, and the dependence on the microscopic geometric configuration is discussed. [4pt] [1] E. DeGiuli & J. McElwaine, PRE 2011. doi: 10.1103/PhysRevE.84.041310

  6. Profitable failure: antidepressant drugs and the triumph of flawed experiments.

    PubMed

    McGoey, Linsey

    2010-01-01

    Drawing on an analysis of Irving Kirsch and colleagues' controversial 2008 article in "PLoS [Public Library of Science] Magazine" on the efficacy of SSRI antidepressant drugs such as Prozac, I examine flaws within the methodologies of randomized controlled trials (RCTs) that have made it difficult for regulators, clinicians and patients to determine the therapeutic value of this class of drug. I then argue, drawing analogies to work by Pierre Bourdieu and Michael Power, that it is the very limitations of RCTs -- their inadequacies in producing reliable evidence of clinical effects -- that help to strengthen assumptions of their superiority as methodological tools. Finally, I suggest that the case of RCTs helps to explore the question of why failure is often useful in consolidating the authority of those who have presided over that failure, and why systems widely recognized to be ineffective tend to assume greater authority at the very moment when people speak of their malfunction.

  7. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1975-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program Assembly Control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools--the all-digital simulator, the hybrid simulator, and the Iron Bird simulator--are described, as well as the program test plans and their implementation on the various simulators. Failure-effects analysis and the creation of special failure-generating software for testing purposes are described. The quality of the end product is evidenced by the F-8 DFBW flight test program in which 42 flights, totaling 58 hours of flight time, were successfully made without any DFCS inflight software, or hardware, failures.

  8. Current problems in the management of marine fisheries.

    PubMed

    Beddington, J R; Agnew, D J; Clark, C W

    2007-06-22

    The public perception of fisheries is that they are in crisis and have been for some time. Numerous scientific and popular articles have pointed to the failures of fisheries management that have caused this crisis. These are widely accepted to be overcapacity in fishing fleets, a failure to take the ecosystem effects of fishing into account, and a failure to enforce unpalatable but necessary reductions in fishing effort on fishing fleets and communities. However, the claims of some analysts that there is an inevitable decline in the status of fisheries is, we believe, incorrect. There have been successes in fisheries management, and we argue that the tools for appropriate management exist. Unfortunately, they have not been implemented widely. Our analysis suggests that management authorities need to develop legally enforceable and tested harvest strategies, coupled with appropriate rights-based incentives to the fishing community, for the future of fisheries to be better than their past.

  9. Utility of the Instability Severity Index Score in Predicting Failure After Arthroscopic Anterior Stabilization of the Shoulder.

    PubMed

    Phadnis, Joideep; Arnold, Christine; Elmorsy, Ahmed; Flannery, Mark

    2015-08-01

    The redislocation rate after arthroscopic stabilization for anterior glenohumeral instability is up to 30%. The Instability Severity Index Score (ISIS) was developed to preoperatively rationalize the risk of failure, but it has not yet been validated by an independent group. To assess the utility of the ISIS in predicting failure of arthroscopic anterior shoulder stabilization and to identify other preoperative factors for failure. Case-control study; Level of evidence, 3. A case-control study was performed on 141 consecutive patients, comparing those who suffered failure of arthroscopic stabilization with those who had successful arthroscopic stabilization. The mean follow-up time was 47 months (range, 24-132 months). The ISIS was applied retrospectively, and an analysis was performed to establish independent risk factors for failure. A receiver operator coefficient curve was constructed to set a threshold ISIS for considering alternative surgery. Of 141 patients, 19 (13.5%) suffered recurrent instability. The mean ISIS of the failed stabilization group was higher than that of the successful stabilization group (5.1 vs 1.7; P < .001). Independent risk factors for failure were Hill-Sachs lesion (P < .001), glenoid bone loss (P < .001), age <21 years at the time of surgery (P < .001), age at first dislocation (P = .01), competitive-level participation in sports (P < .001), and participation in contact or overhead sports (P = .03). The presence of glenoid bone loss carried the highest risk of failure (70%). There was a 70% risk of failure if the ISIS was ≥4, as opposed to a 4% risk of failure if the ISIS was <4. This is the first completely independent study to confirm that the ISIS is a useful preoperative tool. It is recommended that surgeons consider alternative forms of stabilization if the ISIS is ≥4. © 2015 The Author(s).

  10. [Pathophysiology of respiratory muscle weakness].

    PubMed

    Windisch, W

    2008-03-01

    The respiratory system consists of two parts which can be impaired independently from each other, the lungs and the respiratory pump. The latter is a complex system covering different anatomic structures: the breathing centre, the peripheral nervous system, the respiratory muscles, and the thorax. According to this complexity several underlying conditions can cause insufficiency of the respiratory pump, i. e. ventilatory failure. Disturbances of the breathing centre, different neuromuscular disorders, impairments of the mechanics, such as thoracic deformities or hyperinflation, and airway obstruction are example conditions responsible for ventilatory failure. Main characteristic of ventilatory failure is the occurrence of hypercapnia which is in contrast to pulmonary failure where diffusion disturbances typically not cause hypercapnia. Both acute and chronic ventilatory failure presenting with hypercapnia can develop. In acute ventilatory failure respiratory acidosis develops, but in chronic respiratory failure pH is normalized as a consequence of metabolic retention of bicarbonate. However, acute on chronic ventilatory failure can present with a combined picture, i. e. elevated bicarbonate levels, acidosis, and often severe hypercapnia. Clinical signs such as tachypnea, features of the underlying disease or hypercapnia are important diagnostic tools in addition to the measurement of pressures generated by the respiratory muscles. Non-invasive and widely available techniques, such as the assessment of the maximal ins- and expiratory mouth pressures (PImax, PEmax), should be used as screening instruments, but the reliability of these measurements is reduced due to the volitional character of the tests and due to the impossibility to define normal values. Inspiratory pressures can be assessed more accurately and independently from the patients' effort: with or without the insertion of oesophageal and gastric balloon catheters. However, this technique is more invasive and very complex. It is therefore restricted to centres with scientific aims.

  11. Application of failure mode and effects analysis to intracranial stereotactic radiation surgery by linear accelerator.

    PubMed

    Masini, Laura; Donis, Laura; Loi, Gianfranco; Mones, Eleonora; Molina, Elisa; Bolchini, Cesare; Krengli, Marco

    2014-01-01

    The aim of this study was to analyze the application of the failure modes and effects analysis (FMEA) to intracranial stereotactic radiation surgery (SRS) by linear accelerator in order to identify the potential failure modes in the process tree and adopt appropriate safety measures to prevent adverse events (AEs) and near-misses, thus improving the process quality. A working group was set up to perform FMEA for intracranial SRS in the framework of a quality assurance program. FMEA was performed in 4 consecutive tasks: (1) creation of a visual map of the process; (2) identification of possible failure modes; (3) assignment of a risk probability number (RPN) to each failure mode based on tabulated scores of severity, frequency of occurrence and detectability; and (4) identification of preventive measures to minimize the risk of occurrence. The whole SRS procedure was subdivided into 73 single steps; 116 total possible failure modes were identified and a score of severity, occurrence, and detectability was assigned to each. Based on these scores, RPN was calculated for each failure mode thus obtaining values from 1 to 180. In our analysis, 112/116 (96.6%) RPN values were <60, 2 (1.7%) between 60 and 125 (63, 70), and 2 (1.7%) >125 (135, 180). The 2 highest RPN scores were assigned to the risk of using the wrong collimator's size and incorrect coordinates on the laser target localizer frame. Failure modes and effects analysis is a simple and practical proactive tool for systematic analysis of risks in radiation therapy. In our experience of SRS, FMEA led to the adoption of major changes in various steps of the SRS procedure.

  12. Simulation for Prediction of Entry Article Demise (SPEAD): an Analysis Tool for Spacecraft Safety Analysis and Ascent/Reentry Risk Assessment

    NASA Technical Reports Server (NTRS)

    Ling, Lisa

    2014-01-01

    For the purpose of performing safety analysis and risk assessment for a probable offnominal suborbital/orbital atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. This report discusses the capabilities, modeling, and validation of the SPEAD analysis tool. SPEAD is applicable for Earth or Mars, with the option for 3 or 6 degrees-of-freedom (DOF) trajectory propagation. The atmosphere and aerodynamics data are supplied in tables, for linear interpolation of up to 4 independent variables. The gravitation model can include up to 20 zonal harmonic coefficients. The modeling of a single motor is available and can be adapted to multiple motors. For thermal analysis, the aerodynamic radiative and free-molecular/continuum convective heating, black-body radiative cooling, conductive heat transfer between adjacent nodes, and node ablation are modeled. In a 6- DOF simulation, the local convective heating on a node is a function of Mach, angle-ofattack, and sideslip angle, and is dependent on 1) the location of the node in the spacecraft and its orientation to the flow modeled by an exposure factor, and 2) the geometries of the spacecraft and the node modeled by a heating factor and convective area. Node failure is evaluated using criteria based on melting temperature, reference heat load, g-load, or a combination of the above. The failure of a liquid propellant tank is evaluated based on burnout flux from nucleate boiling or excess internal pressure. Following a component failure, updates are made as needed to the spacecraft mass and aerodynamic properties, nodal exposure and heating factors, and nodal convective and conductive areas. This allows the trajectory to be propagated seamlessly in a single run, inclusive of the trajectories of components that have separated from the spacecraft. The node ablation simulates the decreasing mass and convective/reference areas, and variable heating factor. A built-in database provides the thermo-mechanical properties of For the purpose of performing safety analysis and risk assessment for a probable offnominal suborbital/orbital atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. This report discusses the capabilities, modeling, and validation of the SPEAD analysis tool. SPEAD is applicable for Earth or Mars, with the option for 3 or 6 degrees-of-freedom (DOF) trajectory propagation. The atmosphere and aerodynamics data are supplied in tables, for linear interpolation of up to 4 independent variables. The gravitation model can include up to 20 zonal harmonic coefficients. The modeling of a single motor is available and can be adapted to multiple motors. For thermal analysis, the aerodynamic radiative and free-molecular/continuum convective heating, black-body radiative cooling, conductive heat transfer between adjacent nodes, and node ablation are modeled. In a 6- DOF simulation, the local convective heating on a node is a function of Mach, angle-ofattack, and sideslip angle, and is dependent on 1) the location of the node in the spacecraft and its orientation to the flow modeled by an exposure factor, and 2) the geometries of the spacecraft and the node modeled by a heating factor and convective area. Node failure is evaluated using criteria based on melting temperature, reference heat load, g-load, or a combination of the above. The failure of a liquid propellant tank is evaluated based on burnout flux from nucleate boiling or excess internal pressure. Following a component failure, updates are made as needed to the spacecraft mass and aerodynamic properties, nodal exposure and heating factors, and nodal convective and conductive areas. This allows the trajectory to be propagated seamlessly in a single run, inclusive of the trajectories of components that have separated from the spacecraft. The node ablation simulates the decreasing mass and convective/reference areas, and variable heating factor. A built-in database provides the thermo-mechanical properties of

  13. Investigating Brittle Rock Failure and Associated Seismicity Using Laboratory Experiments and Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Zhao, Qi

    Rock failure process is a complex phenomenon that involves elastic and plastic deformation, microscopic cracking, macroscopic fracturing, and frictional slipping of fractures. Understanding this complex behaviour has been the focus of a significant amount of research. In this work, the combined finite-discrete element method (FDEM) was first employed to study (1) the influence of rock discontinuities on hydraulic fracturing and associated seismicity and (2) the influence of in-situ stress on seismic behaviour. Simulated seismic events were analyzed using post-processing tools including frequency-magnitude distribution (b-value), spatial fractal dimension (D-value), seismic rate, and fracture clustering. These simulations demonstrated that at the local scale, fractures tended to propagate following the rock mass discontinuities; while at reservoir scale, they developed in the direction parallel to the maximum in-situ stress. Moreover, seismic signature (i.e., b-value, D-value, and seismic rate) can help to distinguish different phases of the failure process. The FDEM modelling technique and developed analysis tools were then coupled with laboratory experiments to further investigate the different phases of the progressive rock failure process. Firstly, a uniaxial compression experiment, monitored using a time-lapse ultrasonic tomography method, was carried out and reproduced by the numerical model. Using this combination of technologies, the entire deformation and failure processes were studied at macroscopic and microscopic scales. The results not only illustrated the rock failure and seismic behaviours at different stress levels, but also suggested several precursory behaviours indicating the catastrophic failure of the rock. Secondly, rotary shear experiments were conducted using a newly developed rock physics experimental apparatus ERDmu-T) that was paired with X-ray micro-computed tomography (muCT). This combination of technologies has significant advantages over conventional rotary shear experiments since it allowed for the direct observation of how two rough surfaces interact and deform without perturbing the experimental conditions. Some intriguing observations were made pertaining to key areas of the study of fault evolution, making possible for a more comprehensive interpretation of the frictional sliding behaviour. Lastly, a carefully calibrated FDEM model that was built based on the rotary experiment was utilized to investigate facets that the experiment was not able to resolve, for example, the time-continuous stress condition and the seismic activity on the shear surface. The model reproduced the mechanical behaviour observed in the laboratory experiment, shedding light on the understanding of fault evolution.

  14. Parametric Testing of Launch Vehicle FDDR Models

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Bajwa, Anupa; Berg, Peter; Thirumalainambi, Rajkumar

    2011-01-01

    For the safe operation of a complex system like a (manned) launch vehicle, real-time information about the state of the system and potential faults is extremely important. The on-board FDDR (Failure Detection, Diagnostics, and Response) system is a software system to detect and identify failures, provide real-time diagnostics, and to initiate fault recovery and mitigation. The ERIS (Evaluation of Rocket Integrated Subsystems) failure simulation is a unified Matlab/Simulink model of the Ares I Launch Vehicle with modular, hierarchical subsystems and components. With this model, the nominal flight performance characteristics can be studied. Additionally, failures can be injected to see their effects on vehicle state and on vehicle behavior. A comprehensive test and analysis of such a complicated model is virtually impossible. In this paper, we will describe, how parametric testing (PT) can be used to support testing and analysis of the ERIS failure simulation. PT uses a combination of Monte Carlo techniques with n-factor combinatorial exploration to generate a small, yet comprehensive set of parameters for the test runs. For the analysis of the high-dimensional simulation data, we are using multivariate clustering to automatically find structure in this high-dimensional data space. Our tools can generate detailed HTML reports that facilitate the analysis.

  15. Safety analysis of occupational exposure of healthcare workers to residual contaminations of cytotoxic drugs using FMECA security approach.

    PubMed

    Le, Laetitia Minh Mai; Reitter, Delphine; He, Sophie; Bonle, Franck Té; Launois, Amélie; Martinez, Diane; Prognon, Patrice; Caudron, Eric

    2017-12-01

    Handling cytotoxic drugs is associated with chemical contamination of workplace surfaces. The potential mutagenic, teratogenic and oncogenic properties of those drugs create a risk of occupational exposure for healthcare workers, from reception of starting materials to the preparation and administration of cytotoxic therapies. The Security Failure Mode Effects and Criticality Analysis (FMECA) was used as a proactive method to assess the risks involved in the chemotherapy compounding process. FMECA was carried out by a multidisciplinary team from 2011 to 2016. Potential failure modes of the process were identified based on the Risk Priority Number (RPN) that prioritizes corrective actions. Twenty-five potential failure modes were identified. Based on RPN results, the corrective actions plan was revised annually to reduce the risk of exposure and improve practices. Since 2011, 16 specific measures were implemented successively. In six years, a cumulative RPN reduction of 626 was observed, with a decrease from 912 to 286 (-69%) despite an increase of cytotoxic compounding activity of around 23.2%. In order to anticipate and prevent occupational exposure, FMECA is a valuable tool to identify, prioritize and eliminate potential failure modes for operators involved in the cytotoxic drug preparation process before the failures occur. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Dissolution Failure of Solid Oral Drug Products in Field Alert Reports.

    PubMed

    Sun, Dajun; Hu, Meng; Browning, Mark; Friedman, Rick L; Jiang, Wenlei; Zhao, Liang; Wen, Hong

    2017-05-01

    From 2005 to 2014, 370 data entries of dissolution failures of solid oral drug products were assessed with respect to the solubility of drug substances, dosage forms [immediate release (IR) vs. modified release (MR)], and manufacturers (brand name vs. generic). The study results show that the solubility of drug substances does not play a significant role in dissolution failures; however, MR drug products fail dissolution tests more frequently than IR drug products. When multiple variables were analyzed simultaneously, poorly water-soluble IR drug products failed the most dissolution tests, followed by poorly soluble MR drug products and very soluble MR drug products. Interestingly, the generic drug products fail dissolution tests at an earlier time point during a stability study than the brand name drug products. Whether the dissolution failure of these solid oral drug products has any in vivo implication will require further pharmacokinetic, pharmacodynamic, clinical, and drug safety evaluation. Food and Drug Administration is currently conducting risk-based assessment using in-house dissolution testing, physiologically based pharmacokinetic modeling and simulation, and post-market surveillance tools. At the meantime, this interim report will outline a general scheme of monitoring dissolution failures of solid oral dosage forms as a pharmaceutical quality indicator. Published by Elsevier Inc.

  17. Gaussian fitting for carotid and radial artery pressure waveforms: comparison between normal subjects and heart failure patients.

    PubMed

    Liu, Chengyu; Zheng, Dingchang; Zhao, Lina; Liu, Changchun

    2014-01-01

    It has been reported that Gaussian functions could accurately and reliably model both carotid and radial artery pressure waveforms (CAPW and RAPW). However, the physiological relevance of the characteristic features from the modeled Gaussian functions has been little investigated. This study thus aimed to determine characteristic features from the Gaussian functions and to make comparisons of them between normal subjects and heart failure patients. Fifty-six normal subjects and 51 patients with heart failure were studied with the CAPW and RAPW signals recorded simultaneously. The two signals were normalized first and then modeled by three positive Gaussian functions, with their peak amplitude, peak time, and half-width determined. Comparisons of these features were finally made between the two groups. Results indicated that the peak amplitude of the first Gaussian curve was significantly decreased in heart failure patients compared with normal subjects (P<0.001). Significantly increased peak amplitude of the second Gaussian curves (P<0.001) and significantly shortened peak times of the second and third Gaussian curves (both P<0.001) were also presented in heart failure patients. These results were true for both CAPW and RAPW signals, indicating the clinical significance of the Gaussian modeling, which should provide essential tools for further understanding the underlying physiological mechanisms of the artery pressure waveform.

  18. Poster - Thur Eve - 38: Review of couch parameters using an FMEA.

    PubMed

    Larouche, R; Doucet, R; Rémy, E; Filion, A; Poirier, L

    2012-07-01

    To improve patient safety during positioning, we undertook a systematic review of the processes used by our center to obtain couch positions. We used a Failure Mode and Effects Analysis (FMEA) framework and fifteen different possible failures were identified and rated. The three major failures were 1) Loss of planned couch position and bias from the previous day's couch position, 2) DICOM origin or isocenter is different between two plans (imaging or treatment), and 3) Patient shift in opposite direction than intended. The main effect of these failures was to cause an override of couch parameters. Based on these results, we modified our processes, introduced new QA and software checks and developed new tolerance tables so as to improve system robustness and increase our success rate at catching failures before they can affect the patient. It has been a year since we made these modifications. Based on our results, we have reduced the number of overrides at our center from a maximum of 20.5% to a maximum of 6.3%, with an average at 4% of daily treatments. Our results suggest that FMEA is an effective tool in improving treatment quality that could be used in other centers. © 2012 American Association of Physicists in Medicine.

  19. Reliability Assessment for Low-cost Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Freeman, Paul Michael

    Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those algorithms to experimental faulted and unfaulted flight test data. Flight tests are conducted with actuator faults that affect the plant input and sensor faults that affect the vehicle state measurements. A model-based detection strategy is designed and uses robust linear filtering methods to reject exogenous disturbances, e.g. wind, while providing robustness to model variation. A data-driven algorithm is developed to operate exclusively on raw flight test data without physical model knowledge. The fault detection and identification performance of these complementary but different methods is compared. Together, enhanced reliability assessment and multi-pronged fault detection and identification techniques can help to bring about the next generation of reliable low-cost unmanned aircraft.

  20. Can the Seattle heart failure model be used to risk-stratify heart failure patients for potential left ventricular assist device therapy?

    PubMed

    Levy, Wayne C; Mozaffarian, Dariush; Linker, David T; Farrar, David J; Miller, Leslie W

    2009-03-01

    According to results of the REMATCH trial, left ventricular assist device therapy in patients with severe heart failure has resulted in a 48% reduction in mortality. A decision tool will be necessary to aid in the selection of patients for destination left ventricular assist devices (LVADs) as the technology progresses for implantation in ambulatory Stage D heart failure patients. The purpose of this analysis was to determine whether the Seattle Heart Failure Model (SHFM) can be used to risk-stratify heart failure patients for potential LVAD therapy. The SHFM was applied to REMATCH patients with the prospective addition of inotropic agents and intra-aortic balloon pump (IABP) +/- ventilator. The SHFM was highly predictive of survival (p = 0.0004). One-year SHFM-predicted survival was similar to actual survival for both the REMATCH medical (30% vs 28%) and LVAD (49% vs 52%) groups. The estimated 1-year survival with medical therapy for patients in REMATCH was 30 +/- 21%, but with a range of 0% to 74%. The 1- and 2-year estimated survival was

Top