Soft error evaluation and vulnerability analysis in Xilinx Zynq-7010 system-on chip
NASA Astrophysics Data System (ADS)
Du, Xuecheng; He, Chaohui; Liu, Shuhuan; Zhang, Yao; Li, Yonghong; Xiong, Ceng; Tan, Pengkang
2016-09-01
Radiation-induced soft errors are an increasingly important threat to the reliability of modern electronic systems. In order to evaluate system-on chip's reliability and soft error, the fault tree analysis method was used in this work. The system fault tree was constructed based on Xilinx Zynq-7010 All Programmable SoC. Moreover, the soft error rates of different components in Zynq-7010 SoC were tested by americium-241 alpha radiation source. Furthermore, some parameters that used to evaluate the system's reliability and safety were calculated using Isograph Reliability Workbench 11.0, such as failure rate, unavailability and mean time to failure (MTTF). According to fault tree analysis for system-on chip, the critical blocks and system reliability were evaluated through the qualitative and quantitative analysis.
A Case for Soft Error Detection and Correction in Computational Chemistry.
van Dam, Hubertus J J; Vishnu, Abhinav; de Jong, Wibe A
2013-09-10
High performance computing platforms are expected to deliver 10(18) floating operations per second by the year 2022 through the deployment of millions of cores. Even if every core is highly reliable the sheer number of them will mean that the mean time between failures will become so short that most application runs will suffer at least one fault. In particular soft errors caused by intermittent incorrect behavior of the hardware are a concern as they lead to silent data corruption. In this paper we investigate the impact of soft errors on optimization algorithms using Hartree-Fock as a particular example. Optimization algorithms iteratively reduce the error in the initial guess to reach the intended solution. Therefore they may intuitively appear to be resilient to soft errors. Our results show that this is true for soft errors of small magnitudes but not for large errors. We suggest error detection and correction mechanisms for different classes of data structures. The results obtained with these mechanisms indicate that we can correct more than 95% of the soft errors at moderate increases in the computational cost.
Scaled CMOS Technology Reliability Users Guide
NASA Technical Reports Server (NTRS)
White, Mark
2010-01-01
The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is presented revealing a power relationship. General models describing the soft error rates across scaled product generations are presented. The analysis methodology may be applied to other scaled microelectronic products and their key parameters.
Distributed phased array architecture study
NASA Technical Reports Server (NTRS)
Bourgeois, Brian
1987-01-01
Variations in amplifiers and phase shifters can cause degraded antenna performance, depending also on the environmental conditions and antenna array architecture. The implementation of distributed phased array hardware was studied with the aid of the DISTAR computer program as a simulation tool. This simulation provides guidance in hardware simulation. Both hard and soft failures of the amplifiers in the T/R modules are modeled. Hard failures are catastrophic: no power is transmitted to the antenna elements. Noncatastrophic or soft failures are modeled as a modified Gaussian distribution. The resulting amplitude characteristics then determine the array excitation coefficients. The phase characteristics take on a uniform distribution. Pattern characteristics such as antenna gain, half power beamwidth, mainbeam phase errors, sidelobe levels, and beam pointing errors were studied as functions of amplifier and phase shifter variations. General specifications for amplifier and phase shifter tolerances in various architecture configurations for C band and S band were determined.
NASA Astrophysics Data System (ADS)
Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko
2017-08-01
We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.
NASA Astrophysics Data System (ADS)
Tambara, Lucas Antunes; Tonfat, Jorge; Santos, André; Kastensmidt, Fernanda Lima; Medina, Nilberto H.; Added, Nemitala; Aguiar, Vitor A. P.; Aguirre, Fernando; Silveira, Marcilei A. G.
2017-02-01
The increasing system complexity of FPGA-based hardware designs and shortening of time-to-market have motivated the adoption of new designing methodologies focused on addressing the current need for high-performance circuits. High-Level Synthesis (HLS) tools can generate Register Transfer Level (RTL) designs from high-level software programming languages. These tools have evolved significantly in recent years, providing optimized RTL designs, which can serve the needs of safety-critical applications that require both high performance and high reliability levels. However, a reliability evaluation of HLS-based designs under soft errors has not yet been presented. In this work, the trade-offs of different HLS-based designs in terms of reliability, resource utilization, and performance are investigated by analyzing their behavior under soft errors and comparing them to a standard processor-based implementation in an SRAM-based FPGA. Results obtained from fault injection campaigns and radiation experiments show that it is possible to increase the performance of a processor-based system up to 5,000 times by changing its architecture with a small impact in the cross section (increasing up to 8 times), and still increasing the Mean Workload Between Failures (MWBF) of the system.
Design Techniques for Power-Aware Combinational Logic SER Mitigation
NASA Astrophysics Data System (ADS)
Mahatme, Nihaar N.
The history of modern semiconductor devices and circuits suggests that technologists have been able to maintain scaling at the rate predicted by Moore's Law [Moor-65]. With improved performance, speed and lower area, technology scaling has also exacerbated reliability issues such as soft errors. Soft errors are transient errors that occur in microelectronic circuits due to ionizing radiation particle strikes on reverse biased semiconductor junctions. These radiation induced errors at the terrestrial-level are caused due to radiation particle strikes by (1) alpha particles emitted as decay products of packing material (2) cosmic rays that produce energetic protons and neutrons, and (3) thermal neutrons [Dodd-03], [Srou-88] and more recently muons and electrons [Ma-79] [Nara-08] [Siew-10] [King-10]. In the space environment radiation induced errors are a much bigger threat and are mainly caused by cosmic heavy-ions, protons etc. The effects of radiation exposure on circuits and measures to protect against them have been studied extensively for the past 40 years, especially for parts operating in space. Radiation particle strikes can affect memory as well as combinational logic. Typically when these particles strike semiconductor junctions of transistors that are part of feedback structures such as SRAM memory cells or flip-flops, it can lead to an inversion of the cell content. Such a failure is formally called a bit-flip or single-event upset (SEU). When such particles strike sensitive junctions part of combinational logic gates they produce transient voltage spikes or glitches called single-event transients (SETs) that could be latched by receiving flip-flops. As the circuits are clocked faster, there are more number of clocking edges which increases the likelihood of latching these transients. In older technology generations the probability of errors in flip-flops due to SETs being latched was much lower compared to direct strikes on flip-flops or SRAMs leading to SEUs. This was mainly because the operating frequencies were much lower for older technology generations. The Intel Pentium II for example was fabricated using 0.35 microm technology and operated between 200-330 MHz. With technology scaling however, operating frequencies have increased tremendously and the contribution of soft errors due to latched SETs from combinational logic could account for a significant proportion of the chip-level soft error rate [Sief-12][Maha-11][Shiv02] [Bu97]. Therefore there is a need to systematically characterize the problem of combinational logic single-event effects (SEE) and understand the various factors that affect the combinational logic single-event error rate. Just as scaling has led to soft errors emerging as a reliability-limiting failure mode for modern digital ICs, the problem of increasing power consumption has arguably been a bigger bane of scaling. While Moore's Law loftily states the blessing of technology scaling to be smaller and faster transistor it fails to highlight that the power density increases exponentially with every technology generation. The power density problem was partially solved in the 1970's and 1980's by moving from bipolar and GaAs technologies to full-scale silicon CMOS technologies. Following this however, technology miniaturization that enabled high-speed, multicore and parallel computing has steadily increased the power density and the power consumption problem. Today minimizing the power consumption is as much critical for power hungry server farms as it for portable devices, all pervasive sensor networks and future eco-bio-sensors. Low-power consumption is now regularly part of design philosophies for various digital products with diverse applications from computing to communication to healthcare. Thus designers in today's world are left grappling with both a "power wall" as well as a "reliability wall". Unfortunately, when it comes to improving reliability through soft error mitigation, most approaches are invariably straddled with overheads in terms of area or speed and more importantly power. Thus, the cost of protecting combinational logic through the use of power hungry mitigation approaches can disrupt the power budget significantly. Therefore there is a strong need to develop techniques that can provide both power minimization as well as combinational logic soft error mitigation. This dissertation, advances hitherto untapped opportunities to jointly reduce power consumption and deliver soft error resilient designs. Circuit as well as architectural approaches are employed to achieve this objective and the advantages of cross-layer optimization for power and soft error reliability are emphasized.
Clover: Compiler directed lightweight soft error resilience
Liu, Qingrui; Lee, Dongyoon; Jung, Changhee; ...
2015-05-01
This paper presents Clover, a compiler directed soft error detection and recovery scheme for lightweight soft error resilience. The compiler carefully generates soft error tolerant code based on idem-potent processing without explicit checkpoint. During program execution, Clover relies on a small number of acoustic wave detectors deployed in the processor to identify soft errors by sensing the wave made by a particle strike. To cope with DUE (detected unrecoverable errors) caused by the sensing latency of error detection, Clover leverages a novel selective instruction duplication technique called tail-DMR (dual modular redundancy). Once a soft error is detected by either themore » sensor or the tail-DMR, Clover takes care of the error as in the case of exception handling. To recover from the error, Clover simply redirects program control to the beginning of the code region where the error is detected. Lastly, the experiment results demonstrate that the average runtime overhead is only 26%, which is a 75% reduction compared to that of the state-of-the-art soft error resilience technique.« less
Error-Rate Estimation Based on Multi-Signal Flow Graph Model and Accelerated Radiation Tests
Wang, Yueke; Xing, Kefei; Deng, Wei; Zhang, Zelong
2016-01-01
A method of evaluating the single-event effect soft-error vulnerability of space instruments before launched has been an active research topic in recent years. In this paper, a multi-signal flow graph model is introduced to analyze the fault diagnosis and meantime to failure (MTTF) for space instruments. A model for the system functional error rate (SFER) is proposed. In addition, an experimental method and accelerated radiation testing system for a signal processing platform based on the field programmable gate array (FPGA) is presented. Based on experimental results of different ions (O, Si, Cl, Ti) under the HI-13 Tandem Accelerator, the SFER of the signal processing platform is approximately 10−3(error/particle/cm2), while the MTTF is approximately 110.7 h. PMID:27583533
Error-Rate Estimation Based on Multi-Signal Flow Graph Model and Accelerated Radiation Tests.
He, Wei; Wang, Yueke; Xing, Kefei; Deng, Wei; Zhang, Zelong
2016-01-01
A method of evaluating the single-event effect soft-error vulnerability of space instruments before launched has been an active research topic in recent years. In this paper, a multi-signal flow graph model is introduced to analyze the fault diagnosis and meantime to failure (MTTF) for space instruments. A model for the system functional error rate (SFER) is proposed. In addition, an experimental method and accelerated radiation testing system for a signal processing platform based on the field programmable gate array (FPGA) is presented. Based on experimental results of different ions (O, Si, Cl, Ti) under the HI-13 Tandem Accelerator, the SFER of the signal processing platform is approximately 10-3(error/particle/cm2), while the MTTF is approximately 110.7 h.
A Framework for Debugging Geoscience Projects in a High Performance Computing Environment
NASA Astrophysics Data System (ADS)
Baxter, C.; Matott, L.
2012-12-01
High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.
An Investigation into Soft Error Detection Efficiency at Operating System Level
Taheri, Hassan
2014-01-01
Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs) and data errors. Valuable research results have already appeared in literature at hardware and software levels for their alleviation. However, there is the basic assumption behind these works that the operating system is reliable and the focus is on other system levels. In this paper, we investigate the effects of soft errors on the operating system components and compare their vulnerability with that of application level components. Results show that soft errors in operating system components affect both operating system and application level components. Therefore, by providing endurance to operating system level components against soft errors, both operating system and application level components gain tolerance. PMID:24574894
An investigation into soft error detection efficiency at operating system level.
Asghari, Seyyed Amir; Kaynak, Okyay; Taheri, Hassan
2014-01-01
Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs) and data errors. Valuable research results have already appeared in literature at hardware and software levels for their alleviation. However, there is the basic assumption behind these works that the operating system is reliable and the focus is on other system levels. In this paper, we investigate the effects of soft errors on the operating system components and compare their vulnerability with that of application level components. Results show that soft errors in operating system components affect both operating system and application level components. Therefore, by providing endurance to operating system level components against soft errors, both operating system and application level components gain tolerance.
Evaluation and Management of Failed Shoulder Instability Surgery.
Cartucho, António; Moura, Nuno; Sarmento, Marco
2017-01-01
Failed shoulder instability surgery is mostly considered to be the recurrence of shoulder dislocation but subluxation, painful or non-reliable shoulder are also reasons for patient dissatisfaction and should be considered in the notion. The authors performed a revision of the literature and online contents on evaluation and management of failed shoulder instability surgery. When we look at the reasons for failure of shoulder instability surgery we point the finger at poor patient selection, technical error and an additional traumatic event. More than 80% of surgical failures, for shoulder instability, are associated with bone loss. Quantification of glenoid bone loss and investigation of an engaging Hill-Sachs lesion are determining facts. Adequate imaging studies are determinant to assess labrum and capsular lesions and to rule out associated pathology as rotator cuff tears. CT-scan is the method of choice to diagnose and quantify bone loss. Arthroscopic soft tissue procedures are indicated in patients with minimal bone loss and no contact sports. Open soft tissue procedures should be performed in patients with small bone defects, with hiperlaxity and practicing contact sports. Soft tissue techniques, as postero-inferior capsular plication and remplissage, may be used in patients with less than 25% of glenoid bone loss and Hill-Sachs lesions. Bone block procedures should be used for glenoid larger bone defects in the presence of an engaging Hill-Sachs lesion or in the presence of poor soft tissue quality. A tricortical iliac crest graft may be used as a primary procedure or as a salvage procedure after failure of a Bristow or a Latarjet procedure. Less frequently, the surgeon has to address the Hill-Sachs lesion. When a 30% loss of humeral head circumference is present a filling graft should be used. Reasons for failure are multifactorial. In order to address this entity, surgeons must correctly identify the causes and tailor the right solution.
NASA Astrophysics Data System (ADS)
Watanabe, Y.; Abe, S.
2014-06-01
Terrestrial neutron-induced soft errors in MOSFETs from a 65 nm down to a 25 nm design rule are analyzed by means of multi-scale Monte Carlo simulation using the PHITS-HyENEXSS code system. Nuclear reaction models implemented in PHITS code are validated by comparisons with experimental data. From the analysis of calculated soft error rates, it is clarified that secondary He and H ions provide a major impact on soft errors with decreasing critical charge. It is also found that the high energy component from 10 MeV up to several hundreds of MeV in secondary cosmic-ray neutrons has the most significant source of soft errors regardless of design rule.
A Quatro-Based 65-nm Flip-Flop Circuit for Soft-Error Resilience
NASA Astrophysics Data System (ADS)
Li, Y.-Q.; Wang, H.-B.; Liu, R.; Chen, L.; Nofal, I.; Shi, S.-T.; He, A.-L.; Guo, G.; Baeg, S. H.; Wen, S.-J.; Wong, R.; Chen, M.; Wu, Q.
2017-06-01
A flip-flop circuit hardened against soft errors is presented in this paper. This design is an improved version of Quatro for further enhanced soft-error resilience by integrating the guard-gate technique. The proposed design, as well as reference Quatro and regular flip-flops, was implemented and manufactured in a 65-nm CMOS bulk technology. Experimental characterization results of their alpha and heavy ions soft-error rates verified the superior hardening performance of the proposed design over the other two circuits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watanabe, Y., E-mail: watanabe@aees.kyushu-u.ac.jp; Abe, S.
Terrestrial neutron-induced soft errors in MOSFETs from a 65 nm down to a 25 nm design rule are analyzed by means of multi-scale Monte Carlo simulation using the PHITS-HyENEXSS code system. Nuclear reaction models implemented in PHITS code are validated by comparisons with experimental data. From the analysis of calculated soft error rates, it is clarified that secondary He and H ions provide a major impact on soft errors with decreasing critical charge. It is also found that the high energy component from 10 MeV up to several hundreds of MeV in secondary cosmic-ray neutrons has the most significant sourcemore » of soft errors regardless of design rule.« less
Proton upsets in LSI memories in space
NASA Technical Reports Server (NTRS)
Mcnulty, P. J.; Wyatt, R. C.; Filz, R. C.; Rothwell, P. L.; Farrell, G. E.
1980-01-01
Two types of large scale integrated dynamic random access memory devices were tested and found to be subject to soft errors when exposed to protons incident at energies between 18 and 130 MeV. These errors are shown to differ significantly from those induced in the same devices by alphas from an Am-241 source. There is considerable variation among devices in their sensitivity to proton-induced soft errors, even among devices of the same type. For protons incident at 130 MeV, the soft error cross sections measured in these experiments varied from 10 to the -8th to 10 to the -6th sq cm/proton. For individual devices, however, the soft error cross section consistently increased with beam energy from 18-130 MeV. Analysis indicates that the soft errors induced by energetic protons result from spallation interactions between the incident protons and the nuclei of the atoms comprising the device. Because energetic protons are the most numerous of both the galactic and solar cosmic rays and form the inner radiation belt, proton-induced soft errors have potentially serious implications for many electronic systems flown in space.
Failure detection and isolation analysis of a redundant strapdown inertial measurement unit
NASA Technical Reports Server (NTRS)
Motyka, P.; Landey, M.; Mckern, R.
1981-01-01
The objective of this study was to define and develop techniques for failure detection and isolation (FDI) algorithms for a dual fail/operational redundant strapdown inertial navigation system are defined and developed. The FDI techniques chosen include provisions for hard and soft failure detection in the context of flight control and navigation. Analyses were done to determine error detection and switching levels for the inertial navigation system, which is intended for a conventional takeoff or landing (CTOL) operating environment. In addition, investigations of false alarms and missed alarms were included for the FDI techniques developed, along with the analyses of filters to be used in conjunction with FDI processing. Two specific FDI algorithms were compared: the generalized likelihood test and the edge vector test. A deterministic digital computer simulation was used to compare and evaluate the algorithms and FDI systems.
NASA Astrophysics Data System (ADS)
Zhang, Kuiyuan; Umehara, Shigehiro; Yamaguchi, Junki; Furuta, Jun; Kobayashi, Kazutoshi
2016-08-01
This paper analyzes how body bias and BOX region thickness affect soft error rates in 65-nm SOTB (Silicon on Thin BOX) and 28-nm UTBB (Ultra Thin Body and BOX) FD-SOI processes. Soft errors are induced by alpha-particle and neutron irradiation and the results are then analyzed by Monte Carlo based simulation using PHITS-TCAD. The alpha-particle-induced single event upset (SEU) cross-section and neutron-induced soft error rate (SER) obtained by simulation are consistent with measurement results. We clarify that SERs decreased in response to an increase in the BOX thickness for SOTB while SERs in UTBB are independent of BOX thickness. We also discover SOTB develops a higher tolerance to soft errors when reverse body bias is applied while UTBB become more susceptible.
Multi-bits error detection and fast recovery in RISC cores
NASA Astrophysics Data System (ADS)
Jing, Wang; Xing, Yang; Yuanfu, Zhao; Weigong, Zhang; Jiao, Shen; Keni, Qiu
2015-11-01
The particles-induced soft errors are a major threat to the reliability of microprocessors. Even worse, multi-bits upsets (MBUs) are ever-increased due to the rapidly shrinking feature size of the IC on a chip. Several architecture-level mechanisms have been proposed to protect microprocessors from soft errors, such as dual and triple modular redundancies (DMR and TMR). However, most of them are inefficient to combat the growing multi-bits errors or cannot well balance the critical paths delay, area and power penalty. This paper proposes a novel architecture, self-recovery dual-pipeline (SRDP), to effectively provide soft error detection and recovery with low cost for general RISC structures. We focus on the following three aspects. First, an advanced DMR pipeline is devised to detect soft error, especially MBU. Second, SEU/MBU errors can be located by enhancing self-checking logic into pipelines stage registers. Third, a recovery scheme is proposed with a recovery cost of 1 or 5 clock cycles. Our evaluation of a prototype implementation exhibits that the SRDP can successfully detect particle-induced soft errors up to 100% and recovery is nearly 95%, the other 5% will inter a specific trap.
Patient-specific polyetheretherketone facial implants in a computer-aided planning workflow.
Guevara-Rojas, Godoberto; Figl, Michael; Schicho, Kurt; Seemann, Rudolf; Traxler, Hannes; Vacariu, Apostolos; Carbon, Claus-Christian; Ewers, Rolf; Watzinger, Franz
2014-09-01
In the present study, we report an innovative workflow using polyetheretherketone (PEEK) patient-specific implants for esthetic corrections in the facial region through onlay grafting. The planning includes implant design according to virtual osteotomy and generation of a subtraction volume. The implant design was refined by stepwise changing the implant geometry according to soft tissue simulations. One patient was scanned using computed tomography. PEEK implants were interactively designed and manufactured using rapid prototyping techniques. Positioning intraoperatively was assisted by computer-aided navigation. Two months after surgery, a 3-dimensional surface model of the patient's face was generated using photogrammetry. Finally, the Hausdorff distance calculation was used to quantify the overall error, encompassing the failures in soft tissue simulation and implantation. The implant positioning process during surgery was satisfactory. The simulated soft tissue surface and the photogrammetry scan of the patient showed a high correspondence, especially where the skin covered the implants. The mean total error (Hausdorff distance) was 0.81 ± 1.00 mm (median 0.48, interquartile range 1.11). The spatial deviation remained less than 0.7 mm for the vast majority of points. The proposed workflow provides a complete computer-aided design, computer-aided manufacturing, and computer-aided surgery chain for implant design, allowing for soft tissue simulation, fabrication of patient-specific implants, and image-guided surgery to position the implants. Much of the surgical complexity resulting from osteotomies of the zygoma, chin, or mandibular angle might be transferred into the planning phase of patient-specific implants. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Neutron beam irradiation study of workload dependence of SER in a microprocessor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michalak, Sarah E; Graves, Todd L; Hong, Ted
It is known that workloads are an important factor in soft error rates (SER), but it is proving difficult to find differentiating workloads for microprocessors. We have performed neutron beam irradiation studies of a commercial microprocessor under a wide variety of workload conditions from idle, performing no operations, to very busy workloads resembling real HPC, graphics, and business applications. There is evidence that the mean times to first indication of failure, MTFIF defined in Section II, may be different for some of the applications.
Asymmetric Memory Circuit Would Resist Soft Errors
NASA Technical Reports Server (NTRS)
Buehler, Martin G.; Perlman, Marvin
1990-01-01
Some nonlinear error-correcting codes more efficient in presence of asymmetry. Combination of circuit-design and coding concepts expected to make integrated-circuit random-access memories more resistant to "soft" errors (temporary bit errors, also called "single-event upsets" due to ionizing radiation). Integrated circuit of new type made deliberately more susceptible to one kind of bit error than to other, and associated error-correcting code adapted to exploit this asymmetry in error probabilities.
Monte Carlo simulation of particle-induced bit upsets
NASA Astrophysics Data System (ADS)
Wrobel, Frédéric; Touboul, Antoine; Vaillé, Jean-Roch; Boch, Jérôme; Saigné, Frédéric
2017-09-01
We investigate the issue of radiation-induced failures in electronic devices by developing a Monte Carlo tool called MC-Oracle. It is able to transport the particles in device, to calculate the energy deposited in the sensitive region of the device and to calculate the transient current induced by the primary particle and the secondary particles produced during nuclear reactions. We compare our simulation results with SRAM experiments irradiated with neutrons, protons and ions. The agreement is very good and shows that it is possible to predict the soft error rate (SER) for a given device in a given environment.
Deployment Testing of Flexible Composite Hinges in Bi-Material Beams
NASA Technical Reports Server (NTRS)
Sauder, Jonathan F.; Trease, Brian
2016-01-01
Composites have excellent properties for strength, thermal stability, and weight. However, they are traditionally highly rigid, and when used in deployable structures require hinges bonded to the composite material, which increases complexity and opportunities for failure. Recent research in composites has found by adding an elastomeric soft matrix, often silicone instead of an epoxy, the composite becomes flexible. This work explores the deployment repeatability of silicone matrix composite hinges which join rigid composite beams. The hinges were found to have sub-millimeter deployment repeatability. Also, an interesting creep effect was discovered, that a hinges deployment error would decrease with time.
Read disturb errors in a CMOS static RAM chip. [radiation hardened for spacedraft
NASA Technical Reports Server (NTRS)
Wood, Steven H.; Marr, James C., IV; Nguyen, Tien T.; Padgett, Dwayne J.; Tran, Joe C.; Griswold, Thomas W.; Lebowitz, Daniel C.
1989-01-01
Results are reported from an extensive investigation into pattern-sensitive soft errors (read disturb errors) in the TCC244 CMOS static RAM chip. The TCC244, also known as the SA2838, is a radiation-hard single-event-upset-resistant 4 x 256 memory chip. This device is being used by the Jet Propulsion Laboratory in the Galileo and Magellan spacecraft, which will have encounters with Jupiter and Venus, respectively. Two aspects of the part's design are shown to result in the occurrence of read disturb errors: the transparence of the signal path from the address pins to the array of cells, and the large resistance in the Vdd and Vss lines of the cells in the center of the array. Probe measurements taken during a read disturb failure illustrate how address skews and the data pattern in the chip combine to produce a bit flip. A capacitive charge pump formed by the individual cell capacitances and the resistance in the supply lines pumps down both the internal cell voltage and the local supply voltage until a bit flip occurs.
Puncture mechanics of soft elastomeric membrane with large deformation by rigid cylindrical indenter
NASA Astrophysics Data System (ADS)
Liu, Junjie; Chen, Zhe; Liang, Xueya; Huang, Xiaoqiang; Mao, Guoyong; Hong, Wei; Yu, Honghui; Qu, Shaoxing
2018-03-01
Soft elastomeric membrane structures are widely used and commonly found in engineering and biological applications. Puncture is one of the primary failure modes of soft elastomeric membrane at large deformation when indented by rigid objects. In order to investigate the puncture failure mechanism of soft elastomeric membrane with large deformation, we study the deformation and puncture failure of silicone rubber membrane that results from the continuous axisymmetric indentation by cylindrical steel indenters experimentally and analytically. In the experiment, effects of indenter size and the friction between the indenter and the membrane on the deformation and puncture failure of the membrane are investigated. In the analytical study, a model within the framework of nonlinear field theory is developed to describe the large local deformation around the punctured area, as well as to predict the puncture failure of the membrane. The deformed membrane is divided into three parts and the friction contact between the membrane and indenter is modeled by Coulomb friction law. The first invariant of the right Cauchy-Green deformation tensor I1 is adopted to predict the puncture failure of the membrane. The experimental and analytical results agree well. This work provides a guideline in designing reliable soft devices featured with membrane structures, which are present in a wide variety of applications.
Modeling Soft Tissue Damage and Failure Using a Combined Particle/Continuum Approach.
Rausch, M K; Karniadakis, G E; Humphrey, J D
2017-02-01
Biological soft tissues experience damage and failure as a result of injury, disease, or simply age; examples include torn ligaments and arterial dissections. Given the complexity of tissue geometry and material behavior, computational models are often essential for studying both damage and failure. Yet, because of the need to account for discontinuous phenomena such as crazing, tearing, and rupturing, continuum methods are limited. Therefore, we model soft tissue damage and failure using a particle/continuum approach. Specifically, we combine continuum damage theory with Smoothed Particle Hydrodynamics (SPH). Because SPH is a meshless particle method, and particle connectivity is determined solely through a neighbor list, discontinuities can be readily modeled by modifying this list. We show, for the first time, that an anisotropic hyperelastic constitutive model commonly employed for modeling soft tissue can be conveniently implemented within a SPH framework and that SPH results show excellent agreement with analytical solutions for uniaxial and biaxial extension as well as finite element solutions for clamped uniaxial extension in 2D and 3D. We further develop a simple algorithm that automatically detects damaged particles and disconnects the spatial domain along rupture lines in 2D and rupture surfaces in 3D. We demonstrate the utility of this approach by simulating damage and failure under clamped uniaxial extension and in a peeling experiment of virtual soft tissue samples. In conclusion, SPH in combination with continuum damage theory may provide an accurate and efficient framework for modeling damage and failure in soft tissues.
Modeling Soft Tissue Damage and Failure Using a Combined Particle/Continuum Approach
Rausch, M. K.; Karniadakis, G. E.; Humphrey, J. D.
2016-01-01
Biological soft tissues experience damage and failure as a result of injury, disease, or simply age; examples include torn ligaments and arterial dissections. Given the complexity of tissue geometry and material behavior, computational models are often essential for studying both damage and failure. Yet, because of the need to account for discontinuous phenomena such as crazing, tearing, and rupturing, continuum methods are limited. Therefore, we model soft tissue damage and failure using a particle/continuum approach. Specifically, we combine continuum damage theory with Smoothed Particle Hydrodynamics (SPH). Because SPH is a meshless particle method, and particle connectivity is determined solely through a neighbor list, discontinuities can be readily modeled by modifying this list. We show, for the first time, that an anisotropic hyperelastic constitutive model commonly employed for modeling soft tissue can be conveniently implemented within a SPH framework and that SPH results show excellent agreement with analytical solutions for uniaxial and biaxial extension as well as finite element solutions for clamped uniaxial extension in 2D and 3D. We further develop a simple algorithm that automatically detects damaged particles and disconnects the spatial domain along rupture lines in 2D and rupture surfaces in 3D. We demonstrate the utility of this approach by simulating damage and failure under clamped uniaxial extension and in a peeling experiment of virtual soft tissue samples. In conclusion, SPH in combination with continuum damage theory may provide an accurate and efficient framework for modeling damage and failure in soft tissues. PMID:27538848
Reliable Broadcast under Cascading Failures in Interdependent Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, Sisi; Lee, Sangkeun; Chinthavali, Supriya
Reliable broadcast is an essential tool to disseminate information among a set of nodes in the presence of failures. We present a novel study of reliable broadcast in interdependent networks, in which the failures in one network may cascade to another network. In particular, we focus on the interdependency between the communication network and power grid network, where the power grid depends on the signals from the communication network for control and the communication network depends on the grid for power. In this paper, we build a resilient solution to handle crash failures in the communication network that may causemore » cascading failures and may even partition the network. In order to guarantee that all the correct nodes deliver the messages, we use soft links, which are inactive backup links to non-neighboring nodes that are only active when failures occur. At the core of our work is a fully distributed algorithm for the nodes to predict and collect the information of cascading failures so that soft links can be maintained to correct nodes prior to the failures. In the presence of failures, soft links are activated to guarantee message delivery and new soft links are built accordingly for long term robustness. Our evaluation results show that the algorithm achieves low packet drop rate and handles cascading failures with little overhead.« less
Lochmüller, E M; Miller, P; Bürklein, D; Wehr, U; Rambeck, W; Eckstein, F
2000-01-01
The objective of this study was to directly compare in situ femoral dual-energy X-ray absorptiometry (DXA) and in vitro chemical analysis (ash weight and calcium) with mechanical failure loads of the proximal femur, and to determine the influence of bone size (volume) and density on mechanical failure and DXA-derived areal bone mineral density (BMD, in g/cm2). We performed femoral DXA in 52 fixed cadavers (age 82.1 +/- 9.7 years; 30 male, 22 female) with intact skin and soft tissues. The femora were then excised, mechanically loaded to failure in a stance phase configuration, their volume measured with a water displacement method (proximal neck to lesser trochanter), and the ash weight and calcium content of this region determined by chemical analysis. The correlation coefficient between the bone mineral content (measured in situ with DXA) and the ash weight was r = 0.87 (standard error of the estimate = 16%), the ash weight allowing for a better prediction of femoral failure loads (r = 0.78; p < 0.01) than DXA (r = 0.67; p < 0.01). The femoral volume (r = 0.61; p < 0.01), but not the volumetric bone density (r = 0.26), was significantly associated with the failure load. The femoral bone volume had a significant impact (r = 0.35; p < 0.01) on the areal BMD (DXA), and only 63% of the variability of bone volume could be predicted (based on the basis of body height, weight and femoral projectional bone area. The results suggest that accuracy errors of femoral DXA limit the prediction of mechanical failure loads, and that the influence of bone size on areal BMD cannot be fully corrected by accounting for body height, weight and projected femoral area.
Brown, Christopher A; Hurwit, Daniel; Behn, Anthony; Hunt, Kenneth J
2014-02-01
Anatomic repair is indicated for patients who have recurrent lateral ankle instability despite nonoperative measures. There is no difference in repair stiffness, failure torque, or failure angle between specimens repaired with all-soft suture anchors versus the modified Broström-Gould technique with sutures only. Controlled laboratory study. In 10 matched pairs of human cadaveric ankles, the anterior talofibular ligament (ATFL) was incised from its origin on the fibula. After randomization, 1 ankle was repaired to its anatomic insertion using two 1.4-mm JuggerKnot all-soft suture anchors; the other ankle was repaired with a modified Broström-Gould technique using 2-0 FiberWire. All were augmented using the inferior extensor retinaculum. All ankles were mounted to the testing machine in 20° of plantar flexion and 15° of internal rotation and loaded to failure after the repair. Stiffness, failure torque, and failure angle were recorded and compared using a paired Student t test with a significance level set at P < .05. There was no significant difference in failure torque, failure angle, or stiffness. No anchors pulled out of bone. The primary mode of failure was pulling through the ATFL tissue. There was no statistical difference in strength or stiffness between a 1.4-mm all-soft suture anchor and a modified Broström-Gould repair with 2-0 FiberWire. The primary mode of failure was at the tissue level rather than knot failure or anchor pullout. The particular implant choice (suture only, tunnel, anchor) in repairing the lateral ligament complex may not be as important as the time to biological healing. The suture-only construct as described in the Broström-Gould repair was as strong as all-soft suture anchors, and the majority of the ankles failed at the tissue level. For those surgeons whose preference is to use anchor repair, this novel all-soft suture anchor may be an alternative to other larger anchors, as none failed by pullout.
On-orbit observations of single event upset in Harris HM-6508 1K RAMs, reissue A
NASA Astrophysics Data System (ADS)
Blake, J. B.; Mandel, R.
1987-02-01
The Harris HM-6508 1K x 1 RAMs are part of a subsystem of a satellite in a low, polar orbit. The memory module, used in the subsystem containing the RAMs, consists of three printed circuit cards, with each card containing eight 2K byte memory hybrids, for a total of 48K bytes. Each memory hybrid contains 16 HM-6508 RAM chips. On a regular basis all but 256 bytes of the 48K bytes are examined for bit errors. Two different techniques were used for detecting bit errors. The first technique, a memory check sum, was capable of automatically detecting all single bit and some double bit errors which occurred within a page of memory. A memory page consists of 256 bytes. Memory check sum tests are performed approximately every 90 minutes. To detect a multiple error or to determine the exact location of the bit error within the page the entire contents of the memory is dumped and compared to the load file. Memory dumps are normally performed once a month, or immediately after the check sum routine detects an error. Once the exact location of the error is found, the correct value is reloaded into memory. After the memory is reloaded, the contents of the memory location in question is verified in order to determine if the error was a soft error generated by an SEU or a hard error generated by a part failure or cosmic-ray induced latchup.
Deployment Testing of Flexible Composite Hinges in Bi-Material Beams
NASA Technical Reports Server (NTRS)
Sauder, Jonathan F.; Trease, Brian
2016-01-01
Composites have excellent properties for strength, thermal stability, and weight. However, they are traditionally highly rigid, and when used in deployable structures require hinges bonded to the composite material, which increases complexity and opportunities for failure. Recent research in composites has found by adding an elastomeric soft matrix, often silicone instead of an epoxy, the composite becomes flexible. This work explores the deployment repeatability of silicone matrix composite hinges which join rigid composite beams. The hinges were found to have sub-millimeter linear deployment repeatability, and sub-degree angular deployment repeatability. Also, an interesting relaxation effect was discovered, as a hinges deployment error would decrease with time.
Human Activity Recognition by Combining a Small Number of Classifiers.
Nazabal, Alfredo; Garcia-Moreno, Pablo; Artes-Rodriguez, Antonio; Ghahramani, Zoubin
2016-09-01
We consider the problem of daily human activity recognition (HAR) using multiple wireless inertial sensors, and specifically, HAR systems with a very low number of sensors, each one providing an estimation of the performed activities. We propose new Bayesian models to combine the output of the sensors. The models are based on a soft outputs combination of individual classifiers to deal with the small number of sensors. We also incorporate the dynamic nature of human activities as a first-order homogeneous Markov chain. We develop both inductive and transductive inference methods for each model to be employed in supervised and semisupervised situations, respectively. Using different real HAR databases, we compare our classifiers combination models against a single classifier that employs all the signals from the sensors. Our models exhibit consistently a reduction of the error rate and an increase of robustness against sensor failures. Our models also outperform other classifiers combination models that do not consider soft outputs and an Markovian structure of the human activities.
Zhang, Xiaowei; Sahraei, Elham; Wang, Kai
2016-01-01
Separator integrity is an important factor in preventing internal short circuit in lithium-ion batteries. Local penetration tests (nail or conical punch) often produce presumably sporadic results, where in exactly similar cell and test set-ups one cell goes to thermal runaway while the other shows minimal reactions. We conducted an experimental study of the separators under mechanical loading, and discovered two distinct deformation and failure mechanisms, which could explain the difference in short circuit characteristics of otherwise similar tests. Additionally, by investigation of failure modes, we provided a hypothesis about the process of formation of local “soft short circuits” in cells with undetectable failure. Finally, we proposed a criterion for predicting onset of soft short from experimental data. PMID:27581185
Register file soft error recovery
Fleischer, Bruce M.; Fox, Thomas W.; Wait, Charles D.; Muff, Adam J.; Watson, III, Alfred T.
2013-10-15
Register file soft error recovery including a system that includes a first register file and a second register file that mirrors the first register file. The system also includes an arithmetic pipeline for receiving data read from the first register file, and error detection circuitry to detect whether the data read from the first register file includes corrupted data. The system further includes error recovery circuitry to insert an error recovery instruction into the arithmetic pipeline in response to detecting the corrupted data. The inserted error recovery instruction replaces the corrupted data in the first register file with a copy of the data from the second register file.
Preisig, James C
2005-07-01
Equations are derived for analyzing the performance of channel estimate based equalizers. The performance is characterized in terms of the mean squared soft decision error (sigma2(s)) of each equalizer. This error is decomposed into two components. These are the minimum achievable error (sigma2(0)) and the excess error (sigma2(e)). The former is the soft decision error that would be realized by the equalizer if the filter coefficient calculation were based upon perfect knowledge of the channel impulse response and statistics of the interfering noise field. The latter is the additional soft decision error that is realized due to errors in the estimates of these channel parameters. These expressions accurately predict the equalizer errors observed in the processing of experimental data by a channel estimate based decision feedback equalizer (DFE) and a passive time-reversal equalizer. Further expressions are presented that allow equalizer performance to be predicted given the scattering function of the acoustic channel. The analysis using these expressions yields insights into the features of surface scattering that most significantly impact equalizer performance in shallow water environments and motivates the implementation of a DFE that is robust with respect to channel estimation errors.
Prien-Larsen, Jens Christian; Prien-Larsen, Thomas; Cieslak, Lars; Dessau, Ram B
2016-07-01
Although there is clear consensus on the use of monofilament polypropylene tapes for treating stress urinary incontinence (SUI), tapes differ in weight, stiffness, and elasticity. In this study, we compared outcomes of two tape types: high-stiffness Intramesh SOFT L.I.F.T versus low-stiffness Intramesh L.I.F.T. tape. Our null hypothesis was that in terms of performance, SOFT tape equaled L.I.F.T. tape. Six hundred and sixty women underwent prospective transvaginal tape (TVT) surgery for SUI: 210 had the SOFT tape placed and 450 the L.I.F.T. tape. Follow-ups were scheduled at 3 and 12 months. Objective cure at 3-months' follow-up was 87 % in the SOFT group vs 94 % in the L.I.F.T. group (p = 0.003) and at 12 months 86 vs 96 % (p = 0.0004), respectively. Subjective outcomes were equal. For SOFT tape, the objective failure rate at 3 months was especially pronounced in women older than 70 years: 31 vs 10 % (p = 0.008), and subjective failure was 24 vs 7 % (p = 0.01). At 12 months, objective failure for the SOFT tape was significantly higher in both age groups compared with L.I.F.T. [odds ratio (OR) 2.17]. Multivariate analysis showed that body mass index (BMI) ≥30 (OR 2.41), mixed incontinence (MUI) (OR 2.24), use of SOFT tape (OR 2.17), and age ≥ 70 years are significant independent risk factors for surgical failure. Outcomes with SOFT tape are significantly inferior than with L.I.F.T. tape, especially among elderly women. Therefore, the two variants of monofilament polypropylene tape are not interchangeable.
Detection and Correction of Silent Data Corruption for Large-Scale High-Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiala, David J; Mueller, Frank; Engelmann, Christian
Faults have become the norm rather than the exception for high-end computing on clusters with 10s/100s of thousands of cores. Exacerbating this situation, some of these faults remain undetected, manifesting themselves as silent errors that corrupt memory while applications continue to operate and report incorrect results. This paper studies the potential for redundancy to both detect and correct soft errors in MPI message-passing applications. Our study investigates the challenges inherent to detecting soft errors within MPI application while providing transparent MPI redundancy. By assuming a model wherein corruption in application data manifests itself by producing differing MPI message data betweenmore » replicas, we study the best suited protocols for detecting and correcting MPI data that is the result of corruption. To experimentally validate our proposed detection and correction protocols, we introduce RedMPI, an MPI library which resides in the MPI profiling layer. RedMPI is capable of both online detection and correction of soft errors that occur in MPI applications without requiring any modifications to the application source by utilizing either double or triple redundancy. Our results indicate that our most efficient consistency protocol can successfully protect applications experiencing even high rates of silent data corruption with runtime overheads between 0% and 30% as compared to unprotected applications without redundancy. Using our fault injector within RedMPI, we observe that even a single soft error can have profound effects on running applications, causing a cascading pattern of corruption in most cases causes that spreads to all other processes. RedMPI's protection has been shown to successfully mitigate the effects of soft errors while allowing applications to complete with correct results even in the face of errors.« less
Practicality of Evaluating Soft Errors in Commercial sub-90 nm CMOS for Space Applications
NASA Technical Reports Server (NTRS)
Pellish, Jonathan A.; LaBel, Kenneth A.
2010-01-01
The purpose of this presentation is to: Highlight space memory evaluation evolution, Review recent developments regarding low-energy proton direct ionization soft errors, Assess current space memory evaluation challenges, including increase of non-volatile technology choices, and Discuss related testing and evaluation complexities.
NASA Astrophysics Data System (ADS)
Aghababaei, Sajjad; Saeedi, Gholamreza; Jalalifar, Hossein
2016-05-01
The floor failure at longwall face decreases productivity and safety, increases operation costs, and causes other serious problems. In Parvadeh-I coal mine, the timber is used to prevent the puncture of powered support base into the floor. In this paper, a rock engineering system (RES)-based model is presented to evaluate the risk of floor failure mechanisms at the longwall face of E 2 and W 1 panels. The presented model is used to determine the most probable floor failure mechanism, effective factors, damaged regions and remedial actions. From the analyzed results, it is found that soft floor failure is dominant in the floor failure mechanism at Parvadeh-I coal mine. The average of vulnerability index (VI) for soft, buckling and compressive floor failure mechanisms was estimated equal to 52, 43 and 30 for both panels, respectively. By determining the critical VI for soft floor failure mechanism equal to 54, the percentage of regions with VIs beyond the critical VI in E 2 and W 1 panels is equal to 65.5 and 30, respectively. The percentage of damaged regions showed that the excess amount of used timber to prevent the puncture of weak floor below the powered support base is equal to 4,180,739 kg. RES outputs and analyzed results showed that setting and yielding load of powered supports, length of face, existent water at face, geometry of powered supports, changing the cutting pattern at longwall face and limiting the panels to damaged regions with supercritical VIs could be considered to control the soft floor failure in this mine. The results of this research could be used as a useful tool to identify the damaged regions prior to mining operation at longwall panel for the same conditions.
Morbi, Abigail H M; Hamady, Mohamad S; Riga, Celia V; Kashef, Elika; Pearch, Ben J; Vincent, Charles; Moorthy, Krishna; Vats, Amit; Cheshire, Nicholas J W; Bicknell, Colin D
2012-08-01
To determine the type and frequency of errors during vascular interventional radiology (VIR) and design and implement an intervention to reduce error and improve efficiency in this setting. Ethical guidance was sought from the Research Services Department at Imperial College London. Informed consent was not obtained. Field notes were recorded during 55 VIR procedures by a single observer. Two blinded assessors identified failures from field notes and categorized them into one or more errors by using a 22-part classification system. The potential to cause harm, disruption to procedural flow, and preventability of each failure was determined. A preprocedural team rehearsal (PPTR) was then designed and implemented to target frequent preventable potential failures. Thirty-three procedures were observed subsequently to determine the efficacy of the PPTR. Nonparametric statistical analysis was used to determine the effect of intervention on potential failure rates, potential to cause harm and procedural flow disruption scores (Mann-Whitney U test), and number of preventable failures (Fisher exact test). Before intervention, 1197 potential failures were recorded, of which 54.6% were preventable. A total of 2040 errors were deemed to have occurred to produce these failures. Planning error (19.7%), staff absence (16.2%), equipment unavailability (12.2%), communication error (11.2%), and lack of safety consciousness (6.1%) were the most frequent errors, accounting for 65.4% of the total. After intervention, 352 potential failures were recorded. Classification resulted in 477 errors. Preventable failures decreased from 54.6% to 27.3% (P < .001) with implementation of PPTR. Potential failure rates per hour decreased from 18.8 to 9.2 (P < .001), with no increase in potential to cause harm or procedural flow disruption per failure. Failures during VIR procedures are largely because of ineffective planning, communication error, and equipment difficulties, rather than a result of technical or patient-related issues. Many of these potential failures are preventable. A PPTR is an effective means of targeting frequent preventable failures, reducing procedural delays and improving patient safety.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimura, K.; Ohmi, K.; Tottori University Electronic Display Research Center, 101 Minami4-chome, Koyama-cho, Tottori-shi, Tottori 680-8551
With increasing density of memory devices, the issue of generating soft errors by cosmic rays is becoming more and more serious. Therefore, the irradiation resistance of resistance random access memory (ReRAM) to cosmic radiation has to be elucidated for practical use. In this paper, we investigated the data retention characteristics of ReRAM against ultraviolet irradiation with a Pt/NiO/ITO structure. Soft errors were confirmed to be caused by ultraviolet irradiation in both low- and high-resistance states. An analysis of the wavelength dependence of light irradiation on data retention characteristics suggested that electronic excitation from the valence to the conduction band andmore » to the energy level generated due to the introduction of oxygen vacancies caused the errors. Based on a statistically estimated soft error rates, the errors were suggested to be caused by the cohesion and dispersion of oxygen vacancies owing to the generation of electron-hole pairs and valence changes by the ultraviolet irradiation.« less
Failure analysis and modeling of a VAXcluster system
NASA Technical Reports Server (NTRS)
Tang, Dong; Iyer, Ravishankar K.; Subramani, Sujatha S.
1990-01-01
This paper discusses the results of a measurement-based analysis of real error data collected from a DEC VAXcluster multicomputer system. In addition to evaluating basic system dependability characteristics such as error and failure distributions and hazard rates for both individual machines and for the VAXcluster, reward models were developed to analyze the impact of failures on the system as a whole. The results show that more than 46 percent of all failures were due to errors in shared resources. This is despite the fact that these errors have a recovery probability greater than 0.99. The hazard rate calculations show that not only errors, but also failures occur in bursts. Approximately 40 percent of all failures occur in bursts and involved multiple machines. This result indicates that correlated failures are significant. Analysis of rewards shows that software errors have the lowest reward (0.05 vs 0.74 for disk errors). The expected reward rate (reliability measure) of the VAXcluster drops to 0.5 in 18 hours for the 7-out-of-7 model and in 80 days for the 3-out-of-7 model.
Random Weighting, Strong Tracking, and Unscented Kalman Filter for Soft Tissue Characterization.
Shin, Jaehyun; Zhong, Yongmin; Oetomo, Denny; Gu, Chengfan
2018-05-21
This paper presents a new nonlinear filtering method based on the Hunt-Crossley model for online nonlinear soft tissue characterization. This method overcomes the problem of performance degradation in the unscented Kalman filter due to contact model error. It adopts the concept of Mahalanobis distance to identify contact model error, and further incorporates a scaling factor in predicted state covariance to compensate identified model error. This scaling factor is determined according to the principle of innovation orthogonality to avoid the cumbersome computation of Jacobian matrix, where the random weighting concept is adopted to improve the estimation accuracy of innovation covariance. A master-slave robotic indentation system is developed to validate the performance of the proposed method. Simulation and experimental results as well as comparison analyses demonstrate that the efficacy of the proposed method for online characterization of soft tissue parameters in the presence of contact model error.
Evaluation the effect of energetic particles in solar flares on satellite's life time
NASA Astrophysics Data System (ADS)
Bagheri, Z.; Davoudifar, P.
2016-09-01
As the satellites have a multiple role in the humans' life, their damages and therefore logical failures of their segment causes problems and lots of expenses. So evaluating different types of failures in their segments has a crustal role. Solar particles are one of the most important reasons of segment damages (hard and soft) during a solar event or in usual times. During a solar event these particle may cause extensive damages which are even permanent (hard errors). To avoid these effects and design shielding mediums, we need to know SEP (solar energetic particles) flux and MTTF (mean time between two failures) of segments. In the present work, we calculated SEP flux witch collide the satellite in common times, in different altitudes. OMERE software was used to determine the coordinates and specifications of a satellite which in simulations has been launched to space. Then we considered a common electronic computer part and calculated MTTF for it. In the same way the SEP fluxes were calculated during different solar flares of different solar cycles and MTFFs were evaluated during occurring of solar flares. Thus a relation between solar flare energy and life time of the satellite electronic part (hours) was obtained.
Joyce, Christopher D; Randall, Kyle L; Mariscalco, Michael W; Magnussen, Robert A; Flanigan, David C
2016-02-01
To describe the outcomes of bone-patellar tendon-bone (BPTB) and soft-tissue allografts in anterior cruciate ligament (ACL) reconstruction with respect to graft failure risk, physical examination findings, instrumented laxity, and patient-reported outcomes. A search of the PubMed, Scopus, CINAHL (Cumulative Index to Nursing and Allied Health Literature) Complete, Cochrane Collaboration, and SPORTDiscus databases was performed. English-language studies with outcome data on primary ACL reconstruction with nonirradiated BPTB and soft-tissue allografts were identified. Outcome data included failure risk, physical examination findings, instrumented laxity measurements, and patient-reported outcome scores. Seventeen studies met the inclusion criteria. Of these studies, 11 reported on BPTB allografts exclusively, 5 reported on soft-tissue allografts exclusively, and 1 compared both types. The comparative study showed no difference in failure risk, Lachman grade, pivot-shift grade, instrumented laxity, or overall International Knee Documentation Committee score between the 2 allograft types. Data from all studies yielded a failure risk of 10.3% (95% confidence interval [CI], 4.5% to 18.1%) in the soft-tissue group and 15.2% (95% CI, 11.3% to 19.6%) in the BPTB group. The risk of a Lachman grade greater than 5 mm was 6.4% (95% CI, 1.7% to 13.7%) in the soft-tissue group and 8.6% (95% CI, 6.3% to 11.2%) in the BPTB group. The risk of a grade 2 or 3 pivot shift was 1.4% (95% CI, 0.3% to 3.3%) in the soft-tissue group and 4.1% (95% CI, 1.9% to 7.2%) in the BPTB group. One comparative study showed no difference in results after ACL reconstruction with nonirradiated BPTB and soft-tissue allografts. Inclusion of case series in the analysis showed qualitatively similar outcomes with the 2 graft types. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
CLEAR: Cross-Layer Exploration for Architecting Resilience
2017-03-01
benchmark analysis, also provides cost-effective solutions (~1% additional energy cost for the same 50× improvement). This paper addresses the...core (OoO-core) [Wang 04], across 18 benchmarks . Such extensive exploration enables us to conclusively answer the above cross-layer resilience...analysis of the effects of soft errors on application benchmarks , provides a highly effective soft error resilience approach. 3. The above
Alpha particle-induced soft errors in microelectronic devices. I
NASA Astrophysics Data System (ADS)
Redman, D. J.; Sega, R. M.; Joseph, R.
1980-03-01
The article provides a tutorial review and trend assessment of the problem of alpha particle-induced soft errors in VLSI memories. Attention is given to an analysis of the design evolution of modern ICs, and the characteristics of alpha particles and their origin in IC packaging are reviewed. Finally, the process of an alpha particle penetrating an IC is examined.
Chai, Chen; Wong, Yiik Diew; Wang, Xuesong
2017-07-01
This paper proposes a simulation-based approach to estimate safety impact of driver cognitive failures and driving errors. Fuzzy Logic, which involves linguistic terms and uncertainty, is incorporated with Cellular Automata model to simulate decision-making process of right-turn filtering movement at signalized intersections. Simulation experiments are conducted to estimate the relationships between cognitive failures and driving errors with safety performance. Simulation results show Different types of cognitive failures are found to have varied relationship with driving errors and safety performance. For right-turn filtering movement, cognitive failures are more likely to result in driving errors with denser conflicting traffic stream. Moreover, different driving errors are found to have different safety impacts. The study serves to provide a novel approach to linguistically assess cognitions and replicate decision-making procedures of the individual driver. Compare to crash analysis, the proposed FCA model allows quantitative estimation of particular cognitive failures, and the impact of cognitions on driving errors and safety performance. Copyright © 2017 Elsevier Ltd. All rights reserved.
Low delay and area efficient soft error correction in arbitration logic
Sugawara, Yutaka
2013-09-10
There is provided an arbitration logic device for controlling an access to a shared resource. The arbitration logic device comprises at least one storage element, a winner selection logic device, and an error detection logic device. The storage element stores a plurality of requestors' information. The winner selection logic device selects a winner requestor among the requestors based on the requestors' information received from a plurality of requestors. The winner selection logic device selects the winner requestor without checking whether there is the soft error in the winner requestor's information.
NASA Astrophysics Data System (ADS)
Celik, Cihangir
Advances in microelectronics result in sub-micrometer electronic technologies as predicted by Moore's Law, 1965, which states the number of transistors in a given space would double every two years. The most available memory architectures today have submicrometer transistor dimensions. The International Technology Roadmap for Semiconductors (ITRS), a continuation of Moore's Law, predicts that Dynamic Random Access Memory (DRAM) will have an average half pitch size of 50 nm and Microprocessor Units (MPU) will have an average gate length of 30 nm over the period of 2008-2012. Decreases in the dimensions satisfy the producer and consumer requirements of low power consumption, more data storage for a given space, faster clock speed, and portability of integrated circuits (IC), particularly memories. On the other hand, these properties also lead to a higher susceptibility of IC designs to temperature, magnetic interference, power supply, and environmental noise, and radiation. Radiation can directly or indirectly affect device operation. When a single energetic particle strikes a sensitive node in the micro-electronic device, it can cause a permanent or transient malfunction in the device. This behavior is called a Single Event Effect (SEE). SEEs are mostly transient errors that generate an electric pulse which alters the state of a logic node in the memory device without having a permanent effect on the functionality of the device. This is called a Single Event Upset (SEU) or Soft Error . Contrary to SEU, Single Event Latchup (SEL), Single Event Gate Rapture (SEGR), or Single Event Burnout (SEB) they have permanent effects on the device operation and a system reset or recovery is needed to return to proper operations. The rate at which a device or system encounters soft errors is defined as Soft Error Rate (SER). The semiconductor industry has been struggling with SEEs and is taking necessary measures in order to continue to improve system designs in nano-scale technologies. Prevention of SEEs has been studied and applied in the semiconductor industry by including radiation protection precautions in the system architecture or by using corrective algorithms in the system operation. Decreasing 10B content (20%of natural boron) in the natural boron of Borophosphosilicate glass (BPSG) layers that are conventionally used in the fabrication of semiconductor devices was one of the major radiation protection approaches for the system architecture. Neutron interaction in the BPSG layer was the origin of the SEEs because of the 10B (n,alpha) 7Li reaction products. Both of the particles produced have the capability of ionization in the silicon substrate region, whose thickness is comparable to the ranges of these particles. Using the soft error phenomenon in exactly the opposite manner of the semiconductor industry can provide a new neutron detection system based on the SERs in the semiconductor memories. By investigating the soft error mechanisms in the available semiconductor memories and enhancing the soft error occurrences in these devices, one can convert all memory using intelligent systems into portable, power efficient, directiondependent neutron detectors. The Neutron Intercepting Silicon Chip (NISC) project aims to achieve this goal by introducing 10B-enriched BPSG layers to the semiconductor memory architectures. This research addresses the development of a simulation tool, the NISC Soft Error Analysis Tool (NISCSAT), for soft error modeling and analysis in the semiconductor memories to provide basic design considerations for the NISC. NISCSAT performs particle transport and calculates the soft error probabilities, or SER, depending on energy depositions of the particles in a given memory node model of the NISC. Soft error measurements were performed with commercially available, off-the-shelf semiconductor memories and microprocessors to observe soft error variations with the neutron flux and memory supply voltage. Measurement results show that soft errors in the memories increase proportionally with the neutron flux, whereas they decrease with increasing the supply voltages. NISC design considerations include the effects of device scaling, 10B content in the BPSG layer, incoming neutron energy, and critical charge of the node for this dissertation. NISCSAT simulations were performed with various memory node models to account these effects. Device scaling simulations showed that any further increase in the thickness of the BPSG layer beyond 2 mum causes self-shielding of the incoming neutrons due to the BPSG layer and results in lower detection efficiencies. Moreover, if the BPSG layer is located more than 4 mum apart from the depletion region in the node, there are no soft errors in the node due to the fact that both of the reaction products have lower ranges in the silicon or any possible node layers. Calculation results regarding the critical charge indicated that the mean charge deposition of the reaction products in the sensitive volume of the node is about 15 fC. It is evident that the NISC design should have a memory architecture with a critical charge of 15 fC or less to obtain higher detection efficiencies. Moreover, the sensitive volume should be placed in close proximity to the BPSG layers so that its location would be within the range of alpha and 7Li particles. Results showed that the distance between the BPSG layer and the sensitive volume should be less than 2 mum to increase the detection efficiency of the NISC. Incoming neutron energy was also investigated by simulations and the results obtained from these simulations showed that NISC neutron detection efficiency is related with the neutron cross-sections of 10B (n,alpha) 7Li reaction, e.g., ratio of the thermal (0.0253 eV) to fast (2 MeV) neutron detection efficiencies is approximately equal to 8000:1. Environmental conditions and their effects on the NISC performance were also studied in this research. Cosmic rays were modeled and simulated via NISCSAT to investigate detection reliability of the NISC. Simulation results show that cosmic rays account for less than 2 % of the soft errors for the thermal neutron detection. On the other hand, fast neutron detection by the NISC, which already has a poor efficiency due to the low neutron cross-sections, becomes almost impossible at higher altitudes where the cosmic ray fluxes and their energies are higher. NISCSAT simulations regarding soft error dependency of the NISC for temperature and electromagnetic fields show that there are no significant effects in the NISC detection efficiency. Furthermore, the detection efficiency of the NISC decreases with both air humidity and use of moderators since the incoming neutrons scatter away before reaching the memory surface.
Oliveira-Santos, Thiago; Klaeser, Bernd; Weitzel, Thilo; Krause, Thomas; Nolte, Lutz-Peter; Peterhans, Matthias; Weber, Stefan
2011-01-01
Percutaneous needle intervention based on PET/CT images is effective, but exposes the patient to unnecessary radiation due to the increased number of CT scans required. Computer assisted intervention can reduce the number of scans, but requires handling, matching and visualization of two different datasets. While one dataset is used for target definition according to metabolism, the other is used for instrument guidance according to anatomical structures. No navigation systems capable of handling such data and performing PET/CT image-based procedures while following clinically approved protocols for oncologic percutaneous interventions are available. The need for such systems is emphasized in scenarios where the target can be located in different types of tissue such as bone and soft tissue. These two tissues require different clinical protocols for puncturing and may therefore give rise to different problems during the navigated intervention. Studies comparing the performance of navigated needle interventions targeting lesions located in these two types of tissue are not often found in the literature. Hence, this paper presents an optical navigation system for percutaneous needle interventions based on PET/CT images. The system provides viewers for guiding the physician to the target with real-time visualization of PET/CT datasets, and is able to handle targets located in both bone and soft tissue. The navigation system and the required clinical workflow were designed taking into consideration clinical protocols and requirements, and the system is thus operable by a single person, even during transition to the sterile phase. Both the system and the workflow were evaluated in an initial set of experiments simulating 41 lesions (23 located in bone tissue and 18 in soft tissue) in swine cadavers. We also measured and decomposed the overall system error into distinct error sources, which allowed for the identification of particularities involved in the process as well as highlighting the differences between bone and soft tissue punctures. An overall average error of 4.23 mm and 3.07 mm for bone and soft tissue punctures, respectively, demonstrated the feasibility of using this system for such interventions. The proposed system workflow was shown to be effective in separating the preparation from the sterile phase, as well as in keeping the system manageable by a single operator. Among the distinct sources of error, the user error based on the system accuracy (defined as the distance from the planned target to the actual needle tip) appeared to be the most significant. Bone punctures showed higher user error, whereas soft tissue punctures showed higher tissue deformation error.
Multi-Spectral Solar Telescope Array. II - Soft X-ray/EUV reflectivity of the multilayer mirrors
NASA Technical Reports Server (NTRS)
Barbee, Troy W., Jr.; Weed, J. W.; Hoover, Richard B.; Allen, Maxwell J.; Lindblom, Joakim F.; O'Neal, Ray H.; Kankelborg, Charles C.; Deforest, Craig E.; Paris, Elizabeth S.; Walker, Arthur B. C., Jr.
1991-01-01
The Multispectral Solar Telescope Array is a rocket-borne observatory which encompasses seven compact soft X-ray/EUV, multilayer-coated, and two compact far-UV, interference film-coated, Cassegrain and Ritchey-Chretien telescopes. Extensive measurements are presented on the efficiency and spectral bandpass of the X-ray/EUV telescopes. Attention is given to systematic errors and measurement errors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Batista, Antonio J. N.; Santos, Bruno; Fernandes, Ana
The data acquisition and control instrumentation cubicles room of the ITER tokamak will be irradiated with neutrons during the fusion reactor operation. A Virtex-6 FPGA from Xilinx (XC6VLX365T-1FFG1156C) is used on the ATCA-IO-PROCESSOR board, included in the ITER Catalog of I and C products - Fast Controllers. The Virtex-6 is a re-programmable logic device where the configuration is stored in Static RAM (SRAM), functional data stored in dedicated Block RAM (BRAM) and functional state logic in Flip-Flops. Single Event Upsets (SEU) due to the ionizing radiation of neutrons causes soft errors, unintended changes (bit-flips) to the values stored in statemore » elements of the FPGA. The SEU monitoring and soft errors repairing, when possible, were explored in this work. An FPGA built-in Soft Error Mitigation (SEM) controller detects and corrects soft errors in the FPGA configuration memory. Novel SEU sensors with Error Correction Code (ECC) detect and repair the BRAM memories. Proper management of SEU can increase reliability and availability of control instrumentation hardware for nuclear applications. The results of the tests performed using the SEM controller and the BRAM SEU sensors are presented for a Virtex-6 FPGA (XC6VLX240T-1FFG1156C) when irradiated with neutrons from the Portuguese Research Reactor (RPI), a 1 MW nuclear fission reactor operated by IST in the neighborhood of Lisbon. Results show that the proposed SEU mitigation technique is able to repair the majority of the detected SEU errors in the configuration and BRAM memories. (authors)« less
Utilization of robotic-arm assisted total knee arthroplasty for soft tissue protection.
Sultan, Assem A; Piuzzi, Nicolas; Khlopas, Anton; Chughtai, Morad; Sodhi, Nipun; Mont, Michael A
2017-12-01
Despite the well-established success of total knee arthroplasty (TKA), iatrogenic ligamentous and soft tissue injuries are infrequent, but potential complications that can have devastating impact on clinical outcomes. These injuries are often related to technical errors and excessive soft tissue manipulation, particularly during bony resections. Recently, robotic-arm assisted TKA was introduced and demonstrated promising results with potential technical advantages over manual surgery in implant positioning and mechanical accuracy. Furthermore, soft tissue protection is an additional potential advantage offered by these systems that can reduce inadvertent human technical errors encountered during standard manual resections. Therefore, due to the relative paucity of literature, we attempted to answer the following questions: 1) does robotic-arm assisted TKA offer a technical advantage that allows enhanced soft tissue protection? 2) What is the available evidence about soft tissue protection? Recently introduced models of robotic-arm assisted TKA systems with advanced technology showed promising clinical outcomes and soft tissue protection in the short- and mid-term follow-up with results comparable or superior to manual TKA. In this review, we attempted to explore this dimension of robotics in TKA and investigate the soft tissue related complications currently reported in the literature.
Statistical Performance Evaluation Of Soft Seat Pressure Relief Valves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, Stephen P.; Gross, Robert E.
2013-03-26
Risk-based inspection methods enable estimation of the probability of failure on demand for spring-operated pressure relief valves at the United States Department of Energy's Savannah River Site in Aiken, South Carolina. This paper presents a statistical performance evaluation of soft seat spring operated pressure relief valves. These pressure relief valves are typically smaller and of lower cost than hard seat (metal to metal) pressure relief valves and can provide substantial cost savings in fluid service applications (air, gas, liquid, and steam) providing that probability of failure on demand (the probability that the pressure relief valve fails to perform its intendedmore » safety function during a potentially dangerous over pressurization) is at least as good as that for hard seat valves. The research in this paper shows that the proportion of soft seat spring operated pressure relief valves failing is the same or less than that of hard seat valves, and that for failed valves, soft seat valves typically have failure ratios of proof test pressure to set pressure less than that of hard seat valves.« less
Quantum error-correction failure distributions: Comparison of coherent and stochastic error models
NASA Astrophysics Data System (ADS)
Barnes, Jeff P.; Trout, Colin J.; Lucarelli, Dennis; Clader, B. D.
2017-06-01
We compare failure distributions of quantum error correction circuits for stochastic errors and coherent errors. We utilize a fully coherent simulation of a fault-tolerant quantum error correcting circuit for a d =3 Steane and surface code. We find that the output distributions are markedly different for the two error models, showing that no simple mapping between the two error models exists. Coherent errors create very broad and heavy-tailed failure distributions. This suggests that they are susceptible to outlier events and that mean statistics, such as pseudothreshold estimates, may not provide the key figure of merit. This provides further statistical insight into why coherent errors can be so harmful for quantum error correction. These output probability distributions may also provide a useful metric that can be utilized when optimizing quantum error correcting codes and decoding procedures for purely coherent errors.
Impact of Spacecraft Shielding on Direct Ionization Soft Error Rates for sub-130 nm Technologies
NASA Technical Reports Server (NTRS)
Pellish, Jonathan A.; Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Michael M.; Sanders, Anthony B.; Ladbury, Raymond L.; Oldham, Timothy R.; Marshall, Paul W.; Heidel, David F.; Rodbell, Kenneth P.
2010-01-01
We use ray tracing software to model various levels of spacecraft shielding complexity and energy deposition pulse height analysis to study how it affects the direct ionization soft error rate of microelectronic components in space. The analysis incorporates the galactic cosmic ray background, trapped proton, and solar heavy ion environments as well as the October 1989 and July 2000 solar particle events.
Impact of Spacecraft Shielding on Direct Ionization Soft Error Rates for Sub-130 nm Technologies
NASA Technical Reports Server (NTRS)
Pellish, Jonathan A.; Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Thomas M.; Sanders, Anthony B.; Ladbury, Raymond L.; Oldham, Timothy R.; Marshall, Paul W.; Heidel, David F.; Rodbell, Kenneth P.
2010-01-01
We use ray tracing software to model various levels of spacecraft shielding complexity and energy deposition pulse height analysis to study how it affects the direct ionization soft error rate of microelectronic components in space. The analysis incorporates the galactic cosmic ray background, trapped proton, and solar heavy ion environments as well as the October 1989 and July 2000 solar particle events.
Underlying Cause(s) of Letter Perseveration Errors
Fischer-Baum, Simon; Rapp, Brenda
2011-01-01
Perseverations, the inappropriate intrusion of elements from a previous response into a current response, are commonly observed in individuals with acquired deficits. This study specifically investigates the contribution of failure-to activate and failure-to-inhibit deficit(s) in the generation of letter perseveration errors in acquired dysgraphia. We provide evidence from the performance 12 dysgraphic individuals indicating that a failure to activate graphemes for a target word gives rise to letter perseveration errors. In addition, we also provide evidence that, in some individuals, a failure-to-inhibit deficit may also contribute to the production of perseveration errors. PMID:22178232
Exception handling for sensor fusion
NASA Astrophysics Data System (ADS)
Chavez, G. T.; Murphy, Robin R.
1993-08-01
This paper presents a control scheme for handling sensing failures (sensor malfunctions, significant degradations in performance due to changes in the environment, and errant expectations) in sensor fusion for autonomous mobile robots. The advantages of the exception handling mechanism are that it emphasizes a fast response to sensing failures, is able to use only a partial causal model of sensing failure, and leads to a graceful degradation of sensing if the sensing failure cannot be compensated for. The exception handling mechanism consists of two modules: error classification and error recovery. The error classification module in the exception handler attempts to classify the type and source(s) of the error using a modified generate-and-test procedure. If the source of the error is isolated, the error recovery module examines its cache of recovery schemes, which either repair or replace the current sensing configuration. If the failure is due to an error in expectation or cannot be identified, the planner is alerted. Experiments using actual sensor data collected by the CSM Mobile Robotics/Machine Perception Laboratory's Denning mobile robot demonstrate the operation of the exception handling mechanism.
Hard sphere perturbation theory for thermodynamics of soft-sphere model liquid
NASA Astrophysics Data System (ADS)
Mon, K. K.
2001-09-01
It is a long-standing consensus in the literature that hard sphere perturbation theory (HSPT) is not accurate for dense soft sphere model liquids, interacting with repulsive r-n pair potentials for small n. In this paper, we show that if the intrinsic error of HSPT for soft sphere model liquids is accounted for, then this is not completely true. We present results for n=4, 6, 9, 12 which indicate that, even first order variational HSPT can provide free energy upper bounds to within a few percent at densities near freezing when corrected for the intrinsic error of the HSPT.
Full temperature single event upset characterization of two microprocessor technologies
NASA Technical Reports Server (NTRS)
Nichols, Donald K.; Coss, James R.; Smith, L. S.; Rax, Bernard; Huebner, Mark
1988-01-01
Data for the 9450 I3L bipolar microprocessor and the 80C86 CMOS/epi (vintage 1985) microprocessor are presented, showing single-event soft errors for the full MIL-SPEC temperature range of -55 to 125 C. These data show for the first time that the soft-error cross sections continue to decrease with decreasing temperature at subzero temperatures. The temperature dependence of the two parts, however, is very different.
Failure analysis and modeling of a multicomputer system. M.S. Thesis
NASA Technical Reports Server (NTRS)
Subramani, Sujatha Srinivasan
1990-01-01
This thesis describes the results of an extensive measurement-based analysis of real error data collected from a 7-machine DEC VaxCluster multicomputer system. In addition to evaluating basic system error and failure characteristics, we develop reward models to analyze the impact of failures and errors on the system. The results show that, although 98 percent of errors in the shared resources recover, they result in 48 percent of all system failures. The analysis of rewards shows that the expected reward rate for the VaxCluster decreases to 0.5 in 100 days for a 3 out of 7 model, which is well over a 100 times that for a 7-out-of-7 model. A comparison of the reward rates for a range of k-out-of-n models indicates that the maximum increase in reward rate (0.25) occurs in going from the 6-out-of-7 model to the 5-out-of-7 model. The analysis also shows that software errors have the lowest reward (0.2 vs. 0.91 for network errors). The large loss in reward rate for software errors is due to the fact that a large proportion (94 percent) of software errors lead to failure. In comparison, the high reward rate for network errors is due to fast recovery from a majority of these errors (median recovery duration is 0 seconds).
New-Sum: A Novel Online ABFT Scheme For General Iterative Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tao, Dingwen; Song, Shuaiwen; Krishnamoorthy, Sriram
Emerging high-performance computing platforms, with large component counts and lower power margins, are anticipated to be more susceptible to soft errors in both logic circuits and memory subsystems. We present an online algorithm-based fault tolerance (ABFT) approach to efficiently detect and recover soft errors for general iterative methods. We design a novel checksum-based encoding scheme for matrix-vector multiplication that is resilient to both arithmetic and memory errors. Our design decouples the checksum updating process from the actual computation, and allows adaptive checksum overhead control. Building on this new encoding mechanism, we propose two online ABFT designs that can effectively recovermore » from errors when combined with a checkpoint/rollback scheme.« less
Errors in Bibliographic Citations: A Continuing Problem.
ERIC Educational Resources Information Center
Sweetland, James H.
1989-01-01
Summarizes studies examining citation errors and illustrates errors resulting from a lack of standardization, misunderstanding of foreign languages, failure to examine the document cited, and general lack of training in citation norms. It is argued that the failure to detect and correct citation errors is due to diffusion of responsibility in the…
Closed-Loop Analysis of Soft Decisions for Serial Links
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Steele, Glen F.; Zucha, Joan P.; Schlesinger, Adam M.
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
NASA Astrophysics Data System (ADS)
Yoshizawa, Masasumi; Nakamura, Yuuta; Ishiguro, Masataka; Moriya, Tadashi
2007-07-01
In this paper, we describe a method of compensating the attenuation of the ultrasound caused by soft tissue in the transducer vibration method for the measurement of the acoustic impedance of in vivo bone. In the in vivo measurement, the acoustic impedance of bone is measured through soft tissue; therefore, the amplitude of the ultrasound reflected from the bone is attenuated. This attenuation causes an error of the order of -20 to -30% when the acoustic impedance is determined from the measured signals. To compensate the attenuation, the attenuation coefficient and length of the soft tissue are measured by the transducer vibration method. In the experiment using a phantom, this method allows the measurement of the acoustic impedance typically with an error as small as -8 to 10%.
Hu, Jingwen; Klinich, Kathleen D; Miller, Carl S; Nazmi, Giseli; Pearlman, Mark D; Schneider, Lawrence W; Rupp, Jonathan D
2009-11-13
Motor-vehicle crashes are the leading cause of fetal deaths resulting from maternal trauma in the United States, and placental abruption is the most common cause of these deaths. To minimize this injury, new assessment tools, such as crash-test dummies and computational models of pregnant women, are needed to evaluate vehicle restraint systems with respect to reducing the risk of placental abruption. Developing these models requires accurate material properties for tissues in the pregnant abdomen under dynamic loading conditions that can occur in crashes. A method has been developed for determining dynamic material properties of human soft tissues that combines results from uniaxial tensile tests, specimen-specific finite-element models based on laser scans that accurately capture non-uniform tissue-specimen geometry, and optimization techniques. The current study applies this method to characterizing material properties of placental tissue. For 21 placenta specimens tested at a strain rate of 12/s, the mean failure strain is 0.472+/-0.097 and the mean failure stress is 34.80+/-12.62 kPa. A first-order Ogden material model with ground-state shear modulus (mu) of 23.97+/-5.52 kPa and exponent (alpha(1)) of 3.66+/-1.90 best fits the test results. The new method provides a nearly 40% error reduction (p<0.001) compared to traditional curve-fitting methods by considering detailed specimen geometry, loading conditions, and dynamic effects from high-speed loading. The proposed method can be applied to determine mechanical properties of other soft biological tissues.
Lee, Grace C; Hall, Ronald G; Boyd, Natalie K; Dallas, Steven D; Du, Liem C; Treviño, Lucina B; Treviño, Sylvia B; Retzloff, Chad; Lawson, Kenneth A; Wilson, James; Olsen, Randall J; Wang, Yufeng; Frei, Christopher R
2016-11-22
The incidence of outpatient visits for skin and soft tissue infections (SSTIs) has substantially increased over the last decade. The emergence of community-associated methicillin-resistant Staphylococcus aureus (CA-MRSA) has made the management of S. aureus SSTIs complex and challenging. The objective of this study was to identify risk factors contributing to treatment failures associated with community-associated S. aureus skin and soft tissue infections SSTIs. This was a prospective, observational study among 14 primary care clinics within the South Texas Ambulatory Research Network. The primary outcome was treatment failure within 90 days of the initial visit. Univariate associations between the explanatory variables and treatment failure were examined. A generalized linear mixed-effect model was developed to identify independent risk factors associated with treatment failure. Overall, 21% (22/106) patients with S. aureus SSTIs experienced treatment failure. The occurrence of treatment failure was similar among patients with methicillin-resistant S. aureus and those with methicillin-susceptible S. aureus SSTIs (19 vs. 24%; p = 0.70). Independent predictors of treatment failure among cases with S. aureus SSTIs was a duration of infection of ≥7 days prior to initial visit [aOR, 6.02 (95% CI 1.74-19.61)] and a lesion diameter size ≥5 cm [5.25 (1.58-17.20)]. Predictors for treatment failure included a duration of infection for ≥7 days prior to the initial visit and a wound diameter of ≥5 cm. A heightened awareness of these risk factors could help direct targeted interventions in high-risk populations.
Soft-decision decoding techniques for linear block codes and their error performance analysis
NASA Technical Reports Server (NTRS)
Lin, Shu
1996-01-01
The first paper presents a new minimum-weight trellis-based soft-decision iterative decoding algorithm for binary linear block codes. The second paper derives an upper bound on the probability of block error for multilevel concatenated codes (MLCC). The bound evaluates difference in performance for different decompositions of some codes. The third paper investigates the bit error probability code for maximum likelihood decoding of binary linear codes. The fourth and final paper included in this report is concerns itself with the construction of multilevel concatenated block modulation codes using a multilevel concatenation scheme for the frequency non-selective Rayleigh fading channel.
Yang, Yuan; Quan, Nannan; Bu, Jingjing; Li, Xueping; Yu, Ningmei
2016-09-26
High order modulation and demodulation technology can solve the frequency requirement between the wireless energy transmission and data communication. In order to achieve reliable wireless data communication based on high order modulation technology for visual prosthesis, this work proposed a Reed-Solomon (RS) error correcting code (ECC) circuit on the basis of differential amplitude and phase shift keying (DAPSK) soft demodulation. Firstly, recognizing the weakness of the traditional DAPSK soft demodulation algorithm based on division that is complex for hardware implementation, an improved phase soft demodulation algorithm for visual prosthesis to reduce the hardware complexity is put forward. Based on this new algorithm, an improved RS soft decoding method is hence proposed. In this new decoding method, the combination of Chase algorithm and hard decoding algorithms is used to achieve soft decoding. In order to meet the requirements of implantable visual prosthesis, the method to calculate reliability of symbol-level based on multiplication of bit reliability is derived, which reduces the testing vectors number of Chase algorithm. The proposed algorithms are verified by MATLAB simulation and FPGA experimental results. During MATLAB simulation, the biological channel attenuation property model is added into the ECC circuit. The data rate is 8 Mbps in the MATLAB simulation and FPGA experiments. MATLAB simulation results show that the improved phase soft demodulation algorithm proposed in this paper saves hardware resources without losing bit error rate (BER) performance. Compared with the traditional demodulation circuit, the coding gain of the ECC circuit has been improved by about 3 dB under the same BER of [Formula: see text]. The FPGA experimental results show that under the condition of data demodulation error with wireless coils 3 cm away, the system can correct it. The greater the distance, the higher the BER. Then we use a bit error rate analyzer to measure BER of the demodulation circuit and the RS ECC circuit with different distance of two coils. And the experimental results show that the RS ECC circuit has about an order of magnitude lower BER than the demodulation circuit when under the same coils distance. Therefore, the RS ECC circuit has more higher reliability of the communication in the system. The improved phase soft demodulation algorithm and soft decoding algorithm proposed in this paper enables data communication that is more reliable than other demodulation system, which also provide a significant reference for further study to the visual prosthesis system.
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Silva, T; Ketcha, M; Siewerdsen, J H
Purpose: In image-guided spine surgery, mapping 3D preoperative images to 2D intraoperative images via 3D-2D registration can provide valuable assistance in target localization. However, the presence of surgical instrumentation, hardware implants, and soft-tissue resection/displacement causes mismatches in image content, confounding existing registration methods. Manual/semi-automatic methods to mask such extraneous content is time consuming, user-dependent, error prone, and disruptive to clinical workflow. We developed and evaluated 2 novel similarity metrics within a robust registration framework to overcome such challenges in target localization. Methods: An IRB-approved retrospective study in 19 spine surgery patients included 19 preoperative 3D CT images and 50 intraoperativemore » mobile radiographs in cervical, thoracic, and lumbar spine regions. A neuroradiologist provided truth definition of vertebral positions in CT and radiography. 3D-2D registration was performed using the CMA-ES optimizer with 4 gradient-based image similarity metrics: (1) gradient information (GI); (2) gradient correlation (GC); (3) a novel variant referred to as gradient orientation (GO); and (4) a second variant referred to as truncated gradient correlation (TGC). Registration accuracy was evaluated in terms of the projection distance error (PDE) of the vertebral levels. Results: Conventional similarity metrics were susceptible to gross registration error and failure modes associated with the presence of surgical instrumentation: for GI, the median PDE and interquartile range was 33.0±43.6 mm; similarly for GC, PDE = 23.0±92.6 mm respectively. The robust metrics GO and TGC, on the other hand, demonstrated major improvement in PDE (7.6 ±9.4 mm and 8.1± 18.1 mm, respectively) and elimination of gross failure modes. Conclusion: The proposed GO and TGC similarity measures improve registration accuracy and robustness to gross failure in the presence of strong image content mismatch. Such registration capability could offer valuable assistance in target localization without disruption of clinical workflow. G. Kleinszig and S. Vogt are employees of Siemens Healthcare.« less
Comparisons of single event vulnerability of GaAs SRAMS
NASA Astrophysics Data System (ADS)
Weatherford, T. R.; Hauser, J. R.; Diehl, S. E.
1986-12-01
A GaAs MESFET/JFET model incorporated into SPICE has been used to accurately describe C-EJFET, E/D MESFET and D MESFET/resistor GaAs memory technologies. These cells have been evaluated for critical charges due to gate-to-drain and drain-to-source charge collection. Low gate-to-drain critical charges limit conventional GaAs SRAM soft error rates to approximately 1E-6 errors/bit-day. SEU hardening approaches including decoupling resistors, diodes, and FETs have been investigated. Results predict GaAs RAM cell critical charges can be increased to over 0.1 pC. Soft error rates in such hardened memories may approach 1E-7 errors/bit-day without significantly reducing memory speed. Tradeoffs between hardening level, performance and fabrication complexity are discussed.
Spacecraft and propulsion technician error
NASA Astrophysics Data System (ADS)
Schultz, Daniel Clyde
Commercial aviation and commercial space similarly launch, fly, and land passenger vehicles. Unlike aviation, the U.S. government has not established maintenance policies for commercial space. This study conducted a mixed methods review of 610 U.S. space launches from 1984 through 2011, which included 31 failures. An analysis of the failure causal factors showed that human error accounted for 76% of those failures, which included workmanship error accounting for 29% of the failures. With the imminent future of commercial space travel, the increased potential for the loss of human life demands that changes be made to the standardized procedures, training, and certification to reduce human error and failure rates. Several recommendations were made by this study to the FAA's Office of Commercial Space Transportation, space launch vehicle operators, and maintenance technician schools in an effort to increase the safety of the space transportation passengers.
Vu, Lien T; Chen, Chao-Chang A; Lee, Chia-Cheng; Yu, Chia-Wei
2018-04-20
This study aims to develop a compensating method to minimize the shrinkage error of the shell mold (SM) in the injection molding (IM) process to obtain uniform optical power in the central optical zone of soft axial symmetric multifocal contact lenses (CL). The Z-shrinkage error along the Z axis or axial axis of the anterior SM corresponding to the anterior surface of a dry contact lens in the IM process can be minimized by optimizing IM process parameters and then by compensating for additional (Add) powers in the central zone of the original lens design. First, the shrinkage error is minimized by optimizing three levels of four IM parameters, including mold temperature, injection velocity, packing pressure, and cooling time in 18 IM simulations based on an orthogonal array L 18 (2 1 ×3 4 ). Then, based on the Z-shrinkage error from IM simulation, three new contact lens designs are obtained by increasing the Add power in the central zone of the original multifocal CL design to compensate for the optical power errors. Results obtained from IM process simulations and the optical simulations show that the new CL design with 0.1 D increasing in Add power has the closest shrinkage profile to the original anterior SM profile with percentage of reduction in absolute Z-shrinkage error of 55% and more uniform power in the central zone than in the other two cases. Moreover, actual experiments of IM of SM for casting soft multifocal CLs have been performed. The final product of wet CLs has been completed for the original design and the new design. Results of the optical performance have verified the improvement of the compensated design of CLs. The feasibility of this compensating method has been proven based on the measurement results of the produced soft multifocal CLs of the new design. Results of this study can be further applied to predict or compensate for the total optical power errors of the soft multifocal CLs.
A cascaded coding scheme for error control
NASA Technical Reports Server (NTRS)
Shu, L.; Kasami, T.
1985-01-01
A cascade coding scheme for error control is investigated. The scheme employs a combination of hard and soft decisions in decoding. Error performance is analyzed. If the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit-error-rate. Some example schemes are evaluated. They seem to be quite suitable for satellite down-link error control.
A cascaded coding scheme for error control
NASA Technical Reports Server (NTRS)
Kasami, T.; Lin, S.
1985-01-01
A cascaded coding scheme for error control was investigated. The scheme employs a combination of hard and soft decisions in decoding. Error performance is analyzed. If the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit-error-rate. Some example schemes are studied which seem to be quite suitable for satellite down-link error control.
Testing a Novel 3D Printed Radiographic Imaging Device for Use in Forensic Odontology.
Newcomb, Tara L; Bruhn, Ann M; Giles, Bridget; Garcia, Hector M; Diawara, Norou
2017-01-01
There are specific challenges related to forensic dental radiology and difficulties in aligning X-ray equipment to teeth of interest. Researchers used 3D printing to create a new device, the combined holding and aiming device (CHAD), to address the positioning limitations of current dental X-ray devices. Participants (N = 24) used the CHAD, soft dental wax, and a modified external aiming device (MEAD) to determine device preference, radiographer's efficiency, and technique errors. Each participant exposed six X-rays per device for a total of 432 X-rays scored. A significant difference was found at the 0.05 level between the three devices (p = 0.0015), with the MEAD having the least amount of total errors and soft dental wax taking the least amount of time. Total errors were highest when participants used soft dental wax-both the MEAD and the CHAD performed best overall. Further research in forensic dental radiology and use of holding devices is needed. © 2016 American Academy of Forensic Sciences.
Real-time soft error rate measurements on bulk 40 nm SRAM memories: a five-year dual-site experiment
NASA Astrophysics Data System (ADS)
Autran, J. L.; Munteanu, D.; Moindjie, S.; Saad Saoud, T.; Gasiot, G.; Roche, P.
2016-11-01
This paper reports five years of real-time soft error rate experimentation conducted with the same setup at mountain altitude for three years and then at sea level for two years. More than 7 Gbit of SRAM memories manufactured in CMOS bulk 40 nm technology have been subjected to the natural radiation background. The intensity of the atmospheric neutron flux has been continuously measured on site during these experiments using dedicated neutron monitors. As the result, the neutron and alpha component of the soft error rate (SER) have been very accurately extracted from these measurements, refining the first SER estimations performed in 2012 for this SRAM technology. Data obtained at sea level evidence, for the first time, a possible correlation between the neutron flux changes induced by the daily atmospheric pressure variations and the measured SER. Finally, all of the experimental data are compared with results obtained from accelerated tests and numerical simulation.
Gutiérrez, J. J.; Russell, James K.
2016-01-01
Background. Cardiopulmonary resuscitation (CPR) feedback devices are being increasingly used. However, current accelerometer-based devices overestimate chest displacement when CPR is performed on soft surfaces, which may lead to insufficient compression depth. Aim. To assess the performance of a new algorithm for measuring compression depth and rate based on two accelerometers in a simulated resuscitation scenario. Materials and Methods. Compressions were provided to a manikin on two mattresses, foam and sprung, with and without a backboard. One accelerometer was placed on the chest and the second at the manikin's back. Chest displacement and mattress displacement were calculated from the spectral analysis of the corresponding acceleration every 2 seconds and subtracted to compute the actual sternal-spinal displacement. Compression rate was obtained from the chest acceleration. Results. Median unsigned error in depth was 2.1 mm (4.4%). Error was 2.4 mm in the foam and 1.7 mm in the sprung mattress (p < 0.001). Error was 3.1/2.0 mm and 1.8/1.6 mm with/without backboard for foam and sprung, respectively (p < 0.001). Median error in rate was 0.9 cpm (1.0%), with no significant differences between test conditions. Conclusion. The system provided accurate feedback on chest compression depth and rate on soft surfaces. Our solution compensated mattress displacement, avoiding overestimation of compression depth when CPR is performed on soft surfaces. PMID:27999808
NASA Technical Reports Server (NTRS)
Landon, Lauren Blackwell; Vessey, William B.; Barrett, Jamie D.
2015-01-01
A team is defined as: "two or more individuals who interact socially and adaptively, have shared or common goals, and hold meaningful task interdependences; it is hierarchically structured and has a limited life span; in it expertise and roles are distributed; and it is embedded within an organization/environmental context that influences and is influenced by ongoing processes and performance outcomes" (Salas, Stagl, Burke, & Goodwin, 2007, p. 189). From the NASA perspective, a team is commonly understood to be a collection of individuals that is assigned to support and achieve a particular mission. Thus, depending on context, this definition can encompass both the spaceflight crew and the individuals and teams in the larger multi-team system who are assigned to support that crew during a mission. The Team Risk outcomes of interest are predominantly performance related, with a secondary emphasis on long-term health; this is somewhat unique in the NASA HRP in that most Risk areas are medically related and primarily focused on long-term health consequences. In many operational environments (e.g., aviation), performance is assessed as the avoidance of errors. However, the research on performance errors is ambiguous. It implies that actions may be dichotomized into "correct" or "incorrect" responses, where incorrect responses or errors are always undesirable. Researchers have argued that this dichotomy is a harmful oversimplification, and it would be more productive to focus on the variability of human performance and how organizations can manage that variability (Hollnagel, Woods, & Leveson, 2006) (Category III1). Two problems occur when focusing on performance errors: 1) the errors are infrequent and, therefore, difficult to observe and record; and 2) the errors do not directly correspond to failure. Research reveals that humans are fairly adept at correcting or compensating for performance errors before such errors result in recognizable or recordable failures. Astronauts are notably adept high performers. Most failures are recorded only when multiple, small errors occur and humans are unable to recognize and correct or compensate for these errors in time to prevent a failure (Dismukes, Berman, Loukopoulos, 2007) (Category III). More commonly, observers record variability in levels of performance. Some teams commit no observable errors but fail to achieve performance objectives or perform only adequately, while other teams commit some errors but perform spectacularly. Successful performance, therefore, cannot be viewed as simply the absence of errors or the avoidance of failure Johnson Space Center (JSC) Joint Leadership Team, 2008). While failure is commonly attributed to making a major error, focusing solely on the elimination of error(s) does not significantly reduce the risk of failure. Failure may also occur when performance is simply insufficient or an effort is incapable of adjusting sufficiently to a contextual change (e.g., changing levels of autonomy).
Resnick, C M; Dang, R R; Glick, S J; Padwa, B L
2017-03-01
Three-dimensional (3D) soft tissue prediction is replacing two-dimensional analysis in planning for orthognathic surgery. The accuracy of different computational models to predict soft tissue changes in 3D, however, is unclear. A retrospective pilot study was implemented to assess the accuracy of Dolphin 3D software in making these predictions. Seven patients who had a single-segment Le Fort I osteotomy and had preoperative (T 0 ) and >6-month postoperative (T 1 ) cone beam computed tomography (CBCT) scans and 3D photographs were included. The actual skeletal change was determined by subtracting the T 0 from the T 1 CBCT. 3D photographs were overlaid onto the T 0 CBCT and virtual skeletal movements equivalent to the achieved repositioning were applied using Dolphin 3D planner. A 3D soft tissue prediction (T P ) was generated and differences between the T P and T 1 images (error) were measured at 14 points and at the nasolabial angle. A mean linear prediction error of 2.91±2.16mm was found. The mean error at the nasolabial angle was 8.1±5.6°. In conclusion, the ability to accurately predict 3D soft tissue changes after Le Fort I osteotomy using Dolphin 3D software is limited. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Using failure mode and effects analysis to improve the safety of neonatal parenteral nutrition.
Arenas Villafranca, Jose Javier; Gómez Sánchez, Araceli; Nieto Guindo, Miriam; Faus Felipe, Vicente
2014-07-15
Failure mode and effects analysis (FMEA) was used to identify potential errors and to enable the implementation of measures to improve the safety of neonatal parenteral nutrition (PN). FMEA was used to analyze the preparation and dispensing of neonatal PN from the perspective of the pharmacy service in a general hospital. A process diagram was drafted, illustrating the different phases of the neonatal PN process. Next, the failures that could occur in each of these phases were compiled and cataloged, and a questionnaire was developed in which respondents were asked to rate the following aspects of each error: incidence, detectability, and severity. The highest scoring failures were considered high risk and identified as priority areas for improvements to be made. The evaluation process detected a total of 82 possible failures. Among the phases with the highest number of possible errors were transcription of the medical order, formulation of the PN, and preparation of material for the formulation. After the classification of these 82 possible failures and of their relative importance, a checklist was developed to achieve greater control in the error-detection process. FMEA demonstrated that use of the checklist reduced the level of risk and improved the detectability of errors. FMEA was useful for detecting medication errors in the PN preparation process and enabling corrective measures to be taken. A checklist was developed to reduce errors in the most critical aspects of the process. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Fault Injection Techniques and Tools
NASA Technical Reports Server (NTRS)
Hsueh, Mei-Chen; Tsai, Timothy K.; Iyer, Ravishankar K.
1997-01-01
Dependability evaluation involves the study of failures and errors. The destructive nature of a crash and long error latency make it difficult to identify the causes of failures in the operational environment. It is particularly hard to recreate a failure scenario for a large, complex system. To identify and understand potential failures, we use an experiment-based approach for studying the dependability of a system. Such an approach is applied not only during the conception and design phases, but also during the prototype and operational phases. To take an experiment-based approach, we must first understand a system's architecture, structure, and behavior. Specifically, we need to know its tolerance for faults and failures, including its built-in detection and recovery mechanisms, and we need specific instruments and tools to inject faults, create failures or errors, and monitor their effects.
Mars Exploration Rover Potentiometer Problems, Failures and Lessons Learned
NASA Technical Reports Server (NTRS)
Balzer, Mark
2006-01-01
During qualification testing of three types of non-wire-wound precision potentiometers for the Mars Exploration Rover, a variety of problems and failures were encountered. This paper will describe some of the more interesting problems, detail their investigations and present their final solutions. The failures were found to be caused by design errors, manufacturing errors, improper handling, test errors, and carelessness. A trend of decreasing total resistance was noted, and a resistance histogram was used to identify an outlier. A gang fixture is described for simultaneously testing multiple pots, and real time X-ray imaging was used extensively to assist in the failure analyses. Lessons learned are provided.
Mars Exploration Rover potentiometer problems, failures and lessons learned
NASA Technical Reports Server (NTRS)
Balzer, Mark A.
2006-01-01
During qualification testing of three types of nonwire-wound precision potentiometers for the Mars Exploration Rover, a variety of problems and failures were encountered. This paper will describe some of the more interesting problems, detail their investigations and present their final solutions. The failures were found to be caused by design errors, manufacturing errors, improper handling, test errors, and carelessness. A trend of decreasing total resistance was noted, and a resistance histogram was used to identify an outlier. A gang fixture is described for simultaneously testing multiple pots, and real time X-ray imaging was used extensively to assist in the failure analyses. Lessons learned are provided.
Analysis of complications following augmentation with cancellous block allografts.
Chaushu, Gavriel; Mardinger, Ofer; Peleg, Michael; Ghelfan, Oded; Nissan, Joseph
2010-12-01
Bone grafting may be associated with soft and hard tissue complications. Recipient site complications encountered using cancellous block allografts for ridge augmentation are analyzed. A total of 101 consecutive patients (62 females and 39 males; mean age 44 ± 17 years) were treated with implant-supported restoration of 137 severe atrophic alveolar ridges augmented with cancellous bone-block allografts. Alveolar ridge deficiency locations were classified as anterior maxilla (n = 58); posterior maxilla (n = 32 sinuses); posterior mandible (n = 32); and anterior mandible (n = 15). A total of 271 rough-surface implants were placed. Recipient site complications associated with block grafting (infection, membrane exposure, incision line opening, perforation of mucosa over the grafted bone, partial graft failure, total graft failure, and implant failure) were recorded. Partial and total bone-block graft failure occurred in 10 (7%) and 11 (8%) of 137 augmented sites, respectively. Implant failure rate was 12 (4.4%) of 271. Soft tissue complications included membrane exposure (42 [30.7%] of 137); incision line opening (41 [30%] of 137); and perforation of the mucosa over the grafted bone (19 [14%] of 137). Infection of the grafted site occurred in 18 (13%) of 137 bone blocks. Alveolar ridge deficiency location had a statistically significant effect on the outcome of recipient site complications. More complications were noted in the mandible compared to the maxilla. Age and gender had no statistically significant effect. Failures caused by complications were rarely noted in association with cancellous block grafting. The incidence of complications in the mandible was significantly higher. Soft tissue complications do not necessarily result in total loss of cancellous block allograft.
Syndromic surveillance for health information system failures: a feasibility study.
Ong, Mei-Sing; Magrabi, Farah; Coiera, Enrico
2013-05-01
To explore the applicability of a syndromic surveillance method to the early detection of health information technology (HIT) system failures. A syndromic surveillance system was developed to monitor a laboratory information system at a tertiary hospital. Four indices were monitored: (1) total laboratory records being created; (2) total records with missing results; (3) average serum potassium results; and (4) total duplicated tests on a patient. The goal was to detect HIT system failures causing: data loss at the record level; data loss at the field level; erroneous data; and unintended duplication of data. Time-series models of the indices were constructed, and statistical process control charts were used to detect unexpected behaviors. The ability of the models to detect HIT system failures was evaluated using simulated failures, each lasting for 24 h, with error rates ranging from 1% to 35%. In detecting data loss at the record level, the model achieved a sensitivity of 0.26 when the simulated error rate was 1%, while maintaining a specificity of 0.98. Detection performance improved with increasing error rates, achieving a perfect sensitivity when the error rate was 35%. In the detection of missing results, erroneous serum potassium results and unintended repetition of tests, perfect sensitivity was attained when the error rate was as small as 5%. Decreasing the error rate to 1% resulted in a drop in sensitivity to 0.65-0.85. Syndromic surveillance methods can potentially be applied to monitor HIT systems, to facilitate the early detection of failures.
Fine figure correction and other applications using novel MRF fluid designed for ultra-low roughness
NASA Astrophysics Data System (ADS)
Maloney, Chris; Oswald, Eric S.; Dumas, Paul
2015-10-01
An increasing number of technologies require ultra-low roughness (ULR) surfaces. Magnetorheological Finishing (MRF) is one of the options for meeting the roughness specifications for high-energy laser, EUV and X-ray applications. A novel MRF fluid, called C30, has been developed to finish surfaces to ULR. This novel MRF fluid is able to achieve <1.5Å RMS roughness on fused silica and other materials, but has a lower material removal rate with respect to other MRF fluids. As a result of these properties, C30 can also be used for applications in addition to finishing ULR surfaces. These applications include fine figure correction, figure correcting extremely soft materials and removing cosmetic defects. The effectiveness of these new applications is explored through experimental data. The low removal rate of C30 gives MRF the capability to fine figure correct low amplitude errors that are usually difficult to correct with higher removal rate fluids. The ability to figure correct extremely soft materials opens up MRF to a new realm of materials that are difficult to polish. C30 also offers the ability to remove cosmetic defects that often lead to failure during visual quality inspections. These new applications for C30 expand the niche in which MRF is typically used for.
Zhao, Zenghui; Lv, Xianzhou; Wang, Weiming; Tan, Yunliang
2016-01-01
Considering the structure effect of tunnel stability in western mining of China, three typical kinds of numerical model were respectively built as follows based on the strain softening constitutive model and linear elastic-perfectly plastic model for soft rock and interface: R-M, R-C(s)-M and R-C(w)-M. Calculation results revealed that the stress-strain relation and failure characteristics of the three models vary between each other. The combination model without interface or with a strong interface presented continuous failure, while weak interface exhibited 'cut off' effect. Thus, conceptual models of bi-material model and bi-body model were established. Then numerical experiments of tri-axial compression were carried out for the two models. The relationships between stress evolution, failure zone and deformation rate fluctuations as well as the displacement of interface were detailed analyzed. Results show that two breakaway points of deformation rate actually demonstrate the starting and penetration of the main rupture, respectively. It is distinguishable due to the large fluctuation. The bi-material model shows general continuous failure while bi-body model shows 'V' type shear zone in weak body and failure in strong body near the interface due to the interface effect. With the increasing of confining pressure, the 'cut off' effect of weak interface is not obvious. These conclusions lay the theoretical foundation for further development of constitutive model for soft rock-coal combination body.
NASA Technical Reports Server (NTRS)
Marshall, Cheryl J.; Marshall, Paul W.
1999-01-01
This portion of the Short Course is divided into two segments to separately address the two major proton-related effects confronting satellite designers: ionization effects and displacement damage effects. While both of these topics are deeply rooted in "traditional" descriptions of space radiation effects, there are several factors at play to cause renewed concern for satellite systems being designed today. For example, emphasis on Commercial Off-The-Shelf (COTS) technologies in both commercial and government systems increases both Total Ionizing Dose (TID) and Single Event Effect (SEE) concerns. Scaling trends exacerbate the problems, especially with regard to SEEs where protons can dominate soft error rates and even cause destructive failure. In addition, proton-induced displacement damage at fluences encountered in natural space environments can cause degradation in modern bipolar circuitry as well as in many emerging electronic and opto-electronic technologies.
Minimizing treatment planning errors in proton therapy using failure mode and effects analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Yuanshui, E-mail: yuanshui.zheng@okc.procure.com; Johnson, Randall; Larson, Gary
Purpose: Failure mode and effects analysis (FMEA) is a widely used tool to evaluate safety or reliability in conventional photon radiation therapy. However, reports about FMEA application in proton therapy are scarce. The purpose of this study is to apply FMEA in safety improvement of proton treatment planning at their center. Methods: The authors performed an FMEA analysis of their proton therapy treatment planning process using uniform scanning proton beams. The authors identified possible failure modes in various planning processes, including image fusion, contouring, beam arrangement, dose calculation, plan export, documents, billing, and so on. For each error, the authorsmore » estimated the frequency of occurrence, the likelihood of being undetected, and the severity of the error if it went undetected and calculated the risk priority number (RPN). The FMEA results were used to design their quality management program. In addition, the authors created a database to track the identified dosimetric errors. Periodically, the authors reevaluated the risk of errors by reviewing the internal error database and improved their quality assurance program as needed. Results: In total, the authors identified over 36 possible treatment planning related failure modes and estimated the associated occurrence, detectability, and severity to calculate the overall risk priority number. Based on the FMEA, the authors implemented various safety improvement procedures into their practice, such as education, peer review, and automatic check tools. The ongoing error tracking database provided realistic data on the frequency of occurrence with which to reevaluate the RPNs for various failure modes. Conclusions: The FMEA technique provides a systematic method for identifying and evaluating potential errors in proton treatment planning before they result in an error in patient dose delivery. The application of FMEA framework and the implementation of an ongoing error tracking system at their clinic have proven to be useful in error reduction in proton treatment planning, thus improving the effectiveness and safety of proton therapy.« less
Minimizing treatment planning errors in proton therapy using failure mode and effects analysis.
Zheng, Yuanshui; Johnson, Randall; Larson, Gary
2016-06-01
Failure mode and effects analysis (FMEA) is a widely used tool to evaluate safety or reliability in conventional photon radiation therapy. However, reports about FMEA application in proton therapy are scarce. The purpose of this study is to apply FMEA in safety improvement of proton treatment planning at their center. The authors performed an FMEA analysis of their proton therapy treatment planning process using uniform scanning proton beams. The authors identified possible failure modes in various planning processes, including image fusion, contouring, beam arrangement, dose calculation, plan export, documents, billing, and so on. For each error, the authors estimated the frequency of occurrence, the likelihood of being undetected, and the severity of the error if it went undetected and calculated the risk priority number (RPN). The FMEA results were used to design their quality management program. In addition, the authors created a database to track the identified dosimetric errors. Periodically, the authors reevaluated the risk of errors by reviewing the internal error database and improved their quality assurance program as needed. In total, the authors identified over 36 possible treatment planning related failure modes and estimated the associated occurrence, detectability, and severity to calculate the overall risk priority number. Based on the FMEA, the authors implemented various safety improvement procedures into their practice, such as education, peer review, and automatic check tools. The ongoing error tracking database provided realistic data on the frequency of occurrence with which to reevaluate the RPNs for various failure modes. The FMEA technique provides a systematic method for identifying and evaluating potential errors in proton treatment planning before they result in an error in patient dose delivery. The application of FMEA framework and the implementation of an ongoing error tracking system at their clinic have proven to be useful in error reduction in proton treatment planning, thus improving the effectiveness and safety of proton therapy.
The effect of thermocycling on tensile bond strength of two soft liners.
Geramipanah, Farideh; Ghandari, Masoumeh; Zeighami, Somayeh
2013-09-01
Failure of soft liners depends mostly on separation from the denture base resin; therefore measurement of the bond strength is very important. The purpose of this study was to compare the tensile bond strength of two soft liners (Acropars, Molloplast-B) to denture base resin before and after thermocycling. Twenty specimens fromeach of the two different soft liners were processed according to the manufacturer's instructions between two polymethyl methacrylate (PMMA) sheets. Ten specimens in each group were maintained in 37°C water for 24 hours and 10 were thermocycled (5000 cycles) among baths of 5° and 55°C. The tensile bond strength was measured using a universal testing machine at a crosshead speed of 5 mm/min. Mode of failure was determined with SEM (magnification ×30). Two-way ANOVA was used to analyze the data. The mean and standard deviation of tensile bond strength of Acropars and Molloplast-B before thermocycling were 6.59±1.85 and1.51±0.22 MPa, respectively and 5.89±1.52 and1.37±0.18 MPa, respectively after thermocycling. There was no significant difference before and after thermocycling. Mode of failure in Acropars and Molloplast-B were adhesive and cohesive, respectivley. The bond strength of Acropars was significantly higher than Molloplast-B (P<0.05).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hashii, Haruko, E-mail: haruko@pmrc.tsukuba.ac.jp; Hashimoto, Takayuki; Okawa, Ayako
2013-03-01
Purpose: Radiation therapy for cancer may be required for patients with implantable cardiac devices. However, the influence of secondary neutrons or scattered irradiation from high-energy photons (≥10 MV) on implantable cardioverter-defibrillators (ICDs) is unclear. This study was performed to examine this issue in 2 ICD models. Methods and Materials: ICDs were positioned around a water phantom under conditions simulating clinical radiation therapy. The ICDs were not irradiated directly. A control ICD was positioned 140 cm from the irradiation isocenter. Fractional irradiation was performed with 18-MV and 10-MV photon beams to give cumulative in-field doses of 600 Gy and 1600 Gy,more » respectively. Errors were checked after each fraction. Soft errors were defined as severe (change to safety back-up mode), moderate (memory interference, no changes in device parameters), and minor (slight memory change, undetectable by computer). Results: Hard errors were not observed. For the older ICD model, the incidences of severe, moderate, and minor soft errors at 18 MV were 0.75, 0.5, and 0.83/50 Gy at the isocenter. The corresponding data for 10 MV were 0.094, 0.063, and 0 /50 Gy. For the newer ICD model at 18 MV, these data were 0.083, 2.3, and 5.8 /50 Gy. Moderate and minor errors occurred at 18 MV in control ICDs placed 140 cm from the isocenter. The error incidences were 0, 1, and 0 /600 Gy at the isocenter for the newer model, and 0, 1, and 6 /600Gy for the older model. At 10 MV, no errors occurred in control ICDs. Conclusions: ICD errors occurred more frequently at 18 MV irradiation, which suggests that the errors were mainly caused by secondary neutrons. Soft errors of ICDs were observed with high energy photon beams, but most were not critical in the newer model. These errors may occur even when the device is far from the irradiation field.« less
Method and apparatus for faulty memory utilization
Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.
2016-04-19
A method for faulty memory utilization in a memory system includes: obtaining information regarding memory health status of at least one memory page in the memory system; determining an error tolerance of the memory page when the information regarding memory health status indicates that a failure is predicted to occur in an area of the memory system affecting the memory page; initiating a migration of data stored in the memory page when it is determined that the data stored in the memory page is non-error-tolerant; notifying at least one application regarding a predicted operating system failure and/or a predicted application failure when it is determined that data stored in the memory page is non-error-tolerant and cannot be migrated; and notifying at least one application regarding the memory failure predicted to occur when it is determined that data stored in the memory page is error-tolerant.
32-Bit-Wide Memory Tolerates Failures
NASA Technical Reports Server (NTRS)
Buskirk, Glenn A.
1990-01-01
Electronic memory system of 32-bit words corrects bit errors caused by some common type of failures - even failure of entire 4-bit-wide random-access-memory (RAM) chip. Detects failure of two such chips, so user warned that ouput of memory may contain errors. Includes eight 4-bit-wide DRAM's configured so each bit of each DRAM assigned to different one of four parallel 8-bit words. Each DRAM contributes only 1 bit to each 8-bit word.
Radiation Tests on 2Gb NAND Flash Memories
NASA Technical Reports Server (NTRS)
Nguyen, Duc N.; Guertin, Steven M.; Patterson, J. D.
2006-01-01
We report on SEE and TID tests of highly scaled Samsung 2Gbits flash memories. Both in-situ and biased interval irradiations were used to characterize the response of the total accumulated dose failures. The radiation-induced failures can be categorized as followings: single event upset (SEU) read errors in biased and unbiased modes, write errors, and single-event-functional-interrupt (SEFI) failures.
Syndromic surveillance for health information system failures: a feasibility study
Ong, Mei-Sing; Magrabi, Farah; Coiera, Enrico
2013-01-01
Objective To explore the applicability of a syndromic surveillance method to the early detection of health information technology (HIT) system failures. Methods A syndromic surveillance system was developed to monitor a laboratory information system at a tertiary hospital. Four indices were monitored: (1) total laboratory records being created; (2) total records with missing results; (3) average serum potassium results; and (4) total duplicated tests on a patient. The goal was to detect HIT system failures causing: data loss at the record level; data loss at the field level; erroneous data; and unintended duplication of data. Time-series models of the indices were constructed, and statistical process control charts were used to detect unexpected behaviors. The ability of the models to detect HIT system failures was evaluated using simulated failures, each lasting for 24 h, with error rates ranging from 1% to 35%. Results In detecting data loss at the record level, the model achieved a sensitivity of 0.26 when the simulated error rate was 1%, while maintaining a specificity of 0.98. Detection performance improved with increasing error rates, achieving a perfect sensitivity when the error rate was 35%. In the detection of missing results, erroneous serum potassium results and unintended repetition of tests, perfect sensitivity was attained when the error rate was as small as 5%. Decreasing the error rate to 1% resulted in a drop in sensitivity to 0.65–0.85. Conclusions Syndromic surveillance methods can potentially be applied to monitor HIT systems, to facilitate the early detection of failures. PMID:23184193
Ghorbani, Mahdi; Salahshour, Fateme; Haghparast, Abbas; Knaup, Courtney
2014-01-01
Purpose The aim of this study is to compare the dose in various soft tissues in brachytherapy with photon emitting sources. Material and methods 103Pd, 125I, 169Yb, 192Ir brachytherapy sources were simulated with MCNPX Monte Carlo code, and their dose rate constant and radial dose function were compared with the published data. A spherical phantom with 50 cm radius was simulated and the dose at various radial distances in adipose tissue, breast tissue, 4-component soft tissue, brain (grey/white matter), muscle (skeletal), lung tissue, blood (whole), 9-component soft tissue, and water were calculated. The absolute dose and relative dose difference with respect to 9-component soft tissue was obtained for various materials, sources, and distances. Results There was good agreement between the dosimetric parameters of the sources and the published data. Adipose tissue, breast tissue, 4-component soft tissue, and water showed the greatest difference in dose relative to the dose to the 9-component soft tissue. The other soft tissues showed lower dose differences. The dose difference was also higher for 103Pd source than for 125I, 169Yb, and 192Ir sources. Furthermore, greater distances from the source had higher relative dose differences and the effect can be justified due to the change in photon spectrum (softening or hardening) as photons traverse the phantom material. Conclusions The ignorance of soft tissue characteristics (density, composition, etc.) by treatment planning systems incorporates a significant error in dose delivery to the patient in brachytherapy with photon sources. The error depends on the type of soft tissue, brachytherapy source, as well as the distance from the source. PMID:24790623
Accuracy of three-dimensional facial soft tissue simulation in post-traumatic zygoma reconstruction.
Li, P; Zhou, Z W; Ren, J Y; Zhang, Y; Tian, W D; Tang, W
2016-12-01
The aim of this study was to evaluate the accuracy of novel software-CMF-preCADS-for the prediction of soft tissue changes following repositioning surgery for zygomatic fractures. Twenty patients who had sustained an isolated zygomatic fracture accompanied by facial deformity and who were treated with repositioning surgery participated in this study. Cone beam computed tomography (CBCT) scans and three-dimensional (3D) stereophotographs were acquired preoperatively and postoperatively. The 3D skeletal model from the preoperative CBCT data was matched with the postoperative one, and the fractured zygomatic fragments were segmented and aligned to the postoperative position for prediction. Then, the predicted model was matched with the postoperative 3D stereophotograph for quantification of the simulation error. The mean absolute error in the zygomatic soft tissue region between the predicted model and the real one was 1.42±1.56mm for all cases. The accuracy of the prediction (mean absolute error ≤2mm) was 87%. In the subjective assessment it was found that the majority of evaluators considered the predicted model and the postoperative model to be 'very similar'. CMF-preCADS software can provide a realistic, accurate prediction of the facial soft tissue appearance after repositioning surgery for zygomatic fractures. The reliability of this software for other types of repositioning surgery for maxillofacial fractures should be validated in the future. Copyright © 2016. Published by Elsevier Ltd.
Evaluating Application Resilience with XRay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Sui; Bronevetsky, Greg; Li, Bin
2015-05-07
The rising count and shrinking feature size of transistors within modern computers is making them increasingly vulnerable to various types of soft faults. This problem is especially acute in high-performance computing (HPC) systems used for scientific computing, because these systems include many thousands of compute cores and nodes, all of which may be utilized in a single large-scale run. The increasing vulnerability of HPC applications to errors induced by soft faults is motivating extensive work on techniques to make these applications more resiilent to such faults, ranging from generic techniques such as replication or checkpoint/restart to algorithmspecific error detection andmore » tolerance techniques. Effective use of such techniques requires a detailed understanding of how a given application is affected by soft faults to ensure that (i) efforts to improve application resilience are spent in the code regions most vulnerable to faults and (ii) the appropriate resilience technique is applied to each code region. This paper presents XRay, a tool to view the application vulnerability to soft errors, and illustrates how XRay can be used in the context of a representative application. In addition to providing actionable insights into application behavior XRay automatically selects the number of fault injection experiments required to provide an informative view of application behavior, ensuring that the information is statistically well-grounded without performing unnecessary experiments.« less
Preventing medical errors by designing benign failures.
Grout, John R
2003-07-01
One way to successfully reduce medical errors is to design health care systems that are more resistant to the tendencies of human beings to err. One interdisciplinary approach entails creating design changes, mitigating human errors, and making human error irrelevant to outcomes. This approach is intended to facilitate the creation of benign failures, which have been called mistake-proofing devices and forcing functions elsewhere. USING FAULT TREES TO DESIGN FORCING FUNCTIONS: A fault tree is a graphical tool used to understand the relationships that either directly cause or contribute to the cause of a particular failure. A careful analysis of a fault tree enables the analyst to anticipate how the process will behave after the change. EXAMPLE OF AN APPLICATION: A scenario in which a patient is scalded while bathing can serve as an example of how multiple fault trees can be used to design forcing functions. The first fault tree shows the undesirable event--patient scalded while bathing. The second fault tree has a benign event--no water. Adding a scald valve changes the outcome from the undesirable event ("patient scalded while bathing") to the benign event ("no water") Analysis of fault trees does not ensure or guarantee that changes necessary to eliminate error actually occur. Most mistake-proofing is used to prevent simple errors and to create well-defended processes, but complex errors can also result. The utilization of mistake-proofing or forcing functions can be thought of as changing the logic of a process. Errors that formerly caused undesirable failures can be converted into the causes of benign failures. The use of fault trees can provide a variety of insights into the design of forcing functions that will improve patient safety.
Schreckengaust, Richard; Littlejohn, Lanny; Zarow, Gregory J
2014-02-01
The lower extremity tourniquet failure rate remains significantly higher in combat than in preclinical testing, so we hypothesized that tourniquet placement accuracy, speed, and effectiveness would improve during training and decline during simulated combat. Navy Hospital Corpsman (N = 89), enrolled in a Tactical Combat Casualty Care training course in preparation for deployment, applied Combat Application Tourniquet (CAT) and the Special Operations Forces Tactical Tourniquet (SOFT-T) on day 1 and day 4 of classroom training, then under simulated combat, wherein participants ran an obstacle course to apply a tourniquet while wearing full body armor and avoiding simulated small arms fire (paint balls). Application time and pulse elimination effectiveness improved day 1 to day 4 (p < 0.005). Under simulated combat, application time slowed significantly (p < 0.001), whereas accuracy and effectiveness declined slightly. Pulse elimination was poor for CAT (25% failure) and SOFT-T (60% failure) even in classroom conditions following training. CAT was more quickly applied (p < 0.005) and more effective (p < 0.002) than SOFT-T. Training fostered fast and effective application of leg tourniquets while performance declined under simulated combat. The inherent efficacy of tourniquet products contributes to high failure rates under combat conditions, pointing to the need for superior tourniquets and for rigorous deployment preparation training in simulated combat scenarios. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.
A device for characterising the mechanical properties of the plantar soft tissue of the foot.
Parker, D; Cooper, G; Pearson, S; Crofts, G; Howard, D; Busby, P; Nester, C
2015-11-01
The plantar soft tissue is a highly functional viscoelastic structure involved in transferring load to the human body during walking. A Soft Tissue Response Imaging Device was developed to apply a vertical compression to the plantar soft tissue whilst measuring the mechanical response via a combined load cell and ultrasound imaging arrangement. Accuracy of motion compared to input profiles; validation of the response measured for standard materials in compression; variability of force and displacement measures for consecutive compressive cycles; and implementation in vivo with five healthy participants. Static displacement displayed average error of 0.04 mm (range of 15 mm), and static load displayed average error of 0.15 N (range of 250 N). Validation tests showed acceptable agreement compared to a Houndsfield tensometer for both displacement (CMC > 0.99 RMSE > 0.18 mm) and load (CMC > 0.95 RMSE < 4.86 N). Device motion was highly repeatable for bench-top tests (ICC = 0.99) and participant trials (CMC = 1.00). Soft tissue response was found repeatable for intra (CMC > 0.98) and inter trials (CMC > 0.70). The device has been shown to be capable of implementing complex loading patterns similar to gait, and of capturing the compressive response of the plantar soft tissue for a range of loading conditions in vivo. Copyright © 2015. Published by Elsevier Ltd.
Testolin, C G; Gore, R; Rivkin, T; Horlick, M; Arbo, J; Wang, Z; Chiumello, G; Heymsfield, S B
2000-12-01
Dual-energy X-ray absorptiometry (DXA) percent (%) fat estimates may be inaccurate in young children, who typically have high tissue hydration levels. This study was designed to provide a comprehensive analysis of pediatric tissue hydration effects on DXA %fat estimates. Phase 1 was experimental and included three in vitro studies to establish the physical basis of DXA %fat-estimation models. Phase 2 extended phase 1 models and consisted of theoretical calculations to estimate the %fat errors emanating from previously reported pediatric hydration effects. Phase 1 experiments supported the two-compartment DXA soft tissue model and established that pixel ratio of low to high energy (R values) are a predictable function of tissue elemental content. In phase 2, modeling of reference body composition values from birth to age 120 mo revealed that %fat errors will arise if a "constant" adult lean soft tissue R value is applied to the pediatric population; the maximum %fat error, approximately 0.8%, would be present at birth. High tissue hydration, as observed in infants and young children, leads to errors in DXA %fat estimates. The magnitude of these errors based on theoretical calculations is small and may not be of clinical or research significance.
A Strategy to Use Soft Data Effectively in Randomized Controlled Clinical Trials.
ERIC Educational Resources Information Center
Kraemer, Helena Chmura; Thiemann, Sue
1989-01-01
Sees soft data, measures having substantial intrasubject variability due to errors of measurement or response inconsistency, as important measures of response in randomized clinical trials. Shows that using intensive design and slope of response on time as outcome measure maximizes sample retention and decreases within-group variability, thus…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Zhenhua; Falzone, Gabriel; Das, Sumanta
The addition of phase change materials (PCMs) has been proposed as a way to mitigate thermal cracking in cementitious materials. However, the addition of PCMs, i.e., soft inclusions, degrades the compressive strength of cementitious composites. From a strength-of-materials viewpoint, such reductions in strength are suspected to increase the tendency of cementitious materials containing PCMs to crack under load (e.g., volume instability-induced stresses resulting from thermal and/or hygral deformations). Based on detailed assessments of free and restrained shrinkage, elastic modulus, and tensile strength, this study shows that the addition of PCMs does not alter the cracking sensitivity of the material. Inmore » fact, the addition of PCMs (or other soft inclusions) enhances the cracking resistance as compared to a plain cement paste or composites containing equivalent dosages of (stiff) quartz inclusions. This is because composites containing soft inclusions demonstrate benefits resulting from crack blunting and deflection, and improved stress relaxation. As a result, although the tensile stress at failure remains similar, the time to failure (i.e., macroscopic cracking) of PCM-containing composites is considerably extended. More generally, the outcomes indicate that dosages of soft(er) inclusions, and the resulting decrease in compressive strength does not amplify the cracking risk of cementitious composites.« less
Giardina, M; Castiglia, F; Tomarchio, E
2014-12-01
Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events.
Colen, David L; Carney, Martin J; Shubinets, Valeriy; Lanni, Michael A; Liu, Tiffany; Levin, L Scott; Lee, Gwo-Chin; Kovach, Stephen J
2018-04-01
Total knee arthroplasty is a common orthopedic procedure in the United States and complications can be devastating. Soft-tissue compromise or joint infection may cause failure of prosthesis requiring knee fusion or amputation. The role of a plastic surgeon in total knee arthroplasty is critical for cases requiring optimization of the soft-tissue envelope. The purpose of this study was to elucidate factors associated with total knee arthroplasty salvage following complications and clarify principles of reconstruction to optimize outcomes. A retrospective review of patients requiring soft-tissue reconstruction performed by the senior author after total knee arthroplasty over 8 years was completed. Logistic regression and Fisher's exact tests determined factors associated with the primary outcome, prosthesis salvage versus knee fusion or amputation. Seventy-three knees in 71 patients required soft-tissue reconstruction (mean follow-up, 1.8 years), with a salvage rate of 61.1 percent, mostly using medial gastrocnemius flaps. Patients referred to our institution with complicated periprosthetic wounds were significantly more likely to lose their knee prosthesis than patients treated only within our system. Patients with multiple prior knee operations before definitive soft-tissue reconstruction had significantly decreased rates of prosthesis salvage and an increased risk of amputation. Knee salvage significantly decreased with positive joint cultures (Gram-negative greater than Gram-positive organisms) and particularly at the time of definitive reconstruction, which also trended toward an increased risk of amputation. In revision total knee arthroplasty, prompt soft-tissue reconstruction improves the likelihood of success, and protracted surgical courses and contamination increase failure and amputations. The authors show a benefit to involving plastic surgeons early in the course of total knee arthroplasty complications to optimize genicular soft tissues. Therapeutic, III.
Human Reliability and the Cost of Doing Business
NASA Technical Reports Server (NTRS)
DeMott, Diana
2014-01-01
Most businesses recognize that people will make mistakes and assume errors are just part of the cost of doing business, but does it need to be? Companies with high risk, or major consequences, should consider the effect of human error. In a variety of industries, Human Errors have caused costly failures and workplace injuries. These have included: airline mishaps, medical malpractice, administration of medication and major oil spills have all been blamed on human error. A technique to mitigate or even eliminate some of these costly human errors is the use of Human Reliability Analysis (HRA). Various methodologies are available to perform Human Reliability Assessments that range from identifying the most likely areas for concern to detailed assessments with human error failure probabilities calculated. Which methodology to use would be based on a variety of factors that would include: 1) how people react and act in different industries, and differing expectations based on industries standards, 2) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 3) type and availability of data and 4) how the industry views risk & reliability influences ( types of emergencies, contingencies and routine tasks versus cost based concerns). The Human Reliability Assessments should be the first step to reduce, mitigate or eliminate the costly mistakes or catastrophic failures. Using Human Reliability techniques to identify and classify human error risks allows a company more opportunities to mitigate or eliminate these risks and prevent costly failures.
Research on On-Line Modeling of Fed-Batch Fermentation Process Based on v-SVR
NASA Astrophysics Data System (ADS)
Ma, Yongjun
The fermentation process is very complex and non-linear, many parameters are not easy to measure directly on line, soft sensor modeling is a good solution. This paper introduces v-support vector regression (v-SVR) for soft sensor modeling of fed-batch fermentation process. v-SVR is a novel type of learning machine. It can control the accuracy of fitness and prediction error by adjusting the parameter v. An on-line training algorithm is discussed in detail to reduce the training complexity of v-SVR. The experimental results show that v-SVR has low error rate and better generalization with appropriate v.
Formal Validation of Aerospace Software
NASA Astrophysics Data System (ADS)
Lesens, David; Moy, Yannick; Kanig, Johannes
2013-08-01
Any single error in critical software can have catastrophic consequences. Even though failures are usually not advertised, some software bugs have become famous, such as the error in the MIM-104 Patriot. For space systems, experience shows that software errors are a serious concern: more than half of all satellite failures from 2000 to 2003 involved software. To address this concern, this paper addresses the use of formal verification of software developed in Ada.
The Communication Link and Error ANalysis (CLEAN) simulator
NASA Technical Reports Server (NTRS)
Ebel, William J.; Ingels, Frank M.; Crowe, Shane
1993-01-01
During the period July 1, 1993 through December 30, 1993, significant developments to the Communication Link and Error ANalysis (CLEAN) simulator were completed and include: (1) Soft decision Viterbi decoding; (2) node synchronization for the Soft decision Viterbi decoder; (3) insertion/deletion error programs; (4) convolutional encoder; (5) programs to investigate new convolutional codes; (6) pseudo-noise sequence generator; (7) soft decision data generator; (8) RICE compression/decompression (integration of RICE code generated by Pen-Shu Yeh at Goddard Space Flight Center); (9) Markov Chain channel modeling; (10) percent complete indicator when a program is executed; (11) header documentation; and (12) help utility. The CLEAN simulation tool is now capable of simulating a very wide variety of satellite communication links including the TDRSS downlink with RFI. The RICE compression/decompression schemes allow studies to be performed on error effects on RICE decompressed data. The Markov Chain modeling programs allow channels with memory to be simulated. Memory results from filtering, forward error correction encoding/decoding, differential encoding/decoding, channel RFI, nonlinear transponders and from many other satellite system processes. Besides the development of the simulation, a study was performed to determine whether the PCI provides a performance improvement for the TDRSS downlink. There exist RFI with several duty cycles for the TDRSS downlink. We conclude that the PCI does not improve performance for any of these interferers except possibly one which occurs for the TDRS East. Therefore, the usefulness of the PCI is a function of the time spent transmitting data to the WSGT through the TDRS East transponder.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattarivand, Mike; Summers, Clare; Robar, James
Purpose: To evaluate the validity of using spine as a surrogate for tumor positioning with ExacTrac stereoscopic imaging in lung stereotactic body radiation therapy (SBRT). Methods: Using the Novalis ExacTrac x-ray system, 39 lung SBRT patients (182 treatments) were aligned before treatment with 6 degrees (6D) of freedom couch (3 translations, 3 rotations) based on spine matching on stereoscopic images. The couch was shifted to treatment isocenter and pre-treatment CBCT was performed based on a soft tissue match around tumor volume. The CBCT data were used to measure residual errors following ExacTrac alignment. The thresholds for re-aligning the patients basedmore » on CBCT were 3mm shift or 3° rotation (in any 6D). In order to evaluate the effect of tumor location on residual errors, correlations between tumor distance from spine and individual residual errors were calculated. Results: Residual errors were up to 0.5±2.4mm. Using 3mm/3° thresholds, 80/182 (44%) of the treatments required re-alignment based on CBCT soft tissue matching following ExacTrac spine alignment. Most mismatches were in sup-inf, ant-post, and roll directions which had larger standard deviations. No correlation was found between tumor distance from spine and individual residual errors. Conclusion: ExacTrac stereoscopic imaging offers a quick pre-treatment patient alignment. However, bone matching based on spine is not reliable for aligning lung SBRT patients who require soft tissue image registration from CBCT. Spine can be a poor surrogate for lung SBRT patient alignment even for proximal tumor volumes.« less
Smart Braid Feedback for the Closed-Loop Control of Soft Robotic Systems.
Felt, Wyatt; Chin, Khai Yi; Remy, C David
2017-09-01
This article experimentally investigates the potential of using flexible, inductance-based contraction sensors in the closed-loop motion control of soft robots. Accurate motion control remains a highly challenging task for soft robotic systems. Precise models of the actuation dynamics and environmental interactions are often unavailable. This renders open-loop control impossible, while closed-loop control suffers from a lack of suitable feedback. Conventional motion sensors, such as linear or rotary encoders, are difficult to adapt to robots that lack discrete mechanical joints. The rigid nature of these sensors runs contrary to the aspirational benefits of soft systems. As truly soft sensor solutions are still in their infancy, motion control of soft robots has so far relied on laboratory-based sensing systems such as motion capture, electromagnetic (EM) tracking, or Fiber Bragg Gratings. In this article, we used embedded flexible sensors known as Smart Braids to sense the contraction of McKibben muscles through changes in inductance. We evaluated closed-loop control on two systems: a revolute joint and a planar, one degree of freedom continuum manipulator. In the revolute joint, our proposed controller compensated for elasticity in the actuator connections. The Smart Braid feedback allowed motion control with a steady-state root-mean-square (RMS) error of [1.5]°. In the continuum manipulator, Smart Braid feedback enabled tracking of the desired tip angle with a steady-state RMS error of [1.25]°. This work demonstrates that Smart Braid sensors can provide accurate position feedback in closed-loop motion control suitable for field applications of soft robotic systems.
Modeling of a bubble-memory organization with self-checking translators to achieve high reliability.
NASA Technical Reports Server (NTRS)
Bouricius, W. G.; Carter, W. C.; Hsieh, E. P.; Wadia, A. B.; Jessep, D. C., Jr.
1973-01-01
Study of the design and modeling of a highly reliable bubble-memory system that has the capabilities of: (1) correcting a single 16-adjacent bit-group error resulting from failures in a single basic storage module (BSM), and (2) detecting with a probability greater than 0.99 any double errors resulting from failures in BSM's. The results of the study justify the design philosophy adopted of employing memory data encoding and a translator to correct single group errors and detect double group errors to enhance the overall system reliability.
Adaptive and Resilient Soft Tensegrity Robots.
Rieffel, John; Mouret, Jean-Baptiste
2018-04-17
Living organisms intertwine soft (e.g., muscle) and hard (e.g., bones) materials, giving them an intrinsic flexibility and resiliency often lacking in conventional rigid robots. The emerging field of soft robotics seeks to harness these same properties to create resilient machines. The nature of soft materials, however, presents considerable challenges to aspects of design, construction, and control-and up until now, the vast majority of gaits for soft robots have been hand-designed through empirical trial-and-error. This article describes an easy-to-assemble tensegrity-based soft robot capable of highly dynamic locomotive gaits and demonstrating structural and behavioral resilience in the face of physical damage. Enabling this is the use of a machine learning algorithm able to discover effective gaits with a minimal number of physical trials. These results lend further credence to soft-robotic approaches that seek to harness the interaction of complex material dynamics to generate a wealth of dynamical behaviors.
Underlying Cause(s) of Letter Perseveration Errors
ERIC Educational Resources Information Center
Fischer-Baum, Simon; Rapp, Brenda
2012-01-01
Perseverations, the inappropriate intrusion of elements from a previous response into a current response, are commonly observed in individuals with acquired deficits. This study specifically investigates the contribution of failure-to activate and failure-to-inhibit deficit(s) in the generation of letter perseveration errors in acquired…
Identifying the latent failures underpinning medication administration errors: an exploratory study.
Lawton, Rebecca; Carruthers, Sam; Gardner, Peter; Wright, John; McEachan, Rosie R C
2012-08-01
The primary aim of this article was to identify the latent failures that are perceived to underpin medication errors. The study was conducted within three medical wards in a hospital in the United Kingdom. The study employed a cross-sectional qualitative design. Interviews were conducted with 12 nurses and eight managers. Interviews were transcribed and subject to thematic content analysis. A two-step inter-rater comparison tested the reliability of the themes. Ten latent failures were identified based on the analysis of the interviews. These were ward climate, local working environment, workload, human resources, team communication, routine procedures, bed management, written policies and procedures, supervision and leadership, and training. The discussion focuses on ward climate, the most prevalent theme, which is conceptualized here as interacting with failures in the nine other organizational structures and processes. This study is the first of its kind to identify the latent failures perceived to underpin medication errors in a systematic way. The findings can be used as a platform for researchers to test the impact of organization-level patient safety interventions and to design proactive error management tools and incident reporting systems in hospitals. © Health Research and Educational Trust.
Accuracy Study of a Robotic System for MRI-guided Prostate Needle Placement
Seifabadi, Reza; Cho, Nathan BJ.; Song, Sang-Eun; Tokuda, Junichi; Hata, Nobuhiko; Tempany, Clare M.; Fichtinger, Gabor; Iordachita, Iulian
2013-01-01
Background Accurate needle placement is the first concern in percutaneous MRI-guided prostate interventions. In this phantom study, different sources contributing to the overall needle placement error of a MRI-guided robot for prostate biopsy have been identified, quantified, and minimized to the possible extent. Methods and Materials The overall needle placement error of the system was evaluated in a prostate phantom. This error was broken into two parts: the error associated with the robotic system (called before-insertion error) and the error associated with needle-tissue interaction (called due-to-insertion error). The before-insertion error was measured directly in a soft phantom and different sources contributing into this part were identified and quantified. A calibration methodology was developed to minimize the 4-DOF manipulator’s error. The due-to-insertion error was indirectly approximated by comparing the overall error and the before-insertion error. The effect of sterilization on the manipulator’s accuracy and repeatability was also studied. Results The average overall system error in phantom study was 2.5 mm (STD=1.1mm). The average robotic system error in super soft phantom was 1.3 mm (STD=0.7 mm). Assuming orthogonal error components, the needle-tissue interaction error was approximated to be 2.13 mm thus having larger contribution to the overall error. The average susceptibility artifact shift was 0.2 mm. The manipulator’s targeting accuracy was 0.71 mm (STD=0.21mm) after robot calibration. The robot’s repeatability was 0.13 mm. Sterilization had no noticeable influence on the robot’s accuracy and repeatability. Conclusions The experimental methodology presented in this paper may help researchers to identify, quantify, and minimize different sources contributing into the overall needle placement error of an MRI-guided robotic system for prostate needle placement. In the robotic system analyzed here, the overall error of the studied system remained within the acceptable range. PMID:22678990
Accuracy study of a robotic system for MRI-guided prostate needle placement.
Seifabadi, Reza; Cho, Nathan B J; Song, Sang-Eun; Tokuda, Junichi; Hata, Nobuhiko; Tempany, Clare M; Fichtinger, Gabor; Iordachita, Iulian
2013-09-01
Accurate needle placement is the first concern in percutaneous MRI-guided prostate interventions. In this phantom study, different sources contributing to the overall needle placement error of a MRI-guided robot for prostate biopsy have been identified, quantified and minimized to the possible extent. The overall needle placement error of the system was evaluated in a prostate phantom. This error was broken into two parts: the error associated with the robotic system (called 'before-insertion error') and the error associated with needle-tissue interaction (called 'due-to-insertion error'). Before-insertion error was measured directly in a soft phantom and different sources contributing into this part were identified and quantified. A calibration methodology was developed to minimize the 4-DOF manipulator's error. The due-to-insertion error was indirectly approximated by comparing the overall error and the before-insertion error. The effect of sterilization on the manipulator's accuracy and repeatability was also studied. The average overall system error in the phantom study was 2.5 mm (STD = 1.1 mm). The average robotic system error in the Super Soft plastic phantom was 1.3 mm (STD = 0.7 mm). Assuming orthogonal error components, the needle-tissue interaction error was found to be approximately 2.13 mm, thus making a larger contribution to the overall error. The average susceptibility artifact shift was 0.2 mm. The manipulator's targeting accuracy was 0.71 mm (STD = 0.21 mm) after robot calibration. The robot's repeatability was 0.13 mm. Sterilization had no noticeable influence on the robot's accuracy and repeatability. The experimental methodology presented in this paper may help researchers to identify, quantify and minimize different sources contributing into the overall needle placement error of an MRI-guided robotic system for prostate needle placement. In the robotic system analysed here, the overall error of the studied system remained within the acceptable range. Copyright © 2012 John Wiley & Sons, Ltd.
Report of the Odyssey FPGA Independent Assessment Team
NASA Technical Reports Server (NTRS)
Mayer, Donald C.; Katz, Richard B.; Osborn, Jon V.; Soden, Jerry M.; Barto, R.; Day, John H. (Technical Monitor)
2001-01-01
An independent assessment team (IAT) was formed and met on April 2, 2001, at Lockheed Martin in Denver, Colorado, to aid in understanding a technical issue for the Mars Odyssey spacecraft scheduled for launch on April 7, 2001. An RP1280A field-programmable gate array (FPGA) from a lot of parts common to the SIRTF, Odyssey, and Genesis missions had failed on a SIRTF printed circuit board. A second FPGA from an earlier Odyssey circuit board was also known to have failed and was also included in the analysis by the IAT. Observations indicated an abnormally high failure rate for flight RP1280A devices (the first flight lot produced using this flow) at Lockheed Martin and the causes of these failures were not determined. Standard failure analysis techniques were applied to these parts, however, additional diagnostic techniques unique for devices of this class were not used, and the parts were prematurely submitted to a destructive physical analysis, making a determination of the root cause of failure difficult. Any of several potential failure scenarios may have caused these failures, including electrostatic discharge, electrical overstress, manufacturing defects, board design errors, board manufacturing errors, FPGA design errors, or programmer errors. Several of these mechanisms would have relatively benign consequences for disposition of the parts currently installed on boards in the Odyssey spacecraft if established as the root cause of failure. However, other potential failure mechanisms could have more dire consequences. As there is no simple way to determine the likely failure mechanisms with reasonable confidence before Odyssey launch, it is not possible for the IAT to recommend a disposition for the other parts on boards in the Odyssey spacecraft based on sound engineering principles.
Management of failure after surgery for gastro-esophageal reflux disease.
Gronnier, C; Degrandi, O; Collet, D
2018-04-01
Surgical treatment of gastro-esophageal reflux disease (ST-GERD) is well-codified and offers an alternative to long-term medical treatment with a better efficacy for short and long-term outcomes. However, failure of ST-GERD is observed in 2-20% of patients; management is challenging and not standardized. The aim of this study is to analyze the causes of failure and to provide a treatment algorithm. The clinical aspects of ST-GERD failure are variable including persistent reflux, dysphagia or permanent discomfort leading to an important degradation of the quality of life. A morphological and functional pre-therapeutic evaluation is necessary to: (i) determine whether the symptoms are due to recurrence of reflux or to an error in initial indication and (ii) to understand the cause of the failure. The most frequent causes of failure of ST-GERD include errors in the initial indication, which often only need medical treatment, and surgical technical errors, for which surgical redo surgery can be difficult. Multidisciplinary management is necessary in order to offer the best-adapted treatment. Copyright © 2018. Published by Elsevier Masson SAS.
Identification of risk factors for failure in patients with skin and soft tissue infections.
Cieri, Brittany; Conway, Erin L; Sellick, John A; Mergenhagen, Kari A
2018-04-23
The purpose was to determine significant predictors of treatment failure of skin and soft tissue infections (SSTI) in the inpatient and outpatient setting. A retrospective chart review of patients treated between January 1, 2005 to July 1, 2016 with ICD-9 or ICD-10 code of cellulitis or abscess. The primary outcome was failure defined as an additional prescription or subsequent hospital admission within 30 days of treatment. Risk factors for failure were identified through multivariate logistic regression. A total of 541 patients were included. Seventeen percent failed treatment. In the outpatient group, 24% failed treatment compared to 9% for inpatients. Overweight/obesity (body mass index (BMI) > 25 kg/m 2 ) was identified in 80%, with 15% having a BMI >40 kg/m 2 . BMI, heart failure, and outpatient treatment were determined to be significant predictors of failure. The unit odds ratio for failure with BMI was 1.04 (95% [Cl] = 1.01 to 1.1, p = 0.0042). Heart failure increased odds by 2.48 (95% [Cl] = 1.3 to 4.7, p = 0.0056). Outpatients were more likely to fail with an odds ratio of 3.36. Patients with an elevated BMI and heart failure were found to have increased odds of failure with treatment for SSTIs. However, inpatients had considerably less risk of failure than outpatients. These risk factors are important to note when making the decision whether to admit a patient who presents with SSTI in the emergency department. Thoughtful strategies are needed with this at-risk population to prevent subsequent admission. Copyright © 2018. Published by Elsevier Inc.
Schulz, Christian M; Burden, Amanda; Posner, Karen L; Mincer, Shawn L; Steadman, Randolph; Wagner, Klaus J; Domino, Karen B
2017-08-01
Situational awareness errors may play an important role in the genesis of patient harm. The authors examined closed anesthesia malpractice claims for death or brain damage to determine the frequency and type of situational awareness errors. Surgical and procedural anesthesia death and brain damage claims in the Anesthesia Closed Claims Project database were analyzed. Situational awareness error was defined as failure to perceive relevant clinical information, failure to comprehend the meaning of available information, or failure to project, anticipate, or plan. Patient and case characteristics, primary damaging events, and anesthesia payments in claims with situational awareness errors were compared to other death and brain damage claims from 2002 to 2013. Anesthesiologist situational awareness errors contributed to death or brain damage in 198 of 266 claims (74%). Respiratory system damaging events were more common in claims with situational awareness errors (56%) than other claims (21%, P < 0.001). The most common specific respiratory events in error claims were inadequate oxygenation or ventilation (24%), difficult intubation (11%), and aspiration (10%). Payments were made in 85% of situational awareness error claims compared to 46% in other claims (P = 0.001), with no significant difference in payment size. Among 198 claims with anesthesia situational awareness error, perception errors were most common (42%), whereas comprehension errors (29%) and projection errors (29%) were relatively less common. Situational awareness error definitions were operationalized for reliable application to real-world anesthesia cases. Situational awareness errors may have contributed to catastrophic outcomes in three quarters of recent anesthesia malpractice claims.Situational awareness errors resulting in death or brain damage remain prevalent causes of malpractice claims in the 21st century.
Detection of Failure in Asynchronous Motor Using Soft Computing Method
NASA Astrophysics Data System (ADS)
Vinoth Kumar, K.; Sony, Kevin; Achenkunju John, Alan; Kuriakose, Anto; John, Ano P.
2018-04-01
This paper investigates the stator short winding failure of asynchronous motor also their effects on motor current spectrums. A fuzzy logic approach i.e., model based technique possibly will help to detect the asynchronous motor failure. Actually, fuzzy logic similar to humanoid intelligent methods besides expected linguistic empowering inferences through vague statistics. The dynamic model is technologically advanced for asynchronous motor by means of fuzzy logic classifier towards investigate the stator inter turn failure in addition open phase failure. A hardware implementation was carried out with LabVIEW for the online-monitoring of faults.
Ballistic Impact of Braided Composites with a Soft Projectile
NASA Technical Reports Server (NTRS)
Roberts, Gary D.; Pereira, J. Michael; Revilock, Duane M., Jr.; Binienda, Wieslaw K.; Xie, Ming; Braley, Mike
2002-01-01
Impact tests using a soft gelatin projectile were performed to identify failure modes that occur at high strain energy density during impact loading. Failure modes were identified for aluminum plates and for composites plates and half-rings made from triaxial carbon fiber braid having a 0/+/- 60deg architecture. For aluminum plates, a large hole formed as a result of crack propagation from the initiation site at the center of the plate. For composite plates, fiber tensile failure occurred in the back ply at the center of the plate. Cracks then propagated from this site along the +/-60deg fiber directions until triangular flaps opened to form a hole. For composite half-rings fabricated with 0deg fibers aligned circumferentially, fiber tensile failure also occurred in the back ply. Cracks first propagated from this site perpendicular the 0deg fibers. The cracks then turned to follow the +/-60deg fibers and 0deg fibers until rectangular flaps opened to form a hole. Damage in the composites was localized near the impact site, while cracks in the aluminum extended to the boundaries.
Media multitasking and failures of attention in everyday life.
Ralph, Brandon C W; Thomson, David R; Cheyne, James Allan; Smilek, Daniel
2014-09-01
Using a series of online self-report measures, we examine media multitasking, a particularly pervasive form of multitasking, and its relations to three aspects of everyday attention: (1) failures of attention and cognitive errors (2) mind wandering, and (3) attentional control with an emphasis on attentional switching and distractibility. We observed a positive correlation between levels of media multitasking and self-reports of attentional failures, as well as with reports of both spontaneous and deliberate mind wandering. No correlation was observed between media multitasking and self-reported memory failures, lending credence to the hypothesis that media multitasking may be specifically related to problems of inattention, rather than cognitive errors in general. Furthermore, media multitasking was not related with self-reports of difficulties in attention switching or distractibility. We offer a plausible causal structural model assessing both direct and indirect effects among media multitasking, attentional failures, mind wandering, and cognitive errors, with the heuristic goal of constraining and motivating theories of the effects of media multitasking on inattention.
NASA Astrophysics Data System (ADS)
Lohrmann, Carol A.
1990-03-01
Interoperability of commercial Land Mobile Radios (LMR) and the military's tactical LMR is highly desirable if the U.S. government is to respond effectively in a national emergency or in a joint military operation. This ability to talk securely and immediately across agency and military service boundaries is often overlooked. One way to ensure interoperability is to develop and promote Federal communication standards (FS). This thesis surveys one area of the proposed FS 1024 for LMRs; namely, the error detection and correction (EDAC) of the message indicator (MI) bits used for cryptographic synchronization. Several EDAC codes are examined (Hamming, Quadratic Residue, hard decision Golay and soft decision Golay), tested on three FORTRAN programmed channel simulations (INMARSAT, Gaussian and constant burst width), compared and analyzed (based on bit error rates and percent of error-free super-frame runs) so that a best code can be recommended. Out of the four codes under study, the soft decision Golay code (24,12) is evaluated to be the best. This finding is based on the code's ability to detect and correct errors as well as the relative ease of implementation of the algorithm.
[Failure modes and effects analysis in the prescription, validation and dispensing process].
Delgado Silveira, E; Alvarez Díaz, A; Pérez Menéndez-Conde, C; Serna Pérez, J; Rodríguez Sagrado, M A; Bermejo Vicedo, T
2012-01-01
To apply a failure modes and effects analysis to the prescription, validation and dispensing process for hospitalised patients. A work group analysed all of the stages included in the process from prescription to dispensing, identifying the most critical errors and establishing potential failure modes which could produce a mistake. The possible causes, their potential effects, and the existing control systems were analysed to try and stop them from developing. The Hazard Score was calculated, choosing those that were ≥ 8, and a Severity Index = 4 was selected independently of the hazard Score value. Corrective measures and an implementation plan were proposed. A flow diagram that describes the whole process was obtained. A risk analysis was conducted of the chosen critical points, indicating: failure mode, cause, effect, severity, probability, Hazard Score, suggested preventative measure and strategy to achieve so. Failure modes chosen: Prescription on the nurse's form; progress or treatment order (paper); Prescription to incorrect patient; Transcription error by nursing staff and pharmacist; Error preparing the trolley. By applying a failure modes and effects analysis to the prescription, validation and dispensing process, we have been able to identify critical aspects, the stages in which errors may occur and the causes. It has allowed us to analyse the effects on the safety of the process, and establish measures to prevent or reduce them. Copyright © 2010 SEFH. Published by Elsevier Espana. All rights reserved.
1980-03-14
failure Sigmar (Or) in line 50, the standard deviation of the relative error of the weights Sigmap (o) in line 60, the standard deviation of the phase...200, the weight structures in the x and y coordinates Q in line 210, the probability of element failure Sigmar (Or) in line 220, the standard...NUMBER OF ELEMENTS =u;2*H 120 PRINT "Pr’obability of elemenit failure al;O 130 PRINT "Standard dtvi&t ion’ oe r.1&tive ýrror of wl; Sigmar 14 0 PRINT
NASA Astrophysics Data System (ADS)
Apriyono, Arwan; Sumiyanto, Gusmawan, Dadan Deri
2017-03-01
This study presents the application of woven waste tires as soft clay subgrade reinforcement for preventing highway structural failure, reducing construction cost and minimizing environmental hazards associated with the increasingly large amount of waste tires in Indonesia. To his end, we performed experiments using five stripe distance variations of woven tires - i.e. 2, 2.5, 3, 3.5 and 4 cm. Five soft clay samples were made and each was reinforced using each of these woven tires. The California Bearing Ratio (CBR) test was conducted on each softclay sample and the CBR value was determined from the stress on the displacement of 0.10 and 0.20 inch. The correlation between CBR value and strip distance was used to infer the optimum woven tires strip distance, indicated by the largest CBR value. The result suggests that the strip distance of 3 cm is optimum with corresponding CBR value of ˜20%, which is 115% increase compared to softclay without reinforcement.
Khurana, Harpreet Kaur; Cho, Il Kyu; Shim, Jae Yong; Li, Qing X; Jun, Soojin
2008-02-13
Aspartame is a low-calorie sweetener commonly used in soft drinks; however, the maximum usage dose is limited by the U.S. Food and Drug Administration. Fourier transform infrared (FTIR) spectroscopy with attenuated total reflectance sampling accessory and partial least-squares regression (PLS) was used for rapid determination of aspartame in soft drinks. On the basis of spectral characterization, the highest R2 value, and lowest PRESS value, the spectral region between 1600 and 1900 cm(-1) was selected for quantitative estimation of aspartame. The potential of FTIR spectroscopy for aspartame quantification was examined and validated by the conventional HPLC method. Using the FTIR method, aspartame contents in four selected carbonated diet soft drinks were found to average from 0.43 to 0.50 mg/mL with prediction errors ranging from 2.4 to 5.7% when compared with HPLC measurements. The developed method also showed a high degree of accuracy because real samples were used for calibration, thus minimizing potential interference errors. The FTIR method developed can be suitably used for routine quality control analysis of aspartame in the beverage-manufacturing sector.
45 Gb/s low complexity optical front-end for soft-decision LDPC decoders.
Sakib, Meer Nazmus; Moayedi, Monireh; Gross, Warren J; Liboiron-Ladouceur, Odile
2012-07-30
In this paper a low complexity and energy efficient 45 Gb/s soft-decision optical front-end to be used with soft-decision low-density parity-check (LDPC) decoders is demonstrated. The results show that the optical front-end exhibits a net coding gain of 7.06 and 9.62 dB for post forward error correction bit error rate of 10(-7) and 10(-12) for long block length LDPC(32768,26803) code. The performance over a hard decision front-end is 1.9 dB for this code. It is shown that the soft-decision circuit can also be used as a 2-bit flash type analog-to-digital converter (ADC), in conjunction with equalization schemes. At bit rate of 15 Gb/s using RS(255,239), LDPC(672,336), (672, 504), (672, 588), and (1440, 1344) used with a 6-tap finite impulse response (FIR) equalizer will result in optical power savings of 3, 5, 7, 9.5 and 10.5 dB, respectively. The 2-bit flash ADC consumes only 2.71 W at 32 GSamples/s. At 45 GSamples/s the power consumption is estimated to be 4.95 W.
Probabilistic failure assessment with application to solid rocket motors
NASA Technical Reports Server (NTRS)
Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.
1990-01-01
A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.
Estimating patient-specific soft-tissue properties in a TKA knee.
Ewing, Joseph A; Kaufman, Michelle K; Hutter, Erin E; Granger, Jeffrey F; Beal, Matthew D; Piazza, Stephen J; Siston, Robert A
2016-03-01
Surgical technique is one factor that has been identified as critical to success of total knee arthroplasty. Researchers have shown that computer simulations can aid in determining how decisions in the operating room generally affect post-operative outcomes. However, to use simulations to make clinically relevant predictions about knee forces and motions for a specific total knee patient, patient-specific models are needed. This study introduces a methodology for estimating knee soft-tissue properties of an individual total knee patient. A custom surgical navigation system and stability device were used to measure the force-displacement relationship of the knee. Soft-tissue properties were estimated using a parameter optimization that matched simulated tibiofemoral kinematics with experimental tibiofemoral kinematics. Simulations using optimized ligament properties had an average root mean square error of 3.5° across all tests while simulations using generic ligament properties taken from literature had an average root mean square error of 8.4°. Specimens showed large variability among ligament properties regardless of similarities in prosthetic component alignment and measured knee laxity. These results demonstrate the importance of soft-tissue properties in determining knee stability, and suggest that to make clinically relevant predictions of post-operative knee motions and forces using computer simulations, patient-specific soft-tissue properties are needed. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Wang, Biao; Yu, Xiaofen; Li, Qinzhao; Zheng, Yu
2008-10-01
The paper aiming at the influence factor of round grating dividing error, rolling-wheel produce eccentricity and surface shape errors provides an amendment method based on rolling-wheel to get the composite error model which includes all influence factors above, and then corrects the non-circle measurement angle error of the rolling-wheel. We make soft simulation verification and have experiment; the result indicates that the composite error amendment method can improve the diameter measurement accuracy with rolling-wheel theory. It has wide application prospect for the measurement accuracy higher than 5 μm/m.
Addressing the Hard Factors for Command File Errors by Probabilistic Reasoning
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Bryant, Larry
2014-01-01
Command File Errors (CFE) are managed using standard risk management approaches at the Jet Propulsion Laboratory. Over the last few years, more emphasis has been made on the collection, organization, and analysis of these errors for the purpose of reducing the CFE rates. More recently, probabilistic modeling techniques have been used for more in depth analysis of the perceived error rates of the DAWN mission and for managing the soft factors in the upcoming phases of the mission. We broadly classify the factors that can lead to CFE's as soft factors, which relate to the cognition of the operators and hard factors which relate to the Mission System which is composed of the hardware, software and procedures used for the generation, verification & validation and execution of commands. The focus of this paper is to use probabilistic models that represent multiple missions at JPL to determine the root cause and sensitivities of the various components of the mission system and develop recommendations and techniques for addressing them. The customization of these multi-mission models to a sample interplanetary spacecraft is done for this purpose.
The failures of root canal preparation with hand ProTaper.
Bătăiosu, Marilena; Diaconu, Oana; Moraru, Iren; Dăguci, C; Tuculină, Mihaela; Dăguci, Luminiţa; Gheorghiţă, Lelia
2012-07-01
The failures of root canal preparation are due to some anatomical deviation (canal in "C" or "S") and some technique errors. The technique errors are usually present in canal root cleansing and shaping stage and are the result of endodontic treatment objectives deviation. Our study was made on technique errors while preparing the canal roots with hand ProTaper. Our study was made "in vitro" on 84 extracted teeth (molars, premolars, incisors and canines). The canal root of these teeth were cleansed and shaped with hand ProTaper by crown-down technique and canal irrigation with NaOCl(2,5%). The dental preparation control was made by X-ray. During canal root preparation some failures were observed like: canal root overinstrumentation, zipping and stripping phenomenon, discarded and/or fractured instruments. Hand ProTaper represents a revolutionary progress of endodontic treatment, but a deviation from accepted rules of canal root instrumentation can lead to failures of endodontic treatment.
NASA Technical Reports Server (NTRS)
Platt, M. E.; Lewis, E. E.; Boehm, F.
1991-01-01
A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.
Use of limited data to construct Bayesian networks for probabilistic risk assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, Katrina M.; Swiler, Laura Painton
2013-03-01
Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was tomore » establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.« less
A systematic review of cognitive failures in daily life: Healthy populations.
Carrigan, Nicole; Barkus, Emma
2016-04-01
Cognitive failures are minor errors in thinking reported by clinical and non-clinical individuals during everyday life. It is not yet clear how subjectively-reported cognitive failures relate to objective neuropsychological ability. We aimed to consolidate the definition of cognitive failures, outline evidence for the relationship with objective cognition, and develop a unified model of factors that increase cognitive failures. We conducted a systematic review of cognitive failures, identifying 45 articles according to the PRISMA statement. Failures were defined as reflecting proneness to errors in 'real world' planned thought and action. Vulnerability to failures was not consistently associated with objective cognitive performance. A range of stable and variable factors were linked to increased risk of cognitive failures. We conclude that cognitive failures measure real world cognitive capacity rather than pure 'unchallenged' ability. Momentary state may interact with predisposing trait factors to increase the likelihood of failures occurring. Inclusion of self-reported cognitive failures in objective cognitive research will increase the translational relevance of ability into more ecologically valid aspects of real world functioning. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cutti, Andrea Giovanni; Cappello, Angelo; Davalli, Angelo
2006-01-01
Soft tissue artefact is the dominant error source for upper extremity motion analyses that use skin-mounted markers, especially in humeral axial rotation. A new in vivo technique is presented that is based on the definition of a humerus bone-embedded frame almost "artefact free" but influenced by the elbow orientation in the measurement of the humeral axial rotation, and on an algorithm designed to solve this kinematic coupling. The technique was validated in vivo in a study of six healthy subjects who performed five arm-movement tasks. For each task the similarity between a gold standard pattern and the axial rotation pattern before and after the application of the compensation algorithm was evaluated in terms of explained variance, gain, phase and offset. In addition the root mean square error between the patterns was used as a global similarity estimator. After the application, for four out of five tasks, patterns were highly correlated, in phase, with almost equal gain and limited offset; the root mean square error decreased from the original 9 degrees to 3 degrees . The proposed technique appears to help compensate for the soft tissue artefact affecting axial rotation. A further development is also proposed to make the technique effective also for the pure prono-supination task.
Lago, Paola; Bizzarri, Giancarlo; Scalzotto, Francesca; Parpaiola, Antonella; Amigoni, Angela; Putoto, Giovanni; Perilongo, Giorgio
2012-01-01
Objective Administering medication to hospitalised infants and children is a complex process at high risk of error. Failure mode and effect analysis (FMEA) is a proactive tool used to analyse risks, identify failures before they happen and prioritise remedial measures. To examine the hazards associated with the process of drug delivery to children, we performed a proactive risk-assessment analysis. Design and setting Five multidisciplinary teams, representing different divisions of the paediatric department at Padua University Hospital, were trained to analyse the drug-delivery process, to identify possible causes of failures and their potential effects, to calculate a risk priority number (RPN) for each failure and plan changes in practices. Primary outcome To identify higher-priority potential failure modes as defined by RPNs and planning changes in clinical practice to reduce the risk of patients harm and improve safety in the process of medication use in children. Results In all, 37 higher-priority potential failure modes and 71 associated causes and effects were identified. The highest RPNs related (>48) mainly to errors in calculating drug doses and concentrations. Many of these failure modes were found in all the five units, suggesting the presence of common targets for improvement, particularly in enhancing the safety of prescription and preparation of endovenous drugs. The introductions of new activities in the revised process of administering drugs allowed reducing the high-risk failure modes of 60%. Conclusions FMEA is an effective proactive risk-assessment tool useful to aid multidisciplinary groups in understanding a process care and identifying errors that may occur, prioritising remedial interventions and possibly enhancing the safety of drug delivery in children. PMID:23253870
Lago, Paola; Bizzarri, Giancarlo; Scalzotto, Francesca; Parpaiola, Antonella; Amigoni, Angela; Putoto, Giovanni; Perilongo, Giorgio
2012-01-01
Administering medication to hospitalised infants and children is a complex process at high risk of error. Failure mode and effect analysis (FMEA) is a proactive tool used to analyse risks, identify failures before they happen and prioritise remedial measures. To examine the hazards associated with the process of drug delivery to children, we performed a proactive risk-assessment analysis. Five multidisciplinary teams, representing different divisions of the paediatric department at Padua University Hospital, were trained to analyse the drug-delivery process, to identify possible causes of failures and their potential effects, to calculate a risk priority number (RPN) for each failure and plan changes in practices. To identify higher-priority potential failure modes as defined by RPNs and planning changes in clinical practice to reduce the risk of patients harm and improve safety in the process of medication use in children. In all, 37 higher-priority potential failure modes and 71 associated causes and effects were identified. The highest RPNs related (>48) mainly to errors in calculating drug doses and concentrations. Many of these failure modes were found in all the five units, suggesting the presence of common targets for improvement, particularly in enhancing the safety of prescription and preparation of endovenous drugs. The introductions of new activities in the revised process of administering drugs allowed reducing the high-risk failure modes of 60%. FMEA is an effective proactive risk-assessment tool useful to aid multidisciplinary groups in understanding a process care and identifying errors that may occur, prioritising remedial interventions and possibly enhancing the safety of drug delivery in children.
In-flight calibration of the Hitomi Soft X-ray Spectrometer. (2) Point spread function
NASA Astrophysics Data System (ADS)
Maeda, Yoshitomo; Sato, Toshiki; Hayashi, Takayuki; Iizuka, Ryo; Angelini, Lorella; Asai, Ryota; Furuzawa, Akihiro; Kelley, Richard; Koyama, Shu; Kurashima, Sho; Ishida, Manabu; Mori, Hideyuki; Nakaniwa, Nozomi; Okajima, Takashi; Serlemitsos, Peter J.; Tsujimoto, Masahiro; Yaqoob, Tahir
2018-03-01
We present results of inflight calibration of the point spread function of the Soft X-ray Telescope that focuses X-rays onto the pixel array of the Soft X-ray Spectrometer system. We make a full array image of a point-like source by extracting a pulsed component of the Crab nebula emission. Within the limited statistics afforded by an exposure time of only 6.9 ks and limited knowledge of the systematic uncertainties, we find that the raytracing model of 1 {^'.} 2 half-power-diameter is consistent with an image of the observed event distributions across pixels. The ratio between the Crab pulsar image and the raytracing shows scatter from pixel to pixel that is 40% or less in all except one pixel. The pixel-to-pixel ratio has a spread of 20%, on average, for the 15 edge pixels, with an averaged statistical error of 17% (1 σ). In the central 16 pixels, the corresponding ratio is 15% with an error of 6%.
Eisner, Brian H; Kambadakone, Avinash; Monga, Manoj; Anderson, James K; Thoreson, Andrew A; Lee, Hang; Dretler, Stephen P; Sahani, Dushyant V
2009-04-01
We determined the most accurate method of measuring urinary stones on computerized tomography. For the in vitro portion of the study 24 calculi, including 12 calcium oxalate monohydrate and 12 uric acid stones, that had been previously collected at our clinic were measured manually with hand calipers as the gold standard measurement. The calculi were then embedded into human kidney-sized potatoes and scanned using 64-slice multidetector computerized tomography. Computerized tomography measurements were performed at 4 window settings, including standard soft tissue windows (window width-320 and window length-50), standard bone windows (window width-1120 and window length-300), 5.13x magnified soft tissue windows and 5.13x magnified bone windows. Maximum stone dimensions were recorded. For the in vivo portion of the study 41 patients with distal ureteral stones who underwent noncontrast computerized tomography and subsequently spontaneously passed the stones were analyzed. All analyzed stones were 100% calcium oxalate monohydrate or mixed, calcium based stones. Stones were prospectively collected at the clinic and the largest diameter was measured with digital calipers as the gold standard. This was compared to computerized tomography measurements using 4.0x magnified soft tissue windows and 4.0x magnified bone windows. Statistical comparisons were performed using Pearson's correlation and paired t test. In the in vitro portion of the study the most accurate measurements were obtained using 5.13x magnified bone windows with a mean 0.13 mm difference from caliper measurement (p = 0.6). Measurements performed in the soft tissue window with and without magnification, and in the bone window without magnification were significantly different from hand caliper measurements (mean difference 1.2, 1.9 and 1.4 mm, p = 0.003, <0.001 and 0.0002, respectively). When comparing measurement errors between stones of different composition in vitro, the error for calcium oxalate calculi was significantly different from the gold standard for all methods except bone window settings with magnification. For uric acid calculi the measurement error was observed only in standard soft tissue window settings. In vivo 4.0x magnified bone windows was superior to 4.0x magnified soft tissue windows in measurement accuracy. Magnified bone window measurements were not statistically different from digital caliper measurements (mean underestimation vs digital caliper 0.3 mm, p = 0.4), while magnified soft tissue windows were statistically distinct (mean underestimation 1.4 mm, p = 0.001). In this study magnified bone windows were the most accurate method of stone measurements in vitro and in vivo. Therefore, we recommend the routine use of magnified bone windows for computerized tomography measurement of stones. In vitro the measurement error in calcium oxalate stones was greater than that in uric acid stones, suggesting that stone composition may be responsible for measurement inaccuracies.
Using failure mode and effects analysis to plan implementation of smart i.v. pump technology.
Wetterneck, Tosha B; Skibinski, Kathleen A; Roberts, Tanita L; Kleppin, Susan M; Schroeder, Mark E; Enloe, Myra; Rough, Steven S; Hundt, Ann Schoofs; Carayon, Pascale
2006-08-15
Failure mode and effects analysis (FMEA) was used to evaluate a smart i.v. pump as it was implemented into a redesigned medication-use process. A multidisciplinary team conducted a FMEA to guide the implementation of a smart i.v. pump that was designed to prevent pump programming errors. The smart i.v. pump was equipped with a dose-error reduction system that included a pre-defined drug library in which dosage limits were set for each medication. Monitoring for potential failures and errors occurred for three months postimplementation of FMEA. Specific measures were used to determine the success of the actions that were implemented as a result of the FMEA. The FMEA process at the hospital identified key failure modes in the medication process with the use of the old and new pumps, and actions were taken to avoid errors and adverse events. I.V. pump software and hardware design changes were also recommended. Thirteen of the 18 failure modes reported in practice after pump implementation had been identified by the team. A beneficial outcome of FMEA was the development of a multidisciplinary team that provided the infrastructure for safe technology implementation and effective event investigation after implementation. With the continual updating of i.v. pump software and hardware after implementation, FMEA can be an important starting place for safe technology choice and implementation and can produce site experts to follow technology and process changes over time. FMEA was useful in identifying potential problems in the medication-use process with the implementation of new smart i.v. pumps. Monitoring for system failures and errors after implementation remains necessary.
Impact of Extended-Duration Shifts on Medical Errors, Adverse Events, and Attentional Failures
Barger, Laura K; Ayas, Najib T; Cade, Brian E; Cronin, John W; Rosner, Bernard; Speizer, Frank E; Czeisler, Charles A
2006-01-01
Background A recent randomized controlled trial in critical-care units revealed that the elimination of extended-duration work shifts (≥24 h) reduces the rates of significant medical errors and polysomnographically recorded attentional failures. This raised the concern that the extended-duration shifts commonly worked by interns may contribute to the risk of medical errors being made, and perhaps to the risk of adverse events more generally. Our current study assessed whether extended-duration shifts worked by interns are associated with significant medical errors, adverse events, and attentional failures in a diverse population of interns across the United States. Methods and Findings We conducted a Web-based survey, across the United States, in which 2,737 residents in their first postgraduate year (interns) completed 17,003 monthly reports. The association between the number of extended-duration shifts worked in the month and the reporting of significant medical errors, preventable adverse events, and attentional failures was assessed using a case-crossover analysis in which each intern acted as his/her own control. Compared to months in which no extended-duration shifts were worked, during months in which between one and four extended-duration shifts and five or more extended-duration shifts were worked, the odds ratios of reporting at least one fatigue-related significant medical error were 3.5 (95% confidence interval [CI], 3.3–3.7) and 7.5 (95% CI, 7.2–7.8), respectively. The respective odds ratios for fatigue-related preventable adverse events, 8.7 (95% CI, 3.4–22) and 7.0 (95% CI, 4.3–11), were also increased. Interns working five or more extended-duration shifts per month reported more attentional failures during lectures, rounds, and clinical activities, including surgery and reported 300% more fatigue-related preventable adverse events resulting in a fatality. Conclusions In our survey, extended-duration work shifts were associated with an increased risk of significant medical errors, adverse events, and attentional failures in interns across the United States. These results have important public policy implications for postgraduate medical education. PMID:17194188
A detailed description of the sequential probability ratio test for 2-IMU FDI
NASA Technical Reports Server (NTRS)
Rich, T. M.
1976-01-01
The sequential probability ratio test (SPRT) for 2-IMU FDI (inertial measuring unit failure detection/isolation) is described. The SPRT is a statistical technique for detecting and isolating soft IMU failures originally developed for the strapdown inertial reference unit. The flowchart of a subroutine incorporating the 2-IMU SPRT is included.
Implications of scaling on static RAM bit cell stability and reliability
NASA Astrophysics Data System (ADS)
Coones, Mary Ann; Herr, Norm; Bormann, Al; Erington, Kent; Soorholtz, Vince; Sweeney, John; Phillips, Michael
1993-01-01
In order to lower manufacturing costs and increase performance, static random access memory (SRAM) bit cells are scaled progressively toward submicron geometries. The reliability of an SRAM is highly dependent on the bit cell stability. Smaller memory cells with less capacitance and restoring current make the array more susceptible to failures from defectivity, alpha hits, and other instabilities and leakage mechanisms. Improving long term reliability while migrating to higher density devices makes the task of building in and improving reliability increasingly difficult. Reliability requirements for high density SRAMs are very demanding with failure rates of less than 100 failures per billion device hours (100 FITs) being a common criteria. Design techniques for increasing bit cell stability and manufacturability must be implemented in order to build in this level of reliability. Several types of analyses are performed to benchmark the performance of the SRAM device. Examples of these analysis techniques which are presented here include DC parametric measurements of test structures, functional bit mapping of the circuit used to characterize the entire distribution of bits, electrical microprobing of weak and/or failing bits, and system and accelerated soft error rate measurements. These tests allow process and design improvements to be evaluated prior to implementation on the final product. These results are used to provide comprehensive bit cell characterization which can then be compared to device models and adjusted accordingly to provide optimized cell stability versus cell size for a particular technology. The result is designed in reliability which can be accomplished during the early stages of product development.
Co-operation of digital nonlinear equalizers and soft-decision LDPC FEC in nonlinear transmission.
Tanimura, Takahito; Oda, Shoichiro; Hoshida, Takeshi; Aoki, Yasuhiko; Tao, Zhenning; Rasmussen, Jens C
2013-12-30
We experimentally and numerically investigated the characteristics of 128 Gb/s dual polarization - quadrature phase shift keying signals received with two types of nonlinear equalizers (NLEs) followed by soft-decision (SD) low-density parity-check (LDPC) forward error correction (FEC). Successful co-operation among SD-FEC and NLEs over various nonlinear transmissions were demonstrated by optimization of parameters for NLEs.
Bouxsein, Mary L; Szulc, Pawel; Munoz, Fracoise; Thrall, Erica; Sornay-Rendu, Elizabeth; Delmas, Pierre D
2007-06-01
We compared trochanteric soft tissue thickness, femoral aBMD, and the ratio of fall force to femoral strength (i.e., factor of risk) in 21 postmenopausal women with incident hip fracture and 42 age-matched controls. Reduced trochanteric soft tissue thickness, low femoral aBMD, and increased ratio of fall force to femoral strength (i.e., factor of risk) were associated with increased risk of hip fracture. The contribution of trochanteric soft tissue thickness to hip fracture risk is incompletely understood. A biomechanical approach to assessing hip fracture risk that compares forces applied to the hip during a sideways fall to femoral strength may by improved by incorporating the force-attenuating effects of trochanteric soft tissues. We determined the relationship between femoral areal BMD (aBMD) and femoral failure load in 49 human cadaveric specimens, 53-99 yr of age. We compared femoral aBMD, trochanteric soft tissue thickness, and the ratio of fall forces to bone strength (i.e., the factor of risk for hip fracture, phi), before and after accounting for the force-attenuating properties of trochanteric soft tissue in 21 postmenopausal women with incident hip fracture and 42 age-matched controls. Femoral aBMD correlated strongly with femoral failure load (r2 = 0.73-0.83). Age, height, and weight did not differ; however, women with hip fracture had lower total femur aBMD (OR = 2.06; 95% CI, 1.19-3.56) and trochanteric soft tissue thickness (OR = 1.82; 95% CI, 1.01, 3.31). Incorporation of trochanteric soft tissue thickness measurements reduced the estimates of fall forces by approximately 50%. After accounting for force-attenuating properties of trochanteric soft tissue, the ratio of fall forces to femoral strength was 50% higher in cases than controls (0.92 +/- 0.44 versus 0.65 +/- 0.50, respectively; p = 0.04). It is possible to compute a biomechanically based estimate of hip fracture risk by combining estimates of femoral strength based on an empirical relationship between femoral aBMD and bone strength in cadaveric femora, along with estimates of loads applied to the hip during a sideways fall that account for thickness of trochanteric soft tissues. Our findings suggest that trochanteric soft tissue thickness may influence hip fracture risk by attenuating forces applied to the femur during a sideways fall and provide rationale for developing improved measurements of trochanteric soft tissue and for studying a larger cohort to determine whether trochanteric soft tissue thickness contributes to hip fracture risk independently of aBMD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ogunmolu, O; Gans, N; Jiang, S
Purpose: We propose a surface-image-guided soft robotic patient positioning system for maskless head-and-neck radiotherapy. The ultimate goal of this project is to utilize a soft robot to realize non-rigid patient positioning and real-time motion compensation. In this proof-of-concept study, we design a position-based visual servoing control system for an air-bladder-based soft robot and investigate its performance in controlling the flexion/extension cranial motion on a mannequin head phantom. Methods: The current system consists of Microsoft Kinect depth camera, an inflatable air bladder (IAB), pressured air source, pneumatic valve actuators, custom-built current regulators, and a National Instruments myRIO microcontroller. The performance ofmore » the designed system was evaluated on a mannequin head, with a ball joint fixed below its neck to simulate torso-induced head motion along flexion/extension direction. The IAB is placed beneath the mannequin head. The Kinect camera captures images of the mannequin head, extracts the face, and measures the position of the head relative to the camera. This distance is sent to the myRIO, which runs control algorithms and sends actuation commands to the valves, inflating and deflating the IAB to induce head motion. Results: For a step input, i.e. regulation of the head to a constant displacement, the maximum error was a 6% overshoot, which the system then reduces to 0% steady-state error. In this initial investigation, the settling time to reach the regulated position was approximately 8 seconds, with 2 seconds of delay between the command start of motion due to capacitance of the pneumatics, for a total of 10 seconds to regulate the error. Conclusion: The surface image-guided soft robotic patient positioning system can achieve accurate mannequin head flexion/extension motion. Given this promising initial Result, the extension of the current one-dimensional soft robot control to multiple IABs for non-rigid positioning control will be pursued.« less
Wang, Yue; Gregory, Cherry; Minor, Mark A
2018-06-01
Molded silicone rubbers are common in manufacturing of soft robotic parts, but they are often prone to tears, punctures, and tensile failures when strained. In this article, we present a fabric compositing method for improving the mechanical properties of soft robotic parts by creating a fabric/rubber composite that increases the strength and durability of the molded rubber. Comprehensive ASTM material tests evaluating the strength, tear resistance, and puncture resistance are conducted on multiple composites embedded with different fabrics, including polyester, nylon, silk, cotton, rayon, and several blended fabrics. Results show that strong fabrics increase the strength and durability of the composite, valuable in pneumatic soft robotic applications, while elastic fabrics maintain elasticity and enhance tear strength, suitable for robotic skins or soft strain sensors. Two case studies then validate the proposed benefits of the fabric compositing for soft robotic pressure vessel applications and soft strain sensor applications. Evaluations of the fabric/rubber composite samples and devices indicate that such methods are effective for improving mechanical properties of soft robotic parts, resulting in parts that can have customized stiffness, strength, and vastly improved durability.
Wiegmann, D A; Shappell, S A
1999-12-01
The present study examined the role of human error and crew-resource management (CRM) failures in U.S. Naval aviation mishaps. All tactical jet (TACAIR) and rotary wing Class A flight mishaps between fiscal years 1990-1996 were reviewed. Results indicated that over 75% of both TACAIR and rotary wing mishaps were attributable, at least in part, to some form of human error of which 70% were associated with aircrew human factors. Of these aircrew-related mishaps, approximately 56% involved at least one CRM failure. These percentages are very similar to those observed prior to the implementation of aircrew coordination training (ACT) in the fleet, suggesting that the initial benefits of the program have not persisted and that CRM failures continue to plague Naval aviation. Closer examination of these CRM-related mishaps suggest that the type of flight operations (preflight, routine, emergency) do play a role in the etiology of CRM failures. A larger percentage of CRM failures occurred during non-routine or extremis flight situations when TACAIR mishaps were considered. In contrast, a larger percentage of rotary wing CRM mishaps involved failures that occurred during routine flight operations. These findings illustrate the complex etiology of CRM failures within Naval aviation and support the need for ACT programs tailored to the unique problems faced by specific communities in the fleet.
Errors made by animals in memory paradigms are not always due to failure of memory.
Wilkie, D M; Willson, R J; Carr, J A
1999-01-01
It is commonly assumed that errors in animal memory paradigms such as delayed matching to sample, radial mazes, and food-cache recovery are due to failures in memory for information necessary to perform the task successfully. A body of research, reviewed here, suggests that this is not always the case: animals sometimes make errors despite apparently being able to remember the appropriate information. In this paper a case study of this phenomenon is described, along with a demonstration of a simple procedural modification that successfully reduced these non-memory errors, thereby producing a better measure of memory.
Outcomes of a Failure Mode and Effects Analysis for medication errors in pediatric anesthesia.
Martin, Lizabeth D; Grigg, Eliot B; Verma, Shilpa; Latham, Gregory J; Rampersad, Sally E; Martin, Lynn D
2017-06-01
The Institute of Medicine has called for development of strategies to prevent medication errors, which are one important cause of preventable harm. Although the field of anesthesiology is considered a leader in patient safety, recent data suggest high medication error rates in anesthesia practice. Unfortunately, few error prevention strategies for anesthesia providers have been implemented. Using Toyota Production System quality improvement methodology, a multidisciplinary team observed 133 h of medication practice in the operating room at a tertiary care freestanding children's hospital. A failure mode and effects analysis was conducted to systematically deconstruct and evaluate each medication handling process step and score possible failure modes to quantify areas of risk. A bundle of five targeted countermeasures were identified and implemented over 12 months. Improvements in syringe labeling (73 to 96%), standardization of medication organization in the anesthesia workspace (0 to 100%), and two-provider infusion checks (23 to 59%) were observed. Medication error reporting improved during the project and was subsequently maintained. After intervention, the median medication error rate decreased from 1.56 to 0.95 per 1000 anesthetics. The frequency of medication error harm events reaching the patient also decreased. Systematic evaluation and standardization of medication handling processes by anesthesia providers in the operating room can decrease medication errors and improve patient safety. © 2017 John Wiley & Sons Ltd.
Dynamic soft tissue deformation estimation based on energy analysis
NASA Astrophysics Data System (ADS)
Gao, Dedong; Lei, Yong; Yao, Bin
2016-10-01
The needle placement accuracy of millimeters is required in many needle-based surgeries. The tissue deformation, especially that occurring on the surface of organ tissue, affects the needle-targeting accuracy of both manual and robotic needle insertions. It is necessary to understand the mechanism of tissue deformation during needle insertion into soft tissue. In this paper, soft tissue surface deformation is investigated on the basis of continuum mechanics, where a geometry model is presented to quantitatively approximate the volume of tissue deformation. The energy-based method is presented to the dynamic process of needle insertion into soft tissue based on continuum mechanics, and the volume of the cone is exploited to quantitatively approximate the deformation on the surface of soft tissue. The external work is converted into potential, kinetic, dissipated, and strain energies during the dynamic rigid needle-tissue interactive process. The needle insertion experimental setup, consisting of a linear actuator, force sensor, needle, tissue container, and a light, is constructed while an image-based method for measuring the depth and radius of the soft tissue surface deformations is introduced to obtain the experimental data. The relationship between the changed volume of tissue deformation and the insertion parameters is created based on the law of conservation of energy, with the volume of tissue deformation having been obtained using image-based measurements. The experiments are performed on phantom specimens, and an energy-based analytical fitted model is presented to estimate the volume of tissue deformation. The experimental results show that the energy-based analytical fitted model can predict the volume of soft tissue deformation, and the root mean squared errors of the fitting model and experimental data are 0.61 and 0.25 at the velocities 2.50 mm/s and 5.00 mm/s. The estimating parameters of the soft tissue surface deformations are proven to be useful for compensating the needle-targeting error in the rigid needle insertion procedure, especially for percutaneous needle insertion into organs.
Microcircuit radiation effects databank
NASA Technical Reports Server (NTRS)
1983-01-01
Radiation test data submitted by many testers is collated to serve as a reference for engineers who are concerned with and have some knowledge of the effects of the natural radiation environment on microcircuits. Total dose damage information and single event upset cross sections, i.e., the probability of a soft error (bit flip) or of a hard error (latchup) are presented.
The coolest DA white dwarfs detected at soft X-ray wavelengths
NASA Technical Reports Server (NTRS)
Kidder, K. M.; Holberg, J. B.; Barstow, M. A.; Tweedy, R. W.; Wesemael, F.
1992-01-01
New soft X-ray/EUV photometric observations of the DA white dwarfs KPD 0631 + 1043 = WD 0631 + 107 and PG 1113 + 413 = WD 1113 + 413 are analyzed. Previously reported soft X-ray detections of three other DAs and the failure to detect a fourth DA in deep Exosat observations are investigated. New ground-based spectra are presented for all of the objects, with IUE Ly-alpha spectra for some. These data are used to constrain the effective temperatures and surface gravities. The improved estimates of these parameters are employed to refer a photospheric He abundance for the hotter objects and to elucidate an effective observational low-temperature threshold for the detection of pure hydrogen DA white dwarfs at soft X-ray wavelengths.
Soft x-ray spectroscopy of high pressure liquid.
Qiao, Ruimin; Xia, Yujian; Feng, Xuefei; Macdougall, James; Pepper, John; Armitage, Kevin; Borsos, Jason; Knauss, Kevin G; Lee, Namhey; Allézy, Arnaud; Gilbert, Benjamin; MacDowell, Alastair A; Liu, Yi-Sheng; Glans, Per-Anders; Sun, Xuhui; Chao, Weilun; Guo, Jinghua
2018-01-01
We describe a new experimental technique that allows for soft x-ray spectroscopy studies (∼100-1000 eV) of high pressure liquid (∼100 bars). We achieve this through a liquid cell with a 100 nm-thick Si 3 N 4 membrane window, which is sandwiched by two identical O-rings for vacuum sealing. The thin Si 3 N 4 membrane allows soft x-rays to penetrate, while separating the high-pressure liquid under investigation from the vacuum required for soft x-ray transmission and detection. The burst pressure of the Si 3 N 4 membrane increases with decreasing size and more specifically is inversely proportional to the side length of the square window. It also increases proportionally with the membrane thickness. Pressures > 60 bars could be achieved for 100 nm-thick square Si 3 N 4 windows that are smaller than 65 μm. However, above a certain pressure, the failure of the Si wafer becomes the limiting factor. The failure pressure of the Si wafer is sensitive to the wafer thickness. Moreover, the deformation of the Si 3 N 4 membrane is quantified using vertical scanning interferometry. As an example of the performance of the high-pressure liquid cell optimized for total-fluorescence detected soft x-ray absorption spectroscopy (sXAS), the sXAS spectra at the Ca L edge (∼350 eV) of a CaCl 2 aqueous solution are collected under different pressures up to 41 bars.
Soft x-ray spectroscopy of high pressure liquid
Qiao, Ruimin; Xia, Yujian; Feng, Xuefei; ...
2018-01-01
Here, we describe a new experimental technique that allows for soft x-ray spectroscopy studies (~100-1000 eV) of high pressure liquid (~100 bars). We achieve this through a liquid cell with a 100 nm-thick Si 3N 4 membrane window, which is sandwiched by two identical O-rings for vacuum sealing. The thin Si 3N 4 membrane allows soft x-rays to penetrate, while separating the high-pressure liquid under investigation from the vacuum required for soft x-ray transmission and detection. The burst pressure of the Si 3N 4 membrane increases with decreasing size and more specifically is inversely proportional to the side length ofmore » the square window. It also increases proportionally with the membrane thickness. Pressures > 60 bars could be achieved for 100 nm-thick square Si 3N 4 windows that are smaller than 65 μm. However, above a certain pressure, the failure of the Si wafer becomes the limiting factor. The failure pressure of the Si wafer is sensitive to the wafer thickness. Moreover, the deformation of the Si 3N 4 membrane is quantified using vertical scanning interferometry. As an example of the performance of the high-pressure liquid cell optimized for total-fluorescence detected soft x-ray absorption spectroscopy (sXAS), the sXAS spectra at the Ca L edge (~350 eV) of a CaCl 2 aqueous solution are collected under different pressures up to 41 bars.« less
Soft x-ray spectroscopy of high pressure liquid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qiao, Ruimin; Xia, Yujian; Feng, Xuefei
Here, we describe a new experimental technique that allows for soft x-ray spectroscopy studies (~100-1000 eV) of high pressure liquid (~100 bars). We achieve this through a liquid cell with a 100 nm-thick Si 3N 4 membrane window, which is sandwiched by two identical O-rings for vacuum sealing. The thin Si 3N 4 membrane allows soft x-rays to penetrate, while separating the high-pressure liquid under investigation from the vacuum required for soft x-ray transmission and detection. The burst pressure of the Si 3N 4 membrane increases with decreasing size and more specifically is inversely proportional to the side length ofmore » the square window. It also increases proportionally with the membrane thickness. Pressures > 60 bars could be achieved for 100 nm-thick square Si 3N 4 windows that are smaller than 65 μm. However, above a certain pressure, the failure of the Si wafer becomes the limiting factor. The failure pressure of the Si wafer is sensitive to the wafer thickness. Moreover, the deformation of the Si 3N 4 membrane is quantified using vertical scanning interferometry. As an example of the performance of the high-pressure liquid cell optimized for total-fluorescence detected soft x-ray absorption spectroscopy (sXAS), the sXAS spectra at the Ca L edge (~350 eV) of a CaCl 2 aqueous solution are collected under different pressures up to 41 bars.« less
NASA Astrophysics Data System (ADS)
Nooruddin, Hasan A.; Anifowose, Fatai; Abdulraheem, Abdulazeez
2014-03-01
Soft computing techniques are recently becoming very popular in the oil industry. A number of computational intelligence-based predictive methods have been widely applied in the industry with high prediction capabilities. Some of the popular methods include feed-forward neural networks, radial basis function network, generalized regression neural network, functional networks, support vector regression and adaptive network fuzzy inference system. A comparative study among most popular soft computing techniques is presented using a large dataset published in literature describing multimodal pore systems in the Arab D formation. The inputs to the models are air porosity, grain density, and Thomeer parameters obtained using mercury injection capillary pressure profiles. Corrected air permeability is the target variable. Applying developed permeability models in recent reservoir characterization workflow ensures consistency between micro and macro scale information represented mainly by Thomeer parameters and absolute permeability. The dataset was divided into two parts with 80% of data used for training and 20% for testing. The target permeability variable was transformed to the logarithmic scale as a pre-processing step and to show better correlations with the input variables. Statistical and graphical analysis of the results including permeability cross-plots and detailed error measures were created. In general, the comparative study showed very close results among the developed models. The feed-forward neural network permeability model showed the lowest average relative error, average absolute relative error, standard deviations of error and root means squares making it the best model for such problems. Adaptive network fuzzy inference system also showed very good results.
Evidence for explosive chromospheric evaporation in a solar flare observed with SMM
NASA Technical Reports Server (NTRS)
Zarro, D. M.; Saba, J. L. R.; Strong, K. T.; Canfield, R. C.; Metcalf, T.
1986-01-01
SMM soft X-ray data and Sacramento Peak Observatory H-alpha observations are combined in a study of the impulsive phase of a solar flare. A blue asymmetry, indicative of upflow motions, was observed in the coronal Ca XIX line during the soft X-ray rise phase. H-alpha redshifts, indicative of downward motions, were observed simultaneously in bright flare kernels during the period of hard X-ray emission. It is shown that, to within observational errors, the impulsive phase momentum transported by the upflowing soft X-ray plasma is equivalent to that of the downward moving chromospheric material.
Detecting Silent Data Corruption for Extreme-Scale Applications through Data Mining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bautista-Gomez, Leonardo; Cappello, Franck
Supercomputers allow scientists to study natural phenomena by means of computer simulations. Next-generation machines are expected to have more components and, at the same time, consume several times less energy per operation. These trends are pushing supercomputer construction to the limits of miniaturization and energy-saving strategies. Consequently, the number of soft errors is expected to increase dramatically in the coming years. While mechanisms are in place to correct or at least detect some soft errors, a significant percentage of those errors pass unnoticed by the hardware. Such silent errors are extremely damaging because they can make applications silently produce wrongmore » results. In this work we propose a technique that leverages certain properties of high-performance computing applications in order to detect silent errors at the application level. Our technique detects corruption solely based on the behavior of the application datasets and is completely application-agnostic. We propose multiple corruption detectors, and we couple them to work together in a fashion transparent to the user. We demonstrate that this strategy can detect the majority of the corruptions, while incurring negligible overhead. We show that with the help of these detectors, applications can have up to 80% of coverage against data corruption.« less
NASA Technical Reports Server (NTRS)
Hruby, R. J.; Bjorkman, W. S.; Schmidt, S. F.; Carestia, R. A.
1979-01-01
Algorithms were developed that attempt to identify which sensor in a tetrad configuration has experienced a step failure. An algorithm is also described that provides a measure of the confidence with which the correct identification was made. Experimental results are presented from real-time tests conducted on a three-axis motion facility utilizing an ortho-skew tetrad strapdown inertial sensor package. The effects of prediction errors and of quantization on correct failure identification are discussed as well as an algorithm for detecting second failures through prediction.
On 'large-scale' stable fiber displacement during interfacial failure in metal matrix composites
NASA Technical Reports Server (NTRS)
Petrich, R. R.; Koss, D. A.; Hellmann, J. R.; Kallas, M. N.
1993-01-01
Experimental results are presented to show that interfacial failure in sapphire-reinforced niobium is characterized by 'large-scale' (5-15 microns) plasticity-controlled fiber displacements occurring under increasing loads. The results are based on the responses during thin-slice fiber pushout tests wherein the fiber is supported over a hole twice the fiber diameter. The results describe an interfacial failure process that should also occur near fiber ends during pullout when a fiber is well-bonded to a soft, ductile matrix, such that eventual failure occurs by shear within the matrix near the interface.
The effect of sterilization on mechanical properties of soft tissue allografts.
Conrad, Bryan P; Rappé, Matthew; Horodyski, MaryBeth; Farmer, Kevin W; Indelicato, Peter A
2013-09-01
One major concern regarding soft tissue allograft use in surgical procedures is the risk of disease transmission. Current techniques of tissue sterilization, such as irradiation have been shown to adversely affect the mechanical properties of soft tissues. Grafts processed using Biocleanse processing (a proprietary technique developed by Regeneration Technologies to sterilize human tissues) will have better biomechanical characteristics than tissues that have been irradiated. Fifteen pairs of cadaveric Achilles tendon allografts were obtained and separated into three groups of 10 each. Three treatment groups were: Biocleanse, Irradiated, and Control (untreated). Each specimen was tested to determine the biomechanical properties of the tissue. Specimens were cyclically preloaded and then loaded to failure in tension. During testing, load, displacement, and optical strain data were captured. Following testing, the cross sectional area of the tendons was determined. Tendons in the control group were found to have a higher extrinsic stiffness (slope of the load-deformation curve, p = .005), have a higher ultimate stress (force/cross sectional area, p = .006) and higher ultimate failure load (p = .003) than irradiated grafts. Biocleanse grafts were also found to be stiffer than irradiated grafts (p = .014) yet were not found to be statistically different from either irradiated or non-irradiated grafts in terms of load to failure. Biocleanse processing seems to be a viable alternative to irradiation for Achilles tendon allografts sterilization in terms of their biomechanical properties.
Murmur intensity in small-breed dogs with myxomatous mitral valve disease reflects disease severity.
Ljungvall, I; Rishniw, M; Porciello, F; Ferasin, L; Ohad, D G
2014-11-01
To determine whether murmur intensity in small-breed dogs with myxomatous mitral valve disease reflects clinical and echocardiographic disease severity. Retrospective multi-investigator study. Records of adult dogs Ä20 kg with myxomatous mitral valve disease were examined. Murmur intensity and location were recorded and compared with echocardiographic variables and functional disease status. Murmur intensities in consecutive categories were compared for prevalences of congestive heart failure, pulmonary hypertension and cardiac remodelling. 578 dogs [107 with "soft" (30 Grade I/VI and 77 II/VI), 161 with "moderate" (Grade III/VI), 160 with "loud" (Grade IV/VI) and 150 with "thrilling" (Grade V/VI or VI/VI) murmurs] were studied. No dogs with soft murmurs had congestive heart failure, and 90% had no remodelling. However, 56% of dogs with "moderate", 29% of dogs with "loud" and 8% of dogs with "thrilling" murmurs and subclinical myxomatous mitral valve disease also had no remodelling. Probability of a dog having congestive heart failure or pulmonary hypertension increased with increasing murmur intensity. A 4-level murmur grading scheme separated clinically meaningful outcomes in small-breed dogs with myxomatous mitral valve disease. Soft murmurs in small-breed dogs are strongly indicative of subclinical heart disease. Thrilling murmurs are associated with more severe disease. Other murmurs are less informative on an individual basis. © 2014 British Small Animal Veterinary Association.
Multiscale modeling of ductile failure in metallic alloys
NASA Astrophysics Data System (ADS)
Pardoen, Thomas; Scheyvaerts, Florence; Simar, Aude; Tekoğlu, Cihan; Onck, Patrick R.
2010-04-01
Micromechanical models for ductile failure have been developed in the 1970s and 1980s essentially to address cracking in structural applications and complement the fracture mechanics approach. Later, this approach has become attractive for physical metallurgists interested by the prediction of failure during forming operations and as a guide for the design of more ductile and/or high-toughness microstructures. Nowadays, a realistic treatment of damage evolution in complex metallic microstructures is becoming feasible when sufficiently sophisticated constitutive laws are used within the context of a multilevel modelling strategy. The current understanding and the state of the art models for the nucleation, growth and coalescence of voids are reviewed with a focus on the underlying physics. Considerations are made about the introduction of the different length scales associated with the microstructure and damage process. Two applications of the methodology are then described to illustrate the potential of the current models. The first application concerns the competition between intergranular and transgranular ductile fracture in aluminum alloys involving soft precipitate free zones along the grain boundaries. The second application concerns the modeling of ductile failure in friction stir welded joints, a problem which also involves soft and hard zones, albeit at a larger scale.
Asymmetric soft-error resistant memory
NASA Technical Reports Server (NTRS)
Buehler, Martin G. (Inventor); Perlman, Marvin (Inventor)
1991-01-01
A memory system is provided, of the type that includes an error-correcting circuit that detects and corrects, that more efficiently utilizes the capacity of a memory formed of groups of binary cells whose states can be inadvertently switched by ionizing radiation. Each memory cell has an asymmetric geometry, so that ionizing radiation causes a significantly greater probability of errors in one state than in the opposite state (e.g., an erroneous switch from '1' to '0' is far more likely than a switch from '0' to'1'. An asymmetric error correcting coding circuit can be used with the asymmetric memory cells, which requires fewer bits than an efficient symmetric error correcting code.
Error Control Coding Techniques for Space and Satellite Communications
NASA Technical Reports Server (NTRS)
Lin, Shu
2000-01-01
This paper presents a concatenated turbo coding system in which a Reed-Solomom outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft-decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.
Management Aspects of Software Maintenance.
1984-09-01
educated in * the complex nature of software maintenance to be able to properly evaluate and manage the software maintenance effort. In this...maintenance and improvement may be called "software evolution". The soft- ware manager must be Educated in the complex nature cf soft- Iware maintenance to be...complaint of error or request for modification is also studied in order to determine what action needs tc be taken. 2. Define Objective and Approach :
Stephan, Carl N; Simpson, Ellie K
2008-11-01
With the ever increasing production of average soft tissue depth studies, data are becoming increasingly complex, less standardized, and more unwieldy. So far, no overarching review has been attempted to determine: the validity of continued data collection; the usefulness of the existing data subcategorizations; or if a synthesis is possible to produce a manageable soft tissue depth library. While a principal components analysis would provide the best foundation for such an assessment, this type of investigation is not currently possible because of a lack of easily accessible raw data (first, many studies are narrow; second, raw data are infrequently published and/or stored and are not always shared by some authors). This paper provides an alternate means of investigation using an hierarchical approach to review and compare the effects of single variables on published mean values for adults whilst acknowledging measurement errors and within-group variation. The results revealed: (i) no clear secular trends at frequently investigated landmarks; (ii) wide variation in soft tissue depth measures between different measurement techniques irrespective of whether living persons or cadavers were considered; (iii) no clear clustering of non-Caucasoid data far from the Caucasoid means; and (iv) minor differences between males and females. Consequently, the data were pooled across studies using weighted means and standard deviations to cancel out random and opposing study-specific errors, and to produce a single soft tissue depth table with increased sample sizes (e.g., 6786 individuals at pogonion).
Proximal antecedents and correlates of adopted error approach: a self-regulatory perspective.
Van Dyck, Cathy; Van Hooft, Edwin; De Gilder, Dick; Liesveld, Lillian
2010-01-01
The current study aims to further investigate earlier established advantages of an error mastery approach over an error aversion approach. The two main purposes of the study relate to (1) self-regulatory traits (i.e., goal orientation and action-state orientation) that may predict which error approach (mastery or aversion) is adopted, and (2) proximal, psychological processes (i.e., self-focused attention and failure attribution) that relate to adopted error approach. In the current study participants' goal orientation and action-state orientation were assessed, after which they worked on an error-prone task. Results show that learning goal orientation related to error mastery, while state orientation related to error aversion. Under a mastery approach, error occurrence did not result in cognitive resources "wasted" on self-consciousness. Rather, attention went to internal-unstable, thus controllable, improvement oriented causes of error. Participants that had adopted an aversion approach, in contrast, experienced heightened self-consciousness and attributed failure to internal-stable or external causes. These results imply that when working on an error-prone task, people should be stimulated to take on a mastery rather than an aversion approach towards errors.
Wu, Jian; Murphy, Martin J
2010-06-01
To assess the precision and robustness of patient setup corrections computed from 3D/3D rigid registration methods using image intensity, when no ground truth validation is possible. Fifteen pairs of male pelvic CTs were rigidly registered using four different in-house registration methods. Registration results were compared for different resolutions and image content by varying the image down-sampling ratio and by thresholding out soft tissue to isolate bony landmarks. Intrinsic registration precision was investigated by comparing the different methods and by reversing the source and the target roles of the two images being registered. The translational reversibility errors for successful registrations ranged from 0.0 to 1.69 mm. Rotations were less than 1 degrees. Mutual information failed in most registrations that used only bony landmarks. The magnitude of the reversibility error was strongly correlated with the success/ failure of each algorithm to find the global minimum. Rigid image registrations have an intrinsic uncertainty and robustness that depends on the imaging modality, the registration algorithm, the image resolution, and the image content. In the absence of an absolute ground truth, the variation in the shifts calculated by several different methods provides a useful estimate of that uncertainty. The difference observed by reversing the source and target images can be used as an indication of robust convergence.
Markvicka, Eric J; Bartlett, Michael D; Huang, Xiaonan; Majidi, Carmel
2018-07-01
Large-area stretchable electronics are critical for progress in wearable computing, soft robotics and inflatable structures. Recent efforts have focused on engineering electronics from soft materials-elastomers, polyelectrolyte gels and liquid metal. While these materials enable elastic compliance and deformability, they are vulnerable to tearing, puncture and other mechanical damage modes that cause electrical failure. Here, we introduce a material architecture for soft and highly deformable circuit interconnects that are electromechanically stable under typical loading conditions, while exhibiting uncompromising resilience to mechanical damage. The material is composed of liquid metal droplets suspended in a soft elastomer; when damaged, the droplets rupture to form new connections with neighbours and re-route electrical signals without interruption. Since self-healing occurs spontaneously, these materials do not require manual repair or external heat. We demonstrate this unprecedented electronic robustness in a self-repairing digital counter and self-healing soft robotic quadruped that continue to function after significant damage.
Chang, Shih-Tsun; Liu, Yen-Hsiu; Lee, Jiahn-Shing; See, Lai-Chu
2015-09-01
The effect of correcting static vision on sports vision is still not clear. To examine whether sports vision (depth perception [DP], dynamic visual acuity [DVA], eye movement [EM], peripheral vision [PV], and momentary vision [MV],) were different among soft tennis adolescent athletes with normal vision (Group A), with refractive error and corrected with (Group B) and without eyeglasses (Group C). A cross-section study was conducted. Soft tennis athletes aged 10-13 who played softball tennis for 2-5 years, and who were without any ocular diseases and without visual training for the past 3 months were recruited. DPs were measured in an absolute deviation (mm) between a moving rod and fixing rod (approaching at 25 mm/s, receding at 25 mm/s, approaching at 50 mm/s, receding at 50 mm/s) using electric DP tester. A smaller deviation represented better DP. DVA, EM, PV, and MV were measured on a scale from 1 (worse) to 10 (best) using ATHLEVISION software. Chi-square test and Kruskal-Wallis test was used to compare the data among the three study groups. A total of 73 athletes (37 in Group A, 8 in Group B, 28 in Group C) were enrolled in this study. All four items of DP showed significant difference among the three study groups (P = 0.0051, 0.0004, 0.0095, 0.0021). PV displayed significant difference among the three study groups (P = 0.0044). There was no significant difference in DVA, EM, and MV among the three study groups. Significant better DP and PV were seen among soft tennis adolescent athletes with normal vision than those with refractive error regardless whether they had eyeglasses corrected. On the other hand, DVA, EM, and MV were similar among the three study groups.
Intravenous Chemotherapy Compounding Errors in a Follow-Up Pan-Canadian Observational Study.
Gilbert, Rachel E; Kozak, Melissa C; Dobish, Roxanne B; Bourrier, Venetia C; Koke, Paul M; Kukreti, Vishal; Logan, Heather A; Easty, Anthony C; Trbovich, Patricia L
2018-05-01
Intravenous (IV) compounding safety has garnered recent attention as a result of high-profile incidents, awareness efforts from the safety community, and increasingly stringent practice standards. New research with more-sensitive error detection techniques continues to reinforce that error rates with manual IV compounding are unacceptably high. In 2014, our team published an observational study that described three types of previously unrecognized and potentially catastrophic latent chemotherapy preparation errors in Canadian oncology pharmacies that would otherwise be undetectable. We expand on this research and explore whether additional potential human failures are yet to be addressed by practice standards. Field observations were conducted in four cancer center pharmacies in four Canadian provinces from January 2013 to February 2015. Human factors specialists observed and interviewed pharmacy managers, oncology pharmacists, pharmacy technicians, and pharmacy assistants as they carried out their work. Emphasis was on latent errors (potential human failures) that could lead to outcomes such as wrong drug, dose, or diluent. Given the relatively short observational period, no active failures or actual errors were observed. However, 11 latent errors in chemotherapy compounding were identified. In terms of severity, all 11 errors create the potential for a patient to receive the wrong drug or dose, which in the context of cancer care, could lead to death or permanent loss of function. Three of the 11 practices were observed in our previous study, but eight were new. Applicable Canadian and international standards and guidelines do not explicitly address many of the potentially error-prone practices observed. We observed a significant degree of risk for error in manual mixing practice. These latent errors may exist in other regions where manual compounding of IV chemotherapy takes place. Continued efforts to advance standards, guidelines, technological innovation, and chemical quality testing are needed.
The failures of root canal preparation with hand ProTaper
Bătăiosu, Marilena; Diaconu, Oana; Moraru, Iren; Dăguci, C.; Ţuculină, Mihaela; Dăguci, Luminiţa; Gheorghiţă, Lelia
2012-01-01
The failures of root canal preparation are due to some anatomical deviation (canal in “C” or “S”) and some technique errors. The technique errors are usually present in canal root cleansing and shaping stage and are the result of endodontic treatment objectives deviation. Objectives: Our study was made on technique errors while preparing the canal roots with hand ProTaper. Methodology: Our study was made “in vitro” on 84 extracted teeth (molars, premolars, incisors and canines). The canal root of these teeth were cleansed and shaped with hand ProTaper by crown-down technique and canal irrigation with NaOCl(2,5%). The dental preparation control was made by X-ray. Results: During canal root preparation some failures were observed like: canal root overinstrumentation, zipping and stripping phenomenon, discarded and/or fractured instruments. Conclusions: Hand ProTaper represents a revolutionary progress of endodontic treatment, but a deviation from accepted rules of canal root instrumentation can lead to failures of endodontic treatment. PMID:24778848
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; deGroh, Kim K.; Stueber, Thomas J.; Sechkar, Edward A.
1998-01-01
Metallized Teflon fluorinated ethylene propylene (FEP) thermal control insulation is mechanically degraded if exposed to a sufficient fluence of soft x-ray radiation. Soft x-ray photons (4-8 A in wavelength or 1.55 - 3.2 keV) emitted during solar flares have been proposed as a cause of mechanical properties degradation of aluminized Teflon FEP thermal control insulation on the Hubble Space Telescope (HST). Such degradation can be characterized by a reduction in elongation-to-failure of the Teflon FER Ground laboratory soft x-ray exposure tests of aluminized Teflon FEP were conducted to assess the degree of elongation degradation which would occur as a result of exposure to soft x-rays in the range of 3-10 keV. Tests results indicate that soft x-ray exposure in the 3-10 keV range, at mission fluence levels, does not alone cause the observed reduction in elongation of flight retrieved samples. The soft x-ray exposure facility design, mechanical properties degradation results and implications will be presented.
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; deGroh, Kim K.; Stueber, Thomas J.; Sechkar, Edward A.; Hall, Rachelle L.
1998-01-01
Metallized Teflon fluorinated ethylene propylene (FEP) thermal control insulation is mechanically degraded if exposed to a sufficient fluence of soft x-ray radiation. Soft x-ray photons (4-8 A in wavelength or 1.55 - 3.2 keV) emitted during solar flares have been proposed as a cause of mechanical properties degradation of aluminized Teflon FEP thermal control insulation on the Hubble Space Telescope (HST). Such degradation can be characterized by a reduction in elongation-to-failure of the Teflon FEP. Ground laboratory soft x-ray exposure tests of aluminized Teflon FEP were conducted to assess the degree of elongation degradation which would occur as a result of exposure to soft x-rays in the range of 3-10 keV. Tests results indicate that soft x-ray exposure in the 3-10 keV range, at mission fluence levels, does not alone cause the observed reduction in elongation of flight retrieved samples. The soft x-ray exposure facility design, mechanical properties degradation results and implications will be presented.
Stetson, Peter D.; McKnight, Lawrence K.; Bakken, Suzanne; Curran, Christine; Kubose, Tate T.; Cimino, James J.
2002-01-01
Medical errors are common, costly and often preventable. Work in understanding the proximal causes of medical errors demonstrates that systems failures predispose to adverse clinical events. Most of these systems failures are due to lack of appropriate information at the appropriate time during the course of clinical care. Problems with clinical communication are common proximal causes of medical errors. We have begun a project designed to measure the impact of wireless computing on medical errors. We report here on our efforts to develop an ontology representing the intersection of medical errors, information needs and the communication space. We will use this ontology to support the collection, storage and interpretation of project data. The ontology’s formal representation of the concepts in this novel domain will help guide the rational deployment of our informatics interventions. A real-life scenario is evaluated using the ontology in order to demonstrate its utility.
Balasuriya, Lilanthi; Vyles, David; Bakerman, Paul; Holton, Vanessa; Vaidya, Vinay; Garcia-Filion, Pamela; Westdorp, Joan; Sanchez, Christine; Kurz, Rhonda
2017-09-01
An enhanced dose range checking (DRC) system was developed to evaluate prescription error rates in the pediatric intensive care unit and the pediatric cardiovascular intensive care unit. An enhanced DRC system incorporating "soft" and "hard" alerts was designed and implemented. Practitioner responses to alerts for patients admitted to the pediatric intensive care unit and the pediatric cardiovascular intensive care unit were retrospectively reviewed. Alert rates increased from 0.3% to 3.4% after "go-live" (P < 0.001). Before go-live, all alerts were soft alerts. In the period after go-live, 68% of alerts were soft alerts and 32% were hard alerts. Before go-live, providers reduced doses only 1 time for every 10 dose alerts. After implementation of the enhanced computerized physician order entry system, the practitioners responded to soft alerts by reducing doses to more appropriate levels in 24.7% of orders (70/283), compared with 10% (3/30) before go-live (P = 0.0701). The practitioners deleted orders in 9.5% of cases (27/283) after implementation of the enhanced DRC system, as compared with no cancelled orders before go-live (P = 0.0774). Medication orders that triggered a soft alert were submitted unmodified in 65.7% (186/283) as compared with 90% (27/30) of orders before go-live (P = 0.0067). After go-live, 28.7% of hard alerts resulted in a reduced dose, 64% resulted in a cancelled order, and 7.4% were submitted as written. Before go-live, alerts were often clinically irrelevant. After go-live, there was a statistically significant decrease in orders that were submitted unmodified and an increase in the number of orders that were reduced or cancelled.
Applying lessons learned to enhance human performance and reduce human error for ISS operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, W.R.
1999-01-01
A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy{close_quote}s Idaho National Engineering and Environmental Laboratory (INEEL) is developing a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper will describe previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS. {copyright} {ital 1999 American Institute of Physics.}« less
Applying lessons learned to enhance human performance and reduce human error for ISS operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, W.R.
1998-09-01
A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.« less
Oliven, A; Zalman, D; Shilankov, Y; Yeshurun, D; Odeh, M
2002-01-01
Computerized prescription of drugs is expected to reduce the number of many preventable drug ordering errors. In the present study we evaluated the usefullness of a computerized drug order entry (CDOE) system in reducing prescription errors. A department of internal medicine using a comprehensive CDOE, which included also patient-related drug-laboratory, drug-disease and drug-allergy on-line surveillance was compared to a similar department in which drug orders were handwritten. CDOE reduced prescription errors to 25-35%. The causes of errors remained similar, and most errors, on both departments, were associated with abnormal renal function and electrolyte balance. Residual errors remaining on the CDOE-using department were due to handwriting on the typed order, failure to feed patients' diseases, and system failures. The use of CDOE was associated with a significant reduction in mean hospital stay and in the number of changes performed in the prescription. The findings of this study both quantity the impact of comprehensive CDOE on prescription errors and delineate the causes for remaining errors.
Use of failure mode effect analysis (FMEA) to improve medication management process.
Jain, Khushboo
2017-03-13
Purpose Medication management is a complex process, at high risk of error with life threatening consequences. The focus should be on devising strategies to avoid errors and make the process self-reliable by ensuring prevention of errors and/or error detection at subsequent stages. The purpose of this paper is to use failure mode effect analysis (FMEA), a systematic proactive tool, to identify the likelihood and the causes for the process to fail at various steps and prioritise them to devise risk reduction strategies to improve patient safety. Design/methodology/approach The study was designed as an observational analytical study of medication management process in the inpatient area of a multi-speciality hospital in Gurgaon, Haryana, India. A team was made to study the complex process of medication management in the hospital. FMEA tool was used. Corrective actions were developed based on the prioritised failure modes which were implemented and monitored. Findings The percentage distribution of medication errors as per the observation made by the team was found to be maximum of transcription errors (37 per cent) followed by administration errors (29 per cent) indicating the need to identify the causes and effects of their occurrence. In all, 11 failure modes were identified out of which major five were prioritised based on the risk priority number (RPN). The process was repeated after corrective actions were taken which resulted in about 40 per cent (average) and around 60 per cent reduction in the RPN of prioritised failure modes. Research limitations/implications FMEA is a time consuming process and requires a multidisciplinary team which has good understanding of the process being analysed. FMEA only helps in identifying the possibilities of a process to fail, it does not eliminate them, additional efforts are required to develop action plans and implement them. Frank discussion and agreement among the team members is required not only for successfully conducing FMEA but also for implementing the corrective actions. Practical implications FMEA is an effective proactive risk-assessment tool and is a continuous process which can be continued in phases. The corrective actions taken resulted in reduction in RPN, subjected to further evaluation and usage by others depending on the facility type. Originality/value The application of the tool helped the hospital in identifying failures in medication management process, thereby prioritising and correcting them leading to improvement.
Online Soft Sensor of Humidity in PEM Fuel Cell Based on Dynamic Partial Least Squares
Long, Rong; Chen, Qihong; Zhang, Liyan; Ma, Longhua; Quan, Shuhai
2013-01-01
Online monitoring humidity in the proton exchange membrane (PEM) fuel cell is an important issue in maintaining proper membrane humidity. The cost and size of existing sensors for monitoring humidity are prohibitive for online measurements. Online prediction of humidity using readily available measured data would be beneficial to water management. In this paper, a novel soft sensor method based on dynamic partial least squares (DPLS) regression is proposed and applied to humidity prediction in PEM fuel cell. In order to obtain data of humidity and test the feasibility of the proposed DPLS-based soft sensor a hardware-in-the-loop (HIL) test system is constructed. The time lag of the DPLS-based soft sensor is selected as 30 by comparing the root-mean-square error in different time lag. The performance of the proposed DPLS-based soft sensor is demonstrated by experimental results. PMID:24453923
Workflow interruptions, cognitive failure and near-accidents in health care.
Elfering, Achim; Grebner, Simone; Ebener, Corinne
2015-01-01
Errors are frequent in health care. A specific model was tested that affirms failure in cognitive action regulation to mediate the influence of nurses' workflow interruptions and safety conscientiousness on near-accidents in health care. One hundred and sixty-five nurses from seven Swiss hospitals participated in a questionnaire survey. Structural equation modelling confirmed the hypothesised mediation model. Cognitive failure in action regulation significantly mediated the influence of workflow interruptions on near-accidents (p < .05). An indirect path from conscientiousness to near-accidents via cognitive failure in action regulation was also significant (p < .05). Compliance with safety regulations was significantly related to cognitive failure and near-accidents; moreover, cognitive failure mediated the association between compliance and near-accidents (p < .05). Contrary to expectations, compliance with safety regulations was not related to workflow interruptions. Workflow interruptions caused by colleagues, patients and organisational constraints are likely to trigger errors in nursing. Work redesign is recommended to reduce cognitive failure and improve safety of nurses and patients.
The effect of aspect ratio on adhesion and stiffness for soft elastic fibres
Aksak, Burak; Hui, Chung-Yuen; Sitti, Metin
2011-01-01
The effect of aspect ratio on the pull-off stress and stiffness of soft elastic fibres is studied using elasticity and numerical analysis. The adhesive interface between a soft fibre and a smooth rigid surface is modelled using the Dugdale–Barenblatt model. Numerical simulations show that, while pull-off stress increases with decreasing aspect ratio, fibres get stiffer. Also, for sufficiently low aspect ratio fibres, failure occurs via the growth of internal cracks and pull-off stress approaches the intrinsic adhesive strength. Experiments carried out with various aspect ratio polyurethane elastomer fibres are consistent with the numerical simulations. PMID:21227962
Strand Plasticity Governs Fatigue in Colloidal Gels
NASA Astrophysics Data System (ADS)
van Doorn, Jan Maarten; Verweij, Joanne E.; Sprakel, Joris; van der Gucht, Jasper
2018-05-01
The repeated loading of a solid leads to microstructural damage that ultimately results in catastrophic material failure. While posing a major threat to the stability of virtually all materials, the microscopic origins of fatigue, especially for soft solids, remain elusive. Here we explore fatigue in colloidal gels as prototypical inhomogeneous soft solids by combining experiments and computer simulations. Our results reveal how mechanical loading leads to irreversible strand stretching, which builds slack into the network that softens the solid at small strains and causes strain hardening at larger deformations. We thus find that microscopic plasticity governs fatigue at much larger scales. This gives rise to a new picture of fatigue in soft thermal solids and calls for new theoretical descriptions of soft gel mechanics in which local plasticity is taken into account.
Auxiliary variables for numerically solving nonlinear equations with softly broken symmetries.
Olum, Ken D; Masoumi, Ali
2017-06-01
General methods for solving simultaneous nonlinear equations work by generating a sequence of approximate solutions that successively improve a measure of the total error. However, if the total error function has a narrow curved valley, the available techniques tend to find the solution after a very large number of steps, if ever. The solver first converges rapidly to the valley, but once there it converges extremely slowly to the solution. In this paper we show that in the specific physically important case where these valleys are the result of a softly broken symmetry, the solution can often be found much more quickly by adding the generators of the softly broken symmetry as auxiliary variables. This makes the number of variables more than the equations and hence there will be a family of solutions, any one of which would be acceptable. We present a procedure for finding solutions in this case and apply it to several simple examples and an important problem in the physics of false vacuum decay. We also provide a Mathematica package that implements Powell's hybrid method with the generalization to allow more variables than equations.
In-flight performance of pulse-processing system of the ASTRO-H/Hitomi soft x-ray spectrometer
NASA Astrophysics Data System (ADS)
Ishisaki, Yoshitaka; Yamada, Shinya; Seta, Hiromi; Tashiro, Makoto S.; Takeda, Sawako; Terada, Yukikatsu; Kato, Yuka; Tsujimoto, Masahiro; Koyama, Shu; Mitsuda, Kazuhisa; Sawada, Makoto; Boyce, Kevin R.; Chiao, Meng P.; Watanabe, Tomomi; Leutenegger, Maurice A.; Eckart, Megan E.; Porter, Frederick Scott; Kilbourne, Caroline Anne
2018-01-01
We summarize results of the initial in-orbit performance of the pulse shape processor (PSP) of the soft x-ray spectrometer instrument onboard ASTRO-H (Hitomi). Event formats, kind of telemetry, and the pulse-processing parameters are described, and the parameter settings in orbit are listed. The PSP was powered-on 2 days after launch, and the event threshold was lowered in orbit. The PSP worked fine in orbit, and there was neither memory error nor SpaceWire communication error until the break-up of spacecraft. Time assignment, electrical crosstalk, and the event screening criteria are studied. It is confirmed that the event processing rate at 100% central processing unit load is ˜200 c / s / array, compliant with the requirement on the PSP.
Reynolds, James D.
2007-01-01
Purpose A review of retinopathy of prematurity (ROP) malpractice cases will identify specific, repetitive problems in the provision of care and the reasons underlying these problems. Opportunities to improve the quality of care provided to premature infants with ROP will result. Methods A retrospective review of a series of 13 ROP malpractice cases in which the author served as a paid consultant, as well as a review of the literature for additional cases, was conducted. The series of 13 involved a review of the entire medical record as well as testimony and depositions. The characteristics of each case are tabulated, including state, date, allegations, defendants, disposition, award, the medical facts and care issues involved, and the judgment of medical error. In addition, a merit review was performed on the care in each case, and an error assessment was performed. Results The quality of care issues included neonatology failure to refer or follow up in 8 of 13, failure to adequately supervise resident care in 2 of 13, ophthalmologic failure to follow up in 6 of 13, and failure to properly diagnose and manage in 9 of 13. The latter included 4 of 13 that hinged on zone III issues and the presence or absence of full nasal vascularization with or without previous zone II disease. Merit review found negligent error by at least one party in 12 of 13. Ophthalmology error was found in 6 of 13. Malpractice, ie, negligent error causing negligent harm, was judged to be present in 9 of 13. Conclusions Negligent errors are common in malpractice cases that proceed to disposition. There are a limited number of repetitive errors that produce malpractice. An explanation of how these errors occur, coupled with the pertinent pathophysiology, afford an excellent opportunity to improve patient care PMID:18427626
Bronson, N R
1984-05-01
A new A-mode biometry system for determining axial length measurements of the eye has been developed that incorporates a soft-membrane transducer. The soft transducer decreases the risk of indenting the cornea with the probe resulting in inaccurate measurements. A microprocessor evaluates echo patterns and determines whether or not axial alignment has been obtained, eliminating possible user error. The new A-scan requires minimal user skill and can be used successfully by both physician and technician.
Flight test results of the strapdown ring laser gyro tetrad inertial navigation system
NASA Technical Reports Server (NTRS)
Carestia, R. A.; Hruby, R. J.; Bjorkman, W. S.
1983-01-01
A helicopter flight test program undertaken to evaluate the performance of Tetrad (a strap down, laser gyro, inertial navigation system) is described. The results of 34 flights show a mean final navigational velocity error of 5.06 knots, with a standard deviation of 3.84 knots; a corresponding mean final position error of 2.66 n. mi., with a standard deviation of 1.48 n. mi.; and a modeled mean position error growth rate for the 34 tests of 1.96 knots, with a standard deviation of 1.09 knots. No laser gyro or accelerometer failures were detected during the flight tests. Off line parity residual studies used simulated failures with the prerecorded flight test and laboratory test data. The airborne Tetrad system's failure--detection logic, exercised during the tests, successfully demonstrated the detection of simulated ""hard'' failures and the system's ability to continue successfully to navigate by removing the simulated faulted sensor from the computations. Tetrad's four ring laser gyros provided reliable and accurate angular rate sensing during the 4 yr of the test program, and no sensor failures were detected during the evaluation of free inertial navigation performance.
Closed-Loop Analysis of Soft Decisions for Serial Links
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Steele, Glen F.; Zucha, Joan P.; Schlensinger, Adam M.
2012-01-01
Modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more overhead through noisier channels, and software-defined radios use error-correction techniques that approach Shannon s theoretical limit of performance. The authors describe the benefit of closed-loop measurements for a receiver when paired with a counterpart transmitter and representative channel conditions. We also describe a real-time Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in real-time during the development of software defined radios.
De Rosario, Helios; Page, Álvaro; Besa, Antonio
2017-09-06
The accurate location of the main axes of rotation (AoR) is a crucial step in many applications of human movement analysis. There are different formal methods to determine the direction and position of the AoR, whose performance varies across studies, depending on the pose and the source of errors. Most methods are based on minimizing squared differences between observed and modelled marker positions or rigid motion parameters, implicitly assuming independent and uncorrelated errors, but the largest error usually results from soft tissue artefacts (STA), which do not have such statistical properties and are not effectively cancelled out by such methods. However, with adequate methods it is possible to assume that STA only account for a small fraction of the observed motion and to obtain explicit formulas through differential analysis that relate STA components to the resulting errors in AoR parameters. In this paper such formulas are derived for three different functional calibration techniques (Geometric Fitting, mean Finite Helical Axis, and SARA), to explain why each technique behaves differently from the others, and to propose strategies to compensate for those errors. These techniques were tested with published data from a sit-to-stand activity, where the true axis was defined using bi-planar fluoroscopy. All the methods were able to estimate the direction of the AoR with an error of less than 5°, whereas there were errors in the location of the axis of 30-40mm. Such location errors could be reduced to less than 17mm by the methods based on equations that use rigid motion parameters (mean Finite Helical Axis, SARA) when the translation component was calculated using the three markers nearest to the axis. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Ni, Jianjun David
2011-01-01
This presentation briefly discusses a research effort on mitigation techniques of pulsed radio frequency interference (RFI) on a Low-Density-Parity-Check (LDPC) code. This problem is of considerable interest in the context of providing reliable communications to the space vehicle which might suffer severe degradation due to pulsed RFI sources such as large radars. The LDPC code is one of modern forward-error-correction (FEC) codes which have the decoding performance to approach the Shannon Limit. The LDPC code studied here is the AR4JA (2048, 1024) code recommended by the Consultative Committee for Space Data Systems (CCSDS) and it has been chosen for some spacecraft design. Even though this code is designed as a powerful FEC code in the additive white Gaussian noise channel, simulation data and test results show that the performance of this LDPC decoder is severely degraded when exposed to the pulsed RFI specified in the spacecraft s transponder specifications. An analysis work (through modeling and simulation) has been conducted to evaluate the impact of the pulsed RFI and a few implemental techniques have been investigated to mitigate the pulsed RFI impact by reshuffling the soft-decision-data available at the input of the LDPC decoder. The simulation results show that the LDPC decoding performance of codeword error rate (CWER) under pulsed RFI can be improved up to four orders of magnitude through a simple soft-decision-data reshuffle scheme. This study reveals that an error floor of LDPC decoding performance appears around CWER=1E-4 when the proposed technique is applied to mitigate the pulsed RFI impact. The mechanism causing this error floor remains unknown, further investigation is necessary.
Reliability issues in active control of large flexible space structures
NASA Technical Reports Server (NTRS)
Vandervelde, W. E.
1986-01-01
Efforts in this reporting period were centered on four research tasks: design of failure detection filters for robust performance in the presence of modeling errors, design of generalized parity relations for robust performance in the presence of modeling errors, design of failure sensitive observers using the geometric system theory of Wonham, and computational techniques for evaluation of the performance of control systems with fault tolerance and redundancy management
Hard decoding algorithm for optimizing thresholds under general Markovian noise
NASA Astrophysics Data System (ADS)
Chamberland, Christopher; Wallman, Joel; Beale, Stefanie; Laflamme, Raymond
2017-04-01
Quantum error correction is instrumental in protecting quantum systems from noise in quantum computing and communication settings. Pauli channels can be efficiently simulated and threshold values for Pauli error rates under a variety of error-correcting codes have been obtained. However, realistic quantum systems can undergo noise processes that differ significantly from Pauli noise. In this paper, we present an efficient hard decoding algorithm for optimizing thresholds and lowering failure rates of an error-correcting code under general completely positive and trace-preserving (i.e., Markovian) noise. We use our hard decoding algorithm to study the performance of several error-correcting codes under various non-Pauli noise models by computing threshold values and failure rates for these codes. We compare the performance of our hard decoding algorithm to decoders optimized for depolarizing noise and show improvements in thresholds and reductions in failure rates by several orders of magnitude. Our hard decoding algorithm can also be adapted to take advantage of a code's non-Pauli transversal gates to further suppress noise. For example, we show that using the transversal gates of the 5-qubit code allows arbitrary rotations around certain axes to be perfectly corrected. Furthermore, we show that Pauli twirling can increase or decrease the threshold depending upon the code properties. Lastly, we show that even if the physical noise model differs slightly from the hypothesized noise model used to determine an optimized decoder, failure rates can still be reduced by applying our hard decoding algorithm.
Design of analytical failure detection using secondary observers
NASA Technical Reports Server (NTRS)
Sisar, M.
1982-01-01
The problem of designing analytical failure-detection systems (FDS) for sensors and actuators, using observers, is addressed. The use of observers in FDS is related to the examination of the n-dimensional observer error vector which carries the necessary information on possible failures. The problem is that in practical systems, in which only some of the components of the state vector are measured, one has access only to the m-dimensional observer-output error vector, with m or = to n. In order to cope with these cases, a secondary observer is synthesized to reconstruct the entire observer-error vector from the observer output error vector. This approach leads toward the design of highly sensitive and reliable FDS, with the possibility of obtaining a unique fingerprint for every possible failure. In order to keep the observer's (or Kalman filter) false-alarm rate under a certain specified value, it is necessary to have an acceptable matching between the observer (or Kalman filter) models and the system parameters. A previously developed adaptive observer algorithm is used to maintain the desired system-observer model matching, despite initial mismatching or system parameter variations. Conditions for convergence for the adaptive process are obtained, leading to a simple adaptive law (algorithm) with the possibility of an a priori choice of fixed adaptive gains. Simulation results show good tracking performance with small observer output errors, while accurate and fast parameter identification, in both deterministic and stochastic cases, is obtained.
Justification of Estimates for Fiscal Year 1983 Submitted to Congress.
1982-02-01
hierarchies to aid software production; completion of the components of an adaptive suspension vehicle including a storage energy unit, hydraulics, laser...and corrosion (long storage times), and radiation-induced breakdown. Solid- lubricated main engine bearings for cruise missile engines would offer...environments will cause "soft error" (computational and memory storage errors) in advanced microelectronic circuits. Research on high-speed, low-power
Deep Learning MR Imaging-based Attenuation Correction for PET/MR Imaging.
Liu, Fang; Jang, Hyungseok; Kijowski, Richard; Bradshaw, Tyler; McMillan, Alan B
2018-02-01
Purpose To develop and evaluate the feasibility of deep learning approaches for magnetic resonance (MR) imaging-based attenuation correction (AC) (termed deep MRAC) in brain positron emission tomography (PET)/MR imaging. Materials and Methods A PET/MR imaging AC pipeline was built by using a deep learning approach to generate pseudo computed tomographic (CT) scans from MR images. A deep convolutional auto-encoder network was trained to identify air, bone, and soft tissue in volumetric head MR images coregistered to CT data for training. A set of 30 retrospective three-dimensional T1-weighted head images was used to train the model, which was then evaluated in 10 patients by comparing the generated pseudo CT scan to an acquired CT scan. A prospective study was carried out for utilizing simultaneous PET/MR imaging for five subjects by using the proposed approach. Analysis of covariance and paired-sample t tests were used for statistical analysis to compare PET reconstruction error with deep MRAC and two existing MR imaging-based AC approaches with CT-based AC. Results Deep MRAC provides an accurate pseudo CT scan with a mean Dice coefficient of 0.971 ± 0.005 for air, 0.936 ± 0.011 for soft tissue, and 0.803 ± 0.021 for bone. Furthermore, deep MRAC provides good PET results, with average errors of less than 1% in most brain regions. Significantly lower PET reconstruction errors were realized with deep MRAC (-0.7% ± 1.1) compared with Dixon-based soft-tissue and air segmentation (-5.8% ± 3.1) and anatomic CT-based template registration (-4.8% ± 2.2). Conclusion The authors developed an automated approach that allows generation of discrete-valued pseudo CT scans (soft tissue, bone, and air) from a single high-spatial-resolution diagnostic-quality three-dimensional MR image and evaluated it in brain PET/MR imaging. This deep learning approach for MR imaging-based AC provided reduced PET reconstruction error relative to a CT-based standard within the brain compared with current MR imaging-based AC approaches. © RSNA, 2017 Online supplemental material is available for this article.
Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code
NASA Astrophysics Data System (ADS)
Marinkovic, Slavica; Guillemot, Christine
2006-12-01
Quantized frame expansions based on block transforms and oversampled filter banks (OFBs) have been considered recently as joint source-channel codes (JSCCs) for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC) or a fixed-length code (FLC). This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an[InlineEquation not available: see fulltext.]-ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO) VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.
Parvin, Darius E; McDougle, Samuel D; Taylor, Jordan A; Ivry, Richard B
2018-05-09
Failures to obtain reward can occur from errors in action selection or action execution. Recently, we observed marked differences in choice behavior when the failure to obtain a reward was attributed to errors in action execution compared with errors in action selection (McDougle et al., 2016). Specifically, participants appeared to solve this credit assignment problem by discounting outcomes in which the absence of reward was attributed to errors in action execution. Building on recent evidence indicating relatively direct communication between the cerebellum and basal ganglia, we hypothesized that cerebellar-dependent sensory prediction errors (SPEs), a signal indicating execution failure, could attenuate value updating within a basal ganglia-dependent reinforcement learning system. Here we compared the SPE hypothesis to an alternative, "top-down" hypothesis in which changes in choice behavior reflect participants' sense of agency. In two experiments with male and female human participants, we manipulated the strength of SPEs, along with the participants' sense of agency in the second experiment. The results showed that, whereas the strength of SPE had no effect on choice behavior, participants were much more likely to discount the absence of rewards under conditions in which they believed the reward outcome depended on their ability to produce accurate movements. These results provide strong evidence that SPEs do not directly influence reinforcement learning. Instead, a participant's sense of agency appears to play a significant role in modulating choice behavior when unexpected outcomes can arise from errors in action execution. SIGNIFICANCE STATEMENT When learning from the outcome of actions, the brain faces a credit assignment problem: Failures of reward can be attributed to poor choice selection or poor action execution. Here, we test a specific hypothesis that execution errors are implicitly signaled by cerebellar-based sensory prediction errors. We evaluate this hypothesis and compare it with a more "top-down" hypothesis in which the modulation of choice behavior from execution errors reflects participants' sense of agency. We find that sensory prediction errors have no significant effect on reinforcement learning. Instead, instructions influencing participants' belief of causal outcomes appear to be the main factor influencing their choice behavior. Copyright © 2018 the authors 0270-6474/18/384521-10$15.00/0.
Lenderink, Albert W.; Widdershoven, Jos W. M. G.; van den Bemt, Patricia M. L. A.
2010-01-01
Objective Heart failure patients are regularly admitted to hospital and frequently use multiple medication. Besides intentional changes in pharmacotherapy, unintentional changes may occur during hospitalisation. The aim of this study was to investigate the effect of a clinical pharmacist discharge service on medication discrepancies and prescription errors in patients with heart failure. Setting A general teaching hospital in Tilburg, the Netherlands. Method An open randomized intervention study was performed comparing an intervention group, with a control group receiving regular care by doctors and nurses. The clinical pharmacist discharge service consisted of review of discharge medication, communicating prescribing errors with the cardiologist, giving patients information, preparation of a written overview of the discharge medication and communication to both the community pharmacist and the general practitioner about this medication. Within 6 weeks after discharge all patients were routinely scheduled to visit the outpatient clinic and medication discrepancies were measured. Main outcome measure The primary endpoint was the frequency of prescription errors in the discharge medication and medication discrepancies after discharge combined. Results Forty-four patients were included in the control group and 41 in the intervention group. Sixty-eight percent of patients in the control group had at least one discrepancy or prescription error against 39% in the intervention group (RR 0.57 (95% CI 0.37–0.88)). The percentage of medications with a discrepancy or prescription error in the control group was 14.6% and in the intervention group it was 6.1% (RR 0.42 (95% CI 0.27–0.66)). Conclusion This clinical pharmacist discharge service significantly reduces the risk of discrepancies and prescription errors in medication of patients with heart failure in the 1st month after discharge. PMID:20809276
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carson, M; Molineu, A; Taylor, P
Purpose: To analyze the most recent results of IROC Houston’s anthropomorphic H&N phantom to determine the nature of failing irradiations and the feasibility of altering pass/fail credentialing criteria. Methods: IROC Houston’s H&N phantom, used for IMRT credentialing for NCI-sponsored clinical trials, requires that an institution’s treatment plan must agree with measurement within 7% (TLD doses) and ≥85% pixels must pass 7%/4 mm gamma analysis. 156 phantom irradiations (November 2014 – October 2015) were re-evaluated using tighter criteria: 1) 5% TLD and 5%/4 mm, 2) 5% TLD and 5%/3 mm, 3) 4% TLD and 4%/4 mm, and 4) 3% TLD andmore » 3%/3 mm. Failure/poor performance rates were evaluated with respect to individual film and TLD performance by location in the phantom. Overall poor phantom results were characterized qualitatively as systematic (dosimetric) errors, setup errors/positional shifts, global but non-systematic errors, and errors affecting only a local region. Results: The pass rate for these phantoms using current criteria is 90%. Substituting criteria 1-4 reduces the overall pass rate to 77%, 70%, 63%, and 37%, respectively. Statistical analyses indicated the probability of noise-induced TLD failure at the 5% criterion was <0.5%. Using criteria 1, TLD results were most often the cause of failure (86% failed TLD while 61% failed film), with most failures identified in the primary PTV (77% cases). Other criteria posed similar results. Irradiations that failed from film only were overwhelmingly associated with phantom shifts/setup errors (≥80% cases). Results failing criteria 1 were primarily diagnosed as systematic: 58% of cases. 11% were setup/positioning errors, 8% were global non-systematic errors, and 22% were local errors. Conclusion: This study demonstrates that 5% TLD and 5%/4 mm gamma criteria may be both practically and theoretically achievable. Further work is necessary to diagnose and resolve dosimetric inaccuracy in these trials, particularly for systematic dose errors. This work is funded by NCI Grant CA180803.« less
Semi-Autonomous Control with Cyber-Pain for Artificial Muscles and Smart Structures
2010-09-15
avoid some key failure modes. Our approach has built on our developments in dynamic self-sensing and realistic simulation of DEA electromechanics...local controller) to avoid some key failure modes. Our approach has built on our developments in dynamic self-sensing and realistic simulation of DEA...strains [4]. In its natural state long polymer backbones are entangled with intermittent cross-links tying neighbouring backbones together. The soft
Medication errors: definitions and classification
Aronson, Jeffrey K
2009-01-01
To understand medication errors and to identify preventive strategies, we need to classify them and define the terms that describe them. The four main approaches to defining technical terms consider etymology, usage, previous definitions, and the Ramsey–Lewis method (based on an understanding of theory and practice). A medication error is ‘a failure in the treatment process that leads to, or has the potential to lead to, harm to the patient’. Prescribing faults, a subset of medication errors, should be distinguished from prescription errors. A prescribing fault is ‘a failure in the prescribing [decision-making] process that leads to, or has the potential to lead to, harm to the patient’. The converse of this, ‘balanced prescribing’ is ‘the use of a medicine that is appropriate to the patient's condition and, within the limits created by the uncertainty that attends therapeutic decisions, in a dosage regimen that optimizes the balance of benefit to harm’. This excludes all forms of prescribing faults, such as irrational, inappropriate, and ineffective prescribing, underprescribing and overprescribing. A prescription error is ‘a failure in the prescription writing process that results in a wrong instruction about one or more of the normal features of a prescription’. The ‘normal features’ include the identity of the recipient, the identity of the drug, the formulation, dose, route, timing, frequency, and duration of administration. Medication errors can be classified, invoking psychological theory, as knowledge-based mistakes, rule-based mistakes, action-based slips, and memory-based lapses. This classification informs preventive strategies. PMID:19594526
Error analysis and prevention of cosmic ion-induced soft errors in static CMOS RAMs
NASA Astrophysics Data System (ADS)
Diehl, S. E.; Ochoa, A., Jr.; Dressendorfer, P. V.; Koga, P.; Kolasinski, W. A.
1982-12-01
Cosmic ray interactions with memory cells are known to cause temporary, random, bit errors in some designs. The sensitivity of polysilicon gate CMOS static RAM designs to logic upset by impinging ions has been studied using computer simulations and experimental heavy ion bombardment. Results of the simulations are confirmed by experimental upset cross-section data. Analytical models have been extended to determine and evaluate design modifications which reduce memory cell sensitivity to cosmic ions. A simple design modification, the addition of decoupling resistance in the feedback path, is shown to produce static RAMs immune to cosmic ray-induced bit errors.
Probabilistic confidence for decisions based on uncertain reliability estimates
NASA Astrophysics Data System (ADS)
Reid, Stuart G.
2013-05-01
Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.
ERIC Educational Resources Information Center
Kalahar, Kory G.
2011-01-01
Student failure is a prominent issue in many comprehensive secondary schools nationwide. Researchers studying error, reliability, and performance in organizations have developed and employed a method known as critical incident technique (CIT) for investigating failure. Adopting an action research model, this study involved gathering and analyzing…
[The error, source of learning].
Joyeux, Stéphanie; Bohic, Valérie
2016-05-01
The error itself is not recognised as a fault. It is the intentionality which differentiates between an error and a fault. An error is unintentional while a fault is a failure to respect known rules. The risk of error is omnipresent in health institutions. Public authorities have therefore set out a series of measures to reduce this risk. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Huda, Nizlel; Sutawidjaja, Akbar; Subanji; Rahardjo, Swasono
2018-04-01
Metacognitive activity is very important in mathematical problems solving. Metacognitive activity consists of metacognitive awareness, metacognitive evaluation and metacognitive regulation. This study aimed to reveal the errors of metacognitive evaluation in students’ metacognitive failure in solving mathematical problems. 20 students taken as research subjects were grouped into three groups: the first group was students who experienced one metacognitive failure, the second group was students who experienced two metacognitive failures and the third group was students who experienced three metacognitive failures. One person was taken from each group as the reasearch subject. The research data was collected from worksheets done using think aload then followed by interviewing the research subjects based on the results’ of subject work. The findings in this study were students who experienced metacognitive failure in solving mathematical problems tends to miscalculate metacognitive evaluation in considering the effectiveness and limitations of their thinking and the effectiveness of their chosen strategy of completion.
NASA Astrophysics Data System (ADS)
Bell, Andrew F.; Naylor, Mark; Heap, Michael J.; Main, Ian G.
2011-08-01
Power-law accelerations in the mean rate of strain, earthquakes and other precursors have been widely reported prior to material failure phenomena, including volcanic eruptions, landslides and laboratory deformation experiments, as predicted by several theoretical models. The Failure Forecast Method (FFM), which linearizes the power-law trend, has been routinely used to forecast the failure time in retrospective analyses; however, its performance has never been formally evaluated. Here we use synthetic and real data, recorded in laboratory brittle creep experiments and at volcanoes, to show that the assumptions of the FFM are inconsistent with the error structure of the data, leading to biased and imprecise forecasts. We show that a Generalized Linear Model method provides higher-quality forecasts that converge more accurately to the eventual failure time, accounting for the appropriate error distributions. This approach should be employed in place of the FFM to provide reliable quantitative forecasts and estimate their associated uncertainties.
Addressee Errors in ATC Communications: The Call Sign Problem
NASA Technical Reports Server (NTRS)
Monan, W. P.
1983-01-01
Communication errors involving aircraft call signs were portrayed in reports of 462 hazardous incidents voluntarily submitted to the ASRS during an approximate four-year period. These errors resulted in confusion, disorder, and uncoordinated traffic conditions and produced the following types of operational anomalies: altitude deviations, wrong-way headings, aborted takeoffs, go arounds, runway incursions, missed crossing altitude restrictions, descents toward high terrain, and traffic conflicts in flight and on the ground. Analysis of the report set resulted in identification of five categories of errors involving call signs: (1) faulty radio usage techniques, (2) call sign loss or smearing due to frequency congestion, (3) confusion resulting from similar sounding call signs, (4) airmen misses of call signs leading to failures to acknowledge or readback, and (5) controller failures regarding confirmation of acknowledgements or readbacks. These error categories are described in detail and several associated hazard mitigating measures that might be aken are considered.
The NASA F-15 Intelligent Flight Control Systems: Generation II
NASA Technical Reports Server (NTRS)
Buschbacher, Mark; Bosworth, John
2006-01-01
The Second Generation (Gen II) control system for the F-15 Intelligent Flight Control System (IFCS) program implements direct adaptive neural networks to demonstrate robust tolerance to faults and failures. The direct adaptive tracking controller integrates learning neural networks (NNs) with a dynamic inversion control law. The term direct adaptive is used because the error between the reference model and the aircraft response is being compensated or directly adapted to minimize error without regard to knowing the cause of the error. No parameter estimation is needed for this direct adaptive control system. In the Gen II design, the feedback errors are regulated with a proportional-plus-integral (PI) compensator. This basic compensator is augmented with an online NN that changes the system gains via an error-based adaptation law to improve aircraft performance at all times, including normal flight, system failures, mispredicted behavior, or changes in behavior resulting from damage.
Real-time optimal guidance for orbital maneuvering.
NASA Technical Reports Server (NTRS)
Cohen, A. O.; Brown, K. R.
1973-01-01
A new formulation for soft-constraint trajectory optimization is presented as a real-time optimal feedback guidance method for multiburn orbital maneuvers. Control is always chosen to minimize burn time plus a quadratic penalty for end condition errors, weighted so that early in the mission (when controllability is greatest) terminal errors are held negligible. Eventually, as controllability diminishes, the method partially relaxes but effectively still compensates perturbations in whatever subspace remains controllable. Although the soft-constraint concept is well-known in optimal control, the present formulation is novel in addressing the loss of controllability inherent in multiple burn orbital maneuvers. Moreover the necessary conditions usually obtained from a Bolza formulation are modified in this case so that the fully hard constraint formulation is a numerically well behaved subcase. As a result convergence properties have been greatly improved.
Early Detection Of Failure Mechanisms In Resilient Biostructures: A Network Flow Study
2017-10-01
of flat blades of solid cartilage (sawfishes and some sharks) or simple tubes of bone (swordfish, marlin, etc.) and do not vary appreciably in size...cartilage The hard cartilage is formed by two flat sections that are almost parallel to each other and run along the longitudinal axis of the rostrum...rostrum subjected to a uniform pressure: soft cartilage The soft cartilage is located at the center of the rostrum and runs in the longitudinal Z
2014-05-14
2002). "Healing after standardized clinical probing of perlimlant soft tissue seal; a histomorphometric study in dogs." Clinical oral Implants...implant and periodontal tissues . A study in the beagle dog." Clin Oral Implants Res3(1): 9-16. Luterbacher, S., Mayfield, L. (2000). "Diagnostic...encapsulation around the implant and soft tissue . This type of healing and encapsulation led to inflammation, infection, mobility and failure of the
Bottoms, Hayden C; Eslick, Andrea N; Marsh, Elizabeth J
2010-08-01
Although contradictions with stored knowledge are common in daily life, people often fail to notice them. For example, in the Moses illusion, participants fail to notice errors in questions such as "How many animals of each kind did Moses take on the Ark?" despite later showing knowledge that the Biblical reference is to Noah, not Moses. We examined whether error prevalence affected participants' ability to detect distortions in questions, and whether this in turn had memorial consequences. Many of the errors were overlooked, but participants were better able to catch them when they were more common. More generally, the failure to detect errors had negative memorial consequences, increasing the likelihood that the errors were used to answer later general knowledge questions. Methodological implications of this finding are discussed, as it suggests that typical analyses likely underestimate the size of the Moses illusion. Overall, answering distorted questions can yield errors in the knowledge base; most importantly, prior knowledge does not protect against these negative memorial consequences.
Localizing softness and stress along loops in 3D topological metamaterials
NASA Astrophysics Data System (ADS)
Baardink, Guido; Souslov, Anton; Paulose, Jayson; Vitelli, Vincenzo
2018-01-01
Topological states can be used to control the mechanical properties of a material along an edge or around a localized defect. The rigidity of elastic networks is characterized by a topological invariant called the polarization; materials with a well-defined uniform polarization display a dramatic range of edge softness depending on the orientation of the polarization relative to the terminating surface. However, in all 3D mechanical metamaterials proposed to date, the topological modes are mixed with bulk soft modes, which organize themselves in Weyl loops. Here, we report the design of a 3D topological metamaterial without Weyl lines and with a uniform polarization that leads to an asymmetry between the number of soft modes on opposing surfaces. We then use this construction to localize topological soft modes in interior regions of the material by including defect lines—dislocation loops—that are unique to three dimensions. We derive a general formula that relates the difference in the number of soft modes and states of self-stress localized along the dislocation loop to the handedness of the vector triad formed by the lattice polarization, Burgers vector, and dislocation-line direction. Our findings suggest a strategy for preprogramming failure and softness localized along lines in 3D, while avoiding extended soft Weyl modes.
NASA Technical Reports Server (NTRS)
Huynh, Loc C.; Duval, R. W.
1986-01-01
The use of Redundant Asynchronous Multiprocessor System to achieve ultrareliable Fault Tolerant Control Systems shows great promise. The development has been hampered by the inability to determine whether differences in the outputs of redundant CPU's are due to failures or to accrued error built up by slight differences in CPU clock intervals. This study derives an analytical dynamic model of the difference between redundant CPU's due to differences in their clock intervals and uses this model with on-line parameter identification to idenitify the differences in the clock intervals. The ability of this methodology to accurately track errors due to asynchronisity generate an error signal with the effect of asynchronisity removed and this signal may be used to detect and isolate actual system failures.
TID and SEE Response of an Advanced Samsung 4G NAND Flash Memory
NASA Technical Reports Server (NTRS)
Oldham, Timothy R.; Friendlich, M.; Howard, J. W.; Berg, M. D.; Kim, H. S.; Irwin, T. L.; LaBel, K. A.
2007-01-01
Initial total ionizing dose (TID) and single event heavy ion test results are presented for an unhardened commercial flash memory, fabricated with 63 nm technology. Results are that the parts survive to a TID of nearly 200 krad (SiO2), with a tractable soft error rate of about 10(exp -l2) errors/bit-day, for the Adams Ten Percent Worst Case Environment.
NASA Astrophysics Data System (ADS)
Zheng, W.; Gao, J. M.; Wang, R. X.; Chen, K.; Jiang, Y.
2017-12-01
This paper put forward a new method of technical characteristics deployment based on Reliability Function Deployment (RFD) by analysing the advantages and shortages of related research works on mechanical reliability design. The matrix decomposition structure of RFD was used to describe the correlative relation between failure mechanisms, soft failures and hard failures. By considering the correlation of multiple failure modes, the reliability loss of one failure mode to the whole part was defined, and a calculation and analysis model for reliability loss was presented. According to the reliability loss, the reliability index value of the whole part was allocated to each failure mode. On the basis of the deployment of reliability index value, the inverse reliability method was employed to acquire the values of technology characteristics. The feasibility and validity of proposed method were illustrated by a development case of machining centre’s transmission system.
Ironic Effects of Drawing Attention to Story Errors
Eslick, Andrea N.; Fazio, Lisa K.; Marsh, Elizabeth J.
2014-01-01
Readers learn errors embedded in fictional stories and use them to answer later general knowledge questions (Marsh, Meade, & Roediger, 2003). Suggestibility is robust and occurs even when story errors contradict well-known facts. The current study evaluated whether suggestibility is linked to participants’ inability to judge story content as correct versus incorrect. Specifically, participants read stories containing correct and misleading information about the world; some information was familiar (making error discovery possible), while some was more obscure. To improve participants’ monitoring ability, we highlighted (in red font) a subset of story phrases requiring evaluation; readers no longer needed to find factual information. Rather, they simply needed to evaluate its correctness. Readers were more likely to answer questions with story errors if they were highlighted in red font, even if they contradicted well-known facts. Though highlighting to-be-evaluated information freed cognitive resources for monitoring, an ironic effect occurred: Drawing attention to specific errors increased rather than decreased later suggestibility. Failure to monitor for errors, not failure to identify the information requiring evaluation, leads to suggestibility. PMID:21294039
An alternative data filling approach for prediction of missing data in soft sets (ADFIS).
Sadiq Khan, Muhammad; Al-Garadi, Mohammed Ali; Wahab, Ainuddin Wahid Abdul; Herawan, Tutut
2016-01-01
Soft set theory is a mathematical approach that provides solution for dealing with uncertain data. As a standard soft set, it can be represented as a Boolean-valued information system, and hence it has been used in hundreds of useful applications. Meanwhile, these applications become worthless if the Boolean information system contains missing data due to error, security or mishandling. Few researches exist that focused on handling partially incomplete soft set and none of them has high accuracy rate in prediction performance of handling missing data. It is shown that the data filling approach for incomplete soft set (DFIS) has the best performance among all previous approaches. However, in reviewing DFIS, accuracy is still its main problem. In this paper, we propose an alternative data filling approach for prediction of missing data in soft sets, namely ADFIS. The novelty of ADFIS is that, unlike the previous approach that used probability, we focus more on reliability of association among parameters in soft set. Experimental results on small, 04 UCI benchmark data and causality workbench lung cancer (LUCAP2) data shows that ADFIS performs better accuracy as compared to DFIS.
NASA Technical Reports Server (NTRS)
Davis, Robert N.; Polites, Michael E.; Trevino, Luis C.
2004-01-01
This paper details a novel scheme for autonomous component health management (ACHM) with failed actuator detection and failed sensor detection, identification, and avoidance. This new scheme has features that far exceed the performance of systems with triple-redundant sensing and voting, yet requires fewer sensors and could be applied to any system with redundant sensing. Relevant background to the ACHM scheme is provided, and the simulation results for the application of that scheme to a single-axis spacecraft attitude control system with a 3rd order plant and dual-redundant measurement of system states are presented. ACHM fulfills key functions needed by an integrated vehicle health monitoring (IVHM) system. It is: autonomous; adaptive; works in realtime; provides optimal state estimation; identifies failed components; avoids failed components; reconfigures for multiple failures; reconfigures for intermittent failures; works for hard-over, soft, and zero-output failures; and works for both open- and closed-loop systems. The ACHM scheme combines a prefilter that generates preliminary state estimates, detects and identifies failed sensors and actuators, and avoids the use of failed sensors in state estimation with a fixed-gain Kalman filter that generates optimal state estimates and provides model-based state estimates that comprise an integral part of the failure detection logic. The results show that ACHM successfully isolates multiple persistent and intermittent hard-over, soft, and zero-output failures. It is now ready to be tested on a computer model of an actual system.
Multisite Parent-Centered Risk Assessment to Reduce Pediatric Oral Chemotherapy Errors
Walsh, Kathleen E.; Mazor, Kathleen M.; Roblin, Douglas; Biggins, Colleen; Wagner, Joann L.; Houlahan, Kathleen; Li, Justin W.; Keuker, Christopher; Wasilewski-Masker, Karen; Donovan, Jennifer; Kanaan, Abir; Weingart, Saul N.
2013-01-01
Purpose: Observational studies describe high rates of errors in home oral chemotherapy use in children. In hospitals, proactive risk assessment methods help front-line health care workers develop error prevention strategies. Our objective was to engage parents of children with cancer in a multisite study using proactive risk assessment methods to identify how errors occur at home and propose risk reduction strategies. Methods: We recruited parents from three outpatient pediatric oncology clinics in the northeast and southeast United States to participate in failure mode and effects analyses (FMEA). An FMEA is a systematic team-based proactive risk assessment approach in understanding ways a process can fail and develop prevention strategies. Steps included diagram the process, brainstorm and prioritize failure modes (places where things go wrong), and propose risk reduction strategies. We focused on home oral chemotherapy administration after a change in dose because prior studies identified this area as high risk. Results: Parent teams consisted of four parents at two of the sites and 10 at the third. Parents developed a 13-step process map, with two to 19 failure modes per step. The highest priority failure modes included miscommunication when receiving instructions from the clinician (caused by conflicting instructions or parent lapses) and unsafe chemotherapy handling at home. Recommended risk assessment strategies included novel uses of technology to improve parent access to information, clinicians, and other parents while at home. Conclusion: Parents of pediatric oncology patients readily participated in a proactive risk assessment method, identifying processes that pose a risk for medication errors involving home oral chemotherapy. PMID:23633976
An Empirical Approach to Analysis of Similarities between Software Failure Regions
1991-09-01
cycle costs after the soft- ware has been marketed (Alberts, 1976). 1 Unfortunately, extensive software testing is frequently necessary in spite of...incidence is primarily syntactic. This mixing of semantic and syntactic forms in the same analysis could lead to some distortion, especially since the...of formulae to improve readability or to indicate precedence of operations. * All defintions within ’Condition I’ of a failure region are assumed to
Improved Rubin-Bodner Model for the Prediction of Soft Tissue Deformations
Zhang, Guangming; Xia, James J.; Liebschner, Michael; Zhang, Xiaoyan; Kim, Daeseung; Zhou, Xiaobo
2016-01-01
In craniomaxillofacial (CMF) surgery, a reliable way of simulating the soft tissue deformation resulted from skeletal reconstruction is vitally important for preventing the risks of facial distortion postoperatively. However, it is difficult to simulate the soft tissue behaviors affected by different types of CMF surgery. This study presents an integrated bio-mechanical and statistical learning model to improve accuracy and reliability of predictions on soft facial tissue behavior. The Rubin-Bodner (RB) model is initially used to describe the biomechanical behavior of the soft facial tissue. Subsequently, a finite element model (FEM) computers the stress of each node in soft facial tissue mesh data resulted from bone displacement. Next, the Generalized Regression Neural Network (GRNN) method is implemented to obtain the relationship between the facial soft tissue deformation and the stress distribution corresponding to different CMF surgical types and to improve evaluation of elastic parameters included in the RB model. Therefore, the soft facial tissue deformation can be predicted by biomechanical properties and statistical model. Leave-one-out cross-validation is used on eleven patients. As a result, the average prediction error of our model (0.7035mm) is lower than those resulting from other approaches. It also demonstrates that the more accurate bio-mechanical information the model has, the better prediction performance it could achieve. PMID:27717593
Results from the First Two Flights of the Static Computer Memory Integrity Testing Experiment
NASA Technical Reports Server (NTRS)
Hancock, Thomas M., III
1999-01-01
This paper details the scientific objectives, experiment design, data collection method, and post flight analysis following the first two flights of the Static Computer Memory Integrity Testing (SCMIT) experiment. SCMIT is designed to detect soft-event upsets in passive magnetic memory. A soft-event upset is a change in the logic state of active or passive forms of magnetic memory, commonly referred to as a "Bitflip". In its mildest form a soft-event upset can cause software exceptions, unexpected events, start spacecraft safeing (ending data collection) or corrupted fault protection and error recovery capabilities. In it's most severe form loss of mission or spacecraft can occur. Analysis after the first flight (in 1991 during STS-40) identified possible soft-event upsets to 25% of the experiment detectors. Post flight analysis after the second flight (in 1997 on STS-87) failed to find any evidence of soft-event upsets. The SCMIT experiment is currently scheduled for a third flight in December 1999 on STS-101.
Zhang, Bin; He, Xin; Ouyang, Fusheng; Gu, Dongsheng; Dong, Yuhao; Zhang, Lu; Mo, Xiaokai; Huang, Wenhui; Tian, Jie; Zhang, Shuixing
2017-09-10
We aimed to identify optimal machine-learning methods for radiomics-based prediction of local failure and distant failure in advanced nasopharyngeal carcinoma (NPC). We enrolled 110 patients with advanced NPC. A total of 970 radiomic features were extracted from MRI images for each patient. Six feature selection methods and nine classification methods were evaluated in terms of their performance. We applied the 10-fold cross-validation as the criterion for feature selection and classification. We repeated each combination for 50 times to obtain the mean area under the curve (AUC) and test error. We observed that the combination methods Random Forest (RF) + RF (AUC, 0.8464 ± 0.0069; test error, 0.3135 ± 0.0088) had the highest prognostic performance, followed by RF + Adaptive Boosting (AdaBoost) (AUC, 0.8204 ± 0.0095; test error, 0.3384 ± 0.0097), and Sure Independence Screening (SIS) + Linear Support Vector Machines (LSVM) (AUC, 0.7883 ± 0.0096; test error, 0.3985 ± 0.0100). Our radiomics study identified optimal machine-learning methods for the radiomics-based prediction of local failure and distant failure in advanced NPC, which could enhance the applications of radiomics in precision oncology and clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.
Analysis of aggregate pier systems for stabilization of subgrade settlement.
DOT National Transportation Integrated Search
2014-12-01
Every year, ODOT undertakes numerous pavement patching/resurfacing projects to repair pavement : distress and structural failure due to soft and/or organic soils constituting the subgrade. Other than the : temporary solution of patching/resurfacing, ...
Catastrophic metallosis after tumoral knee prosthesis failure: A case report.
La Verde, Luca; Fenga, Domenico; Spinelli, Maria Silvia; Campo, Francesco Rosario; Florio, Michela; Rosa, Michele Attilio
2017-01-01
Metallosis is a condition characterized by an infiltration of periprosthetic soft tissues and bone by metallic debris resulting from wear or failure of joint arthroplasties. Authors describe a case of a 45-year-old man treated for an osteosarcoma of the distal femur with a modular prosthesis when he was 18 years old, he developed massive metallosis with skin dyspigmentation after 17 years. His medical\\surgical history was remarkable for a left tumoral knee prosthesis implanted 21 years ago. Two years before revision, the patient had a car accident with a two-points prosthesis breakage and despite the surgeon's advice, the patient refused surgery. In two years, prosthesis malfunction caused a progressive catastrophic soft tissues infiltration of metallic debris. Authors suggest that if prosthesis fracture is detected, revision surgery should be attempted as earlier as possible. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Technical Reports Server (NTRS)
Gejji, Raghvendra, R.
1992-01-01
Network transmission errors such as collisions, CRC errors, misalignment, etc. are statistical in nature. Although errors can vary randomly, a high level of errors does indicate specific network problems, e.g. equipment failure. In this project, we have studied the random nature of collisions theoretically as well as by gathering statistics, and established a numerical threshold above which a network problem is indicated with high probability.
Sensor failure detection system. [for the F100 turbofan engine
NASA Technical Reports Server (NTRS)
Beattie, E. C.; Laprad, R. F.; Mcglone, M. E.; Rock, S. M.; Akhter, M. M.
1981-01-01
Advanced concepts for detecting, isolating, and accommodating sensor failures were studied to determine their applicability to the gas turbine control problem. Five concepts were formulated based upon such techniques as Kalman filters and a screening process led to the selection of one advanced concept for further evaluation. The selected advanced concept uses a Kalman filter to generate residuals, a weighted sum square residuals technique to detect soft failures, likelihood ratio testing of a bank of Kalman filters for isolation, and reconfiguring of the normal mode Kalman filter by eliminating the failed input to accommodate the failure. The advanced concept was compared to a baseline parameter synthesis technique. The advanced concept was shown to be a viable concept for detecting, isolating, and accommodating sensor failures for the gas turbine applications.
Stretchable electronics based on Ag-PDMS composites
Larmagnac, Alexandre; Eggenberger, Samuel; Janossy, Hanna; Vörös, Janos
2014-01-01
Patterned structures of flexible, stretchable, electrically conductive materials on soft substrates could lead to novel electronic devices with unique mechanical properties allowing them to bend, fold, stretch or conform to their environment. For the last decade, research on improving the stretchability of circuits on elastomeric substrates has made significant progresses but designing printed circuit assemblies on elastomers remains challenging. Here we present a simple, cost-effective, cleanroom-free process to produce large scale soft electronic hardware where standard surface-mounted electrical components were directly bonded onto all-elastomeric printed circuit boards, or soft PCBs. Ag-PDMS tracks were stencil printed onto a PDMS substrate and soft PCBs were made by bonding the top and bottom layers together and filling punched holes with Ag-PDMS to create vias. Silver epoxy was used to bond commercial electrical components and no mechanical failure was observed after hundreds of stretching cycles. We also demonstrate the fabrication of a stretchable clock generator. PMID:25434843
An undulator based soft x-ray source for microscopy on the Duke electron storage ring
NASA Astrophysics Data System (ADS)
Johnson, Lewis Elgin
1998-09-01
This dissertation describes the design, development, and installation of an undulator-based soft x-ray source on the Duke Free Electron Laser laboratory electron storage ring. Insertion device and soft x-ray beamline physics and technology are all discussed in detail. The Duke/NIST undulator is a 3.64-m long hybrid design constructed by the Brobeck Division of Maxwell Laboratories. Originally built for an FEL project at the National Institute of Standards and Technology, the undulator was acquired by Duke in 1992 for use as a soft x-ray source for the FEL laboratory. Initial Hall probe measurements on the magnetic field distribution of the undulator revealed field errors of more than 0.80%. Initial phase errors for the device were more than 11 degrees. Through a series of in situ and off-line measurements and modifications we have re-tuned the magnet field structure of the device to produce strong spectral characteristics through the 5th harmonic. A low operating K has served to reduce the effects of magnetic field errors on the harmonic spectral content. Although rms field errors remained at 0.75%, we succeeded in reducing phase errors to less than 5 degrees. Using trajectory simulations from magnetic field data, we have computed the spectral output given the interaction of the Duke storage ring electron beam and the NIST undulator. Driven by a series of concerns and constraints over maximum utility, personnel safety and funding, we have also constructed a unique front end beamline for the undulator. The front end has been designed for maximum throughput of the 1st harmonic around 40A in its standard mode of operation. The front end has an alternative mode of operation which transmits the 3rd and 5th harmonics. This compact system also allows for the extraction of some of the bend magnet produced synchrotron and transition radiation from the storage ring. As with any well designed front end system, it also provides excellent protection to personnel and to the storage ring. A diagnostic beamline consisting of a transmission grating spectrometer and scanning wire beam profile monitor was constructed to measure the spatial and spectral characteristics of the undulator radiation. Test of the system with a circulating electron beam has confirmed the magnetic and focusing properties of the undulator, and verified that it can be used without perturbing the orbit of the beam.
Chang, Shih-Tsun; Liu, Yen-Hsiu; Lee, Jiahn-Shing; See, Lai-Chu
2015-01-01
Background: The effect of correcting static vision on sports vision is still not clear. Aim: To examine whether sports vision (depth perception [DP], dynamic visual acuity [DVA], eye movement [EM], peripheral vision [PV], and momentary vision [MV],) were different among soft tennis adolescent athletes with normal vision (Group A), with refractive error and corrected with (Group B) and without eyeglasses (Group C). Setting and Design: A cross-section study was conducted. Soft tennis athletes aged 10–13 who played softball tennis for 2–5 years, and who were without any ocular diseases and without visual training for the past 3 months were recruited. Materials and Methods: DPs were measured in an absolute deviation (mm) between a moving rod and fixing rod (approaching at 25 mm/s, receding at 25 mm/s, approaching at 50 mm/s, receding at 50 mm/s) using electric DP tester. A smaller deviation represented better DP. DVA, EM, PV, and MV were measured on a scale from 1 (worse) to 10 (best) using ATHLEVISION software. Statistical Analysis: Chi-square test and Kruskal–Wallis test was used to compare the data among the three study groups. Results: A total of 73 athletes (37 in Group A, 8 in Group B, 28 in Group C) were enrolled in this study. All four items of DP showed significant difference among the three study groups (P = 0.0051, 0.0004, 0.0095, 0.0021). PV displayed significant difference among the three study groups (P = 0.0044). There was no significant difference in DVA, EM, and MV among the three study groups. Conclusions: Significant better DP and PV were seen among soft tennis adolescent athletes with normal vision than those with refractive error regardless whether they had eyeglasses corrected. On the other hand, DVA, EM, and MV were similar among the three study groups. PMID:26632127
Ballistic Impact of Braided Composites With a Soft Projectile
NASA Technical Reports Server (NTRS)
Roberts, Gary D.; Pereira, J. Michael; Revilock, Duane M., Jr.; Binienda, Wieslaw; Xie, Ming; Braley, Mike
2004-01-01
Impact tests using a soft gelatin projectile were performed to identify failure modes that occur at high strain energy density during impact loading. Use of a soft projectile allows a large amount of kinetic energy to be transferred into strain energy in the target before penetration occurs. Failure modes were identified for flat aluminum plates and for flat composite plates made from a triaxial braid having a quasi-isotropic fiber architecture with fibers in the 0 and +/- 60 deg. directions. For the aluminum plates, a large hole formed as a result of crack propagation from the initiation site at the center of the plate to the fixed boundaries. For the composite plates, fiber tensile failure occurred in the back ply at the center of the plate. Cracks then propagated from this site along the +/- 60 deg. fiber directions until triangular flaps opened to allow the projectile to pass through the plate. The damage size was only slightly larger than the initial impact area. It was difficult to avoid slipping of the fixed edges of the plates during impact, and slipping was shown to have a large effect on the penetration threshold. Failure modes were also identified for composite half-rings fabricated with the 0 deg. fibers aligned circumferentially. Slipping of the edges was not a problem in the half-ring tests. For the composite half-rings, fiber tensile failure also occurred in the back ply. However, cracks initially propagated from this site in a direction transverse to the 0 deg. fibers. The cracks then turned to follow the +/- 60 deg. fibers for a short distance before turning again to follow 0 deg. fibers until two approximately rectangular flaps opened to allow the projectile to pass through the plate. The damage size in the composite half-rings was also only slightly larger than the initial impact area. Cracks did not propagate to the boundaries, and no delamination was observed. The damage tolerance demonstrated by the quasi-isotropic triaxial braid composites indicate that composites of this type can reasonably be considered as a lightweight alternative to metals for fan cases in commercial jet engines.
A measurement-based performability model for a multiprocessor system
NASA Technical Reports Server (NTRS)
Ilsueh, M. C.; Iyer, Ravi K.; Trivedi, K. S.
1987-01-01
A measurement-based performability model based on real error-data collected on a multiprocessor system is described. Model development from the raw errror-data to the estimation of cumulative reward is described. Both normal and failure behavior of the system are characterized. The measured data show that the holding times in key operational and failure states are not simple exponential and that semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different failure types and recovery procedures.
Human factors process failure modes and effects analysis (HF PFMEA) software tool
NASA Technical Reports Server (NTRS)
Chandler, Faith T. (Inventor); Relvini, Kristine M. (Inventor); Shedd, Nathaneal P. (Inventor); Valentino, William D. (Inventor); Philippart, Monica F. (Inventor); Bessette, Colette I. (Inventor)
2011-01-01
Methods, computer-readable media, and systems for automatically performing Human Factors Process Failure Modes and Effects Analysis for a process are provided. At least one task involved in a process is identified, where the task includes at least one human activity. The human activity is described using at least one verb. A human error potentially resulting from the human activity is automatically identified, the human error is related to the verb used in describing the task. A likelihood of occurrence, detection, and correction of the human error is identified. The severity of the effect of the human error is identified. The likelihood of occurrence, and the severity of the risk of potential harm is identified. The risk of potential harm is compared with a risk threshold to identify the appropriateness of corrective measures.
Laser as a Tool to Study Radiation Effects in CMOS
NASA Astrophysics Data System (ADS)
Ajdari, Bahar
Energetic particles from cosmic ray or terrestrial sources can strike sensitive areas of CMOS devices and cause soft errors. Understanding the effects of such interactions is crucial as the device technology advances, and chip reliability has become more important than ever. Particle accelerator testing has been the standard method to characterize the sensitivity of chips to single event upsets (SEUs). However, because of their costs and availability limitations, other techniques have been explored. Pulsed laser has been a successful tool for characterization of SEU behavior, but to this day, laser has not been recognized as a comparable method to beam testing. In this thesis, I propose a methodology of correlating laser soft error rate (SER) to particle beam gathered data. Additionally, results are presented showing a temperature dependence of SER and the "neighbor effect" phenomenon where due to the close proximity of devices a "weakening effect" in the ON state can be observed.
Kazaura, Kamugisha; Omae, Kazunori; Suzuki, Toshiji; Matsumoto, Mitsuji; Mutafungwa, Edward; Korhonen, Timo O; Murakami, Tadaaki; Takahashi, Koichi; Matsumoto, Hideki; Wakamori, Kazuhiko; Arimoto, Yoshinori
2006-06-12
The deterioration and deformation of a free-space optical beam wave-front as it propagates through the atmosphere can reduce the link availability and may introduce burst errors thus degrading the performance of the system. We investigate the suitability of utilizing soft-computing (SC) based tools for improving performance of free-space optical (FSO) communications systems. The SC based tools are used for the prediction of key parameters of a FSO communications system. Measured data collected from an experimental FSO communication system is used as training and testing data for a proposed multi-layer neural network predictor (MNNP) used to predict future parameter values. The predicted parameters are essential for reducing transmission errors by improving the antenna's accuracy of tracking data beams. This is particularly essential for periods considered to be of strong atmospheric turbulence. The parameter values predicted using the proposed tool show acceptable conformity with original measurements.
Effects of Stopping Ions and LET Fluctuations on Soft Error Rate Prediction.
Weeden-Wright, S. L.; King, Michael Patrick; Hooten, N. C.; ...
2015-02-01
Variability in energy deposition from stopping ions and LET fluctuations is quantified for specific radiation environments. When compared to predictions using average LET via CREME96, LET fluctuations lead to an order-of-magnitude difference in effective flux and a nearly 4x decrease in predicted soft error rate (SER) in an example calculation performed on a commercial 65 nm SRAM. The large LET fluctuations reported here will be even greater for the smaller sensitive volumes that are characteristic of highly scaled technologies. End-of-range effects of stopping ions do not lead to significant inaccuracies in radiation environments with low solar activity unless the sensitivevolumemore » thickness is 100 μm or greater. In contrast, end-of-range effects for stopping ions lead to significant inaccuracies for sensitive- volume thicknesses less than 10 μm in radiation environments with high solar activity.« less
NASA Astrophysics Data System (ADS)
Yan, Hong; Song, Xiangzhong; Tian, Kuangda; Chen, Yilin; Xiong, Yanmei; Min, Shungeng
2018-02-01
A novel method, mid-infrared (MIR) spectroscopy, which enables the determination of Chlorantraniliprole in Abamectin within minutes, is proposed. We further evaluate the prediction ability of four wavelength selection methods, including bootstrapping soft shrinkage approach (BOSS), Monte Carlo uninformative variable elimination (MCUVE), genetic algorithm partial least squares (GA-PLS) and competitive adaptive reweighted sampling (CARS) respectively. The results showed that BOSS method obtained the lowest root mean squared error of cross validation (RMSECV) (0.0245) and root mean squared error of prediction (RMSEP) (0.0271), as well as the highest coefficient of determination of cross-validation (Qcv2) (0.9998) and the coefficient of determination of test set (Q2test) (0.9989), which demonstrated that the mid infrared spectroscopy can be used to detect Chlorantraniliprole in Abamectin conveniently. Meanwhile, a suitable wavelength selection method (BOSS) is essential to conducting a component spectral analysis.
Analyzing the effectiveness of a frame-level redundancy scrubbing technique for SRAM-based FPGAs
Tonfat, Jorge; Lima Kastensmidt, Fernanda; Rech, Paolo; ...
2015-12-17
Radiation effects such as soft errors are the major threat to the reliability of SRAM-based FPGAs. This work analyzes the effectiveness in correcting soft errors of a novel scrubbing technique using internal frame redundancy called Frame-level Redundancy Scrubbing (FLR-scrubbing). This correction technique can be implemented in a coarse grain TMR design. The FLR-scrubbing technique was implemented on a mid-size Xilinx Virtex-5 FPGA device used as a case study. The FLR-scrubbing technique was tested under neutron radiation and fault injection. Implementation results demonstrated minimum area and energy consumption overhead when compared to other techniques. The time to repair the fault ismore » also improved by using the Internal Configuration Access Port (ICAP). Lastly, neutron radiation test results demonstrated that the proposed technique is suitable for correcting accumulated SEUs and MBUs.« less
Damage level prediction of non-reshaped berm breakwater using ANN, SVM and ANFIS models
NASA Astrophysics Data System (ADS)
Mandal, Sukomal; Rao, Subba; N., Harish; Lokesha
2012-06-01
The damage analysis of coastal structure is very important as it involves many design parameters to be considered for the better and safe design of structure. In the present study experimental data for non-reshaped berm breakwater are collected from Marine Structures Laboratory, Department of Applied Mechanics and Hydraulics, NITK, Surathkal, India. Soft computing techniques like Artificial Neural Network (ANN), Support Vector Machine (SVM) and Adaptive Neuro Fuzzy Inference system (ANFIS) models are constructed using experimental data sets to predict the damage level of non-reshaped berm breakwater. The experimental data are used to train ANN, SVM and ANFIS models and results are determined in terms of statistical measures like mean square error, root mean square error, correla-tion coefficient and scatter index. The result shows that soft computing techniques i.e., ANN, SVM and ANFIS can be efficient tools in predicting damage levels of non reshaped berm breakwater.
Analyzing the effectiveness of a frame-level redundancy scrubbing technique for SRAM-based FPGAs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonfat, Jorge; Lima Kastensmidt, Fernanda; Rech, Paolo
Radiation effects such as soft errors are the major threat to the reliability of SRAM-based FPGAs. This work analyzes the effectiveness in correcting soft errors of a novel scrubbing technique using internal frame redundancy called Frame-level Redundancy Scrubbing (FLR-scrubbing). This correction technique can be implemented in a coarse grain TMR design. The FLR-scrubbing technique was implemented on a mid-size Xilinx Virtex-5 FPGA device used as a case study. The FLR-scrubbing technique was tested under neutron radiation and fault injection. Implementation results demonstrated minimum area and energy consumption overhead when compared to other techniques. The time to repair the fault ismore » also improved by using the Internal Configuration Access Port (ICAP). Lastly, neutron radiation test results demonstrated that the proposed technique is suitable for correcting accumulated SEUs and MBUs.« less
Learning to Fail in Aphasia: An Investigation of Error Learning in Naming
Middleton, Erica L.; Schwartz, Myrna F.
2013-01-01
Purpose To determine if the naming impairment in aphasia is influenced by error learning and if error learning is related to type of retrieval strategy. Method Nine participants with aphasia and ten neurologically-intact controls named familiar proper noun concepts. When experiencing tip-of-the-tongue naming failure (TOT) in an initial TOT-elicitation phase, participants were instructed to adopt phonological or semantic self-cued retrieval strategies. In the error learning manipulation, items evoking TOT states during TOT-elicitation were randomly assigned to a short or long time condition where participants were encouraged to continue to try to retrieve the name for either 20 seconds (short interval) or 60 seconds (long). The incidence of TOT on the same items was measured on a post test after 48-hours. Error learning was defined as a higher rate of recurrent TOTs (TOT at both TOT-elicitation and post test) for items assigned to the long (versus short) time condition. Results In the phonological condition, participants with aphasia showed error learning whereas controls showed a pattern opposite to error learning. There was no evidence for error learning in the semantic condition for either group. Conclusion Error learning is operative in aphasia, but dependent on the type of strategy employed during naming failure. PMID:23816662
Styck, Kara M; Walsh, Shana M
2016-01-01
The purpose of the present investigation was to conduct a meta-analysis of the literature on examiner errors for the Wechsler scales of intelligence. Results indicate that a mean of 99.7% of protocols contained at least 1 examiner error when studies that included a failure to record examinee responses as an error were combined and a mean of 41.2% of protocols contained at least 1 examiner error when studies that ignored errors of omission were combined. Furthermore, graduate student examiners were significantly more likely to make at least 1 error on Wechsler intelligence test protocols than psychologists. However, psychologists made significantly more errors per protocol than graduate student examiners regardless of the inclusion or exclusion of failure to record examinee responses as errors. On average, 73.1% of Full-Scale IQ (FSIQ) scores changed as a result of examiner errors, whereas 15.8%-77.3% of scores on the Verbal Comprehension Index (VCI), Perceptual Reasoning Index (PRI), Working Memory Index (WMI), and Processing Speed Index changed as a result of examiner errors. In addition, results suggest that examiners tend to overestimate FSIQ scores and underestimate VCI scores. However, no strong pattern emerged for the PRI and WMI. It can be concluded that examiner errors occur frequently and impact index and FSIQ scores. Consequently, current estimates for the standard error of measurement of popular IQ tests may not adequately capture the variance due to the examiner. (c) 2016 APA, all rights reserved).
Giant voltage-induced deformation of a dielectric elastomer under a constant pressure
NASA Astrophysics Data System (ADS)
Godaba, Hareesh; Foo, Choon Chiang; Zhang, Zhi Qian; Khoo, Boo Cheong; Zhu, Jian
2014-09-01
Dielectric elastomer actuators coupled with liquid have recently been developed as soft pumps, soft lenses, Braille displays, etc. In this paper, we investigate the performance of a dielectric elastomer actuator, which is coupled with water. The experiments demonstrate that the membrane of a dielectric elastomer can achieve a giant voltage-induced area strain of 1165%, when subject to a constant pressure. Both theory and experiment show that the pressure plays an important role in determining the electromechanical behaviour. The experiments also suggest that the dielectric elastomer actuators, when coupled with liquid, may suffer mechanical instability and collapse after a large amount of liquid is enclosed by the membrane. This failure mode needs to be taken into account in designing soft actuators.
Bachmaier, Samuel; Smith, Patrick A; Bley, Jordan; Wijdicks, Coen A
2018-02-01
To compare the dynamic elongation, stiffness behavior, and ultimate failure load of standard with small diameter soft tissue grafts for anterior cruciate ligament (ACL) reconstruction with and without high-strength suture tape reinforcement. Both a tripled "small" diameter and a "standard" quadrupled tendon graft with and without suture tape reinforcement were tested using suspensory fixation (n = 8 each group). The suture tape was passed through the suspensory fixation button on the femur and tibia to ensure independent (safety belt) fixation from the graft in vitro. The testing of the constructs included position-controlled cyclic loading, force-controlled cyclic loading at 250 N and 400 N as well as pull to failure (50 mm/min). Reinforcement of a small diameter graft significantly reduced dynamic elongation of 38% (1.46 ± 0.28 mm vs 2.34 ± 0.44 mm, P < .001) and 50% (2.55 ± 0.44 mm vs 5.06 ± 0.67 mm, P < .001) after the 250 N and 400 N load protocol, respectively. Reinforcement of a standard diameter tendon graft decreased dynamic elongation of 15% (1.59 ± 0.34 mm vs 1.86 ± 0.17 mm, P = .066) and 26% (2.62 ± 0.44 mm vs 3.55 ± 0.44 mm, P < .001). No significant difference was found between both reinforced models. The ultimate failure loads of small and standard diameter reinforced grafts were 1592 ± 105 N and 1585 ± 265 N, resulting in a 64% (P < .001) and 40% (P < .001) increase compared with their respective controls. Independent suture tape reinforcement of soft tissue grafts for ACL reconstruction leads to significantly reduced elongation and higher ultimate failure load according to in vivo native ACL function data without stress-shielding the soft tissue graft. If in vitro results are translational to human knees in vivo, the suture tape reinforcement technique for ACL reconstruction may decrease the risk of graft tears, particularly in the case of small diameter soft tissue grafts. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Jia, Rui; Monk, Paul; Murray, David; Noble, J Alison; Mellon, Stephen
2017-09-06
Optoelectronic motion capture systems are widely employed to measure the movement of human joints. However, there can be a significant discrepancy between the data obtained by a motion capture system (MCS) and the actual movement of underlying bony structures, which is attributed to soft tissue artefact. In this paper, a computer-aided tracking and motion analysis with ultrasound (CAT & MAUS) system with an augmented globally optimal registration algorithm is presented to dynamically track the underlying bony structure during movement. The augmented registration part of CAT & MAUS was validated with a high system accuracy of 80%. The Euclidean distance between the marker-based bony landmark and the bony landmark tracked by CAT & MAUS was calculated to quantify the measurement error of an MCS caused by soft tissue artefact during movement. The average Euclidean distance between the target bony landmark measured by each of the CAT & MAUS system and the MCS alone varied from 8.32mm to 16.87mm in gait. This indicates the discrepancy between the MCS measured bony landmark and the actual underlying bony landmark. Moreover, Procrustes analysis was applied to demonstrate that CAT & MAUS reduces the deformation of the body segment shape modeled by markers during motion. The augmented CAT & MAUS system shows its potential to dynamically detect and locate actual underlying bony landmarks, which reduces the MCS measurement error caused by soft tissue artefact during movement. Copyright © 2017 Elsevier Ltd. All rights reserved.
Color reproduction for advanced manufacture of soft tissue prostheses.
Xiao, Kaida; Zardawi, Faraedon; van Noort, Richard; Yates, Julian M
2013-11-01
The objectives of this study were to develop a color reproduction system in advanced manufacture technology for accurate and automatic processing of soft tissue prostheses. The manufacturing protocol was defined to effectively and consistently produce soft tissue prostheses using a 3D printing system. Within this protocol printer color profiles were developed using a number of mathematical models for the proposed 3D color printing system based on 240 training colors. On this basis, the color reproduction system was established and their system errors including accuracy of color reproduction, performance of color repeatability and color gamut were evaluated using 14 known human skin shades. The printer color profile developed using the third-order polynomial regression based on least-square fitting provided the best model performance. The results demonstrated that by using the proposed color reproduction system, 14 different skin colors could be reproduced and excellent color reproduction performance achieved. Evaluation of the system's color repeatability revealed a demonstrable system error and this highlighted the need for regular evaluation. The color gamut for the proposed 3D printing system was simulated and it was demonstrated that the vast majority of skin colors can be reproduced with the exception of extreme dark or light skin color shades. This study demonstrated that the proposed color reproduction system can be effectively used to reproduce a range of human skin colors for application in advanced manufacture of soft tissue prostheses. Copyright © 2013 Elsevier Ltd. All rights reserved.
Peterman, Robert J; Jiang, Shuying; Johe, Rene; Mukherjee, Padma M
2016-12-01
Dolphin® visual treatment objective (VTO) prediction software is routinely utilized by orthodontists during the treatment planning of orthognathic cases to help predict post-surgical soft tissue changes. Although surgical soft tissue prediction is considered to be a vital tool, its accuracy is not well understood in tow-jaw surgical procedures. The objective of this study was to quantify the accuracy of Dolphin Imaging's VTO soft tissue prediction software on class III patients treated with maxillary advancement and mandibular setback and to validate the efficacy of the software in such complex cases. This retrospective study analyzed the records of 14 patients treated with comprehensive orthodontics in conjunction with two-jaw orthognathic surgery. Pre- and post-treatment radiographs were traced and superimposed to determine the actual skeletal movements achieved in surgery. This information was then used to simulate surgery in the software and generate a final soft tissue patient profile prediction. Prediction images were then compared to the actual post-treatment profile photos to determine differences. Dolphin Imaging's software was determined to be accurate within an error range of +/- 2 mm in the X-axis at most landmarks. The lower lip predictions were most inaccurate. Clinically, the observed error suggests that the VTO may be used for demonstration and communication with a patient or consulting practitioner. However, Dolphin should not be useful for precise treatment planning of surgical movements. This program should be used with caution to prevent unrealistic patient expectations and dissatisfaction.
An Autonomous Self-Aware and Adaptive Fault Tolerant Routing Technique for Wireless Sensor Networks
Abba, Sani; Lee, Jeong-A
2015-01-01
We propose an autonomous self-aware and adaptive fault-tolerant routing technique (ASAART) for wireless sensor networks. We address the limitations of self-healing routing (SHR) and self-selective routing (SSR) techniques for routing sensor data. We also examine the integration of autonomic self-aware and adaptive fault detection and resiliency techniques for route formation and route repair to provide resilience to errors and failures. We achieved this by using a combined continuous and slotted prioritized transmission back-off delay to obtain local and global network state information, as well as multiple random functions for attaining faster routing convergence and reliable route repair despite transient and permanent node failure rates and efficient adaptation to instantaneous network topology changes. The results of simulations based on a comparison of the ASAART with the SHR and SSR protocols for five different simulated scenarios in the presence of transient and permanent node failure rates exhibit a greater resiliency to errors and failure and better routing performance in terms of the number of successfully delivered network packets, end-to-end delay, delivered MAC layer packets, packet error rate, as well as efficient energy conservation in a highly congested, faulty, and scalable sensor network. PMID:26295236
An Autonomous Self-Aware and Adaptive Fault Tolerant Routing Technique for Wireless Sensor Networks.
Abba, Sani; Lee, Jeong-A
2015-08-18
We propose an autonomous self-aware and adaptive fault-tolerant routing technique (ASAART) for wireless sensor networks. We address the limitations of self-healing routing (SHR) and self-selective routing (SSR) techniques for routing sensor data. We also examine the integration of autonomic self-aware and adaptive fault detection and resiliency techniques for route formation and route repair to provide resilience to errors and failures. We achieved this by using a combined continuous and slotted prioritized transmission back-off delay to obtain local and global network state information, as well as multiple random functions for attaining faster routing convergence and reliable route repair despite transient and permanent node failure rates and efficient adaptation to instantaneous network topology changes. The results of simulations based on a comparison of the ASAART with the SHR and SSR protocols for five different simulated scenarios in the presence of transient and permanent node failure rates exhibit a greater resiliency to errors and failure and better routing performance in terms of the number of successfully delivered network packets, end-to-end delay, delivered MAC layer packets, packet error rate, as well as efficient energy conservation in a highly congested, faulty, and scalable sensor network.
Kim, Changhwa; Shin, DongHyun
2017-01-01
There are wireless networks in which typically communications are unsafe. Most terrestrial wireless sensor networks belong to this category of networks. Another example of an unsafe communication network is an underwater acoustic sensor network (UWASN). In UWASNs in particular, communication failures occur frequently and the failure durations can range from seconds up to a few hours, days, or even weeks. These communication failures can cause data losses significant enough to seriously damage human life or property, depending on their application areas. In this paper, we propose a framework to reduce sensor data loss during communication failures and we present a formal approach to the Selection by Minimum Error and Pattern (SMEP) method that plays the most important role for the reduction in sensor data loss under the proposed framework. The SMEP method is compared with other methods to validate its effectiveness through experiments using real-field sensor data sets. Moreover, based on our experimental results and performance comparisons, the SMEP method has been validated to be better than others in terms of the average sensor data value error rate caused by sensor data loss. PMID:28498312
Kim, Changhwa; Shin, DongHyun
2017-05-12
There are wireless networks in which typically communications are unsafe. Most terrestrial wireless sensor networks belong to this category of networks. Another example of an unsafe communication network is an underwater acoustic sensor network (UWASN). In UWASNs in particular, communication failures occur frequently and the failure durations can range from seconds up to a few hours, days, or even weeks. These communication failures can cause data losses significant enough to seriously damage human life or property, depending on their application areas. In this paper, we propose a framework to reduce sensor data loss during communication failures and we present a formal approach to the Selection by Minimum Error and Pattern (SMEP) method that plays the most important role for the reduction in sensor data loss under the proposed framework. The SMEP method is compared with other methods to validate its effectiveness through experiments using real-field sensor data sets. Moreover, based on our experimental results and performance comparisons, the SMEP method has been validated to be better than others in terms of the average sensor data value error rate caused by sensor data loss.
Parameters design of the dielectric elastomer spring-roll bending actuator (Conference Presentation)
NASA Astrophysics Data System (ADS)
Li, Jinrong; Liu, Liwu; Liu, Yanju; Leng, Jinsong
2017-04-01
Dielectric elastomers are novel soft smart material that could deform sustainably when subjected to external electric field. That makes dielectric elastomers promising materials for actuators. In this paper, a spring-roll actuator that would bend when a high voltage is applied was fabricated based on dielectric elastomer. Using such actuators as active parts, the flexible grippers and inchworm-inspired crawling robots were manufactured, which demonstrated some examples of applications in soft robotics. To guide the parameters design of dielectric elastomer based spring-roll bending actuators, the theoretical model of such actuators was established based on thermodynamic theories. The initial deformation and electrical induced bending angle of actuators were formulated. The failure of actuators was also analyzed considering some typical failure modes like electromechanical instability, electrical breakdown, loss of tension and maximum tolerant stretch. Thus the allowable region of actuators was determined. Then the bending angle-voltage relations and failure voltages of actuators with different parameters, including stretches of the dielectric elastomer film, number of active layers, and dimensions of spring, were investigated. The influences of each parameter on the actuator performances were discussed, providing meaningful guidance to the optical design of the spring-roll bending actuators.
Spilker, R L; de Almeida, E S; Donzelli, P S
1992-01-01
This chapter addresses computationally demanding numerical formulations in the biomechanics of soft tissues. The theory of mixtures can be used to represent soft hydrated tissues in the human musculoskeletal system as a two-phase continuum consisting of an incompressible solid phase (collagen and proteoglycan) and an incompressible fluid phase (interstitial water). We first consider the finite deformation of soft hydrated tissues in which the solid phase is represented as hyperelastic. A finite element formulation of the governing nonlinear biphasic equations is presented based on a mixed-penalty approach and derived using the weighted residual method. Fluid and solid phase deformation, velocity, and pressure are interpolated within each element, and the pressure variables within each element are eliminated at the element level. A system of nonlinear, first-order differential equations in the fluid and solid phase deformation and velocity is obtained. In order to solve these equations, the contributions of the hyperelastic solid phase are incrementally linearized, a finite difference rule is introduced for temporal discretization, and an iterative scheme is adopted to achieve equilibrium at the end of each time increment. We demonstrate the accuracy and adequacy of the procedure using a six-node, isoparametric axisymmetric element, and we present an example problem for which independent numerical solution is available. Next, we present an automated, adaptive environment for the simulation of soft tissue continua in which the finite element analysis is coupled with automatic mesh generation, error indicators, and projection methods. Mesh generation and updating, including both refinement and coarsening, for the two-dimensional examples examined in this study are performed using the finite quadtree approach. The adaptive analysis is based on an error indicator which is the L2 norm of the difference between the finite element solution and a projected finite element solution. Total stress, calculated as the sum of the solid and fluid phase stresses, is used in the error indicator. To allow the finite difference algorithm to proceed in time using an updated mesh, solution values must be transferred to the new nodal locations. This rezoning is accomplished using a projected field for the primary variables. The accuracy and effectiveness of this adaptive finite element analysis is demonstrated using a linear, two-dimensional, axisymmetric problem corresponding to the indentation of a thin sheet of soft tissue. The method is shown to effectively capture the steep gradients and to produce solutions in good agreement with independent, converged, numerical solutions.
Fail Better: Toward a Taxonomy of E-Learning Error
ERIC Educational Resources Information Center
Priem, Jason
2010-01-01
The study of student error, important across many fields of educational research, has begun to attract interest in the field of e-learning, particularly in relation to usability. However, it remains unclear when errors should be avoided (as usability failures) or embraced (as learning opportunities). Many domains have benefited from taxonomies of…
Managing Errors to Reduce Accidents in High Consequence Networked Information Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganter, J.H.
1999-02-01
Computers have always helped to amplify and propagate errors made by people. The emergence of Networked Information Systems (NISs), which allow people and systems to quickly interact worldwide, has made understanding and minimizing human error more critical. This paper applies concepts from system safety to analyze how hazards (from hackers to power disruptions) penetrate NIS defenses (e.g., firewalls and operating systems) to cause accidents. Such events usually result from both active, easily identified failures and more subtle latent conditions that have resided in the system for long periods. Both active failures and latent conditions result from human errors. We classifymore » these into several types (slips, lapses, mistakes, etc.) and provide NIS examples of how they occur. Next we examine error minimization throughout the NIS lifecycle, from design through operation to reengineering. At each stage, steps can be taken to minimize the occurrence and effects of human errors. These include defensive design philosophies, architectural patterns to guide developers, and collaborative design that incorporates operational experiences and surprises into design efforts. We conclude by looking at three aspects of NISs that will cause continuing challenges in error and accident management: immaturity of the industry, limited risk perception, and resource tradeoffs.« less
A preliminary taxonomy of medical errors in family practice
Dovey, S; Meyers, D; Phillips, R; Green, L; Fryer, G; Galliher, J; Kappus, J; Grob, P
2002-01-01
Objective: To develop a preliminary taxonomy of primary care medical errors. Design: Qualitative analysis to identify categories of error reported during a randomized controlled trial of computer and paper reporting methods. Setting: The National Network for Family Practice and Primary Care Research. Participants: Family physicians. Main outcome measures: Medical error category, context, and consequence. Results: Forty two physicians made 344 reports: 284 (82.6%) arose from healthcare systems dysfunction; 46 (13.4%) were errors due to gaps in knowledge or skills; and 14 (4.1%) were reports of adverse events, not errors. The main subcategories were: administrative failures (102; 30.9% of errors), investigation failures (82; 24.8%), treatment delivery lapses (76; 23.0%), miscommunication (19; 5.8%), payment systems problems (4; 1.2%), error in the execution of a clinical task (19; 5.8%), wrong treatment decision (14; 4.2%), and wrong diagnosis (13; 3.9%). Most reports were of errors that were recognized and occurred in reporters' practices. Affected patients ranged in age from 8 months to 100 years, were of both sexes, and represented all major US ethnic groups. Almost half the reports were of events which had adverse consequences. Ten errors resulted in patients being admitted to hospital and one patient died. Conclusions: This medical error taxonomy, developed from self-reports of errors observed by family physicians during their routine clinical practice, emphasizes problems in healthcare processes and acknowledges medical errors arising from shortfalls in clinical knowledge and skills. Patient safety strategies with most effect in primary care settings need to be broader than the current focus on medication errors. PMID:12486987
A preliminary taxonomy of medical errors in family practice.
Dovey, S M; Meyers, D S; Phillips, R L; Green, L A; Fryer, G E; Galliher, J M; Kappus, J; Grob, P
2002-09-01
To develop a preliminary taxonomy of primary care medical errors. Qualitative analysis to identify categories of error reported during a randomized controlled trial of computer and paper reporting methods. The National Network for Family Practice and Primary Care Research. Family physicians. Medical error category, context, and consequence. Forty two physicians made 344 reports: 284 (82.6%) arose from healthcare systems dysfunction; 46 (13.4%) were errors due to gaps in knowledge or skills; and 14 (4.1%) were reports of adverse events, not errors. The main subcategories were: administrative failure (102; 30.9% of errors), investigation failures (82; 24.8%), treatment delivery lapses (76; 23.0%), miscommunication (19; 5.8%), payment systems problems (4; 1.2%), error in the execution of a clinical task (19; 5.8%), wrong treatment decision (14; 4.2%), and wrong diagnosis (13; 3.9%). Most reports were of errors that were recognized and occurred in reporters' practices. Affected patients ranged in age from 8 months to 100 years, were of both sexes, and represented all major US ethnic groups. Almost half the reports were of events which had adverse consequences. Ten errors resulted in patients being admitted to hospital and one patient died. This medical error taxonomy, developed from self-reports of errors observed by family physicians during their routine clinical practice, emphasizes problems in healthcare processes and acknowledges medical errors arising from shortfalls in clinical knowledge and skills. Patient safety strategies with most effect in primary care settings need to be broader than the current focus on medication errors.
Propagation of measurement accuracy to biomass soft-sensor estimation and control quality.
Steinwandter, Valentin; Zahel, Thomas; Sagmeister, Patrick; Herwig, Christoph
2017-01-01
In biopharmaceutical process development and manufacturing, the online measurement of biomass and derived specific turnover rates is a central task to physiologically monitor and control the process. However, hard-type sensors such as dielectric spectroscopy, broth fluorescence, or permittivity measurement harbor various disadvantages. Therefore, soft-sensors, which use measurements of the off-gas stream and substrate feed to reconcile turnover rates and provide an online estimate of the biomass formation, are smart alternatives. For the reconciliation procedure, mass and energy balances are used together with accuracy estimations of measured conversion rates, which were so far arbitrarily chosen and static over the entire process. In this contribution, we present a novel strategy within the soft-sensor framework (named adaptive soft-sensor) to propagate uncertainties from measurements to conversion rates and demonstrate the benefits: For industrially relevant conditions, hereby the error of the resulting estimated biomass formation rate and specific substrate consumption rate could be decreased by 43 and 64 %, respectively, compared to traditional soft-sensor approaches. Moreover, we present a generic workflow to determine the required raw signal accuracy to obtain predefined accuracies of soft-sensor estimations. Thereby, appropriate measurement devices and maintenance intervals can be selected. Furthermore, using this workflow, we demonstrate that the estimation accuracy of the soft-sensor can be additionally and substantially increased.
Steering without navigation equipment: the lamentable state of Australian health policy reform
2009-01-01
Background Commentary on health policy reform in Australia often commences with an unstated logical error: Australians' health is good, therefore the Australian Health System is good. This possibly explains the disconnect between the options discussed, the areas needing reform and the generally self-congratulatory tone of the discussion: a good system needs (relatively) minor improvement. Results This paper comments on some issues of particular concern to Australian health policy makers and some areas needing urgent reform. The two sets of issues do not overlap. It is suggested that there are two fundamental reasons for this. The first is the failure to develop governance structures which promote the identification and resolution of problems according to their importance. The second and related failure is the failure to equip the health services industry with satisfactory navigation equipment - independent research capacity, independent reporting and evaluation - on a scale commensurate with the needs of the country's largest industry. These two failures together deprive the health system - as a system - of the chief driver of progress in every successful industry in the 20th Century. Conclusion Concluding comment is made on the National Health and Hospitals Reform Commission (NHHRC). This continued the tradition of largely evidence free argument and decision making. It failed to identify and properly analyse major system failures, the reasons for them and the form of governance which would maximise the likelihood of future error leaning. The NHHRC itself failed to error learn from past policy failures, a key lesson from which is that a major - and possibly the major - obstacle to reform, is government itself. The Commission virtually ignored the issue of governance. The endorsement of a monopolised system, driven by benevolent managers will miss the major lesson of history which is illustrated by Australia's own failures. PMID:19948044
Examining the Angular Resolution of the Astro-H's Soft X-Ray Telescopes
NASA Technical Reports Server (NTRS)
Sato, Toshiki; Iizuka, Ryo; Ishida, Manabu; Kikuchi, Naomichi; Maeda, Yoshitomo; Kurashima, Sho; Nakaniwa, Nozomi; Tomikawa, Kazuki; Hayashi, Takayuki; Mori, Hideyuki;
2016-01-01
The international x-ray observatory ASTRO-H was renamed Hitomi after launch. It covers a wide energy range from a few hundred eV to 600 keV. It is equipped with two soft x-ray telescopes (SXTs: SXT-I and SXT-S) for imaging the soft x-ray sky up to 12 keV, which focus an image onto the respective focal-plane detectors: CCD camera (SXI) and a calorimeter (SXS). The SXTs are fabricated in a quadrant unit. The angular resolution in half-power diameter (HPD) of each quadrant of the SXTs ranges between 1.1 and 1.4 arc min at 4.51 keV. It was also found that one quadrant has an energy dependence on the HPD. We examine the angular resolution with spot scan measurements. In order to understand the cause of imaging capability deterioration and to reflect it to the future telescope development, we carried out spot scan measurements, in which we illuminate all over the aperture of each quadrant with a square beam 8 mm on a side. Based on the scan results, we made maps of image blurring and a focus position. The former and the latter reflect figure error and positioning error, respectively, of the foils that are within the incident 8 mm x 8 mm beam. As a result, we estimated those errors in a quadrant to be approx. 0.9 to 1.0 and approx. 0.6 to 0.9 arc min, respectively. We found that the larger the positioning error in a quadrant is, the larger its HPD is. The HPD map, which manifests the local image blurring, is very similar from quadrant to quadrant, but the map of the focus position is different from location to location in each telescope. It is also found that the difference in local performance causes energy dependence of the HPD.
Neural Network and Regression Methods Demonstrated in the Design Optimization of a Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Hopkins, Dale A.; Lavelle, Thomas M.; Patnaik, Surya
2003-01-01
The neural network and regression methods of NASA Glenn Research Center s COMETBOARDS design optimization testbed were used to generate approximate analysis and design models for a subsonic aircraft operating at Mach 0.85 cruise speed. The analytical model is defined by nine design variables: wing aspect ratio, engine thrust, wing area, sweep angle, chord-thickness ratio, turbine temperature, pressure ratio, bypass ratio, fan pressure; and eight response parameters: weight, landing velocity, takeoff and landing field lengths, approach thrust, overall efficiency, and compressor pressure and temperature. The variables were adjusted to optimally balance the engines to the airframe. The solution strategy included a sensitivity model and the soft analysis model. Researchers generated the sensitivity model by training the approximators to predict an optimum design. The trained neural network predicted all response variables, within 5-percent error. This was reduced to 1 percent by the regression method. The soft analysis model was developed to replace aircraft analysis as the reanalyzer in design optimization. Soft models have been generated for a neural network method, a regression method, and a hybrid method obtained by combining the approximators. The performance of the models is graphed for aircraft weight versus thrust as well as for wing area and turbine temperature. The regression method followed the analytical solution with little error. The neural network exhibited 5-percent maximum error over all parameters. Performance of the hybrid method was intermediate in comparison to the individual approximators. Error in the response variable is smaller than that shown in the figure because of a distortion scale factor. The overall performance of the approximators was considered to be satisfactory because aircraft analysis with NASA Langley Research Center s FLOPS (Flight Optimization System) code is a synthesis of diverse disciplines: weight estimation, aerodynamic analysis, engine cycle analysis, propulsion data interpolation, mission performance, airfield length for landing and takeoff, noise footprint, and others.
NASA Technical Reports Server (NTRS)
Ackermann, M.; Ajello, M.; Allafort, A.; Atwood, W. B.; Baldini, L.; Barbiellini, G.; Bastieri, D.; Bechtol, K.; Bellazzini, R.; Bhat, P. N.;
2012-01-01
Due to an error at the publisher, the times given for the major tick marks in the X-axis in Figure 1 of the published article are incorrect. The correctly labeled times should be 00:52:00, 00:54:00,..., and 01:04:00. The correct version of Figure 1 and its caption is shown below. IOP Publishing sincerely regrets this error.25.
NASA Astrophysics Data System (ADS)
Ackermann, M.; Ajello, M.; Allafort, A.; Atwood, W. B.; Baldini, L.; Barbiellini, G.; Bastieri, D.; Bechtol, K.; Bellazzini, R.; Bhat, P. N.; Blandford, R. D.; Bonamente, E.; Borgland, A. W.; Bregeon, J.; Briggs, M. S.; Brigida, M.; Bruel, P.; Buehler, R.; Burgess, J. M.; Buson, S.; Caliandro, G. A.; Cameron, R. A.; Casandjian, J. M.; Cecchi, C.; Charles, E.; Chekhtman, A.; Chiang, J.; Ciprini, S.; Claus, R.; Cohen-Tanugi, J.; Connaughton, V.; Conrad, J.; Cutini, S.; Dennis, B. R.; de Palma, F.; Dermer, C. D.; Digel, S. W.; Silva, E. do Couto e.; Drell, P. S.; Drlica-Wagner, A.; Dubois, R.; Favuzzi, C.; Fegan, S. J.; Ferrara, E. C.; Fortin, P.; Fukazawa, Y.; Fusco, P.; Gargano, F.; Germani, S.; Giglietto, N.; Giordano, F.; Giroletti, M.; Glanzman, T.; Godfrey, G.; Grillo, L.; Grove, J. E.; Gruber, D.; Guiriec, S.; Hadasch, D.; Hayashida, M.; Hays, E.; Horan, D.; Iafrate, G.; Jóhannesson, G.; Johnson, A. S.; Johnson, W. N.; Kamae, T.; Kippen, R. M.; Knödlseder, J.; Kuss, M.; Lande, J.; Latronico, L.; Longo, F.; Loparco, F.; Lott, B.; Lovellette, M. N.; Lubrano, P.; Mazziotta, M. N.; McEnery, J. E.; Meegan, C.; Mehault, J.; Michelson, P. F.; Mitthumsiri, W.; Monte, C.; Monzani, M. E.; Morselli, A.; Moskalenko, I. V.; Murgia, S.; Murphy, R.; Naumann-Godo, M.; Nuss, E.; Nymark, T.; Ohno, M.; Ohsugi, T.; Okumura, A.; Omodei, N.; Orlando, E.; Paciesas, W. S.; Panetta, J. H.; Parent, D.; Pesce-Rollins, M.; Petrosian, V.; Pierbattista, M.; Piron, F.; Pivato, G.; Poon, H.; Porter, T. A.; Preece, R.; Rainò, S.; Rando, R.; Razzano, M.; Razzaque, S.; Reimer, A.; Reimer, O.; Ritz, S.; Sbarra, C.; Schwartz, R. A.; Sgrò, C.; Share, G. H.; Siskind, E. J.; Spinelli, P.; Takahashi, H.; Tanaka, T.; Tanaka, Y.; Thayer, J. B.; Tibaldo, L.; Tinivella, M.; Tolbert, A. K.; Tosti, G.; Troja, E.; Uchiyama, Y.; Usher, T. L.; Vandenbroucke, J.; Vasileiou, V.; Vianello, G.; Vitale, V.; von Kienlin, A.; Waite, A. P.; Wilson-Hodge, C.; Wood, D. L.; Wood, K. S.; Yang, Z.
2012-04-01
Due to an error at the publisher, the times given for the major tick marks in the X-axis in Figure 1 of the published article are incorrect. The correctly labeled times should be "00:52:00," "00:54:00," ... , and "01:04:00." The correct version of Figure 1 and its caption is shown below. IOP Publishing sincerely regrets this error.
Investigating the impact of spatial priors on the performance of model-based IVUS elastography
Richards, M S; Doyley, M M
2012-01-01
This paper describes methods that provide pre-requisite information for computing circumferential stress in modulus elastograms recovered from vascular tissue—information that could help cardiologists detect life-threatening plaques and predict their propensity to rupture. The modulus recovery process is an ill-posed problem; therefore additional information is needed to provide useful elastograms. In this work, prior geometrical information was used to impose hard or soft constraints on the reconstruction process. We conducted simulation and phantom studies to evaluate and compare modulus elastograms computed with soft and hard constraints versus those computed without any prior information. The results revealed that (1) the contrast-to-noise ratio of modulus elastograms achieved using the soft prior and hard prior reconstruction methods exceeded those computed without any prior information; (2) the soft prior and hard prior reconstruction methods could tolerate up to 8 % measurement noise; and (3) the performance of soft and hard prior modulus elastogram degraded when incomplete spatial priors were employed. This work demonstrates that including spatial priors in the reconstruction process should improve the performance of model-based elastography, and the soft prior approach should enhance the robustness of the reconstruction process to errors in the geometrical information. PMID:22037648
Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred
2015-01-01
Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.
Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred
2015-01-01
Human–robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human–robot interaction experiments. For that, we analyzed 201 videos of five human–robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human–robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies. PMID:26217266
Payne, Christopher J; Wamala, Isaac; Abah, Colette; Thalhofer, Thomas; Saeed, Mossab; Bautista-Salinas, Daniel; Horvath, Markus A; Vasilyev, Nikolay V; Roche, Ellen T; Pigula, Frank A; Walsh, Conor J
2017-09-01
Soft robotic devices have significant potential for medical device applications that warrant safe synergistic interaction with humans. This article describes the optimization of an implantable soft robotic system for heart failure whereby soft actuators wrapped around the ventricles are programmed to contract and relax in synchrony with the beating heart. Elastic elements integrated into the soft actuators provide recoiling function so as to aid refilling during the diastolic phase of the cardiac cycle. Improved synchronization with the biological system is achieved by incorporating the native ventricular pressure into the control system to trigger assistance and synchronize the device with the heart. A three-state electro-pneumatic valve configuration allows the actuators to contract at different rates to vary contraction patterns. An in vivo study was performed to test three hypotheses relating to mechanical coupling and temporal synchronization of the actuators and heart. First, that adhesion of the actuators to the ventricles improves cardiac output. Second, that there is a contraction-relaxation ratio of the actuators which generates optimal cardiac output. Third, that the rate of actuator contraction is a factor in cardiac output.
Regenbogen, Scott E; Greenberg, Caprice C; Studdert, David M; Lipsitz, Stuart R; Zinner, Michael J; Gawande, Atul A
2007-11-01
To identify the most prevalent patterns of technical errors in surgery, and evaluate commonly recommended interventions in light of these patterns. The majority of surgical adverse events involve technical errors, but little is known about the nature and causes of these events. We examined characteristics of technical errors and common contributing factors among closed surgical malpractice claims. Surgeon reviewers analyzed 444 randomly sampled surgical malpractice claims from four liability insurers. Among 258 claims in which injuries due to error were detected, 52% (n = 133) involved technical errors. These technical errors were further analyzed with a structured review instrument designed by qualitative content analysis. Forty-nine percent of the technical errors caused permanent disability; an additional 16% resulted in death. Two-thirds (65%) of the technical errors were linked to manual error, 9% to errors in judgment, and 26% to both manual and judgment error. A minority of technical errors involved advanced procedures requiring special training ("index operations"; 16%), surgeons inexperienced with the task (14%), or poorly supervised residents (9%). The majority involved experienced surgeons (73%), and occurred in routine, rather than index, operations (84%). Patient-related complexities-including emergencies, difficult or unexpected anatomy, and previous surgery-contributed to 61% of technical errors, and technology or systems failures contributed to 21%. Most technical errors occur in routine operations with experienced surgeons, under conditions of increased patient complexity or systems failure. Commonly recommended interventions, including restricting high-complexity operations to experienced surgeons, additional training for inexperienced surgeons, and stricter supervision of trainees, are likely to address only a minority of technical errors. Surgical safety research should instead focus on improving decision-making and performance in routine operations for complex patients and circumstances.
Malpractice claims related to musculoskeletal imaging. Incidence and anatomical location of lesions.
Fileni, Adriano; Fileni, Gaia; Mirk, Paoletta; Magnavita, Giulia; Nicoli, Marzia; Magnavita, Nicola
2013-12-01
Failure to detect lesions of the musculoskeletal system is a frequent cause of malpractice claims against radiologists. We examined all the malpractice claims related to alleged errors in musculoskeletal imaging filed against Italian radiologists over a period of 14 years (1993-2006). During the period considered, a total of 416 claims for alleged diagnostic errors relating to the musculoskeletal system were filed against radiologists; of these, 389 (93.5%) concerned failure to report fractures, and 15 (3.6%) failure to diagnose a tumour. Incorrect interpretation of bone pathology is among the most common causes of litigation against radiologists; alone, it accounts for 36.4% of all malpractice claims filed during the observation period. Awareness of this risk should encourage extreme caution and diligence.
Performance analysis of the word synchronization properties of the outer code in a TDRSS decoder
NASA Technical Reports Server (NTRS)
Costello, D. J., Jr.; Lin, S.
1984-01-01
A self-synchronizing coding scheme for NASA's TDRSS satellite system is a concatenation of a (2,1,7) inner convolutional code with a (255,223) Reed-Solomon outer code. Both symbol and word synchronization are achieved without requiring that any additional symbols be transmitted. An important parameter which determines the performance of the word sync procedure is the ratio of the decoding failure probability to the undetected error probability. Ideally, the former should be as small as possible compared to the latter when the error correcting capability of the code is exceeded. A computer simulation of a (255,223) Reed-Solomon code as carried out. Results for decoding failure probability and for undetected error probability are tabulated and compared.
NASA Technical Reports Server (NTRS)
Diorio, Kimberly A.; Voska, Ned (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.
The Forced Soft Spring Equation
ERIC Educational Resources Information Center
Fay, T. H.
2006-01-01
Through numerical investigations, this paper studies examples of the forced Duffing type spring equation with [epsilon] negative. By performing trial-and-error numerical experiments, the existence is demonstrated of stability boundaries in the phase plane indicating initial conditions yielding bounded solutions. Subharmonic boundaries are…
ERIC Educational Resources Information Center
Alamin, Abdulamir; Ahmed, Sawsan
2012-01-01
Analyzing errors committed by second language learners during their first year of study at the University of Taif, can offer insights and knowledge of the learners' difficulties in acquiring technical English communication. With reference to the errors analyzed, the researcher found that the learners' failure to understand basic English grammar…
New developments in ALFT's soft x-ray point sources
NASA Astrophysics Data System (ADS)
Cintron, Dario F.; Guo, Xiaoming; Xu, Meisheng; Ye, Rubin; Antoshko, Yuriy; Antoshko, Yuriy; Drew, Steve; Philippe, Albert; Panarella, Emilio
2002-07-01
The new development in ALFT soft X-ray point source VSX-400 consists mainly of an improvement of the nozzle design to reduce the source size, as well as the introduction of a novel trigger system, capable of triggering the discharge hundreds of million of times without failure, and a debris removal system. Continuous operation for 8 hours at 20 kHz allows us to achieve 400 mW of useful soft X-ray radiation around 1 nm wavelength. In another regime of operation with a high energy machine, the VSX-Z, we have been able to achieve consistently 10 J of X-rays per pulse at a repetition rate that can reach 1 Hz with an input electrical energy of approximately 3 kJ and an efficiency in excess of 10-3.
Tully, Mary P; Ashcroft, Darren M; Dornan, Tim; Lewis, Penny J; Taylor, David; Wass, Val
2009-01-01
Prescribing errors are common, they result in adverse events and harm to patients and it is unclear how best to prevent them because recommendations are more often based on surmized rather than empirically collected data. The aim of this systematic review was to identify all informative published evidence concerning the causes of and factors associated with prescribing errors in specialist and non-specialist hospitals, collate it, analyse it qualitatively and synthesize conclusions from it. Seven electronic databases were searched for articles published between 1985-July 2008. The reference lists of all informative studies were searched for additional citations. To be included, a study had to be of handwritten prescriptions for adult or child inpatients that reported empirically collected data on the causes of or factors associated with errors. Publications in languages other than English and studies that evaluated errors for only one disease, one route of administration or one type of prescribing error were excluded. Seventeen papers reporting 16 studies, selected from 1268 papers identified by the search, were included in the review. Studies from the US and the UK in university-affiliated hospitals predominated (10/16 [62%]). The definition of a prescribing error varied widely and the included studies were highly heterogeneous. Causes were grouped according to Reason's model of accident causation into active failures, error-provoking conditions and latent conditions. The active failure most frequently cited was a mistake due to inadequate knowledge of the drug or the patient. Skills-based slips and memory lapses were also common. Where error-provoking conditions were reported, there was at least one per error. These included lack of training or experience, fatigue, stress, high workload for the prescriber and inadequate communication between healthcare professionals. Latent conditions included reluctance to question senior colleagues and inadequate provision of training. Prescribing errors are often multifactorial, with several active failures and error-provoking conditions often acting together to cause them. In the face of such complexity, solutions addressing a single cause, such as lack of knowledge, are likely to have only limited benefit. Further rigorous study, seeking potential ways of reducing error, needs to be conducted. Multifactorial interventions across many parts of the system are likely to be required.
Soft X-Ray Exposure Testing of FEP Teflon for the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
deGroh, Kim K.
1998-01-01
The FEP Teflon (DuPont) multilayer insulation (MLI) thermal-control blanket material on the Hubble Space Telescope is degrading in the space environment. During the first Hubble servicing mission in 1993, after 3.6 years in low Earth orbit, aluminized and silvered FEP Teflon MLI thermal-control blanket materials were retrieved. These materials have been jointly analyzed by the NASA Lewis Research Center and the NASA Goddard Space Flight Center for degradation induced in the space environment (ref. 1). Solar-facing blanket materials were found to be embrittled with through-the-thickness cracking in the 5-mil FEP. During the second Hubble servicing mission in 1997, astronauts noticed that several blankets had large areas with tears. The torn FEP was curled up in some areas, exposing the underlying materials to the space environment. This tearing problem, and the associated curling up of torn areas, could lead to over-heating of the telescope and to particulate contamination. A Hubble Space Telescope MLI Failure Review Board was assembled by Goddard to investigate and identify the degradation mechanism of the FEP, to identify and characterize replacement materials, and to estimate the extent of damage at the time of the third servicing mission in 1999. A small piece of FEP retrieved during the second servicing mission is being evaluated by this failure review board along with materials from the first servicing mission. Since the first servicing mission, and as part of the failure review board, Lewis has been exposing FEP to soft x-rays to help determine the damage mechanisms of FEP in the space environment. Soft x-rays, which can penetrate into the bulk of FEP, are generated during solar flares and appear to be contributing to the degradation of the Hubble MLI.
Development of Biological Acoustic Impedance Microscope and its Error Estimation
NASA Astrophysics Data System (ADS)
Hozumi, Naohiro; Nakano, Aiko; Terauchi, Satoshi; Nagao, Masayuki; Yoshida, Sachiko; Kobayashi, Kazuto; Yamamoto, Seiji; Saijo, Yoshifumi
This report deals with the scanning acoustic microscope for imaging cross sectional acoustic impedance of biological soft tissues. A focused acoustic beam was transmitted to the tissue object mounted on the "rear surface" of plastic substrate. A cerebellum tissue of rat and a reference material were observed at the same time under the same condition. As the incidence is not vertical, not only longitudinal wave but also transversal wave is generated in the substrate. The error in acoustic impedance assuming vertical incidence was estimated. It was proved that the error can precisely be compensated, if the beam pattern and acoustic parameters of coupling medium and substrate had been known.
Microcircuit radiation effects databank
NASA Technical Reports Server (NTRS)
1983-01-01
This databank is the collation of radiation test data submitted by many testers and serves as a reference for engineers who are concerned with and have some knowledge of the effects of the natural radiation environment on microcircuits. It contains radiation sensitivity results from ground tests and is divided into two sections. Section A lists total dose damage information, and section B lists single event upset cross sections, I.E., the probability of a soft error (bit flip) or of a hard error (latchup).
Unusual course of infective endocarditis: acute renal failure progressing to chronic renal failure.
Sevinc, Alper; Davutoglu, Vedat; Barutcu, Irfan; Kocoglu, M Esra
2006-04-01
Infective endocarditis is an infection of the endocardium that usually involves the valves and adjacent structures. The classical fever of unknown origin presentation represents a minority of infective endocarditis. The presented case was a 21-yearold young lady presenting with acute renal failure and fever to the emergency room. Cardiac auscultation revealed a soft S1 and 4/6 apical holosystolic murmur extended to axilla. Echocardiography showed mobile fresh vegetation under the mitral posterior leaflet. She was diagnosed as having infective endocarditis. Hemodialysis was started with antimicrobial therapy. However, because of the presence of severe mitral regurgitation with left ventricle dilatation and large mobile vegetation, mitral prosthetic mechanical valve replacement was performed. Although treated with antibiotics combined with surgery, renal functions were deteriorated and progressed to chronic renal failure.
Error, blame, and the law in health care--an antipodean perspective.
Runciman, William B; Merry, Alan F; Tito, Fiona
2003-06-17
Patients are frequently harmed by problems arising from the health care process itself. Addressing these problems requires understanding the role of errors, violations, and system failures in their genesis. Problem-solving is inhibited by a tendency to blame those involved, often inappropriately. This has been aggravated by the need to attribute blame before compensation can be obtained through tort and the human failing of attributing blame simply because there has been a serious outcome. Blaming and punishing for errors that are made by well-intentioned people working in the health care system drives the problem of iatrogenic harm underground and alienates people who are best placed to prevent such problems from recurring. On the other hand, failure to assign blame when it is due is also undesirable and erodes trust in the medical profession. Understanding the distinction between blameworthy behavior and inevitable human errors and appreciating the systemic factors that underlie most failures in complex systems are essential for the response to a harmed patient to be informed, fair, and effective in improving safety. It is important to meet society's needs to blame and exact retribution when appropriate. However, this should not be a prerequisite for compensation, which should be appropriately structured, fair, timely, and, ideally, properly funded as an intrinsic part of health care and social security systems.
Reconfigurable Control with Neural Network Augmentation for a Modified F-15 Aircraft
NASA Technical Reports Server (NTRS)
Burken, John J.; Williams-Hayes, Peggy; Kaneshige, John T.; Stachowiak, Susan J.
2006-01-01
Description of the performance of a simplified dynamic inversion controller with neural network augmentation follows. Simulation studies focus on the results with and without neural network adaptation through the use of an F-15 aircraft simulator that has been modified to include canards. Simulated control law performance with a surface failure, in addition to an aerodynamic failure, is presented. The aircraft, with adaptation, attempts to minimize the inertial cross-coupling effect of the failure (a control derivative anomaly associated with a jammed control surface). The dynamic inversion controller calculates necessary surface commands to achieve desired rates. The dynamic inversion controller uses approximate short period and roll axis dynamics. The yaw axis controller is a sideslip rate command system. Methods are described to reduce the cross-coupling effect and maintain adequate tracking errors for control surface failures. The aerodynamic failure destabilizes the pitching moment due to angle of attack. The results show that control of the aircraft with the neural networks is easier (more damped) than without the neural networks. Simulation results show neural network augmentation of the controller improves performance with aerodynamic and control surface failures in terms of tracking error and cross-coupling reduction.
Adaptive Control Using Neural Network Augmentation for a Modified F-15 Aircraft
NASA Technical Reports Server (NTRS)
Burken, John J.; Williams-Hayes, Peggy; Karneshige, J. T.; Stachowiak, Susan J.
2006-01-01
Description of the performance of a simplified dynamic inversion controller with neural network augmentation follows. Simulation studies focus on the results with and without neural network adaptation through the use of an F-15 aircraft simulator that has been modified to include canards. Simulated control law performance with a surface failure, in addition to an aerodynamic failure, is presented. The aircraft, with adaptation, attempts to minimize the inertial cross-coupling effect of the failure (a control derivative anomaly associated with a jammed control surface). The dynamic inversion controller calculates necessary surface commands to achieve desired rates. The dynamic inversion controller uses approximate short period and roll axis dynamics. The yaw axis controller is a sideslip rate command system. Methods are described to reduce the cross-coupling effect and maintain adequate tracking errors for control surface failures. The aerodynamic failure destabilizes the pitching moment due to angle of attack. The results show that control of the aircraft with the neural networks is easier (more damped) than without the neural networks. Simulation results show neural network augmentation of the controller improves performance with aerodynamic and control surface failures in terms of tracking error and cross-coupling reduction.
NASA Technical Reports Server (NTRS)
Morrell, Frederick R.; Bailey, Melvin L.
1987-01-01
A vector-based failure detection and isolation technique for a skewed array of two degree-of-freedom inertial sensors is developed. Failure detection is based on comparison of parity equations with a threshold, and isolation is based on comparison of logic variables which are keyed to pass/fail results of the parity test. A multi-level approach to failure detection is used to ensure adequate coverage for the flight control, display, and navigation avionics functions. Sensor error models are introduced to expose the susceptibility of the parity equations to sensor errors and physical separation effects. The algorithm is evaluated in a simulation of a commercial transport operating in a range of light to severe turbulence environments. A bias-jump failure level of 0.2 deg/hr was detected and isolated properly in the light and moderate turbulence environments, but not detected in the extreme turbulence environment. An accelerometer bias-jump failure level of 1.5 milli-g was detected over all turbulence environments. For both types of inertial sensor, hard-over, and null type failures were detected in all environments without incident. The algorithm functioned without false alarm or isolation over all turbulence environments for the runs tested.
Application of failure mode and effect analysis in an assisted reproduction technology laboratory.
Intra, Giulia; Alteri, Alessandra; Corti, Laura; Rabellotti, Elisa; Papaleo, Enrico; Restelli, Liliana; Biondo, Stefania; Garancini, Maria Paola; Candiani, Massimo; Viganò, Paola
2016-08-01
Assisted reproduction technology laboratories have a very high degree of complexity. Mismatches of gametes or embryos can occur, with catastrophic consequences for patients. To minimize the risk of error, a multi-institutional working group applied failure mode and effects analysis (FMEA) to each critical activity/step as a method of risk assessment. This analysis led to the identification of the potential failure modes, together with their causes and effects, using the risk priority number (RPN) scoring system. In total, 11 individual steps and 68 different potential failure modes were identified. The highest ranked failure modes, with an RPN score of 25, encompassed 17 failures and pertained to "patient mismatch" and "biological sample mismatch". The maximum reduction in risk, with RPN reduced from 25 to 5, was mostly related to the introduction of witnessing. The critical failure modes in sample processing were improved by 50% in the RPN by focusing on staff training. Three indicators of FMEA success, based on technical skill, competence and traceability, have been evaluated after FMEA implementation. Witnessing by a second human operator should be introduced in the laboratory to avoid sample mix-ups. These findings confirm that FMEA can effectively reduce errors in assisted reproduction technology laboratories. Copyright © 2016 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Creep and fracture of a model yoghurt
NASA Astrophysics Data System (ADS)
Manneville, Sebastien; Leocmach, Mathieu; Perge, Christophe; Divoux, Thibaut
2014-11-01
Biomaterials such as protein or polysaccharide gels are known to behave qualitatively as soft solids and to rupture under an external load. Combining optical and ultrasonic imaging to shear rheology we show that the failure scenario of a model yoghurt, namely a casein gel, is reminiscent of brittle solids: after a primary creep regime characterized by a macroscopically homogeneous deformation and a power-law behavior which exponent is fully accounted for by linear viscoelasticity, fractures nucleate and grow logarithmically perpendicularly to shear, up to the sudden rupture of the gel. A single equation accounting for those two successive processes nicely captures the full rheological response. The failure time follows a decreasing power-law with the applied shear stress, similar to the Basquin law of fatigue for solids. These results are in excellent agreement with recent fiber-bundle models that include damage accumulation on elastic fibers and exemplify protein gels as model, brittle-like soft solids. Work funded by the European Research Council under Grant Agreement No. 258803.
Clinical Presentation of Soft-tissue Infections and its Management: A Study of 100 Cases.
Singh, Baldev; Singh, Sukha; Khichy, Sudhir; Ghatge, Avinash
2017-01-01
Soft-tissue infections vary widely in their nature and severity. A clear approach to the management must allow their rapid identification and treatment as they can be life-threatening. Clinical presentation of soft-tissue infections and its management. A prospective study based on 100 patients presenting with soft-tissue infections was done. All the cases of soft-tissue infections were considered irrespective of age, sex, etiological factors, or systemic disorders. The findings were evaluated regarding the pattern of soft-tissue infections in relation to age and sex, clinical presentation, complications, duration of hospital stay, management, and mortality. The most commonly involved age group was in the range of 41-60 years with male predominance. Abscess formation (45%) was the most common clinical presentation. Type 2 diabetes mellitus was the most common associated comorbid condition. Staphylococcus aureus was the most common culture isolate obtained. The most common complication seen was renal failure. Patients with surgical site infections had maximum duration of stay in the hospital. About 94% of the cases of soft-tissue infections were managed surgically. Mortality was mostly encountered in the cases of complications of cellulitis. Skin and soft-tissue infections are among the most common infections encountered by the emergency physicians. Ignorance, reluctance to treatment, economic constraints, and illiteracy delay the early detection and the initiation of proper treatment. Adequate and timely surgical intervention in most of the cases is of utmost importance to prevent the complications and reduce the mortality.
NASA Astrophysics Data System (ADS)
Baró, Jordi; Dahmen, Karin A.; Davidsen, Jörn; Planes, Antoni; Castillo, Pedro O.; Nataf, Guillaume F.; Salje, Ekhard K. H.; Vives, Eduard
2018-06-01
The total energy of acoustic emission (AE) events in externally stressed materials diverges when approaching macroscopic failure. Numerical and conceptual models explain this accelerated seismic release (ASR) as the approach to a critical point that coincides with ultimate failure. Here, we report ASR during soft uniaxial compression of three silica-based (SiO2 ) nanoporous materials. Instead of a singular critical point, the distribution of AE energies is stationary, and variations in the activity rate are sufficient to explain the presence of multiple periods of ASR leading to distinct brittle failure events. We propose that critical failure is suppressed in the AE statistics by mechanisms of transient hardening. Some of the critical exponents estimated from the experiments are compatible with mean field models, while others are still open to interpretation in terms of the solution of frictional and fracture avalanche models.
Design of the Detector II: A CMOS Gate Array for the Study of Concurrent Error Detection Techniques.
1987-07-01
detection schemes and temporary failures. The circuit consists- or of six different adders with concurrent error detection schemes . The error detection... schemes are - simple duplication, duplication with functional dual implementation, duplication with different &I [] .6implementations, two-rail encoding...THE SYSTEM. .. .... ...... ...... ...... 5 7. DESIGN OF CED SCHEMES .. ... ...... ...... ........ 7 7.1 Simple Duplication
Error and attack tolerance of complex networks
NASA Astrophysics Data System (ADS)
Albert, Réka; Jeong, Hawoong; Barabási, Albert-László
2000-07-01
Many complex systems display a surprising degree of tolerance against errors. For example, relatively simple organisms grow, persist and reproduce despite drastic pharmaceutical or environmental interventions, an error tolerance attributed to the robustness of the underlying metabolic network. Complex communication networks display a surprising degree of robustness: although key components regularly malfunction, local failures rarely lead to the loss of the global information-carrying ability of the network. The stability of these and other complex systems is often attributed to the redundant wiring of the functional web defined by the systems' components. Here we demonstrate that error tolerance is not shared by all redundant systems: it is displayed only by a class of inhomogeneously wired networks, called scale-free networks, which include the World-Wide Web, the Internet, social networks and cells. We find that such networks display an unexpected degree of robustness, the ability of their nodes to communicate being unaffected even by unrealistically high failure rates. However, error tolerance comes at a high price in that these networks are extremely vulnerable to attacks (that is, to the selection and removal of a few nodes that play a vital role in maintaining the network's connectivity). Such error tolerance and attack vulnerability are generic properties of communication networks.
Covariate Measurement Error Correction Methods in Mediation Analysis with Failure Time Data
Zhao, Shanshan
2014-01-01
Summary Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This paper focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error and error associated with temporal variation. The underlying model with the ‘true’ mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling design. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. PMID:25139469
Covariate measurement error correction methods in mediation analysis with failure time data.
Zhao, Shanshan; Prentice, Ross L
2014-12-01
Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This article focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error, and error associated with temporal variation. The underlying model with the "true" mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling designs. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. © 2014, The International Biometric Society.
Reduction in pediatric identification band errors: a quality collaborative.
Phillips, Shannon Connor; Saysana, Michele; Worley, Sarah; Hain, Paul D
2012-06-01
Accurate and consistent placement of a patient identification (ID) band is used in health care to reduce errors associated with patient misidentification. Multiple safety organizations have devoted time and energy to improving patient ID, but no multicenter improvement collaboratives have shown scalability of previously successful interventions. We hoped to reduce by half the pediatric patient ID band error rate, defined as absent, illegible, or inaccurate ID band, across a quality improvement learning collaborative of hospitals in 1 year. On the basis of a previously successful single-site intervention, we conducted a self-selected 6-site collaborative to reduce ID band errors in heterogeneous pediatric hospital settings. The collaborative had 3 phases: preparatory work and employee survey of current practice and barriers, data collection (ID band failure rate), and intervention driven by data and collaborative learning to accelerate change. The collaborative audited 11377 patients for ID band errors between September 2009 and September 2010. The ID band failure rate decreased from 17% to 4.1% (77% relative reduction). Interventions including education of frontline staff regarding correct ID bands as a safety strategy; a change to softer ID bands, including "luggage tag" type ID bands for some patients; and partnering with families and patients through education were applied at all institutions. Over 13 months, a collaborative of pediatric institutions significantly reduced the ID band failure rate. This quality improvement learning collaborative demonstrates that safety improvements tested in a single institution can be disseminated to improve quality of care across large populations of children.
2012-01-01
Background Although proton radiotherapy is a promising new approach for cancer patients, functional interference is a concern for patients with implantable cardioverter defibrillators (ICDs). The purpose of this study was to clarify the influence of secondary neutrons induced by proton radiotherapy on ICDs. Methods The experimental set-up simulated proton radiotherapy for a patient with an ICD. Four new ICDs were placed 0.3 cm laterally and 3 cm distally outside the radiation field in order to evaluate the influence of secondary neutrons. The cumulative in-field radiation dose was 107 Gy over 10 sessions of irradiation with a dose rate of 2 Gy/min and a field size of 10 × 10 cm2. After each radiation fraction, interference with the ICD by the therapy was analyzed by an ICD programmer. The dose distributions of secondary neutrons were estimated by Monte-Carlo simulation. Results The frequency of the power-on reset, the most serious soft error where the programmed pacing mode changes temporarily to a safety back-up mode, was 1 per approximately 50 Gy. The total number of soft errors logged in all devices was 29, which was a rate of 1 soft error per approximately 15 Gy. No permanent device malfunctions were detected. The calculated dose of secondary neutrons per 1 Gy proton dose in the phantom was approximately 1.3-8.9 mSv/Gy. Conclusions With the present experimental settings, the probability was approximately 1 power-on reset per 50 Gy, which was below the dose level (60-80 Gy) generally used in proton radiotherapy. Further quantitative analysis in various settings is needed to establish guidelines regarding proton radiotherapy for cancer patients with ICDs. PMID:22284700
NASA Astrophysics Data System (ADS)
Mehdizadeh, Saeid; Behmanesh, Javad; Khalili, Keivan
2017-11-01
Precipitation plays an important role in determining the climate of a region. Precise estimation of precipitation is required to manage and plan water resources, as well as other related applications such as hydrology, climatology, meteorology and agriculture. Time series of hydrologic variables such as precipitation are composed of deterministic and stochastic parts. Despite this fact, the stochastic part of the precipitation data is not usually considered in modeling of precipitation process. As an innovation, the present study introduces three new hybrid models by integrating soft computing methods including multivariate adaptive regression splines (MARS), Bayesian networks (BN) and gene expression programming (GEP) with a time series model, namely generalized autoregressive conditional heteroscedasticity (GARCH) for modeling of the monthly precipitation. For this purpose, the deterministic (obtained by soft computing methods) and stochastic (obtained by GARCH time series model) parts are combined with each other. To carry out this research, monthly precipitation data of Babolsar, Bandar Anzali, Gorgan, Ramsar, Tehran and Urmia stations with different climates in Iran were used during the period of 1965-2014. Root mean square error (RMSE), relative root mean square error (RRMSE), mean absolute error (MAE) and determination coefficient (R2) were employed to evaluate the performance of conventional/single MARS, BN and GEP, as well as the proposed MARS-GARCH, BN-GARCH and GEP-GARCH hybrid models. It was found that the proposed novel models are more precise than single MARS, BN and GEP models. Overall, MARS-GARCH and BN-GARCH models yielded better accuracy than GEP-GARCH. The results of the present study confirmed the suitability of proposed methodology for precise modeling of precipitation.
Fatigue characteristics of carbon nanotube blocks under compression
NASA Astrophysics Data System (ADS)
Suhr, J.; Ci, L.; Victor, P.; Ajayan, P. M.
2008-03-01
In this paper we investigate the mechanical response from repeated high compressive strains on freestanding, long, vertically aligned multiwalled carbon nanotube membranes and show that the arrays of nanotubes under compression behave very similar to soft tissue and exhibit viscoelastic behavior. Under compressive cyclic loading, the mechanical response of nanotube blocks shows initial preconditioning and hysteresis characteristic of viscoeleastic materials. Furthermore, no fatigue failure is observed even at high strain amplitudes up to half million cycles. The outstanding fatigue life and extraordinary soft tissue-like mechanical behavior suggest that properly engineered carbon nanotube structures could mimic artificial muscles.
NASA Astrophysics Data System (ADS)
Huo, Ming-Xia; Li, Ying
2017-12-01
Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.
Pasha, Azam; Sindhu, D; Nayak, Rabindra S; Mamatha, J; Chaitra, K R; Vishwakarma, Swati
2015-01-01
This study was conducted to evaluate the effect of two soft drinks, Coca-Cola and Mirinda orange on bracket bond strength, on adhesive remnant on teeth after debonding the bracket, and to observe by means of scanning electron microscope (SEM) the effect of these drinks on intact and sealed enamel. 120 non-carious maxillary premolar teeth already extracted for Orthodontic purposes were taken and divided into three groups, i.e., Coca-Cola drink, Mirinda orange, and control (artificial saliva) group. Brackets were bonded using conventional methods. Teeth were kept in soft drinks for 15 days, for 15 min, 3 times a day, separated by intervals of 2 h. At other times, they were kept in artificial saliva. The samples, thus obtained were evaluated for shear bond strength using the universal testing machine and subsequently subjected for adhesive remnant index (ARI) scores. SEM study on all the three groups was done for evaluating enamel surface of the intact and sealed enamel. The lowest mean resistance to shearing forces was shown by Mirinda orange group (5.30 ± 2.74 Mpa) followed by Coca-Cola group (6.24 ± 1.59 Mpa) and highest resistance to shearing forces by control group (7.33 ± 1.72 Mpa). The ARI scores revealed a cohesive failure in control samples and an adhesive failure in Mirinda and cola samples. SEM results showed areas of defect due to erosion caused by acidic soft drinks on intact and sealed enamel surface. Mirinda group showed the lowest resistance to shearing forces, followed by Coca-Cola group and with the highest resistance to shearing forces by the control group. There were significant differences between the control group and the study groups. Areas of defects, which were caused by erosion related to acidic soft drinks on the enamel surface around the adhesive, were seen. Areas of defects caused by Coca-Cola were more extensive when compared to Mirinda orange drink.
High Reliability Organizations--Medication Safety.
Yip, Luke; Farmer, Brenna
2015-06-01
High reliability organizations (HROs), such as the aviation industry, successfully engage in high-risk endeavors and have low incidence of adverse events. HROs have a preoccupation with failure and errors. They analyze each event to effect system wide change in an attempt to mitigate the occurrence of similar errors. The healthcare industry can adapt HRO practices, specifically with regard to teamwork and communication. Crew resource management concepts can be adapted to healthcare with the use of certain tools such as checklists and the sterile cockpit to reduce medication errors. HROs also use The Swiss Cheese Model to evaluate risk and look for vulnerabilities in multiple protective barriers, instead of focusing on one failure. This model can be used in medication safety to evaluate medication management in addition to using the teamwork and communication tools of HROs.
Using Utility Functions to Control a Distributed Storage System
2008-05-01
Pinheiro et al. [2007] suggest this is not an accurate assumption. Nicola and Goyal [1990] examined correlated failures across multiversion software...F. and Goyal, A. (1990). Modeling of correlated failures and community error recovery in multiversion software. IEEE Transactions on Software
Haptic communication between humans is tuned by the hard or soft mechanics of interaction
Usai, Francesco; Ganesh, Gowrishankar; Sanguineti, Vittorio; Burdet, Etienne
2018-01-01
To move a hard table together, humans may coordinate by following the dominant partner’s motion [1–4], but this strategy is unsuitable for a soft mattress where the perceived forces are small. How do partners readily coordinate in such differing interaction dynamics? To address this, we investigated how pairs tracked a target using flexion-extension of their wrists, which were coupled by a hard, medium or soft virtual elastic band. Tracking performance monotonically increased with a stiffer band for the worse partner, who had higher tracking error, at the cost of the skilled partner’s muscular effort. This suggests that the worse partner followed the skilled one’s lead, but simulations show that the results are better explained by a model where partners share movement goals through the forces, whilst the coupling dynamics determine the capacity of communicable information. This model elucidates the versatile mechanism by which humans can coordinate during both hard and soft physical interactions to ensure maximum performance with minimal effort. PMID:29565966
Performance of concatenated Reed-Solomon/Viterbi channel coding
NASA Technical Reports Server (NTRS)
Divsalar, D.; Yuen, J. H.
1982-01-01
The concatenated Reed-Solomon (RS)/Viterbi coding system is reviewed. The performance of the system is analyzed and results are derived with a new simple approach. A functional model for the input RS symbol error probability is presented. Based on this new functional model, we compute the performance of a concatenated system in terms of RS word error probability, output RS symbol error probability, bit error probability due to decoding failure, and bit error probability due to decoding error. Finally we analyze the effects of the noisy carrier reference and the slow fading on the system performance.
Lu, Min-Hua; Mao, Rui; Lu, Yin; Liu, Zheng; Wang, Tian-Fu; Chen, Si-Ping
2012-01-01
Indentation testing is a widely used approach to evaluate mechanical characteristics of soft tissues quantitatively. Young's modulus of soft tissue can be calculated from the force-deformation data with known tissue thickness and Poisson's ratio using Hayes' equation. Our group previously developed a noncontact indentation system using a water jet as a soft indenter as well as the coupling medium for the propagation of high-frequency ultrasound. The novel system has shown its ability to detect the early degeneration of articular cartilage. However, there is still lack of a quantitative method to extract the intrinsic mechanical properties of soft tissue from water jet indentation. The purpose of this study is to investigate the relationship between the loading-unloading curves and the mechanical properties of soft tissues to provide an imaging technique of tissue mechanical properties. A 3D finite element model of water jet indentation was developed with consideration of finite deformation effect. An improved Hayes' equation has been derived by introducing a new scaling factor which is dependent on Poisson's ratios v, aspect ratio a/h (the radius of the indenter/the thickness of the test tissue), and deformation ratio d/h. With this model, the Young's modulus of soft tissue can be quantitatively evaluated and imaged with the error no more than 2%. PMID:22927890
Neurological soft signs in children with attention deficit hyperactivity disorder.
Patankar, V C; Sangle, J P; Shah, Henal R; Dave, M; Kamath, R M
2012-04-01
Attention deficit hyperactivity disorder (ADHD) is a common neurodevelopmental disorder with wide repercussions. Since it is etiologically related to delayed maturation, neurological soft signs (NSS) could be a tool to assess this. Further the correlation of NSS with severity and type of ADHD and presence of Specific Learning Disability (SLD) would give further insight into it. To study neurological soft signs and risk factors (type, mode of delivery, and milestones) in children with ADHD and to correlate NSS with type and severity of ADHD and with co-morbid Specific Learning Disability. The study was carried out in Child care services of a tertiary teaching urban hospital. It was a cross-sectional single interview study. 52 consecutive children diagnosed as having ADHD were assessed for the presence of neurological soft signs using Revised Physical and Neurological Examination soft Signs scale (PANESS). The ADHD was rated by parents using ADHD parent rating scale. The data was analyzed using the chi-squared test and Pearson's co-relational analysis. Neurological soft signs are present in 84% of children. They are equally present in both the inattentive-hyperactive and impulsive-hyperactive types of ADHD. The presence of neurological soft signs in ADHD are independent of the presence of co-morbid SLD. Dysrrhythmias and overflow with gait were typically seen for impulsive-hyperactive type and higher severity of ADHD is related to more errors.
Understanding adverse events: human factors.
Reason, J
1995-01-01
(1) Human rather than technical failures now represent the greatest threat to complex and potentially hazardous systems. This includes healthcare systems. (2) Managing the human risks will never be 100% effective. Human fallibility can be moderated, but it cannot be eliminated. (3) Different error types have different underlying mechanisms, occur in different parts of the organisation, and require different methods of risk management. The basic distinctions are between: Slips, lapses, trips, and fumbles (execution failures) and mistakes (planning or problem solving failures). Mistakes are divided into rule based mistakes and knowledge based mistakes. Errors (information-handling problems) and violations (motivational problems) Active versus latent failures. Active failures are committed by those in direct contact with the patient, latent failures arise in organisational and managerial spheres and their adverse effects may take a long time to become evident. (4) Safety significant errors occur at all levels of the system, not just at the sharp end. Decisions made in the upper echelons of the organisation create the conditions in the workplace that subsequently promote individual errors and violations. Latent failures are present long before an accident and are hence prime candidates for principled risk management. (5) Measures that involve sanctions and exhortations (that is, moralistic measures directed to those at the sharp end) have only very limited effectiveness, especially so in the case of highly trained professionals. (6) Human factors problems are a product of a chain of causes in which the individual psychological factors (that is, momentary inattention, forgetting, etc) are the last and least manageable links. Attentional "capture" (preoccupation or distraction) is a necessary condition for the commission of slips and lapses. Yet, its occurrence is almost impossible to predict or control effectively. The same is true of the factors associated with forgetting. States of mind contributing to error are thus extremely difficult to manage; they can happen to the best of people at any time. (7) People do not act in isolation. Their behaviour is shaped by circumstances. The same is true for errors and violations. The likelihood of an unsafe act being committed is heavily influenced by the nature of the task and by the local workplace conditions. These, in turn, are the product of "upstream" organisational factors. Great gains in safety can ve achieved through relatively small modifications of equipment and workplaces. (8) Automation and increasing advanced equipment do not cure human factors problems, they merely relocate them. In contrast, training people to work effectively in teams costs little, but has achieved significant enhancements of human performance in aviation. (9) Effective risk management depends critically on a confidential and preferable anonymous incident monitoring system that records the individual, task, situational, and organisational factors associated with incidents and near misses. (10) Effective risk management means the simultaneous and targeted deployment of limited remedial resources at different levels of the system: the individual or team, the task, the situation, and the organisation as a whole. PMID:10151618
NASA Astrophysics Data System (ADS)
Mathieson, Haley Aaron
This thesis investigates experimentally and analytically the structural performance of sandwich panels composed of glass fibre reinforced polymer (GFRP) skins and a soft polyurethane foam core, with or without thin GFRP ribs connecting skins. The study includes three main components: (a) out-of-plane bending fatigue, (b) axial compression loading, and (c) in-plane bending of sandwich beams. Fatigue studies included 28 specimens and looked into establishing service life (S-N) curves of sandwich panels without ribs, governed by soft core shear failure and also ribbed panels governed by failure at the rib-skin junction. Additionally, the study compared fatigue life curves of sandwich panels loaded under fully reversed bending conditions (R=-1) with panels cyclically loaded in one direction only (R=0) and established the stiffness degradation characteristics throughout their fatigue life. Mathematical models expressing fatigue life and stiffness degradation curves were calibrated and expanded forms for various loading ratios were developed. Approximate fatigue thresholds of 37% and 23% were determined for non-ribbed panels loaded at R=0 and -1, respectively. Digital imaging techniques showed significant shear contribution significantly (90%) to deflections if no ribs used. Axial loading work included 51 specimens and examined the behavior of panels of various lengths (slenderness ratios), skin thicknesses, and also panels of similar length with various rib configurations. Observed failure modes governing were global buckling, skin wrinkling or skin crushing. In-plane bending involved testing 18 sandwich beams of various shear span-to-depth ratios and skin thicknesses, which failed by skin wrinkling at the compression side. The analytical modeling components of axially loaded panels include; a simple design-oriented analytical failure model and a robust non-linear model capable of predicting the full load-displacement response of axially loaded slender sandwich panels, accounting for P-Delta effects, inherent out-of-straightness profile of any shape at initial conditions, and the excessive shear deformation of soft core and its effect on buckling capacity. Another model was developed to predict the load-deflection response and failure modes of in-plane loaded sandwich beams. After successful verification of the models using experimental results, comprehensive parametric studies were carried out using these models to cover parameters beyond the limitations of the experimental program.
Protocol Processing for 100 Gbit/s and Beyond - A Soft Real-Time Approach in Hardware and Software
NASA Astrophysics Data System (ADS)
Büchner, Steffen; Lopacinski, Lukasz; Kraemer, Rolf; Nolte, Jörg
2017-09-01
100 Gbit/s wireless communication protocol processing stresses all parts of a communication system until the outermost. The efficient use of upcoming 100 Gbit/s and beyond transmission technology requires the rethinking of the way protocols are processed by the communication endpoints. This paper summarizes the achievements of the project End2End100. We will present a comprehensive soft real-time stream processing approach that allows the protocol designer to develop, analyze, and plan scalable protocols for ultra high data rates of 100 Gbit/s and beyond. Furthermore, we will present an ultra-low power, adaptable, and massively parallelized FEC (Forward Error Correction) scheme that detects and corrects bit errors at line rate with an energy consumption between 1 pJ/bit and 13 pJ/bit. The evaluation results discussed in this publication show that our comprehensive approach allows end-to-end communication with a very low protocol processing overhead.
Novel intelligent real-time position tracking system using FPGA and fuzzy logic.
Soares dos Santos, Marco P; Ferreira, J A F
2014-03-01
The main aim of this paper is to test if FPGAs are able to achieve better position tracking performance than software-based soft real-time platforms. For comparison purposes, the same controller design was implemented in these architectures. A Multi-state Fuzzy Logic controller (FLC) was implemented both in a Xilinx(®) Virtex-II FPGA (XC2v1000) and in a soft real-time platform NI CompactRIO(®)-9002. The same sampling time was used. The comparative tests were conducted using a servo-pneumatic actuation system. Steady-state errors lower than 4 μm were reached for an arbitrary vertical positioning of a 6.2 kg mass when the controller was embedded into the FPGA platform. Performance gains up to 16 times in the steady-state error, up to 27 times in the overshoot and up to 19.5 times in the settling time were achieved by using the FPGA-based controller over the software-based FLC controller. © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
An Interactive Concatenated Turbo Coding System
NASA Technical Reports Server (NTRS)
Liu, Ye; Tang, Heng; Lin, Shu; Fossorier, Marc
1999-01-01
This paper presents a concatenated turbo coding system in which a Reed-Solomon outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft- decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.
Mechanics of hard films on soft substrates
NASA Astrophysics Data System (ADS)
Lu, Nanshu
2009-12-01
Flexible electronics have been developed for various applications, including paper-like electronic readers, rollable solar cells, electronic skins etc., with the merits of light weight, low cost, large area, and ruggedness. The systems may be subject to one-time or repeated large deformation during manufacturing and application. Although organic materials can be highly deformable, currently they are not able to fulfill every electronic function. Therefore flexible electronic devices are usually made as organic/inorganic hybrids, with diverse materials, complex architecture, and micro features. While the polymer substrates can recover from large deformations, thin films of electronic materials such as metals, silicon, oxides, and nitrides fracture at small strains, usually less than a few percent. Mechanics of hard films on soft substrates hence holds the key to build high-performance and highly reliable flex circuits. This thesis investigates the deformability and failure mechanisms of thin films of metallic and ceramic materials supported by soft polymeric substrates through combined experimental, theoretical, and numerical methods. When subject to tension, micron-thick metal films with stable microstructure and strong interfacial adhesion to the substrate can be stretched beyond 50% without forming cracks. They eventually rupture by a ductile transgranular fracture which involves simultaneous necking and debonding. When metal films become nanometer-thick, intergranular fracture dominates the failure mode at elongations of only a few percent. Unannealed films show unstable microstructure at room temperature when subject to mechanical loading. In this case, films also rupture at small strains but by three concurrent mechanisms: deformation-induced grain growth, strain localization at large grains, and simultaneous debonding. In contrast to metal films, ceramic films rupture by brittle mechanisms. The only way to prevent rupture of ceramic films is to reduce the strain they are subject to. Instead of using blanket films that fail at strains less than i%, we have patterned ceramic films into a lattice of periodic, isolated islands. Failure modes such as channel cracking, debonding, and wrinkling have been identified. Island behaviors are controlled by factors such as island size, thickness, and elastic mismatch with the substrate. A very soft interlayer between the islands and the underlying polyimide substrate reduces strains in the islands by orders of magnitude. Using this approach, substrates with arrays of 200 x 200 mum2 large SiNx islands were stretched beyond 20% without cracking or debonding the islands. In summary, highly stretchable thin metal films and ceramic island arrays supported by polymer substrates have been achieved, along with mechanistic understandings of their deformation and failure mechanisms.
NASA Astrophysics Data System (ADS)
ÁLvarez, A.; Orfila, A.; Tintoré, J.
2004-03-01
Satellites are the only systems able to provide continuous information on the spatiotemporal variability of vast areas of the ocean. Relatively long-term time series of satellite data are nowadays available. These spatiotemporal time series of satellite observations can be employed to build empirical models, called satellite-based ocean forecasting (SOFT) systems, to forecast certain aspects of future ocean states. SOFT systems can predict satellite-observed fields at different timescales. The forecast skill of SOFT systems forecasting the sea surface temperature (SST) at monthly timescales has been extensively explored in previous works. In this work we study the performance of two SOFT systems forecasting, respectively, the SST and sea level anomaly (SLA) at weekly timescales, that is, providing forecasts of the weekly averaged SST and SLA fields with 1 week in advance. The SOFT systems were implemented in the Ligurian Sea (Western Mediterranean Sea). Predictions from the SOFT systems are compared with observations and with the predictions obtained from persistence models. Results indicate that the SOFT system forecasting the SST field is always superior in terms of predictability to persistence. Minimum prediction errors in the SST are obtained during winter and spring seasons. On the other hand, the biggest differences between the performance of SOFT and persistence models are found during summer and autumn. These changes in the predictability are explained on the basis of the particular variability of the SST field in the Ligurian Sea. Concerning the SLA field, no improvements with respect to persistence have been found for the SOFT system forecasting the SLA field.
Latest trends in parts SEP susceptibility from heavy ions
NASA Technical Reports Server (NTRS)
Nichols, Donald K.; Smith, L. S.; Soli, George A.; Koga, R.; Kolasinski, W. A.
1989-01-01
JPL and Aerospace have collected a third set of heavy-ion single-event phenomena (SEP) test data since their last joint IEEE publications in December 1985 and December 1987. Trends in SEP susceptibility (e.g., soft errors and latchup) for state-of-the-art parts are presented. Results of the study indicate that hard technologies and unacceptably soft technologies can be flagged. In some instances, specific tested parts can be taken as candidates for key microprocessors or memories. As always with radiation test data, specific test data for qualified flight parts is recommended for critical applications.
NASA Technical Reports Server (NTRS)
Lin, Shu; Fossorier, Marc
1998-01-01
In a coded communication system with equiprobable signaling, MLD minimizes the word error probability and delivers the most likely codeword associated with the corresponding received sequence. This decoding has two drawbacks. First, minimization of the word error probability is not equivalent to minimization of the bit error probability. Therefore, MLD becomes suboptimum with respect to the bit error probability. Second, MLD delivers a hard-decision estimate of the received sequence, so that information is lost between the input and output of the ML decoder. This information is important in coded schemes where the decoded sequence is further processed, such as concatenated coding schemes, multi-stage and iterative decoding schemes. In this chapter, we first present a decoding algorithm which both minimizes bit error probability, and provides the corresponding soft information at the output of the decoder. This algorithm is referred to as the MAP (maximum aposteriori probability) decoding algorithm.
A Randomized Trial of Soft Multifocal Contact Lenses for Myopia Control: Baseline Data and Methods.
Walline, Jeffrey J; Gaume Giannoni, Amber; Sinnott, Loraine T; Chandler, Moriah A; Huang, Juan; Mutti, Donald O; Jones-Jordan, Lisa A; Berntsen, David A
2017-09-01
The Bifocal Lenses In Nearsighted Kids (BLINK) study is the first soft multifocal contact lens myopia control study to compare add powers and measure peripheral refractive error in the vertical meridian, so it will provide important information about the potential mechanism of myopia control. The BLINK study is a National Eye Institute-sponsored, double-masked, randomized clinical trial to investigate the effects of soft multifocal contact lenses on myopia progression. This article describes the subjects' baseline characteristics and study methods. Subjects were 7 to 11 years old, had -0.75 to -5.00 spherical component and less than 1.00 diopter (D) astigmatism, and had 20/25 or better logMAR distance visual acuity with manifest refraction in each eye and with +2.50-D add soft bifocal contact lenses on both eyes. Children were randomly assigned to wear Biofinity single-vision, Biofinity Multifocal "D" with a +1.50-D add power, or Biofinity Multifocal "D" with a +2.50-D add power contact lenses. We examined 443 subjects at the baseline visits, and 294 (66.4%) subjects were enrolled. Of the enrolled subjects, 177 (60.2%) were female, and 200 (68%) were white. The mean (± SD) age was 10.3 ± 1.2 years, and 117 (39.8%) of the eligible subjects were younger than 10 years. The mean spherical equivalent refractive error, measured by cycloplegic autorefraction was -2.39 ± 1.00 D. The best-corrected binocular logMAR visual acuity with glasses was +0.01 ± 0.06 (20/21) at distance and -0.03 ± 0.08 (20/18) at near. The BLINK study subjects are similar to patients who would routinely be eligible for myopia control in practice, so the results will provide clinical information about soft bifocal contact lens myopia control as well as information about the mechanism of the treatment effect, if one occurs.
Modeling Security Aspects of Network
NASA Astrophysics Data System (ADS)
Schoch, Elmar
With more and more widespread usage of computer systems and networks, dependability becomes a paramount requirement. Dependability typically denotes tolerance or protection against all kinds of failures, errors and faults. Sources of failures can basically be accidental, e.g., in case of hardware errors or software bugs, or intentional due to some kind of malicious behavior. These intentional, malicious actions are subject of security. A more complete overview on the relations between dependability and security can be found in [31]. In parallel to the increased use of technology, misuse also has grown significantly, requiring measures to deal with it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pullum, Laura L; Symons, Christopher T
2011-01-01
Machine learning is used in many applications, from machine vision to speech recognition to decision support systems, and is used to test applications. However, though much has been done to evaluate the performance of machine learning algorithms, little has been done to verify the algorithms or examine their failure modes. Moreover, complex learning frameworks often require stepping beyond black box evaluation to distinguish between errors based on natural limits on learning and errors that arise from mistakes in implementation. We present a conceptual architecture, failure model and taxonomy, and failure modes and effects analysis (FMEA) of a semi-supervised, multi-modal learningmore » system, and provide specific examples from its use in a radiological analysis assistant system. The goal of the research described in this paper is to provide a foundation from which dependability analysis of systems using semi-supervised, multi-modal learning can be conducted. The methods presented provide a first step towards that overall goal.« less
Con Edison power failure of July 13 and 14, 1977. Final staff report
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1978-06-01
On July 13, 1977 the entire electric load of the Con Edison system was lost, plunging New York City and Westchester County into darkness. The collapse resulted from a combination of natural events, equipment malfunctions, questionable system-design features, and operating errors. An attempt is made in this report to answer the following: what were the specific causes of the failure; if equipment malfunctions and operator errors contributed, could they have been prevented; to what extent was Con Edison prepared to handle such an emergency; and did Con Edison plan prudently reserve generation, for reserve transmission capability, for automatic equipment tomore » protect its system, and for proper operator response to a critical situation. Following the introductory and summary section, additional sections include: the Consolidated Edison system; prevention of bulk power-supply interruptions; the sequence of failure and restoration; analysis of the July 1977 power failure; restoration sequence and equipment damage assessment; and other investigations of the blackout. (MCW)« less
Error begat error: design error analysis and prevention in social infrastructure projects.
Love, Peter E D; Lopez, Robert; Edwards, David J; Goh, Yang M
2012-09-01
Design errors contribute significantly to cost and schedule growth in social infrastructure projects and to engineering failures, which can result in accidents and loss of life. Despite considerable research that has addressed their error causation in construction projects they still remain prevalent. This paper identifies the underlying conditions that contribute to design errors in social infrastructure projects (e.g. hospitals, education, law and order type buildings). A systemic model of error causation is propagated and subsequently used to develop a learning framework for design error prevention. The research suggests that a multitude of strategies should be adopted in congruence to prevent design errors from occurring and so ensure that safety and project performance are ameliorated. Copyright © 2011. Published by Elsevier Ltd.
Checklists and Monitoring in the Cockpit: Why Crucial Defenses Sometimes Fail
NASA Technical Reports Server (NTRS)
Dismukes, R. Key; Berman, Ben
2010-01-01
Checklists and monitoring are two essential defenses against equipment failures and pilot errors. Problems with checklist use and pilots failures to monitor adequately have a long history in aviation accidents. This study was conducted to explore why checklists and monitoring sometimes fail to catch errors and equipment malfunctions as intended. Flight crew procedures were observed from the cockpit jumpseat during normal airline operations in order to: 1) collect data on monitoring and checklist use in cockpit operations in typical flight conditions; 2) provide a plausible cognitive account of why deviations from formal checklist and monitoring procedures sometimes occur; 3) lay a foundation for identifying ways to reduce vulnerability to inadvertent checklist and monitoring errors; 4) compare checklist and monitoring execution in normal flights with performance issues uncovered in accident investigations; and 5) suggest ways to improve the effectiveness of checklists and monitoring. Cognitive explanations for deviations from prescribed procedures are provided, along with suggestions for countermeasures for vulnerability to error.
Yang, Shu-Hui; Jerng, Jih-Shuin; Chen, Li-Chin; Li, Yu-Tsu; Huang, Hsiao-Fang; Wu, Chao-Ling; Chan, Jing-Yuan; Huang, Szu-Fen; Liang, Huey-Wen; Sun, Jui-Sheng
2017-11-03
Intra-hospital transportation (IHT) might compromise patient safety because of different care settings and higher demand on the human operation. Reports regarding the incidence of IHT-related patient safety events and human failures remain limited. To perform a retrospective analysis of IHT-related events, human failures and unsafe acts. A hospital-wide process for the IHT and database from the incident reporting system in a medical centre in Taiwan. All eligible IHT-related patient safety events between January 2010 to December 2015 were included. Incidence rate of IHT-related patient safety events, human failure modes, and types of unsafe acts. There were 206 patient safety events in 2 009 013 IHT sessions (102.5 per 1 000 000 sessions). Most events (n=148, 71.8%) did not involve patient harm, and process events (n=146, 70.9%) were most common. Events at the location of arrival (n=101, 49.0%) were most frequent; this location accounted for 61.0% and 44.2% of events with patient harm and those without harm, respectively (p<0.001). Of the events with human failures (n=186), the most common related process step was the preparation of the transportation team (n=91, 48.9%). Contributing unsafe acts included perceptual errors (n=14, 7.5%), decision errors (n=56, 30.1%), skill-based errors (n=48, 25.8%), and non-compliance (n=68, 36.6%). Multivariate analysis showed that human failure found in the arrival and hand-off sub-process (OR 4.84, p<0.001) was associated with increased patient harm, whereas the presence of omission (OR 0.12, p<0.001) was associated with less patient harm. This study shows a need to reduce human failures to prevent patient harm during intra-hospital transportation. We suggest that the transportation team pay specific attention to the sub-process at the location of arrival and prevent errors other than omissions. Long-term monitoring of IHT-related events is also warranted. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Effect of single vision soft contact lenses on peripheral refraction.
Kang, Pauline; Fan, Yvonne; Oh, Kelly; Trac, Kevin; Zhang, Frank; Swarbrick, Helen
2012-07-01
To investigate changes in peripheral refraction with under-, full, and over-correction of central refraction with commercially available single vision soft contact lenses (SCLs) in young myopic adults. Thirty-four myopic adult subjects were fitted with Proclear Sphere SCLs to under-correct (+0.75 DS), fully correct, and over-correct (-0.75 DS) their manifest central refractive error. Central and peripheral refraction were measured with no lens wear and subsequently with different levels of SCL central refractive error correction. The uncorrected refractive error was myopic at all locations along the horizontal meridian. Peripheral refraction was relatively hyperopic compared to center at 30 and 35° in the temporal visual field (VF) in low myopes and at 30 and 35° in the temporal VF and 10, 30, and 35° in the nasal VF in moderate myopes. All levels of SCL correction caused a hyperopic shift in refraction at all locations in the horizontal VF. The smallest hyperopic shift was demonstrated with under-correction followed by full correction and then by over-correction of central refractive error. An increase in relative peripheral hyperopia was measured with full correction SCLs compared with no correction in both low and moderate myopes. However, no difference in relative peripheral refraction profiles were found between under-, full, and over-correction. Under-, full, and over-correction of central refractive error with single vision SCLs caused a hyperopic shift in both central and peripheral refraction at all positions in the horizontal meridian. All levels of SCL correction caused the peripheral retina, which initially experienced absolute myopic defocus at baseline with no correction, to experience absolute hyperopic defocus. This peripheral hyperopia may be a possible cause of myopia progression reported with different types and levels of myopia correction.
NASA Astrophysics Data System (ADS)
Samboju, Vishal; Adams, Matthew; Salgaonkar, Vasant; Diederich, Chris J.; Cunha, J. Adam M.
2017-02-01
The speed of sound (SOS) for ultrasound devices used for imaging soft tissue is often calibrated to water, 1540 m/s1 , despite in-vivo soft tissue SOS varying from 1450 to 1613 m/s2 . Images acquired with 1540 m/s and used in conjunction with stereotactic external coordinate systems can thus result in displacement errors of several millimeters. Ultrasound imaging systems are routinely used to guide interventional thermal ablation and cryoablation devices, or radiation sources for brachytherapy3 . Brachytherapy uses small radioactive pellets, inserted interstitially with needles under ultrasound guidance, to eradicate cancerous tissue4 . Since the radiation dose diminishes with distance from the pellet as 1/r2 , imaging uncertainty of a few millimeters can result in significant erroneous dose delivery5,6. Likewise, modeling of power deposition and thermal dose accumulations from ablative sources are also prone to errors due to placement offsets from SOS errors7 . This work presents a method of mitigating needle placement error due to SOS variances without the need of ionizing radiation2,8. We demonstrate the effects of changes in dosimetry in a prostate brachytherapy environment due to patientspecific SOS variances and the ability to mitigate dose delivery uncertainty. Electromagnetic (EM) sensors embedded in the brachytherapy ultrasound system provide information regarding 3D position and orientation of the ultrasound array. Algorithms using data from these two modalities are used to correct bmode images to account for SOS errors. While ultrasound localization resulted in >3 mm displacements, EM resolution was verified to <1 mm precision using custom-built phantoms with various SOS, showing 1% accuracy in SOS measurement.
Lessons from aviation - the role of checklists in minimally invasive cardiac surgery.
Hussain, S; Adams, C; Cleland, A; Jones, P M; Walsh, G; Kiaii, B
2016-01-01
We describe an adverse event during minimally invasive cardiac surgery that resulted in a multi-disciplinary review of intra-operative errors and the creation of a procedural checklist. This checklist aims to prevent errors of omission and communication failures that result in increased morbidity and mortality. We discuss the application of the aviation - led "threats and errors model" to medical practice and the role of checklists and other strategies aimed at reducing medical errors. © The Author(s) 2015.
Stochastic Models of Human Errors
NASA Technical Reports Server (NTRS)
Elshamy, Maged; Elliott, Dawn M. (Technical Monitor)
2002-01-01
Humans play an important role in the overall reliability of engineering systems. More often accidents and systems failure are traced to human errors. Therefore, in order to have meaningful system risk analysis, the reliability of the human element must be taken into consideration. Describing the human error process by mathematical models is a key to analyzing contributing factors. Therefore, the objective of this research effort is to establish stochastic models substantiated by sound theoretic foundation to address the occurrence of human errors in the processing of the space shuttle.
Tinkle, Christopher L; Fernandez-Pineda, Israel; Sykes, April; Lu, Zhaohua; Hua, Chia-Ho; Neel, Michael D; Bahrami, Armita; Shulkin, Barry L; Kaste, Sue C; Pappo, Alberto; Spunt, Sheri L; Krasin, Matthew J
2017-11-15
Indications for and delivery of adjuvant therapies for pediatric nonrhabdomyosarcoma soft tissue sarcoma (NRSTS) have been derived largely from adult studies; therefore, significant concern remains regarding radiation exposure to normal tissue. The authors report long-term treatment outcomes and toxicities for pediatric and young adult patients with high-grade NRSTS who were treated on a prospective trial using limited-margin radiotherapy. Sixty-two patients (ages 3-22 years) with predominantly high-grade NRSTS requiring radiation were treated on a phase 2 institutional study of conformal external-beam radiotherapy and/or brachytherapy using a 1.5-cm to 2-cm anatomically constrained margin. The estimated cumulative incidence of local failure, Gray's method estimated cumulative incidence of local failure, Kaplan-Meier method estimated survival, competing-risk regression model determined predictors of disease outcome, and toxicity was reported according to CTCAE v2.0. At a median follow-up of 5.1 years (range, 0.2-10.9 years), 9 patients had experienced local failure. The 5-year overall cumulative incidence of local failure was 14.8% (95% confidence interval [CI], 7.2%-25%), and all but 1 local failure occurred outside the highest-dose irradiation volume. The 5-year Kaplan-Meier estimates for event-free and overall survival were 49.3% (95% CI, 36.3%-61.1%) and 67.9% (95% CI, 54.2%-78.3%), respectively. Multivariable analysis indicated that younger age was the only independent predictor of local recurrence (P = .004). The 5-year cumulative incidence of grade 3 or 4 late toxicity was 15% (95% CI, 7.2%-25.3%). The delivery of limited-margin radiotherapy using conformal external-beam radiotherapy or brachytherapy provides a high rate of local tumor control without an increase in marginal failures and with acceptable treatment-related morbidity. Cancer 2017;123:4419-29. © 2017 American Cancer Society. © 2017 American Cancer Society.
Oh, Eric J; Shepherd, Bryan E; Lumley, Thomas; Shaw, Pamela A
2018-04-15
For time-to-event outcomes, a rich literature exists on the bias introduced by covariate measurement error in regression models, such as the Cox model, and methods of analysis to address this bias. By comparison, less attention has been given to understanding the impact or addressing errors in the failure time outcome. For many diseases, the timing of an event of interest (such as progression-free survival or time to AIDS progression) can be difficult to assess or reliant on self-report and therefore prone to measurement error. For linear models, it is well known that random errors in the outcome variable do not bias regression estimates. With nonlinear models, however, even random error or misclassification can introduce bias into estimated parameters. We compare the performance of 2 common regression models, the Cox and Weibull models, in the setting of measurement error in the failure time outcome. We introduce an extension of the SIMEX method to correct for bias in hazard ratio estimates from the Cox model and discuss other analysis options to address measurement error in the response. A formula to estimate the bias induced into the hazard ratio by classical measurement error in the event time for a log-linear survival model is presented. Detailed numerical studies are presented to examine the performance of the proposed SIMEX method under varying levels and parametric forms of the error in the outcome. We further illustrate the method with observational data on HIV outcomes from the Vanderbilt Comprehensive Care Clinic. Copyright © 2017 John Wiley & Sons, Ltd.
Dehghan, Ashraf; Abumasoudi, Rouhollah Sheikh; Ehsanpour, Soheila
2016-01-01
Infertility and errors in the process of its treatment have a negative impact on infertile couples. The present study was aimed to identify and assess the common errors in the reception process by applying the approach of "failure modes and effects analysis" (FMEA). In this descriptive cross-sectional study, the admission process of fertility and infertility center of Isfahan was selected for evaluation of its errors based on the team members' decision. At first, the admission process was charted through observations and interviewing employees, holding multiple panels, and using FMEA worksheet, which has been used in many researches all over the world and also in Iran. Its validity was evaluated through content and face validity, and its reliability was evaluated through reviewing and confirmation of the obtained information by the FMEA team, and eventually possible errors, causes, and three indicators of severity of effect, probability of occurrence, and probability of detection were determined and corrective actions were proposed. Data analysis was determined by the number of risk priority (RPN) which is calculated by multiplying the severity of effect, probability of occurrence, and probability of detection. Twenty-five errors with RPN ≥ 125 was detected through the admission process, in which six cases of error had high priority in terms of severity and occurrence probability and were identified as high-risk errors. The team-oriented method of FMEA could be useful for assessment of errors and also to reduce the occurrence probability of errors.
Dehghan, Ashraf; Abumasoudi, Rouhollah Sheikh; Ehsanpour, Soheila
2016-01-01
Background: Infertility and errors in the process of its treatment have a negative impact on infertile couples. The present study was aimed to identify and assess the common errors in the reception process by applying the approach of “failure modes and effects analysis” (FMEA). Materials and Methods: In this descriptive cross-sectional study, the admission process of fertility and infertility center of Isfahan was selected for evaluation of its errors based on the team members’ decision. At first, the admission process was charted through observations and interviewing employees, holding multiple panels, and using FMEA worksheet, which has been used in many researches all over the world and also in Iran. Its validity was evaluated through content and face validity, and its reliability was evaluated through reviewing and confirmation of the obtained information by the FMEA team, and eventually possible errors, causes, and three indicators of severity of effect, probability of occurrence, and probability of detection were determined and corrective actions were proposed. Data analysis was determined by the number of risk priority (RPN) which is calculated by multiplying the severity of effect, probability of occurrence, and probability of detection. Results: Twenty-five errors with RPN ≥ 125 was detected through the admission process, in which six cases of error had high priority in terms of severity and occurrence probability and were identified as high-risk errors. Conclusions: The team-oriented method of FMEA could be useful for assessment of errors and also to reduce the occurrence probability of errors. PMID:28194208
NASA Astrophysics Data System (ADS)
Chertok, I. M.; Belov, A. V.
2018-03-01
Correction to: Solar Phys https://doi.org/10.1007/s11207-017-1169-1 We found an important error in the text of our article. On page 6, the second sentence of Section 3.2 "We studied the variations in soft X-ray flare characteristics in more detail by averaging them within the running windows of ± one Carrington rotation with a step of two rotations." should instead read "We studied the variations in soft X-ray flare characteristics in more detail by averaging them within the running windows of ± 2.5 Carrington rotations with a step of two rotations." We regret the inconvenience. The online version of the original article can be found at https://doi.org/10.1007/s11207-017-1169-1
Vélez-Díaz-Pallarés, Manuel; Delgado-Silveira, Eva; Carretero-Accame, María Emilia; Bermejo-Vicedo, Teresa
2013-01-01
To identify actions to reduce medication errors in the process of drug prescription, validation and dispensing, and to evaluate the impact of their implementation. A Health Care Failure Mode and Effect Analysis (HFMEA) was supported by a before-and-after medication error study to measure the actual impact on error rate after the implementation of corrective actions in the process of drug prescription, validation and dispensing in wards equipped with computerised physician order entry (CPOE) and unit-dose distribution system (788 beds out of 1080) in a Spanish university hospital. The error study was carried out by two observers who reviewed medication orders on a daily basis to register prescription errors by physicians and validation errors by pharmacists. Drugs dispensed in the unit-dose trolleys were reviewed for dispensing errors. Error rates were expressed as the number of errors for each process divided by the total opportunities for error in that process times 100. A reduction in prescription errors was achieved by providing training for prescribers on CPOE, updating prescription procedures, improving clinical decision support and automating the software connection to the hospital census (relative risk reduction (RRR), 22.0%; 95% CI 12.1% to 31.8%). Validation errors were reduced after optimising time spent in educating pharmacy residents on patient safety, developing standardised validation procedures and improving aspects of the software's database (RRR, 19.4%; 95% CI 2.3% to 36.5%). Two actions reduced dispensing errors: reorganising the process of filling trolleys and drawing up a protocol for drug pharmacy checking before delivery (RRR, 38.5%; 95% CI 14.1% to 62.9%). HFMEA facilitated the identification of actions aimed at reducing medication errors in a healthcare setting, as the implementation of several of these led to a reduction in errors in the process of drug prescription, validation and dispensing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopan, O; Kalet, A; Smith, W
2016-06-15
Purpose: A standard tool for ensuring the quality of radiation therapy treatments is the initial physics plan review. However, little is known about its performance in practice. The goal of this study is to measure the effectiveness of physics plan review by introducing simulated errors into “mock” treatment plans and measuring the performance of plan review by physicists. Methods: We generated six mock treatment plans containing multiple errors. These errors were based on incident learning system data both within the department and internationally (SAFRON). These errors were scored for severity and frequency. Those with the highest scores were included inmore » the simulations (13 errors total). Observer bias was minimized using a multiple co-correlated distractor approach. Eight physicists reviewed these plans for errors, with each physicist reviewing, on average, 3/6 plans. The confidence interval for the proportion of errors detected was computed using the Wilson score interval. Results: Simulated errors were detected in 65% of reviews [51–75%] (95% confidence interval [CI] in brackets). The following error scenarios had the highest detection rates: incorrect isocenter in DRRs/CBCT (91% [73–98%]) and a planned dose different from the prescribed dose (100% [61–100%]). Errors with low detection rates involved incorrect field parameters in record and verify system (38%, [18–61%]) and incorrect isocenter localization in planning system (29% [8–64%]). Though pre-treatment QA failure was reliably identified (100%), less than 20% of participants reported the error that caused the failure. Conclusion: This is one of the first quantitative studies of error detection. Although physics plan review is a key safety measure and can identify some errors with high fidelity, others errors are more challenging to detect. This data will guide future work on standardization and automation. Creating new checks or improving existing ones (i.e., via automation) will help in detecting those errors with low detection rates.« less
Forecasting the brittle failure of heterogeneous, porous geomaterials
NASA Astrophysics Data System (ADS)
Vasseur, Jérémie; Wadsworth, Fabian; Heap, Michael; Main, Ian; Lavallée, Yan; Dingwell, Donald
2017-04-01
Heterogeneity develops in magmas during ascent and is dominated by the development of crystal and importantly, bubble populations or pore-network clusters which grow, interact, localize, coalesce, outgas and resorb. Pore-scale heterogeneity is also ubiquitous in sedimentary basin fill during diagenesis. As a first step, we construct numerical simulations in 3D in which randomly generated heterogeneous and polydisperse spheres are placed in volumes and which are permitted to overlap with one another, designed to represent the random growth and interaction of bubbles in a liquid volume. We use these simulated geometries to show that statistical predictions of the inter-bubble lengthscales and evolving bubble surface area or cluster densities can be made based on fundamental percolation theory. As a second step, we take a range of well constrained random heterogeneous rock samples including sandstones, andesites, synthetic partially sintered glass bead samples, and intact glass samples and subject them to a variety of stress loading conditions at a range of temperatures until failure. We record in real time the evolution of the number of acoustic events that precede failure and show that in all scenarios, the acoustic event rate accelerates toward failure, consistent with previous findings. Applying tools designed to forecast the failure time based on these precursory signals, we constrain the absolute error on the forecast time. We find that for all sample types, the error associated with an accurate forecast of failure scales non-linearly with the lengthscale between the pore clusters in the material. Moreover, using a simple micromechanical model for the deformation of porous elastic bodies, we show that the ratio between the equilibrium sub-critical crack length emanating from the pore clusters relative to the inter-pore lengthscale, provides a scaling for the error on forecast accuracy. Thus for the first time we provide a potential quantitative correction for forecasting the failure of porous brittle solids that build the Earth's crust.
Identification of priorities for medication safety in neonatal intensive care.
Kunac, Desireé L; Reith, David M
2005-01-01
Although neonates are reported to be at greater risk of medication error than infants and older children, little is known about the causes and characteristics of error in this patient group. Failure mode and effects analysis (FMEA) is a technique used in industry to evaluate system safety and identify potential hazards in advance. The aim of this study was to identify and prioritize potential failures in the neonatal intensive care unit (NICU) medication use process through application of FMEA. Using the FMEA framework and a systems-based approach, an eight-member multidisciplinary panel worked as a team to create a flow diagram of the neonatal unit medication use process. Then by brainstorming, the panel identified all potential failures, their causes and their effects at each step in the process. Each panel member independently rated failures based on occurrence, severity and likelihood of detection to allow calculation of a risk priority score (RPS). The panel identified 72 failures, with 193 associated causes and effects. Vulnerabilities were found to be distributed across the entire process, but multiple failures and associated causes were possible when prescribing the medication and when preparing the drug for administration. The top ranking issue was a perceived lack of awareness of medication safety issues (RPS score 273), due to a lack of medication safety training. The next highest ranking issues were found to occur at the administration stage. Common potential failures related to errors in the dose, timing of administration, infusion pump settings and route of administration. Perceived causes were multiple, but were largely associated with unsafe systems for medication preparation and storage in the unit, variable staff skill level and lack of computerised technology. Interventions to decrease medication-related adverse events in the NICU should aim to increase staff awareness of medication safety issues and focus on medication administration processes.
Soft tissue deformation estimation by spatio-temporal Kalman filter finite element method.
Yarahmadian, Mehran; Zhong, Yongmin; Gu, Chengfan; Shin, Jaehyun
2018-01-01
Soft tissue modeling plays an important role in the development of surgical training simulators as well as in robot-assisted minimally invasive surgeries. It has been known that while the traditional Finite Element Method (FEM) promises the accurate modeling of soft tissue deformation, it still suffers from a slow computational process. This paper presents a Kalman filter finite element method to model soft tissue deformation in real time without sacrificing the traditional FEM accuracy. The proposed method employs the FEM equilibrium equation and formulates it as a filtering process to estimate soft tissue behavior using real-time measurement data. The model is temporally discretized using the Newmark method and further formulated as the system state equation. Simulation results demonstrate that the computational time of KF-FEM is approximately 10 times shorter than the traditional FEM and it is still as accurate as the traditional FEM. The normalized root-mean-square error of the proposed KF-FEM in reference to the traditional FEM is computed as 0.0116. It is concluded that the proposed method significantly improves the computational performance of the traditional FEM without sacrificing FEM accuracy. The proposed method also filters noises involved in system state and measurement data.
Microscope self-calibration based on micro laser line imaging and soft computing algorithms
NASA Astrophysics Data System (ADS)
Apolinar Muñoz Rodríguez, J.
2018-06-01
A technique to perform microscope self-calibration via micro laser line and soft computing algorithms is presented. In this technique, the microscope vision parameters are computed by means of soft computing algorithms based on laser line projection. To implement the self-calibration, a microscope vision system is constructed by means of a CCD camera and a 38 μm laser line. From this arrangement, the microscope vision parameters are represented via Bezier approximation networks, which are accomplished through the laser line position. In this procedure, a genetic algorithm determines the microscope vision parameters by means of laser line imaging. Also, the approximation networks compute the three-dimensional vision by means of the laser line position. Additionally, the soft computing algorithms re-calibrate the vision parameters when the microscope vision system is modified during the vision task. The proposed self-calibration improves accuracy of the traditional microscope calibration, which is accomplished via external references to the microscope system. The capability of the self-calibration based on soft computing algorithms is determined by means of the calibration accuracy and the micro-scale measurement error. This contribution is corroborated by an evaluation based on the accuracy of the traditional microscope calibration.
NASA Astrophysics Data System (ADS)
Chang, Chun; Huang, Benxiong; Xu, Zhengguang; Li, Bin; Zhao, Nan
2018-02-01
Three soft-input-soft-output (SISO) detection methods for dual-polarized quadrature duobinary (DP-QDB), including maximum-logarithmic-maximum-a-posteriori-probability-algorithm (Max-log-MAP)-based detection, soft-output-Viterbi-algorithm (SOVA)-based detection, and a proposed SISO detection, which can all be combined with SISO decoding, are presented. The three detection methods are investigated at 128 Gb/s in five-channel wavelength-division-multiplexing uncoded and low-density-parity-check (LDPC) coded DP-QDB systems by simulations. Max-log-MAP-based detection needs the returning-to-initial-states (RTIS) process despite having the best performance. When the LDPC code with a code rate of 0.83 is used, the detecting-and-decoding scheme with the SISO detection does not need RTIS and has better bit error rate (BER) performance than the scheme with SOVA-based detection. The former can reduce the optical signal-to-noise ratio (OSNR) requirement (at BER=10-5) by 2.56 dB relative to the latter. The application of the SISO iterative detection in LDPC-coded DP-QDB systems makes a good trade-off between requirements on transmission efficiency, OSNR requirement, and transmission distance, compared with the other two SISO methods.
Levy, Scott; Ferreira, Kurt B.; Bridges, Patrick G.; ...
2014-12-09
Building the next-generation of extreme-scale distributed systems will require overcoming several challenges related to system resilience. As the number of processors in these systems grow, the failure rate increases proportionally. One of the most common sources of failure in large-scale systems is memory. In this paper, we propose a novel runtime for transparently exploiting memory content similarity to improve system resilience by reducing the rate at which memory errors lead to node failure. We evaluate the viability of this approach by examining memory snapshots collected from eight high-performance computing (HPC) applications and two important HPC operating systems. Based on themore » characteristics of the similarity uncovered, we conclude that our proposed approach shows promise for addressing system resilience in large-scale systems.« less
Software reliability experiments data analysis and investigation
NASA Technical Reports Server (NTRS)
Walker, J. Leslie; Caglayan, Alper K.
1991-01-01
The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.
Hill, David P.
2012-01-01
Hill (2008) and Hill (2010) contain two technical errors: (1) a missing factor of 2 for computed Love‐wave amplitudes, and (2) a sign error in the off‐diagonal elements in the Euler rotation matrix.
A Computing Method to Determine the Performance of an Ionic Liquid Gel Soft Actuator
Zhang, Chenghong; Zhou, Yanmin; Wang, Zhipeng
2018-01-01
A new type of soft actuator material—an ionic liquid gel (ILG) that consists of BMIMBF4, HEMA, DEAP, and ZrO2—is polymerized into a gel state under ultraviolet (UV) light irradiation. In this paper, we first propose that the ILG conforms to the assumptions of hyperelastic theory and that the Mooney-Rivlin model can be used to study the properties of the ILG. Under the five-parameter and nine-parameter Mooney-Rivlin models, the formulas for the calculation of the uniaxial tensile stress, plane uniform tensile stress, and 3D directional stress are deduced. The five-parameter and nine-parameter Mooney-Rivlin models of the ILG with a ZrO2 content of 3 wt% were obtained by uniaxial tensile testing, and the parameters are denoted as c10, c01, c20, c11, and c02 and c10, c01, c20, c11, c02, c30, c21, c12, and c03, respectively. Through the analysis and comparison of the uniaxial tensile stress between the calculated and experimental data, the error between the stress data calculated from the five-parameter Mooney-Rivlin model and the experimental data is less than 0.51%, and the error between the stress data calculated from the nine-parameter Mooney-Rivlin model and the experimental data is no more than 8.87%. Hence, our work presents a feasible and credible formula for the calculation of the stress of the ILG. This work opens a new path to assess the performance of a soft actuator composed of an ILG and will contribute to the optimized design of soft robots. PMID:29853999
A Computing Method to Determine the Performance of an Ionic Liquid Gel Soft Actuator.
He, Bin; Zhang, Chenghong; Zhou, Yanmin; Wang, Zhipeng
2018-01-01
A new type of soft actuator material-an ionic liquid gel (ILG) that consists of BMIMBF 4 , HEMA, DEAP, and ZrO 2 -is polymerized into a gel state under ultraviolet (UV) light irradiation. In this paper, we first propose that the ILG conforms to the assumptions of hyperelastic theory and that the Mooney-Rivlin model can be used to study the properties of the ILG. Under the five-parameter and nine-parameter Mooney-Rivlin models, the formulas for the calculation of the uniaxial tensile stress, plane uniform tensile stress, and 3D directional stress are deduced. The five-parameter and nine-parameter Mooney-Rivlin models of the ILG with a ZrO 2 content of 3 wt% were obtained by uniaxial tensile testing, and the parameters are denoted as c 10 , c 01 , c 20 , c 11 , and c 02 and c 10 , c 01 , c 20 , c 11 , c 02 , c 30 , c 21 , c 12 , and c 03 , respectively. Through the analysis and comparison of the uniaxial tensile stress between the calculated and experimental data, the error between the stress data calculated from the five-parameter Mooney-Rivlin model and the experimental data is less than 0.51%, and the error between the stress data calculated from the nine-parameter Mooney-Rivlin model and the experimental data is no more than 8.87%. Hence, our work presents a feasible and credible formula for the calculation of the stress of the ILG. This work opens a new path to assess the performance of a soft actuator composed of an ILG and will contribute to the optimized design of soft robots.
Hamm, Jordan P.; Dyckman, Kara A.; McDowell, Jennifer E.; Clementz, Brett A.
2012-01-01
Cognitive control is required for correct performance on antisaccade tasks, including the ability to inhibit an externally driven ocular motor repsonse (a saccade to a peripheral stimulus) in favor of an internally driven ocular motor goal (a saccade directed away from a peripheral stimulus). Healthy humans occasionally produce errors during antisaccade tasks, but the mechanisms associated with such failures of cognitive control are uncertain. Most research on cognitive control failures focuses on post-stimulus processing, although a growing body of literature highlights a role of intrinsic brain activity in perceptual and cognitive performance. The current investigation used dense array electroencephalography and distributed source analyses to examine brain oscillations across a wide frequency bandwidth in the period prior to antisaccade cue onset. Results highlight four important aspects of ongoing and preparatory brain activations that differentiate error from correct antisaccade trials: (i) ongoing oscillatory beta (20–30Hz) power in anterior cingulate prior to trial initiation (lower for error trials), (ii) instantaneous phase of ongoing alpha-theta (7Hz) in frontal and occipital cortices immediately before trial initiation (opposite between trial types), (iii) gamma power (35–60Hz) in posterior parietal cortex 100 ms prior to cue onset (greater for error trials), and (iv) phase locking of alpha (5–12Hz) in parietal and occipital cortices immediately prior to cue onset (lower for error trials). These findings extend recently reported effects of pre-trial alpha phase on perception to cognitive control processes, and help identify the cortical generators of such phase effects. PMID:22593071
Programmable Numerical Function Generators: Architectures and Synthesis Method
2005-08-01
generates HDL (Hardware Descrip- tion Language) code from the design specification described by Scilab [14], a MATLAB-like numerical calculation soft...cad.com/Error-NFG/. [14] Scilab 3.0, INRIA-ENPC, France, http://scilabsoft.inria.fr/ [15] M. J. Schulte and J. E. Stine, “Approximating elementary functions
Overview of Device SEE Susceptibility from Heavy Ions
NASA Technical Reports Server (NTRS)
Nichols, D. K.; Coss, J. R.; McCarthy, K. P.; Schwartz, H. R.; Smith, L. S.
1998-01-01
A fifth set of heavy ion single event effects (SEE) test data have been collected since the last IEEE publications (1,2,3,4) in December issues for 1985, 1987, 1989, and 1991. Trends in SEE susceptibility (including soft errors and latchup) for state-of-the-art parts are evaluated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, W.T.; Siebers, J.V.; Bzdusek, K.
Purpose: To introduce methods to analyze Deformable Image Registration (DIR) and identify regions of potential DIR errors. Methods: DIR Deformable Vector Fields (DVFs) quantifying patient anatomic changes were evaluated using the Jacobian determinant and the magnitude of DVF curl as functions of tissue density and tissue type. These quantities represent local relative deformation and rotation, respectively. Large values in dense tissues can potentially identify non-physical DVF errors. For multiple DVFs per patient, histograms and visualization of DVF differences were also considered. To demonstrate the capabilities of methods, we computed multiple DVFs for each of five Head and Neck (H'N) patientsmore » (P1–P5) via a Fast-symmetric Demons (FSD) algorithm and via a Diffeomorphic Demons (DFD) algorithm, and show the potential to identify DVF errors. Results: Quantitative comparisons of the FSD and DFD registrations revealed <0.3 cm DVF differences in >99% of all voxels for P1, >96% for P2, and >90% of voxels for P3. While the FSD and DFD registrations were very similar for these patients, the Jacobian determinant was >50% in 9–15% of soft tissue and in 3–17% of bony tissue in each of these cases. The volumes of large soft tissue deformation were consistent for all five patients using the FSD algorithm (mean 15%±4% volume), whereas DFD reduced regions of large deformation by 10% volume (785 cm{sup 3}) for P4 and by 14% volume (1775 cm{sup 3}) for P5. The DFD registrations resulted in fewer regions of large DVF-curl; 50% rotations in FSD registrations averaged 209±136 cm{sup 3} in soft tissue and 10±11 cm{sup 3} in bony tissue, but using DFD these values were reduced to 42±53 cm{sup 3} and 1.1±1.5 cm{sup 3}, respectively. Conclusion: Analysis of Jacobian determinant and curl as functions of tissue density can identify regions of potential DVF errors by identifying non-physical deformations and rotations. Collaboration with Phillips Healthcare, as indicated in authorship.« less
Error floor behavior study of LDPC codes for concatenated codes design
NASA Astrophysics Data System (ADS)
Chen, Weigang; Yin, Liuguo; Lu, Jianhua
2007-11-01
Error floor behavior of low-density parity-check (LDPC) codes using quantized decoding algorithms is statistically studied with experimental results on a hardware evaluation platform. The results present the distribution of the residual errors after decoding failure and reveal that the number of residual error bits in a codeword is usually very small using quantized sum-product (SP) algorithm. Therefore, LDPC code may serve as the inner code in a concatenated coding system with a high code rate outer code and thus an ultra low error floor can be achieved. This conclusion is also verified by the experimental results.
47 CFR 1.1112 - Form of payment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Commission. Failure to comply with the Commission's procedures will result in the return of the application... not receive final payment and such failure is not excused by bank error. (2) The Commission will... attached to the receipt copy a stamped self-addressed envelope of sufficient size to contain the date...
Accuracy analysis for triangulation and tracking based on time-multiplexed structured light.
Wagner, Benjamin; Stüber, Patrick; Wissel, Tobias; Bruder, Ralf; Schweikard, Achim; Ernst, Floris
2014-08-01
The authors' research group is currently developing a new optical head tracking system for intracranial radiosurgery. This tracking system utilizes infrared laser light to measure features of the soft tissue on the patient's forehead. These features are intended to offer highly accurate registration with respect to the rigid skull structure by means of compensating for the soft tissue. In this context, the system also has to be able to quickly generate accurate reconstructions of the skin surface. For this purpose, the authors have developed a laser scanning device which uses time-multiplexed structured light to triangulate surface points. The accuracy of the authors' laser scanning device is analyzed and compared for different triangulation methods. These methods are given by the Linear-Eigen method and a nonlinear least squares method. Since Microsoft's Kinect camera represents an alternative for fast surface reconstruction, the authors' results are also compared to the triangulation accuracy of the Kinect device. Moreover, the authors' laser scanning device was used for tracking of a rigid object to determine how this process is influenced by the remaining triangulation errors. For this experiment, the scanning device was mounted to the end-effector of a robot to be able to calculate a ground truth for the tracking. The analysis of the triangulation accuracy of the authors' laser scanning device revealed a root mean square (RMS) error of 0.16 mm. In comparison, the analysis of the triangulation accuracy of the Kinect device revealed a RMS error of 0.89 mm. It turned out that the remaining triangulation errors only cause small inaccuracies for the tracking of a rigid object. Here, the tracking accuracy was given by a RMS translational error of 0.33 mm and a RMS rotational error of 0.12°. This paper shows that time-multiplexed structured light can be used to generate highly accurate reconstructions of surfaces. Furthermore, the reconstructed point sets can be used for high-accuracy tracking of objects, meeting the strict requirements of intracranial radiosurgery.
A stochastic model for soft tissue failure using acoustic emission data.
Sánchez-Molina, D; Martínez-González, E; Velázquez-Ameijide, J; Llumà, J; Rebollo Soria, M C; Arregui-Dalmases, C
2015-11-01
The strength of soft tissues is due mainly to collagen fibers. In most collagenous tissues, the arrangement of the fibers is random, but has preferred directions. The random arrangement makes it difficult to make deterministic predictions about the starting process of fiber breaking under tension. When subjected to tensile stress the fibers are progressively straighten out and then start to be stretched. At the beginning of fiber breaking, some of the fibers reach their maximum tensile strength and break down while some others remain unstressed (this latter fibers will assume then bigger stress until they eventually arrive to their failure point). In this study, a sample of human esophagi was subjected to a tensile breaking of fibers, up to the complete failure of the specimen. An experimental setup using Acoustic Emission to detect the elastic energy released is used during the test to detect the location of the emissions and the number of micro-failures per time unit. The data were statistically analyzed in order to be compared to a stochastic model which relates the level of stress in the tissue and the probability of breaking given the number of previously broken fibers (i.e. the deterioration in the tissue). The probability of a fiber breaking as the stretch increases in the tissue can be represented by a non-homogeneous Markov process which is the basis of the stochastic model proposed. This paper shows that a two-parameter model can account for the fiber breaking and the expected distribution for ultimate stress is a Fréchet distribution. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sauer, Juergen; Chavaillaz, Alain; Wastell, David
2016-06-01
This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.
Remote maintenance monitoring system
NASA Technical Reports Server (NTRS)
Simpkins, Lorenz G. (Inventor); Owens, Richard C. (Inventor); Rochette, Donn A. (Inventor)
1992-01-01
A remote maintenance monitoring system retrofits to a given hardware device with a sensor implant which gathers and captures failure data from the hardware device, without interfering with its operation. Failure data is continuously obtained from predetermined critical points within the hardware device, and is analyzed with a diagnostic expert system, which isolates failure origin to a particular component within the hardware device. For example, monitoring of a computer-based device may include monitoring of parity error data therefrom, as well as monitoring power supply fluctuations therein, so that parity error and power supply anomaly data may be used to trace the failure origin to a particular plane or power supply within the computer-based device. A plurality of sensor implants may be rerofit to corresponding plural devices comprising a distributed large-scale system. Transparent interface of the sensors to the devices precludes operative interference with the distributed network. Retrofit capability of the sensors permits monitoring of even older devices having no built-in testing technology. Continuous real time monitoring of a distributed network of such devices, coupled with diagnostic expert system analysis thereof, permits capture and analysis of even intermittent failures, thereby facilitating maintenance of the monitored large-scale system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dvorak, G.J.
1974-10-01
The research effort was concentrated on metal matrix composites, such as the Al--B, Al--Be, Cu--W, and similar systems. It was found that in as- fabricated composites with soft matrices fatigue failure can be prevented if the composite shakes down during cyclic loading. The fatigue strength of heat- treated composites is affected by residual microstresses, but failure can be prevented if the total microstresses are kept within the respective fatigue limits (at 10 to the 7th power cycles) of the constituents. These criteria for prevention of fatigue failure in metal matrix composite systems were verified by extensive comparisons of theoretical predictionsmore » with available experimental results. (GRA)« less
Natural Selection as an Emergent Process: Instructional Implications
ERIC Educational Resources Information Center
Cooper, Robert A.
2017-01-01
Student reasoning about cases of natural selection is often plagued by errors that stem from miscategorising selection as a direct, causal process, misunderstanding the role of randomness, and from the intuitive ideas of intentionality, teleology and essentialism. The common thread throughout many of these reasoning errors is a failure to apply…
Necrotizing Fasciitis of the Breast Requiring Emergent Radical Mastectomy.
Ward, Nicholas D; Harris, Jennifer W; Sloan, David A
2017-01-01
Necrotizing fasciitis is a rare, aggressive, soft-tissue infection that results in necrosis of skin, subcutaneous tissue, and fascia. It spreads rapidly and may progress to sepsis, multi-organ failure, and death. Predisposing conditions include diabetes, chronic alcoholism, advanced age, vascular disease, and immunosuppression and many cases are preceded by an injury or invasive procedure. Necrotizing soft-tissue infection of the breast is uncommon, with only a few reported cases in the literature. We present a 53-year-old diabetic woman who presented to the emergency room with several weeks of worsening breast and shoulder pain, swelling, and erythema. Upon formal evaluation by the surgical service, a necrotizing soft-tissue infection was suspected, and the patient was scheduled for emergent, surgical debridement. Because of the aggressive nature and high mortality of this disease, immediate surgical intervention, coupled with antibiotic therapy and physiologic support, is necessary to prevent complications and death. © 2016 Wiley Periodicals, Inc.
Severe soft tissue infections.
Napolitano, Lena M
2009-09-01
Severe skin and soft tissue infections (SSTIs) frequently require management in the ICU, in part related to associated septic shock or toxic shock syndrome or associated organ failure. Four fundamental management principles are key to a successful outcome in caring for patients who have severe SSTIs, including (1) early diagnosis and differentiation of necrotizing versus nonnecrotizing SSTI, (2) early initiation of appropriate empiric broad-spectrum antimicrobial therapy with consideration of risk factors for specific pathogens and mandatory coverage for methicillin-resistant Staphylococcus aureus (MRSA), (3) source control (ie, early aggressive surgical intervention for drainage of abscesses and debridement of necrotizing soft tissue infections), and (4) pathogen identification and appropriate de-escalation of antimicrobial therapy. MRSA has emerged as the most common identifiable cause of severe SSTIs; therefore, initiation of empiric anti-MRSA antimicrobials is warranted in all cases of severe SSTIs. In addition, appropriate critical care management-including fluid resuscitation, organ support and nutritional support-is a necessary component in treating severe SSTIs.
Daverio, Marco; Fino, Giuliana; Luca, Brugnaro; Zaggia, Cristina; Pettenazzo, Andrea; Parpaiola, Antonella; Lago, Paola; Amigoni, Angela
2015-12-01
Errors in are estimated to occur with an incidence of 3.7-16.6% in hospitalized patients. The application of systems for detection of adverse events is becoming a widespread reality in healthcare. Incident reporting (IR) and failure mode and effective analysis (FMEA) are strategies widely used to detect errors, but no studies have combined them in the setting of a pediatric intensive care unit (PICU). The aim of our study was to describe the trend of IR in a PICU and evaluate the effect of FMEA application on the number and severity of the errors detected. With this prospective observational study, we evaluated the frequency IR documented in standard IR forms completed from January 2009 to December 2012 in the PICU of Woman's and Child's Health Department of Padova. On the basis of their severity, errors were classified as: without outcome (55%), with minor outcome (16%), with moderate outcome (10%), and with major outcome (3%); 16% of reported incidents were 'near misses'. We compared the data before and after the introduction of FMEA. Sixty-nine errors were registered, 59 (86%) concerning drug therapy (83% during prescription). Compared to 2009-2010, in 2011-2012, we noted an increase of reported errors (43 vs 26) with a reduction of their severity (21% vs 8% 'near misses' and 65% vs 38% errors with no outcome). With the introduction of FMEA, we obtained an increased awareness in error reporting. Application of these systems will improve the quality of healthcare services. © 2015 John Wiley & Sons Ltd.
Interstitial and external radiotherapy in carcinoma of the soft palate and uvula
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esche, B.A.; Haie, C.M.; Gerbaulet, A.P.
1988-09-01
Forty-three patients, all male, with limited epidermoid carcinoma of the soft palate and uvula were treated by interstitial implant usually associated with external radiotherapy. Most patients received 50 Gy external irradiation to the oropharynx and neck followed by 20-35 Gy by interstitial iridium-192 wires using either guide gutters or a plastic tube technique. Twelve primary tumors and two recurrences after external irradiation alone had implant only for 65-75 Gy. Total actuarial local control is 92% with no local failures in 34 T1 primary tumors. Only one serious complication was seen. Overall actuarial survival was 60% at 3 years and 37%more » at 5 years but cause-specific survivals were 81% and 64%. The leading cause of death was other aerodigestive cancer, with an actuarial rate of occurrence of 10% per year after treatment of a soft palate cancer. Interstitital brachytherapy alone or combined with external irradiation is safe, effective management for early carcinoma of the soft palate and uvula but second malignancy is a serious problem.« less
Stanton, Neville A; Harvey, Catherine
2017-02-01
Risk assessments in Sociotechnical Systems (STS) tend to be based on error taxonomies, yet the term 'human error' does not sit easily with STS theories and concepts. A new break-link approach was proposed as an alternative risk assessment paradigm to reveal the effect of information communication failures between agents and tasks on the entire STS. A case study of the training of a Royal Navy crew detecting a low flying Hawk (simulating a sea-skimming missile) is presented using EAST to model the Hawk-Frigate STS in terms of social, information and task networks. By breaking 19 social links and 12 task links, 137 potential risks were identified. Discoveries included revealing the effect of risk moving around the system; reducing the risks to the Hawk increased the risks to the Frigate. Future research should examine the effects of compounded information communication failures on STS performance. Practitioner Summary: The paper presents a step-by-step walk-through of EAST to show how it can be used for risk assessment in sociotechnical systems. The 'broken-links' method takes a systemic, rather than taxonomic, approach to identify information communication failures in social and task networks.
A real-time diagnostic and performance monitor for UNIX. M.S. Thesis
NASA Technical Reports Server (NTRS)
Dong, Hongchao
1992-01-01
There are now over one million UNIX sites and the pace at which new installations are added is steadily increasing. Along with this increase, comes a need to develop simple efficient, effective and adaptable ways of simultaneously collecting real-time diagnostic and performance data. This need exists because distributed systems can give rise to complex failure situations that are often un-identifiable with single-machine diagnostic software. The simultaneous collection of error and performance data is also important for research in failure prediction and error/performance studies. This paper introduces a portable method to concurrently collect real-time diagnostic and performance data on a distributed UNIX system. The combined diagnostic/performance data collection is implemented on a distributed multi-computer system using SUN4's as servers. The approach uses existing UNIX system facilities to gather system dependability information such as error and crash reports. In addition, performance data such as CPU utilization, disk usage, I/O transfer rate and network contention is also collected. In the future, the collected data will be used to identify dependability bottlenecks and to analyze the impact of failures on system performance.
NASA Technical Reports Server (NTRS)
Hall, Steven R.; Walker, Bruce K.
1990-01-01
A new failure detection and isolation algorithm for linear dynamic systems is presented. This algorithm, the Orthogonal Series Generalized Likelihood Ratio (OSGLR) test, is based on the assumption that the failure modes of interest can be represented by truncated series expansions. This assumption leads to a failure detection algorithm with several desirable properties. Computer simulation results are presented for the detection of the failures of actuators and sensors of a C-130 aircraft. The results show that the OSGLR test generally performs as well as the GLR test in terms of time to detect a failure and is more robust to failure mode uncertainty. However, the OSGLR test is also somewhat more sensitive to modeling errors than the GLR test.
Liu, Xiang; Effenberger, Frank; Chand, Naresh
2015-03-09
We demonstrate a flexible modulation and detection scheme for upstream transmission in passive optical networks using pulse position modulation at optical network unit, facilitating burst-mode detection with automatic decision threshold tracking, and DSP-enabled soft-combining at optical line terminal. Adaptive receiver sensitivities of -33.1 dBm, -36.6 dBm and -38.3 dBm at a bit error ratio of 10(-4) are respectively achieved for 2.5 Gb/s, 1.25 Gb/s and 625 Mb/s after transmission over a 20-km standard single-mode fiber without any optical amplification.
Multi-stage decoding of multi-level modulation codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Kasami, Tadao; Costello, Daniel J., Jr.
1991-01-01
Various types of multi-stage decoding for multi-level modulation codes are investigated. It is shown that if the component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. Particularly, it is shown that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum soft-decision decoding of the code is very small, only a fraction of dB loss in signal to noise ratio at a bit error rate (BER) of 10(exp -6).
Dynamic plasticity and failure of high-purity alumina under shock loading.
Chen, M W; McCauley, J W; Dandekar, D P; Bourne, N K
2006-08-01
Most high-performance ceramics subjected to shock loading can withstand high failure strength and exhibit significant inelastic strain that cannot be achieved under conventional loading conditions. The transition point from elastic to inelastic response prior to failure during shock loading, known as the Hugoniot elastic limit (HEL), has been widely used as an important parameter in the characterization of the dynamic mechanical properties of ceramics. Nevertheless, the underlying micromechanisms that control HEL have been debated for many years. Here we show high-resolution electron microscopy of high-purity alumina, soft-recovered from shock-loading experiments. The change of deformation behaviour from dislocation activity in the vicinity of grain boundaries to deformation twinning has been observed as the impact pressures increase from below, to above HEL. The evolution of deformation modes leads to the conversion of material failure from an intergranular mode to transgranular cleavage, in which twinning interfaces serve as the preferred cleavage planes.
Bassetti, Matteo; McGovern, Paul C; Wenisch, Christoph; Meyer, R Daniel; Yan, Jean Li; Wible, Michele; Rottinghaus, Scott T; Quintana, Alvaro
2015-09-01
An imbalance in all-cause mortality was noted in tigecycline phase 3 and 4 comparative clinical trials across all studied indications. We investigated clinical failure and mortality in phase 3 and 4 complicated skin and soft-tissue infection (cSSTI) and complicated intra-abdominal infection (cIAI) tigecycline trials using descriptive analyses of a blinded adjudication of mortality and multivariate regression analyses. Attributable mortality analyses of cSSTI revealed death due to infection in 0.1% of each treatment group (P=1.000). In cIAI, there were no significant differences between tigecycline (1.2%) and comparator (0.7%) subjects who died due to infection (P=0.243). For cIAI clinical failure, treatment interaction with organ dysfunction was observed with no difference observed between clinical cure for tigecycline (85.4%) and comparator (76.7%) treatment groups (odds ratio=0.58, 95% confidence interval 0.28-1.19). Tigecycline-treated subjects had more adverse events of secondary pneumonias (2.1% vs. 1.2%) and more adverse events of secondary pneumonias with an outcome of death (0.5% vs. 0.1%). These analyses do not suggest that tigecycline is a factor either for failure (cSSTI and cIAI studies) or for death (cIAI studies). Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbee, D; McCarthy, A; Galavis, P
Purpose: Errors found during initial physics plan checks frequently require replanning and reprinting, resulting decreased departmental efficiency. Additionally, errors may be missed during physics checks, resulting in potential treatment errors or interruption. This work presents a process control created using the Eclipse Scripting API (ESAPI) enabling dosimetrists and physicists to detect potential errors in the Eclipse treatment planning system prior to performing any plan approvals or printing. Methods: Potential failure modes for five categories were generated based on available ESAPI (v11) patient object properties: Images, Contours, Plans, Beams, and Dose. An Eclipse script plugin (PlanCheck) was written in C# tomore » check errors most frequently observed clinically in each of the categories. The PlanCheck algorithms were devised to check technical aspects of plans, such as deliverability (e.g. minimum EDW MUs), in addition to ensuring that policy and procedures relating to planning were being followed. The effect on clinical workflow efficiency was measured by tracking the plan document error rate and plan revision/retirement rates in the Aria database over monthly intervals. Results: The number of potential failure modes the PlanCheck script is currently capable of checking for in the following categories: Images (6), Contours (7), Plans (8), Beams (17), and Dose (4). Prior to implementation of the PlanCheck plugin, the observed error rates in errored plan documents and revised/retired plans in the Aria database was 20% and 22%, respectively. Error rates were seen to decrease gradually over time as adoption of the script improved. Conclusion: A process control created using the Eclipse scripting API enabled plan checks to occur within the planning system, resulting in reduction in error rates and improved efficiency. Future work includes: initiating full FMEA for planning workflow, extending categories to include additional checks outside of ESAPI via Aria database queries, and eventual automated plan checks.« less
Restrictions on surgical resident shift length does not impact type of medical errors.
Anderson, Jamie E; Goodman, Laura F; Jensen, Guy W; Salcedo, Edgardo S; Galante, Joseph M
2017-05-15
In 2011, resident duty hours were restricted in an attempt to improve patient safety and resident education. With the goal of reducing fatigue, shorter shift length leads to more patient handoffs, raising concerns about adverse effects on patient safety. This study seeks to determine whether differences in duty-hour restrictions influence types of errors made by residents. This is a nested retrospective cohort study at a surgery department in an academic medical center. During 2013-14, standard 2011 duty hours were in place for residents. In 2014-15, duty-hour restrictions at the study site were relaxed ("flexible") with no restrictions on shift length. We reviewed all morbidity and mortality submissions from July 1, 2013-June 30, 2015 and compared differences in types of errors between these periods. A total of 383 patients experienced adverse events, including 59 deaths (15.4%). Comparing standard versus flexible periods, there was no difference in mortality (15.7% versus 12.6%, P = 0.479) or complication rates (2.6% versus 2.5%, P = 0.696). There was no difference in types of errors between periods (P = 0.050-0.808). The most number of errors were due to cognitive failures (229, 59.6%), whereas the fewest number of errors were due to team failure (127, 33.2%). By subset, technical errors resulted in the highest number of errors (169, 44.1%). There were no differences between types of errors of cases that were nonelective, at night, or involving residents. Among adverse events reported in this departmental surgical morbidity and mortality, there were no differences in types of errors when resident duty hours were less restrictive. Copyright © 2017 Elsevier Inc. All rights reserved.
Challenges in Construction Over Soft Soil - Case Studies in Malaysia
NASA Astrophysics Data System (ADS)
Mohamad, N. O.; Razali, C. E.; Hadi, A. A. A.; Som, P. P.; Eng, B. C.; Rusli, M. B.; Mohamad, F. R.
2016-07-01
Construction on soft ground area is a great challenge in the field of geotechnical engineering. Many engineering problems in the form of slope instability, bearing capacity failure or excessive settlement could occur either during or after the construction phase due to low shear strength and high compressibility of this soil. As main technical agencies responsible for implementation of development projects for Government of Malaysia, Public Works Department has vast experience in dealing with this problematic soil over the years. This paper discussed and elaborate on the engineering problems encountered in construction projects that have been carried out by PWD, namely Core Facilities Building of Polytechnic Kota Kinabalu in Sabah and Hospital Tengku Ampuan Rahimah Integration Quarters in Klang, Selangor. Instability of the ground during construction works had caused delay and cost overrun in completion of the project in Selangor, whereas occurrence of continuous post construction settlement had affected the integrity and serviceability of the building in Sabah. The causes of failure and proposed rehabilitation work for both projects also will be discussed in brief.
Glauber gluons and multiple parton interactions
NASA Astrophysics Data System (ADS)
Gaunt, Jonathan R.
2014-07-01
We show that for hadronic transverse energy E T in hadron-hadron collisions, the classic Collins-Soper-Sterman (CSS) argument for the cancellation of Glauber gluons breaks down at the level of two Glauber gluons exchanged between the spectators. Through an argument that relates the diagrams with these Glauber gluons to events containing additional soft scatterings, we suggest that this failure of the CSS cancellation actually corresponds to a failure of the `standard' factorisation formula with hard, soft and collinear functions to describe E T at leading power. This is because the observable receives a leading power contribution from multiple parton interaction (or spectator-spectator Glauber) processes. We also suggest that the same argument can be used to show that a whole class of observables, which we refer to as MPI sensitive observables, do not obey the standard factorisation at leading power. MPI sensitive observables are observables whose distributions in hadron-hadron collisions are disrupted strongly by the presence of multiple parton interactions (MPI) in the event. Examples of further MPI sensitive observables include the beam thrust B {/a, b +} and transverse thrust.
Effect of the mandible on mouthguard measurements of head kinematics.
Kuo, Calvin; Wu, Lyndia C; Hammoor, Brad T; Luck, Jason F; Cutcliffe, Hattie C; Lynall, Robert C; Kait, Jason R; Campbell, Kody R; Mihalik, Jason P; Bass, Cameron R; Camarillo, David B
2016-06-14
Wearable sensors are becoming increasingly popular for measuring head motions and detecting head impacts. Many sensors are worn on the skin or in headgear and can suffer from motion artifacts introduced by the compliance of soft tissue or decoupling of headgear from the skull. The instrumented mouthguard is designed to couple directly to the upper dentition, which is made of hard enamel and anchored in a bony socket by stiff ligaments. This gives the mouthguard superior coupling to the skull compared with other systems. However, multiple validation studies have yielded conflicting results with respect to the mouthguard׳s head kinematics measurement accuracy. Here, we demonstrate that imposing different constraints on the mandible (lower jaw) can alter mouthguard kinematic accuracy in dummy headform testing. In addition, post mortem human surrogate tests utilizing the worst-case unconstrained mandible condition yield 40% and 80% normalized root mean square error in angular velocity and angular acceleration respectively. These errors can be modeled using a simple spring-mass system in which the soft mouthguard material near the sensors acts as a spring and the mandible as a mass. However, the mouthguard can be designed to mitigate these disturbances by isolating sensors from mandible loads, improving accuracy to below 15% normalized root mean square error in all kinematic measures. Thus, while current mouthguards would suffer from measurement errors in the worst-case unconstrained mandible condition, future mouthguards should be designed to account for these disturbances and future validation testing should include unconstrained mandibles to ensure proper accuracy. Copyright © 2016 Elsevier Ltd. All rights reserved.
Large poroelastic deformation of a soft material
NASA Astrophysics Data System (ADS)
MacMinn, Christopher W.; Dufresne, Eric R.; Wettlaufer, John S.
2014-11-01
Flow through a porous material will drive mechanical deformation when the fluid pressure becomes comparable to the stiffness of the solid skeleton. This has applications ranging from hydraulic fracture for recovery of shale gas, where fluid is injected at high pressure, to the mechanics of biological cells and tissues, where the solid skeleton is very soft. The traditional linear theory of poroelasticity captures this fluid-solid coupling by combining Darcy's law with linear elasticity. However, linear elasticity is only volume-conservative to first order in the strain, which can become problematic when damage, plasticity, or extreme softness lead to large deformations. Here, we compare the predictions of linear poroelasticity with those of a large-deformation framework in the context of two model problems. We show that errors in volume conservation are compounded and amplified by coupling with the fluid flow, and can become important even when the deformation is small. We also illustrate these results with a laboratory experiment.
NASA Astrophysics Data System (ADS)
Elfgen, S.; Franck, D.; Hameyer, K.
2018-04-01
Magnetic measurements are indispensable for the characterization of soft magnetic material used e.g. in electrical machines. Characteristic values are used as quality control during production and for the parametrization of material models. Uncertainties and errors in the measurements are reflected directly in the parameters of the material models. This can result in over-dimensioning and inaccuracies in simulations for the design of electrical machines. Therefore, existing influencing factors in the characterization of soft magnetic materials are named and their resulting uncertainties contributions studied. The analysis of the resulting uncertainty contributions can serve the operator as additional selection criteria for different measuring sensors. The investigation is performed for measurements within and outside the currently prescribed standard, using a Single sheet tester and its impact on the identification of iron loss parameter is studied.
NASA Astrophysics Data System (ADS)
Jiang, Cong; Yu, Zong-Wen; Wang, Xiang-Bin
2017-03-01
We show how to calculate the secure final key rate in the four-intensity decoy-state measurement-device-independent quantum key distribution protocol with both source errors and statistical fluctuations with a certain failure probability. Our results rely only on the range of only a few parameters in the source state. All imperfections in this protocol have been taken into consideration without assuming any specific error patterns of the source.
Clarke, S G; Phillips, A T M; Bull, A M J; Cobb, J P
2012-06-01
The impact of anatomical variation and surgical error on excessive wear and loosening of the acetabular component of large diameter metal-on-metal hip arthroplasties was measured using a multi-factorial analysis through 112 different simulations. Each surgical scenario was subject to eight different daily loading activities using finite element analysis. Excessive wear appears to be predominantly dependent on cup orientation, with inclination error having a higher influence than version error, according to the study findings. Acetabular cup loosening, as inferred from initial implant stability, appears to depend predominantly on factors concerning the area of cup-bone contact, specifically the level of cup seating achieved and the individual patient's anatomy. The extent of press fit obtained at time of surgery did not appear to influence either mechanism of failure in this study. Copyright © 2012 Elsevier Ltd. All rights reserved.
How unrealistic optimism is maintained in the face of reality.
Sharot, Tali; Korn, Christoph W; Dolan, Raymond J
2011-10-09
Unrealistic optimism is a pervasive human trait that influences domains ranging from personal relationships to politics and finance. How people maintain unrealistic optimism, despite frequently encountering information that challenges those biased beliefs, is unknown. We examined this question and found a marked asymmetry in belief updating. Participants updated their beliefs more in response to information that was better than expected than to information that was worse. This selectivity was mediated by a relative failure to code for errors that should reduce optimism. Distinct regions of the prefrontal cortex tracked estimation errors when those called for positive update, both in individuals who scored high and low on trait optimism. However, highly optimistic individuals exhibited reduced tracking of estimation errors that called for negative update in right inferior prefrontal gyrus. These findings indicate that optimism is tied to a selective update failure and diminished neural coding of undesirable information regarding the future.
An experiment in software reliability
NASA Technical Reports Server (NTRS)
Dunham, J. R.; Pierce, J. L.
1986-01-01
The results of a software reliability experiment conducted in a controlled laboratory setting are reported. The experiment was undertaken to gather data on software failures and is one in a series of experiments being pursued by the Fault Tolerant Systems Branch of NASA Langley Research Center to find a means of credibly performing reliability evaluations of flight control software. The experiment tests a small sample of implementations of radar tracking software having ultra-reliability requirements and uses n-version programming for error detection, and repetitive run modeling for failure and fault rate estimation. The experiment results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay.
FPGA-Based, Self-Checking, Fault-Tolerant Computers
NASA Technical Reports Server (NTRS)
Some, Raphael; Rennels, David
2004-01-01
A proposed computer architecture would exploit the capabilities of commercially available field-programmable gate arrays (FPGAs) to enable computers to detect and recover from bit errors. The main purpose of the proposed architecture is to enable fault-tolerant computing in the presence of single-event upsets (SEUs). [An SEU is a spurious bit flip (also called a soft error) caused by a single impact of ionizing radiation.] The architecture would also enable recovery from some soft errors caused by electrical transients and, to some extent, from intermittent and permanent (hard) errors caused by aging of electronic components. A typical FPGA of the current generation contains one or more complete processor cores, memories, and highspeed serial input/output (I/O) channels, making it possible to shrink a board-level processor node to a single integrated-circuit chip. Custom, highly efficient microcontrollers, general-purpose computers, custom I/O processors, and signal processors can be rapidly and efficiently implemented by use of FPGAs. Unfortunately, FPGAs are susceptible to SEUs. Prior efforts to mitigate the effects of SEUs have yielded solutions that degrade performance of the system and require support from external hardware and software. In comparison with other fault-tolerant- computing architectures (e.g., triple modular redundancy), the proposed architecture could be implemented with less circuitry and lower power demand. Moreover, the fault-tolerant computing functions would require only minimal support from circuitry outside the central processing units (CPUs) of computers, would not require any software support, and would be largely transparent to software and to other computer hardware. There would be two types of modules: a self-checking processor module and a memory system (see figure). The self-checking processor module would be implemented on a single FPGA and would be capable of detecting its own internal errors. It would contain two CPUs executing identical programs in lock step, with comparison of their outputs to detect errors. It would also contain various cache local memory circuits, communication circuits, and configurable special-purpose processors that would use self-checking checkers. (The basic principle of the self-checking checker method is to utilize logic circuitry that generates error signals whenever there is an error in either the checker or the circuit being checked.) The memory system would comprise a main memory and a hardware-controlled check-pointing system (CPS) based on a buffer memory denoted the recovery cache. The main memory would contain random-access memory (RAM) chips and FPGAs that would, in addition to everything else, implement double-error-detecting and single-error-correcting memory functions to enable recovery from single-bit errors.
Diagnostic decision-making and strategies to improve diagnosis.
Thammasitboon, Satid; Cutrer, William B
2013-10-01
A significant portion of diagnostic errors arises through cognitive errors resulting from inadequate knowledge, faulty data gathering, and/or faulty verification. Experts estimate that 75% of diagnostic failures can be attributed to clinician diagnostic thinking failure. The cognitive processes that underlie diagnostic thinking of clinicians are complex and intriguing, and it is imperative that clinicians acquire explicit appreciation and application of different cognitive approaches to make decisions better. A dual-process model that unifies many theories of decision-making has emerged as a promising template for understanding how clinicians think and judge efficiently in a diagnostic reasoning process. The identification and implementation of strategies for decreasing or preventing such diagnostic errors has become a growing area of interest and research. Suggested strategies to decrease diagnostic error incidence include increasing clinician's clinical expertise and avoiding inherent cognitive errors to make decisions better. Implementing Interventions focused solely on avoiding errors may work effectively for patient safety issues such as medication errors. Addressing cognitive errors, however, requires equal effort on expanding the individual clinician's expertise. Providing cognitive support to clinicians for robust diagnostic decision-making serves as the final strategic target for decreasing diagnostic errors. Clinical guidelines and algorithms offer another method for streamlining decision-making and decreasing likelihood of cognitive diagnostic errors. Addressing cognitive processing errors is undeniably the most challenging task in reducing diagnostic errors. While many suggested approaches exist, they are mostly based on theories and sciences in cognitive psychology, decision-making, and education. The proposed interventions are primarily suggestions and very few of them have been tested in the actual practice settings. Collaborative research effort is required to effectively address cognitive processing errors. Researchers in various areas, including patient safety/quality improvement, decision-making, and problem solving, must work together to make medical diagnosis more reliable. © 2013 Mosby, Inc. All rights reserved.
Chest Wall Diseases: Respiratory Pathophysiology.
Tzelepis, George E
2018-06-01
The chest wall consists of various structures that function in an integrated fashion to ventilate the lungs. Disorders affecting the bony structures or soft tissues of the chest wall may impose elastic loads by stiffening the chest wall and decreasing respiratory system compliance. These alterations increase the work of breathing and lead to hypoventilation and hypercapnia. Respiratory failure may occur acutely or after a variable period of time. This review focuses on the pathophysiology of respiratory function in specific diseases and disorders of the chest wall, and highlights pathogenic mechanisms of respiratory failure. Copyright © 2018 Elsevier Inc. All rights reserved.
Design of Stretchable Electronics Against Impact.
Yuan, J H; Pharr, M; Feng, X; Rogers, John A; Huang, Yonggang
2016-10-01
Stretchable electronics offer soft, biocompatible mechanical properties; these same properties make them susceptible to device failure associated with physical impact. This paper studies designs for stretchable electronics that resist failure from impacts due to incorporation of a viscoelastic encapsulation layer. Results indicate that the impact resistance depends on the thickness and viscoelastic properties of the encapsulation layer, as well as the duration of impact. An analytic model for the critical thickness of the encapsulation layer is established. It is shown that a commercially available, low modulus silicone material offers viscous properties that make it a good candidate as the encapsulation layer for stretchable electronics.
Ozone Profile Retrievals from the OMPS on Suomi NPP
NASA Astrophysics Data System (ADS)
Bak, J.; Liu, X.; Kim, J. H.; Haffner, D. P.; Chance, K.; Yang, K.; Sun, K.; Gonzalez Abad, G.
2017-12-01
We verify and correct the Ozone Mapping and Profiler Suite (OMPS) Nadir Mapper (NM) L1B v2.0 data with the aim of producing accurate ozone profile retrievals using an optimal estimation based inversion method in the 302.5-340 nm fitting. The evaluation of available slit functions demonstrates that preflight-measured slit functions well represent OMPS measurements compared to derived Gaussian slit functions. Our OMPS fitting residuals contain significant wavelength and cross-track dependent biases, and thereby serious cross-track striping errors are found in preliminary retrievals, especially in the troposphere. To eliminate the systematic component of the fitting residuals, we apply "soft calibration" to OMPS radiances. With the soft calibration the amplitude of fitting residuals decreases from 1 % to 0.2 % over low/mid latitudes, and thereby the consistency of tropospheric ozone retrievals between OMPS and Ozone Monitoring Instrument (OMI) are substantially improved. A common mode correction is implemented for additional radiometric calibration, which improves retrievals especially at high latitudes where the amplitude of fitting residuals decreases by a factor of 2. We estimate the floor noise error of OMPS measurements from standard deviations of the fitting residuals. The derived error in the Huggins band ( 0.1 %) is 2 times smaller than OMI floor noise error and 2 times larger than OMPS L1B measurement error. The OMPS floor noise errors better constrain our retrievals for maximizing measurement information and stabilizing our fitting residuals. The final precision of the fitting residuals is less than 0.1 % in the low/mid latitude, with 1 degrees of freedom for signal for the tropospheric ozone, so that we meet the general requirements for successful tropospheric ozone retrievals. To assess if the quality of OMPS ozone retrievals could be acceptable for scientific use, we will characterize OMPS ozone profile retrievals, present error analysis, and validate retrievals using a reference dataset. The useful information on the vertical distribution of ozone is limited below 40 km only from OMPS NM measurements due to the absence of Hartley ozone wavelength. This shortcoming will be improved with the joint ozone profile retrieval using Nadir Profiler (NP) measurements covering the 250 to 310 nm range.
Effect of system compliance on crack nucleation in soft materials
NASA Astrophysics Data System (ADS)
Rattan, Shruti; Crosby, Alfred
Puncture mechanics in soft materials is critical for the development of new surgical instruments, robot assisted-surgery as well as new materials used in personal protective equipment. However, analytical techniques to study this important deformation process are limited. We have previously described a simple experimental method to study the resistive forces and failure of a soft gel being indented with a small tip needle. We showed that puncture stresses can reach two orders of magnitude greater than the material modulus and that the force response is insensitive to the geometry of the indenter at large indentation depths. Currently, we are examining the influence of system compliance on crack nucleation (e.g. puncture) in soft gels. It is well known that system compliance influences the peak force in adhesion and traditional fracture experiments; however, its influence on crack nucleation is unresolved. We find that as the system becomes more compliant, lower peak forces required to puncture a gel of certain stiffness with the same indenter were measured. We are developing scaling relationships to relate the peak puncture force and system compliance. Our findings introduce new questions with regard to the possibility of intrinsic materials properties related to the critical stress and energy for crack nucleation in soft materials.
ERIC Educational Resources Information Center
Jones, Katherine J.; Cochran, Gary; Hicks, Rodney W.; Mueller, Keith J.
2004-01-01
Context:Low service volume, insufficient information technology, and limited human resources are barriers to learning about and correcting system failures in small rural hospitals. This paper describes the implementation of and initial findings from a voluntary medication error reporting program developed by the Nebraska Center for Rural Health…
Patient safety: honoring advanced directives.
Tice, Martha A
2007-02-01
Healthcare providers typically think of patient safety in the context of preventing iatrogenic injury. Prevention of falls and medication or treatment errors is the typical focus of adverse event analyses. If healthcare providers are committed to honoring the wishes of patients, then perhaps failures to honor advanced directives should be viewed as reportable medical errors.
25+ Years of the Hubble Space Telescope and a Simple Error That Cost Millions
ERIC Educational Resources Information Center
Shakerin, Said
2016-01-01
A simple mistake in properly setting up a measuring device caused millions of dollars to be spent in correcting the initial optical failure of the Hubble Space Telescope (HST). This short article is intended as a lesson for a physics laboratory and discussion of errors in measurement.
Understanding Teamwork in Trauma Resuscitation through Analysis of Team Errors
ERIC Educational Resources Information Center
Sarcevic, Aleksandra
2009-01-01
An analysis of human errors in complex work settings can lead to important insights into the workspace design. This type of analysis is particularly relevant to safety-critical, socio-technical systems that are highly dynamic, stressful and time-constrained, and where failures can result in catastrophic societal, economic or environmental…
49 CFR Appendix C to Part 236 - Safety Assurance Criteria and Processes
Code of Federal Regulations, 2010 CFR
2010-10-01
... system (all its elements including hardware and software) must be designed to assure safe operation with... unsafe errors in the software due to human error in the software specification, design, or coding phases... (hardware or software, or both) are used in combination to ensure safety. If a common mode failure exists...
Shifting and Sharing: Academic Physicians' Strategies for Navigating Underperformance and Failure.
LaDonna, Kori A; Ginsburg, Shiphra; Watling, Christopher
2018-05-22
Medical practice is uncertain and complex. Consequently, even outstanding performers will inevitably experience moments of underperformance and failure. Coping relies on insight and resilience. However, how physicians develop and use these skills to navigate struggle remains underexplored. A better understanding may reveal strategies to support both struggling learners and stressed practitioners. In 2015, 28 academic physicians were interviewed about their experiences with underperformance or failure. Constructivist grounded theory informed data collection and analysis. Participants' experiences with struggle ranged from patient errors and academic failures to frequent, smaller moments of interpersonal conflict and work-life imbalance. To buffer impact, participants sometimes shifted their focus to an aspect of their identity where they felt successful. Additionally, while participants perceived that insight develops by acknowledging and reflecting on error, they sometimes deflected blame for performance gaps. More often, participants seemed to accept personal responsibility while simultaneously sharing accountability for underperformance or failure with external forces. Paradoxically, participants perceived learners who used these strategies as lacking in insight. Participants demonstrated the protective and functional value of distributing responsibility for underperformance and failure. Shifting and sharing may be an element of reflection and resilience; recognizing external factors may provide a way to gain perspective and to preserve the self. However, this strategy challenges educators' assumptions that learners who deflect are avoiding personal responsibility. The authors' findings raise questions about what it means to be resilient, and how assumptions about learners' responses to failure may affect strategies to support underperforming learners.
Safety Strategies in an Academic Radiation Oncology Department and Recommendations for Action
Terezakis, Stephanie A.; Pronovost, Peter; Harris, Kendra; DeWeese, Theodore; Ford, Eric
2013-01-01
Background Safety initiatives in the United States continue to work on providing guidance as to how the average practitioner might make patients safer in the face of the complex process by which radiation therapy (RT), an essential treatment used in the management of many patients with cancer, is prepared and delivered. Quality control measures can uncover certain specific errors such as machine dose mis-calibration or misalignments of the patient in the radiation treatment beam. However, they are less effective at uncovering less common errors that can occur anywhere along the treatment planning and delivery process, and even when the process is functioning as intended, errors still occur. Prioritizing Risks and Implementing Risk-Reduction Strategies Activities undertaken at the radiation oncology department at the Johns Hopkins Hospital (Baltimore) include Failure Mode and Effects Analysis (FMEA), risk-reduction interventions, and voluntary error and near-miss reporting systems. A visual process map portrayed 269 RT steps occurring among four subprocesses—including consult, simulation, treatment planning, and treatment delivery. Two FMEAs revealed 127 and 159 possible failure modes, respectively. Risk-reduction interventions for 15 “top-ranked” failure modes were implemented. Since the error and near-miss reporting system’s implementation in the department in 2007, 253 events have been logged. However, the system may be insufficient for radiation oncology, for which a greater level of practice-specific information is required to fully understand each event. Conclusions The “basic science” of radiation treatment has received considerable support and attention in developing novel therapies to benefit patients. The time has come to apply the same focus and resources to ensuring that patients safely receive the maximal benefits possible. PMID:21819027
Sayler, Elaine; Eldredge-Hindy, Harriet; Dinome, Jessie; Lockamy, Virginia; Harrison, Amy S
2015-01-01
The planning procedure for Valencia and Leipzig surface applicators (VLSAs) (Nucletron, Veenendaal, The Netherlands) differs substantially from CT-based planning; the unfamiliarity could lead to significant errors. This study applies failure modes and effects analysis (FMEA) to high-dose-rate (HDR) skin brachytherapy using VLSAs to ensure safety and quality. A multidisciplinary team created a protocol for HDR VLSA skin treatments and applied FMEA. Failure modes were identified and scored by severity, occurrence, and detectability. The clinical procedure was then revised to address high-scoring process nodes. Several key components were added to the protocol to minimize risk probability numbers. (1) Diagnosis, prescription, applicator selection, and setup are reviewed at weekly quality assurance rounds. Peer review reduces the likelihood of an inappropriate treatment regime. (2) A template for HDR skin treatments was established in the clinic's electronic medical record system to standardize treatment instructions. This reduces the chances of miscommunication between the physician and planner as well as increases the detectability of an error. (3) A screen check was implemented during the second check to increase detectability of an error. (4) To reduce error probability, the treatment plan worksheet was designed to display plan parameters in a format visually similar to the treatment console display, facilitating data entry and verification. (5) VLSAs are color coded and labeled to match the electronic medical record prescriptions, simplifying in-room selection and verification. Multidisciplinary planning and FMEA increased detectability and reduced error probability during VLSA HDR brachytherapy. This clinical model may be useful to institutions implementing similar procedures. Copyright © 2015 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Reconfigurable Control with Neural Network Augmentation for a Modified F-15 Aircraft
NASA Technical Reports Server (NTRS)
Burken, John J.
2007-01-01
This paper describes the performance of a simplified dynamic inversion controller with neural network supplementation. This 6 DOF (Degree-of-Freedom) simulation study focuses on the results with and without adaptation of neural networks using a simulation of the NASA modified F-15 which has canards. One area of interest is the performance of a simulated surface failure while attempting to minimize the inertial cross coupling effect of a [B] matrix failure (a control derivative anomaly associated with a jammed or missing control surface). Another area of interest and presented is simulated aerodynamic failures ([A] matrix) such as a canard failure. The controller uses explicit models to produce desired angular rate commands. The dynamic inversion calculates the necessary surface commands to achieve the desired rates. The simplified dynamic inversion uses approximate short period and roll axis dynamics. Initial results indicated that the transient response for a [B] matrix failure using a Neural Network (NN) improved the control behavior when compared to not using a neural network for a given failure, However, further evaluation of the controller was comparable, with objections io the cross coupling effects (after changes were made to the controller). This paper describes the methods employed to reduce the cross coupling effect and maintain adequate tracking errors. The IA] matrix failure results show that control of the aircraft without adaptation is more difficult [leas damped) than with active neural networks, Simulation results show Neural Network augmentation of the controller improves performance in terms of backing error and cross coupling reduction and improved performance with aerodynamic-type failures.
Trends in Device SEE Susceptibility from Heavy Ions
NASA Technical Reports Server (NTRS)
Nichols, D. K.; Coss, J. R.; McCarty, K. P.; Schwartz, H. R.; Swift, G. M.; Watson, R. K.; Koga, R.; Crain, W. R.; Crawford, K. B.; Hansel, S. J.
1995-01-01
The sixth set of heavy ion single event effects (SEE) test data have been collected since the last IEEE publications in December issues of IEEE - Nuclear Science Transactions for 1985, 1987, 1989, 1991, and the IEEE Workshop Record, 1993. Trends in SEE susceptibility (including soft errors and latchup) for state-of- are evaluated.
Estimating soft tissue thickness from light-tissue interactions––a simulation study
Wissel, Tobias; Bruder, Ralf; Schweikard, Achim; Ernst, Floris
2013-01-01
Immobilization and marker-based motion tracking in radiation therapy often cause decreased patient comfort. However, the more comfortable alternative of optical surface tracking is highly inaccurate due to missing point-to-point correspondences between subsequent point clouds as well as elastic deformation of soft tissue. In this study, we present a proof of concept for measuring subcutaneous features with a laser scanner setup focusing on the skin thickness as additional input for high accuracy optical surface tracking. Using Monte-Carlo simulations for multi-layered tissue, we show that informative features can be extracted from the simulated tissue reflection by integrating intensities within concentric ROIs around the laser spot center. Training a regression model with a simulated data set identifies patterns that allow for predicting skin thickness with a root mean square error of down to 18 µm. Different approaches to compensate for varying observation angles were shown to yield errors still below 90 µm. Finally, this initial study provides a very promising proof of concept and encourages research towards a practical prototype. PMID:23847741
Khozani, Zohreh Sheikh; Bonakdari, Hossein; Zaji, Amir Hossein
2016-01-01
Two new soft computing models, namely genetic programming (GP) and genetic artificial algorithm (GAA) neural network (a combination of modified genetic algorithm and artificial neural network methods) were developed in order to predict the percentage of shear force in a rectangular channel with non-homogeneous roughness. The ability of these methods to estimate the percentage of shear force was investigated. Moreover, the independent parameters' effectiveness in predicting the percentage of shear force was determined using sensitivity analysis. According to the results, the GP model demonstrated superior performance to the GAA model. A comparison was also made between the GP program determined as the best model and five equations obtained in prior research. The GP model with the lowest error values (root mean square error ((RMSE) of 0.0515) had the best function compared with the other equations presented for rough and smooth channels as well as smooth ducts. The equation proposed for rectangular channels with rough boundaries (RMSE of 0.0642) outperformed the prior equations for smooth boundaries.
NASA Astrophysics Data System (ADS)
Fulkerson, David E.
2010-02-01
This paper describes a new methodology for characterizing the electrical behavior and soft error rate (SER) of CMOS and SiGe HBT integrated circuits that are struck by ions. A typical engineering design problem is to calculate the SER of a critical path that commonly includes several circuits such as an input buffer, several logic gates, logic storage, clock tree circuitry, and an output buffer. Using multiple 3D TCAD simulations to solve this problem is too costly and time-consuming for general engineering use. The new and simple methodology handles the problem with ease by simple SPICE simulations. The methodology accurately predicts the measured threshold linear energy transfer (LET) of a bulk CMOS SRAM. It solves for circuit currents and voltage spikes that are close to those predicted by expensive 3D TCAD simulations. It accurately predicts the measured event cross-section vs. LET curve of an experimental SiGe HBT flip-flop. The experimental cross section vs. frequency behavior and other subtle effects are also accurately predicted.
Hubert, G; Regis, D; Cheminet, A; Gatti, M; Lacoste, V
2014-10-01
Particles originating from primary cosmic radiation, which hit the Earth's atmosphere give rise to a complex field of secondary particles. These particles include neutrons, protons, muons, pions, etc. Since the 1980s it has been known that terrestrial cosmic rays can penetrate the natural shielding of buildings, equipment and circuit package and induce soft errors in integrated circuits. Recently, research has shown that commercial static random access memories are now so small and sufficiently sensitive that single event upsets (SEUs) may be induced from the electronic stopping of a proton. With continued advancements in process size, this downward trend in sensitivity is expected to continue. Then, muon soft errors have been predicted for nano-electronics. This paper describes the effects in the specific cases such as neutron-, proton- and muon-induced SEU observed in complementary metal-oxide semiconductor. The results will allow investigating the technology node sensitivity along the scaling trend. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Nitta, Nariaki
1988-01-01
Hard X-ray spectra in solar flares obtained by the broadband spectrometers aboard Hinotori and SMM are compared. Within the uncertainty brought about by assuming the typical energy of the background X-rays, spectra by the Hinotori spectrometer are usually consistent with those by the SMM spectrometer for flares in 1981. On the contrary, flares in 1982 persistently show 20-50-percent higher flux by Hinotori than by SMM. If this discrepancy is entirely attributable to errors in the calibration of energy ranges, the errors would be about 10 percent. Despite such a discrepancy in absolute flux, in the the decay phase of one flare, spectra revealed a hard X-ray component (probably a 'superhot' component) that could be explained neither by emission from a plasma at about 2 x 10 to the 7th K nor by a nonthermal power-law component. Imaging observations during this period show hard X-ray emission nearly cospatial with soft X-ray emission, in contrast with earlier times at which hard and soft X-rays come from different places.
Bit Error Probability for Maximum Likelihood Decoding of Linear Block Codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Fossorier, Marc P. C.; Rhee, Dojun
1996-01-01
In this paper, the bit error probability P(sub b) for maximum likelihood decoding of binary linear codes is investigated. The contribution of each information bit to P(sub b) is considered. For randomly generated codes, it is shown that the conventional approximation at high SNR P(sub b) is approximately equal to (d(sub H)/N)P(sub s), where P(sub s) represents the block error probability, holds for systematic encoding only. Also systematic encoding provides the minimum P(sub b) when the inverse mapping corresponding to the generator matrix of the code is used to retrieve the information sequence. The bit error performances corresponding to other generator matrix forms are also evaluated. Although derived for codes with a generator matrix randomly generated, these results are shown to provide good approximations for codes used in practice. Finally, for decoding methods which require a generator matrix with a particular structure such as trellis decoding or algebraic-based soft decision decoding, equivalent schemes that reduce the bit error probability are discussed.
NASA Astrophysics Data System (ADS)
De Lorenzo, Danilo; De Momi, Elena; Beretta, Elisa; Cerveri, Pietro; Perona, Franco; Ferrigno, Giancarlo
2009-02-01
Computer Assisted Orthopaedic Surgery (CAOS) systems improve the results and the standardization of surgical interventions. Anatomical landmarks and bone surface detection is straightforward to either register the surgical space with the pre-operative imaging space and to compute biomechanical parameters for prosthesis alignment. Surface points acquisition increases the intervention invasiveness and can be influenced by the soft tissue layer interposition (7-15mm localization errors). This study is aimed at evaluating the accuracy of a custom-made A-mode ultrasound (US) system for non invasive detection of anatomical landmarks and surfaces. A-mode solutions eliminate the necessity of US images segmentation, offers real-time signal processing and requires less invasive equipment. The system consists in a single transducer US probe optically tracked, a pulser/receiver and an FPGA-based board, which is responsible for logic control command generation and for real-time signal processing and three custom-made board (signal acquisition, blanking and synchronization). We propose a new calibration method of the US system. The experimental validation was then performed measuring the length of known-shape polymethylmethacrylate boxes filled with pure water and acquiring bone surface points on a bovine bone phantom covered with soft-tissue mimicking materials. Measurement errors were computed through MR and CT images acquisitions of the phantom. Points acquisition on bone surface with the US system demonstrated lower errors (1.2mm) than standard pointer acquisition (4.2mm).
Least Reliable Bits Coding (LRBC) for high data rate satellite communications
NASA Technical Reports Server (NTRS)
Vanderaar, Mark; Wagner, Paul; Budinger, James
1992-01-01
An analysis and discussion of a bandwidth efficient multi-level/multi-stage block coded modulation technique called Least Reliable Bits Coding (LRBC) is presented. LRBC uses simple multi-level component codes that provide increased error protection on increasingly unreliable modulated bits in order to maintain an overall high code rate that increases spectral efficiency. Further, soft-decision multi-stage decoding is used to make decisions on unprotected bits through corrections made on more protected bits. Using analytical expressions and tight performance bounds it is shown that LRBC can achieve increased spectral efficiency and maintain equivalent or better power efficiency compared to that of Binary Phase Shift Keying (BPSK). Bit error rates (BER) vs. channel bit energy with Additive White Gaussian Noise (AWGN) are given for a set of LRB Reed-Solomon (RS) encoded 8PSK modulation formats with an ensemble rate of 8/9. All formats exhibit a spectral efficiency of 2.67 = (log2(8))(8/9) information bps/Hz. Bit by bit coded and uncoded error probabilities with soft-decision information are determined. These are traded with with code rate to determine parameters that achieve good performance. The relative simplicity of Galois field algebra vs. the Viterbi algorithm and the availability of high speed commercial Very Large Scale Integration (VLSI) for block codes indicates that LRBC using block codes is a desirable method for high data rate implementations.
Erring and learning in clinical practice.
Hurwitz, Brian
2002-01-01
This paper discusses error type their possible consequences and the doctors who make them. There is no single, all-encompassing typology of medical errors. They are frequently multifactorial in origin and use from the mental processes of individuals; from defects in perception, thinking reasoning planning and interpretation and from failures of team-working omissions and poorly executed actions. They also arise from inadequately designed and operated healthcare systems or procedures. The paper considers error-truth relatedness, the approach of UK courts to medical errors, the learning opportunities which flow from error recognition and the need for personal and professional self awareness of clinical fallibilities. PMID:12389767
Murthi, Anand M; Ramirez, Miguel A; Parks, Brent G; Carpenter, Shannon R
2017-12-01
The bicipital aponeurosis, or lacertus fibrosus, can potentially be used as a reconstruction graft in chronic distal biceps tendon tears. To evaluate construct stiffness, load to failure, and failure mechanism with lacertus fibrosus versus Achilles allograft for distal biceps tendon reconstruction. Controlled laboratory study. Ten fresh-frozen matched cadaveric pairs of elbows were used. Three centimeters of the distal biceps tendon was resected. Specimens were randomized to the lacertus fibrosus or Achilles tendon group. In one group, the lacertus fibrosus was released from its distal attachment and then tubularized and repaired intraosseously to the radius. In the other group, an Achilles tendon graft was sutured to the biceps muscle and repaired to the ulna. The prepared radii were rigidly mounted at a 45° angle on a load frame. The proximal biceps muscle was secured in a custom-fabricated cryogenic grip. Displacement was measured using a differential variable reluctance transducer mounted at the radius-soft tissue junction and in the muscle- or muscle allograft-tissue junction proximal to the repair. Specimens were loaded at 20 mm/min until failure, defined as a 3-mm displacement at the radius-soft tissue junction. No significant difference was found in mean load to failure between the lacertus fibrosus and Achilles tendon group (mean ± SD, 20.2 ± 5.5 N vs 16.89 ± 4.54 N; P = .18). Stiffness also did not differ significantly between the lacertus fibrosus and Achilles tendon group (12.3 ± 7.1 kPa vs 10.5 ± 5.7 kPa; P = .34). The primary mode of failure in the lacertus fibrosus group was suture pullout from the tissue at the musculotendinous junction (7 of 10). In the Achilles group, failures were observed at the muscle-allograft interface (3) and the allograft-bone (radial tuberosity) interface (3), and 3 suture failures were observed. The button fixation did not fail in any specimens. The mean stiffness and load-to-failure values were not significantly different between a lacertus fibrosus construct and Achilles tendon allograft. Use of the lacertus fibrosus may be a potential alternative to Achilles tendon allograft reconstruction of chronic distal biceps tears when primary repair is not possible.
A circadian rhythm in skill-based errors in aviation maintenance.
Hobbs, Alan; Williamson, Ann; Van Dongen, Hans P A
2010-07-01
In workplaces where activity continues around the clock, human error has been observed to exhibit a circadian rhythm, with a characteristic peak in the early hours of the morning. Errors are commonly distinguished by the nature of the underlying cognitive failure, particularly the level of intentionality involved in the erroneous action. The Skill-Rule-Knowledge (SRK) framework of Rasmussen is used widely in the study of industrial errors and accidents. The SRK framework describes three fundamental types of error, according to whether behavior is under the control of practiced sensori-motor skill routines with minimal conscious awareness; is guided by implicit or explicit rules or expertise; or where the planning of actions requires the conscious application of domain knowledge. Up to now, examinations of circadian patterns of industrial errors have not distinguished between different types of error. Consequently, it is not clear whether all types of error exhibit the same circadian rhythm. A survey was distributed to aircraft maintenance personnel in Australia. Personnel were invited to anonymously report a safety incident and were prompted to describe, in detail, the human involvement (if any) that contributed to it. A total of 402 airline maintenance personnel reported an incident, providing 369 descriptions of human error in which the time of the incident was reported and sufficient detail was available to analyze the error. Errors were categorized using a modified version of the SRK framework, in which errors are categorized as skill-based, rule-based, or knowledge-based, or as procedure violations. An independent check confirmed that the SRK framework had been applied with sufficient consistency and reliability. Skill-based errors were the most common form of error, followed by procedure violations, rule-based errors, and knowledge-based errors. The frequency of errors was adjusted for the estimated proportion of workers present at work/each hour of the day, and the 24 h pattern of each error type was examined. Skill-based errors exhibited a significant circadian rhythm, being most prevalent in the early hours of the morning. Variation in the frequency of rule-based errors, knowledge-based errors, and procedure violations over the 24 h did not reach statistical significance. The results suggest that during the early hours of the morning, maintenance technicians are at heightened risk of "absent minded" errors involving failures to execute action plans as intended.
1985-04-24
reliability/ downtime/ communication lines/ man-machine interface/ other: 2. A noticeable (to the user) failure happens about and that number has been...improving/ steady/ getting.worse. 3. The number of failures /errors for NOHIMS is acceptable/ somewhat acceptable/ somewhat unacceptable/ unacceptable...somewhat fast/ somewhat slow/ slow. 7. When a NWHIMS failure occurs, it affects the day-to-day provision of medical care because work procedures must
Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad
2016-01-01
Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162
Complacency and Automation Bias in the Use of Imperfect Automation.
Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L
2015-08-01
We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.
NASA Astrophysics Data System (ADS)
Kostyukov, V. N.; Naumenko, A. P.
2017-08-01
The paper dwells upon urgent issues of evaluating impact of actions conducted by complex technological systems operators on their safe operation considering application of condition monitoring systems for elements and sub-systems of petrochemical production facilities. The main task for the research is to distinguish factors and criteria of monitoring system properties description, which would allow to evaluate impact of errors made by personnel on operation of real-time condition monitoring and diagnostic systems for machinery of petrochemical facilities, and find and objective criteria for monitoring system class, considering a human factor. On the basis of real-time condition monitoring concepts of sudden failure skipping risk, static and dynamic error, monitoring systems, one may solve a task of evaluation of impact that personnel's qualification has on monitoring system operation in terms of error in personnel or operators' actions while receiving information from monitoring systems and operating a technological system. Operator is considered as a part of the technological system. Although, personnel's behavior is usually a combination of the following parameters: input signal - information perceiving, reaction - decision making, response - decision implementing. Based on several researches on behavior of nuclear powers station operators in USA, Italy and other countries, as well as on researches conducted by Russian scientists, required data on operator's reliability were selected for analysis of operator's behavior at technological facilities diagnostics and monitoring systems. The calculations revealed that for the monitoring system selected as an example, the failure skipping risk for the set values of static (less than 0.01) and dynamic (less than 0.001) errors considering all related factors of data on reliability of information perception, decision-making, and reaction fulfilled is 0.037, in case when all the facilities and error probability are under control - not more than 0.027. In case when only pump and compressor units are under control, the failure skipping risk is not more than 0.022, when the probability of error in operator's action is not more than 0.011. The work output shows that on the basis of the researches results an assessment of operators' reliability can be made in terms of almost any kind of production, but considering only technological capabilities, since operators' psychological and general training considerable vary in different production industries. Using latest technologies of engineering psychology and design of data support systems, situation assessment systems, decision-making and responding system, as well as achievement in condition monitoring in various production industries one can evaluate hazardous condition skipping risk probability considering static, dynamic errors and human factor.
Using Digital Image Correlation to Characterize Local Strains on Vascular Tissue Specimens.
Zhou, Boran; Ravindran, Suraj; Ferdous, Jahid; Kidane, Addis; Sutton, Michael A; Shazly, Tarek
2016-01-24
Characterization of the mechanical behavior of biological and engineered soft tissues is a central component of fundamental biomedical research and product development. Stress-strain relationships are typically obtained from mechanical testing data to enable comparative assessment among samples and in some cases identification of constitutive mechanical properties. However, errors may be introduced through the use of average strain measures, as significant heterogeneity in the strain field may result from geometrical non-uniformity of the sample and stress concentrations induced by mounting/gripping of soft tissues within the test system. When strain field heterogeneity is significant, accurate assessment of the sample mechanical response requires measurement of local strains. This study demonstrates a novel biomechanical testing protocol for calculating local surface strains using a mechanical testing device coupled with a high resolution camera and a digital image correlation technique. A series of sample surface images are acquired and then analyzed to quantify the local surface strain of a vascular tissue specimen subjected to ramped uniaxial loading. This approach can improve accuracy in experimental vascular biomechanics and has potential for broader use among other native soft tissues, engineered soft tissues, and soft hydrogel/polymeric materials. In the video, we demonstrate how to set up the system components and perform a complete experiment on native vascular tissue.
Bunke, C M; Brier, M E; Golper, T A
1997-08-01
The use of the "peritonitis rate" in the management of patients undergoing peritoneal dialysis is assuming importance in comparing the prowess of facilities, care givers and new innovations. For this to be a meaningful outcome measure, the type of infection (causative pathogen) must have less clinical significance than the number of infections during a time interval. The natural history of Staphylococcus aureus, pseudomonas, and fungal peritonitis would not support that the outcome of an episode of peritonitis is independent of the causative pathogen. Could this concern be extended to other more frequently occurring pathogens? To address this, the Network 9 Peritonitis Study identified 530 episodes of single organism peritonitis caused by a gram positive organism and 136 episodes caused by a single non-pseudomonal gram negative (NPGN) pathogen. Coincidental soft tissue infections (exit site or tunnel) occurred equally in both groups. Outcomes of peritonitis were analyzed by organism classification and by presence or absence of a soft tissue infection. NPGN peritonitis was associated with significantly more frequent catheter loss, hospitalization, and technique failure and was less likely to resolve regardless of the presence or absence of a soft tissue infection. Hospitalization and death tended to occur more frequently with enterococcal peritonitis than with other gram positive peritonitis. The outcomes in the NPGN peritonitis group were significantly worse (resolution, catheter loss, hospitalization, technique failure) compared to coagulase negative staphylococcal or S. aureus peritonitis, regardless of the presence or absence of a coincidental soft tissue infection. Furthermore, for the first time, the poor outcomes of gram negative peritonitis are shown to be independent of pseudomonas or polymicrobial involvement or soft tissue infections. The gram negative organism appears to be the important factor. In addition, the outcome of peritonitis caused by S. aureus is worse than that of other staphylococci. Thus, it is clear that all peritonitis episodes cannot be considered equivalent in terms of outcome. The concept of peritonitis rate is only meaningful when specific organisms are considered.
Peters, Christopher L; Jimenez, Chris; Erickson, Jill; Anderson, Mike B; Pelt, Christopher E
2013-10-16
Soft-tissue releases are commonly necessary to achieve symmetrical flexion and extension gaps in primary total knee arthroplasty performed with a measured resection technique. We reviewed the frequency of required releases according to preoperative alignment and the clinical and radiographic results; associations with failure, reoperations, and complications are presented. We reviewed 1216 knees that underwent primary total knee arthroplasty from 2004 to 2009; 774 (64%) were in female patients and 442 (36%), in male patients. In the coronal plane, 855 knees had preoperative varus deformity, 123 were neutral, and 238 had valgus deformity. The mean age at the time of the index procedure was 62.7 years (range, twenty-three to ninety-four years), and the mean body mass index was 32.7 kg/m² (range, 17.4 to 87.9 kg/m²). Clinical outcomes included the Knee Society Score (KSS), implant failure, reoperation, and complications. Radiographs were analyzed for component alignment. The only difference in the total KSS was found at the time of final follow-up between valgus knees with zero releases (total KSS = 178) and those with one or two releases (KSS = 160, p = 0.026). Overall, 407 knees (33.5%) required zero releases, 686 (56.4%) required one or two releases, and 123 (10.1%) required three or more releases. Among varus knees, 37% required zero releases, 55% required one or two releases, and 7.5% required three or more releases. Among neutral knees, 39% required zero releases, 55% required one or two releases, and 5.7% required three or more releases. Only 17% of valgus knees required zero releases whereas 61% required one or two releases and 21.8% required three or more releases. Valgus knees required more releases than neutral or varus knees did (p < 0.001). Selective soft-tissue release for gap balancing in primary total knee arthroplasty is an effective technique that produced excellent clinical and radiographic results regardless of preoperative alignment. Consistent anatomic coronal-plane alignment and soft-tissue balance could be achieved without bone cut modification by using measured bone resection and selective soft-tissue release.
Pasha, Azam; Sindhu, D; Nayak, Rabindra S; Mamatha, J; Chaitra, K R; Vishwakarma, Swati
2015-01-01
Background and Objectives: This study was conducted to evaluate the effect of two soft drinks, Coca-Cola and Mirinda orange on bracket bond strength, on adhesive remnant on teeth after debonding the bracket, and to observe by means of scanning electron microscope (SEM) the effect of these drinks on intact and sealed enamel. Methods: 120 non-carious maxillary premolar teeth already extracted for Orthodontic purposes were taken and divided into three groups, i.e., Coca-Cola drink, Mirinda orange, and control (artificial saliva) group. Brackets were bonded using conventional methods. Teeth were kept in soft drinks for 15 days, for 15 min, 3 times a day, separated by intervals of 2 h. At other times, they were kept in artificial saliva. The samples, thus obtained were evaluated for shear bond strength using the universal testing machine and subsequently subjected for adhesive remnant index (ARI) scores. SEM study on all the three groups was done for evaluating enamel surface of the intact and sealed enamel. Results: The lowest mean resistance to shearing forces was shown by Mirinda orange group (5.30 ± 2.74 Mpa) followed by Coca-Cola group (6.24 ± 1.59 Mpa) and highest resistance to shearing forces by control group (7.33 ± 1.72 Mpa). The ARI scores revealed a cohesive failure in control samples and an adhesive failure in Mirinda and cola samples. SEM results showed areas of defect due to erosion caused by acidic soft drinks on intact and sealed enamel surface. Conclusion: Mirinda group showed the lowest resistance to shearing forces, followed by Coca-Cola group and with the highest resistance to shearing forces by the control group. There were significant differences between the control group and the study groups. Areas of defects, which were caused by erosion related to acidic soft drinks on the enamel surface around the adhesive, were seen. Areas of defects caused by Coca-Cola were more extensive when compared to Mirinda orange drink. PMID:26668477
Effects of alcohol on pilot performance in simulated flight
NASA Technical Reports Server (NTRS)
Billings, C. E.; Demosthenes, T.; White, T. R.; O'Hara, D. B.
1991-01-01
Ethyl alcohol's known ability to produce reliable decrements in pilot performance was used in a study designed to evaluate objective methods for assessing pilot performance. Four air carrier pilot volunteers were studied during eight simulated flights in a B727 simulator. Total errors increased linearly and significantly with increasing blood alcohol. Planning and performance errors, procedural errors and failures of vigilance each increased significantly in one or more pilots and in the group as a whole.
Hart, Heledd; Lim, Lena; Mehta, Mitul A.; Curtis, Charles; Xu, Xiaohui; Breen, Gerome; Simmons, Andrew; Mirza, Kah; Rubia, Katya
2018-01-01
Childhood maltreatment is associated with error hypersensitivity. We examined the effect of childhood abuse and abuse-by-gene (5-HTTLPR, MAOA) interaction on functional brain connectivity during error processing in medication/drug-free adolescents. Functional connectivity was compared, using generalized psychophysiological interaction (gPPI) analysis of functional magnetic resonance imaging (fMRI) data, between 22 age- and gender-matched medication-naïve and substance abuse-free adolescents exposed to severe childhood abuse and 27 healthy controls, while they performed an individually adjusted tracking stop-signal task, designed to elicit 50% inhibition failures. During inhibition failures, abused participants relative to healthy controls exhibited reduced connectivity between right and left putamen, bilateral caudate and anterior cingulate cortex (ACC), and between right supplementary motor area (SMA) and right inferior and dorsolateral prefrontal cortex. Abuse-related connectivity abnormalities were associated with longer abuse duration. No group differences in connectivity were observed for successful inhibition. The findings suggest that childhood abuse is associated with decreased functional connectivity in fronto-cingulo-striatal networks during error processing. Furthermore that the severity of connectivity abnormalities increases with abuse duration. Reduced connectivity of error detection networks in maltreated individuals may be linked to constant monitoring of errors in order to avoid mistakes which, in abusive contexts, are often associated with harsh punishment. PMID:29434543
Compact disk error measurements
NASA Technical Reports Server (NTRS)
Howe, D.; Harriman, K.; Tehranchi, B.
1993-01-01
The objectives of this project are as follows: provide hardware and software that will perform simple, real-time, high resolution (single-byte) measurement of the error burst and good data gap statistics seen by a photoCD player read channel when recorded CD write-once discs of variable quality (i.e., condition) are being read; extend the above system to enable measurement of the hard decision (i.e., 1-bit error flags) and soft decision (i.e., 2-bit error flags) decoding information that is produced/used by the Cross Interleaved - Reed - Solomon - Code (CIRC) block decoder employed in the photoCD player read channel; construct a model that uses data obtained via the systems described above to produce meaningful estimates of output error rates (due to both uncorrected ECC words and misdecoded ECC words) when a CD disc having specific (measured) error statistics is read (completion date to be determined); and check the hypothesis that current adaptive CIRC block decoders are optimized for pressed (DAD/ROM) CD discs. If warranted, do a conceptual design of an adaptive CIRC decoder that is optimized for write-once CD discs.
Neural Network-Based Sensor Validation for Turboshaft Engines
NASA Technical Reports Server (NTRS)
Moller, James C.; Litt, Jonathan S.; Guo, Ten-Huei
1998-01-01
Sensor failure detection, isolation, and accommodation using a neural network approach is described. An auto-associative neural network is configured to perform dimensionality reduction on the sensor measurement vector and provide estimated sensor values. The sensor validation scheme is applied in a simulation of the T700 turboshaft engine in closed loop operation. Performance is evaluated based on the ability to detect faults correctly and maintain stable and responsive engine operation. The set of sensor outputs used for engine control forms the network input vector. Analytical redundancy is verified by training networks of successively smaller bottleneck layer sizes. Training data generation and strategy are discussed. The engine maintained stable behavior in the presence of sensor hard failures. With proper selection of fault determination thresholds, stability was maintained in the presence of sensor soft failures.
Automation bias in electronic prescribing.
Lyell, David; Magrabi, Farah; Raban, Magdalena Z; Pont, L G; Baysari, Melissa T; Day, Richard O; Coiera, Enrico
2017-03-16
Clinical decision support (CDS) in e-prescribing can improve safety by alerting potential errors, but introduces new sources of risk. Automation bias (AB) occurs when users over-rely on CDS, reducing vigilance in information seeking and processing. Evidence of AB has been found in other clinical tasks, but has not yet been tested with e-prescribing. This study tests for the presence of AB in e-prescribing and the impact of task complexity and interruptions on AB. One hundred and twenty students in the final two years of a medical degree prescribed medicines for nine clinical scenarios using a simulated e-prescribing system. Quality of CDS (correct, incorrect and no CDS) and task complexity (low, low + interruption and high) were varied between conditions. Omission errors (failure to detect prescribing errors) and commission errors (acceptance of false positive alerts) were measured. Compared to scenarios with no CDS, correct CDS reduced omission errors by 38.3% (p < .0001, n = 120), 46.6% (p < .0001, n = 70), and 39.2% (p < .0001, n = 120) for low, low + interrupt and high complexity scenarios respectively. Incorrect CDS increased omission errors by 33.3% (p < .0001, n = 120), 24.5% (p < .009, n = 82), and 26.7% (p < .0001, n = 120). Participants made commission errors, 65.8% (p < .0001, n = 120), 53.5% (p < .0001, n = 82), and 51.7% (p < .0001, n = 120). Task complexity and interruptions had no impact on AB. This study found evidence of AB omission and commission errors in e-prescribing. Verification of CDS alerts is key to avoiding AB errors. However, interventions focused on this have had limited success to date. Clinicians should remain vigilant to the risks of CDS failures and verify CDS.
Brigham, John C.; Aquino, Wilkins; Aguilo, Miguel A.; Diamessis, Peter J.
2010-01-01
An approach for efficient and accurate finite element analysis of harmonically excited soft solids using high-order spectral finite elements is presented and evaluated. The Helmholtz-type equations used to model such systems suffer from additional numerical error known as pollution when excitation frequency becomes high relative to stiffness (i.e. high wave number), which is the case, for example, for soft tissues subject to ultrasound excitations. The use of high-order polynomial elements allows for a reduction in this pollution error, but requires additional consideration to counteract Runge's phenomenon and/or poor linear system conditioning, which has led to the use of spectral element approaches. This work examines in detail the computational benefits and practical applicability of high-order spectral elements for such problems. The spectral elements examined are tensor product elements (i.e. quad or brick elements) of high-order Lagrangian polynomials with non-uniformly distributed Gauss-Lobatto-Legendre nodal points. A shear plane wave example is presented to show the dependence of the accuracy and computational expense of high-order elements on wave number. Then, a convergence study for a viscoelastic acoustic-structure interaction finite element model of an actual ultrasound driven vibroacoustic experiment is shown. The number of degrees of freedom required for a given accuracy level was found to consistently decrease with increasing element order. However, the computationally optimal element order was found to strongly depend on the wave number. PMID:21461402
A device for high-throughput monitoring of degradation in soft tissue samples.
Tzeranis, D S; Panagiotopoulos, I; Gkouma, S; Kanakaris, G; Georgiou, N; Vaindirlis, N; Vasileiou, G; Neidlin, M; Gkousioudi, A; Spitas, V; Macheras, G A; Alexopoulos, L G
2018-06-06
This work describes the design and validation of a novel device, the High-Throughput Degradation Monitoring Device (HDD), for monitoring the degradation of 24 soft tissue samples over incubation periods of several days inside a cell culture incubator. The device quantifies sample degradation by monitoring its deformation induced by a static gravity load. Initial instrument design and experimental protocol development focused on quantifying cartilage degeneration. Characterization of measurement errors, caused mainly by thermal transients and by translating the instrument sensor, demonstrated that HDD can quantify sample degradation with <6 μm precision and <10 μm temperature-induced errors. HDD capabilities were evaluated in a pilot study that monitored the degradation of fresh ex vivo human cartilage samples by collagenase solutions over three days. HDD could robustly resolve the effects of collagenase concentration as small as 0.5 mg/ml. Careful sample preparation resulted in measurements that did not suffer from donor-to-donor variation (coefficient of variance <70%). Due to its unique combination of sample throughput, measurement precision, temporal sampling and experimental versality, HDD provides a novel biomechanics-based experimental platform for quantifying the effects of proteins (cytokines, growth factors, enzymes, antibodies) or small molecules on the degradation of soft tissues or tissue engineering constructs. Thereby, HDD can complement established tools and in vitro models in important applications including drug screening and biomaterial development. Copyright © 2018 Elsevier Ltd. All rights reserved.
Walden, Steven J; Evans, Sam L; Mulville, Jacqui
2017-01-01
The purpose of this study was to determine how the Vickers hardness (HV) of bone varies during soft tissue putrefaction. This has possible forensic applications, notably for determining the postmortem interval. Experimental porcine bone samples were decomposed in surface and burial deposition scenarios over a period of 6 months. Although the Vickers hardness varied widely, it was found that when transverse axial hardness was subtracted from longitudinal axial hardness, the difference showed correlations with three distinct phases of soft tissue putrefaction. The ratio of transverse axial hardness to longitudinal axial hardness showed a similar correlation. A difference of 10 or greater in HV with soft tissue present and signs of minimal decomposition, was associated with a decomposition period of 250 cumulative cooling degree days or less. A difference of 10 (+/- standard error of mean at a 95% confidence interval) or greater in HV associated with marked decomposition indicated a decomposition period of 1450 cumulative cooling degree days or more. A difference of -7 to +8 (+/- standard error of mean at a 95% confidence interval) was thus associated with 250 to 1450 cumulative cooling degree days' decomposition. The ratio of transverse axial HV to longitudinal HV, ranging from 2.42 to 1.54, is a more reliable indicator in this context and is preferable to using negative integers These differences may have potential as an indicator of postmortem interval and thus the time of body deposition in the forensic context. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Lei; Hu, Jianhao
2010-12-01
Notice of Violation of IEEE Publication Principles"Joint Redundant Residue Number Systems and Module Isolation for Mitigating Single Event Multiple Bit Upsets in Datapath"by Lei Li and Jianhao Hu,in the IEEE Transactions on Nuclear Science, vol.57, no.6, Dec. 2010, pp. 3779-3786After careful and considered review of the content and authorship of this paper by a duly constituted expert committee, this paper has been found to be in violation of IEEE's Publication Principles.This paper contains substantial duplication of original text from the paper cited below. The original text was copied without attribution (including appropriate references to the original author(s) and/or paper title) and without permission.Due to the nature of this violation, reasonable effort should be made to remove all past references to this paper, and future references should be made to the following articles:"Multiple Error Detection and Correction Based on Redundant Residue Number Systems"by Vik Tor Goh and M.U. Siddiqi,in the IEEE Transactions on Communications, vol.56, no.3, March 2008, pp.325-330"A Coding Theory Approach to Error Control in Redundant Residue Number Systems. I: Theory and Single Error Correction"by H. Krishna, K-Y. Lin, and J-D. Sun, in the IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, vol.39, no.1, Jan 1992, pp.8-17In this paper, we propose a joint scheme which combines redundant residue number systems (RRNS) with module isolation (MI) for mitigating single event multiple bit upsets (SEMBUs) in datapath. The proposed hardening scheme employs redundant residues to improve the fault tolerance for datapath and module spacings to guarantee that SEMBUs caused by charge sharing do not propagate among the operation channels of different moduli. The features of RRNS, such as independence, parallel and error correction, are exploited to establish the radiation hardening architecture for the datapath in radiation environments. In the proposed scheme, all of the residues can be processed independently, and most of the soft errors in datapath can be corrected with the redundant relationship of the residues at correction module, which is allocated at the end of the datapath. In the back-end implementation, module isolation technique is used to improve the soft error rate performance for RRNS by physically separating the operation channels of different moduli. The case studies show at least an order of magnitude decrease on the soft error rate (SER) as compared to the NonRHBD designs, and demonstrate that RRNS+MI can reduce the SER from 10-12 to 10-17 when the processing steps of datapath are 106. The proposed scheme can even achieve less area and latency overheads than that without radiation hardening, since RRNS can reduce the operational complexity in datapath.
Wiesinger, Florian; Bylund, Mikael; Yang, Jaewon; Kaushik, Sandeep; Shanbhag, Dattesh; Ahn, Sangtae; Jonsson, Joakim H; Lundman, Josef A; Hope, Thomas; Nyholm, Tufve; Larson, Peder; Cozzini, Cristina
2018-02-18
To describe a method for converting Zero TE (ZTE) MR images into X-ray attenuation information in the form of pseudo-CT images and demonstrate its performance for (1) attenuation correction (AC) in PET/MR and (2) dose planning in MR-guided radiation therapy planning (RTP). Proton density-weighted ZTE images were acquired as input for MR-based pseudo-CT conversion, providing (1) efficient capture of short-lived bone signals, (2) flat soft-tissue contrast, and (3) fast and robust 3D MR imaging. After bias correction and normalization, the images were segmented into bone, soft-tissue, and air by means of thresholding and morphological refinements. Fixed Hounsfield replacement values were assigned for air (-1000 HU) and soft-tissue (+42 HU), whereas continuous linear mapping was used for bone. The obtained ZTE-derived pseudo-CT images accurately resembled the true CT images (i.e., Dice coefficient for bone overlap of 0.73 ± 0.08 and mean absolute error of 123 ± 25 HU evaluated over the whole head, including errors from residual registration mismatches in the neck and mouth regions). The linear bone mapping accounted for bone density variations. Averaged across five patients, ZTE-based AC demonstrated a PET error of -0.04 ± 1.68% relative to CT-based AC. Similarly, for RTP assessed in eight patients, the absolute dose difference over the target volume was found to be 0.23 ± 0.42%. The described method enables MR to pseudo-CT image conversion for the head in an accurate, robust, and fast manner without relying on anatomical prior knowledge. Potential applications include PET/MR-AC, and MR-guided RTP. © 2018 International Society for Magnetic Resonance in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Perry B.; Geyer, Amy; Borrego, David
Purpose: To investigate the benefits and limitations of patient-phantom matching for determining organ dose during fluoroscopy guided interventions. Methods: In this study, 27 CT datasets representing patients of different sizes and genders were contoured and converted into patient-specific computational models. Each model was matched, based on height and weight, to computational phantoms selected from the UF hybrid patient-dependent series. In order to investigate the influence of phantom type on patient organ dose, Monte Carlo methods were used to simulate two cardiac projections (PA/left lateral) and two abdominal projections (RAO/LPO). Organ dose conversion coefficients were then calculated for each patient-specific andmore » patient-dependent phantom and also for a reference stylized and reference hybrid phantom. The coefficients were subsequently analyzed for any correlation between patient-specificity and the accuracy of the dose estimate. Accuracy was quantified by calculating an absolute percent difference using the patient-specific dose conversion coefficients as the reference. Results: Patient-phantom matching was shown most beneficial for estimating the dose to heavy patients. In these cases, the improvement over using a reference stylized phantom ranged from approximately 50% to 120% for abdominal projections and for a reference hybrid phantom from 20% to 60% for all projections. For lighter individuals, patient-phantom matching was clearly superior to using a reference stylized phantom, but not significantly better than using a reference hybrid phantom for certain fields and projections. Conclusions: The results indicate two sources of error when patients are matched with phantoms: Anatomical error, which is inherent due to differences in organ size and location, and error attributed to differences in the total soft tissue attenuation. For small patients, differences in soft tissue attenuation are minimal and are exceeded by inherent anatomical differences. For large patients, difference in soft tissue attenuation can be large. In these cases, patient-phantom matching proves most effective as differences in soft tissue attenuation are mitigated. With increasing obesity rates, overweight patients will continue to make up a growing fraction of all patients undergoing medical imaging. Thus, having phantoms that better represent this population represents a considerable improvement over previous methods. In response to this study, additional phantoms representing heavier weight percentiles will be added to the UFHADM and UFHADF patient-dependent series.« less
Quality assurance of the international computerised 24 h dietary recall method (EPIC-Soft).
Crispim, Sandra P; Nicolas, Genevieve; Casagrande, Corinne; Knaze, Viktoria; Illner, Anne-Kathrin; Huybrechts, Inge; Slimani, Nadia
2014-02-01
The interview-administered 24 h dietary recall (24-HDR) EPIC-Soft® has a series of controls to guarantee the quality of dietary data across countries. These comprise all steps that are part of fieldwork preparation, data collection and data management; however, a complete characterisation of these quality controls is still lacking. The present paper describes in detail the quality controls applied in EPIC-Soft, which are, to a large extent, built on the basis of the EPIC-Soft error model and are present in three phases: (1) before, (2) during and (3) after the 24-HDR interviews. Quality controls for consistency and harmonisation are implemented before the interviews while preparing the seventy databases constituting an EPIC-Soft version (e.g. pre-defined and coded foods and recipes). During the interviews, EPIC-Soft uses a cognitive approach by helping the respondent to recall the dietary intake information in a stepwise manner and includes controls for consistency (e.g. probing questions) as well as for completeness of the collected data (e.g. system calculation for some unknown amounts). After the interviews, a series of controls can be applied by dietitians and data managers to further guarantee data quality. For example, the interview-specific 'note files' that were created to track any problems or missing information during the interviews can be checked to clarify the information initially provided. Overall, the quality controls employed in the EPIC-Soft methodology are not always perceivable, but prove to be of assistance for its overall standardisation and possibly for the accuracy of the collected data.
Savannah River Site generic data base development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanton, C.H.; Eide, S.A.
This report describes the results of a project to improve the generic component failure data base for the Savannah River Site (SRS). A representative list of components and failure modes for SRS risk models was generated by reviewing existing safety analyses and component failure data bases and from suggestions from SRS safety analysts. Then sources of data or failure rate estimates were identified and reviewed for applicability. A major source of information was the Nuclear Computerized Library for Assessing Reactor Reliability, or NUCLARR. This source includes an extensive collection of failure data and failure rate estimates for commercial nuclear powermore » plants. A recent Idaho National Engineering Laboratory report on failure data from the Idaho Chemical Processing Plant was also reviewed. From these and other recent sources, failure data and failure rate estimates were collected for the components and failure modes of interest. This information was aggregated to obtain a recommended generic failure rate distribution (mean and error factor) for each component failure mode.« less
Steinberger, Dina M; Douglas, Stephen V; Kirschbaum, Mark S
2009-09-01
A multidisciplinary team from the University of Wisconsin Hospital and Clinics transplant program used failure mode and effects analysis to proactively examine opportunities for communication and handoff failures across the continuum of care from organ procurement to transplantation. The team performed a modified failure mode and effects analysis that isolated the multiple linked, serial, and complex information exchanges occurring during the transplantation of one solid organ. Failure mode and effects analysis proved effective for engaging a diverse group of persons who had an investment in the outcome in analysis and discussion of opportunities to improve the system's resilience for avoiding errors during a time-pressured and complex process.
Effect of system workload on operating system reliability - A study on IBM 3081
NASA Technical Reports Server (NTRS)
Iyer, R. K.; Rossetti, D. J.
1985-01-01
This paper presents an analysis of operating system failures on an IBM 3081 running VM/SP. Three broad categories of software failures are found: error handling, program control or logic, and hardware related; it is found that more than 25 percent of software failures occur in the hardware/software interface. Measurements show that results on software reliability cannot be considered representative unless the system workload is taken into account. The overall CPU execution rate, although measured to be close to 100 percent most of the time, is not found to correlate strongly with the occurrence of failures. Possible reasons for the observed workload failure dependency, based on detailed investigations of the failure data, are discussed.
NASA Astrophysics Data System (ADS)
Lollino, Piernicola; Andriani, Gioacchino Francesco; Fazio, Nunzio Luciano; Perrotti, Michele
2016-04-01
Strain-softening under low confinement stress, i.e. the drop of strength that occurs in the post-failure stage, represents a key factor of the stress-strain behavior of rocks. However, this feature of the rock behavior is generally underestimated or even neglected in the assessment of boundary value problems of intact soft rock masses. This is typically the case when the stability of intact rock masses is treated by means of limit equilibrium or finite element analyses, for which rigid-plastic or elastic perfectly-plastic constitutive models, generally implementing peak strength conditions of the rock, are respectively used. In fact, the aforementioned numerical techniques are characterized by intrinsic limitations that do not allow to account for material brittleness, either for the method assumptions or due to numerical stability problems, as for the case of the finite element method, unless sophisticated regularization techniques are implemented. However, for those problems that concern the stability of intact soft rock masses at low stress levels, as for example the stability of shallow underground caves or that of rock slopes, the brittle stress-strain response of rock in the post-failure stage cannot be disregarded due to the risk of overestimation of the stability factor. This work is aimed at highlighting the role of post-peak brittleness of soft rocks in the analysis of specific ideal problems by means of the use of a hybrid finite-discrete element technique (FDEM) that allows for the simulation of the rock stress-strain brittle behavior in a proper way. In particular, the stability of two ideal cases, represented by a shallow underground rectangular cave and a vertical cliff, has been analyzed by implementing a post-peak brittle behavior of the rock and the comparison with a non-brittle response of the rock mass is also explored. To this purpose, the mechanical behavior of a soft calcarenite belonging to the Calcarenite di Gravina formation, extensively outcropping in Puglia (Southern Italy), and the corresponding features of the post-peak behavior as measured in the laboratory, have been used as a reference in this work, as well as the typical geometrical features of underground cavities and rock cliffs, as observed in Southern Italy, have been adopted for the simulations. The numerical results indicate the strong impact for the assessment of stability when rock post-peak brittleness is accounted for, if compared with perfectly plastic assumptions, and the need for adopting numerical techniques, as the FDEM approach, to take properly into account this important aspect of the rock behavior is highlighted.
Mouroux, M.; Yvon-Groussin, A.; Peytavin, G.; Delaugerre, C.; Legrand, M.; Bossi, P.; Do, B.; Trylesinski, A.; Diquet, B.; Dohin, E.; Delfraissy, J. F.; Katlama, C.; Calvez, V.
2000-01-01
The MIKADO trial was designed to evaluate the efficacy of stavudine-zalcitabine-saquinavir (soft gel capsule) [d4T-ddC-SQV(SGC)] in 36 naive patients (−3.3 log10 units at week 24 [W24]). Among the 29 patients remaining on d4T-ddC-SQV(SGC) until W24, 10 harbored a virological failure (viral load of >200 copies/ml at W24) (group 1). To determine the reasons for therapeutic failure, genotypic and phenotypic resistance test results and SQV concentrations in plasma were analyzed and compared to those in successfully treated patients (viral load of <200 copies/ml at W24) (group 2). Reverse transcriptase and protease genotypic analyses in group 1 revealed the acquisition of only one SQV-associated mutation (L90M) in only two patients. There was no significant increase in the 50 or 90% inhibitory concentration of SQV in patients with or without the L90M mutation. However, the fact that two patients developed an L90M mutation only 4 weeks after relapse points to the need for genotypic resistance testing in the context of an initial failure of the antiretroviral regimen. At W24, the median SQV concentration in group 1 (71 ng/ml) was significantly lower than in group 2 (475 ng/ml), and the plasma SQV concentration was correlated with the viral load at W24 (r = −0.5; P < 0.05) and with the drop in viral load between day 0 and W24 (r = −0.5; P < 0.01). These results and the fact that the plasma SQV concentrations in the two groups prior to relapse (W12) were not significantly different strongly suggest that the early failure of this combination is not due to viral resistance but to a lack of compliance, pharmacological variability, and drug interactions or a combination of these factors. PMID:10878071
Szlavecz, Akos; Chiew, Yeong Shiong; Redmond, Daniel; Beatson, Alex; Glassenbury, Daniel; Corbett, Simon; Major, Vincent; Pretty, Christopher; Shaw, Geoffrey M; Benyo, Balazs; Desaive, Thomas; Chase, J Geoffrey
2014-09-30
Real-time patient respiratory mechanics estimation can be used to guide mechanical ventilation settings, particularly, positive end-expiratory pressure (PEEP). This work presents a software, Clinical Utilisation of Respiratory Elastance (CURE Soft), using a time-varying respiratory elastance model to offer this ability to aid in mechanical ventilation treatment. CURE Soft is a desktop application developed in JAVA. It has two modes of operation, 1) Online real-time monitoring decision support and, 2) Offline for user education purposes, auditing, or reviewing patient care. The CURE Soft has been tested in mechanically ventilated patients with respiratory failure. The clinical protocol, software testing and use of the data were approved by the New Zealand Southern Regional Ethics Committee. Using CURE Soft, patient's respiratory mechanics response to treatment and clinical protocol were monitored. Results showed that the patient's respiratory elastance (Stiffness) changed with the use of muscle relaxants, and responded differently to ventilator settings. This information can be used to guide mechanical ventilation therapy and titrate optimal ventilator PEEP. CURE Soft enables real-time calculation of model-based respiratory mechanics for mechanically ventilated patients. Results showed that the system is able to provide detailed, previously unavailable information on patient-specific respiratory mechanics and response to therapy in real-time. The additional insight available to clinicians provides the potential for improved decision-making, and thus improved patient care and outcomes.
Impact of Measurement Error on Synchrophasor Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yilu; Gracia, Jose R.; Ewing, Paul D.
2015-07-01
Phasor measurement units (PMUs), a type of synchrophasor, are powerful diagnostic tools that can help avert catastrophic failures in the power grid. Because of this, PMU measurement errors are particularly worrisome. This report examines the internal and external factors contributing to PMU phase angle and frequency measurement errors and gives a reasonable explanation for them. It also analyzes the impact of those measurement errors on several synchrophasor applications: event location detection, oscillation detection, islanding detection, and dynamic line rating. The primary finding is that dynamic line rating is more likely to be influenced by measurement error. Other findings include themore » possibility of reporting nonoscillatory activity as an oscillation as the result of error, failing to detect oscillations submerged by error, and the unlikely impact of error on event location and islanding detection.« less
ERIC Educational Resources Information Center
Dougherty, Michael R.; Sprenger, Amber
2006-01-01
This article introduces 2 new sources of bias in probability judgment, discrimination failure and inhibition failure, which are conceptualized as arising from an interaction between error prone memory processes and a support theory like comparison process. Both sources of bias stem from the influence of irrelevant information on participants'…
Impact of time-of-flight PET on quantification errors in MR imaging-based attenuation correction.
Mehranian, Abolfazl; Zaidi, Habib
2015-04-01
Time-of-flight (TOF) PET/MR imaging is an emerging imaging technology with great capabilities offered by TOF to improve image quality and lesion detectability. We assessed, for the first time, the impact of TOF image reconstruction on PET quantification errors induced by MR imaging-based attenuation correction (MRAC) using simulation and clinical PET/CT studies. Standard 4-class attenuation maps were derived by segmentation of CT images of 27 patients undergoing PET/CT examinations into background air, lung, soft-tissue, and fat tissue classes, followed by the assignment of predefined attenuation coefficients to each class. For each patient, 4 PET images were reconstructed: non-TOF and TOF both corrected for attenuation using reference CT-based attenuation correction and the resulting 4-class MRAC maps. The relative errors between non-TOF and TOF MRAC reconstructions were compared with their reference CT-based attenuation correction reconstructions. The bias was locally and globally evaluated using volumes of interest (VOIs) defined on lesions and normal tissues and CT-derived tissue classes containing all voxels in a given tissue, respectively. The impact of TOF on reducing the errors induced by metal-susceptibility and respiratory-phase mismatch artifacts was also evaluated using clinical and simulation studies. Our results show that TOF PET can remarkably reduce attenuation correction artifacts and quantification errors in the lungs and bone tissues. Using classwise analysis, it was found that the non-TOF MRAC method results in an error of -3.4% ± 11.5% in the lungs and -21.8% ± 2.9% in bones, whereas its TOF counterpart reduced the errors to -2.9% ± 7.1% and -15.3% ± 2.3%, respectively. The VOI-based analysis revealed that the non-TOF and TOF methods resulted in an average overestimation of 7.5% and 3.9% in or near lung lesions (n = 23) and underestimation of less than 5% for soft tissue and in or near bone lesions (n = 91). Simulation results showed that as TOF resolution improves, artifacts and quantification errors are substantially reduced. TOF PET substantially reduces artifacts and improves significantly the quantitative accuracy of standard MRAC methods. Therefore, MRAC should be less of a concern on future TOF PET/MR scanners with improved timing resolution. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
A theoretical basis for the analysis of redundant software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques fault-tolerant software, is an understanding of the impact of multiple-joint occurrences of coincident errors. A theoretical basis for the study of redundant software is developed which provides a probabilistic framework for empirically evaluating the effectiveness of the general (N-Version) strategy when component versions are subject to coincident errors, and permits an analytical study of the effects of these errors. The basic assumptions of the model are: (1) independently designed software components are chosen in a random sample; and (2) in the user environment, the system is required to execute on a stationary input series. The intensity of coincident errors, has a central role in the model. This function describes the propensity to introduce design faults in such a way that software components fail together when executing in the user environment. The model is used to give conditions under which an N-Version system is a better strategy for reducing system failure probability than relying on a single version of software. A condition which limits the effectiveness of a fault-tolerant strategy is studied, and it is posted whether system failure probability varies monotonically with increasing N or whether an optimal choice of N exists.
NASA Astrophysics Data System (ADS)
Peterson, Tim J.; Western, Andrew W.; Cheng, Xiang
2018-03-01
Suspicious groundwater-level observations are common and can arise for many reasons ranging from an unforeseen biophysical process to bore failure and data management errors. Unforeseen observations may provide valuable insights that challenge existing expectations and can be deemed outliers, while monitoring and data handling failures can be deemed errors, and, if ignored, may compromise trend analysis and groundwater model calibration. Ideally, outliers and errors should be identified but to date this has been a subjective process that is not reproducible and is inefficient. This paper presents an approach to objectively and efficiently identify multiple types of errors and outliers. The approach requires only the observed groundwater hydrograph, requires no particular consideration of the hydrogeology, the drivers (e.g. pumping) or the monitoring frequency, and is freely available in the HydroSight toolbox. Herein, the algorithms and time-series model are detailed and applied to four observation bores with varying dynamics. The detection of outliers was most reliable when the observation data were acquired quarterly or more frequently. Outlier detection where the groundwater-level variance is nonstationary or the absolute trend increases rapidly was more challenging, with the former likely to result in an under-estimation of the number of outliers and the latter an overestimation in the number of outliers.
Alan K. Swanson; Solomon Z. Dobrowski; Andrew O. Finley; James H. Thorne; Michael K. Schwartz
2013-01-01
The uncertainty associated with species distribution model (SDM) projections is poorly characterized, despite its potential value to decision makers. Error estimates from most modelling techniques have been shown to be biased due to their failure to account for spatial autocorrelation (SAC) of residual error. Generalized linear mixed models (GLMM) have the ability to...
1984-10-26
test for independence; ons i ser, -, of the poduct life estimator; dependent risks; 119 ASRACT Coniinue on ’wme-se f nereiary-~and iaen r~f> by Worst...the failure times associated with different failure - modes when we really should use a bivariate (or multivariate) distribution, then what is the...dependencies may be present, then what is the magnitude of the estimation error? S The third specific aim will attempt to obtain bounds on the
Clarification of terminology in medication errors: definitions and classification.
Ferner, Robin E; Aronson, Jeffrey K
2006-01-01
We have previously described and analysed some terms that are used in drug safety and have proposed definitions. Here we discuss and define terms that are used in the field of medication errors, particularly terms that are sometimes misunderstood or misused. We also discuss the classification of medication errors. A medication error is a failure in the treatment process that leads to, or has the potential to lead to, harm to the patient. Errors can be classified according to whether they are mistakes, slips, or lapses. Mistakes are errors in the planning of an action. They can be knowledge based or rule based. Slips and lapses are errors in carrying out an action - a slip through an erroneous performance and a lapse through an erroneous memory. Classification of medication errors is important because the probabilities of errors of different classes are different, as are the potential remedies.
NASA Technical Reports Server (NTRS)
McCarty, John P.; Lyles, Garry M.
1997-01-01
Propulsion system quality is defined in this paper as having high reliability, that is, quality is a high probability of within-tolerance performance or operation. Since failures are out-of-tolerance performance, the probability of failures and their occurrence is the difference between high and low quality systems. Failures can be described at 3 levels: the system failure (which is the detectable end of a failure), the failure mode (which is the failure process), and the failure cause (which is the start). Failure causes can be evaluated & classified by type. The results of typing flight history failures shows that most failures are in unrecognized modes and result from human error or noise, i.e. failures are when engineers learn how things really work. Although the study based on US launch vehicles, a sampling of failures from other countries indicates the finding has broad application. The parameters of the design of a propulsion system are not single valued, but have dispersions associated with the manufacturing of parts. Many tests are needed to find failures, if the dispersions are large relative to tolerances, which could contribute to the large number of failures in unrecognized modes.
Rossi X-Ray Timing Explorer All-Sky Monitor Localization of SGR 1627-41
NASA Astrophysics Data System (ADS)
Smith, Donald A.; Bradt, Hale V.; Levine, Alan M.
1999-07-01
The fourth unambiguously identified soft gamma repeater (SGR), SGR 1627-41, was discovered with the BATSE instrument on 1998 June 15. Interplanetary Network (IPN) measurements and BATSE data constrained the location of this new SGR to a 6° segment of a narrow (19") annulus. We present two bursts from this source observed by the All-Sky Monitor (ASM) on the Rossi X-Ray Timing Explorer. We use the ASM data to further constrain the source location to a 5' long segment of the BATSE/IPN error box. The ASM/IPN error box lies within 0.3 arcmin of the supernova remnant G337.0-0.1. The probability that a supernova remnant would fall so close to the error box purely by chance is ~5%.
RXTE All-Sky Monitor Localization of SGR 1627-41
NASA Astrophysics Data System (ADS)
Smith, D. A.; Bradt, H. V.; Levine, A. M.
1999-09-01
The fourth unambiguously identified Soft Gamma Repeater (SGR), SGR 1627--41, was discovered with the BATSE instrument on 1998 June 15 (Kouveliotou et al. 1998). Interplanetary Network (IPN) measurements and BATSE data constrained the location of this new SGR to a 6(deg) segment of a narrow (19('') ) annulus (Hurley et al. 1999; Woods et al. 1998). We report on two bursts from this source observed by the All-Sky Monitor (ASM) on RXTE. We use the ASM data to further constrain the source location to a 5(') long segment of the BATSE/IPN error box. The ASM/IPN error box lies within 0.3(') of the supernova remnant (SNR) G337.0--0.1. The probability that a SNR would fall so close to the error box purely by chance is ~ 5%.
Li, Wen-Chin; Harris, Don; Yu, Chung-San
2008-03-01
The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents.
Reliability, Safety and Error Recovery for Advanced Control Software
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2003-01-01
For long-duration automated operation of regenerative life support systems in space environments, there is a need for advanced integration and control systems that are significantly more reliable and safe, and that support error recovery and minimization of operational failures. This presentation outlines some challenges of hazardous space environments and complex system interactions that can lead to system accidents. It discusses approaches to hazard analysis and error recovery for control software and challenges of supporting effective intervention by safety software and the crew.
Progressive retry for software error recovery in distributed systems
NASA Technical Reports Server (NTRS)
Wang, Yi-Min; Huang, Yennun; Fuchs, W. K.
1993-01-01
In this paper, we describe a method of execution retry for bypassing software errors based on checkpointing, rollback, message reordering and replaying. We demonstrate how rollback techniques, previously developed for transient hardware failure recovery, can also be used to recover from software faults by exploiting message reordering to bypass software errors. Our approach intentionally increases the degree of nondeterminism and the scope of rollback when a previous retry fails. Examples from our experience with telecommunications software systems illustrate the benefits of the scheme.
Research on Spectroscopy, Opacity, and Atmospheres
NASA Technical Reports Server (NTRS)
Kurucz, Robert L.
1996-01-01
I discuss errors in theory and in interpreting observations that are produced by the failure to consider resolution in space, time, and energy. I discuss convection in stellar model atmospheres and in stars. Large errors in abundances are possible such as the factor of ten error in the Li abundance for extreme Population II stars. Finally I discuss the variation of microturbulent velocity with depth, effective temperature, gravity and abundance. These variations must be dealt with in computing models and grids and in any type of photometric calibration.
A Conceptual Framework for Predicting Error in Complex Human-Machine Environments
NASA Technical Reports Server (NTRS)
Freed, Michael; Remington, Roger; Null, Cynthia H. (Technical Monitor)
1998-01-01
We present a Goals, Operators, Methods, and Selection Rules-Model Human Processor (GOMS-MHP) style model-based approach to the problem of predicting human habit capture errors. Habit captures occur when the model fails to allocate limited cognitive resources to retrieve task-relevant information from memory. Lacking the unretrieved information, decision mechanisms act in accordance with implicit default assumptions, resulting in error when relied upon assumptions prove incorrect. The model helps interface designers identify situations in which such failures are especially likely.
Single Event Effect Testing of the Micron MT46V128M8
NASA Technical Reports Server (NTRS)
Stansberry, Scott; Campola, Michael; Wilcox, Ted; Seidleck, Christina; Phan, Anthony
2017-01-01
The Micron MT46V128M8 was tested for single event effects (SEE) at the Texas AM University Cyclotron Facility (TAMU) in June of 2017. Testing revealed a sensitivity to device hang-ups classified as single event functional interrupts (SEFI) and possible soft data errors classified as single event upsets (SEU).
Micromagnetic Study of Perpendicular Magnetic Recording Media
NASA Astrophysics Data System (ADS)
Dong, Yan
With increasing areal density in magnetic recording systems, perpendicular recording has successfully replaced longitudinal recording to mitigate the superparamagnetic limit. The extensive theoretical and experimental research associated with perpendicular magnetic recording media has contributed significantly to improving magnetic recording performance. Micromagnetic studies on perpendicular recording media, including aspects of the design of hybrid soft underlayers, media noise properties, inter-grain exchange characterization and ultra-high density bit patterned media recording, are presented in this dissertation. To improve the writability of recording media, one needs to reduce the head-to-keeper spacing while maintaining a good texture growth for the recording layer. A hybrid soft underlayer, consisting of a thin crystalline soft underlayer stacked above a non-magnetic seed layer and a conventional amorphous soft underlayer, provides an alternative approach for reducing the effective head-to-keeper spacing in perpendicular recording. Micromagnetic simulations indicate that the media using a hybrid soft underlayer helps enhance the effective field and the field gradient in comparison with conventional media that uses only an amorphous soft underlayer. The hybrid soft underlayer can support a thicker non-magnetic seed layer yet achieve an equivalent or better effective field and field gradient. A noise plateau for intermediate recording densities is observed for a recording layer of typical magnetization. Medium noise characteristics and transition jitter in perpendicular magnetic recording are explored using micromagnetic simulation. The plateau is replaced by a normal linear dependence of noise on recording density for a low magnetization recording layer. We show analytically that a source of the plateau is similar to that producing the Non-Linear-Transition-Shift of signal. In particular, magnetostatic effects are predicted to produce positive correlation of jitter and thus negative correlation of noise at the densities associated with the plateau. One focus for developing perpendicular recording media is on how to extract intergranular exchange coupling and intrinsic anisotropy field dispersion. A micromagnetic numerical technique is developed to effectively separate the effects of intergranular exchange coupling and anisotropy dispersion by finding their correlation to differentiated M-H curves with different initial magnetization states, even in the presence of thermal fluctuation. The validity of this method is investigated with a series of intergranular exchange couplings and anisotropy dispersions for different media thickness. This characterization method allows for an experimental measurement employing a vibrating sample magnetometer (VSM). Bit patterned media have been suggested to extend areal density beyond 1 Tbit/in2. The feasibility of 4 Tbit/in2 bit patterned recording is determined by aspects of write head design and media fabrication, and is estimated by the bit error rate. Micromagnetic specifications including 2.3:1 BAR bit patterned exchange coupled composite media, trailing shield, and side shields are proposed to meet the requirement of 3x10 -4 bit error rate, 4 nm fly height, 5% switching field distribution, 5% timing and 5% jitter errors for 4 Tbit/in2 bit-patterned recording. Demagnetizing field distribution is examined by studying the shielding effect of the side shields on the stray field from the neighboring dots. For recording self-assembled bit-patterned media, the head design writes two staggered tracks in a single pass and has maximum perpendicular field gradients of 580 Oe/nm along the down-track direction and 476 Oe/nm along the cross-track direction. The geometry demanded by self-assembly reduces recording density to 2.9 Tbit/in 2.
Estimation of Fetal Weight during Labor: Still a Challenge.
Barros, Joana Goulão; Reis, Inês; Pereira, Isabel; Clode, Nuno; Graça, Luís M
2016-01-01
To evaluate the accuracy of fetal weight prediction by ultrasonography labor employing a formula including the linear measurements of femur length (FL) and mid-thigh soft-tissue thickness (STT). We conducted a prospective study involving singleton uncomplicated term pregnancies within 48 hours of delivery. Only pregnancies with a cephalic fetus admitted in the labor ward for elective cesarean section, induction of labor or spontaneous labor were included. We excluded all non-Caucasian women, the ones previously diagnosed with gestational diabetes and the ones with evidence of ruptured membranes. Fetal weight estimates were calculated using a previously proposed formula [estimated fetal weight = 1687.47 + (54.1 x FL) + (76.68 x STT). The relationship between actual birth weight and estimated fetal weight was analyzed using Pearson's correlation. The formula's performance was assessed by calculating the signed and absolute errors. Mean weight difference and signed percentage error were calculated for birth weight divided into three subgroups: < 3000 g; 3000-4000 g; and > 4000 g. We included for analysis 145 cases and found a significant, yet low, linear relationship between birth weight and estimated fetal weight (p < 0.001; R2 = 0.197) with an absolute mean error of 10.6%. The lowest mean percentage error (0.3%) corresponded to the subgroup with birth weight between 3000 g and 4000 g. This study demonstrates a poor correlation between actual birth weight and the estimated fetal weight using a formula based on femur length and mid-thigh soft-tissue thickness, both linear parameters. Although avoidance of circumferential ultrasound measurements might prove to be beneficial, it is still yet to be found a fetal estimation formula that can be both accurate and simple to perform.
The Extreme Mechanics of Soft Structures
NASA Astrophysics Data System (ADS)
Reis, Pedro
2015-03-01
I will present a series of experimental investigations on the rich behavior of soft mechanical structures, which, similarly to soft materials, can undergo large deformations under a variety of loading conditions. Soft structures typically comprise slender elements that can readily undergo mechanical instabilities to achieve extreme flexibility and reversible reconfigurations. This field has came to be warmly known as `Extreme Mechanics', where one of the fundamental challenges lies in rationalizing the geometric nonlinearities that arise in the post-buckling regime. I shall focus on problems involving thin elastic rods and shells, through examples ranging from the deployment of submarine cables onto the seabed, locomotion of uniflagellar bacteria, crystallography of curved wrinkling and its usage for active aerodynamic drag reduction. The main common feature underlying this series of studies is the prominence of geometry, and its interplay with mechanics, in dictating complex mechanical behavior that is relevant and applicable over a wide range of length scales. Moreover, our findings suggest that we rethink our relationship with mechanical instabilities which, rather than modes of failure, can be embraced as opportunities for functionality that are scalable, reversible, and robust. The author knowledges financial support from the National Science Foundation, CMMI-1351449 (CAREER).
Ohnishi, Mutsuko; Nakatani, Teruyo; Lanske, Beate; Razzaque, M. Shawkat
2011-01-01
Changes in the expression of klotho, a β-glucuronidase, contribute to the development of features that resemble those of premature aging, as well as chronic renal failure. Klotho knockout mice have increased expression of the sodium/phosphate cotransporter (NaPi2a) and 1α-hydroxylase in their kidneys, along with increased serum levels of phosphate and 1,25-dihydroxyvitamin D. These changes are associated with widespread soft-tissue calcifications, generalized tissue atrophy, and a shorter lifespan in the knockout mice. To determine the role of the increased vitamin D activities in klotho knockout animals, we generated klotho and 1α-hydroxylase double-knockout mice. These double mutants regained body weight and developed hypophosphatemia with a complete elimination of the soft-tissue and vascular calcifications that were routinely found in klotho knockout mice. The markedly increased serum fibroblast growth factor 23 and the abnormally low serum parathyroid hormone levels, typical of klotho knockout mice, were significantly reversed in the double-knockout animals. These in vivo studies suggest that vitamin D has a pathologic role in regulating abnormal mineral ion metabolism and soft-tissue anomalies of klotho-deficient mice. PMID:19225558
Failure mode analysis in adrenal vein sampling: a single-center experience.
Trerotola, Scott O; Asmar, Melissa; Yan, Yan; Fraker, Douglas L; Cohen, Debbie L
2014-10-01
To analyze failure modes in a high-volume adrenal vein sampling (AVS) practice in an effort to identify preventable causes of nondiagnostic sampling. A retrospective database was constructed containing 343 AVS procedures performed over a 10-year period. Each nondiagnostic AVS procedure was reviewed for failure mode and correlated with results of any repeat AVS. Data collected included selectivity index, lateralization index, adrenalectomy outcomes if performed, and details of AVS procedure. All AVS procedures were performed after cosyntropin stimulation, using sequential technique. AVS was nondiagnostic in 12 of 343 (3.5%) primary procedures and 2 secondary procedures. Failure was right-sided in 8 (57%) procedures, left-sided in 4 (29%) procedures, bilateral in 1 procedure, and neither in 1 procedure (laboratory error). Failure modes included diluted sample from correctly identified vein (n = 7 [50%]; 3 right and 4 left), vessel misidentified as adrenal vein (n = 3 [21%]; all right), failure to locate an adrenal vein (n = 2 [14%]; both right), cosyntropin stimulation failure (n = 1 [7%]; diagnostic by nonstimulated criteria), and laboratory error (n = 1 [7%]; specimen loss). A second AVS procedure was diagnostic in three of five cases (60%), and a third AVS procedure was diagnostic in one of one case (100%). Among the eight patients in whom AVS ultimately was not diagnostic, four underwent adrenalectomy based on diluted AVS samples, and one underwent adrenalectomy based on imaging; all five experienced improvement in aldosteronism. A substantial percentage of AVS failures occur on the left, all related to dilution. Even when technically nondiagnostic per strict criteria, some "failed" AVS procedures may be sufficient to guide therapy. Repeat AVS has a good yield. Copyright © 2014 SIR. Published by Elsevier Inc. All rights reserved.
Ramasamy, P R
2017-01-01
Open fractures of tibia have posed great difficulty in managing both the soft tissue and the skeletal components of the injured limb. Gustilo Anderson III B open tibial fractures are more difficult to manage than I, II, and III A fractures. Stable skeletal fixation with immediate soft tissue cover has been the key to the successful outcome in treating open tibial fractures, in particular, Gustilo Anderson III B types. If the length of the open wound is larger and if the exposed surface of tibial fracture and tibial shaft is greater, then the management becomes still more difficult. Thirty six Gustilo Anderson III B open tibial fractures managed between June 2002 and December 2013 with "fix and shift" technique were retrospectively reviewed. All the 36 patients managed by this technique had open wounds measuring >5 cm (post debridement). Under fix and shift technique, stable fixation involved primary external fixator application or primary intramedullary nailing of the tibial fracture and immediate soft tissue cover involved septocutaneous shift, i.e., shifting of fasciocutaneous segments based on septocutaneous perforators. Primary fracture union rate was 50% and reoperation rate (bone stimulating procedures) was 50%. Overall fracture union rate was 100%. The rate of malunion was 14% and deep infection was 16%. Failure of septocutaneous shift was 2.7%. There was no incidence of amputation. Management of Gustilo Anderson III B open tibial fractures with "fix and shift" technique has resulted in better outcome in terms of skeletal factors (primary fracture union, overall union, and time for union and malunion) and soft tissue factors (wound healing, flap failure, access to secondary procedures, and esthetic appearance) when compared to standard methods adopted earlier. Hence, "fix and shift" could be recommended as one of the treatment modalities for open III B tibial fractures.
Ma, Zuwei; Hong, Yi; Nelson, Devin M; Pichamuthu, Joseph E; Leeson, Cory E; Wagner, William R
2011-09-12
Biodegradable polyurethane urea (PUU) elastomers are ideal candidates for fabricating tissue engineering scaffolds with mechanical properties akin to strong and resilient soft tissues. PUU with a crystalline poly(ε-caprolactone) (PCL) macrodiol soft segment (SS) showed good elasticity and resilience at small strains (<50%) but showed poor resilience under large strains because of stress-induced crystallization of the PCL segments, with a permanent set of 677 ± 30% after tensile failure. To obtain softer and more resilient PUUs, we used noncrystalline poly(trimethylene carbonate) (PTMC) or poly(δ-valerolactone-co-ε-caprolactone) (PVLCL) macrodiols of different molecular weights as SSs that were reacted with 1,4-diisocyanatobutane and chain extended with 1,4-diaminobutane. Mechanical properties of the PUUs were characterized by tensile testing with static or cyclic loading and dynamic mechanical analysis. All of the PUUs synthesized showed large elongations at break (800-1400%) and high tensile strength (30-60 MPa). PUUs with noncrystalline SSs all showed improved elasticity and resilience relative to the crystalline PCL-based PUU, especially for the PUUs with high molecular weight SSs (PTMC 5400 M(n) and PVLCL 6000 M(n)), of which the permanent deformation after tensile failure was only 12 ± 7 and 39 ± 4%, respectively. The SS molecular weight also influenced the tensile modulus in an inverse fashion. Accelerated degradation studies in PBS containing 100 U/mL lipase showed significantly greater mass loss for the two polyester-based PUUs versus the polycarbonate-based PUU and for PVLCL versus PCL polyester PUUs. Basic cytocompatibility was demonstrated with primary vascular smooth muscle cell culture. The synthesized families of PUUs showed variable elastomeric behavior that could be explained in terms of the underlying molecular design and crystalline behavior. Depending on the application target of interest, these materials may provide options or guidance for soft tissue scaffold development.
A Convex Approach to Fault Tolerant Control
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Cox, David E.; Bauer, Frank (Technical Monitor)
2002-01-01
The design of control laws for dynamic systems with the potential for actuator failures is considered in this work. The use of Linear Matrix Inequalities allows more freedom in controller design criteria than typically available with robust control. This work proposes an extension of fault-scheduled control design techniques that can find a fixed controller with provable performance over a set of plants. Through convexity of the objective function, performance bounds on this set of plants implies performance bounds on a range of systems defined by a convex hull. This is used to incorporate performance bounds for a variety of soft and hard failures into the control design problem.
ROSAT X-Ray Observation of the Second Error Box for SGR 1900+14
NASA Technical Reports Server (NTRS)
Li, P.; Hurley, K.; Vrba, F.; Kouveliotou, C.; Meegan, C. A.; Fishman, G. J.; Kulkarni, S.; Frail, D.
1997-01-01
The positions of the two error boxes for the soft gamma repeater (SGR) 1900+14 were determined by the "network synthesis" method, which employs observations by the Ulysses gamma-ray burst and CGRO BATSE instruments. The location of the first error box has been observed at optical, infrared, and X-ray wavelengths, resulting in the discovery of a ROSAT X-ray point source and a curious double infrared source. We have recently used the ROSAT HRI to observe the second error box to complete the counterpart search. A total of six X-ray sources were identified within the field of view. None of them falls within the network synthesis error box, and a 3 sigma upper limit to any X-ray counterpart was estimated to be 6.35 x 10(exp -14) ergs/sq cm/s. The closest source is approximately 3 min. away, and has an estimated unabsorbed flux of 1.5 x 10(exp -12) ergs/sq cm/s. Unlike the first error box, there is no supernova remnant near the second error box. The closest one, G43.9+1.6, lies approximately 2.dg6 away. For these reasons, we believe that the first error box is more likely to be the correct one.
Radiation Failures in Intel 14nm Microprocessors
NASA Technical Reports Server (NTRS)
Bossev, Dobrin P.; Duncan, Adam R.; Gadlage, Matthew J.; Roach, Austin H.; Kay, Matthew J.; Szabo, Carl; Berger, Tammy J.; York, Darin A.; Williams, Aaron; LaBel, K.;
2016-01-01
In this study the 14 nm Intel Broadwell 5th generation core series 5005U-i3 and 5200U-i5 was mounted on Dell Inspiron laptops, MSI Cubi and Gigabyte Brix barebones and tested with Windows 8 and CentOS7 at idle. Heavy-ion-induced hard- and catastrophic failures do not appear to be related to the Intel 14nm Tri-Gate FinFET process. They originate from a small (9 m 140 m) area on the 32nm planar PCH die (not the CPU) as initially speculated. The hard failures seem to be due to a SEE but the exact physical mechanism has yet to be identified. Some possibilities include latch-ups, charge ion trapping or implantation, ion channels, or a combination of those (in biased conditions). The mechanism of the catastrophic failures seems related to the presence of electric power (1.05V core voltage). The 1064 nm laser mimics ionization radiation and induces soft- and hard failures as a direct result of electron-hole pair production, not heat. The 14nm FinFET processes continue to look promising for space radiation environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, J; Wang, J; P, J
2016-06-15
Purpose: To optimize the clinical processes of radiotherapy and to reduce the radiotherapy risks by implementing the powerful risk management tools of failure mode and effects analysis(FMEA) and PDCA(plan-do-check-act). Methods: A multidiciplinary QA(Quality Assurance) team from our department consisting of oncologists, physicists, dosimetrists, therapists and administrator was established and an entire workflow QA process management using FMEA and PDCA tools was implemented for the whole treatment process. After the primary process tree was created, the failure modes and Risk priority numbers(RPNs) were determined by each member, and then the RPNs were averaged after team discussion. Results: 3 of 9 failuremore » modes with RPN above 100 in the practice were identified in the first PDCA cycle, which were further analyzed to investigate the RPNs: including of patient registration error, prescription error and treating wrong patient. New process controls reduced the occurrence, or detectability scores from the top 3 failure modes. Two important corrective actions reduced the highest RPNs from 300 to 50, and the error rate of radiotherapy decreased remarkably. Conclusion: FMEA and PDCA are helpful in identifying potential problems in the radiotherapy process, which was proven to improve the safety, quality and efficiency of radiation therapy in our department. The implementation of the FMEA approach may improve the understanding of the overall process of radiotherapy while may identify potential flaws in the whole process. Further more, repeating the PDCA cycle can bring us closer to the goal: higher safety and accuracy radiotherapy.« less
Tokuda, Yasuharu; Kishida, Naoki; Konishi, Ryota; Koizumi, Shunzo
2011-03-01
Cognitive errors in the course of clinical decision-making are prevalent in many cases of medical injury. We used information on verdict's judgment from closed claims files to determine the important cognitive factors associated with cases of medical injury. Data were collected from claims closed between 2001 to 2005 at district courts in Tokyo and Osaka, Japan. In each case, we recorded all the contributory cognitive, systemic, and patient-related factors judged in the verdicts to be causally related to the medical injury. We also analyzed the association between cognitive factors and cases involving paid compensation using a multivariable logistic regression model. Among 274 cases (mean age 49 years old; 45% women), there were 122 (45%) deaths and 67 (24%) major injuries (incomplete recovery within a year). In 103 cases (38%), the verdicts ordered hospitals to pay compensation (median; 8,000,000 Japanese Yen). An error in judgment (199/274, 73%) and failure of vigilance (177/274, 65%) were the most prevalent causative cognitive factors, and error in judgment was also significantly associated with paid compensation (odds ratio, 1.9; 95% confidence interval [CI], 1.0-3.4). Systemic causative factors including poor teamwork (11/274, 4%) and technology failure (5/274, 2%) were less common. The closed claims analysis based on verdict's judgment showed that cognitive errors were common in cases of medical injury, with an error in judgment being most prevalent and closely associated with compensation payment. Reduction of this type of error is required to produce safer healthcare. 2010 Society of Hospital Medicine.
Roberts, Rachel M; Davis, Melissa C
2015-01-01
There is a need for an evidence-based approach to training professional psychologists in the administration and scoring of standardized tests such as the Wechsler Adult Intelligence Scale (WAIS) due to substantial evidence that these tasks are associated with numerous errors that have the potential to significantly impact clients' lives. Twenty three post-graduate psychology students underwent training in using the WAIS-IV according to a best-practice teaching model that involved didactic teaching, independent study of the test manual, and in-class practice with teacher supervision and feedback. Video recordings and test protocols from a role-played test administration were analyzed for errors according to a comprehensive checklist with self, peer, and faculty member reviews. 91.3% of students were rated as having demonstrated competency in administration and scoring. All students were found to make errors, with substantially more errors being detected by the faculty member than by self or peers. Across all subtests, the most frequent errors related to failure to deliver standardized instructions verbatim from the manual. The failure of peer and self-reviews to detect the majority of the errors suggests that novice feedback (self or peers) may be ineffective to eliminate errors and the use of more senior peers may be preferable. It is suggested that involving senior trainees, recent graduates and/or experienced practitioners in the training of post-graduate students may have benefits for both parties, promoting a peer-learning and continuous professional development approach to the development and maintenance of skills in psychological assessment.
Yang, F; Cao, N; Young, L; Howard, J; Logan, W; Arbuckle, T; Sponseller, P; Korssjoen, T; Meyer, J; Ford, E
2015-06-01
Though failure mode and effects analysis (FMEA) is becoming more widely adopted for risk assessment in radiation therapy, to our knowledge, its output has never been validated against data on errors that actually occur. The objective of this study was to perform FMEA of a stereotactic body radiation therapy (SBRT) treatment planning process and validate the results against data recorded within an incident learning system. FMEA on the SBRT treatment planning process was carried out by a multidisciplinary group including radiation oncologists, medical physicists, dosimetrists, and IT technologists. Potential failure modes were identified through a systematic review of the process map. Failure modes were rated for severity, occurrence, and detectability on a scale of one to ten and risk priority number (RPN) was computed. Failure modes were then compared with historical reports identified as relevant to SBRT planning within a departmental incident learning system that has been active for two and a half years. Differences between FMEA anticipated failure modes and existing incidents were identified. FMEA identified 63 failure modes. RPN values for the top 25% of failure modes ranged from 60 to 336. Analysis of the incident learning database identified 33 reported near-miss events related to SBRT planning. Combining both methods yielded a total of 76 possible process failures, of which 13 (17%) were missed by FMEA while 43 (57%) identified by FMEA only. When scored for RPN, the 13 events missed by FMEA ranked within the lower half of all failure modes and exhibited significantly lower severity relative to those identified by FMEA (p = 0.02). FMEA, though valuable, is subject to certain limitations. In this study, FMEA failed to identify 17% of actual failure modes, though these were of lower risk. Similarly, an incident learning system alone fails to identify a large number of potentially high-severity process errors. Using FMEA in combination with incident learning may render an improved overview of risks within a process.
NASA Technical Reports Server (NTRS)
Spector, E.; LeBlanc, A.; Shackelford, L.
1995-01-01
This study reports on the short-term in vivo precision and absolute measurements of three combinations of whole-body scan modes and analysis software using a Hologic QDR 2000 dual-energy X-ray densitometer. A group of 21 normal, healthy volunteers (11 male and 10 female) were scanned six times, receiving one pencil-beam and one array whole-body scan on three occasions approximately 1 week apart. The following combinations of scan modes and analysis software were used: pencil-beam scans analyzed with Hologic's standard whole-body software (PB scans); the same pencil-beam analyzed with Hologic's newer "enhanced" software (EPB scans); and array scans analyzed with the enhanced software (EA scans). Precision values (% coefficient of variation, %CV) were calculated for whole-body and regional bone mineral content (BMC), bone mineral density (BMD), fat mass, lean mass, %fat and total mass. In general, there was no significant difference among the three scan types with respect to short-term precision of BMD and only slight differences in the precision of BMC. Precision of BMC and BMD for all three scan types was excellent: < 1% CV for whole-body values, with most regional values in the 1%-2% range. Pencil-beam scans demonstrated significantly better soft tissue precision than did array scans. Precision errors for whole-body lean mass were: 0.9% (PB), 1.1% (EPB) and 1.9% (EA). Precision errors for whole-body fat mass were: 1.7% (PB), 2.4% (EPB) and 5.6% (EA). EPB precision errors were slightly higher than PB precision errors for lean, fat and %fat measurements of all regions except the head, although these differences were significant only for the fat and % fat of the arms and legs. In addition EPB precision values exhibited greater individual variability than PB precision values. Finally, absolute values of bone and soft tissue were compared among the three combinations of scan and analysis modes. BMC, BMD, fat mass, %fat and lean mass were significantly different between PB scans and either of the EPB or EA scans. Differences were as large as 20%-25% for certain regional fat and BMD measurements. Additional work may be needed to examine the relative accuracy of the scan mode/software combinations and to identify reasons for the differences in soft tissue precision with the array whole-body scan mode.
A hard-soft microfluidic-based biosensor flow cell for SPR imaging application.
Liu, Changchun; Cui, Dafu; Li, Hui
2010-09-15
An ideal microfluidic-based biosensor flow cell should have not only a "soft" interface for high strength sealing with biosensing chips, but also "hard" macro-to-micro interface for tubing connection. Since these properties are exclusive of each other, no one material can provide the advantages of both. In this paper, we explore the application of a SiO(2) thin film, deposited by plasma-enhanced chemical vapor deposition (PECVD) technology, as an intermediate layer for irreversibly adhering polydimethylsiloxane (PDMS) to plastic substrate, and develop a hard-soft, compact, robust microfluidic-based biosensor flow cell for the multi-array immunoassay application of surface plasmon resonance (SPR) imaging. This hard-soft biosensor flow cell consists of one rigid, computer numerically controlled (CNC)-machined poly(methyl methacrylate) (PMMA) base coated with a 200 nm thick SiO(2) thin film, and one soft PDMS microfluidic layer. This novel microfluidic-based biosensor flow cell does not only keep the original advantage of conventional PDMS-based biosensor flow cell such as the intrinsically soft interface, easy-to-fabrication, and low cost, but also has a rigid, robust, easy-to-use interface to tubing connection and can be operated up to 185 kPa in aqueous environments without failure. Its application was successfully demonstrated with two types of experiments by coupling with SPR imaging biosensor: the real-time monitoring of the immunoglobulin G (IgG) interaction, as well as the detection of sulfamethoxazole (SMOZ) and sulfamethazine (SMZ) with the sensitivity of 3.5 and 0.6 ng/mL, respectively. This novel hard-soft microfluidic device is also useful for a variety of other biosensor flow cells. Copyright 2010 Elsevier B.V. All rights reserved.
Third-order polynomial model for analyzing stickup state laminated structure in flexible electronics
NASA Astrophysics Data System (ADS)
Meng, Xianhong; Wang, Zihao; Liu, Boya; Wang, Shuodao
2018-02-01
Laminated hard-soft integrated structures play a significant role in the fabrication and development of flexible electronics devices. Flexible electronics have advantageous characteristics such as soft and light-weight, can be folded, twisted, flipped inside-out, or be pasted onto other surfaces of arbitrary shapes. In this paper, an analytical model is presented to study the mechanics of laminated hard-soft structures in flexible electronics under a stickup state. Third-order polynomials are used to describe the displacement field, and the principle of virtual work is adopted to derive the governing equations and boundary conditions. The normal strain and the shear stress along the thickness direction in the bi-material region are obtained analytically, which agree well with the results from finite element analysis. The analytical model can be used to analyze stickup state laminated structures, and can serve as a valuable reference for the failure prediction and optimal design of flexible electronics in the future.
New Platforms for Characterization of Biological Material Failure and Resilience Properties
NASA Astrophysics Data System (ADS)
Brown, Katherine; Butler, Benjamin J.; Nguyen, Thuy-Tien N.; Sorry, David; Williams, Alun; Proud, William G.
2017-06-01
Obtaining information about the material responses of viscoelastic soft matter, such as polymers and foams has, required adaptation of techniques traditionally used with hard condensed matter. More recently it has been recognized that understanding the strain-rate behavior of natural and synthetic soft biological materials poses even greater challenges for materials research due their heterogeneous composition and structural complexity. Expanding fundamental knowledge about how these classes of biomaterials function under different loading regimes is of considerable interest in both fundamental and applied research. A comparative overview of methods, developed in our laboratory or elsewhere, for determining material responses of cells and soft tissues over a wide range of strain rates (quasi-static to blast loading) will be presented. Examples will illustrate how data are obtained for studying material responses of cells and tissues. Strengths and weaknesses of current approaches will be discussed, with particular emphasis on challenges associated with the development of realistic experimental and computational models for trauma and other disease indications.
NASA Astrophysics Data System (ADS)
Gleick, P. H.
2010-12-01
The failure of traditional water management systems in the 20th century -- what I call the "hard path for water" -- is evident in several ways, including the persistent inability to meet basic human needs for safe water and adequate sanitation for vast populations, ongoing and accelerating aquatic ecosystem collapses , and growing political disputes over water allocation, management, and use, even in regions where substantial investment in water has been made. Progress in resolving these problems, especially in the face of unavoidable climate changes, growing populations, and constrained financial systems, will require bridging hydrologic and social sciences in new ways. Integrating social and cultural knowledge with new economic and technological tools and classical hydrologic and climatological sciences can produce a new “soft path for water” that offers the opportunity to move toward sustainable water systems. This talk will define the soft path for water and offer examples of innovative steps already being taken along that path in the western United States, South Africa, India, and elsewhere.
Crack blunting and the strength of soft elastic solids
NASA Astrophysics Data System (ADS)
Hui, C.-Y.; Jagota, A.; Bennison, S. J.; Londono, J. D.
2003-06-01
When a material is so soft that the cohesive strength (or adhesive strength, in the case of interfacial fracture) exceeds the elastic modulus of the material, we show that a crack will blunt instead of propagating. Large-deformation finite-element model (FEM) simulations of crack initiation, in which the debonding processes are quantified using a cohesive zone model, are used to support this hypothesis. An approximate analytic solution, which agrees well with the FEM simulation, gives additional insight into the blunting process. The consequence of this result on the strength of soft, rubbery materials is the main topic of this paper. We propose two mechanisms by which crack growth can occur in such blunted regions. We have also performed experiments on two different elastomers to demonstrate elastic blunting. In one system, we present some details on a void growth mechanism for ultimate failure, post-blunting. Finally, we demonstrate how crack blunting can shed light on some long-standing problems in the area of adhesion and fracture of elastomers.
Keers, Richard N; Williams, Steven D; Cooke, Jonathan; Ashcroft, Darren M
2015-01-01
Objectives To investigate the underlying causes of intravenous medication administration errors (MAEs) in National Health Service (NHS) hospitals. Setting Two NHS teaching hospitals in the North West of England. Participants Twenty nurses working in a range of inpatient clinical environments were identified and recruited using purposive sampling at each study site. Primary outcome measures Semistructured interviews were conducted with nurse participants using the critical incident technique, where they were asked to discuss perceived causes of intravenous MAEs that they had been directly involved with. Transcribed interviews were analysed using the Framework approach and emerging themes were categorised according to Reason's model of accident causation. Results In total, 21 intravenous MAEs were discussed containing 23 individual active failures which included slips and lapses (n=11), mistakes (n=8) and deliberate violations of policy (n=4). Each active failure was associated with a range of error and violation provoking conditions. The working environment was implicated when nurses lacked healthcare team support and/or were exposed to a perceived increased workload during ward rounds, shift changes or emergencies. Nurses frequently reported that the quality of intravenous dose-checking activities was compromised due to high perceived workload and working relationships. Nurses described using approaches such as subconscious functioning and prioritising to manage their duties, which at times contributed to errors. Conclusions Complex interactions between active and latent failures can lead to intravenous MAEs in hospitals. Future interventions may need to be multimodal in design in order to mitigate these risks and reduce the burden of intravenous MAEs. PMID:25770226
Study on Flexible Pavement Failures in Soft Soil Tropical Regions
NASA Astrophysics Data System (ADS)
Jayakumar, M.; Chee Soon, Lee
2015-04-01
Road network system experienced rapid upgrowth since ages ago and it started developing in Malaysia during the colonization of British due to its significant impacts in transportation field. Flexible pavement, the major road network in Malaysia, has been deteriorating by various types of distresses which cause descending serviceability of the pavement structure. This paper discusses the pavement condition assessment carried out in Sarawak and Sabah, Malaysia to have design solutions for flexible pavement failures. Field tests were conducted to examine the subgrade strength of existing roads in Sarawak at various failure locations, to assess the impact of subgrade strength on pavement failures. Research outcomes from field condition assessment and subgrade testing showed that the critical causes of pavement failures are inadequate design and maintenance of drainage system and shoulder cross fall, along with inadequate pavement thickness provided by may be assuming the conservative value of soil strength at optimum moisture content, whereas the exiting and expected subgrade strengths at equilibrium moisture content are far below. Our further research shows that stabilized existing recycled asphalt and base materials to use as a sub-base along with bitumen stabilized open graded base in the pavement composition may be a viable solution for pavement failures.
Dahlin, C; Simion, M; Hatano, N
2010-12-01
In the present prospective study, bone augmentation by guided bone regeneration (GBR) in combination with bovine hydroxyapatite (BHA) as filling material was evaluated with regard to soft and hard tissue stability over time. Implant survival, radiologic bone level (marginal bone level [MBL]), and clinical soft tissue parameters (marginal soft tissue level [MSTL]) were observed. Twenty patients received a total of 41 implants (Brånemark System, Nobel Biocare, Göteborg, Sweden) in conjunction with GBR treatment. The end point of the study was after 5 years following implant placement. The cumulative implant survival rate was 97.5% corresponding to one implant failure. The radiologic evaluation of the MBL demonstrated a crestal bone height above the level of the fixture head. The bone height decreased from -3.51 to -2.38 mm (p < .001). The MSTL was -1.52 mm at baseline and -1.15 mm at the 5-year follow-up (p < .04) demonstrating a stable submucosal crown margin throughout the study period. GBR treatment in combination with a xenogeneic filling material (BHA) is a viable treatment option in order to maintain stable hard and soft tissue levels in conjunction with augmentative procedure related to oral implant treatment. © 2009, Copyright the Authors. Journal Compilation © 2010, Wiley Periodicals, Inc.
Mardinger, Ofer; Chaushu, Gavriel; Ghelfan, Oded; Nissan, Joseph
2009-06-01
The normal bone resorption after tooth extraction can be significantly aggravated in the case of pre-existing severe bone loss and chronic infection. Bone augmentation procedures have been proposed, but they require adequate closure of soft tissues. We propose the use of intrasocket reactive tissue to cover extraction sites augmented by bovine bone mineral graft to promote the success of the graft procedure. The study included 24 patients with severe bone loss and chronic pathology in 27 sites. The intrasocket reactive soft tissue was elevated from the bony walls in a subperiosteal plane. Porous bovine or allograft bone mineral was placed in the extraction site without membranes, and the intrasocket reactive soft tissue was sutured over the grafting material to seal the coronal portion of the socket. Twenty-seven implants were placed 6 months after bone augmentation. Healing progressed uneventfully. Postoperative morbidity was minimal. There was no leakage or infection of the grafting material. The mean time to implant placement was 7.8 months. Supplemental augmentation was not needed. There were no implant failures. Follow-up ranged from 6 to 36 months (mean, 15 months). All implants were rehabilitated with fixed prostheses. Intrasocket reactive soft tissue can be used predictably to obtain primary closure of augmented extraction sites with severe bone loss with minimal postoperative morbidity.
The effect of both a thoracic trauma and a soft-tissue trauma on fracture healing in a rat model
2011-01-01
Background and purpose There is some clinical evidence that fracture healing is impaired in multiply injured patients. Nothing is known, however, about the effects of various types of injuries and their contribution to a possible disturbance of the fracture-healing process. We investigated the effect of a thoracic trauma and an additional soft-tissue trauma on fracture healing in a rat tibia model. Methods 3 groups of rats were operated: group A with a simple fracture of the tibia and fibula, group B with a fracture and an additional thoracic trauma, and group C with a fracture, thoracic trauma, and an additional soft-tissue trauma. The fracture and the soft-tissue injury were produced by a special guillotine-like device and the thoracic trauma by a blast wave generator. After one day, the serum level of IL-6 was quantified, and at the end of the study (28 days) the mechanical properties and the callus volume of the healed tibia were determined. Results Increasing the severity of the injury caused IL-6 levels to more than double 1 day after injury. It halved the load to failure in mechanical tests and led to reduced callus volume after 28 days of healing. Interpretation Fracture healing is impaired when additional thoracic trauma and soft tissue trauma occurs. PMID:21463222
ERIC Educational Resources Information Center
Rast, Philippe; Zimprich, Daniel; Van Boxtel, Martin; Jolles, Jellemer
2009-01-01
The Cognitive Failures Questionnaire (CFQ) is designed to assess a person's proneness to committing cognitive slips and errors in the completion of everyday tasks. Although the CFQ is a widely used instrument, its factor structure remains an issue of scientific debate. The present study used data of a representative sample (N = 1,303, 24-83 years…
Applying airline safety practices to medication administration.
Pape, Theresa M
2003-04-01
Medication administration errors (MAE) continue as major problems for health care institutions, nurses, and patients. However, MAEs are often the result of system failures leading to patient injury, increased hospital costs, and blaming. Costs include those related to increased hospital length of stay and legal expenses. Contributing factors include distractions, lack of focus, poor communication, and failure to follow standard protocols during medication administration.
Wang, Yao; Jing, Lei; Ke, Hong-Liang; Hao, Jian; Gao, Qun; Wang, Xiao-Xun; Sun, Qiang; Xu, Zhi-Jun
2016-09-20
The accelerated aging tests under electric stress for one type of LED lamp are conducted, and the differences between online and offline tests of the degradation of luminous flux are studied in this paper. The transformation of the two test modes is achieved with an adjustable AC voltage stabilized power source. Experimental results show that the exponential fitting of the luminous flux degradation in online tests possesses a higher fitting degree for most lamps, and the degradation rate of the luminous flux by online tests is always lower than that by offline tests. Bayes estimation and Weibull distribution are used to calculate the failure probabilities under the accelerated voltages, and then the reliability of the lamps under rated voltage of 220 V is estimated by use of the inverse power law model. Results show that the relative error of the lifetime estimation by offline tests increases as the failure probability decreases, and it cannot be neglected when the failure probability is less than 1%. The relative errors of lifetime estimation are 7.9%, 5.8%, 4.2%, and 3.5%, at the failure probabilities of 0.1%, 1%, 5%, and 10%, respectively.
Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.
Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter
2016-08-24
Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.
Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors
Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter
2016-01-01
Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908
Li, Lingyun; Zhang, Fuming; Hu, Min; Ren, Fuji; Chi, Lianli; Linhardt, Robert J.
2016-01-01
Low molecular weight heparins are complex polycomponent drugs that have recently become amenable to top-down analysis using liquid chromatography-mass spectrometry. Even using open source deconvolution software, DeconTools, and automatic structural assignment software, GlycReSoft, the comparison of two or more low molecular weight heparins is extremely time-consuming, taking about a week for an expert analyst and provides no guarantee of accuracy. Efficient data processing tools are required to improve analysis. This study uses the programming language of Microsoft Excel™ Visual Basic for Applications to extend its standard functionality for macro functions and specific mathematical modules for mass spectrometric data processing. The program developed enables the comparison of top-down analytical glycomics data on two or more low molecular weight heparins. The current study describes a new program, GlycCompSoft, which has a low error rate with good time efficiency in the automatic processing of large data sets. The experimental results based on three lots of Lovenox®, Clexane® and three generic enoxaparin samples show that the run time of GlycCompSoft decreases from 11 to 2 seconds when the data processed decreases from 18000 to 1500 rows. PMID:27942011
Flight-deck automation - Promises and problems
NASA Technical Reports Server (NTRS)
Wiener, E. L.; Curry, R. E.
1980-01-01
The paper analyzes the role of human factors in flight-deck automation, identifies problem areas, and suggests design guidelines. Flight-deck automation using microprocessor technology and display systems improves performance and safety while leading to a decrease in size, cost, and power consumption. On the other hand negative factors such as failure of automatic equipment, automation-induced error compounded by crew error, crew error in equipment set-up, failure to heed automatic alarms, and loss of proficiency must also be taken into account. Among the problem areas discussed are automation of control tasks, monitoring of complex systems, psychosocial aspects of automation, and alerting and warning systems. Guidelines are suggested for designing, utilising, and improving control and monitoring systems. Investigation into flight-deck automation systems is important as the knowledge gained can be applied to other systems such as air traffic control and nuclear power generation, but the many problems encountered with automated systems need to be analyzed and overcome in future research.
NASA Technical Reports Server (NTRS)
Kwon, Jin H.; Lee, Ja H.
1989-01-01
The far-field beam pattern and the power-collection efficiency are calculated for a multistage laser-diode-array amplifier consisting of about 200,000 5-W laser diode arrays with random distributions of phase and orientation errors and random diode failures. From the numerical calculation it is found that the far-field beam pattern is little affected by random failures of up to 20 percent of the laser diodes with reference of 80 percent receiving efficiency in the center spot. The random differences in phases among laser diodes due to probable manufacturing errors is allowed to about 0.2 times the wavelength. The maximum allowable orientation error is about 20 percent of the diffraction angle of a single laser diode aperture (about 1 cm). The preliminary results indicate that the amplifier could be used for space beam-power transmission with an efficiency of about 80 percent for a moderate-size (3-m-diameter) receiver placed at a distance of less than 50,000 km.
Akbarzadeh, A; Ay, M R; Ahmadian, A; Alam, N Riahi; Zaidi, H
2013-02-01
Hybrid PET/MRI presents many advantages in comparison with its counterpart PET/CT in terms of improved soft-tissue contrast, decrease in radiation exposure, and truly simultaneous and multi-parametric imaging capabilities. However, the lack of well-established methodology for MR-based attenuation correction is hampering further development and wider acceptance of this technology. We assess the impact of ignoring bone attenuation and using different tissue classes for generation of the attenuation map on the accuracy of attenuation correction of PET data. This work was performed using simulation studies based on the XCAT phantom and clinical input data. For the latter, PET and CT images of patients were used as input for the analytic simulation model using realistic activity distributions where CT-based attenuation correction was utilized as reference for comparison. For both phantom and clinical studies, the reference attenuation map was classified into various numbers of tissue classes to produce three (air, soft tissue and lung), four (air, lungs, soft tissue and cortical bones) and five (air, lungs, soft tissue, cortical bones and spongeous bones) class attenuation maps. The phantom studies demonstrated that ignoring bone increases the relative error by up to 6.8% in the body and up to 31.0% for bony regions. Likewise, the simulated clinical studies showed that the mean relative error reached 15% for lesions located in the body and 30.7% for lesions located in bones, when neglecting bones. These results demonstrate an underestimation of about 30% of tracer uptake when neglecting bone, which in turn imposes substantial loss of quantitative accuracy for PET images produced by hybrid PET/MRI systems. Considering bones in the attenuation map will considerably improve the accuracy of MR-guided attenuation correction in hybrid PET/MR to enable quantitative PET imaging on hybrid PET/MR technologies.