Sample records for significantly reduces errors

  1. Automated drug dispensing system reduces medication errors in an intensive care setting.

    PubMed

    Chapuis, Claire; Roustit, Matthieu; Bal, Gaëlle; Schwebel, Carole; Pansu, Pascal; David-Tchouda, Sandra; Foroni, Luc; Calop, Jean; Timsit, Jean-François; Allenet, Benoît; Bosson, Jean-Luc; Bedouch, Pierrick

    2010-12-01

    We aimed to assess the impact of an automated dispensing system on the incidence of medication errors related to picking, preparation, and administration of drugs in a medical intensive care unit. We also evaluated the clinical significance of such errors and user satisfaction. Preintervention and postintervention study involving a control and an intervention medical intensive care unit. Two medical intensive care units in the same department of a 2,000-bed university hospital. Adult medical intensive care patients. After a 2-month observation period, we implemented an automated dispensing system in one of the units (study unit) chosen randomly, with the other unit being the control. The overall error rate was expressed as a percentage of total opportunities for error. The severity of errors was classified according to National Coordinating Council for Medication Error Reporting and Prevention categories by an expert committee. User satisfaction was assessed through self-administered questionnaires completed by nurses. A total of 1,476 medications for 115 patients were observed. After automated dispensing system implementation, we observed a reduced percentage of total opportunities for error in the study compared to the control unit (13.5% and 18.6%, respectively; p<.05); however, no significant difference was observed before automated dispensing system implementation (20.4% and 19.3%, respectively; not significant). Before-and-after comparisons in the study unit also showed a significantly reduced percentage of total opportunities for error (20.4% and 13.5%; p<.01). An analysis of detailed opportunities for error showed a significant impact of the automated dispensing system in reducing preparation errors (p<.05). Most errors caused no harm (National Coordinating Council for Medication Error Reporting and Prevention category C). The automated dispensing system did not reduce errors causing harm. Finally, the mean for working conditions improved from 1.0±0.8 to 2.5±0.8 on the four-point Likert scale. The implementation of an automated dispensing system reduced overall medication errors related to picking, preparation, and administration of drugs in the intensive care unit. Furthermore, most nurses favored the new drug dispensation organization.

  2. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting.

    PubMed

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-11-01

    Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  3. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting

    PubMed Central

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-01-01

    Background Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. Objective The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. Methods The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Results Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Conclusions Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. PMID:24906806

  4. Reduced error signalling in medication-naive children with ADHD: associations with behavioural variability and post-error adaptations

    PubMed Central

    Plessen, Kerstin J.; Allen, Elena A.; Eichele, Heike; van Wageningen, Heidi; Høvik, Marie Farstad; Sørensen, Lin; Worren, Marius Kalsås; Hugdahl, Kenneth; Eichele, Tom

    2016-01-01

    Background We examined the blood-oxygen level–dependent (BOLD) activation in brain regions that signal errors and their association with intraindividual behavioural variability and adaptation to errors in children with attention-deficit/hyperactivity disorder (ADHD). Methods We acquired functional MRI data during a Flanker task in medication-naive children with ADHD and healthy controls aged 8–12 years and analyzed the data using independent component analysis. For components corresponding to performance monitoring networks, we compared activations across groups and conditions and correlated them with reaction times (RT). Additionally, we analyzed post-error adaptations in behaviour and motor component activations. Results We included 25 children with ADHD and 29 controls in our analysis. Children with ADHD displayed reduced activation to errors in cingulo-opercular regions and higher RT variability, but no differences of interference control. Larger BOLD amplitude to error trials significantly predicted reduced RT variability across all participants. Neither group showed evidence of post-error response slowing; however, post-error adaptation in motor networks was significantly reduced in children with ADHD. This adaptation was inversely related to activation of the right-lateralized ventral attention network (VAN) on error trials and to task-driven connectivity between the cingulo-opercular system and the VAN. Limitations Our study was limited by the modest sample size and imperfect matching across groups. Conclusion Our findings show a deficit in cingulo-opercular activation in children with ADHD that could relate to reduced signalling for errors. Moreover, the reduced orienting of the VAN signal may mediate deficient post-error motor adaptions. Pinpointing general performance monitoring problems to specific brain regions and operations in error processing may help to guide the targets of future treatments for ADHD. PMID:26441332

  5. Separate Medication Preparation Rooms Reduce Interruptions and Medication Errors in the Hospital Setting: A Prospective Observational Study.

    PubMed

    Huckels-Baumgart, Saskia; Baumgart, André; Buschmann, Ute; Schüpfer, Guido; Manser, Tanja

    2016-12-21

    Interruptions and errors during the medication process are common, but published literature shows no evidence supporting whether separate medication rooms are an effective single intervention in reducing interruptions and errors during medication preparation in hospitals. We tested the hypothesis that the rate of interruptions and reported medication errors would decrease as a result of the introduction of separate medication rooms. Our aim was to evaluate the effect of separate medication rooms on interruptions during medication preparation and on self-reported medication error rates. We performed a preintervention and postintervention study using direct structured observation of nurses during medication preparation and daily structured medication error self-reporting of nurses by questionnaires in 2 wards at a major teaching hospital in Switzerland. A volunteer sample of 42 nurses was observed preparing 1498 medications for 366 patients over 17 hours preintervention and postintervention on both wards. During 122 days, nurses completed 694 reporting sheets containing 208 medication errors. After the introduction of the separate medication room, the mean interruption rate decreased significantly from 51.8 to 30 interruptions per hour (P < 0.01), and the interruption-free preparation time increased significantly from 1.4 to 2.5 minutes (P < 0.05). Overall, the mean medication error rate per day was also significantly reduced after implementation of the separate medication room from 1.3 to 0.9 errors per day (P < 0.05). The present study showed the positive effect of a hospital-based intervention; after the introduction of the separate medication room, the interruption and medication error rates decreased significantly.

  6. COMPLEX VARIABLE BOUNDARY ELEMENT METHOD: APPLICATIONS.

    USGS Publications Warehouse

    Hromadka, T.V.; Yen, C.C.; Guymon, G.L.

    1985-01-01

    The complex variable boundary element method (CVBEM) is used to approximate several potential problems where analytical solutions are known. A modeling result produced from the CVBEM is a measure of relative error in matching the known boundary condition values of the problem. A CVBEM error-reduction algorithm is used to reduce the relative error of the approximation by adding nodal points in boundary regions where error is large. From the test problems, overall error is reduced significantly by utilizing the adaptive integration algorithm.

  7. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    NASA Astrophysics Data System (ADS)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  8. Effect of Bar-code Technology on the Incidence of Medication Dispensing Errors and Potential Adverse Drug Events in a Hospital Pharmacy

    PubMed Central

    Poon, Eric G; Cina, Jennifer L; Churchill, William W; Mitton, Patricia; McCrea, Michelle L; Featherstone, Erica; Keohane, Carol A; Rothschild, Jeffrey M; Bates, David W; Gandhi, Tejal K

    2005-01-01

    We performed a direct observation pre-post study to evaluate the impact of barcode technology on medication dispensing errors and potential adverse drug events in the pharmacy of a tertiary-academic medical center. We found that barcode technology significantly reduced the rate of target dispensing errors leaving the pharmacy by 85%, from 0.37% to 0.06%. The rate of potential adverse drug events (ADEs) due to dispensing errors was also significantly reduced by 63%, from 0.19% to 0.069%. In a 735-bed hospital where 6 million doses of medications are dispensed per year, this technology is expected to prevent about 13,000 dispensing errors and 6,000 potential ADEs per year. PMID:16779372

  9. Interruption Practice Reduces Errors

    DTIC Science & Technology

    2014-01-01

    dangers of errors at the PCS. Electronic health record systems are used to reduce certain errors related to poor- handwriting and dosage...10.16, MSE =.31, p< .05, η2 = .18 A significant interaction between the number of interruptions and interrupted trials suggests that trials...the variance when calculating whether a memory has a higher signal than interference. If something in addition to activation contributes to goal

  10. Applying human factors principles to alert design increases efficiency and reduces prescribing errors in a scenario-based simulation

    PubMed Central

    Russ, Alissa L; Zillich, Alan J; Melton, Brittany L; Russell, Scott A; Chen, Siying; Spina, Jeffrey R; Weiner, Michael; Johnson, Elizabette G; Daggy, Joanne K; McManus, M Sue; Hawsey, Jason M; Puleo, Anthony G; Doebbeling, Bradley N; Saleem, Jason J

    2014-01-01

    Objective To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. Materials and methods We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug–allergy, drug–drug interaction, and drug–disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. Results Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1–5) compared to original alerts: 4 (1–7); p=0.024). Discussion Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. Conclusions This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes. PMID:24668841

  11. Applying human factors principles to alert design increases efficiency and reduces prescribing errors in a scenario-based simulation.

    PubMed

    Russ, Alissa L; Zillich, Alan J; Melton, Brittany L; Russell, Scott A; Chen, Siying; Spina, Jeffrey R; Weiner, Michael; Johnson, Elizabette G; Daggy, Joanne K; McManus, M Sue; Hawsey, Jason M; Puleo, Anthony G; Doebbeling, Bradley N; Saleem, Jason J

    2014-10-01

    To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug-allergy, drug-drug interaction, and drug-disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1-5) compared to original alerts: 4 (1-7); p=0.024). Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. Interventions to reduce medication errors in neonatal care: a systematic review

    PubMed Central

    Nguyen, Minh-Nha Rhylie; Mosel, Cassandra

    2017-01-01

    Background: Medication errors represent a significant but often preventable cause of morbidity and mortality in neonates. The objective of this systematic review was to determine the effectiveness of interventions to reduce neonatal medication errors. Methods: A systematic review was undertaken of all comparative and noncomparative studies published in any language, identified from searches of PubMed and EMBASE and reference-list checking. Eligible studies were those investigating the impact of any medication safety interventions aimed at reducing medication errors in neonates in the hospital setting. Results: A total of 102 studies were identified that met the inclusion criteria, including 86 comparative and 16 noncomparative studies. Medication safety interventions were classified into six themes: technology (n = 38; e.g. electronic prescribing), organizational (n = 16; e.g. guidelines, policies, and procedures), personnel (n = 13; e.g. staff education), pharmacy (n = 9; e.g. clinical pharmacy service), hazard and risk analysis (n = 8; e.g. error detection tools), and multifactorial (n = 18; e.g. any combination of previous interventions). Significant variability was evident across all included studies, with differences in intervention strategies, trial methods, types of medication errors evaluated, and how medication errors were identified and evaluated. Most studies demonstrated an appreciable risk of bias. The vast majority of studies (>90%) demonstrated a reduction in medication errors. A similar median reduction of 50–70% in medication errors was evident across studies included within each of the identified themes, but findings varied considerably from a 16% increase in medication errors to a 100% reduction in medication errors. Conclusion: While neonatal medication errors can be reduced through multiple interventions aimed at improving the medication use process, no single intervention appeared clearly superior. Further research is required to evaluate the relative cost-effectiveness of the various medication safety interventions to facilitate decisions regarding uptake and implementation into clinical practice. PMID:29387337

  13. Prevention of prescription errors by computerized, on-line, individual patient related surveillance of drug order entry.

    PubMed

    Oliven, A; Zalman, D; Shilankov, Y; Yeshurun, D; Odeh, M

    2002-01-01

    Computerized prescription of drugs is expected to reduce the number of many preventable drug ordering errors. In the present study we evaluated the usefullness of a computerized drug order entry (CDOE) system in reducing prescription errors. A department of internal medicine using a comprehensive CDOE, which included also patient-related drug-laboratory, drug-disease and drug-allergy on-line surveillance was compared to a similar department in which drug orders were handwritten. CDOE reduced prescription errors to 25-35%. The causes of errors remained similar, and most errors, on both departments, were associated with abnormal renal function and electrolyte balance. Residual errors remaining on the CDOE-using department were due to handwriting on the typed order, failure to feed patients' diseases, and system failures. The use of CDOE was associated with a significant reduction in mean hospital stay and in the number of changes performed in the prescription. The findings of this study both quantity the impact of comprehensive CDOE on prescription errors and delineate the causes for remaining errors.

  14. The preclinical pharmacological profile of WAY-132983, a potent M1 preferring agonist.

    PubMed

    Bartolomeo, A C; Morris, H; Buccafusco, J J; Kille, N; Rosenzweig-Lipson, S; Husbands, M G; Sabb, A L; Abou-Gharbia, M; Moyer, J A; Boast, C A

    2000-02-01

    Muscarinic M1 preferring agonists may improve cognitive deficits associated with Alzheimer's disease. Side effect assessment of the M1 preferring agonist WAY-132983 showed significant salivation (10 mg/kg i.p. or p.o.) and produced dose-dependent hypothermia after i. p. or p.o. administration. WAY-132983 significantly reduced scopolamine (0.3 mg/kg i.p.)-induced hyperswimming in mice. Cognitive assessment in rats used pretrained animals in a forced choice, 1-h delayed nonmatch-to-sample radial arm maze task. WAY-132983 (0.3 mg/kg i.p) significantly reduced scopolamine (0.3 mg/kg s.c.)-induced errors. Oral WAY-132983 attenuated scopolamine-induced errors; that is, errors produced after combining scopolamine and WAY-132983 (to 3 mg/kg p.o.) were not significantly increased compared with those of vehicle-treated control animals, whereas errors after scopolamine were significantly higher than those of control animals. With the use of miniosmotic pumps, 0.03 mg/kg/day (s.c.) WAY-132983 significantly reduced AF64A (3 nmol/3 microliter/lateral ventricle)-induced errors. Verification of AF64A cholinotoxicity showed significantly lower choline acetyltransferase activity in the hippocampi of AF64A-treated animals, with no significant changes in the striatal or frontal cortex. Cognitive assessment in primates involved the use of pretrained aged animals in a visual delayed match-to-sample procedure. Oral WAY-132983 significantly increased the number of correct responses during short and long delay interval testing. These effects were also apparent 24 h after administration. WAY-132983 exhibited cognitive benefit at doses lower than those producing undesirable effects; therefore, WAY-132983 is a potential candidate for improving the cognitive status of patients with Alzheimer's disease.

  15. Twice cutting method reduces tibial cutting error in unicompartmental knee arthroplasty.

    PubMed

    Inui, Hiroshi; Taketomi, Shuji; Yamagami, Ryota; Sanada, Takaki; Tanaka, Sakae

    2016-01-01

    Bone cutting error can be one of the causes of malalignment in unicompartmental knee arthroplasty (UKA). The amount of cutting error in total knee arthroplasty has been reported. However, none have investigated cutting error in UKA. The purpose of this study was to reveal the amount of cutting error in UKA when open cutting guide was used and clarify whether cutting the tibia horizontally twice using the same cutting guide reduced the cutting errors in UKA. We measured the alignment of the tibial cutting guides, the first-cut cutting surfaces and the second cut cutting surfaces using the navigation system in 50 UKAs. Cutting error was defined as the angular difference between the cutting guide and cutting surface. The mean absolute first-cut cutting error was 1.9° (1.1° varus) in the coronal plane and 1.1° (0.6° anterior slope) in the sagittal plane, whereas the mean absolute second-cut cutting error was 1.1° (0.6° varus) in the coronal plane and 1.1° (0.4° anterior slope) in the sagittal plane. Cutting the tibia horizontally twice reduced the cutting errors in the coronal plane significantly (P<0.05). Our study demonstrated that in UKA, cutting the tibia horizontally twice using the same cutting guide reduced cutting error in the coronal plane. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Flight Evaluation of Center-TRACON Automation System Trajectory Prediction Process

    NASA Technical Reports Server (NTRS)

    Williams, David H.; Green, Steven M.

    1998-01-01

    Two flight experiments (Phase 1 in October 1992 and Phase 2 in September 1994) were conducted to evaluate the accuracy of the Center-TRACON Automation System (CTAS) trajectory prediction process. The Transport Systems Research Vehicle (TSRV) Boeing 737 based at Langley Research Center flew 57 arrival trajectories that included cruise and descent segments; at the same time, descent clearance advisories from CTAS were followed. Actual trajectories of the airplane were compared with the trajectories predicted by the CTAS trajectory synthesis algorithms and airplane Flight Management System (FMS). Trajectory prediction accuracy was evaluated over several levels of cockpit automation that ranged from a conventional cockpit to performance-based FMS vertical navigation (VNAV). Error sources and their magnitudes were identified and measured from the flight data. The major source of error during these tests was found to be the predicted winds aloft used by CTAS. The most significant effect related to flight guidance was the cross-track and turn-overshoot errors associated with conventional VOR guidance. FMS lateral navigation (LNAV) guidance significantly reduced both the cross-track and turn-overshoot error. Pilot procedures and VNAV guidance were found to significantly reduce the vertical profile errors associated with atmospheric and airplane performance model errors.

  17. Validating and calibrating the Nintendo Wii balance board to derive reliable center of pressure measures.

    PubMed

    Leach, Julia M; Mancini, Martina; Peterka, Robert J; Hayes, Tamara L; Horak, Fay B

    2014-09-29

    The Nintendo Wii balance board (WBB) has generated significant interest in its application as a postural control measurement device in both the clinical and (basic, clinical, and rehabilitation) research domains. Although the WBB has been proposed as an alternative to the "gold standard" laboratory-grade force plate, additional research is necessary before the WBB can be considered a valid and reliable center of pressure (CoP) measurement device. In this study, we used the WBB and a laboratory-grade AMTI force plate (AFP) to simultaneously measure the CoP displacement of a controlled dynamic load, which has not been done before. A one-dimensional inverted pendulum was displaced at several different displacement angles and load heights to simulate a variety of postural sway amplitudes and frequencies (<1 Hz). Twelve WBBs were tested to address the issue of inter-device variability. There was a significant effect of sway amplitude, frequency, and direction on the WBB's CoP measurement error, with an increase in error as both sway amplitude and frequency increased and a significantly greater error in the mediolateral (ML) (compared to the anteroposterior (AP)) sway direction. There was no difference in error across the 12 WBB's, supporting low inter-device variability. A linear calibration procedure was then implemented to correct the WBB's CoP signals and reduce measurement error. There was a significant effect of calibration on the WBB's CoP signal accuracy, with a significant reduction in CoP measurement error (quantified by root-mean-squared error) from 2-6 mm (before calibration) to 0.5-2 mm (after calibration). WBB-based CoP signal calibration also significantly reduced the percent error in derived (time-domain) CoP sway measures, from -10.5% (before calibration) to -0.05% (after calibration) (percent errors averaged across all sway measures and in both sway directions). In this study, we characterized the WBB's CoP measurement error under controlled, dynamic conditions and implemented a linear calibration procedure for WBB CoP signals that is recommended to reduce CoP measurement error and provide more reliable estimates of time-domain CoP measures. Despite our promising results, additional work is necessary to understand how our findings translate to the clinical and rehabilitation research domains. Once the WBB's CoP measurement error is fully characterized in human postural sway (which differs from our simulated postural sway in both amplitude and frequency content), it may be used to measure CoP displacement in situations where lower accuracy and precision is acceptable.

  18. Validating and Calibrating the Nintendo Wii Balance Board to Derive Reliable Center of Pressure Measures

    PubMed Central

    Leach, Julia M.; Mancini, Martina; Peterka, Robert J.; Hayes, Tamara L.; Horak, Fay B.

    2014-01-01

    The Nintendo Wii balance board (WBB) has generated significant interest in its application as a postural control measurement device in both the clinical and (basic, clinical, and rehabilitation) research domains. Although the WBB has been proposed as an alternative to the “gold standard” laboratory-grade force plate, additional research is necessary before the WBB can be considered a valid and reliable center of pressure (CoP) measurement device. In this study, we used the WBB and a laboratory-grade AMTI force plate (AFP) to simultaneously measure the CoP displacement of a controlled dynamic load, which has not been done before. A one-dimensional inverted pendulum was displaced at several different displacement angles and load heights to simulate a variety of postural sway amplitudes and frequencies (<1 Hz). Twelve WBBs were tested to address the issue of inter-device variability. There was a significant effect of sway amplitude, frequency, and direction on the WBB's CoP measurement error, with an increase in error as both sway amplitude and frequency increased and a significantly greater error in the mediolateral (ML) (compared to the anteroposterior (AP)) sway direction. There was no difference in error across the 12 WBB's, supporting low inter-device variability. A linear calibration procedure was then implemented to correct the WBB's CoP signals and reduce measurement error. There was a significant effect of calibration on the WBB's CoP signal accuracy, with a significant reduction in CoP measurement error (quantified by root-mean-squared error) from 2–6 mm (before calibration) to 0.5–2 mm (after calibration). WBB-based CoP signal calibration also significantly reduced the percent error in derived (time-domain) CoP sway measures, from −10.5% (before calibration) to −0.05% (after calibration) (percent errors averaged across all sway measures and in both sway directions). In this study, we characterized the WBB's CoP measurement error under controlled, dynamic conditions and implemented a linear calibration procedure for WBB CoP signals that is recommended to reduce CoP measurement error and provide more reliable estimates of time-domain CoP measures. Despite our promising results, additional work is necessary to understand how our findings translate to the clinical and rehabilitation research domains. Once the WBB's CoP measurement error is fully characterized in human postural sway (which differs from our simulated postural sway in both amplitude and frequency content), it may be used to measure CoP displacement in situations where lower accuracy and precision is acceptable. PMID:25268919

  19. Near field communications technology and the potential to reduce medication errors through multidisciplinary application

    PubMed Central

    Pegler, Joe; Lehane, Elaine; Livingstone, Vicki; McCarthy, Nora; Sahm, Laura J.; Tabirca, Sabin; O’Driscoll, Aoife; Corrigan, Mark

    2016-01-01

    Background Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems. Methods An NFC-based system was designed to facilitate prescribing, administration and review of medications commonly used on surgical wards. Final year medical, nursing, and pharmacy students were recruited to test the electronic system in a cross-over observational setting on a simulated ward. Medication errors were compared against errors recorded using a paper-based system. Results A significant difference in the commission of medication errors was seen when NFC and paper-based medication systems were compared. Paper use resulted in a mean of 4.09 errors per prescribing round while NFC prescribing resulted in a mean of 0.22 errors per simulated prescribing round (P=0.000). Likewise, medication administration errors were reduced from a mean of 2.30 per drug round with a Paper system to a mean of 0.80 errors per round using NFC (P<0.015). A mean satisfaction score of 2.30 was reported by users, (rated on seven-point scale with 1 denoting total satisfaction with system use and 7 denoting total dissatisfaction). Conclusions An NFC based medication system may be used to effectively reduce medication errors in a simulated ward environment. PMID:28293602

  20. Near field communications technology and the potential to reduce medication errors through multidisciplinary application.

    PubMed

    O'Connell, Emer; Pegler, Joe; Lehane, Elaine; Livingstone, Vicki; McCarthy, Nora; Sahm, Laura J; Tabirca, Sabin; O'Driscoll, Aoife; Corrigan, Mark

    2016-01-01

    Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems. An NFC-based system was designed to facilitate prescribing, administration and review of medications commonly used on surgical wards. Final year medical, nursing, and pharmacy students were recruited to test the electronic system in a cross-over observational setting on a simulated ward. Medication errors were compared against errors recorded using a paper-based system. A significant difference in the commission of medication errors was seen when NFC and paper-based medication systems were compared. Paper use resulted in a mean of 4.09 errors per prescribing round while NFC prescribing resulted in a mean of 0.22 errors per simulated prescribing round (P=0.000). Likewise, medication administration errors were reduced from a mean of 2.30 per drug round with a Paper system to a mean of 0.80 errors per round using NFC (P<0.015). A mean satisfaction score of 2.30 was reported by users, (rated on seven-point scale with 1 denoting total satisfaction with system use and 7 denoting total dissatisfaction). An NFC based medication system may be used to effectively reduce medication errors in a simulated ward environment.

  1. (How) do we learn from errors? A prospective study of the link between the ward's learning practices and medication administration errors.

    PubMed

    Drach-Zahavy, A; Somech, A; Admi, H; Peterfreund, I; Peker, H; Priente, O

    2014-03-01

    Attention in the ward should shift from preventing medication administration errors to managing them. Nevertheless, little is known in regard with the practices nursing wards apply to learn from medication administration errors as a means of limiting them. To test the effectiveness of four types of learning practices, namely, non-integrated, integrated, supervisory and patchy learning practices in limiting medication administration errors. Data were collected from a convenient sample of 4 hospitals in Israel by multiple methods (observations and self-report questionnaires) at two time points. The sample included 76 wards (360 nurses). Medication administration error was defined as any deviation from prescribed medication processes and measured by a validated structured observation sheet. Wards' use of medication administration technologies, location of the medication station, and workload were observed; learning practices and demographics were measured by validated questionnaires. Results of the mixed linear model analysis indicated that the use of technology and quiet location of the medication cabinet were significantly associated with reduced medication administration errors (estimate=.03, p<.05 and estimate=-.17, p<.01 correspondingly), while workload was significantly linked to inflated medication administration errors (estimate=.04, p<.05). Of the learning practices, supervisory learning was the only practice significantly linked to reduced medication administration errors (estimate=-.04, p<.05). Integrated and patchy learning were significantly linked to higher levels of medication administration errors (estimate=-.03, p<.05 and estimate=-.04, p<.01 correspondingly). Non-integrated learning was not associated with it (p>.05). How wards manage errors might have implications for medication administration errors beyond the effects of typical individual, organizational and technology risk factors. Head nurse can facilitate learning from errors by "management by walking around" and monitoring nurses' medication administration behaviors. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Prescription errors before and after introduction of electronic medication alert system in a pediatric emergency department.

    PubMed

    Sethuraman, Usha; Kannikeswaran, Nirupama; Murray, Kyle P; Zidan, Marwan A; Chamberlain, James M

    2015-06-01

    Prescription errors occur frequently in pediatric emergency departments (PEDs).The effect of computerized physician order entry (CPOE) with electronic medication alert system (EMAS) on these is unknown. The objective was to compare prescription errors rates before and after introduction of CPOE with EMAS in a PED. The hypothesis was that CPOE with EMAS would significantly reduce the rate and severity of prescription errors in the PED. A prospective comparison of a sample of outpatient, medication prescriptions 5 months before and after CPOE with EMAS implementation (7,268 before and 7,292 after) was performed. Error types and rates, alert types and significance, and physician response were noted. Medication errors were deemed significant if there was a potential to cause life-threatening injury, failure of therapy, or an adverse drug effect. There was a significant reduction in the errors per 100 prescriptions (10.4 before vs. 7.3 after; absolute risk reduction = 3.1, 95% confidence interval [CI] = 2.2 to 4.0). Drug dosing error rates decreased from 8 to 5.4 per 100 (absolute risk reduction = 2.6, 95% CI = 1.8 to 3.4). Alerts were generated for 29.6% of prescriptions, with 45% involving drug dose range checking. The sensitivity of CPOE with EMAS in identifying errors in prescriptions was 45.1% (95% CI = 40.8% to 49.6%), and the specificity was 57% (95% CI = 55.6% to 58.5%). Prescribers modified 20% of the dosing alerts, resulting in the error not reaching the patient. Conversely, 11% of true dosing alerts for medication errors were overridden by the prescribers: 88 (11.3%) resulted in medication errors, and 684 (88.6%) were false-positive alerts. A CPOE with EMAS was associated with a decrease in overall prescription errors in our PED. Further system refinements are required to reduce the high false-positive alert rates. © 2015 by the Society for Academic Emergency Medicine.

  3. The effectiveness of the error reporting promoting program on the nursing error incidence rate in Korean operating rooms.

    PubMed

    Kim, Myoung-Soo; Kim, Jung-Soon; Jung, In Sook; Kim, Young Hae; Kim, Ho Jung

    2007-03-01

    The purpose of this study was to develop and evaluate an error reporting promoting program(ERPP) to systematically reduce the incidence rate of nursing errors in operating room. A non-equivalent control group non-synchronized design was used. Twenty-six operating room nurses who were in one university hospital in Busan participated in this study. They were stratified into four groups according to their operating room experience and were allocated to the experimental and control groups using a matching method. Mann-Whitney U Test was used to analyze the differences pre and post incidence rates of nursing errors between the two groups. The incidence rate of nursing errors decreased significantly in the experimental group compared to the pre-test score from 28.4% to 15.7%. The incidence rate by domains, it decreased significantly in the 3 domains-"compliance of aseptic technique", "management of document", "environmental management" in the experimental group while it decreased in the control group which was applied ordinary error-reporting method. Error-reporting system can make possible to hold the errors in common and to learn from them. ERPP was effective to reduce the errors of recognition-related nursing activities. For the wake of more effective error-prevention, we will be better to apply effort of risk management along the whole health care system with this program.

  4. Temporal Decompostion of a Distribution System Quasi-Static Time-Series Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry A; Hunsberger, Randolph J

    This paper documents the first phase of an investigation into reducing runtimes of complex OpenDSS models through parallelization. As the method seems promising, future work will quantify - and further mitigate - errors arising from this process. In this initial report, we demonstrate how, through the use of temporal decomposition, the run times of a complex distribution-system-level quasi-static time series simulation can be reduced roughly proportional to the level of parallelization. Using this method, the monolithic model runtime of 51 hours was reduced to a minimum of about 90 minutes. As expected, this comes at the expense of control- andmore » voltage-errors at the time-slice boundaries. All evaluations were performed using a real distribution circuit model with the addition of 50 PV systems - representing a mock complex PV impact study. We are able to reduce induced transition errors through the addition of controls initialization, though small errors persist. The time savings with parallelization are so significant that we feel additional investigation to reduce control errors is warranted.« less

  5. A novel diagnosis method for a Hall plates-based rotary encoder with a magnetic concentrator.

    PubMed

    Meng, Bumin; Wang, Yaonan; Sun, Wei; Yuan, Xiaofang

    2014-07-31

    In the last few years, rotary encoders based on two-dimensional complementary metal oxide semiconductors (CMOS) Hall plates with a magnetic concentrator have been developed to measure contactless absolute angle. There are various error factors influencing the measuring accuracy, which are difficult to locate after the assembly of encoder. In this paper, a model-based rapid diagnosis method is presented. Based on an analysis of the error mechanism, an error model is built to compare minimum residual angle error and to quantify the error factors. Additionally, a modified particle swarm optimization (PSO) algorithm is used to reduce the calculated amount. The simulation and experimental results show that this diagnosis method is feasible to quantify the causes of the error and to reduce iteration significantly.

  6. Detection of Error Related Neuronal Responses Recorded by Electrocorticography in Humans during Continuous Movements

    PubMed Central

    Milekovic, Tomislav; Ball, Tonio; Schulze-Bonhage, Andreas; Aertsen, Ad; Mehring, Carsten

    2013-01-01

    Background Brain-machine interfaces (BMIs) can translate the neuronal activity underlying a user’s movement intention into movements of an artificial effector. In spite of continuous improvements, errors in movement decoding are still a major problem of current BMI systems. If the difference between the decoded and intended movements becomes noticeable, it may lead to an execution error. Outcome errors, where subjects fail to reach a certain movement goal, are also present during online BMI operation. Detecting such errors can be beneficial for BMI operation: (i) errors can be corrected online after being detected and (ii) adaptive BMI decoding algorithm can be updated to make fewer errors in the future. Methodology/Principal Findings Here, we show that error events can be detected from human electrocorticography (ECoG) during a continuous task with high precision, given a temporal tolerance of 300–400 milliseconds. We quantified the error detection accuracy and showed that, using only a small subset of 2×2 ECoG electrodes, 82% of detection information for outcome error and 74% of detection information for execution error available from all ECoG electrodes could be retained. Conclusions/Significance The error detection method presented here could be used to correct errors made during BMI operation or to adapt a BMI algorithm to make fewer errors in the future. Furthermore, our results indicate that smaller ECoG implant could be used for error detection. Reducing the size of an ECoG electrode implant used for BMI decoding and error detection could significantly reduce the medical risk of implantation. PMID:23383315

  7. Impact of Stewardship Interventions on Antiretroviral Medication Errors in an Urban Medical Center: A 3-Year, Multiphase Study.

    PubMed

    Zucker, Jason; Mittal, Jaimie; Jen, Shin-Pung; Cheng, Lucy; Cennimo, David

    2016-03-01

    There is a high prevalence of HIV infection in Newark, New Jersey, with University Hospital admitting approximately 600 HIV-infected patients per year. Medication errors involving antiretroviral therapy (ART) could significantly affect treatment outcomes. The goal of this study was to evaluate the effectiveness of various stewardship interventions in reducing the prevalence of prescribing errors involving ART. This was a retrospective review of all inpatients receiving ART for HIV treatment during three distinct 6-month intervals over a 3-year period. During the first year, the baseline prevalence of medication errors was determined. During the second year, physician and pharmacist education was provided, and a computerized order entry system with drug information resources and prescribing recommendations was implemented. Prospective audit of ART orders with feedback was conducted in the third year. Analyses and comparisons were made across the three phases of this study. Of the 334 patients with HIV admitted in the first year, 45% had at least one antiretroviral medication error and 38% had uncorrected errors at the time of discharge. After education and computerized order entry, significant reductions in medication error rates were observed compared to baseline rates; 36% of 315 admissions had at least one error and 31% had uncorrected errors at discharge. While the prevalence of antiretroviral errors in year 3 was similar to that of year 2 (37% of 276 admissions), there was a significant decrease in the prevalence of uncorrected errors at discharge (12%) with the use of prospective review and intervention. Interventions, such as education and guideline development, can aid in reducing ART medication errors, but a committed stewardship program is necessary to elicit the greatest impact. © 2016 Pharmacotherapy Publications, Inc.

  8. Computer-assisted bar-coding system significantly reduces clinical laboratory specimen identification errors in a pediatric oncology hospital.

    PubMed

    Hayden, Randall T; Patterson, Donna J; Jay, Dennis W; Cross, Carl; Dotson, Pamela; Possel, Robert E; Srivastava, Deo Kumar; Mirro, Joseph; Shenep, Jerry L

    2008-02-01

    To assess the ability of a bar code-based electronic positive patient and specimen identification (EPPID) system to reduce identification errors in a pediatric hospital's clinical laboratory. An EPPID system was implemented at a pediatric oncology hospital to reduce errors in patient and laboratory specimen identification. The EPPID system included bar-code identifiers and handheld personal digital assistants supporting real-time order verification. System efficacy was measured in 3 consecutive 12-month time frames, corresponding to periods before, during, and immediately after full EPPID implementation. A significant reduction in the median percentage of mislabeled specimens was observed in the 3-year study period. A decline from 0.03% to 0.005% (P < .001) was observed in the 12 months after full system implementation. On the basis of the pre-intervention detected error rate, it was estimated that EPPID prevented at least 62 mislabeling events during its first year of operation. EPPID decreased the rate of misidentification of clinical laboratory samples. The diminution of errors observed in this study provides support for the development of national guidelines for the use of bar coding for laboratory specimens, paralleling recent recommendations for medication administration.

  9. Theory of mind in schizophrenia: error types and associations with symptoms.

    PubMed

    Fretland, Ragnhild A; Andersson, Stein; Sundet, Kjetil; Andreassen, Ole A; Melle, Ingrid; Vaskinn, Anja

    2015-03-01

    Social cognition is an important determinant of functioning in schizophrenia. However, how social cognition relates to the clinical symptoms of schizophrenia is still unclear. The aim of this study was to explore the relationship between a social cognition domain, Theory of Mind (ToM), and the clinical symptoms of schizophrenia. Specifically, we investigated the associations between three ToM error types; 1) "overmentalizing" 2) "reduced ToM and 3) "no ToM", and positive, negative and disorganized symptoms. Fifty-two participants with a diagnosis of schizophrenia or schizoaffective disorder were assessed with the Movie for the Assessment of Social Cognition (MASC), a video-based ToM measure. An empirically validated five-factor model of the Positive and Negative Syndrome Scale (PANSS) was used to assess clinical symptoms. There was a significant, small-moderate association between overmentalizing and positive symptoms (rho=.28, p=.04). Disorganized symptoms correlated at a trend level with "reduced ToM" (rho=.27, p=.05). There were no other significant correlations between ToM impairments and symptom levels. Positive/disorganized symptoms did not contribute significantly in explaining total ToM performance, whereas IQ did (B=.37, p=.01). Within the undermentalizing domain, participants performed more "reduced ToM" errors than "no ToM" errors. Overmentalizing was associated with positive symptoms. The undermentalizing error types were unrelated to symptoms, but "reduced ToM" was somewhat associated to disorganization. The higher number of "reduced ToM" responses suggests that schizophrenia is characterized by accuracy problems rather than a fundamental lack of mental state concept. The findings call for the use of more sensitive measures when investigating ToM in schizophrenia to avoid the "right/wrong ToM"-dichotomy. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Private pilot judgment training in flight school settings.

    DOT National Transportation Integrated Search

    1987-05-01

    Pilot judgment errors have long been recognized as an important factor in aviation accidents. Previous studies have demonstrated that specialized training procedures can significantly reduce the number of decisional errors made by newly certified pri...

  11. Reduction of medication errors related to sliding scale insulin by the introduction of a standardized order sheet.

    PubMed

    Harada, Saki; Suzuki, Akio; Nishida, Shohei; Kobayashi, Ryo; Tamai, Sayuri; Kumada, Keisuke; Murakami, Nobuo; Itoh, Yoshinori

    2017-06-01

    Insulin is frequently used for glycemic control. Medication errors related to insulin are a common problem for medical institutions. Here, we prepared a standardized sliding scale insulin (SSI) order sheet and assessed the effect of its introduction. Observations before and after the introduction of the standardized SSI template were conducted at Gifu University Hospital. The incidence of medication errors, hyperglycemia, and hypoglycemia related to SSI were obtained from the electronic medical records. The introduction of the standardized SSI order sheet significantly reduced the incidence of medication errors related to SSI compared with that prior to its introduction (12/165 [7.3%] vs 4/159 [2.1%], P = .048). However, the incidence of hyperglycemia (≥250 mg/dL) and hypoglycemia (≤50 mg/dL) in patients who received SSI was not significantly different between the 2 groups. The introduction of the standardized SSI order sheet reduced the incidence of medication errors related to SSI. © 2016 John Wiley & Sons, Ltd.

  12. Power Measurement Errors on a Utility Aircraft

    NASA Technical Reports Server (NTRS)

    Bousman, William G.

    2002-01-01

    Extensive flight test data obtained from two recent performance tests of a UH 60A aircraft are reviewed. A power difference is calculated from the power balance equation and is used to examine power measurement errors. It is shown that the baseline measurement errors are highly non-Gaussian in their frequency distribution and are therefore influenced by additional, unquantified variables. Linear regression is used to examine the influence of other variables and it is shown that a substantial portion of the variance depends upon measurements of atmospheric parameters. Correcting for temperature dependence, although reducing the variance in the measurement errors, still leaves unquantified effects. Examination of the power difference over individual test runs indicates significant errors from drift, although it is unclear how these may be corrected. In an idealized case, where the drift is correctable, it is shown that the power measurement errors are significantly reduced and the error distribution is Gaussian. A new flight test program is recommended that will quantify the thermal environment for all torque measurements on the UH 60. Subsequently, the torque measurement systems will be recalibrated based on the measured thermal environment and a new power measurement assessment performed.

  13. Extracellular space preservation aids the connectomic analysis of neural circuits.

    PubMed

    Pallotto, Marta; Watkins, Paul V; Fubara, Boma; Singer, Joshua H; Briggman, Kevin L

    2015-12-09

    Dense connectomic mapping of neuronal circuits is limited by the time and effort required to analyze 3D electron microscopy (EM) datasets. Algorithms designed to automate image segmentation suffer from substantial error rates and require significant manual error correction. Any improvement in segmentation error rates would therefore directly reduce the time required to analyze 3D EM data. We explored preserving extracellular space (ECS) during chemical tissue fixation to improve the ability to segment neurites and to identify synaptic contacts. ECS preserved tissue is easier to segment using machine learning algorithms, leading to significantly reduced error rates. In addition, we observed that electrical synapses are readily identified in ECS preserved tissue. Finally, we determined that antibodies penetrate deep into ECS preserved tissue with only minimal permeabilization, thereby enabling correlated light microscopy (LM) and EM studies. We conclude that preservation of ECS benefits multiple aspects of the connectomic analysis of neural circuits.

  14. Virtual reality robotic surgery warm-up improves task performance in a dry laboratory environment: a prospective randomized controlled study.

    PubMed

    Lendvay, Thomas S; Brand, Timothy C; White, Lee; Kowalewski, Timothy; Jonnadula, Saikiran; Mercer, Laina D; Khorsand, Derek; Andros, Justin; Hannaford, Blake; Satava, Richard M

    2013-06-01

    Preoperative simulation warm-up has been shown to improve performance and reduce errors in novice and experienced surgeons, yet existing studies have only investigated conventional laparoscopy. We hypothesized that a brief virtual reality (VR) robotic warm-up would enhance robotic task performance and reduce errors. In a 2-center randomized trial, 51 residents and experienced minimally invasive surgery faculty in General Surgery, Urology, and Gynecology underwent a validated robotic surgery proficiency curriculum on a VR robotic simulator and on the da Vinci surgical robot (Intuitive Surgical Inc). Once they successfully achieved performance benchmarks, surgeons were randomized to either receive a 3- to 5-minute VR simulator warm-up or read a leisure book for 10 minutes before performing similar and dissimilar (intracorporeal suturing) robotic surgery tasks. The primary outcomes compared were task time, tool path length, economy of motion, technical, and cognitive errors. Task time (-29.29 seconds, p = 0.001; 95% CI, -47.03 to -11.56), path length (-79.87 mm; p = 0.014; 95% CI, -144.48 to -15.25), and cognitive errors were reduced in the warm-up group compared with the control group for similar tasks. Global technical errors in intracorporeal suturing (0.32; p = 0.020; 95% CI, 0.06-0.59) were reduced after the dissimilar VR task. When surgeons were stratified by earlier robotic and laparoscopic clinical experience, the more experienced surgeons (n = 17) demonstrated significant improvements from warm-up in task time (-53.5 seconds; p = 0.001; 95% CI, -83.9 to -23.0) and economy of motion (0.63 mm/s; p = 0.007; 95% CI, 0.18-1.09), and improvement in these metrics was not statistically significantly appreciated in the less-experienced cohort (n = 34). We observed significant performance improvement and error reduction rates among surgeons of varying experience after VR warm-up for basic robotic surgery tasks. In addition, the VR warm-up reduced errors on a more complex task (robotic suturing), suggesting the generalizability of the warm-up. Copyright © 2013 American College of Surgeons. All rights reserved.

  15. Virtual Reality Robotic Surgery Warm-Up Improves Task Performance in a Dry Lab Environment: A Prospective Randomized Controlled Study

    PubMed Central

    Lendvay, Thomas S.; Brand, Timothy C.; White, Lee; Kowalewski, Timothy; Jonnadula, Saikiran; Mercer, Laina; Khorsand, Derek; Andros, Justin; Hannaford, Blake; Satava, Richard M.

    2014-01-01

    Background Pre-operative simulation “warm-up” has been shown to improve performance and reduce errors in novice and experienced surgeons, yet existing studies have only investigated conventional laparoscopy. We hypothesized a brief virtual reality (VR) robotic warm-up would enhance robotic task performance and reduce errors. Study Design In a two-center randomized trial, fifty-one residents and experienced minimally invasive surgery faculty in General Surgery, Urology, and Gynecology underwent a validated robotic surgery proficiency curriculum on a VR robotic simulator and on the da Vinci surgical robot. Once successfully achieving performance benchmarks, surgeons were randomized to either receive a 3-5 minute VR simulator warm-up or read a leisure book for 10 minutes prior to performing similar and dissimilar (intracorporeal suturing) robotic surgery tasks. The primary outcomes compared were task time, tool path length, economy of motion, technical and cognitive errors. Results Task time (-29.29sec, p=0.001, 95%CI-47.03,-11.56), path length (-79.87mm, p=0.014, 95%CI -144.48,-15.25), and cognitive errors were reduced in the warm-up group compared to the control group for similar tasks. Global technical errors in intracorporeal suturing (0.32, p=0.020, 95%CI 0.06,0.59) were reduced after the dissimilar VR task. When surgeons were stratified by prior robotic and laparoscopic clinical experience, the more experienced surgeons(n=17) demonstrated significant improvements from warm-up in task time (-53.5sec, p=0.001, 95%CI -83.9,-23.0) and economy of motion (0.63mm/sec, p=0.007, 95%CI 0.18,1.09), whereas improvement in these metrics was not statistically significantly appreciated in the less experienced cohort(n=34). Conclusions We observed a significant performance improvement and error reduction rate among surgeons of varying experience after VR warm-up for basic robotic surgery tasks. In addition, the VR warm-up reduced errors on a more complex task (robotic suturing) suggesting the generalizability of the warm-up. PMID:23583618

  16. Reduced vision in highly myopic eyes without ocular pathology: the ZOC-BHVI high myopia study.

    PubMed

    Jong, Monica; Sankaridurg, Padmaja; Li, Wayne; Resnikoff, Serge; Naidoo, Kovin; He, Mingguang

    2018-01-01

    The aim was to investigate the relationship of the magnitude of myopia with visual acuity in highly myopic eyes without ocular pathology. Twelve hundred and ninety-two highly myopic eyes (up to -6.00 DS both eyes, no astigmatic cut-off) with no ocular pathology from the ZOC-BHVI high myopia study in China, had cycloplegic refraction, followed by subjective refraction and visual acuities and axial length measurement. Two logistic regression models were undertaken to test the association of age, gender, refractive error, axial length and parental myopia with reduced vision. Mean group age was 19.0 ± 8.6 years; subjective spherical equivalent refractive error was -9.03 ± 2.73 D; objective spherical equivalent refractive error was -8.90 ± 2.60 D and axial length was 27.0 ± 1.3 mm. Using visual acuity, 82.4 per cent had normal vision, 16.0 per cent had mildly reduced vision, 1.2 per cent had moderately reduced vision, 0.3 per cent had severely reduced vision and no subjects were blind. The percentage with reduced vision increased with spherical equivalent to 74.5 per cent from -15.00 to -39.99 D, axial length to 67.7 per cent of eyes from 30.01 to 32.00 mm and age to 22.9 per cent of those 41 years and over. Spherical equivalent and axial length were significantly associated with reduced vision (p < 0.0001). Age and parental myopia were not significantly associated with reduced vision. Gender was significant for one model (p = 0.04). Mildly reduced vision is common in high myopia without ocular pathology and is strongly correlated with greater magnitudes of refractive error and axial length. Better understanding is required to minimise reduced vision in high myopes. © 2017 Optometry Australia.

  17. Reducing representativeness and sampling errors in radio occultation-radiosonde comparisons

    NASA Astrophysics Data System (ADS)

    Gilpin, Shay; Rieckh, Therese; Anthes, Richard

    2018-05-01

    Radio occultation (RO) and radiosonde (RS) comparisons provide a means of analyzing errors associated with both observational systems. Since RO and RS observations are not taken at the exact same time or location, temporal and spatial sampling errors resulting from atmospheric variability can be significant and inhibit error analysis of the observational systems. In addition, the vertical resolutions of RO and RS profiles vary and vertical representativeness errors may also affect the comparison. In RO-RS comparisons, RO observations are co-located with RS profiles within a fixed time window and distance, i.e. within 3-6 h and circles of radii ranging between 100 and 500 km. In this study, we first show that vertical filtering of RO and RS profiles to a common vertical resolution reduces representativeness errors. We then test two methods of reducing horizontal sampling errors during RO-RS comparisons: restricting co-location pairs to within ellipses oriented along the direction of wind flow rather than circles and applying a spatial-temporal sampling correction based on model data. Using data from 2011 to 2014, we compare RO and RS differences at four GCOS Reference Upper-Air Network (GRUAN) RS stations in different climatic locations, in which co-location pairs were constrained to a large circle ( ˜ 666 km radius), small circle ( ˜ 300 km radius), and ellipse parallel to the wind direction ( ˜ 666 km semi-major axis, ˜ 133 km semi-minor axis). We also apply a spatial-temporal sampling correction using European Centre for Medium-Range Weather Forecasts Interim Reanalysis (ERA-Interim) gridded data. Restricting co-locations to within the ellipse reduces root mean square (RMS) refractivity, temperature, and water vapor pressure differences relative to RMS differences within the large circle and produces differences that are comparable to or less than the RMS differences within circles of similar area. Applying the sampling correction shows the most significant reduction in RMS differences, such that RMS differences are nearly identical to the sampling correction regardless of the geometric constraints. We conclude that implementing the spatial-temporal sampling correction using a reliable model will most effectively reduce sampling errors during RO-RS comparisons; however, if a reliable model is not available, restricting spatial comparisons to within an ellipse parallel to the wind flow will reduce sampling errors caused by horizontal atmospheric variability.

  18. Magneto-optical tracking of flexible laparoscopic ultrasound: model-based online detection and correction of magnetic tracking errors.

    PubMed

    Feuerstein, Marco; Reichl, Tobias; Vogel, Jakob; Traub, Joerg; Navab, Nassir

    2009-06-01

    Electromagnetic tracking is currently one of the most promising means of localizing flexible endoscopic instruments such as flexible laparoscopic ultrasound transducers. However, electromagnetic tracking is also susceptible to interference from ferromagnetic material, which distorts the magnetic field and leads to tracking errors. This paper presents new methods for real-time online detection and reduction of dynamic electromagnetic tracking errors when localizing a flexible laparoscopic ultrasound transducer. We use a hybrid tracking setup to combine optical tracking of the transducer shaft and electromagnetic tracking of the flexible transducer tip. A novel approach of modeling the poses of the transducer tip in relation to the transducer shaft allows us to reliably detect and significantly reduce electromagnetic tracking errors. For detecting errors of more than 5 mm, we achieved a sensitivity and specificity of 91% and 93%, respectively. Initial 3-D rms error of 6.91 mm were reduced to 3.15 mm.

  19. Customization of user interfaces to reduce errors and enhance user acceptance.

    PubMed

    Burkolter, Dina; Weyers, Benjamin; Kluge, Annette; Luther, Wolfram

    2014-03-01

    Customization is assumed to reduce error and increase user acceptance in the human-machine relation. Reconfiguration gives the operator the option to customize a user interface according to his or her own preferences. An experimental study with 72 computer science students using a simulated process control task was conducted. The reconfiguration group (RG) interactively reconfigured their user interfaces and used the reconfigured user interface in the subsequent test whereas the control group (CG) used a default user interface. Results showed significantly lower error rates and higher acceptance of the RG compared to the CG while there were no significant differences between the groups regarding situation awareness and mental workload. Reconfiguration seems to be promising and therefore warrants further exploration. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  20. Can eye-tracking technology improve situational awareness in paramedic clinical education?

    PubMed

    Williams, Brett; Quested, Andrew; Cooper, Simon

    2013-01-01

    Human factors play a significant part in clinical error. Situational awareness (SA) means being aware of one's surroundings, comprehending the present situation, and being able to predict outcomes. It is a key human skill that, when properly applied, is associated with reducing medical error: eye-tracking technology can be used to provide an objective and qualitative measure of the initial perception component of SA. Feedback from eye-tracking technology can be used to improve the understanding and teaching of SA in clinical contexts, and consequently, has potential for reducing clinician error and the concomitant adverse events.

  1. Data entry errors and design for model-based tight glycemic control in critical care.

    PubMed

    Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey

    2012-01-01

    Tight glycemic control (TGC) has shown benefits but has been difficult to achieve consistently. Model-based methods and computerized protocols offer the opportunity to improve TGC quality but require human data entry, particularly of blood glucose (BG) values, which can be significantly prone to error. This study presents the design and optimization of data entry methods to minimize error for a computerized and model-based TGC method prior to pilot clinical trials. To minimize data entry error, two tests were carried out to optimize a method with errors less than the 5%-plus reported in other studies. Four initial methods were tested on 40 subjects in random order, and the best two were tested more rigorously on 34 subjects. The tests measured entry speed and accuracy. Errors were reported as corrected and uncorrected errors, with the sum comprising a total error rate. The first set of tests used randomly selected values, while the second set used the same values for all subjects to allow comparisons across users and direct assessment of the magnitude of errors. These research tests were approved by the University of Canterbury Ethics Committee. The final data entry method tested reduced errors to less than 1-2%, a 60-80% reduction from reported values. The magnitude of errors was clinically significant and was typically by 10.0 mmol/liter or an order of magnitude but only for extreme values of BG < 2.0 mmol/liter or BG > 15.0-20.0 mmol/liter, both of which could be easily corrected with automated checking of extreme values for safety. The data entry method selected significantly reduced data entry errors in the limited design tests presented, and is in use on a clinical pilot TGC study. The overall approach and testing methods are easily performed and generalizable to other applications and protocols. © 2012 Diabetes Technology Society.

  2. A statistical study of radio-source structure effects on astrometric very long baseline interferometry observations

    NASA Technical Reports Server (NTRS)

    Ulvestad, J. S.

    1989-01-01

    Errors from a number of sources in astrometric very long baseline interferometry (VLBI) have been reduced in recent years through a variety of methods of calibration and modeling. Such reductions have led to a situation in which the extended structure of the natural radio sources used in VLBI is a significant error source in the effort to improve the accuracy of the radio reference frame. In the past, work has been done on individual radio sources to establish the magnitude of the errors caused by their particular structures. The results of calculations on 26 radio sources are reported in which an effort is made to determine the typical delay and delay-rate errors for a number of sources having different types of structure. It is found that for single observations of the types of radio sources present in astrometric catalogs, group-delay and phase-delay scatter in the 50 to 100 psec range due to source structure can be expected at 8.4 GHz on the intercontinental baselines available in the Deep Space Network (DSN). Delay-rate scatter of approx. 5 x 10(exp -15) sec sec(exp -1) (or approx. 0.002 mm sec (exp -1) is also expected. If such errors mapped directly into source position errors, they would correspond to position uncertainties of approx. 2 to 5 nrad, similar to the best position determinations in the current JPL VLBI catalog. With the advent of wider bandwidth VLBI systems on the large DSN antennas, the system noise will be low enough so that the structure-induced errors will be a significant part of the error budget. Several possibilities for reducing the structure errors are discussed briefly, although it is likely that considerable effort will have to be devoted to the structure problem in order to reduce the typical error by a factor of two or more.

  3. An interventional approach for patient and nurse safety: a fatigue countermeasures feasibility study.

    PubMed

    Scott, Linda D; Hofmeister, Nancee; Rogness, Neal; Rogers, Ann E

    2010-01-01

    Studies indicate that extended shifts worked by hospital staff nurses are associated with higher risk of errors. Long work hours coupled with insufficient sleep and fatigue are even riskier. Although other industries have developed programs to reduce fatigue-related errors and injury, fatigue countermeasures program for nurses (FCMPN) are lacking. The objective of this study was to evaluate the feasibility of an FCMPN for improving sleep duration and quality while reducing daytime sleepiness and patient care errors. Selected sleep variables, errors and drowsy driving, were evaluated among hospital staff nurses (n = 47) before and after FCMPN implementation. A one-group pretest-posttest repeated-measures approach was used. Participants provided data 2 weeks before the FCMPN, 4 weeks after receiving the intervention, and again at 3 months after intervention. Most of the nurses experienced poor sleep quality, severe daytime sleepiness, and decreased alertness at work and while operating a motor vehicle. After the FCMPN, significant improvements were noted in sleep duration, sleep quality, alertness, and error prevention. Although significant improvements were not found in daytime sleepiness scores, severity of daytime sleepiness appeared to decrease. Despite improvements in fatigue management, nurses reported feelings of guilt when engaging in FCMPN activities, especially strategic naps and relieved breaks. Initial findings support the feasibility of using an FCMPN for mitigating fatigue, improving sleep, and reducing errors among hospital staff nurses. In future investigations, the acceptability, efficacy, and effectiveness of FCMPNs can be examined.

  4. Evaluation of robotic training forces that either enhance or reduce error in chronic hemiparetic stroke survivors.

    PubMed

    Patton, James L; Stoykov, Mary Ellen; Kovic, Mark; Mussa-Ivaldi, Ferdinando A

    2006-01-01

    This investigation is one in a series of studies that address the possibility of stroke rehabilitation using robotic devices to facilitate "adaptive training." Healthy subjects, after training in the presence of systematically applied forces, typically exhibit a predictable "after-effect." A critical question is whether this adaptive characteristic is preserved following stroke so that it might be exploited for restoring function. Another important question is whether subjects benefit more from training forces that enhance their errors than from forces that reduce their errors. We exposed hemiparetic stroke survivors and healthy age-matched controls to a pattern of disturbing forces that have been found by previous studies to induce a dramatic adaptation in healthy individuals. Eighteen stroke survivors made 834 movements in the presence of a robot-generated force field that pushed their hands proportional to its speed and perpendicular to its direction of motion--either clockwise or counterclockwise. We found that subjects could adapt, as evidenced by significant after-effects. After-effects were not correlated with the clinical scores that we used for measuring motor impairment. Further examination revealed that significant improvements occurred only when the training forces magnified the original errors, and not when the training forces reduced the errors or were zero. Within this constrained experimental task we found that error-enhancing therapy (as opposed to guiding the limb closer to the correct path) to be more effective than therapy that assisted the subject.

  5. Reduction in chemotherapy order errors with computerized physician order entry.

    PubMed

    Meisenberg, Barry R; Wright, Robert R; Brady-Copertino, Catherine J

    2014-01-01

    To measure the number and type of errors associated with chemotherapy order composition associated with three sequential methods of ordering: handwritten orders, preprinted orders, and computerized physician order entry (CPOE) embedded in the electronic health record. From 2008 to 2012, a sample of completed chemotherapy orders were reviewed by a pharmacist for the number and type of errors as part of routine performance improvement monitoring. Error frequencies for each of the three distinct methods of composing chemotherapy orders were compared using statistical methods. The rate of problematic order sets-those requiring significant rework for clarification-was reduced from 30.6% with handwritten orders to 12.6% with preprinted orders (preprinted v handwritten, P < .001) to 2.2% with CPOE (preprinted v CPOE, P < .001). The incidence of errors capable of causing harm was reduced from 4.2% with handwritten orders to 1.5% with preprinted orders (preprinted v handwritten, P < .001) to 0.1% with CPOE (CPOE v preprinted, P < .001). The number of problem- and error-containing chemotherapy orders was reduced sequentially by preprinted order sets and then by CPOE. CPOE is associated with low error rates, but it did not eliminate all errors, and the technology can introduce novel types of errors not seen with traditional handwritten or preprinted orders. Vigilance even with CPOE is still required to avoid patient harm.

  6. Extracellular space preservation aids the connectomic analysis of neural circuits

    PubMed Central

    Pallotto, Marta; Watkins, Paul V; Fubara, Boma; Singer, Joshua H; Briggman, Kevin L

    2015-01-01

    Dense connectomic mapping of neuronal circuits is limited by the time and effort required to analyze 3D electron microscopy (EM) datasets. Algorithms designed to automate image segmentation suffer from substantial error rates and require significant manual error correction. Any improvement in segmentation error rates would therefore directly reduce the time required to analyze 3D EM data. We explored preserving extracellular space (ECS) during chemical tissue fixation to improve the ability to segment neurites and to identify synaptic contacts. ECS preserved tissue is easier to segment using machine learning algorithms, leading to significantly reduced error rates. In addition, we observed that electrical synapses are readily identified in ECS preserved tissue. Finally, we determined that antibodies penetrate deep into ECS preserved tissue with only minimal permeabilization, thereby enabling correlated light microscopy (LM) and EM studies. We conclude that preservation of ECS benefits multiple aspects of the connectomic analysis of neural circuits. DOI: http://dx.doi.org/10.7554/eLife.08206.001 PMID:26650352

  7. Image-guided spatial localization of heterogeneous compartments for magnetic resonance

    PubMed Central

    An, Li; Shen, Jun

    2015-01-01

    Purpose: Image-guided localization SPectral Localization Achieved by Sensitivity Heterogeneity (SPLASH) allows rapid measurement of signals from irregularly shaped anatomical compartments without using phase encoding gradients. Here, the authors propose a novel method to address the issue of heterogeneous signal distribution within the localized compartments. Methods: Each compartment was subdivided into multiple subcompartments and their spectra were solved by Tikhonov regularization to enforce smoothness within each compartment. The spectrum of a given compartment was generated by combining the spectra of the components of that compartment. The proposed method was first tested using Monte Carlo simulations and then applied to reconstructing in vivo spectra from irregularly shaped ischemic stroke and normal tissue compartments. Results: Monte Carlo simulations demonstrate that the proposed regularized SPLASH method significantly reduces localization and metabolite quantification errors. In vivo results show that the intracompartment regularization results in ∼40% reduction of error in metabolite quantification. Conclusions: The proposed method significantly reduces localization errors and metabolite quantification errors caused by intracompartment heterogeneous signal distribution. PMID:26328977

  8. Effects of sharing information on drug administration errors in pediatric wards: a pre–post intervention study

    PubMed Central

    Chua, Siew-Siang; Choo, Sim-Mei; Sulaiman, Che Zuraini; Omar, Asma; Thong, Meow-Keong

    2017-01-01

    Background and purpose Drug administration errors are more likely to reach the patient than other medication errors. The main aim of this study was to determine whether the sharing of information on drug administration errors among health care providers would reduce such problems. Patients and methods This study involved direct, undisguised observations of drug administrations in two pediatric wards of a major teaching hospital in Kuala Lumpur, Malaysia. This study consisted of two phases: Phase 1 (pre-intervention) and Phase 2 (post-intervention). Data were collected by two observers over a 40-day period in both Phase 1 and Phase 2 of the study. Both observers were pharmacy graduates: Observer 1 just completed her undergraduate pharmacy degree, whereas Observer 2 was doing her one-year internship as a provisionally registered pharmacist in the hospital under study. A drug administration error was defined as a discrepancy between the drug regimen received by the patient and that intended by the prescriber and also drug administration procedures that did not follow standard hospital policies and procedures. Results from Phase 1 of the study were analyzed, presented and discussed with the ward staff before commencement of data collection in Phase 2. Results A total of 1,284 and 1,401 doses of drugs were administered in Phase 1 and Phase 2, respectively. The rate of drug administration errors reduced significantly from Phase 1 to Phase 2 (44.3% versus 28.6%, respectively; P<0.001). Logistic regression analysis showed that the adjusted odds of drug administration errors in Phase 1 of the study were almost three times that in Phase 2 (P<0.001). The most common types of errors were incorrect administration technique and incorrect drug preparation. Nasogastric and intravenous routes of drug administration contributed significantly to the rate of drug administration errors. Conclusion This study showed that sharing of the types of errors that had occurred was significantly associated with a reduction in drug administration errors. PMID:28356748

  9. Improving patient safety using the sterile cockpit principle during medication administration: a collaborative, unit-based project.

    PubMed

    Fore, Amanda M; Sculli, Gary L; Albee, Doreen; Neily, Julia

    2013-01-01

      To implement the sterile cockpit principle to decrease interruptions and distractions during high volume medication administration and reduce the number of medication errors.   While some studies have described the importance of reducing interruptions as a tactic to reduce medication errors, work is needed to assess the impact on patient outcomes.   Data regarding the type and frequency of distractions were collected during the first 11 weeks of implementation. Medication error rates were tracked 1 year before and after 1 year implementation.   Simple regression analysis showed a decrease in the mean number of distractions, (β = -0.193, P = 0.02) over time. The medication error rate decreased by 42.78% (P = 0.04) after implementation of the sterile cockpit principle.   The use of crew resource management techniques, including the sterile cockpit principle, applied to medication administration has a significant impact on patient safety.   Applying the sterile cockpit principle to inpatient medical units is a feasible approach to reduce the number of distractions during the administration of medication, thus, reducing the likelihood of medication error. 'Do Not Disturb' signs and vests are inexpensive, simple interventions that can be used as reminders to decrease distractions. © 2012 Blackwell Publishing Ltd.

  10. DNA assembly with error correction on a droplet digital microfluidics platform.

    PubMed

    Khilko, Yuliya; Weyman, Philip D; Glass, John I; Adams, Mark D; McNeil, Melanie A; Griffin, Peter B

    2018-06-01

    Custom synthesized DNA is in high demand for synthetic biology applications. However, current technologies to produce these sequences using assembly from DNA oligonucleotides are costly and labor-intensive. The automation and reduced sample volumes afforded by microfluidic technologies could significantly decrease materials and labor costs associated with DNA synthesis. The purpose of this study was to develop a gene assembly protocol utilizing a digital microfluidic device. Toward this goal, we adapted bench-scale oligonucleotide assembly methods followed by enzymatic error correction to the Mondrian™ digital microfluidic platform. We optimized Gibson assembly, polymerase chain reaction (PCR), and enzymatic error correction reactions in a single protocol to assemble 12 oligonucleotides into a 339-bp double- stranded DNA sequence encoding part of the human influenza virus hemagglutinin (HA) gene. The reactions were scaled down to 0.6-1.2 μL. Initial microfluidic assembly methods were successful and had an error frequency of approximately 4 errors/kb with errors originating from the original oligonucleotide synthesis. Relative to conventional benchtop procedures, PCR optimization required additional amounts of MgCl 2 , Phusion polymerase, and PEG 8000 to achieve amplification of the assembly and error correction products. After one round of error correction, error frequency was reduced to an average of 1.8 errors kb - 1 . We demonstrated that DNA assembly from oligonucleotides and error correction could be completely automated on a digital microfluidic (DMF) platform. The results demonstrate that enzymatic reactions in droplets show a strong dependence on surface interactions, and successful on-chip implementation required supplementation with surfactants, molecular crowding agents, and an excess of enzyme. Enzymatic error correction of assembled fragments improved sequence fidelity by 2-fold, which was a significant improvement but somewhat lower than expected compared to bench-top assays, suggesting an additional capacity for optimization.

  11. [Relations between health information systems and patient safety].

    PubMed

    Nøhr, Christian

    2012-11-05

    Health information systems have the potential to reduce medical errors, and indeed many studies have shown a significant reduction. However, if the systems are not designed and implemented properly, there is evidence that suggest that new types of errors will arise--i.e., technology-induced errors. Health information systems will need to undergo a more rigorous evaluation. Usability evaluation and simulation test with humans in the loop can help to detect and prevent technology-induced errors before they are deployed in real health-care settings.

  12. The effects of a test-taking strategy intervention for high school students with test anxiety in advanced placement science courses

    NASA Astrophysics Data System (ADS)

    Markus, Doron J.

    Test anxiety is one of the most debilitating and disruptive factors associated with underachievement and failure in schools (Birenbaum, Menucha, Nasser, & Fadia, 1994; Tobias, 1985). Researchers have suggested that interventions that combine multiple test-anxiety reduction techniques are most effective at reducing test anxiety levels (Ergene, 2003). For the current study, involving 62 public high school students enrolled in advanced placement science courses, the researcher designed a multimodal intervention designed to reduce test anxiety. Analyses were conducted to assess the relationships among test anxiety levels, unit examination scores, and irregular multiple-choice error patterns (error clumping), as well as changes in these measures after the intervention. Results indicate significant, positive relationships between some measures of test anxiety and error clumping, as well as significant, negative relationships between test anxiety levels and student achievement. In addition, results show significant decreases in holistic measures of test anxiety among students with low anxiety levels, as well as decreases in Emotionality subscores of test anxiety among students with high levels of test anxiety. There were no significant changes over time in the Worry subscores of test anxiety. Suggestions for further research include further confirmation of the existence of error clumping, and its causal relationship with test anxiety.

  13. The Use of Categorized Time-Trend Reporting of Radiation Oncology Incidents: A Proactive Analytical Approach to Improving Quality and Safety Over Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, Anthony, E-mail: anthony.arnold@sesiahs.health.nsw.gov.a; Delaney, Geoff P.; Cassapi, Lynette

    Purpose: Radiotherapy is a common treatment for cancer patients. Although incidence of error is low, errors can be severe or affect significant numbers of patients. In addition, errors will often not manifest until long periods after treatment. This study describes the development of an incident reporting tool that allows categorical analysis and time trend reporting, covering first 3 years of use. Methods and Materials: A radiotherapy-specific incident analysis system was established. Staff members were encouraged to report actual errors and near-miss events detected at prescription, simulation, planning, or treatment phases of radiotherapy delivery. Trend reporting was reviewed monthly. Results: Reportsmore » were analyzed for the first 3 years of operation (May 2004-2007). A total of 688 reports was received during the study period. The actual error rate was 0.2% per treatment episode. During the study period, the actual error rates reduced significantly from 1% per year to 0.3% per year (p < 0.001), as did the total event report rates (p < 0.0001). There were 3.5 times as many near misses reported compared with actual errors. Conclusions: This system has allowed real-time analysis of events within a radiation oncology department to a reduced error rate through focus on learning and prevention from the near-miss reports. Plans are underway to develop this reporting tool for Australia and New Zealand.« less

  14. Managing human fallibility in critical aerospace situations

    NASA Astrophysics Data System (ADS)

    Tew, Larry

    2014-11-01

    Human fallibility is pervasive in the aerospace industry with over 50% of errors attributed to human error. Consider the benefits to any organization if those errors were significantly reduced. Aerospace manufacturing involves high value, high profile systems with significant complexity and often repetitive build, assembly, and test operations. In spite of extensive analysis, planning, training, and detailed procedures, human factors can cause unexpected errors. Handling such errors involves extensive cause and corrective action analysis and invariably schedule slips and cost growth. We will discuss success stories, including those associated with electro-optical systems, where very significant reductions in human fallibility errors were achieved after receiving adapted and specialized training. In the eyes of company and customer leadership, the steps used to achieve these results lead to in a major culture change in both the workforce and the supporting management organization. This approach has proven effective in other industries like medicine, firefighting, law enforcement, and aviation. The roadmap to success and the steps to minimize human error are known. They can be used by any organization willing to accept human fallibility and take a proactive approach to incorporate the steps needed to manage and minimize error.

  15. Price and cost estimation

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.

    1979-01-01

    Price and Cost Estimating Program (PACE II) was developed to prepare man-hour and material cost estimates. Versatile and flexible tool significantly reduces computation time and errors and reduces typing and reproduction time involved in preparation of cost estimates.

  16. Syntactic and semantic errors in radiology reports associated with speech recognition software.

    PubMed

    Ringler, Michael D; Goss, Brian C; Bartholmai, Brian J

    2017-03-01

    Speech recognition software can increase the frequency of errors in radiology reports, which may affect patient care. We retrieved 213,977 speech recognition software-generated reports from 147 different radiologists and proofread them for errors. Errors were classified as "material" if they were believed to alter interpretation of the report. "Immaterial" errors were subclassified as intrusion/omission or spelling errors. The proportion of errors and error type were compared among individual radiologists, imaging subspecialty, and time periods. In all, 20,759 reports (9.7%) contained errors, of which 3992 (1.9%) were material errors. Among immaterial errors, spelling errors were more common than intrusion/omission errors ( p < .001). Proportion of errors and fraction of material errors varied significantly among radiologists and between imaging subspecialties ( p < .001). Errors were more common in cross-sectional reports, reports reinterpreting results of outside examinations, and procedural studies (all p < .001). Error rate decreased over time ( p < .001), which suggests that a quality control program with regular feedback may reduce errors.

  17. Elimination of Emergency Department Medication Errors Due To Estimated Weights.

    PubMed

    Greenwalt, Mary; Griffen, David; Wilkerson, Jim

    2017-01-01

    From 7/2014 through 6/2015, 10 emergency department (ED) medication dosing errors were reported through the electronic incident reporting system of an urban academic medical center. Analysis of these medication errors identified inaccurate estimated weight on patients as the root cause. The goal of this project was to reduce weight-based dosing medication errors due to inaccurate estimated weights on patients presenting to the ED. Chart review revealed that 13.8% of estimated weights documented on admitted ED patients varied more than 10% from subsequent actual admission weights recorded. A random sample of 100 charts containing estimated weights revealed 2 previously unreported significant medication dosage errors (.02 significant error rate). Key improvements included removing barriers to weighing ED patients, storytelling to engage staff and change culture, and removal of the estimated weight documentation field from the ED electronic health record (EHR) forms. With these improvements estimated weights on ED patients, and the resulting medication errors, were eliminated.

  18. Rhythmic chaos: irregularities of computer ECG diagnosis.

    PubMed

    Wang, Yi-Ting Laureen; Seow, Swee-Chong; Singh, Devinder; Poh, Kian-Keong; Chai, Ping

    2017-09-01

    Diagnostic errors can occur when physicians rely solely on computer electrocardiogram interpretation. Cardiologists often receive referrals for computer misdiagnoses of atrial fibrillation. Patients may have been inappropriately anticoagulated for pseudo atrial fibrillation. Anticoagulation carries significant risks, and such errors may carry a high cost. Have we become overreliant on machines and technology? In this article, we illustrate three such cases and briefly discuss how we can reduce these errors. Copyright: © Singapore Medical Association.

  19. A modified technique to reduce tibial keel cutting errors during an Oxford unicompartmental knee arthroplasty.

    PubMed

    Inui, Hiroshi; Taketomi, Shuji; Tahara, Keitarou; Yamagami, Ryota; Sanada, Takaki; Tanaka, Sakae

    2017-03-01

    Bone cutting errors can cause malalignment of unicompartmental knee arthroplasties (UKA). Although the extent of tibial malalignment due to horizontal cutting errors has been well reported, there is a lack of studies evaluating malalignment as a consequence of keel cutting errors, particularly in the Oxford UKA. The purpose of this study was to examine keel cutting errors during Oxford UKA placement using a navigation system and to clarify whether two different tibial keel cutting techniques would have different error rates. The alignment of the tibial cut surface after a horizontal osteotomy and the surface of the tibial trial component was measured with a navigation system. Cutting error was defined as the angular difference between these measurements. The following two techniques were used: the standard "pushing" technique in 83 patients (group P) and a modified "dolphin" technique in 41 patients (group D). In all 123 patients studied, the mean absolute keel cutting error was 1.7° and 1.4° in the coronal and sagittal planes, respectively. In group P, there were 22 outlier patients (27 %) in the coronal plane and 13 (16 %) in the sagittal plane. Group D had three outlier patients (8 %) in the coronal plane and none (0 %) in the sagittal plane. Significant differences were observed in the outlier ratio of these techniques in both the sagittal (P = 0.014) and coronal (P = 0.008) planes. Our study demonstrated overall keel cutting errors of 1.7° in the coronal plane and 1.4° in the sagittal plane. The "dolphin" technique was found to significantly reduce keel cutting errors on the tibial side. This technique will be useful for accurate component positioning and therefore improve the longevity of Oxford UKAs. Retrospective comparative study, Level III.

  20. Continuous correction of differential path length factor in near-infrared spectroscopy

    PubMed Central

    Moore, Jason H.; Diamond, Solomon G.

    2013-01-01

    Abstract. In continuous-wave near-infrared spectroscopy (CW-NIRS), changes in the concentration of oxyhemoglobin and deoxyhemoglobin can be calculated by solving a set of linear equations from the modified Beer-Lambert Law. Cross-talk error in the calculated hemodynamics can arise from inaccurate knowledge of the wavelength-dependent differential path length factor (DPF). We apply the extended Kalman filter (EKF) with a dynamical systems model to calculate relative concentration changes in oxy- and deoxyhemoglobin while simultaneously estimating relative changes in DPF. Results from simulated and experimental CW-NIRS data are compared with results from a weighted least squares (WLSQ) method. The EKF method was found to effectively correct for artificially introduced errors in DPF and to reduce the cross-talk error in simulation. With experimental CW-NIRS data, the hemodynamic estimates from EKF differ significantly from the WLSQ (p<0.001). The cross-correlations among residuals at different wavelengths were found to be significantly reduced by the EKF method compared to WLSQ in three physiologically relevant spectral bands 0.04 to 0.15 Hz, 0.15 to 0.4 Hz and 0.4 to 2.0 Hz (p<0.001). This observed reduction in residual cross-correlation is consistent with reduced cross-talk error in the hemodynamic estimates from the proposed EKF method. PMID:23640027

  1. Continuous correction of differential path length factor in near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Talukdar, Tanveer; Moore, Jason H.; Diamond, Solomon G.

    2013-05-01

    In continuous-wave near-infrared spectroscopy (CW-NIRS), changes in the concentration of oxyhemoglobin and deoxyhemoglobin can be calculated by solving a set of linear equations from the modified Beer-Lambert Law. Cross-talk error in the calculated hemodynamics can arise from inaccurate knowledge of the wavelength-dependent differential path length factor (DPF). We apply the extended Kalman filter (EKF) with a dynamical systems model to calculate relative concentration changes in oxy- and deoxyhemoglobin while simultaneously estimating relative changes in DPF. Results from simulated and experimental CW-NIRS data are compared with results from a weighted least squares (WLSQ) method. The EKF method was found to effectively correct for artificially introduced errors in DPF and to reduce the cross-talk error in simulation. With experimental CW-NIRS data, the hemodynamic estimates from EKF differ significantly from the WLSQ (p<0.001). The cross-correlations among residuals at different wavelengths were found to be significantly reduced by the EKF method compared to WLSQ in three physiologically relevant spectral bands 0.04 to 0.15 Hz, 0.15 to 0.4 Hz and 0.4 to 2.0 Hz (p<0.001). This observed reduction in residual cross-correlation is consistent with reduced cross-talk error in the hemodynamic estimates from the proposed EKF method.

  2. Graphical user interface simplifies infusion pump programming and enhances the ability to detect pump-related faults.

    PubMed

    Syroid, Noah; Liu, David; Albert, Robert; Agutter, James; Egan, Talmage D; Pace, Nathan L; Johnson, Ken B; Dowdle, Michael R; Pulsipher, Daniel; Westenskow, Dwayne R

    2012-11-01

    Drug administration errors are frequent and are often associated with the misuse of IV infusion pumps. One source of these errors may be the infusion pump's user interface. We used failure modes-and-effects analyses to identify programming errors and to guide the design of a new syringe pump user interface. We designed the new user interface to clearly show the pump's operating state simultaneously in more than 1 monitoring location. We evaluated anesthesia residents in laboratory and simulated environments on programming accuracy and error detection between the new user interface and the user interface of a commercially available infusion pump. With the new user interface, we observed the number of programming errors reduced by 81%, the number of keystrokes per task reduced from 9.2 ± 5.0 to 7.5 ± 5.5 (mean ± SD), the time required per task reduced from 18.1 ± 14.1 seconds to 10.9 ± 9.5 seconds and significantly less perceived workload. Residents detected 38 of 70 (54%) of the events with the new user interface and 37 of 70 (53%) with the existing user interface, despite no experience with the new user interface and extensive experience with the existing interface. The number of programming errors and workload were reduced partly because it took less time and fewer keystrokes to program the pump when using the new user interface. Despite minimal training, residents quickly identified preexisting infusion pump problems with the new user interface. Intuitive and easy-to-program infusion pump interfaces may reduce drug administration errors and infusion pump-related adverse events.

  3. Psychophysiological effects of self-regulation method: EEG frequency analysis and contingent negative variations.

    PubMed

    Ikemi, A

    1988-01-01

    Experiments were conducted to investigate the psychophysiological effects of self-regulation method (SRM), a newly developed method of self-control, using EEG frequency analysis and contingent negative variations (CNV). The results of the EEG frequency analysis showed that there is a significant increase in the percentage (power) of the theta-band and a significant decrease in the percentage (power) of the beta-band during SRM. Moreover, the results of an identical experiment conducted on subjects in a drowsy state showed that the changes in EEG frequencies during SRM can be differentiated from those of a drowsy state. Furthermore, experiments using CNV showed that there is a significant reduction of CNV amplitude during SRM. Despite the reduced amplitude during SRM, the number of errors in a task to evoke the CNV was reduced significantly without significant delay of reaction time. When an identical experiment was conducted in a drowsy state, CNV amplitude was reduced significantly, but reaction time and errors increased. From these experiments, the state of vigilance during SRM was discussed as a state of 'relaxed alertness'.

  4. Zero tolerance prescribing: a strategy to reduce prescribing errors on the paediatric intensive care unit.

    PubMed

    Booth, Rachelle; Sturgess, Emma; Taberner-Stokes, Alison; Peters, Mark

    2012-11-01

    To establish the baseline prescribing error rate in a tertiary paediatric intensive care unit (PICU) and to determine the impact of a zero tolerance prescribing (ZTP) policy incorporating a dedicated prescribing area and daily feedback of prescribing errors. A prospective, non-blinded, observational study was undertaken in a 12-bed tertiary PICU over a period of 134 weeks. Baseline prescribing error data were collected on weekdays for all patients for a period of 32 weeks, following which the ZTP policy was introduced. Daily error feedback was introduced after a further 12 months. Errors were sub-classified as 'clinical', 'non-clinical' and 'infusion prescription' errors and the effects of interventions considered separately. The baseline combined prescribing error rate was 892 (95 % confidence interval (CI) 765-1,019) errors per 1,000 PICU occupied bed days (OBDs), comprising 25.6 % clinical, 44 % non-clinical and 30.4 % infusion prescription errors. The combined interventions of ZTP plus daily error feedback were associated with a reduction in the combined prescribing error rate to 447 (95 % CI 389-504) errors per 1,000 OBDs (p < 0.0001), an absolute risk reduction of 44.5 % (95 % CI 40.8-48.0 %). Introduction of the ZTP policy was associated with a significant decrease in clinical and infusion prescription errors, while the introduction of daily error feedback was associated with a significant reduction in non-clinical prescribing errors. The combined interventions of ZTP and daily error feedback were associated with a significant reduction in prescribing errors in the PICU, in line with Department of Health requirements of a 40 % reduction within 5 years.

  5. Validity of mail survey data on bagged waterfowl

    USGS Publications Warehouse

    Atwood, E.L.

    1956-01-01

    Knowledge of the pattern of occurrence and characteristics of response errors obtained during an investigation of the validity of post-season surveys of hunters was used to advantage to devise a two-step method for removing the response-bias errors from the raw survey data. The method was tested on data with known errors and found to have a high efficiency in reducing the effect of response-bias errors. The development of this method for removing the effect of the response-bias errors, and its application to post-season hunter-take survey data, increased the reliability of the data from below the point of practical management significance up to the approximate reliability limits corresponding to the sampling errors.

  6. Cost effectiveness of a pharmacist-led information technology intervention for reducing rates of clinically important errors in medicines management in general practices (PINCER).

    PubMed

    Elliott, Rachel A; Putman, Koen D; Franklin, Matthew; Annemans, Lieven; Verhaeghe, Nick; Eden, Martin; Hayre, Jasdeep; Rodgers, Sarah; Sheikh, Aziz; Avery, Anthony J

    2014-06-01

    We recently showed that a pharmacist-led information technology-based intervention (PINCER) was significantly more effective in reducing medication errors in general practices than providing simple feedback on errors, with cost per error avoided at £79 (US$131). We aimed to estimate cost effectiveness of the PINCER intervention by combining effectiveness in error reduction and intervention costs with the effect of the individual errors on patient outcomes and healthcare costs, to estimate the effect on costs and QALYs. We developed Markov models for each of six medication errors targeted by PINCER. Clinical event probability, treatment pathway, resource use and costs were extracted from literature and costing tariffs. A composite probabilistic model combined patient-level error models with practice-level error rates and intervention costs from the trial. Cost per extra QALY and cost-effectiveness acceptability curves were generated from the perspective of NHS England, with a 5-year time horizon. The PINCER intervention generated £2,679 less cost and 0.81 more QALYs per practice [incremental cost-effectiveness ratio (ICER): -£3,037 per QALY] in the deterministic analysis. In the probabilistic analysis, PINCER generated 0.001 extra QALYs per practice compared with simple feedback, at £4.20 less per practice. Despite this extremely small set of differences in costs and outcomes, PINCER dominated simple feedback with a mean ICER of -£3,936 (standard error £2,970). At a ceiling 'willingness-to-pay' of £20,000/QALY, PINCER reaches 59 % probability of being cost effective. PINCER produced marginal health gain at slightly reduced overall cost. Results are uncertain due to the poor quality of data to inform the effect of avoiding errors.

  7. Headaches associated with refractive errors: myth or reality?

    PubMed

    Gil-Gouveia, R; Martins, I P

    2002-04-01

    Headache and refractive errors are very common conditions in the general population, and those with headache often attribute their pain to a visual problem. The International Headache Society (IHS) criteria for the classification of headache includes an entity of headache associated with refractive errors (HARE), but indicates that its importance is widely overestimated. To compare overall headache frequency and HARE frequency in healthy subjects with uncorrected or miscorrected refractive errors and a control group. We interviewed 105 individuals with uncorrected refractive errors and a control group of 71 subjects (with properly corrected or without refractive errors) regarding their headache history. We compared the occurrence of headache and its diagnosis in both groups and assessed its relation to their habits of visual effort and type of refractive errors. Headache frequency was similar in both subjects and controls. Headache associated with refractive errors was the only headache type significantly more common in subjects with refractive errors than in controls (6.7% versus 0%). It was associated with hyperopia and was unrelated to visual effort or to the severity of visual error. With adequate correction, 72.5% of the subjects with headache and refractive error reported improvement in their headaches, and 38% had complete remission of headache. Regardless of the type of headache present, headache frequency was significantly reduced in these subjects (t = 2.34, P =.02). Headache associated with refractive errors was rarely identified in individuals with refractive errors. In those with chronic headache, proper correction of refractive errors significantly improved headache complaints and did so primarily by decreasing the frequency of headache episodes.

  8. Retinal Image Quality During Accommodation

    PubMed Central

    López-Gil, N.; Martin, J.; Liu, T.; Bradley, A.; Díaz-Muñoz, D.; Thibos, L.

    2013-01-01

    Purpose We asked if retinal image quality is maximum during accommodation, or sub-optimal due to accommodative error, when subjects perform an acuity task. Methods Subjects viewed a monochromatic (552nm), high-contrast letter target placed at various viewing distances. Wavefront aberrations of the accommodating eye were measured near the endpoint of an acuity staircase paradigm. Refractive state, defined as the optimum target vergence for maximising retinal image quality, was computed by through-focus wavefront analysis to find the power of the virtual correcting lens that maximizes visual Strehl ratio. Results Despite changes in ocular aberrations and pupil size during binocular viewing, retinal image quality and visual acuity typically remain high for all target vergences. When accommodative errors lead to sub-optimal retinal image quality, acuity and measured image quality both decline. However, the effect of accommodation errors of on visual acuity are mitigated by pupillary constriction associated with accommodation and binocular convergence and also to binocular summation of dissimilar retinal image blur. Under monocular viewing conditions some subjects displayed significant accommodative lag that reduced visual performance, an effect that was exacerbated by pharmacological dilation of the pupil. Conclusions Spurious measurement of accommodative error can be avoided when the image quality metric used to determine refractive state is compatible with the focusing criteria used by the visual system to control accommodation. Real focusing errors of the accommodating eye do not necessarily produce a reliably measurable loss of image quality or clinically significant loss of visual performance, probably because of increased depth-of-focus due to pupil constriction. When retinal image quality is close to maximum achievable (given the eye’s higher-order aberrations), acuity is also near maximum. A combination of accommodative lag, reduced image quality, and reduced visual function may be a useful sign for diagnosing functionally-significant accommodative errors indicating the need for therapeutic intervention. PMID:23786386

  9. Retinal image quality during accommodation.

    PubMed

    López-Gil, Norberto; Martin, Jesson; Liu, Tao; Bradley, Arthur; Díaz-Muñoz, David; Thibos, Larry N

    2013-07-01

    We asked if retinal image quality is maximum during accommodation, or sub-optimal due to accommodative error, when subjects perform an acuity task. Subjects viewed a monochromatic (552 nm), high-contrast letter target placed at various viewing distances. Wavefront aberrations of the accommodating eye were measured near the endpoint of an acuity staircase paradigm. Refractive state, defined as the optimum target vergence for maximising retinal image quality, was computed by through-focus wavefront analysis to find the power of the virtual correcting lens that maximizes visual Strehl ratio. Despite changes in ocular aberrations and pupil size during binocular viewing, retinal image quality and visual acuity typically remain high for all target vergences. When accommodative errors lead to sub-optimal retinal image quality, acuity and measured image quality both decline. However, the effect of accommodation errors of on visual acuity are mitigated by pupillary constriction associated with accommodation and binocular convergence and also to binocular summation of dissimilar retinal image blur. Under monocular viewing conditions some subjects displayed significant accommodative lag that reduced visual performance, an effect that was exacerbated by pharmacological dilation of the pupil. Spurious measurement of accommodative error can be avoided when the image quality metric used to determine refractive state is compatible with the focusing criteria used by the visual system to control accommodation. Real focusing errors of the accommodating eye do not necessarily produce a reliably measurable loss of image quality or clinically significant loss of visual performance, probably because of increased depth-of-focus due to pupil constriction. When retinal image quality is close to maximum achievable (given the eye's higher-order aberrations), acuity is also near maximum. A combination of accommodative lag, reduced image quality, and reduced visual function may be a useful sign for diagnosing functionally-significant accommodative errors indicating the need for therapeutic intervention. © 2013 The Authors Ophthalmic & Physiological Optics © 2013 The College of Optometrists.

  10. Missed lung cancer: when, where, and why?

    PubMed Central

    del Ciello, Annemilia; Franchi, Paola; Contegiacomo, Andrea; Cicchetti, Giuseppe; Bonomo, Lorenzo; Larici, Anna Rita

    2017-01-01

    Missed lung cancer is a source of concern among radiologists and an important medicolegal challenge. In 90% of the cases, errors in diagnosis of lung cancer occur on chest radiographs. It may be challenging for radiologists to distinguish a lung lesion from bones, pulmonary vessels, mediastinal structures, and other complex anatomical structures on chest radiographs. Nevertheless, lung cancer can also be overlooked on computed tomography (CT) scans, regardless of the context, either if a clinical or radiologic suspect exists or for other reasons. Awareness of the possible causes of overlooking a pulmonary lesion can give radiologists a chance to reduce the occurrence of this eventuality. Various factors contribute to a misdiagnosis of lung cancer on chest radiographs and on CT, often very similar in nature to each other. Observer error is the most significant one and comprises scanning error, recognition error, decision-making error, and satisfaction of search. Tumor characteristics such as lesion size, conspicuity, and location are also crucial in this context. Even technical aspects can contribute to the probability of skipping lung cancer, including image quality and patient positioning and movement. Albeit it is hard to remove missed lung cancer completely, strategies to reduce observer error and methods to improve technique and automated detection may be valuable in reducing its likelihood. PMID:28206951

  11. An organizational approach to understanding patient safety and medical errors.

    PubMed

    Kaissi, Amer

    2006-01-01

    Progress in patient safety, or lack thereof, is a cause for great concern. In this article, we argue that the patient safety movement has failed to reach its goals of eradicating or, at least, significantly reducing errors because of an inappropriate focus on provider and patient-level factors with no real attention to the organizational factors that affect patient safety. We describe an organizational approach to patient safety using different organizational theory perspectives and make several propositions to push patient safety research and practice in a direction that is more likely to improve care processes and outcomes. From a Contingency Theory perspective, we suggest that health care organizations, in general, operate under a misfit between contingencies and structures. This misfit is mainly due to lack of flexibility, cost containment, and lack of regulations, thus explaining the high level of errors committed in these organizations. From an organizational culture perspective, we argue that health care organizations must change their assumptions, beliefs, values, and artifacts to change their culture from a culture of blame to a culture of safety and thus reduce medical errors. From an organizational learning perspective, we discuss how reporting, analyzing, and acting on error information can result in reduced errors in health care organizations.

  12. The influence of monetary punishment on cognitive control in abstinent cocaine-users*

    PubMed Central

    Hester, Robert; Bell, Ryan P.; Foxe, John J.; Garavan, Hugh

    2013-01-01

    Background Dependent drug users show a diminished neural response to punishment, in both limbic and cortical regions, though it remains unclear how such changes influence cognitive processes critical to addiction. To assess this relationship, we examined the influence of monetary punishment on inhibitory control and adaptive post-error behaviour in abstinent cocaine dependent (CD) participants. Methods 15 abstinent CD and 15 matched control participants performed a Go/No-go response inhibition task, which administered monetary fines for failed response inhibition, during collection of fMRI data. Results CD participants showed reduced inhibitory control and significantly less adaptive post-error slowing in response to punishment, when compared to controls. The diminished behavioural punishment sensitivity shown by CD participants was associated with significant hypoactive error-related BOLD responses in the dorsal anterior cingulate cortex (ACC), right insula and right prefrontal regions. Specifically, CD participants’ error-related response in these regions was not modulated by the presence of punishment, whereas control participants’ response showed a significant BOLD increase during punished errors. Conclusions CD participants showed a blunted response to failed control (errors) that was not modulated by punishment. Consistent with previous findings of reduced sensitivity to monetary loss in cocaine users, we further demonstrate that such insensitivity is associated with an inability to increase cognitive control in the face of negative consequences, a core symptom of addiction. The pattern of deficits in the CD group may have implications for interventions that attempt to improve cognitive control in drug dependent groups via positive/negative incentives. PMID:23791040

  13. The influence of monetary punishment on cognitive control in abstinent cocaine-users.

    PubMed

    Hester, Robert; Bell, Ryan P; Foxe, John J; Garavan, Hugh

    2013-11-01

    Dependent drug users show a diminished neural response to punishment, in both limbic and cortical regions, though it remains unclear how such changes influence cognitive processes critical to addiction. To assess this relationship, we examined the influence of monetary punishment on inhibitory control and adaptive post-error behavior in abstinent cocaine dependent (CD) participants. 15 abstinent CD and 15 matched control participants performed a Go/No-go response inhibition task, which administered monetary fines for failed response inhibition, during collection of fMRI data. CD participants showed reduced inhibitory control and significantly less adaptive post-error slowing in response to punishment, when compared to controls. The diminished behavioral punishment sensitivity shown by CD participants was associated with significant hypoactive error-related BOLD responses in the dorsal anterior cingulate cortex (ACC), right insula and right prefrontal regions. Specifically, CD participants' error-related response in these regions was not modulated by the presence of punishment, whereas control participants' response showed a significant BOLD increase during punished errors. CD participants showed a blunted response to failed control (errors) that was not modulated by punishment. Consistent with previous findings of reduced sensitivity to monetary loss in cocaine users, we further demonstrate that such insensitivity is associated with an inability to increase cognitive control in the face of negative consequences, a core symptom of addiction. The pattern of deficits in the CD group may have implications for interventions that attempt to improve cognitive control in drug dependent groups via positive/negative incentives. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.

  14. Simplified Approach Charts Improve Data Retrieval Performance

    PubMed Central

    Stewart, Michael; Laraway, Sean; Jordan, Kevin; Feary, Michael S.

    2016-01-01

    The effectiveness of different instrument approach charts to deliver minimum visibility and altitude information during airport equipment outages was investigated. Eighteen pilots flew simulated instrument approaches in three conditions: (a) normal operations using a standard approach chart (standard-normal), (b) equipment outage conditions using a standard approach chart (standard-outage), and (c) equipment outage conditions using a prototype decluttered approach chart (prototype-outage). Errors and retrieval times in identifying minimum altitudes and visibilities were measured. The standard-outage condition produced significantly more errors and longer retrieval times versus the standard-normal condition. The prototype-outage condition had significantly fewer errors and shorter retrieval times than did the standard-outage condition. The prototype-outage condition produced significantly fewer errors but similar retrieval times when compared with the standard-normal condition. Thus, changing the presentation of minima may reduce risk and increase safety in instrument approaches, specifically with airport equipment outages. PMID:28491009

  15. Dysfunctional error-related processing in female psychopathy

    PubMed Central

    Steele, Vaughn R.; Edwards, Bethany G.; Bernat, Edward M.; Calhoun, Vince D.; Kiehl, Kent A.

    2016-01-01

    Neurocognitive studies of psychopathy have predominantly focused on male samples. Studies have shown that female psychopaths exhibit similar affective deficits as their male counterparts, but results are less consistent across cognitive domains including response modulation. As such, there may be potential gender differences in error-related processing in psychopathic personality. Here we investigate response-locked event-related potential (ERP) components [the error-related negativity (ERN/Ne) related to early error-detection processes and the error-related positivity (Pe) involved in later post-error processing] in a sample of incarcerated adult female offenders (n = 121) who performed a response inhibition Go/NoGo task. Psychopathy was assessed using the Hare Psychopathy Checklist-Revised (PCL-R). The ERN/Ne and Pe were analyzed with classic windowed ERP components and principal component analysis (PCA). Consistent with previous research performed in psychopathic males, female psychopaths exhibited specific deficiencies in the neural correlates of post-error processing (as indexed by reduced Pe amplitude) but not in error monitoring (as indexed by intact ERN/Ne amplitude). Specifically, psychopathic traits reflecting interpersonal and affective dysfunction remained significant predictors of both time-domain and PCA measures reflecting reduced Pe mean amplitude. This is the first evidence to suggest that incarcerated female psychopaths exhibit similar dysfunctional post-error processing as male psychopaths. PMID:26060326

  16. The global burden of diagnostic errors in primary care

    PubMed Central

    Singh, Hardeep; Schiff, Gordon D; Graber, Mark L; Onakpoya, Igho; Thompson, Matthew J

    2017-01-01

    Diagnosis is one of the most important tasks performed by primary care physicians. The World Health Organization (WHO) recently prioritized patient safety areas in primary care, and included diagnostic errors as a high-priority problem. In addition, a recent report from the Institute of Medicine in the USA, ‘Improving Diagnosis in Health Care’, concluded that most people will likely experience a diagnostic error in their lifetime. In this narrative review, we discuss the global significance, burden and contributory factors related to diagnostic errors in primary care. We synthesize available literature to discuss the types of presenting symptoms and conditions most commonly affected. We then summarize interventions based on available data and suggest next steps to reduce the global burden of diagnostic errors. Research suggests that we are unlikely to find a ‘magic bullet’ and confirms the need for a multifaceted approach to understand and address the many systems and cognitive issues involved in diagnostic error. Because errors involve many common conditions and are prevalent across all countries, the WHO’s leadership at a global level will be instrumental to address the problem. Based on our review, we recommend that the WHO consider bringing together primary care leaders, practicing frontline clinicians, safety experts, policymakers, the health IT community, medical education and accreditation organizations, researchers from multiple disciplines, patient advocates, and funding bodies among others, to address the many common challenges and opportunities to reduce diagnostic error. This could lead to prioritization of practice changes needed to improve primary care as well as setting research priorities for intervention development to reduce diagnostic error. PMID:27530239

  17. Reduced discretization error in HZETRN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaba, Tony C., E-mail: Tony.C.Slaba@nasa.gov; Blattnig, Steve R., E-mail: Steve.R.Blattnig@nasa.gov; Tweed, John, E-mail: jtweed@odu.edu

    2013-02-01

    The deterministic particle transport code HZETRN is an efficient analysis tool for studying the effects of space radiation on humans, electronics, and shielding materials. In a previous work, numerical methods in the code were reviewed, and new methods were developed that further improved efficiency and reduced overall discretization error. It was also shown that the remaining discretization error could be attributed to low energy light ions (A < 4) with residual ranges smaller than the physical step-size taken by the code. Accurately resolving the spectrum of low energy light particles is important in assessing risk associated with astronaut radiation exposure.more » In this work, modifications to the light particle transport formalism are presented that accurately resolve the spectrum of low energy light ion target fragments. The modified formalism is shown to significantly reduce overall discretization error and allows a physical approximation to be removed. For typical step-sizes and energy grids used in HZETRN, discretization errors for the revised light particle transport algorithms are shown to be less than 4% for aluminum and water shielding thicknesses as large as 100 g/cm{sup 2} exposed to both solar particle event and galactic cosmic ray environments.« less

  18. High accuracy switched-current circuits using an improved dynamic mirror

    NASA Technical Reports Server (NTRS)

    Zweigle, G.; Fiez, T.

    1991-01-01

    The switched-current technique, a recently developed circuit approach to analog signal processing, has emerged as an alternative/compliment to the well established switched-capacitor circuit technique. High speed switched-current circuits offer potential cost and power savings over slower switched-capacitor circuits. Accuracy improvements are a primary concern at this stage in the development of the switched-current technique. Use of the dynamic current mirror has produced circuits that are insensitive to transistor matching errors. The dynamic current mirror has been limited by other sources of error including clock-feedthrough and voltage transient errors. In this paper we present an improved switched-current building block using the dynamic current mirror. Utilizing current feedback the errors due to current imbalance in the dynamic current mirror are reduced. Simulations indicate that this feedback can reduce total harmonic distortion by as much as 9 dB. Additionally, we have developed a clock-feedthrough reduction scheme for which simulations reveal a potential 10 dB total harmonic distortion improvement. The clock-feedthrough reduction scheme also significantly reduces offset errors and allows for cancellation with a constant current source. Experimental results confirm the simulated improvements.

  19. New Methods for Assessing and Reducing Uncertainty in Microgravity Studies

    NASA Astrophysics Data System (ADS)

    Giniaux, J. M.; Hooper, A. J.; Bagnardi, M.

    2017-12-01

    Microgravity surveying, also known as dynamic or 4D gravimetry is a time-dependent geophysical method used to detect mass fluctuations within the shallow crust, by analysing temporal changes in relative gravity measurements. We present here a detailed uncertainty analysis of temporal gravity measurements, considering for the first time all possible error sources, including tilt, error in drift estimations and timing errors. We find that some error sources that are actually ignored, can have a significant impact on the total error budget and it is therefore likely that some gravity signals may have been misinterpreted in previous studies. Our analysis leads to new methods for reducing some of the uncertainties associated with residual gravity estimation. In particular, we propose different approaches for drift estimation and free air correction depending on the survey set up. We also provide formulae to recalculate uncertainties for past studies and lay out a framework for best practice in future studies. We demonstrate our new approach on volcanic case studies, which include Kilauea in Hawaii and Askja in Iceland.

  20. Altimeter error sources at the 10-cm performance level

    NASA Technical Reports Server (NTRS)

    Martin, C. F.

    1977-01-01

    Error sources affecting the calibration and operational use of a 10 cm altimeter are examined to determine the magnitudes of current errors and the investigations necessary to reduce them to acceptable bounds. Errors considered include those affecting operational data pre-processing, and those affecting altitude bias determination, with error budgets developed for both. The most significant error sources affecting pre-processing are bias calibration, propagation corrections for the ionosphere, and measurement noise. No ionospheric models are currently validated at the required 10-25% accuracy level. The optimum smoothing to reduce the effects of measurement noise is investigated and found to be on the order of one second, based on the TASC model of geoid undulations. The 10 cm calibrations are found to be feasible only through the use of altimeter passes that are very high elevation for a tracking station which tracks very close to the time of altimeter track, such as a high elevation pass across the island of Bermuda. By far the largest error source, based on the current state-of-the-art, is the location of the island tracking station relative to mean sea level in the surrounding ocean areas.

  1. Conflict and performance monitoring throughout the lifespan: An event-related potential (ERP) and temporospatial component analysis.

    PubMed

    Clawson, Ann; Clayson, Peter E; Keith, Cierra M; Catron, Christina; Larson, Michael J

    2017-03-01

    Cognitive control includes higher-level cognitive processes used to evaluate environmental conflict. Given the importance of cognitive control in regulating behavior, understanding the developmental course of these processes may contribute to a greater understanding of normal and abnormal development. We examined behavioral (response times [RTs], error rates) and event-related potential data (N2, error-related negativity [ERN], correct-response negativity [CRN], error positivity [Pe]) during a flanker task in cross-sectional groups of 45 youth (ages 8-18), 52 younger adults (ages 20-28), and 58 older adults (ages 56-91). Younger adults displayed the most efficient processing, including significantly reduced CRN and N2 amplitude, increased Pe amplitude, and significantly better task performance than youth or older adults (e.g., faster RTs, fewer errors). Youth displayed larger CRN and N2, attenuated Pe, and significantly worse task performance than younger adults. Older adults fell either between youth and younger adults (e.g., CRN amplitudes, N2 amplitudes) or displayed neural and behavioral performance that was similar to youth (e.g., Pe amplitudes, error rates). These findings point to underdeveloped neural and cognitive processes early in life and reduced efficiency in older adulthood, contributing to poor implementation and modulation of cognitive control in response to conflict. Thus, cognitive control processing appears to reach peak performance and efficiency in younger adulthood, marked by improved task performance with less neural activation. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines

    PubMed Central

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J.; Raboso, Mariano

    2015-01-01

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements. PMID:26091392

  3. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.

    PubMed

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano

    2015-06-17

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  4. Safety climate and attitude toward medication error reporting after hospital accreditation in South Korea.

    PubMed

    Lee, Eunjoo

    2016-09-01

    This study compared registered nurses' perceptions of safety climate and attitude toward medication error reporting before and after completing a hospital accreditation program. Medication errors are the most prevalent adverse events threatening patient safety; reducing underreporting of medication errors significantly improves patient safety. Safety climate in hospitals may affect medication error reporting. This study employed a longitudinal, descriptive design. Data were collected using questionnaires. A tertiary acute hospital in South Korea undergoing a hospital accreditation program. Nurses, pre- and post-accreditation (217 and 373); response rate: 58% and 87%, respectively. Hospital accreditation program. Perceived safety climate and attitude toward medication error reporting. The level of safety climate and attitude toward medication error reporting increased significantly following accreditation; however, measures of institutional leadership and management did not improve significantly. Participants' perception of safety climate was positively correlated with their attitude toward medication error reporting; this correlation strengthened following completion of the program. Improving hospitals' safety climate increased nurses' medication error reporting; interventions that help hospital administration and managers to provide more supportive leadership may facilitate safety climate improvement. Hospitals and their units should develop more friendly and intimate working environments that remove nurses' fear of penalties. Administration and managers should support nurses who report their own errors. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Dynamically Hedging Oil and Currency Futures Using Receding Horizontal Control and Stochastic Programming

    NASA Astrophysics Data System (ADS)

    Cottrell, Paul Edward

    There is a lack of research in the area of hedging future contracts, especially in illiquid or very volatile market conditions. It is important to understand the volatility of the oil and currency markets because reduced fluctuations in these markets could lead to better hedging performance. This study compared different hedging methods by using a hedging error metric, supplementing the Receding Horizontal Control and Stochastic Programming (RHCSP) method by utilizing the London Interbank Offered Rate with the Levy process. The RHCSP hedging method was investigated to determine if improved hedging error was accomplished compared to the Black-Scholes, Leland, and Whalley and Wilmott methods when applied on simulated, oil, and currency futures markets. A modified RHCSP method was also investigated to determine if this method could significantly reduce hedging error under extreme market illiquidity conditions when applied on simulated, oil, and currency futures markets. This quantitative study used chaos theory and emergence for its theoretical foundation. An experimental research method was utilized for this study with a sample size of 506 hedging errors pertaining to historical and simulation data. The historical data were from January 1, 2005 through December 31, 2012. The modified RHCSP method was found to significantly reduce hedging error for the oil and currency market futures by the use of a 2-way ANOVA with a t test and post hoc Tukey test. This study promotes positive social change by identifying better risk controls for investment portfolios and illustrating how to benefit from high volatility in markets. Economists, professional investment managers, and independent investors could benefit from the findings of this study.

  6. The Effect of an Electronic Checklist on Critical Care Provider Workload, Errors, and Performance.

    PubMed

    Thongprayoon, Charat; Harrison, Andrew M; O'Horo, John C; Berrios, Ronaldo A Sevilla; Pickering, Brian W; Herasevich, Vitaly

    2016-03-01

    The strategy used to improve effective checklist use in intensive care unit (ICU) setting is essential for checklist success. This study aimed to test the hypothesis that an electronic checklist could reduce ICU provider workload, errors, and time to checklist completion, as compared to a paper checklist. This was a simulation-based study conducted at an academic tertiary hospital. All participants completed checklists for 6 ICU patients: 3 using an electronic checklist and 3 using an identical paper checklist. In both scenarios, participants had full access to the existing electronic medical record system. The outcomes measured were workload (defined using the National Aeronautics and Space Association task load index [NASA-TLX]), the number of checklist errors, and time to checklist completion. Two independent clinician reviewers, blinded to participant results, served as the reference standard for checklist error calculation. Twenty-one ICU providers participated in this study. This resulted in the generation of 63 simulated electronic checklists and 63 simulated paper checklists. The median NASA-TLX score was 39 for the electronic checklist and 50 for the paper checklist (P = .005). The median number of checklist errors for the electronic checklist was 5, while the median number of checklist errors for the paper checklist was 8 (P = .003). The time to checklist completion was not significantly different between the 2 checklist formats (P = .76). The electronic checklist significantly reduced provider workload and errors without any measurable difference in the amount of time required for checklist completion. This demonstrates that electronic checklists are feasible and desirable in the ICU setting. © The Author(s) 2014.

  7. Information technology and medication safety: what is the benefit?

    PubMed Central

    Kaushal, R; Bates, D

    2002-01-01

    

 Medication errors occur frequently and have significant clinical and financial consequences. Several types of information technologies can be used to decrease rates of medication errors. Computerized physician order entry with decision support significantly reduces serious inpatient medication error rates in adults. Other available information technologies that may prove effective for inpatients include computerized medication administration records, robots, automated pharmacy systems, bar coding, "smart" intravenous devices, and computerized discharge prescriptions and instructions. In outpatients, computerization of prescribing and patient oriented approaches such as personalized web pages and delivery of web based information may be important. Public and private mandates for information technology interventions are growing, but further development, application, evaluation, and dissemination are required. PMID:12486992

  8. The global burden of diagnostic errors in primary care.

    PubMed

    Singh, Hardeep; Schiff, Gordon D; Graber, Mark L; Onakpoya, Igho; Thompson, Matthew J

    2017-06-01

    Diagnosis is one of the most important tasks performed by primary care physicians. The World Health Organization (WHO) recently prioritized patient safety areas in primary care, and included diagnostic errors as a high-priority problem. In addition, a recent report from the Institute of Medicine in the USA, 'Improving Diagnosis in Health Care ', concluded that most people will likely experience a diagnostic error in their lifetime. In this narrative review, we discuss the global significance, burden and contributory factors related to diagnostic errors in primary care. We synthesize available literature to discuss the types of presenting symptoms and conditions most commonly affected. We then summarize interventions based on available data and suggest next steps to reduce the global burden of diagnostic errors. Research suggests that we are unlikely to find a 'magic bullet' and confirms the need for a multifaceted approach to understand and address the many systems and cognitive issues involved in diagnostic error. Because errors involve many common conditions and are prevalent across all countries, the WHO's leadership at a global level will be instrumental to address the problem. Based on our review, we recommend that the WHO consider bringing together primary care leaders, practicing frontline clinicians, safety experts, policymakers, the health IT community, medical education and accreditation organizations, researchers from multiple disciplines, patient advocates, and funding bodies among others, to address the many common challenges and opportunities to reduce diagnostic error. This could lead to prioritization of practice changes needed to improve primary care as well as setting research priorities for intervention development to reduce diagnostic error. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Charge renormalization at the large-D limit for N-electron atoms and weakly bound systems

    NASA Astrophysics Data System (ADS)

    Kais, S.; Bleil, R.

    1995-05-01

    We develop a systematic way to determine an effective nuclear charge ZRD such that the Hartree-Fock results will be significantly closer to the exact energies by utilizing the analytically known large-D limit energies. This method yields an expansion for the effective nuclear charge in powers of (1/D), which we have evaluated to the first order. This first order approximation to the desired effective nuclear charge has been applied to two-electron atoms with Z=2-20, and weakly bound systems such as H-. The errors for the two-electron atoms when compared with exact results were reduced from ˜0.2% for Z=2 to ˜0.002% for large Z. Although usual Hartree-Fock calculations for H- show this to be unstable, our results reduce the percent error of the Hartree-Fock energy from 7.6% to 1.86% and predicts the anion to be stable. For N-electron atoms (N=3-18, Z=3-28), using only the zeroth order approximation for the effective charge significantly reduces the error of Hartree-Fock calculations and recovers more than 80% of the correlation energy.

  10. The effects of error augmentation on learning to walk on a narrow balance beam.

    PubMed

    Domingo, Antoinette; Ferris, Daniel P

    2010-10-01

    Error augmentation during training has been proposed as a means to facilitate motor learning due to the human nervous system's reliance on performance errors to shape motor commands. We studied the effects of error augmentation on short-term learning of walking on a balance beam to determine whether it had beneficial effects on motor performance. Four groups of able-bodied subjects walked on a treadmill-mounted balance beam (2.5-cm wide) before and after 30 min of training. During training, two groups walked on the beam with a destabilization device that augmented error (Medium and High Destabilization groups). A third group walked on a narrower beam (1.27-cm) to augment error (Narrow). The fourth group practiced walking on the 2.5-cm balance beam (Wide). Subjects in the Wide group had significantly greater improvements after training than the error augmentation groups. The High Destabilization group had significantly less performance gains than the Narrow group in spite of similar failures per minute during training. In a follow-up experiment, a fifth group of subjects (Assisted) practiced with a device that greatly reduced catastrophic errors (i.e., stepping off the beam) but maintained similar pelvic movement variability. Performance gains were significantly greater in the Wide group than the Assisted group, indicating that catastrophic errors were important for short-term learning. We conclude that increasing errors during practice via destabilization and a narrower balance beam did not improve short-term learning of beam walking. In addition, the presence of qualitatively catastrophic errors seems to improve short-term learning of walking balance.

  11. Ultrasound transducer function: annual testing is not sufficient.

    PubMed

    Mårtensson, Mattias; Olsson, Mats; Brodin, Lars-Åke

    2010-10-01

    The objective was to follow-up the study 'High incidence of defective ultrasound transducers in use in routine clinical practice' and evaluate if annual testing is good enough to reduce the incidence of defective ultrasound transducers in routine clinical practice to an acceptable level. A total of 299 transducers were tested in 13 clinics at five hospitals in the Stockholm area. Approximately 7000-15,000 ultrasound examinations are carried out at these clinics every year. The transducers tested in the study had been tested and classified as fully operational 1 year before and since then been in normal use in the routine clinical practice. The transducers were tested with the Sonora FirstCall Test System. There were 81 (27.1%) defective transducers found; giving a 95% confidence interval ranging from 22.1 to 32.1%. The most common transducer errors were 'delamination' of the ultrasound lens and 'break in the cable' which together constituted 82.7% of all transducer errors found. The highest error rate was found at the radiological clinics with a mean error rate of 36.0%. There was a significant difference in error rate between two observed ways the clinics handled the transducers. There was no significant difference in the error rates of the transducer brands or the transducers models. Annual testing is not sufficient to reduce the incidence of defective ultrasound transducers in routine clinical practice to an acceptable level and it is strongly advisable to create a user routine that minimizes the handling of the transducers.

  12. Realtime mitigation of GPS SA errors using Loran-C

    NASA Technical Reports Server (NTRS)

    Braasch, Soo Y.

    1994-01-01

    The hybrid use of Loran-C with the Global Positioning System (GPS) was shown capable of providing a sole-means of enroute air radionavigation. By allowing pilots to fly direct to their destinations, use of this system is resulting in significant time savings and therefore fuel savings as well. However, a major error source limiting the accuracy of GPS is the intentional degradation of the GPS signal known as Selective Availability (SA). SA-induced position errors are highly correlated and far exceed all other error sources (horizontal position error: 100 meters, 95 percent). Realtime mitigation of SA errors from the position solution is highly desirable. How that can be achieved is discussed. The stability of Loran-C signals is exploited to reduce SA errors. The theory behind this technique is discussed and results using bench and flight data are given.

  13. New architecture for dynamic frame-skipping transcoder.

    PubMed

    Fung, Kai-Tat; Chan, Yui-Lam; Siu, Wan-Chi

    2002-01-01

    Transcoding is a key technique for reducing the bit rate of a previously compressed video signal. A high transcoding ratio may result in an unacceptable picture quality when the full frame rate of the incoming video bitstream is used. Frame skipping is often used as an efficient scheme to allocate more bits to the representative frames, so that an acceptable quality for each frame can be maintained. However, the skipped frame must be decompressed completely, which might act as a reference frame to nonskipped frames for reconstruction. The newly quantized discrete cosine transform (DCT) coefficients of the prediction errors need to be re-computed for the nonskipped frame with reference to the previous nonskipped frame; this can create undesirable complexity as well as introduce re-encoding errors. In this paper, we propose new algorithms and a novel architecture for frame-rate reduction to improve picture quality and to reduce complexity. The proposed architecture is mainly performed on the DCT domain to achieve a transcoder with low complexity. With the direct addition of DCT coefficients and an error compensation feedback loop, re-encoding errors are reduced significantly. Furthermore, we propose a frame-rate control scheme which can dynamically adjust the number of skipped frames according to the incoming motion vectors and re-encoding errors due to transcoding such that the decoded sequence can have a smooth motion as well as better transcoded pictures. Experimental results show that, as compared to the conventional transcoder, the new architecture for frame-skipping transcoder is more robust, produces fewer requantization errors, and has reduced computational complexity.

  14. Reduced vision and refractive errors, results from a school vision screening program in Kanchanpur District of far western Nepal.

    PubMed

    Awasthi, S; Pant, B P; Dhakal, H P

    2010-01-01

    At present there is no data available on reduced vision and refractive errors in school children of far western Nepal. So, school screening records were used to obtain data useful for planning of refractive services. Data are provided from school screening conducted by Geta Eye Hospital during February/March 2008. The cases with complete data sets on visual acuity, refractive error and age were included and analyzed using computer software. Of 1165 children (mean age 11.6 ± 2.5 years) examined, 98.8% (n = 1151) had uncorrected visual acuity of 6/9 and better in at least one eye whereas 1.2% (n = 14) had acuity 6/12 and worse in both eyes. Among them, either eye of 9 children improved to 6/9 and better with correction. However, visual acuity was 6/12 and worse in both eyes of 5 children even after correction. There were 24 children with refractive errors (myopia, 1.54%; n = 18 and hypermetropia, 0.51%; n = 6) in at least one eye. The spherical equivalent refraction was not significantly different with age and gender. The incidence of reduced vision and refractive errors among school children of this semi rural district were low.

  15. Error Reduction Methods for Integrated-path Differential-absorption Lidar Measurements

    NASA Technical Reports Server (NTRS)

    Chen, Jeffrey R.; Numata, Kenji; Wu, Stewart T.

    2012-01-01

    We report new modeling and error reduction methods for differential-absorption optical-depth (DAOD) measurements of atmospheric constituents using direct-detection integrated-path differential-absorption lidars. Errors from laser frequency noise are quantified in terms of the line center fluctuation and spectral line shape of the laser pulses, revealing relationships verified experimentally. A significant DAOD bias is removed by introducing a correction factor. Errors from surface height and reflectance variations can be reduced to tolerable levels by incorporating altimetry knowledge and "log after averaging", or by pointing the laser and receiver to a fixed surface spot during each wavelength cycle to shorten the time of "averaging before log".

  16. A bundle with a preformatted medical order sheet and an introductory course to reduce prescription errors in neonates.

    PubMed

    Palmero, David; Di Paolo, Ermindo R; Beauport, Lydie; Pannatier, André; Tolsa, Jean-François

    2016-01-01

    The objective of this study was to assess whether the introduction of a new preformatted medical order sheet coupled with an introductory course affected prescription quality and the frequency of errors during the prescription stage in a neonatal intensive care unit (NICU). Two-phase observational study consisting of two consecutive 4-month phases: pre-intervention (phase 0) and post-intervention (phase I) conducted in an 11-bed NICU in a Swiss university hospital. Interventions consisted of the introduction of a new preformatted medical order sheet with explicit information supplied, coupled with a staff introductory course on appropriate prescription and medication errors. The main outcomes measured were formal aspects of prescription and frequency and nature of prescription errors. Eighty-three and 81 patients were included in phase 0 and phase I, respectively. A total of 505 handwritten prescriptions in phase 0 and 525 in phase I were analysed. The rate of prescription errors decreased significantly from 28.9% in phase 0 to 13.5% in phase I (p < 0.05). Compared with phase 0, dose errors, name confusion and errors in frequency and rate of drug administration decreased in phase I, from 5.4 to 2.7% (p < 0.05), 5.9 to 0.2% (p < 0.05), 3.6 to 0.2% (p < 0.05), and 4.7 to 2.1% (p < 0.05), respectively. The rate of incomplete and ambiguous prescriptions decreased from 44.2 to 25.7 and 8.5 to 3.2% (p < 0.05), respectively. Inexpensive and simple interventions can improve the intelligibility of prescriptions and reduce medication errors. Medication errors are frequent in NICUs and prescription is one of the most critical steps. CPOE reduce prescription errors, but their implementation is not available everywhere. Preformatted medical order sheet coupled with an introductory course decrease medication errors in a NICU. Preformatted medical order sheet is an inexpensive and readily implemented alternative to CPOE.

  17. Suppression of vapor cell temperature error for spin-exchange-relaxation-free magnetometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Jixi, E-mail: lujixi@buaa.edu.cn; Qian, Zheng; Fang, Jiancheng

    2015-08-15

    This paper presents a method to reduce the vapor cell temperature error of the spin-exchange-relaxation-free (SERF) magnetometer. The fluctuation of cell temperature can induce variations of the optical rotation angle, resulting in a scale factor error of the SERF magnetometer. In order to suppress this error, we employ the variation of the probe beam absorption to offset the variation of the optical rotation angle. The theoretical discussion of our method indicates that the scale factor error introduced by the fluctuation of the cell temperature could be suppressed by setting the optical depth close to one. In our experiment, we adjustmore » the probe frequency to obtain various optical depths and then measure the variation of scale factor with respect to the corresponding cell temperature changes. Our experimental results show a good agreement with our theoretical analysis. Under our experimental condition, the error has been reduced significantly compared with those when the probe wavelength is adjusted to maximize the probe signal. The cost of this method is the reduction of the scale factor of the magnetometer. However, according to our analysis, it only has minor effect on the sensitivity under proper operating parameters.« less

  18. Analysis of using EMG and mechanical sensors to enhance intent recognition in powered lower limb prostheses

    NASA Astrophysics Data System (ADS)

    Young, A. J.; Kuiken, T. A.; Hargrove, L. J.

    2014-10-01

    Objective. The purpose of this study was to determine the contribution of electromyography (EMG) data, in combination with a diverse array of mechanical sensors, to locomotion mode intent recognition in transfemoral amputees using powered prostheses. Additionally, we determined the effect of adding time history information using a dynamic Bayesian network (DBN) for both the mechanical and EMG sensors. Approach. EMG signals from the residual limbs of amputees have been proposed to enhance pattern recognition-based intent recognition systems for powered lower limb prostheses, but mechanical sensors on the prosthesis—such as inertial measurement units, position and velocity sensors, and load cells—may be just as useful. EMG and mechanical sensor data were collected from 8 transfemoral amputees using a powered knee/ankle prosthesis over basic locomotion modes such as walking, slopes and stairs. An offline study was conducted to determine the benefit of different sensor sets for predicting intent. Main results. EMG information was not as accurate alone as mechanical sensor information (p < 0.05) for any classification strategy. However, EMG in combination with the mechanical sensor data did significantly reduce intent recognition errors (p < 0.05) both for transitions between locomotion modes and steady-state locomotion. The sensor time history (DBN) classifier significantly reduced error rates compared to a linear discriminant classifier for steady-state steps, without increasing the transitional error, for both EMG and mechanical sensors. Combining EMG and mechanical sensor data with sensor time history reduced the average transitional error from 18.4% to 12.2% and the average steady-state error from 3.8% to 1.0% when classifying level-ground walking, ramps, and stairs in eight transfemoral amputee subjects. Significance. These results suggest that a neural interface in combination with time history methods for locomotion mode classification can enhance intent recognition performance; this strategy should be considered for future real-time experiments.

  19. The thinking doctor: clinical decision making in contemporary medicine.

    PubMed

    Trimble, Michael; Hamilton, Paul

    2016-08-01

    Diagnostic errors are responsible for a significant number of adverse events. Logical reasoning and good decision-making skills are key factors in reducing such errors, but little emphasis has traditionally been placed on how these thought processes occur, and how errors could be minimised. In this article, we explore key cognitive ideas that underpin clinical decision making and suggest that by employing some simple strategies, physicians might be better able to understand how they make decisions and how the process might be optimised. © 2016 Royal College of Physicians.

  20. Reduction in pediatric identification band errors: a quality collaborative.

    PubMed

    Phillips, Shannon Connor; Saysana, Michele; Worley, Sarah; Hain, Paul D

    2012-06-01

    Accurate and consistent placement of a patient identification (ID) band is used in health care to reduce errors associated with patient misidentification. Multiple safety organizations have devoted time and energy to improving patient ID, but no multicenter improvement collaboratives have shown scalability of previously successful interventions. We hoped to reduce by half the pediatric patient ID band error rate, defined as absent, illegible, or inaccurate ID band, across a quality improvement learning collaborative of hospitals in 1 year. On the basis of a previously successful single-site intervention, we conducted a self-selected 6-site collaborative to reduce ID band errors in heterogeneous pediatric hospital settings. The collaborative had 3 phases: preparatory work and employee survey of current practice and barriers, data collection (ID band failure rate), and intervention driven by data and collaborative learning to accelerate change. The collaborative audited 11377 patients for ID band errors between September 2009 and September 2010. The ID band failure rate decreased from 17% to 4.1% (77% relative reduction). Interventions including education of frontline staff regarding correct ID bands as a safety strategy; a change to softer ID bands, including "luggage tag" type ID bands for some patients; and partnering with families and patients through education were applied at all institutions. Over 13 months, a collaborative of pediatric institutions significantly reduced the ID band failure rate. This quality improvement learning collaborative demonstrates that safety improvements tested in a single institution can be disseminated to improve quality of care across large populations of children.

  1. On the dosimetric effect and reduction of inverse consistency and transitivity errors in deformable image registration for dose accumulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, Edward T.; Hardcastle, Nicholas; Tome, Wolfgang A.

    2012-01-15

    Purpose: Deformable image registration (DIR) is necessary for accurate dose accumulation between multiple radiotherapy image sets. DIR algorithms can suffer from inverse and transitivity inconsistencies. When using deformation vector fields (DVFs) that exhibit inverse-inconsistency and are nontransitive, dose accumulation on a given image set via different image pathways will lead to different accumulated doses. The purpose of this study was to investigate the dosimetric effect of and propose a postprocessing solution to reduce inverse consistency and transitivity errors. Methods: Four MVCT images and four phases of a lung 4DCT, each with an associated calculated dose, were selected for analysis. DVFsmore » between all four images in each data set were created using the Fast Symmetric Demons algorithm. Dose was accumulated on the fourth image in each set using DIR via two different image pathways. The two accumulated doses on the fourth image were compared. The inverse consistency and transitivity errors in the DVFs were then reduced. The dose accumulation was repeated using the processed DVFs, the results of which were compared with the accumulated dose from the original DVFs. To evaluate the influence of the postprocessing technique on DVF accuracy, the original and processed DVF accuracy was evaluated on the lung 4DCT data on which anatomical landmarks had been identified by an expert. Results: Dose accumulation to the same image via different image pathways resulted in two different accumulated dose results. After the inverse consistency errors were reduced, the difference between the accumulated doses diminished. The difference was further reduced after reducing the transitivity errors. The postprocessing technique had minimal effect on the accuracy of the DVF for the lung 4DCT images. Conclusions: This study shows that inverse consistency and transitivity errors in DIR have a significant dosimetric effect in dose accumulation; Depending on the image pathway taken to accumulate the dose, different results may be obtained. A postprocessing technique that reduces inverse consistency and transitivity error is presented, which allows for consistent dose accumulation regardless of the image pathway followed.« less

  2. Regional alveolar partial pressure of oxygen measurement with parallel accelerated hyperpolarized gas MRI.

    PubMed

    Kadlecek, Stephen; Hamedani, Hooman; Xu, Yinan; Emami, Kiarash; Xin, Yi; Ishii, Masaru; Rizi, Rahim

    2013-10-01

    Alveolar oxygen tension (Pao2) is sensitive to the interplay between local ventilation, perfusion, and alveolar-capillary membrane permeability, and thus reflects physiologic heterogeneity of healthy and diseased lung function. Several hyperpolarized helium ((3)He) magnetic resonance imaging (MRI)-based Pao2 mapping techniques have been reported, and considerable effort has gone toward reducing Pao2 measurement error. We present a new Pao2 imaging scheme, using parallel accelerated MRI, which significantly reduces measurement error. The proposed Pao2 mapping scheme was computer-simulated and was tested on both phantoms and five human subjects. Where possible, correspondence between actual local oxygen concentration and derived values was assessed for both bias (deviation from the true mean) and imaging artifact (deviation from the true spatial distribution). Phantom experiments demonstrated a significantly reduced coefficient of variation using the accelerated scheme. Simulation results support this observation and predict that correspondence between the true spatial distribution and the derived map is always superior using the accelerated scheme, although the improvement becomes less significant as the signal-to-noise ratio increases. Paired measurements in the human subjects, comparing accelerated and fully sampled schemes, show a reduced Pao2 distribution width for 41 of 46 slices. In contrast to proton MRI, acceleration of hyperpolarized imaging has no signal-to-noise penalty; its use in Pao2 measurement is therefore always beneficial. Comparison of multiple schemes shows that the benefit arises from a longer time-base during which oxygen-induced depolarization modifies the signal strength. Demonstration of the accelerated technique in human studies shows the feasibility of the method and suggests that measurement error is reduced here as well, particularly at low signal-to-noise levels. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.

  3. Accounting for measurement error in log regression models with applications to accelerated testing.

    PubMed

    Richardson, Robert; Tolley, H Dennis; Evenson, William E; Lunt, Barry M

    2018-01-01

    In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  4. The impact of computerized physician order entry on prescription orders: A quasi-experimental study in Iran

    PubMed Central

    Khammarnia, Mohammad; Sharifian, Roxana; Zand, Farid; Barati, Omid; Keshtkaran, Ali; Sabetian, Golnar; Shahrokh, , Nasim; Setoodezadeh, Fatemeh

    2017-01-01

    Background: One way to reduce medical errors associated with physician orders is computerized physician order entry (CPOE) software. This study was conducted to compare prescription orders between 2 groups before and after CPOE implementation in a hospital. Methods: We conducted a before-after prospective study in 2 intensive care unit (ICU) wards (as intervention and control wards) in the largest tertiary public hospital in South of Iran during 2014 and 2016. All prescription orders were validated by a clinical pharmacist and an ICU physician. The rates of ordering the errors in medical orders were compared before (manual ordering) and after implementation of the CPOE. A standard checklist was used for data collection. For the data analysis, SPSS Version 21, descriptive statistics, and analytical tests such as McNemar, chi-square, and logistic regression were used. Results: The CPOE significantly decreased 2 types of errors, illegible orders and lack of writing the drug form, in the intervention ward compared to the control ward (p< 0.05); however, the 2 errors increased due to the defect in the CPOE (p< 0.001). The use of CPOE decreased the prescription errors from 19% to 3% (p= 0.001), However, no differences were observed in the control ward (p<0.05). In addition, more errors occurred in the morning shift (p< 0.001). Conclusion: In general, the use of CPOE significantly reduced the prescription errors. Nonetheless, more caution should be exercised in the use of this system, and its deficiencies should be resolved. Furthermore, it is recommended that CPOE be used to improve the quality of delivered services in hospitals. PMID:29445698

  5. The impact of computerized physician order entry on prescription orders: A quasi-experimental study in Iran.

    PubMed

    Khammarnia, Mohammad; Sharifian, Roxana; Zand, Farid; Barati, Omid; Keshtkaran, Ali; Sabetian, Golnar; Shahrokh, Nasim; Setoodezadeh, Fatemeh

    2017-01-01

    Background: One way to reduce medical errors associated with physician orders is computerized physician order entry (CPOE) software. This study was conducted to compare prescription orders between 2 groups before and after CPOE implementation in a hospital. Methods: We conducted a before-after prospective study in 2 intensive care unit (ICU) wards (as intervention and control wards) in the largest tertiary public hospital in South of Iran during 2014 and 2016. All prescription orders were validated by a clinical pharmacist and an ICU physician. The rates of ordering the errors in medical orders were compared before (manual ordering) and after implementation of the CPOE. A standard checklist was used for data collection. For the data analysis, SPSS Version 21, descriptive statistics, and analytical tests such as McNemar, chi-square, and logistic regression were used. Results: The CPOE significantly decreased 2 types of errors, illegible orders and lack of writing the drug form, in the intervention ward compared to the control ward (p< 0.05); however, the 2 errors increased due to the defect in the CPOE (p< 0.001). The use of CPOE decreased the prescription errors from 19% to 3% (p= 0.001), However, no differences were observed in the control ward (p<0.05). In addition, more errors occurred in the morning shift (p< 0.001). Conclusion: In general, the use of CPOE significantly reduced the prescription errors. Nonetheless, more caution should be exercised in the use of this system, and its deficiencies should be resolved. Furthermore, it is recommended that CPOE be used to improve the quality of delivered services in hospitals.

  6. Mindful Reading: Mindfulness Meditation Helps Keep Readers with Dyslexia and ADHD on the Lexical Track.

    PubMed

    Tarrasch, Ricardo; Berman, Zohar; Friedmann, Naama

    2016-01-01

    This study explored the effects of a Mindfulness-Based Stress Reduction (MBSR) intervention on reading, attention, and psychological well-being among people with developmental dyslexia and/or attention deficits. Various types of dyslexia exist, characterized by different error types. We examined a question that has not been tested so far: which types of errors (and dyslexias) are affected by MBSR training. To do so, we tested, using an extensive battery of reading tests, whether each participant had dyslexia, and which errors types s/he makes, and then compared the rate of each error type before and after the MBSR workshop. We used a similar approach to attention disorders: we evaluated the participants' sustained, selective, executive, and orienting of attention to assess whether they had attention-disorders, and if so, which functions were impaired. We then evaluated the effect of MBSR on each of the attention functions. Psychological measures including mindfulness, stress, reflection and rumination, lifesatisfaction, depression, anxiety, and sleep-disturbances were also evaluated. Nineteen Hebrew-readers completed a 2-month mindfulness workshop. The results showed that whereas reading errors of letter-migrations within and between words and vowelletter errors did not decrease following the workshop, most participants made fewer reading errors in general following the workshop, with a significant reduction of 19% from their original number of errors. This decrease mainly resulted from a decrease in errors that occur due to reading via the sublexical rather than the lexical route. It seems, therefore, that mindfulness helped reading by keeping the readers on the lexical route. This improvement in reading probably resulted from improved sustained attention: the reduction in sublexical reading was significant for the dyslexic participants who also had attention deficits, and there were significant correlations between reduced reading errors and decreases in impulsivity. Following the meditation workshop, the rate of commission errors decreased, indicating decreased impulsivity, and the variation in RTs in the CPT task decreased, indicating improved sustained attention. Significant improvements were obtained in participants' mindfulness, perceived-stress, rumination, depression, state-anxiety, and sleep-disturbances. Correlations were also obtained between reading improvement and increased mindfulness following the workshop. Thus, whereas mindfulness training did not affect specific types of errors and did not improve dyslexia, it did affect the reading of adults with developmental dyslexia and ADHD, by helping them to stay on the straight path of the lexical route while reading. Thus, the reading improvement induced by mindfulness sheds light on the intricate relation between attention and reading. Mindfulness reduced impulsivity and improved sustained attention, and this, in turn, improved reading of adults with developmental dyslexia and ADHD, by helping them to read via the straight path of the lexical route.

  7. Mindful Reading: Mindfulness Meditation Helps Keep Readers with Dyslexia and ADHD on the Lexical Track

    PubMed Central

    Tarrasch, Ricardo; Berman, Zohar; Friedmann, Naama

    2016-01-01

    This study explored the effects of a Mindfulness-Based Stress Reduction (MBSR) intervention on reading, attention, and psychological well-being among people with developmental dyslexia and/or attention deficits. Various types of dyslexia exist, characterized by different error types. We examined a question that has not been tested so far: which types of errors (and dyslexias) are affected by MBSR training. To do so, we tested, using an extensive battery of reading tests, whether each participant had dyslexia, and which errors types s/he makes, and then compared the rate of each error type before and after the MBSR workshop. We used a similar approach to attention disorders: we evaluated the participants’ sustained, selective, executive, and orienting of attention to assess whether they had attention-disorders, and if so, which functions were impaired. We then evaluated the effect of MBSR on each of the attention functions. Psychological measures including mindfulness, stress, reflection and rumination, lifesatisfaction, depression, anxiety, and sleep-disturbances were also evaluated. Nineteen Hebrew-readers completed a 2-month mindfulness workshop. The results showed that whereas reading errors of letter-migrations within and between words and vowelletter errors did not decrease following the workshop, most participants made fewer reading errors in general following the workshop, with a significant reduction of 19% from their original number of errors. This decrease mainly resulted from a decrease in errors that occur due to reading via the sublexical rather than the lexical route. It seems, therefore, that mindfulness helped reading by keeping the readers on the lexical route. This improvement in reading probably resulted from improved sustained attention: the reduction in sublexical reading was significant for the dyslexic participants who also had attention deficits, and there were significant correlations between reduced reading errors and decreases in impulsivity. Following the meditation workshop, the rate of commission errors decreased, indicating decreased impulsivity, and the variation in RTs in the CPT task decreased, indicating improved sustained attention. Significant improvements were obtained in participants’ mindfulness, perceived-stress, rumination, depression, state-anxiety, and sleep-disturbances. Correlations were also obtained between reading improvement and increased mindfulness following the workshop. Thus, whereas mindfulness training did not affect specific types of errors and did not improve dyslexia, it did affect the reading of adults with developmental dyslexia and ADHD, by helping them to stay on the straight path of the lexical route while reading. Thus, the reading improvement induced by mindfulness sheds light on the intricate relation between attention and reading. Mindfulness reduced impulsivity and improved sustained attention, and this, in turn, improved reading of adults with developmental dyslexia and ADHD, by helping them to read via the straight path of the lexical route. PMID:27242565

  8. Flight Test Results: CTAS Cruise/Descent Trajectory Prediction Accuracy for En route ATC Advisories

    NASA Technical Reports Server (NTRS)

    Green, S.; Grace, M.; Williams, D.

    1999-01-01

    The Center/TRACON Automation System (CTAS), under development at NASA Ames Research Center, is designed to assist controllers with the management and control of air traffic transitioning to/from congested airspace. This paper focuses on the transition from the en route environment, to high-density terminal airspace, under a time-based arrival-metering constraint. Two flight tests were conducted at the Denver Air Route Traffic Control Center (ARTCC) to study trajectory-prediction accuracy, the key to accurate Decision Support Tool advisories such as conflict detection/resolution and fuel-efficient metering conformance. In collaboration with NASA Langley Research Center, these test were part of an overall effort to research systems and procedures for the integration of CTAS and flight management systems (FMS). The Langley Transport Systems Research Vehicle Boeing 737 airplane flew a combined total of 58 cruise-arrival trajectory runs while following CTAS clearance advisories. Actual trajectories of the airplane were compared to CTAS and FMS predictions to measure trajectory-prediction accuracy and identify the primary sources of error for both. The research airplane was used to evaluate several levels of cockpit automation ranging from conventional avionics to a performance-based vertical navigation (VNAV) FMS. Trajectory prediction accuracy was analyzed with respect to both ARTCC radar tracking and GPS-based aircraft measurements. This paper presents detailed results describing the trajectory accuracy and error sources. Although differences were found in both accuracy and error sources, CTAS accuracy was comparable to the FMS in terms of both meter-fix arrival-time performance (in support of metering) and 4D-trajectory prediction (key to conflict prediction). Overall arrival time errors (mean plus standard deviation) were measured to be approximately 24 seconds during the first flight test (23 runs) and 15 seconds during the second flight test (25 runs). The major source of error during these tests was found to be the predicted winds aloft used by CTAS. Position and velocity estimates of the airplane provided to CTAS by the ATC Host radar tracker were found to be a relatively insignificant error source for the trajectory conditions evaluated. Airplane performance modeling errors within CTAS were found to not significantly affect arrival time errors when the constrained descent procedures were used. The most significant effect related to the flight guidance was observed to be the cross-track and turn-overshoot errors associated with conventional VOR guidance. Lateral navigation (LNAV) guidance significantly reduced both the cross-track and turn-overshoot error. Pilot procedures and VNAV guidance were found to significantly reduce the vertical profile errors associated with atmospheric and aircraft performance model errors.

  9. Pattern of eyelid motion predictive of decision errors during drowsiness: oculomotor indices of altered states.

    PubMed

    Lobb, M L; Stern, J A

    1986-08-01

    Sequential patterns of eye and eyelid motion were identified in seven subjects performing a modified serial probe recognition task under drowsy conditions. Using simultaneous EOG and video recordings, eyelid motion was divided into components above, within, and below the pupil and the durations in sequence were recorded. A serial probe recognition task was modified to allow for distinguishing decision errors from attention errors. Decision errors were found to be more frequent following a downward shift in the gaze angle which the eyelid closing sequence was reduced from a five element to a three element sequence. The velocity of the eyelid moving over the pupil during decision errors was slow in the closing and fast in the reopening phase, while on decision correct trials it was fast in closing and slower in reopening. Due to the high variability of eyelid motion under drowsy conditions these findings were only marginally significant. When a five element blink occurred, the velocity of the lid over pupil motion component of these endogenous eye blinks was significantly faster on decision correct than on decision error trials. Furthermore, the highly variable, long duration closings associated with the decision response produced slow eye movements in the horizontal plane (SEM) which were more frequent and significantly longer in duration on decision error versus decision correct responses.

  10. Use of Electronic Medication Administration Records to Reduce Perceived Stress and Risk of Medication Errors in Nursing Homes.

    PubMed

    Alenius, Malin; Graf, Peter

    2016-07-01

    Concerns have been raised about the effects of current medication administration processes on the safety of many of the aspects of medication administration. Keeping electronic medication administration records could decrease many of these problems. Unfortunately, there has not been much research on this topic, especially in nursing homes. A prospective case-control survey was consequently performed at two nursing homes; the electronic record system was introduced in one, whereas the other continued to use paper records. The personnel were asked to fill in a questionnaire of their perceptions of stress and risk of medication errors at baseline (n = 66) and 20 weeks after the intervention group had started recording medication administration electronically (n = 59). There were statistically significant decreases in the perceived risk of omitting a medication, of medication errors occurring because of communication problems, and of medication errors occurring because of inaccurate medication administration records in the intervention group (all P < .01 vs the control group). The perceived overall daily stress levels were also reduced in the intervention group (P < .05). These results indicate that the utilization of electronic medication administration records will reduce many of the concerns regarding the medication administration process.

  11. Investigating the relationship between foveal morphology and refractive error in a population with infantile nystagmus syndrome.

    PubMed

    Healey, Natasha; McLoone, Eibhlin; Mahon, Gerald; Jackson, A Jonathan; Saunders, Kathryn J; McClelland, Julie F

    2013-04-26

    We explored associations between refractive error and foveal hypoplasia in infantile nystagmus syndrome (INS). We recruited 50 participants with INS (albinism n = 33, nonalbinism infantile nystagmus [NAIN] n = 17) aged 4 to 48 years. Cycloplegic refractive error and logMAR acuity were obtained. Spherical equivalent (SER), most ametropic meridian (MAM) refractive error, and better eye acuity (VA) were used for analyses. High resolution spectral-domain optical coherence tomography (SD-OCT) was used to obtain foveal scans, which were graded using the Foveal Hypoplasia Grading Scale. Associations between grades of severity of foveal hypoplasia, and refractive error and VA were explored. Participants with more severe foveal hypoplasia had significantly higher MAMs and SERs (Kruskal-Wallis H test P = 0.005 and P = 0.008, respectively). There were no statistically significant associations between foveal hypoplasia and cylindrical refractive error (Kruskal-Wallis H test P = 0.144). Analyses demonstrated significant differences between participants with albinism or NAIN in terms of SER and MAM (Mann-Whitney U test P = 0.001). There were no statistically significant differences between astigmatic errors between participants with albinism and NAIN. Controlling for the effects of albinism, results demonstrated no significant associations between SER, and MAM and foveal hypoplasia (partial correlation P > 0.05). Poorer visual acuity was associated statistically significantly with more severe foveal hypoplasia (Kruskal-Wallis H test P = 0.001) and with a diagnosis of albinism (Mann-Whitney U test P = 0.001). Increasing severity of foveal hypoplasia is associated with poorer VA, reflecting reduced cone density in INS. Individuals with INS also demonstrate a significant association between more severe foveal hypoplasia and increasing hyperopia. However, in the absence of albinism, there is no significant relation between refractive outcome and degree of foveal hypoplasia, suggesting that foveal maldevelopment in isolation does not impair significantly the emmetropization process. It likely is that impaired emmetropization evidenced in the albinism group may be attributed to the whole eye effect of albinism.

  12. Operator Variability in Scan Positioning is a Major Component of HR-pQCT Precision Error and is Reduced by Standardized Training

    PubMed Central

    Bonaretti, Serena; Vilayphiou, Nicolas; Chan, Caroline Mai; Yu, Andrew; Nishiyama, Kyle; Liu, Danmei; Boutroy, Stephanie; Ghasem-Zadeh, Ali; Boyd, Steven K.; Chapurlat, Roland; McKay, Heather; Shane, Elizabeth; Bouxsein, Mary L.; Black, Dennis M.; Majumdar, Sharmila; Orwoll, Eric S.; Lang, Thomas F.; Khosla, Sundeep; Burghardt, Andrew J.

    2017-01-01

    Introduction HR-pQCT is increasingly used to assess bone quality, fracture risk and anti-fracture interventions. The contribution of the operator has not been adequately accounted in measurement precision. Operators acquire a 2D projection (“scout view image”) and define the region to be scanned by positioning a “reference line” on a standard anatomical landmark. In this study, we (i) evaluated the contribution of positioning variability to in vivo measurement precision, (ii) measured intra- and inter-operator positioning variability, and (iii) tested if custom training software led to superior reproducibility in new operators compared to experienced operators. Methods To evaluate the operator in vivo measurement precision we compared precision errors calculated in 64 co-registered and non-co-registered scan-rescan images. To quantify operator variability, we developed software that simulates the positioning process of the scanner’s software. Eight experienced operators positioned reference lines on scout view images designed to test intra- and inter-operator reproducibility. Finally, we developed modules for training and evaluation of reference line positioning. We enrolled 6 new operators to participate in a common training, followed by the same reproducibility experiments performed by the experienced group. Results In vivo precision errors were up to three-fold greater (Tt.BMD and Ct.Th) when variability in scan positioning was included. Inter-operator precision errors were significantly greater than short-term intra-operator precision (p<0.001). New trained operators achieved comparable intra-operator reproducibility to experienced operators, and lower inter-operator reproducibility (p<0.001). Precision errors were significantly greater for the radius than for the tibia. Conclusion Operator reference line positioning contributes significantly to in vivo measurement precision and is significantly greater for multi-operator datasets. Inter-operator variability can be significantly reduced using a systematic training platform, now available online (http://webapps.radiology.ucsf.edu/refline/). PMID:27475931

  13. Contributions of the cerebellum and the motor cortex to acquisition and retention of motor memories

    PubMed Central

    Herzfeld, David J.; Pastor, Damien; Haith, Adrian M.; Rossetti, Yves; Shadmehr, Reza; O’Shea, Jacinta

    2014-01-01

    We investigated the contributions of the cerebellum and the motor cortex (M1) to acquisition and retention of human motor memories in a force field reaching task. We found that anodal transcranial direct current stimulation (tDCS) of the cerebellum, a technique that is thought to increase neuronal excitability, increased the ability to learn from error and form an internal model of the field, while cathodal cerebellar stimulation reduced this error-dependent learning. In addition, cathodal cerebellar stimulation disrupted the ability to respond to error within a reaching movement, reducing the gain of the sensory-motor feedback loop. By contrast, anodal M1 stimulation had no significant effects on these variables. During sham stimulation, early in training the acquired motor memory exhibited rapid decay in error-clamp trials. With further training the rate of decay decreased, suggesting that with training the motor memory was transformed from a labile to a more stable state. Surprisingly, neither cerebellar nor M1 stimulation altered these decay patterns. Participants returned 24 hours later and were re-tested in error-clamp trials without stimulation. The cerebellar group that had learned the task with cathodal stimulation exhibited significantly impaired retention, and retention was not improved by M1 anodal stimulation. In summary, non-invasive cerebellar stimulation resulted in polarity-dependent up- or down-regulation of error-dependent motor learning. In addition, cathodal cerebellar stimulation during acquisition impaired the ability to retain the motor memory overnight. Thus, in the force field task we found a critical role for the cerebellum in both formation of motor memory and its retention. PMID:24816533

  14. Simplification of the Kalman filter for meteorological data assimilation

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.

    1991-01-01

    The paper proposes a new statistical method of data assimilation that is based on a simplification of the Kalman filter equations. The forecast error covariance evolution is approximated simply by advecting the mass-error covariance field, deriving the remaining covariances geostrophically, and accounting for external model-error forcing only at the end of each forecast cycle. This greatly reduces the cost of computation of the forecast error covariance. In simulations with a linear, one-dimensional shallow-water model and data generated artificially, the performance of the simplified filter is compared with that of the Kalman filter and the optimal interpolation (OI) method. The simplified filter produces analyses that are nearly optimal, and represents a significant improvement over OI.

  15. Energy and Quality-Aware Multimedia Signal Processing

    NASA Astrophysics Data System (ADS)

    Emre, Yunus

    Today's mobile devices have to support computation-intensive multimedia applications with a limited energy budget. In this dissertation, we present architecture level and algorithm-level techniques that reduce energy consumption of these devices with minimal impact on system quality. First, we present novel techniques to mitigate the effects of SRAM memory failures in JPEG2000 implementations operating in scaled voltages. We investigate error control coding schemes and propose an unequal error protection scheme tailored for JPEG2000 that reduces overhead without affecting the performance. Furthermore, we propose algorithm-specific techniques for error compensation that exploit the fact that in JPEG2000 the discrete wavelet transform outputs have larger values for low frequency subband coefficients and smaller values for high frequency subband coefficients. Next, we present use of voltage overscaling to reduce the data-path power consumption of JPEG codecs. We propose an algorithm-specific technique which exploits the characteristics of the quantized coefficients after zig-zag scan to mitigate errors introduced by aggressive voltage scaling. Third, we investigate the effect of reducing dynamic range for datapath energy reduction. We analyze the effect of truncation error and propose a scheme that estimates the mean value of the truncation error during the pre-computation stage and compensates for this error. Such a scheme is very effective for reducing the noise power in applications that are dominated by additions and multiplications such as FIR filter and transform computation. We also present a novel sum of absolute difference (SAD) scheme that is based on most significant bit truncation. The proposed scheme exploits the fact that most of the absolute difference (AD) calculations result in small values, and most of the large AD values do not contribute to the SAD values of the blocks that are selected. Such a scheme is highly effective in reducing the energy consumption of motion estimation and intra-prediction kernels in video codecs. Finally, we present several hybrid energy-saving techniques based on combination of voltage scaling, computation reduction and dynamic range reduction that further reduce the energy consumption while keeping the performance degradation very low. For instance, a combination of computation reduction and dynamic range reduction for Discrete Cosine Transform shows on average, 33% to 46% reduction in energy consumption while incurring only 0.5dB to 1.5dB loss in PSNR.

  16. Fast online generalized multiscale finite element method using constraint energy minimization

    NASA Astrophysics Data System (ADS)

    Chung, Eric T.; Efendiev, Yalchin; Leung, Wing Tat

    2018-02-01

    Local multiscale methods often construct multiscale basis functions in the offline stage without taking into account input parameters, such as source terms, boundary conditions, and so on. These basis functions are then used in the online stage with a specific input parameter to solve the global problem at a reduced computational cost. Recently, online approaches have been introduced, where multiscale basis functions are adaptively constructed in some regions to reduce the error significantly. In multiscale methods, it is desired to have only 1-2 iterations to reduce the error to a desired threshold. Using Generalized Multiscale Finite Element Framework [10], it was shown that by choosing sufficient number of offline basis functions, the error reduction can be made independent of physical parameters, such as scales and contrast. In this paper, our goal is to improve this. Using our recently proposed approach [4] and special online basis construction in oversampled regions, we show that the error reduction can be made sufficiently large by appropriately selecting oversampling regions. Our numerical results show that one can achieve a three order of magnitude error reduction, which is better than our previous methods. We also develop an adaptive algorithm and enrich in selected regions with large residuals. In our adaptive method, we show that the convergence rate can be determined by a user-defined parameter and we confirm this by numerical simulations. The analysis of the method is presented.

  17. Spatial compression impairs prism adaptation in healthy individuals.

    PubMed

    Scriven, Rachel J; Newport, Roger

    2013-01-01

    Neglect patients typically present with gross inattention to one side of space following damage to the contralateral hemisphere. While prism-adaptation (PA) is effective in ameliorating some neglect behaviors, the mechanisms involved and their relationship to neglect remain unclear. Recent studies have shown that conscious strategic control (SC) processes in PA may be impaired in neglect patients, who are also reported to show extraordinarily long aftereffects compared to healthy participants. Determining the underlying cause of these effects may be the key to understanding therapeutic benefits. Alternative accounts suggest that reduced SC might result from a failure to detect prism-induced reaching errors properly either because (a) the size of the error is underestimated in compressed visual space or (b) pathologically increased error-detection thresholds reduce the requirement for error correction. The purpose of this study was to model these two alternatives in healthy participants and to examine whether SC and subsequent aftereffects were abnormal compared to standard PA. Each participant completed three PA procedures within a MIRAGE mediated reality environment with direction errors recorded before, during and after adaptation. During PA, visual feedback of the reach could be compressed, perturbed by noise, or represented veridically. Compressed visual space significantly reduced SC and aftereffects compared to control and noise conditions. These results support recent observations in neglect patients, suggesting that a distortion of spatial representation may successfully model neglect and explain neglect performance while adapting to prisms.

  18. Impact of Data Assimilation on Cost-Accuracy Tradeoff in Multi-Fidelity Models at the Example of an Infiltration Problem

    NASA Astrophysics Data System (ADS)

    Sinsbeck, Michael; Tartakovsky, Daniel

    2015-04-01

    Infiltration into top soil can be described by alternative models with different degrees of fidelity: Richards equation and the Green-Ampt model. These models typically contain uncertain parameters and forcings, rendering predictions of the state variables uncertain as well. Within the probabilistic framework, solutions of these models are given in terms of their probability density functions (PDFs) that, in the presence of data, can be treated as prior distributions. The assimilation of soil moisture data into model predictions, e.g., via a Bayesian updating of solution PDFs, poses a question of model selection: Given a significant difference in computational cost, is a lower-fidelity model preferable to its higher-fidelity counter-part? We investigate this question in the context of heterogeneous porous media, whose hydraulic properties are uncertain. While low-fidelity (reduced-complexity) models introduce a model error, their moderate computational cost makes it possible to generate more realizations, which reduces the (e.g., Monte Carlo) sampling or stochastic error. The ratio between these two errors determines the model with the smallest total error. We found assimilation of measurements of a quantity of interest (the soil moisture content, in our example) to decrease the model error, increasing the probability that the predictive accuracy of a reduced-complexity model does not fall below that of its higher-fidelity counterpart.

  19. Medication errors: a prospective cohort study of hand-written and computerised physician order entry in the intensive care unit.

    PubMed

    Shulman, Rob; Singer, Mervyn; Goldstone, John; Bellingan, Geoff

    2005-10-05

    The study aimed to compare the impact of computerised physician order entry (CPOE) without decision support with hand-written prescribing (HWP) on the frequency, type and outcome of medication errors (MEs) in the intensive care unit. Details of MEs were collected before, and at several time points after, the change from HWP to CPOE. The study was conducted in a London teaching hospital's 22-bedded general ICU. The sampling periods were 28 weeks before and 2, 10, 25 and 37 weeks after introduction of CPOE. The unit pharmacist prospectively recorded details of MEs and the total number of drugs prescribed daily during the data collection periods, during the course of his normal chart review. The total proportion of MEs was significantly lower with CPOE (117 errors from 2429 prescriptions, 4.8%) than with HWP (69 errors from 1036 prescriptions, 6.7%) (p < 0.04). The proportion of errors reduced with time following the introduction of CPOE (p < 0.001). Two errors with CPOE led to patient harm requiring an increase in length of stay and, if administered, three prescriptions with CPOE could potentially have led to permanent harm or death. Differences in the types of error between systems were noted. There was a reduction in major/moderate patient outcomes with CPOE when non-intercepted and intercepted errors were combined (p = 0.01). The mean baseline APACHE II score did not differ significantly between the HWP and the CPOE periods (19.4 versus 20.0, respectively, p = 0.71). Introduction of CPOE was associated with a reduction in the proportion of MEs and an improvement in the overall patient outcome score (if intercepted errors were included). Moderate and major errors, however, remain a significant concern with CPOE.

  20. Residents' numeric inputting error in computerized physician order entry prescription.

    PubMed

    Wu, Xue; Wu, Changxu; Zhang, Kan; Wei, Dong

    2016-04-01

    Computerized physician order entry (CPOE) system with embedded clinical decision support (CDS) can significantly reduce certain types of prescription error. However, prescription errors still occur. Various factors such as the numeric inputting methods in human computer interaction (HCI) produce different error rates and types, but has received relatively little attention. This study aimed to examine the effects of numeric inputting methods and urgency levels on numeric inputting errors of prescription, as well as categorize the types of errors. Thirty residents participated in four prescribing tasks in which two factors were manipulated: numeric inputting methods (numeric row in the main keyboard vs. numeric keypad) and urgency levels (urgent situation vs. non-urgent situation). Multiple aspects of participants' prescribing behavior were measured in sober prescribing situations. The results revealed that in urgent situations, participants were prone to make mistakes when using the numeric row in the main keyboard. With control of performance in the sober prescribing situation, the effects of the input methods disappeared, and urgency was found to play a significant role in the generalized linear model. Most errors were either omission or substitution types, but the proportion of transposition and intrusion error types were significantly higher than that of the previous research. Among numbers 3, 8, and 9, which were the less common digits used in prescription, the error rate was higher, which was a great risk to patient safety. Urgency played a more important role in CPOE numeric typing error-making than typing skills and typing habits. It was recommended that inputting with the numeric keypad had lower error rates in urgent situation. An alternative design could consider increasing the sensitivity of the keys with lower frequency of occurrence and decimals. To improve the usability of CPOE, numeric keyboard design and error detection could benefit from spatial incidence of errors found in this study. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Development and Evaluation of Algorithms for Breath Alcohol Screening.

    PubMed

    Ljungblad, Jonas; Hök, Bertil; Ekström, Mikael

    2016-04-01

    Breath alcohol screening is important for traffic safety, access control and other areas of health promotion. A family of sensor devices useful for these purposes is being developed and evaluated. This paper is focusing on algorithms for the determination of breath alcohol concentration in diluted breath samples using carbon dioxide to compensate for the dilution. The examined algorithms make use of signal averaging, weighting and personalization to reduce estimation errors. Evaluation has been performed by using data from a previously conducted human study. It is concluded that these features in combination will significantly reduce the random error compared to the signal averaging algorithm taken alone.

  2. Intrinsic Raman spectroscopy for quantitative biological spectroscopy Part II

    PubMed Central

    Bechtel, Kate L.; Shih, Wei-Chuan; Feld, Michael S.

    2009-01-01

    We demonstrate the effectiveness of intrinsic Raman spectroscopy (IRS) at reducing errors caused by absorption and scattering. Physical tissue models, solutions of varying absorption and scattering coefficients with known concentrations of Raman scatterers, are studied. We show significant improvement in prediction error by implementing IRS to predict concentrations of Raman scatterers using both ordinary least squares regression (OLS) and partial least squares regression (PLS). In particular, we show that IRS provides a robust calibration model that does not increase in error when applied to samples with optical properties outside the range of calibration. PMID:18711512

  3. Benchmarking the pseudopotential and fixed-node approximations in diffusion Monte Carlo calculations of molecules and solids

    DOE PAGES

    Nazarov, Roman; Shulenburger, Luke; Morales, Miguel A.; ...

    2016-03-28

    We performed diffusion Monte Carlo (DMC) calculations of the spectroscopic properties of a large set of molecules, assessing the effect of different approximations. In systems containing elements with large atomic numbers, we show that the errors associated with the use of nonlocal mean-field-based pseudopotentials in DMC calculations can be significant and may surpass the fixed-node error. In conclusion, we suggest practical guidelines for reducing these pseudopotential errors, which allow us to obtain DMC-computed spectroscopic parameters of molecules and equation of state properties of solids in excellent agreement with experiment.

  4. Benchmarking the pseudopotential and fixed-node approximations in diffusion Monte Carlo calculations of molecules and solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nazarov, Roman; Shulenburger, Luke; Morales, Miguel A.

    We performed diffusion Monte Carlo (DMC) calculations of the spectroscopic properties of a large set of molecules, assessing the effect of different approximations. In systems containing elements with large atomic numbers, we show that the errors associated with the use of nonlocal mean-field-based pseudopotentials in DMC calculations can be significant and may surpass the fixed-node error. In conclusion, we suggest practical guidelines for reducing these pseudopotential errors, which allow us to obtain DMC-computed spectroscopic parameters of molecules and equation of state properties of solids in excellent agreement with experiment.

  5. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, W.R.

    1999-01-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy{close_quote}s Idaho National Engineering and Environmental Laboratory (INEEL) is developing a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper will describe previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS. {copyright} {ital 1999 American Institute of Physics.}« less

  6. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, W.R.

    1998-09-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.« less

  7. EPs welcome new focus on reducing diagnostic errors.

    PubMed

    2015-12-01

    Emergency medicine leaders welcome a major new report from the Institute of Medicine (IOM) calling on providers, policy makers, and government agencies to institute changes to reduce the incidence of diagnostic errors. The 369-page report, "Improving Diagnosis in Health Care," states that the rate of diagnostic errors in this country is unacceptably high and offers a long list of recommendations aimed at addressing the problem. These include large, systemic changes that involve improvements in multiple areas, including health information technology (HIT), professional education, teamwork, and payment reform. Further, of particular interest to emergency physicians are recommended changes to the liability system. The authors of the IOM report state that while most people will likely experience a significant diagnostic error in their lifetime, the importance of this problem is under-appreciated. According to conservative estimates, the report says 5% of adults who seek outpatient care each year experience a diagnostic error. The report also notes that research over many decades shows diagnostic errors contribute to roughly 10% of all.deaths. The report says more steps need to be taken to facilitate inter-professional and intra-professional teamwork throughout the diagnostic process. Experts concur with the report's finding that mechanisms need to be developed so that providers receive ongoing feedback on their diagnostic performance.

  8. Heuristics and Cognitive Error in Medical Imaging.

    PubMed

    Itri, Jason N; Patel, Sohil H

    2018-05-01

    The field of cognitive science has provided important insights into mental processes underlying the interpretation of imaging examinations. Despite these insights, diagnostic error remains a major obstacle in the goal to improve quality in radiology. In this article, we describe several types of cognitive bias that lead to diagnostic errors in imaging and discuss approaches to mitigate cognitive biases and diagnostic error. Radiologists rely on heuristic principles to reduce complex tasks of assessing probabilities and predicting values into simpler judgmental operations. These mental shortcuts allow rapid problem solving based on assumptions and past experiences. Heuristics used in the interpretation of imaging studies are generally helpful but can sometimes result in cognitive biases that lead to significant errors. An understanding of the causes of cognitive biases can lead to the development of educational content and systematic improvements that mitigate errors and improve the quality of care provided by radiologists.

  9. Color-Coded Prefilled Medication Syringes Decrease Time to Delivery and Dosing Error in Simulated Emergency Department Pediatric Resuscitations.

    PubMed

    Moreira, Maria E; Hernandez, Caleb; Stevens, Allen D; Jones, Seth; Sande, Margaret; Blumen, Jason R; Hopkins, Emily; Bakes, Katherine; Haukoos, Jason S

    2015-08-01

    The Institute of Medicine has called on the US health care system to identify and reduce medical errors. Unfortunately, medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients when dosing requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national health care priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared with conventional medication administration, in simulated pediatric emergency department (ED) resuscitation scenarios. We performed a prospective, block-randomized, crossover study in which 10 emergency physician and nurse teams managed 2 simulated pediatric arrest scenarios in situ, using either prefilled, color-coded syringes (intervention) or conventional drug administration methods (control). The ED resuscitation room and the intravenous medication port were video recorded during the simulations. Data were extracted from video review by blinded, independent reviewers. Median time to delivery of all doses for the conventional and color-coded delivery groups was 47 seconds (95% confidence interval [CI] 40 to 53 seconds) and 19 seconds (95% CI 18 to 20 seconds), respectively (difference=27 seconds; 95% CI 21 to 33 seconds). With the conventional method, 118 doses were administered, with 20 critical dosing errors (17%); with the color-coded method, 123 doses were administered, with 0 critical dosing errors (difference=17%; 95% CI 4% to 30%). A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by emergency physician and nurse teams during simulated pediatric ED resuscitations. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  10. System Related Interventions to Reduce Diagnostic Error: A Narrative Review

    PubMed Central

    Singh, Hardeep; Graber, Mark L.; Kissam, Stephanie M.; Sorensen, Asta V.; Lenfestey, Nancy F.; Tant, Elizabeth M.; Henriksen, Kerm; LaBresh, Kenneth A.

    2013-01-01

    Background Diagnostic errors (missed, delayed, or wrong diagnosis) have gained recent attention and are associated with significant preventable morbidity and mortality. We reviewed the recent literature to identify interventions that have been, or could be, implemented to address systems-related factors that contribute directly to diagnostic error. Methods We conducted a comprehensive search using multiple search strategies. We first identified candidate articles in English between 2000 and 2009 from a PubMed search that exclusively evaluated for articles related to diagnostic error or delay. We then sought additional papers from references in the initial dataset, searches of additional databases, and subject matter experts. Articles were included if they formally evaluated an intervention to prevent or reduce diagnostic error; however, we also included papers if interventions were suggested and not tested in order to inform the state-of-the science on the topic. We categorized interventions according to the step in the diagnostic process they targeted: patient-provider encounter, performance and interpretation of diagnostic tests, follow-up and tracking of diagnostic information, subspecialty and referral-related; and patient-specific. Results We identified 43 articles for full review, of which 6 reported tested interventions and 37 contained suggestions for possible interventions. Empirical studies, though somewhat positive, were non-experimental or quasi-experimental and included a small number of clinicians or health care sites. Outcome measures in general were underdeveloped and varied markedly between studies, depending on the setting or step in the diagnostic process involved. Conclusions Despite a number of suggested interventions in the literature, few empirical studies have tested interventions to reduce diagnostic error in the last decade. Advancing the science of diagnostic error prevention will require more robust study designs and rigorous definitions of diagnostic processes and outcomes to measure intervention effects. PMID:22129930

  11. Robust Linear Models for Cis-eQTL Analysis.

    PubMed

    Rantalainen, Mattias; Lindgren, Cecilia M; Holmes, Christopher C

    2015-01-01

    Expression Quantitative Trait Loci (eQTL) analysis enables characterisation of functional genetic variation influencing expression levels of individual genes. In outbread populations, including humans, eQTLs are commonly analysed using the conventional linear model, adjusting for relevant covariates, assuming an allelic dosage model and a Gaussian error term. However, gene expression data generally have noise that induces heavy-tailed errors relative to the Gaussian distribution and often include atypical observations, or outliers. Such departures from modelling assumptions can lead to an increased rate of type II errors (false negatives), and to some extent also type I errors (false positives). Careful model checking can reduce the risk of type-I errors but often not type II errors, since it is generally too time-consuming to carefully check all models with a non-significant effect in large-scale and genome-wide studies. Here we propose the application of a robust linear model for eQTL analysis to reduce adverse effects of deviations from the assumption of Gaussian residuals. We present results from a simulation study as well as results from the analysis of real eQTL data sets. Our findings suggest that in many situations robust models have the potential to provide more reliable eQTL results compared to conventional linear models, particularly in respect to reducing type II errors due to non-Gaussian noise. Post-genomic data, such as that generated in genome-wide eQTL studies, are often noisy and frequently contain atypical observations. Robust statistical models have the potential to provide more reliable results and increased statistical power under non-Gaussian conditions. The results presented here suggest that robust models should be considered routinely alongside other commonly used methodologies for eQTL analysis.

  12. Phase Error Correction in Time-Averaged 3D Phase Contrast Magnetic Resonance Imaging of the Cerebral Vasculature

    PubMed Central

    MacDonald, M. Ethan; Forkert, Nils D.; Pike, G. Bruce; Frayne, Richard

    2016-01-01

    Purpose Volume flow rate (VFR) measurements based on phase contrast (PC)-magnetic resonance (MR) imaging datasets have spatially varying bias due to eddy current induced phase errors. The purpose of this study was to assess the impact of phase errors in time averaged PC-MR imaging of the cerebral vasculature and explore the effects of three common correction schemes (local bias correction (LBC), local polynomial correction (LPC), and whole brain polynomial correction (WBPC)). Methods Measurements of the eddy current induced phase error from a static phantom were first obtained. In thirty healthy human subjects, the methods were then assessed in background tissue to determine if local phase offsets could be removed. Finally, the techniques were used to correct VFR measurements in cerebral vessels and compared statistically. Results In the phantom, phase error was measured to be <2.1 ml/s per pixel and the bias was reduced with the correction schemes. In background tissue, the bias was significantly reduced, by 65.6% (LBC), 58.4% (LPC) and 47.7% (WBPC) (p < 0.001 across all schemes). Correction did not lead to significantly different VFR measurements in the vessels (p = 0.997). In the vessel measurements, the three correction schemes led to flow measurement differences of -0.04 ± 0.05 ml/s, 0.09 ± 0.16 ml/s, and -0.02 ± 0.06 ml/s. Although there was an improvement in background measurements with correction, there was no statistical difference between the three correction schemes (p = 0.242 in background and p = 0.738 in vessels). Conclusions While eddy current induced phase errors can vary between hardware and sequence configurations, our results showed that the impact is small in a typical brain PC-MR protocol and does not have a significant effect on VFR measurements in cerebral vessels. PMID:26910600

  13. A Visual Profile of Queensland Indigenous Children.

    PubMed

    Hopkins, Shelley; Sampson, Geoff P; Hendicott, Peter L; Wood, Joanne M

    2016-03-01

    Little is known about the prevalence of refractive error, binocular vision, and other visual conditions in Australian Indigenous children. This is important given the association of these visual conditions with reduced reading performance in the wider population, which may also contribute to the suboptimal reading performance reported in this population. The aim of this study was to develop a visual profile of Queensland Indigenous children. Vision testing was performed on 595 primary schoolchildren in Queensland, Australia. Vision parameters measured included visual acuity, refractive error, color vision, nearpoint of convergence, horizontal heterophoria, fusional vergence range, accommodative facility, AC/A ratio, visual motor integration, and rapid automatized naming. Near heterophoria, nearpoint of convergence, and near fusional vergence range were used to classify convergence insufficiency (CI). Although refractive error (Indigenous, 10%; non-Indigenous, 16%; p = 0.04) and strabismus (Indigenous, 0%; non-Indigenous, 3%; p = 0.03) were significantly less common in Indigenous children, CI was twice as prevalent (Indigenous, 10%; non-Indigenous, 5%; p = 0.04). Reduced visual information processing skills were more common in Indigenous children (reduced visual motor integration [Indigenous, 28%; non-Indigenous, 16%; p < 0.01] and slower rapid automatized naming [Indigenous, 67%; non-Indigenous, 59%; p = 0.04]). The prevalence of visual impairment (reduced visual acuity) and color vision deficiency was similar between groups. Indigenous children have less refractive error and strabismus than their non-Indigenous peers. However, CI and reduced visual information processing skills were more common in this group. Given that vision screenings primarily target visual acuity assessment and strabismus detection, this is an important finding as many Indigenous children with CI and reduced visual information processing may be missed. Emphasis should be placed on identifying children with CI and reduced visual information processing given the potential effect of these conditions on school performance.

  14. On the use of programmable hardware and reduced numerical precision in earth-system modeling.

    PubMed

    Düben, Peter D; Russell, Francis P; Niu, Xinyu; Luk, Wayne; Palmer, T N

    2015-09-01

    Programmable hardware, in particular Field Programmable Gate Arrays (FPGAs), promises a significant increase in computational performance for simulations in geophysical fluid dynamics compared with CPUs of similar power consumption. FPGAs allow adjusting the representation of floating-point numbers to specific application needs. We analyze the performance-precision trade-off on FPGA hardware for the two-scale Lorenz '95 model. We scale the size of this toy model to that of a high-performance computing application in order to make meaningful performance tests. We identify the minimal level of precision at which changes in model results are not significant compared with a maximal precision version of the model and find that this level is very similar for cases where the model is integrated for very short or long intervals. It is therefore a useful approach to investigate model errors due to rounding errors for very short simulations (e.g., 50 time steps) to obtain a range for the level of precision that can be used in expensive long-term simulations. We also show that an approach to reduce precision with increasing forecast time, when model errors are already accumulated, is very promising. We show that a speed-up of 1.9 times is possible in comparison to FPGA simulations in single precision if precision is reduced with no strong change in model error. The single-precision FPGA setup shows a speed-up of 2.8 times in comparison to our model implementation on two 6-core CPUs for large model setups.

  15. The Error Structure of the SMAP Single and Dual Channel Soil Moisture Retrievals

    NASA Astrophysics Data System (ADS)

    Dong, Jianzhi; Crow, Wade T.; Bindlish, Rajat

    2018-01-01

    Knowledge of the temporal error structure for remotely sensed surface soil moisture retrievals can improve our ability to exploit them for hydrologic and climate studies. This study employs a triple collocation analysis to investigate both the total variance and temporal autocorrelation of errors in Soil Moisture Active and Passive (SMAP) products generated from two separate soil moisture retrieval algorithms, the vertically polarized brightness temperature-based single-channel algorithm (SCA-V, the current baseline SMAP algorithm) and the dual-channel algorithm (DCA). A key assumption made in SCA-V is that real-time vegetation opacity can be accurately captured using only a climatology for vegetation opacity. Results demonstrate that while SCA-V generally outperforms DCA, SCA-V can produce larger total errors when this assumption is significantly violated by interannual variability in vegetation health and biomass. Furthermore, larger autocorrelated errors in SCA-V retrievals are found in areas with relatively large vegetation opacity deviations from climatological expectations. This implies that a significant portion of the autocorrelated error in SCA-V is attributable to the violation of its vegetation opacity climatology assumption and suggests that utilizing a real (as opposed to climatological) vegetation opacity time series in the SCA-V algorithm would reduce the magnitude of autocorrelated soil moisture retrieval errors.

  16. Distortion of Digital Image Correlation (DIC) Displacements and Strains from Heat Waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, E. M. C.; Reu, P. L.

    “Heat waves” is a colloquial term used to describe convective currents in air formed when different objects in an area are at different temperatures. In the context of Digital Image Correlation (DIC) and other optical-based image processing techniques, imaging an object of interest through heat waves can significantly distort the apparent location and shape of the object. We present that there are many potential heat sources in DIC experiments, including but not limited to lights, cameras, hot ovens, and sunlight, yet error caused by heat waves is often overlooked. This paper first briefly presents three practical situations in which heatmore » waves contributed significant error to DIC measurements to motivate the investigation of heat waves in more detail. Then the theoretical background of how light is refracted through heat waves is presented, and the effects of heat waves on displacements and strains computed from DIC are characterized in detail. Finally, different filtering methods are investigated to reduce the displacement and strain errors caused by imaging through heat waves. The overarching conclusions from this work are that errors caused by heat waves are significantly higher than typical noise floors for DIC measurements, and that the errors are difficult to filter because the temporal and spatial frequencies of the errors are in the same range as those of typical signals of interest. In conclusion, eliminating or mitigating the effects of heat sources in a DIC experiment is the best solution to minimizing errors caused by heat waves.« less

  17. Distortion of Digital Image Correlation (DIC) Displacements and Strains from Heat Waves

    DOE PAGES

    Jones, E. M. C.; Reu, P. L.

    2017-11-28

    “Heat waves” is a colloquial term used to describe convective currents in air formed when different objects in an area are at different temperatures. In the context of Digital Image Correlation (DIC) and other optical-based image processing techniques, imaging an object of interest through heat waves can significantly distort the apparent location and shape of the object. We present that there are many potential heat sources in DIC experiments, including but not limited to lights, cameras, hot ovens, and sunlight, yet error caused by heat waves is often overlooked. This paper first briefly presents three practical situations in which heatmore » waves contributed significant error to DIC measurements to motivate the investigation of heat waves in more detail. Then the theoretical background of how light is refracted through heat waves is presented, and the effects of heat waves on displacements and strains computed from DIC are characterized in detail. Finally, different filtering methods are investigated to reduce the displacement and strain errors caused by imaging through heat waves. The overarching conclusions from this work are that errors caused by heat waves are significantly higher than typical noise floors for DIC measurements, and that the errors are difficult to filter because the temporal and spatial frequencies of the errors are in the same range as those of typical signals of interest. In conclusion, eliminating or mitigating the effects of heat sources in a DIC experiment is the best solution to minimizing errors caused by heat waves.« less

  18. Reduction in Hospital-Wide Clinical Laboratory Specimen Identification Errors following Process Interventions: A 10-Year Retrospective Observational Study

    PubMed Central

    Ning, Hsiao-Chen; Lin, Chia-Ni; Chiu, Daniel Tsun-Yee; Chang, Yung-Ta; Wen, Chiao-Ni; Peng, Shu-Yu; Chu, Tsung-Lan; Yu, Hsin-Ming; Wu, Tsu-Lan

    2016-01-01

    Background Accurate patient identification and specimen labeling at the time of collection are crucial steps in the prevention of medical errors, thereby improving patient safety. Methods All patient specimen identification errors that occurred in the outpatient department (OPD), emergency department (ED), and inpatient department (IPD) of a 3,800-bed academic medical center in Taiwan were documented and analyzed retrospectively from 2005 to 2014. To reduce such errors, the following series of strategies were implemented: a restrictive specimen acceptance policy for the ED and IPD in 2006; a computer-assisted barcode positive patient identification system for the ED and IPD in 2007 and 2010, and automated sample labeling combined with electronic identification systems introduced to the OPD in 2009. Results Of the 2000345 specimens collected in 2005, 1023 (0.0511%) were identified as having patient identification errors, compared with 58 errors (0.0015%) among 3761238 specimens collected in 2014, after serial interventions; this represents a 97% relative reduction. The total number (rate) of institutional identification errors contributed from the ED, IPD, and OPD over a 10-year period were 423 (0.1058%), 556 (0.0587%), and 44 (0.0067%) errors before the interventions, and 3 (0.0007%), 52 (0.0045%) and 3 (0.0001%) after interventions, representing relative 99%, 92% and 98% reductions, respectively. Conclusions Accurate patient identification is a challenge of patient safety in different health settings. The data collected in our study indicate that a restrictive specimen acceptance policy, computer-generated positive identification systems, and interdisciplinary cooperation can significantly reduce patient identification errors. PMID:27494020

  19. Errors in clinical laboratories or errors in laboratory medicine?

    PubMed

    Plebani, Mario

    2006-01-01

    Laboratory testing is a highly complex process and, although laboratory services are relatively safe, they are not as safe as they could or should be. Clinical laboratories have long focused their attention on quality control methods and quality assessment programs dealing with analytical aspects of testing. However, a growing body of evidence accumulated in recent decades demonstrates that quality in clinical laboratories cannot be assured by merely focusing on purely analytical aspects. The more recent surveys on errors in laboratory medicine conclude that in the delivery of laboratory testing, mistakes occur more frequently before (pre-analytical) and after (post-analytical) the test has been performed. Most errors are due to pre-analytical factors (46-68.2% of total errors), while a high error rate (18.5-47% of total errors) has also been found in the post-analytical phase. Errors due to analytical problems have been significantly reduced over time, but there is evidence that, particularly for immunoassays, interference may have a serious impact on patients. A description of the most frequent and risky pre-, intra- and post-analytical errors and advice on practical steps for measuring and reducing the risk of errors is therefore given in the present paper. Many mistakes in the Total Testing Process are called "laboratory errors", although these may be due to poor communication, action taken by others involved in the testing process (e.g., physicians, nurses and phlebotomists), or poorly designed processes, all of which are beyond the laboratory's control. Likewise, there is evidence that laboratory information is only partially utilized. A recent document from the International Organization for Standardization (ISO) recommends a new, broader definition of the term "laboratory error" and a classification of errors according to different criteria. In a modern approach to total quality, centered on patients' needs and satisfaction, the risk of errors and mistakes in pre- and post-examination steps must be minimized to guarantee the total quality of laboratory services.

  20. pyAmpli: an amplicon-based variant filter pipeline for targeted resequencing data.

    PubMed

    Beyens, Matthias; Boeckx, Nele; Van Camp, Guy; Op de Beeck, Ken; Vandeweyer, Geert

    2017-12-14

    Haloplex targeted resequencing is a popular method to analyze both germline and somatic variants in gene panels. However, involved wet-lab procedures may introduce false positives that need to be considered in subsequent data-analysis. No variant filtering rationale addressing amplicon enrichment related systematic errors, in the form of an all-in-one package, exists to our knowledge. We present pyAmpli, a platform independent parallelized Python package that implements an amplicon-based germline and somatic variant filtering strategy for Haloplex data. pyAmpli can filter variants for systematic errors by user pre-defined criteria. We show that pyAmpli significantly increases specificity, without reducing sensitivity, essential for reporting true positive clinical relevant mutations in gene panel data. pyAmpli is an easy-to-use software tool which increases the true positive variant call rate in targeted resequencing data. It specifically reduces errors related to PCR-based enrichment of targeted regions.

  1. Computing in the presence of soft bit errors. [caused by single event upset on spacecraft

    NASA Technical Reports Server (NTRS)

    Rasmussen, R. D.

    1984-01-01

    It is shown that single-event-upsets (SEUs) due to cosmic rays are a significant source of single bit error in spacecraft computers. The physical mechanism of SEU, electron hole generation by means of Linear Energy Transfer (LET), it discussed with reference made to the results of a study of the environmental effects on computer systems of the Galileo spacecraft. Techniques for making software more tolerant of cosmic ray effects are considered, including: reducing the number of registers used by the software; continuity testing of variables; redundant execution of major procedures for error detection; and encoding state variables to detect single-bit changes. Attention is also given to design modifications which may reduce the cosmic ray exposure of on-board hardware. These modifications include: shielding components operating in LEO; removing low-power Schottky parts; and the use of CMOS diodes. The SEU parameters of different electronic components are listed in a table.

  2. Testing of a novel pin array guide for accurate three-dimensional glenoid component positioning.

    PubMed

    Lewis, Gregory S; Stevens, Nicole M; Armstrong, April D

    2015-12-01

    A substantial challenge in total shoulder replacement is accurate positioning and alignment of the glenoid component. This challenge arises from limited intraoperative exposure and complex arthritic-driven deformity. We describe a novel pin array guide and method for patient-specific guiding of the glenoid central drill hole. We also experimentally tested the hypothesis that this method would reduce errors in version and inclination compared with 2 traditional methods. Polymer models of glenoids were created from computed tomography scans from 9 arthritic patients. Each 3-dimensional (3D) printed scapula was shrouded to simulate the operative situation. Three different methods for central drill alignment were tested, all with the target orientation of 5° retroversion and 0° inclination: no assistance, assistance by preoperative 3D imaging, and assistance by the pin array guide. Version and inclination errors of the drill line were compared. Version errors using the pin array guide (3° ± 2°) were significantly lower than version errors associated with no assistance (9° ± 7°) and preoperative 3D imaging (8° ± 6°). Inclination errors were also significantly lower using the pin array guide compared with no assistance. The new pin array guide substantially reduced errors in orientation of the central drill line. The guide method is patient specific but does not require rapid prototyping and instead uses adjustments to an array of pins based on automated software calculations. This method may ultimately provide a cost-effective solution enabling surgeons to obtain accurate orientation of the glenoid. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  3. System of error detection in the manufacture of garments using artificial vision

    NASA Astrophysics Data System (ADS)

    Moreno, J. J.; Aguila, A.; Partida, E.; Martinez, C. L.; Morales, O.; Tejeida, R.

    2017-12-01

    A computer vision system is implemented to detect errors in the cutting stage within the manufacturing process of garments in the textile industry. It provides solution to errors within the process that cannot be easily detected by any employee, in addition to significantly increase the speed of quality review. In the textile industry as in many others, quality control is required in manufactured products and this has been carried out manually by means of visual inspection by employees over the years. For this reason, the objective of this project is to design a quality control system using computer vision to identify errors in the cutting stage within the garment manufacturing process to increase the productivity of textile processes by reducing costs.

  4. Global adjustment for creating extended panoramic images in video-dermoscopy

    NASA Astrophysics Data System (ADS)

    Faraz, Khuram; Blondel, Walter; Daul, Christian

    2017-07-01

    This contribution presents a fast global adjustment scheme exploiting SURF descriptor locations for constructing large skin mosaics. Precision in pairwise image registration is well-preserved while significantly reducing the global mosaicing error.

  5. Effect of Pointing Error on the BER Performance of an Optical CDMA FSO Link with SIK Receiver

    NASA Astrophysics Data System (ADS)

    Nazrul Islam, A. K. M.; Majumder, S. P.

    2017-12-01

    An analytical approach is presented for an optical code division multiple access (OCDMA) system over free space optical (FSO) channel considering the effect of pointing error between the transmitter and the receiver. Analysis is carried out with an optical sequence inverse keying (SIK) correlator receiver with intensity modulation and direct detection (IM/DD) to find the bit error rate (BER) with pointing error. The results are evaluated numerically in terms of signal-to-noise plus multi-access interference (MAI) ratio, BER and power penalty due to pointing error. It is noticed that the OCDMA FSO system is highly affected by pointing error with significant power penalty at a BER of 10-6 and 10-9. For example, penalty at BER 10-9 is found to be 9 dB corresponding to normalized pointing error of 1.4 for 16 users with processing gain of 256 and is reduced to 6.9 dB when the processing gain is increased to 1,024.

  6. An error-based micro-sensor capture system for real-time motion estimation

    NASA Astrophysics Data System (ADS)

    Yang, Lin; Ye, Shiwei; Wang, Zhibo; Huang, Zhipei; Wu, Jiankang; Kong, Yongmei; Zhang, Li

    2017-10-01

    A wearable micro-sensor motion capture system with 16 IMUs and an error-compensatory complementary filter algorithm for real-time motion estimation has been developed to acquire accurate 3D orientation and displacement in real life activities. In the proposed filter algorithm, the gyroscope bias error, orientation error and magnetic disturbance error are estimated and compensated, significantly reducing the orientation estimation error due to sensor noise and drift. Displacement estimation, especially for activities such as jumping, has been the challenge in micro-sensor motion capture. An adaptive gait phase detection algorithm has been developed to accommodate accurate displacement estimation in different types of activities. The performance of this system is benchmarked with respect to the results of VICON optical capture system. The experimental results have demonstrated effectiveness of the system in daily activities tracking, with estimation error 0.16 ± 0.06 m for normal walking and 0.13 ± 0.11 m for jumping motions. Research supported by the National Natural Science Foundation of China (Nos. 61431017, 81272166).

  7. Understanding diagnostic errors in medicine: a lesson from aviation

    PubMed Central

    Singh, H; Petersen, L A; Thomas, E J

    2006-01-01

    The impact of diagnostic errors on patient safety in medicine is increasingly being recognized. Despite the current progress in patient safety research, the understanding of such errors and how to prevent them is inadequate. Preliminary research suggests that diagnostic errors have both cognitive and systems origins. Situational awareness is a model that is primarily used in aviation human factors research that can encompass both the cognitive and the systems roots of such errors. This conceptual model offers a unique perspective in the study of diagnostic errors. The applicability of this model is illustrated by the analysis of a patient whose diagnosis of spinal cord compression was substantially delayed. We suggest how the application of this framework could lead to potential areas of intervention and outline some areas of future research. It is possible that the use of such a model in medicine could help reduce errors in diagnosis and lead to significant improvements in patient care. Further research is needed, including the measurement of situational awareness and correlation with health outcomes. PMID:16751463

  8. Self-Interaction Error in Density Functional Theory: An Appraisal.

    PubMed

    Bao, Junwei Lucas; Gagliardi, Laura; Truhlar, Donald G

    2018-05-03

    Self-interaction error (SIE) is considered to be one of the major sources of error in most approximate exchange-correlation functionals for Kohn-Sham density-functional theory (KS-DFT), and it is large with all local exchange-correlation functionals and with some hybrid functionals. In this work, we consider systems conventionally considered to be dominated by SIE. For these systems, we demonstrate that by using multiconfiguration pair-density functional theory (MC-PDFT), the error of a translated local density-functional approximation is significantly reduced (by a factor of 3) when using an MCSCF density and on-top density, as compared to using KS-DFT with the parent functional; the error in MC-PDFT with local on-top functionals is even lower than the error in some popular KS-DFT hybrid functionals. Density-functional theory, either in MC-PDFT form with local on-top functionals or in KS-DFT form with some functionals having 50% or more nonlocal exchange, has smaller errors for SIE-prone systems than does CASSCF, which has no SIE.

  9. Increasing the statistical significance of entanglement detection in experiments.

    PubMed

    Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei

    2010-05-28

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.

  10. Impact of time-of-flight PET on quantification errors in MR imaging-based attenuation correction.

    PubMed

    Mehranian, Abolfazl; Zaidi, Habib

    2015-04-01

    Time-of-flight (TOF) PET/MR imaging is an emerging imaging technology with great capabilities offered by TOF to improve image quality and lesion detectability. We assessed, for the first time, the impact of TOF image reconstruction on PET quantification errors induced by MR imaging-based attenuation correction (MRAC) using simulation and clinical PET/CT studies. Standard 4-class attenuation maps were derived by segmentation of CT images of 27 patients undergoing PET/CT examinations into background air, lung, soft-tissue, and fat tissue classes, followed by the assignment of predefined attenuation coefficients to each class. For each patient, 4 PET images were reconstructed: non-TOF and TOF both corrected for attenuation using reference CT-based attenuation correction and the resulting 4-class MRAC maps. The relative errors between non-TOF and TOF MRAC reconstructions were compared with their reference CT-based attenuation correction reconstructions. The bias was locally and globally evaluated using volumes of interest (VOIs) defined on lesions and normal tissues and CT-derived tissue classes containing all voxels in a given tissue, respectively. The impact of TOF on reducing the errors induced by metal-susceptibility and respiratory-phase mismatch artifacts was also evaluated using clinical and simulation studies. Our results show that TOF PET can remarkably reduce attenuation correction artifacts and quantification errors in the lungs and bone tissues. Using classwise analysis, it was found that the non-TOF MRAC method results in an error of -3.4% ± 11.5% in the lungs and -21.8% ± 2.9% in bones, whereas its TOF counterpart reduced the errors to -2.9% ± 7.1% and -15.3% ± 2.3%, respectively. The VOI-based analysis revealed that the non-TOF and TOF methods resulted in an average overestimation of 7.5% and 3.9% in or near lung lesions (n = 23) and underestimation of less than 5% for soft tissue and in or near bone lesions (n = 91). Simulation results showed that as TOF resolution improves, artifacts and quantification errors are substantially reduced. TOF PET substantially reduces artifacts and improves significantly the quantitative accuracy of standard MRAC methods. Therefore, MRAC should be less of a concern on future TOF PET/MR scanners with improved timing resolution. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  11. Patient safety in the clinical laboratory: a longitudinal analysis of specimen identification errors.

    PubMed

    Wagar, Elizabeth A; Tamashiro, Lorraine; Yasin, Bushra; Hilborne, Lee; Bruckner, David A

    2006-11-01

    Patient safety is an increasingly visible and important mission for clinical laboratories. Attention to improving processes related to patient identification and specimen labeling is being paid by accreditation and regulatory organizations because errors in these areas that jeopardize patient safety are common and avoidable through improvement in the total testing process. To assess patient identification and specimen labeling improvement after multiple implementation projects using longitudinal statistical tools. Specimen errors were categorized by a multidisciplinary health care team. Patient identification errors were grouped into 3 categories: (1) specimen/requisition mismatch, (2) unlabeled specimens, and (3) mislabeled specimens. Specimens with these types of identification errors were compared preimplementation and postimplementation for 3 patient safety projects: (1) reorganization of phlebotomy (4 months); (2) introduction of an electronic event reporting system (10 months); and (3) activation of an automated processing system (14 months) for a 24-month period, using trend analysis and Student t test statistics. Of 16,632 total specimen errors, mislabeled specimens, requisition mismatches, and unlabeled specimens represented 1.0%, 6.3%, and 4.6% of errors, respectively. Student t test showed a significant decrease in the most serious error, mislabeled specimens (P < .001) when compared to before implementation of the 3 patient safety projects. Trend analysis demonstrated decreases in all 3 error types for 26 months. Applying performance-improvement strategies that focus longitudinally on specimen labeling errors can significantly reduce errors, therefore improving patient safety. This is an important area in which laboratory professionals, working in interdisciplinary teams, can improve safety and outcomes of care.

  12. Bullying among nursing staff: relationship with psychological/behavioral responses of nurses and medical errors.

    PubMed

    Wright, Whitney; Khatri, Naresh

    2015-01-01

    The aim of this article is to examine the relationship between three types of bullying (person-related, work-related, and physically intimidating) with two types of outcomes (psychological/behavioral responses of nurses and medical errors). In addition, it investigates if the three types of bullying behaviors vary with age or gender of nurses and if the extent of bullying varies across different facilities in an institution. Nurses play an integral role in achieving safe and effective health care. To ensure nurses are functioning at their optimal level, health care organizations need to reduce negative components that impact nurses' job performance and their mental and physical health. Mitigating bullying from the workplace may be necessary to create and maintain a high-performing, caring, and safe hospital culture. Using an internal e-mail system, an e-mail requesting the participants to complete the questionnaire on Survey Monkey was sent to a sample of 1,078 nurses employed across three facilities at a university hospital system in the Midwest. Two hundred forty-one completed questionnaires were received with a response rate of 23%. Bullying was measured utilizing the Negative Acts Questionnaire-Revised (NAQ-R). Outcomes (psychological/behavioral responses of nurses and medical errors) were measured using Rosenstein and O'Daniel's (2008) modified scales. Person-related bullying showed significant positive relationships with psychological/behavioral responses and medical errors. Work-related bullying showed a significant positive relationship with psychological/behavioral responses, but not with medical errors. Physically intimidating bullying did not show a significant relationship to either outcome. Whereas person-related bullying was found to be negatively associated with age of nurses, physically intimidating bullying was positively associated with age. Male nurses experienced higher work-related bullying than female nurses. Findings from this study suggest that bullying behaviors exist and affect psychological/behavioral responses of nurses such as stress and anxiety and medical errors. Health care organizations should identify bullying behaviors and implement bullying prevention strategies to reduce those behaviors and the adverse effects that they may have on psychological/behavioral responses of nurses and medical errors.

  13. Vibration Noise Modeling for Measurement While Drilling System Based on FOGs

    PubMed Central

    Zhang, Chunxi; Wang, Lu; Gao, Shuang; Lin, Tie; Li, Xianmu

    2017-01-01

    Aiming to improve survey accuracy of Measurement While Drilling (MWD) based on Fiber Optic Gyroscopes (FOGs) in the long period, the external aiding sources are fused into the inertial navigation by the Kalman filter (KF) method. The KF method needs to model the inertial sensors’ noise as the system noise model. The system noise is modeled as white Gaussian noise conventionally. However, because of the vibration while drilling, the noise in gyros isn’t white Gaussian noise any more. Moreover, an incorrect noise model will degrade the accuracy of KF. This paper developed a new approach for noise modeling on the basis of dynamic Allan variance (DAVAR). In contrast to conventional white noise models, the new noise model contains both the white noise and the color noise. With this new noise model, the KF for the MWD was designed. Finally, two vibration experiments have been performed. Experimental results showed that the proposed vibration noise modeling approach significantly improved the estimated accuracies of the inertial sensor drifts. Compared the navigation results based on different noise model, with the DAVAR noise model, the position error and the toolface angle error are reduced more than 90%. The velocity error is reduced more than 65%. The azimuth error is reduced more than 50%. PMID:29039815

  14. Vibration Noise Modeling for Measurement While Drilling System Based on FOGs.

    PubMed

    Zhang, Chunxi; Wang, Lu; Gao, Shuang; Lin, Tie; Li, Xianmu

    2017-10-17

    Aiming to improve survey accuracy of Measurement While Drilling (MWD) based on Fiber Optic Gyroscopes (FOGs) in the long period, the external aiding sources are fused into the inertial navigation by the Kalman filter (KF) method. The KF method needs to model the inertial sensors' noise as the system noise model. The system noise is modeled as white Gaussian noise conventionally. However, because of the vibration while drilling, the noise in gyros isn't white Gaussian noise any more. Moreover, an incorrect noise model will degrade the accuracy of KF. This paper developed a new approach for noise modeling on the basis of dynamic Allan variance (DAVAR). In contrast to conventional white noise models, the new noise model contains both the white noise and the color noise. With this new noise model, the KF for the MWD was designed. Finally, two vibration experiments have been performed. Experimental results showed that the proposed vibration noise modeling approach significantly improved the estimated accuracies of the inertial sensor drifts. Compared the navigation results based on different noise model, with the DAVAR noise model, the position error and the toolface angle error are reduced more than 90%. The velocity error is reduced more than 65%. The azimuth error is reduced more than 50%.

  15. Effects of Correlated Errors on the Analysis of Space Geodetic Data

    NASA Technical Reports Server (NTRS)

    Romero-Wolf, Andres; Jacobs, C. S.

    2011-01-01

    As thermal errors are reduced instrumental and troposphere correlated errors will increasingly become more important. Work in progress shows that troposphere covariance error models improve data analysis results. We expect to see stronger effects with higher data rates. Temperature modeling of delay errors may further reduce temporal correlations in the data.

  16. Round-off errors in cutting plane algorithms based on the revised simplex procedure

    NASA Technical Reports Server (NTRS)

    Moore, J. E.

    1973-01-01

    This report statistically analyzes computational round-off errors associated with the cutting plane approach to solving linear integer programming problems. Cutting plane methods require that the inverse of a sequence of matrices be computed. The problem basically reduces to one of minimizing round-off errors in the sequence of inverses. Two procedures for minimizing this problem are presented, and their influence on error accumulation is statistically analyzed. One procedure employs a very small tolerance factor to round computed values to zero. The other procedure is a numerical analysis technique for reinverting or improving the approximate inverse of a matrix. The results indicated that round-off accumulation can be effectively minimized by employing a tolerance factor which reflects the number of significant digits carried for each calculation and by applying the reinversion procedure once to each computed inverse. If 18 significant digits plus an exponent are carried for each variable during computations, then a tolerance value of 0.1 x 10 to the minus 12th power is reasonable.

  17. Simultaneous localization and calibration for electromagnetic tracking systems.

    PubMed

    Sadjadi, Hossein; Hashtrudi-Zaad, Keyvan; Fichtinger, Gabor

    2016-06-01

    In clinical environments, field distortion can cause significant electromagnetic tracking errors. Therefore, dynamic calibration of electromagnetic tracking systems is essential to compensate for measurement errors. It is proposed to integrate the motion model of the tracked instrument with redundant EM sensor observations and to apply a simultaneous localization and mapping algorithm in order to accurately estimate the pose of the instrument and create a map of the field distortion in real-time. Experiments were conducted in the presence of ferromagnetic and electrically-conductive field distorting objects and results compared with those of a conventional sensor fusion approach. The proposed method reduced the tracking error from 3.94±1.61 mm to 1.82±0.62 mm in the presence of steel, and from 0.31±0.22 mm to 0.11±0.14 mm in the presence of aluminum. With reduced tracking error and independence from external tracking devices or pre-operative calibrations, the approach is promising for reliable EM navigation in various clinical procedures. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Computerized pharmaceutical intervention to reduce reconciliation errors at hospital discharge in Spain: an interrupted time-series study.

    PubMed

    García-Molina Sáez, C; Urbieta Sanz, E; Madrigal de Torres, M; Vicente Vera, T; Pérez Cárceles, M D

    2016-04-01

    It is well known that medication reconciliation at discharge is a key strategy to ensure proper drug prescription and the effectiveness and safety of any treatment. Different types of interventions to reduce reconciliation errors at discharge have been tested, many of which are based on the use of electronic tools as they are useful to optimize the medication reconciliation process. However, not all countries are progressing at the same speed in this task and not all tools are equally effective. So it is important to collate updated country-specific data in order to identify possible strategies for improvement in each particular region. Our aim therefore was to analyse the effectiveness of a computerized pharmaceutical intervention to reduce reconciliation errors at discharge in Spain. A quasi-experimental interrupted time-series study was carried out in the cardio-pneumology unit of a general hospital from February to April 2013. The study consisted of three phases: pre-intervention, intervention and post-intervention, each involving 23 days of observations. At the intervention period, a pharmacist was included in the medical team and entered the patient's pre-admission medication in a computerized tool integrated into the electronic clinical history of the patient. The effectiveness was evaluated by the differences between the mean percentages of reconciliation errors in each period using a Mann-Whitney U test accompanied by Bonferroni correction, eliminating autocorrelation of the data by first using an ARIMA analysis. In addition, the types of error identified and their potential seriousness were analysed. A total of 321 patients (119, 105 and 97 in each phase, respectively) were included in the study. For the 3966 medicaments recorded, 1087 reconciliation errors were identified in 77·9% of the patients. The mean percentage of reconciliation errors per patient in the first period of the study was 42·18%, falling to 19·82% during the intervention period (P = 0·000). When the intervention was withdrawn, the mean percentage of reconciliation errors increased again to 27·72% (P = 0·008). The difference between the percentages of pre- and post-intervention periods was statistically significant (P = 0·000). Most reconciliation errors were due to omission (46·7%) or incomplete prescription (43·8%), and 35·3% of which could have caused harm to the patient. A computerized pharmaceutical intervention is shown to reduce reconciliation errors in the context of a high incidence of such errors. © 2016 John Wiley & Sons Ltd.

  19. Standardized sign-out reduces intern perception of medical errors on the general internal medicine ward.

    PubMed

    Salerno, Stephen M; Arnett, Michael V; Domanski, Jeremy P

    2009-01-01

    Prior research on reducing variation in housestaff handoff procedures have depended on proprietary checkout software. Use of low-technology standardization techniques has not been widely studied. We wished to determine if standardizing the process of intern sign-out using low-technology sign-out tools could reduce perception of errors and missing handoff data. We conducted a pre-post prospective study of a cohort of 34 interns on a general internal medicine ward. Night interns coming off duty and day interns reassuming care were surveyed on their perception of erroneous sign-out data, mistakes made by the night intern overnight, and occurrences unanticipated by sign-out. Trainee satisfaction with the sign-out process was assessed with a 5-point Likert survey. There were 399 intern surveys performed 8 weeks before and 6 weeks after the introduction of a standardized sign-out form. The response rate was 95% for the night interns and 70% for the interns reassuming care in the morning. After the standardized form was introduced, night interns were significantly (p < .003) less likely to detect missing sign-out data including missing important diseases, contingency plans, or medications. Standardized sign-out did not significantly alter the frequency of dropped tasks or missed lab and X-ray data as perceived by the night intern. However, the day teams thought there were significantly less perceived errors on the part of the night intern (p = .001) after introduction of the standardized sign-out sheet. There was no difference in mean Likert scores of resident satisfaction with sign-out before and after the intervention. Standardized written sign-out sheets significantly improve the completeness and effectiveness of handoffs between night and day interns. Further research is needed to determine if these process improvements are related to better patient outcomes.

  20. ERROR REDUCTION IN DUCT LEAKAGE TESTING THROUGH DATA CROSS-CHECKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ANDREWS, J.W.

    1998-12-31

    One way to reduce uncertainty in scientific measurement is to devise a protocol in which more quantities are measured than are absolutely required, so that the result is over constrained. This report develops a method for so combining data from two different tests for air leakage in residential duct systems. An algorithm, which depends on the uncertainty estimates for the measured quantities, optimizes the use of the excess data. In many cases it can significantly reduce the error bar on at least one of the two measured duct leakage rates (supply or return), and it provides a rational method ofmore » reconciling any conflicting results from the two leakage tests.« less

  1. Skills, rules and knowledge in aircraft maintenance: errors in context

    NASA Technical Reports Server (NTRS)

    Hobbs, Alan; Williamson, Ann

    2002-01-01

    Automatic or skill-based behaviour is generally considered to be less prone to error than behaviour directed by conscious control. However, researchers who have applied Rasmussen's skill-rule-knowledge human error framework to accidents and incidents have sometimes found that skill-based errors appear in significant numbers. It is proposed that this is largely a reflection of the opportunities for error which workplaces present and does not indicate that skill-based behaviour is intrinsically unreliable. In the current study, 99 errors reported by 72 aircraft mechanics were examined in the light of a task analysis based on observations of the work of 25 aircraft mechanics. The task analysis identified the opportunities for error presented at various stages of maintenance work packages and by the job as a whole. Once the frequency of each error type was normalized in terms of the opportunities for error, it became apparent that skill-based performance is more reliable than rule-based performance, which is in turn more reliable than knowledge-based performance. The results reinforce the belief that industrial safety interventions designed to reduce errors would best be directed at those aspects of jobs that involve rule- and knowledge-based performance.

  2. Medication Incidents Involving Antiepileptic Drugs in Canadian Hospitals: A Multi-Incident Analysis.

    PubMed

    Cheng, Roger; Yang, Yu Daisy; Chan, Matthew; Patel, Tejal

    2017-01-01

    Medication errors involving antiepileptic drugs (AEDs) are not well studied but have the potential to cause significant harm. We investigated the occurrence of medication incidents in Canadian hospitals that involve AEDs, their severity and contributing factors by analyzing data from two national databases. Our multi-incident analysis revealed that while medication errors were rarely fatal, errors do occur of which some are serious. Medication incidents were most commonly caused by dose omissions, the dose or its frequency being incorrect and the wrong AED being given. Our analysis could augment quality-improvement initiatives by medication safety administrators to reduce AED medication incidents in hospitals.

  3. The effect of the Earth's oblate spheroid shape on the accuracy of a time-of-arrival lightning ground strike locating system

    NASA Technical Reports Server (NTRS)

    Casper, Paul W.; Bent, Rodney B.

    1991-01-01

    The algorithm used in previous technology time-of-arrival lightning mapping systems was based on the assumption that the earth is a perfect spheroid. These systems yield highly-accurate lightning locations, which is their major strength. However, extensive analysis of tower strike data has revealed occasionally significant (one to two kilometer) systematic offset errors which are not explained by the usual error sources. It was determined that these systematic errors reduce dramatically (in some cases) when the oblate shape of the earth is taken into account. The oblate spheroid correction algorithm and a case example is presented.

  4. Enhanced orbit determination filter sensitivity analysis: Error budget development

    NASA Technical Reports Server (NTRS)

    Estefan, J. A.; Burkhart, P. D.

    1994-01-01

    An error budget analysis is presented which quantifies the effects of different error sources in the orbit determination process when the enhanced orbit determination filter, recently developed, is used to reduce radio metric data. The enhanced filter strategy differs from more traditional filtering methods in that nearly all of the principal ground system calibration errors affecting the data are represented as filter parameters. Error budget computations were performed for a Mars Observer interplanetary cruise scenario for cases in which only X-band (8.4-GHz) Doppler data were used to determine the spacecraft's orbit, X-band ranging data were used exclusively, and a combined set in which the ranging data were used in addition to the Doppler data. In all three cases, the filter model was assumed to be a correct representation of the physical world. Random nongravitational accelerations were found to be the largest source of error contributing to the individual error budgets. Other significant contributors, depending on the data strategy used, were solar-radiation pressure coefficient uncertainty, random earth-orientation calibration errors, and Deep Space Network (DSN) station location uncertainty.

  5. Aliasing errors in measurements of beam position and ellipticity

    NASA Astrophysics Data System (ADS)

    Ekdahl, Carl

    2005-09-01

    Beam position monitors (BPMs) are used in accelerators and ion experiments to measure currents, position, and azimuthal asymmetry. These usually consist of discrete arrays of electromagnetic field detectors, with detectors located at several equally spaced azimuthal positions at the beam tube wall. The discrete nature of these arrays introduces systematic errors into the data, independent of uncertainties resulting from signal noise, lack of recording dynamic range, etc. Computer simulations were used to understand and quantify these aliasing errors. If required, aliasing errors can be significantly reduced by employing more than the usual four detectors in the BPMs. These simulations show that the error in measurements of the centroid position of a large beam is indistinguishable from the error in the position of a filament. The simulations also show that aliasing errors in the measurement of beam ellipticity are very large unless the beam is accurately centered. The simulations were used to quantify the aliasing errors in beam parameter measurements during early experiments on the DARHT-II accelerator, demonstrating that they affected the measurements only slightly, if at all.

  6. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors

    PubMed Central

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-01-01

    Objectives Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. Methods This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. Results After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% (P <0.050). In particular, specimen identification errors decreased by 0.056%, with a 96.55% improvement. Conclusion Targeted educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors. PMID:29062553

  7. Lane Level Localization; Using Images and HD Maps to Mitigate the Lateral Error

    NASA Astrophysics Data System (ADS)

    Hosseinyalamdary, S.; Peter, M.

    2017-05-01

    In urban canyon where the GNSS signals are blocked by buildings, the accuracy of measured position significantly deteriorates. GIS databases have been frequently utilized to improve the accuracy of measured position using map matching approaches. In map matching, the measured position is projected to the road links (centerlines) in this approach and the lateral error of measured position is reduced. By the advancement in data acquision approaches, high definition maps which contain extra information, such as road lanes are generated. These road lanes can be utilized to mitigate the positional error and improve the accuracy in position. In this paper, the image content of a camera mounted on the platform is utilized to detect the road boundaries in the image. We apply color masks to detect the road marks, apply the Hough transform to fit lines to the left and right road boundaries, find the corresponding road segment in GIS database, estimate the homography transformation between the global and image coordinates of the road boundaries, and estimate the camera pose with respect to the global coordinate system. The proposed approach is evaluated on a benchmark. The position is measured by a smartphone's GPS receiver, images are taken from smartphone's camera and the ground truth is provided by using Real-Time Kinematic (RTK) technique. Results show the proposed approach significantly improves the accuracy of measured GPS position. The error in measured GPS position with average and standard deviation of 11.323 and 11.418 meters is reduced to the error in estimated postion with average and standard deviation of 6.725 and 5.899 meters.

  8. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors: A quality initiative.

    PubMed

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-08-01

    Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% ( P <0.050). In particular, specimen identification errors decreased by 0.056%, with a 96.55% improvement. Targeted educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors.

  9. Clinical implementation and failure mode and effects analysis of HDR skin brachytherapy using Valencia and Leipzig surface applicators.

    PubMed

    Sayler, Elaine; Eldredge-Hindy, Harriet; Dinome, Jessie; Lockamy, Virginia; Harrison, Amy S

    2015-01-01

    The planning procedure for Valencia and Leipzig surface applicators (VLSAs) (Nucletron, Veenendaal, The Netherlands) differs substantially from CT-based planning; the unfamiliarity could lead to significant errors. This study applies failure modes and effects analysis (FMEA) to high-dose-rate (HDR) skin brachytherapy using VLSAs to ensure safety and quality. A multidisciplinary team created a protocol for HDR VLSA skin treatments and applied FMEA. Failure modes were identified and scored by severity, occurrence, and detectability. The clinical procedure was then revised to address high-scoring process nodes. Several key components were added to the protocol to minimize risk probability numbers. (1) Diagnosis, prescription, applicator selection, and setup are reviewed at weekly quality assurance rounds. Peer review reduces the likelihood of an inappropriate treatment regime. (2) A template for HDR skin treatments was established in the clinic's electronic medical record system to standardize treatment instructions. This reduces the chances of miscommunication between the physician and planner as well as increases the detectability of an error. (3) A screen check was implemented during the second check to increase detectability of an error. (4) To reduce error probability, the treatment plan worksheet was designed to display plan parameters in a format visually similar to the treatment console display, facilitating data entry and verification. (5) VLSAs are color coded and labeled to match the electronic medical record prescriptions, simplifying in-room selection and verification. Multidisciplinary planning and FMEA increased detectability and reduced error probability during VLSA HDR brachytherapy. This clinical model may be useful to institutions implementing similar procedures. Copyright © 2015 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  10. Disclosure of Medical Errors: What Factors Influence How Patients Respond?

    PubMed Central

    Mazor, Kathleen M; Reed, George W; Yood, Robert A; Fischer, Melissa A; Baril, Joann; Gurwitz, Jerry H

    2006-01-01

    BACKGROUND Disclosure of medical errors is encouraged, but research on how patients respond to specific practices is limited. OBJECTIVE This study sought to determine whether full disclosure, an existing positive physician-patient relationship, an offer to waive associated costs, and the severity of the clinical outcome influenced patients' responses to medical errors. PARTICIPANTS Four hundred and seven health plan members participated in a randomized experiment in which they viewed video depictions of medical error and disclosure. DESIGN Subjects were randomly assigned to experimental condition. Conditions varied in type of medication error, level of disclosure, reference to a prior positive physician-patient relationship, an offer to waive costs, and clinical outcome. MEASURES Self-reported likelihood of changing physicians and of seeking legal advice; satisfaction, trust, and emotional response. RESULTS Nondisclosure increased the likelihood of changing physicians, and reduced satisfaction and trust in both error conditions. Nondisclosure increased the likelihood of seeking legal advice and was associated with a more negative emotional response in the missed allergy error condition, but did not have a statistically significant impact on seeking legal advice or emotional response in the monitoring error condition. Neither the existence of a positive relationship nor an offer to waive costs had a statistically significant impact. CONCLUSIONS This study provides evidence that full disclosure is likely to have a positive effect or no effect on how patients respond to medical errors. The clinical outcome also influences patients' responses. The impact of an existing positive physician-patient relationship, or of waiving costs associated with the error remains uncertain. PMID:16808770

  11. Nature of nursing errors and their contributing factors in intensive care units.

    PubMed

    Eltaybani, Sameh; Mohamed, Nadia; Abdelwareth, Mona

    2018-04-27

    Errors tend to be multifactorial and so learning from nurses' experiences with them would be a powerful tool toward promoting patient safety. To identify the nature of nursing errors and their contributing factors in intensive care units (ICUs). A semi-structured interview with 112 critical care nurses to elicit the reports about their encountered errors followed by a content analysis. A total of 300 errors were reported. Most of them (94·3%) were classified in more than one error category, e.g. 'lack of intervention', 'lack of attentiveness' and 'documentation errors': these were the most frequently involved error categories. Approximately 40% of reported errors contributed to significant harm or death of the involved patients, with system-related factors being involved in 84·3% of them. More errors occur during the evening shift than the night and morning shifts (42·7% versus 28·7% and 16·7%, respectively). There is a statistically significant relation (p ≤ 0·001) between error disclosure to a nursing supervisor and its impact on the patient. Nurses are more likely to report their errors when they feel safe and when the reporting system is not burdensome, although an internationally standardized language to define and analyse nursing errors is needed. Improving the health care system, particularly the managerial and environmental aspects, might reduce nursing errors in ICUs in terms of their incidence and seriousness. Targeting error-liable times in the ICU, such as mid-evening and mid-night shifts, along with improved supervision and adequate staff reallocation, might tackle the incidence and seriousness of nursing errors. Development of individualized nursing interventions for patients with low health literacy and patients in isolation might create more meaningful dialogue for ICU health care safety. © 2018 British Association of Critical Care Nurses.

  12. Using warnings to reduce categorical false memories in younger and older adults.

    PubMed

    Carmichael, Anna M; Gutchess, Angela H

    2016-07-01

    Warnings about memory errors can reduce their incidence, although past work has largely focused on associative memory errors. The current study sought to explore whether warnings could be tailored to specifically reduce false recall of categorical information in both younger and older populations. Before encoding word pairs designed to induce categorical false memories, half of the younger and older participants were warned to avoid committing these types of memory errors. Older adults who received a warning committed fewer categorical memory errors, as well as other types of semantic memory errors, than those who did not receive a warning. In contrast, young adults' memory errors did not differ for the warning versus no-warning groups. Our findings provide evidence for the effectiveness of warnings at reducing categorical memory errors in older adults, perhaps by supporting source monitoring, reduction in reliance on gist traces, or through effective metacognitive strategies.

  13. A constrained-gradient method to control divergence errors in numerical MHD

    NASA Astrophysics Data System (ADS)

    Hopkins, Philip F.

    2016-10-01

    In numerical magnetohydrodynamics (MHD), a major challenge is maintaining nabla \\cdot {B}=0. Constrained transport (CT) schemes achieve this but have been restricted to specific methods. For more general (meshless, moving-mesh, ALE) methods, `divergence-cleaning' schemes reduce the nabla \\cdot {B} errors; however they can still be significant and can lead to systematic errors which converge away slowly. We propose a new constrained gradient (CG) scheme which augments these with a projection step, and can be applied to any numerical scheme with a reconstruction. This iteratively approximates the least-squares minimizing, globally divergence-free reconstruction of the fluid. Unlike `locally divergence free' methods, this actually minimizes the numerically unstable nabla \\cdot {B} terms, without affecting the convergence order of the method. We implement this in the mesh-free code GIZMO and compare various test problems. Compared to cleaning schemes, our CG method reduces the maximum nabla \\cdot {B} errors by ˜1-3 orders of magnitude (˜2-5 dex below typical errors if no nabla \\cdot {B} cleaning is used). By preventing large nabla \\cdot {B} at discontinuities, this eliminates systematic errors at jumps. Our CG results are comparable to CT methods; for practical purposes, the nabla \\cdot {B} errors are eliminated. The cost is modest, ˜30 per cent of the hydro algorithm, and the CG correction can be implemented in a range of numerical MHD methods. While for many problems, we find Dedner-type cleaning schemes are sufficient for good results, we identify a range of problems where using only Powell or `8-wave' cleaning can produce order-of-magnitude errors.

  14. An intersecting chord method for minimum circumscribed sphere and maximum inscribed sphere evaluations of sphericity error

    NASA Astrophysics Data System (ADS)

    Liu, Fei; Xu, Guanghua; Zhang, Qing; Liang, Lin; Liu, Dan

    2015-11-01

    As one of the Geometrical Product Specifications that are widely applied in industrial manufacturing and measurement, sphericity error can synthetically scale a 3D structure and reflects the machining quality of a spherical workpiece. Following increasing demands in the high motion performance of spherical parts, sphericity error is becoming an indispensable component in the evaluation of form error. However, the evaluation of sphericity error is still considered to be a complex mathematical issue, and the related research studies on the development of available models are lacking. In this paper, an intersecting chord method is first proposed to solve the minimum circumscribed sphere and maximum inscribed sphere evaluations of sphericity error. This new modelling method leverages chord relationships to replace the characteristic points, thereby significantly reducing the computational complexity and improving the computational efficiency. Using the intersecting chords to generate a virtual centre, the reference sphere in two concentric spheres is simplified as a space intersecting structure. The position of the virtual centre on the space intersecting structure is determined by characteristic chords, which may reduce the deviation between the virtual centre and the centre of the reference sphere. In addition,two experiments are used to verify the effectiveness of the proposed method with real datasets from the Cartesian coordinates. The results indicate that the estimated errors are in perfect agreement with those of the published methods. Meanwhile, the computational efficiency is improved. For the evaluation of the sphericity error, the use of high performance computing is a remarkable change.

  15. Disruptive staff interactions: a serious source of inter-provider conflict and stress in health care settings.

    PubMed

    Stecker, Mona; Stecker, Mark M

    2014-07-01

    This study sought to explore the prevalence of workplace stress, gender differences, and the relationship of workplace incivility to the experience of stress. Effects of stress on performance have been explored for many years. Work stress has been at the root of many physical and psychological problems and has even been linked to medical errors and suboptimal patient outcomes. In this study, 617 respondents completed a Provider Conflict Questionnaire (PCQ) as well as a ten-item stress survey. Work was the main stressor according to 78.2% of respondents. The stress index was moderately high, ranging between 10 and 48 (mean = 25.5). Females demonstrated a higher stress index. Disruptive behavior showed a significant positive correlation with increased stress. This study concludes that employees of institutions with less disruptive behavior exhibited lower stress levels. This finding is important in improving employee satisfaction and reducing medical errors. It is difficult to retain experienced nurses, and stress is a significant contributor to job dissatisfaction. Moreover, workplace conflict and its correlation to increased stress levels must be managed as a strategy to reduce medical errors and increase job satisfaction.

  16. Initial clinical experience with a video-based patient positioning system.

    PubMed

    Johnson, L S; Milliken, B D; Hadley, S W; Pelizzari, C A; Haraf, D J; Chen, G T

    1999-08-01

    To report initial clinical experience with an interactive, video-based patient positioning system that is inexpensive, quick, accurate, and easy to use. System hardware includes two black-and-white CCD cameras, zoom lenses, and a PC equipped with a frame grabber. Custom software is used to acquire and archive video images, as well as to display real-time subtraction images revealing patient misalignment in multiple views. Two studies are described. In the first study, video is used to document the daily setup histories of 5 head and neck patients. Time-lapse cine loops are generated for each patient and used to diagnose and correct common setup errors. In the second study, 6 twice-daily (BID) head and neck patients are positioned according to the following protocol: at AM setups conventional treatment room lasers are used; at PM setups lasers are used initially and then video is used for 1-2 minutes to fine-tune the patient position. Lateral video images and lateral verification films are registered off-line to compare the distribution of setup errors per patient, with and without video assistance. In the first study, video images were used to determine the accuracy of our conventional head and neck setup technique, i.e., alignment of lightcast marks and surface anatomy to treatment room lasers and the light field. For this initial cohort of patients, errors ranged from sigma = 5 to 7 mm and were patient-specific. Time-lapse cine loops of the images revealed sources of the error, and as a result, our localization techniques and immobilization device were modified to improve setup accuracy. After the improvements, conventional setup errors were reduced to sigma = 3 to 5 mm. In the second study, when a stereo pair of live subtraction images were introduced to perform daily "on-line" setup correction, errors were reduced to sigma = 1 to 3 mm. Results depended on patient health and cooperation and the length of time spent fine-tuning the position. An interactive, video-based patient positioning system was shown to reduce setup errors to within 1 to 3 mm in head and neck patients, without a significant increase in overall treatment time or labor-intensive procedures. Unlike retrospective portal image analysis, use of two live-video images provides the therapists with immediate feedback and allows for true 3-D positioning and correction of out-of-plane rotation before radiation is delivered. With significant improvement in head and neck alignment and the elimination of setup errors greater than 3 to 5 mm, margins associated with treatment volumes potentially can be reduced, thereby decreasing normal tissue irradiation.

  17. Ocular manifestations of sickle cell disease and genetic susceptibility for refractive errors

    PubMed Central

    Shukla, Palak; Verma, Henu; Patel, Santosh; Patra, P. K.; Bhaskar, L. V. K. S.

    2017-01-01

    PURPOSE: Sickle cell disease (SCD) is the most common and serious form of an inherited blood disorder that lead to higher risk of early mortality. SCD patients are at high risk for developing multiorgan acute and chronic complications linked with significant morbidity and mortality. Some of the ophthalmological complications of SCD include retinal changes, refractive errors, vitreous hemorrhage, and abnormalities of the cornea. MATERIALS AND METHODS: The present study includes 96 SCD patients. A dilated comprehensive eye examination was performed to know the status of retinopathy. Refractive errors were measured in all patients. In patients with >10 years of age, cycloplegia was not performed before autorefractometry. A subset of fifty patients’ genotyping was done for NOS3 27-base pair (bp) variable number of tandem repeat (VNTR) and IL4 intron-3 VNTR polymorphisms using polymerase chain reaction-electrophoresis. Chi-square test was performed to test the association between the polymorphisms and refractive errors. RESULTS: The results of the present study revealed that 63.5% of patients have myopia followed by 19.8% hyperopia. NOS3 27-bp VNTR genotypes significantly deviated from Hardy–Weinberg equilibrium (P < 0.0001). Although IL4 70-bp VNTR increased the risk of developing refractive errors, it is not statistically significant. However, NOS3 27-bp VNTR significantly reduced the risk of development of myopia. CONCLUSION: In summary, our study documents the prevalence of refractive errors along with some retinal changes in Indian SCD patients. Further, this study demonstrates that the NOS3 VNTR contributes to the susceptibility to development of myopia in SCD cases. PMID:29018763

  18. Ocular manifestations of sickle cell disease and genetic susceptibility for refractive errors.

    PubMed

    Shukla, Palak; Verma, Henu; Patel, Santosh; Patra, P K; Bhaskar, L V K S

    2017-01-01

    Sickle cell disease (SCD) is the most common and serious form of an inherited blood disorder that lead to higher risk of early mortality. SCD patients are at high risk for developing multiorgan acute and chronic complications linked with significant morbidity and mortality. Some of the ophthalmological complications of SCD include retinal changes, refractive errors, vitreous hemorrhage, and abnormalities of the cornea. The present study includes 96 SCD patients. A dilated comprehensive eye examination was performed to know the status of retinopathy. Refractive errors were measured in all patients. In patients with >10 years of age, cycloplegia was not performed before autorefractometry. A subset of fifty patients' genotyping was done for NOS3 27-base pair (bp) variable number of tandem repeat (VNTR) and IL4 intron-3 VNTR polymorphisms using polymerase chain reaction-electrophoresis. Chi-square test was performed to test the association between the polymorphisms and refractive errors. The results of the present study revealed that 63.5% of patients have myopia followed by 19.8% hyperopia. NOS3 27-bp VNTR genotypes significantly deviated from Hardy-Weinberg equilibrium ( P < 0.0001). Although IL4 70-bp VNTR increased the risk of developing refractive errors, it is not statistically significant. However, NOS3 27-bp VNTR significantly reduced the risk of development of myopia. In summary, our study documents the prevalence of refractive errors along with some retinal changes in Indian SCD patients. Further, this study demonstrates that the NOS3 VNTR contributes to the susceptibility to development of myopia in SCD cases.

  19. Identification of patient information corruption in the intensive care unit: using a scoring tool to direct quality improvements in handover.

    PubMed

    Pickering, Brian W; Hurley, Killian; Marsh, Brian

    2009-11-01

    To use a handover assessment tool for identifying patient information corruption and objectively evaluating interventions designed to reduce handover errors and improve medical decision making. The continuous monitoring, intervention, and evaluation of the patient in modern intensive care unit practice generates large quantities of information, the platform on which medical decisions are made. Information corruption, defined as errors of distortion/omission compared with the medical record, may result in medical judgment errors. Identifying these errors may lead to quality improvements in intensive care unit care delivery and safety. Handover assessment instrument development study divided into two phases by the introduction of a handover intervention. Closed, 17-bed, university-affiliated mixed surgical/medical intensive care unit. Senior and junior medical members of the intensive care unit team. Electronic handover page. Study subjects were asked to recall clinical information commonly discussed at handover on individual patients. The handover score measured the percentage of information correctly retained for each individual doctor-patient interaction. The clinical intention score, a subjective measure of medical judgment, was graded (1-5) by three blinded intensive care unit experts. A total of 137 interactions were scored. Median (interquartile range) handover scores for phases 1 and 2 were 79.07% (67.44-84.50) and 83.72% (76.16-88.37), respectively. Score variance was reduced by the handover intervention (p < .05). Increasing median handover scores, 68.60 to 83.72, were associated with increases in clinical intention scores from 1 to 5 (chi-square = 23.59, df = 4, p < .0001). When asked to recall clinical information discussed at handover, medical members of the intensive care unit team provide data that are significantly corrupted compared with the medical record. Low subjective clinical judgment scores are significant associated with low handover scores. The handover/clinical intention scores may, therefore, be useful screening tools for intensive care unit system vulnerability to medical error. Additionally, handover instruments can identify interventions that reduce system vulnerability to error and may be used to guide quality improvements in handover practice.

  20. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    PubMed

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  1. Effectiveness of Toyota process redesign in reducing thyroid gland fine-needle aspiration error.

    PubMed

    Raab, Stephen S; Grzybicki, Dana Marie; Sudilovsky, Daniel; Balassanian, Ronald; Janosky, Janine E; Vrbin, Colleen M

    2006-10-01

    Our objective was to determine whether the Toyota Production System process redesign resulted in diagnostic error reduction for patients who underwent cytologic evaluation of thyroid nodules. In this longitudinal, nonconcurrent cohort study, we compared the diagnostic error frequency of a thyroid aspiration service before and after implementation of error reduction initiatives consisting of adoption of a standardized diagnostic terminology scheme and an immediate interpretation service. A total of 2,424 patients underwent aspiration. Following terminology standardization, the false-negative rate decreased from 41.8% to 19.1% (P = .006), the specimen nondiagnostic rate increased from 5.8% to 19.8% (P < .001), and the sensitivity increased from 70.2% to 90.6% (P < .001). Cases with an immediate interpretation had a lower noninterpretable specimen rate than those without immediate interpretation (P < .001). Toyota process change led to significantly fewer diagnostic errors for patients who underwent thyroid fine-needle aspiration.

  2. Robust iterative learning contouring controller with disturbance observer for machine tool feed drives.

    PubMed

    Simba, Kenneth Renny; Bui, Ba Dinh; Msukwa, Mathew Renny; Uchiyama, Naoki

    2018-04-01

    In feed drive systems, particularly machine tools, a contour error is more significant than the individual axial tracking errors from the view point of enhancing precision in manufacturing and production systems. The contour error must be within the permissible tolerance of given products. In machining complex or sharp-corner products, large contour errors occur mainly owing to discontinuous trajectories and the existence of nonlinear uncertainties. Therefore, it is indispensable to design robust controllers that can enhance the tracking ability of feed drive systems. In this study, an iterative learning contouring controller consisting of a classical Proportional-Derivative (PD) controller and disturbance observer is proposed. The proposed controller was evaluated experimentally by using a typical sharp-corner trajectory, and its performance was compared with that of conventional controllers. The results revealed that the maximum contour error can be reduced by about 37% on average. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Reduction in specimen labeling errors after implementation of a positive patient identification system in phlebotomy.

    PubMed

    Morrison, Aileen P; Tanasijevic, Milenko J; Goonan, Ellen M; Lobo, Margaret M; Bates, Michael M; Lipsitz, Stuart R; Bates, David W; Melanson, Stacy E F

    2010-06-01

    Ensuring accurate patient identification is central to preventing medical errors, but it can be challenging. We implemented a bar code-based positive patient identification system for use in inpatient phlebotomy. A before-after design was used to evaluate the impact of the identification system on the frequency of mislabeled and unlabeled samples reported in our laboratory. Labeling errors fell from 5.45 in 10,000 before implementation to 3.2 in 10,000 afterward (P = .0013). An estimated 108 mislabeling events were prevented by the identification system in 1 year. Furthermore, a workflow step requiring manual preprinting of labels, which was accompanied by potential labeling errors in about one quarter of blood "draws," was removed as a result of the new system. After implementation, a higher percentage of patients reported having their wristband checked before phlebotomy. Bar code technology significantly reduced the rate of specimen identification errors.

  4. Liquid Medication Dosing Errors in Children: Role of Provider Counseling Strategies

    PubMed Central

    Yin, H. Shonna; Dreyer, Benard P.; Moreira, Hannah A.; van Schaick, Linda; Rodriguez, Luis; Boettger, Susanne; Mendelsohn, Alan L.

    2014-01-01

    Objective To examine the degree to which recommended provider counseling strategies, including advanced communication techniques and dosing instrument provision, are associated with reductions in parent liquid medication dosing errors. Methods Cross-sectional analysis of baseline data on provider communication and dosing instrument provision from a study of a health literacy intervention to reduce medication errors. Parents whose children (<9 years) were seen in two urban public hospital pediatric emergency departments (EDs) and were prescribed daily dose liquid medications self-reported whether they received counseling about their child’s medication, including advanced strategies (teachback, drawings/pictures, demonstration, showback) and receipt of a dosing instrument. Primary dependent variable: observed dosing error (>20% deviation from prescribed). Multivariate logistic regression analyses performed, controlling for: parent age, language, country, ethnicity, socioeconomic status, education, health literacy (Short Test of Functional Health Literacy in Adults); child age, chronic disease status; site. Results Of 287 parents, 41.1% made dosing errors. Advanced counseling and instrument provision in the ED were reported by 33.1% and 19.2%, respectively; 15.0% reported both. Advanced counseling and instrument provision in the ED were associated with decreased errors (30.5 vs. 46.4%, p=0.01; 21.8 vs. 45.7%, p=0.001). In adjusted analyses, ED advanced counseling in combination with instrument provision was associated with a decreased odds of error compared to receiving neither (AOR 0.3; 95% CI 0.1–0.7); advanced counseling alone and instrument alone were not significantly associated with odds of error. Conclusion Provider use of advanced counseling strategies and dosing instrument provision may be especially effective in reducing errors when used together. PMID:24767779

  5. A long-term follow-up evaluation of electronic health record prescribing safety

    PubMed Central

    Abramson, Erika L; Malhotra, Sameer; Osorio, S Nena; Edwards, Alison; Cheriff, Adam; Cole, Curtis; Kaushal, Rainu

    2013-01-01

    Objective To be eligible for incentives through the Electronic Health Record (EHR) Incentive Program, many providers using older or locally developed EHRs will be transitioning to new, commercial EHRs. We previously evaluated prescribing errors made by providers in the first year following transition from a locally developed EHR with minimal prescribing clinical decision support (CDS) to a commercial EHR with robust CDS. Following system refinements, we conducted this study to assess the rates and types of errors 2 years after transition and determine the evolution of errors. Materials and methods We conducted a mixed methods cross-sectional case study of 16 physicians at an academic-affiliated ambulatory clinic from April to June 2010. We utilized standardized prescription and chart review to identify errors. Fourteen providers also participated in interviews. Results We analyzed 1905 prescriptions. The overall prescribing error rate was 3.8 per 100 prescriptions (95% CI 2.8 to 5.1). Error rates were significantly lower 2 years after transition (p<0.001 compared to pre-implementation, 12 weeks and 1 year after transition). Rates of near misses remained unchanged. Providers positively appreciated most system refinements, particularly reduced alert firing. Discussion Our study suggests that over time and with system refinements, use of a commercial EHR with advanced CDS can lead to low prescribing error rates, although more serious errors may require targeted interventions to eliminate them. Reducing alert firing frequency appears particularly important. Our results provide support for federal efforts promoting meaningful use of EHRs. Conclusions Ongoing error monitoring can allow CDS to be optimally tailored and help achieve maximal safety benefits. Clinical Trials Registration ClinicalTrials.gov, Identifier: NCT00603070. PMID:23578816

  6. Liquid medication dosing errors in children: role of provider counseling strategies.

    PubMed

    Yin, H Shonna; Dreyer, Benard P; Moreira, Hannah A; van Schaick, Linda; Rodriguez, Luis; Boettger, Susanne; Mendelsohn, Alan L

    2014-01-01

    To examine the degree to which recommended provider counseling strategies, including advanced communication techniques and dosing instrument provision, are associated with reductions in parent liquid medication dosing errors. Cross-sectional analysis of baseline data on provider communication and dosing instrument provision from a study of a health literacy intervention to reduce medication errors. Parents whose children (<9 years) were seen in 2 urban public hospital pediatric emergency departments (EDs) and were prescribed daily dose liquid medications self-reported whether they received counseling about their child's medication, including advanced strategies (teachback, drawings/pictures, demonstration, showback) and receipt of a dosing instrument. The primary dependent variable was observed dosing error (>20% deviation from prescribed). Multivariate logistic regression analyses were performed, controlling for parent age, language, country, ethnicity, socioeconomic status, education, health literacy (Short Test of Functional Health Literacy in Adults); child age, chronic disease status; and site. Of 287 parents, 41.1% made dosing errors. Advanced counseling and instrument provision in the ED were reported by 33.1% and 19.2%, respectively; 15.0% reported both. Advanced counseling and instrument provision in the ED were associated with decreased errors (30.5 vs. 46.4%, P = .01; 21.8 vs. 45.7%, P = .001). In adjusted analyses, ED advanced counseling in combination with instrument provision was associated with a decreased odds of error compared to receiving neither (adjusted odds ratio 0.3; 95% confidence interval 0.1-0.7); advanced counseling alone and instrument alone were not significantly associated with odds of error. Provider use of advanced counseling strategies and dosing instrument provision may be especially effective in reducing errors when used together. Copyright © 2014 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  7. Goldmann tonometry tear film error and partial correction with a shaped applanation surface.

    PubMed

    McCafferty, Sean J; Enikov, Eniko T; Schwiegerling, Jim; Ashley, Sean M

    2018-01-01

    The aim of the study was to quantify the isolated tear film adhesion error in a Goldmann applanation tonometer (GAT) prism and in a correcting applanation tonometry surface (CATS) prism. The separation force of a tonometer prism adhered by a tear film to a simulated cornea was measured to quantify an isolated tear film adhesion force. Acrylic hemispheres (7.8 mm radius) used as corneas were lathed over the apical 3.06 mm diameter to simulate full applanation contact with the prism surface for both GAT and CATS prisms. Tear film separation measurements were completed with both an artificial tear and fluorescein solutions as a fluid bridge. The applanation mire thicknesses were measured and correlated with the tear film separation measurements. Human cadaver eyes were used to validate simulated cornea tear film separation measurement differences between the GAT and CATS prisms. The CATS prism tear film adhesion error (2.74±0.21 mmHg) was significantly less than the GAT prism (4.57±0.18 mmHg, p <0.001). Tear film adhesion error was independent of applanation mire thickness ( R 2 =0.09, p =0.04). Fluorescein produces more tear film error than artificial tears (+0.51±0.04 mmHg; p <0.001). Cadaver eye validation indicated the CATS prism's tear film adhesion error (1.40±0.51 mmHg) was significantly less than that of the GAT prism (3.30±0.38 mmHg; p =0.002). Measured GAT tear film adhesion error is more than previously predicted. A CATS prism significantly reduced tear film adhesion error bŷ41%. Fluorescein solution increases the tear film adhesion compared to artificial tears, while mire thickness has a negligible effect.

  8. Technological Advancements and Error Rates in Radiation Therapy Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margalit, Danielle N., E-mail: dmargalit@partners.org; Harvard Cancer Consortium and Brigham and Women's Hospital/Dana Farber Cancer Institute, Boston, MA; Chen, Yu-Hui

    2011-11-15

    Purpose: Technological advances in radiation therapy (RT) delivery have the potential to reduce errors via increased automation and built-in quality assurance (QA) safeguards, yet may also introduce new types of errors. Intensity-modulated RT (IMRT) is an increasingly used technology that is more technically complex than three-dimensional (3D)-conformal RT and conventional RT. We determined the rate of reported errors in RT delivery among IMRT and 3D/conventional RT treatments and characterized the errors associated with the respective techniques to improve existing QA processes. Methods and Materials: All errors in external beam RT delivery were prospectively recorded via a nonpunitive error-reporting system atmore » Brigham and Women's Hospital/Dana Farber Cancer Institute. Errors are defined as any unplanned deviation from the intended RT treatment and are reviewed during monthly departmental quality improvement meetings. We analyzed all reported errors since the routine use of IMRT in our department, from January 2004 to July 2009. Fisher's exact test was used to determine the association between treatment technique (IMRT vs. 3D/conventional) and specific error types. Effect estimates were computed using logistic regression. Results: There were 155 errors in RT delivery among 241,546 fractions (0.06%), and none were clinically significant. IMRT was commonly associated with errors in machine parameters (nine of 19 errors) and data entry and interpretation (six of 19 errors). IMRT was associated with a lower rate of reported errors compared with 3D/conventional RT (0.03% vs. 0.07%, p = 0.001) and specifically fewer accessory errors (odds ratio, 0.11; 95% confidence interval, 0.01-0.78) and setup errors (odds ratio, 0.24; 95% confidence interval, 0.08-0.79). Conclusions: The rate of errors in RT delivery is low. The types of errors differ significantly between IMRT and 3D/conventional RT, suggesting that QA processes must be uniquely adapted for each technique. There was a lower error rate with IMRT compared with 3D/conventional RT, highlighting the need for sustained vigilance against errors common to more traditional treatment techniques.« less

  9. [Risk Management: concepts and chances for public health].

    PubMed

    Palm, Stefan; Cardeneo, Margareta; Halber, Marco; Schrappe, Matthias

    2002-01-15

    Errors are a common problem in medicine and occur as a result of a complex process involving many contributing factors. Medical errors significantly reduce the safety margin for the patient and contribute additional costs in health care delivery. In most cases adverse events cannot be attributed to a single underlying cause. Therefore an effective risk management strategy must follow a system approach, which is based on counting and analysis of near misses. The development of defenses against the undesired effects of errors should be the main focus rather than asking the question "Who blundered?". Analysis of near misses (which in this context can be compared to indicators) offers several methodological advantages as compared to the analysis of errors and adverse events. Risk management is an integral element of quality management.

  10. Comparisons of single event vulnerability of GaAs SRAMS

    NASA Astrophysics Data System (ADS)

    Weatherford, T. R.; Hauser, J. R.; Diehl, S. E.

    1986-12-01

    A GaAs MESFET/JFET model incorporated into SPICE has been used to accurately describe C-EJFET, E/D MESFET and D MESFET/resistor GaAs memory technologies. These cells have been evaluated for critical charges due to gate-to-drain and drain-to-source charge collection. Low gate-to-drain critical charges limit conventional GaAs SRAM soft error rates to approximately 1E-6 errors/bit-day. SEU hardening approaches including decoupling resistors, diodes, and FETs have been investigated. Results predict GaAs RAM cell critical charges can be increased to over 0.1 pC. Soft error rates in such hardened memories may approach 1E-7 errors/bit-day without significantly reducing memory speed. Tradeoffs between hardening level, performance and fabrication complexity are discussed.

  11. A strategy for reducing gross errors in the generalized Born models of implicit solvation

    PubMed Central

    Onufriev, Alexey V.; Sigalov, Grigori

    2011-01-01

    The “canonical” generalized Born (GB) formula [C. Still, A. Tempczyk, R. C. Hawley, and T. Hendrickson, J. Am. Chem. Soc. 112, 6127 (1990)] is known to provide accurate estimates for total electrostatic solvation energies ΔGel of biomolecules if the corresponding effective Born radii are accurate. Here we show that even if the effective Born radii are perfectly accurate, the canonical formula still exhibits significant number of gross errors (errors larger than 2kBT relative to numerical Poisson equation reference) in pairwise interactions between individual atomic charges. Analysis of exact analytical solutions of the Poisson equation (PE) for several idealized nonspherical geometries reveals two distinct spatial modes of the PE solution; these modes are also found in realistic biomolecular shapes. The canonical GB Green function misses one of two modes seen in the exact PE solution, which explains the observed gross errors. To address the problem and reduce gross errors of the GB formalism, we have used exact PE solutions for idealized nonspherical geometries to suggest an alternative analytical Green function to replace the canonical GB formula. The proposed functional form is mathematically nearly as simple as the original, but depends not only on the effective Born radii but also on their gradients, which allows for better representation of details of nonspherical molecular shapes. In particular, the proposed functional form captures both modes of the PE solution seen in nonspherical geometries. Tests on realistic biomolecular structures ranging from small peptides to medium size proteins show that the proposed functional form reduces gross pairwise errors in all cases, with the amount of reduction varying from more than an order of magnitude for small structures to a factor of 2 for the largest ones. PMID:21528947

  12. Mistake proofing: changing designs to reduce error

    PubMed Central

    Grout, J R

    2006-01-01

    Mistake proofing uses changes in the physical design of processes to reduce human error. It can be used to change designs in ways that prevent errors from occurring, to detect errors after they occur but before harm occurs, to allow processes to fail safely, or to alter the work environment to reduce the chance of errors. Effective mistake proofing design changes should initially be effective in reducing harm, be inexpensive, and easily implemented. Over time these design changes should make life easier and speed up the process. Ideally, the design changes should increase patients' and visitors' understanding of the process. These designs should themselves be mistake proofed and follow the good design practices of other disciplines. PMID:17142609

  13. Obligation towards medical errors disclosure at a tertiary care hospital in Dubai, UAE

    PubMed Central

    Zaghloul, Ashraf Ahmad; Rahman, Syed Azizur; Abou El-Enein, Nagwa Younes

    2016-01-01

    OBJECTIVE: The study aimed to identify healthcare providers’ obligation towards medical errors disclosure as well as to study the association between the severity of the medical error and the intention to disclose the error to the patients and their families. DESIGN: A cross-sectional study design was followed to identify the magnitude of disclosure among healthcare providers in different departments at a randomly selected tertiary care hospital in Dubai. SETTING AND PARTICIPANTS: The total sample size accounted for 106 respondents. Data were collected using a questionnaire composed of two sections namely; demographic variables of the respondents and a section which included variables relevant to medical error disclosure. RESULTS: Statistical analysis yielded significant association between the obligation to disclose medical errors with male healthcare providers (X2 = 5.1), and being a physician (X2 = 19.3). Obligation towards medical errors disclosure was significantly associated with those healthcare providers who had not committed any medical errors during the past year (X2 = 9.8), and any type of medical error regardless the cause, extent of harm (X2 = 8.7). Variables included in the binary logistic regression model were; status (Exp β (Physician) = 0.39, 95% CI 0.16–0.97), gender (Exp β (Male) = 4.81, 95% CI 1.84–12.54), and medical errors during the last year (Exp β (None) = 2.11, 95% CI 0.6–2.3). CONCLUSION: Education and training of physicians about disclosure conversations needs to start as early as medical school. Like the training in other competencies required of physicians, education in communicating about medical errors could help reduce physicians’ apprehension and make them more comfortable with disclosure conversations. PMID:27567766

  14. Comparative Study of Refractive Errors, Strabismus, Microsaccades, and Visual Perception Between Preterm and Full-Term Children With Infantile Cerebral Palsy.

    PubMed

    Kozeis, Nikolaos; Panos, Georgios D; Zafeiriou, Dimitrios I; de Gottrau, Philippe; Gatzioufas, Zisis

    2015-07-01

    The purpose of this study was to examine the refractive status, orthoptic status and visual perception in a group of preterm and another of full-term children with cerebral palsy, in order to investigate whether prematurity has an effect on the development of refractive errors and binocular disorders. A hundred school-aged children, 70 preterm and 30 full-term, with congenital cerebral palsy were examined. Differences for hypermetropia, myopia, and emmetropia were not statistically significant between the 2 groups. Astigmatism was significantly increased in the preterm group. The orthoptic status was similar for both groups. Visual perception was markedly reduced in both groups, but the differences were not significant. In conclusion, children with cerebral palsy have impaired visual skills, leading to reading difficulties. The presence of prematurity does not appear to represent an additional risk factor for the development of refractive errors and binocular disorders. © The Author(s) 2014.

  15. An investigation of motion base cueing and G-seat cueing on pilot performance in a simulator

    NASA Technical Reports Server (NTRS)

    Mckissick, B. T.; Ashworth, B. R.; Parrish, R. V.

    1983-01-01

    The effect of G-seat cueing (GSC) and motion-base cueing (MBC) on performance of a pursuit-tracking task is studied using the visual motion simulator (VMS) at Langley Research Center. The G-seat, the six-degree-of-freedom synergistic platform motion system, the visual display, the cockpit hardware, and the F-16 aircraft mathematical model are characterized. Each of 8 active F-15 pilots performed the 2-min-43-sec task 10 times for each experimental mode: no cue, GSC, MBC, and GSC + MBC; the results were analyzed statistically in terms of the RMS values of vertical and lateral tracking error. It is shown that lateral error is significantly reduced by either GSC or MBC, and that the combination of cues produces a further, significant decrease. Vertical error is significantly decreased by GSC with or without MBC, whereas MBC effects vary for different pilots. The pattern of these findings is roughly duplicated in measurements of stick force applied for roll and pitch correction.

  16. Sleep and errors in a group of Australian hospital nurses at work and during the commute.

    PubMed

    Dorrian, Jillian; Tolley, Carolyn; Lamond, Nicole; van den Heuvel, Cameron; Pincombe, Jan; Rogers, Ann E; Drew, Dawson

    2008-09-01

    There is a paucity of information regarding Australian nurses' sleep and fatigue levels, and whether they result in impairment. Forty-one Australian hospital nurses completed daily logbooks for one month recording work hours, sleep, sleepiness, stress, errors, near errors and observed errors (made by others). Nurses reported exhaustion, stress and struggling to remain (STR) awake at work during one in three shifts. Sleep was significantly reduced on workdays in general, and workdays when an error was reported relative to days off. The primary predictor of error was STR, followed by stress. The primary predictor of extreme drowsiness during the commute was also STR awake, followed by exhaustion, and consecutive shifts. In turn, STR awake was predicted by exhaustion, prior sleep and shift length. Findings highlight the need for further attention to these issues to optimise the safety of nurses and patients in our hospitals, and the community at large on our roads.

  17. Detecting and overcoming systematic errors in genome-scale phylogenies.

    PubMed

    Rodríguez-Ezpeleta, Naiara; Brinkmann, Henner; Roure, Béatrice; Lartillot, Nicolas; Lang, B Franz; Philippe, Hervé

    2007-06-01

    Genome-scale data sets result in an enhanced resolution of the phylogenetic inference by reducing stochastic errors. However, there is also an increase of systematic errors due to model violations, which can lead to erroneous phylogenies. Here, we explore the impact of systematic errors on the resolution of the eukaryotic phylogeny using a data set of 143 nuclear-encoded proteins from 37 species. The initial observation was that, despite the impressive amount of data, some branches had no significant statistical support. To demonstrate that this lack of resolution is due to a mutual annihilation of phylogenetic and nonphylogenetic signals, we created a series of data sets with slightly different taxon sampling. As expected, these data sets yielded strongly supported but mutually exclusive trees, thus confirming the presence of conflicting phylogenetic and nonphylogenetic signals in the original data set. To decide on the correct tree, we applied several methods expected to reduce the impact of some kinds of systematic error. Briefly, we show that (i) removing fast-evolving positions, (ii) recoding amino acids into functional categories, and (iii) using a site-heterogeneous mixture model (CAT) are three effective means of increasing the ratio of phylogenetic to nonphylogenetic signal. Finally, our results allow us to formulate guidelines for detecting and overcoming phylogenetic artefacts in genome-scale phylogenetic analyses.

  18. Human error and human factors engineering in health care.

    PubMed

    Welch, D L

    1997-01-01

    Human error is inevitable. It happens in health care systems as it does in all other complex systems, and no measure of attention, training, dedication, or punishment is going to stop it. The discipline of human factors engineering (HFE) has been dealing with the causes and effects of human error since the 1940's. Originally applied to the design of increasingly complex military aircraft cockpits, HFE has since been effectively applied to the problem of human error in such diverse systems as nuclear power plants, NASA spacecraft, the process control industry, and computer software. Today the health care industry is becoming aware of the costs of human error and is turning to HFE for answers. Just as early experimental psychologists went beyond the label of "pilot error" to explain how the design of cockpits led to air crashes, today's HFE specialists are assisting the health care industry in identifying the causes of significant human errors in medicine and developing ways to eliminate or ameliorate them. This series of articles will explore the nature of human error and how HFE can be applied to reduce the likelihood of errors and mitigate their effects.

  19. Using Healthcare Failure Mode and Effect Analysis to reduce medication errors in the process of drug prescription, validation and dispensing in hospitalised patients.

    PubMed

    Vélez-Díaz-Pallarés, Manuel; Delgado-Silveira, Eva; Carretero-Accame, María Emilia; Bermejo-Vicedo, Teresa

    2013-01-01

    To identify actions to reduce medication errors in the process of drug prescription, validation and dispensing, and to evaluate the impact of their implementation. A Health Care Failure Mode and Effect Analysis (HFMEA) was supported by a before-and-after medication error study to measure the actual impact on error rate after the implementation of corrective actions in the process of drug prescription, validation and dispensing in wards equipped with computerised physician order entry (CPOE) and unit-dose distribution system (788 beds out of 1080) in a Spanish university hospital. The error study was carried out by two observers who reviewed medication orders on a daily basis to register prescription errors by physicians and validation errors by pharmacists. Drugs dispensed in the unit-dose trolleys were reviewed for dispensing errors. Error rates were expressed as the number of errors for each process divided by the total opportunities for error in that process times 100. A reduction in prescription errors was achieved by providing training for prescribers on CPOE, updating prescription procedures, improving clinical decision support and automating the software connection to the hospital census (relative risk reduction (RRR), 22.0%; 95% CI 12.1% to 31.8%). Validation errors were reduced after optimising time spent in educating pharmacy residents on patient safety, developing standardised validation procedures and improving aspects of the software's database (RRR, 19.4%; 95% CI 2.3% to 36.5%). Two actions reduced dispensing errors: reorganising the process of filling trolleys and drawing up a protocol for drug pharmacy checking before delivery (RRR, 38.5%; 95% CI 14.1% to 62.9%). HFMEA facilitated the identification of actions aimed at reducing medication errors in a healthcare setting, as the implementation of several of these led to a reduction in errors in the process of drug prescription, validation and dispensing.

  20. Determining relative error bounds for the CVBEM

    USGS Publications Warehouse

    Hromadka, T.V.

    1985-01-01

    The Complex Variable Boundary Element Methods provides a measure of relative error which can be utilized to subsequently reduce the error or provide information for further modeling analysis. By maximizing the relative error norm on each boundary element, a bound on the total relative error for each boundary element can be evaluated. This bound can be utilized to test CVBEM convergence, to analyze the effects of additional boundary nodal points in reducing the modeling error, and to evaluate the sensitivity of resulting modeling error within a boundary element from the error produced in another boundary element as a function of geometric distance. ?? 1985.

  1. Impact of Internally Developed Electronic Prescription on Prescribing Errors at Discharge from the Emergency Department

    PubMed Central

    Hitti, Eveline; Tamim, Hani; Bakhti, Rinad; Zebian, Dina; Mufarrij, Afif

    2017-01-01

    Introduction Medication errors are common, with studies reporting at least one error per patient encounter. At hospital discharge, medication errors vary from 15%–38%. However, studies assessing the effect of an internally developed electronic (E)-prescription system at discharge from an emergency department (ED) are comparatively minimal. Additionally, commercially available electronic solutions are cost-prohibitive in many resource-limited settings. We assessed the impact of introducing an internally developed, low-cost E-prescription system, with a list of commonly prescribed medications, on prescription error rates at discharge from the ED, compared to handwritten prescriptions. Methods We conducted a pre- and post-intervention study comparing error rates in a randomly selected sample of discharge prescriptions (handwritten versus electronic) five months pre and four months post the introduction of the E-prescription. The internally developed, E-prescription system included a list of 166 commonly prescribed medications with the generic name, strength, dose, frequency and duration. We included a total of 2,883 prescriptions in this study: 1,475 in the pre-intervention phase were handwritten (HW) and 1,408 in the post-intervention phase were electronic. We calculated rates of 14 different errors and compared them between the pre- and post-intervention period. Results Overall, E-prescriptions included fewer prescription errors as compared to HW-prescriptions. Specifically, E-prescriptions reduced missing dose (11.3% to 4.3%, p <0.0001), missing frequency (3.5% to 2.2%, p=0.04), missing strength errors (32.4% to 10.2%, p <0.0001) and legibility (0.7% to 0.2%, p=0.005). E-prescriptions, however, were associated with a significant increase in duplication errors, specifically with home medication (1.7% to 3%, p=0.02). Conclusion A basic, internally developed E-prescription system, featuring commonly used medications, effectively reduced medication errors in a low-resource setting where the costs of sophisticated commercial electronic solutions are prohibitive. PMID:28874948

  2. Impact of Internally Developed Electronic Prescription on Prescribing Errors at Discharge from the Emergency Department.

    PubMed

    Hitti, Eveline; Tamim, Hani; Bakhti, Rinad; Zebian, Dina; Mufarrij, Afif

    2017-08-01

    Medication errors are common, with studies reporting at least one error per patient encounter. At hospital discharge, medication errors vary from 15%-38%. However, studies assessing the effect of an internally developed electronic (E)-prescription system at discharge from an emergency department (ED) are comparatively minimal. Additionally, commercially available electronic solutions are cost-prohibitive in many resource-limited settings. We assessed the impact of introducing an internally developed, low-cost E-prescription system, with a list of commonly prescribed medications, on prescription error rates at discharge from the ED, compared to handwritten prescriptions. We conducted a pre- and post-intervention study comparing error rates in a randomly selected sample of discharge prescriptions (handwritten versus electronic) five months pre and four months post the introduction of the E-prescription. The internally developed, E-prescription system included a list of 166 commonly prescribed medications with the generic name, strength, dose, frequency and duration. We included a total of 2,883 prescriptions in this study: 1,475 in the pre-intervention phase were handwritten (HW) and 1,408 in the post-intervention phase were electronic. We calculated rates of 14 different errors and compared them between the pre- and post-intervention period. Overall, E-prescriptions included fewer prescription errors as compared to HW-prescriptions. Specifically, E-prescriptions reduced missing dose (11.3% to 4.3%, p <0.0001), missing frequency (3.5% to 2.2%, p=0.04), missing strength errors (32.4% to 10.2%, p <0.0001) and legibility (0.7% to 0.2%, p=0.005). E-prescriptions, however, were associated with a significant increase in duplication errors, specifically with home medication (1.7% to 3%, p=0.02). A basic, internally developed E-prescription system, featuring commonly used medications, effectively reduced medication errors in a low-resource setting where the costs of sophisticated commercial electronic solutions are prohibitive.

  3. Characterisation of false-positive observations in botanical surveys

    PubMed Central

    2017-01-01

    Errors in botanical surveying are a common problem. The presence of a species is easily overlooked, leading to false-absences; while misidentifications and other mistakes lead to false-positive observations. While it is common knowledge that these errors occur, there are few data that can be used to quantify and describe these errors. Here we characterise false-positive errors for a controlled set of surveys conducted as part of a field identification test of botanical skill. Surveys were conducted at sites with a verified list of vascular plant species. The candidates were asked to list all the species they could identify in a defined botanically rich area. They were told beforehand that their final score would be the sum of the correct species they listed, but false-positive errors counted against their overall grade. The number of errors varied considerably between people, some people create a high proportion of false-positive errors, but these are scattered across all skill levels. Therefore, a person’s ability to correctly identify a large number of species is not a safeguard against the generation of false-positive errors. There was no phylogenetic pattern to falsely observed species; however, rare species are more likely to be false-positive as are species from species rich genera. Raising the threshold for the acceptance of an observation reduced false-positive observations dramatically, but at the expense of more false negative errors. False-positive errors are higher in field surveying of plants than many people may appreciate. Greater stringency is required before accepting species as present at a site, particularly for rare species. Combining multiple surveys resolves the problem, but requires a considerable increase in effort to achieve the same sensitivity as a single survey. Therefore, other methods should be used to raise the threshold for the acceptance of a species. For example, digital data input systems that can verify, feedback and inform the user are likely to reduce false-positive errors significantly. PMID:28533972

  4. The Effects of Combining Videogame Dancing and Pelvic Floor Training to Improve Dual-Task Gait and Cognition in Women with Mixed-Urinary Incontinence.

    PubMed

    Fraser, Sarah A; Elliott, Valerie; de Bruin, Eling D; Bherer, Louis; Dumoulin, Chantal

    2014-06-01

    Many women over 65 years of age suffer from mixed urinary incontinence (MUI) and executive function (EF) deficits. Both incontinence and EF declines increase fall risk. The current study assessed EF and dual-task gait after a multicomponent intervention that combined pelvic floor muscle (PFM) training and videogame dancing (VGD). Baseline (Pre1), pretraining (Pre2), and post-training (Post) neuropsychological and dual-task gait assessments were completed by 23 women (mean age, 70.4 years) with MUI. During the dual-task, participants walked and performed an auditory n-back task. From Pre2 to Post, all women completed 12 weeks of combined PFM and VGD training. After training (Pre2 to Post), the number of errors in the Inhibition/Switch Stroop condition decreased significantly, the Trail Making Test difference score improved marginally, and the number of n-back errors during dual-task gait significantly decreased. A subgroup analysis based on continence improvements (pad test) revealed that only those subjects who improved in the pad test had significantly reduced numbers of n-back errors during dual-task gait. The results of this study suggest that a multicomponent intervention can improve EFs and the dual-task gait of older women with MUI. Future research is needed to determine if the training-induced improvements in these factors reduce fall risk.

  5. Impact of exposure measurement error in air pollution epidemiology: effect of error type in time-series studies.

    PubMed

    Goldman, Gretchen T; Mulholland, James A; Russell, Armistead G; Strickland, Matthew J; Klein, Mitchel; Waller, Lance A; Tolbert, Paige E

    2011-06-22

    Two distinctly different types of measurement error are Berkson and classical. Impacts of measurement error in epidemiologic studies of ambient air pollution are expected to depend on error type. We characterize measurement error due to instrument imprecision and spatial variability as multiplicative (i.e. additive on the log scale) and model it over a range of error types to assess impacts on risk ratio estimates both on a per measurement unit basis and on a per interquartile range (IQR) basis in a time-series study in Atlanta. Daily measures of twelve ambient air pollutants were analyzed: NO2, NOx, O3, SO2, CO, PM10 mass, PM2.5 mass, and PM2.5 components sulfate, nitrate, ammonium, elemental carbon and organic carbon. Semivariogram analysis was applied to assess spatial variability. Error due to this spatial variability was added to a reference pollutant time-series on the log scale using Monte Carlo simulations. Each of these time-series was exponentiated and introduced to a Poisson generalized linear model of cardiovascular disease emergency department visits. Measurement error resulted in reduced statistical significance for the risk ratio estimates for all amounts (corresponding to different pollutants) and types of error. When modelled as classical-type error, risk ratios were attenuated, particularly for primary air pollutants, with average attenuation in risk ratios on a per unit of measurement basis ranging from 18% to 92% and on an IQR basis ranging from 18% to 86%. When modelled as Berkson-type error, risk ratios per unit of measurement were biased away from the null hypothesis by 2% to 31%, whereas risk ratios per IQR were attenuated (i.e. biased toward the null) by 5% to 34%. For CO modelled error amount, a range of error types were simulated and effects on risk ratio bias and significance were observed. For multiplicative error, both the amount and type of measurement error impact health effect estimates in air pollution epidemiology. By modelling instrument imprecision and spatial variability as different error types, we estimate direction and magnitude of the effects of error over a range of error types.

  6. Reduction of Orifice-Induced Pressure Errors

    NASA Technical Reports Server (NTRS)

    Plentovich, Elizabeth B.; Gloss, Blair B.; Eves, John W.; Stack, John P.

    1987-01-01

    Use of porous-plug orifice reduces or eliminates errors, induced by orifice itself, in measuring static pressure on airfoil surface in wind-tunnel experiments. Piece of sintered metal press-fitted into static-pressure orifice so it matches surface contour of model. Porous material reduces orifice-induced pressure error associated with conventional orifice of same or smaller diameter. Also reduces or eliminates additional errors in pressure measurement caused by orifice imperfections. Provides more accurate measurements in regions with very thin boundary layers.

  7. Reducing errors benefits the field-based learning of a fundamental movement skill in children.

    PubMed

    Capio, C M; Poolton, J M; Sit, C H P; Holmstrom, M; Masters, R S W

    2013-03-01

    Proficient fundamental movement skills (FMS) are believed to form the basis of more complex movement patterns in sports. This study examined the development of the FMS of overhand throwing in children through either an error-reduced (ER) or error-strewn (ES) training program. Students (n = 216), aged 8-12 years (M = 9.16, SD = 0.96), practiced overhand throwing in either a program that reduced errors during practice (ER) or one that was ES. ER program reduced errors by incrementally raising the task difficulty, while the ES program had an incremental lowering of task difficulty. Process-oriented assessment of throwing movement form (Test of Gross Motor Development-2) and product-oriented assessment of throwing accuracy (absolute error) were performed. Changes in performance were examined among children in the upper and lower quartiles of the pretest throwing accuracy scores. ER training participants showed greater gains in movement form and accuracy, and performed throwing more effectively with a concurrent secondary cognitive task. Movement form improved among girls, while throwing accuracy improved among children with low ability. Reduced performance errors in FMS training resulted in greater learning than a program that did not restrict errors. Reduced cognitive processing costs (effective dual-task performance) associated with such approach suggest its potential benefits for children with developmental conditions. © 2011 John Wiley & Sons A/S.

  8. Competency: an essential component of caring in nursing.

    PubMed

    Knapp, Bobbi

    2004-01-01

    Providing online e-learning for nurses significantly reduces medical errors by providing "just-in-time" reference and device training. Offering continuing education 24/7 assures continued competency in an ever-changing practice environment while fostering professional development and career mobility.

  9. Sustained Attention is Associated with Error Processing Impairment: Evidence from Mental Fatigue Study in Four-Choice Reaction Time Task

    PubMed Central

    Xiao, Yi; Ma, Feng; Lv, Yixuan; Cai, Gui; Teng, Peng; Xu, FengGang; Chen, Shanguang

    2015-01-01

    Attention is important in error processing. Few studies have examined the link between sustained attention and error processing. In this study, we examined how error-related negativity (ERN) of a four-choice reaction time task was reduced in the mental fatigue condition and investigated the role of sustained attention in error processing. Forty-one recruited participants were divided into two groups. In the fatigue experiment group, 20 subjects performed a fatigue experiment and an additional continuous psychomotor vigilance test (PVT) for 1 h. In the normal experiment group, 21 subjects only performed the normal experimental procedures without the PVT test. Fatigue and sustained attention states were assessed with a questionnaire. Event-related potential results showed that ERN (p < 0.005) and peak (p < 0.05) mean amplitudes decreased in the fatigue experiment. ERN amplitudes were significantly associated with the attention and fatigue states in electrodes Fz, FC1, Cz, and FC2. These findings indicated that sustained attention was related to error processing and that decreased attention is likely the cause of error processing impairment. PMID:25756780

  10. Error-Monitoring in Response to Social Stimuli in Individuals with Higher-Functioning Autism Spectrum Disorder

    PubMed Central

    McMahon, Camilla M.; Henderson, Heather A.

    2014-01-01

    Error-monitoring, or the ability to recognize one's mistakes and implement behavioral changes to prevent further mistakes, may be impaired in individuals with Autism Spectrum Disorder (ASD). Children and adolescents (ages 9-19) with ASD (n = 42) and typical development (n = 42) completed two face processing tasks that required discrimination of either the gender or affect of standardized face stimuli. Post-error slowing and the difference in Error-Related Negativity amplitude between correct and incorrect responses (ERNdiff) were used to index error-monitoring ability. Overall, ERNdiff increased with age. On the Gender Task, individuals with ASD had a smaller ERNdiff than individuals with typical development; however, on the Affect Task, there were no significant diagnostic group differences on ERNdiff. Individuals with ASD may have ERN amplitudes similar to those observed in individuals with typical development in more social contexts compared to less social contexts due to greater consequences for errors, more effortful processing, and/or reduced processing efficiency in these contexts. Across all participants, more post-error slowing on the Affect Task was associated with better social cognitive skills. PMID:25066088

  11. Information systems and human error in the lab.

    PubMed

    Bissell, Michael G

    2004-01-01

    Health system costs in clinical laboratories are incurred daily due to human error. Indeed, a major impetus for automating clinical laboratories has always been the opportunity it presents to simultaneously reduce cost and improve quality of operations by decreasing human error. But merely automating these processes is not enough. To the extent that introduction of these systems results in operators having less practice in dealing with unexpected events or becoming deskilled in problemsolving, however new kinds of error will likely appear. Clinical laboratories could potentially benefit by integrating findings on human error from modern behavioral science into their operations. Fully understanding human error requires a deep understanding of human information processing and cognition. Predicting and preventing negative consequences requires application of this understanding to laboratory operations. Although the occurrence of a particular error at a particular instant cannot be absolutely prevented, human error rates can be reduced. The following principles are key: an understanding of the process of learning in relation to error; understanding the origin of errors since this knowledge can be used to reduce their occurrence; optimal systems should be forgiving to the operator by absorbing errors, at least for a time; although much is known by industrial psychologists about how to write operating procedures and instructions in ways that reduce the probability of error, this expertise is hardly ever put to use in the laboratory; and a feedback mechanism must be designed into the system that enables the operator to recognize in real time that an error has occurred.

  12. Errors as a Means of Reducing Impulsive Food Choice.

    PubMed

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2016-06-05

    Nowadays, the increasing incidence of eating disorders due to poor self-control has given rise to increased obesity and other chronic weight problems, and ultimately, to reduced life expectancy. The capacity to refrain from automatic responses is usually high in situations in which making errors is highly likely. The protocol described here aims at reducing imprudent preference in women during hypothetical intertemporal choices about appetitive food by associating it with errors. First, participants undergo an error task where two different edible stimuli are associated with two different error likelihoods (high and low). Second, they make intertemporal choices about the two edible stimuli, separately. As a result, this method decreases the discount rate for future amounts of the edible reward that cued higher error likelihood, selectively. This effect is under the influence of the self-reported hunger level. The present protocol demonstrates that errors, well known as motivationally salient events, can induce the recruitment of cognitive control, thus being ultimately useful in reducing impatient choices for edible commodities.

  13. Errors as a Means of Reducing Impulsive Food Choice

    PubMed Central

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2016-01-01

    Nowadays, the increasing incidence of eating disorders due to poor self-control has given rise to increased obesity and other chronic weight problems, and ultimately, to reduced life expectancy. The capacity to refrain from automatic responses is usually high in situations in which making errors is highly likely. The protocol described here aims at reducing imprudent preference in women during hypothetical intertemporal choices about appetitive food by associating it with errors. First, participants undergo an error task where two different edible stimuli are associated with two different error likelihoods (high and low). Second, they make intertemporal choices about the two edible stimuli, separately. As a result, this method decreases the discount rate for future amounts of the edible reward that cued higher error likelihood, selectively. This effect is under the influence of the self-reported hunger level. The present protocol demonstrates that errors, well known as motivationally salient events, can induce the recruitment of cognitive control, thus being ultimately useful in reducing impatient choices for edible commodities. PMID:27341281

  14. Unforced errors and error reduction in tennis

    PubMed Central

    Brody, H

    2006-01-01

    Only at the highest level of tennis is the number of winners comparable to the number of unforced errors. As the average player loses many more points due to unforced errors than due to winners by an opponent, if the rate of unforced errors can be reduced, it should lead to an increase in points won. This article shows how players can improve their game by understanding and applying the laws of physics to reduce the number of unforced errors. PMID:16632568

  15. Load Sharing Behavior of Star Gearing Reducer for Geared Turbofan Engine

    NASA Astrophysics Data System (ADS)

    Mo, Shuai; Zhang, Yidu; Wu, Qiong; Wang, Feiming; Matsumura, Shigeki; Houjoh, Haruo

    2017-07-01

    Load sharing behavior is very important for power-split gearing system, star gearing reducer as a new type and special transmission system can be used in many industry fields. However, there is few literature regarding the key multiple-split load sharing issue in main gearbox used in new type geared turbofan engine. Further mechanism analysis are made on load sharing behavior among star gears of star gearing reducer for geared turbofan engine. Comprehensive meshing error analysis are conducted on eccentricity error, gear thickness error, base pitch error, assembly error, and bearing error of star gearing reducer respectively. Floating meshing error resulting from meshing clearance variation caused by the simultaneous floating of sun gear and annular gear are taken into account. A refined mathematical model for load sharing coefficient calculation is established in consideration of different meshing stiffness and supporting stiffness for components. The regular curves of load sharing coefficient under the influence of interactions, single action and single variation of various component errors are obtained. The accurate sensitivity of load sharing coefficient toward different errors is mastered. The load sharing coefficient of star gearing reducer is 1.033 and the maximum meshing force in gear tooth is about 3010 N. This paper provides scientific theory evidences for optimal parameter design and proper tolerance distribution in advanced development and manufacturing process, so as to achieve optimal effects in economy and technology.

  16. Quasi-static shape adjustment of a 15 meter diameter space antenna

    NASA Technical Reports Server (NTRS)

    Belvin, W. Keith; Herstrom, Catherine L.; Edighoffer, Harold H.

    1987-01-01

    A 15 meter diameter Hoop-Column antenna has been analyzed and tested to study shape adjustment of the reflector surface. The Hoop-Column antenna concept employs pretensioned cables and mesh to produce a paraboloidal reflector surface. Fabrication errors and thermal distortions may significantly reduce surface accuracy and consequently degrade electromagnetic performance. Thus, the ability to adjust the surface shape is desirable. The shape adjustment algorithm consisted of finite element and least squares error analyses to minimize the surface distortions. Experimental results verified the analysis. Application of the procedure resulted in a reduction of surface error by 38 percent. Quasi-static shape adjustment has the potential for on-orbit compensation for a variety of surface shape distortions.

  17. PACE 2: Pricing and Cost Estimating Handbook

    NASA Technical Reports Server (NTRS)

    Stewart, R. D.; Shepherd, T.

    1977-01-01

    An automatic data processing system to be used for the preparation of industrial engineering type manhour and material cost estimates has been established. This computer system has evolved into a highly versatile and highly flexible tool which significantly reduces computation time, eliminates computational errors, and reduces typing and reproduction time for estimators and pricers since all mathematical and clerical functions are automatic once basic inputs are derived.

  18. Impact of monetary incentives on cognitive performance and error monitoring following sleep deprivation.

    PubMed

    Hsieh, Shulan; Li, Tzu-Hsien; Tsai, Ling-Ling

    2010-04-01

    To examine whether monetary incentives attenuate the negative effects of sleep deprivation on cognitive performance in a flanker task that requires higher-level cognitive-control processes, including error monitoring. Twenty-four healthy adults aged 18 to 23 years were randomly divided into 2 subject groups: one received and the other did not receive monetary incentives for performance accuracy. Both subject groups performed a flanker task and underwent electroencephalographic recordings for event-related brain potentials after normal sleep and after 1 night of total sleep deprivation in a within-subject, counterbalanced, repeated-measures study design. Monetary incentives significantly enhanced the response accuracy and reaction time variability under both normal sleep and sleep-deprived conditions, and they reduced the effects of sleep deprivation on the subjective effort level, the amplitude of the error-related negativity (an error-related event-related potential component), and the latency of the P300 (an event-related potential variable related to attention processes). However, monetary incentives could not attenuate the effects of sleep deprivation on any measures of behavior performance, such as the response accuracy, reaction time variability, or posterror accuracy adjustments; nor could they reduce the effects of sleep deprivation on the amplitude of the Pe, another error-related event-related potential component. This study shows that motivation incentives selectively reduce the effects of total sleep deprivation on some brain activities, but they cannot attenuate the effects of sleep deprivation on performance decrements in tasks that require high-level cognitive-control processes. Thus, monetary incentives and sleep deprivation may act through both common and different mechanisms to affect cognitive performance.

  19. Meaningful Peer Review in Radiology: A Review of Current Practices and Potential Future Directions.

    PubMed

    Moriarity, Andrew K; Hawkins, C Matthew; Geis, J Raymond; Dreyer, Keith J; Kamer, Aaron P; Khandheria, Paras; Morey, Jose; Whitfill, James; Wiggins, Richard H; Itri, Jason N

    2016-12-01

    The current practice of peer review within radiology is well developed and widely implemented compared with other medical specialties. However, there are many factors that limit current peer review practices from reducing diagnostic errors and improving patient care. The development of "meaningful peer review" requires a transition away from compliance toward quality improvement, whereby the information and insights gained facilitate education and drive systematic improvements that reduce the frequency and impact of diagnostic error. The next generation of peer review requires significant improvements in IT functionality and integration, enabling features such as anonymization, adjudication by multiple specialists, categorization and analysis of errors, tracking, feedback, and easy export into teaching files and other media that require strong partnerships with vendors. In this article, the authors assess various peer review practices, with focused discussion on current limitations and future needs for meaningful peer review in radiology. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  20. Investigating the Causes of Medication Errors and Strategies to Prevention of Them from Nurses and Nursing Student Viewpoint

    PubMed Central

    Gorgich, Enam Alhagh Charkhat; Barfroshan, Sanam; Ghoreishi, Gholamreza; Yaghoobi, Maryam

    2016-01-01

    Introduction and Aim: Medication errors as a serious problem in world and one of the most common medical errors that threaten patient safety and may lead to even death of them. The purpose of this study was to investigate the causes of medication errors and strategies to prevention of them from nurses and nursing student viewpoint. Materials & Methods: This cross-sectional descriptive study was conducted on 327 nursing staff of khatam-al-anbia hospital and 62 intern nursing students in nursing and midwifery school of Zahedan, Iran, enrolled through the availability sampling in 2015. The data were collected by the valid and reliable questionnaire. To analyze the data, descriptive statistics, T-test and ANOVA were applied by use of SPSS16 software. Findings: The results showed that the most common causes of medications errors in nursing were tiredness due increased workload (97.8%), and in nursing students were drug calculation, (77.4%). The most important way for prevention in nurses and nursing student opinion, was reducing the work pressure by increasing the personnel, proportional to the number and condition of patients and also creating a unit as medication calculation. Also there was a significant relationship between the type of ward and the mean of medication errors in two groups. Conclusion: Based on the results it is recommended that nurse-managers resolve the human resources problem, provide workshops and in-service education about preparing medications, side-effects of drugs and pharmacological knowledge. Using electronic medications cards is a measure which reduces medications errors. PMID:27045413

  1. Event-Related-Potential (ERP) Correlates of Performance Monitoring in Adults With Attention-Deficit Hyperactivity Disorder (ADHD)

    PubMed Central

    Marquardt, Lynn; Eichele, Heike; Lundervold, Astri J.; Haavik, Jan; Eichele, Tom

    2018-01-01

    Introduction: Attention-deficit hyperactivity disorder (ADHD) is one of the most frequent neurodevelopmental disorders in children and tends to persist into adulthood. Evidence from neuropsychological, neuroimaging, and electrophysiological studies indicates that alterations of error processing are core symptoms in children and adolescents with ADHD. To test whether adults with ADHD show persisting deficits and compensatory processes, we investigated performance monitoring during stimulus-evaluation and response-selection, with a focus on errors, as well as within-group correlations with symptom scores. Methods: Fifty-five participants (27 ADHD and 28 controls) aged 19–55 years performed a modified flanker task during EEG recording with 64 electrodes, and the ADHD and control groups were compared on measures of behavioral task performance, event-related potentials of performance monitoring (N2, P3), and error processing (ERN, Pe). Adult ADHD Self-Report Scale (ASRS) was used to assess ADHD symptom load. Results: Adults with ADHD showed higher error rates in incompatible trials, and these error rates correlated positively with the ASRS scores. Also, we observed lower P3 amplitudes in incompatible trials, which were inversely correlated with symptom load in the ADHD group. Adults with ADHD also displayed reduced error-related ERN and Pe amplitudes. There were no significant differences in reaction time (RT) and RT variability between the two groups. Conclusion: Our findings show deviations of electrophysiological measures, suggesting reduced effortful engagement of attentional and error-monitoring processes in adults with ADHD. Associations between ADHD symptom scores, event-related potential amplitudes, and poorer task performance in the ADHD group further support this notion. PMID:29706908

  2. Cone-Beam CT Assessment of Interfraction and Intrafraction Setup Error of Two Head-and-Neck Cancer Thermoplastic Masks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velec, Michael; Waldron, John N.; O'Sullivan, Brian

    2010-03-01

    Purpose: To prospectively compare setup error in standard thermoplastic masks and skin-sparing masks (SSMs) modified with low neck cutouts for head-and-neck intensity-modulated radiation therapy (IMRT) patients. Methods and Materials: Twenty head-and-neck IMRT patients were randomized to be treated in a standard mask (SM) or SSM. Cone-beam computed tomography (CBCT) scans, acquired daily after both initial setup and any repositioning, were used for initial and residual interfraction evaluation, respectively. Weekly, post-IMRT CBCT scans were acquired for intrafraction setup evaluation. The population random (sigma) and systematic (SIGMA) errors were compared for SMs and SSMs. Skin toxicity was recorded weekly by use ofmore » Radiation Therapy Oncology Group criteria. Results: We evaluated 762 CBCT scans in 11 patients randomized to the SM and 9 to the SSM. Initial interfraction sigma was 1.6 mm or less or 1.1 deg. or less for SM and 2.0 mm or less and 0.8 deg. for SSM. Initial interfraction SIGMA was 1.0 mm or less or 1.4 deg. or less for SM and 1.1 mm or less or 0.9 deg. or less for SSM. These errors were reduced before IMRT with CBCT image guidance with no significant differences in residual interfraction or intrafraction uncertainties between SMs and SSMs. Intrafraction sigma and SIGMA were less than 1 mm and less than 1 deg. for both masks. Less severe skin reactions were observed in the cutout regions of the SSM compared with non-cutout regions. Conclusions: Interfraction and intrafraction setup error is not significantly different for SSMs and conventional masks in head-and-neck radiation therapy. Mask cutouts should be considered for these patients in an effort to reduce skin toxicity.« less

  3. Do Work Condition Interventions Affect Quality and Errors in Primary Care? Results from the Healthy Work Place Study.

    PubMed

    Linzer, Mark; Poplau, Sara; Brown, Roger; Grossman, Ellie; Varkey, Anita; Yale, Steven; Williams, Eric S; Hicks, Lanis; Wallock, Jill; Kohnhorst, Diane; Barbouche, Michael

    2017-01-01

    While primary care work conditions are associated with adverse clinician outcomes, little is known about the effect of work condition interventions on quality or safety. A cluster randomized controlled trial of 34 clinics in the upper Midwest and New York City. Primary care clinicians and their diabetic and hypertensive patients. Quality improvement projects to improve communication between providers, workflow design, and chronic disease management. Intervention clinics received brief summaries of their clinician and patient outcome data at baseline. We measured work conditions and clinician and patient outcomes both at baseline and 6-12 months post-intervention. Multilevel regression analyses assessed the impact of work condition changes on outcomes. Subgroup analyses assessed impact by intervention category. There were no significant differences in error reduction (19 % vs. 11 %, OR of improvement 1.84, 95 % CI 0.70, 4.82, p = 0.21) or quality of care improvement (19 % improved vs. 44 %, OR 0.62, 95 % CI 0.58, 1.21, p = 0.42) between intervention and control clinics. The conceptual model linking work conditions, provider outcomes, and error reduction showed significant relationships between work conditions and provider outcomes (p ≤ 0.001) and a trend toward a reduced error rate in providers with lower burnout (OR 1.44, 95 % CI 0.94, 2.23, p = 0.09). Few quality metrics, short time span, fewer clinicians recruited than anticipated. Work-life interventions improving clinician satisfaction and well-being do not necessarily reduce errors or improve quality. Longer, more focused interventions may be needed to produce meaningful improvements in patient care. ClinicalTrials.gov # NCT02542995.

  4. An educational and audit tool to reduce prescribing error in intensive care.

    PubMed

    Thomas, A N; Boxall, E M; Laha, S K; Day, A J; Grundy, D

    2008-10-01

    To reduce prescribing errors in an intensive care unit by providing prescriber education in tutorials, ward-based teaching and feedback in 3-monthly cycles with each new group of trainee medical staff. Prescribing audits were conducted three times in each 3-month cycle, once pretraining, once post-training and a final audit after 6 weeks. The audit information was fed back to prescribers with their correct prescribing rates, rates for individual error types and total error rates together with anonymised information about other prescribers' error rates. The percentage of prescriptions with errors decreased over each 3-month cycle (pretraining 25%, 19%, (one missing data point), post-training 23%, 6%, 11%, final audit 7%, 3%, 5% (p<0.0005)). The total number of prescriptions and error rates varied widely between trainees (data collection one; cycle two: range of prescriptions written: 1-61, median 18; error rate: 0-100%; median: 15%). Prescriber education and feedback reduce manual prescribing errors in intensive care.

  5. Cognitive tests predict real-world errors: the relationship between drug name confusion rates in laboratory-based memory and perception tests and corresponding error rates in large pharmacy chains

    PubMed Central

    Schroeder, Scott R; Salomon, Meghan M; Galanter, William L; Schiff, Gordon D; Vaida, Allen J; Gaunt, Michael J; Bryson, Michelle L; Rash, Christine; Falck, Suzanne; Lambert, Bruce L

    2017-01-01

    Background Drug name confusion is a common type of medication error and a persistent threat to patient safety. In the USA, roughly one per thousand prescriptions results in the wrong drug being filled, and most of these errors involve drug names that look or sound alike. Prior to approval, drug names undergo a variety of tests to assess their potential for confusability, but none of these preapproval tests has been shown to predict real-world error rates. Objectives We conducted a study to assess the association between error rates in laboratory-based tests of drug name memory and perception and real-world drug name confusion error rates. Methods Eighty participants, comprising doctors, nurses, pharmacists, technicians and lay people, completed a battery of laboratory tests assessing visual perception, auditory perception and short-term memory of look-alike and sound-alike drug name pairs (eg, hydroxyzine/hydralazine). Results Laboratory test error rates (and other metrics) significantly predicted real-world error rates obtained from a large, outpatient pharmacy chain, with the best-fitting model accounting for 37% of the variance in real-world error rates. Cross-validation analyses confirmed these results, showing that the laboratory tests also predicted errors from a second pharmacy chain, with 45% of the variance being explained by the laboratory test data. Conclusions Across two distinct pharmacy chains, there is a strong and significant association between drug name confusion error rates observed in the real world and those observed in laboratory-based tests of memory and perception. Regulators and drug companies seeking a validated preapproval method for identifying confusing drug names ought to consider using these simple tests. By using a standard battery of memory and perception tests, it should be possible to reduce the number of confusing look-alike and sound-alike drug name pairs that reach the market, which will help protect patients from potentially harmful medication errors. PMID:27193033

  6. Diffraction-based overlay metrology for double patterning technologies

    NASA Astrophysics Data System (ADS)

    Dasari, Prasad; Korlahalli, Rahul; Li, Jie; Smith, Nigel; Kritsun, Oleg; Volkman, Cathy

    2009-03-01

    The extension of optical lithography to 32nm and beyond is made possible by Double Patterning Techniques (DPT) at critical levels of the process flow. The ease of DPT implementation is hindered by increased significance of critical dimension uniformity and overlay errors. Diffraction-based overlay (DBO) has shown to be an effective metrology solution for accurate determination of the overlay errors associated with double patterning [1, 2] processes. In this paper we will report its use in litho-freeze-litho-etch (LFLE) and spacer double patterning technology (SDPT), which are pitch splitting solutions that reduce the significance of overlay errors. Since the control of overlay between various mask/level combinations is critical for fabrication, precise and accurate assessment of errors by advanced metrology techniques such as spectroscopic diffraction based overlay (DBO) and traditional image-based overlay (IBO) using advanced target designs will be reported. A comparison between DBO, IBO and CD-SEM measurements will be reported. . A discussion of TMU requirements for 32nm technology and TMU performance data of LFLE and SDPT targets by different overlay approaches will be presented.

  7. Exploring Reactions to Pilot Reliability Certification and Changing Attitudes on the Reduction of Errors

    ERIC Educational Resources Information Center

    Boedigheimer, Dan

    2010-01-01

    Approximately 70% of aviation accidents are attributable to human error. The greatest opportunity for further improving aviation safety is found in reducing human errors in the cockpit. The purpose of this quasi-experimental, mixed-method research was to evaluate whether there was a difference in pilot attitudes toward reducing human error in the…

  8. Reducing diagnostic errors in medicine: what's the goal?

    PubMed

    Graber, Mark; Gordon, Ruthanna; Franklin, Nancy

    2002-10-01

    This review considers the feasibility of reducing or eliminating the three major categories of diagnostic errors in medicine: "No-fault errors" occur when the disease is silent, presents atypically, or mimics something more common. These errors will inevitably decline as medical science advances, new syndromes are identified, and diseases can be detected more accurately or at earlier stages. These errors can never be eradicated, unfortunately, because new diseases emerge, tests are never perfect, patients are sometimes noncompliant, and physicians will inevitably, at times, choose the most likely diagnosis over the correct one, illustrating the concept of necessary fallibility and the probabilistic nature of choosing a diagnosis. "System errors" play a role when diagnosis is delayed or missed because of latent imperfections in the health care system. These errors can be reduced by system improvements, but can never be eliminated because these improvements lag behind and degrade over time, and each new fix creates the opportunity for novel errors. Tradeoffs also guarantee system errors will persist, when resources are just shifted. "Cognitive errors" reflect misdiagnosis from faulty data collection or interpretation, flawed reasoning, or incomplete knowledge. The limitations of human processing and the inherent biases in using heuristics guarantee that these errors will persist. Opportunities exist, however, for improving the cognitive aspect of diagnosis by adopting system-level changes (e.g., second opinions, decision-support systems, enhanced access to specialists) and by training designed to improve cognition or cognitive awareness. Diagnostic error can be substantially reduced, but never eradicated.

  9. Experimental investigation of false positive errors in auditory species occurrence surveys

    USGS Publications Warehouse

    Miller, David A.W.; Weir, Linda A.; McClintock, Brett T.; Grant, Evan H. Campbell; Bailey, Larissa L.; Simons, Theodore R.

    2012-01-01

    False positive errors are a significant component of many ecological data sets, which in combination with false negative errors, can lead to severe biases in conclusions about ecological systems. We present results of a field experiment where observers recorded observations for known combinations of electronically broadcast calling anurans under conditions mimicking field surveys to determine species occurrence. Our objectives were to characterize false positive error probabilities for auditory methods based on a large number of observers, to determine if targeted instruction could be used to reduce false positive error rates, and to establish useful predictors of among-observer and among-species differences in error rates. We recruited 31 observers, ranging in abilities from novice to expert, that recorded detections for 12 species during 180 calling trials (66,960 total observations). All observers made multiple false positive errors and on average 8.1% of recorded detections in the experiment were false positive errors. Additional instruction had only minor effects on error rates. After instruction, false positive error probabilities decreased by 16% for treatment individuals compared to controls with broad confidence interval overlap of 0 (95% CI: -46 to 30%). This coincided with an increase in false negative errors due to the treatment (26%; -3 to 61%). Differences among observers in false positive and in false negative error rates were best predicted by scores from an online test and a self-assessment of observer ability completed prior to the field experiment. In contrast, years of experience conducting call surveys was a weak predictor of error rates. False positive errors were also more common for species that were played more frequently, but were not related to the dominant spectral frequency of the call. Our results corroborate other work that demonstrates false positives are a significant component of species occurrence data collected by auditory methods. Instructing observers to only report detections they are completely certain are correct is not sufficient to eliminate errors. As a result, analytical methods that account for false positive errors will be needed, and independent testing of observer ability is a useful predictor for among-observer variation in observation error rates.

  10. Optimizing α for better statistical decisions: a case study involving the pace-of-life syndrome hypothesis: optimal α levels set to minimize Type I and II errors frequently result in different conclusions from those using α = 0.05.

    PubMed

    Mudge, Joseph F; Penny, Faith M; Houlahan, Jeff E

    2012-12-01

    Setting optimal significance levels that minimize Type I and Type II errors allows for more transparent and well-considered statistical decision making compared to the traditional α = 0.05 significance level. We use the optimal α approach to re-assess conclusions reached by three recently published tests of the pace-of-life syndrome hypothesis, which attempts to unify occurrences of different physiological, behavioral, and life history characteristics under one theory, over different scales of biological organization. While some of the conclusions reached using optimal α were consistent to those previously reported using the traditional α = 0.05 threshold, opposing conclusions were also frequently reached. The optimal α approach reduced probabilities of Type I and Type II errors, and ensured statistical significance was associated with biological relevance. Biologists should seriously consider their choice of α when conducting null hypothesis significance tests, as there are serious disadvantages with consistent reliance on the traditional but arbitrary α = 0.05 significance level. Copyright © 2012 WILEY Periodicals, Inc.

  11. Hardware-Efficient On-line Learning through Pipelined Truncated-Error Backpropagation in Binary-State Networks

    PubMed Central

    Mostafa, Hesham; Pedroni, Bruno; Sheik, Sadique; Cauwenberghs, Gert

    2017-01-01

    Artificial neural networks (ANNs) trained using backpropagation are powerful learning architectures that have achieved state-of-the-art performance in various benchmarks. Significant effort has been devoted to developing custom silicon devices to accelerate inference in ANNs. Accelerating the training phase, however, has attracted relatively little attention. In this paper, we describe a hardware-efficient on-line learning technique for feedforward multi-layer ANNs that is based on pipelined backpropagation. Learning is performed in parallel with inference in the forward pass, removing the need for an explicit backward pass and requiring no extra weight lookup. By using binary state variables in the feedforward network and ternary errors in truncated-error backpropagation, the need for any multiplications in the forward and backward passes is removed, and memory requirements for the pipelining are drastically reduced. Further reduction in addition operations owing to the sparsity in the forward neural and backpropagating error signal paths contributes to highly efficient hardware implementation. For proof-of-concept validation, we demonstrate on-line learning of MNIST handwritten digit classification on a Spartan 6 FPGA interfacing with an external 1Gb DDR2 DRAM, that shows small degradation in test error performance compared to an equivalently sized binary ANN trained off-line using standard back-propagation and exact errors. Our results highlight an attractive synergy between pipelined backpropagation and binary-state networks in substantially reducing computation and memory requirements, making pipelined on-line learning practical in deep networks. PMID:28932180

  12. Hardware-Efficient On-line Learning through Pipelined Truncated-Error Backpropagation in Binary-State Networks.

    PubMed

    Mostafa, Hesham; Pedroni, Bruno; Sheik, Sadique; Cauwenberghs, Gert

    2017-01-01

    Artificial neural networks (ANNs) trained using backpropagation are powerful learning architectures that have achieved state-of-the-art performance in various benchmarks. Significant effort has been devoted to developing custom silicon devices to accelerate inference in ANNs. Accelerating the training phase, however, has attracted relatively little attention. In this paper, we describe a hardware-efficient on-line learning technique for feedforward multi-layer ANNs that is based on pipelined backpropagation. Learning is performed in parallel with inference in the forward pass, removing the need for an explicit backward pass and requiring no extra weight lookup. By using binary state variables in the feedforward network and ternary errors in truncated-error backpropagation, the need for any multiplications in the forward and backward passes is removed, and memory requirements for the pipelining are drastically reduced. Further reduction in addition operations owing to the sparsity in the forward neural and backpropagating error signal paths contributes to highly efficient hardware implementation. For proof-of-concept validation, we demonstrate on-line learning of MNIST handwritten digit classification on a Spartan 6 FPGA interfacing with an external 1Gb DDR2 DRAM, that shows small degradation in test error performance compared to an equivalently sized binary ANN trained off-line using standard back-propagation and exact errors. Our results highlight an attractive synergy between pipelined backpropagation and binary-state networks in substantially reducing computation and memory requirements, making pipelined on-line learning practical in deep networks.

  13. Digital stereo photogrammetry for grain-scale monitoring of fluvial surfaces: Error evaluation and workflow optimisation

    NASA Astrophysics Data System (ADS)

    Bertin, Stephane; Friedrich, Heide; Delmas, Patrice; Chan, Edwin; Gimel'farb, Georgy

    2015-03-01

    Grain-scale monitoring of fluvial morphology is important for the evaluation of river system dynamics. Significant progress in remote sensing and computer performance allows rapid high-resolution data acquisition, however, applications in fluvial environments remain challenging. Even in a controlled environment, such as a laboratory, the extensive acquisition workflow is prone to the propagation of errors in digital elevation models (DEMs). This is valid for both of the common surface recording techniques: digital stereo photogrammetry and terrestrial laser scanning (TLS). The optimisation of the acquisition process, an effective way to reduce the occurrence of errors, is generally limited by the use of commercial software. Therefore, the removal of evident blunders during post processing is regarded as standard practice, although this may introduce new errors. This paper presents a detailed evaluation of a digital stereo-photogrammetric workflow developed for fluvial hydraulic applications. The introduced workflow is user-friendly and can be adapted to various close-range measurements: imagery is acquired with two Nikon D5100 cameras and processed using non-proprietary "on-the-job" calibration and dense scanline-based stereo matching algorithms. Novel ground truth evaluation studies were designed to identify the DEM errors, which resulted from a combination of calibration errors, inaccurate image rectifications and stereo-matching errors. To ensure optimum DEM quality, we show that systematic DEM errors must be minimised by ensuring a good distribution of control points throughout the image format during calibration. DEM quality is then largely dependent on the imagery utilised. We evaluated the open access multi-scale Retinex algorithm to facilitate the stereo matching, and quantified its influence on DEM quality. Occlusions, inherent to any roughness element, are still a major limiting factor to DEM accuracy. We show that a careful selection of the camera-to-object and baseline distance reduces errors in occluded areas and that realistic ground truths help to quantify those errors.

  14. Error analysis and new dual-cosine window for estimating the sensor frequency response function from the step response data

    NASA Astrophysics Data System (ADS)

    Yang, Shuang-Long; Liang, Li-Ping; Liu, Hou-De; Xu, Ke-Jun

    2018-03-01

    Aiming at reducing the estimation error of the sensor frequency response function (FRF) estimated by the commonly used window-based spectral estimation method, the error models of interpolation and transient errors are derived in the form of non-parameter models. Accordingly, window effects on the errors are analyzed and reveal that the commonly used hanning window leads to smaller interpolation error which can also be significantly eliminated by the cubic spline interpolation method when estimating the FRF from the step response data, and window with smaller front-end value can restrain more transient error. Thus, a new dual-cosine window with its non-zero discrete Fourier transform bins at -3, -1, 0, 1, and 3 is constructed for FRF estimation. Compared with the hanning window, the new dual-cosine window has the equivalent interpolation error suppression capability and better transient error suppression capability when estimating the FRF from the step response; specifically, it reduces the asymptotic property of the transient error from O(N-2) of the hanning window method to O(N-4) while only increases the uncertainty slightly (about 0.4 dB). Then, one direction of a wind tunnel strain gauge balance which is a high order, small damping, and non-minimum phase system is employed as the example for verifying the new dual-cosine window-based spectral estimation method. The model simulation result shows that the new dual-cosine window method is better than the hanning window method for FRF estimation, and compared with the Gans method and LPM method, it has the advantages of simple computation, less time consumption, and short data requirement; the actual data calculation result of the balance FRF is consistent to the simulation result. Thus, the new dual-cosine window is effective and practical for FRF estimation.

  15. The effectiveness of computerized order entry at reducing preventable adverse drug events and medication errors in hospital settings: a systematic review and meta-analysis

    PubMed Central

    2014-01-01

    Background The Health Information Technology for Economic and Clinical Health (HITECH) Act subsidizes implementation by hospitals of electronic health records with computerized provider order entry (CPOE), which may reduce patient injuries caused by medication errors (preventable adverse drug events, pADEs). Effects on pADEs have not been rigorously quantified, and effects on medication errors have been variable. The objectives of this analysis were to assess the effectiveness of CPOE at reducing pADEs in hospital-related settings, and examine reasons for heterogeneous effects on medication errors. Methods Articles were identified using MEDLINE, Cochrane Library, Econlit, web-based databases, and bibliographies of previous systematic reviews (September 2013). Eligible studies compared CPOE with paper-order entry in acute care hospitals, and examined diverse pADEs or medication errors. Studies on children or with limited event-detection methods were excluded. Two investigators extracted data on events and factors potentially associated with effectiveness. We used random effects models to pool data. Results Sixteen studies addressing medication errors met pooling criteria; six also addressed pADEs. Thirteen studies used pre-post designs. Compared with paper-order entry, CPOE was associated with half as many pADEs (pooled risk ratio (RR) = 0.47, 95% CI 0.31 to 0.71) and medication errors (RR = 0.46, 95% CI 0.35 to 0.60). Regarding reasons for heterogeneous effects on medication errors, five intervention factors and two contextual factors were sufficiently reported to support subgroup analyses or meta-regression. Differences between commercial versus homegrown systems, presence and sophistication of clinical decision support, hospital-wide versus limited implementation, and US versus non-US studies were not significant, nor was timing of publication. Higher baseline rates of medication errors predicted greater reductions (P < 0.001). Other context and implementation variables were seldom reported. Conclusions In hospital-related settings, implementing CPOE is associated with a greater than 50% decline in pADEs, although the studies used weak designs. Decreases in medication errors are similar and robust to variations in important aspects of intervention design and context. This suggests that CPOE implementation, as subsidized under the HITECH Act, may benefit public health. More detailed reporting of the context and process of implementation could shed light on factors associated with greater effectiveness. PMID:24894078

  16. Innovations in Medication Preparation Safety and Wastage Reduction: Use of a Workflow Management System in a Pediatric Hospital.

    PubMed

    Davis, Stephen Jerome; Hurtado, Josephine; Nguyen, Rosemary; Huynh, Tran; Lindon, Ivan; Hudnall, Cedric; Bork, Sara

    2017-01-01

    Background: USP <797> regulatory requirements have mandated that pharmacies improve aseptic techniques and cleanliness of the medication preparation areas. In addition, the Institute for Safe Medication Practices (ISMP) recommends that technology and automation be used as much as possible for preparing and verifying compounded sterile products. Objective: To determine the benefits associated with the implementation of the workflow management system, such as reducing medication preparation and delivery errors, reducing quantity and frequency of medication errors, avoiding costs, and enhancing the organization's decision to move toward positive patient identification (PPID). Methods: At Texas Children's Hospital, data were collected and analyzed from January 2014 through August 2014 in the pharmacy areas in which the workflow management system would be implemented. Data were excluded for September 2014 during the workflow management system oral liquid implementation phase. Data were collected and analyzed from October 2014 through June 2015 to determine whether the implementation of the workflow management system reduced the quantity and frequency of reported medication errors. Data collected and analyzed during the study period included the quantity of doses prepared, number of incorrect medication scans, number of doses discontinued from the workflow management system queue, and the number of doses rejected. Data were collected and analyzed to identify patterns of incorrect medication scans, to determine reasons for rejected medication doses, and to determine the reduction in wasted medications. Results: During the 17-month study period, the pharmacy department dispensed 1,506,220 oral liquid and injectable medication doses. From October 2014 through June 2015, the pharmacy department dispensed 826,220 medication doses that were prepared and checked via the workflow management system. Of those 826,220 medication doses, there were 16 reported incorrect volume errors. The error rate after the implementation of the workflow management system averaged 8.4%, which was a 1.6% reduction. After the implementation of the workflow management system, the average number of reported oral liquid medication and injectable medication errors decreased to 0.4 and 0.2 times per week, respectively. Conclusion: The organization was able to achieve its purpose and goal of improving the provision of quality pharmacy care through optimal medication use and safety by reducing medication preparation errors. Error rates decreased and the workflow processes were streamlined, which has led to seamless operations within the pharmacy department. There has been significant cost avoidance and waste reduction and enhanced interdepartmental satisfaction due to the reduction of reported medication errors.

  17. An adaptive modeling and simulation environment for combined-cycle data reconciliation and degradation estimation

    NASA Astrophysics Data System (ADS)

    Lin, Tsungpo

    Performance engineers face the major challenge in modeling and simulation for the after-market power system due to system degradation and measurement errors. Currently, the majority in power generation industries utilizes the deterministic data matching method to calibrate the model and cascade system degradation, which causes significant calibration uncertainty and also the risk of providing performance guarantees. In this research work, a maximum-likelihood based simultaneous data reconciliation and model calibration (SDRMC) is used for power system modeling and simulation. By replacing the current deterministic data matching with SDRMC one can reduce the calibration uncertainty and mitigate the error propagation to the performance simulation. A modeling and simulation environment for a complex power system with certain degradation has been developed. In this environment multiple data sets are imported when carrying out simultaneous data reconciliation and model calibration. Calibration uncertainties are estimated through error analyses and populated to performance simulation by using principle of error propagation. System degradation is then quantified by performance comparison between the calibrated model and its expected new & clean status. To mitigate smearing effects caused by gross errors, gross error detection (GED) is carried out in two stages. The first stage is a screening stage, in which serious gross errors are eliminated in advance. The GED techniques used in the screening stage are based on multivariate data analysis (MDA), including multivariate data visualization and principal component analysis (PCA). Subtle gross errors are treated at the second stage, in which the serial bias compensation or robust M-estimator is engaged. To achieve a better efficiency in the combined scheme of the least squares based data reconciliation and the GED technique based on hypotheses testing, the Levenberg-Marquardt (LM) algorithm is utilized as the optimizer. To reduce the computation time and stabilize the problem solving for a complex power system such as a combined cycle power plant, meta-modeling using the response surface equation (RSE) and system/process decomposition are incorporated with the simultaneous scheme of SDRMC. The goal of this research work is to reduce the calibration uncertainties and, thus, the risks of providing performance guarantees arisen from uncertainties in performance simulation.

  18. Optimal subsystem approach to multi-qubit quantum state discrimination and experimental investigation

    NASA Astrophysics Data System (ADS)

    Xue, ShiChuan; Wu, JunJie; Xu, Ping; Yang, XueJun

    2018-02-01

    Quantum computing is a significant computing capability which is superior to classical computing because of its superposition feature. Distinguishing several quantum states from quantum algorithm outputs is often a vital computational task. In most cases, the quantum states tend to be non-orthogonal due to superposition; quantum mechanics has proved that perfect outcomes could not be achieved by measurements, forcing repetitive measurement. Hence, it is important to determine the optimum measuring method which requires fewer repetitions and a lower error rate. However, extending current measurement approaches mainly aiming at quantum cryptography to multi-qubit situations for quantum computing confronts challenges, such as conducting global operations which has considerable costs in the experimental realm. Therefore, in this study, we have proposed an optimum subsystem method to avoid these difficulties. We have provided an analysis of the comparison between the reduced subsystem method and the global minimum error method for two-qubit problems; the conclusions have been verified experimentally. The results showed that the subsystem method could effectively discriminate non-orthogonal two-qubit states, such as separable states, entangled pure states, and mixed states; the cost of the experimental process had been significantly reduced, in most circumstances, with acceptable error rate. We believe the optimal subsystem method is the most valuable and promising approach for multi-qubit quantum computing applications.

  19. A New Font, Specifically Designed for Peripheral Vision, Improves Peripheral Letter and Word Recognition, but Not Eye-Mediated Reading Performance

    PubMed Central

    Bernard, Jean-Baptiste; Aguilar, Carlos; Castet, Eric

    2016-01-01

    Reading speed is dramatically reduced when readers cannot use their central vision. This is because low visual acuity and crowding negatively impact letter recognition in the periphery. In this study, we designed a new font (referred to as the Eido font) in order to reduce inter-letter similarity and consequently to increase peripheral letter recognition performance. We tested this font by running five experiments that compared the Eido font with the standard Courier font. Letter spacing and x-height were identical for the two monospaced fonts. Six normally-sighted subjects used exclusively their peripheral vision to run two aloud reading tasks (with eye movements), a letter recognition task (without eye movements), a word recognition task (without eye movements) and a lexical decision task. Results show that reading speed was not significantly different between the Eido and the Courier font when subjects had to read single sentences with a round simulated gaze-contingent central scotoma (10° diameter). In contrast, Eido significantly decreased perceptual errors in peripheral crowded letter recognition (-30% errors on average for letters briefly presented at 6° eccentricity) and in peripheral word recognition (-32% errors on average for words briefly presented at 6° eccentricity). PMID:27074013

  20. A New Font, Specifically Designed for Peripheral Vision, Improves Peripheral Letter and Word Recognition, but Not Eye-Mediated Reading Performance.

    PubMed

    Bernard, Jean-Baptiste; Aguilar, Carlos; Castet, Eric

    2016-01-01

    Reading speed is dramatically reduced when readers cannot use their central vision. This is because low visual acuity and crowding negatively impact letter recognition in the periphery. In this study, we designed a new font (referred to as the Eido font) in order to reduce inter-letter similarity and consequently to increase peripheral letter recognition performance. We tested this font by running five experiments that compared the Eido font with the standard Courier font. Letter spacing and x-height were identical for the two monospaced fonts. Six normally-sighted subjects used exclusively their peripheral vision to run two aloud reading tasks (with eye movements), a letter recognition task (without eye movements), a word recognition task (without eye movements) and a lexical decision task. Results show that reading speed was not significantly different between the Eido and the Courier font when subjects had to read single sentences with a round simulated gaze-contingent central scotoma (10° diameter). In contrast, Eido significantly decreased perceptual errors in peripheral crowded letter recognition (-30% errors on average for letters briefly presented at 6° eccentricity) and in peripheral word recognition (-32% errors on average for words briefly presented at 6° eccentricity).

  1. Does manipulating the speed of visual flow in virtual reality change distance estimation while walking in Parkinson's disease?

    PubMed

    Ehgoetz Martens, Kaylena A; Ellard, Colin G; Almeida, Quincy J

    2015-03-01

    Although dopaminergic replacement therapy is believed to improve sensory processing in PD, while delayed perceptual speed is thought to be caused by a predominantly cholinergic deficit, it is unclear whether sensory-perceptual deficits are a result of corrupt sensory processing, or a delay in updating perceived feedback during movement. The current study aimed to examine these two hypotheses by manipulating visual flow speed and dopaminergic medication to examine which influenced distance estimation in PD. Fourteen PD and sixteen HC participants were instructed to estimate the distance of a remembered target by walking to the position the target formerly occupied. This task was completed in virtual reality in order to manipulate the visual flow (VF) speed in real time. Three conditions were carried out: (1) BASELINE: VF speed was equal to participants' real-time movement speed; (2) SLOW: VF speed was reduced by 50 %; (2) FAST: VF speed was increased by 30 %. Individuals with PD performed the experiment in their ON and OFF state. PD demonstrated significantly greater judgement error during BASELINE and FAST conditions compared to HC, although PD did not improve their judgement error during the SLOW condition. Additionally, PD had greater variable error during baseline compared to HC; however, during the SLOW conditions, PD had significantly less variable error compared to baseline and similar variable error to HC participants. Overall, dopaminergic medication did not significantly influence judgement error. Therefore, these results suggest that corrupt processing of sensory information is the main contributor to sensory-perceptual deficits during movement in PD rather than delayed updating of sensory feedback.

  2. An observational study of drug administration errors in a Malaysian hospital (study of drug administration errors).

    PubMed

    Chua, S S; Tea, M H; Rahman, M H A

    2009-04-01

    Drug administration errors were the second most frequent type of medication errors, after prescribing errors but the latter were often intercepted hence, administration errors were more probably to reach the patients. Therefore, this study was conducted to determine the frequency and types of drug administration errors in a Malaysian hospital ward. This is a prospective study that involved direct, undisguised observations of drug administrations in a hospital ward. A researcher was stationed in the ward under study for 15 days to observe all drug administrations which were recorded in a data collection form and then compared with the drugs prescribed for the patient. A total of 1118 opportunities for errors were observed and 127 administrations had errors. This gave an error rate of 11.4 % [95% confidence interval (CI) 9.5-13.3]. If incorrect time errors were excluded, the error rate reduced to 8.7% (95% CI 7.1-10.4). The most common types of drug administration errors were incorrect time (25.2%), followed by incorrect technique of administration (16.3%) and unauthorized drug errors (14.1%). In terms of clinical significance, 10.4% of the administration errors were considered as potentially life-threatening. Intravenous routes were more likely to be associated with an administration error than oral routes (21.3% vs. 7.9%, P < 0.001). The study indicates that the frequency of drug administration errors in developing countries such as Malaysia is similar to that in the developed countries. Incorrect time errors were also the most common type of drug administration errors. A non-punitive system of reporting medication errors should be established to encourage more information to be documented so that risk management protocol could be developed and implemented.

  3. SU-F-E-02: A Feasibility Study for Application of Metal Artifact Reduction Techniques in MR-Guided Brachytherapy Gynecological Cancer with Titanium Applicators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kadbi, M

    Purpose: Utilization of Titanium Tandem and Ring (T&R) applicators in MR-guided brachytherapy has become widespread for gynecological cancer treatment. However, Titanium causes magnetic field disturbance and susceptibility artifact, which complicate image interpretation. In this study, metal artifact reduction techniques were employed to improve the image quality and reduce the metal related artifacts. Methods: Several techniques were employed to reduce the metal artifact caused by titanium T&R applicator. These techniques include Metal Artifact Reduction Sequence (MARS), View Angle Tilting (VAT) to correct in-plane distortion, and Slice Encoding for Metal Artifact Correction (SEMAC) for through-plane artifact correction. Moreover, MARS can be combinedmore » with VAT to further reduce the in-plane artifact by reapplying the selection gradients during the readout (MARS+VAT). SEMAC uses a slice selective excitation but acquires additional z-encodings in order to resolve off-resonant signal and to reduce through-plane distortions. Results: Comparison between the clinical sequences revealed that increasing the bandwidth reduces the error in measured diameter of T&R. However, the error is larger than 4mm for the best case with highest bandwidth and spatial resolution. MARS+VAT with isotropic resolution of 1mm reduced the error to 1.9mm which is the least among the examined 2D sequences. The measured diameter of tandem from SEMAC+VAT has the closest value to the actual diameter of tandem (3.2mm) and the error was reduced to less than 1mm. In addition, SEMAC+VAT significantly reduces the blooming artifact in the ring compared to clinical sequences. Conclusion: A higher bandwidth and spatial resolution sequence reduces the artifact and diameter of applicator with a slight compromise in SNR. Metal artifact reduction sequences decrease the distortion associated with titanium applicator. SEMAC+VAT sequence in combination with VAT revealed promising results for titanium imaging and can be utilized for MR-guided brachytherapy in gynecological cancer. The author is employee with Philips Healthcare.« less

  4. Assessing Error Awareness as a Mediator of the Relationship between Subjective Concerns and Cognitive Performance in Older Adults

    PubMed Central

    Buckley, Rachel F.; Laming, Gemma; Chen, Li Peng Evelyn; Crole, Alice; Hester, Robert

    2016-01-01

    Objectives Subjective concerns of cognitive decline (SCD) often manifest in older adults who exhibit objectively normal cognitive functioning. This subjective-objective discrepancy is counter-intuitive when mounting evidence suggests that subjective concerns relate to future clinical progression to Alzheimer’s disease, and so possess the potential to be a sensitive early behavioural marker of disease. In the current study, we aimed to determine whether individual variability in conscious awareness of errors in daily life might mediate this subjective-objective relationship. Methods 67 cognitively-normal older adults underwent cognitive, SCD and mood tests, and an error awareness task. Results Poorer error awareness was not found to mediate a relationship between SCD and objective performance. Furthermore, non-clinical levels of depressive symptomatology were a primary driving factor of SCD and error awareness, and significantly mediated a relationship between the two. Discussion We were unable to show that poorer error awareness mediates SCD and cognitive performance in older adults. Our study does suggest, however, that underlying depressive symptoms influence both poorer error awareness and greater SCD severity. Error awareness is thus not recommended as a proxy for SCD, as reduced levels of error awareness do not seem to be reflected by greater SCD. PMID:27832173

  5. Coordinate alignment of combined measurement systems using a modified common points method

    NASA Astrophysics Data System (ADS)

    Zhao, G.; Zhang, P.; Xiao, W.

    2018-03-01

    The co-ordinate metrology has been extensively researched for its outstanding advantages in measurement range and accuracy. The alignment of different measurement systems is usually achieved by integrating local coordinates via common points before measurement. The alignment errors would accumulate and significantly reduce the global accuracy, thus need to be minimized. In this thesis, a modified common points method (MCPM) is proposed to combine different traceable system errors of the cooperating machines, and optimize the global accuracy by introducing mutual geometric constraints. The geometric constraints, obtained by measuring the common points in individual local coordinate systems, provide the possibility to reduce the local measuring uncertainty whereby enhance the global measuring certainty. A simulation system is developed in Matlab to analyze the feature of MCPM using the Monto-Carlo method. An exemplary setup is constructed to verify the feasibility and efficiency of the proposed method associated with laser tracker and indoor iGPS systems. Experimental results show that MCPM could significantly improve the alignment accuracy.

  6. Evaluating a medical error taxonomy.

    PubMed

    Brixey, Juliana; Johnson, Todd R; Zhang, Jiajie

    2002-01-01

    Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a standard language for reporting medication errors. This project maps the NCC MERP taxonomy of medication error to MedWatch medical errors involving infusion pumps. Of particular interest are human factors associated with medical device errors. The NCC MERP taxonomy of medication errors is limited in mapping information from MEDWATCH because of the focus on the medical device and the format of reporting.

  7. Characteristics of the BDS Carrier Phase Multipath and Its Mitigation Methods in Relative Positioning

    PubMed Central

    Dai, Wujiao; Shi, Qiang; Cai, Changsheng

    2017-01-01

    The carrier phase multipath effect is one of the most significant error sources in the precise positioning of BeiDou Navigation Satellite System (BDS). We analyzed the characteristics of BDS multipath, and found the multipath errors of geostationary earth orbit (GEO) satellite signals are systematic, whereas those of inclined geosynchronous orbit (IGSO) or medium earth orbit (MEO) satellites are both systematic and random. The modified multipath mitigation methods, including sidereal filtering algorithm and multipath hemispherical map (MHM) model, were used to improve BDS dynamic deformation monitoring. The results indicate that the sidereal filtering methods can reduce the root mean square (RMS) of positioning errors in the east, north and vertical coordinate directions by 15%, 37%, 25% and 18%, 51%, 27% in the coordinate and observation domains, respectively. By contrast, the MHM method can reduce the RMS by 22%, 52% and 27% on average. In addition, the BDS multipath errors in static baseline solutions are a few centimeters in multipath-rich environments, which is different from that of Global Positioning System (GPS) multipath. Therefore, we add a parameter representing the GEO multipath error in observation equation to the adjustment model to improve the precision of BDS static baseline solutions. And the results show that the modified model can achieve an average precision improvement of 82%, 54% and 68% in the east, north and up coordinate directions, respectively. PMID:28387744

  8. Characteristics of the BDS Carrier Phase Multipath and Its Mitigation Methods in Relative Positioning.

    PubMed

    Dai, Wujiao; Shi, Qiang; Cai, Changsheng

    2017-04-07

    The carrier phase multipath effect is one of the most significant error sources in the precise positioning of BeiDou Navigation Satellite System (BDS). We analyzed the characteristics of BDS multipath, and found the multipath errors of geostationary earth orbit (GEO) satellite signals are systematic, whereas those of inclined geosynchronous orbit (IGSO) or medium earth orbit (MEO) satellites are both systematic and random. The modified multipath mitigation methods, including sidereal filtering algorithm and multipath hemispherical map (MHM) model, were used to improve BDS dynamic deformation monitoring. The results indicate that the sidereal filtering methods can reduce the root mean square (RMS) of positioning errors in the east, north and vertical coordinate directions by 15%, 37%, 25% and 18%, 51%, 27% in the coordinate and observation domains, respectively. By contrast, the MHM method can reduce the RMS by 22%, 52% and 27% on average. In addition, the BDS multipath errors in static baseline solutions are a few centimeters in multipath-rich environments, which is different from that of Global Positioning System (GPS) multipath. Therefore, we add a parameter representing the GEO multipath error in observation equation to the adjustment model to improve the precision of BDS static baseline solutions. And the results show that the modified model can achieve an average precision improvement of 82%, 54% and 68% in the east, north and up coordinate directions, respectively.

  9. An error analysis perspective for patient alignment systems.

    PubMed

    Figl, Michael; Kaar, Marcus; Hoffman, Rainer; Kratochwil, Alfred; Hummel, Johann

    2013-09-01

    This paper analyses the effects of error sources which can be found in patient alignment systems. As an example, an ultrasound (US) repositioning system and its transformation chain are assessed. The findings of this concept can also be applied to any navigation system. In a first step, all error sources were identified and where applicable, corresponding target registration errors were computed. By applying error propagation calculations on these commonly used registration/calibration and tracking errors, we were able to analyse the components of the overall error. Furthermore, we defined a special situation where the whole registration chain reduces to the error caused by the tracking system. Additionally, we used a phantom to evaluate the errors arising from the image-to-image registration procedure, depending on the image metric used. We have also discussed how this analysis can be applied to other positioning systems such as Cone Beam CT-based systems or Brainlab's ExacTrac. The estimates found by our error propagation analysis are in good agreement with the numbers found in the phantom study but significantly smaller than results from patient evaluations. We probably underestimated human influences such as the US scan head positioning by the operator and tissue deformation. Rotational errors of the tracking system can multiply these errors, depending on the relative position of tracker and probe. We were able to analyse the components of the overall error of a typical patient positioning system. We consider this to be a contribution to the optimization of the positioning accuracy for computer guidance systems.

  10. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    PubMed

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2017-06-01

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors (RSS) from the three quantitative analyses were compared. In methane gas analysis, SWLS yielded the lowest SEP and RSS among the three methods. In methane/toluene mixture gas analysis, a modification of the SWLS has been presented to tackle the bias error from other components. The SWLS without modification presents the lowest SEP in all cases but not bias and RSS. The modification of SWLS reduced the bias, which showed a lower RSS than CLS, especially for small components.

  11. Long Term Mean Local Time of the Ascending Node Prediction

    NASA Technical Reports Server (NTRS)

    McKinley, David P.

    2007-01-01

    Significant error has been observed in the long term prediction of the Mean Local Time of the Ascending Node on the Aqua spacecraft. This error of approximately 90 seconds over a two year prediction is a complication in planning and timing of maneuvers for all members of the Earth Observing System Afternoon Constellation, which use Aqua's MLTAN as the reference for their inclination maneuvers. It was determined that the source of the prediction error was the lack of a solid Earth tide model in the operational force models. The Love Model of the solid Earth tide potential was used to derive analytic corrections to the inclination and right ascension of the ascending node of Aqua's Sun-synchronous orbit. Additionally, it was determined that the resonance between the Sun and orbit plane of the Sun-synchronous orbit is the primary driver of this error. The analytic corrections have been added to the operational force models for the Aqua spacecraft reducing the two-year 90-second error to less than 7 seconds.

  12. Impact of Tropospheric Aerosol Absorption on Ozone Retrieval from buv Measurements

    NASA Technical Reports Server (NTRS)

    Torres, O.; Bhartia, P. K.

    1998-01-01

    The impact of tropospheric aerosols on the retrieval of column ozone amounts using spaceborne measurements of backscattered ultraviolet radiation is examined. Using radiative transfer calculations, we show that uv-absorbing desert dust may introduce errors as large as 10% in ozone column amount, depending on the aerosol layer height and optical depth. Smaller errors are produced by carbonaceous aerosols that result from biomass burning. Though the error is produced by complex interactions between ozone absorption (both stratospheric and tropospheric), aerosol scattering, and aerosol absorption, a surprisingly simple correction procedure reduces the error to about 1%, for a variety of aerosols and for a wide range of aerosol loading. Comparison of the corrected TOMS data with operational data indicates that though the zonal mean total ozone derived from TOMS are not significantly affected by these errors, localized affects in the tropics can be large enough to seriously affect the studies of tropospheric ozone that are currently undergoing using the TOMS data.

  13. Scaling depth-induced wave-breaking in two-dimensional spectral wave models

    NASA Astrophysics Data System (ADS)

    Salmon, J. E.; Holthuijsen, L. H.; Zijlema, M.; van Vledder, G. Ph.; Pietrzak, J. D.

    2015-03-01

    Wave breaking in shallow water is still poorly understood and needs to be better parameterized in 2D spectral wave models. Significant wave heights over horizontal bathymetries are typically under-predicted in locally generated wave conditions and over-predicted in non-locally generated conditions. A joint scaling dependent on both local bottom slope and normalized wave number is presented and is shown to resolve these issues. Compared to the 12 wave breaking parameterizations considered in this study, this joint scaling demonstrates significant improvements, up to ∼50% error reduction, over 1D horizontal bathymetries for both locally and non-locally generated waves. In order to account for the inherent differences between uni-directional (1D) and directionally spread (2D) wave conditions, an extension of the wave breaking dissipation models is presented. By including the effects of wave directionality, rms-errors for the significant wave height are reduced for the best performing parameterizations in conditions with strong directional spreading. With this extension, our joint scaling improves modeling skill for significant wave heights over a verification data set of 11 different 1D laboratory bathymetries, 3 shallow lakes and 4 coastal sites. The corresponding averaged normalized rms-error for significant wave height in the 2D cases varied between 8% and 27%. In comparison, using the default setting with a constant scaling, as used in most presently operating 2D spectral wave models, gave equivalent errors between 15% and 38%.

  14. Method of grid generation

    DOEpatents

    Barnette, Daniel W.

    2002-01-01

    The present invention provides a method of grid generation that uses the geometry of the problem space and the governing relations to generate a grid. The method can generate a grid with minimized discretization errors, and with minimal user interaction. The method of the present invention comprises assigning grid cell locations so that, when the governing relations are discretized using the grid, at least some of the discretization errors are substantially zero. Conventional grid generation is driven by the problem space geometry; grid generation according to the present invention is driven by problem space geometry and by governing relations. The present invention accordingly can provide two significant benefits: more efficient and accurate modeling since discretization errors are minimized, and reduced cost grid generation since less human interaction is required.

  15. Optimized tomography of continuous variable systems using excitation counting

    NASA Astrophysics Data System (ADS)

    Shen, Chao; Heeres, Reinier W.; Reinhold, Philip; Jiang, Luyao; Liu, Yi-Kai; Schoelkopf, Robert J.; Jiang, Liang

    2016-11-01

    We propose a systematic procedure to optimize quantum state tomography protocols for continuous variable systems based on excitation counting preceded by a displacement operation. Compared with conventional tomography based on Husimi or Wigner function measurement, the excitation counting approach can significantly reduce the number of measurement settings. We investigate both informational completeness and robustness, and provide a bound of reconstruction error involving the condition number of the sensing map. We also identify the measurement settings that optimize this error bound, and demonstrate that the improved reconstruction robustness can lead to an order-of-magnitude reduction of estimation error with given resources. This optimization procedure is general and can incorporate prior information of the unknown state to further simplify the protocol.

  16. Post-error action control is neurobehaviorally modulated under conditions of constant speeded response.

    PubMed

    Soshi, Takahiro; Ando, Kumiko; Noda, Takamasa; Nakazawa, Kanako; Tsumura, Hideki; Okada, Takayuki

    2014-01-01

    Post-error slowing (PES) is an error recovery strategy that contributes to action control, and occurs after errors in order to prevent future behavioral flaws. Error recovery often malfunctions in clinical populations, but the relationship between behavioral traits and recovery from error is unclear in healthy populations. The present study investigated the relationship between impulsivity and error recovery by simulating a speeded response situation using a Go/No-go paradigm that forced the participants to constantly make accelerated responses prior to stimuli disappearance (stimulus duration: 250 ms). Neural correlates of post-error processing were examined using event-related potentials (ERPs). Impulsivity traits were measured with self-report questionnaires (BIS-11, BIS/BAS). Behavioral results demonstrated that the commission error for No-go trials was 15%, but PES did not take place immediately. Delayed PES was negatively correlated with error rates and impulsivity traits, showing that response slowing was associated with reduced error rates and changed with impulsivity. Response-locked error ERPs were clearly observed for the error trials. Contrary to previous studies, error ERPs were not significantly related to PES. Stimulus-locked N2 was negatively correlated with PES and positively correlated with impulsivity traits at the second post-error Go trial: larger N2 activity was associated with greater PES and less impulsivity. In summary, under constant speeded conditions, error monitoring was dissociated from post-error action control, and PES did not occur quickly. Furthermore, PES and its neural correlate (N2) were modulated by impulsivity traits. These findings suggest that there may be clinical and practical efficacy of maintaining cognitive control of actions during error recovery under common daily environments that frequently evoke impulsive behaviors.

  17. Post-error action control is neurobehaviorally modulated under conditions of constant speeded response

    PubMed Central

    Soshi, Takahiro; Ando, Kumiko; Noda, Takamasa; Nakazawa, Kanako; Tsumura, Hideki; Okada, Takayuki

    2015-01-01

    Post-error slowing (PES) is an error recovery strategy that contributes to action control, and occurs after errors in order to prevent future behavioral flaws. Error recovery often malfunctions in clinical populations, but the relationship between behavioral traits and recovery from error is unclear in healthy populations. The present study investigated the relationship between impulsivity and error recovery by simulating a speeded response situation using a Go/No-go paradigm that forced the participants to constantly make accelerated responses prior to stimuli disappearance (stimulus duration: 250 ms). Neural correlates of post-error processing were examined using event-related potentials (ERPs). Impulsivity traits were measured with self-report questionnaires (BIS-11, BIS/BAS). Behavioral results demonstrated that the commission error for No-go trials was 15%, but PES did not take place immediately. Delayed PES was negatively correlated with error rates and impulsivity traits, showing that response slowing was associated with reduced error rates and changed with impulsivity. Response-locked error ERPs were clearly observed for the error trials. Contrary to previous studies, error ERPs were not significantly related to PES. Stimulus-locked N2 was negatively correlated with PES and positively correlated with impulsivity traits at the second post-error Go trial: larger N2 activity was associated with greater PES and less impulsivity. In summary, under constant speeded conditions, error monitoring was dissociated from post-error action control, and PES did not occur quickly. Furthermore, PES and its neural correlate (N2) were modulated by impulsivity traits. These findings suggest that there may be clinical and practical efficacy of maintaining cognitive control of actions during error recovery under common daily environments that frequently evoke impulsive behaviors. PMID:25674058

  18. Impact of electronic chemotherapy order forms on prescribing errors at an urban medical center: results from an interrupted time-series analysis.

    PubMed

    Elsaid, K; Truong, T; Monckeberg, M; McCarthy, H; Butera, J; Collins, C

    2013-12-01

    To evaluate the impact of electronic standardized chemotherapy templates on incidence and types of prescribing errors. A quasi-experimental interrupted time series with segmented regression. A 700-bed multidisciplinary tertiary care hospital with an ambulatory cancer center. A multidisciplinary team including oncology physicians, nurses, pharmacists and information technologists. Standardized, regimen-specific, chemotherapy prescribing forms were developed and implemented over a 32-month period. Trend of monthly prevented prescribing errors per 1000 chemotherapy doses during the pre-implementation phase (30 months), immediate change in the error rate from pre-implementation to implementation and trend of errors during the implementation phase. Errors were analyzed according to their types: errors in communication or transcription, errors in dosing calculation and errors in regimen frequency or treatment duration. Relative risk (RR) of errors in the post-implementation phase (28 months) compared with the pre-implementation phase was computed with 95% confidence interval (CI). Baseline monthly error rate was stable with 16.7 prevented errors per 1000 chemotherapy doses. A 30% reduction in prescribing errors was observed with initiating the intervention. With implementation, a negative change in the slope of prescribing errors was observed (coefficient = -0.338; 95% CI: -0.612 to -0.064). The estimated RR of transcription errors was 0.74; 95% CI (0.59-0.92). The estimated RR of dosing calculation errors was 0.06; 95% CI (0.03-0.10). The estimated RR of chemotherapy frequency/duration errors was 0.51; 95% CI (0.42-0.62). Implementing standardized chemotherapy-prescribing templates significantly reduced all types of prescribing errors and improved chemotherapy safety.

  19. Merits of using color and shape differentiation to improve the speed and accuracy of drug strength identification on over-the-counter medicines by laypeople.

    PubMed

    Hellier, Elizabeth; Tucker, Mike; Kenny, Natalie; Rowntree, Anna; Edworthy, Judy

    2010-09-01

    This study aimed to examine the utility of using color and shape to differentiate drug strength information on over-the-counter medicine packages. Medication errors are an important threat to patient safety, and confusions between drug strengths are a significant source of medication error. A visual search paradigm required laypeople to search for medicine packages of a particular strength from among distracter packages of different strengths, and measures of reaction time and error were recorded. Using color to differentiate drug strength information conferred an advantage on search times and accuracy. Shape differentiation did not improve search times and had only a weak effect on search accuracy. Using color to differentiate drug strength information improves drug strength identification performance. Color differentiation of drug strength information may be a useful way of reducing medication errors and improving patient safety.

  20. New algorithm for toric intraocular lens power calculation considering the posterior corneal astigmatism.

    PubMed

    Canovas, Carmen; Alarcon, Aixa; Rosén, Robert; Kasthurirangan, Sanjeev; Ma, Joseph J K; Koch, Douglas D; Piers, Patricia

    2018-02-01

    To assess the accuracy of toric intraocular lens (IOL) power calculations of a new algorithm that incorporates the effect of posterior corneal astigmatism (PCA). Abbott Medical Optics, Inc., Groningen, the Netherlands. Retrospective case report. In eyes implanted with toric IOLs, the exact vergence formula of the Tecnis toric calculator was used to predict refractive astigmatism from preoperative biometry, surgeon-estimated surgically induced astigmatism (SIA), and implanted IOL power, with and without including the new PCA algorithm. For each calculation method, the error in predicted refractive astigmatism was calculated as the vector difference between the prediction and the actual refraction. Calculations were also made using postoperative keratometry (K) values to eliminate the potential effect of incorrect SIA estimates. The study comprised 274 eyes. The PCA algorithm significantly reduced the centroid error in predicted refractive astigmatism (P < .001). With the PCA algorithm, the centroid error reduced from 0.50 @ 1 to 0.19 @ 3 when using preoperative K values and from 0.30 @ 0 to 0.02 @ 84 when using postoperative K values. Patients who had anterior corneal against-the-rule, with-the-rule, and oblique astigmatism had improvement with the PCA algorithm. In addition, the PCA algorithm reduced the median absolute error in all groups (P < .001). The use of the new PCA algorithm decreased the error in the prediction of residual refractive astigmatism in eyes implanted with toric IOLs. Therefore, the new PCA algorithm, in combination with an exact vergence IOL power calculation formula, led to an increased predictability of toric IOL power. Copyright © 2018 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  1. Improving Global Net Surface Heat Flux with Ocean Reanalysis

    NASA Astrophysics Data System (ADS)

    Carton, J.; Chepurin, G. A.; Chen, L.; Grodsky, S.

    2017-12-01

    This project addresses the current level of uncertainty in surface heat flux estimates. Time mean surface heat flux estimates provided by atmospheric reanalyses differ by 10-30W/m2. They are generally unbalanced globally, and have been shown by ocean simulation studies to be incompatible with ocean temperature and velocity measurements. Here a method is presented 1) to identify the spatial and temporal structure of the underlying errors and 2) to reduce them by exploiting hydrographic observations and the analysis increments produced by an ocean reanalysis using sequential data assimilation. The method is applied to fluxes computed from daily state variables obtained from three widely used reanalyses: MERRA2, ERA-Interim, and JRA-55, during an eight year period 2007-2014. For each of these seasonal heat flux errors/corrections are obtained. In a second set of experiments the heat fluxes are corrected and the ocean reanalysis experiments are repeated. This second round of experiments shows that the time mean error in the corrected fluxes is reduced to within ±5W/m2 over the interior subtropical and midlatitude oceans, with the most significant changes occuring over the Southern Ocean. The global heat flux imbalance of each reanalysis is reduced to within a few W/m2 with this single correction. Encouragingly, the corrected forms of the three sets of fluxes are also shown to converge. In the final discussion we present experiments beginning with a modified form of the ERA-Int reanalysis, produced by the DAKKAR program, in which state variables have been individually corrected based on independent measurements. Finally, we discuss the separation of flux error from model error.

  2. Body composition in Nepalese children using isotope dilution: the production of ethnic-specific calibration equations and an exploration of methodological issues.

    PubMed

    Devakumar, Delan; Grijalva-Eternod, Carlos S; Roberts, Sebastian; Chaube, Shiva Shankar; Saville, Naomi M; Manandhar, Dharma S; Costello, Anthony; Osrin, David; Wells, Jonathan C K

    2015-01-01

    Background. Body composition is important as a marker of both current and future health. Bioelectrical impedance (BIA) is a simple and accurate method for estimating body composition, but requires population-specific calibration equations. Objectives. (1) To generate population specific calibration equations to predict lean mass (LM) from BIA in Nepalese children aged 7-9 years. (2) To explore methodological changes that may extend the range and improve accuracy. Methods. BIA measurements were obtained from 102 Nepalese children (52 girls) using the Tanita BC-418. Isotope dilution with deuterium oxide was used to measure total body water and to estimate LM. Prediction equations for estimating LM from BIA data were developed using linear regression, and estimates were compared with those obtained from the Tanita system. We assessed the effects of flexing the arms of children to extend the range of coverage towards lower weights. We also estimated potential error if the number of children included in the study was reduced. Findings. Prediction equations were generated, incorporating height, impedance index, weight and sex as predictors (R (2) 93%). The Tanita system tended to under-estimate LM, with a mean error of 2.2%, but extending up to 25.8%. Flexing the arms to 90° increased the lower weight range, but produced a small error that was not significant when applied to children <16 kg (p 0.42). Reducing the number of children increased the error at the tails of the weight distribution. Conclusions. Population-specific isotope calibration of BIA for Nepalese children has high accuracy. Arm position is important and can be used to extend the range of low weight covered. Smaller samples reduce resource requirements, but leads to large errors at the tails of the weight distribution.

  3. Analysis and Compensation of Modulation Angular Rate Error Based on Missile-Borne Rotation Semi-Strapdown Inertial Navigation System.

    PubMed

    Zhang, Jiayu; Li, Jie; Zhang, Xi; Che, Xiaorui; Huang, Yugang; Feng, Kaiqiang

    2018-05-04

    The Semi-Strapdown Inertial Navigation System (SSINS) provides a new solution to attitude measurement of a high-speed rotating missile. However, micro-electro-mechanical-systems (MEMS) inertial measurement unit (MIMU) outputs are corrupted by significant sensor errors. In order to improve the navigation precision, a rotation modulation technology method called Rotation Semi-Strapdown Inertial Navigation System (RSSINS) is introduced into SINS. In fact, the stability of the modulation angular rate is difficult to achieve in a high-speed rotation environment. The changing rotary angular rate has an impact on the inertial sensor error self-compensation. In this paper, the influence of modulation angular rate error, including acceleration-deceleration process, and instability of the angular rate on the navigation accuracy of RSSINS is deduced and the error characteristics of the reciprocating rotation scheme are analyzed. A new compensation method is proposed to remove or reduce sensor errors so as to make it possible to maintain high precision autonomous navigation performance by MIMU when there is no external aid. Experiments have been carried out to validate the performance of the method. In addition, the proposed method is applicable for modulation angular rate error compensation under various dynamic conditions.

  4. Increased instrument intelligence--can it reduce laboratory error?

    PubMed

    Jekelis, Albert W

    2005-01-01

    Recent literature has focused on the reduction of laboratory errors and the potential impact on patient management. This study assessed the intelligent, automated preanalytical process-control abilities in newer generation analyzers as compared with older analyzers and the impact on error reduction. Three generations of immuno-chemistry analyzers were challenged with pooled human serum samples for a 3-week period. One of the three analyzers had an intelligent process of fluidics checks, including bubble detection. Bubbles can cause erroneous results due to incomplete sample aspiration. This variable was chosen because it is the most easily controlled sample defect that can be introduced. Traditionally, lab technicians have had to visually inspect each sample for the presence of bubbles. This is time consuming and introduces the possibility of human error. Instruments with bubble detection may be able to eliminate the human factor and reduce errors associated with the presence of bubbles. Specific samples were vortexed daily to introduce a visible quantity of bubbles, then immediately placed in the daily run. Errors were defined as a reported result greater than three standard deviations below the mean and associated with incomplete sample aspiration of the analyte of the individual analyzer Three standard deviations represented the target limits of proficiency testing. The results of the assays were examined for accuracy and precision. Efficiency, measured as process throughput, was also measured to associate a cost factor and potential impact of the error detection on the overall process. The analyzer performance stratified according to their level of internal process control The older analyzers without bubble detection reported 23 erred results. The newest analyzer with bubble detection reported one specimen incorrectly. The precision and accuracy of the nonvortexed specimens were excellent and acceptable for all three analyzers. No errors were found in the nonvortexed specimens. There were no significant differences in overall process time for any of the analyzers when tests were arranged in an optimal configuration. The analyzer with advanced fluidic intelligence demostrated the greatest ability to appropriately deal with an incomplete aspiration by not processing and reporting a result for the sample. This study suggests that preanalytical process-control capabilities could reduce errors. By association, it implies that similar intelligent process controls could favorably impact the error rate and, in the case of this instrument, do it without negatively impacting process throughput. Other improvements may be realized as a result of having an intelligent error-detection process including further reduction in misreported results, fewer repeats, less operator intervention, and less reagent waste.

  5. Frozen section analysis of margins for head and neck tumor resections: reduction of sampling errors with a third histologic level.

    PubMed

    Olson, Stephen M; Hussaini, Mohammad; Lewis, James S

    2011-05-01

    Frozen section analysis is an essential tool for assessing margins intra-operatively to assure complete resection. Many institutions evaluate surgical defect edge tissue provided by the surgeon after the main lesion has been removed. With the increasing use of transoral laser microsurgery, this method is becoming even more prevalent. We sought to evaluate error rates at our large academic institution and to see if sampling errors could be reduced by the simple method change of taking an additional third section on these specimens. All head and neck tumor resection cases from January 2005 through August 2008 with margins evaluated by frozen section were identified by database search. These cases were analyzed by cutting two levels during frozen section and a third permanent section later. All resection cases from August 2008 through July 2009 were identified as well. These were analyzed by cutting three levels during frozen section (the third a 'much deeper' level) and a fourth permanent section later. Error rates for both of these periods were determined. Errors were separated into sampling and interpretation types. There were 4976 total frozen section specimens from 848 patients. The overall error rate was 2.4% for all frozen sections where just two levels were evaluated and was 2.5% when three levels were evaluated (P=0.67). The sampling error rate was 1.6% for two-level sectioning and 1.2% for three-level sectioning (P=0.42). However, when considering only the frozen section cases where tumor was ultimately identified (either at the time of frozen section or on permanent sections) the sampling error rate for two-level sectioning was 15.3 versus 7.4% for three-level sectioning. This difference was statistically significant (P=0.006). Cutting a single additional 'deeper' level at the time of frozen section identifies more tumor-bearing specimens and may reduce the number of sampling errors.

  6. A randomised open-label cross-over study of inhaler errors, preference and time to achieve correct inhaler use in patients with COPD or asthma: comparison of ELLIPTA with other inhaler devices.

    PubMed

    van der Palen, Job; Thomas, Mike; Chrystyn, Henry; Sharma, Raj K; van der Valk, Paul Dlpm; Goosens, Martijn; Wilkinson, Tom; Stonham, Carol; Chauhan, Anoop J; Imber, Varsha; Zhu, Chang-Qing; Svedsater, Henrik; Barnes, Neil C

    2016-11-24

    Errors in the use of different inhalers were investigated in patients naive to the devices under investigation in a multicentre, single-visit, randomised, open-label, cross-over study. Patients with chronic obstructive pulmonary disease (COPD) or asthma were assigned to ELLIPTA vs DISKUS (Accuhaler), metered-dose inhaler (MDI) or Turbuhaler. Patients with COPD were also assigned to ELLIPTA vs Handihaler or Breezhaler. Patients demonstrated inhaler use after reading the patient information leaflet (PIL). A trained investigator assessed critical errors (i.e., those likely to result in the inhalation of significantly reduced, minimal or no medication). If the patient made errors, the investigator demonstrated the correct use of the inhaler, and the patient demonstrated inhaler use again. Fewer COPD patients made critical errors with ELLIPTA after reading the PIL vs: DISKUS, 9/171 (5%) vs 75/171 (44%); MDI, 10/80 (13%) vs 48/80 (60%); Turbuhaler, 8/100 (8%) vs 44/100 (44%); Handihaler, 17/118 (14%) vs 57/118 (48%); Breezhaler, 13/98 (13%) vs 45/98 (46%; all P<0.001). Most patients (57-70%) made no errors using ELLIPTA and did not require investigator instruction. Instruction was required for DISKUS (65%), MDI (85%), Turbuhaler (71%), Handihaler (62%) and Breezhaler (56%). Fewer asthma patients made critical errors with ELLIPTA after reading the PIL vs: DISKUS (3/70 (4%) vs 9/70 (13%), P=0.221); MDI (2/32 (6%) vs 8/32 (25%), P=0.074) and significantly fewer vs Turbuhaler (3/60 (5%) vs 20/60 (33%), P<0.001). More asthma and COPD patients preferred ELLIPTA over the other devices (all P⩽0.002). Significantly, fewer COPD patients using ELLIPTA made critical errors after reading the PIL vs other inhalers. More asthma and COPD patients preferred ELLIPTA over comparator inhalers.

  7. A randomised open-label cross-over study of inhaler errors, preference and time to achieve correct inhaler use in patients with COPD or asthma: comparison of ELLIPTA with other inhaler devices

    PubMed Central

    van der Palen, Job; Thomas, Mike; Chrystyn, Henry; Sharma, Raj K; van der Valk, Paul DLPM; Goosens, Martijn; Wilkinson, Tom; Stonham, Carol; Chauhan, Anoop J; Imber, Varsha; Zhu, Chang-Qing; Svedsater, Henrik; Barnes, Neil C

    2016-01-01

    Errors in the use of different inhalers were investigated in patients naive to the devices under investigation in a multicentre, single-visit, randomised, open-label, cross-over study. Patients with chronic obstructive pulmonary disease (COPD) or asthma were assigned to ELLIPTA vs DISKUS (Accuhaler), metered-dose inhaler (MDI) or Turbuhaler. Patients with COPD were also assigned to ELLIPTA vs Handihaler or Breezhaler. Patients demonstrated inhaler use after reading the patient information leaflet (PIL). A trained investigator assessed critical errors (i.e., those likely to result in the inhalation of significantly reduced, minimal or no medication). If the patient made errors, the investigator demonstrated the correct use of the inhaler, and the patient demonstrated inhaler use again. Fewer COPD patients made critical errors with ELLIPTA after reading the PIL vs: DISKUS, 9/171 (5%) vs 75/171 (44%); MDI, 10/80 (13%) vs 48/80 (60%); Turbuhaler, 8/100 (8%) vs 44/100 (44%); Handihaler, 17/118 (14%) vs 57/118 (48%); Breezhaler, 13/98 (13%) vs 45/98 (46%; all P<0.001). Most patients (57–70%) made no errors using ELLIPTA and did not require investigator instruction. Instruction was required for DISKUS (65%), MDI (85%), Turbuhaler (71%), Handihaler (62%) and Breezhaler (56%). Fewer asthma patients made critical errors with ELLIPTA after reading the PIL vs: DISKUS (3/70 (4%) vs 9/70 (13%), P=0.221); MDI (2/32 (6%) vs 8/32 (25%), P=0.074) and significantly fewer vs Turbuhaler (3/60 (5%) vs 20/60 (33%), P<0.001). More asthma and COPD patients preferred ELLIPTA over the other devices (all P⩽0.002). Significantly, fewer COPD patients using ELLIPTA made critical errors after reading the PIL vs other inhalers. More asthma and COPD patients preferred ELLIPTA over comparator inhalers. PMID:27883002

  8. Errors Affect Hypothetical Intertemporal Food Choice in Women

    PubMed Central

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2014-01-01

    Growing evidence suggests that the ability to control behavior is enhanced in contexts in which errors are more frequent. Here we investigated whether pairing desirable food with errors could decrease impulsive choice during hypothetical temporal decisions about food. To this end, healthy women performed a Stop-signal task in which one food cue predicted high-error rate, and another food cue predicted low-error rate. Afterwards, we measured participants’ intertemporal preferences during decisions between smaller-immediate and larger-delayed amounts of food. We expected reduced sensitivity to smaller-immediate amounts of food associated with high-error rate. Moreover, taking into account that deprivational states affect sensitivity for food, we controlled for participants’ hunger. Results showed that pairing food with high-error likelihood decreased temporal discounting. This effect was modulated by hunger, indicating that, the lower the hunger level, the more participants showed reduced impulsive preference for the food previously associated with a high number of errors as compared with the other food. These findings reveal that errors, which are motivationally salient events that recruit cognitive control and drive avoidance learning against error-prone behavior, are effective in reducing impulsive choice for edible outcomes. PMID:25244534

  9. Nurses' behaviors and visual scanning patterns may reduce patient identification errors.

    PubMed

    Marquard, Jenna L; Henneman, Philip L; He, Ze; Jo, Junghee; Fisher, Donald L; Henneman, Elizabeth A

    2011-09-01

    Patient identification (ID) errors occurring during the medication administration process can be fatal. The aim of this study is to determine whether differences in nurses' behaviors and visual scanning patterns during the medication administration process influence their capacities to identify patient ID errors. Nurse participants (n = 20) administered medications to 3 patients in a simulated clinical setting, with 1 patient having an embedded ID error. Error-identifying nurses tended to complete more process steps in a similar amount of time than non-error-identifying nurses and tended to scan information across artifacts (e.g., ID band, patient chart, medication label) rather than fixating on several pieces of information on a single artifact before fixating on another artifact. Non-error-indentifying nurses tended to increase their durations of off-topic conversations-a type of process interruption-over the course of the trials; the difference between groups was significant in the trial with the embedded ID error. Error-identifying nurses tended to have their most fixations in a row on the patient's chart, whereas non-error-identifying nurses did not tend to have a single artifact on which they consistently fixated. Finally, error-identifying nurses tended to have predictable eye fixation sequences across artifacts, whereas non-error-identifying nurses tended to have seemingly random eye fixation sequences. This finding has implications for nurse training and the design of tools and technologies that support nurses as they complete the medication administration process. (c) 2011 APA, all rights reserved.

  10. Making Residents Part of the Safety Culture: Improving Error Reporting and Reducing Harms.

    PubMed

    Fox, Michael D; Bump, Gregory M; Butler, Gabriella A; Chen, Ling-Wan; Buchert, Andrew R

    2017-01-30

    Reporting medical errors is a focus of the patient safety movement. As frontline physicians, residents are optimally positioned to recognize errors and flaws in systems of care. Previous work highlights the difficulty of engaging residents in identification and/or reduction of medical errors and in integrating these trainees into their institutions' cultures of safety. The authors describe the implementation of a longitudinal, discipline-based, multifaceted curriculum to enhance the reporting of errors by pediatric residents at Children's Hospital of Pittsburgh of University of Pittsburgh Medical Center. The key elements of this curriculum included providing the necessary education to identify medical errors with an emphasis on systems-based causes, modeling of error reporting by faculty, and integrating error reporting and discussion into the residents' daily activities. The authors tracked monthly error reporting rates by residents and other health care professionals, in addition to serious harm event rates at the institution. The interventions resulted in significant increases in error reports filed by residents, from 3.6 to 37.8 per month over 4 years (P < 0.0001). This increase in resident error reporting correlated with a decline in serious harm events, from 15.0 to 8.1 per month over 4 years (P = 0.01). Integrating patient safety into the everyday resident responsibilities encourages frequent reporting and discussion of medical errors and leads to improvements in patient care. Multiple simultaneous interventions are essential to making residents part of the safety culture of their training hospitals.

  11. Making electronic prescribing alerts more effective: scenario-based experimental study in junior doctors

    PubMed Central

    Shah, Priya; Wyatt, Jeremy C; Makubate, Boikanyo; Cross, Frank W

    2011-01-01

    Objective Expert authorities recommend clinical decision support systems to reduce prescribing error rates, yet large numbers of insignificant on-screen alerts presented in modal dialog boxes persistently interrupt clinicians, limiting the effectiveness of these systems. This study compared the impact of modal and non-modal electronic (e-) prescribing alerts on prescribing error rates, to help inform the design of clinical decision support systems. Design A randomized study of 24 junior doctors each performing 30 simulated prescribing tasks in random order with a prototype e-prescribing system. Using a within-participant design, doctors were randomized to be shown one of three types of e-prescribing alert (modal, non-modal, no alert) during each prescribing task. Measurements The main outcome measure was prescribing error rate. Structured interviews were performed to elicit participants' preferences for the prescribing alerts and their views on clinical decision support systems. Results Participants exposed to modal alerts were 11.6 times less likely to make a prescribing error than those not shown an alert (OR 11.56, 95% CI 6.00 to 22.26). Those shown a non-modal alert were 3.2 times less likely to make a prescribing error (OR 3.18, 95% CI 1.91 to 5.30) than those not shown an alert. The error rate with non-modal alerts was 3.6 times higher than with modal alerts (95% CI 1.88 to 7.04). Conclusions Both kinds of e-prescribing alerts significantly reduced prescribing error rates, but modal alerts were over three times more effective than non-modal alerts. This study provides new evidence about the relative effects of modal and non-modal alerts on prescribing outcomes. PMID:21836158

  12. An error reduction algorithm to improve lidar turbulence estimates for wind energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer F.; Clifton, Andrew

    Remote-sensing devices such as lidars are currently being investigated as alternatives to cup anemometers on meteorological towers for the measurement of wind speed and direction. Although lidars can measure mean wind speeds at heights spanning an entire turbine rotor disk and can be easily moved from one location to another, they measure different values of turbulence than an instrument on a tower. Current methods for improving lidar turbulence estimates include the use of analytical turbulence models and expensive scanning lidars. While these methods provide accurate results in a research setting, they cannot be easily applied to smaller, vertically profiling lidarsmore » in locations where high-resolution sonic anemometer data are not available. Thus, there is clearly a need for a turbulence error reduction model that is simpler and more easily applicable to lidars that are used in the wind energy industry. In this work, a new turbulence error reduction algorithm for lidars is described. The Lidar Turbulence Error Reduction Algorithm, L-TERRA, can be applied using only data from a stand-alone vertically profiling lidar and requires minimal training with meteorological tower data. The basis of L-TERRA is a series of physics-based corrections that are applied to the lidar data to mitigate errors from instrument noise, volume averaging, and variance contamination. These corrections are applied in conjunction with a trained machine-learning model to improve turbulence estimates from a vertically profiling WINDCUBE v2 lidar. The lessons learned from creating the L-TERRA model for a WINDCUBE v2 lidar can also be applied to other lidar devices. L-TERRA was tested on data from two sites in the Southern Plains region of the United States. The physics-based corrections in L-TERRA brought regression line slopes much closer to 1 at both sites and significantly reduced the sensitivity of lidar turbulence errors to atmospheric stability. The accuracy of machine-learning methods in L-TERRA was highly dependent on the input variables and training dataset used, suggesting that machine learning may not be the best technique for reducing lidar turbulence intensity (TI) error. Future work will include the use of a lidar simulator to better understand how different factors affect lidar turbulence error and to determine how these errors can be reduced using information from a stand-alone lidar.« less

  13. An error reduction algorithm to improve lidar turbulence estimates for wind energy

    DOE PAGES

    Newman, Jennifer F.; Clifton, Andrew

    2017-02-10

    Remote-sensing devices such as lidars are currently being investigated as alternatives to cup anemometers on meteorological towers for the measurement of wind speed and direction. Although lidars can measure mean wind speeds at heights spanning an entire turbine rotor disk and can be easily moved from one location to another, they measure different values of turbulence than an instrument on a tower. Current methods for improving lidar turbulence estimates include the use of analytical turbulence models and expensive scanning lidars. While these methods provide accurate results in a research setting, they cannot be easily applied to smaller, vertically profiling lidarsmore » in locations where high-resolution sonic anemometer data are not available. Thus, there is clearly a need for a turbulence error reduction model that is simpler and more easily applicable to lidars that are used in the wind energy industry. In this work, a new turbulence error reduction algorithm for lidars is described. The Lidar Turbulence Error Reduction Algorithm, L-TERRA, can be applied using only data from a stand-alone vertically profiling lidar and requires minimal training with meteorological tower data. The basis of L-TERRA is a series of physics-based corrections that are applied to the lidar data to mitigate errors from instrument noise, volume averaging, and variance contamination. These corrections are applied in conjunction with a trained machine-learning model to improve turbulence estimates from a vertically profiling WINDCUBE v2 lidar. The lessons learned from creating the L-TERRA model for a WINDCUBE v2 lidar can also be applied to other lidar devices. L-TERRA was tested on data from two sites in the Southern Plains region of the United States. The physics-based corrections in L-TERRA brought regression line slopes much closer to 1 at both sites and significantly reduced the sensitivity of lidar turbulence errors to atmospheric stability. The accuracy of machine-learning methods in L-TERRA was highly dependent on the input variables and training dataset used, suggesting that machine learning may not be the best technique for reducing lidar turbulence intensity (TI) error. Future work will include the use of a lidar simulator to better understand how different factors affect lidar turbulence error and to determine how these errors can be reduced using information from a stand-alone lidar.« less

  14. Improving end of life care: an information systems approach to reducing medical errors.

    PubMed

    Tamang, S; Kopec, D; Shagas, G; Levy, K

    2005-01-01

    Chronic and terminally ill patients are disproportionately affected by medical errors. In addition, the elderly suffer more preventable adverse events than younger patients. Targeting system wide "error-reducing" reforms to vulnerable populations can significantly reduce the incidence and prevalence of human error in medical practice. Recent developments in health informatics, particularly the application of artificial intelligence (AI) techniques such as data mining, neural networks, and case-based reasoning (CBR), presents tremendous opportunities for mitigating error in disease diagnosis and patient management. Additionally, the ubiquity of the Internet creates the possibility of an almost ideal network for the dissemination of medical information. We explore the capacity and limitations of web-based palliative information systems (IS) to transform the delivery of care, streamline processes and improve the efficiency and appropriateness of medical treatment. As a result, medical error(s) that occur with patients dealing with severe, chronic illness and the frail elderly can be reduced.The palliative model grew out of the need for pain relief and comfort measures for patients diagnosed with cancer. Applied definitions of palliative care extend this convention, but there is no widely accepted definition. This research will discuss the development life cycle of two palliative information systems: the CONFER QOLP management information system (MIS), currently used by a community-based palliative care program in Brooklyn, New York, and the CAREN case-based reasoning prototype. CONFER is a web platform based on the idea of "eCare". CONFER uses XML (extensible mark-up language), a W3C-endorced standard mark up to define systems data. The second system, CAREN, is a CBR prototype designed for palliative care patients in the cancer trajectory. CBR is a technique, which tries to exploit the similarities of two situations and match decision-making to the best-known precedent cases. The prototype uses the opensource CASPIAN shell developed by the University of Aberystwyth, Wales and is available by anonymous FTP. We will discuss and analyze the preliminary results we have obtained using this CBR tool. Our research suggests that automated information systems can be used to improve the quality of care at the end of life and disseminate expert level 'know how' to palliative care clinicians. We will present how our CBR prototype can be successfully deployed, capable of securely transferring information using a Secure File Transfer Protocol (SFTP) and using a JAVA CBR engine.

  15. Antiretroviral medication prescribing errors are common with hospitalization of HIV-infected patients.

    PubMed

    Commers, Tessa; Swindells, Susan; Sayles, Harlan; Gross, Alan E; Devetten, Marcel; Sandkovsky, Uriel

    2014-01-01

    Errors in prescribing antiretroviral therapy (ART) often occur with the hospitalization of HIV-infected patients. The rapid identification and prevention of errors may reduce patient harm and healthcare-associated costs. A retrospective review of hospitalized HIV-infected patients was carried out between 1 January 2009 and 31 December 2011. Errors were documented as omission, underdose, overdose, duplicate therapy, incorrect scheduling and/or incorrect therapy. The time to error correction was recorded. Relative risks (RRs) were computed to evaluate patient characteristics and error rates. A total of 289 medication errors were identified in 146/416 admissions (35%). The most common was drug omission (69%). At an error rate of 31%, nucleoside reverse transcriptase inhibitors were associated with an increased risk of error when compared with protease inhibitors (RR 1.32; 95% CI 1.04-1.69) and co-formulated drugs (RR 1.59; 95% CI 1.19-2.09). Of the errors, 31% were corrected within the first 24 h, but over half (55%) were never remedied. Admissions with an omission error were 7.4 times more likely to have all errors corrected within 24 h than were admissions without an omission. Drug interactions with ART were detected on 51 occasions. For the study population (n = 177), an increased risk of admission error was observed for black (43%) compared with white (28%) individuals (RR 1.53; 95% CI 1.16-2.03) but no significant differences were observed between white patients and other minorities or between men and women. Errors in inpatient ART were common, and the majority were never detected. The most common errors involved omission of medication, and nucleoside reverse transcriptase inhibitors had the highest rate of prescribing error. Interventions to prevent and correct errors are urgently needed.

  16. Using virtual reality to assess theory of mind subprocesses and error types in early and chronic schizophrenia.

    PubMed

    Canty, Allana L; Neumann, David L; Shum, David H K

    2017-12-01

    Individuals with schizophrenia often demonstrate theory of mind (ToM) impairment relative to healthy adults. However, the exact nature of this impairment (first- vs. second-order ToM and cognitive vs. affective ToM) and the extent to which ToM abilities deteriorate with illness chronicity is unclear. Furthermore, little is known about the relationships between clinical symptoms and ToM error types (overmentalising, reduced mentalising and no ToM) in early and chronic schizophrenia. This study examined the nature and types of ToM impairment in individuals with early ( n  = 26) and chronic schizophrenia ( n  = 32) using a novel virtual reality task. Clinical participants and demographically-matched controls were administered the Virtual Assessment of Mentalising Ability, which provides indices of first- and second-order cognitive and affective ToM, and quantifies three different types of mentalising errors (viz., overmentalising, reduced mentalising, and no ToM). Individuals with early schizophrenia performed significantly poorer than healthy controls on first-order affective and second-order cognitive and affective ToM, but significantly higher than individuals with chronic schizophrenia on all ToM subscales. Whereas a lack of mental state concept was associated with negative symptoms, overmentalising was associated with positive symptoms. These findings suggest that ToM abilities selectively deteriorate with illness chronicity and error types are related to these individuals' presenting symptomology. An implication of the findings is that social-cognitive interventions for schizophrenia need to consider the nature, time course and symptomatology of the presenting patient.

  17. Endodontic complications of root canal therapy performed by dental students with stainless-steel K-files and nickel-titanium hand files.

    PubMed

    Pettiette, M T; Metzger, Z; Phillips, C; Trope, M

    1999-04-01

    Straightening of curved canals is one of the most common procedural errors in endodontic instrumentation. This problem is commonly encountered when dental students perform molar endodontics. The purpose of this study was to compare the effect of the type of instrument used by these students on the extent of straightening and on the incidence of other endodontic procedural errors. Nickel-titanium 0.02 taper hand files were compared with traditional stainless-steel 0.02 taper K-files. Sixty molar teeth comprised of maxillary and mandibular first and second molars were treated by senior dental students. Instrumentation was with either nickel-titanium hand files or stainless-steel K-files. Preoperative and postoperative radiographs of each tooth were taken using an XCP precision instrument with a customized bite block to ensure accurate reproduction of radiographic angulation. The radiographs were scanned and the images stored as TIFF files. By superimposing tracings from the preoperative over the postoperative radiographs, the degree of deviation of the apical third of the root canal filling from the original canal was measured. The presence of other errors, such as strip perforation and instrument breakage, was established by examining the radiographs. In curved canals instrumented by stainless-steel K-files, the average deviation of the apical third of the canals was 14.44 degrees (+/- 10.33 degrees). The deviation was significantly reduced when nickel-titanium hand files were used to an average of 4.39 degrees (+/- 4.53 degrees). The incidence of other procedural errors was also significantly reduced by the use of nickel-titanium hand files.

  18. Prescribing errors during hospital inpatient care: factors influencing identification by pharmacists.

    PubMed

    Tully, Mary P; Buchan, Iain E

    2009-12-01

    To investigate the prevalence of prescribing errors identified by pharmacists in hospital inpatients and the factors influencing error identification rates by pharmacists throughout hospital admission. 880-bed university teaching hospital in North-west England. Data about prescribing errors identified by pharmacists (median: 9 (range 4-17) collecting data per day) when conducting routine work were prospectively recorded on 38 randomly selected days over 18 months. Proportion of new medication orders in which an error was identified; predictors of error identification rate, adjusted for workload and seniority of pharmacist, day of week, type of ward or stage of patient admission. 33,012 new medication orders were reviewed for 5,199 patients; 3,455 errors (in 10.5% of orders) were identified for 2,040 patients (39.2%; median 1, range 1-12). Most were problem orders (1,456, 42.1%) or potentially significant errors (1,748, 50.6%); 197 (5.7%) were potentially serious; 1.6% (n = 54) were potentially severe or fatal. Errors were 41% (CI: 28-56%) more likely to be identified at patient's admission than at other times, independent of confounders. Workload was the strongest predictor of error identification rates, with 40% (33-46%) less errors identified on the busiest days than at other times. Errors identified fell by 1.9% (1.5-2.3%) for every additional chart checked, independent of confounders. Pharmacists routinely identify errors but increasing workload may reduce identification rates. Where resources are limited, they may be better spent on identifying and addressing errors immediately after admission to hospital.

  19. Impact of Extended-Duration Shifts on Medical Errors, Adverse Events, and Attentional Failures

    PubMed Central

    Barger, Laura K; Ayas, Najib T; Cade, Brian E; Cronin, John W; Rosner, Bernard; Speizer, Frank E; Czeisler, Charles A

    2006-01-01

    Background A recent randomized controlled trial in critical-care units revealed that the elimination of extended-duration work shifts (≥24 h) reduces the rates of significant medical errors and polysomnographically recorded attentional failures. This raised the concern that the extended-duration shifts commonly worked by interns may contribute to the risk of medical errors being made, and perhaps to the risk of adverse events more generally. Our current study assessed whether extended-duration shifts worked by interns are associated with significant medical errors, adverse events, and attentional failures in a diverse population of interns across the United States. Methods and Findings We conducted a Web-based survey, across the United States, in which 2,737 residents in their first postgraduate year (interns) completed 17,003 monthly reports. The association between the number of extended-duration shifts worked in the month and the reporting of significant medical errors, preventable adverse events, and attentional failures was assessed using a case-crossover analysis in which each intern acted as his/her own control. Compared to months in which no extended-duration shifts were worked, during months in which between one and four extended-duration shifts and five or more extended-duration shifts were worked, the odds ratios of reporting at least one fatigue-related significant medical error were 3.5 (95% confidence interval [CI], 3.3–3.7) and 7.5 (95% CI, 7.2–7.8), respectively. The respective odds ratios for fatigue-related preventable adverse events, 8.7 (95% CI, 3.4–22) and 7.0 (95% CI, 4.3–11), were also increased. Interns working five or more extended-duration shifts per month reported more attentional failures during lectures, rounds, and clinical activities, including surgery and reported 300% more fatigue-related preventable adverse events resulting in a fatality. Conclusions In our survey, extended-duration work shifts were associated with an increased risk of significant medical errors, adverse events, and attentional failures in interns across the United States. These results have important public policy implications for postgraduate medical education. PMID:17194188

  20. Reducing errors in aircraft atmospheric inversion estimates of point-source emissions: the Aliso Canyon natural gas leak as a natural tracer experiment

    NASA Astrophysics Data System (ADS)

    Gourdji, S. M.; Yadav, V.; Karion, A.; Mueller, K. L.; Conley, S.; Ryerson, T.; Nehrkorn, T.; Kort, E. A.

    2018-04-01

    Urban greenhouse gas (GHG) flux estimation with atmospheric measurements and modeling, i.e. the ‘top-down’ approach, can potentially support GHG emission reduction policies by assessing trends in surface fluxes and detecting anomalies from bottom-up inventories. Aircraft-collected GHG observations also have the potential to help quantify point-source emissions that may not be adequately sampled by fixed surface tower-based atmospheric observing systems. Here, we estimate CH4 emissions from a known point source, the Aliso Canyon natural gas leak in Los Angeles, CA from October 2015–February 2016, using atmospheric inverse models with airborne CH4 observations from twelve flights ≈4 km downwind of the leak and surface sensitivities from a mesoscale atmospheric transport model. This leak event has been well-quantified previously using various methods by the California Air Resources Board, thereby providing high confidence in the mass-balance leak rate estimates of (Conley et al 2016), used here for comparison to inversion results. Inversions with an optimal setup are shown to provide estimates of the leak magnitude, on average, within a third of the mass balance values, with remaining errors in estimated leak rates predominantly explained by modeled wind speed errors of up to 10 m s‑1, quantified by comparing airborne meteorological observations with modeled values along the flight track. An inversion setup using scaled observational wind speed errors in the model-data mismatch covariance matrix is shown to significantly reduce the influence of transport model errors on spatial patterns and estimated leak rates from the inversions. In sum, this study takes advantage of a natural tracer release experiment (i.e. the Aliso Canyon natural gas leak) to identify effective approaches for reducing the influence of transport model error on atmospheric inversions of point-source emissions, while suggesting future potential for integrating surface tower and aircraft atmospheric GHG observations in top-down urban emission monitoring systems.

  1. Reducing Errors in Satellite Simulated Views of Clouds with an Improved Parameterization of Unresolved Scales

    NASA Astrophysics Data System (ADS)

    Hillman, B. R.; Marchand, R.; Ackerman, T. P.

    2016-12-01

    Satellite instrument simulators have emerged as a means to reduce errors in model evaluation by producing simulated or psuedo-retrievals from model fields, which account for limitations in the satellite retrieval process. Because of the mismatch in resolved scales between satellite retrievals and large-scale models, model cloud fields must first be downscaled to scales consistent with satellite retrievals. This downscaling is analogous to that required for model radiative transfer calculations. The assumption is often made in both model radiative transfer codes and satellite simulators that the unresolved clouds follow maximum-random overlap with horizontally homogeneous cloud condensate amounts. We examine errors in simulated MISR and CloudSat retrievals that arise due to these assumptions by applying the MISR and CloudSat simulators to cloud resolving model (CRM) output generated by the Super-parameterized Community Atmosphere Model (SP-CAM). Errors are quantified by comparing simulated retrievals performed directly on the CRM fields with those simulated by first averaging the CRM fields to approximately 2-degree resolution, applying a "subcolumn generator" to regenerate psuedo-resolved cloud and precipitation condensate fields, and then applying the MISR and CloudSat simulators on the regenerated condensate fields. We show that errors due to both assumptions of maximum-random overlap and homogeneous condensate are significant (relative to uncertainties in the observations and other simulator limitations). The treatment of precipitation is particularly problematic for CloudSat-simulated radar reflectivity. We introduce an improved subcolumn generator for use with the simulators, and show that these errors can be greatly reduced by replacing the maximum-random overlap assumption with the more realistic generalized overlap and incorporating a simple parameterization of subgrid-scale cloud and precipitation condensate heterogeneity. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND NO. SAND2016-7485 A

  2. Reducing the Familiarity of Conjunction Lures with Pictures

    ERIC Educational Resources Information Center

    Lloyd, Marianne E.

    2013-01-01

    Four experiments were conducted to test whether conjunction errors were reduced after pictorial encoding and whether the semantic overlap between study and conjunction items would impact error rates. Across 4 experiments, compound words studied with a single-picture had lower conjunction error rates during a recognition test than those words…

  3. Recommendations to Improve the Accuracy of Estimates of Physical Activity Derived from Self Report

    PubMed Central

    Ainsworth, Barbara E; Caspersen, Carl J; Matthews, Charles E; Mâsse, Louise C; Baranowski, Tom; Zhu, Weimo

    2013-01-01

    Context Assessment of physical activity using self-report has the potential for measurement error that can lead to incorrect inferences about physical activity behaviors and bias study results. Objective To provide recommendations to improve the accuracy of physical activity derived from self report. Process We provide an overview of presentations and a compilation of perspectives shared by the authors of this paper and workgroup members. Findings We identified a conceptual framework for reducing errors using physical activity self-report questionnaires. The framework identifies six steps to reduce error: (1) identifying the need to measure physical activity, (2) selecting an instrument, (3) collecting data, (4) analyzing data, (5) developing a summary score, and (6) interpreting data. Underlying the first four steps are behavioral parameters of type, intensity, frequency, and duration of physical activities performed, activity domains, and the location where activities are performed. We identified ways to reduce measurement error at each step and made recommendations for practitioners, researchers, and organizational units to reduce error in questionnaire assessment of physical activity. Conclusions Self-report measures of physical activity have a prominent role in research and practice settings. Measurement error can be reduced by applying the framework discussed in this paper. PMID:22287451

  4. Can a two-hour lecture by a pharmacist improve the quality of prescriptions in a pediatric hospital? A retrospective cohort study.

    PubMed

    Vairy, Stephanie; Corny, Jennifer; Jamoulle, Olivier; Levy, Arielle; Lebel, Denis; Carceller, Ana

    2017-12-01

    A high rate of prescription errors exists in pediatric teaching hospitals, especially during initial training. To determine the effectiveness of a two-hour lecture by a pharmacist on rates of prescription errors and quality of prescriptions. A two-hour lecture led by a pharmacist was provided to 11 junior pediatric residents (PGY-1) as part of a one-month immersion program. A control group included 15 residents without the intervention. We reviewed charts to analyze the first 50 prescriptions of each resident. Data were collected from 1300 prescriptions involving 451 patients, 550 in the intervention group and 750 in the control group. The rate of prescription errors in the intervention group was 9.6% compared to 11.3% in the control group (p=0.32), affecting 106 patients. Statistically significant differences between both groups were prescriptions with unwritten doses (p=0.01) and errors involving overdosing (p=0.04). We identified many errors as well as issues surrounding quality of prescriptions. We found a 10.6% prescription error rate. This two-hour lecture seems insufficient to reduce prescription errors among junior pediatric residents. This study highlights the most frequent types of errors and prescription quality issues that should be targeted by future educational interventions.

  5. Color-coded prefilled medication syringes decrease time to delivery and dosing errors in simulated prehospital pediatric resuscitations: A randomized crossover trial☆, ☆

    PubMed Central

    Stevens, Allen D.; Hernandez, Caleb; Jones, Seth; Moreira, Maria E.; Blumen, Jason R.; Hopkins, Emily; Sande, Margaret; Bakes, Katherine; Haukoos, Jason S.

    2016-01-01

    Background Medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients where dosing often requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national healthcare priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared to conventional medication administration, in simulated prehospital pediatric resuscitation scenarios. Methods We performed a prospective, block-randomized, cross-over study, where 10 full-time paramedics each managed two simulated pediatric arrests in situ using either prefilled, color-coded-syringes (intervention) or their own medication kits stocked with conventional ampoules (control). Each paramedic was paired with two emergency medical technicians to provide ventilations and compressions as directed. The ambulance patient compartment and the intravenous medication port were video recorded. Data were extracted from video review by blinded, independent reviewers. Results Median time to delivery of all doses for the intervention and control groups was 34 (95% CI: 28–39) seconds and 42 (95% CI: 36–51) seconds, respectively (difference = 9 [95% CI: 4–14] seconds). Using the conventional method, 62 doses were administered with 24 (39%) critical dosing errors; using the prefilled, color-coded syringe method, 59 doses were administered with 0 (0%) critical dosing errors (difference = 39%, 95% CI: 13–61%). Conclusions A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by paramedics during simulated prehospital pediatric resuscitations. PMID:26247145

  6. Impact of Feedback on Three Phases of Performance Monitoring

    PubMed Central

    Appelgren, Alva; Penny, William; Bengtsson, Sara L

    2013-01-01

    We investigated if certain phases of performance monitoring show differential sensitivity to external feedback and thus rely on distinct mechanisms. The phases of interest were: the error phase (FE), the phase of the correct response after errors (FEC), and the phase of correct responses following corrects (FCC). We tested accuracy and reaction time (RT) on 12 conditions of a continuous-choice-response task; the 2-back task. External feedback was either presented or not in FE and FEC, and delivered on 0%, 20%, or 100% of FCC trials. The FCC20 was matched to FE and FEC in the number of sounds received so that we could investigate when external feedback was most valuable to the participants. We found that external feedback led to a reduction in accuracy when presented on all the correct responses. Moreover, RT was significantly reduced for FCC100, which in turn correlated with the accuracy reduction. Interestingly, the correct response after an error was particularly sensitive to external feedback since accuracy was reduced when external feedback was presented during this phase but not for FCC20. Notably, error-monitoring was not influenced by feedback-type. The results are in line with models suggesting that the internal error-monitoring system is sufficient in cognitively demanding tasks where performance is ∼ 80%, as well as theories stipulating that external feedback directs attention away from the task. Our data highlight the first correct response after an error as particularly sensitive to external feedback, suggesting that important consolidation of response strategy takes place here. PMID:24217138

  7. Event-related potentials reflect impaired temporal interval learning following haloperidol administration.

    PubMed

    Forster, Sarah E; Zirnheld, Patrick; Shekhar, Anantha; Steinhauer, Stuart R; O'Donnell, Brian F; Hetrick, William P

    2017-09-01

    Signals carried by the mesencephalic dopamine system and conveyed to anterior cingulate cortex are critically implicated in probabilistic reward learning and performance monitoring. A common evaluative mechanism purportedly subserves both functions, giving rise to homologous medial frontal negativities in feedback- and response-locked event-related brain potentials (the feedback-related negativity (FRN) and the error-related negativity (ERN), respectively), reflecting dopamine-dependent prediction error signals to unexpectedly negative events. Consistent with this model, the dopamine receptor antagonist, haloperidol, attenuates the ERN, but effects on FRN have not yet been evaluated. ERN and FRN were recorded during a temporal interval learning task (TILT) following randomized, double-blind administration of haloperidol (3 mg; n = 18), diphenhydramine (an active control for haloperidol; 25 mg; n = 20), or placebo (n = 21) to healthy controls. Centroparietal positivities, the Pe and feedback-locked P300, were also measured and correlations between ERP measures and behavioral indices of learning, overall accuracy, and post-error compensatory behavior were evaluated. We hypothesized that haloperidol would reduce ERN and FRN, but that ERN would uniquely track automatic, error-related performance adjustments, while FRN would be associated with learning and overall accuracy. As predicted, ERN was reduced by haloperidol and in those exhibiting less adaptive post-error performance; however, these effects were limited to ERNs following fast timing errors. In contrast, the FRN was not affected by drug condition, although increased FRN amplitude was associated with improved accuracy. Significant drug effects on centroparietal positivities were also absent. Our results support a functional and neurobiological dissociation between the ERN and FRN.

  8. Color-coded prefilled medication syringes decrease time to delivery and dosing errors in simulated prehospital pediatric resuscitations: A randomized crossover trial.

    PubMed

    Stevens, Allen D; Hernandez, Caleb; Jones, Seth; Moreira, Maria E; Blumen, Jason R; Hopkins, Emily; Sande, Margaret; Bakes, Katherine; Haukoos, Jason S

    2015-11-01

    Medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients where dosing often requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national healthcare priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared to conventional medication administration, in simulated prehospital pediatric resuscitation scenarios. We performed a prospective, block-randomized, cross-over study, where 10 full-time paramedics each managed two simulated pediatric arrests in situ using either prefilled, color-coded syringes (intervention) or their own medication kits stocked with conventional ampoules (control). Each paramedic was paired with two emergency medical technicians to provide ventilations and compressions as directed. The ambulance patient compartment and the intravenous medication port were video recorded. Data were extracted from video review by blinded, independent reviewers. Median time to delivery of all doses for the intervention and control groups was 34 (95% CI: 28-39) seconds and 42 (95% CI: 36-51) seconds, respectively (difference=9 [95% CI: 4-14] seconds). Using the conventional method, 62 doses were administered with 24 (39%) critical dosing errors; using the prefilled, color-coded syringe method, 59 doses were administered with 0 (0%) critical dosing errors (difference=39%, 95% CI: 13-61%). A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by paramedics during simulated prehospital pediatric resuscitations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Influence of Forecast Accuracy of Photovoltaic Power Output on Capacity Optimization of Microgrid Composition under 30 min Power Balancing Control

    NASA Astrophysics Data System (ADS)

    Sone, Akihito; Kato, Takeyoshi; Shimakage, Toyonari; Suzuoki, Yasuo

    A microgrid (MG) is one of the measures for enhancing the high penetration of renewable energy (RE)-based distributed generators (DGs). If a number of MGs are controlled to maintain the predetermined electricity demand including RE-based DGs as negative demand, they would contribute to supply-demand balancing of whole electric power system. For constructing a MG economically, the capacity optimization of controllable DGs against RE-based DGs is essential. By using a numerical simulation model developed based on a demonstrative study on a MG using PAFC and NaS battery as controllable DGs and photovoltaic power generation system (PVS) as a RE-based DG, this study discusses the influence of forecast accuracy of PVS output on the capacity optimization. Three forecast cases with different accuracy are compared. The main results are as follows. Even with no forecast error during every 30 min. as the ideal forecast method, the required capacity of NaS battery reaches about 40% of PVS capacity for mitigating the instantaneous forecast error within 30 min. The required capacity to compensate for the forecast error is doubled with the actual forecast method. The influence of forecast error can be reduced by adjusting the scheduled power output of controllable DGs according to the weather forecast. Besides, the required capacity can be reduced significantly if the error of balancing control in a MG is acceptable for a few percentages of periods, because the total periods of large forecast error is not so often.

  10. Improving Papanicolaou test quality and reducing medical errors by using Toyota production system methods.

    PubMed

    Raab, Stephen S; Andrew-Jaja, Carey; Condel, Jennifer L; Dabbs, David J

    2006-01-01

    The objective of the study was to determine whether the Toyota production system process improves Papanicolaou test quality and patient safety. An 8-month nonconcurrent cohort study that included 464 case and 639 control women who had a Papanicolaou test was performed. Office workflow was redesigned using Toyota production system methods by introducing a 1-by-1 continuous flow process. We measured the frequency of Papanicolaou tests without a transformation zone component, follow-up and Bethesda System diagnostic frequency of atypical squamous cells of undetermined significance, and diagnostic error frequency. After the intervention, the percentage of Papanicolaou tests lacking a transformation zone component decreased from 9.9% to 4.7% (P = .001). The percentage of Papanicolaou tests with a diagnosis of atypical squamous cells of undetermined significance decreased from 7.8% to 3.9% (P = .007). The frequency of error per correlating cytologic-histologic specimen pair decreased from 9.52% to 7.84%. The introduction of the Toyota production system process resulted in improved Papanicolaou test quality.

  11. Complementary Reliability-Based Decodings of Binary Linear Block Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1997-01-01

    This correspondence presents a hybrid reliability-based decoding algorithm which combines the reprocessing method based on the most reliable basis and a generalized Chase-type algebraic decoder based on the least reliable positions. It is shown that reprocessing with a simple additional algebraic decoding effort achieves significant coding gain. For long codes, the order of reprocessing required to achieve asymptotic optimum error performance is reduced by approximately 1/3. This significantly reduces the computational complexity, especially for long codes. Also, a more efficient criterion for stopping the decoding process is derived based on the knowledge of the algebraic decoding solution.

  12. The importance of the exposure metric in air pollution epidemiology studies: When does it matter, and why?

    EPA Science Inventory

    Exposure error in ambient air pollution epidemiologic studies may introduce bias and/or attenuation of the health risk estimate, reduce statistical significance, and lower statistical power. Alternative exposure metrics are increasingly being used in place of central-site measure...

  13. Federal Government Information Systems Security Management and Governance Are Pacing Factors for Innovation

    ERIC Educational Resources Information Center

    Edwards, Gregory

    2011-01-01

    Security incidents resulting from human error or subversive actions have caused major financial losses, reduced business productivity or efficiency, and threatened national security. Some research suggests that information system security frameworks lack emphasis on human involvement as a significant cause for security problems in a rapidly…

  14. Landing Technique Improvements After an Aquatic-Based Neuromuscular Training Program in Physically Active Women.

    PubMed

    Scarneo, Samantha E; Root, Hayley J; Martinez, Jessica C; Denegar, Craig; Casa, Douglas J; Mazerolle, Stephanie M; Dann, Catie L; Aerni, Giselle A; DiStefano, Lindsay J

    2017-01-01

    Neuromuscular training programs (NTPs) improve landing technique and decrease vertical ground-reaction forces (VGRFs), resulting in injury-risk reduction. NTPs in an aquatic environment may elicit the same improvements as land-based programs with reduced joint stress. To examine the effects of an aquatic NTP on landing technique as measured by the Landing Error Scoring System (LESS) and VGRFs, immediately and 4 mo after the intervention. Repeated measures, pool and laboratory. Fifteen healthy, recreationally active women (age 21 ± 2 y, mass 62.02 ± 8.18 kg, height 164.74 ± 5.97 cm) who demonstrated poor landing technique (LESS-Real Time > 4). All participants completed an aquatic NTP 3 times/wk for 6 wk. Participants' landing technique was evaluated using a jump-landing task immediately before (PRE), immediately after (POST), and 4 mo after (RET) the intervention period. A single rater, blinded to time point, graded all videos using the LESS, which is a valid and reliable movement-screening tool. Peak VGRFs were measured during the stance phase of the jump-landing test. Repeated-measure analyses of variance with planned comparisons were performed to explore differences between time points. LESS scores were lower at POST (4.46 ± 1.69 errors) and at RET (4.2 ± 1.72 errors) than at PRE (6.30 ± 1.78 errors) (P < .01). No significant differences were observed between POST and RET (P > .05). Participants also landed with significantly lower peak VGRFs (P < .01) from PRE (2.69 ± .72 N) to POST (2.23 ± .66 N). The findings introduce evidence that an aquatic NTP improves landing technique and suggest that improvements are retained over time. These results show promise of using an aquatic NTP when there is a desire to reduce joint loading, such as early stages of rehabilitation, to improve biomechanics and reduce injury risk.

  15. Research on the output bit error rate of 2DPSK signal based on stochastic resonance theory

    NASA Astrophysics Data System (ADS)

    Yan, Daqin; Wang, Fuzhong; Wang, Shuo

    2017-12-01

    Binary differential phase-shift keying (2DPSK) signal is mainly used for high speed data transmission. However, the bit error rate of digital signal receiver is high in the case of wicked channel environment. In view of this situation, a novel method based on stochastic resonance (SR) is proposed, which is aimed to reduce the bit error rate of 2DPSK signal by coherent demodulation receiving. According to the theory of SR, a nonlinear receiver model is established, which is used to receive 2DPSK signal under small signal-to-noise ratio (SNR) circumstances (between -15 dB and 5 dB), and compared with the conventional demodulation method. The experimental results demonstrate that when the input SNR is in the range of -15 dB to 5 dB, the output bit error rate of nonlinear system model based on SR has a significant decline compared to the conventional model. It could reduce 86.15% when the input SNR equals -7 dB. Meanwhile, the peak value of the output signal spectrum is 4.25 times as that of the conventional model. Consequently, the output signal of the system is more likely to be detected and the accuracy can be greatly improved.

  16. Accelerated Compressed Sensing Based CT Image Reconstruction.

    PubMed

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  17. Accelerated Compressed Sensing Based CT Image Reconstruction

    PubMed Central

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R.; Paul, Narinder S.; Cobbold, Richard S. C.

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization. PMID:26167200

  18. Refining Field Measurements of Methane Flux Rates from Abandoned Oil and Gas Wells

    NASA Astrophysics Data System (ADS)

    Lagron, C. S.; Kang, M.; Riqueros, N. S.; Jackson, R. B.

    2015-12-01

    Recent studies in Pennsylvania demonstrate the potential for significant methane emissions from abandoned oil and gas wells. A subset of tested wells was high emitting, with methane flux rates up to seven orders of magnitude greater than natural fluxes (up to 105 mg CH4/hour, or about 2.5LPM). These wells contribute disproportionately to the total methane emissions from abandoned oil and gas wells. The principles guiding the chamber design have been developed for lower flux rates, typically found in natural environments, and chamber design modifications may reduce uncertainty in flux rates associated with high-emitting wells. Kang et al. estimate errors of a factor of two in measured values based on previous studies. We conduct controlled releases of methane to refine error estimates and improve chamber design with a focus on high-emitters. Controlled releases of methane are conducted at 0.05 LPM, 0.50 LPM, 1.0 LPM, 2.0 LPM, 3.0 LPM, and 5.0 LPM, and at two chamber dimensions typically used in field measurements studies of abandoned wells. As most sources of error tabulated by Kang et al. tend to bias the results toward underreporting of methane emissions, a flux-targeted chamber design modification can reduce error margins and/or provide grounds for a potential upward revision of emission estimates.

  19. Improved Conflict Detection for Reducing Operational Errors in Air Traffic Control

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Hainz

    2003-01-01

    An operational error is an incident in which an air traffic controller allows the separation between two aircraft to fall below the minimum separation standard. The rates of such errors in the US have increased significantly over the past few years. This paper proposes new detection methods that can help correct this trend by improving on the performance of Conflict Alert, the existing software in the Host Computer System that is intended to detect and warn controllers of imminent conflicts. In addition to the usual trajectory based on the flight plan, a "dead-reckoning" trajectory (current velocity projection) is also generated for each aircraft and checked for conflicts. Filters for reducing common types of false alerts were implemented. The new detection methods were tested in three different ways. First, a simple flightpath command language was developed t o generate precisely controlled encounters for the purpose of testing the detection software. Second, written reports and tracking data were obtained for actual operational errors that occurred in the field, and these were "replayed" to test the new detection algorithms. Finally, the detection methods were used to shadow live traffic, and performance was analysed, particularly with regard to the false-alert rate. The results indicate that the new detection methods can provide timely warnings of imminent conflicts more consistently than Conflict Alert.

  20. In-hospital fellow coverage reduces communication errors in the surgical intensive care unit.

    PubMed

    Williams, Mallory; Alban, Rodrigo F; Hardy, James P; Oxman, David A; Garcia, Edward R; Hevelone, Nathanael; Frendl, Gyorgy; Rogers, Selwyn O

    2014-06-01

    Staff coverage strategies of intensive care units (ICUs) impact clinical outcomes. High-intensity staff coverage strategies are associated with lower morbidity and mortality. Accessible clinical expertise, team work, and effective communication have all been attributed to the success of this coverage strategy. We evaluate the impact of in-hospital fellow coverage (IHFC) on improving communication of cardiorespiratory events. A prospective observational study performed in an academic tertiary care center with high-intensity staff coverage. The main outcome measure was resident to fellow communication of cardiorespiratory events during IHFC vs home coverage (HC) periods. Three hundred twelve cardiorespiratory events were collected in 114 surgical ICU patients in 134 study days. Complete data were available for 306 events. One hundred three communication errors occurred. IHFC was associated with significantly better communication of events compared to HC (P<.0001). Residents communicated 89% of events during IHFC vs 51% of events during HC (P<.001). Communication patterns of junior and midlevel residents were similar. Midlevel residents communicated 68% of all on-call events (87% IHFC vs 50% HC, P<.001). Junior residents communicated 66% of events (94% IHFC vs 52% HC, P<.001). Communication errors were lower in all ICUs during IHFC (P<.001). IHFC reduced communication errors. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Methods and apparatus for reducing peak wind turbine loads

    DOEpatents

    Moroz, Emilian Mieczyslaw

    2007-02-13

    A method for reducing peak loads of wind turbines in a changing wind environment includes measuring or estimating an instantaneous wind speed and direction at the wind turbine and determining a yaw error of the wind turbine relative to the measured instantaneous wind direction. The method further includes comparing the yaw error to a yaw error trigger that has different values at different wind speeds and shutting down the wind turbine when the yaw error exceeds the yaw error trigger corresponding to the measured or estimated instantaneous wind speed.

  2. Simulation: learning from mistakes while building communication and teamwork.

    PubMed

    Kuehster, Christina R; Hall, Carla D

    2010-01-01

    Medical errors are one of the leading causes of death annually in the United States. Many of these errors are related to poor communication and/or lack of teamwork. Using simulation as a teaching modality provides a dual role in helping to reduce these errors. Thorough integration of clinical practice with teamwork and communication in a safe environment increases the likelihood of reducing the error rates in medicine. By allowing practitioners to make potential errors in a safe environment, such as simulation, these valuable lessons improve retention and will rarely be repeated.

  3. Does Educator Training or Experience Affect the Quality of Multiple-Choice Questions?

    PubMed

    Webb, Emily M; Phuong, Jonathan S; Naeger, David M

    2015-10-01

    Physicians receive little training on proper multiple-choice question (MCQ) writing methods. Well-constructed MCQs follow rules, which ensure that a question tests what it is intended to test. Questions that break these are described as "flawed." We examined whether the prevalence of flawed questions differed significantly between those with or without prior training in question writing and between those with different levels of educator experience. We assessed 200 unedited MCQs from a question bank for our senior medical student radiology elective: an equal number of questions (50) were written by faculty with previous training in MCQ writing, other faculty, residents, and medical students. Questions were scored independently by two readers for the presence of 11 distinct flaws described in the literature. Questions written by faculty with MCQ writing training had significantly fewer errors: mean 0.4 errors per question compared to a mean of 1.5-1.7 errors per question for the other groups (P < .001). There were no significant differences in the total number of errors between the untrained faculty, residents, and students (P values .35-.91). Among trained faculty 17/50 questions (34%) were flawed, whereas other faculty wrote 38/50 (76%) flawed questions, residents 37/50 (74%), and students 44/50 (88%). Trained question writers' higher performance was mainly manifest in the reduced frequency of five specific errors. Faculty with training in effective MCQ writing made fewer errors in MCQ construction. Educator experience alone had no effect on the frequency of flaws; faculty without dedicated training, residents, and students performed similarly. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  4. Performance analysis of an integrated GPS/inertial attitude determination system. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Sullivan, Wendy I.

    1994-01-01

    The performance of an integrated GPS/inertial attitude determination system is investigated using a linear covariance analysis. The principles of GPS interferometry are reviewed, and the major error sources of both interferometers and gyroscopes are discussed and modeled. A new figure of merit, attitude dilution of precision (ADOP), is defined for two possible GPS attitude determination methods, namely single difference and double difference interferometry. Based on this figure of merit, a satellite selection scheme is proposed. The performance of the integrated GPS/inertial attitude determination system is determined using a linear covariance analysis. Based on this analysis, it is concluded that the baseline errors (i.e., knowledge of the GPS interferometer baseline relative to the vehicle coordinate system) are the limiting factor in system performance. By reducing baseline errors, it should be possible to use lower quality gyroscopes without significantly reducing performance. For the cases considered, single difference interferometry is only marginally better than double difference interferometry. Finally, the performance of the system is found to be relatively insensitive to the satellite selection technique.

  5. Paediatric in-patient prescribing errors in Malaysia: a cross-sectional multicentre study.

    PubMed

    Khoo, Teik Beng; Tan, Jing Wen; Ng, Hoong Phak; Choo, Chong Ming; Bt Abdul Shukor, Intan Nor Chahaya; Teh, Siao Hean

    2017-06-01

    Background There is a lack of large comprehensive studies in developing countries on paediatric in-patient prescribing errors in different settings. Objectives To determine the characteristics of in-patient prescribing errors among paediatric patients. Setting General paediatric wards, neonatal intensive care units and paediatric intensive care units in government hospitals in Malaysia. Methods This is a cross-sectional multicentre study involving 17 participating hospitals. Drug charts were reviewed in each ward to identify the prescribing errors. All prescribing errors identified were further assessed for their potential clinical consequences, likely causes and contributing factors. Main outcome measures Incidence, types, potential clinical consequences, causes and contributing factors of the prescribing errors. Results The overall prescribing error rate was 9.2% out of 17,889 prescribed medications. There was no significant difference in the prescribing error rates between different types of hospitals or wards. The use of electronic prescribing had a higher prescribing error rate than manual prescribing (16.9 vs 8.2%, p < 0.05). Twenty eight (1.7%) prescribing errors were deemed to have serious potential clinical consequences and 2 (0.1%) were judged to be potentially fatal. Most of the errors were attributed to human factors, i.e. performance or knowledge deficit. The most common contributing factors were due to lack of supervision or of knowledge. Conclusions Although electronic prescribing may potentially improve safety, it may conversely cause prescribing errors due to suboptimal interfaces and cumbersome work processes. Junior doctors need specific training in paediatric prescribing and close supervision to reduce prescribing errors in paediatric in-patients.

  6. Commentary: Reducing diagnostic errors: another role for checklists?

    PubMed

    Winters, Bradford D; Aswani, Monica S; Pronovost, Peter J

    2011-03-01

    Diagnostic errors are a widespread problem, although the true magnitude is unknown because they cannot currently be measured validly. These errors have received relatively little attention despite alarming estimates of associated harm and death. One promising intervention to reduce preventable harm is the checklist. This intervention has proven successful in aviation, in which situations are linear and deterministic (one alarm goes off and a checklist guides the flight crew to evaluate the cause). In health care, problems are multifactorial and complex. A checklist has been used to reduce central-line-associated bloodstream infections in intensive care units. Nevertheless, this checklist was incorporated in a culture-based safety program that engaged and changed behaviors and used robust measurement of infections to evaluate progress. In this issue, Ely and colleagues describe how three checklists could reduce the cognitive biases and mental shortcuts that underlie diagnostic errors, but point out that these tools still need to be tested. To be effective, they must reduce diagnostic errors (efficacy) and be routinely used in practice (effectiveness). Such tools must intuitively support how the human brain works, and under time pressures, clinicians rarely think in conditional probabilities when making decisions. To move forward, it is necessary to accurately measure diagnostic errors (which could come from mapping out the diagnostic process as the medication process has done and measuring errors at each step) and pilot test interventions such as these checklists to determine whether they work.

  7. A spectrally tunable solid-state source for radiometric, photometric, and colorimetric applications

    NASA Astrophysics Data System (ADS)

    Fryc, Irena; Brown, Steven W.; Eppeldauer, George P.; Ohno, Yoshihiro

    2004-10-01

    A spectrally tunable light source using a large number of LEDs and an integrating sphere has been designed and being developed at NIST. The source is designed to have a capability of producing any spectral distributions mimicking various light sources in the visible region by feedback control of individual LEDs. The output spectral irradiance or radiance of the source will be calibrated by a reference instrument, and the source will be used as a spectroradiometric as well as photometric and colorimetric standard. The use of the tunable source mimicking spectra of display colors, for example, rather than a traditional incandescent standard lamp for calibration of colorimeters, can reduce the spectral mismatch errors of the colorimeter measuring displays significantly. A series of simulations have been conducted to predict the performance of the designed tunable source when used for calibration of colorimeters. The results indicate that the errors can be reduced by an order of magnitude compared with those when the colorimeters are calibrated against Illuminant A. Stray light errors of a spectroradiometer can also be effectively reduced by using the tunable source producing a blackbody spectrum at higher temperature (e.g., 9000 K). The source can also approximate various CIE daylight illuminants and common lamp spectral distributions for other photometric and colorimetric applications.

  8. Calibrating First-Order Strong Lensing Mass Estimates in Clusters of Galaxies

    NASA Astrophysics Data System (ADS)

    Reed, Brendan; Remolian, Juan; Sharon, Keren; Li, Nan; SPT Clusters Cooperation

    2018-01-01

    We investigate methods to reduce the statistical and systematic errors inherent to using the Einstein Radius as a first-order mass estimate in strong lensing galaxy clusters. By finding an empirical universal calibration function, we aim to enable a first-order mass estimate of large cluster data sets in a fraction of the time and effort of full-scale strong lensing mass modeling. We use 74 simulated cluster data from the Argonne National Laboratory in a lens redshift slice of [0.159, 0.667] with various source redshifts in the range of [1.23, 2.69]. From the simulated density maps, we calculate the exact mass enclosed within the Einstein Radius. We find that the mass inferred from the Einstein Radius alone produces an error width of ~39% with respect to the true mass. We explore an array of polynomial and exponential correction functions with dependence on cluster redshift and projected radii of the lensed images, aiming to reduce the statistical and systematic uncertainty. We find that the error on the the mass inferred from the Einstein Radius can be reduced significantly by using a universal correction function. Our study has implications for current and future large galaxy cluster surveys aiming to measure cluster mass, and the mass-concentration relation.

  9. Cognitive tests predict real-world errors: the relationship between drug name confusion rates in laboratory-based memory and perception tests and corresponding error rates in large pharmacy chains.

    PubMed

    Schroeder, Scott R; Salomon, Meghan M; Galanter, William L; Schiff, Gordon D; Vaida, Allen J; Gaunt, Michael J; Bryson, Michelle L; Rash, Christine; Falck, Suzanne; Lambert, Bruce L

    2017-05-01

    Drug name confusion is a common type of medication error and a persistent threat to patient safety. In the USA, roughly one per thousand prescriptions results in the wrong drug being filled, and most of these errors involve drug names that look or sound alike. Prior to approval, drug names undergo a variety of tests to assess their potential for confusability, but none of these preapproval tests has been shown to predict real-world error rates. We conducted a study to assess the association between error rates in laboratory-based tests of drug name memory and perception and real-world drug name confusion error rates. Eighty participants, comprising doctors, nurses, pharmacists, technicians and lay people, completed a battery of laboratory tests assessing visual perception, auditory perception and short-term memory of look-alike and sound-alike drug name pairs (eg, hydroxyzine/hydralazine). Laboratory test error rates (and other metrics) significantly predicted real-world error rates obtained from a large, outpatient pharmacy chain, with the best-fitting model accounting for 37% of the variance in real-world error rates. Cross-validation analyses confirmed these results, showing that the laboratory tests also predicted errors from a second pharmacy chain, with 45% of the variance being explained by the laboratory test data. Across two distinct pharmacy chains, there is a strong and significant association between drug name confusion error rates observed in the real world and those observed in laboratory-based tests of memory and perception. Regulators and drug companies seeking a validated preapproval method for identifying confusing drug names ought to consider using these simple tests. By using a standard battery of memory and perception tests, it should be possible to reduce the number of confusing look-alike and sound-alike drug name pairs that reach the market, which will help protect patients from potentially harmful medication errors. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  10. Using Automated Writing Evaluation to Reduce Grammar Errors in Writing

    ERIC Educational Resources Information Center

    Liao, Hui-Chuan

    2016-01-01

    Despite the recent development of automated writing evaluation (AWE) technology and the growing interest in applying this technology to language classrooms, few studies have looked at the effects of using AWE on reducing grammatical errors in L2 writing. This study identified the primary English grammatical error types made by 66 Taiwanese…

  11. Reduction in chemotherapy order errors with computerised physician order entry and clinical decision support systems.

    PubMed

    Aziz, Muhammad Tahir; Ur-Rehman, Tofeeq; Qureshi, Sadia; Bukhari, Nadeem Irfan

    Medication errors in chemotherapy are frequent and lead to patient morbidity and mortality, as well as increased rates of re-admission and length of stay, and considerable extra costs. Objective: This study investigated the proposition that computerised chemotherapy ordering reduces the incidence and severity of chemotherapy protocol errors. A computerised physician order entry of chemotherapy order (C-CO) with clinical decision support system was developed in-house, including standardised chemotherapy protocol definitions, automation of pharmacy distribution, clinical checks, labeling and invoicing. A prospective study was then conducted in a C-CO versus paper based chemotherapy order (P-CO) in a 30-bed chemotherapy bay of a tertiary hospital. Both C-CO and P-CO orders, including pharmacoeconomic analysis and the severity of medication errors, were checked and validated by a clinical pharmacist. A group analysis and field trial were also conducted to assess clarity, feasibility and decision making. The C-CO was very usable in terms of its clarity and feasibility. The incidence of medication errors was significantly lower in the C-CO compared with the P-CO (10/3765 [0.26%] versus 134/5514 [2.4%]). There was also a reduction in dispensing time of chemotherapy protocols in the C-CO. The chemotherapy computerisation with clinical decision support system resulted in a significant decrease in the occurrence and severity of medication errors, improvements in chemotherapy dispensing and administration times, and reduction of chemotherapy cost.

  12. Using Six Sigma to reduce medication errors in a home-delivery pharmacy service.

    PubMed

    Castle, Lon; Franzblau-Isaac, Ellen; Paulsen, Jim

    2005-06-01

    Medco Health Solutions, Inc. conducted a project to reduce medication errors in its home-delivery service, which is composed of eight prescription-processing pharmacies, three dispensing pharmacies, and six call-center pharmacies. Medco uses the Six Sigma methodology to reduce process variation, establish procedures to monitor the effectiveness of medication safety programs, and determine when these efforts do not achieve performance goals. A team reviewed the processes in home-delivery pharmacy and suggested strategies to improve the data-collection and medication-dispensing practices. A variety of improvement activities were implemented, including a procedure for developing, reviewing, and enhancing sound-alike/look-alike (SALA) alerts and system enhancements to improve processing consistency across the pharmacies. "External nonconformances" were reduced for several categories of medication errors, including wrong-drug selection (33%), wrong directions (49%), and SALA errors (69%). Control charts demonstrated evidence of sustained process improvement and actual reduction in specific medication error elements. Establishing a continuous quality improvement process to ensure that medication errors are minimized is critical to any health care organization providing medication services.

  13. The impact of command signal power distribution, processing delays, and speed scaling on neurally-controlled devices

    NASA Astrophysics Data System (ADS)

    Marathe, A. R.; Taylor, D. M.

    2015-08-01

    Objective. Decoding algorithms for brain-machine interfacing (BMI) are typically only optimized to reduce the magnitude of decoding errors. Our goal was to systematically quantify how four characteristics of BMI command signals impact closed-loop performance: (1) error magnitude, (2) distribution of different frequency components in the decoding errors, (3) processing delays, and (4) command gain. Approach. To systematically evaluate these different command features and their interactions, we used a closed-loop BMI simulator where human subjects used their own wrist movements to command the motion of a cursor to targets on a computer screen. Random noise with three different power distributions and four different relative magnitudes was added to the ongoing cursor motion in real time to simulate imperfect decoding. These error characteristics were tested with four different visual feedback delays and two velocity gains. Main results. Participants had significantly more trouble correcting for errors with a larger proportion of low-frequency, slow-time-varying components than they did with jittery, higher-frequency errors, even when the error magnitudes were equivalent. When errors were present, a movement delay often increased the time needed to complete the movement by an order of magnitude more than the delay itself. Scaling down the overall speed of the velocity command can actually speed up target acquisition time when low-frequency errors and delays are present. Significance. This study is the first to systematically evaluate how the combination of these four key command signal features (including the relatively-unexplored error power distribution) and their interactions impact closed-loop performance independent of any specific decoding method. The equations we derive relating closed-loop movement performance to these command characteristics can provide guidance on how best to balance these different factors when designing BMI systems. The equations reported here also provide an efficient way to compare a diverse range of decoding options offline.

  14. Optical vector network analyzer with improved accuracy based on polarization modulation and polarization pulling.

    PubMed

    Li, Wei; Liu, Jian Guo; Zhu, Ning Hua

    2015-04-15

    We report a novel optical vector network analyzer (OVNA) with improved accuracy based on polarization modulation and stimulated Brillouin scattering (SBS) assisted polarization pulling. The beating between adjacent higher-order optical sidebands which are generated because of the nonlinearity of an electro-optic modulator (EOM) introduces considerable error to the OVNA. In our scheme, the measurement error is significantly reduced by removing the even-order optical sidebands using polarization discrimination. The proposed approach is theoretically analyzed and experimentally verified. The experimental results show that the accuracy of the OVNA is greatly improved compared to a conventional OVNA.

  15. Crowded field photometry with deconvolved images.

    NASA Astrophysics Data System (ADS)

    Linde, P.; Spännare, S.

    A local implementation of the Lucy-Richardson algorithm has been used to deconvolve a set of crowded stellar field images. The effects of deconvolution on detection limits as well as on photometric and astrometric properties have been investigated as a function of the number of deconvolution iterations. Results show that deconvolution improves detection of faint stars, although artifacts are also found. Deconvolution provides more stars measurable without significant degradation of positional accuracy. The photometric precision is affected by deconvolution in several ways. Errors due to unresolved images are notably reduced, while flux redistribution between stars and background increases the errors.

  16. Extension of similarity test procedures to cooled engine components with insulating ceramic coatings

    NASA Technical Reports Server (NTRS)

    Gladden, H. J.

    1980-01-01

    Material thermal conductivity was analyzed for its effect on the thermal performance of air cooled gas turbine components, both with and without a ceramic thermal-barrier material, tested at reduced temperatures and pressures. The analysis shows that neglecting the material thermal conductivity can contribute significant errors when metal-wall-temperature test data taken on a turbine vane are extrapolated to engine conditions. This error in metal temperature for an uncoated vane is of opposite sign from that for a ceramic-coated vane. A correction technique is developed for both ceramic-coated and uncoated components.

  17. Application of a Noise Adaptive Contrast Sensitivity Function to Image Data Compression

    NASA Astrophysics Data System (ADS)

    Daly, Scott J.

    1989-08-01

    The visual contrast sensitivity function (CSF) has found increasing use in image compression as new algorithms optimize the display-observer interface in order to reduce the bit rate and increase the perceived image quality. In most compression algorithms, increasing the quantization intervals reduces the bit rate at the expense of introducing more quantization error, a potential image quality degradation. The CSF can be used to distribute this error as a function of spatial frequency such that it is undetectable by the human observer. Thus, instead of being mathematically lossless, the compression algorithm can be designed to be visually lossless, with the advantage of a significantly reduced bit rate. However, the CSF is strongly affected by image noise, changing in both shape and peak sensitivity. This work describes a model of the CSF that includes these changes as a function of image noise level by using the concepts of internal visual noise, and tests this model in the context of image compression with an observer study.

  18. Reducing patient identification errors related to glucose point-of-care testing.

    PubMed

    Alreja, Gaurav; Setia, Namrata; Nichols, James; Pantanowitz, Liron

    2011-01-01

    Patient identification (ID) errors in point-of-care testing (POCT) can cause test results to be transferred to the wrong patient's chart or prevent results from being transmitted and reported. Despite the implementation of patient barcoding and ongoing operator training at our institution, patient ID errors still occur with glucose POCT. The aim of this study was to develop a solution to reduce identification errors with POCT. Glucose POCT was performed by approximately 2,400 clinical operators throughout our health system. Patients are identified by scanning in wristband barcodes or by manual data entry using portable glucose meters. Meters are docked to upload data to a database server which then transmits data to any medical record matching the financial number of the test result. With a new model, meters connect to an interface manager where the patient ID (a nine-digit account number) is checked against patient registration data from admission, discharge, and transfer (ADT) feeds and only matched results are transferred to the patient's electronic medical record. With the new process, the patient ID is checked prior to testing, and testing is prevented until ID errors are resolved. When averaged over a period of a month, ID errors were reduced to 3 errors/month (0.015%) in comparison with 61.5 errors/month (0.319%) before implementing the new meters. Patient ID errors may occur with glucose POCT despite patient barcoding. The verification of patient identification should ideally take place at the bedside before testing occurs so that the errors can be addressed in real time. The introduction of an ADT feed directly to glucose meters reduced patient ID errors in POCT.

  19. Reducing patient identification errors related to glucose point-of-care testing

    PubMed Central

    Alreja, Gaurav; Setia, Namrata; Nichols, James; Pantanowitz, Liron

    2011-01-01

    Background: Patient identification (ID) errors in point-of-care testing (POCT) can cause test results to be transferred to the wrong patient's chart or prevent results from being transmitted and reported. Despite the implementation of patient barcoding and ongoing operator training at our institution, patient ID errors still occur with glucose POCT. The aim of this study was to develop a solution to reduce identification errors with POCT. Materials and Methods: Glucose POCT was performed by approximately 2,400 clinical operators throughout our health system. Patients are identified by scanning in wristband barcodes or by manual data entry using portable glucose meters. Meters are docked to upload data to a database server which then transmits data to any medical record matching the financial number of the test result. With a new model, meters connect to an interface manager where the patient ID (a nine-digit account number) is checked against patient registration data from admission, discharge, and transfer (ADT) feeds and only matched results are transferred to the patient's electronic medical record. With the new process, the patient ID is checked prior to testing, and testing is prevented until ID errors are resolved. Results: When averaged over a period of a month, ID errors were reduced to 3 errors/month (0.015%) in comparison with 61.5 errors/month (0.319%) before implementing the new meters. Conclusion: Patient ID errors may occur with glucose POCT despite patient barcoding. The verification of patient identification should ideally take place at the bedside before testing occurs so that the errors can be addressed in real time. The introduction of an ADT feed directly to glucose meters reduced patient ID errors in POCT. PMID:21633490

  20. Awareness of deficits and error processing after traumatic brain injury.

    PubMed

    Larson, Michael J; Perlstein, William M

    2009-10-28

    Severe traumatic brain injury is frequently associated with alterations in performance monitoring, including reduced awareness of physical and cognitive deficits. We examined the relationship between awareness of deficits and electrophysiological indices of performance monitoring, including the error-related negativity and posterror positivity (Pe) components of the scalp-recorded event-related potential, in 16 traumatic brain injury survivors who completed a Stroop color-naming task while event-related potential measurements were recorded. Awareness of deficits was measured as the discrepancy between patient and significant-other ratings on the Frontal Systems Behavior Scale. The amplitude of the Pe, but not error-related negativity, was reliably associated with decreased awareness of deficits. Results indicate that Pe amplitude may serve as an electrophysiological indicator of awareness of abilities and deficits.

  1. Accurate characterisation of hole size and location by projected fringe profilometry

    NASA Astrophysics Data System (ADS)

    Wu, Yuxiang; Dantanarayana, Harshana G.; Yue, Huimin; Huntley, Jonathan M.

    2018-06-01

    The ability to accurately estimate the location and geometry of holes is often required in the field of quality control and automated assembly. Projected fringe profilometry is a potentially attractive technique on account of being non-contacting, of lower cost, and orders of magnitude faster than the traditional coordinate measuring machine. However, we demonstrate in this paper that fringe projection is susceptible to significant (hundreds of µm) measurement artefacts in the neighbourhood of hole edges, which give rise to errors of a similar magnitude in the estimated hole geometry. A mechanism for the phenomenon is identified based on the finite size of the imaging system’s point spread function and the resulting bias produced near to sample discontinuities in geometry and reflectivity. A mathematical model is proposed, from which a post-processing compensation algorithm is developed to suppress such errors around the holes. The algorithm includes a robust and accurate sub-pixel edge detection method based on a Fourier descriptor of the hole contour. The proposed algorithm was found to reduce significantly the measurement artefacts near the hole edges. As a result, the errors in estimated hole radius were reduced by up to one order of magnitude, to a few tens of µm for hole radii in the range 2–15 mm, compared to those from the uncompensated measurements.

  2. Reducing visual deficits caused by refractive errors in school and preschool children: results of a pilot school program in the Andean region of Apurimac, Peru.

    PubMed

    Latorre-Arteaga, Sergio; Gil-González, Diana; Enciso, Olga; Phelan, Aoife; García-Muñoz, Angel; Kohler, Johannes

    2014-01-01

    Refractive error is defined as the inability of the eye to bring parallel rays of light into focus on the retina, resulting in nearsightedness (myopia), farsightedness (Hyperopia) or astigmatism. Uncorrected refractive error in children is associated with increased morbidity and reduced educational opportunities. Vision screening (VS) is a method for identifying children with visual impairment or eye conditions likely to lead to visual impairment. To analyze the utility of vision screening conducted by teachers and to contribute to a better estimation of the prevalence of childhood refractive errors in Apurimac, Peru. Design : A pilot vision screening program in preschool (Group I) and elementary school children (Group II) was conducted with the participation of 26 trained teachers. Children whose visual acuity was<6/9 [20/30] (Group I) and ≤ 6/9 (Group II) in one or both eyes, measured with the Snellen Tumbling E chart at 6 m, were referred for a comprehensive eye exam. Specificity and positive predictive value to detect refractive error were calculated against clinical examination. Program assessment with participants was conducted to evaluate outcomes and procedures. A total sample of 364 children aged 3-11 were screened; 45 children were examined at Centro Oftalmológico Monseñor Enrique Pelach (COMEP) Eye Hospital. Prevalence of refractive error was 6.2% (Group I) and 6.9% (Group II); specificity of teacher vision screening was 95.8% and 93.0%, while positive predictive value was 59.1% and 47.8% for each group, respectively. Aspects highlighted to improve the program included extending training, increasing parental involvement, and helping referred children to attend the hospital. Prevalence of refractive error in children is significant in the region. Vision screening performed by trained teachers is a valid intervention for early detection of refractive error, including screening of preschool children. Program sustainability and improvements in education and quality of life resulting from childhood vision screening require further research.

  3. Outpatient Prescribing Errors and the Impact of Computerized Prescribing

    PubMed Central

    Gandhi, Tejal K; Weingart, Saul N; Seger, Andrew C; Borus, Joshua; Burdick, Elisabeth; Poon, Eric G; Leape, Lucian L; Bates, David W

    2005-01-01

    Background Medication errors are common among inpatients and many are preventable with computerized prescribing. Relatively little is known about outpatient prescribing errors or the impact of computerized prescribing in this setting. Objective To assess the rates, types, and severity of outpatient prescribing errors and understand the potential impact of computerized prescribing. Design Prospective cohort study in 4 adult primary care practices in Boston using prescription review, patient survey, and chart review to identify medication errors, potential adverse drug events (ADEs) and preventable ADEs. Participants Outpatients over age 18 who received a prescription from 24 participating physicians. Results We screened 1879 prescriptions from 1202 patients, and completed 661 surveys (response rate 55%). Of the prescriptions, 143 (7.6%; 95% confidence interval (CI) 6.4% to 8.8%) contained a prescribing error. Three errors led to preventable ADEs and 62 (43%; 3% of all prescriptions) had potential for patient injury (potential ADEs); 1 was potentially life-threatening (2%) and 15 were serious (24%). Errors in frequency (n=77, 54%) and dose (n=26, 18%) were common. The rates of medication errors and potential ADEs were not significantly different at basic computerized prescribing sites (4.3% vs 11.0%, P=.31; 2.6% vs 4.0%, P=.16) compared to handwritten sites. Advanced checks (including dose and frequency checking) could have prevented 95% of potential ADEs. Conclusions Prescribing errors occurred in 7.6% of outpatient prescriptions and many could have harmed patients. Basic computerized prescribing systems may not be adequate to reduce errors. More advanced systems with dose and frequency checking are likely needed to prevent potentially harmful errors. PMID:16117752

  4. Detection and avoidance of errors in computer software

    NASA Technical Reports Server (NTRS)

    Kinsler, Les

    1989-01-01

    The acceptance test errors of a computer software project to determine if the errors could be detected or avoided in earlier phases of development. GROAGSS (Gamma Ray Observatory Attitude Ground Support System) was selected as the software project to be examined. The development of the software followed the standard Flight Dynamics Software Development methods. GROAGSS was developed between August 1985 and April 1989. The project is approximately 250,000 lines of code of which approximately 43,000 lines are reused from previous projects. GROAGSS had a total of 1715 Change Report Forms (CRFs) submitted during the entire development and testing. These changes contained 936 errors. Of these 936 errors, 374 were found during the acceptance testing. These acceptance test errors were first categorized into methods of avoidance including: more clearly written requirements; detail review; code reading; structural unit testing; and functional system integration testing. The errors were later broken down in terms of effort to detect and correct, class of error, and probability that the prescribed detection method would be successful. These determinations were based on Software Engineering Laboratory (SEL) documents and interviews with the project programmers. A summary of the results of the categorizations is presented. The number of programming errors at the beginning of acceptance testing can be significantly reduced. The results of the existing development methodology are examined for ways of improvements. A basis is provided for the definition is a new development/testing paradigm. Monitoring of the new scheme will objectively determine its effectiveness on avoiding and detecting errors.

  5. Effects of health care provider work hours and sleep deprivation on safety and performance.

    PubMed

    Lockley, Steven W; Barger, Laura K; Ayas, Najib T; Rothschild, Jeffrey M; Czeisler, Charles A; Landrigan, Christopher P

    2007-11-01

    There has been increasing interest in the impact of resident-physician and nurse work hours on patient safety. The evidence demonstrates that work schedules have a profound effect on providers' sleep and performance, as well as on their safety and that of their patients. Nurses working shifts greater than 12.5 hours are at significantly increased risk of experiencing decreased vigilance on the job, suffering an occupational injury, or making a medical error. Physicians-in-training working traditional > 24-hour on-call shifts are at greatly increased risk of experiencing an occupational sharps injury or a motor vehicle crash on the drive home from work and of making a serious or even fatal medical error. As compared to when working 16-hours shifts, on-call residents have twice as many attentional failures when working overnight and commit 36% more serious medical errors. They also report making 300% more fatigue-related medical errors that lead to a patient's death. The weight of evidence strongly suggests that extended-duration work shifts significantly increase fatigue and impair performance and safety. From the standpoint of both providers and patients, the hours routinely worked by health care providers in the United States are unsafe. To reduce the unacceptably high rate of preventable fatigue-related medical error and injuries among health care workers, the United States must establish and enforce safe work-hour limits.

  6. Relationships between evidence-based practice, quality improvement and clinical error experience of nurses in Korean hospitals.

    PubMed

    Hwang, Jee-In; Park, Hyeoun-Ae

    2015-07-01

    This study investigated individual and work-related factors associated with nurses' perceptions of evidence-based practice (EBP) and quality improvement (QI), and the relationships between evidence-based practice, quality improvement and clinical errors. Understanding the factors affecting evidence-based practice and quality improvement activities and their relationships with clinical errors is important for designing strategies to promote evidence-based practice, quality improvement and patient safety. A cross-sectional survey was conducted with 594 nurses in two Korean teaching hospitals using the evidence-based practice Questionnaire and quality improvement scale developed in this study. Four hundred and forty-three nurses (74.6%) returned the completed survey. Nurses' ages and educational levels were significantly associated with evidence-based practice scores whereas age and job position were associated with quality improvement scores. There were positive, moderate correlations between evidence-based practice and quality improvement scores. Nurses who had not made any clinical errors during the past 12 months had significantly higher quality improvement skills scores than those who had. The findings indicated the necessity of educational support regarding evidence-based practice and quality improvement for younger staff nurses who have no master degrees. Enhancing quality improvement skills may reduce clinical errors. Nurse managers should consider the characteristics of their staff when implementing educational and clinical strategies for evidence-based practice and quality improvement. © 2013 John Wiley & Sons Ltd.

  7. Error reduction in EMG signal decomposition

    PubMed Central

    Kline, Joshua C.

    2014-01-01

    Decomposition of the electromyographic (EMG) signal into constituent action potentials and the identification of individual firing instances of each motor unit in the presence of ambient noise are inherently probabilistic processes, whether performed manually or with automated algorithms. Consequently, they are subject to errors. We set out to classify and reduce these errors by analyzing 1,061 motor-unit action-potential trains (MUAPTs), obtained by decomposing surface EMG (sEMG) signals recorded during human voluntary contractions. Decomposition errors were classified into two general categories: location errors representing variability in the temporal localization of each motor-unit firing instance and identification errors consisting of falsely detected or missed firing instances. To mitigate these errors, we developed an error-reduction algorithm that combines multiple decomposition estimates to determine a more probable estimate of motor-unit firing instances with fewer errors. The performance of the algorithm is governed by a trade-off between the yield of MUAPTs obtained above a given accuracy level and the time required to perform the decomposition. When applied to a set of sEMG signals synthesized from real MUAPTs, the identification error was reduced by an average of 1.78%, improving the accuracy to 97.0%, and the location error was reduced by an average of 1.66 ms. The error-reduction algorithm in this study is not limited to any specific decomposition strategy. Rather, we propose it be used for other decomposition methods, especially when analyzing precise motor-unit firing instances, as occurs when measuring synchronization. PMID:25210159

  8. Internally-generated error signals in monkey frontal eye field during an inferred motion task

    PubMed Central

    Ferrera, Vincent P.; Barborica, Andrei

    2010-01-01

    An internal model for predictive saccades in frontal cortex was investigated by recording neurons in monkey frontal eye field during an inferred motion task. Monkeys were trained to make saccades to the extrapolated position of a small moving target that was rendered temporarily invisible and whose trajectory was altered. On roughly two-thirds of the trials, monkeys made multiple saccades while the target was invisible. Primary saccades were correlated with extrapolated target position. Secondary saccades significantly reduced residual errors resulting from imperfect accuracy of the first saccade. These observations suggest that the second saccade was corrective. As there was no visual feedback, corrective saccades could only be driven by an internally generated error signal. Neuronal activity in the frontal eye field was directionally tuned prior to both primary and secondary saccades. Separate subpopulations of cells encoded either saccade direction or direction error prior to the second saccade. These results suggest that FEF neurons encode the error after the first saccade, as well as the direction of the second saccade. Hence, FEF appears to contribute to detecting and correcting movement errors based on internally generated signals. PMID:20810882

  9. Trait anger in relation to neural and behavioral correlates of response inhibition and error-processing.

    PubMed

    Lievaart, Marien; van der Veen, Frederik M; Huijding, Jorg; Naeije, Lilian; Hovens, Johannes E; Franken, Ingmar H A

    2016-01-01

    Effortful control is considered to be an important factor in explaining individual differences in trait anger. In the current study, we sought to investigate the relation between anger-primed effortful control (i.e., inhibitory control and error-processing) and trait anger using an affective Go/NoGo task. Individuals low (LTA; n=45) and high (HTA; n=49) on trait anger were selected for this study. Behavioral performance (accuracy) and Event-Related Potentials (ERPs; i.e., N2, P3, ERN, Pe) were compared between both groups. Contrary to our predictions, we found no group differences regarding inhibitory control. That is, HTA and LTA individuals made comparable numbers of commission errors on NoGo trials and no significant differences were found on the N2 and P3 amplitudes. With respect to error-processing, we found reduced Pe amplitudes following errors in HTA individuals as compared to LTA individuals, whereas the ERN amplitudes were comparable for both groups. These results indicate that high trait anger individuals show deficits in later stages of error-processing, which may explain the continuation of impulsive behaviors in HTA individuals despite their negative consequences. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Use of machine learning methods to reduce predictive error of groundwater models.

    PubMed

    Xu, Tianfang; Valocchi, Albert J; Choi, Jaesik; Amir, Eyal

    2014-01-01

    Quantitative analyses of groundwater flow and transport typically rely on a physically-based model, which is inherently subject to error. Errors in model structure, parameter and data lead to both random and systematic error even in the output of a calibrated model. We develop complementary data-driven models (DDMs) to reduce the predictive error of physically-based groundwater models. Two machine learning techniques, the instance-based weighting and support vector regression, are used to build the DDMs. This approach is illustrated using two real-world case studies of the Republican River Compact Administration model and the Spokane Valley-Rathdrum Prairie model. The two groundwater models have different hydrogeologic settings, parameterization, and calibration methods. In the first case study, cluster analysis is introduced for data preprocessing to make the DDMs more robust and computationally efficient. The DDMs reduce the root-mean-square error (RMSE) of the temporal, spatial, and spatiotemporal prediction of piezometric head of the groundwater model by 82%, 60%, and 48%, respectively. In the second case study, the DDMs reduce the RMSE of the temporal prediction of piezometric head of the groundwater model by 77%. It is further demonstrated that the effectiveness of the DDMs depends on the existence and extent of the structure in the error of the physically-based model. © 2013, National GroundWater Association.

  11. Refractive error and visual impairment in private school children in Ghana.

    PubMed

    Kumah, Ben D; Ebri, Anne; Abdul-Kabir, Mohammed; Ahmed, Abdul-Sadik; Koomson, Nana Ya; Aikins, Samual; Aikins, Amos; Amedo, Angela; Lartey, Seth; Naidoo, Kovin

    2013-12-01

    To assess the prevalence of refractive error and visual impairment in private school children in Ghana. A random selection of geographically defined classes in clusters was used to identify a sample of school children aged 12 to 15 years in the Ashanti Region. Children in 60 clusters were enumerated and examined in classrooms. The examination included visual acuity, retinoscopy, autorefraction under cycloplegia, and examination of anterior segment, media, and fundus. For quality assurance, a random sample of children with reduced and normal vision were selected and re-examined independently. A total of 2454 children attending 53 private schools were enumerated, and of these, 2435 (99.2%) were examined. Prevalence of uncorrected, presenting, and best visual acuity of 20/40 or worse in the better eye was 3.7, 3.5, and 0.4%, respectively. Refractive error was the cause of reduced vision in 71.7% of 152 eyes, amblyopia in 9.9%, retinal disorders in 5.9%, and corneal opacity in 4.6%. Exterior and anterior segment abnormalities occurred in 43 (1.8%) children. Myopia (at least -0.50 D) in one or both eyes was present in 3.2% of children when measured with retinoscopy and in 3.4% measured with autorefraction. Myopia was not significantly associated with gender (P = 0.82). Hyperopia (+2.00 D or more) in at least one eye was present in 0.3% of children with retinoscopy and autorefraction. The prevalence of reduced vision in Ghanaian private school children due to uncorrected refractive error was low. However, the prevalence of amblyopia, retinal disorders, and corneal opacities indicate the need for early interventions.

  12. Sustainability of protocolized handover of pediatric cardiac surgery patients to the intensive care unit.

    PubMed

    Chenault, Kristin; Moga, Michael-Alice; Shin, Minah; Petersen, Emily; Backer, Carl; De Oliveira, Gildasio S; Suresh, Santhanam

    2016-05-01

    Transfer of patient care among clinicians (handovers) is a common source of medical errors. While the immediate efficacy of these initiatives is well documented, sustainability of practice changes that results in better processes of care is largely understudied. The objective of the current investigation was to evaluate the sustainability of a protocolized handover process in pediatric patients from the operating room after cardiac surgery to the intensive care unit. This was a prospective study with direct observation assessment of handover performance conducted in the cardiac ICU (CICU) of a free-standing, tertiary care children's hospital in the United States. Patient transitions from the operating room to the CICU, including the verbal handoff, were directly observed by a single independent observer in all phases of the study. A checklist of key elements identified errors classified as: (1) technical, (2) information omissions, and (3) realized errors. Total number of errors was compared across the different times of the study (preintervention, postintervention, and the current sustainability phase). A total of 119 handovers were studied: 41 preintervention, 38 postintervention, and 40 in the current sustainability phase. The median [Interquartile range (IQR)] number of technical errors was significantly reduced in the sustainability phase compared to the preintervention and postintervention phase, 2 (1-3), 6 (5-7), and 2.5 (2-4), respectively P = 0.0001. Similarly, the median (IQR) number of verbal information omissions was also significantly reduced in the sustainability phase compared to the preintervention and postintervention phases, 1 (1-1), 4 (3-5) and 2 (1-3), respectively. We demonstrate sustainability of an improved handover process using a checklist in children being transferred to the intensive care unit after cardiac surgery. Standardized handover processes can be a sustainable strategy to improve patient safety after pediatric cardiac surgery. © 2016 John Wiley & Sons Ltd.

  13. Error analysis for reducing noisy wide-gap concentric cylinder rheometric data for nonlinear fluids - Theory and applications

    NASA Technical Reports Server (NTRS)

    Borgia, Andrea; Spera, Frank J.

    1990-01-01

    This work discusses the propagation of errors for the recovery of the shear rate from wide-gap concentric cylinder viscometric measurements of non-Newtonian fluids. A least-square regression of stress on angular velocity data to a system of arbitrary functions is used to propagate the errors for the series solution to the viscometric flow developed by Krieger and Elrod (1953) and Pawlowski (1953) ('power-law' approximation) and for the first term of the series developed by Krieger (1968). A numerical experiment shows that, for measurements affected by significant errors, the first term of the Krieger-Elrod-Pawlowski series ('infinite radius' approximation) and the power-law approximation may recover the shear rate with equal accuracy as the full Krieger-Elrod-Pawlowski solution. An experiment on a clay slurry indicates that the clay has a larger yield stress at rest than during shearing, and that, for the range of shear rates investigated, a four-parameter constitutive equation approximates reasonably well its rheology. The error analysis presented is useful for studying the rheology of fluids such as particle suspensions, slurries, foams, and magma.

  14. The role of visual spatial attention in adult developmental dyslexia.

    PubMed

    Collis, Nathan L; Kohnen, Saskia; Kinoshita, Sachiko

    2013-01-01

    The present study investigated the nature of visual spatial attention deficits in adults with developmental dyslexia, using a partial report task with five-letter, digit, and symbol strings. Participants responded by a manual key press to one of nine alternatives, which included other characters in the string, allowing an assessment of position errors as well as intrusion errors. The results showed that the dyslexic adults performed significantly worse than age-matched controls with letter and digit strings but not with symbol strings. Both groups produced W-shaped serial position functions with letter and digit strings. The dyslexics' deficits with letter string stimuli were limited to position errors, specifically at the string-interior positions 2 and 4. These errors correlated with letter transposition reading errors (e.g., reading slat as "salt"), but not with the Rapid Automatized Naming (RAN) task. Overall, these results suggest that the dyslexic adults have a visual spatial attention deficit; however, the deficit does not reflect a reduced span in visual-spatial attention, but a deficit in processing a string of letters in parallel, probably due to difficulty in the coding of letter position.

  15. Dosimetric consequences of translational and rotational errors in frame-less image-guided radiosurgery

    PubMed Central

    2012-01-01

    Background To investigate geometric and dosimetric accuracy of frame-less image-guided radiosurgery (IG-RS) for brain metastases. Methods and materials Single fraction IG-RS was practiced in 72 patients with 98 brain metastases. Patient positioning and immobilization used either double- (n = 71) or single-layer (n = 27) thermoplastic masks. Pre-treatment set-up errors (n = 98) were evaluated with cone-beam CT (CBCT) based image-guidance (IG) and were corrected in six degrees of freedom without an action level. CBCT imaging after treatment measured intra-fractional errors (n = 64). Pre- and post-treatment errors were simulated in the treatment planning system and target coverage and dose conformity were evaluated. Three scenarios of 0 mm, 1 mm and 2 mm GTV-to-PTV (gross tumor volume, planning target volume) safety margins (SM) were simulated. Results Errors prior to IG were 3.9 mm ± 1.7 mm (3D vector) and the maximum rotational error was 1.7° ± 0.8° on average. The post-treatment 3D error was 0.9 mm ± 0.6 mm. No differences between double- and single-layer masks were observed. Intra-fractional errors were significantly correlated with the total treatment time with 0.7mm±0.5mm and 1.2mm±0.7mm for treatment times ≤23 minutes and >23 minutes (p<0.01), respectively. Simulation of RS without image-guidance reduced target coverage and conformity to 75% ± 19% and 60% ± 25% of planned values. Each 3D set-up error of 1 mm decreased target coverage and dose conformity by 6% and 10% on average, respectively, with a large inter-patient variability. Pre-treatment correction of translations only but not rotations did not affect target coverage and conformity. Post-treatment errors reduced target coverage by >5% in 14% of the patients. A 1 mm safety margin fully compensated intra-fractional patient motion. Conclusions IG-RS with online correction of translational errors achieves high geometric and dosimetric accuracy. Intra-fractional errors decrease target coverage and conformity unless compensated with appropriate safety margins. PMID:22531060

  16. Evaluation of Trajectory Errors in an Automated Terminal-Area Environment

    NASA Technical Reports Server (NTRS)

    Oseguera-Lohr, Rosa M.; Williams, David H.

    2003-01-01

    A piloted simulation experiment was conducted to document the trajectory errors associated with use of an airplane's Flight Management System (FMS) in conjunction with a ground-based ATC automation system, Center-TRACON Automation System (CTAS) in the terminal area. Three different arrival procedures were compared: current-day (vectors from ATC), modified (current-day with minor updates), and data link with FMS lateral navigation. Six active airline pilots flew simulated arrivals in a fixed-base simulator. The FMS-datalink procedure resulted in the smallest time and path distance errors, indicating that use of this procedure could reduce the CTAS arrival-time prediction error by about half over the current-day procedure. Significant sources of error contributing to the arrival-time error were crosstrack errors and early speed reduction in the last 2-4 miles before the final approach fix. Pilot comments were all very positive, indicating the FMS-datalink procedure was easy to understand and use, and the increased head-down time and workload did not detract from the benefit. Issues that need to be resolved before this method of operation would be ready for commercial use include development of procedures acceptable to controllers, better speed conformance monitoring, and FMS database procedures to support the approach transitions.

  17. An Enhanced MEMS Error Modeling Approach Based on Nu-Support Vector Regression

    PubMed Central

    Bhatt, Deepak; Aggarwal, Priyanka; Bhattacharya, Prabir; Devabhaktuni, Vijay

    2012-01-01

    Micro Electro Mechanical System (MEMS)-based inertial sensors have made possible the development of a civilian land vehicle navigation system by offering a low-cost solution. However, the accurate modeling of the MEMS sensor errors is one of the most challenging tasks in the design of low-cost navigation systems. These sensors exhibit significant errors like biases, drift, noises; which are negligible for higher grade units. Different conventional techniques utilizing the Gauss Markov model and neural network method have been previously utilized to model the errors. However, Gauss Markov model works unsatisfactorily in the case of MEMS units due to the presence of high inherent sensor errors. On the other hand, modeling the random drift utilizing Neural Network (NN) is time consuming, thereby affecting its real-time implementation. We overcome these existing drawbacks by developing an enhanced Support Vector Machine (SVM) based error model. Unlike NN, SVMs do not suffer from local minimisation or over-fitting problems and delivers a reliable global solution. Experimental results proved that the proposed SVM approach reduced the noise standard deviation by 10–35% for gyroscopes and 61–76% for accelerometers. Further, positional error drifts under static conditions improved by 41% and 80% in comparison to NN and GM approaches. PMID:23012552

  18. Analysis and Compensation of Modulation Angular Rate Error Based on Missile-Borne Rotation Semi-Strapdown Inertial Navigation System

    PubMed Central

    Zhang, Jiayu; Li, Jie; Zhang, Xi; Che, Xiaorui; Huang, Yugang; Feng, Kaiqiang

    2018-01-01

    The Semi-Strapdown Inertial Navigation System (SSINS) provides a new solution to attitude measurement of a high-speed rotating missile. However, micro-electro-mechanical-systems (MEMS) inertial measurement unit (MIMU) outputs are corrupted by significant sensor errors. In order to improve the navigation precision, a rotation modulation technology method called Rotation Semi-Strapdown Inertial Navigation System (RSSINS) is introduced into SINS. In fact, the stability of the modulation angular rate is difficult to achieve in a high-speed rotation environment. The changing rotary angular rate has an impact on the inertial sensor error self-compensation. In this paper, the influence of modulation angular rate error, including acceleration-deceleration process, and instability of the angular rate on the navigation accuracy of RSSINS is deduced and the error characteristics of the reciprocating rotation scheme are analyzed. A new compensation method is proposed to remove or reduce sensor errors so as to make it possible to maintain high precision autonomous navigation performance by MIMU when there is no external aid. Experiments have been carried out to validate the performance of the method. In addition, the proposed method is applicable for modulation angular rate error compensation under various dynamic conditions. PMID:29734707

  19. Enhancing the Quality of EAP Writing through Overt Teaching

    ERIC Educational Resources Information Center

    Wee, Roselind; Sim, Jacqueline; Jusoff, Kamaruzaman

    2009-01-01

    This paper examines how overt teaching is instrumental in reducing subject-verb agreement (SVA) errors of Malaysian EAP learners which in turn improves the quality of their writing. The researchers used overt teaching of these grammatical items, that is, SVA and investigated how this method has significantly benefitted the learners who were second…

  20. Understanding overlay signatures using machine learning on non-lithography context information

    NASA Astrophysics Data System (ADS)

    Overcast, Marshall; Mellegaard, Corey; Daniel, David; Habets, Boris; Erley, Georg; Guhlemann, Steffen; Thrun, Xaver; Buhl, Stefan; Tottewitz, Steven

    2018-03-01

    Overlay errors between two layers can be caused by non-lithography processes. While these errors can be compensated by the run-to-run system, such process and tool signatures are not always stable. In order to monitor the impact of non-lithography context on overlay at regular intervals, a systematic approach is needed. Using various machine learning techniques, significant context parameters that relate to deviating overlay signatures are automatically identified. Once the most influential context parameters are found, a run-to-run simulation is performed to see how much improvement can be obtained. The resulting analysis shows good potential for reducing the influence of hidden context parameters on overlay performance. Non-lithographic contexts are significant contributors, and their automatic detection and classification will enable the overlay roadmap, given the corresponding control capabilities.

  1. Extrapolating target tracks

    NASA Astrophysics Data System (ADS)

    Van Zandt, James R.

    2012-05-01

    Steady-state performance of a tracking filter is traditionally evaluated immediately after a track update. However, there is commonly a further delay (e.g., processing and communications latency) before the tracks can actually be used. We analyze the accuracy of extrapolated target tracks for four tracking filters: Kalman filter with the Singer maneuver model and worst-case correlation time, with piecewise constant white acceleration, and with continuous white acceleration, and the reduced state filter proposed by Mookerjee and Reifler.1, 2 Performance evaluation of a tracking filter is significantly simplified by appropriate normalization. For the Kalman filter with the Singer maneuver model, the steady-state RMS error immediately after an update depends on only two dimensionless parameters.3 By assuming a worst case value of target acceleration correlation time, we reduce this to a single parameter without significantly changing the filter performance (within a few percent for air tracking).4 With this simplification, we find for all four filters that the RMS errors for the extrapolated state are functions of only two dimensionless parameters. We provide simple analytic approximations in each case.

  2. Improving Assimilated Global Climate Data Using TRMM and SSM/I Rainfall and Moisture Data

    NASA Technical Reports Server (NTRS)

    Hou, Arthur Y.; Zhang, Sara Q.; daSilva, Arlindo M.; Olson, William S.

    1999-01-01

    Current global analyses contain significant errors in primary hydrological fields such as precipitation, evaporation, and related cloud and moisture in the tropics. Work has been underway at NASA's Data Assimilation Office to explore the use of TRMM and SSM/I-derived rainfall and total precipitable water (TPW) data in global data assimilation to directly constrain these hydrological parameters. We found that assimilating these data types improves not only the precipitation and moisture estimates but also key climate parameters directly linked to convection such as the outgoing longwave radiation, clouds, and the large-scale circulation in the tropics. We will present results showing that assimilating TRMM and SSM/I 6-hour averaged rain rates and TPW estimates significantly reduces the state-dependent systematic errors in assimilated products. Specifically, rainfall assimilation improves cloud and latent heating distributions, which, in turn, improves the cloudy-sky radiation and the large-scale circulation, while TPW assimilation reduces moisture biases to improve radiation in clear-sky regions. Rainfall and TPW assimilation also improves tropical forecasts beyond 1 day.

  3. Errors from approximation of ODE systems with reduced order models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassilevska, Tanya

    2016-12-30

    This is a code to calculate the error from approximation of systems of ordinary differential equations (ODEs) by using Proper Orthogonal Decomposition (POD) Reduced Order Models (ROM) methods and to compare and analyze the errors for two POD ROM variants. The first variant is the standard POD ROM, the second variant is a modification of the method using the values of the time derivatives (a.k.a. time-derivative snapshots). The code compares the errors from the two variants under different conditions.

  4. Classification and reduction of pilot error

    NASA Technical Reports Server (NTRS)

    Rogers, W. H.; Logan, A. L.; Boley, G. D.

    1989-01-01

    Human error is a primary or contributing factor in about two-thirds of commercial aviation accidents worldwide. With the ultimate goal of reducing pilot error accidents, this contract effort is aimed at understanding the factors underlying error events and reducing the probability of certain types of errors by modifying underlying factors such as flight deck design and procedures. A review of the literature relevant to error classification was conducted. Classification includes categorizing types of errors, the information processing mechanisms and factors underlying them, and identifying factor-mechanism-error relationships. The classification scheme developed by Jens Rasmussen was adopted because it provided a comprehensive yet basic error classification shell or structure that could easily accommodate addition of details on domain-specific factors. For these purposes, factors specific to the aviation environment were incorporated. Hypotheses concerning the relationship of a small number of underlying factors, information processing mechanisms, and error types types identified in the classification scheme were formulated. ASRS data were reviewed and a simulation experiment was performed to evaluate and quantify the hypotheses.

  5. Quantifying the impact of material-model error on macroscale quantities-of-interest using multiscale a posteriori error-estimation techniques

    DOE PAGES

    Brown, Judith A.; Bishop, Joseph E.

    2016-07-20

    An a posteriori error-estimation framework is introduced to quantify and reduce modeling errors resulting from approximating complex mesoscale material behavior with a simpler macroscale model. Such errors may be prevalent when modeling welds and additively manufactured structures, where spatial variations and material textures may be present in the microstructure. We consider a case where a <100> fiber texture develops in the longitudinal scanning direction of a weld. Transversely isotropic elastic properties are obtained through homogenization of a microstructural model with this texture and are considered the reference weld properties within the error-estimation framework. Conversely, isotropic elastic properties are considered approximatemore » weld properties since they contain no representation of texture. Errors introduced by using isotropic material properties to represent a weld are assessed through a quantified error bound in the elastic regime. Lastly, an adaptive error reduction scheme is used to determine the optimal spatial variation of the isotropic weld properties to reduce the error bound.« less

  6. Piecewise compensation for the nonlinear error of fiber-optic gyroscope scale factor

    NASA Astrophysics Data System (ADS)

    Zhang, Yonggang; Wu, Xunfeng; Yuan, Shun; Wu, Lei

    2013-08-01

    Fiber-Optic Gyroscope (FOG) scale factor nonlinear error will result in errors in Strapdown Inertial Navigation System (SINS). In order to reduce nonlinear error of FOG scale factor in SINS, a compensation method is proposed in this paper based on curve piecewise fitting of FOG output. Firstly, reasons which can result in FOG scale factor error are introduced and the definition of nonlinear degree is provided. Then we introduce the method to divide the output range of FOG into several small pieces, and curve fitting is performed in each output range of FOG to obtain scale factor parameter. Different scale factor parameters of FOG are used in different pieces to improve FOG output precision. These parameters are identified by using three-axis turntable, and nonlinear error of FOG scale factor can be reduced. Finally, three-axis swing experiment of SINS verifies that the proposed method can reduce attitude output errors of SINS by compensating the nonlinear error of FOG scale factor and improve the precision of navigation. The results of experiments also demonstrate that the compensation scheme is easy to implement. It can effectively compensate the nonlinear error of FOG scale factor with slightly increased computation complexity. This method can be used in inertial technology based on FOG to improve precision.

  7. Ultralow dose dentomaxillofacial CT imaging and iterative reconstruction techniques: variability of Hounsfield units and contrast-to-noise ratio

    PubMed Central

    Bischel, Alexander; Stratis, Andreas; Kakar, Apoorv; Bosmans, Hilde; Jacobs, Reinhilde; Gassner, Eva-Maria; Puelacher, Wolfgang; Pauwels, Ruben

    2016-01-01

    Objective: The aim of this study was to evaluate whether application of ultralow dose protocols and iterative reconstruction technology (IRT) influence quantitative Hounsfield units (HUs) and contrast-to-noise ratio (CNR) in dentomaxillofacial CT imaging. Methods: A phantom with inserts of five types of materials was scanned using protocols for (a) a clinical reference for navigated surgery (CT dose index volume 36.58 mGy), (b) low-dose sinus imaging (18.28 mGy) and (c) four ultralow dose imaging (4.14, 2.63, 0.99 and 0.53 mGy). All images were reconstructed using: (i) filtered back projection (FBP); (ii) IRT: adaptive statistical iterative reconstruction-50 (ASIR-50), ASIR-100 and model-based iterative reconstruction (MBIR); and (iii) standard (std) and bone kernel. Mean HU, CNR and average HU error after recalibration were determined. Each combination of protocols was compared using Friedman analysis of variance, followed by Dunn's multiple comparison test. Results: Pearson's sample correlation coefficients were all >0.99. Ultralow dose protocols using FBP showed errors of up to 273 HU. Std kernels had less HU variability than bone kernels. MBIR reduced the error value for the lowest dose protocol to 138 HU and retained the highest relative CNR. ASIR could not demonstrate significant advantages over FBP. Conclusions: Considering a potential dose reduction as low as 1.5% of a std protocol, ultralow dose protocols and IRT should be further tested for clinical dentomaxillofacial CT imaging. Advances in knowledge: HU as a surrogate for bone density may vary significantly in CT ultralow dose imaging. However, use of std kernels and MBIR technology reduce HU error values and may retain the highest CNR. PMID:26859336

  8. Using wide area differential GPS to improve total system error for precision flight operations

    NASA Astrophysics Data System (ADS)

    Alter, Keith Warren

    Total System Error (TSE) refers to an aircraft's total deviation from the desired flight path. TSE can be divided into Navigational System Error (NSE), the error attributable to the aircraft's navigation system, and Flight Technical Error (FTE), the error attributable to pilot or autopilot control. Improvement in either NSE or FTE reduces TSE and leads to the capability to fly more precise flight trajectories. The Federal Aviation Administration's Wide Area Augmentation System (WAAS) became operational for non-safety critical applications in 2000 and will become operational for safety critical applications in 2002. This navigation service will provide precise 3-D positioning (demonstrated to better than 5 meters horizontal and vertical accuracy) for civil aircraft in the United States. Perhaps more importantly, this navigation system, which provides continuous operation across large regions, enables new flight instrumentation concepts which allow pilots to fly aircraft significantly more precisely, both for straight and curved flight paths. This research investigates the capabilities of some of these new concepts, including the Highway-In-The Sky (HITS) display, which not only improves FTE but also reduces pilot workload when compared to conventional flight instrumentation. Augmentation to the HITS display, including perspective terrain and terrain alerting, improves pilot situational awareness. Flight test results from demonstrations in Juneau, AK, and Lake Tahoe, CA, provide evidence of the overall feasibility of integrated, low-cost flight navigation systems based on these concepts. These systems, requiring no more computational power than current-generation low-end desktop computers, have immediate applicability to general aviation flight from Cessnas to business jets and can support safer and ultimately more economical flight operations. Commercial airlines may also, over time, benefit from these new technologies.

  9. Analysis of naturalistic driving videos of fleet services drivers to estimate driver error and potentially distracting behaviors as risk factors for rear-end versus angle crashes.

    PubMed

    Harland, Karisa K; Carney, Cher; McGehee, Daniel

    2016-07-03

    The objective of this study was to estimate the prevalence and odds of fleet driver errors and potentially distracting behaviors just prior to rear-end versus angle crashes. Analysis of naturalistic driving videos among fleet services drivers for errors and potentially distracting behaviors occurring in the 6 s before crash impact. Categorical variables were examined using the Pearson's chi-square test, and continuous variables, such as eyes-off-road time, were compared using the Student's t-test. Multivariable logistic regression was used to estimate the odds of a driver error or potentially distracting behavior being present in the seconds before rear-end versus angle crashes. Of the 229 crashes analyzed, 101 (44%) were rear-end and 128 (56%) were angle crashes. Driver age, gender, and presence of passengers did not differ significantly by crash type. Over 95% of rear-end crashes involved inadequate surveillance compared to only 52% of angle crashes (P < .0001). Almost 65% of rear-end crashes involved a potentially distracting driver behavior, whereas less than 40% of angle crashes involved these behaviors (P < .01). On average, drivers spent 4.4 s with their eyes off the road while operating or manipulating their cell phone. Drivers in rear-end crashes were at 3.06 (95% confidence interval [CI], 1.73-5.44) times adjusted higher odds of being potentially distracted than those in angle crashes. Fleet driver driving errors and potentially distracting behaviors are frequent. This analysis provides data to inform safe driving interventions for fleet services drivers. Further research is needed in effective interventions to reduce the likelihood of drivers' distracting behaviors and errors that may potentially reducing crashes.

  10. Assessment of Satellite Surface Radiation Products in Highland Regions with Tibet Instrumental Data

    NASA Technical Reports Server (NTRS)

    Yang, Kun; Koike, Toshio; Stackhouse, Paul; Mikovitz, Colleen

    2006-01-01

    This study presents results of comparisons between instrumental radiation data in the elevated Tibetan Plateau and two global satellite products: the Global Energy and Water Cycle Experiment - Surface Radiation Budget (GEWEX-SRB) and International Satellite Cloud Climatology Project - Flux Data (ISCCP-FD). In general, shortwave radiation (SW) is estimated better by ISCCP-FD while longwave radiation (LW) is estimated better by GEWEX-SRB, but all the radiation components in both products are under-estimated. Severe and systematic errors were found in monthly-mean SRB SW (on plateau-average, -48 W/sq m for downward SW and -18 W/sq m for upward SW) and FD LW (on plateau-average, -37 W/sq m for downward LW and -62 W/sq m for upward LW) for radiation. Errors in monthly-mean diurnal variations are even larger than the monthly mean errors. Though the LW errors can be reduced about 10 W/sq m after a correction for altitude difference between the site and SRB and FD grids, these errors are still higher than that for other regions. The large errors in SRB SW was mainly due to a processing mistake for elevation effect, but the errors in SRB LW was mainly due to significant errors in input data. We suggest reprocessing satellite surface radiation budget data, at least for highland areas like Tibet.

  11. Risk factors for refractive errors in primary school children (6-12 years old) in Nakhon Pathom Province.

    PubMed

    Yingyong, Penpimol

    2010-11-01

    Refractive error is one of the leading causes of visual impairment in children. An analysis of risk factors for refractive error is required to reduce and prevent this common eye disease. To identify the risk factors associated with refractive errors in primary school children (6-12 year old) in Nakhon Pathom province. A population-based cross-sectional analytic study was conducted between October 2008 and September 2009 in Nakhon Pathom. Refractive error, parental refractive status, and hours per week of near activities (studying, reading books, watching television, playing with video games, or working on the computer) were assessed in 377 children who participated in this study. The most common type of refractive error in primary school children was myopia. Myopic children were more likely to have parents with myopia. Children with myopia spend more time at near activities. The multivariate odds ratio (95% confidence interval)for two myopic parents was 6.37 (2.26-17.78) and for each diopter-hour per week of near work was 1.019 (1.005-1.033). Multivariate logistic regression models show no confounding effects between parental myopia and near work suggesting that each factor has an independent association with myopia. Statistical analysis by logistic regression revealed that family history of refractive error and hours of near-work were significantly associated with refractive error in primary school children.

  12. A Comprehensive Quality Assurance Program for Personnel and Procedures in Radiation Oncology: Value of Voluntary Error Reporting and Checklists

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalapurakal, John A., E-mail: j-kalapurakal@northwestern.edu; Zafirovski, Aleksandar; Smith, Jeffery

    Purpose: This report describes the value of a voluntary error reporting system and the impact of a series of quality assurance (QA) measures including checklists and timeouts on reported error rates in patients receiving radiation therapy. Methods and Materials: A voluntary error reporting system was instituted with the goal of recording errors, analyzing their clinical impact, and guiding the implementation of targeted QA measures. In response to errors committed in relation to treatment of the wrong patient, wrong treatment site, and wrong dose, a novel initiative involving the use of checklists and timeouts for all staff was implemented. The impactmore » of these and other QA initiatives was analyzed. Results: From 2001 to 2011, a total of 256 errors in 139 patients after 284,810 external radiation treatments (0.09% per treatment) were recorded in our voluntary error database. The incidence of errors related to patient/tumor site, treatment planning/data transfer, and patient setup/treatment delivery was 9%, 40.2%, and 50.8%, respectively. The compliance rate for the checklists and timeouts initiative was 97% (P<.001). These and other QA measures resulted in a significant reduction in many categories of errors. The introduction of checklists and timeouts has been successful in eliminating errors related to wrong patient, wrong site, and wrong dose. Conclusions: A comprehensive QA program that regularly monitors staff compliance together with a robust voluntary error reporting system can reduce or eliminate errors that could result in serious patient injury. We recommend the adoption of these relatively simple QA initiatives including the use of checklists and timeouts for all staff to improve the safety of patients undergoing radiation therapy in the modern era.« less

  13. Effect of endorectal balloon positioning errors on target deformation and dosimetric quality during prostate SBRT

    NASA Astrophysics Data System (ADS)

    Jones, Bernard L.; Gan, Gregory; Kavanagh, Brian; Miften, Moyed

    2013-11-01

    An inflatable endorectal balloon (ERB) is often used during stereotactic body radiation therapy (SBRT) for treatment of prostate cancer in order to reduce both intrafraction motion of the target and risk of rectal toxicity. However, the ERB can exert significant force on the prostate, and this work assessed the impact of ERB position errors on deformation of the prostate and treatment dose metrics. Seventy-one cone-beam computed tomography (CBCT) image datasets of nine patients with clinical stage T1cN0M0 prostate cancer were studied. An ERB (Flexi-Cuff, EZ-EM, Westbury, NY) inflated with 60 cm3 of air was used during simulation and treatment, and daily kilovoltage (kV) CBCT imaging was performed to localize the prostate. The shape of the ERB in each CBCT was analyzed to determine errors in position, size, and shape. A deformable registration algorithm was used to track the dose received by (and deformation of) the prostate, and dosimetric values such as D95, PTV coverage, and Dice coefficient for the prostate were calculated. The average balloon position error was 0.5 cm in the inferior direction, with errors ranging from 2 cm inferiorly to 1 cm superiorly. The prostate was deformed primarily in the AP direction, and tilted primarily in the anterior-posterior/superior-inferior plane. A significant correlation was seen between errors in depth of ERB insertion (DOI) and mean voxel-wise deformation, prostate tilt, Dice coefficient, and planning-to-treatment prostate inter-surface distance (p < 0.001). Dosimetrically, DOI is negatively correlated with prostate D95 and PTV coverage (p < 0.001). For the model of ERB studied, error in ERB position can cause deformations in the prostate that negatively affect treatment, and this additional aspect of setup error should be considered when ERBs are used for prostate SBRT. Before treatment, the ERB position should be verified, and the ERB should be adjusted if the error is observed to exceed tolerable values.

  14. Laboratory errors and patient safety.

    PubMed

    Miligy, Dawlat A

    2015-01-01

    Laboratory data are extensively used in medical practice; consequently, laboratory errors have a tremendous impact on patient safety. Therefore, programs designed to identify and reduce laboratory errors, as well as, setting specific strategies are required to minimize these errors and improve patient safety. The purpose of this paper is to identify part of the commonly encountered laboratory errors throughout our practice in laboratory work, their hazards on patient health care and some measures and recommendations to minimize or to eliminate these errors. Recording the encountered laboratory errors during May 2008 and their statistical evaluation (using simple percent distribution) have been done in the department of laboratory of one of the private hospitals in Egypt. Errors have been classified according to the laboratory phases and according to their implication on patient health. Data obtained out of 1,600 testing procedure revealed that the total number of encountered errors is 14 tests (0.87 percent of total testing procedures). Most of the encountered errors lay in the pre- and post-analytic phases of testing cycle (representing 35.7 and 50 percent, respectively, of total errors). While the number of test errors encountered in the analytic phase represented only 14.3 percent of total errors. About 85.7 percent of total errors were of non-significant implication on patients health being detected before test reports have been submitted to the patients. On the other hand, the number of test errors that have been already submitted to patients and reach the physician represented 14.3 percent of total errors. Only 7.1 percent of the errors could have an impact on patient diagnosis. The findings of this study were concomitant with those published from the USA and other countries. This proves that laboratory problems are universal and need general standardization and bench marking measures. Original being the first data published from Arabic countries that evaluated the encountered laboratory errors and launch the great need for universal standardization and bench marking measures to control the laboratory work.

  15. Precise method of compensating radiation-induced errors in a hot-cathode-ionization gauge with correcting electrode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saeki, Hiroshi, E-mail: saeki@spring8.or.jp; Magome, Tamotsu, E-mail: saeki@spring8.or.jp

    2014-10-06

    To compensate pressure-measurement errors caused by a synchrotron radiation environment, a precise method using a hot-cathode-ionization-gauge head with correcting electrode, was developed and tested in a simulation experiment with excess electrons in the SPring-8 storage ring. This precise method to improve the measurement accuracy, can correctly reduce the pressure-measurement errors caused by electrons originating from the external environment, and originating from the primary gauge filament influenced by spatial conditions of the installed vacuum-gauge head. As the result of the simulation experiment to confirm the performance reducing the errors caused by the external environment, the pressure-measurement error using this method wasmore » approximately less than several percent in the pressure range from 10{sup −5} Pa to 10{sup −8} Pa. After the experiment, to confirm the performance reducing the error caused by spatial conditions, an additional experiment was carried out using a sleeve and showed that the improved function was available.« less

  16. Learning a locomotor task: with or without errors?

    PubMed

    Marchal-Crespo, Laura; Schneider, Jasmin; Jaeger, Lukas; Riener, Robert

    2014-03-04

    Robotic haptic guidance is the most commonly used robotic training strategy to reduce performance errors while training. However, research on motor learning has emphasized that errors are a fundamental neural signal that drive motor adaptation. Thus, researchers have proposed robotic therapy algorithms that amplify movement errors rather than decrease them. However, to date, no study has analyzed with precision which training strategy is the most appropriate to learn an especially simple task. In this study, the impact of robotic training strategies that amplify or reduce errors on muscle activation and motor learning of a simple locomotor task was investigated in twenty two healthy subjects. The experiment was conducted with the MAgnetic Resonance COmpatible Stepper (MARCOS) a special robotic device developed for investigations in the MR scanner. The robot moved the dominant leg passively and the subject was requested to actively synchronize the non-dominant leg to achieve an alternating stepping-like movement. Learning with four different training strategies that reduce or amplify errors was evaluated: (i) Haptic guidance: errors were eliminated by passively moving the limbs, (ii) No guidance: no robot disturbances were presented, (iii) Error amplification: existing errors were amplified with repulsive forces, (iv) Noise disturbance: errors were evoked intentionally with a randomly-varying force disturbance on top of the no guidance strategy. Additionally, the activation of four lower limb muscles was measured by the means of surface electromyography (EMG). Strategies that reduce or do not amplify errors limit muscle activation during training and result in poor learning gains. Adding random disturbing forces during training seems to increase attention, and therefore improve motor learning. Error amplification seems to be the most suitable strategy for initially less skilled subjects, perhaps because subjects could better detect their errors and correct them. Error strategies have a great potential to evoke higher muscle activation and provoke better motor learning of simple tasks. Neuroimaging evaluation of brain regions involved in learning can provide valuable information on observed behavioral outcomes related to learning processes. The impacts of these strategies on neurological patients need further investigations.

  17. Reducing Individual Variation for fMRI Studies in Children by Minimizing Template Related Errors

    PubMed Central

    Weng, Jian; Dong, Shanshan; He, Hongjian; Chen, Feiyan; Peng, Xiaogang

    2015-01-01

    Spatial normalization is an essential process for group comparisons in functional MRI studies. In practice, there is a risk of normalization errors particularly in studies involving children, seniors or diseased populations and in regions with high individual variation. One way to minimize normalization errors is to create a study-specific template based on a large sample size. However, studies with a large sample size are not always feasible, particularly for children studies. The performance of templates with a small sample size has not been evaluated in fMRI studies in children. In the current study, this issue was encountered in a working memory task with 29 children in two groups. We compared the performance of different templates: a study-specific template created by the experimental population, a Chinese children template and the widely used adult MNI template. We observed distinct differences in the right orbitofrontal region among the three templates in between-group comparisons. The study-specific template and the Chinese children template were more sensitive for the detection of between-group differences in the orbitofrontal cortex than the MNI template. Proper templates could effectively reduce individual variation. Further analysis revealed a correlation between the BOLD contrast size and the norm index of the affine transformation matrix, i.e., the SFN, which characterizes the difference between a template and a native image and differs significantly across subjects. Thereby, we proposed and tested another method to reduce individual variation that included the SFN as a covariate in group-wise statistics. This correction exhibits outstanding performance in enhancing detection power in group-level tests. A training effect of abacus-based mental calculation was also demonstrated, with significantly elevated activation in the right orbitofrontal region that correlated with behavioral response time across subjects in the trained group. PMID:26207985

  18. Improvement on Timing Accuracy of LIDAR for Remote Sensing

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Huang, W.; Zhou, X.; Huang, Y.; He, C.; Li, X.; Zhang, L.

    2018-05-01

    The traditional timing discrimination technique for laser rangefinding in remote sensing, which is lower in measurement performance and also has a larger error, has been unable to meet the high precision measurement and high definition lidar image. To solve this problem, an improvement of timing accuracy based on the improved leading-edge timing discrimination (LED) is proposed. Firstly, the method enables the corresponding timing point of the same threshold to move forward with the multiple amplifying of the received signal. Then, timing information is sampled, and fitted the timing points through algorithms in MATLAB software. Finally, the minimum timing error is calculated by the fitting function. Thereby, the timing error of the received signal from the lidar is compressed and the lidar data quality is improved. Experiments show that timing error can be significantly reduced by the multiple amplifying of the received signal and the algorithm of fitting the parameters, and a timing accuracy of 4.63 ps is achieved.

  19. Ultra-deep mutant spectrum profiling: improving sequencing accuracy using overlapping read pairs.

    PubMed

    Chen-Harris, Haiyin; Borucki, Monica K; Torres, Clinton; Slezak, Tom R; Allen, Jonathan E

    2013-02-12

    High throughput sequencing is beginning to make a transformative impact in the area of viral evolution. Deep sequencing has the potential to reveal the mutant spectrum within a viral sample at high resolution, thus enabling the close examination of viral mutational dynamics both within- and between-hosts. The challenge however, is to accurately model the errors in the sequencing data and differentiate real viral mutations, particularly those that exist at low frequencies, from sequencing errors. We demonstrate that overlapping read pairs (ORP) -- generated by combining short fragment sequencing libraries and longer sequencing reads -- significantly reduce sequencing error rates and improve rare variant detection accuracy. Using this sequencing protocol and an error model optimized for variant detection, we are able to capture a large number of genetic mutations present within a viral population at ultra-low frequency levels (<0.05%). Our rare variant detection strategies have important implications beyond viral evolution and can be applied to any basic and clinical research area that requires the identification of rare mutations.

  20. A numerical procedure for recovering true scattering coefficients from measurements with wide-beam antennas

    NASA Technical Reports Server (NTRS)

    Wang, Qinglin; Gogineni, S. P.

    1991-01-01

    A numerical procedure for estimating the true scattering coefficient, sigma(sup 0), from measurements made using wide-beam antennas. The use of wide-beam antennas results in an inaccurate estimate of sigma(sup 0) if the narrow-beam approximation is used in the retrieval process for sigma(sup 0). To reduce this error, a correction procedure was proposed that estimates the error resulting from the narrow-beam approximation and uses the error to obtain a more accurate estimate of sigma(sup 0). An exponential model was assumed to take into account the variation of sigma(sup 0) with incidence angles, and the model parameters are estimated from measured data. Based on the model and knowledge of the antenna pattern, the procedure calculates the error due to the narrow-beam approximation. The procedure is shown to provide a significant improvement in estimation of sigma(sup 0) obtained with wide-beam antennas. The proposed procedure is also shown insensitive to the assumed sigma(sup 0) model.

  1. QSRA: a quality-value guided de novo short read assembler.

    PubMed

    Bryant, Douglas W; Wong, Weng-Keen; Mockler, Todd C

    2009-02-24

    New rapid high-throughput sequencing technologies have sparked the creation of a new class of assembler. Since all high-throughput sequencing platforms incorporate errors in their output, short-read assemblers must be designed to account for this error while utilizing all available data. We have designed and implemented an assembler, Quality-value guided Short Read Assembler, created to take advantage of quality-value scores as a further method of dealing with error. Compared to previous published algorithms, our assembler shows significant improvements not only in speed but also in output quality. QSRA generally produced the highest genomic coverage, while being faster than VCAKE. QSRA is extremely competitive in its longest contig and N50/N80 contig lengths, producing results of similar quality to those of EDENA and VELVET. QSRA provides a step closer to the goal of de novo assembly of complex genomes, improving upon the original VCAKE algorithm by not only drastically reducing runtimes but also increasing the viability of the assembly algorithm through further error handling capabilities.

  2. MERIT DEM: A new high-accuracy global digital elevation model and its merit to global hydrodynamic modeling

    NASA Astrophysics Data System (ADS)

    Yamazaki, D.; Ikeshima, D.; Neal, J. C.; O'Loughlin, F.; Sampson, C. C.; Kanae, S.; Bates, P. D.

    2017-12-01

    Digital Elevation Models (DEM) are fundamental data for flood modelling. While precise airborne DEMs are available in developed regions, most parts of the world rely on spaceborne DEMs which include non-negligible height errors. Here we show the most accurate global DEM to date at 90m resolution by eliminating major error components from the SRTM and AW3D DEMs. Using multiple satellite data and multiple filtering techniques, we addressed absolute bias, stripe noise, speckle noise and tree height bias from spaceborne DEMs. After the error removal, significant improvements were found in flat regions where height errors were larger than topography variability, and landscapes features such as river networks and hill-valley structures became clearly represented. We found the topography slope of the previous DEMs was largely distorted in most of world major floodplains (e.g. Ganges, Nile, Niger, Mekong) and swamp forests (e.g. Amazon, Congo, Vasyugan). The developed DEM will largely reduce the uncertainty in both global and regional flood modelling.

  3. Influence of Tooth Spacing Error on Gears With and Without Profile Modifications

    NASA Technical Reports Server (NTRS)

    Padmasolala, Giri; Lin, Hsiang H.; Oswald, Fred B.

    2000-01-01

    A computer simulation was conducted to investigate the effectiveness of profile modification for reducing dynamic loads in gears with different tooth spacing errors. The simulation examined varying amplitudes of spacing error and differences in the span of teeth over which the error occurs. The modification considered included both linear and parabolic tip relief. The analysis considered spacing error that varies around most of the gear circumference (similar to a typical sinusoidal error pattern) as well as a shorter span of spacing errors that occurs on only a few teeth. The dynamic analysis was performed using a revised version of a NASA gear dynamics code, modified to add tooth spacing errors to the analysis. Results obtained from the investigation show that linear tip relief is more effective in reducing dynamic loads on gears with small spacing errors but parabolic tip relief becomes more effective as the amplitude of spacing error increases. In addition, the parabolic modification is more effective for the more severe error case where the error is spread over a longer span of teeth. The findings of this study can be used to design robust tooth profile modification for improving dynamic performance of gear sets with different tooth spacing errors.

  4. Task errors by emergency physicians are associated with interruptions, multitasking, fatigue and working memory capacity: a prospective, direct observation study.

    PubMed

    Westbrook, Johanna I; Raban, Magdalena Z; Walter, Scott R; Douglas, Heather

    2018-01-09

    Interruptions and multitasking have been demonstrated in experimental studies to reduce individuals' task performance. These behaviours are frequently used by clinicians in high-workload, dynamic clinical environments, yet their effects have rarely been studied. To assess the relative contributions of interruptions and multitasking by emergency physicians to prescribing errors. 36 emergency physicians were shadowed over 120 hours. All tasks, interruptions and instances of multitasking were recorded. Physicians' working memory capacity (WMC) and preference for multitasking were assessed using the Operation Span Task (OSPAN) and Inventory of Polychronic Values. Following observation, physicians were asked about their sleep in the previous 24 hours. Prescribing errors were used as a measure of task performance. We performed multivariate analysis of prescribing error rates to determine associations with interruptions and multitasking, also considering physician seniority, age, psychometric measures, workload and sleep. Physicians experienced 7.9 interruptions/hour. 28 clinicians were observed prescribing 239 medication orders which contained 208 prescribing errors. While prescribing, clinicians were interrupted 9.4 times/hour. Error rates increased significantly if physicians were interrupted (rate ratio (RR) 2.82; 95% CI 1.23 to 6.49) or multitasked (RR 1.86; 95% CI 1.35 to 2.56) while prescribing. Having below-average sleep showed a >15-fold increase in clinical error rate (RR 16.44; 95% CI 4.84 to 55.81). WMC was protective against errors; for every 10-point increase on the 75-point OSPAN, a 19% decrease in prescribing errors was observed. There was no effect of polychronicity, workload, physician gender or above-average sleep on error rates. Interruptions, multitasking and poor sleep were associated with significantly increased rates of prescribing errors among emergency physicians. WMC mitigated the negative influence of these factors to an extent. These results confirm experimental findings in other fields and raise questions about the acceptability of the high rates of multitasking and interruption in clinical environments. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. ALGORITHM TO REDUCE APPROXIMATION ERROR FROM THE COMPLEX-VARIABLE BOUNDARY-ELEMENT METHOD APPLIED TO SOIL FREEZING.

    USGS Publications Warehouse

    Hromadka, T.V.; Guymon, G.L.

    1985-01-01

    An algorithm is presented for the numerical solution of the Laplace equation boundary-value problem, which is assumed to apply to soil freezing or thawing. The Laplace equation is numerically approximated by the complex-variable boundary-element method. The algorithm aids in reducing integrated relative error by providing a true measure of modeling error along the solution domain boundary. This measure of error can be used to select locations for adding, removing, or relocating nodal points on the boundary or to provide bounds for the integrated relative error of unknown nodal variable values along the boundary.

  6. Calibration of misalignment errors in the non-null interferometry based on reverse iteration optimization algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Xinmu; Hao, Qun; Hu, Yao; Wang, Shaopu; Ning, Yan; Li, Tengfei; Chen, Shufen

    2017-10-01

    With no necessity of compensating the whole aberration introduced by the aspheric surfaces, non-null test has the advantage over null test in applicability. However, retrace error, which is brought by the path difference between the rays reflected from the surface under test (SUT) and the incident rays, is introduced into the measurement and makes up of the residual wavefront aberrations (RWAs) along with surface figure error (SFE), misalignment error and other influences. Being difficult to separate from RWAs, the misalignment error may remain after measurement and it is hard to identify whether it is removed or not. It is a primary task to study the removal of misalignment error. A brief demonstration of digital Moiré interferometric technique is presented and a calibration method for misalignment error on the basis of reverse iteration optimization (RIO) algorithm in non-null test method is addressed. The proposed method operates mostly in the virtual system, and requires no accurate adjustment in the real interferometer, which is of significant advantage in reducing the errors brought by repeating complicated manual adjustment, furthermore improving the accuracy of the aspheric surface test. Simulation verification is done in this paper. The calibration accuracy of the position and attitude can achieve at least a magnitude of 10-5 mm and 0.0056×10-6rad, respectively. The simulation demonstrates that the influence of misalignment error can be precisely calculated and removed after calibration.

  7. Risk-Aware Planetary Rover Operation: Autonomous Terrain Classification and Path Planning

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Fuchs, Thoams J.; Steffy, Amanda; Maimone, Mark; Yen, Jeng

    2015-01-01

    Identifying and avoiding terrain hazards (e.g., soft soil and pointy embedded rocks) are crucial for the safety of planetary rovers. This paper presents a newly developed groundbased Mars rover operation tool that mitigates risks from terrain by automatically identifying hazards on the terrain, evaluating their risks, and suggesting operators safe paths options that avoids potential risks while achieving specified goals. The tool will bring benefits to rover operations by reducing operation cost, by reducing cognitive load of rover operators, by preventing human errors, and most importantly, by significantly reducing the risk of the loss of rovers.

  8. Insufficient Hartree–Fock Exchange in Hybrid DFT Functionals Produces Bent Alkynyl Radical Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oyeyemi, Victor B.; Keith, John A.; Pavone, Michele

    2012-01-11

    Density functional theory (DFT) is often used to determine the electronic and geometric structures of molecules. While studying alkynyl radicals, we discovered that DFT exchange-correlation (XC) functionals containing less than ~22% Hartree–Fock (HF) exchange led to qualitatively different structures than those predicted from ab initio HF and post-HF calculations or DFT XCs containing 25% or more HF exchange. We attribute this discrepancy to rehybridization at the radical center due to electron delocalization across the triple bonds of the alkynyl groups, which itself is an artifact of self-interaction and delocalization errors. Inclusion of sufficient exact exchange reduces these errors and suppressesmore » this erroneous delocalization; we find that a threshold amount is needed for accurate structure determinations. Finally, below this threshold, significant errors in predicted alkyne thermochemistry emerge as a consequence.« less

  9. MPI Runtime Error Detection with MUST: Advances in Deadlock Detection

    DOE PAGES

    Hilbrich, Tobias; Protze, Joachim; Schulz, Martin; ...

    2013-01-01

    The widely used Message Passing Interface (MPI) is complex and rich. As a result, application developers require automated tools to avoid and to detect MPI programming errors. We present the Marmot Umpire Scalable Tool (MUST) that detects such errors with significantly increased scalability. We present improvements to our graph-based deadlock detection approach for MPI, which cover future MPI extensions. Our enhancements also check complex MPI constructs that no previous graph-based detection approach handled correctly. Finally, we present optimizations for the processing of MPI operations that reduce runtime deadlock detection overheads. Existing approaches often require ( p ) analysis time permore » MPI operation, for p processes. We empirically observe that our improvements lead to sub-linear or better analysis time per operation for a wide range of real world applications.« less

  10. Pricing Employee Stock Options (ESOs) with Random Lattice

    NASA Astrophysics Data System (ADS)

    Chendra, E.; Chin, L.; Sukmana, A.

    2018-04-01

    Employee Stock Options (ESOs) are stock options granted by companies to their employees. Unlike standard options that can be traded by typical institutional or individual investors, employees cannot sell or transfer their ESOs to other investors. The sale restrictions may induce the ESO’s holder to exercise them earlier. In much cited paper, Hull and White propose a binomial lattice in valuing ESOs which assumes that employees will exercise voluntarily their ESOs if the stock price reaches a horizontal psychological barrier. Due to nonlinearity errors, the numerical pricing results oscillate significantly so they may lead to large pricing errors. In this paper, we use the random lattice method to price the Hull-White ESOs model. This method can reduce the nonlinearity error by aligning a layer of nodes of the random lattice with a psychological barrier.

  11. A Methodology for Validating Safety Heuristics Using Clinical Simulations: Identifying and Preventing Possible Technology-Induced Errors Related to Using Health Information Systems

    PubMed Central

    Borycki, Elizabeth; Kushniruk, Andre; Carvalho, Christopher

    2013-01-01

    Internationally, health information systems (HIS) safety has emerged as a significant concern for governments. Recently, research has emerged that has documented the ability of HIS to be implicated in the harm and death of patients. Researchers have attempted to develop methods that can be used to prevent or reduce technology-induced errors. Some researchers are developing methods that can be employed prior to systems release. These methods include the development of safety heuristics and clinical simulations. In this paper, we outline our methodology for developing safety heuristics specific to identifying the features or functions of a HIS user interface design that may lead to technology-induced errors. We follow this with a description of a methodological approach to validate these heuristics using clinical simulations. PMID:23606902

  12. Active full-shell grazing-incidence optics

    NASA Astrophysics Data System (ADS)

    Roche, Jacqueline M.; Elsner, Ronald F.; Ramsey, Brian D.; O'Dell, Stephen L.; Kolodziejczak, Jeffrey J.; Weisskopf, Martin C.; Gubarev, Mikhail V.

    2016-09-01

    MSFC has a long history of developing full-shell grazing-incidence x-ray optics for both narrow (pointed) and wide field (surveying) applications. The concept presented in this paper shows the potential to use active optics to switch between narrow and wide-field geometries, while maintaining large effective area and high angular resolution. In addition, active optics has the potential to reduce errors due to mounting and manufacturing lightweight optics. The design presented corrects low spatial frequency error and has significantly fewer actuators than other concepts presented thus far in the field of active x-ray optics. Using a finite element model, influence functions are calculated using active components on a full-shell grazing-incidence optic. Next, the ability of the active optic to effect a change of optical prescription and to correct for errors due to manufacturing and mounting is modeled.

  13. Body position reproducibility and joint alignment stability criticality on a muscular strength research device

    NASA Astrophysics Data System (ADS)

    Nunez, F.; Romero, A.; Clua, J.; Mas, J.; Tomas, A.; Catalan, A.; Castellsaguer, J.

    2005-08-01

    MARES (Muscle Atrophy Research and Exercise System) is a computerized ergometer for neuromuscular research to be flown and installed onboard the International Space Station in 2007. Validity of data acquired depends on controlling and reducing all significant error sources. One of them is the misalignment of the joint rotation axis with respect to the motor axis.The error induced on the measurements is proportional to the misalignment between both axis. Therefore, the restraint system's performance is critical [1]. MARES HRS (Human Restraint System) assures alignment within an acceptable range while performing the exercise (results: elbow movement:13.94mm+/-5.45, Knee movement: 22.36mm+/- 6.06 ) and reproducibility of human positioning (results: elbow movement: 2.82mm+/-1.56, Knee movement 7.45mm+/-4.8 ). These results allow limiting measurement errors induced by misalignment.

  14. Active Full-Shell Grazing-Incidence Optics

    NASA Technical Reports Server (NTRS)

    Davis, Jacqueline M.; Elsner, Ronald F.; Ramsey, Brian D.; O'Dell, Stephen L.; Kolodziejczak, Jeffery; Weisskopf, Martin C.; Gubarev, Mikhail V.

    2016-01-01

    MSFC has a long history of developing full-shell grazing-incidence x-ray optics for both narrow (pointed) and wide field (surveying) applications. The concept presented in this paper shows the potential to use active optics to switch between narrow and wide-field geometries, while maintaining large effective area and high angular resolution. In addition, active optics has the potential to reduce errors due to mounting and manufacturing lightweight optics. The design presented corrects low spatial frequency error and has significantly fewer actuators than other concepts presented thus far in the field of active x-ray optics. Using a finite element model, influence functions are calculated using active components on a full-shell grazing-incidence optic. Next, the ability of the active optic to effect a change of optical prescription and to correct for errors due to manufacturing and mounting is modeled.

  15. Relevant reduction effect with a modified thermoplastic mask of rotational error for glottic cancer in IMRT

    NASA Astrophysics Data System (ADS)

    Jung, Jae Hong; Jung, Joo-Young; Cho, Kwang Hwan; Ryu, Mi Ryeong; Bae, Sun Hyun; Moon, Seong Kwon; Kim, Yong Ho; Choe, Bo-Young; Suh, Tae Suk

    2017-02-01

    The purpose of this study was to analyze the glottis rotational error (GRE) by using a thermoplastic mask for patients with the glottic cancer undergoing intensity-modulated radiation therapy (IMRT). We selected 20 patients with glottic cancer who had received IMRT by using the tomotherapy. The image modalities with both kilovoltage computed tomography (planning kVCT) and megavoltage CT (daily MVCT) images were used for evaluating the error. Six anatomical landmarks in the image were defined to evaluate a correlation between the absolute GRE (°) and the length of contact with the underlying skin of the patient by the mask (mask, mm). We also statistically analyzed the results by using the Pearson's correlation coefficient and a linear regression analysis ( P <0.05). The mask and the absolute GRE were verified to have a statistical correlation ( P < 0.01). We found a statistical significance for each parameter in the linear regression analysis (mask versus absolute roll: P = 0.004 [ P < 0.05]; mask versus 3D-error: P = 0.000 [ P < 0.05]). The range of the 3D-errors with contact by the mask was from 1.2% - 39.7% between the maximumand no-contact case in this study. A thermoplastic mask with a tight, increased contact area may possibly contribute to the uncertainty of the reproducibility as a variation of the absolute GRE. Thus, we suggest that a modified mask, such as one that covers only the glottis area, can significantly reduce the patients' setup errors during the treatment.

  16. Radar walking speed measurements of seniors in their apartments: technology for fall prevention.

    PubMed

    Cuddihy, Paul E; Yardibi, Tarik; Legenzoff, Zachary J; Liu, Liang; Phillips, Calvin E; Abbott, Carmen; Galambos, Colleen; Keller, James; Popescu, Mihail; Back, Jessica; Skubic, Marjorie; Rantz, Marilyn J

    2012-01-01

    Falls are a significant cause of injury and accidental death among persons over the age of 65. Gait velocity is one of the parameters which have been correlated to the risk of falling. We aim to build a system which monitors gait in seniors and reports any changes to caregivers, who can then perform a clinical assessment and perform corrective and preventative actions to reduce the likelihood of falls. In this paper, we deploy a Doppler radar-based gait measurement system into the apartments of thirteen seniors. In scripted walks, we show the system measures gait velocity with a mean error of 14.5% compared to the time recorded by a clinician. With a calibration factor, the mean error is reduced to 10.5%. The radar is a promising sensing technology for gait velocity in a day-to-day senior living environment.

  17. Virtual design and construction of plumbing systems

    NASA Astrophysics Data System (ADS)

    Filho, João Bosco P. Dantas; Angelim, Bruno Maciel; Guedes, Joana Pimentel; de Castro, Marcelo Augusto Farias; Neto, José de Paula Barros

    2016-12-01

    Traditionally, the design coordination process is carried out by overlaying and comparing 2D drawings made by different project participants. Detecting information errors from a composite drawing is especially challenging and error prone. This procedure usually leaves many design errors undetected until construction begins, and typically lead to rework. Correcting conflict issues, which were not identified during design and coordination phase, reduces the overall productivity for everyone involved in the construction process. The identification of construction issues in the field generate Request for Information (RFIs) that is one of delays causes. The application of Virtual Design and Construction (VDC) tools to the coordination processes can bring significant value to architecture, structure, and mechanical, electrical, and plumbing (MEP) designs in terms of a reduced number of errors undetected and requests for information. This paper is focused on evaluating requests for information (RFI) associated with water/sanitary facilities of a BIM model. Thus, it is expected to add improvements of water/sanitary facility designs, as well as to assist the virtual construction team to notice and identify design problems. This is an exploratory and descriptive research. A qualitative methodology is used. This study adopts RFI's classification in six analyzed categories: correction, omission, validation of information, modification, divergence of information and verification. The results demonstrate VDC's contribution improving the plumbing system designs. Recommendations are suggested to identify and avoid these RFI types in plumbing system design process or during virtual construction.

  18. Gravity and Nonconservative Force Model Tuning for the GEOSAT Follow-On Spacecraft

    NASA Technical Reports Server (NTRS)

    Lemoine, Frank G.; Zelensky, Nikita P.; Rowlands, David D.; Luthcke, Scott B.; Chinn, Douglas S.; Marr, Gregory C.; Smith, David E. (Technical Monitor)

    2000-01-01

    The US Navy's GEOSAT Follow-On spacecraft was launched on February 10, 1998 and the primary objective of the mission was to map the oceans using a radar altimeter. Three radar altimeter calibration campaigns have been conducted in 1999 and 2000. The spacecraft is tracked by satellite laser ranging (SLR) and Doppler beacons and a limited amount of data have been obtained from the Global Positioning Receiver (GPS) on board the satellite. Even with EGM96, the predicted radial orbit error due to gravity field mismodelling (to 70x70) remains high at 2.61 cm (compared to 0.88 cm for TOPEX). We report on the preliminary gravity model tuning for GFO using SLR, and altimeter crossover data. Preliminary solutions using SLR and GFO/GFO crossover data from CalVal campaigns I and II in June-August 1999, and January-February 2000 have reduced the predicted radial orbit error to 1.9 cm and further reduction will be possible when additional data are added to the solutions. The gravity model tuning has improved principally the low order m-daily terms and has reduced significantly the geographically correlated error present in this satellite orbit. In addition to gravity field mismodelling, the largest contributor to the orbit error is the non-conservative force mismodelling. We report on further nonconservative force model tuning results using available data from over one cycle in beta prime.

  19. The effect of information provision on reduction of errors in intravenous drug preparation and administration by nurses in ICU and surgical wards.

    PubMed

    Abbasinazari, Mohammad; Zareh-Toranposhti, Samaneh; Hassani, Abdollah; Sistanizad, Mohammad; Azizian, Homa; Panahi, Yunes

    2012-01-01

    Malpractice in preparation and administration of intravenous (IV) medications has been reported frequently. Inadequate knowledge of nurses has been reported as a cause of such errors. We aimed to evaluate the role of nurses' education via installation of wall posters and giving informative pamphlets in reducing the errors in preparation and administration of intravenous drugs in 2 wards (ICU and surgery) of a teaching hospital in Tehran, Iran. A trained observer stationed in 2 wards in different work shifts. He recorded the nurses' practice regarding the preparation and administration of IV drugs and scored them before and after the education process. 400 observations were evaluated. Of them, 200 were related to before education and 200 were related to after education. On a 0-10 quality scale, mean ± SD scores of before and after education were determined. Mean ± SD scores of before and after education at the 2 wards were 4.51 (± 1.24) and 6.15 (± 1.23) respectively. There was a significant difference between the scores before and after intervention in ICU (P<0.001), surgery (P<0.001), and total two wards (P<0.001). Nurses' education by using wall poster and informative pamphlets regarding the correct preparation and administration of IV drugs can reduce the number of errors.

  20. Mucuna pruriens seed extract reduces oxidative stress in nigrostriatal tissue and improves neurobehavioral activity in paraquat-induced Parkinsonian mouse model.

    PubMed

    Yadav, Satyndra Kumar; Prakash, Jay; Chouhan, Shikha; Singh, Surya Pratap

    2013-06-01

    Parkinson's disease (PD) is a neurodegenerative disease which causes rigidity, resting tremor and postural instability. Treatment for this disease is still under investigation. Mucuna pruriens (L.), is a traditional herbal medicine, used in India since 1500 B.C., as a neuroprotective agent. In this present study, we evaluated the therapeutic effects of aqueous extract of M. pruriens (Mp) seed in Parkinsonian mouse model developed by chronic exposure to paraquat (PQ). Results of our study revealed that the nigrostriatal portion of Parkinsonian mouse brain showed significantly increased levels of nitrite, malondialdehyde (MDA) and reduced levels of catalase compared to the control. In the Parkinsonian mice hanging time was decreased, whereas narrow beam walk time and foot printing errors were increased. Treatment with aqueous seed extract of Mp significantly increased the catalase activity and decreased the MDA and nitrite level, compared to untreated Parkinsonian mouse brain. Mp treatment also improved the behavioral abnormalities. It increased hanging time, whereas it decreased narrow beam walk time and foot printing error compared to untreated Parkinsonian mouse brain. Furthermore, we observed a significant reduction in tyrosine hydroxylase (TH) immunoreactivity in the substantia nigra (SN) and striatum region of the brain, after treatment with PQ which was considerably restored by the use of Mp seed extract. Our result suggested that Mp seed extract treatment significantly reduced the PQ induced neurotoxicity as evident by decrease in oxidative damage, physiological abnormalities and immunohistochemical changes in the Parkinsonian mouse. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Patient safety in otolaryngology: a descriptive review.

    PubMed

    Danino, Julian; Muzaffar, Jameel; Metcalfe, Chris; Coulson, Chris

    2017-03-01

    Human evaluation and judgement may include errors that can have disastrous results. Within medicine and healthcare there has been slow progress towards major changes in safety. Healthcare lags behind other specialised industries, such as aviation and nuclear power, where there have been significant improvements in overall safety, especially in reducing risk of errors. Following several high profile cases in the USA during the 1990s, a report titled "To Err Is Human: Building a Safer Health System" was published. The report extrapolated that in the USA approximately 50,000 to 100,000 patients may die each year as a result of medical errors. Traditionally otolaryngology has always been regarded as a "safe specialty". A study in the USA in 2004 inferred that there may be 2600 cases of major morbidity and 165 deaths within the specialty. MEDLINE via PubMed interface was searched for English language articles published between 2000 and 2012. Each combined two or three of the keywords noted earlier. Limitations are related to several generic topics within patient safety in otolaryngology. Other areas covered have been current relevant topics due to recent interest or new advances in technology. There has been a heightened awareness within the healthcare community of patient safety; it has become a major priority. Focus has shifted from apportioning blame to prevention of the errors and implementation of patient safety mechanisms in healthcare delivery. Type of Errors can be divided into errors due to action and errors due to knowledge or planning. In healthcare there are several factors that may influence adverse events and patient safety. Although technology may improve patient safety, it also introduces new sources of error. The ability to work with people allows for the increase in safety netting. Team working has been shown to have a beneficial effect on patient safety. Any field of work involving human decision-making will always have a risk of error. Within Otolaryngology, although patient safety has evolved along similar themes as other surgical specialties; there are several specific high-risk areas. Medical error is a common problem and its human cost is of immense importance. Steps to reduce such errors require the identification of high-risk practice within a complex healthcare system. The commitment to patient safety and quality improvement in medicine depend on personal responsibility and professional accountability.

  2. The use of compressive sensing and peak detection in the reconstruction of microtubules length time series in the process of dynamic instability.

    PubMed

    Mahrooghy, Majid; Yarahmadian, Shantia; Menon, Vineetha; Rezania, Vahid; Tuszynski, Jack A

    2015-10-01

    Microtubules (MTs) are intra-cellular cylindrical protein filaments. They exhibit a unique phenomenon of stochastic growth and shrinkage, called dynamic instability. In this paper, we introduce a theoretical framework for applying Compressive Sensing (CS) to the sampled data of the microtubule length in the process of dynamic instability. To reduce data density and reconstruct the original signal with relatively low sampling rates, we have applied CS to experimental MT lament length time series modeled as a Dichotomous Markov Noise (DMN). The results show that using CS along with the wavelet transform significantly reduces the recovery errors comparing in the absence of wavelet transform, especially in the low and the medium sampling rates. In a sampling rate ranging from 0.2 to 0.5, the Root-Mean-Squared Error (RMSE) decreases by approximately 3 times and between 0.5 and 1, RMSE is small. We also apply a peak detection technique to the wavelet coefficients to detect and closely approximate the growth and shrinkage of MTs for computing the essential dynamic instability parameters, i.e., transition frequencies and specially growth and shrinkage rates. The results show that using compressed sensing along with the peak detection technique and wavelet transform in sampling rates reduces the recovery errors for the parameters. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Reduced-cost second-order algebraic-diagrammatic construction method for excitation energies and transition moments

    NASA Astrophysics Data System (ADS)

    Mester, Dávid; Nagy, Péter R.; Kállay, Mihály

    2018-03-01

    A reduced-cost implementation of the second-order algebraic-diagrammatic construction [ADC(2)] method is presented. We introduce approximations by restricting virtual natural orbitals and natural auxiliary functions, which results, on average, in more than an order of magnitude speedup compared to conventional, density-fitting ADC(2) algorithms. The present scheme is the successor of our previous approach [D. Mester, P. R. Nagy, and M. Kállay, J. Chem. Phys. 146, 194102 (2017)], which has been successfully applied to obtain singlet excitation energies with the linear-response second-order coupled-cluster singles and doubles model. Here we report further methodological improvements and the extension of the method to compute singlet and triplet ADC(2) excitation energies and transition moments. The various approximations are carefully benchmarked, and conservative truncation thresholds are selected which guarantee errors much smaller than the intrinsic error of the ADC(2) method. Using the canonical values as reference, we find that the mean absolute error for both singlet and triplet ADC(2) excitation energies is 0.02 eV, while that for oscillator strengths is 0.001 a.u. The rigorous cutoff parameters together with the significantly reduced operation count and storage requirements allow us to obtain accurate ADC(2) excitation energies and transition properties using triple-ζ basis sets for systems of up to one hundred atoms.

  4. Improving the Thermal, Radial and Temporal Accuracy of the Analytical Ultracentrifuge through External References

    PubMed Central

    Ghirlando, Rodolfo; Balbo, Andrea; Piszczek, Grzegorz; Brown, Patrick H.; Lewis, Marc S.; Brautigam, Chad A.; Schuck, Peter; Zhao, Huaying

    2013-01-01

    Sedimentation velocity (SV) is a method based on first-principles that provides a precise hydrodynamic characterization of macromolecules in solution. Due to recent improvements in data analysis, the accuracy of experimental SV data emerges as a limiting factor in its interpretation. Our goal was to unravel the sources of experimental error and develop improved calibration procedures. We implemented the use of a Thermochron iButton® temperature logger to directly measure the temperature of a spinning rotor, and detected deviations that can translate into an error of as much as 10% in the sedimentation coefficient. We further designed a precision mask with equidistant markers to correct for instrumental errors in the radial calibration, which were observed to span a range of 8.6%. The need for an independent time calibration emerged with use of the current data acquisition software (Zhao et al., doi 10.1016/j.ab.2013.02.011) and we now show that smaller but significant time errors of up to 2% also occur with earlier versions. After application of these calibration corrections, the sedimentation coefficients obtained from eleven instruments displayed a significantly reduced standard deviation of ∼ 0.7 %. This study demonstrates the need for external calibration procedures and regular control experiments with a sedimentation coefficient standard. PMID:23711724

  5. Improving the thermal, radial, and temporal accuracy of the analytical ultracentrifuge through external references.

    PubMed

    Ghirlando, Rodolfo; Balbo, Andrea; Piszczek, Grzegorz; Brown, Patrick H; Lewis, Marc S; Brautigam, Chad A; Schuck, Peter; Zhao, Huaying

    2013-09-01

    Sedimentation velocity (SV) is a method based on first principles that provides a precise hydrodynamic characterization of macromolecules in solution. Due to recent improvements in data analysis, the accuracy of experimental SV data emerges as a limiting factor in its interpretation. Our goal was to unravel the sources of experimental error and develop improved calibration procedures. We implemented the use of a Thermochron iButton temperature logger to directly measure the temperature of a spinning rotor and detected deviations that can translate into an error of as much as 10% in the sedimentation coefficient. We further designed a precision mask with equidistant markers to correct for instrumental errors in the radial calibration that were observed to span a range of 8.6%. The need for an independent time calibration emerged with use of the current data acquisition software (Zhao et al., Anal. Biochem., 437 (2013) 104-108), and we now show that smaller but significant time errors of up to 2% also occur with earlier versions. After application of these calibration corrections, the sedimentation coefficients obtained from 11 instruments displayed a significantly reduced standard deviation of approximately 0.7%. This study demonstrates the need for external calibration procedures and regular control experiments with a sedimentation coefficient standard. Published by Elsevier Inc.

  6. A Medication Safety Model: A Case Study in Thai Hospital

    PubMed Central

    Rattanarojsakul, Phichai; Thawesaengskulthai, Natcha

    2013-01-01

    Reaching zero defects is vital in medication service. Medication error can be reduced if the causes are recognized. The purpose of this study is to search for a conceptual framework of the causes of medication error in Thailand and to examine relationship between these factors and its importance. The study was carried out upon an in-depth case study and survey of hospital personals who were involved in the drug use process. The structured survey was based on Emergency Care Research Institute (ECRI) (2008) questionnaires focusing on the important factors that affect the medication safety. Additional questionnaires included content to the context of Thailand's private hospital, validated by five-hospital qualified experts. By correlation Pearson analysis, the result revealed 14 important factors showing a linear relationship with drug administration error except the medication reconciliation. By independent sample t-test, the administration error in the hospital was significantly related to external impact. The multiple regression analysis of the detail of medication administration also indicated the patient identification before administration of medication, detection of the risk of medication adverse effects and assurance of medication administration at the right time, dosage and route were statistically significant at 0.05 level. The major implication of the study is to propose a medication safety model in a Thai private hospital. PMID:23985110

  7. A medication safety model: a case study in Thai hospital.

    PubMed

    Rattanarojsakul, Phichai; Thawesaengskulthai, Natcha

    2013-06-12

    Reaching zero defects is vital in medication service. Medication error can be reduced if the causes are recognized. The purpose of this study is to search for a conceptual framework of the causes of medication error in Thailand and to examine relationship between these factors and its importance. The study was carried out upon an in-depth case study and survey of hospital personals who were involved in the drug use process. The structured survey was based on Emergency Care Research Institute (ECRI) (2008) questionnaires focusing on the important factors that affect the medication safety. Additional questionnaires included content to the context of Thailand's private hospital, validated by five-hospital qualified experts. By correlation Pearson analysis, the result revealed 14 important factors showing a linear relationship with drug administration error except the medication reconciliation. By independent sample t-test, the administration error in the hospital was significantly related to external impact. The multiple regression analysis of the detail of medication administration also indicated the patient identification before administration of medication, detection of the risk of medication adverse effects and assurance of medication administration at the right time, dosage and route were statistically significant at 0.05 level. The major implication of the study is to propose a medication safety model in a Thai private hospital.

  8. Experimental test of dense wavelength-division multiplexing using novel, periodic-group-delay-complemented dispersion compensation and dispersion-managed solitons

    NASA Astrophysics Data System (ADS)

    Mollenauer, Linn F.; Grant, Andrew; Liu, Xiang; Wei, Xing; Xie, Chongjin; Kang, Inuk

    2003-11-01

    In an all-Raman amplified, recirculating loop containing 100-km spans, we have tested dense wavelength-division multiplexing at 10 Gbits/s per channel, using dispersion-managed solitons and a novel, periodic-group-delay-complemented dispersion-compensation scheme that greatly reduces the timing jitter from interchannel collisions. The achieved working distances are ~9000 and ~20,000 km for uncorrected bit error rates of <10-8 and <10-3, respectively, the latter corresponding to the use of ``enhanced'' forward error correction; significantly, these distances are very close to those achievable in single-channel transmission in the same system.

  9. Patient Safety Culture and the Second Victim Phenomenon: Connecting Culture to Staff Distress in Nurses.

    PubMed

    Quillivan, Rebecca R; Burlison, Jonathan D; Browne, Emily K; Scott, Susan D; Hoffman, James M

    2016-08-01

    Second victim experiences can affect the wellbeing of health care providers and compromise patient safety. Many factors associated with improved coping after patient safety event involvement are also components of a strong patient safety culture, so that supportive patient safety cultures may reduce second victim-related trauma. A cross-sectional survey study was conducted to assess the influence of patient safety culture on second victim-related distress. The Agency for Healthcare Research and Quality (AHRQ) Hospital Survey on Patient Safety Culture (HSOPSC) and the Second Victim Experience and Support Tool (SVEST), which was developed to assess organizational support and personal and professional distress after involvement in a patient safety event, were administered to nurses involved in direct patient care. Of 358 nurses at a specialized pediatric hospital, 169 (47.2%) completed both surveys. Hierarchical linear regres sion demonstrated that the patient safety culture survey dimension nonpunitive response to error was significantly associated with reductions in the second victim survey dimensions psychological, physical, and professional distress (p < 0.001). As a mediator, organizational support fully explained the nonpunitive response to error-physical distress and nonpunitive response to error-professional distress relationships and partially explained the nonpunitive response to error-psychological distress relationship. The results suggest that punitive safety cultures may contribute to self-reported perceptions of second victim-related psychological, physical, and professional distress, which could reflect a lack of organizational support. Reducing punitive response to error and encouraging supportive coworker, supervisor, and institutional interactions may be useful strategies to manage the severity of second victim experiences.

  10. Reducing Modeling Error of Graphical Methods for Estimating Volume of Distribution Measurements in PIB-PET study

    PubMed Central

    Guo, Hongbin; Renaut, Rosemary A; Chen, Kewei; Reiman, Eric M

    2010-01-01

    Graphical analysis methods are widely used in positron emission tomography quantification because of their simplicity and model independence. But they may, particularly for reversible kinetics, lead to bias in the estimated parameters. The source of the bias is commonly attributed to noise in the data. Assuming a two-tissue compartmental model, we investigate the bias that originates from modeling error. This bias is an intrinsic property of the simplified linear models used for limited scan durations, and it is exaggerated by random noise and numerical quadrature error. Conditions are derived under which Logan's graphical method either over- or under-estimates the distribution volume in the noise-free case. The bias caused by modeling error is quantified analytically. The presented analysis shows that the bias of graphical methods is inversely proportional to the dissociation rate. Furthermore, visual examination of the linearity of the Logan plot is not sufficient for guaranteeing that equilibrium has been reached. A new model which retains the elegant properties of graphical analysis methods is presented, along with a numerical algorithm for its solution. We perform simulations with the fibrillar amyloid β radioligand [11C] benzothiazole-aniline using published data from the University of Pittsburgh and Rotterdam groups. The results show that the proposed method significantly reduces the bias due to modeling error. Moreover, the results for data acquired over a 70 minutes scan duration are at least as good as those obtained using existing methods for data acquired over a 90 minutes scan duration. PMID:20493196

  11. Follow on Research for Multi-Utility Technology Test Bed Aircraft at NASA Dryden Flight Research Center (FY13 Progress Report)

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi

    2013-01-01

    Modern aircraft employ a significant fraction of their weight in composite materials to reduce weight and improve performance. Aircraft aeroservoelastic models are typically characterized by significant levels of model parameter uncertainty due to the composite manufacturing process. Small modeling errors in the finite element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of Multi Utility Technology Test-bed (MUTT) aircraft is the flight demonstration of active flutter suppression, and therefore in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of MUTT aircraft. The ground vibration test-validated structural dynamic finite element model of the MUTT aircraft is created in this study. The structural dynamic finite element model of MUTT aircraft is improved using the in-house Multi-disciplinary Design, Analysis, and Optimization tool. In this study, two different weight configurations of MUTT aircraft have been improved simultaneously in a single model tuning procedure.

  12. Optics measurement algorithms and error analysis for the proton energy frontier

    NASA Astrophysics Data System (ADS)

    Langner, A.; Tomás, R.

    2015-03-01

    Optics measurement algorithms have been improved in preparation for the commissioning of the LHC at higher energy, i.e., with an increased damage potential. Due to machine protection considerations the higher energy sets tighter limits in the maximum excitation amplitude and the total beam charge, reducing the signal to noise ratio of optics measurements. Furthermore the precision in 2012 (4 TeV) was insufficient to understand beam size measurements and determine interaction point (IP) β -functions (β*). A new, more sophisticated algorithm has been developed which takes into account both the statistical and systematic errors involved in this measurement. This makes it possible to combine more beam position monitor measurements for deriving the optical parameters and demonstrates to significantly improve the accuracy and precision. Measurements from the 2012 run have been reanalyzed which, due to the improved algorithms, result in a significantly higher precision of the derived optical parameters and decreased the average error bars by a factor of three to four. This allowed the calculation of β* values and demonstrated to be fundamental in the understanding of emittance evolution during the energy ramp.

  13. Pricing and hedging derivative securities with neural networks: Bayesian regularization, early stopping, and bagging.

    PubMed

    Gençay, R; Qi, M

    2001-01-01

    We study the effectiveness of cross validation, Bayesian regularization, early stopping, and bagging to mitigate overfitting and improving generalization for pricing and hedging derivative securities with daily S&P 500 index daily call options from January 1988 to December 1993. Our results indicate that Bayesian regularization can generate significantly smaller pricing and delta-hedging errors than the baseline neural-network (NN) model and the Black-Scholes model for some years. While early stopping does not affect the pricing errors, it significantly reduces the hedging error (HE) in four of the six years we investigated. Although computationally most demanding, bagging seems to provide the most accurate pricing and delta hedging. Furthermore, the standard deviation of the MSPE of bagging is far less than that of the baseline model in all six years, and the standard deviation of the average HE of bagging is far less than that of the baseline model in five out of six years. We conclude that they be used at least in cases when no appropriate hints are available.

  14. The effectiveness of risk management program on pediatric nurses' medication error.

    PubMed

    Dehghan-Nayeri, Nahid; Bayat, Fariba; Salehi, Tahmineh; Faghihzadeh, Soghrat

    2013-09-01

    Medication therapy is one of the most complex and high-risk clinical processes that nurses deal with. Medication error is the most common type of error that brings about damage and death to patients, especially pediatric ones. However, these errors are preventable. Identifying and preventing undesirable events leading to medication errors are the main risk management activities. The aim of this study was to investigate the effectiveness of a risk management program on the pediatric nurses' medication error rate. This study is a quasi-experimental one with a comparison group. In this study, 200 nurses were recruited from two main pediatric hospitals in Tehran. In the experimental hospital, we applied the risk management program for a period of 6 months. Nurses of the control hospital did the hospital routine schedule. A pre- and post-test was performed to measure the frequency of the medication error events. SPSS software, t-test, and regression analysis were used for data analysis. After the intervention, the medication error rate of nurses at the experimental hospital was significantly lower (P < 0.001) and the error-reporting rate was higher (P < 0.007) compared to before the intervention and also in comparison to the nurses of the control hospital. Based on the results of this study and taking into account the high-risk nature of the medical environment, applying the quality-control programs such as risk management can effectively prevent the occurrence of the hospital undesirable events. Nursing mangers can reduce the medication error rate by applying risk management programs. However, this program cannot succeed without nurses' cooperation.

  15. Cathodal transcranial direct current stimulation in children with dystonia: a pilot open-label trial.

    PubMed

    Young, Scott J; Bertucco, Matteo; Sheehan-Stross, Rebecca; Sanger, Terence D

    2013-10-01

    Studies suggest that dystonia is associated with increased motor cortex excitability. Cathodal transcranial direct current stimulation can temporarily reduce motor cortex excitability. To test whether stimulation of the motor cortex can reduce dystonic symptoms in children, we measured tracking performance and muscle overflow using an electromyogram tracking task before and after stimulation. Of 10 participants, 3 showed a significant reduction in overflow, and a fourth showed a significant reduction in tracking error. Overflow decreased more when the hand contralateral to the cathode performed the task than when the hand ipsilateral to the cathode performed the task. Averaged over all participants, the results did not reach statistical significance. These results suggest that cathodal stimulation may allow a subset of children to control muscles or reduce involuntary overflow activity. Further testing is needed to confirm these results in a blinded trial and identify the subset of children who are likely to respond.

  16. Oil detection in the coastal marshes of Louisiana using MESMA applied to band subsets of AVIRIS data

    USGS Publications Warehouse

    Peterson, Seth H.; Roberts, Dar A.; Beland, Michael; Kokaly, Raymond F.; Ustin, Susan L.

    2015-01-01

    We mapped oil presence in the marshes of Barataria Bay, Louisiana following the Deepwater Horizon oil spill using Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) data. Oil and non-photosynthetic vegetation (NPV) have very similar spectra, differing only in two narrow hydrocarbon absorption regions around 1700 and 2300 nm. Confusion between NPV and oil is expressed as an increase in oil fraction error with increasing NPV, as shown by Multiple Endmember Spectral Mixture Analysis (MESMA) applied to synthetic spectra generated with known endmember fractions. Significantly, the magnitude of error varied depending upon the type of NPV in the mixture. To reduce error, we used stable zone unmixing to identify a nine band subset that emphasized the hydrocarbon absorption regions, allowing for more accurate detection of oil presence using MESMA. When this band subset was applied to post-spill AVIRIS data acquired over Barataria Bay on several dates following the 2010 oil spill, accuracies ranged from 87.5% to 93.3%. Oil presence extended 10.5 m into the marsh for oiled shorelines, showing a reduced oil fraction with increasing distance from the shoreline.

  17. Attention and memory bias to facial emotions underlying negative symptoms of schizophrenia.

    PubMed

    Jang, Seon-Kyeong; Park, Seon-Cheol; Lee, Seung-Hwan; Cho, Yang Seok; Choi, Kee-Hong

    2016-01-01

    This study assessed bias in selective attention to facial emotions in negative symptoms of schizophrenia and its influence on subsequent memory for facial emotions. Thirty people with schizophrenia who had high and low levels of negative symptoms (n = 15, respectively) and 21 healthy controls completed a visual probe detection task investigating selective attention bias (happy, sad, and angry faces randomly presented for 50, 500, or 1000 ms). A yes/no incidental facial memory task was then completed. Attention bias scores and recognition errors were calculated. Those with high negative symptoms exhibited reduced attention to emotional faces relative to neutral faces; those with low negative symptoms showed the opposite pattern when faces were presented for 500 ms regardless of the valence. Compared to healthy controls, those with high negative symptoms made more errors for happy faces in the memory task. Reduced attention to emotional faces in the probe detection task was significantly associated with less pleasure and motivation and more recognition errors for happy faces in schizophrenia group only. Attention bias away from emotional information relatively early in the attentional process and associated diminished positive memory may relate to pathological mechanisms for negative symptoms.

  18. Associations between intrusive thoughts, reality discrimination and hallucination-proneness in healthy young adults.

    PubMed

    Smailes, David; Meins, Elizabeth; Fernyhough, Charles

    2015-01-01

    People who experience intrusive thoughts are at increased risk of developing hallucinatory experiences, as are people who have weak reality discrimination skills. No study has yet examined whether these two factors interact to make a person especially prone to hallucinatory experiences. The present study examined this question in a non-clinical sample. Participants were 160 students, who completed a reality discrimination task, as well as self-report measures of cannabis use, negative affect, intrusive thoughts and auditory hallucination-proneness. The possibility of an interaction between reality discrimination performance and level of intrusive thoughts was assessed using multiple regression. The number of reality discrimination errors and level of intrusive thoughts were independent predictors of hallucination-proneness. The reality discrimination errors × intrusive thoughts interaction term was significant, with participants who made many reality discrimination errors and reported high levels of intrusive thoughts being especially prone to hallucinatory experiences. Hallucinatory experiences are more likely to occur in people who report high levels of intrusive thoughts and have weak reality discrimination skills. If applicable to clinical samples, these findings suggest that improving patients' reality discrimination skills and reducing the number of intrusive thoughts they experience may reduce the frequency of hallucinatory experiences.

  19. RMP: Reduced-set matching pursuit approach for efficient compressed sensing signal reconstruction.

    PubMed

    Abdel-Sayed, Michael M; Khattab, Ahmed; Abu-Elyazeed, Mohamed F

    2016-11-01

    Compressed sensing enables the acquisition of sparse signals at a rate that is much lower than the Nyquist rate. Compressed sensing initially adopted [Formula: see text] minimization for signal reconstruction which is computationally expensive. Several greedy recovery algorithms have been recently proposed for signal reconstruction at a lower computational complexity compared to the optimal [Formula: see text] minimization, while maintaining a good reconstruction accuracy. In this paper, the Reduced-set Matching Pursuit (RMP) greedy recovery algorithm is proposed for compressed sensing. Unlike existing approaches which either select too many or too few values per iteration, RMP aims at selecting the most sufficient number of correlation values per iteration, which improves both the reconstruction time and error. Furthermore, RMP prunes the estimated signal, and hence, excludes the incorrectly selected values. The RMP algorithm achieves a higher reconstruction accuracy at a significantly low computational complexity compared to existing greedy recovery algorithms. It is even superior to [Formula: see text] minimization in terms of the normalized time-error product, a new metric introduced to measure the trade-off between the reconstruction time and error. RMP superior performance is illustrated with both noiseless and noisy samples.

  20. Diagnostic Errors in Ambulatory Care: Dimensions and Preventive Strategies

    ERIC Educational Resources Information Center

    Singh, Hardeep; Weingart, Saul N.

    2009-01-01

    Despite an increasing focus on patient safety in ambulatory care, progress in understanding and reducing diagnostic errors in this setting lag behind many other safety concerns such as medication errors. To explore the extent and nature of diagnostic errors in ambulatory care, we identified five dimensions of ambulatory care from which errors may…

  1. Critical older driver errors in a national sample of serious U.S. crashes.

    PubMed

    Cicchino, Jessica B; McCartt, Anne T

    2015-07-01

    Older drivers are at increased risk of crash involvement per mile traveled. The purpose of this study was to examine older driver errors in serious crashes to determine which errors are most prevalent. The National Highway Traffic Safety Administration's National Motor Vehicle Crash Causation Survey collected in-depth, on-scene data for a nationally representative sample of 5470 U.S. police-reported passenger vehicle crashes during 2005-2007 for which emergency medical services were dispatched. There were 620 crashes involving 647 drivers aged 70 and older, representing 250,504 crash-involved older drivers. The proportion of various critical errors made by drivers aged 70 and older were compared with those made by drivers aged 35-54. Driver error was the critical reason for 97% of crashes involving older drivers. Among older drivers who made critical errors, the most common were inadequate surveillance (33%) and misjudgment of the length of a gap between vehicles or of another vehicle's speed, illegal maneuvers, medical events, and daydreaming (6% each). Inadequate surveillance (33% vs. 22%) and gap or speed misjudgment errors (6% vs. 3%) were more prevalent among older drivers than middle-aged drivers. Seventy-one percent of older drivers' inadequate surveillance errors were due to looking and not seeing another vehicle or failing to see a traffic control rather than failing to look, compared with 40% of inadequate surveillance errors among middle-aged drivers. About two-thirds (66%) of older drivers' inadequate surveillance errors and 77% of their gap or speed misjudgment errors were made when turning left at intersections. When older drivers traveled off the edge of the road or traveled over the lane line, this was most commonly due to non-performance errors such as medical events (51% and 44%, respectively), whereas middle-aged drivers were involved in these crash types for other reasons. Gap or speed misjudgment errors and inadequate surveillance errors were significantly more prevalent among female older drivers than among female middle-aged drivers, but the prevalence of these errors did not differ significantly between older and middle-aged male drivers. These errors comprised 51% of errors among older female drivers but only 31% among older male drivers. Efforts to reduce older driver crash involvements should focus on diminishing the likelihood of the most common driver errors. Countermeasures that simplify or remove the need to make left turns across traffic such as roundabouts, protected left turn signals, and diverging diamond intersection designs could decrease the frequency of inadequate surveillance and gap or speed misjudgment errors. In the future, vehicle-to-vehicle and vehicle-to-infrastructure communications may also help protect older drivers from these errors. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A novel post-processing scheme for two-dimensional electrical impedance tomography based on artificial neural networks

    PubMed Central

    2017-01-01

    Objective Electrical Impedance Tomography (EIT) is a powerful non-invasive technique for imaging applications. The goal is to estimate the electrical properties of living tissues by measuring the potential at the boundary of the domain. Being safe with respect to patient health, non-invasive, and having no known hazards, EIT is an attractive and promising technology. However, it suffers from a particular technical difficulty, which consists of solving a nonlinear inverse problem in real time. Several nonlinear approaches have been proposed as a replacement for the linear solver, but in practice very few are capable of stable, high-quality, and real-time EIT imaging because of their very low robustness to errors and inaccurate modeling, or because they require considerable computational effort. Methods In this paper, a post-processing technique based on an artificial neural network (ANN) is proposed to obtain a nonlinear solution to the inverse problem, starting from a linear solution. While common reconstruction methods based on ANNs estimate the solution directly from the measured data, the method proposed here enhances the solution obtained from a linear solver. Conclusion Applying a linear reconstruction algorithm before applying an ANN reduces the effects of noise and modeling errors. Hence, this approach significantly reduces the error associated with solving 2D inverse problems using machine-learning-based algorithms. Significance This work presents radical enhancements in the stability of nonlinear methods for biomedical EIT applications. PMID:29206856

  3. [Diagnostic Errors in Medicine].

    PubMed

    Buser, Claudia; Bankova, Andriyana

    2015-12-09

    The recognition of diagnostic errors in everyday practice can help improve patient safety. The most common diagnostic errors are the cognitive errors, followed by system-related errors and no fault errors. The cognitive errors often result from mental shortcuts, known as heuristics. The rate of cognitive errors can be reduced by a better understanding of heuristics and the use of checklists. The autopsy as a retrospective quality assessment of clinical diagnosis has a crucial role in learning from diagnostic errors. Diagnostic errors occur more often in primary care in comparison to hospital settings. On the other hand, the inpatient errors are more severe than the outpatient errors.

  4. Effects of data selection on the assimilation of AIRS data

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Brin, E.; Treadon, R.; Derber, J.; VanDelst, P.; DeSilva, A.; Marshall, J. Le; Poli, P.; Atlas, R.; Cruz, C.; hide

    2006-01-01

    The Atmospheric InfraRed Sounder (AIRS), flying aboard NASA's Earth Observing System (EOS) Aqua satellite with the Advanced Microwave Sounding Unit-A (AMSU-A), has been providing data for use in numerical weather prediction (NWP) and data assimilation systems (DAS) for over three years. The full AIRS data set is currently not transmitted in near-real-time (NRT) to the NWP centers. Instead, data sets with reduced spatial and spectral information are produced and made available in NRT. In this paper, we evaluate the use of different channel selections and error specifications. We achieved significant positive impact from the Aqua AIRS/AMSU-A combination in both hemispheres during our experimental time period of January 2003. The best results were obtained using a set of 156 channels that did not include any in the 6.7micron water vapor band. The latter have a large influence on both temperature and humidity analyses. If observation and background errors are not properly specified, the partitioning of temperature and humidity information from these channels will not be correct, and this can lead to a degradation in forecast skill. We found that changing the specified channel errors had a significant effect on the amount of data that entered into the analysis as a result of quality control thresholds that are related to the errors. However, changing the channel errors within a relatively small window did not significantly impact forecast skill with the 155 channel set. We also examined the effects of different types of spatial data reduction on assimilated data sets and NWP forecast skill. Whether we picked the center or the warmest AIRS pixel in a 3x3 array affected the amount of data ingested by the analysis but had a negligible impact on the forecast skill.

  5. Anticipatory synergy adjustments reflect individual performance of feedforward force control.

    PubMed

    Togo, Shunta; Imamizu, Hiroshi

    2016-10-06

    We grasp and dexterously manipulate an object through multi-digit synergy. In the framework of the uncontrolled manifold (UCM) hypothesis, multi-digit synergy is defined as the coordinated control mechanism of fingers to stabilize variable important for task success, e.g., total force. Previous studies reported anticipatory synergy adjustments (ASAs) that correspond to a drop of the synergy index before a quick change of the total force. The present study compared ASA's properties with individual performances of feedforward force control to investigate a relationship of those. Subjects performed a total finger force production task that consisted of a phase in which subjects tracked target line with visual information and a phase in which subjects produced total force pulse without visual information. We quantified their multi-digit synergy through UCM analysis and observed significant ASAs before producing total force pulse. The time of the ASA initiation and the magnitude of the drop of the synergy index were significantly correlated with the error of force pulse, but not with the tracking error. Almost all subjects showed a significant increase of the variance that affected the total force. Our study directly showed that ASA reflects the individual performance of feedforward force control independently of target-tracking performance and suggests that the multi-digit synergy was weakened to adjust the multi-digit movements based on a prediction error so as to reduce the future error. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Demonstration of spectral calibration for stellar interferometry

    NASA Technical Reports Server (NTRS)

    Demers, Richard T.; An, Xin; Tang, Hong; Rud, Mayer; Wayne, Leonard; Kissil, Andrew; Kwack, Eug-Yun

    2006-01-01

    A breadboard is under development to demonstrate the calibration of spectral errors in microarcsecond stellar interferometers. Analysis shows that thermally and mechanically stable hardware in addition to careful optical design can reduce the wavelength dependent error to tens of nanometers. Calibration of the hardware can further reduce the error to the level of picometers. The results of thermal, mechanical and optical analysis supporting the breadboard design will be shown.

  7. A revised radiation package of G-packed McICA and two-stream approximation: Performance evaluation in a global weather forecasting model

    NASA Astrophysics Data System (ADS)

    Baek, Sunghye

    2017-07-01

    For more efficient and accurate computation of radiative flux, improvements have been achieved in two aspects, integration of the radiative transfer equation over space and angle. First, the treatment of the Monte Carlo-independent column approximation (MCICA) is modified focusing on efficiency using a reduced number of random samples ("G-packed") within a reconstructed and unified radiation package. The original McICA takes 20% of CPU time of radiation in the Global/Regional Integrated Model systems (GRIMs). The CPU time consumption of McICA is reduced by 70% without compromising accuracy. Second, parameterizations of shortwave two-stream approximations are revised to reduce errors with respect to the 16-stream discrete ordinate method. Delta-scaled two-stream approximation (TSA) is almost unanimously used in Global Circulation Model (GCM) but contains systematic errors which overestimate forward peak scattering as solar elevation decreases. These errors are alleviated by adjusting the parameterizations of each scattering element—aerosol, liquid, ice and snow cloud particles. Parameterizations are determined with 20,129 atmospheric columns of the GRIMs data and tested with 13,422 independent data columns. The result shows that the root-mean-square error (RMSE) over the all atmospheric layers is decreased by 39% on average without significant increase in computational time. Revised TSA developed and validated with a separate one-dimensional model is mounted on GRIMs for mid-term numerical weather forecasting. Monthly averaged global forecast skill scores are unchanged with revised TSA but the temperature at lower levels of the atmosphere (pressure ≥ 700 hPa) is slightly increased (< 0.5 K) with corrected atmospheric absorption.

  8. Reducing medication errors in critical care: a multimodal approach

    PubMed Central

    Kruer, Rachel M; Jarrell, Andrew S; Latif, Asad

    2014-01-01

    The Institute of Medicine has reported that medication errors are the single most common type of error in health care, representing 19% of all adverse events, while accounting for over 7,000 deaths annually. The frequency of medication errors in adult intensive care units can be as high as 947 per 1,000 patient-days, with a median of 105.9 per 1,000 patient-days. The formulation of drugs is a potential contributor to medication errors. Challenges related to drug formulation are specific to the various routes of medication administration, though errors associated with medication appearance and labeling occur among all drug formulations and routes of administration. Addressing these multifaceted challenges requires a multimodal approach. Changes in technology, training, systems, and safety culture are all strategies to potentially reduce medication errors related to drug formulation in the intensive care unit. PMID:25210478

  9. The effect of covariate mean differences on the standard error and confidence interval for the comparison of treatment means.

    PubMed

    Liu, Xiaofeng Steven

    2011-05-01

    The use of covariates is commonly believed to reduce the unexplained error variance and the standard error for the comparison of treatment means, but the reduction in the standard error is neither guaranteed nor uniform over different sample sizes. The covariate mean differences between the treatment conditions can inflate the standard error of the covariate-adjusted mean difference and can actually produce a larger standard error for the adjusted mean difference than that for the unadjusted mean difference. When the covariate observations are conceived of as randomly varying from one study to another, the covariate mean differences can be related to a Hotelling's T(2) . Using this Hotelling's T(2) statistic, one can always find a minimum sample size to achieve a high probability of reducing the standard error and confidence interval width for the adjusted mean difference. ©2010 The British Psychological Society.

  10. Improvement of tsunami detection in timeseries data of GPS buoys with the Continuous Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Chida, Y.; Takagawa, T.

    2017-12-01

    The observation data of GPS buoys which are installed in the offshore of Japan are used for monitoring not only waves but also tsunamis in Japan. The real-time data was successfully used to upgrade the tsunami warnings just after the 2011 Tohoku earthquake. Huge tsunamis can be easily detected because the signal-noise ratio is high enough, but moderate tsunami is not. GPS data sometimes include the error waveforms like tsunamis because of changing accuracy by the number and the position of GPS satellites. To distinguish the true tsunami waveforms from pseudo-tsunami ones is important for tsunami detection. In this research, a method to reduce misdetections of tsunami in the observation data of GPS buoys and to increase the efficiency of tsunami detection was developed.Firstly, the error waveforms were extracted by using the indexes of position dilution of precision, reliability of GPS satellite positioning and satellite number for calculation. Then, the output from this procedure was used for the Continuous Wavelet Transform (CWT) to analyze the time-frequency characteristics of error waveforms and real tsunami waveforms.We found that the error waveforms tended to appear when the accuracy of GPS buoys positioning was low. By extracting these waveforms, it was possible to decrease about 43% error waveforms without the reduction of the tsunami detection rate. Moreover, we found that the amplitudes of power spectra obtained from the error waveforms and real tsunamis were similar in the component of long period (4-65 minutes), on the other hand, the amplitude in the component of short period (< 1 minute) obtained from the error waveforms was significantly larger than that of the real tsunami waveforms. By thresholding of the short-period component, further extraction of error waveforms became possible without a significant reduction of tsunami detection rate.

  11. Discrepancies in medication entries between anesthetic and pharmacy records using electronic databases.

    PubMed

    Vigoda, Michael M; Gencorelli, Frank J; Lubarsky, David A

    2007-10-01

    Accurate recording of disposition of controlled substances is required by regulatory agencies. Linking anesthesia information management systems (AIMS) with medication dispensing systems may facilitate automated reconciliation of medication discrepancies. In this retrospective investigation at a large academic hospital, we reviewed 11,603 cases (spanning an 8-mo period) comparing records of medications (i.e., narcotics, benzodiazepines, ketamine, and thiopental) recorded as removed from our automated medication dispensing system with medications recorded as administered in our AIMS. In 15% of cases, we found discrepancies between dispensed versus administered medications. Discrepancies occurred in both the AIMS (8% cases) and the medication dispensing system (10% cases). Although there were many different types of user errors, nearly 75% of them resulted from either an error in the amount of drug waste documented in the medication dispensing system (35%); or an error in documenting the medication in the AIMS (40%). A significant percentage of cases contained data entry errors in both the automated dispensing and AIMS. This error rate limits the current practicality of automating the necessary reconciliation. An electronic interface between an AIMS and a medication dispensing system could alert users of medication entry errors prior to finalizing a case, thus reducing the time (and cost) of reconciling discrepancies.

  12. Predicting and interpreting identification errors in military vehicle training using multidimensional scaling.

    PubMed

    Bohil, Corey J; Higgins, Nicholas A; Keebler, Joseph R

    2014-01-01

    We compared methods for predicting and understanding the source of confusion errors during military vehicle identification training. Participants completed training to identify main battle tanks. They also completed card-sorting and similarity-rating tasks to express their mental representation of resemblance across the set of training items. We expected participants to selectively attend to a subset of vehicle features during these tasks, and we hypothesised that we could predict identification confusion errors based on the outcomes of the card-sort and similarity-rating tasks. Based on card-sorting results, we were able to predict about 45% of observed identification confusions. Based on multidimensional scaling of the similarity-rating data, we could predict more than 80% of identification confusions. These methods also enabled us to infer the dimensions receiving significant attention from each participant. This understanding of mental representation may be crucial in creating personalised training that directs attention to features that are critical for accurate identification. Participants completed military vehicle identification training and testing, along with card-sorting and similarity-rating tasks. The data enabled us to predict up to 84% of identification confusion errors and to understand the mental representation underlying these errors. These methods have potential to improve training and reduce identification errors leading to fratricide.

  13. Suppression (but Not Reappraisal) Impairs Subsequent Error Detection: An ERP Study of Emotion Regulation's Resource-Depleting Effect

    PubMed Central

    Wang, Yan; Yang, Lixia; Wang, Yan

    2014-01-01

    Past event-related potentials (ERPs) research shows that, after exerting effortful emotion inhibition, the neural correlates of performance monitoring (e.g. error-related negativity) were weakened. An undetermined issue is whether all forms of emotion regulation uniformly impair later performance monitoring. The present study compared the cognitive consequences of two emotion regulation strategies, namely suppression and reappraisal. Participants were instructed to suppress their emotions while watching a sad movie, or to adopt a neutral and objective attitude toward the movie, or to just watch the movie carefully. Then after a mood scale, all participants completed an ostensibly unrelated Stroop task, during which ERPs (i.e. error-related negativity (ERN), post-error positivity (Pe) and N450) were obtained. Reappraisal group successfully decreased their sad emotion, relative to the other two groups. Compared with participants in the control group and the reappraisal group, those who suppressed their emotions during the sad movie showed reduced ERN after error commission. Participants in the suppression group also made more errors in incongruent Stroop trials than the other two groups. There were no significant main effects or interactions of group for reaction time, Pe and N450. Results suggest that reappraisal is both more effective and less resource-depleting than suppression. PMID:24777113

  14. Performance Improvement of Receivers Based on Ultra-Tight Integration in GNSS-Challenged Environments

    PubMed Central

    Qin, Feng; Zhan, Xingqun; Du, Gang

    2013-01-01

    Ultra-tight integration was first proposed by Abbott in 2003 with the purpose of integrating a global navigation satellite system (GNSS) and an inertial navigation system (INS). This technology can improve the tracking performances of a receiver by reconfiguring the tracking loops in GNSS-challenged environments. In this paper, the models of all error sources known to date in the phase lock loops (PLLs) of a standard receiver and an ultra-tightly integrated GNSS/INS receiver are built, respectively. Based on these models, the tracking performances of the two receivers are compared to verify the improvement due to the ultra-tight integration. Meanwhile, the PLL error distributions of the two receivers are also depicted to analyze the error changes of the tracking loops. These results show that the tracking error is significantly reduced in the ultra-tightly integrated GNSS/INS receiver since the receiver's dynamics are estimated and compensated by an INS. Moreover, the mathematical relationship between the tracking performances of the ultra-tightly integrated GNSS/INS receiver and the quality of the selected inertial measurement unit (IMU) is derived from the error models and proved by the error comparisons of four ultra-tightly integrated GNSS/INS receivers aided by different grade IMUs.

  15. Suppression (but not reappraisal) impairs subsequent error detection: an ERP study of emotion regulation's resource-depleting effect.

    PubMed

    Wang, Yan; Yang, Lixia; Wang, Yan

    2014-01-01

    Past event-related potentials (ERPs) research shows that, after exerting effortful emotion inhibition, the neural correlates of performance monitoring (e.g. error-related negativity) were weakened. An undetermined issue is whether all forms of emotion regulation uniformly impair later performance monitoring. The present study compared the cognitive consequences of two emotion regulation strategies, namely suppression and reappraisal. Participants were instructed to suppress their emotions while watching a sad movie, or to adopt a neutral and objective attitude toward the movie, or to just watch the movie carefully. Then after a mood scale, all participants completed an ostensibly unrelated Stroop task, during which ERPs (i.e. error-related negativity (ERN), post-error positivity (Pe) and N450) were obtained. Reappraisal group successfully decreased their sad emotion, relative to the other two groups. Compared with participants in the control group and the reappraisal group, those who suppressed their emotions during the sad movie showed reduced ERN after error commission. Participants in the suppression group also made more errors in incongruent Stroop trials than the other two groups. There were no significant main effects or interactions of group for reaction time, Pe and N450. Results suggest that reappraisal is both more effective and less resource-depleting than suppression.

  16. Self-reported medical, medication and laboratory error in eight countries: risk factors for chronically ill adults.

    PubMed

    Scobie, Andrea

    2011-04-01

    To identify risk factors associated with self-reported medical, medication and laboratory error in eight countries. The Commonwealth Fund's 2008 International Health Policy Survey of chronically ill patients in eight countries. None. A multi-country telephone survey was conducted between 3 March and 30 May 2008 with patients in Australia, Canada, France, Germany, the Netherlands, New Zealand, the UK and the USA who self-reported being chronically ill. A bivariate analysis was performed to determine significant explanatory variables of medical, medication and laboratory error (P < 0.01) for inclusion in a binary logistic regression model. The final regression model included eight risk factors for self-reported error: age 65 and under, education level of some college or less, presence of two or more chronic conditions, high prescription drug use (four+ drugs), four or more doctors seen within 2 years, a care coordination problem, poor doctor-patient communication and use of an emergency department. Risk factors with the greatest ability to predict experiencing an error encompassed issues with coordination of care and provider knowledge of a patient's medical history. The identification of these risk factors could help policymakers and organizations to proactively reduce the likelihood of error through greater examination of system- and organization-level practices.

  17. Cost-Effectiveness Analysis of an Automated Medication System Implemented in a Danish Hospital Setting.

    PubMed

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    To evaluate the cost-effectiveness of an automated medication system (AMS) implemented in a Danish hospital setting. An economic evaluation was performed alongside a controlled before-and-after effectiveness study with one control ward and one intervention ward. The primary outcome measure was the number of errors in the medication administration process observed prospectively before and after implementation. To determine the difference in proportion of errors after implementation of the AMS, logistic regression was applied with the presence of error(s) as the dependent variable. Time, group, and interaction between time and group were the independent variables. The cost analysis used the hospital perspective with a short-term incremental costing approach. The total 6-month costs with and without the AMS were calculated as well as the incremental costs. The number of avoided administration errors was related to the incremental costs to obtain the cost-effectiveness ratio expressed as the cost per avoided administration error. The AMS resulted in a statistically significant reduction in the proportion of errors in the intervention ward compared with the control ward. The cost analysis showed that the AMS increased the ward's 6-month cost by €16,843. The cost-effectiveness ratio was estimated at €2.01 per avoided administration error, €2.91 per avoided procedural error, and €19.38 per avoided clinical error. The AMS was effective in reducing errors in the medication administration process at a higher overall cost. The cost-effectiveness analysis showed that the AMS was associated with affordable cost-effectiveness rates. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  18. Effect of Numerical Error on Gravity Field Estimation for GRACE and Future Gravity Missions

    NASA Astrophysics Data System (ADS)

    McCullough, Christopher; Bettadpur, Srinivas

    2015-04-01

    In recent decades, gravity field determination from low Earth orbiting satellites, such as the Gravity Recovery and Climate Experiment (GRACE), has become increasingly more effective due to the incorporation of high accuracy measurement devices. Since instrumentation quality will only increase in the near future and the gravity field determination process is computationally and numerically intensive, numerical error from the use of double precision arithmetic will eventually become a prominent error source. While using double-extended or quadruple precision arithmetic will reduce these errors, the numerical limitations of current orbit determination algorithms and processes must be accurately identified and quantified in order to adequately inform the science data processing techniques of future gravity missions. The most obvious numerical limitation in the orbit determination process is evident in the comparison of measured observables with computed values, derived from mathematical models relating the satellites' numerically integrated state to the observable. Significant error in the computed trajectory will corrupt this comparison and induce error in the least squares solution of the gravitational field. In addition, errors in the numerically computed trajectory propagate into the evaluation of the mathematical measurement model's partial derivatives. These errors amalgamate in turn with numerical error from the computation of the state transition matrix, computed using the variational equations of motion, in the least squares mapping matrix. Finally, the solution of the linearized least squares system, computed using a QR factorization, is also susceptible to numerical error. Certain interesting combinations of each of these numerical errors are examined in the framework of GRACE gravity field determination to analyze and quantify their effects on gravity field recovery.

  19. A hybrid solution using computational prediction and measured data to accurately determine process corrections with reduced overlay sampling

    NASA Astrophysics Data System (ADS)

    Noyes, Ben F.; Mokaberi, Babak; Mandoy, Ram; Pate, Alex; Huijgen, Ralph; McBurney, Mike; Chen, Owen

    2017-03-01

    Reducing overlay error via an accurate APC feedback system is one of the main challenges in high volume production of the current and future nodes in the semiconductor industry. The overlay feedback system directly affects the number of dies meeting overlay specification and the number of layers requiring dedicated exposure tools through the fabrication flow. Increasing the former number and reducing the latter number is beneficial for the overall efficiency and yield of the fabrication process. An overlay feedback system requires accurate determination of the overlay error, or fingerprint, on exposed wafers in order to determine corrections to be automatically and dynamically applied to the exposure of future wafers. Since current and future nodes require correction per exposure (CPE), the resolution of the overlay fingerprint must be high enough to accommodate CPE in the overlay feedback system, or overlay control module (OCM). Determining a high resolution fingerprint from measured data requires extremely dense overlay sampling that takes a significant amount of measurement time. For static corrections this is acceptable, but in an automated dynamic correction system this method creates extreme bottlenecks for the throughput of said system as new lots have to wait until the previous lot is measured. One solution is using a less dense overlay sampling scheme and employing computationally up-sampled data to a dense fingerprint. That method uses a global fingerprint model over the entire wafer; measured localized overlay errors are therefore not always represented in its up-sampled output. This paper will discuss a hybrid system shown in Fig. 1 that combines a computationally up-sampled fingerprint with the measured data to more accurately capture the actual fingerprint, including local overlay errors. Such a hybrid system is shown to result in reduced modelled residuals while determining the fingerprint, and better on-product overlay performance.

  20. The effect of keyboard key spacing on typing speed, error, usability, and biomechanics, Part 2: Vertical spacing.

    PubMed

    Pereira, Anna; Hsieh, Chih-Ming; Laroche, Charles; Rempel, David

    2014-06-01

    The objective was to evaluate the effects of vertical key spacing on a conventional computer keyboard on typing speed, percentage error, usability, forearm muscle activity, and wrist posture for both females with small fingers and males with large fingers. Part I evaluated primarily horizontal key spacing and found that for male typists with large fingers, productivity and usability were similar for spacings of 17, 18, and 19 mm but were reduced for spacings of 16 mm. Few other key spacing studies are available, and the international standards that specify the spacing between keys on a keyboard have been mainly guided by design convention. Experienced female typists (n = 26) with small fingers (middle finger length < or = 7.71 cm or finger breadth of < or = 1.93 cm) and male typists (n = 26) with large fingers (middle finger length > or = 8.37 cm or finger breadth of > or = 2.24 cm) typed on five keyboards that differed primarily in vertical key spacing (17 x 18, 17 x 17, 17 x 16, 17 x 15.5, and 18 x 16 mm) while typing speed, error, fatigue, preference, forearm muscle activity, and wrist posture were recorded. Productivity and usability ratings were significantly worse for the keyboard with 15.5 mm vertical spacing compared to the other keyboards for both groups.There were few significant differences on usability ratings between the other keyboards. Reducing vertical key spacing,from 18 to 17 to 16 mm, had no significant effect on productivity or usability. The findings support the design of keyboards with vertical key spacings of 16, 17, or 18 mm. These findings may influence keyboard design and standards.

  1. Effect of audio instruction on tracking errors using a four-dimensional image-guided radiotherapy system.

    PubMed

    Nakamura, Mitsuhiro; Sawada, Akira; Mukumoto, Nobutaka; Takahashi, Kunio; Mizowaki, Takashi; Kokubo, Masaki; Hiraoka, Masahiro

    2013-09-06

    The Vero4DRT (MHI-TM2000) is capable of performing X-ray image-based tracking (X-ray Tracking) that directly tracks the target or fiducial markers under continuous kV X-ray imaging. Previously, we have shown that irregular respiratory patterns increased X-ray Tracking errors. Thus, we assumed that audio instruction, which generally improves the periodicity of respiration, should reduce tracking errors. The purpose of this study was to assess the effect of audio instruction on X-ray Tracking errors. Anterior-posterior abdominal skin-surface displacements obtained from ten lung cancer patients under free breathing and simple audio instruction were used as an alternative to tumor motion in the superior-inferior direction. First, a sequential predictive model based on the Levinson-Durbin algorithm was created to estimate the future three-dimensional (3D) target position under continuous kV X-ray imaging while moving a steel ball target of 9.5 mm in diameter. After creating the predictive model, the future 3D target position was sequentially calculated from the current and past 3D target positions based on the predictive model every 70 ms under continuous kV X-ray imaging. Simultaneously, the system controller of the Vero4DRT calculated the corresponding pan and tilt rotational angles of the gimbaled X-ray head, which then adjusted its orientation to the target. The calculated and current rotational angles of the gimbaled X-ray head were recorded every 5 ms. The target position measured by the laser displacement gauge was synchronously recorded every 10 msec. Total tracking system errors (ET) were compared between free breathing and audio instruction. Audio instruction significantly improved breathing regularity (p < 0.01). The mean ± standard deviation of the 95th percentile of ET (E95T ) was 1.7 ± 0.5 mm (range: 1.1-2.6mm) under free breathing (E95T,FB) and 1.9 ± 0.5 mm (range: 1.2-2.7 mm) under audio instruction (E95T,AI). E95T,AI was larger than E95T,FB for five patients; no significant difference was found between E95T,FB and E95T,AI (p = 0.21). Correlation analysis revealed that the rapid respiratory velocity significantly increased E95T. Although audio instruction improved breathing regularity, it also increased the respiratory velocity, which did not necessarily reduce tracking errors.

  2. Effect of audio instruction on tracking errors using a four‐dimensional image‐guided radiotherapy system

    PubMed Central

    Sawada, Akira; Mukumoto, Nobutaka; Takahashi, Kunio; Mizowaki, Takashi; Kokubo, Masaki; Hiraoka, Masahiro

    2013-01-01

    The Vero4DRT (MHI‐TM2000) is capable of performing X‐ray image‐based tracking (X‐ray Tracking) that directly tracks the target or fiducial markers under continuous kV X‐ray imaging. Previously, we have shown that irregular respiratory patterns increased X‐ray Tracking errors. Thus, we assumed that audio instruction, which generally improves the periodicity of respiration, should reduce tracking errors. The purpose of this study was to assess the effect of audio instruction on X‐ray Tracking errors. Anterior‐posterior abdominal skin‐surface displacements obtained from ten lung cancer patients under free breathing and simple audio instruction were used as an alternative to tumor motion in the superior‐inferior direction. First, a sequential predictive model based on the Levinson‐Durbin algorithm was created to estimate the future three‐dimensional (3D) target position under continuous kV X‐ray imaging while moving a steel ball target of 9.5 mm in diameter. After creating the predictive model, the future 3D target position was sequentially calculated from the current and past 3D target positions based on the predictive model every 70 ms under continuous kV X‐ray imaging. Simultaneously, the system controller of the Vero4DRT calculated the corresponding pan and tilt rotational angles of the gimbaled X‐ray head, which then adjusted its orientation to the target. The calculated and current rotational angles of the gimbaled X‐ray head were recorded every 5 ms. The target position measured by the laser displacement gauge was synchronously recorded every 10 msec. Total tracking system errors (ET) were compared between free breathing and audio instruction. Audio instruction significantly improved breathing regularity (p < 0.01). The mean ± standard deviation of the 95th percentile of ET (E95T) was 1.7 ± 0.5 mm (range: 1.1–2.6 mm) under free breathing (E95T,FB) and 1.9 ± 0.5 mm (range: 1.2–2.7 mm) under audio instruction (E95T,AI). E95T,AI was larger than E95T,FB for five patients; no significant difference was found between E95T,FB and ET,AI95(p = 0.21). Correlation analysis revealed that the rapid respiratory velocity significantly increased E95T. Although audio instruction improved breathing regularity, it also increased the respiratory velocity, which did not necessarily reduce tracking errors. PACS number: 87.55.ne, 87.57.N‐, 87.59.C‐, PMID:24036880

  3. Teaching Cancer Patients the Value of Correct Positioning During Radiotherapy Using Visual Aids and Practical Exercises.

    PubMed

    Hansen, Helle; Nielsen, Berit Kjærside; Boejen, Annette; Vestergaard, Anne

    2018-06-01

    The aim of this study was to investigate if teaching patients about positioning before radiotherapy treatment would (a) reduce the residual rotational set-up errors, (b) reduce the number of repositionings and (c) improve patients' sense of control by increasing self-efficacy and reducing distress. Patients were randomized to either standard care (control group) or standard care and a teaching session combining visual aids and practical exercises (intervention group). Daily images from the treatment sessions were evaluated off-line. Both groups filled in a questionnaire before and at the end of the treatment course on various aspects of cooperation with the staff regarding positioning. Comparisons of residual rotational set-up errors showed an improvement in the intervention group compared to the control group. No significant differences were found in number of repositionings, self-efficacy or distress. Results show that it is possible to teach patients about positioning and thereby improve precision in positioning. Teaching patients about positioning did not seem to affect self-efficacy or distress scores at baseline and at the end of the treatment course.

  4. Perinatal choline supplementation attenuates behavioral alterations associated with neonatal alcohol exposure in rats.

    PubMed

    Thomas, Jennifer D; Garrison, Megan; O'Neill, Teresa M

    2004-01-01

    Children exposed to alcohol prenatally suffer from a variety of behavioral alterations, including hyperactivity and learning deficits. Given that women continue to drink alcohol during pregnancy, it is critical that effective interventions and treatments be identified. Previously, we reported that early postnatal choline supplementation can reduce the severity of learning deficits in rats exposed to alcohol prenatally. The present study examined whether choline supplementation can reduce the severity of behavioral alterations associated with alcohol exposure during the third trimester equivalent brain growth spurt. Male neonatal rats were assigned to one of three treatment groups. One group was exposed to alcohol (6.6 g/kg/day) from postnatal days (PD) 4-9 via an artificial rearing procedure. Artificially reared and normally reared control groups were included. One half of subjects from each treatment received daily subcutaneous injections of a choline chloride solution from PD 4-30, whereas the other half received saline vehicle injections. On PD 31-34, after choline treatment was complete, activity level was monitored and, on PD 40-42, subjects were tested on a serial spatial discrimination reversal learning task. Subjects exposed to alcohol were significantly hyperactive compared to controls. The severity of ethanol-induced hyperactivity was attenuated with choline treatment. In addition, subjects exposed to ethanol during the neonatal period committed a significantly greater number of perseverative-type errors on the reversal learning task compared to controls. Exposure to choline significantly reduced the number of ethanol-related errors. Importantly, these behavioral changes were not due to the acute effects of choline, but were related to long-lasting organizational effects of early choline supplementation. These data suggest that early dietary interventions may reduce the severity of fetal alcohol effects.

  5. Identifying and reducing error in cluster-expansion approximations of protein energies.

    PubMed

    Hahn, Seungsoo; Ashenberg, Orr; Grigoryan, Gevorg; Keating, Amy E

    2010-12-01

    Protein design involves searching a vast space for sequences that are compatible with a defined structure. This can pose significant computational challenges. Cluster expansion is a technique that can accelerate the evaluation of protein energies by generating a simple functional relationship between sequence and energy. The method consists of several steps. First, for a given protein structure, a training set of sequences with known energies is generated. Next, this training set is used to expand energy as a function of clusters consisting of single residues, residue pairs, and higher order terms, if required. The accuracy of the sequence-based expansion is monitored and improved using cross-validation testing and iterative inclusion of additional clusters. As a trade-off for evaluation speed, the cluster-expansion approximation causes prediction errors, which can be reduced by including more training sequences, including higher order terms in the expansion, and/or reducing the sequence space described by the cluster expansion. This article analyzes the sources of error and introduces a method whereby accuracy can be improved by judiciously reducing the described sequence space. The method is applied to describe the sequence-stability relationship for several protein structures: coiled-coil dimers and trimers, a PDZ domain, and T4 lysozyme as examples with computationally derived energies, and SH3 domains in amphiphysin-1 and endophilin-1 as examples where the expanded pseudo-energies are obtained from experiments. Our open-source software package Cluster Expansion Version 1.0 allows users to expand their own energy function of interest and thereby apply cluster expansion to custom problems in protein design. © 2010 Wiley Periodicals, Inc.

  6. Insight into biases and sequencing errors for amplicon sequencing with the Illumina MiSeq platform.

    PubMed

    Schirmer, Melanie; Ijaz, Umer Z; D'Amore, Rosalinda; Hall, Neil; Sloan, William T; Quince, Christopher

    2015-03-31

    With read lengths of currently up to 2 × 300 bp, high throughput and low sequencing costs Illumina's MiSeq is becoming one of the most utilized sequencing platforms worldwide. The platform is manageable and affordable even for smaller labs. This enables quick turnaround on a broad range of applications such as targeted gene sequencing, metagenomics, small genome sequencing and clinical molecular diagnostics. However, Illumina error profiles are still poorly understood and programs are therefore not designed for the idiosyncrasies of Illumina data. A better knowledge of the error patterns is essential for sequence analysis and vital if we are to draw valid conclusions. Studying true genetic variation in a population sample is fundamental for understanding diseases, evolution and origin. We conducted a large study on the error patterns for the MiSeq based on 16S rRNA amplicon sequencing data. We tested state-of-the-art library preparation methods for amplicon sequencing and showed that the library preparation method and the choice of primers are the most significant sources of bias and cause distinct error patterns. Furthermore we tested the efficiency of various error correction strategies and identified quality trimming (Sickle) combined with error correction (BayesHammer) followed by read overlapping (PANDAseq) as the most successful approach, reducing substitution error rates on average by 93%. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Effects of measurement errors on psychometric measurements in ergonomics studies: Implications for correlations, ANOVA, linear regression, factor analysis, and linear discriminant analysis.

    PubMed

    Liu, Yan; Salvendy, Gavriel

    2009-05-01

    This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.

  8. Target Uncertainty Mediates Sensorimotor Error Correction

    PubMed Central

    Vijayakumar, Sethu; Wolpert, Daniel M.

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323

  9. Target Uncertainty Mediates Sensorimotor Error Correction.

    PubMed

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.

  10. [Ambient air interference in oxygen intake measurements in liquid incubating media with the use of open polarographic cells].

    PubMed

    Miniaev, M V; Voronchikhina, L I

    2007-01-01

    A model of oxygen intake by aerobic bio-objects in liquid incubating media was applied to investigate the influence air-media interface area on accuracy of measuring the oxygen intake and error value. It was shown that intrusion of air oxygen increases the relative error to 24% in open polarographic cells and to 13% in cells with a reduced interface area. Results of modeling passive media oxygenation laid a basis for proposing a method to reduce relative error by 66% for open cells and by 15% for cells with a reduced interface area.

  11. Attentional effects on orientation judgements are dependent on memory consolidation processes.

    PubMed

    Haskell, Christie; Anderson, Britt

    2016-11-01

    Are the effects of memory and attention on perception synergistic, antagonistic, or independent? Tested separately, memory and attention have been shown to affect the accuracy of orientation judgements. When multiple stimuli are presented sequentially versus simultaneously, error variance is reduced. When a target is validly cued, precision is increased. What if they are manipulated together? We combined memory and attention manipulations in an orientation judgement task to answer this question. Two circular gratings were presented sequentially or simultaneously. On some trials a brief luminance cue preceded the stimuli. Participants were cued to report the orientation of one of the two gratings by rotating a response grating. We replicated the finding that error variance is reduced on sequential trials. Critically, we found interacting effects of memory and attention. Valid cueing reduced the median, absolute error only when two stimuli appeared together and improved it to the level of performance on uncued sequential trials, whereas invalid cueing always increased error. This effect was not mediated by cue predictiveness; however, predictive cues reduced the standard deviation of the error distribution, whereas nonpredictive cues reduced "guessing". Our results suggest that, when the demand on memory is greater than a single stimulus, attention is a bottom-up process that prioritizes stimuli for consolidation. Thus attention and memory are synergistic.

  12. Current pulse: can a production system reduce medical errors in health care?

    PubMed

    Printezis, Antonios; Gopalakrishnan, Mohan

    2007-01-01

    One of the reasons for rising health care costs is medical errors, a majority of which result from faulty systems and processes. Health care in the past has used process-based initiatives such as Total Quality Management, Continuous Quality Improvement, and Six Sigma to reduce errors. These initiatives to redesign health care, reduce errors, and improve overall efficiency and customer satisfaction have had moderate success. Current trend is to apply the successful Toyota Production System (TPS) to health care since its organizing principles have led to tremendous improvement in productivity and quality for Toyota and other businesses that have adapted them. This article presents insights on the effectiveness of TPS principles in health care and the challenges that lie ahead in successfully integrating this approach with other quality initiatives.

  13. Dysfunctional error-related processing in incarcerated youth with elevated psychopathic traits

    PubMed Central

    Maurer, J. Michael; Steele, Vaughn R.; Cope, Lora M.; Vincent, Gina M.; Stephen, Julia M.; Calhoun, Vince D.; Kiehl, Kent A.

    2016-01-01

    Adult psychopathic offenders show an increased propensity towards violence, impulsivity, and recidivism. A subsample of youth with elevated psychopathic traits represent a particularly severe subgroup characterized by extreme behavioral problems and comparable neurocognitive deficits as their adult counterparts, including perseveration deficits. Here, we investigate response-locked event-related potential (ERP) components (the error-related negativity [ERN/Ne] related to early error-monitoring processing and the error-related positivity [Pe] involved in later error-related processing) in a sample of incarcerated juvenile male offenders (n = 100) who performed a response inhibition Go/NoGo task. Psychopathic traits were assessed using the Hare Psychopathy Checklist: Youth Version (PCL:YV). The ERN/Ne and Pe were analyzed with classic windowed ERP components and principal component analysis (PCA). Using linear regression analyses, PCL:YV scores were unrelated to the ERN/Ne, but were negatively related to Pe mean amplitude. Specifically, the PCL:YV Facet 4 subscale reflecting antisocial traits emerged as a significant predictor of reduced amplitude of a subcomponent underlying the Pe identified with PCA. This is the first evidence to suggest a negative relationship between adolescent psychopathy scores and Pe mean amplitude. PMID:26930170

  14. Effects of monetary reward and punishment on information checking behaviour: An eye-tracking study.

    PubMed

    Li, Simon Y W; Cox, Anna L; Or, Calvin; Blandford, Ann

    2018-07-01

    The aim of the present study was to investigate the effect of error consequence, as reward or punishment, on individuals' checking behaviour following data entry. This study comprised two eye-tracking experiments that replicate and extend the investigation of Li et al. (2016) into the effect of monetary reward and punishment on data-entry performance. The first experiment adopted the same experimental setup as Li et al. (2016) but additionally used an eye tracker. The experiment validated Li et al. (2016) finding that, when compared to no error consequence, both reward and punishment led to improved data-entry performance in terms of reducing errors, and that no performance difference was found between reward and punishment. The second experiment extended the earlier study by associating error consequence to each individual trial by providing immediate performance feedback to participants. It was found that gradual increment (i.e. reward feedback) also led to significantly more accurate performance than no error consequence. It is unclear whether gradual increment is more effective than gradual decrement because of the small sample size tested. However, this study reasserts the effectiveness of reward on data-entry performance. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Error-rate prediction for programmable circuits: methodology, tools and studied cases

    NASA Astrophysics Data System (ADS)

    Velazco, Raoul

    2013-05-01

    This work presents an approach to predict the error rates due to Single Event Upsets (SEU) occurring in programmable circuits as a consequence of the impact or energetic particles present in the environment the circuits operate. For a chosen application, the error-rate is predicted by combining the results obtained from radiation ground testing and the results of fault injection campaigns performed off-beam during which huge numbers of SEUs are injected during the execution of the studied application. The goal of this strategy is to obtain accurate results about different applications' error rates, without using particle accelerator facilities, thus significantly reducing the cost of the sensitivity evaluation. As a case study, this methodology was applied a complex processor, the Power PC 7448 executing a program issued from a real space application and a crypto-processor application implemented in an SRAM-based FPGA and accepted to be embedded in the payload of a scientific satellite of NASA. The accuracy of predicted error rates was confirmed by comparing, for the same circuit and application, predictions with measures issued from radiation ground testing performed at the cyclotron Cyclone cyclotron of HIF (Heavy Ion Facility) of Louvain-la-Neuve (Belgium).

  16. Partial pressure analysis in space testing

    NASA Technical Reports Server (NTRS)

    Tilford, Charles R.

    1994-01-01

    For vacuum-system or test-article analysis it is often desirable to know the species and partial pressures of the vacuum gases. Residual gas or Partial Pressure Analyzers (PPA's) are commonly used for this purpose. These are mass spectrometer-type instruments, most commonly employing quadrupole filters. These instruments can be extremely useful, but they should be used with caution. Depending on the instrument design, calibration procedures, and conditions of use, measurements made with these instruments can be accurate to within a few percent, or in error by two or more orders of magnitude. Significant sources of error can include relative gas sensitivities that differ from handbook values by an order of magnitude, changes in sensitivity with pressure by as much as two orders of magnitude, changes in sensitivity with time after exposure to chemically active gases, and the dependence of the sensitivity for one gas on the pressures of other gases. However, for most instruments, these errors can be greatly reduced with proper operating procedures and conditions of use. In this paper, data are presented illustrating performance characteristics for different instruments and gases, operating parameters are recommended to minimize some errors, and calibrations procedures are described that can detect and/or correct other errors.

  17. Safety coaches in radiology: decreasing human error and minimizing patient harm.

    PubMed

    Dickerson, Julie M; Koch, Bernadette L; Adams, Janet M; Goodfriend, Martha A; Donnelly, Lane F

    2010-09-01

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program.

  18. Resource allocation for error resilient video coding over AWGN using optimization approach.

    PubMed

    An, Cheolhong; Nguyen, Truong Q

    2008-12-01

    The number of slices for error resilient video coding is jointly optimized with 802.11a-like media access control and the physical layers with automatic repeat request and rate compatible punctured convolutional code over additive white gaussian noise channel as well as channel times allocation for time division multiple access. For error resilient video coding, the relation between the number of slices and coding efficiency is analyzed and formulated as a mathematical model. It is applied for the joint optimization problem, and the problem is solved by a convex optimization method such as the primal-dual decomposition method. We compare the performance of a video communication system which uses the optimal number of slices with one that codes a picture as one slice. From numerical examples, end-to-end distortion of utility functions can be significantly reduced with the optimal slices of a picture especially at low signal-to-noise ratio.

  19. An Analysis of Ripple and Error Fields Induced by a Blanket in the CFETR

    NASA Astrophysics Data System (ADS)

    Yu, Guanying; Liu, Xufeng; Liu, Songlin

    2016-10-01

    The Chinese Fusion Engineering Tokamak Reactor (CFETR) is an important intermediate device between ITER and DEMO. The Water Cooled Ceramic Breeder (WCCB) blanket whose structural material is mainly made of Reduced Activation Ferritic/Martensitic (RAFM) steel, is one of the candidate conceptual blanket design. An analysis of ripple and error field induced by RAFM steel in WCCB is evaluated with the method of static magnetic analysis in the ANSYS code. Significant additional magnetic field is produced by blanket and it leads to an increased ripple field. Maximum ripple along the separatrix line reaches 0.53% which is higher than 0.5% of the acceptable design value. Simultaneously, one blanket module is taken out for heating purpose and the resulting error field is calculated to be seriously against the requirement. supported by National Natural Science Foundation of China (No. 11175207) and the National Magnetic Confinement Fusion Program of China (No. 2013GB108004)

  20. Decentralized control of sound radiation using iterative loop recovery.

    PubMed

    Schiller, Noah H; Cabell, Randolph H; Fuller, Chris R

    2010-10-01

    A decentralized model-based control strategy is designed to reduce low-frequency sound radiation from periodically stiffened panels. While decentralized control systems tend to be scalable, performance can be limited due to modeling error introduced by the unmodeled interaction between neighboring control units. Since bounds on modeling error are not known in advance, it is difficult to ensure the decentralized control system will be robust without making the controller overly conservative. Therefore an iterative approach is suggested, which utilizes frequency-shaped loop recovery. The approach accounts for modeling error introduced by neighboring control loops, requires no communication between subsystems, and is relatively simple. The control strategy is evaluated numerically using a model of a stiffened aluminum panel that is representative of the sidewall of an aircraft. Simulations demonstrate that the iterative approach can achieve significant reductions in radiated sound power from the stiffened panel without destabilizing neighboring control units.

  1. Three-dimensional ray-tracing model for the study of advanced refractive errors in keratoconus.

    PubMed

    Schedin, Staffan; Hallberg, Per; Behndig, Anders

    2016-01-20

    We propose a numerical three-dimensional (3D) ray-tracing model for the analysis of advanced corneal refractive errors. The 3D modeling was based on measured corneal elevation data by means of Scheimpflug photography. A mathematical description of the measured corneal surfaces from a keratoconus (KC) patient was used for the 3D ray tracing, based on Snell's law of refraction. A model of a commercial intraocular lens (IOL) was included in the analysis. By modifying the posterior IOL surface, it was shown that the imaging quality could be significantly improved. The RMS values were reduced by approximately 50% close to the retina, both for on- and off-axis geometries. The 3D ray-tracing model can constitute a basis for simulation of customized IOLs that are able to correct the advanced, irregular refractive errors in KC.

  2. Decentralized Control of Sound Radiation Using Iterative Loop Recovery

    NASA Technical Reports Server (NTRS)

    Schiller, Noah H.; Cabell, Randolph H.; Fuller, Chris R.

    2009-01-01

    A decentralized model-based control strategy is designed to reduce low-frequency sound radiation from periodically stiffened panels. While decentralized control systems tend to be scalable, performance can be limited due to modeling error introduced by the unmodeled interaction between neighboring control units. Since bounds on modeling error are not known in advance, it is difficult to ensure the decentralized control system will be robust without making the controller overly conservative. Therefore an iterative approach is suggested, which utilizes frequency-shaped loop recovery. The approach accounts for modeling error introduced by neighboring control loops, requires no communication between subsystems, and is relatively simple. The control strategy is evaluated numerically using a model of a stiffened aluminum panel that is representative of the sidewall of an aircraft. Simulations demonstrate that the iterative approach can achieve significant reductions in radiated sound power from the stiffened panel without destabilizing neighboring control units.

  3. Robust estimation of thermodynamic parameters (ΔH, ΔS and ΔCp) for prediction of retention time in gas chromatography - Part II (Application).

    PubMed

    Claumann, Carlos Alberto; Wüst Zibetti, André; Bolzan, Ariovaldo; Machado, Ricardo A F; Pinto, Leonel Teixeira

    2015-12-18

    For this work, an analysis of parameter estimation for the retention factor in GC model was performed, considering two different criteria: sum of square error, and maximum error in absolute value; relevant statistics are described for each case. The main contribution of this work is the implementation of an initialization scheme (specialized) for the estimated parameters, which features fast convergence (low computational time) and is based on knowledge of the surface of the error criterion. In an application to a series of alkanes, specialized initialization resulted in significant reduction to the number of evaluations of the objective function (reducing computational time) in the parameter estimation. The obtained reduction happened between one and two orders of magnitude, compared with the simple random initialization. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Reducing visual deficits caused by refractive errors in school and preschool children: results of a pilot school program in the Andean region of Apurimac, Peru

    PubMed Central

    Latorre-Arteaga, Sergio; Gil-González, Diana; Enciso, Olga; Phelan, Aoife; García-Muñoz, Ángel; Kohler, Johannes

    2014-01-01

    Background Refractive error is defined as the inability of the eye to bring parallel rays of light into focus on the retina, resulting in nearsightedness (myopia), farsightedness (Hyperopia) or astigmatism. Uncorrected refractive error in children is associated with increased morbidity and reduced educational opportunities. Vision screening (VS) is a method for identifying children with visual impairment or eye conditions likely to lead to visual impairment. Objective To analyze the utility of vision screening conducted by teachers and to contribute to a better estimation of the prevalence of childhood refractive errors in Apurimac, Peru. Design A pilot vision screening program in preschool (Group I) and elementary school children (Group II) was conducted with the participation of 26 trained teachers. Children whose visual acuity was<6/9 [20/30] (Group I) and≤6/9 (Group II) in one or both eyes, measured with the Snellen Tumbling E chart at 6 m, were referred for a comprehensive eye exam. Specificity and positive predictive value to detect refractive error were calculated against clinical examination. Program assessment with participants was conducted to evaluate outcomes and procedures. Results A total sample of 364 children aged 3–11 were screened; 45 children were examined at Centro Oftalmológico Monseñor Enrique Pelach (COMEP) Eye Hospital. Prevalence of refractive error was 6.2% (Group I) and 6.9% (Group II); specificity of teacher vision screening was 95.8% and 93.0%, while positive predictive value was 59.1% and 47.8% for each group, respectively. Aspects highlighted to improve the program included extending training, increasing parental involvement, and helping referred children to attend the hospital. Conclusion Prevalence of refractive error in children is significant in the region. Vision screening performed by trained teachers is a valid intervention for early detection of refractive error, including screening of preschool children. Program sustainability and improvements in education and quality of life resulting from childhood vision screening require further research. PMID:24560253

  5. Using Analysis Increments (AI) to Estimate and Correct Systematic Errors in the Global Forecast System (GFS) Online

    NASA Astrophysics Data System (ADS)

    Bhargava, K.; Kalnay, E.; Carton, J.; Yang, F.

    2017-12-01

    Systematic forecast errors, arising from model deficiencies, form a significant portion of the total forecast error in weather prediction models like the Global Forecast System (GFS). While much effort has been expended to improve models, substantial model error remains. The aim here is to (i) estimate the model deficiencies in the GFS that lead to systematic forecast errors, (ii) implement an online correction (i.e., within the model) scheme to correct GFS following the methodology of Danforth et al. [2007] and Danforth and Kalnay [2008, GRL]. Analysis Increments represent the corrections that new observations make on, in this case, the 6-hr forecast in the analysis cycle. Model bias corrections are estimated from the time average of the analysis increments divided by 6-hr, assuming that initial model errors grow linearly and first ignoring the impact of observation bias. During 2012-2016, seasonal means of the 6-hr model bias are generally robust despite changes in model resolution and data assimilation systems, and their broad continental scales explain their insensitivity to model resolution. The daily bias dominates the sub-monthly analysis increments and consists primarily of diurnal and semidiurnal components, also requiring a low dimensional correction. Analysis increments in 2015 and 2016 are reduced over oceans, which is attributed to improvements in the specification of the SSTs. These results encourage application of online correction, as suggested by Danforth and Kalnay, for mean, seasonal and diurnal and semidiurnal model biases in GFS to reduce both systematic and random errors. As the error growth in the short-term is still linear, estimated model bias corrections can be added as a forcing term in the model tendency equation to correct online. Preliminary experiments with GFS, correcting temperature and specific humidity online show reduction in model bias in 6-hr forecast. This approach can then be used to guide and optimize the design of sub-grid scale physical parameterizations, more accurate discretization of the model dynamics, boundary conditions, radiative transfer codes, and other potential model improvements which can then replace the empirical correction scheme. The analysis increments also provide guidance in testing new physical parameterizations.

  6. A Well-Calibrated Ocean Algorithm for Special Sensor Microwave/Imager

    NASA Technical Reports Server (NTRS)

    Wentz, Frank J.

    1997-01-01

    I describe an algorithm for retrieving geophysical parameters over the ocean from special sensor microwave/imager (SSM/I) observations. This algorithm is based on a model for the brightness temperature T(sub B) of the ocean and intervening atmosphere. The retrieved parameters are the near-surface wind speed W, the columnar water vapor V, the columnar cloud liquid water L, and the line-of-sight wind W(sub LS). I restrict my analysis to ocean scenes free of rain, and when the algorithm detects rain, the retrievals are discarded. The model and algorithm are precisely calibrated using a very large in situ database containing 37,650 SSM/I overpasses of buoys and 35,108 overpasses of radiosonde sites. A detailed error analysis indicates that the T(sub B) model rms accuracy is between 0.5 and 1 K and that the rms retrieval accuracies for wind, vapor, and cloud are 0.9 m/s, 1.2 mm, and 0.025 mm, respectively. The error in specifying the cloud temperature will introduce an additional 10% error in the cloud water retrieval. The spatial resolution for these accuracies is 50 km. The systematic errors in the retrievals are smaller than the rms errors, being about 0.3 m/s, 0.6 mm, and 0.005 mm for W, V, and L, respectively. The one exception is the systematic error in wind speed of -1.0 m/s that occurs for observations within +/-20 deg of upwind. The inclusion of the line-of-sight wind W(sub LS) in the retrieval significantly reduces the error in wind speed due to wind direction variations. The wind error for upwind observations is reduced from -3.0 to -1.0 m/s. Finally, I find a small signal in the 19-GHz, horizontal polarization (h(sub pol) T(sub B) residual DeltaT(sub BH) that is related to the effective air pressure of the water vapor profile. This information may be of some use in specifying the vertical distribution of water vapor.

  7. SU-F-T-471: Simulated External Beam Delivery Errors Detection with a Large Area Ion Chamber Transmission Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, D; Dyer, B; Kumaran Nair, C

    Purpose: The Integral Quality Monitor (IQM), developed by iRT Systems GmbH (Koblenz, Germany) is a large-area, linac-mounted ion chamber used to monitor photon fluence during patient treatment. Our previous work evaluated the change of the ion chamber’s response to deviations from static 1×1 cm2 and 10×10 cm2 photon beams and other characteristics integral to use in external beam detection. The aim of this work is to simulate two external beam radiation delivery errors, quantify the detection of simulated errors and evaluate the reduction in patient harm resulting from detection. Methods: Two well documented radiation oncology delivery errors were selected formore » simulation. The first error was recreated by modifying a wedged whole breast treatment, removing the physical wedge and calculating the planned dose with Pinnacle TPS (Philips Radiation Oncology Systems, Fitchburg, WI). The second error was recreated by modifying a static-gantry IMRT pharyngeal tonsil plan to be delivered in 3 unmodulated fractions. A radiation oncologist evaluated the dose for simulated errors and predicted morbidity and mortality commiserate with the original reported toxicity, indicating that reported errors were approximately simulated. The ion chamber signal of unmodified treatments was compared to the simulated error signal and evaluated in Pinnacle TPS again with radiation oncologist prediction of simulated patient harm. Results: Previous work established that transmission detector system measurements are stable within 0.5% standard deviation (SD). Errors causing signal change greater than 20 SD (10%) were considered detected. The whole breast and pharyngeal tonsil IMRT simulated error increased signal by 215% and 969%, respectively, indicating error detection after the first fraction and IMRT segment, respectively. Conclusion: The transmission detector system demonstrated utility in detecting clinically significant errors and reducing patient toxicity/harm in simulated external beam delivery. Future work will evaluate detection of other smaller magnitude delivery errors.« less

  8. 42 CFR 431.992 - Corrective action plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CMS, designed to reduce improper payments in each program based on its analysis of the error causes in... State must take the following actions: (1) Data analysis. States must conduct data analysis such as reviewing clusters of errors, general error causes, characteristics, and frequency of errors that are...

  9. 42 CFR 431.992 - Corrective action plan.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CMS, designed to reduce improper payments in each program based on its analysis of the error causes in... State must take the following actions: (1) Data analysis. States must conduct data analysis such as reviewing clusters of errors, general error causes, characteristics, and frequency of errors that are...

  10. The effects of partial and full correction of refractive errors on sensorial and motor outcomes in children with refractive accommodative esotropia.

    PubMed

    Sefi-Yurdakul, Nazife; Kaykısız, Hüseyin; Koç, Feray

    2018-03-17

    To investigate the effects of partial and full correction of refractive errors on sensorial and motor outcomes in children with refractive accommodative esotropia (RAE). The records of pediatric cases with full RAE were reviewed; their first and last sensorial and motor findings were evaluated in two groups, classified as partial (Group 1) and full correction (Group 2) of refractive errors. The mean age at first admission was 5.84 ± 3.62 years in Group 1 (n = 35) and 6.35 ± 3.26 years in Group 2 (n = 46) (p = 0.335). Mean change in best corrected visual acuity (BCVA) was 0.24 ± 0.17 logarithm of the minimum angle of resolution (logMAR) in Group 1 and 0.13 ± 0.16 logMAR in Group 2 (p = 0.001). Duration of deviation, baseline refraction and amount of reduced refraction showed significant effects on change in BCVA (p < 0.05). Significant correlation was determined between binocular vision (BOV), duration of deviation and uncorrected baseline amount of deviation (p < 0.05). The baseline BOV rates were significantly high in fully corrected Group 2, and also were found to have increased in Group 1 (p < 0.05). Change in refraction was - 0.09 ± 1.08 and + 0.35 ± 0.76 diopters in Groups 1 and 2, respectively (p = 0.005). Duration of deviation, baseline refraction and the amount of reduced refraction had significant effects on change in refraction (p < 0.05). Change in deviation without refractive correction was - 0.74 ± 7.22 prism diopters in Group 1 and - 3.24 ± 10.41 prism diopters in Group 2 (p = 0.472). Duration of follow-up and uncorrected baseline deviation showed significant effects on change in deviation (p < 0.05). Although the BOV rates and BCVA were initially high in fully corrected patients, they finally improved significantly in both the fully and partially corrected patients. Full hypermetropic correction may also cause an increase in the refractive error with a possible negative effect on emmetropization. The negative effect of the duration of deviation on BOV and BCVA demonstrates the significance of early treatment in RAE cases.

  11. Fully Convolutional Networks for Ground Classification from LIDAR Point Clouds

    NASA Astrophysics Data System (ADS)

    Rizaldy, A.; Persello, C.; Gevaert, C. M.; Oude Elberink, S. J.

    2018-05-01

    Deep Learning has been massively used for image classification in recent years. The use of deep learning for ground classification from LIDAR point clouds has also been recently studied. However, point clouds need to be converted into an image in order to use Convolutional Neural Networks (CNNs). In state-of-the-art techniques, this conversion is slow because each point is converted into a separate image. This approach leads to highly redundant computation during conversion and classification. The goal of this study is to design a more efficient data conversion and ground classification. This goal is achieved by first converting the whole point cloud into a single image. The classification is then performed by a Fully Convolutional Network (FCN), a modified version of CNN designed for pixel-wise image classification. The proposed method is significantly faster than state-of-the-art techniques. On the ISPRS Filter Test dataset, it is 78 times faster for conversion and 16 times faster for classification. Our experimental analysis on the same dataset shows that the proposed method results in 5.22 % of total error, 4.10 % of type I error, and 15.07 % of type II error. Compared to the previous CNN-based technique and LAStools software, the proposed method reduces the total error and type I error (while type II error is slightly higher). The method was also tested on a very high point density LIDAR point clouds resulting in 4.02 % of total error, 2.15 % of type I error and 6.14 % of type II error.

  12. Evaluation of analytical errors in a clinical chemistry laboratory: a 3 year experience.

    PubMed

    Sakyi, As; Laing, Ef; Ephraim, Rk; Asibey, Of; Sadique, Ok

    2015-01-01

    Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified.

  13. Model-based cost-effectiveness analysis of interventions aimed at preventing medication error at hospital admission (medicines reconciliation).

    PubMed

    Karnon, Jonathan; Campbell, Fiona; Czoski-Murray, Carolyn

    2009-04-01

    Medication errors can lead to preventable adverse drug events (pADEs) that have significant cost and health implications. Errors often occur at care interfaces, and various interventions have been devised to reduce medication errors at the point of admission to hospital. The aim of this study is to assess the incremental costs and effects [measured as quality adjusted life years (QALYs)] of a range of such interventions for which evidence of effectiveness exists. A previously published medication errors model was adapted to describe the pathway of errors occurring at admission through to the occurrence of pADEs. The baseline model was populated using literature-based values, and then calibrated to observed outputs. Evidence of effects was derived from a systematic review of interventions aimed at preventing medication error at hospital admission. All five interventions, for which evidence of effectiveness was identified, are estimated to be extremely cost-effective when compared with the baseline scenario. Pharmacist-led reconciliation intervention has the highest expected net benefits, and a probability of being cost-effective of over 60% by a QALY value of pound10 000. The medication errors model provides reasonably strong evidence that some form of intervention to improve medicines reconciliation is a cost-effective use of NHS resources. The variation in the reported effectiveness of the few identified studies of medication error interventions illustrates the need for extreme attention to detail in the development of interventions, but also in their evaluation and may justify the primary evaluation of more than one specification of included interventions.

  14. Human Error In Complex Systems

    NASA Technical Reports Server (NTRS)

    Morris, Nancy M.; Rouse, William B.

    1991-01-01

    Report presents results of research aimed at understanding causes of human error in such complex systems as aircraft, nuclear powerplants, and chemical processing plants. Research considered both slips (errors of action) and mistakes (errors of intention), and influence of workload on them. Results indicated that: humans respond to conditions in which errors expected by attempting to reduce incidence of errors; and adaptation to conditions potent influence on human behavior in discretionary situations.

  15. A continuous quality improvement project to reduce medication error in the emergency department.

    PubMed

    Lee, Sara Bc; Lee, Larry Ly; Yeung, Richard Sd; Chan, Jimmy Ts

    2013-01-01

    Medication errors are a common source of adverse healthcare incidents particularly in the emergency department (ED) that has a number of factors that make it prone to medication errors. This project aims to reduce medication errors and improve the health and economic outcomes of clinical care in Hong Kong ED. In 2009, a task group was formed to identify problems that potentially endanger medication safety and developed strategies to eliminate these problems. Responsible officers were assigned to look after seven error-prone areas. Strategies were proposed, discussed, endorsed and promulgated to eliminate the problems identified. A reduction of medication incidents (MI) from 16 to 6 was achieved before and after the improvement work. This project successfully established a concrete organizational structure to safeguard error-prone areas of medication safety in a sustainable manner.

  16. Preliminary Evidence for Reduced Post-Error Reaction Time Slowing in Hyperactive/Inattentive Preschool Children

    PubMed Central

    Berwid, Olga G.; Halperin, Jeffrey M.; Johnson, Ray E.; Marks, David J.

    2013-01-01

    Background Attention-Deficit/Hyperactivity Disorder has been associated with deficits in self-regulatory cognitive processes, some of which are thought to lie at the heart of the disorder. Slowing of reaction times (RTs) for correct responses following errors made during decision tasks has been interpreted as an indication of intact self-regulatory functioning and has been shown to be attenuated in school-aged children with ADHD. This study attempted to examine whether ADHD symptoms are associated with an early-emerging deficit in post-error slowing. Method A computerized two-choice RT task was administered to an ethnically diverse sample of preschool-aged children classified as either ‘control’ (n = 120) or ‘hyperactive/inattentive’ (HI; n = 148) using parent- and teacher-rated ADHD symptoms. Analyses were conducted to determine whether HI preschoolers exhibit a deficit in this self-regulatory ability. Results HI children exhibited reduced post-error slowing relative to controls on the trials selected for analysis. Supplementary analyses indicated that this may have been due to a reduced proportion of trials following errors on which HI children slowed rather than to a reduction in the absolute magnitude of slowing on all trials following errors. Conclusions High levels of ADHD symptoms in preschoolers may be associated with a deficit in error processing as indicated by post-error slowing. The results of supplementary analyses suggest that this deficit is perhaps more a result of failures to perceive errors than of difficulties with executive control. PMID:23387525

  17. Effects of Optical Blur Reduction on Equivalent Intrinsic Blur

    PubMed Central

    Valeshabad, Ali Kord; Wanek, Justin; McAnany, J. Jason; Shahidi, Mahnaz

    2015-01-01

    Purpose To determine the effect of optical blur reduction on equivalent intrinsic blur, an estimate of the blur within the visual system, by comparing optical and equivalent intrinsic blur before and after adaptive optics (AO) correction of wavefront error. Methods Twelve visually normal individuals (age; 31 ± 12 years) participated in this study. Equivalent intrinsic blur (σint) was derived using a previously described model. Optical blur (σopt) due to high-order aberrations was quantified by Shack-Hartmann aberrometry and minimized using AO correction of wavefront error. Results σopt and σint were significantly reduced and visual acuity (VA) was significantly improved after AO correction (P ≤ 0.004). Reductions in σopt and σint were linearly dependent on the values before AO correction (r ≥ 0.94, P ≤ 0.002). The reduction in σint was greater than the reduction in σopt, although it was marginally significant (P = 0.05). σint after AO correlated significantly with σint before AO (r = 0.92, P < 0.001) and the two parameters were related linearly with a slope of 0.46. Conclusions Reduction in equivalent intrinsic blur was greater than the reduction in optical blur due to AO correction of wavefront error. This finding implies that VA in subjects with high equivalent intrinsic blur can be improved beyond that expected from the reduction in optical blur alone. PMID:25785538

  18. Effects of optical blur reduction on equivalent intrinsic blur.

    PubMed

    Kord Valeshabad, Ali; Wanek, Justin; McAnany, J Jason; Shahidi, Mahnaz

    2015-04-01

    To determine the effect of optical blur reduction on equivalent intrinsic blur, an estimate of the blur within the visual system, by comparing optical and equivalent intrinsic blur before and after adaptive optics (AO) correction of wavefront error. Twelve visually normal subjects (mean [±SD] age, 31 [±12] years) participated in this study. Equivalent intrinsic blur (σint) was derived using a previously described model. Optical blur (σopt) caused by high-order aberrations was quantified by Shack-Hartmann aberrometry and minimized using AO correction of wavefront error. σopt and σint were significantly reduced and visual acuity was significantly improved after AO correction (p ≤ 0.004). Reductions in σopt and σint were linearly dependent on the values before AO correction (r ≥ 0.94, p ≤ 0.002). The reduction in σint was greater than the reduction in σopt, although it was marginally significant (p = 0.05). σint after AO correlated significantly with σint before AO (r = 0.92, p < 0.001), and the two parameters were related linearly with a slope of 0.46. Reduction in equivalent intrinsic blur was greater than the reduction in optical blur after AO correction of wavefront error. This finding implies that visual acuity in subjects with high equivalent intrinsic blur can be improved beyond that expected from the reduction in optical blur alone.

  19. Reduced-rank approximations to the far-field transform in the gridded fast multipole method

    NASA Astrophysics Data System (ADS)

    Hesford, Andrew J.; Waag, Robert C.

    2011-05-01

    The fast multipole method (FMM) has been shown to have a reduced computational dependence on the size of finest-level groups of elements when the elements are positioned on a regular grid and FFT convolution is used to represent neighboring interactions. However, transformations between plane-wave expansions used for FMM interactions and pressure distributions used for neighboring interactions remain significant contributors to the cost of FMM computations when finest-level groups are large. The transformation operators, which are forward and inverse Fourier transforms with the wave space confined to the unit sphere, are smooth and well approximated using reduced-rank decompositions that further reduce the computational dependence of the FMM on finest-level group size. The adaptive cross approximation (ACA) is selected to represent the forward and adjoint far-field transformation operators required by the FMM. However, the actual error of the ACA is found to be greater than that predicted using traditional estimates, and the ACA generally performs worse than the approximation resulting from a truncated singular-value decomposition (SVD). To overcome these issues while avoiding the cost of a full-scale SVD, the ACA is employed with more stringent accuracy demands and recompressed using a reduced, truncated SVD. The results show a greatly reduced approximation error that performs comparably to the full-scale truncated SVD without degrading the asymptotic computational efficiency associated with ACA matrix assembly.

  20. Reduced-Rank Approximations to the Far-Field Transform in the Gridded Fast Multipole Method.

    PubMed

    Hesford, Andrew J; Waag, Robert C

    2011-05-10

    The fast multipole method (FMM) has been shown to have a reduced computational dependence on the size of finest-level groups of elements when the elements are positioned on a regular grid and FFT convolution is used to represent neighboring interactions. However, transformations between plane-wave expansions used for FMM interactions and pressure distributions used for neighboring interactions remain significant contributors to the cost of FMM computations when finest-level groups are large. The transformation operators, which are forward and inverse Fourier transforms with the wave space confined to the unit sphere, are smooth and well approximated using reduced-rank decompositions that further reduce the computational dependence of the FMM on finest-level group size. The adaptive cross approximation (ACA) is selected to represent the forward and adjoint far-field transformation operators required by the FMM. However, the actual error of the ACA is found to be greater than that predicted using traditional estimates, and the ACA generally performs worse than the approximation resulting from a truncated singular-value decomposition (SVD). To overcome these issues while avoiding the cost of a full-scale SVD, the ACA is employed with more stringent accuracy demands and recompressed using a reduced, truncated SVD. The results show a greatly reduced approximation error that performs comparably to the full-scale truncated SVD without degrading the asymptotic computational efficiency associated with ACA matrix assembly.

  1. Reduced-Rank Approximations to the Far-Field Transform in the Gridded Fast Multipole Method

    PubMed Central

    Hesford, Andrew J.; Waag, Robert C.

    2011-01-01

    The fast multipole method (FMM) has been shown to have a reduced computational dependence on the size of finest-level groups of elements when the elements are positioned on a regular grid and FFT convolution is used to represent neighboring interactions. However, transformations between plane-wave expansions used for FMM interactions and pressure distributions used for neighboring interactions remain significant contributors to the cost of FMM computations when finest-level groups are large. The transformation operators, which are forward and inverse Fourier transforms with the wave space confined to the unit sphere, are smooth and well approximated using reduced-rank decompositions that further reduce the computational dependence of the FMM on finest-level group size. The adaptive cross approximation (ACA) is selected to represent the forward and adjoint far-field transformation operators required by the FMM. However, the actual error of the ACA is found to be greater than that predicted using traditional estimates, and the ACA generally performs worse than the approximation resulting from a truncated singular-value decomposition (SVD). To overcome these issues while avoiding the cost of a full-scale SVD, the ACA is employed with more stringent accuracy demands and recompressed using a reduced, truncated SVD. The results show a greatly reduced approximation error that performs comparably to the full-scale truncated SVD without degrading the asymptotic computational efficiency associated with ACA matrix assembly. PMID:21552350

  2. Interdisciplinary Coordination Reviews: A Process to Reduce Construction Costs.

    ERIC Educational Resources Information Center

    Fewell, Dennis A.

    1998-01-01

    Interdisciplinary Coordination design review is instrumental in detecting coordination errors and omissions in construction documents. Cleansing construction documents of interdisciplinary coordination errors reduces time extensions, the largest source of change orders, and limits exposure to liability claims. Improving the quality of design…

  3. Balancing aggregation and smoothing errors in inverse models

    DOE PAGES

    Turner, A. J.; Jacob, D. J.

    2015-06-30

    Inverse models use observations of a system (observation vector) to quantify the variables driving that system (state vector) by statistical optimization. When the observation vector is large, such as with satellite data, selecting a suitable dimension for the state vector is a challenge. A state vector that is too large cannot be effectively constrained by the observations, leading to smoothing error. However, reducing the dimension of the state vector leads to aggregation error as prior relationships between state vector elements are imposed rather than optimized. Here we present a method for quantifying aggregation and smoothing errors as a function ofmore » state vector dimension, so that a suitable dimension can be selected by minimizing the combined error. Reducing the state vector within the aggregation error constraints can have the added advantage of enabling analytical solution to the inverse problem with full error characterization. We compare three methods for reducing the dimension of the state vector from its native resolution: (1) merging adjacent elements (grid coarsening), (2) clustering with principal component analysis (PCA), and (3) applying a Gaussian mixture model (GMM) with Gaussian pdfs as state vector elements on which the native-resolution state vector elements are projected using radial basis functions (RBFs). The GMM method leads to somewhat lower aggregation error than the other methods, but more importantly it retains resolution of major local features in the state vector while smoothing weak and broad features.« less

  4. Balancing aggregation and smoothing errors in inverse models

    NASA Astrophysics Data System (ADS)

    Turner, A. J.; Jacob, D. J.

    2015-01-01

    Inverse models use observations of a system (observation vector) to quantify the variables driving that system (state vector) by statistical optimization. When the observation vector is large, such as with satellite data, selecting a suitable dimension for the state vector is a challenge. A state vector that is too large cannot be effectively constrained by the observations, leading to smoothing error. However, reducing the dimension of the state vector leads to aggregation error as prior relationships between state vector elements are imposed rather than optimized. Here we present a method for quantifying aggregation and smoothing errors as a function of state vector dimension, so that a suitable dimension can be selected by minimizing the combined error. Reducing the state vector within the aggregation error constraints can have the added advantage of enabling analytical solution to the inverse problem with full error characterization. We compare three methods for reducing the dimension of the state vector from its native resolution: (1) merging adjacent elements (grid coarsening), (2) clustering with principal component analysis (PCA), and (3) applying a Gaussian mixture model (GMM) with Gaussian pdfs as state vector elements on which the native-resolution state vector elements are projected using radial basis functions (RBFs). The GMM method leads to somewhat lower aggregation error than the other methods, but more importantly it retains resolution of major local features in the state vector while smoothing weak and broad features.

  5. Balancing aggregation and smoothing errors in inverse models

    NASA Astrophysics Data System (ADS)

    Turner, A. J.; Jacob, D. J.

    2015-06-01

    Inverse models use observations of a system (observation vector) to quantify the variables driving that system (state vector) by statistical optimization. When the observation vector is large, such as with satellite data, selecting a suitable dimension for the state vector is a challenge. A state vector that is too large cannot be effectively constrained by the observations, leading to smoothing error. However, reducing the dimension of the state vector leads to aggregation error as prior relationships between state vector elements are imposed rather than optimized. Here we present a method for quantifying aggregation and smoothing errors as a function of state vector dimension, so that a suitable dimension can be selected by minimizing the combined error. Reducing the state vector within the aggregation error constraints can have the added advantage of enabling analytical solution to the inverse problem with full error characterization. We compare three methods for reducing the dimension of the state vector from its native resolution: (1) merging adjacent elements (grid coarsening), (2) clustering with principal component analysis (PCA), and (3) applying a Gaussian mixture model (GMM) with Gaussian pdfs as state vector elements on which the native-resolution state vector elements are projected using radial basis functions (RBFs). The GMM method leads to somewhat lower aggregation error than the other methods, but more importantly it retains resolution of major local features in the state vector while smoothing weak and broad features.

  6. Improving laboratory data entry quality using Six Sigma.

    PubMed

    Elbireer, Ali; Le Chasseur, Julie; Jackson, Brooks

    2013-01-01

    The Uganda Makerere University provides clinical laboratory support to over 70 clients in Uganda. With increased volume, manual data entry errors have steadily increased, prompting laboratory managers to employ the Six Sigma method to evaluate and reduce their problems. The purpose of this paper is to describe how laboratory data entry quality was improved by using Six Sigma. The Six Sigma Quality Improvement (QI) project team followed a sequence of steps, starting with defining project goals, measuring data entry errors to assess current performance, analyzing data and determining data-entry error root causes. Finally the team implemented changes and control measures to address the root causes and to maintain improvements. Establishing the Six Sigma project required considerable resources and maintaining the gains requires additional personnel time and dedicated resources. After initiating the Six Sigma project, there was a 60.5 percent reduction in data entry errors from 423 errors a month (i.e. 4.34 Six Sigma) in the first month, down to an average 166 errors/month (i.e. 4.65 Six Sigma) over 12 months. The team estimated the average cost of identifying and fixing a data entry error to be $16.25 per error. Thus, reducing errors by an average of 257 errors per month over one year has saved the laboratory an estimated $50,115 a year. The Six Sigma QI project provides a replicable framework for Ugandan laboratory staff and other resource-limited organizations to promote quality environment. Laboratory staff can deliver excellent care at a lower cost, by applying QI principles. This innovative QI method of reducing data entry errors in medical laboratories may improve the clinical workflow processes and make cost savings across the health care continuum.

  7. Erratum: Binary neutron stars with arbitrary spins in numerical relativity [Phys. Rev. D 92, 124012 (2015)

    NASA Astrophysics Data System (ADS)

    Tacik, Nick; Foucart, Francois; Pfeiffer, Harald P.; Haas, Roland; Ossokine, Serguei; Kaplan, Jeff; Muhlberger, Curran; Duez, Matt D.; Kidder, Lawrence E.; Scheel, Mark A.; Szilágyi, Béla

    2016-08-01

    The code used in [Phys. Rev. D 92, 124012 (2015)] erroneously computed the enthalpy at the center of the neutron stars. Upon correcting this error, density oscillations in evolutions of rotating neutron stars are significantly reduced (from ˜20 % to ˜0.5 % ). Furthermore, it is possible to construct neutron stars with faster rotation rates.

  8. Role of the pharmacist in reducing healthcare costs: current insights

    PubMed Central

    Dalton, Kieran; Byrne, Stephen

    2017-01-01

    Global healthcare expenditure is escalating at an unsustainable rate. Money spent on medicines and managing medication-related problems continues to grow. The high prevalence of medication errors and inappropriate prescribing is a major issue within healthcare systems, and can often contribute to adverse drug events, many of which are preventable. As a result, there is a huge opportunity for pharmacists to have a significant impact on reducing healthcare costs, as they have the expertise to detect, resolve, and prevent medication errors and medication-related problems. The development of clinical pharmacy practice in recent decades has resulted in an increased number of pharmacists working in clinically advanced roles worldwide. Pharmacist-provided services and clinical interventions have been shown to reduce the risk of potential adverse drug events and improve patient outcomes, and the majority of published studies show that these pharmacist activities are cost-effective or have a good cost:benefit ratio. This review demonstrates that pharmacists can contribute to substantial healthcare savings across a variety of settings. However, there is a paucity of evidence in the literature highlighting the specific aspects of pharmacists’ work which are the most effective and cost-effective. Future high-quality economic evaluations with robust methodologies and study design are required to investigate what pharmacist services have significant clinical benefits to patients and substantiate the greatest cost savings for healthcare budgets. PMID:29354549

  9. Perceived vs. measured effects of advanced cockpit systems on pilot workload and error: are pilots' beliefs misaligned with reality?

    PubMed

    Casner, Stephen M

    2009-05-01

    Four types of advanced cockpit systems were tested in an in-flight experiment for their effect on pilot workload and error. Twelve experienced pilots flew conventional cockpit and advanced cockpit versions of the same make and model airplane. In both airplanes, the experimenter dictated selected combinations of cockpit systems for each pilot to use while soliciting subjective workload measures and recording any errors that pilots made. The results indicate that the use of a GPS navigation computer helped reduce workload and errors during some phases of flight but raised them in others. Autopilots helped reduce some aspects of workload in the advanced cockpit airplane but did not appear to reduce workload in the conventional cockpit. Electronic flight and navigation instruments appeared to have no effect on workload or error. Despite this modest showing for advanced cockpit systems, pilots stated an overwhelming preference for using them during all phases of flight.

  10. ‘Why should I care?’ Challenging free will attenuates neural reaction to errors

    PubMed Central

    Pourtois, Gilles; Brass, Marcel

    2015-01-01

    Whether human beings have free will has been a philosophical question for centuries. The debate about free will has recently entered the public arena through mass media and newspaper articles commenting on scientific findings that leave little to no room for free will. Previous research has shown that encouraging such a deterministic perspective influences behavior, namely by promoting cursory and antisocial behavior. Here we propose that such behavioral changes may, at least partly, stem from a more basic neurocognitive process related to response monitoring, namely a reduced error detection mechanism. Our results show that the error-related negativity, a neural marker of error detection, was reduced in individuals led to disbelieve in free will. This finding shows that reducing the belief in free will has a specific impact on error detection mechanisms. More generally, it suggests that abstract beliefs about intentional control can influence basic and automatic processes related to action control. PMID:24795441

  11. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    PubMed

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  12. Fast Video Encryption Using the H.264 Error Propagation Property for Smart Mobile Devices

    PubMed Central

    Chung, Yongwha; Lee, Sungju; Jeon, Taewoong; Park, Daihee

    2015-01-01

    In transmitting video data securely over Video Sensor Networks (VSNs), since mobile handheld devices have limited resources in terms of processor clock speed and battery size, it is necessary to develop an efficient method to encrypt video data to meet the increasing demand for secure connections. Selective encryption methods can reduce the amount of computation needed while satisfying high-level security requirements. This is achieved by selecting an important part of the video data and encrypting it. In this paper, to ensure format compliance and security, we propose a special encryption method for H.264, which encrypts only the DC/ACs of I-macroblocks and the motion vectors of P-macroblocks. In particular, the proposed new selective encryption method exploits the error propagation property in an H.264 decoder and improves the collective performance by analyzing the tradeoff between the visual security level and the processing speed compared to typical selective encryption methods (i.e., I-frame, P-frame encryption, and combined I-/P-frame encryption). Experimental results show that the proposed method can significantly reduce the encryption workload without any significant degradation of visual security. PMID:25850068

  13. Time accurate application of the MacCormack 2-4 scheme on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Hudson, Dale A.; Long, Lyle N.

    1995-01-01

    Many recent computational efforts in turbulence and acoustics research have used higher order numerical algorithms. One popular method has been the explicit MacCormack 2-4 scheme. The MacCormack 2-4 scheme is second order accurate in time and fourth order accurate in space, and is stable for CFL's below 2/3. Current research has shown that the method can give accurate results but does exhibit significant Gibbs phenomena at sharp discontinuities. The impact of adding Jameson type second, third, and fourth order artificial viscosity was examined here. Category 2 problems, the nonlinear traveling wave and the Riemann problem, were computed using a CFL number of 0.25. This research has found that dispersion errors can be significantly reduced or nearly eliminated by using a combination of second and third order terms in the damping. Use of second and fourth order terms reduced the magnitude of dispersion errors but not as effectively as the second and third order combination. The program was coded using Thinking Machine's CM Fortran, a variant of Fortran 90/High Performance Fortran, and was executed on a 2K CM-200. Simple extrapolation boundary conditions were used for both problems.

  14. Adaptive Control of Small Outboard-Powered Boats for Survey Applications

    NASA Technical Reports Server (NTRS)

    VanZwieten, T.S.; VanZwieten, J.H.; Fisher, A.D.

    2009-01-01

    Four autopilot controllers have been developed in this work that can both hold a desired heading and follow a straight line. These PID, adaptive PID, neuro-adaptive, and adaptive augmenting control algorithms have all been implemented into a numerical simulation of a 33-foot center console vessel with wind, waves, and current disturbances acting in the perpendicular (across-track) direction of the boat s desired trajectory. Each controller is tested for its ability to follow a desired heading in the presence of these disturbances and then to follow a straight line at two different throttle settings for the same disturbances. These controllers were tuned for an input thrust of 2000 N and all four controllers showed good performance with none of the controllers significantly outperforming the others when holding a constant heading and following a straight line at this engine thrust. Each controller was then tested for a reduced engine thrust of 1200 N per engine where each of the three adaptive controllers reduced heading error and across-track error by approximately 50% after a 300 second tuning period when compared to the fixed gain PID, showing that significant robustness to changes in throttle setting was gained by using an adaptive algorithm.

  15. Effect of bar-code technology on the safety of medication administration.

    PubMed

    Poon, Eric G; Keohane, Carol A; Yoon, Catherine S; Ditmore, Matthew; Bane, Anne; Levtzion-Korach, Osnat; Moniz, Thomas; Rothschild, Jeffrey M; Kachalia, Allen B; Hayes, Judy; Churchill, William W; Lipsitz, Stuart; Whittemore, Anthony D; Bates, David W; Gandhi, Tejal K

    2010-05-06

    Serious medication errors are common in hospitals and often occur during order transcription or administration of medication. To help prevent such errors, technology has been developed to verify medications by incorporating bar-code verification technology within an electronic medication-administration system (bar-code eMAR). We conducted a before-and-after, quasi-experimental study in an academic medical center that was implementing the bar-code eMAR. We assessed rates of errors in order transcription and medication administration on units before and after implementation of the bar-code eMAR. Errors that involved early or late administration of medications were classified as timing errors and all others as nontiming errors. Two clinicians reviewed the errors to determine their potential to harm patients and classified those that could be harmful as potential adverse drug events. We observed 14,041 medication administrations and reviewed 3082 order transcriptions. Observers noted 776 nontiming errors in medication administration on units that did not use the bar-code eMAR (an 11.5% error rate) versus 495 such errors on units that did use it (a 6.8% error rate)--a 41.4% relative reduction in errors (P<0.001). The rate of potential adverse drug events (other than those associated with timing errors) fell from 3.1% without the use of the bar-code eMAR to 1.6% with its use, representing a 50.8% relative reduction (P<0.001). The rate of timing errors in medication administration fell by 27.3% (P<0.001), but the rate of potential adverse drug events associated with timing errors did not change significantly. Transcription errors occurred at a rate of 6.1% on units that did not use the bar-code eMAR but were completely eliminated on units that did use it. Use of the bar-code eMAR substantially reduced the rate of errors in order transcription and in medication administration as well as potential adverse drug events, although it did not eliminate such errors. Our data show that the bar-code eMAR is an important intervention to improve medication safety. (ClinicalTrials.gov number, NCT00243373.) 2010 Massachusetts Medical Society

  16. Thirty Years of Improving the NCEP Global Forecast System

    NASA Astrophysics Data System (ADS)

    White, G. H.; Manikin, G.; Yang, F.

    2014-12-01

    Current eight day forecasts by the NCEP Global Forecast System are as accurate as five day forecasts 30 years ago. This revolution in weather forecasting reflects increases in computer power, improvements in the assimilation of observations, especially satellite data, improvements in model physics, improvements in observations and international cooperation and competition. One important component has been and is the diagnosis, evaluation and reduction of systematic errors. The effect of proposed improvements in the GFS on systematic errors is one component of the thorough testing of such improvements by the Global Climate and Weather Modeling Branch. Examples of reductions in systematic errors in zonal mean temperatures and winds and other fields will be presented. One challenge in evaluating systematic errors is uncertainty in what reality is. Model initial states can be regarded as the best overall depiction of the atmosphere, but can be misleading in areas of few observations or for fields not well observed such as humidity or precipitation over the oceans. Verification of model physics is particularly difficult. The Environmental Modeling Center emphasizes the evaluation of systematic biases against observations. Recently EMC has placed greater emphasis on synoptic evaluation and on precipitation, 2-meter temperatures and dew points and 10 meter winds. A weekly EMC map discussion reviews the performance of many models over the United States and has helped diagnose and alleviate significant systematic errors in the GFS, including a near surface summertime evening cold wet bias over the eastern US and a multi-week period when the GFS persistently developed bogus tropical storms off Central America. The GFS exhibits a wet bias for light rain and a dry bias for moderate to heavy rain over the continental United States. Significant changes to the GFS are scheduled to be implemented in the fall of 2014. These include higher resolution, improved physics and improvements to the assimilation. These changes significantly improve the tropospheric flow and reduce a tropical upper tropospheric warm bias. One important error remaining is the failure of the GFS to maintain deep convection over Indonesia and in the tropical west Pacific. This and other current systematic errors will be presented.

  17. Easing The Calculation Of Bolt-Circle Coordinates

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.

    1995-01-01

    Bolt Circle Calculation (BOLT-CALC) computer program used to reduce significant time consumed in manually computing trigonometry of rectangular Cartesian coordinates of holes in bolt circle as shown on blueprint or drawing. Eliminates risk of computational errors, particularly in cases involving many holes or in cases in which coordinates expressed to many significant digits. Program assists in many practical situations arising in machine shops. Written in BASIC. Also successfully compiled and implemented by use of Microsoft's QuickBasic v4.0.

  18. Effects of Error Correction during Assessment Probes on the Acquisition of Sight Words for Students with Moderate Intellectual Disabilities

    ERIC Educational Resources Information Center

    Waugh, Rebecca E.

    2010-01-01

    Simultaneous prompting is an errorless learning strategy designed to reduce the number of errors students make; however, research has shown a disparity in the number of errors students make during instructional versus probe trials. This study directly examined the effects of error correction versus no error correction during probe trials on the…

  19. Effects of Error Correction during Assessment Probes on the Acquisition of Sight Words for Students with Moderate Intellectual Disabilities

    ERIC Educational Resources Information Center

    Waugh, Rebecca E.; Alberto, Paul A.; Fredrick, Laura D.

    2011-01-01

    Simultaneous prompting is an errorless learning strategy designed to reduce the number of errors students make; however, research has shown a disparity in the number of errors students make during instructional versus probe trials. This study directly examined the effects of error correction versus no error correction during probe trials on the…

  20. Workflow interruptions, cognitive failure and near-accidents in health care.

    PubMed

    Elfering, Achim; Grebner, Simone; Ebener, Corinne

    2015-01-01

    Errors are frequent in health care. A specific model was tested that affirms failure in cognitive action regulation to mediate the influence of nurses' workflow interruptions and safety conscientiousness on near-accidents in health care. One hundred and sixty-five nurses from seven Swiss hospitals participated in a questionnaire survey. Structural equation modelling confirmed the hypothesised mediation model. Cognitive failure in action regulation significantly mediated the influence of workflow interruptions on near-accidents (p < .05). An indirect path from conscientiousness to near-accidents via cognitive failure in action regulation was also significant (p < .05). Compliance with safety regulations was significantly related to cognitive failure and near-accidents; moreover, cognitive failure mediated the association between compliance and near-accidents (p < .05). Contrary to expectations, compliance with safety regulations was not related to workflow interruptions. Workflow interruptions caused by colleagues, patients and organisational constraints are likely to trigger errors in nursing. Work redesign is recommended to reduce cognitive failure and improve safety of nurses and patients.

  1. Tropospheric Correction for InSAR Using Interpolated ECMWF Data and GPS Zenith Total Delay

    NASA Technical Reports Server (NTRS)

    Webb, Frank H.; Fishbein, Evan F.; Moore, Angelyn W.; Owen, Susan E.; Fielding, Eric J.; Granger, Stephanie L.; Bjorndahl, Fredrik; Lofgren Johan

    2011-01-01

    To mitigate atmospheric errors caused by the troposphere, which is a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging, a tropospheric correction method has been developed using data from the European Centre for Medium- Range Weather Forecasts (ECMWF) and the Global Positioning System (GPS). The ECMWF data was interpolated using a Stretched Boundary Layer Model (SBLM), and ground-based GPS estimates of the tropospheric delay from the Southern California Integrated GPS Network were interpolated using modified Gaussian and inverse distance weighted interpolations. The resulting Zenith Total Delay (ZTD) correction maps have been evaluated, both separately and using a combination of the two data sets, for three short-interval InSAR pairs from Envisat during 2006 on an area stretching from northeast from the Los Angeles basin towards Death Valley. Results show that the root mean square (rms) in the InSAR images was greatly reduced, meaning a significant reduction in the atmospheric noise of up to 32 percent. However, for some of the images, the rms increased and large errors remained after applying the tropospheric correction. The residuals showed a constant gradient over the area, suggesting that a remaining orbit error from Envisat was present. The orbit reprocessing in ROI_pac and the plane fitting both require that the only remaining error in the InSAR image be the orbit error. If this is not fulfilled, the correction can be made anyway, but it will be done using all remaining errors assuming them to be orbit errors. By correcting for tropospheric noise, the biggest error source is removed, and the orbit error becomes apparent and can be corrected for

  2. Geolocation error tracking of ZY-3 three line cameras

    NASA Astrophysics Data System (ADS)

    Pan, Hongbo

    2017-01-01

    The high-accuracy geolocation of high-resolution satellite images (HRSIs) is a key issue for mapping and integrating multi-temporal, multi-sensor images. In this manuscript, we propose a new geometric frame for analysing the geometric error of a stereo HRSI, in which the geolocation error can be divided into three parts: the epipolar direction, cross base direction, and height direction. With this frame, we proved that the height error of three line cameras (TLCs) is independent of nadir images, and that the terrain effect has a limited impact on the geolocation errors. For ZY-3 error sources, the drift error in both the pitch and roll angle and its influence on the geolocation accuracy are analysed. Epipolar and common tie-point constraints are proposed to study the bundle adjustment of HRSIs. Epipolar constraints explain that the relative orientation can reduce the number of compensation parameters in the cross base direction and have a limited impact on the height accuracy. The common tie points adjust the pitch-angle errors to be consistent with each other for TLCs. Therefore, free-net bundle adjustment of a single strip cannot significantly improve the geolocation accuracy. Furthermore, the epipolar and common tie-point constraints cause the error to propagate into the adjacent strip when multiple strips are involved in the bundle adjustment, which results in the same attitude uncertainty throughout the whole block. Two adjacent strips-Orbit 305 and Orbit 381, covering 7 and 12 standard scenes separately-and 308 ground control points (GCPs) were used for the experiments. The experiments validate the aforementioned theory. The planimetric and height root mean square errors were 2.09 and 1.28 m, respectively, when two GCPs were settled at the beginning and end of the block.

  3. Electronic witness system in IVF-patients perspective.

    PubMed

    Forte, Marina; Faustini, Federica; Maggiulli, Roberta; Scarica, Catello; Romano, Stefania; Ottolini, Christian; Farcomeni, Alessio; Palagiano, Antonio; Capalbo, Antonio; Ubaldi, Filippo Maria; Rienzi, Laura

    2016-09-01

    The objective of this study is to evaluate patient concerns about in vitro fertilization (IVF) errors and electronic witness systems (EWS) satisfaction. The design of this study is a prospective single-center cohort study. The setting of this study was located in the private IVF center. Four hundred eight infertile patients attending an IVF cycle at a GENERA center in Italy were equipped with an EWS. Although generally recognized as a very rare event in IVF, biological sample mix-up has been reported in the literature. For this reason, some IVF laboratories have introduced EWS with the aim to further reduce the risk of error during biological samples handling. Participating patients received a questionnaire developed through a Likert scale ranging from 1 to 6. Patient concerns about sample mix-up without and with an EWS were assessed. 90.4 % of patients expressed significant concerns relating to sample mix-up. The EWS reduced these concerns in 92.1 % of patients, 97.1 % of which were particularly satisfied with the electronic traceability of their gametes and embryos in the IVF laboratory. 97.1 % of patients felt highly comfortable with an IVF center equipped with an EWS. Female patients had a significantly higher appreciation of the EWS when compared to their male partners (p = 0.029). A significant mix-up event occurred in an Italian hospital during the study and patient's satisfaction increased significantly towards the use of the EWS after the event (p = 0.032). EWS, by sensibly reducing the risk for sample mix-up in IVF cycles, has been proved to be a trusted strategy from patient's perspective.

  4. Colour and spatial cueing in low-prevalence visual search.

    PubMed

    Russell, Nicholas C C; Kunar, Melina A

    2012-01-01

    In visual search, 30-40% of targets with a prevalence rate of 2% are missed, compared to 7% of targets with a prevalence rate of 50% (Wolfe, Horowitz, & Kenner, 2005). This "low-prevalence" (LP) effect is thought to occur as participants are making motor errors, changing their response criteria, and/or quitting their search too soon. We investigate whether colour and spatial cues, known to improve visual search when the target has a high prevalence (HP), benefit search when the target is rare. Experiments 1 and 2 showed that although knowledge of the target's colour reduces miss errors overall, it does not eliminate the LP effect as more targets were missed at LP than at HP. Furthermore, detection of a rare target is significantly impaired if it appears in an unexpected colour-more so than if the prevalence of the target is high (Experiment 2). Experiment 3 showed that, if a rare target is exogenously cued, target detection is improved but still impaired relative to high-prevalence conditions. Furthermore, if the cue is absent or invalid, the percentage of missed targets increases. Participants were given the option to correct motor errors in all three experiments, which reduced but did not eliminate the LP effect. The results suggest that although valid colour and spatial cues improve target detection, participants still miss more targets at LP than at HP. Furthermore, invalid cues at LP are very costly in terms of miss errors. We discuss our findings in relation to current theories and applications of LP search.

  5. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE PAGES

    Butler, Troy; Wildey, Timothy

    2018-01-01

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  6. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, Troy; Wildey, Timothy

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  7. Symmetric Blind Information Reconciliation for Quantum Key Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiktenko, Evgeniy O.; Trushechkin, Anton S.; Lim, Charles Ci Wen

    Quantum key distribution (QKD) is a quantum-proof key-exchange scheme which is fast approaching the communication industry. An essential component in QKD is the information reconciliation step, which is used for correcting the quantum-channel noise errors. The recently suggested blind-reconciliation technique, based on low-density parity-check codes, offers remarkable prospectives for efficient information reconciliation without an a priori quantum bit error rate estimation. We suggest an improvement of the blind-information-reconciliation protocol promoting a significant increase in the efficiency of the procedure and reducing its interactivity. Finally, the proposed technique is based on introducing symmetry in operations of parties, and the consideration ofmore » results of unsuccessful belief-propagation decodings.« less

  8. Symmetric Blind Information Reconciliation for Quantum Key Distribution

    DOE PAGES

    Kiktenko, Evgeniy O.; Trushechkin, Anton S.; Lim, Charles Ci Wen; ...

    2017-10-27

    Quantum key distribution (QKD) is a quantum-proof key-exchange scheme which is fast approaching the communication industry. An essential component in QKD is the information reconciliation step, which is used for correcting the quantum-channel noise errors. The recently suggested blind-reconciliation technique, based on low-density parity-check codes, offers remarkable prospectives for efficient information reconciliation without an a priori quantum bit error rate estimation. We suggest an improvement of the blind-information-reconciliation protocol promoting a significant increase in the efficiency of the procedure and reducing its interactivity. Finally, the proposed technique is based on introducing symmetry in operations of parties, and the consideration ofmore » results of unsuccessful belief-propagation decodings.« less

  9. Symmetric Blind Information Reconciliation for Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Kiktenko, E. O.; Trushechkin, A. S.; Lim, C. C. W.; Kurochkin, Y. V.; Fedorov, A. K.

    2017-10-01

    Quantum key distribution (QKD) is a quantum-proof key-exchange scheme which is fast approaching the communication industry. An essential component in QKD is the information reconciliation step, which is used for correcting the quantum-channel noise errors. The recently suggested blind-reconciliation technique, based on low-density parity-check codes, offers remarkable prospectives for efficient information reconciliation without an a priori quantum bit error rate estimation. We suggest an improvement of the blind-information-reconciliation protocol promoting a significant increase in the efficiency of the procedure and reducing its interactivity. The proposed technique is based on introducing symmetry in operations of parties, and the consideration of results of unsuccessful belief-propagation decodings.

  10. Does applying technology throughout the medication use process improve patient safety with antineoplastics?

    PubMed

    Bubalo, Joseph; Warden, Bruce A; Wiegel, Joshua J; Nishida, Tess; Handel, Evelyn; Svoboda, Leanne M; Nguyen, Lam; Edillo, P Neil

    2014-12-01

    Medical errors, in particular medication errors, continue to be a troublesome factor in the delivery of safe and effective patient care. Antineoplastic agents represent a group of medications highly susceptible to medication errors due to their complex regimens and narrow therapeutic indices. As the majority of these medication errors are frequently associated with breakdowns in poorly defined systems, developing technologies and evolving workflows seem to be a logical approach to provide added safeguards against medication errors. This article will review both the pros and cons of today's technologies and their ability to simplify the medication use process, reduce medication errors, improve documentation, improve healthcare costs and increase provider efficiency as relates to the use of antineoplastic therapy throughout the medication use process. Several technologies, mainly computerized provider order entry (CPOE), barcode medication administration (BCMA), smart pumps, electronic medication administration record (eMAR), and telepharmacy, have been well described and proven to reduce medication errors, improve adherence to quality metrics, and/or improve healthcare costs in a broad scope of patients. The utilization of these technologies during antineoplastic therapy is weak at best and lacking for most. Specific to the antineoplastic medication use system, the only technology with data to adequately support a claim of reduced medication errors is CPOE. In addition to the benefits these technologies can provide, it is also important to recognize their potential to induce new types of errors and inefficiencies which can negatively impact patient care. The utilization of technology reduces but does not eliminate the potential for error. The evidence base to support technology in preventing medication errors is limited in general but even more deficient in the realm of antineoplastic therapy. Though CPOE has the best evidence to support its use in the antineoplastic population, benefit from many other technologies may have to be inferred based on data from other patient populations. As health systems begin to widely adopt and implement new technologies it is important to critically assess their effectiveness in improving patient safety. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  11. Diagnostic decision-making and strategies to improve diagnosis.

    PubMed

    Thammasitboon, Satid; Cutrer, William B

    2013-10-01

    A significant portion of diagnostic errors arises through cognitive errors resulting from inadequate knowledge, faulty data gathering, and/or faulty verification. Experts estimate that 75% of diagnostic failures can be attributed to clinician diagnostic thinking failure. The cognitive processes that underlie diagnostic thinking of clinicians are complex and intriguing, and it is imperative that clinicians acquire explicit appreciation and application of different cognitive approaches to make decisions better. A dual-process model that unifies many theories of decision-making has emerged as a promising template for understanding how clinicians think and judge efficiently in a diagnostic reasoning process. The identification and implementation of strategies for decreasing or preventing such diagnostic errors has become a growing area of interest and research. Suggested strategies to decrease diagnostic error incidence include increasing clinician's clinical expertise and avoiding inherent cognitive errors to make decisions better. Implementing Interventions focused solely on avoiding errors may work effectively for patient safety issues such as medication errors. Addressing cognitive errors, however, requires equal effort on expanding the individual clinician's expertise. Providing cognitive support to clinicians for robust diagnostic decision-making serves as the final strategic target for decreasing diagnostic errors. Clinical guidelines and algorithms offer another method for streamlining decision-making and decreasing likelihood of cognitive diagnostic errors. Addressing cognitive processing errors is undeniably the most challenging task in reducing diagnostic errors. While many suggested approaches exist, they are mostly based on theories and sciences in cognitive psychology, decision-making, and education. The proposed interventions are primarily suggestions and very few of them have been tested in the actual practice settings. Collaborative research effort is required to effectively address cognitive processing errors. Researchers in various areas, including patient safety/quality improvement, decision-making, and problem solving, must work together to make medical diagnosis more reliable. © 2013 Mosby, Inc. All rights reserved.

  12. Moderation of the Relationship Between Reward Expectancy and Prediction Error-Related Ventral Striatal Reactivity by Anhedonia in Unmedicated Major Depressive Disorder: Findings From the EMBARC Study

    PubMed Central

    Greenberg, Tsafrir; Chase, Henry W.; Almeida, Jorge R.; Stiffler, Richelle; Zevallos, Carlos R.; Aslam, Haris A.; Deckersbach, Thilo; Weyandt, Sarah; Cooper, Crystal; Toups, Marisa; Carmody, Thomas; Kurian, Benji; Peltier, Scott; Adams, Phillip; McInnis, Melvin G.; Oquendo, Maria A.; McGrath, Patrick J.; Fava, Maurizio; Weissman, Myrna; Parsey, Ramin; Trivedi, Madhukar H.; Phillips, Mary L.

    2016-01-01

    Objective Anhedonia, disrupted reward processing, is a core symptom of major depressive disorder. Recent findings demonstrate altered reward-related ventral striatal reactivity in depressed individuals, but the extent to which this is specific to anhedonia remains poorly understood. The authors examined the effect of anhedonia on reward expectancy (expected outcome value) and prediction error-(discrepancy between expected and actual outcome) related ventral striatal reactivity, as well as the relationship between these measures. Method A total of 148 unmedicated individuals with major depressive disorder and 31 healthy comparison individuals recruited for the multisite EMBARC (Establishing Moderators and Biosignatures of Antidepressant Response in Clinical Care) study underwent functional MRI during a well-validated reward task. Region of interest and whole-brain data were examined in the first- (N=78) and second- (N=70) recruited cohorts, as well as the total sample, of depressed individuals, and in healthy individuals. Results Healthy, but not depressed, individuals showed a significant inverse relationship between reward expectancy and prediction error-related right ventral striatal reactivity. Across all participants, and in depressed individuals only, greater anhedonia severity was associated with a reduced reward expectancy-prediction error inverse relationship, even after controlling for other symptoms. Conclusions The normal reward expectancy and prediction error-related ventral striatal reactivity inverse relationship concords with conditioning models, predicting a shift in ventral striatal responding from reward outcomes to reward cues. This study shows, for the first time, an absence of this relationship in two cohorts of unmedicated depressed individuals and a moderation of this relationship by anhedonia, suggesting reduced reward-contingency learning with greater anhedonia. These findings help elucidate neural mechanisms of anhedonia, as a step toward identifying potential biosignatures of treatment response. PMID:26183698

  13. Moderation of the Relationship Between Reward Expectancy and Prediction Error-Related Ventral Striatal Reactivity by Anhedonia in Unmedicated Major Depressive Disorder: Findings From the EMBARC Study.

    PubMed

    Greenberg, Tsafrir; Chase, Henry W; Almeida, Jorge R; Stiffler, Richelle; Zevallos, Carlos R; Aslam, Haris A; Deckersbach, Thilo; Weyandt, Sarah; Cooper, Crystal; Toups, Marisa; Carmody, Thomas; Kurian, Benji; Peltier, Scott; Adams, Phillip; McInnis, Melvin G; Oquendo, Maria A; McGrath, Patrick J; Fava, Maurizio; Weissman, Myrna; Parsey, Ramin; Trivedi, Madhukar H; Phillips, Mary L

    2015-09-01

    Anhedonia, disrupted reward processing, is a core symptom of major depressive disorder. Recent findings demonstrate altered reward-related ventral striatal reactivity in depressed individuals, but the extent to which this is specific to anhedonia remains poorly understood. The authors examined the effect of anhedonia on reward expectancy (expected outcome value) and prediction error- (discrepancy between expected and actual outcome) related ventral striatal reactivity, as well as the relationship between these measures. A total of 148 unmedicated individuals with major depressive disorder and 31 healthy comparison individuals recruited for the multisite EMBARC (Establishing Moderators and Biosignatures of Antidepressant Response in Clinical Care) study underwent functional MRI during a well-validated reward task. Region of interest and whole-brain data were examined in the first- (N=78) and second- (N=70) recruited cohorts, as well as the total sample, of depressed individuals, and in healthy individuals. Healthy, but not depressed, individuals showed a significant inverse relationship between reward expectancy and prediction error-related right ventral striatal reactivity. Across all participants, and in depressed individuals only, greater anhedonia severity was associated with a reduced reward expectancy-prediction error inverse relationship, even after controlling for other symptoms. The normal reward expectancy and prediction error-related ventral striatal reactivity inverse relationship concords with conditioning models, predicting a shift in ventral striatal responding from reward outcomes to reward cues. This study shows, for the first time, an absence of this relationship in two cohorts of unmedicated depressed individuals and a moderation of this relationship by anhedonia, suggesting reduced reward-contingency learning with greater anhedonia. These findings help elucidate neural mechanisms of anhedonia, as a step toward identifying potential biosignatures of treatment response.

  14. Temperature and pressure effects on capacitance probe cryogenic liquid level measurement accuracy

    NASA Technical Reports Server (NTRS)

    Edwards, Lawrence G.; Haberbusch, Mark

    1993-01-01

    The inaccuracies of liquid nitrogen and liquid hydrogen level measurements by use of a coaxial capacitance probe were investigated as a function of fluid temperatures and pressures. Significant liquid level measurement errors were found to occur due to the changes in the fluids dielectric constants which develop over the operating temperature and pressure ranges of the cryogenic storage tanks. The level measurement inaccuracies can be reduced by using fluid dielectric correction factors based on measured fluid temperatures and pressures. The errors in the corrected liquid level measurements were estimated based on the reported calibration errors of the temperature and pressure measurement systems. Experimental liquid nitrogen (LN2) and liquid hydrogen (LH2) level measurements were obtained using the calibrated capacitance probe equations and also by the dielectric constant correction factor method. The liquid levels obtained by the capacitance probe for the two methods were compared with the liquid level estimated from the fluid temperature profiles. Results show that the dielectric constant corrected liquid levels agreed within 0.5 percent of the temperature profile estimated liquid level. The uncorrected dielectric constant capacitance liquid level measurements deviated from the temperature profile level by more than 5 percent. This paper identifies the magnitude of liquid level measurement error that can occur for LN2 and LH2 fluids due to temperature and pressure effects on the dielectric constants over the tank storage conditions from 5 to 40 psia. A method of reducing the level measurement errors by using dielectric constant correction factors based on fluid temperature and pressure measurements is derived. The improved accuracy by use of the correction factors is experimentally verified by comparing liquid levels derived from fluid temperature profiles.

  15. The positive financial impact of using an Intensive Care Information System in a tertiary Intensive Care Unit.

    PubMed

    Levesque, Eric; Hoti, Emir; de La Serna, Sofia; Habouchi, Houssam; Ichai, Philippe; Saliba, Faouzi; Samuel, Didier; Azoulay, Daniel

    2013-03-01

    In the French healthcare system, the intensive care budget allocated is directly dependent on the activity level of the center. To evaluate this activity level, it is necessary to code the medical diagnoses and procedures performed on Intensive Care Unit (ICU) patients. The aim of this study was to evaluate the effects of using an Intensive Care Information System (ICIS) on the incidence of coding errors and its impact on the ICU budget allocated. Since 2005, the documentation on and monitoring of every patient admitted to our ICU has been carried out using an ICIS. However, the coding process was performed manually until 2008. This study focused on two periods: the period of manual coding (year 2007) and the period of computerized coding (year 2008) which covered a total of 1403 ICU patients. The time spent on the coding process, the rate of coding errors (defined as patients missed/not coded or wrongly identified as undergoing major procedure/s) and the financial impact were evaluated for these two periods. With computerized coding, the time per admission decreased significantly (from 6.8 ± 2.8 min in 2007 to 3.6 ± 1.9 min in 2008, p<0.001). Similarly, a reduction in coding errors was observed (7.9% vs. 2.2%, p<0.001). This decrease in coding errors resulted in a reduced difference between the potential and real ICU financial supplements obtained in the respective years (€194,139 loss in 2007 vs. a €1628 loss in 2008). Using specific computer programs improves the intensive process of manual coding by shortening the time required as well as reducing errors, which in turn positively impacts the ICU budget allocation. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  16. New Insights into Handling Missing Values in Environmental Epidemiological Studies

    PubMed Central

    Roda, Célina; Nicolis, Ioannis; Momas, Isabelle; Guihenneuc, Chantal

    2014-01-01

    Missing data are unavoidable in environmental epidemiologic surveys. The aim of this study was to compare methods for handling large amounts of missing values: omission of missing values, single and multiple imputations (through linear regression or partial least squares regression), and a fully Bayesian approach. These methods were applied to the PARIS birth cohort, where indoor domestic pollutant measurements were performed in a random sample of babies' dwellings. A simulation study was conducted to assess performances of different approaches with a high proportion of missing values (from 50% to 95%). Different simulation scenarios were carried out, controlling the true value of the association (odds ratio of 1.0, 1.2, and 1.4), and varying the health outcome prevalence. When a large amount of data is missing, omitting these missing data reduced statistical power and inflated standard errors, which affected the significance of the association. Single imputation underestimated the variability, and considerably increased risk of type I error. All approaches were conservative, except the Bayesian joint model. In the case of a common health outcome, the fully Bayesian approach is the most efficient approach (low root mean square error, reasonable type I error, and high statistical power). Nevertheless for a less prevalent event, the type I error is increased and the statistical power is reduced. The estimated posterior distribution of the OR is useful to refine the conclusion. Among the methods handling missing values, no approach is absolutely the best but when usual approaches (e.g. single imputation) are not sufficient, joint modelling approach of missing process and health association is more efficient when large amounts of data are missing. PMID:25226278

  17. Attitude guidance and tracking for spacecraft with two reaction wheels

    NASA Astrophysics Data System (ADS)

    Biggs, James D.; Bai, Yuliang; Henninger, Helen

    2018-04-01

    This paper addresses the guidance and tracking problem for a rigid-spacecraft using two reaction wheels (RWs). The guidance problem is formulated as an optimal control problem on the special orthogonal group SO(3). The optimal motion is solved analytically as a function of time and is used to reduce the original guidance problem to one of computing the minimum of a nonlinear function. A tracking control using two RWs is developed that extends previous singular quaternion stabilisation controls to tracking controls on the rotation group. The controller is proved to locally asymptotically track the generated reference motions using Lyapunov's direct method. Simulations of a 3U CubeSat demonstrate that this tracking control is robust to initial rotation errors and angular velocity errors in the controlled axis. For initial angular velocity errors in the uncontrolled axis and under significant disturbances the control fails to track. However, the singular tracking control is combined with a nano-magnetic torquer which simply damps the angular velocity in the uncontrolled axis and is shown to provide a practical control method for tracking in the presence of disturbances and initial condition errors.

  18. Experimental methods to validate measures of emotional state and readiness for duty in critical operations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weston, Louise Marie

    2007-09-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This reportmore » reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended.« less

  19. Progressive Care Nurses Improving Patient Safety by Limiting Interruptions During Medication Administration.

    PubMed

    Flynn, Fran; Evanish, Julie Q; Fernald, Josephine M; Hutchinson, Dawn E; Lefaiver, Cheryl

    2016-08-01

    Because of the high frequency of interruptions during medication administration, the effectiveness of strategies to limit interruptions during medication administration has been evaluated in numerous quality improvement initiatives in an effort to reduce medication administration errors. To evaluate the effectiveness of evidence-based strategies to limit interruptions during scheduled, peak medication administration times in 3 progressive cardiac care units (PCCUs). A secondary aim of the project was to evaluate the impact of limiting interruptions on medication errors. The percentages of interruptions and medication errors before and after implementation of evidence-based strategies to limit interruptions were measured by using direct observations of nurses on 2 PCCUs. Nurses in a third PCCU served as a comparison group. Interruptions (P < .001) and medication errors (P = .02) decreased significantly in 1 PCCU after implementation of evidence-based strategies to limit interruptions. Avoidable interruptions decreased 83% in PCCU1 and 53% in PCCU2 after implementation of the evidence-based strategies. Implementation of evidence-based strategies to limit interruptions in PCCUs decreases avoidable interruptions and promotes patient safety. ©2016 American Association of Critical-Care Nurses.

  20. Evaluation of Factors Influencing Accuracy of Principal Procedure Coding Based on ICD-9-CM: An Iranian Study

    PubMed Central

    Farzandipour, Mehrdad; Sheikhtaheri, Abbas

    2009-01-01

    To evaluate the accuracy of procedural coding and the factors that influence it, 246 records were randomly selected from four teaching hospitals in Kashan, Iran. “Recodes” were assigned blindly and then compared to the original codes. Furthermore, the coders' professional behaviors were carefully observed during the coding process. Coding errors were classified as major or minor. The relations between coding accuracy and possible effective factors were analyzed by χ2 or Fisher exact tests as well as the odds ratio (OR) and the 95 percent confidence interval for the OR. The results showed that using a tabular index for rechecking codes reduces errors (83 percent vs. 72 percent accuracy). Further, more thorough documentation by the clinician positively affected coding accuracy, though this relation was not significant. Readability of records decreased errors overall (p = .003), including major ones (p = .012). Moreover, records with no abbreviations had fewer major errors (p = .021). In conclusion, not using abbreviations, ensuring more readable documentation, and paying more attention to available information increased coding accuracy and the quality of procedure databases. PMID:19471647

Top