Sample records for online error detection

  1. Single Versus Multiple Events Error Potential Detection in a BCI-Controlled Car Game With Continuous and Discrete Feedback.

    PubMed

    Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R

    2016-03-01

    This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.

  2. Quantitative evaluation of patient-specific quality assurance using online dosimetry system

    NASA Astrophysics Data System (ADS)

    Jung, Jae-Yong; Shin, Young-Ju; Sohn, Seung-Chang; Min, Jung-Whan; Kim, Yon-Lae; Kim, Dong-Su; Choe, Bo-Young; Suh, Tae-Suk

    2018-01-01

    In this study, we investigated the clinical performance of an online dosimetry system (Mobius FX system, MFX) by 1) dosimetric plan verification using gamma passing rates and dose volume metrics and 2) error-detection capability evaluation by deliberately introduced machine error. Eighteen volumetric modulated arc therapy (VMAT) plans were studied. To evaluate the clinical performance of the MFX, we used gamma analysis and dose volume histogram (DVH) analysis. In addition, to evaluate the error-detection capability, we used gamma analysis and DVH analysis utilizing three types of deliberately introduced errors (Type 1: gantry angle-independent multi-leaf collimator (MLC) error, Type 2: gantry angle-dependent MLC error, and Type 3: gantry angle error). A dosimetric verification comparison of physical dosimetry system (Delt4PT) and online dosimetry system (MFX), gamma passing rates of the two dosimetry systems showed very good agreement with treatment planning system (TPS) calculation. For the average dose difference between the TPS calculation and the MFX measurement, most of the dose metrics showed good agreement within a tolerance of 3%. For the error-detection comparison of Delta4PT and MFX, the gamma passing rates of the two dosimetry systems did not meet the 90% acceptance criterion with the magnitude of error exceeding 2 mm and 1.5 ◦, respectively, for error plans of Types 1, 2, and 3. For delivery with all error types, the average dose difference of PTV due to error magnitude showed good agreement between calculated TPS and measured MFX within 1%. Overall, the results of the online dosimetry system showed very good agreement with those of the physical dosimetry system. Our results suggest that a log file-based online dosimetry system is a very suitable verification tool for accurate and efficient clinical routines for patient-specific quality assurance (QA).

  3. Magneto-optical tracking of flexible laparoscopic ultrasound: model-based online detection and correction of magnetic tracking errors.

    PubMed

    Feuerstein, Marco; Reichl, Tobias; Vogel, Jakob; Traub, Joerg; Navab, Nassir

    2009-06-01

    Electromagnetic tracking is currently one of the most promising means of localizing flexible endoscopic instruments such as flexible laparoscopic ultrasound transducers. However, electromagnetic tracking is also susceptible to interference from ferromagnetic material, which distorts the magnetic field and leads to tracking errors. This paper presents new methods for real-time online detection and reduction of dynamic electromagnetic tracking errors when localizing a flexible laparoscopic ultrasound transducer. We use a hybrid tracking setup to combine optical tracking of the transducer shaft and electromagnetic tracking of the flexible transducer tip. A novel approach of modeling the poses of the transducer tip in relation to the transducer shaft allows us to reliably detect and significantly reduce electromagnetic tracking errors. For detecting errors of more than 5 mm, we achieved a sensitivity and specificity of 91% and 93%, respectively. Initial 3-D rms error of 6.91 mm were reduced to 3.15 mm.

  4. Detection of Error Related Neuronal Responses Recorded by Electrocorticography in Humans during Continuous Movements

    PubMed Central

    Milekovic, Tomislav; Ball, Tonio; Schulze-Bonhage, Andreas; Aertsen, Ad; Mehring, Carsten

    2013-01-01

    Background Brain-machine interfaces (BMIs) can translate the neuronal activity underlying a user’s movement intention into movements of an artificial effector. In spite of continuous improvements, errors in movement decoding are still a major problem of current BMI systems. If the difference between the decoded and intended movements becomes noticeable, it may lead to an execution error. Outcome errors, where subjects fail to reach a certain movement goal, are also present during online BMI operation. Detecting such errors can be beneficial for BMI operation: (i) errors can be corrected online after being detected and (ii) adaptive BMI decoding algorithm can be updated to make fewer errors in the future. Methodology/Principal Findings Here, we show that error events can be detected from human electrocorticography (ECoG) during a continuous task with high precision, given a temporal tolerance of 300–400 milliseconds. We quantified the error detection accuracy and showed that, using only a small subset of 2×2 ECoG electrodes, 82% of detection information for outcome error and 74% of detection information for execution error available from all ECoG electrodes could be retained. Conclusions/Significance The error detection method presented here could be used to correct errors made during BMI operation or to adapt a BMI algorithm to make fewer errors in the future. Furthermore, our results indicate that smaller ECoG implant could be used for error detection. Reducing the size of an ECoG electrode implant used for BMI decoding and error detection could significantly reduce the medical risk of implantation. PMID:23383315

  5. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I

    2015-06-15

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm{sup 3} cube where the average cumulative reconstructed dose exceedsmore » the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments.« less

  6. Hybrid online sensor error detection and functional redundancy for systems with time-varying parameters.

    PubMed

    Feng, Jianyuan; Turksoy, Kamuran; Samadi, Sediqeh; Hajizadeh, Iman; Littlejohn, Elizabeth; Cinar, Ali

    2017-12-01

    Supervision and control systems rely on signals from sensors to receive information to monitor the operation of a system and adjust manipulated variables to achieve the control objective. However, sensor performance is often limited by their working conditions and sensors may also be subjected to interference by other devices. Many different types of sensor errors such as outliers, missing values, drifts and corruption with noise may occur during process operation. A hybrid online sensor error detection and functional redundancy system is developed to detect errors in online signals, and replace erroneous or missing values detected with model-based estimates. The proposed hybrid system relies on two techniques, an outlier-robust Kalman filter (ORKF) and a locally-weighted partial least squares (LW-PLS) regression model, which leverage the advantages of automatic measurement error elimination with ORKF and data-driven prediction with LW-PLS. The system includes a nominal angle analysis (NAA) method to distinguish between signal faults and large changes in sensor values caused by real dynamic changes in process operation. The performance of the system is illustrated with clinical data continuous glucose monitoring (CGM) sensors from people with type 1 diabetes. More than 50,000 CGM sensor errors were added to original CGM signals from 25 clinical experiments, then the performance of error detection and functional redundancy algorithms were analyzed. The results indicate that the proposed system can successfully detect most of the erroneous signals and substitute them with reasonable estimated values computed by functional redundancy system.

  7. An online detection system for aggregate sizes and shapes based on digital image processing

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Chen, Sijia

    2017-02-01

    Traditional aggregate size measuring methods are time-consuming, taxing, and do not deliver online measurements. A new online detection system for determining aggregate size and shape based on a digital camera with a charge-coupled device, and subsequent digital image processing, have been developed to overcome these problems. The system captures images of aggregates while falling and flat lying. Using these data, the particle size and shape distribution can be obtained in real time. Here, we calibrate this method using standard globules. Our experiments show that the maximum particle size distribution error was only 3 wt%, while the maximum particle shape distribution error was only 2 wt% for data derived from falling aggregates, having good dispersion. In contrast, the data for flat-lying aggregates had a maximum particle size distribution error of 12 wt%, and a maximum particle shape distribution error of 10 wt%; their accuracy was clearly lower than for falling aggregates. However, they performed well for single-graded aggregates, and did not require a dispersion device. Our system is low-cost and easy to install. It can successfully achieve online detection of aggregate size and shape with good reliability, and it has great potential for aggregate quality assurance.

  8. Online adaptation of a c-VEP Brain-computer Interface(BCI) based on error-related potentials and unsupervised learning.

    PubMed

    Spüler, Martin; Rosenstiel, Wolfgang; Bogdan, Martin

    2012-01-01

    The goal of a Brain-Computer Interface (BCI) is to control a computer by pure brain activity. Recently, BCIs based on code-modulated visual evoked potentials (c-VEPs) have shown great potential to establish high-performance communication. In this paper we present a c-VEP BCI that uses online adaptation of the classifier to reduce calibration time and increase performance. We compare two different approaches for online adaptation of the system: an unsupervised method and a method that uses the detection of error-related potentials. Both approaches were tested in an online study, in which an average accuracy of 96% was achieved with adaptation based on error-related potentials. This accuracy corresponds to an average information transfer rate of 144 bit/min, which is the highest bitrate reported so far for a non-invasive BCI. In a free-spelling mode, the subjects were able to write with an average of 21.3 error-free letters per minute, which shows the feasibility of the BCI system in a normal-use scenario. In addition we show that a calibration of the BCI system solely based on the detection of error-related potentials is possible, without knowing the true class labels.

  9. New-Sum: A Novel Online ABFT Scheme For General Iterative Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tao, Dingwen; Song, Shuaiwen; Krishnamoorthy, Sriram

    Emerging high-performance computing platforms, with large component counts and lower power margins, are anticipated to be more susceptible to soft errors in both logic circuits and memory subsystems. We present an online algorithm-based fault tolerance (ABFT) approach to efficiently detect and recover soft errors for general iterative methods. We design a novel checksum-based encoding scheme for matrix-vector multiplication that is resilient to both arithmetic and memory errors. Our design decouples the checksum updating process from the actual computation, and allows adaptive checksum overhead control. Building on this new encoding mechanism, we propose two online ABFT designs that can effectively recovermore » from errors when combined with a checkpoint/rollback scheme.« less

  10. Comparing Elicited Imitation and Word Monitoring as Measures of Implicit Knowledge

    ERIC Educational Resources Information Center

    Suzuki, Yuichi; DeKeyser, Robert

    2015-01-01

    The present study challenges the validity of elicited imitation (EI) as a measure for implicit knowledge, investigating to what extent online error detection and subsequent sentence repetition draw on implicit knowledge. To assess online detection during listening, a word monitoring component was built into an EI task. Advanced-level Japanese L2…

  11. Online production validation in a HEP environment

    NASA Astrophysics Data System (ADS)

    Harenberg, T.; Kuhl, T.; Lang, N.; Mättig, P.; Sandhoff, M.; Schwanenberger, C.; Volkmer, F.

    2017-03-01

    In high energy physics (HEP) event simulations, petabytes of data are processed and stored requiring millions of CPU-years. This enormous demand for computing resources is handled by centers distributed worldwide, which form part of the LHC computing grid. The consumption of such an important amount of resources demands for an efficient production of simulation and for the early detection of potential errors. In this article we present a new monitoring framework for grid environments, which polls a measure of data quality during job execution. This online monitoring facilitates the early detection of configuration errors (specially in simulation parameters), and may thus contribute to significant savings in computing resources.

  12. Quality assurance for online adapted treatment plans: Benchmarking and delivery monitoring simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Taoran, E-mail: taoran.li.duke@gmail.com; Wu, Qiuwen; Yang, Yun

    Purpose: An important challenge facing online adaptive radiation therapy is the development of feasible and efficient quality assurance (QA). This project aimed to validate the deliverability of online adapted plans and develop a proof-of-concept online delivery monitoring system for online adaptive radiation therapy QA. Methods: The first part of this project benchmarked automatically online adapted prostate treatment plans using traditional portal dosimetry IMRT QA. The portal dosimetry QA results of online adapted plans were compared to original (unadapted) plans as well as randomly selected prostate IMRT plans from our clinic. In the second part, an online delivery monitoring system wasmore » designed and validated via a simulated treatment with intentional multileaf collimator (MLC) errors. This system was based on inputs from the dynamic machine information (DMI), which continuously reports actual MLC positions and machine monitor units (MUs) at intervals of 50 ms or less during delivery. Based on the DMI, the system performed two levels of monitoring/verification during the delivery: (1) dynamic monitoring of cumulative fluence errors resulting from leaf position deviations and visualization using fluence error maps (FEMs); and (2) verification of MLC positions against the treatment plan for potential errors in MLC motion and data transfer at each control point. Validation of the online delivery monitoring system was performed by introducing intentional systematic MLC errors (ranging from 0.5 to 2 mm) to the DMI files for both leaf banks. These DMI files were analyzed by the proposed system to evaluate the system’s performance in quantifying errors and revealing the source of errors, as well as to understand patterns in the FEMs. In addition, FEMs from 210 actual prostate IMRT beams were analyzed using the proposed system to further validate its ability to catch and identify errors, as well as establish error magnitude baselines for prostate IMRT delivery. Results: Online adapted plans were found to have similar delivery accuracy in comparison to clinical IMRT plans when validated with portal dosimetry IMRT QA. FEMs for the simulated deliveries with intentional MLC errors exhibited distinct patterns for different MLC error magnitudes and directions, indicating that the proposed delivery monitoring system is highly specific in detecting the source of errors. Implementing the proposed QA system for online adapted plans revealed excellent delivery accuracy: over 99% of leaf position differences were within 0.5 mm, and >99% of pixels in the FEMs had fluence errors within 0.5 MU. Patterns present in the FEMs and MLC control point analysis for actual patient cases agreed with the error pattern analysis results, further validating the system’s ability to reveal and differentiate MLC deviations. Calculation of the fluence map based on the DMI was performed within 2 ms after receiving each DMI input. Conclusions: The proposed online delivery monitoring system requires minimal additional resources and time commitment to the current clinical workflow while still maintaining high sensitivity to leaf position errors and specificity to error types. The presented online delivery monitoring system therefore represents a promising QA system candidate for online adaptive radiation therapy.« less

  13. Quality assurance for online adapted treatment plans: benchmarking and delivery monitoring simulation.

    PubMed

    Li, Taoran; Wu, Qiuwen; Yang, Yun; Rodrigues, Anna; Yin, Fang-Fang; Jackie Wu, Q

    2015-01-01

    An important challenge facing online adaptive radiation therapy is the development of feasible and efficient quality assurance (QA). This project aimed to validate the deliverability of online adapted plans and develop a proof-of-concept online delivery monitoring system for online adaptive radiation therapy QA. The first part of this project benchmarked automatically online adapted prostate treatment plans using traditional portal dosimetry IMRT QA. The portal dosimetry QA results of online adapted plans were compared to original (unadapted) plans as well as randomly selected prostate IMRT plans from our clinic. In the second part, an online delivery monitoring system was designed and validated via a simulated treatment with intentional multileaf collimator (MLC) errors. This system was based on inputs from the dynamic machine information (DMI), which continuously reports actual MLC positions and machine monitor units (MUs) at intervals of 50 ms or less during delivery. Based on the DMI, the system performed two levels of monitoring/verification during the delivery: (1) dynamic monitoring of cumulative fluence errors resulting from leaf position deviations and visualization using fluence error maps (FEMs); and (2) verification of MLC positions against the treatment plan for potential errors in MLC motion and data transfer at each control point. Validation of the online delivery monitoring system was performed by introducing intentional systematic MLC errors (ranging from 0.5 to 2 mm) to the DMI files for both leaf banks. These DMI files were analyzed by the proposed system to evaluate the system's performance in quantifying errors and revealing the source of errors, as well as to understand patterns in the FEMs. In addition, FEMs from 210 actual prostate IMRT beams were analyzed using the proposed system to further validate its ability to catch and identify errors, as well as establish error magnitude baselines for prostate IMRT delivery. Online adapted plans were found to have similar delivery accuracy in comparison to clinical IMRT plans when validated with portal dosimetry IMRT QA. FEMs for the simulated deliveries with intentional MLC errors exhibited distinct patterns for different MLC error magnitudes and directions, indicating that the proposed delivery monitoring system is highly specific in detecting the source of errors. Implementing the proposed QA system for online adapted plans revealed excellent delivery accuracy: over 99% of leaf position differences were within 0.5 mm, and >99% of pixels in the FEMs had fluence errors within 0.5 MU. Patterns present in the FEMs and MLC control point analysis for actual patient cases agreed with the error pattern analysis results, further validating the system's ability to reveal and differentiate MLC deviations. Calculation of the fluence map based on the DMI was performed within 2 ms after receiving each DMI input. The proposed online delivery monitoring system requires minimal additional resources and time commitment to the current clinical workflow while still maintaining high sensitivity to leaf position errors and specificity to error types. The presented online delivery monitoring system therefore represents a promising QA system candidate for online adaptive radiation therapy.

  14. Online detecting system of roller wear based on laser-linear array CCD technology

    NASA Astrophysics Data System (ADS)

    Guo, Yuan

    2010-10-01

    Roller is an important metallurgy tool in the rolling mill. And the surface of a roller affects the quantity of the rolling product directly. After using a period of time, roller must be repaired or replaced. Examining the profile of a working roller between the intervals of rolling is called online detecting for roller wear. The study of online detecting roller wear is very important for selecting the grinding time in reason, reducing the exchanging times of rollers, improving the quality of the product and realizing online grinding rollers. By applying the laser-linear array CCD detective technology, a method for online non-touch detecting roller wear was brought forward. The principle, composition and the operation process of the linear array CCD detecting system were expatiated. And an error compensation algorithm is exactly calculated to offset the shift of the roller axis in this measurement system. So the stability and the accuracy were improved remarkably. The experiment proves that the accuracy of the detecting system reaches to the demand of practical production process. It can provide a new method of high speed and high accuracy online detecting for roller wear.

  15. Online damage detection using recursive principal component analysis and recursive condition indicators

    NASA Astrophysics Data System (ADS)

    Krishnan, M.; Bhowmik, B.; Tiwari, A. K.; Hazra, B.

    2017-08-01

    In this paper, a novel baseline free approach for continuous online damage detection of multi degree of freedom vibrating structures using recursive principal component analysis (RPCA) in conjunction with online damage indicators is proposed. In this method, the acceleration data is used to obtain recursive proper orthogonal modes in online using the rank-one perturbation method, and subsequently utilized to detect the change in the dynamic behavior of the vibrating system from its pristine state to contiguous linear/nonlinear-states that indicate damage. The RPCA algorithm iterates the eigenvector and eigenvalue estimates for sample covariance matrices and new data point at each successive time instants, using the rank-one perturbation method. An online condition indicator (CI) based on the L2 norm of the error between actual response and the response projected using recursive eigenvector matrix updates over successive iterations is proposed. This eliminates the need for offline post processing and facilitates online damage detection especially when applied to streaming data. The proposed CI, named recursive residual error, is also adopted for simultaneous spatio-temporal damage detection. Numerical simulations performed on five-degree of freedom nonlinear system under white noise and El Centro excitations, with different levels of nonlinearity simulating the damage scenarios, demonstrate the robustness of the proposed algorithm. Successful results obtained from practical case studies involving experiments performed on a cantilever beam subjected to earthquake excitation, for full sensors and underdetermined cases; and data from recorded responses of the UCLA Factor building (full data and its subset) demonstrate the efficacy of the proposed methodology as an ideal candidate for real-time, reference free structural health monitoring.

  16. Crowdsourcing for error detection in cortical surface delineations.

    PubMed

    Ganz, Melanie; Kondermann, Daniel; Andrulis, Jonas; Knudsen, Gitte Moos; Maier-Hein, Lena

    2017-01-01

    With the recent trend toward big data analysis, neuroimaging datasets have grown substantially in the past years. While larger datasets potentially offer important insights for medical research, one major bottleneck is the requirement for resources of medical experts needed to validate automatic processing results. To address this issue, the goal of this paper was to assess whether anonymous nonexperts from an online community can perform quality control of MR-based cortical surface delineations derived by an automatic algorithm. So-called knowledge workers from an online crowdsourcing platform were asked to annotate errors in automatic cortical surface delineations on 100 central, coronal slices of MR images. On average, annotations for 100 images were obtained in less than an hour. When using expert annotations as reference, the crowd on average achieves a sensitivity of 82 % and a precision of 42 %. Merging multiple annotations per image significantly improves the sensitivity of the crowd (up to 95 %), but leads to a decrease in precision (as low as 22 %). Our experiments show that the detection of errors in automatic cortical surface delineations generated by anonymous untrained workers is feasible. Future work will focus on increasing the sensitivity of our method further, such that the error detection tasks can be handled exclusively by the crowd and expert resources can be focused on error correction.

  17. The detection error of thermal test low-frequency cable based on M sequence correlation algorithm

    NASA Astrophysics Data System (ADS)

    Wu, Dongliang; Ge, Zheyang; Tong, Xin; Du, Chunlin

    2018-04-01

    The problem of low accuracy and low efficiency of off-line detecting on thermal test low-frequency cable faults could be solved by designing a cable fault detection system, based on FPGA export M sequence code(Linear feedback shift register sequence) as pulse signal source. The design principle of SSTDR (Spread spectrum time-domain reflectometry) reflection method and hardware on-line monitoring setup figure is discussed in this paper. Testing data show that, this detection error increases with fault location of thermal test low-frequency cable.

  18. Subthreshold muscle twitches dissociate oscillatory neural signatures of conflicts from errors.

    PubMed

    Cohen, Michael X; van Gaal, Simon

    2014-02-01

    We investigated the neural systems underlying conflict detection and error monitoring during rapid online error correction/monitoring mechanisms. We combined data from four separate cognitive tasks and 64 subjects in which EEG and EMG (muscle activity from the thumb used to respond) were recorded. In typical neuroscience experiments, behavioral responses are classified as "error" or "correct"; however, closer inspection of our data revealed that correct responses were often accompanied by "partial errors" - a muscle twitch of the incorrect hand ("mixed correct trials," ~13% of the trials). We found that these muscle twitches dissociated conflicts from errors in time-frequency domain analyses of EEG data. In particular, both mixed-correct trials and full error trials were associated with enhanced theta-band power (4-9Hz) compared to correct trials. However, full errors were additionally associated with power and frontal-parietal synchrony in the delta band. Single-trial robust multiple regression analyses revealed a significant modulation of theta power as a function of partial error correction time, thus linking trial-to-trial fluctuations in power to conflict. Furthermore, single-trial correlation analyses revealed a qualitative dissociation between conflict and error processing, such that mixed correct trials were associated with positive theta-RT correlations whereas full error trials were associated with negative delta-RT correlations. These findings shed new light on the local and global network mechanisms of conflict monitoring and error detection, and their relationship to online action adjustment. © 2013.

  19. Permanent-File-Validation Utility Computer Program

    NASA Technical Reports Server (NTRS)

    Derry, Stephen D.

    1988-01-01

    Errors in files detected and corrected during operation. Permanent File Validation (PFVAL) utility computer program provides CDC CYBER NOS sites with mechanism to verify integrity of permanent file base. Locates and identifies permanent file errors in Mass Storage Table (MST) and Track Reservation Table (TRT), in permanent file catalog entries (PFC's) in permit sectors, and in disk sector linkage. All detected errors written to listing file and system and job day files. Program operates by reading system tables , catalog track, permit sectors, and disk linkage bytes to vaidate expected and actual file linkages. Used extensively to identify and locate errors in permanent files and enable online correction, reducing computer-system downtime.

  20. Learning a visuomotor rotation: simultaneous visual and proprioceptive information is crucial for visuomotor remapping.

    PubMed

    Shabbott, Britne A; Sainburg, Robert L

    2010-05-01

    Visuomotor adaptation is mediated by errors between intended and sensory-detected arm positions. However, it is not clear whether visual-based errors that are shown during the course of motion lead to qualitatively different or more efficient adaptation than errors shown after movement. For instance, continuous visual feedback mediates online error corrections, which may facilitate or inhibit the adaptation process. We addressed this question by manipulating the timing of visual error information and task instructions during a visuomotor adaptation task. Subjects were exposed to a visuomotor rotation, during which they received continuous visual feedback (CF) of hand position with instructions to correct or not correct online errors, or knowledge-of-results (KR), provided as a static hand-path at the end of each trial. Our results showed that all groups improved performance with practice, and that online error corrections were inconsequential to the adaptation process. However, in contrast to the CF groups, the KR group showed relatively small reductions in mean error with practice, increased inter-trial variability during rotation exposure, and more limited generalization across target distances and workspace. Further, although the KR group showed improved performance with practice, after-effects were minimal when the rotation was removed. These findings suggest that simultaneous visual and proprioceptive information is critical in altering neural representations of visuomotor maps, although delayed error information may elicit compensatory strategies to offset perturbations.

  1. Double ErrP Detection for Automatic Error Correction in an ERP-Based BCI Speller.

    PubMed

    Cruz, Aniana; Pires, Gabriel; Nunes, Urbano J

    2018-01-01

    Brain-computer interface (BCI) is a useful device for people with severe motor disabilities. However, due to its low speed and low reliability, BCI still has a very limited application in daily real-world tasks. This paper proposes a P300-based BCI speller combined with a double error-related potential (ErrP) detection to automatically correct erroneous decisions. This novel approach introduces a second error detection to infer whether wrong automatic correction also elicits a second ErrP. Thus, two single-trial responses, instead of one, contribute to the final selection, improving the reliability of error detection. Moreover, to increase error detection, the evoked potential detected as target by the P300 classifier is combined with the evoked error potential at a feature-level. Discriminable error and positive potentials (response to correct feedback) were clearly identified. The proposed approach was tested on nine healthy participants and one tetraplegic participant. The online average accuracy for the first and second ErrPs were 88.4% and 84.8%, respectively. With automatic correction, we achieved an improvement around 5% achieving 89.9% in spelling accuracy for an effective 2.92 symbols/min. The proposed approach revealed that double ErrP detection can improve the reliability and speed of BCI systems.

  2. Online Deviation Detection for Medical Processes

    PubMed Central

    Christov, Stefan C.; Avrunin, George S.; Clarke, Lori A.

    2014-01-01

    Human errors are a major concern in many medical processes. To help address this problem, we are investigating an approach for automatically detecting when performers of a medical process deviate from the acceptable ways of performing that process as specified by a detailed process model. Such deviations could represent errors and, thus, detecting and reporting deviations as they occur could help catch errors before harm is done. In this paper, we identify important issues related to the feasibility of the proposed approach and empirically evaluate the approach for two medical procedures, chemotherapy and blood transfusion. For the evaluation, we use the process models to generate sample process executions that we then seed with synthetic errors. The process models describe the coordination of activities of different process performers in normal, as well as in exceptional situations. The evaluation results suggest that the proposed approach could be applied in clinical settings to help catch errors before harm is done. PMID:25954343

  3. Arduino-based noise robust online heart-rate detection.

    PubMed

    Das, Sangita; Pal, Saurabh; Mitra, Madhuchhanda

    2017-04-01

    This paper introduces a noise robust real time heart rate detection system from electrocardiogram (ECG) data. An online data acquisition system is developed to collect ECG signals from human subjects. Heart rate is detected using window-based autocorrelation peak localisation technique. A low-cost Arduino UNO board is used to implement the complete automated process. The performance of the system is compared with PC-based heart rate detection technique. Accuracy of the system is validated through simulated noisy ECG data with various levels of signal to noise ratio (SNR). The mean percentage error of detected heart rate is found to be 0.72% for the noisy database with five different noise levels.

  4. Runtime Verification in Context : Can Optimizing Error Detection Improve Fault Diagnosis

    NASA Technical Reports Server (NTRS)

    Dwyer, Matthew B.; Purandare, Rahul; Person, Suzette

    2010-01-01

    Runtime verification has primarily been developed and evaluated as a means of enriching the software testing process. While many researchers have pointed to its potential applicability in online approaches to software fault tolerance, there has been a dearth of work exploring the details of how that might be accomplished. In this paper, we describe how a component-oriented approach to software health management exposes the connections between program execution, error detection, fault diagnosis, and recovery. We identify both research challenges and opportunities in exploiting those connections. Specifically, we describe how recent approaches to reducing the overhead of runtime monitoring aimed at error detection might be adapted to reduce the overhead and improve the effectiveness of fault diagnosis.

  5. VizieR Online Data Catalog: 2014-2017 photometry for ASASSN-13db (Sicilia-Aguilar+, 2017)

    NASA Astrophysics Data System (ADS)

    Sicilia-Aguilar, A.; Oprandi, A.; Froebrich, D.; Fang, M.; Prieto, J. L.; Stanek, K.; Scholz, A.; Kochanek, C. S.; Henning, T.; Gredel, R.; Holoien, T. S. W.; Rabus, M.; Shappee, B. J.; Billington, S. J.; Campbell-White, J.; Zegmott, T. J.

    2017-08-01

    Table 1 contains the full photometry from the All Sky Automated Survey for Supernovae (ASAS-SN) for the variable star ASASSN-13db. Detections with their errors and 5-sigma upper limits are given. Upper limits are marked by the "<" sign and have the error column set to 99.99. (1 data file).

  6. Self-awareness assessment during cognitive rehabilitation in children with acquired brain injury: a feasibility study and proposed model of child anosognosia.

    PubMed

    Krasny-Pacini, Agata; Limond, Jennifer; Evans, Jonathan; Hiebel, Jean; Bendjelida, Karim; Chevignard, Mathilde

    2015-01-01

    To compare three ways of assessing self-awareness in children with traumatic brain injury (TBI) and to propose a model of child anosognosia. Five single cases of children with severe TBI, aged 8-14, undergoing metacognitive training. Awareness was assessed using three different measures: two measures of metacognitive knowledge/intellectual awareness (a questionnaire and illustrated stories where child characters have everyday problems related to their executive dysfunction) and one measure of on-line/emergent awareness (post-task appraisal of task difficulty). All three measures showed good feasibility. Analysis of awareness deficit scores indicated large variability (1-100%). Three children showed dissociated scores. Based on these results, we propose a model of child self-awareness and anosognosia and a framework for awareness assessment for rehabilitation purposes. The model emphasizes (1) the role of on-line error detection in the construction of autobiographical memories that allow a child to build a self-knowledge of his/her strengths and difficulties; (2) the multiple components of awareness that need to be assessed separately; (3) the implications for rehabilitation: errorless versus error-based learning, rehabilitation approaches based on metacognition, rationale for rehabilitation intervention based on child's age and impaired awareness component, ethical and developmental consideration of confrontational methods. Self-awareness has multiple components that need to be assessed separately, to better adapt cognitive rehabilitation. Using questionnaires and discrepancy scores are not sufficient to assess awareness, because it does not include on-line error detection, which can be massively impaired in children, especially those with impaired executive functions. On-line error detection is important to promote and error-based learning is useful to allow a child to build a self-knowledge of his/her strengths and difficulties, in the absence of severe episodic memory problems. Metacognitive trainings may not be appropriate for younger children who have age appropriate developmentally immature self-awareness, nor for patients with brain injury if they suffer anosognosia because of their brain injury.

  7. A dedicated on-line detecting system for auto air dryers

    NASA Astrophysics Data System (ADS)

    Shi, Chao-yu; Luo, Zai

    2013-10-01

    According to the correlative automobile industry standard and the requirements of manufacturer, this dedicated on-line detecting system is designed against the shortage of low degree automatic efficiency and detection precision of auto air dryer in the domestic. Fast automatic detection is achieved by combining the technology of computer control, mechatronics and pneumatics. This system can detect the speciality performance of pressure regulating valve and sealability of auto air dryer, in which online analytical processing of test data is available, at the same time, saving and inquiring data is achieved. Through some experimental analysis, it is indicated that efficient and accurate detection of the performance of auto air dryer is realized, and the test errors are less than 3%. Moreover, we carry out the type A evaluation of uncertainty in test data based on Bayesian theory, and the results show that the test uncertainties of all performance parameters are less than 0.5kPa, which can meet the requirements of operating industrial site absolutely.

  8. On-line bolt-loosening detection method of key components of running trains using binocular vision

    NASA Astrophysics Data System (ADS)

    Xie, Yanxia; Sun, Junhua

    2017-11-01

    Bolt loosening, as one of hidden faults, affects the running quality of trains and even causes serious safety accidents. However, the developed fault detection approaches based on two-dimensional images cannot detect bolt-loosening due to lack of depth information. Therefore, we propose a novel online bolt-loosening detection method using binocular vision. Firstly, the target detection model based on convolutional neural network (CNN) is used to locate the target regions. And then, stereo matching and three-dimensional reconstruction are performed to detect bolt-loosening faults. The experimental results show that the looseness of multiple bolts can be characterized by the method simultaneously. The measurement repeatability and precision are less than 0.03mm, 0.09mm respectively, and its relative error is controlled within 1.09%.

  9. Predictive monitoring of actions, EEG recordings in virtual reality.

    PubMed

    Ozkan, Duru G; Pezzetta, Rachele

    2018-04-01

    Error-related negativity (ERN) is a signal that is associated with error detection. Joch and colleagues (Joch M, Hegele M, Maurer H, Müller H, Maurer LK. J Neurophysiol 118: 486-495, 2017) successfully separated the ERN as a response to online prediction error from feedback updates. We discuss the role of ERN in action and suggest insights from virtual reality techniques; we consider the potential benefit of self-evaluation in determining the mechanisms of ERN amplitude; finally, we review the oscillatory activity that has been claimed to accompany ERN.

  10. Online Learners’ Reading Ability Detection Based on Eye-Tracking Sensors

    PubMed Central

    Zhan, Zehui; Zhang, Lei; Mei, Hu; Fong, Patrick S. W.

    2016-01-01

    The detection of university online learners’ reading ability is generally problematic and time-consuming. Thus the eye-tracking sensors have been employed in this study, to record temporal and spatial human eye movements. Learners’ pupils, blinks, fixation, saccade, and regression are recognized as primary indicators for detecting reading abilities. A computational model is established according to the empirical eye-tracking data, and applying the multi-feature regularization machine learning mechanism based on a Low-rank Constraint. The model presents good generalization ability with an error of only 4.9% when randomly running 100 times. It has obvious advantages in saving time and improving precision, with only 20 min of testing required for prediction of an individual learner’s reading ability. PMID:27626418

  11. Error detection capability of a novel transmission detector: a validation study for online VMAT monitoring.

    PubMed

    Pasler, Marlies; Michel, Kilian; Marrazzo, Livia; Obenland, Michael; Pallotta, Stefania; Björnsgard, Mari; Lutterbach, Johannes

    2017-09-01

    The purpose of this study was to characterize a new single large-area ionization chamber, the integral quality monitor system (iRT, Germany), for online and real-time beam monitoring. Signal stability, monitor unit (MU) linearity and dose rate dependence were investigated for static and arc deliveries and compared to independent ionization chamber measurements. The dose verification capability of the transmission detector system was evaluated by comparing calculated and measured detector signals for 15 volumetric modulated arc therapy plans. The error detection sensitivity was tested by introducing MLC position and linac output errors. Deviations in dose distributions between the original and error-induced plans were compared in terms of detector signal deviation, dose-volume histogram (DVH) metrics and 2D γ-evaluation (2%/2 mm and 3%/3 mm). The detector signal is linearly dependent on linac output and shows negligible (<0.4%) dose rate dependence up to 460 MU min -1 . Signal stability is within 1% for cumulative detector output; substantial variations were observed for the segment-by-segment signal. Calculated versus measured cumulative signal deviations ranged from  -0.16%-2.25%. DVH, mean 2D γ-value and detector signal evaluations showed increasing deviations with regard to the respective reference with growing MLC and dose output errors; good correlation between DVH metrics and detector signal deviation was found (e.g. PTV D mean : R 2   =  0.97). Positional MLC errors of 1 mm and errors in linac output of 2% were identified with the transmission detector system. The extensive tests performed in this investigation show that the new transmission detector provides a stable and sensitive cumulative signal output and is suitable for beam monitoring during patient treatment.

  12. Error detection capability of a novel transmission detector: a validation study for online VMAT monitoring

    NASA Astrophysics Data System (ADS)

    Pasler, Marlies; Michel, Kilian; Marrazzo, Livia; Obenland, Michael; Pallotta, Stefania; Björnsgard, Mari; Lutterbach, Johannes

    2017-09-01

    The purpose of this study was to characterize a new single large-area ionization chamber, the integral quality monitor system (iRT, Germany), for online and real-time beam monitoring. Signal stability, monitor unit (MU) linearity and dose rate dependence were investigated for static and arc deliveries and compared to independent ionization chamber measurements. The dose verification capability of the transmission detector system was evaluated by comparing calculated and measured detector signals for 15 volumetric modulated arc therapy plans. The error detection sensitivity was tested by introducing MLC position and linac output errors. Deviations in dose distributions between the original and error-induced plans were compared in terms of detector signal deviation, dose-volume histogram (DVH) metrics and 2D γ-evaluation (2%/2 mm and 3%/3 mm). The detector signal is linearly dependent on linac output and shows negligible (<0.4%) dose rate dependence up to 460 MU min-1. Signal stability is within 1% for cumulative detector output; substantial variations were observed for the segment-by-segment signal. Calculated versus measured cumulative signal deviations ranged from  -0.16%-2.25%. DVH, mean 2D γ-value and detector signal evaluations showed increasing deviations with regard to the respective reference with growing MLC and dose output errors; good correlation between DVH metrics and detector signal deviation was found (e.g. PTV D mean: R 2  =  0.97). Positional MLC errors of 1 mm and errors in linac output of 2% were identified with the transmission detector system. The extensive tests performed in this investigation show that the new transmission detector provides a stable and sensitive cumulative signal output and is suitable for beam monitoring during patient treatment.

  13. Online Error Reporting for Managing Quality Control Within Radiology.

    PubMed

    Golnari, Pedram; Forsberg, Daniel; Rosipko, Beverly; Sunshine, Jeffrey L

    2016-06-01

    Information technology systems within health care, such as picture archiving and communication system (PACS) in radiology, can have a positive impact on production but can also risk compromising quality. The widespread use of PACS has removed the previous feedback loop between radiologists and technologists. Instead of direct communication of quality discrepancies found for an examination, the radiologist submitted a paper-based quality-control report. A web-based issue-reporting tool can help restore some of the feedback loop and also provide possibilities for more detailed analysis of submitted errors. The purpose of this study was to evaluate the hypothesis that data from use of an online error reporting software for quality control can focus our efforts within our department. For the 372,258 radiologic examinations conducted during the 6-month period study, 930 errors (390 exam protocol, 390 exam validation, and 150 exam technique) were submitted, corresponding to an error rate of 0.25 %. Within the category exam protocol, technologist documentation had the highest number of submitted errors in ultrasonography (77 errors [44 %]), while imaging protocol errors were the highest subtype error for computed tomography modality (35 errors [18 %]). Positioning and incorrect accession had the highest errors in the exam technique and exam validation error category, respectively, for nearly all of the modalities. An error rate less than 1 % could signify a system with a very high quality; however, a more likely explanation is that not all errors were detected or reported. Furthermore, staff reception of the error reporting system could also affect the reporting rate.

  14. Differences in the Gambling Behavior of Online and Non-Online Student Gamblers in a Controlled Laboratory Environment

    PubMed Central

    Montes, Kevin S.; Weatherly, Jeffrey N.

    2016-01-01

    Although research suggests that approximately 1 in 4 college students report having gambled online, few laboratory-based studies have been conducted enlisting online student gamblers. Moreover, it is unclear the extent to which differences in gambling behavior exist between online and non-online student gamblers. The current study examined if online gamblers would play more hands, commit more errors, and wager more credits than non-online student gamblers in a controlled, laboratory environment. Online (n = 19) and non-online (n = 26) student gamblers played video poker in three separate sessions and the number of hands played, errors committed, and credits wagered were recorded. Results showed that online student gamblers played more hands and committed more errors playing video poker than non-online student gamblers. The results from the current study extend previous research by suggesting that online gamblers engage in potentially more deleterious gambling behavior (e.g., playing more hands and committing more errors) than non-online gamblers. Additional research is needed to examine differences in the gambling behavior of online and non-online gamblers in a controlled, laboratory environment. PMID:27106027

  15. System identification for modeling for control of flexible structures

    NASA Technical Reports Server (NTRS)

    Mettler, Edward; Milman, Mark

    1986-01-01

    The major components of a design and operational flight strategy for flexible structure control systems are presented. In this strategy an initial distributed parameter control design is developed and implemented from available ground test data and on-orbit identification using sophisticated modeling and synthesis techniques. The reliability of this high performance controller is directly linked to the accuracy of the parameters on which the design is based. Because uncertainties inevitably grow without system monitoring, maintaining the control system requires an active on-line system identification function to supply parameter updates and covariance information. Control laws can then be modified to improve performance when the error envelopes are decreased. In terms of system safety and stability the covariance information is of equal importance as the parameter values themselves. If the on-line system ID function detects an increase in parameter error covariances, then corresponding adjustments must be made in the control laws to increase robustness. If the error covariances exceed some threshold, an autonomous calibration sequence could be initiated to restore the error enveloped to an acceptable level.

  16. Integrating Online and Offline Three-Dimensional Deep Learning for Automated Polyp Detection in Colonoscopy Videos.

    PubMed

    Lequan Yu; Hao Chen; Qi Dou; Jing Qin; Pheng Ann Heng

    2017-01-01

    Automated polyp detection in colonoscopy videos has been demonstrated to be a promising way for colorectal cancer prevention and diagnosis. Traditional manual screening is time consuming, operator dependent, and error prone; hence, automated detection approach is highly demanded in clinical practice. However, automated polyp detection is very challenging due to high intraclass variations in polyp size, color, shape, and texture, and low interclass variations between polyps and hard mimics. In this paper, we propose a novel offline and online three-dimensional (3-D) deep learning integration framework by leveraging the 3-D fully convolutional network (3D-FCN) to tackle this challenging problem. Compared with the previous methods employing hand-crafted features or 2-D convolutional neural network, the 3D-FCN is capable of learning more representative spatio-temporal features from colonoscopy videos, and hence has more powerful discrimination capability. More importantly, we propose a novel online learning scheme to deal with the problem of limited training data by harnessing the specific information of an input video in the learning process. We integrate offline and online learning to effectively reduce the number of false positives generated by the offline network and further improve the detection performance. Extensive experiments on the dataset of MICCAI 2015 Challenge on Polyp Detection demonstrated the better performance of our method when compared with other competitors.

  17. Differences in the Gambling Behavior of Online and Non-online Student Gamblers in a Controlled Laboratory Environment.

    PubMed

    Montes, Kevin S; Weatherly, Jeffrey N

    2017-03-01

    Although research suggests that approximately 1 in 4 college students report having gambled online, few laboratory-based studies have been conducted enlisting online student gamblers. Moreover, it is unclear the extent to which differences in gambling behavior exist between online and non-online student gamblers. The current study examined if online gamblers would play more hands, commit more errors, and wager more credits than non-online student gamblers in a controlled, laboratory environment. Online (n = 19) and non-online (n = 26) student gamblers played video poker in three separate sessions and the number of hands played, errors committed, and credits wagered were recorded. Results showed that online student gamblers played more hands and committed more errors playing video poker than non-online student gamblers. The results from the current study extend previous research by suggesting that online gamblers engage in potentially more deleterious gambling behavior (e.g., playing more hands and committing more errors) than non-online gamblers. Additional research is needed to examine differences in the gambling behavior of online and non-online gamblers in a controlled, laboratory environment.

  18. Using medication list--problem list mismatches as markers of potential error.

    PubMed Central

    Carpenter, James D.; Gorman, Paul N.

    2002-01-01

    The goal of this project was to specify and develop an algorithm that will check for drug and problem list mismatches in an electronic medical record (EMR). The algorithm is based on the premise that a patient's problem list and medication list should agree, and a mismatch may indicate medication error. Successful development of this algorithm could mean detection of some errors, such as medication orders entered into a wrong patient record, or drug therapy omissions, that are not otherwise detected via automated means. Additionally, mismatches may identify opportunities to improve problem list integrity. To assess the concept's feasibility, this study compared medications listed in a pharmacy information system with findings in an online nursing adult admission assessment, serving as a proxy for the problem list. Where drug and problem list mismatches were discovered, examination of the patient record confirmed the mismatch, and identified any potential causes. Evaluation of the algorithm in diabetes treatment indicates that it successfully detects both potential medication error and opportunities to improve problem list completeness. This algorithm, once fully developed and deployed, could prove a valuable way to improve the patient problem list, and could decrease the risk of medication error. PMID:12463796

  19. Detection and Correction of Silent Data Corruption for Large-Scale High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiala, David J; Mueller, Frank; Engelmann, Christian

    Faults have become the norm rather than the exception for high-end computing on clusters with 10s/100s of thousands of cores. Exacerbating this situation, some of these faults remain undetected, manifesting themselves as silent errors that corrupt memory while applications continue to operate and report incorrect results. This paper studies the potential for redundancy to both detect and correct soft errors in MPI message-passing applications. Our study investigates the challenges inherent to detecting soft errors within MPI application while providing transparent MPI redundancy. By assuming a model wherein corruption in application data manifests itself by producing differing MPI message data betweenmore » replicas, we study the best suited protocols for detecting and correcting MPI data that is the result of corruption. To experimentally validate our proposed detection and correction protocols, we introduce RedMPI, an MPI library which resides in the MPI profiling layer. RedMPI is capable of both online detection and correction of soft errors that occur in MPI applications without requiring any modifications to the application source by utilizing either double or triple redundancy. Our results indicate that our most efficient consistency protocol can successfully protect applications experiencing even high rates of silent data corruption with runtime overheads between 0% and 30% as compared to unprotected applications without redundancy. Using our fault injector within RedMPI, we observe that even a single soft error can have profound effects on running applications, causing a cascading pattern of corruption in most cases causes that spreads to all other processes. RedMPI's protection has been shown to successfully mitigate the effects of soft errors while allowing applications to complete with correct results even in the face of errors.« less

  20. Set-up uncertainties: online correction with X-ray volume imaging.

    PubMed

    Kataria, Tejinder; Abhishek, Ashu; Chadha, Pranav; Nandigam, Janardhan

    2011-01-01

    To determine interfractional three-dimensional set-up errors using X-ray volumetric imaging (XVI). Between December 2007 and August 2009, 125 patients were taken up for image-guided radiotherapy using online XVI. After matching of reference and acquired volume view images, set-up errors in three translation directions were recorded and corrected online before treatment each day. Mean displacements, population systematic (Σ), and random (σ) errors were calculated and analyzed using SPSS (v16) software. Optimum clinical target volume (CTV) to planning target volume (PTV) margin was calculated using Van Herk's (2.5Σ + 0.7 σ) and Stroom's (2Σ + 0.7 σ) formula. Patients were grouped in 4 cohorts, namely brain, head and neck, thorax, and abdomen-pelvis. The mean vector displacement recorded were 0.18 cm, 0.15 cm, 0.36 cm, and 0.35 cm for brain, head and neck, thorax, and abdomen-pelvis, respectively. Analysis of individual mean set-up errors revealed good agreement with the proposed 0.3 cm isotropic margins for brain and 0.5 cm isotropic margins for head-neck. Similarly, 0.5 cm circumferential and 1 cm craniocaudal proposed margins were in agreement with thorax and abdomen-pelvic cases. The calculated mean displacements were well within CTV-PTV margin estimates of Van Herk (90% population coverage to minimum 95% prescribed dose) and Stroom (99% target volume coverage by 95% prescribed dose). Employing these individualized margins in a particular cohort ensure comparable target coverage as described in literature, which is further improved if XVI-aided set-up error detection and correction is used before treatment.

  1. Time-critical Database Condition Data Handling in the CMS Experiment During the First Data Taking Period

    NASA Astrophysics Data System (ADS)

    Cavallari, Francesca; de Gruttola, Michele; Di Guida, Salvatore; Govi, Giacomo; Innocente, Vincenzo; Pfeiffer, Andreas; Pierro, Antonio

    2011-12-01

    Automatic, synchronous and reliable population of the condition databases is critical for the correct operation of the online selection as well as of the offline reconstruction and analysis of data. In this complex infrastructure, monitoring and fast detection of errors is a very challenging task. In this paper, we describe the CMS experiment system to process and populate the Condition Databases and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are automatically collected using centralized jobs or are "dropped" by the users in dedicated services (offline and online drop-box), which synchronize them and take care of writing them into the online database. Then they are automatically streamed to the offline database, and thus are immediately accessible offline worldwide. The condition data are managed by different users using a wide range of applications.In normal operation the database monitor is used to provide simple timing information and the history of all transactions for all database accounts, and in the case of faults it is used to return simple error messages and more complete debugging information.

  2. How do Community Pharmacies Recover from E-prescription Errors?

    PubMed Central

    Odukoya, Olufunmilola K.; Stone, Jamie A.; Chui, Michelle A.

    2014-01-01

    Background The use of e-prescribing is increasing annually, with over 788 million e-prescriptions received in US pharmacies in 2012. Approximately 9% of e-prescriptions have medication errors. Objective To describe the process used by community pharmacy staff to detect, explain, and correct e-prescription errors. Methods The error recovery conceptual framework was employed for data collection and analysis. 13 pharmacists and 14 technicians from five community pharmacies in Wisconsin participated in the study. A combination of data collection methods were utilized, including direct observations, interviews, and focus groups. The transcription and content analysis of recordings were guided by the three-step error recovery model. Results Most of the e-prescription errors were detected during the entering of information into the pharmacy system. These errors were detected by both pharmacists and technicians using a variety of strategies which included: (1) performing double checks of e-prescription information; (2) printing the e-prescription to paper and confirming the information on the computer screen with information from the paper printout; and (3) using colored pens to highlight important information. Strategies used for explaining errors included: (1) careful review of patient’ medication history; (2) pharmacist consultation with patients; (3) consultation with another pharmacy team member; and (4) use of online resources. In order to correct e-prescription errors, participants made educated guesses of the prescriber’s intent or contacted the prescriber via telephone or fax. When e-prescription errors were encountered in the community pharmacies, the primary goal of participants was to get the order right for patients by verifying the prescriber’s intent. Conclusion Pharmacists and technicians play an important role in preventing e-prescription errors through the detection of errors and the verification of prescribers’ intent. Future studies are needed to examine factors that facilitate or hinder recovery from e-prescription errors. PMID:24373898

  3. Masked and unmasked error-related potentials during continuous control and feedback

    NASA Astrophysics Data System (ADS)

    Lopes Dias, Catarina; Sburlea, Andreea I.; Müller-Putz, Gernot R.

    2018-06-01

    The detection of error-related potentials (ErrPs) in tasks with discrete feedback is well established in the brain–computer interface (BCI) field. However, the decoding of ErrPs in tasks with continuous feedback is still in its early stages. Objective. We developed a task in which subjects have continuous control of a cursor’s position by means of a joystick. The cursor’s position was shown to the participants in two different modalities of continuous feedback: normal and jittered. The jittered feedback was created to mimic the instability that could exist if participants controlled the trajectory directly with brain signals. Approach. This paper studies the electroencephalographic (EEG)—measurable signatures caused by a loss of control over the cursor’s trajectory, causing a target miss. Main results. In both feedback modalities, time-locked potentials revealed the typical frontal-central components of error-related potentials. Errors occurring during the jittered feedback (masked errors) were delayed in comparison to errors occurring during normal feedback (unmasked errors). Masked errors displayed lower peak amplitudes than unmasked errors. Time-locked classification analysis allowed a good distinction between correct and error classes (average Cohen-, average TPR  =  81.8% and average TNR  =  96.4%). Time-locked classification analysis between masked error and unmasked error classes revealed results at chance level (average Cohen-, average TPR  =  60.9% and average TNR  =  58.3%). Afterwards, we performed asynchronous detection of ErrPs, combining both masked and unmasked trials. The asynchronous detection of ErrPs in a simulated online scenario resulted in an average TNR of 84.0% and in an average TPR of 64.9%. Significance. The time-locked classification results suggest that the masked and unmasked errors were indistinguishable in terms of classification. The asynchronous classification results suggest that the feedback modality did not hinder the asynchronous detection of ErrPs.

  4. Persistent aerial video registration and fast multi-view mosaicing.

    PubMed

    Molina, Edgardo; Zhu, Zhigang

    2014-05-01

    Capturing aerial imagery at high resolutions often leads to very low frame rate video streams, well under full motion video standards, due to bandwidth, storage, and cost constraints. Low frame rates make registration difficult when an aircraft is moving at high speeds or when global positioning system (GPS) contains large errors or it fails. We present a method that takes advantage of persistent cyclic video data collections to perform an online registration with drift correction. We split the persistent aerial imagery collection into individual cycles of the scene, identify and correct the registration errors on the first cycle in a batch operation, and then use the corrected base cycle as a reference pass to register and correct subsequent passes online. A set of multi-view panoramic mosaics is then constructed for each aerial pass for representation, presentation and exploitation of the 3D dynamic scene. These sets of mosaics are all in alignment to the reference cycle allowing their direct use in change detection, tracking, and 3D reconstruction/visualization algorithms. Stereo viewing with adaptive baselines and varying view angles is realized by choosing a pair of mosaics from a set of multi-view mosaics. Further, the mosaics for the second pass and later can be generated and visualized online as their is no further batch error correction.

  5. SBL-Online: Implementing Studio-Based Learning Techniques in an Online Introductory Programming Course to Address Common Programming Errors and Misconceptions

    ERIC Educational Resources Information Center

    Polo, Blanca J.

    2013-01-01

    Much research has been done in regards to student programming errors, online education and studio-based learning (SBL) in computer science education. This study furthers this area by bringing together this knowledge and applying it to proactively help students overcome impasses caused by common student programming errors. This project proposes a…

  6. Automatic on-line detection system design research on internal defects of metal materials based on optical fiber F-P sensing technology

    NASA Astrophysics Data System (ADS)

    Xia, Liu; Shan, Ning; Chao, Ban; Caoshan, Wang

    2016-10-01

    Metal materials have been used in aerospace and other industrial fields widely because of its excellent characteristics, so its internal defects detection is very important. Ultrasound technology is used widely in the fields of nondestructive detection because of its excellent characteristic. But the conventional detection instrument for ultrasound, which has shortcomings such as low intelligent level and long development cycles, limits its development. In this paper, the theory of ultrasound detection is analyzed. A computational method of the defects distributional position is given. The non-contact type optical fiber F-P interference cavity structure is designed and the length of origin cavity is given. The real-time on-line ultrasound detecting experiment devices for internal defects of metal materials is established based on the optical fiber F-P sensing system. The virtual instrument of automation ultrasound detection internal defects is developed based on LabVIEW software and the experimental study is carried out. The results show that this system can be used in internal defect real-time on-line locating of engineering structures effectively. This system has higher measurement precision. Relative error is 6.7%. It can be met the requirement of engineering practice. The system is characterized by simple operation, easy realization. The software has a friendly interface, good expansibility, and high intelligent level.

  7. Dynamic modelling and estimation of the error due to asynchronism in a redundant asynchronous multiprocessor system

    NASA Technical Reports Server (NTRS)

    Huynh, Loc C.; Duval, R. W.

    1986-01-01

    The use of Redundant Asynchronous Multiprocessor System to achieve ultrareliable Fault Tolerant Control Systems shows great promise. The development has been hampered by the inability to determine whether differences in the outputs of redundant CPU's are due to failures or to accrued error built up by slight differences in CPU clock intervals. This study derives an analytical dynamic model of the difference between redundant CPU's due to differences in their clock intervals and uses this model with on-line parameter identification to idenitify the differences in the clock intervals. The ability of this methodology to accurately track errors due to asynchronisity generate an error signal with the effect of asynchronisity removed and this signal may be used to detect and isolate actual system failures.

  8. A novel onset detection technique for brain-computer interfaces using sound-production related cognitive tasks in simulated-online system

    NASA Astrophysics Data System (ADS)

    Song, YoungJae; Sepulveda, Francisco

    2017-02-01

    Objective. Self-paced EEG-based BCIs (SP-BCIs) have traditionally been avoided due to two sources of uncertainty: (1) precisely when an intentional command is sent by the brain, i.e., the command onset detection problem, and (2) how different the intentional command is when compared to non-specific (or idle) states. Performance evaluation is also a problem and there are no suitable standard metrics available. In this paper we attempted to tackle these issues. Approach. Self-paced covert sound-production cognitive tasks (i.e., high pitch and siren-like sounds) were used to distinguish between intentional commands (IC) and idle states. The IC states were chosen for their ease of execution and negligible overlap with common cognitive states. Band power and a digital wavelet transform were used for feature extraction, and the Davies-Bouldin index was used for feature selection. Classification was performed using linear discriminant analysis. Main results. Performance was evaluated under offline and simulated-online conditions. For the latter, a performance score called true-false-positive (TFP) rate, ranging from 0 (poor) to 100 (perfect), was created to take into account both classification performance and onset timing errors. Averaging the results from the best performing IC task for all seven participants, an 77.7% true-positive (TP) rate was achieved in offline testing. For simulated-online analysis the best IC average TFP score was 76.67% (87.61% TP rate, 4.05% false-positive rate). Significance. Results were promising when compared to previous IC onset detection studies using motor imagery, in which best TP rates were reported as 72.0% and 79.7%, and which, crucially, did not take timing errors into account. Moreover, based on our literature review, there is no previous covert sound-production onset detection system for spBCIs. Results showed that the proposed onset detection technique and TFP performance metric have good potential for use in SP-BCIs.

  9. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  10. Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)

    NASA Technical Reports Server (NTRS)

    McIntosh, Dawn

    2006-01-01

    This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search

  11. SU-C-BRA-03: An Automated and Quick Contour Errordetection for Auto Segmentation in Online Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, J; Ates, O; Li, X

    Purpose: To develop a tool that can quickly and automatically assess contour quality generated from auto segmentation during online adaptive replanning. Methods: Due to the strict time requirement of online replanning and lack of ‘ground truth’ contours in daily images, our method starts with assessing image registration accuracy focusing on the surface of the organ in question. Several metrics tightly related to registration accuracy including Jacobian maps, contours shell deformation, and voxel-based root mean square (RMS) analysis were computed. To identify correct contours, additional metrics and an adaptive decision tree are introduced. To approve in principle, tests were performed withmore » CT sets, planned and daily CTs acquired using a CT-on-rails during routine CT-guided RT delivery for 20 prostate cancer patients. The contours generated on daily CTs using an auto-segmentation tool (ADMIRE, Elekta, MIM) based on deformable image registration of the planning CT and daily CT were tested. Results: The deformed contours of 20 patients with total of 60 structures were manually checked as baselines. The incorrect rate of total contours is 49%. To evaluate the quality of local deformation, the Jacobian determinant (1.047±0.045) on contours has been analyzed. In an analysis of rectum contour shell deformed, the higher rate (0.41) of error contours detection was obtained compared to 0.32 with manual check. All automated detections took less than 5 seconds. Conclusion: The proposed method can effectively detect contour errors in micro and macro scope by evaluating multiple deformable registration metrics in a parallel computing process. Future work will focus on improving practicability and optimizing calculation algorithms and metric selection.« less

  12. Error correcting mechanisms during antisaccades: contribution of online control during primary saccades and offline control via secondary saccades.

    PubMed

    Bedi, Harleen; Goltz, Herbert C; Wong, Agnes M F; Chandrakumar, Manokaraananthan; Niechwiej-Szwedo, Ewa

    2013-01-01

    Errors in eye movements can be corrected during the ongoing saccade through in-flight modifications (i.e., online control), or by programming a secondary eye movement (i.e., offline control). In a reflexive saccade task, the oculomotor system can use extraretinal information (i.e., efference copy) online to correct errors in the primary saccade, and offline retinal information to generate a secondary corrective saccade. The purpose of this study was to examine the error correction mechanisms in the antisaccade task. The roles of extraretinal and retinal feedback in maintaining eye movement accuracy were investigated by presenting visual feedback at the spatial goal of the antisaccade. We found that online control for antisaccade is not affected by the presence of visual feedback; that is whether visual feedback is present or not, the duration of the deceleration interval was extended and significantly correlated with reduced antisaccade endpoint error. We postulate that the extended duration of deceleration is a feature of online control during volitional saccades to improve their endpoint accuracy. We found that secondary saccades were generated more frequently in the antisaccade task compared to the reflexive saccade task. Furthermore, we found evidence for a greater contribution from extraretinal sources of feedback in programming the secondary "corrective" saccades in the antisaccade task. Nonetheless, secondary saccades were more corrective for the remaining antisaccade amplitude error in the presence of visual feedback of the target. Taken together, our results reveal a distinctive online error control strategy through an extension of the deceleration interval in the antisaccade task. Target feedback does not improve online control, rather it improves the accuracy of secondary saccades in the antisaccade task.

  13. Error Correcting Mechanisms during Antisaccades: Contribution of Online Control during Primary Saccades and Offline Control via Secondary Saccades

    PubMed Central

    Bedi, Harleen; Goltz, Herbert C.; Wong, Agnes M. F.; Chandrakumar, Manokaraananthan; Niechwiej-Szwedo, Ewa

    2013-01-01

    Errors in eye movements can be corrected during the ongoing saccade through in-flight modifications (i.e., online control), or by programming a secondary eye movement (i.e., offline control). In a reflexive saccade task, the oculomotor system can use extraretinal information (i.e., efference copy) online to correct errors in the primary saccade, and offline retinal information to generate a secondary corrective saccade. The purpose of this study was to examine the error correction mechanisms in the antisaccade task. The roles of extraretinal and retinal feedback in maintaining eye movement accuracy were investigated by presenting visual feedback at the spatial goal of the antisaccade. We found that online control for antisaccade is not affected by the presence of visual feedback; that is whether visual feedback is present or not, the duration of the deceleration interval was extended and significantly correlated with reduced antisaccade endpoint error. We postulate that the extended duration of deceleration is a feature of online control during volitional saccades to improve their endpoint accuracy. We found that secondary saccades were generated more frequently in the antisaccade task compared to the reflexive saccade task. Furthermore, we found evidence for a greater contribution from extraretinal sources of feedback in programming the secondary “corrective” saccades in the antisaccade task. Nonetheless, secondary saccades were more corrective for the remaining antisaccade amplitude error in the presence of visual feedback of the target. Taken together, our results reveal a distinctive online error control strategy through an extension of the deceleration interval in the antisaccade task. Target feedback does not improve online control, rather it improves the accuracy of secondary saccades in the antisaccade task. PMID:23936308

  14. On-line measurement of diameter of hot-rolled steel tube

    NASA Astrophysics Data System (ADS)

    Zhu, Xueliang; Zhao, Huiying; Tian, Ailing; Li, Bin

    2015-02-01

    In order to design a online diameter measurement system for Hot-rolled seamless steel tube production line. On one hand, it can play a stimulate part in the domestic pipe measuring technique. On the other hand, it can also make our domestic hot rolled seamless steel tube enterprises gain a strong product competitiveness with low input. Through the analysis of various detection methods and techniques contrast, this paper choose a CCD camera-based online caliper system design. The system mainly includes the hardware measurement portion and the image processing section, combining with software control technology and image processing technology, which can complete online measurement of heat tube diameter. Taking into account the complexity of the actual job site situation, it can choose a relatively simple and reasonable layout. The image processing section mainly to solve the camera calibration and the application of a function in Matlab, to achieve the diameter size display directly through the algorithm to calculate the image. I build a simulation platform in the design last phase, successfully, collect images for processing, to prove the feasibility and rationality of the design and make error in less than 2%. The design successfully using photoelectric detection technology to solve real work problems

  15. Experimental investigation of false positive errors in auditory species occurrence surveys

    USGS Publications Warehouse

    Miller, David A.W.; Weir, Linda A.; McClintock, Brett T.; Grant, Evan H. Campbell; Bailey, Larissa L.; Simons, Theodore R.

    2012-01-01

    False positive errors are a significant component of many ecological data sets, which in combination with false negative errors, can lead to severe biases in conclusions about ecological systems. We present results of a field experiment where observers recorded observations for known combinations of electronically broadcast calling anurans under conditions mimicking field surveys to determine species occurrence. Our objectives were to characterize false positive error probabilities for auditory methods based on a large number of observers, to determine if targeted instruction could be used to reduce false positive error rates, and to establish useful predictors of among-observer and among-species differences in error rates. We recruited 31 observers, ranging in abilities from novice to expert, that recorded detections for 12 species during 180 calling trials (66,960 total observations). All observers made multiple false positive errors and on average 8.1% of recorded detections in the experiment were false positive errors. Additional instruction had only minor effects on error rates. After instruction, false positive error probabilities decreased by 16% for treatment individuals compared to controls with broad confidence interval overlap of 0 (95% CI: -46 to 30%). This coincided with an increase in false negative errors due to the treatment (26%; -3 to 61%). Differences among observers in false positive and in false negative error rates were best predicted by scores from an online test and a self-assessment of observer ability completed prior to the field experiment. In contrast, years of experience conducting call surveys was a weak predictor of error rates. False positive errors were also more common for species that were played more frequently, but were not related to the dominant spectral frequency of the call. Our results corroborate other work that demonstrates false positives are a significant component of species occurrence data collected by auditory methods. Instructing observers to only report detections they are completely certain are correct is not sufficient to eliminate errors. As a result, analytical methods that account for false positive errors will be needed, and independent testing of observer ability is a useful predictor for among-observer variation in observation error rates.

  16. A hybrid strategy of offline adaptive planning and online image guidance for prostate cancer radiotherapy.

    PubMed

    Lei, Yu; Wu, Qiuwen

    2010-04-21

    Offline adaptive radiotherapy (ART) has been used to effectively correct and compensate for prostate motion and reduce the required margin. The efficacy depends on the characteristics of the patient setup error and interfraction motion through the whole treatment; specifically, systematic errors are corrected and random errors are compensated for through the margins. In online image-guided radiation therapy (IGRT) of prostate cancer, the translational setup error and inter-fractional prostate motion are corrected through pre-treatment imaging and couch correction at each fraction. However, the rotation and deformation of the target are not corrected and only accounted for with margins in treatment planning. The purpose of this study was to investigate whether the offline ART strategy is necessary for an online IGRT protocol and to evaluate the benefit of the hybrid strategy. First, to investigate the rationale of the hybrid strategy, 592 cone-beam-computed tomography (CBCT) images taken before and after each fraction for an online IGRT protocol from 16 patients were analyzed. Specifically, the characteristics of prostate rotation were analyzed. It was found that there exist systematic inter-fractional prostate rotations, and they are patient specific. These rotations, if not corrected, are persistent through the treatment fraction, and rotations detected in early fractions are representative of those in later fractions. These findings suggest that the offline adaptive replanning strategy is beneficial to the online IGRT protocol with further margin reductions. Second, to quantitatively evaluate the benefit of the hybrid strategy, 412 repeated helical CT scans from 25 patients during the course of treatment were included in the replanning study. Both low-risk patients (LRP, clinical target volume, CTV = prostate) and intermediate-risk patients (IRP, CTV = prostate + seminal vesicles) were included in the simulation. The contours of prostate and seminal vesicles were delineated on each CT. The benefit of margin reduction to compensate for both rotation and deformation in the hybrid strategy was evaluated geometrically. With the hybrid strategy, the planning margins can be reduced by 1.4 mm for LRP, and 2.0 mm for IRP, compared with the standard online IGRT only, to maintain the same 99% target volume coverage. The average relative reduction in planning target volume (PTV) based on the internal target volume (ITV) from PTV based on CTV is 19% for LRP, and 27% for IRP.

  17. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  18. Development of a Versatile Laser-Ultrasonic System and Application to the Online Measurement for Process Control of Wall Thickness and Eccentricity of Seamless Tubes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert V. Kolarik II

    2002-10-23

    A system for the online, non-contact measurement of wall thickness in steel seamless mechanical tubing has been developed and demonstrated at a tubing production line at the Timken Company in Canton, Ohio. The system utilizes laser-generation of ultrasound and laser-detection of time of flight with interferometry, laser-doppler velocimetry and pyrometry, all with fiber coupling. Accuracy (<1% error) and precision (1.5%) are at targeted levels. Cost and energy savings have exceeded estimates. The system has shown good reliability in measuring over 200,000 tubes in its first six months of deployment.

  19. Electrophysiological Correlates of Error Monitoring and Feedback Processing in Second Language Learning.

    PubMed

    Bultena, Sybrine; Danielmeier, Claudia; Bekkering, Harold; Lemhöfer, Kristin

    2017-01-01

    Humans monitor their behavior to optimize performance, which presumably relies on stable representations of correct responses. During second language (L2) learning, however, stable representations have yet to be formed while knowledge of the first language (L1) can interfere with learning, which in some cases results in persistent errors. In order to examine how correct L2 representations are stabilized, this study examined performance monitoring in the learning process of second language learners for a feature that conflicts with their first language. Using EEG, we investigated if L2 learners in a feedback-guided word gender assignment task showed signs of error detection in the form of an error-related negativity (ERN) before and after receiving feedback, and how feedback is processed. The results indicated that initially, response-locked negativities for correct (CRN) and incorrect (ERN) responses were of similar size, showing a lack of internal error detection when L2 representations are unstable. As behavioral performance improved following feedback, the ERN became larger than the CRN, pointing to the first signs of successful error detection. Additionally, we observed a second negativity following the ERN/CRN components, the amplitude of which followed a similar pattern as the previous negativities. Feedback-locked data indicated robust FRN and P300 effects in response to negative feedback across different rounds, demonstrating that feedback remained important in order to update memory representations during learning. We thus show that initially, L2 representations may often not be stable enough to warrant successful error monitoring, but can be stabilized through repeated feedback, which means that the brain is able to overcome L1 interference, and can learn to detect errors internally after a short training session. The results contribute a different perspective to the discussion on changes in ERN and FRN components in relation to learning, by extending the investigation of these effects to the language learning domain. Furthermore, these findings provide a further characterization of the online learning process of L2 learners.

  20. Patients' online access to their electronic health records and linked online services: a systematic interpretative review.

    PubMed

    de Lusignan, Simon; Mold, Freda; Sheikh, Aziz; Majeed, Azeem; Wyatt, Jeremy C; Quinn, Tom; Cavill, Mary; Gronlund, Toto Anne; Franco, Christina; Chauhan, Umesh; Blakey, Hannah; Kataria, Neha; Barker, Fiona; Ellis, Beverley; Koczan, Phil; Arvanitis, Theodoros N; McCarthy, Mary; Jones, Simon; Rafi, Imran

    2014-09-08

    To investigate the effect of providing patients online access to their electronic health record (EHR) and linked transactional services on the provision, quality and safety of healthcare. The objectives are also to identify and understand: barriers and facilitators for providing online access to their records and services for primary care workers; and their association with organisational/IT system issues. Primary care. A total of 143 studies were included. 17 were experimental in design and subject to risk of bias assessment, which is reported in a separate paper. Detailed inclusion and exclusion criteria have also been published elsewhere in the protocol. Our primary outcome measure was change in quality or safety as a result of implementation or utilisation of online records/transactional services. No studies reported changes in health outcomes; though eight detected medication errors and seven reported improved uptake of preventative care. Professional concerns over privacy were reported in 14 studies. 18 studies reported concern over potential increased workload; with some showing an increase workload in email or online messaging; telephone contact remaining unchanged, and face-to face contact staying the same or falling. Owing to heterogeneity in reporting overall workload change was hard to predict. 10 studies reported how online access offered convenience, primarily for more advantaged patients, who were largely highly satisfied with the process when clinician responses were prompt. Patient online access and services offer increased convenience and satisfaction. However, professionals were concerned about impact on workload and risk to privacy. Studies correcting medication errors may improve patient safety. There may need to be a redesign of the business process to engage health professionals in online access and of the EHR to make it friendlier and provide equity of access to a wider group of patients. A1 SYSTEMATIC REVIEW REGISTRATION NUMBER: PROSPERO CRD42012003091. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  1. Patients’ online access to their electronic health records and linked online services: a systematic interpretative review

    PubMed Central

    de Lusignan, Simon; Mold, Freda; Sheikh, Aziz; Majeed, Azeem; Wyatt, Jeremy C; Quinn, Tom; Cavill, Mary; Gronlund, Toto Anne; Franco, Christina; Chauhan, Umesh; Blakey, Hannah; Kataria, Neha; Barker, Fiona; Ellis, Beverley; Koczan, Phil; Arvanitis, Theodoros N; McCarthy, Mary; Jones, Simon; Rafi, Imran

    2014-01-01

    Objectives To investigate the effect of providing patients online access to their electronic health record (EHR) and linked transactional services on the provision, quality and safety of healthcare. The objectives are also to identify and understand: barriers and facilitators for providing online access to their records and services for primary care workers; and their association with organisational/IT system issues. Setting Primary care. Participants A total of 143 studies were included. 17 were experimental in design and subject to risk of bias assessment, which is reported in a separate paper. Detailed inclusion and exclusion criteria have also been published elsewhere in the protocol. Primary and secondary outcome measures Our primary outcome measure was change in quality or safety as a result of implementation or utilisation of online records/transactional services. Results No studies reported changes in health outcomes; though eight detected medication errors and seven reported improved uptake of preventative care. Professional concerns over privacy were reported in 14 studies. 18 studies reported concern over potential increased workload; with some showing an increase workload in email or online messaging; telephone contact remaining unchanged, and face-to face contact staying the same or falling. Owing to heterogeneity in reporting overall workload change was hard to predict. 10 studies reported how online access offered convenience, primarily for more advantaged patients, who were largely highly satisfied with the process when clinician responses were prompt. Conclusions Patient online access and services offer increased convenience and satisfaction. However, professionals were concerned about impact on workload and risk to privacy. Studies correcting medication errors may improve patient safety. There may need to be a redesign of the business process to engage health professionals in online access and of the EHR to make it friendlier and provide equity of access to a wider group of patients. A1. Systematic review registration number PROSPERO CRD42012003091. PMID:25200561

  2. Optical Fiber On-Line Detection System for Non-Touch Monitoring Roller Shape

    NASA Astrophysics Data System (ADS)

    Guo, Y.; Wang, Y. T.

    2006-10-01

    Basing on the principle of reflective displacement fiber-optic sensor, a high accuracy non-touch on-line optical fiber measurement system for roller shape is presented. The principle and composition of the detection system and the operation process are expatiated also. By using a novel probe of three optical fibers in equal transverse space, the effects of fluctuations in the light source, reflective changing of target surface and the intensity losses in the fiber lines are automatically compensated. Meantime, an optical fiber sensor model of correcting static error based on BP artificial neural network (ANN) is set up. Also by using interpolation method and value filtering to process the signals, effectively reduce the influence of random noise and the vibration of the roller bearing. So enhance the accuracy and resolution remarkably. Experiment proves that the accuracy of the system reach to the demand of practical production process, it provides a new method for the high speed, accurate and automatic on line detection of the mill roller shape.

  3. Volcanic ash modeling with the NMMB-MONARCH-ASH model: quantification of offline modeling errors

    NASA Astrophysics Data System (ADS)

    Marti, Alejandro; Folch, Arnau

    2018-03-01

    Volcanic ash modeling systems are used to simulate the atmospheric dispersion of volcanic ash and to generate forecasts that quantify the impacts from volcanic eruptions on infrastructures, air quality, aviation, and climate. The efficiency of response and mitigation actions is directly associated with the accuracy of the volcanic ash cloud detection and modeling systems. Operational forecasts build on offline coupled modeling systems in which meteorological variables are updated at the specified coupling intervals. Despite the concerns from other communities regarding the accuracy of this strategy, the quantification of the systematic errors and shortcomings associated with the offline modeling systems has received no attention. This paper employs the NMMB-MONARCH-ASH model to quantify these errors by employing different quantitative and categorical evaluation scores. The skills of the offline coupling strategy are compared against those from an online forecast considered to be the best estimate of the true outcome. Case studies are considered for a synthetic eruption with constant eruption source parameters and for two historical events, which suitably illustrate the severe aviation disruptive effects of European (2010 Eyjafjallajökull) and South American (2011 Cordón Caulle) volcanic eruptions. Evaluation scores indicate that systematic errors due to the offline modeling are of the same order of magnitude as those associated with the source term uncertainties. In particular, traditional offline forecasts employed in operational model setups can result in significant uncertainties, failing to reproduce, in the worst cases, up to 45-70 % of the ash cloud of an online forecast. These inconsistencies are anticipated to be even more relevant in scenarios in which the meteorological conditions change rapidly in time. The outcome of this paper encourages operational groups responsible for real-time advisories for aviation to consider employing computationally efficient online dispersal models.

  4. Use of pump current modulation of diode laser for increased sensitivity of detection of 13СO2 in human exhaled breath

    NASA Astrophysics Data System (ADS)

    Kireev, S. V.; Kondrashov, A. A.; Shnyrev, S. L.; Safagaraev, A. P.

    2018-03-01

    This paper reports that the use of a lock-in detection technique, when the pump current modulation of a diode laser is operating near the wavelength of 2 µm, allows the improvement of the sensitivity of the online detection of 13СO2 in expired air by more than three orders of magnitude. The sensitivity of the 13СO2 detected in the paper is 60 ppb with an error of 13СO2 concentration measured in the exhaled breath at the level of 2.9% for an optical path length of 60 cm.

  5. WE-G-BRA-01: Patient Safety and Treatment Quality Improvement Through Incident Learning: Experience of a Non-Academic Proton Therapy Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Y; Johnson, R; Zhao, L

    2015-06-15

    Purpose: Incident learning has been proven to improve patient safety and treatment quality in conventional radiation therapy. However, its application in proton therapy has not been reported yet to our knowledge. In this study, we report our experience in developing and implementation of an in-house incident learning system. Methods: An incident learning system was developed based on published principles and tailored for our clinical practice and available resource about 18 months ago. The system includes four layers of error detection and report: 1) dosimetry peer review; 2) physicist plan quality assurance (QA); 3) treatment delivery issue on call and record;more » and 4) other incident report. The first two layers of QA and report were mandatory for each treatment plan through easy-to-use spreadsheets that are only accessible by the dosimetry and physicist departments. The treatment delivery issues were recorded case by case by the on call physicist. All other incidents were reported through an online incident report system, which can be anonymous. The incident report includes near misses on planning and delivery, process deviation, machine issues, work flow and documentation. Periodic incident reviews were performed. Results: In total, about 116 errors were reported through dosimetry review, 137 errors through plan QA, 83 treatment issues through physics on call record, and 30 through the online incident report. Only 8 incidents (2.2%) were considered to have a clinical impact to patients, and the rest of errors were either detected before reaching patients or had negligible dosimetric impact (<5% dose variance). Personnel training & process improvements were implemented upon periodic incident review. Conclusion: An incident learning system can be helpful in personnel training, error reduction, and patient safety and treatment quality improvement. The system needs to be catered for each clinic’s practice and available resources. Incident and knowledge sharing among proton centers are encouraged.« less

  6. RACER: Effective Race Detection Using AspectJ

    NASA Technical Reports Server (NTRS)

    Bodden, Eric; Havelund, Klaus

    2008-01-01

    The limits of coding with joint constraints on detected and undetected error rates Programming errors occur frequently in large software systems, and even more so if these systems are concurrent. In the past, researchers have developed specialized programs to aid programmers detecting concurrent programming errors such as deadlocks, livelocks, starvation and data races. In this work we propose a language extension to the aspect-oriented programming language AspectJ, in the form of three new built-in pointcuts, lock(), unlock() and may be Shared(), which allow programmers to monitor program events where locks are granted or handed back, and where values are accessed that may be shared amongst multiple Java threads. We decide thread-locality using a static thread-local objects analysis developed by others. Using the three new primitive pointcuts, researchers can directly implement efficient monitoring algorithms to detect concurrent programming errors online. As an example, we expose a new algorithm which we call RACER, an adoption of the well-known ERASER algorithm to the memory model of Java. We implemented the new pointcuts as an extension to the Aspect Bench Compiler, implemented the RACER algorithm using this language extension and then applied the algorithm to the NASA K9 Rover Executive. Our experiments proved our implementation very effective. In the Rover Executive RACER finds 70 data races. Only one of these races was previously known.We further applied the algorithm to two other multi-threaded programs written by Computer Science researchers, in which we found races as well.

  7. The planning coordinator: A design architecture for autonomous error recovery and on-line planning of intelligent tasks

    NASA Technical Reports Server (NTRS)

    Farah, Jeffrey J.

    1992-01-01

    Developing a robust, task level, error recovery and on-line planning architecture is an open research area. There is previously published work on both error recovery and on-line planning; however, none incorporates error recovery and on-line planning into one integrated platform. The integration of these two functionalities requires an architecture that possesses the following characteristics. The architecture must provide for the inclusion of new information without the destruction of existing information. The architecture must provide for the relating of pieces of information, old and new, to one another in a non-trivial rather than trivial manner (e.g., object one is related to object two under the following constraints, versus, yes, they are related; no, they are not related). Finally, the architecture must be not only a stand alone architecture, but also one that can be easily integrated as a supplement to some existing architecture. This thesis proposal addresses architectural development. Its intent is to integrate error recovery and on-line planning onto a single, integrated, multi-processor platform. This intelligent x-autonomous platform, called the Planning Coordinator, will be used initially to supplement existing x-autonomous systems and eventually replace them.

  8. Toward Adversarial Online Learning and the Science of Deceptive Machines

    DTIC Science & Technology

    2015-11-14

    noise . Adver- saries can take advantage of this inherent blind spot to avoid detection (mimicry). Adversarial label noise is the intentional switching...of classification labels leading to de- terministic noise , error that the model cannot capture due to its generalization bias. An experiment in user...potentially infinite and with imperfect information. We will combine Monte-Carlo tree search ( MCTS ) with rein- forcement learning because the manipulation

  9. Assessing the Library Homepages of COPLAC Institutions for Section 508 Accessibility Errors: Who's Accessible, Who's Not, and How the Online WebXACT Assessment Tool Can Help

    ERIC Educational Resources Information Center

    Huprich, Julia; Green, Ravonne

    2007-01-01

    The Council on Public Liberal Arts Colleges (COPLAC) libraries websites were assessed for Section 508 errors using the online WebXACT tool. Only three of the twenty-one institutions (14%) had zero accessibility errors. Eighty-six percent of the COPLAC institutions had an average of 1.24 errors. Section 508 compliance is required for institutions…

  10. Online patient safety education programme for junior doctors: is it worthwhile?

    PubMed

    McCarthy, S E; O'Boyle, C A; O'Shaughnessy, A; Walsh, G

    2016-02-01

    Increasing demand exists for blended approaches to the development of professionalism. Trainees of the Royal College of Physicians of Ireland participated in an online patient safety programme. Study aims were: (1) to determine whether the programme improved junior doctors' knowledge, attitudes and skills relating to error reporting, open communication and care for the second victim and (2) to establish whether the methodology facilitated participants' learning. 208 junior doctors who completed the programme completed a pre-online questionnaire. Measures were "patient safety knowledge and attitudes", "medical safety climate" and "experience of learning". Sixty-two completed the post-questionnaire, representing a 30 % matched response rate. Participating in the programme resulted in immediate (p < 0.01) improvement in skills such as knowing when and how to complete incident forms and disclosing errors to patients, in self-rated knowledge (p < 0.01) and attitudes towards error reporting (p < 0.01). Sixty-three per cent disagreed that doctors routinely report medical errors and 42 % disagreed that doctors routinely share information about medical errors and what caused them. Participants rated interactive features as the most positive elements of the programme. An online training programme on medical error improved self-rated knowledge, attitudes and skills in junior doctors and was deemed an effective learning tool. Perceptions of work issues such as a poor culture of error reporting among doctors may prevent improved attitudes being realised in practice. Online patient safety education has a role in practice-based initiatives aimed at developing professionalism and improving safety.

  11. Error Recovery in the Time-Triggered Paradigm with FTT-CAN.

    PubMed

    Marques, Luis; Vasconcelos, Verónica; Pedreiras, Paulo; Almeida, Luís

    2018-01-11

    Data networks are naturally prone to interferences that can corrupt messages, leading to performance degradation or even to critical failure of the corresponding distributed system. To improve resilience of critical systems, time-triggered networks are frequently used, based on communication schedules defined at design-time. These networks offer prompt error detection, but slow error recovery that can only be compensated with bandwidth overprovisioning. On the contrary, the Flexible Time-Triggered (FTT) paradigm uses online traffic scheduling, which enables a compromise between error detection and recovery that can achieve timely recovery with a fraction of the needed bandwidth. This article presents a new method to recover transmission errors in a time-triggered Controller Area Network (CAN) network, based on the Flexible Time-Triggered paradigm, namely FTT-CAN. The method is based on using a server (traffic shaper) to regulate the retransmission of corrupted or omitted messages. We show how to design the server to simultaneously: (1) meet a predefined reliability goal, when considering worst case error recovery scenarios bounded probabilistically by a Poisson process that models the fault arrival rate; and, (2) limit the direct and indirect interference in the message set, preserving overall system schedulability. Extensive simulations with multiple scenarios, based on practical and randomly generated systems, show a reduction of two orders of magnitude in the average bandwidth taken by the proposed error recovery mechanism, when compared with traditional approaches available in the literature based on adding extra pre-defined transmission slots.

  12. Error Recovery in the Time-Triggered Paradigm with FTT-CAN

    PubMed Central

    Pedreiras, Paulo; Almeida, Luís

    2018-01-01

    Data networks are naturally prone to interferences that can corrupt messages, leading to performance degradation or even to critical failure of the corresponding distributed system. To improve resilience of critical systems, time-triggered networks are frequently used, based on communication schedules defined at design-time. These networks offer prompt error detection, but slow error recovery that can only be compensated with bandwidth overprovisioning. On the contrary, the Flexible Time-Triggered (FTT) paradigm uses online traffic scheduling, which enables a compromise between error detection and recovery that can achieve timely recovery with a fraction of the needed bandwidth. This article presents a new method to recover transmission errors in a time-triggered Controller Area Network (CAN) network, based on the Flexible Time-Triggered paradigm, namely FTT-CAN. The method is based on using a server (traffic shaper) to regulate the retransmission of corrupted or omitted messages. We show how to design the server to simultaneously: (1) meet a predefined reliability goal, when considering worst case error recovery scenarios bounded probabilistically by a Poisson process that models the fault arrival rate; and, (2) limit the direct and indirect interference in the message set, preserving overall system schedulability. Extensive simulations with multiple scenarios, based on practical and randomly generated systems, show a reduction of two orders of magnitude in the average bandwidth taken by the proposed error recovery mechanism, when compared with traditional approaches available in the literature based on adding extra pre-defined transmission slots. PMID:29324723

  13. A Sequential Analysis of Responses in Online Debates to Postings of Students Exhibiting High Versus Low Grammar and Spelling Errors

    ERIC Educational Resources Information Center

    Jeong, Allan; Li, Haiying; Pan, Andy Jiaren

    2017-01-01

    Given that grammatical and spelling errors have been found to influence perceived competence and credibility in written communication, this study examined how a student's grammar and spelling errors affect how other students respond to the student's postings in four online debates hosted in asynchronous threaded discussions. Message-response…

  14. Investigating the error sources of the online state of charge estimation methods for lithium-ion batteries in electric vehicles

    NASA Astrophysics Data System (ADS)

    Zheng, Yuejiu; Ouyang, Minggao; Han, Xuebing; Lu, Languang; Li, Jianqiu

    2018-02-01

    Sate of charge (SOC) estimation is generally acknowledged as one of the most important functions in battery management system for lithium-ion batteries in new energy vehicles. Though every effort is made for various online SOC estimation methods to reliably increase the estimation accuracy as much as possible within the limited on-chip resources, little literature discusses the error sources for those SOC estimation methods. This paper firstly reviews the commonly studied SOC estimation methods from a conventional classification. A novel perspective focusing on the error analysis of the SOC estimation methods is proposed. SOC estimation methods are analyzed from the views of the measured values, models, algorithms and state parameters. Subsequently, the error flow charts are proposed to analyze the error sources from the signal measurement to the models and algorithms for the widely used online SOC estimation methods in new energy vehicles. Finally, with the consideration of the working conditions, choosing more reliable and applicable SOC estimation methods is discussed, and the future development of the promising online SOC estimation methods is suggested.

  15. Online 3D EPID-based dose verification: Proof of concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.« less

  16. Online measurement of bead geometry in GMAW-based additive manufacturing using passive vision

    NASA Astrophysics Data System (ADS)

    Xiong, Jun; Zhang, Guangjun

    2013-11-01

    Additive manufacturing based on gas metal arc welding is an advanced technique for depositing fully dense components with low cost. Despite this fact, techniques to achieve accurate control and automation of the process have not yet been perfectly developed. The online measurement of the deposited bead geometry is a key problem for reliable control. In this work a passive vision-sensing system, comprising two cameras and composite filtering techniques, was proposed for real-time detection of the bead height and width through deposition of thin walls. The nozzle to the top surface distance was monitored for eliminating accumulated height errors during the multi-layer deposition process. Various image processing algorithms were applied and discussed for extracting feature parameters. A calibration procedure was presented for the monitoring system. Validation experiments confirmed the effectiveness of the online measurement system for bead geometry in layered additive manufacturing.

  17. It Pays to Go Off-Track: Practicing with Error-Augmenting Haptic Feedback Facilitates Learning of a Curve-Tracing Task

    PubMed Central

    Williams, Camille K.; Tremblay, Luc; Carnahan, Heather

    2016-01-01

    Researchers in the domain of haptic training are now entering the long-standing debate regarding whether or not it is best to learn a skill by experiencing errors. Haptic training paradigms provide fertile ground for exploring how various theories about feedback, errors and physical guidance intersect during motor learning. Our objective was to determine how error minimizing, error augmenting and no haptic feedback while learning a self-paced curve-tracing task impact performance on delayed (1 day) retention and transfer tests, which indicate learning. We assessed performance using movement time and tracing error to calculate a measure of overall performance – the speed accuracy cost function. Our results showed that despite exhibiting the worst performance during skill acquisition, the error augmentation group had significantly better accuracy (but not overall performance) than the error minimization group on delayed retention and transfer tests. The control group’s performance fell between that of the two experimental groups but was not significantly different from either on the delayed retention test. We propose that the nature of the task (requiring online feedback to guide performance) coupled with the error augmentation group’s frequent off-target experience and rich experience of error-correction promoted information processing related to error-detection and error-correction that are essential for motor learning. PMID:28082937

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Z.; Pike, R.W.; Hertwig, T.A.

    An effective approach for source reduction in chemical plants has been demonstrated using on-line optimization with flowsheeting (ASPEN PLUS) for process optimization and parameter estimation and the Tjao-Biegler algorithm implemented in a mathematical programming language (GAMS/MINOS) for data reconciliation and gross error detection. Results for a Monsanto sulfuric acid plant with a Bailey distributed control system showed a 25% reduction in the sulfur dioxide emissions and a 17% improvement in the profit over the current operating conditions. Details of the methods used are described.

  19. Caveat emptor: Erroneous safety information about opioids in online drug-information compendia.

    PubMed

    Talwar, Sonia R; Randhawa, Amarita S; Dankiewicz, Erica H; Crudele, Nancy T; Haddox, J David

    2016-01-01

    Healthcare professionals and consumers refer to online drug-information compendia (eg, Epocrates and WebMD) to learn about prescription medications, including opioid analgesics. With the significant risks associated with opioids, including abuse, misuse, and addiction, any of which can result in life-threatening overdose, it is important for those seeking information from online compendia to have access to current, accurate, and complete drug information to help support clinical treatment decisions. Although compendia are informative, readily available, and user friendly, studies have shown that they may contain errors. To review and identify misinformation in drug summaries of online drug-information compendia for selected opioid analgesic products and submit content corrections to the respective editors. Between 2011 and 2013, drug summaries for Purdue's prescription opioid analgesic products from seven leading online drug-information compendia were systematically reviewed, and the requests for corrections were retrospectively categorized and classified. At least 2 months following requests, the same compendia were then reexamined to assess the degree of error resolution. A total of 859 errors were identified, with the greatest percentage in Safety and Patient Education categories. Across the seven compendia, the complete or partial resolution of errors was 34 percent; therefore, nearly two thirds of the identified errors remain. The results of this analysis, consistent with past studies, demonstrate that online drug-information compendia may contain inaccurate information. Healthcare professionals and consumers must be informed of potential misinformation so they may consider using multiple resources to obtain accurate and current drug information, thereby helping to ensure safer use of prescription medications, such as opioids.

  20. A New KE-Free Online ICALL System Featuring Error Contingent Feedback

    ERIC Educational Resources Information Center

    Tokuda, Naoyuki; Chen, Liang

    2004-01-01

    As a first step towards implementing a human language teacher, we have developed a new template-based on-line ICALL (intelligent computer assisted language learning) system capable of automatically diagnosing learners' free-format translated inputs and returning error contingent feedback. The system architecture we have adopted allows language…

  1. Integrated model reference adaptive control and time-varying angular rate estimation for micro-machined gyroscopes

    NASA Astrophysics Data System (ADS)

    Tsai, Nan-Chyuan; Sue, Chung-Yang

    2010-02-01

    Owing to the imposed but undesired accelerations such as quadrature error and cross-axis perturbation, the micro-machined gyroscope would not be unconditionally retained at resonant mode. Once the preset resonance is not sustained, the performance of the micro-gyroscope is accordingly degraded. In this article, a direct model reference adaptive control loop which is integrated with a modified disturbance estimating observer (MDEO) is proposed to guarantee the resonant oscillations at drive mode and counterbalance the undesired disturbance mainly caused by quadrature error and cross-axis perturbation. The parameters of controller are on-line innovated by the dynamic error between the MDEO output and expected response. In addition, Lyapunov stability theory is employed to examine the stability of the closed-loop control system. Finally, the efficacy of numerical evaluation on the exerted time-varying angular rate, which is to be detected and measured by the gyroscope, is verified by intensive simulations.

  2. A fiber optic sensor for on-line non-touch monitoring of roll shape

    NASA Astrophysics Data System (ADS)

    Guo, Yuan; Qu, Weijian; Yuan, Qi

    2009-07-01

    Basing on the principle of reflective displacement fibre-optic sensor, a high accuracy non-touch on-line optical fibre sensor for detecting roll shape is presented. The principle and composition of the detection system and the operation process are expatiated also. By using a novel probe of three optical fibres in equal transverse space, the effects of fluctuations in the light source, reflective changing of target surface and the intensity losses in the fibre lines are automatically compensated. Meantime, an optical fibre sensor model of correcting static error based on BP artificial neural network (ANN) is set up. Also by using interpolation method and value filtering to process the signals, effectively reduce the influence of random noise and the vibration of the roll bearing. So the accuracy and resolution were enhanced remarkably. Experiment proves that the resolution is 1μm and the precision can reach to 0.1%. So the system reaches to the demand of practical production process.

  3. A comparison of the use of bony anatomy and internal markers for offline verification and an evaluation of the potential benefit of online and offline verification protocols for prostate radiotherapy.

    PubMed

    McNair, Helen A; Hansen, Vibeke N; Parker, Christopher C; Evans, Phil M; Norman, Andrew; Miles, Elizabeth; Harris, Emma J; Del-Acroix, Louise; Smith, Elizabeth; Keane, Richard; Khoo, Vincent S; Thompson, Alan C; Dearnaley, David P

    2008-05-01

    To evaluate the utility of intraprostatic markers in the treatment verification of prostate cancer radiotherapy. Specific aims were: to compare the effectiveness of offline correction protocols, either using gold markers or bony anatomy; to estimate the potential benefit of online correction protocol's using gold markers; to determine the presence and effect of intrafraction motion. Thirty patients with three gold markers inserted had pretreatment and posttreatment images acquired and were treated using an offline correction protocol and gold markers. Retrospectively, an offline protocol was applied using bony anatomy and an online protocol using gold markers. The systematic errors were reduced from 1.3, 1.9, and 2.5 mm to 1.1, 1.1, and 1.5 mm in the right-left (RL), superoinferior (SI), and anteroposterior (AP) directions, respectively, using the offline correction protocol and gold markers instead of bony anatomy. The subsequent decrease in margins was 1.7, 3.3, and 4 mm in the RL, SI, and AP directions, respectively. An offline correction protocol combined with an online correction protocol in the first four fractions reduced random errors further to 0.9, 1.1, and 1.0 mm in the RL, SI, and AP directions, respectively. A daily online protocol reduced all errors to <1 mm. Intrafraction motion had greater impact on the effectiveness of the online protocol than the offline protocols. An offline protocol using gold markers is effective in reducing the systematic error. The value of online protocols is reduced by intrafraction motion.

  4. Evaluating structural pattern recognition for handwritten math via primitive label graphs

    NASA Astrophysics Data System (ADS)

    Zanibbi, Richard; Mouchère, Harold; Viard-Gaudin, Christian

    2013-01-01

    Currently, structural pattern recognizer evaluations compare graphs of detected structure to target structures (i.e. ground truth) using recognition rates, recall and precision for object segmentation, classification and relationships. In document recognition, these target objects (e.g. symbols) are frequently comprised of multiple primitives (e.g. connected components, or strokes for online handwritten data), but current metrics do not characterize errors at the primitive level, from which object-level structure is obtained. Primitive label graphs are directed graphs defined over primitives and primitive pairs. We define new metrics obtained by Hamming distances over label graphs, which allow classification, segmentation and parsing errors to be characterized separately, or using a single measure. Recall and precision for detected objects may also be computed directly from label graphs. We illustrate the new metrics by comparing a new primitive-level evaluation to the symbol-level evaluation performed for the CROHME 2012 handwritten math recognition competition. A Python-based set of utilities for evaluating, visualizing and translating label graphs is publicly available.

  5. MTPA control of mechanical sensorless IPMSM based on adaptive nonlinear control.

    PubMed

    Najjar-Khodabakhsh, Abbas; Soltani, Jafar

    2016-03-01

    In this paper, an adaptive nonlinear control scheme has been proposed for implementing maximum torque per ampere (MTPA) control strategy corresponding to interior permanent magnet synchronous motor (IPMSM) drive. This control scheme is developed in the rotor d-q axis reference frame using adaptive input-output state feedback linearization (AIOFL) method. The drive system control stability is supported by Lyapunov theory. The motor inductances are online estimated by an estimation law obtained by AIOFL. The estimation errors of these parameters are proved to be asymptotically converged to zero. Based on minimizing the motor current amplitude, the MTPA control strategy is performed by using the nonlinear optimization technique while considering the online reference torque. The motor reference torque is generated by a conventional rotor speed PI controller. By performing MTPA control strategy, the generated online motor d-q reference currents were used in AIOFL controller to obtain the SV-PWM reference voltages and the online estimation of the motor d-q inductances. In addition, the stator resistance is online estimated using a conventional PI controller. Moreover, the rotor position is detected using the online estimation of the stator flux and online estimation of the motor q-axis inductance. Simulation and experimental results obtained prove the effectiveness and the capability of the proposed control method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Online machining error estimation method of numerical control gear grinding machine tool based on data analysis of internal sensors

    NASA Astrophysics Data System (ADS)

    Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin

    2016-12-01

    This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.

  7. Bibliographic Instruction and the Development of Online Catalogs.

    ERIC Educational Resources Information Center

    McDonald, David R.; Searing, Susan E.

    1983-01-01

    Discusses the definition of an online library catalog; five factors to be considered by the online catalog designer; user-computer communication (error messages, help screens, prompts, unnatural language); online tutorials and offline instruction offered by bibliographic instruction librarians; and the current situation. Nine references are…

  8. Hardware-Efficient On-line Learning through Pipelined Truncated-Error Backpropagation in Binary-State Networks

    PubMed Central

    Mostafa, Hesham; Pedroni, Bruno; Sheik, Sadique; Cauwenberghs, Gert

    2017-01-01

    Artificial neural networks (ANNs) trained using backpropagation are powerful learning architectures that have achieved state-of-the-art performance in various benchmarks. Significant effort has been devoted to developing custom silicon devices to accelerate inference in ANNs. Accelerating the training phase, however, has attracted relatively little attention. In this paper, we describe a hardware-efficient on-line learning technique for feedforward multi-layer ANNs that is based on pipelined backpropagation. Learning is performed in parallel with inference in the forward pass, removing the need for an explicit backward pass and requiring no extra weight lookup. By using binary state variables in the feedforward network and ternary errors in truncated-error backpropagation, the need for any multiplications in the forward and backward passes is removed, and memory requirements for the pipelining are drastically reduced. Further reduction in addition operations owing to the sparsity in the forward neural and backpropagating error signal paths contributes to highly efficient hardware implementation. For proof-of-concept validation, we demonstrate on-line learning of MNIST handwritten digit classification on a Spartan 6 FPGA interfacing with an external 1Gb DDR2 DRAM, that shows small degradation in test error performance compared to an equivalently sized binary ANN trained off-line using standard back-propagation and exact errors. Our results highlight an attractive synergy between pipelined backpropagation and binary-state networks in substantially reducing computation and memory requirements, making pipelined on-line learning practical in deep networks. PMID:28932180

  9. Hardware-Efficient On-line Learning through Pipelined Truncated-Error Backpropagation in Binary-State Networks.

    PubMed

    Mostafa, Hesham; Pedroni, Bruno; Sheik, Sadique; Cauwenberghs, Gert

    2017-01-01

    Artificial neural networks (ANNs) trained using backpropagation are powerful learning architectures that have achieved state-of-the-art performance in various benchmarks. Significant effort has been devoted to developing custom silicon devices to accelerate inference in ANNs. Accelerating the training phase, however, has attracted relatively little attention. In this paper, we describe a hardware-efficient on-line learning technique for feedforward multi-layer ANNs that is based on pipelined backpropagation. Learning is performed in parallel with inference in the forward pass, removing the need for an explicit backward pass and requiring no extra weight lookup. By using binary state variables in the feedforward network and ternary errors in truncated-error backpropagation, the need for any multiplications in the forward and backward passes is removed, and memory requirements for the pipelining are drastically reduced. Further reduction in addition operations owing to the sparsity in the forward neural and backpropagating error signal paths contributes to highly efficient hardware implementation. For proof-of-concept validation, we demonstrate on-line learning of MNIST handwritten digit classification on a Spartan 6 FPGA interfacing with an external 1Gb DDR2 DRAM, that shows small degradation in test error performance compared to an equivalently sized binary ANN trained off-line using standard back-propagation and exact errors. Our results highlight an attractive synergy between pipelined backpropagation and binary-state networks in substantially reducing computation and memory requirements, making pipelined on-line learning practical in deep networks.

  10. Research on the Error Characteristics of a 110 kV Optical Voltage Transformer under Three Conditions: In the Laboratory, Off-Line in the Field and During On-Line Operation

    PubMed Central

    Xiao, Xia; Hu, Haoliang; Xu, Yan; Lei, Min; Xiong, Qianzhu

    2016-01-01

    Optical voltage transformers (OVTs) have been applied in power systems. When performing accuracy performance tests of OVTs large differences exist between the electromagnetic environment and the temperature variation in the laboratory and on-site. Therefore, OVTs may display different error characteristics under different conditions. In this paper, OVT prototypes with typical structures were selected to be tested for the error characteristics with the same testing equipment and testing method. The basic accuracy, the additional error caused by temperature and the adjacent phase in the laboratory, the accuracy in the field off-line, and the real-time monitoring error during on-line operation were tested. The error characteristics under the three conditions—laboratory, in the field off-line and during on-site operation—were compared and analyzed. The results showed that the effect of the transportation process, electromagnetic environment and the adjacent phase on the accuracy of OVTs could be ignored for level 0.2, but the error characteristics of OVTs are dependent on the environmental temperature and are sensitive to the temperature gradient. The temperature characteristics during on-line operation were significantly superior to those observed in the laboratory. PMID:27537895

  11. Research on the Error Characteristics of a 110 kV Optical Voltage Transformer under Three Conditions: In the Laboratory, Off-Line in the Field and During On-Line Operation.

    PubMed

    Xiao, Xia; Hu, Haoliang; Xu, Yan; Lei, Min; Xiong, Qianzhu

    2016-08-16

    Optical voltage transformers (OVTs) have been applied in power systems. When performing accuracy performance tests of OVTs large differences exist between the electromagnetic environment and the temperature variation in the laboratory and on-site. Therefore, OVTs may display different error characteristics under different conditions. In this paper, OVT prototypes with typical structures were selected to be tested for the error characteristics with the same testing equipment and testing method. The basic accuracy, the additional error caused by temperature and the adjacent phase in the laboratory, the accuracy in the field off-line, and the real-time monitoring error during on-line operation were tested. The error characteristics under the three conditions-laboratory, in the field off-line and during on-site operation-were compared and analyzed. The results showed that the effect of the transportation process, electromagnetic environment and the adjacent phase on the accuracy of OVTs could be ignored for level 0.2, but the error characteristics of OVTs are dependent on the environmental temperature and are sensitive to the temperature gradient. The temperature characteristics during on-line operation were significantly superior to those observed in the laboratory.

  12. Adaptive and accelerated tracking-learning-detection

    NASA Astrophysics Data System (ADS)

    Guo, Pengyu; Li, Xin; Ding, Shaowen; Tian, Zunhua; Zhang, Xiaohu

    2013-08-01

    An improved online long-term visual tracking algorithm, named adaptive and accelerated TLD (AA-TLD) based on Tracking-Learning-Detection (TLD) which is a novel tracking framework has been introduced in this paper. The improvement focuses on two aspects, one is adaption, which makes the algorithm not dependent on the pre-defined scanning grids by online generating scale space, and the other is efficiency, which uses not only algorithm-level acceleration like scale prediction that employs auto-regression and moving average (ARMA) model to learn the object motion to lessen the detector's searching range and the fixed number of positive and negative samples that ensures a constant retrieving time, but also CPU and GPU parallel technology to achieve hardware acceleration. In addition, in order to obtain a better effect, some TLD's details are redesigned, which uses a weight including both normalized correlation coefficient and scale size to integrate results, and adjusts distance metric thresholds online. A contrastive experiment on success rate, center location error and execution time, is carried out to show a performance and efficiency upgrade over state-of-the-art TLD with partial TLD datasets and Shenzhou IX return capsule image sequences. The algorithm can be used in the field of video surveillance to meet the need of real-time video tracking.

  13. A Comparison of the Use of Bony Anatomy and Internal Markers for Offline Verification and an Evaluation of the Potential Benefit of Online and Offline Verification Protocols for Prostate Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNair, Helen A.; Hansen, Vibeke N.; Parker, Christopher

    2008-05-01

    Purpose: To evaluate the utility of intraprostatic markers in the treatment verification of prostate cancer radiotherapy. Specific aims were: to compare the effectiveness of offline correction protocols, either using gold markers or bony anatomy; to estimate the potential benefit of online correction protocol's using gold markers; to determine the presence and effect of intrafraction motion. Methods and Materials: Thirty patients with three gold markers inserted had pretreatment and posttreatment images acquired and were treated using an offline correction protocol and gold markers. Retrospectively, an offline protocol was applied using bony anatomy and an online protocol using gold markers. Results: Themore » systematic errors were reduced from 1.3, 1.9, and 2.5 mm to 1.1, 1.1, and 1.5 mm in the right-left (RL), superoinferior (SI), and anteroposterior (AP) directions, respectively, using the offline correction protocol and gold markers instead of bony anatomy. The subsequent decrease in margins was 1.7, 3.3, and 4 mm in the RL, SI, and AP directions, respectively. An offline correction protocol combined with an online correction protocol in the first four fractions reduced random errors further to 0.9, 1.1, and 1.0 mm in the RL, SI, and AP directions, respectively. A daily online protocol reduced all errors to <1 mm. Intrafraction motion had greater impact on the effectiveness of the online protocol than the offline protocols. Conclusions: An offline protocol using gold markers is effective in reducing the systematic error. The value of online protocols is reduced by intrafraction motion.« less

  14. Using Analysis Increments (AI) to Estimate and Correct Systematic Errors in the Global Forecast System (GFS) Online

    NASA Astrophysics Data System (ADS)

    Bhargava, K.; Kalnay, E.; Carton, J.; Yang, F.

    2017-12-01

    Systematic forecast errors, arising from model deficiencies, form a significant portion of the total forecast error in weather prediction models like the Global Forecast System (GFS). While much effort has been expended to improve models, substantial model error remains. The aim here is to (i) estimate the model deficiencies in the GFS that lead to systematic forecast errors, (ii) implement an online correction (i.e., within the model) scheme to correct GFS following the methodology of Danforth et al. [2007] and Danforth and Kalnay [2008, GRL]. Analysis Increments represent the corrections that new observations make on, in this case, the 6-hr forecast in the analysis cycle. Model bias corrections are estimated from the time average of the analysis increments divided by 6-hr, assuming that initial model errors grow linearly and first ignoring the impact of observation bias. During 2012-2016, seasonal means of the 6-hr model bias are generally robust despite changes in model resolution and data assimilation systems, and their broad continental scales explain their insensitivity to model resolution. The daily bias dominates the sub-monthly analysis increments and consists primarily of diurnal and semidiurnal components, also requiring a low dimensional correction. Analysis increments in 2015 and 2016 are reduced over oceans, which is attributed to improvements in the specification of the SSTs. These results encourage application of online correction, as suggested by Danforth and Kalnay, for mean, seasonal and diurnal and semidiurnal model biases in GFS to reduce both systematic and random errors. As the error growth in the short-term is still linear, estimated model bias corrections can be added as a forcing term in the model tendency equation to correct online. Preliminary experiments with GFS, correcting temperature and specific humidity online show reduction in model bias in 6-hr forecast. This approach can then be used to guide and optimize the design of sub-grid scale physical parameterizations, more accurate discretization of the model dynamics, boundary conditions, radiative transfer codes, and other potential model improvements which can then replace the empirical correction scheme. The analysis increments also provide guidance in testing new physical parameterizations.

  15. The NEEDS Data Base Management and Archival Mass Memory System

    NASA Technical Reports Server (NTRS)

    Bailey, G. A.; Bryant, S. B.; Thomas, D. T.; Wagnon, F. W.

    1980-01-01

    A Data Base Management System and an Archival Mass Memory System are being developed that will have a 10 to the 12th bit on-line and a 10 to the 13th off-line storage capacity. The integrated system will accept packetized data from the data staging area at 50 Mbps, create a comprehensive directory, provide for file management, record the data, perform error detection and correction, accept user requests, retrieve the requested data files and provide the data to multiple users at a combined rate of 50 Mbps. Stored and replicated data files will have a bit error rate of less than 10 to the -9th even after ten years of storage. The integrated system will be demonstrated to prove the technology late in 1981.

  16. Fast online generalized multiscale finite element method using constraint energy minimization

    NASA Astrophysics Data System (ADS)

    Chung, Eric T.; Efendiev, Yalchin; Leung, Wing Tat

    2018-02-01

    Local multiscale methods often construct multiscale basis functions in the offline stage without taking into account input parameters, such as source terms, boundary conditions, and so on. These basis functions are then used in the online stage with a specific input parameter to solve the global problem at a reduced computational cost. Recently, online approaches have been introduced, where multiscale basis functions are adaptively constructed in some regions to reduce the error significantly. In multiscale methods, it is desired to have only 1-2 iterations to reduce the error to a desired threshold. Using Generalized Multiscale Finite Element Framework [10], it was shown that by choosing sufficient number of offline basis functions, the error reduction can be made independent of physical parameters, such as scales and contrast. In this paper, our goal is to improve this. Using our recently proposed approach [4] and special online basis construction in oversampled regions, we show that the error reduction can be made sufficiently large by appropriately selecting oversampling regions. Our numerical results show that one can achieve a three order of magnitude error reduction, which is better than our previous methods. We also develop an adaptive algorithm and enrich in selected regions with large residuals. In our adaptive method, we show that the convergence rate can be determined by a user-defined parameter and we confirm this by numerical simulations. The analysis of the method is presented.

  17. Analysis of Online Composite Mirror Descent Algorithm.

    PubMed

    Lei, Yunwen; Zhou, Ding-Xuan

    2017-03-01

    We study the convergence of the online composite mirror descent algorithm, which involves a mirror map to reflect the geometry of the data and a convex objective function consisting of a loss and a regularizer possibly inducing sparsity. Our error analysis provides convergence rates in terms of properties of the strongly convex differentiable mirror map and the objective function. For a class of objective functions with Hölder continuous gradients, the convergence rates of the excess (regularized) risk under polynomially decaying step sizes have the order [Formula: see text] after [Formula: see text] iterates. Our results improve the existing error analysis for the online composite mirror descent algorithm by avoiding averaging and removing boundedness assumptions, and they sharpen the existing convergence rates of the last iterate for online gradient descent without any boundedness assumptions. Our methodology mainly depends on a novel error decomposition in terms of an excess Bregman distance, refined analysis of self-bounding properties of the objective function, and the resulting one-step progress bounds.

  18. Fault detection of helicopter gearboxes using the multi-valued influence matrix method

    NASA Technical Reports Server (NTRS)

    Chin, Hsinyung; Danai, Kourosh; Lewicki, David G.

    1993-01-01

    In this paper we investigate the effectiveness of a pattern classifying fault detection system that is designed to cope with the variability of fault signatures inherent in helicopter gearboxes. For detection, the measurements are monitored on-line and flagged upon the detection of abnormalities, so that they can be attributed to a faulty or normal case. As such, the detection system is composed of two components, a quantization matrix to flag the measurements, and a multi-valued influence matrix (MVIM) that represents the behavior of measurements during normal operation and at fault instances. Both the quantization matrix and influence matrix are tuned during a training session so as to minimize the error in detection. To demonstrate the effectiveness of this detection system, it was applied to vibration measurements collected from a helicopter gearbox during normal operation and at various fault instances. The results indicate that the MVIM method provides excellent results when the full range of faults effects on the measurements are included in the training set.

  19. Using microwave Doppler radar in automated manufacturing applications

    NASA Astrophysics Data System (ADS)

    Smith, Gregory C.

    Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help stimulate future growth in industrial productivity, which also promises to fuel economic growth and promote economic stability. The study also benefits the Department of Industrial Technology at Iowa State University and the field of Industrial Technology by contributing to the ongoing "smart" machine research program within the Department of Industrial Technology and by stimulating research into new sensor technologies within the University and within the field of Industrial Technology.

  20. Joint multifractal analysis based on wavelet leaders

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Yang, Yan-Hong; Wang, Gang-Jin; Zhou, Wei-Xing

    2017-12-01

    Mutually interacting components form complex systems and these components usually have long-range cross-correlated outputs. Using wavelet leaders, we propose a method for characterizing the joint multifractal nature of these long-range cross correlations; we call this method joint multifractal analysis based on wavelet leaders (MF-X-WL). We test the validity of the MF-X-WL method by performing extensive numerical experiments on dual binomial measures with multifractal cross correlations and bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. Both experiments indicate that MF-X-WL is capable of detecting cross correlations in synthetic data with acceptable estimating errors. We also apply the MF-X-WL method to pairs of series from financial markets (returns and volatilities) and online worlds (online numbers of different genders and different societies) and determine intriguing joint multifractal behavior.

  1. Weighing Rocky Exoplanets with Improved Radial Velocimetry

    NASA Astrophysics Data System (ADS)

    Xuesong Wang, Sharon; Wright, Jason; California Planet Survey Consortium

    2016-01-01

    The synergy between Kepler and the ground-based radial velocity (RV) surveys have made numerous discoveries of small and rocky exoplanets, opening the age of Earth analogs. However, most (29/33) of the RV-detected exoplanets that are smaller than 3 Earth radii do not have their masses constrained to better than 20% - limited by the current RV precision (1-2 m/s). Our work improves the RV precision of the Keck telescope, which is responsible for most of the mass measurements for small Kepler exoplanets. We have discovered and verified, for the first time, two of the dominant terms in Keck's RV systematic error budget: modeling errors (mostly in deconvolution) and telluric contamination. These two terms contribute 1 m/s and 0.6 m/s, respectively, to the RV error budget (RMS in quadrature), and they create spurious signals at periods of one sidereal year and its harmonics with amplitudes of 0.2-1 m/s. Left untreated, these errors can mimic the signals of Earth-like or Super-Earth planets in the Habitable Zone. Removing these errors will bring better precision to ten-year worth of Keck data and better constraints on the masses and compositions of small Kepler planets. As more precise RV instruments coming online, we need advanced data analysis tools to overcome issues like these in order to detect the Earth twin (RV amplitude 8 cm/s). We are developing a new, open-source RV data analysis tool in Python, which uses Bayesian MCMC and Gaussian processes, to fully exploit the hardware improvements brought by new instruments like MINERVA and NASA's WIYN/EPDS.

  2. Identification and compensation of the temperature influences in a miniature three-axial accelerometer based on the least squares method

    NASA Astrophysics Data System (ADS)

    Grigorie, Teodor Lucian; Corcau, Ileana Jenica; Tudosie, Alexandru Nicolae

    2017-06-01

    The paper presents a way to obtain an intelligent miniaturized three-axial accelerometric sensor, based on the on-line estimation and compensation of the sensor errors generated by the environmental temperature variation. Taking into account that this error's value is a strongly nonlinear complex function of the values of environmental temperature and of the acceleration exciting the sensor, its correction may not be done off-line and it requires the presence of an additional temperature sensor. The proposed identification methodology for the error model is based on the least square method which process off-line the numerical values obtained from the accelerometer experimental testing for different values of acceleration applied to its axes of sensitivity and for different values of operating temperature. A final analysis of the error level after the compensation highlights the best variant for the matrix in the error model. In the sections of the paper are shown the results of the experimental testing of the accelerometer on all the three sensitivity axes, the identification of the error models on each axis by using the least square method, and the validation of the obtained models with experimental values. For all of the three detection channels was obtained a reduction by almost two orders of magnitude of the acceleration absolute maximum error due to environmental temperature variation.

  3. Application of Energy Function as a Measure of Error in the Numerical Solution for Online Transient Stability Assessment

    NASA Astrophysics Data System (ADS)

    Sarojkumar, K.; Krishna, S.

    2016-08-01

    Online dynamic security assessment (DSA) is a computationally intensive task. In order to reduce the amount of computation, screening of contingencies is performed. Screening involves analyzing the contingencies with the system described by a simpler model so that computation requirement is reduced. Screening identifies those contingencies which are sure to not cause instability and hence can be eliminated from further scrutiny. The numerical method and the step size used for screening should be chosen with a compromise between speed and accuracy. This paper proposes use of energy function as a measure of error in the numerical solution used for screening contingencies. The proposed measure of error can be used to determine the most accurate numerical method satisfying the time constraint of online DSA. Case studies on 17 generator system are reported.

  4. Prevention of prescription errors by computerized, on-line, individual patient related surveillance of drug order entry.

    PubMed

    Oliven, A; Zalman, D; Shilankov, Y; Yeshurun, D; Odeh, M

    2002-01-01

    Computerized prescription of drugs is expected to reduce the number of many preventable drug ordering errors. In the present study we evaluated the usefullness of a computerized drug order entry (CDOE) system in reducing prescription errors. A department of internal medicine using a comprehensive CDOE, which included also patient-related drug-laboratory, drug-disease and drug-allergy on-line surveillance was compared to a similar department in which drug orders were handwritten. CDOE reduced prescription errors to 25-35%. The causes of errors remained similar, and most errors, on both departments, were associated with abnormal renal function and electrolyte balance. Residual errors remaining on the CDOE-using department were due to handwriting on the typed order, failure to feed patients' diseases, and system failures. The use of CDOE was associated with a significant reduction in mean hospital stay and in the number of changes performed in the prescription. The findings of this study both quantity the impact of comprehensive CDOE on prescription errors and delineate the causes for remaining errors.

  5. Testing the Recognition and Perception of Errors in Context

    ERIC Educational Resources Information Center

    Brandenburg, Laura C.

    2015-01-01

    This study tests the recognition of errors in context and whether the presence of errors affects the reader's perception of the writer's ethos. In an experimental, posttest only design, participants were randomly assigned a memo to read in an online survey: one version with errors and one version without. Of the six intentional errors in version…

  6. Online automatic tuning and control for fed-batch cultivation

    PubMed Central

    van Straten, Gerrit; van der Pol, Leo A.; van Boxtel, Anton J. B.

    2007-01-01

    Performance of controllers applied in biotechnological production is often below expectation. Online automatic tuning has the capability to improve control performance by adjusting control parameters. This work presents automatic tuning approaches for model reference specific growth rate control during fed-batch cultivation. The approaches are direct methods that use the error between observed specific growth rate and its set point; systematic perturbations of the cultivation are not necessary. Two automatic tuning methods proved to be efficient, in which the adaptation rate is based on a combination of the error, squared error and integral error. These methods are relatively simple and robust against disturbances, parameter uncertainties, and initialization errors. Application of the specific growth rate controller yields a stable system. The controller and automatic tuning methods are qualified by simulations and laboratory experiments with Bordetella pertussis. PMID:18157554

  7. Correction to: Apatinib: A Review in Advanced Gastric Cancer and Other Advanced Cancers.

    PubMed

    Scott, Lesley J

    2018-05-04

    An Online First version of this article was made available online at http://link.springer.com/journal/40265/onlineFirst/page/1 on 16 April 2018. Errors were subsequently identified in the article, and the following corrections should be noted.

  8. Detection and control of combustion instability based on the concept of dynamical system theory.

    PubMed

    Gotoda, Hiroshi; Shinoda, Yuta; Kobayashi, Masaki; Okuno, Yuta; Tachibana, Shigeru

    2014-02-01

    We propose an online method of detecting combustion instability based on the concept of dynamical system theory, including the characterization of the dynamic behavior of combustion instability. As an important case study relevant to combustion instability encountered in fundamental and practical combustion systems, we deal with the combustion dynamics close to lean blowout (LBO) in a premixed gas-turbine model combustor. The relatively regular pressure fluctuations generated by thermoacoustic oscillations transit to low-dimensional intermittent chaos owing to the intermittent appearance of burst with decreasing equivalence ratio. The translation error, which is characterized by quantifying the degree of parallelism of trajectories in the phase space, can be used as a control variable to prevent LBO.

  9. Detection and control of combustion instability based on the concept of dynamical system theory

    NASA Astrophysics Data System (ADS)

    Gotoda, Hiroshi; Shinoda, Yuta; Kobayashi, Masaki; Okuno, Yuta; Tachibana, Shigeru

    2014-02-01

    We propose an online method of detecting combustion instability based on the concept of dynamical system theory, including the characterization of the dynamic behavior of combustion instability. As an important case study relevant to combustion instability encountered in fundamental and practical combustion systems, we deal with the combustion dynamics close to lean blowout (LBO) in a premixed gas-turbine model combustor. The relatively regular pressure fluctuations generated by thermoacoustic oscillations transit to low-dimensional intermittent chaos owing to the intermittent appearance of burst with decreasing equivalence ratio. The translation error, which is characterized by quantifying the degree of parallelism of trajectories in the phase space, can be used as a control variable to prevent LBO.

  10. Variable complexity online sequential extreme learning machine, with applications to streamflow prediction

    NASA Astrophysics Data System (ADS)

    Lima, Aranildo R.; Hsieh, William W.; Cannon, Alex J.

    2017-12-01

    In situations where new data arrive continually, online learning algorithms are computationally much less costly than batch learning ones in maintaining the model up-to-date. The extreme learning machine (ELM), a single hidden layer artificial neural network with random weights in the hidden layer, is solved by linear least squares, and has an online learning version, the online sequential ELM (OSELM). As more data become available during online learning, information on the longer time scale becomes available, so ideally the model complexity should be allowed to change, but the number of hidden nodes (HN) remains fixed in OSELM. A variable complexity VC-OSELM algorithm is proposed to dynamically add or remove HN in the OSELM, allowing the model complexity to vary automatically as online learning proceeds. The performance of VC-OSELM was compared with OSELM in daily streamflow predictions at two hydrological stations in British Columbia, Canada, with VC-OSELM significantly outperforming OSELM in mean absolute error, root mean squared error and Nash-Sutcliffe efficiency at both stations.

  11. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.

    PubMed

    Klumpp, John; Brandl, Alexander

    2015-03-01

    A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.

  12. Bayesian microsaccade detection

    PubMed Central

    Mihali, Andra; van Opheusden, Bas; Ma, Wei Ji

    2017-01-01

    Microsaccades are high-velocity fixational eye movements, with special roles in perception and cognition. The default microsaccade detection method is to determine when the smoothed eye velocity exceeds a threshold. We have developed a new method, Bayesian microsaccade detection (BMD), which performs inference based on a simple statistical model of eye positions. In this model, a hidden state variable changes between drift and microsaccade states at random times. The eye position is a biased random walk with different velocity distributions for each state. BMD generates samples from the posterior probability distribution over the eye state time series given the eye position time series. Applied to simulated data, BMD recovers the “true” microsaccades with fewer errors than alternative algorithms, especially at high noise. Applied to EyeLink eye tracker data, BMD detects almost all the microsaccades detected by the default method, but also apparent microsaccades embedded in high noise—although these can also be interpreted as false positives. Next we apply the algorithms to data collected with a Dual Purkinje Image eye tracker, whose higher precision justifies defining the inferred microsaccades as ground truth. When we add artificial measurement noise, the inferences of all algorithms degrade; however, at noise levels comparable to EyeLink data, BMD recovers the “true” microsaccades with 54% fewer errors than the default algorithm. Though unsuitable for online detection, BMD has other advantages: It returns probabilities rather than binary judgments, and it can be straightforwardly adapted as the generative model is refined. We make our algorithm available as a software package. PMID:28114483

  13. Cold-Rolled Strip Steel Stress Detection Technology Based on a Magnetoresistance Sensor and the Magnetoelastic Effect

    PubMed Central

    Guan, Ben; Zang, Yong; Han, Xiaohui; Zheng, Kailun

    2018-01-01

    Driven by the demands for contactless stress detection, technologies are being used for shape control when producing cold-rolled strips. This paper presents a novel contactless stress detection technology based on a magnetoresistance sensor and the magnetoelastic effect, enabling the detection of internal stress in manufactured cold-rolled strips. An experimental device was designed and produced. Characteristics of this detection technology were investigated through experiments assisted by theoretical analysis. Theoretically, a linear correlation exists between the internal stress of strip steel and the voltage output of a magneto-resistive sensor. Therefore, for this stress detection system, the sensitivity of the stress detection was adjusted by adjusting the supply voltage of the magnetoresistance sensor, detection distance, and other relevant parameters. The stress detection experimental results showed that this detection system has good repeatability and linearity. The detection error was controlled within 1.5%. Moreover, the intrinsic factors of the detected strip steel, including thickness, carbon percentage, and crystal orientation, also affected the sensitivity of the detection system. The detection technology proposed in this research enables online contactless detection and meets the requirements for cold-rolled steel strips. PMID:29883387

  14. Cold-Rolled Strip Steel Stress Detection Technology Based on a Magnetoresistance Sensor and the Magnetoelastic Effect.

    PubMed

    Guan, Ben; Zang, Yong; Han, Xiaohui; Zheng, Kailun

    2018-05-21

    Driven by the demands for contactless stress detection, technologies are being used for shape control when producing cold-rolled strips. This paper presents a novel contactless stress detection technology based on a magnetoresistance sensor and the magnetoelastic effect, enabling the detection of internal stress in manufactured cold-rolled strips. An experimental device was designed and produced. Characteristics of this detection technology were investigated through experiments assisted by theoretical analysis. Theoretically, a linear correlation exists between the internal stress of strip steel and the voltage output of a magneto-resistive sensor. Therefore, for this stress detection system, the sensitivity of the stress detection was adjusted by adjusting the supply voltage of the magnetoresistance sensor, detection distance, and other relevant parameters. The stress detection experimental results showed that this detection system has good repeatability and linearity. The detection error was controlled within 1.5%. Moreover, the intrinsic factors of the detected strip steel, including thickness, carbon percentage, and crystal orientation, also affected the sensitivity of the detection system. The detection technology proposed in this research enables online contactless detection and meets the requirements for cold-rolled steel strips.

  15. Hyperspectral imaging for food processing automation

    NASA Astrophysics Data System (ADS)

    Park, Bosoon; Lawrence, Kurt C.; Windham, William R.; Smith, Doug P.; Feldner, Peggy W.

    2002-11-01

    This paper presents the research results that demonstrates hyperspectral imaging could be used effectively for detecting feces (from duodenum, ceca, and colon) and ingesta on the surface of poultry carcasses, and potential application for real-time, on-line processing of poultry for automatic safety inspection. The hyperspectral imaging system included a line scan camera with prism-grating-prism spectrograph, fiber optic line lighting, motorized lens control, and hyperspectral image processing software. Hyperspectral image processing algorithms, specifically band ratio of dual-wavelength (565/517) images and thresholding were effective on the identification of fecal and ingesta contamination of poultry carcasses. A multispectral imaging system including a common aperture camera with three optical trim filters (515.4 nm with 8.6- nm FWHM), 566.4 nm with 8.8-nm FWHM, and 631 nm with 10.2-nm FWHM), which were selected and validated by a hyperspectral imaging system, was developed for a real-time, on-line application. A total image processing time required to perform the current multispectral images captured by a common aperture camera was approximately 251 msec or 3.99 frames/sec. A preliminary test shows that the accuracy of real-time multispectral imaging system to detect feces and ingesta on corn/soybean fed poultry carcasses was 96%. However, many false positive spots that cause system errors were also detected.

  16. Real-time auto-adaptive margin generation for MLC-tracked radiotherapy

    NASA Astrophysics Data System (ADS)

    Glitzner, M.; Fast, M. F.; de Senneville, B. Denis; Nill, S.; Oelfke, U.; Lagendijk, J. J. W.; Raaymakers, B. W.; Crijns, S. P. M.

    2017-01-01

    In radiotherapy, abdominal and thoracic sites are candidates for performing motion tracking. With real-time control it is possible to adjust the multileaf collimator (MLC) position to the target position. However, positions are not perfectly matched and position errors arise from system delays and complicated response of the electromechanic MLC system. Although, it is possible to compensate parts of these errors by using predictors, residual errors remain and need to be compensated to retain target coverage. This work presents a method to statistically describe tracking errors and to automatically derive a patient-specific, per-segment margin to compensate the arising underdosage on-line, i.e. during plan delivery. The statistics of the geometric error between intended and actual machine position are derived using kernel density estimators. Subsequently a margin is calculated on-line according to a selected coverage parameter, which determines the amount of accepted underdosage. The margin is then applied onto the actual segment to accommodate the positioning errors in the enlarged segment. The proof-of-concept was tested in an on-line tracking experiment and showed the ability to recover underdosages for two test cases, increasing {{V}90 %} in the underdosed area about 47 % and 41 % , respectively. The used dose model was able to predict the loss of dose due to tracking errors and could be used to infer the necessary margins. The implementation had a running time of 23 ms which is compatible with real-time requirements of MLC tracking systems. The auto-adaptivity to machine and patient characteristics makes the technique a generic yet intuitive candidate to avoid underdosages due to MLC tracking errors.

  17. The Neural-fuzzy Thermal Error Compensation Controller on CNC Machining Center

    NASA Astrophysics Data System (ADS)

    Tseng, Pai-Chung; Chen, Shen-Len

    The geometric errors and structural thermal deformation are factors that influence the machining accuracy of Computer Numerical Control (CNC) machining center. Therefore, researchers pay attention to thermal error compensation technologies on CNC machine tools. Some real-time error compensation techniques have been successfully demonstrated in both laboratories and industrial sites. The compensation results still need to be enhanced. In this research, the neural-fuzzy theory has been conducted to derive a thermal prediction model. An IC-type thermometer has been used to detect the heat sources temperature variation. The thermal drifts are online measured by a touch-triggered probe with a standard bar. A thermal prediction model is then derived by neural-fuzzy theory based on the temperature variation and the thermal drifts. A Graphic User Interface (GUI) system is also built to conduct the user friendly operation interface with Insprise C++ Builder. The experimental results show that the thermal prediction model developed by neural-fuzzy theory methodology can improve machining accuracy from 80µm to 3µm. Comparison with the multi-variable linear regression analysis the compensation accuracy is increased from ±10µm to ±3µm.

  18. Optimized tuner selection for engine performance estimation

    NASA Technical Reports Server (NTRS)

    Simon, Donald L. (Inventor); Garg, Sanjay (Inventor)

    2013-01-01

    A methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented. This technique specifically addresses the underdetermined estimation problem, where there are more unknown parameters than available sensor measurements. A systematic approach is applied to produce a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. Tuning parameter selection is performed using a multi-variable iterative search routine which seeks to minimize the theoretical mean-squared estimation error. Theoretical Kalman filter estimation error bias and variance values are derived at steady-state operating conditions, and the tuner selection routine is applied to minimize these values. The new methodology yields an improvement in on-line engine performance estimation accuracy.

  19. Advanced Water Vapor Lidar Detection System

    NASA Technical Reports Server (NTRS)

    Elsayed-Ali, Hani

    1998-01-01

    In the present water vapor lidar system, the detected signal is sent over long cables to a waveform digitizer in a CAMAC crate. This has the disadvantage of transmitting analog signals for a relatively long distance, which is subjected to pickup noise, leading to a decrease in the signal to noise ratio. Generally, errors in the measurement of water vapor with the DIAL method arise from both random and systematic sources. Systematic errors in DIAL measurements are caused by both atmospheric and instrumentation effects. The selection of the on-line alexandrite laser with a narrow linewidth, suitable intensity and high spectral purity, and its operation at the center of the water vapor lines, ensures minimum influence in the DIAL measurement that are caused by the laser spectral distribution and avoid system overloads. Random errors are caused by noise in the detected signal. Variability of the photon statistics in the lidar return signal, noise resulting from detector dark current, and noise in the background signal are the main sources of random error. This type of error can be minimized by maximizing the signal to noise ratio. The increase in the signal to noise ratio can be achieved by several ways. One way is to increase the laser pulse energy, by increasing its amplitude or the pulse repetition rate. Another way, is to use a detector system with higher quantum efficiency and lower noise, on the other hand, the selection of a narrow band optical filter that rejects most of the day background light and retains high optical efficiency is an important issue. Following acquisition of the lidar data, we minimize random errors in the DIAL measurement by averaging the data, but this will result in the reduction of the vertical and horizontal resolutions. Thus, a trade off is necessary to achieve a balance between the spatial resolution and the measurement precision. Therefore, the main goal of this research effort is to increase the signal to noise ratio by a factor of 10 over the current system, using a newly evaluated, very low noise avalanche photo diode detector and constructing a 10 MHz waveform digitizer which will replace the current CAMAC system.

  20. Microfluidic devices for sample preparation and rapid detection of foodborne pathogens.

    PubMed

    Kant, Krishna; Shahbazi, Mohammad-Ali; Dave, Vivek Priy; Ngo, Tien Anh; Chidambara, Vinayaka Aaydha; Than, Linh Quyen; Bang, Dang Duong; Wolff, Anders

    2018-03-10

    Rapid detection of foodborne pathogens at an early stage is imperative for preventing the outbreak of foodborne diseases, known as serious threats to human health. Conventional bacterial culturing methods for foodborne pathogen detection are time consuming, laborious, and with poor pathogen diagnosis competences. This has prompted researchers to call the current status of detection approaches into question and leverage new technologies for superior pathogen sensing outcomes. Novel strategies mainly rely on incorporating all the steps from sample preparation to detection in miniaturized devices for online monitoring of pathogens with high accuracy and sensitivity in a time-saving and cost effective manner. Lab on chip is a blooming area in diagnosis, which exploits different mechanical and biological techniques to detect very low concentrations of pathogens in food samples. This is achieved through streamlining the sample handling and concentrating procedures, which will subsequently reduce human errors and enhance the accuracy of the sensing methods. Integration of sample preparation techniques into these devices can effectively minimize the impact of complex food matrix on pathogen diagnosis and improve the limit of detections. Integration of pathogen capturing bio-receptors on microfluidic devices is a crucial step, which can facilitate recognition abilities in harsh chemical and physical conditions, offering a great commercial benefit to the food-manufacturing sector. This article reviews recent advances in current state-of-the-art of sample preparation and concentration from food matrices with focus on bacterial capturing methods and sensing technologies, along with their advantages and limitations when integrated into microfluidic devices for online rapid detection of pathogens in foods and food production line. Copyright © 2018. Published by Elsevier Inc.

  1. VizieR Online Data Catalog: ROSAT detected quasars. I. (Brinkmann+ 1997)

    NASA Astrophysics Data System (ADS)

    Brinkmann, W.; Yuan, W.

    1996-09-01

    We have compiled a sample of all quasars with measured radio emission from the Veron-Cetty - Veron catalogue (1993, VV93 ) detected by ROSAT in the ALL-SKY SURVEY (RASS, Voges 1992), as targets of pointed observations, or as serendipitous sources from pointed observations as publicly available from the ROSAT point source catalogue (ROSAT-SRC, Voges et al. 1995). The total number of ROSAT detected radio quasars from the above three sources is 654 objects. 69 of the objects are classified as radio-quiet using the defining line at a radio-loudness of 1.0, and 10 objects have no classification. The 5GHz data are from the 87GB radio survey, the NED database, or from the Veron-Cetty - Veron catalogue. The power law indices and their errors are estimated from the two hardness ratios given by the SASS assuming Galactic absorption. The X-ray flux densities in the ROSAT band (0.1-2.4keV) are calculated from the count rates using the energy to counts conversion factor for power law spectra and Galactic absorption. For the photon index we use the value obtained for a individual source if the estimated 1 sigma error is smaller than 0.5, otherwise we use the mean value 2.14. (1 data file).

  2. The Error Reporting in the ATLAS TDAQ System

    NASA Astrophysics Data System (ADS)

    Kolos, Serguei; Kazarov, Andrei; Papaevgeniou, Lykourgos

    2015-05-01

    The ATLAS Error Reporting provides a service that allows experts and shift crew to track and address errors relating to the data taking components and applications. This service, called the Error Reporting Service (ERS), gives to software applications the opportunity to collect and send comprehensive data about run-time errors, to a place where it can be intercepted in real-time by any other system component. Other ATLAS online control and monitoring tools use the ERS as one of their main inputs to address system problems in a timely manner and to improve the quality of acquired data. The actual destination of the error messages depends solely on the run-time environment, in which the online applications are operating. When an application sends information to ERS, depending on the configuration, it may end up in a local file, a database, distributed middleware which can transport it to an expert system or display it to users. Thanks to the open framework design of ERS, new information destinations can be added at any moment without touching the reporting and receiving applications. The ERS Application Program Interface (API) is provided in three programming languages used in the ATLAS online environment: C++, Java and Python. All APIs use exceptions for error reporting but each of them exploits advanced features of a given language to simplify the end-user program writing. For example, as C++ lacks language support for exceptions, a number of macros have been designed to generate hierarchies of C++ exception classes at compile time. Using this approach a software developer can write a single line of code to generate a boilerplate code for a fully qualified C++ exception class declaration with arbitrary number of parameters and multiple constructors, which encapsulates all relevant static information about the given type of issues. When a corresponding error occurs at run time, the program just need to create an instance of that class passing relevant values to one of the available class constructors and send this instance to ERS. This paper presents the original design solutions exploited for the ERS implementation and describes how it was used during the first ATLAS run period. The cross-system error reporting standardization introduced by ERS was one of the key points for the successful implementation of automated mechanisms for online error recovery.

  3. Time-resolved in vivo luminescence dosimetry for online error detection in pulsed dose-rate brachytherapy.

    PubMed

    Andersen, Claus E; Nielsen, Søren Kynde; Lindegaard, Jacob Christian; Tanderup, Kari

    2009-11-01

    The purpose of this study is to present and evaluate a dose-verification protocol for pulsed dose-rate (PDR) brachytherapy based on in vivo time-resolved (1 s time resolution) fiber-coupled luminescence dosimetry. Five cervix cancer patients undergoing PDR brachytherapy (Varian GammaMed Plus with 192Ir) were monitored. The treatments comprised from 10 to 50 pulses (1 pulse/h) delivered by intracavitary/interstitial applicators (tandem-ring systems and/or needles). For each patient, one or two dosimetry probes were placed directly in or close to the tumor region using stainless steel or titanium needles. Each dosimeter probe consisted of a small aluminum oxide crystal attached to an optical fiber cable (1 mm outer diameter) that could guide radioluminescence (RL) and optically stimulated luminescence (OSL) from the crystal to special readout instrumentation. Positioning uncertainty and hypothetical dose-delivery errors (interchanged guide tubes or applicator movements from +/-5 to +/-15 mm) were simulated in software in order to assess the ability of the system to detect errors. For three of the patients, the authors found no significant differences (P>0.01) for comparisons between in vivo measurements and calculated reference values at the level of dose per dwell position, dose per applicator, or total dose per pulse. The standard deviations of the dose per pulse were less than 3%, indicating a stable dose delivery and a highly stable geometry of applicators and dosimeter probes during the treatments. For the two other patients, the authors noted significant deviations for three individual pulses and for one dosimeter probe. These deviations could have been due to applicator movement during the treatment and one incorrectly positioned dosimeter probe, respectively. Computer simulations showed that the likelihood of detecting a pair of interchanged guide tubes increased by a factor of 10 or more for the considered patients when going from integrating to time-resolved dose verification. The likelihood of detecting a +/-15 mm displacement error increased by a factor of 1.5 or more. In vivo fiber-coupled RL/OSL dosimetry based on detectors placed in standard brachytherapy needles was demonstrated. The time-resolved dose-rate measurements were found to provide a good way to visualize the progression and stability of PDR brachytherapy dose delivery, and time-resolved dose-rate measurements provided an increased sensitivity for detection of dose-delivery errors compared with time-integrated dosimetry.

  4. Which Measures of Online Control Are Least Sensitive to Offline Processes?

    PubMed

    de Grosbois, John; Tremblay, Luc

    2018-02-28

    A major challenge to the measurement of online control is the contamination by offline, planning-based processes. The current study examined the sensitivity of four measures of online control to offline changes in reaching performance induced by prism adaptation and terminal feedback. These measures included the squared Z scores (Z 2 ) of correlations of limb position at 75% movement time versus movement end, variable error, time after peak velocity, and a frequency-domain analysis (pPower). The results indicated that variable error and time after peak velocity were sensitive to the prism adaptation. Furthermore, only the Z 2 values were biased by the terminal feedback. Ultimately, the current study has demonstrated the sensitivity of limb kinematic measures to offline control processes and that pPower analyses may yield the most suitable measure of online control.

  5. A prospective study on an innovative online forum for peer reviewing of surgical science

    PubMed Central

    von Allmen, Regula S.; Carradice, Dan; Oosterling, Steven J.; McFarlane, Kirsty; Wijnhoven, Bas

    2017-01-01

    Background Peer review is important to the scientific process. However, the present system has been criticised and accused of bias, lack of transparency, failure to detect significant breakthrough and error. At the British Journal of Surgery (BJS), after surveying authors’ and reviewers’ opinions on peer review, we piloted an open online forum with the aim of improving the peer review process. Methods In December 2014, a web-based survey assessing attitudes towards open online review was sent to reviewers with a BJS account in Scholar One. From April to June 2015, authors were invited to allow their manuscripts to undergo online peer review in addition to the standard peer review process. The quality of each review was evaluated by editors and editorial assistants using a validated instrument based on a Likert scale. Results The survey was sent to 6635 reviewers. In all, 1454 (21.9%) responded. Support for online peer review was strong, with only 10% stating that they would not subject their manuscripts to online peer review. The most prevalent concern was about intellectual property, being highlighted in 118 of 284 comments (41.5%). Out of 265 eligible manuscripts, 110 were included in the online peer review trial. Around 7000 potential reviewers were invited to review each manuscript. In all, 44 of 110 manuscripts (40%) received 100 reviews from 59 reviewers, alongside 115 conventional reviews. The quality of the open forum reviews was lower than for conventional reviews (2.13 (± 0.75) versus 2.84 (± 0.71), P<0.001). Conclusion Open online peer review is feasible in this setting, but it attracts few reviews, of lower quality than conventional peer reviews. PMID:28662046

  6. A prospective study on an innovative online forum for peer reviewing of surgical science.

    PubMed

    Almquist, Martin; von Allmen, Regula S; Carradice, Dan; Oosterling, Steven J; McFarlane, Kirsty; Wijnhoven, Bas

    2017-01-01

    Peer review is important to the scientific process. However, the present system has been criticised and accused of bias, lack of transparency, failure to detect significant breakthrough and error. At the British Journal of Surgery (BJS), after surveying authors' and reviewers' opinions on peer review, we piloted an open online forum with the aim of improving the peer review process. In December 2014, a web-based survey assessing attitudes towards open online review was sent to reviewers with a BJS account in Scholar One. From April to June 2015, authors were invited to allow their manuscripts to undergo online peer review in addition to the standard peer review process. The quality of each review was evaluated by editors and editorial assistants using a validated instrument based on a Likert scale. The survey was sent to 6635 reviewers. In all, 1454 (21.9%) responded. Support for online peer review was strong, with only 10% stating that they would not subject their manuscripts to online peer review. The most prevalent concern was about intellectual property, being highlighted in 118 of 284 comments (41.5%). Out of 265 eligible manuscripts, 110 were included in the online peer review trial. Around 7000 potential reviewers were invited to review each manuscript. In all, 44 of 110 manuscripts (40%) received 100 reviews from 59 reviewers, alongside 115 conventional reviews. The quality of the open forum reviews was lower than for conventional reviews (2.13 (± 0.75) versus 2.84 (± 0.71), P<0.001). Open online peer review is feasible in this setting, but it attracts few reviews, of lower quality than conventional peer reviews.

  7. Online beam energy measurement of Beijing electron positron collider II linear accelerator

    NASA Astrophysics Data System (ADS)

    Wang, S.; Iqbal, M.; Liu, R.; Chi, Y.

    2016-02-01

    This paper describes online beam energy measurement of Beijing Electron Positron Collider upgraded version II linear accelerator (linac) adequately. It presents the calculation formula, gives the error analysis in detail, discusses the realization in practice, and makes some verification. The method mentioned here measures the beam energy by acquiring the horizontal beam position with three beam position monitors (BPMs), which eliminates the effect of orbit fluctuation, and is much better than the one using the single BPM. The error analysis indicates that this online measurement has further potential usage such as a part of beam energy feedback system. The reliability of this method is also discussed and demonstrated in this paper.

  8. Online beam energy measurement of Beijing electron positron collider II linear accelerator.

    PubMed

    Wang, S; Iqbal, M; Liu, R; Chi, Y

    2016-02-01

    This paper describes online beam energy measurement of Beijing Electron Positron Collider upgraded version II linear accelerator (linac) adequately. It presents the calculation formula, gives the error analysis in detail, discusses the realization in practice, and makes some verification. The method mentioned here measures the beam energy by acquiring the horizontal beam position with three beam position monitors (BPMs), which eliminates the effect of orbit fluctuation, and is much better than the one using the single BPM. The error analysis indicates that this online measurement has further potential usage such as a part of beam energy feedback system. The reliability of this method is also discussed and demonstrated in this paper.

  9. Intrarater Reliability and Other Psychometrics of the Health Promoting Activities Scale (HPAS).

    PubMed

    Muskett, Rachel; Bourke-Taylor, Helen; Hewitt, Alana

    The Health Promoting Activities Scale (HPAS) measures the self-rated frequency with which adults participate in activities that promote health. We evaluated the internal consistency, construct validity, and intrarater reliability of the HPAS with a cohort of mothers (N = 56) of school-age children. We used an online survey that included the HPAS and measures of mental and physical health. Statistical analysis included intraclass correlation coefficients (ICCs), measurement error, error range, limits of agreement, and minimum detectable change (MDC). The HPAS showed good internal consistency (Cronbach's α = .73). Construct validity was supported by a significant difference in HPAS scores among participants grouped by physical activity level; no other differences were significant. Results included a high aggregate ICC of .90 and an MDC of 5 points. Our evaluation of the HPAS revealed good reliability and stability, suggesting suitability for ongoing evaluation as an outcome measure. Copyright © 2017 by the American Occupational Therapy Association, Inc.

  10. On-line diagnosis of sequential systems

    NASA Technical Reports Server (NTRS)

    Sundstrom, R. J.

    1973-01-01

    A model for on-line diagnosis was investigated for discrete-time systems, and resettable sequential systems. Generalized notions of a realization are discussed along with fault tolerance and errors. Further investigation into the theory of on-line diagnosis is recommended for three levels: binary state-assigned level, logical circuit level, and the subsystem-network level.

  11. Author Correction to: Pooled Analyses of Phase III Studies of ADS-5102 (Amantadine) Extended-Release Capsules for Dyskinesia in Parkinson's Disease.

    PubMed

    Elmer, Lawrence W; Juncos, Jorge L; Singer, Carlos; Truong, Daniel D; Criswell, Susan R; Parashos, Sotirios; Felt, Larissa; Johnson, Reed; Patni, Rajiv

    2018-04-01

    An Online First version of this article was made available online at http://link.springer.com/journal/40263/onlineFirst/page/1 on 12 March 2018. An error was subsequently identified in the article, and the following correction should be noted.

  12. Positioning of head and neck patients for proton therapy using proton range probes: a proof of concept study

    NASA Astrophysics Data System (ADS)

    Hammi, A.; Placidi, L.; Weber, D. C.; Lomax, A. J.

    2018-01-01

    To exploit the full potential of proton therapy, accurate and on-line methods to verify the patient positioning and the proton range during the treatment are desirable. Here we propose and validate an innovative technique for determining patient misalignment uncertainties through the use of a small number of low dose, carefully selected proton pencil beams (‘range probes’) (RP) with sufficient energy that their residual Bragg peak (BP) position and shape can be measured on exit. Since any change of the patient orientation in relation to these beams will result in changes of the density heterogeneities through which they pass, our hypothesis is that patient misalignments can be deduced from measured changes in Bragg curve (BC) shape and range. As such, a simple and robust methodology has been developed that estimates average proton range and range dilution of the detected residual BC, in order to locate range probe positions with optimal prediction power for detecting misalignments. The validation of this RP based approach has been split into two phases. First we retrospectively investigate its potential to detect translational patient misalignments under real clinical conditions. Second, we test it for determining rotational errors of an anthropomorphic phantom that was systematically rotated using an in-house developed high precision motion stage. Simulations of RPs in these two scenarios show that this approach could potentially predict translational errors to lower than1.5 mm and rotational errors to smaller than 1° using only three or five RPs positions respectively.

  13. Mark-Up-Based Writing Error Analysis Model in an On-Line Classroom.

    ERIC Educational Resources Information Center

    Feng, Cheng; Yano, Yoneo; Ogata, Hiroaki

    2000-01-01

    Describes a new component called "Writing Error Analysis Model" (WEAM) in the CoCoA system for teaching writing composition in Japanese as a foreign language. The Weam can be used for analyzing learners' morphological errors and selecting appropriate compositions for learners' revising exercises. (Author/VWL)

  14. The Seven Deadly Sins of Online Microcomputing.

    ERIC Educational Resources Information Center

    King, Alan

    1989-01-01

    Offers suggestions for avoiding common errors in online microcomputer use. Areas discussed include learning the basics; hardware protection; backup options; hard disk organization; software selection; file security; and the use of dedicated communications lines. (CLB)

  15. Simultaneous Online Measurement of H2O and CO2 in the Humid CO2 Adsorption/Desorption Process.

    PubMed

    Yu, Qingni; Ye, Sha; Zhu, Jingke; Lei, Lecheng; Yang, Bin

    2015-01-01

    A dew point meter (DP) and an infrared (IR) CO2 analyzer were assembled in a humid CO2 adsorption/desorption system in series for simultaneous online measurements of H2O and CO2, respectively. The humidifier, by using surface-flushing on a saturated brine solution was self-made for the generation of humid air flow. It was found that by this method it became relatively easy to obtain a low H2O content in air flow and that its fluctuation could be reduced compared to the bubbling method. Water calibration for the DP-IR detector is necessary to be conducted for minimizing the measurement error of H2O. It demonstrated that the relative error (RA) for simultaneous online measurements H2O and CO2 in the desorption process is lower than 0.1%. The high RA in the adsorption of H2O is attributed to H2O adsorption on the transfer pipe and amplification of the measurement error. The high accuracy of simultaneous online measurements of H2O and CO2 is promising for investigating their co-adsorption/desorption behaviors, especially for direct CO2 capture from ambient air.

  16. On-line estimation of error covariance parameters for atmospheric data assimilation

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.

    1995-01-01

    A simple scheme is presented for on-line estimation of covariance parameters in statistical data assimilation systems. The scheme is based on a maximum-likelihood approach in which estimates are produced on the basis of a single batch of simultaneous observations. Simple-sample covariance estimation is reasonable as long as the number of available observations exceeds the number of tunable parameters by two or three orders of magnitude. Not much is known at present about model error associated with actual forecast systems. Our scheme can be used to estimate some important statistical model error parameters such as regionally averaged variances or characteristic correlation length scales. The advantage of the single-sample approach is that it does not rely on any assumptions about the temporal behavior of the covariance parameters: time-dependent parameter estimates can be continuously adjusted on the basis of current observations. This is of practical importance since it is likely to be the case that both model error and observation error strongly depend on the actual state of the atmosphere. The single-sample estimation scheme can be incorporated into any four-dimensional statistical data assimilation system that involves explicit calculation of forecast error covariances, including optimal interpolation (OI) and the simplified Kalman filter (SKF). The computational cost of the scheme is high but not prohibitive; on-line estimation of one or two covariance parameters in each analysis box of an operational bozed-OI system is currently feasible. A number of numerical experiments performed with an adaptive SKF and an adaptive version of OI, using a linear two-dimensional shallow-water model and artificially generated model error are described. The performance of the nonadaptive versions of these methods turns out to depend rather strongly on correct specification of model error parameters. These parameters are estimated under a variety of conditions, including uniformly distributed model error and time-dependent model error statistics.

  17. [Study on high accuracy detection of multi-component gas in oil-immerse power transformer].

    PubMed

    Fan, Jie; Chen, Xiao; Huang, Qi-Feng; Zhou, Yu; Chen, Gang

    2013-12-01

    In order to solve the problem of low accuracy and mutual interference in multi-component gas detection, a kind of multi-component gas detection network with high accuracy was designed. A semiconductor laser with narrow bandwidth was utilized as light source and a novel long-path gas cell was also used in this system. By taking the single sine signal to modulate the spectrum of laser and using space division multiplexing (SDM) and time division multiplexing (TDM) technique, the detection of multi-component gas was achieved. The experiments indicate that the linearity relevance coefficient is 0. 99 and the measurement relative error is less than 4%. The system dynamic response time is less than 15 s, by filling a volume of multi-component gas into the gas cell gradually. The system has advantages of high accuracy and quick response, which can be used in the fault gas on-line monitoring for power transformers in real time.

  18. Detecting and isolating abrupt changes in linear switching systems

    NASA Astrophysics Data System (ADS)

    Nazari, Sohail; Zhao, Qing; Huang, Biao

    2015-04-01

    In this paper, a novel fault detection and isolation (FDI) method for switching linear systems is developed. All input and output signals are assumed to be corrupted with measurement noises. In the proposed method, a 'lifted' linear model named as stochastic hybrid decoupling polynomial (SHDP) is introduced. The SHDP model governs the dynamics of the switching linear system with all different modes, and is independent of the switching sequence. The error-in-variable (EIV) representation of SHDP is derived, and is used for the fault residual generation and isolation following the well-adopted local approach. The proposed FDI method can detect and isolate the fault-induced abrupt changes in switching models' parameters without estimating the switching modes. Furthermore, in this paper, the analytical expressions of the gradient vector and Hessian matrix are obtained based on the EIV SHDP formulation, so that they can be used to implement the online fault detection scheme. The performance of the proposed method is then illustrated by simulation examples.

  19. A service evaluation of on-line image-guided radiotherapy to lower extremity sarcoma: Investigating the workload implications of a 3 mm action level for image assessment and correction prior to delivery.

    PubMed

    Taylor, C; Parker, J; Stratford, J; Warren, M

    2018-05-01

    Although all systematic and random positional setup errors can be corrected for in entirety during on-line image-guided radiotherapy, the use of a specified action level, below which no correction occurs, is also an option. The following service evaluation aimed to investigate the use of this 3 mm action level for on-line image assessment and correction (online, systematic set-up error and weekly evaluation) for lower extremity sarcoma, and understand the impact on imaging frequency and patient positioning error within one cancer centre. All patients were immobilised using a thermoplastic shell attached to a plastic base and an individual moulded footrest. A retrospective analysis of 30 patients was performed. Patient setup and correctional data derived from cone beam CT analysis was retrieved. The timing, frequency and magnitude of corrections were evaluated. The population systematic and random error was derived. 20% of patients had no systematic corrections over the duration of treatment, and 47% had one. The maximum number of systematic corrections per course of radiotherapy was 4, which occurred for 2 patients. 34% of episodes occurred within the first 5 fractions. All patients had at least one observed translational error during their treatment greater than 0.3 cm, and 80% of patients had at least one observed translational error during their treatment greater than 0.5 cm. The population systematic error was 0.14 cm, 0.10 cm, 0.14 cm and random error was 0.27 cm, 0.22 cm, 0.23 cm in the lateral, caudocranial and anteroposterial directions. The required Planning Target Volume margin for the study population was 0.55 cm, 0.41 cm and 0.50 cm in the lateral, caudocranial and anteroposterial directions. The 3 mm action level for image assessment and correction prior to delivery reduced the imaging burden and focussed intervention on patients that exhibited greater positional variability. This strategy could be an efficient deployment of departmental resources if full daily correction of positional setup error is not possible. Copyright © 2017. Published by Elsevier Ltd.

  20. A wavelet-based ECG delineation algorithm for 32-bit integer online processing

    PubMed Central

    2011-01-01

    Background Since the first well-known electrocardiogram (ECG) delineator based on Wavelet Transform (WT) presented by Li et al. in 1995, a significant research effort has been devoted to the exploitation of this promising method. Its ability to reliably delineate the major waveform components (mono- or bi-phasic P wave, QRS, and mono- or bi-phasic T wave) would make it a suitable candidate for efficient online processing of ambulatory ECG signals. Unfortunately, previous implementations of this method adopt non-linear operators such as root mean square (RMS) or floating point algebra, which are computationally demanding. Methods This paper presents a 32-bit integer, linear algebra advanced approach to online QRS detection and P-QRS-T waves delineation of a single lead ECG signal, based on WT. Results The QRS detector performance was validated on the MIT-BIH Arrhythmia Database (sensitivity Se = 99.77%, positive predictive value P+ = 99.86%, on 109010 annotated beats) and on the European ST-T Database (Se = 99.81%, P+ = 99.56%, on 788050 annotated beats). The ECG delineator was validated on the QT Database, showing a mean error between manual and automatic annotation below 1.5 samples for all fiducial points: P-onset, P-peak, P-offset, QRS-onset, QRS-offset, T-peak, T-offset, and a mean standard deviation comparable to other established methods. Conclusions The proposed algorithm exhibits reliable QRS detection as well as accurate ECG delineation, in spite of a simple structure built on integer linear algebra. PMID:21457580

  1. A wavelet-based ECG delineation algorithm for 32-bit integer online processing.

    PubMed

    Di Marco, Luigi Y; Chiari, Lorenzo

    2011-04-03

    Since the first well-known electrocardiogram (ECG) delineator based on Wavelet Transform (WT) presented by Li et al. in 1995, a significant research effort has been devoted to the exploitation of this promising method. Its ability to reliably delineate the major waveform components (mono- or bi-phasic P wave, QRS, and mono- or bi-phasic T wave) would make it a suitable candidate for efficient online processing of ambulatory ECG signals. Unfortunately, previous implementations of this method adopt non-linear operators such as root mean square (RMS) or floating point algebra, which are computationally demanding. This paper presents a 32-bit integer, linear algebra advanced approach to online QRS detection and P-QRS-T waves delineation of a single lead ECG signal, based on WT. The QRS detector performance was validated on the MIT-BIH Arrhythmia Database (sensitivity Se = 99.77%, positive predictive value P+ = 99.86%, on 109010 annotated beats) and on the European ST-T Database (Se = 99.81%, P+ = 99.56%, on 788050 annotated beats). The ECG delineator was validated on the QT Database, showing a mean error between manual and automatic annotation below 1.5 samples for all fiducial points: P-onset, P-peak, P-offset, QRS-onset, QRS-offset, T-peak, T-offset, and a mean standard deviation comparable to other established methods. The proposed algorithm exhibits reliable QRS detection as well as accurate ECG delineation, in spite of a simple structure built on integer linear algebra.

  2. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thoelking, J; Yuvaraj, S; Jens, F

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference)more » and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan verification. Funding Support, Disclosures, and Conflict of Interest: COIs: Frank Lohr: Elekta: research grant, travel grants, teaching honoraria IBA: research grant, travel grants, teaching honoraria, advisory board C-Rad: board honoraria, travel grants Frederik Wenz: Elekta: research grant, teaching honoraria, consultant, advisory board Zeiss: research grant, teaching honoraria, patent Hansjoerg Wertz: Elekta: research grant, teaching honoraria IBA: research grant.« less

  3. SU-E-T-571: Newly Emerging Integrated Transmission Detector Systems Provide Online Quality Assurance of External Beam Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, D; Chung, E; Hess, C

    2015-06-15

    Purpose: Two newly emerging transmission detectors positioned upstream from the patient have been evaluated for online quality assurance of external beam radiotherapy. The prototype for the Integral Quality Monitor (IQM), developed by iRT Systems GmbH (Koblenz, Germany) is a large-area ion chamber mounted on the linac accessory tray to monitor photon fluence, energy, beam shape, and gantry position during treatment. The ion chamber utilizes a thickness gradient which records variable response dependent on beam position. The prototype of Delta4 Discover™, developed by ScandiDos (Uppsala, Sweden) is a linac accessory tray mounted 4040 diode array that measures photon fluence during patientmore » treatment. Both systems are employable for patient specific QA prior to treatment delivery. Methods: Our institution evaluated the reproducibility of measurements using various beam types, including VMAT treatment plans with both the IQM ion chamber and the Delta4 Discover diode array. Additionally, the IQM’s effect on photon fluence, dose response, simulated beam error detection, and the accuracy of the integrated barometer, thermometer, and inclinometer were characterized. The evaluated photon beam errors are based on the annual tolerances specified in AAPM TG-142. Results: Repeated VMAT treatments were measured with 0.16% reproducibility by the IQM and 0.55% reproducibility by the Delta4 Discover. The IQM attenuated 6, 10, and 15 MV photon beams by 5.43±0.02%, 4.60±0.02%, and 4.21±0.03% respectively. Photon beam profiles were affected <1.5% in the non-penumbra regions. The IQM’s ion chamber’s dose response was linear and the thermometer, barometer, and inclinometer agreed with other calibrated devices. The device detected variations in monitor units delivered (1%), field position (3mm), single MLC leaf positions (13mm), and photon energy. Conclusion: We have characterized two new transmissions detector systems designed to provide in-vivo like measurements upstream from the patient. Both systems demonstrate substantial utility for online treatment verification and QA of photon external beam radiotherapy.« less

  4. SU-E-J-199: A Software Tool for Quality Assurance of Online Replanning with MR-Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G; Ahunbay, E; Li, X

    2015-06-15

    Purpose: To develop a quality assurance software tool, ArtQA, capable of automatically checking radiation treatment plan parameters, verifying plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary MU calculation considering the effect of magnetic field from MR-Linac, and verifying the delivery and plan consistency, for online replanning. Methods: ArtQA was developed by creating interfaces to TPS (e.g., Monaco, Elekta), R&V system (Mosaiq, Elekta), and secondary MU calculation system. The tool obtains plan parameters from the TPS via direct file reading, and retrieves plan data both transferred from TPS and recorded during themore » actual delivery in the R&V system database via open database connectivity and structured query language. By comparing beam/plan datasets in different systems, ArtQA detects and outputs discrepancies between TPS, R&V system and secondary MU calculation system, and delivery. To consider the effect of 1.5T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA is capable of automatically checking plan integrity and logic consistency, detecting plan data transfer errors, performing secondary MU calculations with or without a transverse magnetic field, and verifying treatment delivery. The tool is efficient and effective for pre- and post-treatment QA checks of all available treatment parameters that may be impractical with the commonly-used visual inspection. Conclusion: The software tool ArtQA can be used for quick and automatic pre- and post-treatment QA check, eliminating human error associated with visual inspection. While this tool is developed for online replanning to be used on MR-Linac, where the QA needs to be performed rapidly as the patient is lying on the table waiting for the treatment, ArtQA can be used as a general QA tool in radiation oncology practice. This work is partially supported by Elekta Inc.« less

  5. Random Weighting, Strong Tracking, and Unscented Kalman Filter for Soft Tissue Characterization.

    PubMed

    Shin, Jaehyun; Zhong, Yongmin; Oetomo, Denny; Gu, Chengfan

    2018-05-21

    This paper presents a new nonlinear filtering method based on the Hunt-Crossley model for online nonlinear soft tissue characterization. This method overcomes the problem of performance degradation in the unscented Kalman filter due to contact model error. It adopts the concept of Mahalanobis distance to identify contact model error, and further incorporates a scaling factor in predicted state covariance to compensate identified model error. This scaling factor is determined according to the principle of innovation orthogonality to avoid the cumbersome computation of Jacobian matrix, where the random weighting concept is adopted to improve the estimation accuracy of innovation covariance. A master-slave robotic indentation system is developed to validate the performance of the proposed method. Simulation and experimental results as well as comparison analyses demonstrate that the efficacy of the proposed method for online characterization of soft tissue parameters in the presence of contact model error.

  6. An On-Line Solid Phase Extraction-Liquid Chromatography-Tandem Mass Spectrometry Method for the Determination of Perfluoroalkyl Acids in Drinking and Surface Waters

    PubMed Central

    Mazzoni, Michela; Rusconi, Marianna; Valsecchi, Sara; Martins, Claudia P. B.; Polesello, Stefano

    2015-01-01

    An UHPLC-MS/MS multiresidue method based on an on-line solid phase extraction (SPE) procedure was developed for the simultaneous determination of 9 perfluorinated carboxylates (from 4 to 12 carbon atoms) and 3 perfluorinated sulphonates (from 4 to 8 carbon atoms). This work proposes using an on-line solid phase extraction before chromatographic separation and analysis to replace traditional methods of off-line SPE before direct injection to LC-MS/MS. Manual sample preparation was reduced to sample centrifugation and acidification, thus eliminating several procedural errors and significantly reducing time-consuming and costs. Ionization suppression between target perfluorinated analytes and their coeluting SIL-IS were detected for homologues with a number of carbon atoms less than 9, but the quantitation was not affected. Total matrix effect corrected by SIL-IS, inclusive of extraction efficacy, and of ionization efficiency, ranged between −34 and +39%. The percentage of recoveries, between 76 and 134%, calculated in different matrices (tap water and rivers impacted by different pollutions) was generally satisfactory. LODs and LOQs of this on-line SPE method, which also incorporate recovery losses, ranged from 0.2 to 5.0 ng/L and from 1 to 20 ng/L, respectively. Validated on-line SPE-LC/MS/MS method has been applied in a wide survey for the determination of perfluoroalkyl acids in Italian surface and ground waters. PMID:25834752

  7. Optical storage media data integrity studies

    NASA Technical Reports Server (NTRS)

    Podio, Fernando L.

    1994-01-01

    Optical disk-based information systems are being used in private industry and many Federal Government agencies for on-line and long-term storage of large quantities of data. The storage devices that are part of these systems are designed with powerful, but not unlimited, media error correction capacities. The integrity of data stored on optical disks does not only depend on the life expectancy specifications for the medium. Different factors, including handling and storage conditions, may result in an increase of medium errors in size and frequency. Monitoring the potential data degradation is crucial, especially for long term applications. Efforts are being made by the Association for Information and Image Management Technical Committee C21, Storage Devices and Applications, to specify methods for monitoring and reporting to the user medium errors detected by the storage device while writing, reading or verifying the data stored in that medium. The Computer Systems Laboratory (CSL) of the National Institute of Standard and Technology (NIST) has a leadership role in the development of these standard techniques. In addition, CSL is researching other data integrity issues, including the investigation of error-resilient compression algorithms. NIST has conducted care and handling experiments on optical disk media with the objective of identifying possible causes of degradation. NIST work in data integrity and related standards activities is described.

  8. Architectural elements of hybrid navigation systems for future space transportation

    NASA Astrophysics Data System (ADS)

    Trigo, Guilherme F.; Theil, Stephan

    2018-06-01

    The fundamental limitations of inertial navigation, currently employed by most launchers, have raised interest for GNSS-aided solutions. Combination of inertial measurements and GNSS outputs allows inertial calibration online, solving the issue of inertial drift. However, many challenges and design options unfold. In this work we analyse several architectural elements and design aspects of a hybrid GNSS/INS navigation system conceived for space transportation. The most fundamental architectural features such as coupling depth, modularity between filter and inertial propagation, and open-/closed-loop nature of the configuration, are discussed in the light of the envisaged application. Importance of the inertial propagation algorithm and sensor class in the overall system are investigated, being the handling of sensor errors and uncertainties that arise with lower grade sensory also considered. In terms of GNSS outputs we consider receiver solutions (position and velocity) and raw measurements (pseudorange, pseudorange-rate and time-difference carrier phase). Receiver clock error handling options and atmospheric error correction schemes for these measurements are analysed under flight conditions. System performance with different GNSS measurements is estimated through covariance analysis, being the differences between loose and tight coupling emphasized through partial outage simulation. Finally, we discuss options for filter algorithm robustness against non-linearities and system/measurement errors. A possible scheme for fault detection, isolation and recovery is also proposed.

  9. Effects of Listening Conditions, Error Types, and Ensemble Textures on Error Detection Skills

    ERIC Educational Resources Information Center

    Waggoner, Dori T.

    2011-01-01

    This study was designed with three main purposes: (a) to investigate the effects of two listening conditions on error detection accuracy, (b) to compare error detection responses for rhythm errors and pitch errors, and (c) to examine the influences of texture on error detection accuracy. Undergraduate music education students (N = 18) listened to…

  10. Quality Control of Meteorological Observations

    NASA Technical Reports Server (NTRS)

    Collins, William; Dee, Dick; Rukhovets, Leonid

    1999-01-01

    For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.

  11. Welding deviation detection algorithm based on extremum of molten pool image contour

    NASA Astrophysics Data System (ADS)

    Zou, Yong; Jiang, Lipei; Li, Yunhua; Xue, Long; Huang, Junfen; Huang, Jiqiang

    2016-01-01

    The welding deviation detection is the basis of robotic tracking welding, but the on-line real-time measurement of welding deviation is still not well solved by the existing methods. There is plenty of information in the gas metal arc welding(GMAW) molten pool images that is very important for the control of welding seam tracking. The physical meaning for the curvature extremum of molten pool contour is revealed by researching the molten pool images, that is, the deviation information points of welding wire center and the molten tip center are the maxima and the local maxima of the contour curvature, and the horizontal welding deviation is the position difference of these two extremum points. A new method of weld deviation detection is presented, including the process of preprocessing molten pool images, extracting and segmenting the contours, obtaining the contour extremum points, and calculating the welding deviation, etc. Extracting the contours is the premise, segmenting the contour lines is the foundation, and obtaining the contour extremum points is the key. The contour images can be extracted with the method of discrete dyadic wavelet transform, which is divided into two sub contours including welding wire and molten tip separately. The curvature value of each point of the two sub contour lines is calculated based on the approximate curvature formula of multi-points for plane curve, and the two points of the curvature extremum are the characteristics needed for the welding deviation calculation. The results of the tests and analyses show that the maximum error of the obtained on-line welding deviation is 2 pixels(0.16 mm), and the algorithm is stable enough to meet the requirements of the pipeline in real-time control at a speed of less than 500 mm/min. The method can be applied to the on-line automatic welding deviation detection.

  12. INS/EKF-based stride length, height and direction intent detection for walking assistance robots.

    PubMed

    Brescianini, Dario; Jung, Jun-Young; Jang, In-Hun; Park, Hyun Sub; Riener, Robert

    2011-01-01

    We propose an algorithm used to obtain the information on stride length, height difference, and direction based on user's intent during walking. For exoskeleton robots used to assist paraplegic patients' walking, this information is used to generate gait patterns by themselves in on-line. To obtain this information, we attach an inertial measurement unit(IMU) on crutches and apply an extended kalman filter-based error correction method to reduce the phenomena of drift due to bias of the IMU. The proposed method is verifed in real walking scenarios including walking, climbing up-stairs, and changing direction of walking with normal. © 2011 IEEE

  13. Can an online clinical data management service help in improving data collection and data quality in a developing country setting?

    PubMed

    Wildeman, Maarten A; Zandbergen, Jeroen; Vincent, Andrew; Herdini, Camelia; Middeldorp, Jaap M; Fles, Renske; Dalesio, Otilia; van der Donk, Emile; Tan, I Bing

    2011-08-08

    Data collection by electronic medical record (EMR) systems have been proven to be helpful in data collection for scientific research and in improving healthcare. For a multi-centre trial in Indonesia and the Netherlands a web based system was selected to enable all participating centres to easily access data. This study assesses whether the introduction of a clinical trial data management service (CTDMS) composed of electronic case report forms (eCRF) can result in effective data collection and treatment monitoring. Data items entered were checked for inconsistencies automatically when submitted online. The data were divided into primary and secondary data items. We analysed both the total number of errors and the change in error rate, for both primary and secondary items, over the first five month of the trial. In the first five months 51 patients were entered. The primary data error rate was 1.6%, whilst that for secondary data was 2.7% against acceptable error rates for analysis of 1% and 2.5% respectively. The presented analysis shows that after five months since the introduction of the CTDMS the primary and secondary data error rates reflect acceptable levels of data quality. Furthermore, these error rates were decreasing over time. The digital nature of the CTDMS, as well as the online availability of that data, gives fast and easy insight in adherence to treatment protocols. As such, the CTDMS can serve as a tool to train and educate medical doctors and can improve treatment protocols.

  14. An Automated Method to Generate e-Learning Quizzes from Online Language Learner Writing

    ERIC Educational Resources Information Center

    Flanagan, Brendan; Yin, Chengjiu; Hirokawa, Sachio; Hashimoto, Kiyota; Tabata, Yoshiyuki

    2013-01-01

    In this paper, the entries of Lang-8, which is a Social Networking Site (SNS) site for learning and practicing foreign languages, were analyzed and found to contain similar rates of errors for most error categories reported in previous research. These similarly rated errors were then processed using an algorithm to determine corrections suggested…

  15. Online Adaboost-Based Parameterized Methods for Dynamic Distributed Network Intrusion Detection.

    PubMed

    Hu, Weiming; Gao, Jun; Wang, Yanguo; Wu, Ou; Maybank, Stephen

    2014-01-01

    Current network intrusion detection systems lack adaptability to the frequently changing network environments. Furthermore, intrusion detection in the new distributed architectures is now a major requirement. In this paper, we propose two online Adaboost-based intrusion detection algorithms. In the first algorithm, a traditional online Adaboost process is used where decision stumps are used as weak classifiers. In the second algorithm, an improved online Adaboost process is proposed, and online Gaussian mixture models (GMMs) are used as weak classifiers. We further propose a distributed intrusion detection framework, in which a local parameterized detection model is constructed in each node using the online Adaboost algorithm. A global detection model is constructed in each node by combining the local parametric models using a small number of samples in the node. This combination is achieved using an algorithm based on particle swarm optimization (PSO) and support vector machines. The global model in each node is used to detect intrusions. Experimental results show that the improved online Adaboost process with GMMs obtains a higher detection rate and a lower false alarm rate than the traditional online Adaboost process that uses decision stumps. Both the algorithms outperform existing intrusion detection algorithms. It is also shown that our PSO, and SVM-based algorithm effectively combines the local detection models into the global model in each node; the global model in a node can handle the intrusion types that are found in other nodes, without sharing the samples of these intrusion types.

  16. Paradigm Shifts in Voluntary Force Control and Motor Unit Behaviors with the Manipulated Size of Visual Error Perception

    PubMed Central

    Chen, Yi-Ching; Lin, Yen-Ting; Chang, Gwo-Ching; Hwang, Ing-Shiou

    2017-01-01

    The detection of error information is an essential prerequisite of a feedback-based movement. This study investigated the differential behavior and neurophysiological mechanisms of a cyclic force-tracking task using error-reducing and error-enhancing feedback. The discharge patterns of a relatively large number of motor units (MUs) were assessed with custom-designed multi-channel surface electromyography following mathematical decomposition of the experimentally-measured signals. Force characteristics, force-discharge relation, and phase-locking cortical activities in the contralateral motor cortex to individual MUs were contrasted among the low (LSF), normal (NSF), and high scaling factor (HSF) conditions, in which the sizes of online execution errors were displayed with various amplification ratios. Along with a spectral shift of the force output toward a lower band, force output with a more phase-lead became less irregular, and tracking accuracy was worse in the LSF condition than in the HSF condition. The coherent discharge of high phasic (HP) MUs with the target signal was greater, and inter-spike intervals were larger, in the LSF condition than in the HSF condition. Force-tracking in the LSF condition manifested with stronger phase-locked EEG activity in the contralateral motor cortex to discharge of the (HP) MUs (LSF > NSF, HSF). The coherent discharge of the (HP) MUs during the cyclic force-tracking predominated the force-discharge relation, which increased inversely to the error scaling factor. In conclusion, the size of visualized error gates motor unit discharge, force-discharge relation, and the relative influences of the feedback and feedforward processes on force control. A smaller visualized error size favors voluntary force control using a feedforward process, in relation to a selective central modulation that enhance the coherent discharge of (HP) MUs. PMID:28348530

  17. Paradigm Shifts in Voluntary Force Control and Motor Unit Behaviors with the Manipulated Size of Visual Error Perception.

    PubMed

    Chen, Yi-Ching; Lin, Yen-Ting; Chang, Gwo-Ching; Hwang, Ing-Shiou

    2017-01-01

    The detection of error information is an essential prerequisite of a feedback-based movement. This study investigated the differential behavior and neurophysiological mechanisms of a cyclic force-tracking task using error-reducing and error-enhancing feedback. The discharge patterns of a relatively large number of motor units (MUs) were assessed with custom-designed multi-channel surface electromyography following mathematical decomposition of the experimentally-measured signals. Force characteristics, force-discharge relation, and phase-locking cortical activities in the contralateral motor cortex to individual MUs were contrasted among the low (LSF), normal (NSF), and high scaling factor (HSF) conditions, in which the sizes of online execution errors were displayed with various amplification ratios. Along with a spectral shift of the force output toward a lower band, force output with a more phase-lead became less irregular, and tracking accuracy was worse in the LSF condition than in the HSF condition. The coherent discharge of high phasic (HP) MUs with the target signal was greater, and inter-spike intervals were larger, in the LSF condition than in the HSF condition. Force-tracking in the LSF condition manifested with stronger phase-locked EEG activity in the contralateral motor cortex to discharge of the (HP) MUs (LSF > NSF, HSF). The coherent discharge of the (HP) MUs during the cyclic force-tracking predominated the force-discharge relation, which increased inversely to the error scaling factor. In conclusion, the size of visualized error gates motor unit discharge, force-discharge relation, and the relative influences of the feedback and feedforward processes on force control. A smaller visualized error size favors voluntary force control using a feedforward process, in relation to a selective central modulation that enhance the coherent discharge of (HP) MUs.

  18. Correction to: CASPer, an online pre-interview screen for personal/professional characteristics: prediction of national licensure scores.

    PubMed

    Dore, Kelly L; Reiter, Harold I; Kreuger, Sharyn; Norman, Geoffrey R

    2017-12-01

    In re-examining the paper "CASPer, an online pre-interview screen for personal/professional characteristics: prediction of national licensure scores" published in AHSE (22(2), 327-336), we recognized two errors of interpretation.

  19. BREAST: a novel method to improve the diagnostic efficacy of mammography

    NASA Astrophysics Data System (ADS)

    Brennan, P. C.; Tapia, K.; Ryan, J.; Lee, W.

    2013-03-01

    High quality breast imaging and accurate image assessment are critical to the early diagnoses, treatment and management of women with breast cancer. Breast Screen Reader Assessment Strategy (BREAST) provides a platform, accessible by researchers and clinicians world-wide, which will contain image data bases, algorithms to assess reader performance and on-line systems for image evaluation. The platform will contribute to the diagnostic efficacy of breast imaging in Australia and beyond on two fronts: reducing errors in mammography, and transforming our assessment of novel technologies and techniques. Mammography is the primary diagnostic tool for detecting breast cancer with over 800,000 women X-rayed each year in Australia, however, it fails to detect 30% of breast cancers with a number of missed cancers being visible on the image [1-6]. BREAST will monitor the mistakes, identify reasons for mammographic errors, and facilitate innovative solutions to reduce error rates. The BREAST platform has the potential to enable expert assessment of breast imaging innovations, anywhere in the world where experts or innovations are located. Currently, innovations are often being assessed by limited numbers of individuals who happen to be geographically located close to the innovation, resulting in equivocal studies with low statistical power. BREAST will transform this current paradigm by enabling large numbers of experts to assess any new method or technology using our embedded evaluation methods. We are confident that this world-first system will play an important part in the future efficacy of breast imaging.

  20. Simultaneous quantification of the boar-taint compounds skatole and androstenone by surface-enhanced Raman scattering (SERS) and multivariate data analysis.

    PubMed

    Sørensen, Klavs M; Westley, Chloe; Goodacre, Royston; Engelsen, Søren Balling

    2015-10-01

    This study investigates the feasibility of using surface-enhanced Raman scattering (SERS) for the quantification of absolute levels of the boar-taint compounds skatole and androstenone in porcine fat. By investigation of different types of nanoparticles, pH and aggregating agents, an optimized environment that promotes SERS of the analytes was developed and tested with different multivariate spectral pre-processing techniques, and this was combined with variable selection on a series of analytical standards. The resulting method exhibited prediction errors (root mean square error of cross validation, RMSECV) of 2.4 × 10(-6) M skatole and 1.2 × 10(-7) M androstenone, with a limit of detection corresponding to approximately 2.1 × 10(-11) M for skatole and approximately 1.8 × 10(-10) for androstenone. The method was subsequently tested on porcine fat extract, leading to prediction errors (RMSECV) of 0.17 μg/g for skatole and 1.5 μg/g for androstenone. It is clear that this optimized SERS method, when combined with multivariate analysis, shows great potential for optimization into an on-line application, which will be the first of its kind, and opens up possibilities for simultaneous detection of other meat-quality metabolites or pathogen markers. Graphical abstract Artistic rendering of a laser-illuminated gold colloid sphere with skatole and androstenone adsorbed on the surface.

  1. Automatic lung segmentation using control feedback system: morphology and texture paradigm.

    PubMed

    Noor, Norliza M; Than, Joel C M; Rijal, Omar M; Kassim, Rosminah M; Yunus, Ashari; Zeki, Amir A; Anzidei, Michele; Saba, Luca; Suri, Jasjit S

    2015-03-01

    Interstitial Lung Disease (ILD) encompasses a wide array of diseases that share some common radiologic characteristics. When diagnosing such diseases, radiologists can be affected by heavy workload and fatigue thus decreasing diagnostic accuracy. Automatic segmentation is the first step in implementing a Computer Aided Diagnosis (CAD) that will help radiologists to improve diagnostic accuracy thereby reducing manual interpretation. Automatic segmentation proposed uses an initial thresholding and morphology based segmentation coupled with feedback that detects large deviations with a corrective segmentation. This feedback is analogous to a control system which allows detection of abnormal or severe lung disease and provides a feedback to an online segmentation improving the overall performance of the system. This feedback system encompasses a texture paradigm. In this study we studied 48 males and 48 female patients consisting of 15 normal and 81 abnormal patients. A senior radiologist chose the five levels needed for ILD diagnosis. The results of segmentation were displayed by showing the comparison of the automated and ground truth boundaries (courtesy of ImgTracer™ 1.0, AtheroPoint™ LLC, Roseville, CA, USA). The left lung's performance of segmentation was 96.52% for Jaccard Index and 98.21% for Dice Similarity, 0.61 mm for Polyline Distance Metric (PDM), -1.15% for Relative Area Error and 4.09% Area Overlap Error. The right lung's performance of segmentation was 97.24% for Jaccard Index, 98.58% for Dice Similarity, 0.61 mm for PDM, -0.03% for Relative Area Error and 3.53% for Area Overlap Error. The segmentation overall has an overall similarity of 98.4%. The segmentation proposed is an accurate and fully automated system.

  2. Text Classification for Assisting Moderators in Online Health Communities

    PubMed Central

    Huh, Jina; Yetisgen-Yildiz, Meliha; Pratt, Wanda

    2013-01-01

    Objectives Patients increasingly visit online health communities to get help on managing health. The large scale of these online communities makes it impossible for the moderators to engage in all conversations; yet, some conversations need their expertise. Our work explores low-cost text classification methods to this new domain of determining whether a thread in an online health forum needs moderators’ help. Methods We employed a binary classifier on WebMD’s online diabetes community data. To train the classifier, we considered three feature types: (1) word unigram, (2) sentiment analysis features, and (3) thread length. We applied feature selection methods based on χ2 statistics and under sampling to account for unbalanced data. We then performed a qualitative error analysis to investigate the appropriateness of the gold standard. Results Using sentiment analysis features, feature selection methods, and balanced training data increased the AUC value up to 0.75 and the F1-score up to 0.54 compared to the baseline of using word unigrams with no feature selection methods on unbalanced data (0.65 AUC and 0.40 F1-score). The error analysis uncovered additional reasons for why moderators respond to patients’ posts. Discussion We showed how feature selection methods and balanced training data can improve the overall classification performance. We present implications of weighing precision versus recall for assisting moderators of online health communities. Our error analysis uncovered social, legal, and ethical issues around addressing community members’ needs. We also note challenges in producing a gold standard, and discuss potential solutions for addressing these challenges. Conclusion Social media environments provide popular venues in which patients gain health-related information. Our work contributes to understanding scalable solutions for providing moderators’ expertise in these large-scale, social media environments. PMID:24025513

  3. Online referrals one way capitated groups gain efficiencies, reduce errors.

    PubMed

    2002-08-01

    An online referral system is just the latest money and time-saving tool in the e-commerce arsenal at Hill Physicians Medical Group. Using a modified version of Healinx Corp.'s secure e-mail messaging platform, Hill is testing a custom-made online referral system at two primary care practices that appear to be helping the practice boost its bottom line under capitation.

  4. Analysis of separation test for automatic brake adjuster based on linear radon transformation

    NASA Astrophysics Data System (ADS)

    Luo, Zai; Jiang, Wensong; Guo, Bin; Fan, Weijun; Lu, Yi

    2015-01-01

    The linear Radon transformation is applied to extract inflection points for online test system under the noise conditions. The linear Radon transformation has a strong ability of anti-noise and anti-interference by fitting the online test curve in several parts, which makes it easy to handle consecutive inflection points. We applied the linear Radon transformation to the separation test system to solve the separating clearance of automatic brake adjuster. The experimental results show that the feature point extraction error of the gradient maximum optimal method is approximately equal to ±0.100, while the feature point extraction error of linear Radon transformation method can reach to ±0.010, which has a lower error than the former one. In addition, the linear Radon transformation is robust.

  5. Research integrity: the experience of a doubting Thomas.

    PubMed

    Hettinger, Thomas P

    2014-04-01

    The sensational "reactome array" paper published in Science in 2009 was investigated in Spain by the Ethics Committee of Consejo Superior de Investigaciones Cientificas (CSIC) after Science issued an editorial expression of concern. The paper was retracted in 2010 because of "skepticism" due to "errors" in chemistry. The "errors" were so profound that many readers expressed doubt that they were really errors, but part of an elaborate hoax. I conducted a forensic analysis of mass spectrometry data in the paper's Supporting Online Material (SOM) and was able to prove that thousands of data values were in fact fabricated. The SOM contains signatures of improper extensive spreadsheet manipulations of incorrect atomic and molecular mass values as well as impossibly repetitive deviations of found molecular mass values from their expected values. No evidence of real mass spectrometry data was detected. Both CSIC and Science have been content to retract the paper without acknowledging the fabrications or assigning responsibility for them. Neither CSIC nor Science has expressed interest in having an independent investigation determining how the paper came to be written, reviewed and published. Their weak response to this episode is a daunting signal that there is an impending crisis in research integrity and science journalism.

  6. [Patient safety in primary care: PREFASEG project].

    PubMed

    Catalán, Arantxa; Borrell, Francesc; Pons, Angels; Amado, Ester; Baena, José Miguel; Morales, Vicente

    2014-07-01

    The Institut Català de la Salut (ICS) has designed and integrated in electronic clinical station of primary care a new software tool to support the prescription of drugs, which can detect on-line certain medication errors. The software called PREFASEG (stands for Secure drug prescriptions) aims to prevent adverse events related to medication use in the field of primary health care (PHC). This study was made on the computerized medical record called CPT, which is used by all PHC physicians in our institution -3,750- and prescribing physicians through it. PREFASEG integrated in eCAP in July 2010 and six months later we performed a cross-sectional study to evaluate their usefulness and refine their design. The software alerts on-line in 5 dimensions: drug interactions, redundant treatments, allergies, contraindications of drugs with disease, and advises against drugs in over 75 years. PREFASEG generated 1,162,765 alerts (1 per 10 high treatment), with the detection of therapeutic duplication (62%) the most alerted. The overall acceptance rate is 35%, redundancies pharmacological (43%) and allergies (26%) are the most accepted. A total of 10,808 professionals (doctors and nurses) have accepted some of the recommendations of the program. PREFASEG is a feasible and highly efficient strategy to achieve an objective of Quality Plan for the NHS. Copyright © 2014. Published by Elsevier Espana.

  7. Image guidance in prostate cancer - can offline corrections be an effective substitute for daily online imaging?

    PubMed

    Prasad, Devleena; Das, Pinaki; Saha, Niladri S; Chatterjee, Sanjoy; Achari, Rimpa; Mallick, Indranil

    2014-01-01

    This aim of this study was to determine if a less resource-intensive and established offline correction protocol - the No Action Level (NAL) protocol was as effective as daily online corrections of setup deviations in curative high-dose radiotherapy of prostate cancer. A total of 683 daily megavoltage CT (MVCT) or kilovoltage CT (kvCBCT) images of 30 patients with localized prostate cancer treated with intensity modulated radiotherapy were evaluated. Daily image-guidance was performed and setup errors in three translational axes recorded. The NAL protocol was simulated by using the mean shift calculated from the first five fractions and implemented on all subsequent treatments. Using the imaging data from the remaining fractions, the daily residual error (RE) was determined. The proportion of fractions where the RE was greater than 3,5 and 7 mm was calculated, and also the actual PTV margin that would be required if the offline protocol was followed. Using the NAL protocol reduced the systematic but not the random errors. Corrections made using the NAL protocol resulted in small and acceptable RE in the mediolateral (ML) and superoinferior (SI) directions with 46/533 (8.1%) and 48/533 (5%) residual shifts above 5 mm. However; residual errors greater than 5mm in the anteroposterior (AP) direction remained in 181/533 (34%) of fractions. The PTV margins calculated based on residual errors were 5mm, 5mm and 13 mm in the ML, SI and AP directions respectively. Offline correction using the NAL protocol resulted in unacceptably high residual errors in the AP direction, due to random uncertainties of rectal and bladder filling. Daily online imaging and corrections remain the standard image guidance policy for highly conformal radiotherapy of prostate cancer.

  8. Towards a robust BCI: error potentials and online learning.

    PubMed

    Buttfield, Anna; Ferrez, Pierre W; Millán, José del R

    2006-06-01

    Recent advances in the field of brain-computer interfaces (BCIs) have shown that BCIs have the potential to provide a powerful new channel of communication, completely independent of muscular and nervous systems. However, while there have been successful laboratory demonstrations, there are still issues that need to be addressed before BCIs can be used by nonexperts outside the laboratory. At IDIAP Research Institute, we have been investigating several areas that we believe will allow us to improve the robustness, flexibility, and reliability of BCIs. One area is recognition of cognitive error states, that is, identifying errors through the brain's reaction to mistakes. The production of these error potentials (ErrP) in reaction to an error made by the user is well established. We have extended this work by identifying a similar but distinct ErrP that is generated in response to an error made by the interface, (a misinterpretation of a command that the user has given). This ErrP can be satisfactorily identified in single trials and can be demonstrated to improve the theoretical performance of a BCI. A second area of research is online adaptation of the classifier. BCI signals change over time, both between sessions and within a single session, due to a number of factors. This means that a classifier trained on data from a previous session will probably not be optimal for a new session. In this paper, we present preliminary results from our investigations into supervised online learning that can be applied in the initial training phase. We also discuss the future direction of this research, including the combination of these two currently separate issues to create a potentially very powerful BCI.

  9. Automatic detection system of shaft part surface defect based on machine vision

    NASA Astrophysics Data System (ADS)

    Jiang, Lixing; Sun, Kuoyuan; Zhao, Fulai; Hao, Xiangyang

    2015-05-01

    Surface physical damage detection is an important part of the shaft parts quality inspection and the traditional detecting methods are mostly human eye identification which has many disadvantages such as low efficiency, bad reliability. In order to improve the automation level of the quality detection of shaft parts and establish its relevant industry quality standard, a machine vision inspection system connected with MCU was designed to realize the surface detection of shaft parts. The system adopt the monochrome line-scan digital camera and use the dark-field and forward illumination technology to acquire images with high contrast; the images were segmented to Bi-value images through maximum between-cluster variance method after image filtering and image enhancing algorithms; then the mainly contours were extracted based on the evaluation criterion of the aspect ratio and the area; then calculate the coordinates of the centre of gravity of defects area, namely locating point coordinates; At last, location of the defects area were marked by the coding pen communicated with MCU. Experiment show that no defect was omitted and false alarm error rate was lower than 5%, which showed that the designed system met the demand of shaft part on-line real-time detection.

  10. Distributed adaptive neural network control for a class of heterogeneous nonlinear multi-agent systems subject to actuation failures

    NASA Astrophysics Data System (ADS)

    Cui, Bing; Zhao, Chunhui; Ma, Tiedong; Feng, Chi

    2017-02-01

    In this paper, the cooperative adaptive consensus tracking problem for heterogeneous nonlinear multi-agent systems on directed graph is addressed. Each follower is modelled as a general nonlinear system with the unknown and nonidentical nonlinear dynamics, disturbances and actuator failures. Cooperative fault tolerant neural network tracking controllers with online adaptive learning features are proposed to guarantee that all agents synchronise to the trajectory of one leader with bounded adjustable synchronisation errors. With the help of linear quadratic regulator-based optimal design, a graph-dependent Lyapunov proof provides error bounds that depend on the graph topology, one virtual matrix and some design parameters. Of particular interest is that if the control gain is selected appropriately, the proposed control scheme can be implemented in a unified framework no matter whether there are faults or not. Furthermore, the fault detection and isolation are not needed to implement. Finally, a simulation is given to verify the effectiveness of the proposed method.

  11. Is comprehension necessary for error detection? A conflict-based account of monitoring in speech production

    PubMed Central

    Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.

    2011-01-01

    Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the double dissociation between comprehension and error-detection ability observed in the aphasic patients. We propose a new theory of speech-error detection which is instead based on the production process itself. The theory borrows from studies of forced-choice-response tasks the notion that error detection is accomplished by monitoring response conflict via a frontal brain structure, such as the anterior cingulate cortex. We adapt this idea to the two-step model of word production, and test the model-derived predictions on a sample of aphasic patients. Our results show a strong correlation between patients’ error-detection ability and the model’s characterization of their production skills, and no significant correlation between error detection and comprehension measures, thus supporting a production-based monitor, generally, and the implemented conflict-based monitor in particular. The successful application of the conflict-based theory to error-detection in linguistic, as well as non-linguistic domains points to a domain-general monitoring system. PMID:21652015

  12. Online absolute pose compensation and steering control of industrial robot based on six degrees of freedom laser measurement

    NASA Astrophysics Data System (ADS)

    Yang, Juqing; Wang, Dayong; Fan, Baixing; Dong, Dengfeng; Zhou, Weihu

    2017-03-01

    In-situ intelligent manufacturing for large-volume equipment requires industrial robots with absolute high-accuracy positioning and orientation steering control. Conventional robots mainly employ an offline calibration technology to identify and compensate key robotic parameters. However, the dynamic and static parameters of a robot change nonlinearly. It is not possible to acquire a robot's actual parameters and control the absolute pose of the robot with a high accuracy within a large workspace by offline calibration in real-time. This study proposes a real-time online absolute pose steering control method for an industrial robot based on six degrees of freedom laser tracking measurement, which adopts comprehensive compensation and correction of differential movement variables. First, the pose steering control system and robot kinematics error model are constructed, and then the pose error compensation mechanism and algorithm are introduced in detail. By accurately achieving the position and orientation of the robot end-tool, mapping the computed Jacobian matrix of the joint variable and correcting the joint variable, the real-time online absolute pose compensation for an industrial robot is accurately implemented in simulations and experimental tests. The average positioning error is 0.048 mm and orientation accuracy is better than 0.01 deg. The results demonstrate that the proposed method is feasible, and the online absolute accuracy of a robot is sufficiently enhanced.

  13. Error Type and Lexical Frequency Effects: Error Detection in Swedish Children with Language Impairment

    ERIC Educational Resources Information Center

    Hallin, Anna Eva; Reuterskiöld, Christina

    2017-01-01

    Purpose: The first aim of this study was to investigate if Swedish-speaking school-age children with language impairment (LI) show specific morphosyntactic vulnerabilities in error detection. The second aim was to investigate the effects of lexical frequency on error detection, an overlooked aspect of previous error detection studies. Method:…

  14. Evaluation of causes and frequency of medication errors during information technology downtime.

    PubMed

    Hanuscak, Tara L; Szeinbach, Sheryl L; Seoane-Vazquez, Enrique; Reichert, Brendan J; McCluskey, Charles F

    2009-06-15

    The causes and frequency of medication errors occurring during information technology downtime were evaluated. Individuals from a convenience sample of 78 hospitals who were directly responsible for supporting and maintaining clinical information systems (CISs) and automated dispensing systems (ADSs) were surveyed using an online tool between February 2007 and May 2007 to determine if medication errors were reported during periods of system downtime. The errors were classified using the National Coordinating Council for Medication Error Reporting and Prevention severity scoring index. The percentage of respondents reporting downtime was estimated. Of the 78 eligible hospitals, 32 respondents with CIS and ADS responsibilities completed the online survey for a response rate of 41%. For computerized prescriber order entry, patch installations and system upgrades caused an average downtime of 57% over a 12-month period. Lost interface and interface malfunction were reported for centralized and decentralized ADSs, with an average downtime response of 34% and 29%, respectively. The average downtime response was 31% for software malfunctions linked to clinical decision-support systems. Although patient harm did not result from 30 (54%) medication errors, the potential for harm was present for 9 (16%) of these errors. Medication errors occurred during CIS and ADS downtime despite the availability of backup systems and standard protocols to handle periods of system downtime. Efforts should be directed to reduce the frequency and length of down-time in order to minimize medication errors during such downtime.

  15. Analysis of the impact of error detection on computer performance

    NASA Technical Reports Server (NTRS)

    Shin, K. C.; Lee, Y. H.

    1983-01-01

    Conventionally, reliability analyses either assume that a fault/error is detected immediately following its occurrence, or neglect damages caused by latent errors. Though unrealistic, this assumption was imposed in order to avoid the difficulty of determining the respective probabilities that a fault induces an error and the error is then detected in a random amount of time after its occurrence. As a remedy for this problem a model is proposed to analyze the impact of error detection on computer performance under moderate assumptions. Error latency, the time interval between occurrence and the moment of detection, is used to measure the effectiveness of a detection mechanism. This model is used to: (1) predict the probability of producing an unreliable result, and (2) estimate the loss of computation due to fault and/or error.

  16. Automatic-repeat-request error control schemes

    NASA Technical Reports Server (NTRS)

    Lin, S.; Costello, D. J., Jr.; Miller, M. J.

    1983-01-01

    Error detection incorporated with automatic-repeat-request (ARQ) is widely used for error control in data communication systems. This method of error control is simple and provides high system reliability. If a properly chosen code is used for error detection, virtually error-free data transmission can be attained. Various types of ARQ and hybrid ARQ schemes, and error detection using linear block codes are surveyed.

  17. OOPS! Retractions, Corrections, and Amplifications in Online Environments.

    ERIC Educational Resources Information Center

    Ojala, Marydee

    1996-01-01

    Examines the practice and implications of issuing corrections, retractions, and amplifications in online databases. All database producers do not provide mechanisms to accommodate retractions and corrections, and it can be difficult for a searcher to find evidence of error correction. Sidebars illustrate both the lack of and evidence of…

  18. The Online Translator: Implementing National Standard 4.1.

    ERIC Educational Resources Information Center

    Burton, Christine

    2003-01-01

    A pedagogical idea for addressing National Standard 4.1 (Students demonstrate understanding of the nature of language through comparisons of language studied and their own) suggests the deliberate use of the online translator to illustrate to students the syntactical errors that occur when translating idioms from one language to another. (VWL)

  19. Correction to: A Comparison of the Energetic Cost of Running in Marathon Racing Shoes.

    PubMed

    Hoogkamer, Wouter; Kipp, Shalaya; Frank, Jesse H; Farina, Emily M; Luo, Geng; Kram, Rodger

    2018-06-01

    An Online First version of this article was made available online at https://link.springer.com/article/10.1007/s40279-017-0811-2 on 16 November 2017. An error was subsequently identified in the article, and the following correction should be noted.

  20. Correction to: Tanner-Whitehouse Skeletal Ages in Male Youth Soccer Players: TW2 or TW3?

    PubMed

    Malina, Robert M; Coelho-E-Silva, Manuel J; Figueiredo, António J; Philippaerts, Renaat M; Hirose, Norikazu; Reyes, Maria Eugenia Peña; Gilli, Giulio; Benso, Andrea; Vaeyens, Roel; Deprez, Dieter; Guglielmo, Luiz G A; Buranarugsa, Rojapon

    2018-04-01

    An Online First version of this article was made available online at https://link.springer.com/article/10.1007%2Fs40279-017-0799-7 on 29 October 2017. Errors were subsequently identified in the article, and the following corrections should be noted.

  1. What errors do peer reviewers detect, and does training improve their ability to detect them?

    PubMed

    Schroter, Sara; Black, Nick; Evans, Stephen; Godlee, Fiona; Osorio, Lyda; Smith, Richard

    2008-10-01

    To analyse data from a trial and report the frequencies with which major and minor errors are detected at a general medical journal, the types of errors missed and the impact of training on error detection. 607 peer reviewers at the BMJ were randomized to two intervention groups receiving different types of training (face-to-face training or a self-taught package) and a control group. Each reviewer was sent the same three test papers over the study period, each of which had nine major and five minor methodological errors inserted. BMJ peer reviewers. The quality of review, assessed using a validated instrument, and the number and type of errors detected before and after training. The number of major errors detected varied over the three papers. The interventions had small effects. At baseline (Paper 1) reviewers found an average of 2.58 of the nine major errors, with no notable difference between the groups. The mean number of errors reported was similar for the second and third papers, 2.71 and 3.0, respectively. Biased randomization was the error detected most frequently in all three papers, with over 60% of reviewers rejecting the papers identifying this error. Reviewers who did not reject the papers found fewer errors and the proportion finding biased randomization was less than 40% for each paper. Editors should not assume that reviewers will detect most major errors, particularly those concerned with the context of study. Short training packages have only a slight impact on improving error detection.

  2. Ultrasensitive microchip based on smart microgel for real-time online detection of trace threat analytes.

    PubMed

    Lin, Shuo; Wang, Wei; Ju, Xiao-Jie; Xie, Rui; Liu, Zhuang; Yu, Hai-Rong; Zhang, Chuan; Chu, Liang-Yin

    2016-02-23

    Real-time online detection of trace threat analytes is critical for global sustainability, whereas the key challenge is how to efficiently convert and amplify analyte signals into simple readouts. Here we report an ultrasensitive microfluidic platform incorporated with smart microgel for real-time online detection of trace threat analytes. The microgel can swell responding to specific stimulus in flowing solution, resulting in efficient conversion of the stimulus signal into significantly amplified signal of flow-rate change; thus highly sensitive, fast, and selective detection can be achieved. We demonstrate this by incorporating ion-recognizable microgel for detecting trace Pb(2+), and connecting our platform with pipelines of tap water and wastewater for real-time online Pb(2+) detection to achieve timely pollution warning and terminating. This work provides a generalizable platform for incorporating myriad stimuli-responsive microgels to achieve ever-better performance for real-time online detection of various trace threat molecules, and may expand the scope of applications of detection techniques.

  3. Comparative analysis of neural network and regression based condition monitoring approaches for wind turbine fault detection

    NASA Astrophysics Data System (ADS)

    Schlechtingen, Meik; Ferreira Santos, Ilmar

    2011-07-01

    This paper presents the research results of a comparison of three different model based approaches for wind turbine fault detection in online SCADA data, by applying developed models to five real measured faults and anomalies. The regression based model as the simplest approach to build a normal behavior model is compared to two artificial neural network based approaches, which are a full signal reconstruction and an autoregressive normal behavior model. Based on a real time series containing two generator bearing damages the capabilities of identifying the incipient fault prior to the actual failure are investigated. The period after the first bearing damage is used to develop the three normal behavior models. The developed or trained models are used to investigate how the second damage manifests in the prediction error. Furthermore the full signal reconstruction and the autoregressive approach are applied to further real time series containing gearbox bearing damages and stator temperature anomalies. The comparison revealed all three models being capable of detecting incipient faults. However, they differ in the effort required for model development and the remaining operational time after first indication of damage. The general nonlinear neural network approaches outperform the regression model. The remaining seasonality in the regression model prediction error makes it difficult to detect abnormality and leads to increased alarm levels and thus a shorter remaining operational period. For the bearing damages and the stator anomalies under investigation the full signal reconstruction neural network gave the best fault visibility and thus led to the highest confidence level.

  4. Mining pharmacovigilance data using Bayesian logistic regression with James-Stein type shrinkage estimation.

    PubMed

    An, Lihua; Fung, Karen Y; Krewski, Daniel

    2010-09-01

    Spontaneous adverse event reporting systems are widely used to identify adverse reactions to drugs following their introduction into the marketplace. In this article, a James-Stein type shrinkage estimation strategy was developed in a Bayesian logistic regression model to analyze pharmacovigilance data. This method is effective in detecting signals as it combines information and borrows strength across medically related adverse events. Computer simulation demonstrated that the shrinkage estimator is uniformly better than the maximum likelihood estimator in terms of mean squared error. This method was used to investigate the possible association of a series of diabetic drugs and the risk of cardiovascular events using data from the Canada Vigilance Online Database.

  5. Intrinsic interactive reinforcement learning - Using error-related potentials for real world human-robot interaction.

    PubMed

    Kim, Su Kyoung; Kirchner, Elsa Andrea; Stefes, Arne; Kirchner, Frank

    2017-12-14

    Reinforcement learning (RL) enables robots to learn its optimal behavioral strategy in dynamic environments based on feedback. Explicit human feedback during robot RL is advantageous, since an explicit reward function can be easily adapted. However, it is very demanding and tiresome for a human to continuously and explicitly generate feedback. Therefore, the development of implicit approaches is of high relevance. In this paper, we used an error-related potential (ErrP), an event-related activity in the human electroencephalogram (EEG), as an intrinsically generated implicit feedback (rewards) for RL. Initially we validated our approach with seven subjects in a simulated robot learning scenario. ErrPs were detected online in single trial with a balanced accuracy (bACC) of 91%, which was sufficient to learn to recognize gestures and the correct mapping between human gestures and robot actions in parallel. Finally, we validated our approach in a real robot scenario, in which seven subjects freely chose gestures and the real robot correctly learned the mapping between gestures and actions (ErrP detection (90% bACC)). In this paper, we demonstrated that intrinsically generated EEG-based human feedback in RL can successfully be used to implicitly improve gesture-based robot control during human-robot interaction. We call our approach intrinsic interactive RL.

  6. VizieR Online Data Catalog: Mira Variables in the OGLE Bulge fields (Groenewegen+, 2005)

    NASA Astrophysics Data System (ADS)

    Groenewegen, M. A. T.; Blommaert, J. A. D. L.

    2005-07-01

    Table 1 provides the results of the period analysis (up to 3 periods with error and amplitudes with error), and associated 2MASS and DENIS photometry. Table 2 provides the cross-correlation with other objects and special remarks. (4 data files).

  7. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  8. Demonstration of a quantum error detection code using a square lattice of four superconducting qubits

    PubMed Central

    Córcoles, A.D.; Magesan, Easwar; Srinivasan, Srikanth J.; Cross, Andrew W.; Steffen, M.; Gambetta, Jay M.; Chow, Jerry M.

    2015-01-01

    The ability to detect and deal with errors when manipulating quantum systems is a fundamental requirement for fault-tolerant quantum computing. Unlike classical bits that are subject to only digital bit-flip errors, quantum bits are susceptible to a much larger spectrum of errors, for which any complete quantum error-correcting code must account. Whilst classical bit-flip detection can be realized via a linear array of qubits, a general fault-tolerant quantum error-correcting code requires extending into a higher-dimensional lattice. Here we present a quantum error detection protocol on a two-by-two planar lattice of superconducting qubits. The protocol detects an arbitrary quantum error on an encoded two-qubit entangled state via quantum non-demolition parity measurements on another pair of error syndrome qubits. This result represents a building block towards larger lattices amenable to fault-tolerant quantum error correction architectures such as the surface code. PMID:25923200

  9. Demonstration of a quantum error detection code using a square lattice of four superconducting qubits.

    PubMed

    Córcoles, A D; Magesan, Easwar; Srinivasan, Srikanth J; Cross, Andrew W; Steffen, M; Gambetta, Jay M; Chow, Jerry M

    2015-04-29

    The ability to detect and deal with errors when manipulating quantum systems is a fundamental requirement for fault-tolerant quantum computing. Unlike classical bits that are subject to only digital bit-flip errors, quantum bits are susceptible to a much larger spectrum of errors, for which any complete quantum error-correcting code must account. Whilst classical bit-flip detection can be realized via a linear array of qubits, a general fault-tolerant quantum error-correcting code requires extending into a higher-dimensional lattice. Here we present a quantum error detection protocol on a two-by-two planar lattice of superconducting qubits. The protocol detects an arbitrary quantum error on an encoded two-qubit entangled state via quantum non-demolition parity measurements on another pair of error syndrome qubits. This result represents a building block towards larger lattices amenable to fault-tolerant quantum error correction architectures such as the surface code.

  10. Integrated analysis of error detection and recovery

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Lee, Y. H.

    1985-01-01

    An integrated modeling and analysis of error detection and recovery is presented. When fault latency and/or error latency exist, the system may suffer from multiple faults or error propagations which seriously deteriorate the fault-tolerant capability. Several detection models that enable analysis of the effect of detection mechanisms on the subsequent error handling operations and the overall system reliability were developed. Following detection of the faulty unit and reconfiguration of the system, the contaminated processes or tasks have to be recovered. The strategies of error recovery employed depend on the detection mechanisms and the available redundancy. Several recovery methods including the rollback recovery are considered. The recovery overhead is evaluated as an index of the capabilities of the detection and reconfiguration mechanisms.

  11. Certified dual-corrected radiation patterns of phased antenna arrays by offline–online order reduction of finite-element models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sommer, A., E-mail: a.sommer@lte.uni-saarland.de; Farle, O., E-mail: o.farle@lte.uni-saarland.de; Dyczij-Edlinger, R., E-mail: edlinger@lte.uni-saarland.de

    2015-10-15

    This paper presents a fast numerical method for computing certified far-field patterns of phased antenna arrays over broad frequency bands as well as wide ranges of steering and look angles. The proposed scheme combines finite-element analysis, dual-corrected model-order reduction, and empirical interpolation. To assure the reliability of the results, improved a posteriori error bounds for the radiated power and directive gain are derived. Both the reduced-order model and the error-bounds algorithm feature offline–online decomposition. A real-world example is provided to demonstrate the efficiency and accuracy of the suggested approach.

  12. Online Tools for Uncovering Data Quality (DQ) Issues in Satellite-Based Global Precipitation Products

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Heo, Gil

    2015-01-01

    Data quality (DQ) has many attributes or facets (i.e., errors, biases, systematic differences, uncertainties, benchmark, false trends, false alarm ratio, etc.)Sources can be complicated (measurements, environmental conditions, surface types, algorithms, etc.) and difficult to be identified especially for multi-sensor and multi-satellite products with bias correction (TMPA, IMERG, etc.) How to obtain DQ info fast and easily, especially quantified info in ROI Existing parameters (random error), literature, DIY, etc.How to apply the knowledge in research and applications.Here, we focus on online systems for integration of products and parameters, visualization and analysis as well as investigation and extraction of DQ information.

  13. Research on On-Line Modeling of Fed-Batch Fermentation Process Based on v-SVR

    NASA Astrophysics Data System (ADS)

    Ma, Yongjun

    The fermentation process is very complex and non-linear, many parameters are not easy to measure directly on line, soft sensor modeling is a good solution. This paper introduces v-support vector regression (v-SVR) for soft sensor modeling of fed-batch fermentation process. v-SVR is a novel type of learning machine. It can control the accuracy of fitness and prediction error by adjusting the parameter v. An on-line training algorithm is discussed in detail to reduce the training complexity of v-SVR. The experimental results show that v-SVR has low error rate and better generalization with appropriate v.

  14. Machine tools error characterization and compensation by on-line measurement of artifact

    NASA Astrophysics Data System (ADS)

    Wahid Khan, Abdul; Chen, Wuyi; Wu, Lili

    2009-11-01

    Most manufacturing machine tools are utilized for mass production or batch production with high accuracy at a deterministic manufacturing principle. Volumetric accuracy of machine tools depends on the positional accuracy of the cutting tool, probe or end effector related to the workpiece in the workspace volume. In this research paper, a methodology is presented for volumetric calibration of machine tools by on-line measurement of an artifact or an object of a similar type. The machine tool geometric error characterization was carried out through a standard or an artifact, having similar geometry to the mass production or batch production product. The artifact was measured at an arbitrary position in the volumetric workspace with a calibrated Renishaw touch trigger probe system. Positional errors were stored into a computer for compensation purpose, to further run the manufacturing batch through compensated codes. This methodology was found quite effective to manufacture high precision components with more dimensional accuracy and reliability. Calibration by on-line measurement gives the advantage to improve the manufacturing process by use of deterministic manufacturing principle and found efficient and economical but limited to the workspace or envelop surface of the measured artifact's geometry or the profile.

  15. Application of round grating angle measurement composite error amendment in the online measurement accuracy improvement of large diameter

    NASA Astrophysics Data System (ADS)

    Wang, Biao; Yu, Xiaofen; Li, Qinzhao; Zheng, Yu

    2008-10-01

    The paper aiming at the influence factor of round grating dividing error, rolling-wheel produce eccentricity and surface shape errors provides an amendment method based on rolling-wheel to get the composite error model which includes all influence factors above, and then corrects the non-circle measurement angle error of the rolling-wheel. We make soft simulation verification and have experiment; the result indicates that the composite error amendment method can improve the diameter measurement accuracy with rolling-wheel theory. It has wide application prospect for the measurement accuracy higher than 5 μm/m.

  16. Optimizing ChIP-seq peak detectors using visual labels and supervised machine learning

    PubMed Central

    Goerner-Potvin, Patricia; Morin, Andreanne; Shao, Xiaojian; Pastinen, Tomi

    2017-01-01

    Motivation: Many peak detection algorithms have been proposed for ChIP-seq data analysis, but it is not obvious which algorithm and what parameters are optimal for any given dataset. In contrast, regions with and without obvious peaks can be easily labeled by visual inspection of aligned read counts in a genome browser. We propose a supervised machine learning approach for ChIP-seq data analysis, using labels that encode qualitative judgments about which genomic regions contain or do not contain peaks. The main idea is to manually label a small subset of the genome, and then learn a model that makes consistent peak predictions on the rest of the genome. Results: We created 7 new histone mark datasets with 12 826 visually determined labels, and analyzed 3 existing transcription factor datasets. We observed that default peak detection parameters yield high false positive rates, which can be reduced by learning parameters using a relatively small training set of labeled data from the same experiment type. We also observed that labels from different people are highly consistent. Overall, these data indicate that our supervised labeling method is useful for quantitatively training and testing peak detection algorithms. Availability and Implementation: Labeled histone mark data http://cbio.ensmp.fr/~thocking/chip-seq-chunk-db/, R package to compute the label error of predicted peaks https://github.com/tdhock/PeakError Contacts: toby.hocking@mail.mcgill.ca or guil.bourque@mcgill.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27797775

  17. Optimizing ChIP-seq peak detectors using visual labels and supervised machine learning.

    PubMed

    Hocking, Toby Dylan; Goerner-Potvin, Patricia; Morin, Andreanne; Shao, Xiaojian; Pastinen, Tomi; Bourque, Guillaume

    2017-02-15

    Many peak detection algorithms have been proposed for ChIP-seq data analysis, but it is not obvious which algorithm and what parameters are optimal for any given dataset. In contrast, regions with and without obvious peaks can be easily labeled by visual inspection of aligned read counts in a genome browser. We propose a supervised machine learning approach for ChIP-seq data analysis, using labels that encode qualitative judgments about which genomic regions contain or do not contain peaks. The main idea is to manually label a small subset of the genome, and then learn a model that makes consistent peak predictions on the rest of the genome. We created 7 new histone mark datasets with 12 826 visually determined labels, and analyzed 3 existing transcription factor datasets. We observed that default peak detection parameters yield high false positive rates, which can be reduced by learning parameters using a relatively small training set of labeled data from the same experiment type. We also observed that labels from different people are highly consistent. Overall, these data indicate that our supervised labeling method is useful for quantitatively training and testing peak detection algorithms. Labeled histone mark data http://cbio.ensmp.fr/~thocking/chip-seq-chunk-db/ , R package to compute the label error of predicted peaks https://github.com/tdhock/PeakError. toby.hocking@mail.mcgill.ca or guil.bourque@mcgill.ca. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  18. Adaptive control of nonlinear system using online error minimum neural networks.

    PubMed

    Jia, Chao; Li, Xiaoli; Wang, Kang; Ding, Dawei

    2016-11-01

    In this paper, a new learning algorithm named OEM-ELM (Online Error Minimized-ELM) is proposed based on ELM (Extreme Learning Machine) neural network algorithm and the spreading of its main structure. The core idea of this OEM-ELM algorithm is: online learning, evaluation of network performance, and increasing of the number of hidden nodes. It combines the advantages of OS-ELM and EM-ELM, which can improve the capability of identification and avoid the redundancy of networks. The adaptive control based on the proposed algorithm OEM-ELM is set up which has stronger adaptive capability to the change of environment. The adaptive control of chemical process Continuous Stirred Tank Reactor (CSTR) is also given for application. The simulation results show that the proposed algorithm with respect to the traditional ELM algorithm can avoid network redundancy and improve the control performance greatly. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Optimised to Fail: Card Readers for Online Banking

    NASA Astrophysics Data System (ADS)

    Drimer, Saar; Murdoch, Steven J.; Anderson, Ross

    The Chip Authentication Programme (CAP) has been introduced by banks in Europe to deal with the soaring losses due to online banking fraud. A handheld reader is used together with the customer’s debit card to generate one-time codes for both login and transaction authentication. The CAP protocol is not public, and was rolled out without any public scrutiny. We reverse engineered the UK variant of card readers and smart cards and here provide the first public description of the protocol. We found numerous weaknesses that are due to design errors such as reusing authentication tokens, overloading data semantics, and failing to ensure freshness of responses. The overall strategic error was excessive optimisation. There are also policy implications. The move from signature to PIN for authorising point-of-sale transactions shifted liability from banks to customers; CAP introduces the same problem for online banking. It may also expose customers to physical harm.

  20. Critical comparison of the on-line and off-line molecularly imprinted solid-phase extraction of patulin coupled with liquid chromatography.

    PubMed

    Lhotská, Ivona; Holznerová, Anežka; Solich, Petr; Šatínský, Dalibor

    2017-12-01

    Reaching trace amounts of mycotoxin contamination requires sensitive and selective analytical tools for their determination. Improving the selectivity of sample pretreatment steps covering new and modern extraction techniques is one way to achieve it. Molecularly imprinted polymers as selective sorbent for extraction undoubtedly meet these criteria. The presented work is focused on the hyphenation of on-line molecularly imprinted solid-phase extraction with a chromatography system using a column-switching approach. Making a critical comparison with a simultaneously developed off-line extraction procedure, evaluation of pros and cons of each method, and determining the reliability of both methods on a real sample analysis were carried out. Both high-performance liquid chromatography methods, using off-line extraction on molecularly imprinted polymer and an on-line column-switching approach, were validated, and the validation results were compared against each other. Although automation leads to significant time savings, fewer human errors, and required no handling of toxic solvents, it reached worse detection limits (15 versus 6 μg/L), worse recovery values (68.3-123.5 versus 81.2-109.9%), and worse efficiency throughout the entire clean-up process in comparison with the off-line extraction method. The difficulties encountered, the compromises made during the optimization of on-line coupling and their critical evaluation are presented in detail. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Online and offline awareness deficits: Anosognosia for spatial neglect.

    PubMed

    Chen, Peii; Toglia, Joan

    2018-04-12

    Anosognosia for spatial neglect (ASN) can be offline or online. Offline ASN is general unawareness of having experienced spatial deficits. Online ASN is an awareness deficit of underestimating spatial difficulties that likely to occur in an upcoming task (anticipatory ASN) or have just occurred during the task (emergent ASN). We explored the relationships among spatial neglect, offline ASN, anticipatory ASN, and emergent ASN. Research Method/Design: Forty-four survivors of stroke answered questionnaires assessing offline and online self-awareness of spatial problems. The online questionnaire was asked immediately before and after each of 4 tests for spatial neglect, including shape cancellation, address and sentence copying, telephone dialing, and indented paragraph reading. Participants were certain they had difficulties in daily spatial tasks (offline awareness), in the task they were about to perform (anticipatory awareness) and had just performed (emergent awareness). Nonetheless, they consistently overestimated their spatial abilities, indicating ASN. Offline and online ASN appeared independent. Online ASN improved after task execution. Neglect severity was not positively correlated with offline ASN. Greater neglect severity correlated with both greater anticipatory and emergent ASN. Regardless of neglect severity, we found task-specific differences in emergent ASN but not in anticipatory ASN. Individuals with spatial neglect acknowledge their spatial difficulty (certainty of error occurrence) but may not necessarily recognize the extent of their difficulty (accuracy of error estimation). Our findings suggest that offline and online ASN are independent. A potential implication from the study is that familiar and challenging tasks may facilitate emergence of self-awareness. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Grade Expectations: Mapping Stakeholder Views of Online Plagiarism Detection

    ERIC Educational Resources Information Center

    Ashe, Diana; Manning, Michelle

    2007-01-01

    Based upon a pilot study of the leading online plagiarism detection service, this article examines the views of faculty and students as the main stakeholders in the controversy over online plagiarism detection. Rather than give advice outside of a specific institutional context, this study offers an understanding of the reasoning that informs the…

  3. Student Consistency and Implications for Feedback in Online Assessment Systems

    ERIC Educational Resources Information Center

    Madhyastha, Tara M.; Tanimoto, Steven

    2009-01-01

    Most of the emphasis on mining online assessment logs has been to identify content-specific errors. However, the pattern of general "consistency" is domain independent, strongly related to performance, and can itself be a target of educational data mining. We demonstrate that simple consistency indicators are related to student outcomes,…

  4. Online Psychology: Trial and Error in Course Development

    ERIC Educational Resources Information Center

    Harman, Marsha J.

    2009-01-01

    Online courses appear to be the future if colleges and universities choose to increase enrollments with students who need more flexibility in scheduling. The challenge has been to create a course that is rigorous with the limitations to physical presence of the instructor and the parameters inherent in technological delivery. This article relates…

  5. The Clinical Practice Library of Medicine (CPLM): An on-line biomedical computer library. System documentation

    NASA Technical Reports Server (NTRS)

    Grams, R. R.

    1982-01-01

    A system designed to access a large range of available medical textbook information in an online interactive fashion is described. A high level query type database manager, INQUIRE, is used. Operating instructions, system flow diagrams, database descriptions, text generation, and error messages are discussed. User information is provided.

  6. Towards New Multiplatform Hybrid Online Laboratory Models

    ERIC Educational Resources Information Center

    Rodriguez-Gil, Luis; García-Zubia, Javier; Orduña, Pablo; López-de-Ipiña, Diego

    2017-01-01

    Online laboratories have traditionally been split between virtual labs, with simulated components; and remote labs, with real components. The former tend to provide less realism but to be easily scalable and less expensive to maintain, while the latter are fully real but tend to require a higher maintenance effort and be more error-prone. This…

  7. Improving integrity of on-line grammage measurement with traceable basic calibration.

    PubMed

    Kangasrääsiö, Juha

    2010-07-01

    The automatic control of grammage (basis weight) in paper and board production is based upon on-line grammage measurement. Furthermore, the automatic control of other quality variables such as moisture, ash content and coat weight, may rely on the grammage measurement. The integrity of Kr-85 based on-line grammage measurement systems was studied, by performing basic calibrations with traceably calibrated plastic reference standards. The calibrations were performed according to the EN ISO/IEC 17025 standard, which is a requirement for calibration laboratories. The observed relative measurement errors were 3.3% in the first time calibrations at the 95% confidence level. With the traceable basic calibration method, however, these errors can be reduced to under 0.5%, thus improving the integrity of on-line grammage measurements. Also a standardised algorithm, based on the experience from the performed calibrations, is proposed to ease the adjustment of the different grammage measurement systems. The calibration technique can basically be applied to all beta-radiation based grammage measurements. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Forecasting daily streamflow using online sequential extreme learning machines

    NASA Astrophysics Data System (ADS)

    Lima, Aranildo R.; Cannon, Alex J.; Hsieh, William W.

    2016-06-01

    While nonlinear machine methods have been widely used in environmental forecasting, in situations where new data arrive continually, the need to make frequent model updates can become cumbersome and computationally costly. To alleviate this problem, an online sequential learning algorithm for single hidden layer feedforward neural networks - the online sequential extreme learning machine (OSELM) - is automatically updated inexpensively as new data arrive (and the new data can then be discarded). OSELM was applied to forecast daily streamflow at two small watersheds in British Columbia, Canada, at lead times of 1-3 days. Predictors used were weather forecast data generated by the NOAA Global Ensemble Forecasting System (GEFS), and local hydro-meteorological observations. OSELM forecasts were tested with daily, monthly or yearly model updates. More frequent updating gave smaller forecast errors, including errors for data above the 90th percentile. Larger datasets used in the initial training of OSELM helped to find better parameters (number of hidden nodes) for the model, yielding better predictions. With the online sequential multiple linear regression (OSMLR) as benchmark, we concluded that OSELM is an attractive approach as it easily outperformed OSMLR in forecast accuracy.

  9. Sensitivity in error detection of patient specific QA tools for IMRT plans

    NASA Astrophysics Data System (ADS)

    Lat, S. Z.; Suriyapee, S.; Sanghangthum, T.

    2016-03-01

    The high complexity of dose calculation in treatment planning and accurate delivery of IMRT plan need high precision of verification method. The purpose of this study is to investigate error detection capability of patient specific QA tools for IMRT plans. The two H&N and two prostate IMRT plans with MapCHECK2 and portal dosimetry QA tools were studied. Measurements were undertaken for original and modified plans with errors introduced. The intentional errors composed of prescribed dose (±2 to ±6%) and position shifting in X-axis and Y-axis (±1 to ±5mm). After measurement, gamma pass between original and modified plans were compared. The average gamma pass for original H&N and prostate plans were 98.3% and 100% for MapCHECK2 and 95.9% and 99.8% for portal dosimetry, respectively. In H&N plan, MapCHECK2 can detect position shift errors starting from 3mm while portal dosimetry can detect errors started from 2mm. Both devices showed similar sensitivity in detection of position shift error in prostate plan. For H&N plan, MapCHECK2 can detect dose errors starting at ±4%, whereas portal dosimetry can detect from ±2%. For prostate plan, both devices can identify dose errors starting from ±4%. Sensitivity of error detection depends on type of errors and plan complexity.

  10. A Mechanism for Error Detection in Speeded Response Time Tasks

    ERIC Educational Resources Information Center

    Holroyd, Clay B.; Yeung, Nick; Coles, Michael G. H.; Cohen, Jonathan D.

    2005-01-01

    The concept of error detection plays a central role in theories of executive control. In this article, the authors present a mechanism that can rapidly detect errors in speeded response time tasks. This error monitor assigns values to the output of cognitive processes involved in stimulus categorization and response generation and detects errors…

  11. The generalization ability of online SVM classification based on Markov sampling.

    PubMed

    Xu, Jie; Yan Tang, Yuan; Zou, Bin; Xu, Zongben; Li, Luoqing; Lu, Yang

    2015-03-01

    In this paper, we consider online support vector machine (SVM) classification learning algorithms with uniformly ergodic Markov chain (u.e.M.c.) samples. We establish the bound on the misclassification error of an online SVM classification algorithm with u.e.M.c. samples based on reproducing kernel Hilbert spaces and obtain a satisfactory convergence rate. We also introduce a novel online SVM classification algorithm based on Markov sampling, and present the numerical studies on the learning ability of online SVM classification based on Markov sampling for benchmark repository. The numerical studies show that the learning performance of the online SVM classification algorithm based on Markov sampling is better than that of classical online SVM classification based on random sampling as the size of training samples is larger.

  12. Procedural error monitoring and smart checklists

    NASA Technical Reports Server (NTRS)

    Palmer, Everett

    1990-01-01

    Human beings make and usually detect errors routinely. The same mental processes that allow humans to cope with novel problems can also lead to error. Bill Rouse has argued that errors are not inherently bad but their consequences may be. He proposes the development of error-tolerant systems that detect errors and take steps to prevent the consequences of the error from occurring. Research should be done on self and automatic detection of random and unanticipated errors. For self detection, displays should be developed that make the consequences of errors immediately apparent. For example, electronic map displays graphically show the consequences of horizontal flight plan entry errors. Vertical profile displays should be developed to make apparent vertical flight planning errors. Other concepts such as energy circles could also help the crew detect gross flight planning errors. For automatic detection, systems should be developed that can track pilot activity, infer pilot intent and inform the crew of potential errors before their consequences are realized. Systems that perform a reasonableness check on flight plan modifications by checking route length and magnitude of course changes are simple examples. Another example would be a system that checked the aircraft's planned altitude against a data base of world terrain elevations. Information is given in viewgraph form.

  13. Differential effects of primary motor cortex and cerebellar transcranial direct current stimulation on motor learning in healthy individuals: A randomized double-blind sham-controlled study.

    PubMed

    Ehsani, F; Bakhtiary, A H; Jaberzadeh, S; Talimkhani, A; Hajihasani, A

    2016-11-01

    The purpose of study was to compare the effect of primary motor cortex (M1) and cerebellar anodal transcranial direct current stimulation (a-tDCS) on online and offline motor learning in healthy individuals. Fifty-nine healthy volunteers were randomly divided into three groups (n=20 in two experimental groups and n=19 in sham-control group). One experimental group received M1a-tDCSand another received cerebellar a-tDCS. The main outcome measure were response time (RT) and number of errors during serial response time test (SRTT) which were assessed prior, 35min and 48h after the interventions. Reduction of response time (RT) and error numbers at last block of the test compared to the first block was considered online learning. Comparison of assessments during retention tests was considered as short-term and long-term offline learning. Online RT reduction was not different among groups (P>0.05), while online error reduction was significantly greater in cerebellar a-tDCS than sham-control group (P<0.017). Moreover, a-tDCS on both M1 and cerebellar regions produced more long-term offline learning as compared to sham tDCS (P<0.01), while short-term offline RT reduction was significantly greater in M1a-tDCS than sham-control group (P<0.05). The findings indicated that although cerebellar a-tDCS enhances online learning and M1a-tDCS has more effect on short-term offline learning, both M 1 and cerebellar a-tDCS can be used as a boosting technique for improvement of offline motor learning in healthy individuals. Crown Copyright © 2016. Published by Elsevier Ireland Ltd. All rights reserved.

  14. Is Comprehension Necessary for Error Detection? A Conflict-Based Account of Monitoring in Speech Production

    ERIC Educational Resources Information Center

    Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.

    2011-01-01

    Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the…

  15. Clover: Compiler directed lightweight soft error resilience

    DOE PAGES

    Liu, Qingrui; Lee, Dongyoon; Jung, Changhee; ...

    2015-05-01

    This paper presents Clover, a compiler directed soft error detection and recovery scheme for lightweight soft error resilience. The compiler carefully generates soft error tolerant code based on idem-potent processing without explicit checkpoint. During program execution, Clover relies on a small number of acoustic wave detectors deployed in the processor to identify soft errors by sensing the wave made by a particle strike. To cope with DUE (detected unrecoverable errors) caused by the sensing latency of error detection, Clover leverages a novel selective instruction duplication technique called tail-DMR (dual modular redundancy). Once a soft error is detected by either themore » sensor or the tail-DMR, Clover takes care of the error as in the case of exception handling. To recover from the error, Clover simply redirects program control to the beginning of the code region where the error is detected. Lastly, the experiment results demonstrate that the average runtime overhead is only 26%, which is a 75% reduction compared to that of the state-of-the-art soft error resilience technique.« less

  16. The Case of the Pilfered Paper: Implications of Online Writing Assistance and Web-Based Plagiarism Detection Services

    ERIC Educational Resources Information Center

    Morgan, Phoebe; Vaughn, Jacqueline

    2010-01-01

    While there is nothing new about academic dishonesty, how it is committed, prevented, and detected has been dramatically transformed by the advent of online technologies. This article briefly describes the concurrent emergence of online writing assistance services and Web-based plagiarism detection tools and examines the implications of both for…

  17. A Conceptual Framework for Detecting Cheating in Online and Take-Home Exams

    ERIC Educational Resources Information Center

    D'Souza, Kelwyn A.; Siegfeldt, Denise V.

    2017-01-01

    Selecting the right methodology to use for detecting cheating in online exams requires considerable time and effort due to a wide variety of scholarly publications on academic dishonesty in online education. This article offers a cheating detection framework that can serve as a guideline for conducting cheating studies. The necessary theories and…

  18. Impact of Measurement Error on Synchrophasor Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yilu; Gracia, Jose R.; Ewing, Paul D.

    2015-07-01

    Phasor measurement units (PMUs), a type of synchrophasor, are powerful diagnostic tools that can help avert catastrophic failures in the power grid. Because of this, PMU measurement errors are particularly worrisome. This report examines the internal and external factors contributing to PMU phase angle and frequency measurement errors and gives a reasonable explanation for them. It also analyzes the impact of those measurement errors on several synchrophasor applications: event location detection, oscillation detection, islanding detection, and dynamic line rating. The primary finding is that dynamic line rating is more likely to be influenced by measurement error. Other findings include themore » possibility of reporting nonoscillatory activity as an oscillation as the result of error, failing to detect oscillations submerged by error, and the unlikely impact of error on event location and islanding detection.« less

  19. VizieR Online Data Catalog: R136 JKs photometry from VLT/SPHERE EAO (Khorrami+, 2017)

    NASA Astrophysics Data System (ADS)

    Khorrami, Z.; Vakili, F.; Lanz, T.; Langlois, M.; Lagadec, E.; Meyer, M. R.; Robbe-Dubois, S.; Abe, L.; Avenhaus, H.; Beuzit, J. L.; Gratton, R.; Mouillet, D.; Origne, A.; Petit, C.; Ramos, J.

    2017-03-01

    The SPHERE/IRDIS catalog of the common sources between J and Ks-band data on R136. The ID, Xpix and Ypix are the identification and pixel position in the IRDIS K and J image. σK and σJ are the total error (combination of PSF-fitting error, residual errors and the calibration error) in Ks and J images. CK and CJ are the Correlation coefficients between the input PSF and the star, in Ks and J data. (1 data file).

  20. Methods and apparatus using commutative error detection values for fault isolation in multiple node computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almasi, Gheorghe; Blumrich, Matthias Augustin; Chen, Dong

    Methods and apparatus perform fault isolation in multiple node computing systems using commutative error detection values for--example, checksums--to identify and to isolate faulty nodes. When information associated with a reproducible portion of a computer program is injected into a network by a node, a commutative error detection value is calculated. At intervals, node fault detection apparatus associated with the multiple node computer system retrieve commutative error detection values associated with the node and stores them in memory. When the computer program is executed again by the multiple node computer system, new commutative error detection values are created and stored inmore » memory. The node fault detection apparatus identifies faulty nodes by comparing commutative error detection values associated with reproducible portions of the application program generated by a particular node from different runs of the application program. Differences in values indicate a possible faulty node.« less

  1. WE-H-BRC-09: Simulated Errors in Mock Radiotherapy Plans to Quantify the Effectiveness of the Physics Plan Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopan, O; Kalet, A; Smith, W

    2016-06-15

    Purpose: A standard tool for ensuring the quality of radiation therapy treatments is the initial physics plan review. However, little is known about its performance in practice. The goal of this study is to measure the effectiveness of physics plan review by introducing simulated errors into “mock” treatment plans and measuring the performance of plan review by physicists. Methods: We generated six mock treatment plans containing multiple errors. These errors were based on incident learning system data both within the department and internationally (SAFRON). These errors were scored for severity and frequency. Those with the highest scores were included inmore » the simulations (13 errors total). Observer bias was minimized using a multiple co-correlated distractor approach. Eight physicists reviewed these plans for errors, with each physicist reviewing, on average, 3/6 plans. The confidence interval for the proportion of errors detected was computed using the Wilson score interval. Results: Simulated errors were detected in 65% of reviews [51–75%] (95% confidence interval [CI] in brackets). The following error scenarios had the highest detection rates: incorrect isocenter in DRRs/CBCT (91% [73–98%]) and a planned dose different from the prescribed dose (100% [61–100%]). Errors with low detection rates involved incorrect field parameters in record and verify system (38%, [18–61%]) and incorrect isocenter localization in planning system (29% [8–64%]). Though pre-treatment QA failure was reliably identified (100%), less than 20% of participants reported the error that caused the failure. Conclusion: This is one of the first quantitative studies of error detection. Although physics plan review is a key safety measure and can identify some errors with high fidelity, others errors are more challenging to detect. This data will guide future work on standardization and automation. Creating new checks or improving existing ones (i.e., via automation) will help in detecting those errors with low detection rates.« less

  2. Convergence and objective functions of some fault/noise-injection-based online learning algorithms for RBF networks.

    PubMed

    Ho, Kevin I-J; Leung, Chi-Sing; Sum, John

    2010-06-01

    In the last two decades, many online fault/noise injection algorithms have been developed to attain a fault tolerant neural network. However, not much theoretical works related to their convergence and objective functions have been reported. This paper studies six common fault/noise-injection-based online learning algorithms for radial basis function (RBF) networks, namely 1) injecting additive input noise, 2) injecting additive/multiplicative weight noise, 3) injecting multiplicative node noise, 4) injecting multiweight fault (random disconnection of weights), 5) injecting multinode fault during training, and 6) weight decay with injecting multinode fault. Based on the Gladyshev theorem, we show that the convergence of these six online algorithms is almost sure. Moreover, their true objective functions being minimized are derived. For injecting additive input noise during training, the objective function is identical to that of the Tikhonov regularizer approach. For injecting additive/multiplicative weight noise during training, the objective function is the simple mean square training error. Thus, injecting additive/multiplicative weight noise during training cannot improve the fault tolerance of an RBF network. Similar to injective additive input noise, the objective functions of other fault/noise-injection-based online algorithms contain a mean square error term and a specialized regularization term.

  3. Correction to Cantor et al. (2005)

    ERIC Educational Resources Information Center

    Cantor, James M.; Blanchard, Ray; Robichaud, Lori K.; Christensen, Bruce K.

    2005-01-01

    This paper reports an error in the original article by James M. Cantor, Ray Blanchard, Lori K. Robichaud, and Bruce K. Christensen ("Psychological Bulletin," 2005, Vol. 131, No. 4, pp. 555-568). As a result of an editorial error the article listed the link to online supplemental data incorrectly. The correct URL is provided here. (The following…

  4. Optimal Tuner Selection for Kalman Filter-Based Aircraft Engine Performance Estimation

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Garg, Sanjay

    2010-01-01

    A linear point design methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented. This technique specifically addresses the underdetermined estimation problem, where there are more unknown parameters than available sensor measurements. A systematic approach is applied to produce a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. Tuning parameter selection is performed using a multi-variable iterative search routine which seeks to minimize the theoretical mean-squared estimation error. This paper derives theoretical Kalman filter estimation error bias and variance values at steady-state operating conditions, and presents the tuner selection routine applied to minimize these values. Results from the application of the technique to an aircraft engine simulation are presented and compared to the conventional approach of tuner selection. Experimental simulation results are found to be in agreement with theoretical predictions. The new methodology is shown to yield a significant improvement in on-line engine performance estimation accuracy

  5. IMU-based online kinematic calibration of robot manipulator.

    PubMed

    Du, Guanglong; Zhang, Ping

    2013-01-01

    Robot calibration is a useful diagnostic method for improving the positioning accuracy in robot production and maintenance. An online robot self-calibration method based on inertial measurement unit (IMU) is presented in this paper. The method requires that the IMU is rigidly attached to the robot manipulator, which makes it possible to obtain the orientation of the manipulator with the orientation of the IMU in real time. This paper proposed an efficient approach which incorporates Factored Quaternion Algorithm (FQA) and Kalman Filter (KF) to estimate the orientation of the IMU. Then, an Extended Kalman Filter (EKF) is used to estimate kinematic parameter errors. Using this proposed orientation estimation method will result in improved reliability and accuracy in determining the orientation of the manipulator. Compared with the existing vision-based self-calibration methods, the great advantage of this method is that it does not need the complex steps, such as camera calibration, images capture, and corner detection, which make the robot calibration procedure more autonomous in a dynamic manufacturing environment. Experimental studies on a GOOGOL GRB3016 robot show that this method has better accuracy, convenience, and effectiveness than vision-based methods.

  6. Flexible coordinate measurement system based on robot for industries

    NASA Astrophysics Data System (ADS)

    Guo, Yin; Yang, Xue-you; Liu, Chang-jie; Ye, Sheng-hua

    2010-10-01

    The flexible coordinate measurement system based on robot which is applicable to multi-model vehicle is designed to meet the needs of online measurement for current mainstream mixed body-in-white(BIW) production line. The moderate precision, good flexibility and no blind angle are the benefits of this measurement system. According to the measurement system, a monocular structured light vision sensor has been designed, which can measure not only edges, but also planes, apertures and other features. And a effective way to fast on-site calibration of the whole system using the laser tracker has also been proposed, which achieves the unity of various coordinate systems in industrial fields. The experimental results show satisfactory precision of +/-0.30mm of this measurement system, which is sufficient for the needs of online measurement for body-in-white(BIW) in the auto production line. The system achieves real-time detection and monitoring of the whole process of the car body's manufacture, and provides a complete data support in purpose of overcoming the manufacturing error immediately and accurately and improving the manufacturing precision.

  7. Electro-Optical Inspection For Tolerance Control As An Integral Part Of A Flexible Machining Cell

    NASA Astrophysics Data System (ADS)

    Renaud, Blaise

    1986-11-01

    Institut CERAC has been involved in optical metrology and 3-dimensional surface control for the last couple of years. Among the industrial applications considered is the on-line shape evaluation of machined parts within the manufacturing cell. The specific objective is to measure the machining errors and to compare them with the tolerances set by designers. An electro-optical sensing technique has been developed which relies on a projection Moire contouring optical method. A prototype inspection system has been designed, making use of video detection and computer image processing. Moire interferograms are interpreted, and the metrological information automatically retrieved. A structured database can be generated for subsequent data analysis and for real-time closed-loop corrective actions. A real-time kernel embedded into a synchronisation network (Petri-net) for the control of concurrent processes in the Electra-Optical Inspection (E0I) station was realised and implemented in a MODULA-2 program DIN01. The prototype system for on-line automatic tolerance control taking place within a flexible machining cell is described in this paper, together with the fast-prototype synchronisation program.

  8. Improved setup and positioning accuracy using a three‐point customized cushion/mask/bite‐block immobilization system for stereotactic reirradiation of head and neck cancer

    PubMed Central

    Wang, He; Wang, Congjun; Tung, Samuel; Dimmitt, Andrew Wilson; Wong, Pei Fong; Edson, Mark A.; Garden, Adam S.; Rosenthal, David I.; Fuller, Clifton D.; Gunn, Gary B.; Takiar, Vinita; Wang, Xin A.; Luo, Dershan; Yang, James N.; Wong, Jennifer

    2016-01-01

    The purpose of this study was to investigate the setup and positioning uncertainty of a custom cushion/mask/bite‐block (CMB) immobilization system and determine PTV margin for image‐guided head and neck stereotactic ablative radiotherapy (HN‐SABR). We analyzed 105 treatment sessions among 21 patients treated with HN‐SABR for recurrent head and neck cancers using a custom CMB immobilization system. Initial patient setup was performed using the ExacTrac infrared (IR) tracking system and initial setup errors were based on comparison of ExacTrac IR tracking system to corrected online ExacTrac X‐rays images registered to treatment plans. Residual setup errors were determined using repeat verification X‐ray. The online ExacTrac corrections were compared to cone‐beam CT (CBCT) before treatment to assess agreement. Intrafractional positioning errors were determined using prebeam X‐rays. The systematic and random errors were analyzed. The initial translational setup errors were −0.8±1.3 mm, −0.8±1.6 mm, and 0.3±1.9 mm in AP, CC, and LR directions, respectively, with a three‐dimensional (3D) vector of 2.7±1.4 mm. The initial rotational errors were up to 2.4° if 6D couch is not available. CBCT agreed with ExacTrac X‐ray images to within 2 mm and 2.5°. The intrafractional uncertainties were 0.1±0.6 mm, 0.1±0.6 mm, and 0.2±0.5 mm in AP, CC, and LR directions, respectively, and 0.0∘±0.5°, 0.0∘±0.6°, and −0.1∘±0.4∘ in yaw, roll, and pitch direction, respectively. The translational vector was 0.9±0.6 mm. The calculated PTV margins mPTV(90,95) were within 1.6 mm when using image guidance for online setup correction. The use of image guidance for online setup correction, in combination with our customized CMB device, highly restricted target motion during treatments and provided robust immobilization to ensure minimum dose of 95% to target volume with 2.0 mm PTV margin for HN‐SABR. PACS number(s): 87.55.ne PMID:27167275

  9. TU-C-BRE-11: 3D EPID-Based in Vivo Dosimetry: A Major Step Forward Towards Optimal Quality and Safety in Radiation Oncology Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mijnheer, B; Mans, A; Olaciregui-Ruiz, I

    Purpose: To develop a 3D in vivo dosimetry method that is able to substitute pre-treatment verification in an efficient way, and to terminate treatment delivery if the online measured 3D dose distribution deviates too much from the predicted dose distribution. Methods: A back-projection algorithm has been further developed and implemented to enable automatic 3D in vivo dose verification of IMRT/VMAT treatments using a-Si EPIDs. New software tools were clinically introduced to allow automated image acquisition, to periodically inspect the record-and-verify database, and to automatically run the EPID dosimetry software. The comparison of the EPID-reconstructed and planned dose distribution is donemore » offline to raise automatically alerts and to schedule actions when deviations are detected. Furthermore, a software package for online dose reconstruction was also developed. The RMS of the difference between the cumulative planned and reconstructed 3D dose distributions was used for triggering a halt of a linac. Results: The implementation of fully automated 3D EPID-based in vivo dosimetry was able to replace pre-treatment verification for more than 90% of the patient treatments. The process has been fully automated and integrated in our clinical workflow where over 3,500 IMRT/VMAT treatments are verified each year. By optimizing the dose reconstruction algorithm and the I/O performance, the delivered 3D dose distribution is verified in less than 200 ms per portal image, which includes the comparison between the reconstructed and planned dose distribution. In this way it was possible to generate a trigger that can stop the irradiation at less than 20 cGy after introducing large delivery errors. Conclusion: The automatic offline solution facilitated the large scale clinical implementation of 3D EPID-based in vivo dose verification of IMRT/VMAT treatments; the online approach has been successfully tested for various severe delivery errors.« less

  10. Software error detection

    NASA Technical Reports Server (NTRS)

    Buechler, W.; Tucker, A. G.

    1981-01-01

    Several methods were employed to detect both the occurrence and source of errors in the operational software of the AN/SLQ-32. A large embedded real time electronic warfare command and control system for the ROLM 1606 computer are presented. The ROLM computer provides information about invalid addressing, improper use of privileged instructions, stack overflows, and unimplemented instructions. Additionally, software techniques were developed to detect invalid jumps, indices out of range, infinte loops, stack underflows, and field size errors. Finally, data are saved to provide information about the status of the system when an error is detected. This information includes I/O buffers, interrupt counts, stack contents, and recently passed locations. The various errors detected, techniques to assist in debugging problems, and segment simulation on a nontarget computer are discussed. These error detection techniques were a major factor in the success of finding the primary cause of error in 98% of over 500 system dumps.

  11. Principal Candidates Create Decision-Making Simulations to Prepare for the JOB

    ERIC Educational Resources Information Center

    Staub, Nancy A.; Bravender, Marlena

    2014-01-01

    Online simulations offer opportunities for trial and error decision-making. What better tool for a principal than to make decisions when the consequences will not have real-world ramifications. In this study, two groups of graduate students in a principal preparation program taking the same course in the same semester use online simulations…

  12. Understanding statements now a virtual cinch.

    PubMed

    Weber, Danielle B; Talaga, John

    2005-04-01

    With a click of the mouse, some patients are accessing and paying their hospital bills online. Novant Health revamped its patient billing process so it's easier to understand and use. Developing a clear, concise billing statement and then implementing an online bill presentment and payment system resulted in improved customer relations, fewer payment processing errors, and faster receipt of payment.

  13. Help! Active Student Learning and Error Remediation in an Online Calculus e-Help Community

    ERIC Educational Resources Information Center

    van de Sande, Carla; Leinhardt, Gaea

    2007-01-01

    Free, open, online homework help sites appear to be extremely popular and exist for many school subjects. Students can anonymously post problems at their convenience and receive responses from forum members. This mode of tutoring may be especially critical for school subjects such as calculus that are intrinsically challenging and have high…

  14. Increasing Students' Awareness of Their Behavior in Online Learning Environments with Visualizations and Achievement Badges

    ERIC Educational Resources Information Center

    Auvinen, Tapio; Hakulinen, Lasse; Malmi, Lauri

    2015-01-01

    In online learning environments where automatic assessment is used, students often resort to harmful study practices such as procrastination and trial-and-error. In this paper, we study two teaching interventions that were designed to address these issues in a university-level computer science course. In the first intervention, we used achievement…

  15. Permanence analysis of a concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.; Lin, S.; Kasami, T.

    1983-01-01

    A concatenated coding scheme for error control in data communications is analyzed. In this scheme, the inner code is used for both error correction and detection, however, the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. Probability of undetected error is derived and bounded. A particular example, proposed for the planetary program, is analyzed.

  16. Probability of undetected error after decoding for a concatenated coding scheme

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.; Lin, S.

    1984-01-01

    A concatenated coding scheme for error control in data communications is analyzed. In this scheme, the inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. Probability of undetected error is derived and bounded. A particular example, proposed for NASA telecommand system is analyzed.

  17. Investigating Perceptual Biases, Data Reliability, and Data Discovery in a Methodology for Collecting Speech Errors From Audio Recordings.

    PubMed

    Alderete, John; Davies, Monica

    2018-04-01

    This work describes a methodology of collecting speech errors from audio recordings and investigates how some of its assumptions affect data quality and composition. Speech errors of all types (sound, lexical, syntactic, etc.) were collected by eight data collectors from audio recordings of unscripted English speech. Analysis of these errors showed that: (i) different listeners find different errors in the same audio recordings, but (ii) the frequencies of error patterns are similar across listeners; (iii) errors collected "online" using on the spot observational techniques are more likely to be affected by perceptual biases than "offline" errors collected from audio recordings; and (iv) datasets built from audio recordings can be explored and extended in a number of ways that traditional corpus studies cannot be.

  18. An accurate algorithm for the detection of DNA fragments from dilution pool sequencing experiments.

    PubMed

    Bansal, Vikas

    2018-01-01

    The short read lengths of current high-throughput sequencing technologies limit the ability to recover long-range haplotype information. Dilution pool methods for preparing DNA sequencing libraries from high molecular weight DNA fragments enable the recovery of long DNA fragments from short sequence reads. These approaches require computational methods for identifying the DNA fragments using aligned sequence reads and assembling the fragments into long haplotypes. Although a number of computational methods have been developed for haplotype assembly, the problem of identifying DNA fragments from dilution pool sequence data has not received much attention. We formulate the problem of detecting DNA fragments from dilution pool sequencing experiments as a genome segmentation problem and develop an algorithm that uses dynamic programming to optimize a likelihood function derived from a generative model for the sequence reads. This algorithm uses an iterative approach to automatically infer the mean background read depth and the number of fragments in each pool. Using simulated data, we demonstrate that our method, FragmentCut, has 25-30% greater sensitivity compared with an HMM based method for fragment detection and can also detect overlapping fragments. On a whole-genome human fosmid pool dataset, the haplotypes assembled using the fragments identified by FragmentCut had greater N50 length, 16.2% lower switch error rate and 35.8% lower mismatch error rate compared with two existing methods. We further demonstrate the greater accuracy of our method using two additional dilution pool datasets. FragmentCut is available from https://bansal-lab.github.io/software/FragmentCut. vibansal@ucsd.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  19. A Swiss cheese error detection method for real-time EPID-based quality assurance and error prevention.

    PubMed

    Passarge, Michelle; Fix, Michael K; Manser, Peter; Stampanoni, Marco F M; Siebers, Jeffrey V

    2017-04-01

    To develop a robust and efficient process that detects relevant dose errors (dose errors of ≥5%) in external beam radiation therapy and directly indicates the origin of the error. The process is illustrated in the context of electronic portal imaging device (EPID)-based angle-resolved volumetric-modulated arc therapy (VMAT) quality assurance (QA), particularly as would be implemented in a real-time monitoring program. A Swiss cheese error detection (SCED) method was created as a paradigm for a cine EPID-based during-treatment QA. For VMAT, the method compares a treatment plan-based reference set of EPID images with images acquired over each 2° gantry angle interval. The process utilizes a sequence of independent consecutively executed error detection tests: an aperture check that verifies in-field radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment check to examine if rotation, scaling, and translation are within tolerances; pixel intensity check containing the standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each check were determined. To test the SCED method, 12 different types of errors were selected to modify the original plan. A series of angle-resolved predicted EPID images were artificially generated for each test case, resulting in a sequence of precalculated frames for each modified treatment plan. The SCED method was applied multiple times for each test case to assess the ability to detect introduced plan variations. To compare the performance of the SCED process with that of a standard gamma analysis, both error detection methods were applied to the generated test cases with realistic noise variations. Averaged over ten test runs, 95.1% of all plan variations that resulted in relevant patient dose errors were detected within 2° and 100% within 14° (<4% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 89.1% were detected by the SCED method within 2°. Based on the type of check that detected the error, determination of error sources was achieved. With noise ranging from no random noise to four times the established noise value, the averaged relevant dose error detection rate of the SCED method was between 94.0% and 95.8% and that of gamma between 82.8% and 89.8%. An EPID-frame-based error detection process for VMAT deliveries was successfully designed and tested via simulations. The SCED method was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of relevant dose errors. Compared to a typical (3%, 3 mm) gamma analysis, the SCED method produced a higher detection rate for all introduced dose errors, identified errors in an earlier stage, displayed a higher robustness to noise variations, and indicated the error source. © 2017 American Association of Physicists in Medicine.

  20. Competitive action video game players display rightward error bias during on-line video game play.

    PubMed

    Roebuck, Andrew J; Dubnyk, Aurora J B; Cochran, David; Mandryk, Regan L; Howland, John G; Harms, Victoria

    2017-09-12

    Research in asymmetrical visuospatial attention has identified a leftward bias in the general population across a variety of measures including visual attention and line-bisection tasks. In addition, increases in rightward collisions, or bumping, during visuospatial navigation tasks have been demonstrated in real world and virtual environments. However, little research has investigated these biases beyond the laboratory. The present study uses a semi-naturalistic approach and the online video game streaming service Twitch to examine navigational errors and assaults as skilled action video game players (n = 60) compete in Counter Strike: Global Offensive. This study showed a significant rightward bias in both fatal assaults and navigational errors. Analysis using the in-game ranking system as a measure of skill failed to show a relationship between bias and skill. These results suggest that a leftward visuospatial bias may exist in skilled players during online video game play. However, the present study was unable to account for some factors such as environmental symmetry and player handedness. In conclusion, video game streaming is a promising method for behavioural research in the future, however further study is required before one can determine whether these results are an artefact of the method applied, or representative of a genuine rightward bias.

  1. SU-F-T-471: Simulated External Beam Delivery Errors Detection with a Large Area Ion Chamber Transmission Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, D; Dyer, B; Kumaran Nair, C

    Purpose: The Integral Quality Monitor (IQM), developed by iRT Systems GmbH (Koblenz, Germany) is a large-area, linac-mounted ion chamber used to monitor photon fluence during patient treatment. Our previous work evaluated the change of the ion chamber’s response to deviations from static 1×1 cm2 and 10×10 cm2 photon beams and other characteristics integral to use in external beam detection. The aim of this work is to simulate two external beam radiation delivery errors, quantify the detection of simulated errors and evaluate the reduction in patient harm resulting from detection. Methods: Two well documented radiation oncology delivery errors were selected formore » simulation. The first error was recreated by modifying a wedged whole breast treatment, removing the physical wedge and calculating the planned dose with Pinnacle TPS (Philips Radiation Oncology Systems, Fitchburg, WI). The second error was recreated by modifying a static-gantry IMRT pharyngeal tonsil plan to be delivered in 3 unmodulated fractions. A radiation oncologist evaluated the dose for simulated errors and predicted morbidity and mortality commiserate with the original reported toxicity, indicating that reported errors were approximately simulated. The ion chamber signal of unmodified treatments was compared to the simulated error signal and evaluated in Pinnacle TPS again with radiation oncologist prediction of simulated patient harm. Results: Previous work established that transmission detector system measurements are stable within 0.5% standard deviation (SD). Errors causing signal change greater than 20 SD (10%) were considered detected. The whole breast and pharyngeal tonsil IMRT simulated error increased signal by 215% and 969%, respectively, indicating error detection after the first fraction and IMRT segment, respectively. Conclusion: The transmission detector system demonstrated utility in detecting clinically significant errors and reducing patient toxicity/harm in simulated external beam delivery. Future work will evaluate detection of other smaller magnitude delivery errors.« less

  2. Online boosting for vehicle detection.

    PubMed

    Chang, Wen-Chung; Cho, Chih-Wei

    2010-06-01

    This paper presents a real-time vision-based vehicle detection system employing an online boosting algorithm. It is an online AdaBoost approach for a cascade of strong classifiers instead of a single strong classifier. Most existing cascades of classifiers must be trained offline and cannot effectively be updated when online tuning is required. The idea is to develop a cascade of strong classifiers for vehicle detection that is capable of being online trained in response to changing traffic environments. To make the online algorithm tractable, the proposed system must efficiently tune parameters based on incoming images and up-to-date performance of each weak classifier. The proposed online boosting method can improve system adaptability and accuracy to deal with novel types of vehicles and unfamiliar environments, whereas existing offline methods rely much more on extensive training processes to reach comparable results and cannot further be updated online. Our approach has been successfully validated in real traffic environments by performing experiments with an onboard charge-coupled-device camera in a roadway vehicle.

  3. Statistically qualified neuro-analytic failure detection method and system

    DOEpatents

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  4. CBrowse: a SAM/BAM-based contig browser for transcriptome assembly visualization and analysis.

    PubMed

    Li, Pei; Ji, Guoli; Dong, Min; Schmidt, Emily; Lenox, Douglas; Chen, Liangliang; Liu, Qi; Liu, Lin; Zhang, Jie; Liang, Chun

    2012-09-15

    To address the impending need for exploring rapidly increased transcriptomics data generated for non-model organisms, we developed CBrowse, an AJAX-based web browser for visualizing and analyzing transcriptome assemblies and contigs. Designed in a standard three-tier architecture with a data pre-processing pipeline, CBrowse is essentially a Rich Internet Application that offers many seamlessly integrated web interfaces and allows users to navigate, sort, filter, search and visualize data smoothly. The pre-processing pipeline takes the contig sequence file in FASTA format and its relevant SAM/BAM file as the input; detects putative polymorphisms, simple sequence repeats and sequencing errors in contigs and generates image, JSON and database-compatible CSV text files that are directly utilized by different web interfaces. CBowse is a generic visualization and analysis tool that facilitates close examination of assembly quality, genetic polymorphisms, sequence repeats and/or sequencing errors in transcriptome sequencing projects. CBrowse is distributed under the GNU General Public License, available at http://bioinfolab.muohio.edu/CBrowse/ liangc@muohio.edu or liangc.mu@gmail.com; glji@xmu.edu.cn Supplementary data are available at Bioinformatics online.

  5. TH-AB-202-04: Auto-Adaptive Margin Generation for MLC-Tracked Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glitzner, M; Lagendijk, J; Raaymakers, B

    Purpose: To develop an auto-adaptive margin generator for MLC tracking. The generator is able to estimate errors arising in image guided radiotherapy, particularly on an MR-Linac, which depend on the latencies of machine and image processing, as well as on patient motion characteristics. From the estimated error distribution, a segment margin is generated, able to compensate errors up to a user-defined confidence. Method: In every tracking control cycle (TCC, 40ms), the desired aperture D(t) is compared to the actual aperture A(t), a delayed and imperfect representation of D(t). Thus an error e(t)=A(T)-D(T) is measured every TCC. Applying kernel-density-estimation (KDE), themore » cumulative distribution (CDF) of e(t) is estimated. With CDF-confidence limits, upper and lower error limits are extracted for motion axes along and perpendicular leaf-travel direction and applied as margins. To test the dosimetric impact, two representative motion traces were extracted from fast liver-MRI (10Hz). The traces were applied onto a 4D-motion platform and continuously tracked by an Elekta Agility 160 MLC using an artificially imposed tracking delay. Gafchromic film was used to detect dose exposition for static, tracked, and error-compensated tracking cases. The margin generator was parameterized to cover 90% of all tracking errors. Dosimetric impact was rated by calculating the ratio between underexposed points (>5% underdosage) to the total number of points inside FWHM of static exposure. Results: Without imposing adaptive margins, tracking experiments showed a ratio of underexposed points of 17.5% and 14.3% for two motion cases with imaging delays of 200ms and 300ms, respectively. Activating the margin generated yielded total suppression (<1%) of underdosed points. Conclusion: We showed that auto-adaptive error compensation using machine error statistics is possible for MLC tracking. The error compensation margins are calculated on-line, without the need of assuming motion or machine models. Further strategies to reduce consequential overdosages are currently under investigation. This work was funded by the SoRTS consortium, which includes the industry partners Elekta, Philips and Technolution.« less

  6. Effects of Contextual Sight-Singing and Aural Skills Training on Error-Detection Abilities.

    ERIC Educational Resources Information Center

    Sheldon, Deborah A.

    1998-01-01

    Examines the effects of contextual sight-singing and ear training on pitch and rhythm error detection abilities among undergraduate instrumental music education majors. Shows that additional training produced better error detection, particularly with rhythm errors and in one-part examples. Maintains that differences attributable to texture were…

  7. MultiGeMS: detection of SNVs from multiple samples using model selection on high-throughput sequencing data.

    PubMed

    Murillo, Gabriel H; You, Na; Su, Xiaoquan; Cui, Wei; Reilly, Muredach P; Li, Mingyao; Ning, Kang; Cui, Xinping

    2016-05-15

    Single nucleotide variant (SNV) detection procedures are being utilized as never before to analyze the recent abundance of high-throughput DNA sequencing data, both on single and multiple sample datasets. Building on previously published work with the single sample SNV caller genotype model selection (GeMS), a multiple sample version of GeMS (MultiGeMS) is introduced. Unlike other popular multiple sample SNV callers, the MultiGeMS statistical model accounts for enzymatic substitution sequencing errors. It also addresses the multiple testing problem endemic to multiple sample SNV calling and utilizes high performance computing (HPC) techniques. A simulation study demonstrates that MultiGeMS ranks highest in precision among a selection of popular multiple sample SNV callers, while showing exceptional recall in calling common SNVs. Further, both simulation studies and real data analyses indicate that MultiGeMS is robust to low-quality data. We also demonstrate that accounting for enzymatic substitution sequencing errors not only improves SNV call precision at low mapping quality regions, but also improves recall at reference allele-dominated sites with high mapping quality. The MultiGeMS package can be downloaded from https://github.com/cui-lab/multigems xinping.cui@ucr.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Classification of mislabelled microarrays using robust sparse logistic regression.

    PubMed

    Bootkrajang, Jakramate; Kabán, Ata

    2013-04-01

    Previous studies reported that labelling errors are not uncommon in microarray datasets. In such cases, the training set may become misleading, and the ability of classifiers to make reliable inferences from the data is compromised. Yet, few methods are currently available in the bioinformatics literature to deal with this problem. The few existing methods focus on data cleansing alone, without reference to classification, and their performance crucially depends on some tuning parameters. In this article, we develop a new method to detect mislabelled arrays simultaneously with learning a sparse logistic regression classifier. Our method may be seen as a label-noise robust extension of the well-known and successful Bayesian logistic regression classifier. To account for possible mislabelling, we formulate a label-flipping process as part of the classifier. The regularization parameter is automatically set using Bayesian regularization, which not only saves the computation time that cross-validation would take, but also eliminates any unwanted effects of label noise when setting the regularization parameter. Extensive experiments with both synthetic data and real microarray datasets demonstrate that our approach is able to counter the bad effects of labelling errors in terms of predictive performance, it is effective at identifying marker genes and simultaneously it detects mislabelled arrays to high accuracy. The code is available from http://cs.bham.ac.uk/∼jxb008. Supplementary data are available at Bioinformatics online.

  9. Erratum to "Combustion oscillation study in a kerosene fueled rocket-based combined-cycle engine combustor" [Acta Astronaut. 129 (2016) 260-270

    NASA Astrophysics Data System (ADS)

    Huang, Zhi-Wei; He, Guo-Qiang; Qin, Fei; Xue, Rui; Wei, Xiang-Geng; Shi, Lei

    2017-03-01

    The publisher regrets that in the above article we found that Table 1 is present online, in the html version in ScienceDirect, but has been omitted in error from the final version of the PDF online and in the print version. The table can be found below:

  10. Online benefits solutions--a new trend in managing employee benefits programs.

    PubMed

    Ala, Mohammad; Brunaczki, Bernadette

    2003-01-01

    This article focuses on the array of online benefits solutions offered by technology companies and reports the benefits to both employers and employees. Some of the benefits include reduced paperwork, reduced errors, and reduced administration costs. Companies that can deliver these benefits will be in great demand to help manage benefits programs and streamline the administrative processes.

  11. Disparity between online and offline tests in accelerated aging tests of LED lamps under electric stress.

    PubMed

    Wang, Yao; Jing, Lei; Ke, Hong-Liang; Hao, Jian; Gao, Qun; Wang, Xiao-Xun; Sun, Qiang; Xu, Zhi-Jun

    2016-09-20

    The accelerated aging tests under electric stress for one type of LED lamp are conducted, and the differences between online and offline tests of the degradation of luminous flux are studied in this paper. The transformation of the two test modes is achieved with an adjustable AC voltage stabilized power source. Experimental results show that the exponential fitting of the luminous flux degradation in online tests possesses a higher fitting degree for most lamps, and the degradation rate of the luminous flux by online tests is always lower than that by offline tests. Bayes estimation and Weibull distribution are used to calculate the failure probabilities under the accelerated voltages, and then the reliability of the lamps under rated voltage of 220 V is estimated by use of the inverse power law model. Results show that the relative error of the lifetime estimation by offline tests increases as the failure probability decreases, and it cannot be neglected when the failure probability is less than 1%. The relative errors of lifetime estimation are 7.9%, 5.8%, 4.2%, and 3.5%, at the failure probabilities of 0.1%, 1%, 5%, and 10%, respectively.

  12. Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection

    PubMed Central

    Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem

    2013-01-01

    The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method. PMID:24351629

  13. Online least squares one-class support vector machines-based abnormal visual event detection.

    PubMed

    Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem

    2013-12-12

    The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method.

  14. Online Recorded Data-Based Composite Neural Control of Strict-Feedback Systems With Application to Hypersonic Flight Dynamics.

    PubMed

    Xu, Bin; Yang, Daipeng; Shi, Zhongke; Pan, Yongping; Chen, Badong; Sun, Fuchun

    2017-09-25

    This paper investigates the online recorded data-based composite neural control of uncertain strict-feedback systems using the backstepping framework. In each step of the virtual control design, neural network (NN) is employed for uncertainty approximation. In previous works, most designs are directly toward system stability ignoring the fact how the NN is working as an approximator. In this paper, to enhance the learning ability, a novel prediction error signal is constructed to provide additional correction information for NN weight update using online recorded data. In this way, the neural approximation precision is highly improved, and the convergence speed can be faster. Furthermore, the sliding mode differentiator is employed to approximate the derivative of the virtual control signal, and thus, the complex analysis of the backstepping design can be avoided. The closed-loop stability is rigorously established, and the boundedness of the tracking error can be guaranteed. Through simulation of hypersonic flight dynamics, the proposed approach exhibits better tracking performance.

  15. Errors detected in pediatric oral liquid medication doses prepared in an automated workflow management system.

    PubMed

    Bledsoe, Sarah; Van Buskirk, Alex; Falconer, R James; Hollon, Andrew; Hoebing, Wendy; Jokic, Sladan

    2018-02-01

    The effectiveness of barcode-assisted medication preparation (BCMP) technology on detecting oral liquid dose preparation errors. From June 1, 2013, through May 31, 2014, a total of 178,344 oral doses were processed at Children's Mercy, a 301-bed pediatric hospital, through an automated workflow management system. Doses containing errors detected by the system's barcode scanning system or classified as rejected by the pharmacist were further reviewed. Errors intercepted by the barcode-scanning system were classified as (1) expired product, (2) incorrect drug, (3) incorrect concentration, and (4) technological error. Pharmacist-rejected doses were categorized into 6 categories based on the root cause of the preparation error: (1) expired product, (2) incorrect concentration, (3) incorrect drug, (4) incorrect volume, (5) preparation error, and (6) other. Of the 178,344 doses examined, 3,812 (2.1%) errors were detected by either the barcode-assisted scanning system (1.8%, n = 3,291) or a pharmacist (0.3%, n = 521). The 3,291 errors prevented by the barcode-assisted system were classified most commonly as technological error and incorrect drug, followed by incorrect concentration and expired product. Errors detected by pharmacists were also analyzed. These 521 errors were most often classified as incorrect volume, preparation error, expired product, other, incorrect drug, and incorrect concentration. BCMP technology detected errors in 1.8% of pediatric oral liquid medication doses prepared in an automated workflow management system, with errors being most commonly attributed to technological problems or incorrect drugs. Pharmacists rejected an additional 0.3% of studied doses. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  16. Online Coregularization for Multiview Semisupervised Learning

    PubMed Central

    Li, Guohui; Huang, Kuihua

    2013-01-01

    We propose a novel online coregularization framework for multiview semisupervised learning based on the notion of duality in constrained optimization. Using the weak duality theorem, we reduce the online coregularization to the task of increasing the dual function. We demonstrate that the existing online coregularization algorithms in previous work can be viewed as an approximation of our dual ascending process using gradient ascent. New algorithms are derived based on the idea of ascending the dual function more aggressively. For practical purpose, we also propose two sparse approximation approaches for kernel representation to reduce the computational complexity. Experiments show that our derived online coregularization algorithms achieve risk and accuracy comparable to offline algorithms while consuming less time and memory. Specially, our online coregularization algorithms are able to deal with concept drift and maintain a much smaller error rate. This paper paves a way to the design and analysis of online coregularization algorithms. PMID:24194680

  17. Real-time detection of faecally contaminated drinking water with tryptophan-like fluorescence: defining threshold values.

    PubMed

    Sorensen, James P R; Baker, Andy; Cumberland, Susan A; Lapworth, Dan J; MacDonald, Alan M; Pedley, Steve; Taylor, Richard G; Ward, Jade S T

    2018-05-01

    We assess the use of fluorescent dissolved organic matter at excitation-emission wavelengths of 280nm and 360nm, termed tryptophan-like fluorescence (TLF), as an indicator of faecally contaminated drinking water. A significant logistic regression model was developed using TLF as a predictor of thermotolerant coliforms (TTCs) using data from groundwater- and surface water-derived drinking water sources in India, Malawi, South Africa and Zambia. A TLF threshold of 1.3ppb dissolved tryptophan was selected to classify TTC contamination. Validation of the TLF threshold indicated a false-negative error rate of 15% and a false-positive error rate of 18%. The threshold was unsuccessful at classifying contaminated sources containing <10 TTC cfu per 100mL, which we consider the current limit of detection. If only sources above this limit were classified, the false-negative error rate was very low at 4%. TLF intensity was very strongly correlated with TTC concentration (ρ s =0.80). A higher threshold of 6.9ppb dissolved tryptophan is proposed to indicate heavily contaminated sources (≥100 TTC cfu per 100mL). Current commercially available fluorimeters are easy-to-use, suitable for use online and in remote environments, require neither reagents nor consumables, and crucially provide an instantaneous reading. TLF measurements are not appreciably impaired by common intereferents, such as pH, turbidity and temperature, within typical natural ranges. The technology is a viable option for the real-time screening of faecally contaminated drinking water globally. Copyright © 2017 Natural Environment Research Council (NERC), as represented by the British Geological Survey (BGS. Published by Elsevier B.V. All rights reserved.

  18. Modeling the Swift BAT Trigger Algorithm with Machine Learning

    NASA Technical Reports Server (NTRS)

    Graff, Philip B.; Lien, Amy Y.; Baker, John G.; Sakamoto, Takanori

    2015-01-01

    To draw inferences about gamma-ray burst (GRB) source populations based on Swift observations, it is essential to understand the detection efficiency of the Swift burst alert telescope (BAT). This study considers the problem of modeling the Swift BAT triggering algorithm for long GRBs, a computationally expensive procedure, and models it using machine learning algorithms. A large sample of simulated GRBs from Lien et al. (2014) is used to train various models: random forests, boosted decision trees (with AdaBoost), support vector machines, and artificial neural networks. The best models have accuracies of approximately greater than 97% (approximately less than 3% error), which is a significant improvement on a cut in GRB flux which has an accuracy of 89:6% (10:4% error). These models are then used to measure the detection efficiency of Swift as a function of redshift z, which is used to perform Bayesian parameter estimation on the GRB rate distribution. We find a local GRB rate density of eta(sub 0) approximately 0.48(+0.41/-0.23) Gpc(exp -3) yr(exp -1) with power-law indices of eta(sub 1) approximately 1.7(+0.6/-0.5) and eta(sub 2) approximately -5.9(+5.7/-0.1) for GRBs above and below a break point of z(sub 1) approximately 6.8(+2.8/-3.2). This methodology is able to improve upon earlier studies by more accurately modeling Swift detection and using this for fully Bayesian model fitting. The code used in this is analysis is publicly available online.

  19. Accurate Orientation Estimation Using AHRS under Conditions of Magnetic Distortion

    PubMed Central

    Yadav, Nagesh; Bleakley, Chris

    2014-01-01

    Low cost, compact attitude heading reference systems (AHRS) are now being used to track human body movements in indoor environments by estimation of the 3D orientation of body segments. In many of these systems, heading estimation is achieved by monitoring the strength of the Earth's magnetic field. However, the Earth's magnetic field can be locally distorted due to the proximity of ferrous and/or magnetic objects. Herein, we propose a novel method for accurate 3D orientation estimation using an AHRS, comprised of an accelerometer, gyroscope and magnetometer, under conditions of magnetic field distortion. The system performs online detection and compensation for magnetic disturbances, due to, for example, the presence of ferrous objects. The magnetic distortions are detected by exploiting variations in magnetic dip angle, relative to the gravity vector, and in magnetic strength. We investigate and show the advantages of using both magnetic strength and magnetic dip angle for detecting the presence of magnetic distortions. The correction method is based on a particle filter, which performs the correction using an adaptive cost function and by adapting the variance during particle resampling, so as to place more emphasis on the results of dead reckoning of the gyroscope measurements and less on the magnetometer readings. The proposed method was tested in an indoor environment in the presence of various magnetic distortions and under various accelerations (up to 3 g). In the experiments, the proposed algorithm achieves <2° static peak-to-peak error and <5° dynamic peak-to-peak error, significantly outperforming previous methods. PMID:25347584

  20. Probability of Detection of Genotyping Errors and Mutations as Inheritance Inconsistencies in Nuclear-Family Data

    PubMed Central

    Douglas, Julie A.; Skol, Andrew D.; Boehnke, Michael

    2002-01-01

    Gene-mapping studies routinely rely on checking for Mendelian transmission of marker alleles in a pedigree, as a means of screening for genotyping errors and mutations, with the implicit assumption that, if a pedigree is consistent with Mendel’s laws of inheritance, then there are no genotyping errors. However, the occurrence of inheritance inconsistencies alone is an inadequate measure of the number of genotyping errors, since the rate of occurrence depends on the number and relationships of genotyped pedigree members, the type of errors, and the distribution of marker-allele frequencies. In this article, we calculate the expected probability of detection of a genotyping error or mutation as an inheritance inconsistency in nuclear-family data, as a function of both the number of genotyped parents and offspring and the marker-allele frequency distribution. Through computer simulation, we explore the sensitivity of our analytic calculations to the underlying error model. Under a random-allele–error model, we find that detection rates are 51%–77% for multiallelic markers and 13%–75% for biallelic markers; detection rates are generally lower when the error occurs in a parent than in an offspring, unless a large number of offspring are genotyped. Errors are especially difficult to detect for biallelic markers with equally frequent alleles, even when both parents are genotyped; in this case, the maximum detection rate is 34% for four-person nuclear families. Error detection in families in which parents are not genotyped is limited, even with multiallelic markers. Given these results, we recommend that additional error checking (e.g., on the basis of multipoint analysis) be performed, beyond routine checking for Mendelian consistency. Furthermore, our results permit assessment of the plausibility of an observed number of inheritance inconsistencies for a family, allowing the detection of likely pedigree—rather than genotyping—errors in the early stages of a genome scan. Such early assessments are valuable in either the targeting of families for resampling or discontinued genotyping. PMID:11791214

  1. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1985-01-01

    A concatenated coding scheme for error contol in data communications was analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughout efficiency of the proposed error control scheme incorporated with a selective repeat ARQ retransmission strategy is analyzed.

  2. Form Overrides Meaning When Bilinguals Monitor for Errors

    PubMed Central

    Ivanova, Iva; Ferreira, Victor S.; Gollan, Tamar H.

    2016-01-01

    Bilinguals rarely produce unintended language switches, which may in part be because switches are detected and corrected by an internal monitor. But are language switches easier or harder to detect than within-language semantic errors? To approximate internal monitoring, bilinguals listened (Experiment 1) or read aloud (Experiment 2) stories, and detected language switches (translation equivalents or semantically unrelated to expected words) and within-language errors (semantically related or unrelated to expected words). Bilinguals detected semantically related within-language errors most slowly and least accurately, language switches more quickly and accurately than within-language errors, and (in Experiment 2), translation equivalents as quickly and accurately as unrelated language switches. These results suggest that internal monitoring of form (which can detect mismatches in language membership) completes earlier than, and is independent of, monitoring of meaning. However, analysis of reading times prior to error detection revealed meaning violations to be more disruptive for processing than language violations. PMID:28649169

  3. On-line Monitoring Device for High-voltage Switch Cabinet Partial Discharge Based on Pulse Current Method

    NASA Astrophysics Data System (ADS)

    Y Tao, S.; Zhang, X. Z.; Cai, H. W.; Li, P.; Feng, Y.; Zhang, T. C.; Li, J.; Wang, W. S.; Zhang, X. K.

    2017-12-01

    The pulse current method for partial discharge detection is generally applied in type testing and other off-line tests of electrical equipment at delivery. After intensive analysis of the present situation and existing problems of partial discharge detection in switch cabinets, this paper designed the circuit principle and signal extraction method for partial discharge on-line detection based on a high-voltage presence indicating systems (VPIS), established a high voltage switch cabinet partial discharge on-line detection circuit based on the pulse current method, developed background software integrated with real-time monitoring, judging and analyzing functions, carried out a real discharge simulation test on a real-type partial discharge defect simulation platform of a 10KV switch cabinet, and verified the sensitivity and validity of the high-voltage switch cabinet partial discharge on-line monitoring device based on the pulse current method. The study presented in this paper is of great significance for switch cabinet maintenance and theoretical study on pulse current method on-line detection, and has provided a good implementation method for partial discharge on-line monitoring devices for 10KV distribution network equipment.

  4. On-line fresh-cut lettuce quality measurement system using hyperspectral imaging

    USDA-ARS?s Scientific Manuscript database

    Lettuce, which is a main type of fresh-cut vegetable, has been used in various fresh-cut products. In this study, an online quality measurement system for detecting foreign substances on the fresh-cut lettuce was developed using hyperspectral reflectance imaging. The online detection system with a s...

  5. GenomePeek—an online tool for prokaryotic genome and metagenome analysis

    DOE PAGES

    McNair, Katelyn; Edwards, Robert A.

    2015-06-16

    As increases in prokaryotic sequencing take place, a method to quickly and accurately analyze this data is needed. Previous tools are mainly designed for metagenomic analysis and have limitations; such as long runtimes and significant false positive error rates. The online tool GenomePeek (edwards.sdsu.edu/GenomePeek) was developed to analyze both single genome and metagenome sequencing files, quickly and with low error rates. GenomePeek uses a sequence assembly approach where reads to a set of conserved genes are extracted, assembled and then aligned against the highly specific reference database. GenomePeek was found to be faster than traditional approaches while still keeping errormore » rates low, as well as offering unique data visualization options.« less

  6. Aerial robot intelligent control method based on back-stepping

    NASA Astrophysics Data System (ADS)

    Zhou, Jian; Xue, Qian

    2018-05-01

    The aerial robot is characterized as strong nonlinearity, high coupling and parameter uncertainty, a self-adaptive back-stepping control method based on neural network is proposed in this paper. The uncertain part of the aerial robot model is compensated online by the neural network of Cerebellum Model Articulation Controller and robust control items are designed to overcome the uncertainty error of the system during online learning. At the same time, particle swarm algorithm is used to optimize and fix parameters so as to improve the dynamic performance, and control law is obtained by the recursion of back-stepping regression. Simulation results show that the designed control law has desired attitude tracking performance and good robustness in case of uncertainties and large errors in the model parameters.

  7. Predicting non-linear dynamics by stable local learning in a recurrent spiking neural network.

    PubMed

    Gilra, Aditya; Gerstner, Wulfram

    2017-11-27

    The brain needs to predict how the body reacts to motor commands, but how a network of spiking neurons can learn non-linear body dynamics using local, online and stable learning rules is unclear. Here, we present a supervised learning scheme for the feedforward and recurrent connections in a network of heterogeneous spiking neurons. The error in the output is fed back through fixed random connections with a negative gain, causing the network to follow the desired dynamics. The rule for Feedback-based Online Local Learning Of Weights (FOLLOW) is local in the sense that weight changes depend on the presynaptic activity and the error signal projected onto the postsynaptic neuron. We provide examples of learning linear, non-linear and chaotic dynamics, as well as the dynamics of a two-link arm. Under reasonable approximations, we show, using the Lyapunov method, that FOLLOW learning is uniformly stable, with the error going to zero asymptotically.

  8. Predicting non-linear dynamics by stable local learning in a recurrent spiking neural network

    PubMed Central

    Gerstner, Wulfram

    2017-01-01

    The brain needs to predict how the body reacts to motor commands, but how a network of spiking neurons can learn non-linear body dynamics using local, online and stable learning rules is unclear. Here, we present a supervised learning scheme for the feedforward and recurrent connections in a network of heterogeneous spiking neurons. The error in the output is fed back through fixed random connections with a negative gain, causing the network to follow the desired dynamics. The rule for Feedback-based Online Local Learning Of Weights (FOLLOW) is local in the sense that weight changes depend on the presynaptic activity and the error signal projected onto the postsynaptic neuron. We provide examples of learning linear, non-linear and chaotic dynamics, as well as the dynamics of a two-link arm. Under reasonable approximations, we show, using the Lyapunov method, that FOLLOW learning is uniformly stable, with the error going to zero asymptotically. PMID:29173280

  9. What are incident reports telling us? A comparative study at two Australian hospitals of medication errors identified at audit, detected by staff and reported to an incident system

    PubMed Central

    Westbrook, Johanna I.; Li, Ling; Lehnbom, Elin C.; Baysari, Melissa T.; Braithwaite, Jeffrey; Burke, Rosemary; Conn, Chris; Day, Richard O.

    2015-01-01

    Objectives To (i) compare medication errors identified at audit and observation with medication incident reports; (ii) identify differences between two hospitals in incident report frequency and medication error rates; (iii) identify prescribing error detection rates by staff. Design Audit of 3291patient records at two hospitals to identify prescribing errors and evidence of their detection by staff. Medication administration errors were identified from a direct observational study of 180 nurses administering 7451 medications. Severity of errors was classified. Those likely to lead to patient harm were categorized as ‘clinically important’. Setting Two major academic teaching hospitals in Sydney, Australia. Main Outcome Measures Rates of medication errors identified from audit and from direct observation were compared with reported medication incident reports. Results A total of 12 567 prescribing errors were identified at audit. Of these 1.2/1000 errors (95% CI: 0.6–1.8) had incident reports. Clinically important prescribing errors (n = 539) were detected by staff at a rate of 218.9/1000 (95% CI: 184.0–253.8), but only 13.0/1000 (95% CI: 3.4–22.5) were reported. 78.1% (n = 421) of clinically important prescribing errors were not detected. A total of 2043 drug administrations (27.4%; 95% CI: 26.4–28.4%) contained ≥1 errors; none had an incident report. Hospital A had a higher frequency of incident reports than Hospital B, but a lower rate of errors at audit. Conclusions Prescribing errors with the potential to cause harm frequently go undetected. Reported incidents do not reflect the profile of medication errors which occur in hospitals or the underlying rates. This demonstrates the inaccuracy of using incident frequency to compare patient risk or quality performance within or across hospitals. New approaches including data mining of electronic clinical information systems are required to support more effective medication error detection and mitigation. PMID:25583702

  10. Local concurrent error detection and correction in data structures using virtual backpointers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, C.C.J.; Chen, P.P.; Fuchs, W.K.

    1989-11-01

    A new technique, based on virtual backpointers, is presented in this paper for local concurrent error detection and correction in linked data structures. Two new data structures utilizing virtual backpointers, the Virtual Double-Linked List and the B-Tree and Virtual Backpointers, are described. For these structures, double errors within a fixed-size checking window can be detected in constant time and single errors detected during forward moves can be corrected in constant time.

  11. Preliminary test result of PRESTo application to the southern Korean peninsula

    NASA Astrophysics Data System (ADS)

    Chi, H.; Lim, I.; Park, J.; Zollo, A.; Elia, L.; Iannaccone, G.

    2012-12-01

    KMA(Korea Meteorological Agency) and KIGAM(Korea Institute of Geoscience and Mineral Resources) have started a project to construct EEWS(Earthquake Early Warning System) from 2007 in South Korea. KIGAM has been operating ElarmS(Earthquake Alarms Systems) developed by UC. Berkeley Seismological Lab. in the real time mode from the middle of 2010. PRESTo (PRobabilistic and Evolutionary early warning SysTem) is a software platform coded by RISSC-lab in Italy for regional earthquake early warning. In comparison to ElarmS, PRESTo has the different approach in estimating parameters of an earthquake by adapting probability function. We conducted online and offline tests to evaluate feasibility of PRESTo about seismic network of the southern part of the Korean Peninsula. The 3D velocity model grids of P- and S-wave were calculated from 1D velocity model used by KIGAM for routine work. Two kinds of magnitude equations, High Mag to events with magnitude over 4.0 and Low Mag to others, were used without any change of the parameters applied to Naples in Italy in order to investigate patterns of the difference between KIGAM's local magnitudes and PRESTo's. Off and online test of PRESTo was done at two PCs with Windows7, Intel Core i7-2 3.4 GHz CPU and 8 GB memory. Offline simulation was applied to the total 162 earthquakes occurred at inland or offshore area of the southern Korean peninsula with magnitude over 2.0 from January 2007 to May 2012. As a result of offline test, the number of events by PRESTo was 132 that were about 80.5 % of the total earthquakes. The 91 % of the detected events had reasonable resolution with origin time error within 15 seconds and location error within 10 km. The 106 events, 80 % of the processed ones, had good resolution with error less than 5 km. The events with magnitude less than 4.0 showed the smaller pattern as amount of 0.3 ~ 0.8 than the magnitude from KIGAM's bulletin. The large magnitude over 4.0, however, showed similar pattern to that of bulletin. Due to the limitation of computer resources which PRESTo requires in online processing, we separated the study area into 4 zones for online test. The KIGAM's bulletin with magnitude over 2.0 had 13 earthquakes from May to June 2012. PRESTo detected 12 events among them in the real time processing mode. The location accuracy of these events showed slightly higher resolution than the results of ElramS. PRESTo used small velocity grid because we had separated study area as 4 parts. ElarmS, however, observed events over all areas. PRESTos of four areas made different results about the same events and they triggered many false events when noisy stations were included to the localized network. As a result of test operation of PRESTo, the magnitudes showed smaller pattern than that of KIGAM's bulletin in the size of about 0.3 ~ 0.8. The parameter tuning is required to determine adequate size of magnitude in regarding the seismic characteristics in the Korean peninsula. In the current system of PRESTo, it is possible to get detailed analysis in the local area, however, it gives larger burden on computing system as increasing the size of area and stations. It is necessary to split one package into several modules in order to reduce system resource and control large number of data channels the future.

  12. Detection of illicit online sales of fentanyls via Twitter

    PubMed Central

    Mackey, Tim K.; Kalyanam, Janani

    2017-01-01

    A counterfeit fentanyl crisis is currently underway in the United States.  Counterfeit versions of commonly abused prescription drugs laced with fentanyl are being manufactured, distributed, and sold globally, leading to an increase in overdose and death in countries like the United States and Canada.  Despite concerns from the U.S. Drug Enforcement Agency regarding covert and overt sale of fentanyls online, no study has examined the role of the Internet and social media on fentanyl illegal marketing and direct-to-consumer access.  In response, this study collected and analyzed five months of Twitter data (from June-November 2015) filtered for the keyword “fentanyl” using Amazon Web Services.  We then analyzed 28,711 fentanyl-related tweets using text filtering and a machine learning approach called a Biterm Topic Model (BTM) to detect underlying latent patterns or “topics” present in the corpus of tweets.  Using this approach we detected a subset of 771 tweets marketing the sale of fentanyls online and then filtered this down to nine unique tweets containing hyperlinks to external websites.  Six hyperlinks were associated with online fentanyl classified ads, 2 with illicit online pharmacies, and 1 could not be classified due to traffic redirection.  Importantly, the one illicit online pharmacy detected was still accessible and offered the sale of fentanyls and other controlled substances direct-to-consumers with no prescription required at the time of publication of this study.   Overall, we detected a relatively small sample of Tweets promoting illegal online sale of fentanyls.  However, the detection of even a few online sellers represents a public health danger and a direct violation of law that demands further study. PMID:29259769

  13. Detection of illicit online sales of fentanyls via Twitter.

    PubMed

    Mackey, Tim K; Kalyanam, Janani

    2017-01-01

    A counterfeit fentanyl crisis is currently underway in the United States.  Counterfeit versions of commonly abused prescription drugs laced with fentanyl are being manufactured, distributed, and sold globally, leading to an increase in overdose and death in countries like the United States and Canada.  Despite concerns from the U.S. Drug Enforcement Agency regarding covert and overt sale of fentanyls online, no study has examined the role of the Internet and social media on fentanyl illegal marketing and direct-to-consumer access.  In response, this study collected and analyzed five months of Twitter data (from June-November 2015) filtered for the keyword "fentanyl" using Amazon Web Services.  We then analyzed 28,711 fentanyl-related tweets using text filtering and a machine learning approach called a Biterm Topic Model (BTM) to detect underlying latent patterns or "topics" present in the corpus of tweets.  Using this approach we detected a subset of 771 tweets marketing the sale of fentanyls online and then filtered this down to nine unique tweets containing hyperlinks to external websites.  Six hyperlinks were associated with online fentanyl classified ads, 2 with illicit online pharmacies, and 1 could not be classified due to traffic redirection.  Importantly, the one illicit online pharmacy detected was still accessible and offered the sale of fentanyls and other controlled substances direct-to-consumers with no prescription required at the time of publication of this study.   Overall, we detected a relatively small sample of Tweets promoting illegal online sale of fentanyls.  However, the detection of even a few online sellers represents a public health danger and a direct violation of law that demands further study.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Passarge, M; Fix, M K; Manser, P

    Purpose: To create and test an accurate EPID-frame-based VMAT QA metric to detect gross dose errors in real-time and to provide information about the source of error. Methods: A Swiss cheese model was created for an EPID-based real-time QA process. The system compares a treatmentplan- based reference set of EPID images with images acquired over each 2° gantry angle interval. The metric utilizes a sequence of independent consecutively executed error detection Methods: a masking technique that verifies infield radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment to quantify rotation, scaling andmore » translation; standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each test were determined. For algorithm testing, twelve different types of errors were selected to modify the original plan. Corresponding predictions for each test case were generated, which included measurement-based noise. Each test case was run multiple times (with different noise per run) to assess the ability to detect introduced errors. Results: Averaged over five test runs, 99.1% of all plan variations that resulted in patient dose errors were detected within 2° and 100% within 4° (∼1% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 91.5% were detected by the system within 2°. Based on the type of method that detected the error, determination of error sources was achieved. Conclusion: An EPID-based during-treatment error detection system for VMAT deliveries was successfully designed and tested. The system utilizes a sequence of methods to identify and prevent gross treatment delivery errors. The system was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of errors in real-time and indicate the error source. J. V. Siebers receives funding support from Varian Medical Systems.« less

  15. On-line high-speed rail defect detection.

    DOT National Transportation Integrated Search

    2004-10-01

    This report presents the results of phase 2 of the project On-line high-speed rail defect detection aimed at improving the reliability and the speed of current defect detection in rails. Ultrasonic guided waves, traveling in the rail running di...

  16. Documenting Uncertainty and Error in Gridded Growing Degree Day and Spring Onset Maps Generated by the USA National Phenology Network

    NASA Astrophysics Data System (ADS)

    Crimmins, T. M.; Switzer, J.; Rosemartin, A.; Marsh, L.; Gerst, K.; Crimmins, M.; Weltzin, J. F.

    2016-12-01

    Since 2016 the USA National Phenology Network (USA-NPN; www.usanpn.org) has produced and delivered daily maps and short-term forecasts of accumulated growing degree days and spring onset dates at fine spatial scale for the conterminous United States. Because accumulated temperature is a strong driver of phenological transitions in plants and animals, including leaf-out, flowering, fruit ripening, and migration, these data products have utility for a wide range of natural resource planning and management applications, including scheduling invasive species and pest detection and control activities, determining planting dates, anticipating allergy outbreaks and planning agricultural harvest dates. The USA-NPN is a national-scale program that supports scientific advancement and decision-making by collecting, storing, and sharing phenology data and information. We will be expanding the suite of gridded map products offered by the USA-NPN to include predictive species-specific maps of phenological transitions in plants and animals at fine spatial and temporal resolution in the future. Data products, such as the gridded maps currently produced by the USA-NPN, inherently contain uncertainty and error arising from multiple sources, including error propagated forward from underlying climate data and from the models implemented. As providing high-quality, vetted data in a transparent way is central to the USA-NPN, we aim to identify and report the sources and magnitude of uncertainty and error in gridded maps and forecast products. At present, we compare our real-time gridded products to independent, trustworthy data sources, such as the Climate Reference Network, on a daily basis and report Mean Absolute Error and bias through an interactive online dashboard.

  17. WE-A-17A-03: Catheter Digitization in High-Dose-Rate Brachytherapy with the Assistance of An Electromagnetic (EM) Tracking System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damato, AL; Bhagwat, MS; Buzurovic, I

    Purpose: To investigate the use of a system using EM tracking, postprocessing and error-detection algorithms for measuring brachytherapy catheter locations and for detecting errors and resolving uncertainties in treatment-planning catheter digitization. Methods: An EM tracker was used to localize 13 catheters in a clinical surface applicator (A) and 15 catheters inserted into a phantom (B). Two pairs of catheters in (B) crossed paths at a distance <2 mm, producing an undistinguishable catheter artifact in that location. EM data was post-processed for noise reduction and reformatted to provide the dwell location configuration. CT-based digitization was automatically extracted from the brachytherapy planmore » DICOM files (CT). EM dwell digitization error was characterized in terms of the average and maximum distance between corresponding EM and CT dwells per catheter. The error detection rate (detected errors / all errors) was calculated for 3 types of errors: swap of two catheter numbers; incorrect catheter number identification superior to the closest position between two catheters (mix); and catheter-tip shift. Results: The averages ± 1 standard deviation of the average and maximum registration error per catheter were 1.9±0.7 mm and 3.0±1.1 mm for (A) and 1.6±0.6 mm and 2.7±0.8 mm for (B). The error detection rate was 100% (A and B) for swap errors, mix errors, and shift >4.5 mm (A) and >5.5 mm (B); errors were detected for shifts on average >2.0 mm (A) and >2.4 mm (B). Both mix errors associated with undistinguishable catheter artifacts were detected and at least one of the involved catheters was identified. Conclusion: We demonstrated the use of an EM tracking system for localization of brachytherapy catheters, detection of digitization errors and resolution of undistinguishable catheter artifacts. Automatic digitization may be possible with a registration between the imaging and the EM frame of reference. Research funded by the Kaye Family Award 2012.« less

  18. Towards an Enhanced Aspect-based Contradiction Detection Approach for Online Review Content

    NASA Astrophysics Data System (ADS)

    Nuradilah Azman, Siti; Ishak, Iskandar; Sharef, Nurfadhlina Mohd; Sidi, Fatimah

    2017-09-01

    User generated content as such online reviews plays an important role in customer’s purchase decisions. Many works have focused on identifying satisfaction of the reviewer in social media through the study of sentiment analysis (SA) and opinion mining. The large amount of potential application and the increasing number of opinions expresses on the web results in researchers interest on sentiment analysis and opinion mining. However, due to the reviewer’s idiosyncrasy, reviewer may have different preferences and point of view for a particular subject which in this case hotel reviews. There is still limited research that focuses on this contradiction detection in the perspective of tourism online review especially in numerical contradiction. Therefore, the aim of this paper to investigate the type of contradiction in online review which mainly focusing on hotel online review, to provide useful material on process or methods for identifying contradiction which mainly on the review itself and to determine opportunities for relevant future research for online review contradiction detection. We also proposed a model to detect numerical contradiction in user generated content for tourism industry.

  19. Errors, error detection, error correction and hippocampal-region damage: data and theories.

    PubMed

    MacKay, Donald G; Johnson, Laura W

    2013-11-01

    This review and perspective article outlines 15 observational constraints on theories of errors, error detection, and error correction, and their relation to hippocampal-region (HR) damage. The core observations come from 10 studies with H.M., an amnesic with cerebellar and HR damage but virtually no neocortical damage. Three studies examined the detection of errors planted in visual scenes (e.g., a bird flying in a fish bowl in a school classroom) and sentences (e.g., I helped themselves to the birthday cake). In all three experiments, H.M. detected reliably fewer errors than carefully matched memory-normal controls. Other studies examined the detection and correction of self-produced errors, with controls for comprehension of the instructions, impaired visual acuity, temporal factors, motoric slowing, forgetting, excessive memory load, lack of motivation, and deficits in visual scanning or attention. In these studies, H.M. corrected reliably fewer errors than memory-normal and cerebellar controls, and his uncorrected errors in speech, object naming, and reading aloud exhibited two consistent features: omission and anomaly. For example, in sentence production tasks, H.M. omitted one or more words in uncorrected encoding errors that rendered his sentences anomalous (incoherent, incomplete, or ungrammatical) reliably more often than controls. Besides explaining these core findings, the theoretical principles discussed here explain H.M.'s retrograde amnesia for once familiar episodic and semantic information; his anterograde amnesia for novel information; his deficits in visual cognition, sentence comprehension, sentence production, sentence reading, and object naming; and effects of aging on his ability to read isolated low frequency words aloud. These theoretical principles also explain a wide range of other data on error detection and correction and generate new predictions for future test. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Unmodeled observation error induces bias when inferring patterns and dynamics of species occurrence via aural detections

    USGS Publications Warehouse

    McClintock, Brett T.; Bailey, Larissa L.; Pollock, Kenneth H.; Simons, Theodore R.

    2010-01-01

    The recent surge in the development and application of species occurrence models has been associated with an acknowledgment among ecologists that species are detected imperfectly due to observation error. Standard models now allow unbiased estimation of occupancy probability when false negative detections occur, but this is conditional on no false positive detections and sufficient incorporation of explanatory variables for the false negative detection process. These assumptions are likely reasonable in many circumstances, but there is mounting evidence that false positive errors and detection probability heterogeneity may be much more prevalent in studies relying on auditory cues for species detection (e.g., songbird or calling amphibian surveys). We used field survey data from a simulated calling anuran system of known occupancy state to investigate the biases induced by these errors in dynamic models of species occurrence. Despite the participation of expert observers in simplified field conditions, both false positive errors and site detection probability heterogeneity were extensive for most species in the survey. We found that even low levels of false positive errors, constituting as little as 1% of all detections, can cause severe overestimation of site occupancy, colonization, and local extinction probabilities. Further, unmodeled detection probability heterogeneity induced substantial underestimation of occupancy and overestimation of colonization and local extinction probabilities. Completely spurious relationships between species occurrence and explanatory variables were also found. Such misleading inferences would likely have deleterious implications for conservation and management programs. We contend that all forms of observation error, including false positive errors and heterogeneous detection probabilities, must be incorporated into the estimation framework to facilitate reliable inferences about occupancy and its associated vital rate parameters.

  1. PRESAGE: Protecting Structured Address Generation against Soft Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram

    Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less

  2. PRESAGE: Protecting Structured Address Generation against Soft Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram

    Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less

  3. A Dual Frequency Carrier Phase Error Difference Checking Algorithm for the GNSS Compass.

    PubMed

    Liu, Shuo; Zhang, Lei; Li, Jian

    2016-11-24

    The performance of the Global Navigation Satellite System (GNSS) compass is related to the quality of carrier phase measurement. How to process the carrier phase error properly is important to improve the GNSS compass accuracy. In this work, we propose a dual frequency carrier phase error difference checking algorithm for the GNSS compass. The algorithm aims at eliminating large carrier phase error in dual frequency double differenced carrier phase measurement according to the error difference between two frequencies. The advantage of the proposed algorithm is that it does not need additional environment information and has a good performance on multiple large errors compared with previous research. The core of the proposed algorithm is removing the geographical distance from the dual frequency carrier phase measurement, then the carrier phase error is separated and detectable. We generate the Double Differenced Geometry-Free (DDGF) measurement according to the characteristic that the different frequency carrier phase measurements contain the same geometrical distance. Then, we propose the DDGF detection to detect the large carrier phase error difference between two frequencies. The theoretical performance of the proposed DDGF detection is analyzed. An open sky test, a manmade multipath test and an urban vehicle test were carried out to evaluate the performance of the proposed algorithm. The result shows that the proposed DDGF detection is able to detect large error in dual frequency carrier phase measurement by checking the error difference between two frequencies. After the DDGF detection, the accuracy of the baseline vector is improved in the GNSS compass.

  4. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Kasami, T.; Fujiwara, T.; Lin, S.

    1986-01-01

    In this paper, a concatenated coding scheme for error control in data communications is presented and analyzed. In this scheme, the inner code is used for both error correction and detection; however, the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error (or decoding error) of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughput efficiency of the proposed error control scheme incorporated with a selective-repeat ARQ retransmission strategy is also analyzed. Three specific examples are presented. One of the examples is proposed for error control in the NASA Telecommand System.

  5. Self-checking self-repairing computer nodes using the mirror processor

    NASA Technical Reports Server (NTRS)

    Tamir, Yuval

    1992-01-01

    Circuitry added to fault-tolerant systems for concurrent error deduction usually reduces performance. Using a technique called micro rollback, it is possible to eliminate most of the performance penalty of concurrent error detection. Error detection is performed in parallel with intermodule communication, and erroneous state changes are later undone. The author reports on the design and implementation of a VLSI RISC microprocessor, called the Mirror Processor (MP), which is capable of micro rollback. In order to achieve concurrent error detection, two MP chips operate in lockstep, comparing external signals and a signature of internal signals every clock cycle. If a mismatch is detected, both processors roll back to the beginning of the cycle when the error occurred. In some cases the erroneous state is corrected by copying a value from the fault-free processor to the faulty processor. The architecture, microarchitecture, and VLSI implementation of the MP, emphasizing its error-detection, error-recovery, and self-diagnosis capabilities, are described.

  6. Differential detection in quadrature-quadrature phase shift keying (Q2PSK) systems

    NASA Astrophysics Data System (ADS)

    El-Ghandour, Osama M.; Saha, Debabrata

    1991-05-01

    A generalized quadrature-quadrature phase shift keying (Q2PSK) signaling format is considered for differential encoding and differential detection. Performance in the presence of additive white Gaussian noise (AWGN) is analyzed. Symbol error rate is found to be approximately twice the symbol error rate in a quaternary DPSK system operating at the same Eb/N0. However, the bandwidth efficiency of differential Q2PSK is substantially higher than that of quaternary DPSK. When the error is due to AWGN, the ratio of double error rate to single error rate can be very high, and the ratio may approach zero at high SNR. To improve error rate, differential detection through maximum-likelihood decoding based on multiple or N symbol observations is considered. If N and SNR are large this decoding gives a 3-dB advantage in error rate over conventional N = 2 differential detection, fully recovering the energy loss (as compared to coherent detection) if the observation is extended to a large number of symbol durations.

  7. The current role of on-line extraction approaches in clinical and forensic toxicology.

    PubMed

    Mueller, Daniel M

    2014-08-01

    In today's clinical and forensic toxicological laboratories, automation is of interest because of its ability to optimize processes, to reduce manual workload and handling errors and to minimize exposition to potentially infectious samples. Extraction is usually the most time-consuming step; therefore, automation of this step is reasonable. Currently, from the field of clinical and forensic toxicology, methods using the following on-line extraction techniques have been published: on-line solid-phase extraction, turbulent flow chromatography, solid-phase microextraction, microextraction by packed sorbent, single-drop microextraction and on-line desorption of dried blood spots. Most of these published methods are either single-analyte or multicomponent procedures; methods intended for systematic toxicological analysis are relatively scarce. However, the use of on-line extraction will certainly increase in the near future.

  8. Local concurrent error detection and correction in data structures using virtual backpointers

    NASA Technical Reports Server (NTRS)

    Li, C. C.; Chen, P. P.; Fuchs, W. K.

    1987-01-01

    A new technique, based on virtual backpointers, for local concurrent error detection and correction in linked data structures is presented. Two new data structures, the Virtual Double Linked List, and the B-tree with Virtual Backpointers, are described. For these structures, double errors can be detected in 0(1) time and errors detected during forward moves can be corrected in 0(1) time. The application of a concurrent auditor process to data structure error detection and correction is analyzed, and an implementation is described, to determine the effect on mean time to failure of a multi-user shared database system. The implementation utilizes a Sequent shared memory multiprocessor system operating on a shared databased of Virtual Double Linked Lists.

  9. Local concurrent error detection and correction in data structures using virtual backpointers

    NASA Technical Reports Server (NTRS)

    Li, Chung-Chi Jim; Chen, Paul Peichuan; Fuchs, W. Kent

    1989-01-01

    A new technique, based on virtual backpointers, for local concurrent error detection and correction in linked data strutures is presented. Two new data structures, the Virtual Double Linked List, and the B-tree with Virtual Backpointers, are described. For these structures, double errors can be detected in 0(1) time and errors detected during forward moves can be corrected in 0(1) time. The application of a concurrent auditor process to data structure error detection and correction is analyzed, and an implementation is described, to determine the effect on mean time to failure of a multi-user shared database system. The implementation utilizes a Sequent shared memory multiprocessor system operating on a shared database of Virtual Double Linked Lists.

  10. Exponential parameter and tracking error convergence guarantees for adaptive controllers without persistency of excitation

    NASA Astrophysics Data System (ADS)

    Chowdhary, Girish; Mühlegg, Maximilian; Johnson, Eric

    2014-08-01

    In model reference adaptive control (MRAC) the modelling uncertainty is often assumed to be parameterised with time-invariant unknown ideal parameters. The convergence of parameters of the adaptive element to these ideal parameters is beneficial, as it guarantees exponential stability, and makes an online learned model of the system available. Most MRAC methods, however, require persistent excitation of the states to guarantee that the adaptive parameters converge to the ideal values. Enforcing PE may be resource intensive and often infeasible in practice. This paper presents theoretical analysis and illustrative examples of an adaptive control method that leverages the increasing ability to record and process data online by using specifically selected and online recorded data concurrently with instantaneous data for adaptation. It is shown that when the system uncertainty can be modelled as a combination of known nonlinear bases, simultaneous exponential tracking and parameter error convergence can be guaranteed if the system states are exciting over finite intervals such that rich data can be recorded online; PE is not required. Furthermore, the rate of convergence is directly proportional to the minimum singular value of the matrix containing online recorded data. Consequently, an online algorithm to record and forget data is presented and its effects on the resulting switched closed-loop dynamics are analysed. It is also shown that when radial basis function neural networks (NNs) are used as adaptive elements, the method guarantees exponential convergence of the NN parameters to a compact neighbourhood of their ideal values without requiring PE. Flight test results on a fixed-wing unmanned aerial vehicle demonstrate the effectiveness of the method.

  11. SU-E-T-310: Targeting Safety Improvements Through Analysis of Near-Miss Error Detection Points in An Incident Learning Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novak, A; Nyflot, M; Sponseller, P

    2014-06-01

    Purpose: Radiation treatment planning involves a complex workflow that can make safety improvement efforts challenging. This study utilizes an incident reporting system to identify detection points of near-miss errors, in order to guide our departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or their patterns. Methods: 1377 incidents were analyzed from a departmental nearmiss error reporting system from 3/2012–10/2013. All incidents were prospectively reviewed weekly by a multi-disciplinary team, and assigned a near-miss severity score ranging from 0–4 reflecting potential harm (no harm to critical). A 98-step consensus workflow was usedmore » to determine origination and detection points of near-miss errors, categorized into 7 major steps (patient assessment/orders, simulation, contouring/treatment planning, pre-treatment plan checks, therapist/on-treatment review, post-treatment checks, and equipment issues). Categories were compared using ANOVA. Results: In the 7-step workflow, 23% of near-miss errors were detected within the same step in the workflow, while an additional 37% were detected by the next step in the workflow, and 23% were detected two steps downstream. Errors detected further from origination were more severe (p<.001; Figure 1). The most common source of near-miss errors was treatment planning/contouring, with 476 near misses (35%). Of those 476, only 72(15%) were found before leaving treatment planning, 213(45%) were found at physics plan checks, and 191(40%) were caught at the therapist pre-treatment chart review or on portal imaging. Errors that passed through physics plan checks and were detected by therapists were more severe than other errors originating in contouring/treatment planning (1.81 vs 1.33, p<0.001). Conclusion: Errors caught by radiation treatment therapists tend to be more severe than errors caught earlier in the workflow, highlighting the importance of safety checks in dosimetry and physics. We are utilizing our findings to improve manual and automated checklists for dosimetry and physics.« less

  12. Error-Related Psychophysiology and Negative Affect

    ERIC Educational Resources Information Center

    Hajcak, G.; McDonald, N.; Simons, R.F.

    2004-01-01

    The error-related negativity (ERN/Ne) and error positivity (Pe) have been associated with error detection and response monitoring. More recently, heart rate (HR) and skin conductance (SC) have also been shown to be sensitive to the internal detection of errors. An enhanced ERN has consistently been observed in anxious subjects and there is some…

  13. Simulating and Detecting Radiation-Induced Errors for Onboard Machine Learning

    NASA Technical Reports Server (NTRS)

    Wagstaff, Kiri L.; Bornstein, Benjamin; Granat, Robert; Tang, Benyang; Turmon, Michael

    2009-01-01

    Spacecraft processors and memory are subjected to high radiation doses and therefore employ radiation-hardened components. However, these components are orders of magnitude more expensive than typical desktop components, and they lag years behind in terms of speed and size. We have integrated algorithm-based fault tolerance (ABFT) methods into onboard data analysis algorithms to detect radiation-induced errors, which ultimately may permit the use of spacecraft memory that need not be fully hardened, reducing cost and increasing capability at the same time. We have also developed a lightweight software radiation simulator, BITFLIPS, that permits evaluation of error detection strategies in a controlled fashion, including the specification of the radiation rate and selective exposure of individual data structures. Using BITFLIPS, we evaluated our error detection methods when using a support vector machine to analyze data collected by the Mars Odyssey spacecraft. We found ABFT error detection for matrix multiplication is very successful, while error detection for Gaussian kernel computation still has room for improvement.

  14. Virtual sensors for on-line wheel wear and part roughness measurement in the grinding process.

    PubMed

    Arriandiaga, Ander; Portillo, Eva; Sánchez, Jose A; Cabanes, Itziar; Pombo, Iñigo

    2014-05-19

    Grinding is an advanced machining process for the manufacturing of valuable complex and accurate parts for high added value sectors such as aerospace, wind generation, etc. Due to the extremely severe conditions inside grinding machines, critical process variables such as part surface finish or grinding wheel wear cannot be easily and cheaply measured on-line. In this paper a virtual sensor for on-line monitoring of those variables is presented. The sensor is based on the modelling ability of Artificial Neural Networks (ANNs) for stochastic and non-linear processes such as grinding; the selected architecture is the Layer-Recurrent neural network. The sensor makes use of the relation between the variables to be measured and power consumption in the wheel spindle, which can be easily measured. A sensor calibration methodology is presented, and the levels of error that can be expected are discussed. Validation of the new sensor is carried out by comparing the sensor's results with actual measurements carried out in an industrial grinding machine. Results show excellent estimation performance for both wheel wear and surface roughness. In the case of wheel wear, the absolute error is within the range of microns (average value 32 μm). In the case of surface finish, the absolute error is well below Ra 1 μm (average value 0.32 μm). The present approach can be easily generalized to other grinding operations.

  15. RECKONER: read error corrector based on KMC.

    PubMed

    Dlugosz, Maciej; Deorowicz, Sebastian

    2017-04-01

    Presence of sequencing errors in data produced by next-generation sequencers affects quality of downstream analyzes. Accuracy of them can be improved by performing error correction of sequencing reads. We introduce a new correction algorithm capable of processing eukaryotic close to 500 Mbp-genome-size, high error-rated data using less than 4 GB of RAM in about 35 min on 16-core computer. Program is freely available at http://sun.aei.polsl.pl/REFRESH/reckoner . sebastian.deorowicz@polsl.pl. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. An advanced SEU tolerant latch based on error detection

    NASA Astrophysics Data System (ADS)

    Xu, Hui; Zhu, Jianwei; Lu, Xiaoping; Li, Jingzhao

    2018-05-01

    This paper proposes a latch that can mitigate SEUs via an error detection circuit. The error detection circuit is hardened by a C-element and a stacked PMOS. In the hold state, a particle strikes the latch or the error detection circuit may cause a fault logic state of the circuit. The error detection circuit can detect the upset node in the latch and the fault output will be corrected. The upset node in the error detection circuit can be corrected by the C-element. The power dissipation and propagation delay of the proposed latch are analyzed by HSPICE simulations. The proposed latch consumes about 77.5% less energy and 33.1% less propagation delay than the triple modular redundancy (TMR) latch. Simulation results demonstrate that the proposed latch can mitigate SEU effectively. Project supported by the National Natural Science Foundation of China (Nos. 61404001, 61306046), the Anhui Province University Natural Science Research Major Project (No. KJ2014ZD12), the Huainan Science and Technology Program (No. 2013A4011), and the National Natural Science Foundation of China (No. 61371025).

  17. Multimodal game bot detection using user behavioral characteristics.

    PubMed

    Kang, Ah Reum; Jeong, Seong Hoon; Mohaisen, Aziz; Kim, Huy Kang

    2016-01-01

    As the online service industry has continued to grow, illegal activities in the online world have drastically increased and become more diverse. Most illegal activities occur continuously because cyber assets, such as game items and cyber money in online games, can be monetized into real currency. The aim of this study is to detect game bots in a massively multiplayer online role playing game (MMORPG). We observed the behavioral characteristics of game bots and found that they execute repetitive tasks associated with gold farming and real money trading. We propose a game bot detection method based on user behavioral characteristics. The method of this paper was applied to real data provided by a major MMORPG company. Detection accuracy rate increased to 96.06 % on the banned account list.

  18. [Detection and classification of medication errors at Joan XXIII University Hospital].

    PubMed

    Jornet Montaña, S; Canadell Vilarrasa, L; Calabuig Mũoz, M; Riera Sendra, G; Vuelta Arce, M; Bardají Ruiz, A; Gallart Mora, M J

    2004-01-01

    Medication errors are multifactorial and multidisciplinary, and may originate in processes such as drug prescription, transcription, dispensation, preparation and administration. The goal of this work was to measure the incidence of detectable medication errors that arise within a unit dose drug distribution and control system, from drug prescription to drug administration, by means of an observational method confined to the Pharmacy Department, as well as a voluntary, anonymous report system. The acceptance of this voluntary report system's implementation was also assessed. A prospective descriptive study was conducted. Data collection was performed at the Pharmacy Department from a review of prescribed medical orders, a review of pharmaceutical transcriptions, a review of dispensed medication and a review of medication returned in unit dose medication carts. A voluntary, anonymous report system centralized in the Pharmacy Department was also set up to detect medication errors. Prescription errors were the most frequent (1.12%), closely followed by dispensation errors (1.04%). Transcription errors (0.42%) and administration errors (0.69%) had the lowest overall incidence. Voluntary report involved only 4.25% of all detected errors, whereas unit dose medication cart review contributed the most to error detection. Recognizing the incidence and types of medication errors that occur in a health-care setting allows us to analyze their causes and effect changes in different stages of the process in order to ensure maximal patient safety.

  19. A citizen science approach to optimising computer aided detection (CAD) in mammography

    NASA Astrophysics Data System (ADS)

    Ionescu, Georgia V.; Harkness, Elaine F.; Hulleman, Johan; Astley, Susan M.

    2018-03-01

    Computer aided detection (CAD) systems assist medical experts during image interpretation. In mammography, CAD systems prompt suspicious regions which help medical experts to detect early signs of cancer. This is a challenging task and prompts may appear in regions that are actually normal, whilst genuine cancers may be missed. The effect prompting has on readers performance is not fully known. In order to explore the effects of prompting errors, we have created an online game (Bat Hunt), designed for non-experts, that mirrors mammographic CAD. This allows us to explore a wider parameter space. Users are required to detect bats in images of flocks of birds, with image difficulty matched to the proportions of screening mammograms in different BI-RADS density categories. Twelve prompted conditions were investigated, along with unprompted detection. On average, players achieved a sensitivity of 0.33 for unprompted detection, and sensitivities of 0.75, 0.83, and 0.92 respectively for 70%, 80%, and 90% of targets prompted, regardless of CAD specificity. False prompts distract players from finding unprompted targets if they appear in the same image. Player performance decreases when the number of false prompts increases, and increases proportionally with prompting sensitivity. Median lowest d' was for unprompted condition (1.08) and the highest for sensitivity 90% and 0.5 false prompts per image (d'=4.48).

  20. After the Medication Error: Recent Nursing Graduates' Reflections on Adequacy of Education.

    PubMed

    Treiber, Linda A; Jones, Jackie H

    2018-05-01

    The purpose of this study was to better understand individual- and system-level factors surrounding making a medication error from the perspective of recent Bachelor of Science in Nursing graduates. Online survey mixed-methods items included perceptions of adequacy of preparatory nursing education, contributory variables, emotional responses, and treatment by employer following the error. Of the 168 respondents, 55% had made a medication error. Errors resulted from inexperience, rushing, technology, staffing, and patient acuity. Twenty-four percent did not report their errors. Key themes for improving education included more practice in varied clinical areas, intensive pharmacological preparation, practical instruction in functioning within the health care environment, and coping after making medication errors. Errors generally caused emotional distress in the error maker. Overall, perceived treatment after the error reflected supportive environments, where nurses were generally treated with respect, fair treatment, and understanding. Opportunities for nursing education include second victim awareness and reinforcing professional practice standards. [J Nurs Educ. 2018;57(5):275-280.]. Copyright 2018, SLACK Incorporated.

  1. Implementing a mixed-mode design for collecting administrative records: striking a balance between quality and burden

    EIA Publications

    2012-01-01

    RECS relies on actual records from energy suppliers to produce robust survey estimates of household energy consumption and expenditures. During the RECS Energy Supplier Survey (ESS), energy billing records are collected from the companies that supply electricity, natural gas, fuel oil/kerosene, and propane (LPG) to the interviewed households. As Federal agencies expand the use of administrative records to enhance, replace, or evaluate survey data, EIA has explored more flexible, reliable and efficient techniques to collect energy billing records. The ESS has historically been a mail-administered survey, but EIA introduced web data collection with the 2009 RECS ESS. In that survey, energy suppliers self-selected their reporting mode among several options: standardized paper form, on-line fillable form or spreadsheet, or failing all else, a nonstandard format of their choosing. In this paper, EIA describes where reporting mode appears to influence the data quality. We detail the reporting modes, the embedded and post-hoc quality control and consistency checks that were performed, the extent of detectable errors, and the methods used for correcting data errors. We explore by mode the levels of unit and item nonresponse, number of errors, and corrections made to the data. In summary, we find notable differences in data quality between modes and analyze where the benefits of offering these new modes outweigh the "costs".

  2. Automated chromatographic laccase-mediator-system activity assay.

    PubMed

    Anders, Nico; Schelden, Maximilian; Roth, Simon; Spiess, Antje C

    2017-08-01

    To study the interaction of laccases, mediators, and substrates in laccase-mediator systems (LMS), an on-line measurement was developed using high performance anion exchange chromatography equipped with a CarboPac™ PA 100 column coupled to pulsed amperometric detection (HPAEC-PAD). The developed method was optimized for overall chromatographic run time (45 to 120 min) and automated sample drawing. As an example, the Trametes versicolor laccase induced oxidation of 1-(3,4-dimethoxyphenyl)-2-(2-methoxyphenoxy)-1,3-dihydroxypropane (adlerol) using 1-hydroxybenzotriazole (HBT) as mediator was measured and analyzed on-line. Since the Au electrode of the PAD detects only hydroxyl group containing substances with a limit of detection being in the milligram/liter range, not all products are measureable. Therefore, this method was applied for the quantification of adlerol, and-based on adlerol conversion-for the quantification of the LMS activity at a specific T. versicolor laccase/HBT ratio. The automated chromatographic activity assay allowed for a defined reaction start of all laccase-mediator-system reactions mixtures, and the LMS reaction progress was automatically monitored for 48 h. The automatization enabled an integrated monitoring overnight and over-weekend and minimized all manual errors such as pipetting of solutions accordingly. The activity of the LMS based on adlerol consumption was determined to 0.47 U/mg protein for a laccase/mediator ratio of 1.75 U laccase/g HBT. In the future, the automated method will allow for a fast screening of combinations of laccases, mediators, and substrates which are efficient for lignin modification. In particular, it allows for a fast and easy quantification of the oxidizing activity of an LMS on a lignin-related substrate which is not covered by typical colorimetric laccase assays. ᅟ.

  3. Deception Detection: The Relationship of Levels of Trust and Perspective Taking in Real-Time Online and Offline Communication Environments.

    PubMed

    Friend, Catherine; Fox Hamilton, Nicola

    2016-09-01

    Where humans have been found to detect lies or deception only at the rate of chance in offline face-to-face communication (F2F), computer-mediated communication (CMC) online can elicit higher rates of trust and sharing of personal information than F2F. How do levels of trust and empathetic personality traits like perspective taking (PT) relate to deception detection in real-time CMC compared to F2F? A between groups correlational design (N = 40) demonstrated that, through a paired deceptive conversation task with confederates, levels of participant trust could predict accurate detection online but not offline. Second, participant PT abilities could not predict accurate detection in either conversation medium. Finally, this study found that conversation medium also had no effect on deception detection. This study finds support for the effects of the Truth Bias and online disinhibition in deception, and further implications in law enforcement are discussed.

  4. Similarity between community structures of different online social networks and its impact on underlying community detection

    NASA Astrophysics Data System (ADS)

    Fan, W.; Yeung, K. H.

    2015-03-01

    As social networking services are popular, many people may register in more than one online social network. In this paper we study a set of users who have accounts of three online social networks: namely Foursquare, Facebook and Twitter. Community structure of this set of users may be reflected in these three online social networks. Therefore, high correlation between these reflections and the underlying community structure may be observed. In this work, community structures are detected in all three online social networks. Also, we investigate the similarity level of community structures across different networks. It is found that they show strong correlation with each other. The similarity between different networks may be helpful to find a community structure close to the underlying one. To verify this, we propose a method to increase the weights of some connections in networks. With this method, new networks are generated to assist community detection. By doing this, value of modularity can be improved and the new community structure match network's natural structure better. In this paper we also show that the detected community structures of online social networks are correlated with users' locations which are identified on Foursquare. This information may also be useful for underlying community detection.

  5. Long-term object tracking combined offline with online learning

    NASA Astrophysics Data System (ADS)

    Hu, Mengjie; Wei, Zhenzhong; Zhang, Guangjun

    2016-04-01

    We propose a simple yet effective method for long-term object tracking. Different from the traditional visual tracking method, which mainly depends on frame-to-frame correspondence, we combine high-level semantic information with low-level correspondences. Our framework is formulated in a confidence selection framework, which allows our system to recover from drift and partly deal with occlusion. To summarize, our algorithm can be roughly decomposed into an initialization stage and a tracking stage. In the initialization stage, an offline detector is trained to get the object appearance information at the category level, which is used for detecting the potential target and initializing the tracking stage. The tracking stage consists of three modules: the online tracking module, detection module, and decision module. A pretrained detector is used for maintaining drift of the online tracker, while the online tracker is used for filtering out false positive detections. A confidence selection mechanism is proposed to optimize the object location based on the online tracker and detection. If the target is lost, the pretrained detector is utilized to reinitialize the whole algorithm when the target is relocated. During experiments, we evaluate our method on several challenging video sequences, and it demonstrates huge improvement compared with detection and online tracking only.

  6. Performance of an online translation tool when applied to patient educational material.

    PubMed

    Khanna, Raman R; Karliner, Leah S; Eck, Matthias; Vittinghoff, Eric; Koenig, Christopher J; Fang, Margaret C

    2011-11-01

    Language barriers may prevent clinicians from tailoring patient educational material to the needs of individuals with limited English proficiency. Online translation tools could fill this gap, but their accuracy is unknown. We evaluated the accuracy of an online translation tool for patient educational material. We selected 45 sentences from a pamphlet available in both English and Spanish, and translated it into Spanish using GoogleTranslate™ (GT). Three bilingual Spanish speakers then performed a blinded evaluation on these 45 sentences, comparing GT-translated sentences to those translated professionally, along four domains: fluency (grammatical correctness), adequacy (information preservation), meaning (connotation maintenance), and severity (perceived dangerousness of an error if present). In addition, evaluators indicated whether they had a preference for either the GT-translated or professionally translated sentences. The GT-translated sentences had significantly lower fluency scores compared to the professional translation (3.4 vs. 4.7, P < 0.001), but similar adequacy (4.2 vs. 4.5, P = 0.19) and meaning (4.5 vs. 4.8, P = 0.29) scores. The GT-translated sentences were more likely to have any error (39% vs. 22%, P = 0.05), but not statistically more likely to have a severe error (4% vs. 2%, P = 0.61). Evaluators preferred the professional translation for complex sentences, but not for simple ones. When applied to patient educational material, GT performed comparably to professional human translation in terms of preserving information and meaning, though it was slightly worse in preserving grammar. In situations where professional human translations are unavailable or impractical, online translation may someday fill an important niche. Copyright © 2011 Society of Hospital Medicine.

  7. Online, efficient and precision laser profiling of bronze-bonded diamond grinding wheels based on a single-layer deep-cutting intermittent feeding method

    NASA Astrophysics Data System (ADS)

    Deng, Hui; Chen, Genyu; He, Jie; Zhou, Cong; Du, Han; Wang, Yanyi

    2016-06-01

    In this study, an online, efficient and precision laser profiling approach that is based on a single-layer deep-cutting intermittent feeding method is described. The effects of the laser cutting depth and the track-overlap ratio of the laser cutting on the efficiency, precision and quality of laser profiling were investigated. Experiments on the online profiling of bronze-bonded diamond grinding wheels were performed using a pulsed fiber laser. The results demonstrate that an increase in the laser cutting depth caused an increase in the material removal efficiency during the laser profiling process. However, the maximum laser profiling efficiency was only achieved when the laser cutting depth was equivalent to the initial surface contour error of the grinding wheel. In addition, the selection of relatively high track-overlap ratios of laser cutting for the profiling of grinding wheels was beneficial with respect to the increase in the precision of laser profiling, whereas the efficiency and quality of the laser profiling were not affected by the change in the track-overlap ratio. After optimized process parameters were employed for online laser profiling, the circular run-out error and the parallelism error of the grinding wheel surface decreased from 83.1 μm and 324.6 μm to 11.3 μm and 3.5 μm, respectively. The surface contour precision of the grinding wheel significantly improved. The highest surface contour precision for grinding wheels of the same type that can be theoretically achieved after laser profiling is completely dependent on the peak power density of the laser. The higher the laser peak power density is, the higher the surface contour precision of the grinding wheel after profiling.

  8. Fault-tolerant quantum error detection.

    PubMed

    Linke, Norbert M; Gutierrez, Mauricio; Landsman, Kevin A; Figgatt, Caroline; Debnath, Shantanu; Brown, Kenneth R; Monroe, Christopher

    2017-10-01

    Quantum computers will eventually reach a size at which quantum error correction becomes imperative. Quantum information can be protected from qubit imperfections and flawed control operations by encoding a single logical qubit in multiple physical qubits. This redundancy allows the extraction of error syndromes and the subsequent detection or correction of errors without destroying the logical state itself through direct measurement. We show the encoding and syndrome measurement of a fault-tolerantly prepared logical qubit via an error detection protocol on four physical qubits, represented by trapped atomic ions. This demonstrates the robustness of a logical qubit to imperfections in the very operations used to encode it. The advantage persists in the face of large added error rates and experimental calibration errors.

  9. Design of the Detector II: A CMOS Gate Array for the Study of Concurrent Error Detection Techniques.

    DTIC Science & Technology

    1987-07-01

    detection schemes and temporary failures. The circuit consists- or of six different adders with concurrent error detection schemes . The error detection... schemes are - simple duplication, duplication with functional dual implementation, duplication with different &I [] .6implementations, two-rail encoding...THE SYSTEM. .. .... ...... ...... ...... 5 7. DESIGN OF CED SCHEMES .. ... ...... ...... ........ 7 7.1 Simple Duplication

  10. Improvement of the Error-detection Mechanism in Adults with Dyslexia Following Reading Acceleration Training.

    PubMed

    Horowitz-Kraus, Tzipi

    2016-05-01

    The error-detection mechanism aids in preventing error repetition during a given task. Electroencephalography demonstrates that error detection involves two event-related potential components: error-related and correct-response negativities (ERN and CRN, respectively). Dyslexia is characterized by slow, inaccurate reading. In particular, individuals with dyslexia have a less active error-detection mechanism during reading than typical readers. In the current study, we examined whether a reading training programme could improve the ability to recognize words automatically (lexical representations) in adults with dyslexia, thereby resulting in more efficient error detection during reading. Behavioural and electrophysiological measures were obtained using a lexical decision task before and after participants trained with the reading acceleration programme. ERN amplitudes were smaller in individuals with dyslexia than in typical readers before training but increased following training, as did behavioural reading scores. Differences between the pre-training and post-training ERN and CRN components were larger in individuals with dyslexia than in typical readers. Also, the error-detection mechanism as represented by the ERN/CRN complex might serve as a biomarker for dyslexia and be used to evaluate the effectiveness of reading intervention programmes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Error detection method

    DOEpatents

    Olson, Eric J.

    2013-06-11

    An apparatus, program product, and method that run an algorithm on a hardware based processor, generate a hardware error as a result of running the algorithm, generate an algorithm output for the algorithm, compare the algorithm output to another output for the algorithm, and detect the hardware error from the comparison. The algorithm is designed to cause the hardware based processor to heat to a degree that increases the likelihood of hardware errors to manifest, and the hardware error is observable in the algorithm output. As such, electronic components may be sufficiently heated and/or sufficiently stressed to create better conditions for generating hardware errors, and the output of the algorithm may be compared at the end of the run to detect a hardware error that occurred anywhere during the run that may otherwise not be detected by traditional methodologies (e.g., due to cooling, insufficient heat and/or stress, etc.).

  12. Field Validation of Food Service Listings: A Comparison of Commercial and Online Geographic Information System Databases

    PubMed Central

    Seliske, Laura; Pickett, William; Bates, Rebecca; Janssen, Ian

    2012-01-01

    Many studies examining the food retail environment rely on geographic information system (GIS) databases for location information. The purpose of this study was to validate information provided by two GIS databases, comparing the positional accuracy of food service places within a 1 km circular buffer surrounding 34 schools in Ontario, Canada. A commercial database (InfoCanada) and an online database (Yellow Pages) provided the addresses of food service places. Actual locations were measured using a global positioning system (GPS) device. The InfoCanada and Yellow Pages GIS databases provided the locations for 973 and 675 food service places, respectively. Overall, 749 (77.1%) and 595 (88.2%) of these were located in the field. The online database had a higher proportion of food service places found in the field. The GIS locations of 25% of the food service places were located within approximately 15 m of their actual location, 50% were within 25 m, and 75% were within 50 m. This validation study provided a detailed assessment of errors in the measurement of the location of food service places in the two databases. The location information was more accurate for the online database, however, when matching criteria were more conservative, there were no observed differences in error between the databases. PMID:23066385

  13. Field validation of food service listings: a comparison of commercial and online geographic information system databases.

    PubMed

    Seliske, Laura; Pickett, William; Bates, Rebecca; Janssen, Ian

    2012-08-01

    Many studies examining the food retail environment rely on geographic information system (GIS) databases for location information. The purpose of this study was to validate information provided by two GIS databases, comparing the positional accuracy of food service places within a 1 km circular buffer surrounding 34 schools in Ontario, Canada. A commercial database (InfoCanada) and an online database (Yellow Pages) provided the addresses of food service places. Actual locations were measured using a global positioning system (GPS) device. The InfoCanada and Yellow Pages GIS databases provided the locations for 973 and 675 food service places, respectively. Overall, 749 (77.1%) and 595 (88.2%) of these were located in the field. The online database had a higher proportion of food service places found in the field. The GIS locations of 25% of the food service places were located within approximately 15 m of their actual location, 50% were within 25 m, and 75% were within 50 m. This validation study provided a detailed assessment of errors in the measurement of the location of food service places in the two databases. The location information was more accurate for the online database, however, when matching criteria were more conservative, there were no observed differences in error between the databases.

  14. Synthesis of Arbitrary Quantum Circuits to Topological Assembly: Systematic, Online and Compact.

    PubMed

    Paler, Alexandru; Fowler, Austin G; Wille, Robert

    2017-09-05

    It is challenging to transform an arbitrary quantum circuit into a form protected by surface code quantum error correcting codes (a variant of topological quantum error correction), especially if the goal is to minimise overhead. One of the issues is the efficient placement of magic state distillation sub circuits, so-called distillation boxes, in the space-time volume that abstracts the computation's required resources. This work presents a general, systematic, online method for the synthesis of such circuits. Distillation box placement is controlled by so-called schedulers. The work introduces a greedy scheduler generating compact box placements. The implemented software, whose source code is available at www.github.com/alexandrupaler/tqec, is used to illustrate and discuss synthesis examples. Synthesis and optimisation improvements are proposed.

  15. Development and experimental verification of a robust active noise control system for a diesel engine in submarines

    NASA Astrophysics Data System (ADS)

    Sachau, D.; Jukkert, S.; Hövelmann, N.

    2016-08-01

    This paper presents the development and experimental validation of an ANC (active noise control)-system designed for a particular application in the exhaust line of a submarine. Thereby, tonal components of the exhaust noise in the frequency band from 75 Hz to 120 Hz are reduced by more than 30 dB. The ANC-system is based on the feedforward leaky FxLMS-algorithm. The observability of the sound pressure in standing wave field is ensured by using two error microphones. The noninvasive online plant identification method is used to increase the robustness of the controller. Online plant identification is extended by a time-varying convergence gain to improve the performance in the presence of slight error in the frequency of the reference signal.

  16. Toward On-line Parameter Estimation of Concentric Tube Robots Using a Mechanics-based Kinematic Model

    PubMed Central

    Jang, Cheongjae; Ha, Junhyoung; Dupont, Pierre E.; Park, Frank Chongwoo

    2017-01-01

    Although existing mechanics-based models of concentric tube robots have been experimentally demonstrated to approximate the actual kinematics, determining accurate estimates of model parameters remains difficult due to the complex relationship between the parameters and available measurements. Further, because the mechanics-based models neglect some phenomena like friction, nonlinear elasticity, and cross section deformation, it is also not clear if model error is due to model simplification or to parameter estimation errors. The parameters of the superelastic materials used in these robots can be slowly time-varying, necessitating periodic re-estimation. This paper proposes a method for estimating the mechanics-based model parameters using an extended Kalman filter as a step toward on-line parameter estimation. Our methodology is validated through both simulation and experiments. PMID:28717554

  17. IMU-Based Online Kinematic Calibration of Robot Manipulator

    PubMed Central

    2013-01-01

    Robot calibration is a useful diagnostic method for improving the positioning accuracy in robot production and maintenance. An online robot self-calibration method based on inertial measurement unit (IMU) is presented in this paper. The method requires that the IMU is rigidly attached to the robot manipulator, which makes it possible to obtain the orientation of the manipulator with the orientation of the IMU in real time. This paper proposed an efficient approach which incorporates Factored Quaternion Algorithm (FQA) and Kalman Filter (KF) to estimate the orientation of the IMU. Then, an Extended Kalman Filter (EKF) is used to estimate kinematic parameter errors. Using this proposed orientation estimation method will result in improved reliability and accuracy in determining the orientation of the manipulator. Compared with the existing vision-based self-calibration methods, the great advantage of this method is that it does not need the complex steps, such as camera calibration, images capture, and corner detection, which make the robot calibration procedure more autonomous in a dynamic manufacturing environment. Experimental studies on a GOOGOL GRB3016 robot show that this method has better accuracy, convenience, and effectiveness than vision-based methods. PMID:24302854

  18. Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs

    NASA Astrophysics Data System (ADS)

    RIngenburg, Michael F.

    Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.

  19. What are incident reports telling us? A comparative study at two Australian hospitals of medication errors identified at audit, detected by staff and reported to an incident system.

    PubMed

    Westbrook, Johanna I; Li, Ling; Lehnbom, Elin C; Baysari, Melissa T; Braithwaite, Jeffrey; Burke, Rosemary; Conn, Chris; Day, Richard O

    2015-02-01

    To (i) compare medication errors identified at audit and observation with medication incident reports; (ii) identify differences between two hospitals in incident report frequency and medication error rates; (iii) identify prescribing error detection rates by staff. Audit of 3291 patient records at two hospitals to identify prescribing errors and evidence of their detection by staff. Medication administration errors were identified from a direct observational study of 180 nurses administering 7451 medications. Severity of errors was classified. Those likely to lead to patient harm were categorized as 'clinically important'. Two major academic teaching hospitals in Sydney, Australia. Rates of medication errors identified from audit and from direct observation were compared with reported medication incident reports. A total of 12 567 prescribing errors were identified at audit. Of these 1.2/1000 errors (95% CI: 0.6-1.8) had incident reports. Clinically important prescribing errors (n = 539) were detected by staff at a rate of 218.9/1000 (95% CI: 184.0-253.8), but only 13.0/1000 (95% CI: 3.4-22.5) were reported. 78.1% (n = 421) of clinically important prescribing errors were not detected. A total of 2043 drug administrations (27.4%; 95% CI: 26.4-28.4%) contained ≥ 1 errors; none had an incident report. Hospital A had a higher frequency of incident reports than Hospital B, but a lower rate of errors at audit. Prescribing errors with the potential to cause harm frequently go undetected. Reported incidents do not reflect the profile of medication errors which occur in hospitals or the underlying rates. This demonstrates the inaccuracy of using incident frequency to compare patient risk or quality performance within or across hospitals. New approaches including data mining of electronic clinical information systems are required to support more effective medication error detection and mitigation. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care.

  20. Experimental investigation of observation error in anuran call surveys

    USGS Publications Warehouse

    McClintock, B.T.; Bailey, L.L.; Pollock, K.H.; Simons, T.R.

    2010-01-01

    Occupancy models that account for imperfect detection are often used to monitor anuran and songbird species occurrence. However, presenceabsence data arising from auditory detections may be more prone to observation error (e.g., false-positive detections) than are sampling approaches utilizing physical captures or sightings of individuals. We conducted realistic, replicated field experiments using a remote broadcasting system to simulate simple anuran call surveys and to investigate potential factors affecting observation error in these studies. Distance, time, ambient noise, and observer abilities were the most important factors explaining false-negative detections. Distance and observer ability were the best overall predictors of false-positive errors, but ambient noise and competing species also affected error rates for some species. False-positive errors made up 5 of all positive detections, with individual observers exhibiting false-positive rates between 0.5 and 14. Previous research suggests false-positive errors of these magnitudes would induce substantial positive biases in standard estimators of species occurrence, and we recommend practices to mitigate for false positives when developing occupancy monitoring protocols that rely on auditory detections. These recommendations include additional observer training, limiting the number of target species, and establishing distance and ambient noise thresholds during surveys. ?? 2010 The Wildlife Society.

  1. A system to use electromagnetic tracking for the quality assurance of brachytherapy catheter digitization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damato, Antonio L., E-mail: adamato@lroc.harvard.edu; Viswanathan, Akila N.; Don, Sarah M.

    2014-10-15

    Purpose: To investigate the use of a system using electromagnetic tracking (EMT), post-processing and an error-detection algorithm for detecting errors and resolving uncertainties in high-dose-rate brachytherapy catheter digitization for treatment planning. Methods: EMT was used to localize 15 catheters inserted into a phantom using a stepwise acquisition technique. Five distinct acquisition experiments were performed. Noise associated with the acquisition was calculated. The dwell location configuration was extracted from the EMT data. A CT scan of the phantom was performed, and five distinct catheter digitization sessions were performed. No a priori registration of the CT scan coordinate system with the EMTmore » coordinate system was performed. CT-based digitization was automatically extracted from the brachytherapy plan DICOM files (CT), and rigid registration was performed between EMT and CT dwell positions. EMT registration error was characterized in terms of the mean and maximum distance between corresponding EMT and CT dwell positions per catheter. An algorithm for error detection and identification was presented. Three types of errors were systematically simulated: swap of two catheter numbers, partial swap of catheter number identification for parts of the catheters (mix), and catheter-tip shift. Error-detection sensitivity (number of simulated scenarios correctly identified as containing an error/number of simulated scenarios containing an error) and specificity (number of scenarios correctly identified as not containing errors/number of correct scenarios) were calculated. Catheter identification sensitivity (number of catheters correctly identified as erroneous across all scenarios/number of erroneous catheters across all scenarios) and specificity (number of catheters correctly identified as correct across all scenarios/number of correct catheters across all scenarios) were calculated. The mean detected and identified shift was calculated. Results: The maximum noise ±1 standard deviation associated with the EMT acquisitions was 1.0 ± 0.1 mm, and the mean noise was 0.6 ± 0.1 mm. Registration of all the EMT and CT dwell positions was associated with a mean catheter error of 0.6 ± 0.2 mm, a maximum catheter error of 0.9 ± 0.4 mm, a mean dwell error of 1.0 ± 0.3 mm, and a maximum dwell error of 1.3 ± 0.7 mm. Error detection and catheter identification sensitivity and specificity of 100% were observed for swap, mix and shift (≥2.6 mm for error detection; ≥2.7 mm for catheter identification) errors. A mean detected shift of 1.8 ± 0.4 mm and a mean identified shift of 1.9 ± 0.4 mm were observed. Conclusions: Registration of the EMT dwell positions to the CT dwell positions was possible with a residual mean error per catheter of 0.6 ± 0.2 mm and a maximum error for any dwell of 1.3 ± 0.7 mm. These low residual registration errors show that quality assurance of the general characteristics of the catheters and of possible errors affecting one specific dwell position is possible. The sensitivity and specificity of the catheter digitization verification algorithm was 100% for swap and mix errors and for shifts ≥2.6 mm. On average, shifts ≥1.8 mm were detected, and shifts ≥1.9 mm were detected and identified.« less

  2. Register file soft error recovery

    DOEpatents

    Fleischer, Bruce M.; Fox, Thomas W.; Wait, Charles D.; Muff, Adam J.; Watson, III, Alfred T.

    2013-10-15

    Register file soft error recovery including a system that includes a first register file and a second register file that mirrors the first register file. The system also includes an arithmetic pipeline for receiving data read from the first register file, and error detection circuitry to detect whether the data read from the first register file includes corrupted data. The system further includes error recovery circuitry to insert an error recovery instruction into the arithmetic pipeline in response to detecting the corrupted data. The inserted error recovery instruction replaces the corrupted data in the first register file with a copy of the data from the second register file.

  3. Diffraction analysis and evaluation of several focus- and track-error detection schemes for magneto-optical disk systems

    NASA Technical Reports Server (NTRS)

    Bernacki, Bruce E.; Mansuripur, M.

    1992-01-01

    A commonly used tracking method on pre-grooved magneto-optical (MO) media is the push-pull technique, and the astigmatic method is a popular focus-error detection approach. These two methods are analyzed using DIFFRACT, a general-purpose scalar diffraction modeling program, to observe the effects on the error signals due to focusing lens misalignment, Seidel aberrations, and optical crosstalk (feedthrough) between the focusing and tracking servos. Using the results of the astigmatic/push-pull system as a basis for comparison, a novel focus/track-error detection technique that utilizes a ring toric lens is evaluated as well as the obscuration method (focus error detection only).

  4. Error detection and correction unit with built-in self-test capability for spacecraft applications

    NASA Technical Reports Server (NTRS)

    Timoc, Constantin

    1990-01-01

    The objective of this project was to research and develop a 32-bit single chip Error Detection and Correction unit capable of correcting all single bit errors and detecting all double bit errors in the memory systems of a spacecraft. We designed the 32-bit EDAC (Error Detection and Correction unit) based on a modified Hamming code and according to the design specifications and performance requirements. We constructed a laboratory prototype (breadboard) which was converted into a fault simulator. The correctness of the design was verified on the breadboard using an exhaustive set of test cases. A logic diagram of the EDAC was delivered to JPL Section 514 on 4 Oct. 1988.

  5. Climbing fibers predict movement kinematics and performance errors.

    PubMed

    Streng, Martha L; Popa, Laurentiu S; Ebner, Timothy J

    2017-09-01

    Requisite for understanding cerebellar function is a complete characterization of the signals provided by complex spike (CS) discharge of Purkinje cells, the output neurons of the cerebellar cortex. Numerous studies have provided insights into CS function, with the most predominant view being that they are evoked by error events. However, several reports suggest that CSs encode other aspects of movements and do not always respond to errors or unexpected perturbations. Here, we evaluated CS firing during a pseudo-random manual tracking task in the monkey ( Macaca mulatta ). This task provides extensive coverage of the work space and relative independence of movement parameters, delivering a robust data set to assess the signals that activate climbing fibers. Using reverse correlation, we determined feedforward and feedback CSs firing probability maps with position, velocity, and acceleration, as well as position error, a measure of tracking performance. The direction and magnitude of the CS modulation were quantified using linear regression analysis. The major findings are that CSs significantly encode all three kinematic parameters and position error, with acceleration modulation particularly common. The modulation is not related to "events," either for position error or kinematics. Instead, CSs are spatially tuned and provide a linear representation of each parameter evaluated. The CS modulation is largely predictive. Similar analyses show that the simple spike firing is modulated by the same parameters as the CSs. Therefore, CSs carry a broader array of signals than previously described and argue for climbing fiber input having a prominent role in online motor control. NEW & NOTEWORTHY This article demonstrates that complex spike (CS) discharge of cerebellar Purkinje cells encodes multiple parameters of movement, including motor errors and kinematics. The CS firing is not driven by error or kinematic events; instead it provides a linear representation of each parameter. In contrast with the view that CSs carry feedback signals, the CSs are predominantly predictive of upcoming position errors and kinematics. Therefore, climbing fibers carry multiple and predictive signals for online motor control. Copyright © 2017 the American Physiological Society.

  6. Online Estimation of Allan Variance Coefficients Based on a Neural-Extended Kalman Filter

    PubMed Central

    Miao, Zhiyong; Shen, Feng; Xu, Dingjie; He, Kunpeng; Tian, Chunmiao

    2015-01-01

    As a noise analysis method for inertial sensors, the traditional Allan variance method requires the storage of a large amount of data and manual analysis for an Allan variance graph. Although the existing online estimation methods avoid the storage of data and the painful procedure of drawing slope lines for estimation, they require complex transformations and even cause errors during the modeling of dynamic Allan variance. To solve these problems, first, a new state-space model that directly models the stochastic errors to obtain a nonlinear state-space model was established for inertial sensors. Then, a neural-extended Kalman filter algorithm was used to estimate the Allan variance coefficients. The real noises of an ADIS16405 IMU and fiber optic gyro-sensors were analyzed by the proposed method and traditional methods. The experimental results show that the proposed method is more suitable to estimate the Allan variance coefficients than the traditional methods. Moreover, the proposed method effectively avoids the storage of data and can be easily implemented using an online processor. PMID:25625903

  7. [On-line monitoring of biomass in 1,3-propanediol fermentation by Fourier-transformed near-infrared spectra analysis].

    PubMed

    Wang, Lu; Liu, Tao; Chen, Yang; Sun, Yaqin; Xiu, Zhilong

    2017-01-25

    Biomass is an important parameter reflecting the fermentation dynamics. Real-time monitoring of biomass can be used to control and optimize a fermentation process. To overcome the deficiencies of measurement delay and manual errors from offline measurement, we designed an experimental platform for online monitoring the biomass during a 1,3-propanediol fermentation process, based on using the fourier-transformed near-infrared (FT-NIR) spectra analysis. By pre-processing the real-time sampled spectra and analyzing the sensitive spectra bands, a partial least-squares algorithm was proposed to establish a dynamic prediction model for the biomass change during a 1,3-propanediol fermentation process. The fermentation processes with substrate glycerol concentrations of 60 g/L and 40 g/L were used as the external validation experiments. The root mean square error of prediction (RMSEP) obtained by analyzing experimental data was 0.341 6 and 0.274 3, respectively. These results showed that the established model gave good prediction and could be effectively used for on-line monitoring the biomass during a 1,3-propanediol fermentation process.

  8. Adaptive h -refinement for reduced-order models: ADAPTIVE h -refinement for reduced-order models

    DOE PAGES

    Carlberg, Kevin T.

    2014-11-05

    Our work presents a method to adaptively refine reduced-order models a posteriori without requiring additional full-order-model solves. The technique is analogous to mesh-adaptive h-refinement: it enriches the reduced-basis space online by ‘splitting’ a given basis vector into several vectors with disjoint support. The splitting scheme is defined by a tree structure constructed offline via recursive k-means clustering of the state variables using snapshot data. This method identifies the vectors to split online using a dual-weighted-residual approach that aims to reduce error in an output quantity of interest. The resulting method generates a hierarchy of subspaces online without requiring large-scale operationsmore » or full-order-model solves. Furthermore, it enables the reduced-order model to satisfy any prescribed error tolerance regardless of its original fidelity, as a completely refined reduced-order model is mathematically equivalent to the original full-order model. Experiments on a parameterized inviscid Burgers equation highlight the ability of the method to capture phenomena (e.g., moving shocks) not contained in the span of the original reduced basis.« less

  9. The Effect of Error Correction vs. Error Detection on Iranian Pre-Intermediate EFL Learners' Writing Achievement

    ERIC Educational Resources Information Center

    Abedi, Razie; Latifi, Mehdi; Moinzadeh, Ahmad

    2010-01-01

    This study tries to answer some ever-existent questions in writing fields regarding approaching the most effective ways to give feedback to students' errors in writing by comparing the effect of error correction and error detection on the improvement of students' writing ability. In order to achieve this goal, 60 pre-intermediate English learners…

  10. Fault-tolerant quantum error detection

    PubMed Central

    Linke, Norbert M.; Gutierrez, Mauricio; Landsman, Kevin A.; Figgatt, Caroline; Debnath, Shantanu; Brown, Kenneth R.; Monroe, Christopher

    2017-01-01

    Quantum computers will eventually reach a size at which quantum error correction becomes imperative. Quantum information can be protected from qubit imperfections and flawed control operations by encoding a single logical qubit in multiple physical qubits. This redundancy allows the extraction of error syndromes and the subsequent detection or correction of errors without destroying the logical state itself through direct measurement. We show the encoding and syndrome measurement of a fault-tolerantly prepared logical qubit via an error detection protocol on four physical qubits, represented by trapped atomic ions. This demonstrates the robustness of a logical qubit to imperfections in the very operations used to encode it. The advantage persists in the face of large added error rates and experimental calibration errors. PMID:29062889

  11. Improving NGDC Track-line Data Quality Control

    NASA Astrophysics Data System (ADS)

    Chandler, M. T.; Wessel, P.

    2004-12-01

    Ship-board gravity, magnetic and bathymetry data archived at the National Geophysical Data Center (NGDC) represent decades of seagoing research, containing over 4,500 cruises. Cruise data remain relevent despite the prominence of satellite altimetry-derived global grids because many geologic processes remain resolvable by oceanographic research alone. Due to the tremendous investment put forth by scientists and taxpayers to compile this vast archive and the significant errors found within it, additional quality assessment and corrections are warranted. These can best be accomplished by adding to existing quality control measures at NGDC. We are currently developing open source software to provide additional quality control. Along with NGDC's current sanity checking, new data at NGDC will also be subjected to an along-track ``sniffer'' which will detect and flag suspicious data for later graphical inspection using a visual editor. If new data pass these tests, they will undergo further scrutinization using a crossover error (COE) calculator which will compare new data values to existing values at points of intersection within the archive. Data passing these tests will be deemed ``quality data`` and suitable for permanent addition to the archive, while data that fail will be returned to the source institution for correction. Crossover errors will be stored and an online COE database will be available. The COE database will allow users to apply corrections to the NGDC track-line database to produce corrected data files. At no time will the archived data itself be modified. An attempt will also be made to reduce navigational errors for pre-GPS navigated cruises. Upon completion these programs will be used to explore and model systematic errors within the archive, generate correction tables for all cruises, and to quantify the error budget in marine geophysical observations. Software will be released and these procedures will be implemented in cooperation with NGDC staff.

  12. Confidence-Based Data Association and Discriminative Deep Appearance Learning for Robust Online Multi-Object Tracking.

    PubMed

    Bae, Seung-Hwan; Yoon, Kuk-Jin

    2018-03-01

    Online multi-object tracking aims at estimating the tracks of multiple objects instantly with each incoming frame and the information provided up to the moment. It still remains a difficult problem in complex scenes, because of the large ambiguity in associating multiple objects in consecutive frames and the low discriminability between objects appearances. In this paper, we propose a robust online multi-object tracking method that can handle these difficulties effectively. We first define the tracklet confidence using the detectability and continuity of a tracklet, and decompose a multi-object tracking problem into small subproblems based on the tracklet confidence. We then solve the online multi-object tracking problem by associating tracklets and detections in different ways according to their confidence values. Based on this strategy, tracklets sequentially grow with online-provided detections, and fragmented tracklets are linked up with others without any iterative and expensive association steps. For more reliable association between tracklets and detections, we also propose a deep appearance learning method to learn a discriminative appearance model from large training datasets, since the conventional appearance learning methods do not provide rich representation that can distinguish multiple objects with large appearance variations. In addition, we combine online transfer learning for improving appearance discriminability by adapting the pre-trained deep model during online tracking. Experiments with challenging public datasets show distinct performance improvement over other state-of-the-arts batch and online tracking methods, and prove the effect and usefulness of the proposed methods for online multi-object tracking.

  13. Alumina Concentration Detection Based on the Kernel Extreme Learning Machine.

    PubMed

    Zhang, Sen; Zhang, Tao; Yin, Yixin; Xiao, Wendong

    2017-09-01

    The concentration of alumina in the electrolyte is of great significance during the production of aluminum. The amount of the alumina concentration may lead to unbalanced material distribution and low production efficiency and affect the stability of the aluminum reduction cell and current efficiency. The existing methods cannot meet the needs for online measurement because industrial aluminum electrolysis has the characteristics of high temperature, strong magnetic field, coupled parameters, and high nonlinearity. Currently, there are no sensors or equipment that can detect the alumina concentration on line. Most companies acquire the alumina concentration from the electrolyte samples which are analyzed through an X-ray fluorescence spectrometer. To solve the problem, the paper proposes a soft sensing model based on a kernel extreme learning machine algorithm that takes the kernel function into the extreme learning machine. K-fold cross validation is used to estimate the generalization error. The proposed soft sensing algorithm can detect alumina concentration by the electrical signals such as voltages and currents of the anode rods. The predicted results show that the proposed approach can give more accurate estimations of alumina concentration with faster learning speed compared with the other methods such as the basic ELM, BP, and SVM.

  14. Error-Analysis for Correctness, Effectiveness, and Composing Procedure.

    ERIC Educational Resources Information Center

    Ewald, Helen Rothschild

    The assumptions underpinning grammatical mistakes can often be detected by looking for patterns of errors in a student's work. Assumptions that negatively influence rhetorical effectiveness can similarly be detected through error analysis. On a smaller scale, error analysis can also reveal assumptions affecting rhetorical choice. Snags in the…

  15. Analysis of measured data of human body based on error correcting frequency

    NASA Astrophysics Data System (ADS)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  16. Sleepiness and Safety: Where Biology Needs Technology.

    PubMed

    Abe, Takashi; Mollicone, Daniel; Basner, Mathias; Dinges, David F

    2014-04-01

    Maintaining human alertness and behavioral capability under conditions of sleep loss and circadian misalignment requires fatigue management technologies due to: (1) dynamic nonlinear modulation of performance capability by the interaction of sleep homeostatic drive and circadian regulation; (2) large differences among people in neurobehavioral vulnerability to sleep loss; (3) error in subjective estimates of fatigue on performance; and (4) to inform people of the need for recovery sleep. Two promising areas of technology have emerged for managing fatigue risk in safety-sensitive occupations. The first involves preventing fatigue by optimizing work schedules using biomathematical models of performance changes associated with sleep homeostatic and circadian dynamics. Increasingly these mathematical models account for individual differences to achieve a more accurate estimate of the timing and magnitude of fatigue effects on individuals. The second area involves technologies for detecting transient fatigue from drowsiness. The Psychomotor Vigilance Test (PVT), which has been extensively validated to be sensitive to deficits in attention from sleep loss and circadian misalignment, is an example in this category. Two shorter-duration versions of the PVT recently have been developed for evaluating whether operators have sufficient behavioral alertness prior to or during work. Another example is online tracking the percent of slow eyelid closures (PERCLOS), which has been shown to reflect momentary fluctuations of vigilance. Technologies for predicting and detecting sleepiness/fatigue have the potential to predict and prevent operator errors and accidents in safety-sensitive occupations, as well as physiological and mental diseases due to inadequate sleep and circadian misalignment.

  17. A novel variational Bayes multiple locus Z-statistic for genome-wide association studies with Bayesian model averaging

    PubMed Central

    Logsdon, Benjamin A.; Carty, Cara L.; Reiner, Alexander P.; Dai, James Y.; Kooperberg, Charles

    2012-01-01

    Motivation: For many complex traits, including height, the majority of variants identified by genome-wide association studies (GWAS) have small effects, leaving a significant proportion of the heritable variation unexplained. Although many penalized multiple regression methodologies have been proposed to increase the power to detect associations for complex genetic architectures, they generally lack mechanisms for false-positive control and diagnostics for model over-fitting. Our methodology is the first penalized multiple regression approach that explicitly controls Type I error rates and provide model over-fitting diagnostics through a novel normally distributed statistic defined for every marker within the GWAS, based on results from a variational Bayes spike regression algorithm. Results: We compare the performance of our method to the lasso and single marker analysis on simulated data and demonstrate that our approach has superior performance in terms of power and Type I error control. In addition, using the Women's Health Initiative (WHI) SNP Health Association Resource (SHARe) GWAS of African-Americans, we show that our method has power to detect additional novel associations with body height. These findings replicate by reaching a stringent cutoff of marginal association in a larger cohort. Availability: An R-package, including an implementation of our variational Bayes spike regression (vBsr) algorithm, is available at http://kooperberg.fhcrc.org/soft.html. Contact: blogsdon@fhcrc.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22563072

  18. TU-G-BRD-08: In-Vivo EPID Dosimetry: Quantifying the Detectability of Four Classes of Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, E; Phillips, M; Bojechko, C

    Purpose: EPID dosimetry is an emerging method for treatment verification and QA. Given that the in-vivo EPID technique is in clinical use at some centers, we investigate the sensitivity and specificity for detecting different classes of errors. We assess the impact of these errors using dose volume histogram endpoints. Though data exist for EPID dosimetry performed pre-treatment, this is the first study quantifying its effectiveness when used during patient treatment (in-vivo). Methods: We analyzed 17 patients; EPID images of the exit dose were acquired and used to reconstruct the planar dose at isocenter. This dose was compared to the TPSmore » dose using a 3%/3mm gamma criteria. To simulate errors, modifications were made to treatment plans using four possible classes of error: 1) patient misalignment, 2) changes in patient body habitus, 3) machine output changes and 4) MLC misalignments. Each error was applied with varying magnitudes. To assess the detectability of the error, the area under a ROC curve (AUC) was analyzed. The AUC was compared to changes in D99 of the PTV introduced by the simulated error. Results: For systematic changes in the MLC leaves, changes in the machine output and patient habitus, the AUC varied from 0.78–0.97 scaling with the magnitude of the error. The optimal gamma threshold as determined by the ROC curve varied between 84–92%. There was little diagnostic power in detecting random MLC leaf errors and patient shifts (AUC 0.52–0.74). Some errors with weak detectability had large changes in D99. Conclusion: These data demonstrate the ability of EPID-based in-vivo dosimetry in detecting variations in patient habitus and errors related to machine parameters such as systematic MLC misalignments and machine output changes. There was no correlation found between the detectability of the error using the gamma pass rate, ROC analysis and the impact on the dose volume histogram. Funded by grant R18HS022244 from AHRQ.« less

  19. On-line monitoring the extract process of Fu-fang Shuanghua oral solution using near infrared spectroscopy and different PLS algorithms

    NASA Astrophysics Data System (ADS)

    Kang, Qian; Ru, Qingguo; Liu, Yan; Xu, Lingyan; Liu, Jia; Wang, Yifei; Zhang, Yewen; Li, Hui; Zhang, Qing; Wu, Qing

    2016-01-01

    An on-line near infrared (NIR) spectroscopy monitoring method with an appropriate multivariate calibration method was developed for the extraction process of Fu-fang Shuanghua oral solution (FSOS). On-line NIR spectra were collected through two fiber optic probes, which were designed to transmit NIR radiation by a 2 mm flange. Partial least squares (PLS), interval PLS (iPLS) and synergy interval PLS (siPLS) algorithms were used comparatively for building the calibration regression models. During the extraction process, the feasibility of NIR spectroscopy was employed to determine the concentrations of chlorogenic acid (CA) content, total phenolic acids contents (TPC), total flavonoids contents (TFC) and soluble solid contents (SSC). High performance liquid chromatography (HPLC), ultraviolet spectrophotometric method (UV) and loss on drying methods were employed as reference methods. Experiment results showed that the performance of siPLS model is the best compared with PLS and iPLS. The calibration models for AC, TPC, TFC and SSC had high values of determination coefficients of (R2) (0.9948, 0.9992, 0.9950 and 0.9832) and low root mean square error of cross validation (RMSECV) (0.0113, 0.0341, 0.1787 and 1.2158), which indicate a good correlation between reference values and NIR predicted values. The overall results show that the on line detection method could be feasible in real application and would be of great value for monitoring the mixed decoction process of FSOS and other Chinese patent medicines.

  20. Calibration of a flexible measurement system based on industrial articulated robot and structured light sensor

    NASA Astrophysics Data System (ADS)

    Mu, Nan; Wang, Kun; Xie, Zexiao; Ren, Ping

    2017-05-01

    To realize online rapid measurement for complex workpieces, a flexible measurement system based on an articulated industrial robot with a structured light sensor mounted on the end-effector is developed. A method for calibrating the system parameters is proposed in which the hand-eye transformation parameters and the robot kinematic parameters are synthesized in the calibration process. An initial hand-eye calibration is first performed using a standard sphere as the calibration target. By applying the modified complete and parametrically continuous method, we establish a synthesized kinematic model that combines the initial hand-eye transformation and distal link parameters as a whole with the sensor coordinate system as the tool frame. According to the synthesized kinematic model, an error model is constructed based on spheres' center-to-center distance errors. Consequently, the error model parameters can be identified in a calibration experiment using a three-standard-sphere target. Furthermore, the redundancy of error model parameters is eliminated to ensure the accuracy and robustness of the parameter identification. Calibration and measurement experiments are carried out based on an ER3A-C60 robot. The experimental results show that the proposed calibration method enjoys high measurement accuracy, and this efficient and flexible system is suitable for online measurement in industrial scenes.

  1. Discrete-Time Zhang Neural Network for Online Time-Varying Nonlinear Optimization With Application to Manipulator Motion Generation.

    PubMed

    Jin, Long; Zhang, Yunong

    2015-07-01

    In this brief, a discrete-time Zhang neural network (DTZNN) model is first proposed, developed, and investigated for online time-varying nonlinear optimization (OTVNO). Then, Newton iteration is shown to be derived from the proposed DTZNN model. In addition, to eliminate the explicit matrix-inversion operation, the quasi-Newton Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is introduced, which can effectively approximate the inverse of Hessian matrix. A DTZNN-BFGS model is thus proposed and investigated for OTVNO, which is the combination of the DTZNN model and the quasi-Newton BFGS method. In addition, theoretical analyses show that, with step-size h=1 and/or with zero initial error, the maximal residual error of the DTZNN model has an O(τ(2)) pattern, whereas the maximal residual error of the Newton iteration has an O(τ) pattern, with τ denoting the sampling gap. Besides, when h ≠ 1 and h ∈ (0,2) , the maximal steady-state residual error of the DTZNN model has an O(τ(2)) pattern. Finally, an illustrative numerical experiment and an application example to manipulator motion generation are provided and analyzed to substantiate the efficacy of the proposed DTZNN and DTZNN-BFGS models for OTVNO.

  2. On-board adaptive model for state of charge estimation of lithium-ion batteries based on Kalman filter with proportional integral-based error adjustment

    NASA Astrophysics Data System (ADS)

    Wei, Jingwen; Dong, Guangzhong; Chen, Zonghai

    2017-10-01

    With the rapid development of battery-powered electric vehicles, the lithium-ion battery plays a critical role in the reliability of vehicle system. In order to provide timely management and protection for battery systems, it is necessary to develop a reliable battery model and accurate battery parameters estimation to describe battery dynamic behaviors. Therefore, this paper focuses on an on-board adaptive model for state-of-charge (SOC) estimation of lithium-ion batteries. Firstly, a first-order equivalent circuit battery model is employed to describe battery dynamic characteristics. Then, the recursive least square algorithm and the off-line identification method are used to provide good initial values of model parameters to ensure filter stability and reduce the convergence time. Thirdly, an extended-Kalman-filter (EKF) is applied to on-line estimate battery SOC and model parameters. Considering that the EKF is essentially a first-order Taylor approximation of battery model, which contains inevitable model errors, thus, a proportional integral-based error adjustment technique is employed to improve the performance of EKF method and correct model parameters. Finally, the experimental results on lithium-ion batteries indicate that the proposed EKF with proportional integral-based error adjustment method can provide robust and accurate battery model and on-line parameter estimation.

  3. Ultrasound fusion image error correction using subject-specific liver motion model and automatic image registration.

    PubMed

    Yang, Minglei; Ding, Hui; Zhu, Lei; Wang, Guangzhi

    2016-12-01

    Ultrasound fusion imaging is an emerging tool and benefits a variety of clinical applications, such as image-guided diagnosis and treatment of hepatocellular carcinoma and unresectable liver metastases. However, respiratory liver motion-induced misalignment of multimodal images (i.e., fusion error) compromises the effectiveness and practicability of this method. The purpose of this paper is to develop a subject-specific liver motion model and automatic registration-based method to correct the fusion error. An online-built subject-specific motion model and automatic image registration method for 2D ultrasound-3D magnetic resonance (MR) images were combined to compensate for the respiratory liver motion. The key steps included: 1) Build a subject-specific liver motion model for current subject online and perform the initial registration of pre-acquired 3D MR and intra-operative ultrasound images; 2) During fusion imaging, compensate for liver motion first using the motion model, and then using an automatic registration method to further correct the respiratory fusion error. Evaluation experiments were conducted on liver phantom and five subjects. In the phantom study, the fusion error (superior-inferior axis) was reduced from 13.90±2.38mm to 4.26±0.78mm by using the motion model only. The fusion error further decreased to 0.63±0.53mm by using the registration method. The registration method also decreased the rotation error from 7.06±0.21° to 1.18±0.66°. In the clinical study, the fusion error was reduced from 12.90±9.58mm to 6.12±2.90mm by using the motion model alone. Moreover, the fusion error decreased to 1.96±0.33mm by using the registration method. The proposed method can effectively correct the respiration-induced fusion error to improve the fusion image quality. This method can also reduce the error correction dependency on the initial registration of ultrasound and MR images. Overall, the proposed method can improve the clinical practicability of ultrasound fusion imaging. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. A Novel Hybrid Error Criterion-Based Active Control Method for on-Line Milling Vibration Suppression with Piezoelectric Actuators and Sensors

    PubMed Central

    Zhang, Xingwu; Wang, Chenxi; Gao, Robert X.; Yan, Ruqiang; Chen, Xuefeng; Wang, Shibin

    2016-01-01

    Milling vibration is one of the most serious factors affecting machining quality and precision. In this paper a novel hybrid error criterion-based frequency-domain LMS active control method is constructed and used for vibration suppression of milling processes by piezoelectric actuators and sensors, in which only one Fast Fourier Transform (FFT) is used and no Inverse Fast Fourier Transform (IFFT) is involved. The correction formulas are derived by a steepest descent procedure and the control parameters are analyzed and optimized. Then, a novel hybrid error criterion is constructed to improve the adaptability, reliability and anti-interference ability of the constructed control algorithm. Finally, based on piezoelectric actuators and acceleration sensors, a simulation of a spindle and a milling process experiment are presented to verify the proposed method. Besides, a protection program is added in the control flow to enhance the reliability of the control method in applications. The simulation and experiment results indicate that the proposed method is an effective and reliable way for on-line vibration suppression, and the machining quality can be obviously improved. PMID:26751448

  5. Learning by (video) example: a randomized study of communication skills training for end-of-life and error disclosure family care conferences.

    PubMed

    Schmitz, Connie C; Braman, Jonathan P; Turner, Norman; Heller, Stephanie; Radosevich, David M; Yan, Yelena; Miller, Jane; Chipman, Jeffrey G

    2016-11-01

    Teaching residents to lead end of life (EOL) and error disclosure (ED) conferences is important. We developed and tested an intervention using videotapes of EOL and error disclosure encounters from previous Objective Structured Clinical Exams. Residents (n = 72) from general and orthopedic surgery programs at 2 sites were enrolled. Using a prospective, pre-post, block group design with stratified randomization, we hypothesized the treatment group would outperform the control on EOL and ED cases. We also hypothesized that online course usage would correlate positively with post-test scores. All residents improved (pre-post). At the group level, treatment effects were insignificant, and post-test performance was unrelated to course usage. At the subgroup level for EOL, low performers assigned to treatment scored higher than controls at post-test; and within the treatment group, post graduate year 3 residents outperformed post graduate year ​1 residents. To be effective, online curricula illustrating communication behaviors need face-to-face interaction, individual role play with feedback and discussion. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Determination of 18 veterinary antibiotics in environmental water using high-performance liquid chromatography-q-orbitrap combined with on-line solid-phase extraction.

    PubMed

    Kim, Chansik; Ryu, Hong-Duck; Chung, Eu Gene; Kim, Yongseok

    2018-05-01

    The use of antibiotics and their occurrence in the environment have received significant attention in recent years owing to the generation of antibiotic-resistant bacteria. Antibiotic residues in water near livestock farming areas should be monitored to establish effective strategies for reducing the use of veterinary antibiotics. However, environmental water contamination resulting from veterinary antibiotics has not been studied extensively. In this work, we developed an analytical method for the simultaneous determination of multiple classes of veterinary antibiotic residues in environmental water using on-line solid-phase extraction (SPE)-high performance liquid chromatography (HPLC)-high resolution mass spectrometry (HRMS). Eighteen popular antibiotics (eight classes) were selected as target analytes based on veterinary antibiotics sales in South Korea in 2015. The developed method was validated by calibration-curve linearities, precisions, relative recoveries, and method detection limits (MDLs)/limits of quantification (LOQs) of the selected antibiotics, and applied to the analysis of environmental water samples (groundwater, river water, and wastewater-treatment-plant effluent). All calibration curves exhibited r 2  > 0.995 with MDLs ranging from 0.2 to 11.9 ng/L. Relative recoveries were between 50 and 150% with coefficients of variation below 20% for all analytes (spiked at 500 ng/L) in groundwater and river water samples. Relative standard deviations (RSDs) of standard-spiked samples were lower than 7% for all antibiotics. The on-line SPE system eliminates human-based SPE errors and affords excellent method reproducibility. Amoxicillin, ampicillin, clopidol, fenbendazole, flumequine, lincomycin, sulfadiazine, and trimethoprim were detected in environmental water samples in concentrations ranging from 1.26 to 127.49 ng/L. The developed method is a reliable analytical technique for the potential routine monitoring of veterinary antibiotics. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Latent error detection: A golden two hours for detection.

    PubMed

    Saward, Justin R E; Stanton, Neville A

    2017-03-01

    Undetected error in safety critical contexts generates a latent condition that can contribute to a future safety failure. The detection of latent errors post-task completion is observed in naval air engineers using a diary to record work-related latent error detection (LED) events. A systems view is combined with multi-process theories to explore sociotechnical factors associated with LED. Perception of cues in different environments facilitates successful LED, for which the deliberate review of past tasks within two hours of the error occurring and whilst remaining in the same or similar sociotechnical environment to that which the error occurred appears most effective. Identified ergonomic interventions offer potential mitigation for latent errors; particularly in simple everyday habitual tasks. It is thought safety critical organisations should look to engineer further resilience through the application of LED techniques that engage with system cues across the entire sociotechnical environment, rather than relying on consistent human performance. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  8. Error Detection/Correction in Collaborative Writing

    ERIC Educational Resources Information Center

    Pilotti, Maura; Chodorow, Martin

    2009-01-01

    In the present study, we examined error detection/correction during collaborative writing. Subjects were asked to identify and correct errors in two contexts: a passage written by the subject (familiar text) and a passage written by a person other than the subject (unfamiliar text). A computer program inserted errors in function words prior to the…

  9. A Corpus-Based System of Error Detection and Revision Suggestion for Spanish Learners in Taiwan: A Case Study

    ERIC Educational Resources Information Center

    Lu, Hui-Chuan; Chu, Yu-Hsin; Chang, Cheng-Yu

    2013-01-01

    Compared with English learners, Spanish learners have fewer resources for automatic error detection and revision and following the current integrative Computer Assisted Language Learning (CALL), we combined corpus-based approach and CALL to create the System of Error Detection and Revision Suggestion (SEDRS) for learning Spanish. Through…

  10. Computer-Assisted Detection of 90% of EFL Student Errors

    ERIC Educational Resources Information Center

    Harvey-Scholes, Calum

    2018-01-01

    Software can facilitate English as a Foreign Language (EFL) students' self-correction of their free-form writing by detecting errors; this article examines the proportion of errors which software can detect. A corpus of 13,644 words of written English was created, comprising 90 compositions written by Spanish-speaking students at levels A2-B2…

  11. Detection and avoidance of errors in computer software

    NASA Technical Reports Server (NTRS)

    Kinsler, Les

    1989-01-01

    The acceptance test errors of a computer software project to determine if the errors could be detected or avoided in earlier phases of development. GROAGSS (Gamma Ray Observatory Attitude Ground Support System) was selected as the software project to be examined. The development of the software followed the standard Flight Dynamics Software Development methods. GROAGSS was developed between August 1985 and April 1989. The project is approximately 250,000 lines of code of which approximately 43,000 lines are reused from previous projects. GROAGSS had a total of 1715 Change Report Forms (CRFs) submitted during the entire development and testing. These changes contained 936 errors. Of these 936 errors, 374 were found during the acceptance testing. These acceptance test errors were first categorized into methods of avoidance including: more clearly written requirements; detail review; code reading; structural unit testing; and functional system integration testing. The errors were later broken down in terms of effort to detect and correct, class of error, and probability that the prescribed detection method would be successful. These determinations were based on Software Engineering Laboratory (SEL) documents and interviews with the project programmers. A summary of the results of the categorizations is presented. The number of programming errors at the beginning of acceptance testing can be significantly reduced. The results of the existing development methodology are examined for ways of improvements. A basis is provided for the definition is a new development/testing paradigm. Monitoring of the new scheme will objectively determine its effectiveness on avoiding and detecting errors.

  12. New double-byte error-correcting codes for memory systems

    NASA Technical Reports Server (NTRS)

    Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.

    1996-01-01

    Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.

  13. Accessibility assessment of assistive technology for the hearing impaired.

    PubMed

    Áfio, Aline Cruz Esmeraldo; Carvalho, Aline Tomaz de; Caravalho, Luciana Vieira de; Silva, Andréa Soares Rocha da; Pagliuca, Lorita Marlena Freitag

    2016-01-01

    to assess the automatic accessibility of assistive technology in online courses for the hearing impaired. evaluation study guided by the Assessment and Maintenance step proposed in the Model of Development of Digital Educational Material. The software Assessor and Simulator for the Accessibility of Sites (ASES) was used to analyze the online course "Education on Sexual and Reproductive Health: the use of condoms" according to the accessibility standards of national and international websites. an error report generated by the program identified, in each didactic module, one error and two warnings related to two international principles and six warnings involved with six national recommendations. The warnings relevant to hearing-impaired people were corrected, and the course was considered accessible by automatic assessment. we concluded that the pages of the course were considered, by the software used, appropriate to the standards of web accessibility.

  14. Virtual Sensors for On-line Wheel Wear and Part Roughness Measurement in the Grinding Process

    PubMed Central

    Arriandiaga, Ander; Portillo, Eva; Sánchez, Jose A.; Cabanes, Itziar; Pombo, Iñigo

    2014-01-01

    Grinding is an advanced machining process for the manufacturing of valuable complex and accurate parts for high added value sectors such as aerospace, wind generation, etc. Due to the extremely severe conditions inside grinding machines, critical process variables such as part surface finish or grinding wheel wear cannot be easily and cheaply measured on-line. In this paper a virtual sensor for on-line monitoring of those variables is presented. The sensor is based on the modelling ability of Artificial Neural Networks (ANNs) for stochastic and non-linear processes such as grinding; the selected architecture is the Layer-Recurrent neural network. The sensor makes use of the relation between the variables to be measured and power consumption in the wheel spindle, which can be easily measured. A sensor calibration methodology is presented, and the levels of error that can be expected are discussed. Validation of the new sensor is carried out by comparing the sensor's results with actual measurements carried out in an industrial grinding machine. Results show excellent estimation performance for both wheel wear and surface roughness. In the case of wheel wear, the absolute error is within the range of microns (average value 32 μm). In the case of surface finish, the absolute error is well below Ra 1 μm (average value 0.32 μm). The present approach can be easily generalized to other grinding operations. PMID:24854055

  15. Accurate Heart Rate Monitoring During Physical Exercises Using PPG.

    PubMed

    Temko, Andriy

    2017-09-01

    The challenging task of heart rate (HR) estimation from the photoplethysmographic (PPG) signal, during intensive physical exercises, is tackled in this paper. The study presents a detailed analysis of a novel algorithm (WFPV) that exploits a Wiener filter to attenuate the motion artifacts, a phase vocoder to refine the HR estimate and user-adaptive post-processing to track the subject physiology. Additionally, an offline version of the HR estimation algorithm that uses Viterbi decoding is designed for scenarios that do not require online HR monitoring (WFPV+VD). The performance of the HR estimation systems is rigorously compared with existing algorithms on the publically available database of 23 PPG recordings. On the whole dataset of 23 PPG recordings, the algorithms result in average absolute errors of 1.97 and 1.37 BPM in the online and offline modes, respectively. On the test dataset of 10 PPG recordings which were most corrupted with motion artifacts, WFPV has an error of 2.95 BPM on its own and 2.32 BPM in an ensemble with two existing algorithms. The error rate is significantly reduced when compared with the state-of-the art PPG-based HR estimation methods. The proposed system is shown to be accurate in the presence of strong motion artifacts and in contrast to existing alternatives has very few free parameters to tune. The algorithm has a low computational cost and can be used for fitness tracking and health monitoring in wearable devices. The MATLAB implementation of the algorithm is provided online.

  16. Detecting and Characterizing Semantic Inconsistencies in Ported Code

    NASA Technical Reports Server (NTRS)

    Ray, Baishakhi; Kim, Miryung; Person,Suzette; Rungta, Neha

    2013-01-01

    Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (1) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows that SPA can detect porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points.

  17. The geospatial data quality REST API for primary biodiversity data

    PubMed Central

    Otegui, Javier; Guralnick, Robert P.

    2016-01-01

    Summary: We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. Availability and implementation: The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial. Contact: javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26833340

  18. The geospatial data quality REST API for primary biodiversity data.

    PubMed

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  19. Educational intervention together with an on-line quality control program achieve recommended analytical goals for bedside blood glucose monitoring in a 1200-bed university hospital.

    PubMed

    Sánchez-Margalet, Víctor; Rodriguez-Oliva, Manuel; Sánchez-Pozo, Cristina; Fernández-Gallardo, María Francisca; Goberna, Raimundo

    2005-01-01

    Portable meters for blood glucose concentrations are used at the patients bedside, as well as by patients for self-monitoring of blood glucose. Even though most devices have important technological advances that decrease operator error, the analytical goals proposed for the performance of glucose meters have been recently changed by the American Diabetes Association (ADA) to reach <5% analytical error and <7.9% total error. We studied 80 meters throughout the Virgen Macarena Hospital and we found most devices with performance error higher than 10%. The aim of the present study was to establish a new system to control portable glucose meters together with an educational program for nurses in a 1200-bed University Hospital to achieve recommended analytical goals, so that we could improve the quality of diabetes care. We used portable glucose meters connected on-line to the laboratory after an educational program for nurses with responsibilities in point-of-care testing. We evaluated the system by assessing total error of the glucometers using high- and low-level glucose control solutions. In a period of 6 months, we collected data from 5642 control samples obtained by 14 devices (Precision PCx) directly from the control program (QC manager). The average total error for the low-level glucose control (2.77 mmol/l) was 6.3% (range 5.5-7.6%), and even lower for the high-level glucose control (16.66 mmol/l), at 4.8% (range 4.1-6.5%). In conclusion, the performance of glucose meters used in our University Hospital with more than 1000 beds not only improved after the intervention, but the meters achieved the analytical goals of the suggested ADA/National Academy of Clinical Biochemistry criteria for total error (<7.9% in the range 2.77-16.66 mmol/l glucose) and optimal total error for high glucose concentrations of <5%, which will improve the quality of care of our patients.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuangrod, T; Simpson, J; Greer, P

    Purpose: A real-time patient treatment delivery verification system using EPID (Watchdog) has been developed as an advanced patient safety tool. In a pilot study data was acquired for 119 prostate and head and neck (HN) IMRT patient deliveries to generate body-site specific action limits using statistical process control. The purpose of this study is to determine the sensitivity of Watchdog to detect clinically significant errors during treatment delivery. Methods: Watchdog utilizes a physics-based model to generate a series of predicted transit cine EPID images as a reference data set, and compares these in real-time to measured transit cine-EPID images acquiredmore » during treatment using chi comparison (4%, 4mm criteria) after the initial 2s of treatment to allow for dose ramp-up. Four study cases were used; dosimetric (monitor unit) errors in prostate (7 fields) and HN (9 fields) IMRT treatments of (5%, 7%, 10%) and positioning (systematic displacement) errors in the same treatments of (5mm, 7mm, 10mm). These errors were introduced by modifying the patient CT scan and re-calculating the predicted EPID data set. The error embedded predicted EPID data sets were compared to the measured EPID data acquired during patient treatment. The treatment delivery percentage (measured from 2s) where Watchdog detected the error was determined. Results: Watchdog detected all simulated errors for all fields during delivery. The dosimetric errors were detected at average treatment delivery percentage of (4%, 0%, 0%) and (7%, 0%, 0%) for prostate and HN respectively. For patient positional errors, the average treatment delivery percentage was (52%, 43%, 25%) and (39%, 16%, 6%). Conclusion: These results suggest that Watchdog can detect significant dosimetric and positioning errors in prostate and HN IMRT treatments in real-time allowing for treatment interruption. Displacements of the patient require longer to detect however incorrect body site or very large geographic misses will be detected rapidly.« less

  1. Competitive learning with pairwise constraints.

    PubMed

    Covões, Thiago F; Hruschka, Eduardo R; Ghosh, Joydeep

    2013-01-01

    Constrained clustering has been an active research topic since the last decade. Most studies focus on batch-mode algorithms. This brief introduces two algorithms for on-line constrained learning, named on-line linear constrained vector quantization error (O-LCVQE) and constrained rival penalized competitive learning (C-RPCL). The former is a variant of the LCVQE algorithm for on-line settings, whereas the latter is an adaptation of the (on-line) RPCL algorithm to deal with constrained clustering. The accuracy results--in terms of the normalized mutual information (NMI)--from experiments with nine datasets show that the partitions induced by O-LCVQE are competitive with those found by the (batch-mode) LCVQE. Compared with this formidable baseline algorithm, it is surprising that C-RPCL can provide better partitions (in terms of the NMI) for most of the datasets. Also, experiments on a large dataset show that on-line algorithms for constrained clustering can significantly reduce the computational time.

  2. The Watchdog Task: Concurrent error detection using assertions

    NASA Technical Reports Server (NTRS)

    Ersoz, A.; Andrews, D. M.; Mccluskey, E. J.

    1985-01-01

    The Watchdog Task, a software abstraction of the Watchdog-processor, is shown to be a powerful error detection tool with a great deal of flexibility and the advantages of watchdog techniques. A Watchdog Task system in Ada is presented; issues of recovery, latency, efficiency (communication) and preprocessing are discussed. Different applications, one of which is error detection on a single processor, are examined.

  3. A Review of Research on Error Detection. Technical Report No. 540.

    ERIC Educational Resources Information Center

    Meyer, Linda A.

    A review was conducted of the research on error detection studies completed with children, adolescents, and young adults to determine at what age children begin to detect errors in texts. The studies were grouped according to the subjects' ages. The focus of the review was on the following aspects of each study: the hypothesis that guided the…

  4. Evaluating suggestibility to additive and contradictory misinformation following explicit error detection in younger and older adults.

    PubMed

    Huff, Mark J; Umanath, Sharda

    2018-06-01

    In 2 experiments, we assessed age-related suggestibility to additive and contradictory misinformation (i.e., remembering of false details from an external source). After reading a fictional story, participants answered questions containing misleading details that were either additive (misleading details that supplemented an original event) or contradictory (errors that changed original details). On a final test, suggestibility was greater for additive than contradictory misinformation, and older adults endorsed fewer false contradictory details than younger adults. To mitigate suggestibility in Experiment 2, participants were warned about potential errors, instructed to detect errors, or instructed to detect errors after exposure to examples of additive and contradictory details. Again, suggestibility to additive misinformation was greater than contradictory, and older adults endorsed less contradictory misinformation. Only after detection instructions with misinformation examples were younger adults able to reduce contradictory misinformation effects and reduced these effects to the level of older adults. Additive misinformation however, was immune to all warning and detection instructions. Thus, older adults were less susceptible to contradictory misinformation errors, and younger adults could match this misinformation rate when warning/detection instructions were strong. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. The Trace Analysis of DEET in Water using an On-line Preconcentration Column and Liquid Chromatography with UV Photodiode Array Detection

    EPA Science Inventory

    A method for the detection of trace levels of N,N-diethyl-m-toluamide (DEET) in water is discussed. The method utilizes an on-line preconcentration column in series with high performance liquid chromatography (HPLC) and UV photodiode array detection. DEET, a common insect repel...

  6. A Novel Image Steganography Technique for Secured Online Transaction Using DWT and Visual Cryptography

    NASA Astrophysics Data System (ADS)

    Anitha Devi, M. D.; ShivaKumar, K. B.

    2017-08-01

    Online payment eco system is the main target especially for cyber frauds. Therefore end to end encryption is very much needed in order to maintain the integrity of secret information related to transactions carried online. With access to payment related sensitive information, which enables lot of money transactions every day, the payment infrastructure is a major target for hackers. The proposed system highlights, an ideal approach for secure online transaction for fund transfer with a unique combination of visual cryptography and Haar based discrete wavelet transform steganography technique. This combination of data hiding technique reduces the amount of information shared between consumer and online merchant needed for successful online transaction along with providing enhanced security to customer’s account details and thereby increasing customer’s confidence preventing “Identity theft” and “Phishing”. To evaluate the effectiveness of proposed algorithm Root mean square error, Peak signal to noise ratio have been used as evaluation parameters

  7. On-line vs off-line electrical conductivity characterization. Polycarbonate composites developed with multiwalled carbon nanotubes by compounding technology

    NASA Astrophysics Data System (ADS)

    Llorens-Chiralt, R.; Weiss, P.; Mikonsaari, I.

    2014-05-01

    Material characterization is one of the key steps when conductive polymers are developed. The dispersion of carbon nanotubes (CNTs) in a polymeric matrix using melt mixing influence final composite properties. The compounding becomes trial and error using a huge amount of materials, spending time and money to obtain competitive composites. Traditional methods to carry out electrical conductivity characterization include compression and injection molding. Both methods need extra equipments and moulds to obtain standard bars. This study aims to investigate the accuracy of the data obtained from absolute resistance recorded during the melt compounding, using an on-line setup developed by our group, and to correlate these values with off-line characterization and processing parameters (screw/barrel configuration, throughput, screw speed, temperature profile and CNTs percentage). Compounds developed with different percentages of multi walled carbon nanotubes (MWCNTs) and polycarbonate has been characterized during and after extrusion. Measurements, on-line resistance and off-line resistivity, showed parallel response and reproducibility, confirming method validity. The significance of the results obtained stems from the fact that we are able to measure on-line resistance and to change compounding parameters during production to achieve reference values reducing production/testing cost and ensuring material quality. Also, this method removes errors which can be found in test bars development, showing better correlation with compounding parameters.

  8. An Efficient Silent Data Corruption Detection Method with Error-Feedback Control and Even Sampling for HPC Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di, Sheng; Berrocal, Eduardo; Cappello, Franck

    The silent data corruption (SDC) problem is attracting more and more attentions because it is expected to have a great impact on exascale HPC applications. SDC faults are hazardous in that they pass unnoticed by hardware and can lead to wrong computation results. In this work, we formulate SDC detection as a runtime one-step-ahead prediction method, leveraging multiple linear prediction methods in order to improve the detection results. The contributions are twofold: (1) we propose an error feedback control model that can reduce the prediction errors for different linear prediction methods, and (2) we propose a spatial-data-based even-sampling method tomore » minimize the detection overheads (including memory and computation cost). We implement our algorithms in the fault tolerance interface, a fault tolerance library with multiple checkpoint levels, such that users can conveniently protect their HPC applications against both SDC errors and fail-stop errors. We evaluate our approach by using large-scale traces from well-known, large-scale HPC applications, as well as by running those HPC applications on a real cluster environment. Experiments show that our error feedback control model can improve detection sensitivity by 34-189% for bit-flip memory errors injected with the bit positions in the range [20,30], without any degradation on detection accuracy. Furthermore, memory size can be reduced by 33% with our spatial-data even-sampling method, with only a slight and graceful degradation in the detection sensitivity.« less

  9. Help prevent hospital errors

    MedlinePlus

    ... A.D.A.M. follows rigorous standards of quality and accountability. A.D.A.M. is among the first to achieve this important distinction for online health information and services. Learn more about A.D.A.M.'s editorial ...

  10. Repeat-aware modeling and correction of short read errors.

    PubMed

    Yang, Xiao; Aluru, Srinivas; Dorman, Karin S

    2011-02-15

    High-throughput short read sequencing is revolutionizing genomics and systems biology research by enabling cost-effective deep coverage sequencing of genomes and transcriptomes. Error detection and correction are crucial to many short read sequencing applications including de novo genome sequencing, genome resequencing, and digital gene expression analysis. Short read error detection is typically carried out by counting the observed frequencies of kmers in reads and validating those with frequencies exceeding a threshold. In case of genomes with high repeat content, an erroneous kmer may be frequently observed if it has few nucleotide differences with valid kmers with multiple occurrences in the genome. Error detection and correction were mostly applied to genomes with low repeat content and this remains a challenging problem for genomes with high repeat content. We develop a statistical model and a computational method for error detection and correction in the presence of genomic repeats. We propose a method to infer genomic frequencies of kmers from their observed frequencies by analyzing the misread relationships among observed kmers. We also propose a method to estimate the threshold useful for validating kmers whose estimated genomic frequency exceeds the threshold. We demonstrate that superior error detection is achieved using these methods. Furthermore, we break away from the common assumption of uniformly distributed errors within a read, and provide a framework to model position-dependent error occurrence frequencies common to many short read platforms. Lastly, we achieve better error correction in genomes with high repeat content. The software is implemented in C++ and is freely available under GNU GPL3 license and Boost Software V1.0 license at "http://aluru-sun.ece.iastate.edu/doku.php?id = redeem". We introduce a statistical framework to model sequencing errors in next-generation reads, which led to promising results in detecting and correcting errors for genomes with high repeat content.

  11. Quiet eye facilitates sensorimotor preprograming and online control of precision aiming in golf putting.

    PubMed

    Causer, Joe; Hayes, Spencer J; Hooper, James M; Bennett, Simon J

    2017-02-01

    An occlusion protocol was used to elucidate the respective roles of preprograming and online control during the quiet eye period of golf putting. Twenty-one novice golfers completed golf putts to 6-ft and 11-ft targets under full vision or with vision occluded on initiation of the backswing. Radial error (RE) was higher, and quiet eye was longer, when putting to the 11-ft versus 6-ft target, and in the occluded versus full vision condition. Quiet eye durations, as well as preprograming, online and dwell durations, were longer in low-RE compared to high-RE trials. The preprograming component of quiet eye was significantly longer in the occluded vision condition, whereas the online and dwell components were significantly longer in the full vision condition. These findings demonstrate an increase in preprograming when vision is occluded. However, this was not sufficient to overcome the need for online visual control during the quiet eye period. These findings suggest the quiet eye period is composed of preprograming and online control elements; however, online visual control of action is critical to performance.

  12. Using video recording to identify management errors in pediatric trauma resuscitation.

    PubMed

    Oakley, Ed; Stocker, Sergio; Staubli, Georg; Young, Simon

    2006-03-01

    To determine the ability of video recording to identify management errors in trauma resuscitation and to compare this method with medical record review. The resuscitation of children who presented to the emergency department of the Royal Children's Hospital between February 19, 2001, and August 18, 2002, for whom the trauma team was activated was video recorded. The tapes were analyzed, and management was compared with Advanced Trauma Life Support guidelines. Deviations from these guidelines were recorded as errors. Fifty video recordings were analyzed independently by 2 reviewers. Medical record review was undertaken for a cohort of the most seriously injured patients, and errors were identified. The errors detected with the 2 methods were compared. Ninety resuscitations were video recorded and analyzed. An average of 5.9 errors per resuscitation was identified with this method (range: 1-12 errors). Twenty-five children (28%) had an injury severity score of >11; there was an average of 2.16 errors per patient in this group. Only 10 (20%) of these errors were detected in the medical record review. Medical record review detected an additional 8 errors that were not evident on the video recordings. Concordance between independent reviewers was high, with 93% agreement. Video recording is more effective than medical record review in detecting management errors in pediatric trauma resuscitation. Management errors in pediatric trauma resuscitation are common and often involve basic resuscitation principles. Resuscitation of the most seriously injured children was associated with fewer errors. Video recording is a useful adjunct to trauma resuscitation auditing.

  13. Simultaneous message framing and error detection

    NASA Technical Reports Server (NTRS)

    Frey, A. H., Jr.

    1968-01-01

    Circuitry simultaneously inserts message framing information and detects noise errors in binary code data transmissions. Separate message groups are framed without requiring both framing bits and error-checking bits, and predetermined message sequence are separated from other message sequences without being hampered by intervening noise.

  14. Multi-bits error detection and fast recovery in RISC cores

    NASA Astrophysics Data System (ADS)

    Jing, Wang; Xing, Yang; Yuanfu, Zhao; Weigong, Zhang; Jiao, Shen; Keni, Qiu

    2015-11-01

    The particles-induced soft errors are a major threat to the reliability of microprocessors. Even worse, multi-bits upsets (MBUs) are ever-increased due to the rapidly shrinking feature size of the IC on a chip. Several architecture-level mechanisms have been proposed to protect microprocessors from soft errors, such as dual and triple modular redundancies (DMR and TMR). However, most of them are inefficient to combat the growing multi-bits errors or cannot well balance the critical paths delay, area and power penalty. This paper proposes a novel architecture, self-recovery dual-pipeline (SRDP), to effectively provide soft error detection and recovery with low cost for general RISC structures. We focus on the following three aspects. First, an advanced DMR pipeline is devised to detect soft error, especially MBU. Second, SEU/MBU errors can be located by enhancing self-checking logic into pipelines stage registers. Third, a recovery scheme is proposed with a recovery cost of 1 or 5 clock cycles. Our evaluation of a prototype implementation exhibits that the SRDP can successfully detect particle-induced soft errors up to 100% and recovery is nearly 95%, the other 5% will inter a specific trap.

  15. Observer detection of image degradation caused by irreversible data compression processes

    NASA Astrophysics Data System (ADS)

    Chen, Ji; Flynn, Michael J.; Gross, Barry; Spizarny, David

    1991-05-01

    Irreversible data compression methods have been proposed to reduce the data storage and communication requirements of digital imaging systems. In general, the error produced by compression increases as an algorithm''s compression ratio is increased. We have studied the relationship between compression ratios and the detection of induced error using radiologic observers. The nature of the errors was characterized by calculating the power spectrum of the difference image. In contrast with studies designed to test whether detected errors alter diagnostic decisions, this study was designed to test whether observers could detect the induced error. A paired-film observer study was designed to test whether induced errors were detected. The study was conducted with chest radiographs selected and ranked for subtle evidence of interstitial disease, pulmonary nodules, or pneumothoraces. Images were digitized at 86 microns (4K X 5K) and 2K X 2K regions were extracted. A full-frame discrete cosine transform method was used to compress images at ratios varying between 6:1 and 60:1. The decompressed images were reprinted next to the original images in a randomized order with a laser film printer. The use of a film digitizer and a film printer which can reproduce all of the contrast and detail in the original radiograph makes the results of this study insensitive to instrument performance and primarily dependent on radiographic image quality. The results of this study define conditions for which errors associated with irreversible compression cannot be detected by radiologic observers. The results indicate that an observer can detect the errors introduced by this compression algorithm for compression ratios of 10:1 (1.2 bits/pixel) or higher.

  16. Online Classes See Cheating Go High-Tech

    ERIC Educational Resources Information Center

    Young, Jeffrey R.

    2012-01-01

    Easy A's may be even easier to score these days, with the growing popularity of online courses. Tech-savvy students are finding ways to cheat that let them ace online courses with minimal effort, in ways that are difficult to detect. The issue of online cheating may rise in prominence, as more and more institutions embrace online courses, and as…

  17. Error detection and reduction in blood banking.

    PubMed

    Motschman, T L; Moore, S B

    1996-12-01

    Error management plays a major role in facility process improvement efforts. By detecting and reducing errors, quality and, therefore, patient care improve. It begins with a strong organizational foundation of management attitude with clear, consistent employee direction and appropriate physical facilities. Clearly defined critical processes, critical activities, and SOPs act as the framework for operations as well as active quality monitoring. To assure that personnel can detect an report errors they must be trained in both operational duties and error management practices. Use of simulated/intentional errors and incorporation of error detection into competency assessment keeps employees practiced, confident, and diminishes fear of the unknown. Personnel can clearly see that errors are indeed used as opportunities for process improvement and not for punishment. The facility must have a clearly defined and consistently used definition for reportable errors. Reportable errors should include those errors with potentially harmful outcomes as well as those errors that are "upstream," and thus further away from the outcome. A well-written error report consists of who, what, when, where, why/how, and follow-up to the error. Before correction can occur, an investigation to determine the underlying cause of the error should be undertaken. Obviously, the best corrective action is prevention. Correction can occur at five different levels; however, only three of these levels are directed at prevention. Prevention requires a method to collect and analyze data concerning errors. In the authors' facility a functional error classification method and a quality system-based classification have been useful. An active method to search for problems uncovers them further upstream, before they can have disastrous outcomes. In the continual quest for improving processes, an error management program is itself a process that needs improvement, and we must strive to always close the circle of quality assurance. Ultimately, the goal of better patient care will be the reward.

  18. Error management in blood establishments: results of eight years of experience (2003–2010) at the Croatian Institute of Transfusion Medicine

    PubMed Central

    Vuk, Tomislav; Barišić, Marijan; Očić, Tihomir; Mihaljević, Ivanka; Šarlija, Dorotea; Jukić, Irena

    2012-01-01

    Background. Continuous and efficient error management, including procedures from error detection to their resolution and prevention, is an important part of quality management in blood establishments. At the Croatian Institute of Transfusion Medicine (CITM), error management has been systematically performed since 2003. Materials and methods. Data derived from error management at the CITM during an 8-year period (2003–2010) formed the basis of this study. Throughout the study period, errors were reported to the Department of Quality Assurance. In addition to surveys and the necessary corrective activities, errors were analysed and classified according to the Medical Event Reporting System for Transfusion Medicine (MERS-TM). Results. During the study period, a total of 2,068 errors were recorded, including 1,778 (86.0%) in blood bank activities and 290 (14.0%) in blood transfusion services. As many as 1,744 (84.3%) errors were detected before issue of the product or service. Among the 324 errors identified upon release from the CITM, 163 (50.3%) errors were detected by customers and reported as complaints. In only five cases was an error detected after blood product transfusion however without any harmful consequences for the patients. All errors were, therefore, evaluated as “near miss” and “no harm” events. Fifty-two (2.5%) errors were evaluated as high-risk events. With regards to blood bank activities, the highest proportion of errors occurred in the processes of labelling (27.1%) and blood collection (23.7%). With regards to blood transfusion services, errors related to blood product issuing prevailed (24.5%). Conclusion. This study shows that comprehensive management of errors, including near miss errors, can generate data on the functioning of transfusion services, which is a precondition for implementation of efficient corrective and preventive actions that will ensure further improvement of the quality and safety of transfusion treatment. PMID:22395352

  19. Transient Faults in Computer Systems

    NASA Technical Reports Server (NTRS)

    Masson, Gerald M.

    1993-01-01

    A powerful technique particularly appropriate for the detection of errors caused by transient faults in computer systems was developed. The technique can be implemented in either software or hardware; the research conducted thus far primarily considered software implementations. The error detection technique developed has the distinct advantage of having provably complete coverage of all errors caused by transient faults that affect the output produced by the execution of a program. In other words, the technique does not have to be tuned to a particular error model to enhance error coverage. Also, the correctness of the technique can be formally verified. The technique uses time and software redundancy. The foundation for an effective, low-overhead, software-based certification trail approach to real-time error detection resulting from transient fault phenomena was developed.

  20. Structural analysis of online handwritten mathematical symbols based on support vector machines

    NASA Astrophysics Data System (ADS)

    Simistira, Foteini; Papavassiliou, Vassilis; Katsouros, Vassilis; Carayannis, George

    2013-01-01

    Mathematical expression recognition is still a very challenging task for the research community mainly because of the two-dimensional (2d) structure of mathematical expressions (MEs). In this paper, we present a novel approach for the structural analysis between two on-line handwritten mathematical symbols of a ME, based on spatial features of the symbols. We introduce six features to represent the spatial affinity of the symbols and compare two multi-class classification methods that employ support vector machines (SVMs): one based on the "one-against-one" technique and one based on the "one-against-all", in identifying the relation between a pair of symbols (i.e. subscript, numerator, etc). A dataset containing 1906 spatial relations derived from the Competition on Recognition of Online Handwritten Mathematical Expressions (CROHME) 2012 training dataset is constructed to evaluate the classifiers and compare them with the rule-based classifier of the ILSP-1 system participated in the contest. The experimental results give an overall mean error rate of 2.61% for the "one-against-one" SVM approach, 6.57% for the "one-against-all" SVM technique and 12.31% error rate for the ILSP-1 classifier.

  1. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory

    PubMed Central

    Tao, Qing

    2017-01-01

    Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM. PMID:29391864

  2. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory.

    PubMed

    Yang, Haimin; Pan, Zhisong; Tao, Qing

    2017-01-01

    Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  3. Research on technology of online gas chromatograph for SF6 decomposition products

    NASA Astrophysics Data System (ADS)

    Li, L.; Fan, X. P.; Zhou, Y. Y.; Tang, N.; Zou, Z. L.; Liu, M. Z.; Huang, G. J.

    2017-12-01

    Sulfur hexafluoride (SF6) decomposition products were qualitatively and quantitatively analyzed by several gas chromatographs in the laboratory. Test conditions and methods were selected and optimized to minimize and eliminate the SF6’ influences on detection of other trace components. The effective separation and detection of selected characteristic gases were achieved. And by comparison among different types of gas chromatograph, it was found that GPTR-S101 can effectively separate and detect SF6 decomposition products and has best the best detection limit and sensitivity. On the basis of GPTR-S101, online gas chromatograph for SF6decomposition products (GPTR-S201) was developed. It lays the foundation for further online monitoring and diagnosis of SF6.

  4. SU-E-T-105: An FMEA Survey of Intensity Modulated Radiation Therapy (IMRT) Step and Shoot Dose Delivery Failure Modes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faught, J Tonigan; Johnson, J; Stingo, F

    2015-06-15

    Purpose: To assess the perception of TG-142 tolerance level dose delivery failures in IMRT and the application of FMEA process to this specific aspect of IMRT. Methods: An online survey was distributed to medical physicists worldwide that briefly described 11 different failure modes (FMs) covered by basic quality assurance in step- and-shoot IMRT at or near TG-142 tolerance criteria levels. For each FM, respondents estimated the worst case H&N patient percent dose error and FMEA scores for Occurrence, Detectability, and Severity. Demographic data was also collected. Results: 181 individual and three group responses were submitted. 84% were from North America.more » Most (76%) individual respondents performed at least 80% clinical work and 92% were nationally certified. Respondent medical physics experience ranged from 2.5–45 years (average 18 years). 52% of individual respondents were at least somewhat familiar with FMEA, while 17% were not familiar. Several IMRT techniques, treatment planning systems and linear accelerator manufacturers were represented. All FMs received widely varying scores ranging from 1–10 for occurrence, at least 1–9 for detectability, and at least 1–7 for severity. Ranking FMs by RPN scores also resulted in large variability, with each FM being ranked both most risky (1st ) and least risky (11th) by different respondents. On average MLC modeling had the highest RPN scores. Individual estimated percent dose errors and severity scores positively correlated (p<0.10) for each FM as expected. No universal correlations were found between the demographic information collected and scoring, percent dose errors, or ranking. Conclusion: FMs investigated overall were evaluated as low to medium risk, with average RPNs less than 110. The ranking of 11 FMs was not agreed upon by the community. Large variability in FMEA scoring may be caused by individual interpretation and/or experience, thus reflecting the subjective nature of the FMEA tool.« less

  5. Time pressure in scenario-based online construction safety quizzes and its effect on students' performance

    NASA Astrophysics Data System (ADS)

    Jaeger, Martin; Adair, Desmond

    2017-05-01

    Online quizzes have been shown to be effective learning and assessment approaches. However, if scenario-based online construction safety quizzes do not include time pressure similar to real-world situations, they reflect situations too ideally. The purpose of this paper is to compare engineering students' performance when carrying out an online construction safety quiz with time pressure versus an online construction safety quiz without time pressure. Two versions of an online construction safety quiz are developed and administered to randomly assigned engineering students based on a quasi-experimental post-test design. The findings contribute to scenario-based learning and assessment of construction safety in four ways. First, the results confirm earlier findings that 'intrinsic stress' does not seem to impair students' performance. Second, students who carry out the online construction safety quiz with time pressure are less likely to 'learn by trial and error'. Third, students exposed to time pressure appreciate that they become better prepared for real life. Finally, preparing students to work under time pressure is an important industry requirement. The results of this study should encourage engineering educators to explore and implement ways to include time pressure in scenario-based online quizzes and learning.

  6. Adaptive hidden Markov model-based online learning framework for bearing faulty detection and performance degradation monitoring

    NASA Astrophysics Data System (ADS)

    Yu, Jianbo

    2017-01-01

    This study proposes an adaptive-learning-based method for machine faulty detection and health degradation monitoring. The kernel of the proposed method is an "evolving" model that uses an unsupervised online learning scheme, in which an adaptive hidden Markov model (AHMM) is used for online learning the dynamic health changes of machines in their full life. A statistical index is developed for recognizing the new health states in the machines. Those new health states are then described online by adding of new hidden states in AHMM. Furthermore, the health degradations in machines are quantified online by an AHMM-based health index (HI) that measures the similarity between two density distributions that describe the historic and current health states, respectively. When necessary, the proposed method characterizes the distinct operating modes of the machine and can learn online both abrupt as well as gradual health changes. Our method overcomes some drawbacks of the HIs (e.g., relatively low comprehensibility and applicability) based on fixed monitoring models constructed in the offline phase. Results from its application in a bearing life test reveal that the proposed method is effective in online detection and adaptive assessment of machine health degradation. This study provides a useful guide for developing a condition-based maintenance (CBM) system that uses an online learning method without considerable human intervention.

  7. Advanced Interactive Display Formats for Terminal Area Traffic Control

    NASA Technical Reports Server (NTRS)

    Grunwald, Arthur J.; Shaviv, G. E.

    1999-01-01

    This research project deals with an on-line dynamic method for automated viewing parameter management in perspective displays. Perspective images are optimized such that a human observer will perceive relevant spatial geometrical features with minimal errors. In order to compute the errors at which observers reconstruct spatial features from perspective images, a visual spatial-perception model was formulated. The model was employed as the basis of an optimization scheme aimed at seeking the optimal projection parameter setting. These ideas are implemented in the context of an air traffic control (ATC) application. A concept, referred to as an active display system, was developed. This system uses heuristic rules to identify relevant geometrical features of the three-dimensional air traffic situation. Agile, on-line optimization was achieved by a specially developed and custom-tailored genetic algorithm (GA), which was to deal with the multi-modal characteristics of the objective function and exploit its time-evolving nature.

  8. Lattice Commissioning Stretgy Simulation for the B Factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, M.; Whittum, D.; Yan, Y.

    2011-08-26

    To prepare for the PEP-II turn on, we have studied one commissioning strategy with simulated lattice errors. Features such as difference and absolute orbit analysis and correction are discussed. To prepare for the commissioning of the PEP-II injection line and high energy ring (HER), we have developed a system for on-line orbit analysis by merging two existing codes: LEGO and RESOLVE. With the LEGO-RESOLVE system, we can study the problem of finding quadrupole alignment and beam position (BPM) offset errors with simulated data. We have increased the speed and versatility of the orbit analysis process by using a command filemore » written in a script language designed specifically for RESOLVE. In addition, we have interfaced the LEGO-RESOLVE system to the control system of the B-Factory. In this paper, we describe online analysis features of the LEGO-RESOLVE system and present examples of practical applications.« less

  9. Insar Unwrapping Error Correction Based on Quasi-Accurate Detection of Gross Errors (quad)

    NASA Astrophysics Data System (ADS)

    Kang, Y.; Zhao, C. Y.; Zhang, Q.; Yang, C. S.

    2018-04-01

    Unwrapping error is a common error in the InSAR processing, which will seriously degrade the accuracy of the monitoring results. Based on a gross error correction method, Quasi-accurate detection (QUAD), the method for unwrapping errors automatic correction is established in this paper. This method identifies and corrects the unwrapping errors by establishing a functional model between the true errors and interferograms. The basic principle and processing steps are presented. Then this method is compared with the L1-norm method with simulated data. Results show that both methods can effectively suppress the unwrapping error when the ratio of the unwrapping errors is low, and the two methods can complement each other when the ratio of the unwrapping errors is relatively high. At last the real SAR data is tested for the phase unwrapping error correction. Results show that this new method can correct the phase unwrapping errors successfully in the practical application.

  10. Application of an Optimal Tuner Selection Approach for On-Board Self-Tuning Engine Models

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Armstrong, Jeffrey B.; Garg, Sanjay

    2012-01-01

    An enhanced design methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented in this paper. It specific-ally addresses the under-determined estimation problem, in which there are more unknown parameters than available sensor measurements. This work builds upon an existing technique for systematically selecting a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. While the existing technique was optimized for open-loop engine operation at a fixed design point, in this paper an alternative formulation is presented that enables the technique to be optimized for an engine operating under closed-loop control throughout the flight envelope. The theoretical Kalman filter mean squared estimation error at a steady-state closed-loop operating point is derived, and the tuner selection approach applied to minimize this error is discussed. A technique for constructing a globally optimal tuning parameter vector, which enables full-envelope application of the technology, is also presented, along with design steps for adjusting the dynamic response of the Kalman filter state estimates. Results from the application of the technique to linear and nonlinear aircraft engine simulations are presented and compared to the conventional approach of tuner selection. The new methodology is shown to yield a significant improvement in on-line Kalman filter estimation accuracy.

  11. Maintaining Academic Integrity in On-Line Education.

    ERIC Educational Resources Information Center

    Heberling, Michael

    2002-01-01

    Discussion of academic cheating and plagiarism focuses on occurrences in online courses, based on experiences at Baker College (Michigan). Highlights include tools to fight plagiarism; using search engines to detect plagiarism; digital paper mills; plagiarism detection companies; and the role of administrators and faculty. (LRW)

  12. Syndromic surveillance for health information system failures: a feasibility study.

    PubMed

    Ong, Mei-Sing; Magrabi, Farah; Coiera, Enrico

    2013-05-01

    To explore the applicability of a syndromic surveillance method to the early detection of health information technology (HIT) system failures. A syndromic surveillance system was developed to monitor a laboratory information system at a tertiary hospital. Four indices were monitored: (1) total laboratory records being created; (2) total records with missing results; (3) average serum potassium results; and (4) total duplicated tests on a patient. The goal was to detect HIT system failures causing: data loss at the record level; data loss at the field level; erroneous data; and unintended duplication of data. Time-series models of the indices were constructed, and statistical process control charts were used to detect unexpected behaviors. The ability of the models to detect HIT system failures was evaluated using simulated failures, each lasting for 24 h, with error rates ranging from 1% to 35%. In detecting data loss at the record level, the model achieved a sensitivity of 0.26 when the simulated error rate was 1%, while maintaining a specificity of 0.98. Detection performance improved with increasing error rates, achieving a perfect sensitivity when the error rate was 35%. In the detection of missing results, erroneous serum potassium results and unintended repetition of tests, perfect sensitivity was attained when the error rate was as small as 5%. Decreasing the error rate to 1% resulted in a drop in sensitivity to 0.65-0.85. Syndromic surveillance methods can potentially be applied to monitor HIT systems, to facilitate the early detection of failures.

  13. Is there any electrophysiological evidence for subliminal error processing?

    PubMed

    Shalgi, Shani; Deouell, Leon Y

    2013-08-29

    The role of error awareness in executive control and modification of behavior is not fully understood. In line with many recent studies showing that conscious awareness is unnecessary for numerous high-level processes such as strategic adjustments and decision making, it was suggested that error detection can also take place unconsciously. The Error Negativity (Ne) component, long established as a robust error-related component that differentiates between correct responses and errors, was a fine candidate to test this notion: if an Ne is elicited also by errors which are not consciously detected, it would imply a subliminal process involved in error monitoring that does not necessarily lead to conscious awareness of the error. Indeed, for the past decade, the repeated finding of a similar Ne for errors which became aware and errors that did not achieve awareness, compared to the smaller negativity elicited by correct responses (Correct Response Negativity; CRN), has lent the Ne the prestigious status of an index of subliminal error processing. However, there were several notable exceptions to these findings. The study in the focus of this review (Shalgi and Deouell, 2012) sheds new light on both types of previous results. We found that error detection as reflected by the Ne is correlated with subjective awareness: when awareness (or more importantly lack thereof) is more strictly determined using the wagering paradigm, no Ne is elicited without awareness. This result effectively resolves the issue of why there are many conflicting findings regarding the Ne and error awareness. The average Ne amplitude appears to be influenced by individual criteria for error reporting and therefore, studies containing different mixtures of participants who are more confident of their own performance or less confident, or paradigms that either encourage or don't encourage reporting low confidence errors will show different results. Based on this evidence, it is no longer possible to unquestioningly uphold the notion that the amplitude of the Ne is unrelated to subjective awareness, and therefore, that errors are detected without conscious awareness.

  14. Activity Tracking for Pilot Error Detection from Flight Data

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Ashford, Rose (Technical Monitor)

    2002-01-01

    This report presents an application of activity tracking for pilot error detection from flight data, and describes issues surrounding such an application. It first describes the Crew Activity Tracking System (CATS), in-flight data collected from the NASA Langley Boeing 757 Airborne Research Integrated Experiment System aircraft, and a model of B757 flight crew activities. It then presents an example of CATS detecting actual in-flight crew errors.

  15. A PC-based computer package for automatic detection and location of earthquakes: Application to a seismic network in eastern sicity (Italy)

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano

    Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the automatic results with the manual picks indicates that the ASDP module can accurately pick 80% of the P-waves and 65% of S-waves. The on-line application on the latter data set shows that automatic locations are affected by larger errors, due to the preliminary setting of the configuration parameters in the program. However, both automatic ASDP and manual hypocenter locations are comparable within the estimated error bounds. New improvements of the PC-Seism software for on-line analysis are also discussed.

  16. TH-AB-202-02: Real-Time Verification and Error Detection for MLC Tracking Deliveries Using An Electronic Portal Imaging Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J Zwan, B; Central Coast Cancer Centre, Gosford, NSW; Colvill, E

    2016-06-15

    Purpose: The added complexity of the real-time adaptive multi-leaf collimator (MLC) tracking increases the likelihood of undetected MLC delivery errors. In this work we develop and test a system for real-time delivery verification and error detection for MLC tracking radiotherapy using an electronic portal imaging device (EPID). Methods: The delivery verification system relies on acquisition and real-time analysis of transit EPID image frames acquired at 8.41 fps. In-house software was developed to extract the MLC positions from each image frame. Three comparison metrics were used to verify the MLC positions in real-time: (1) field size, (2) field location and, (3)more » field shape. The delivery verification system was tested for 8 VMAT MLC tracking deliveries (4 prostate and 4 lung) where real patient target motion was reproduced using a Hexamotion motion stage and a Calypso system. Sensitivity and detection delay was quantified for various types of MLC and system errors. Results: For both the prostate and lung test deliveries the MLC-defined field size was measured with an accuracy of 1.25 cm{sup 2} (1 SD). The field location was measured with an accuracy of 0.6 mm and 0.8 mm (1 SD) for lung and prostate respectively. Field location errors (i.e. tracking in wrong direction) with a magnitude of 3 mm were detected within 0.4 s of occurrence in the X direction and 0.8 s in the Y direction. Systematic MLC gap errors were detected as small as 3 mm. The method was not found to be sensitive to random MLC errors and individual MLC calibration errors up to 5 mm. Conclusion: EPID imaging may be used for independent real-time verification of MLC trajectories during MLC tracking deliveries. Thresholds have been determined for error detection and the system has been shown to be sensitive to a range of delivery errors.« less

  17. ClubSub-P: Cluster-Based Subcellular Localization Prediction for Gram-Negative Bacteria and Archaea

    PubMed Central

    Paramasivam, Nagarajan; Linke, Dirk

    2011-01-01

    The subcellular localization (SCL) of proteins provides important clues to their function in a cell. In our efforts to predict useful vaccine targets against Gram-negative bacteria, we noticed that misannotated start codons frequently lead to wrongly assigned SCLs. This and other problems in SCL prediction, such as the relatively high false-positive and false-negative rates of some tools, can be avoided by applying multiple prediction tools to groups of homologous proteins. Here we present ClubSub-P, an online database that combines existing SCL prediction tools into a consensus pipeline from more than 600 proteomes of fully sequenced microorganisms. On top of the consensus prediction at the level of single sequences, the tool uses clusters of homologous proteins from Gram-negative bacteria and from Archaea to eliminate false-positive and false-negative predictions. ClubSub-P can assign the SCL of proteins from Gram-negative bacteria and Archaea with high precision. The database is searchable, and can easily be expanded using either new bacterial genomes or new prediction tools as they become available. This will further improve the performance of the SCL prediction, as well as the detection of misannotated start codons and other annotation errors. ClubSub-P is available online at http://toolkit.tuebingen.mpg.de/clubsubp/ PMID:22073040

  18. Adaptive Online Sequential ELM for Concept Drift Tackling

    PubMed Central

    Basaruddin, Chan

    2016-01-01

    A machine learning method needs to adapt to over time changes in the environment. Such changes are known as concept drift. In this paper, we propose concept drift tackling method as an enhancement of Online Sequential Extreme Learning Machine (OS-ELM) and Constructive Enhancement OS-ELM (CEOS-ELM) by adding adaptive capability for classification and regression problem. The scheme is named as adaptive OS-ELM (AOS-ELM). It is a single classifier scheme that works well to handle real drift, virtual drift, and hybrid drift. The AOS-ELM also works well for sudden drift and recurrent context change type. The scheme is a simple unified method implemented in simple lines of code. We evaluated AOS-ELM on regression and classification problem by using concept drift public data set (SEA and STAGGER) and other public data sets such as MNIST, USPS, and IDS. Experiments show that our method gives higher kappa value compared to the multiclassifier ELM ensemble. Even though AOS-ELM in practice does not need hidden nodes increase, we address some issues related to the increasing of the hidden nodes such as error condition and rank values. We propose taking the rank of the pseudoinverse matrix as an indicator parameter to detect “underfitting” condition. PMID:27594879

  19. Laboratory and field based evaluation of chromatography ...

    EPA Pesticide Factsheets

    The Monitor for AeRosols and GAses in ambient air (MARGA) is an on-line ion-chromatography-based instrument designed for speciation of the inorganic gas and aerosol ammonium-nitrate-sulfate system. Previous work to characterize the performance of the MARGA has been primarily based on field comparison to other measurement methods to evaluate accuracy. While such studies are useful, the underlying reasons for disagreement among methods are not always clear. This study examines aspects of MARGA accuracy and precision specifically related to automated chromatography analysis. Using laboratory standards, analytical accuracy, precision, and method detection limits derived from the MARGA chromatography software are compared to an alternative software package (Chromeleon, Thermo Scientific Dionex). Field measurements are used to further evaluate instrument performance, including the MARGA’s use of an internal LiBr standard to control accuracy. Using gas/aerosol ratios and aerosol neutralization state as a case study, the impact of chromatography on measurement error is assessed. The new generation of on-line chromatography-based gas and particle measurement systems have many advantages, including simultaneous analysis of multiple pollutants. The Monitor for Aerosols and Gases in Ambient Air (MARGA) is such an instrument that is used in North America, Europe, and Asia for atmospheric process studies as well as routine monitoring. While the instrument has been evaluat

  20. ERP evidence for on-line syntactic computations in 2-year-olds.

    PubMed

    Brusini, Perrine; Dehaene-Lambertz, Ghislaine; Dutat, Michel; Goffinet, François; Christophe, Anne

    2016-06-01

    Syntax allows human beings to build an infinite number of sentences from a finite number of words. How this unique, productive power of human language unfolds over the course of language development is still hotly debated. When they listen to sentences comprising newly-learned words, do children generalize from their knowledge of the legal combinations of word categories or do they instead rely on strings of words stored in memory to detect syntactic errors? Using novel words taught in the lab, we recorded Evoked Response Potentials (ERPs) in two-year-olds and adults listening to grammatical and ungrammatical sentences containing syntactic contexts that had not been used during training. In toddlers, the ungrammatical use of words, even when they have been just learned, induced an early left anterior negativity (surfacing 100-400ms after target word onset) followed by a late posterior positivity (surfacing 700-900ms after target word onset) that was not observed in grammatical sentences. This late effect was remarkably similar to the P600 displayed by adults, suggesting that toddlers and adults perform similar syntactic computations. Our results thus show that toddlers build on-line expectations regarding the syntactic category of upcoming words in a sentence. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Analysis of hydroxamate siderophores in soil solution using liquid chromatography with mass spectrometry and tandem mass spectrometry with on-line sample preconcentration.

    PubMed

    Olofsson, Madelen A; Bylund, Dan

    2015-10-01

    A liquid chromatography with electrospray ionization mass spectrometry method was developed to quantitatively and qualitatively analyze 13 hydroxamate siderophores (ferrichrome, ferrirubin, ferrirhodin, ferrichrysin, ferricrocin, ferrioxamine B, D1 , E and G, neocoprogen I and II, coprogen and triacetylfusarinine C). Samples were preconcentrated on-line by a switch-valve setup prior to analyte separation on a Kinetex C18 column. Gradient elution was performed using a mixture of an ammonium formate buffer and acetonitrile. Total analysis time including column conditioning was 20.5 min. Analytes were fragmented by applying collision-induced dissociation, enabling structural identification by tandem mass spectrometry. Limit of detection values for the selected ion monitoring method ranged from 71 pM to 1.5 nM with corresponding values of two to nine times higher for the multiple reaction monitoring method. The liquid chromatography with mass spectrometry method resulted in a robust and sensitive quantification of hydroxamate siderophores as indicated by retention time stability, linearity, sensitivity, precision and recovery. The analytical error of the methods, assessed through random-order, duplicate analysis of soil samples extracted with a mixture of 10 mM phosphate buffer and methanol, appears negligible in relation to between-sample variations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Optimal estimation of suspended-sediment concentrations in streams

    USGS Publications Warehouse

    Holtschlag, D.J.

    2001-01-01

    Optimal estimators are developed for computation of suspended-sediment concentrations in streams. The estimators are a function of parameters, computed by use of generalized least squares, which simultaneously account for effects of streamflow, seasonal variations in average sediment concentrations, a dynamic error component, and the uncertainty in concentration measurements. The parameters are used in a Kalman filter for on-line estimation and an associated smoother for off-line estimation of suspended-sediment concentrations. The accuracies of the optimal estimators are compared with alternative time-averaging interpolators and flow-weighting regression estimators by use of long-term daily-mean suspended-sediment concentration and streamflow data from 10 sites within the United States. For sampling intervals from 3 to 48 days, the standard errors of on-line and off-line optimal estimators ranged from 52.7 to 107%, and from 39.5 to 93.0%, respectively. The corresponding standard errors of linear and cubic-spline interpolators ranged from 48.8 to 158%, and from 50.6 to 176%, respectively. The standard errors of simple and multiple regression estimators, which did not vary with the sampling interval, were 124 and 105%, respectively. Thus, the optimal off-line estimator (Kalman smoother) had the lowest error characteristics of those evaluated. Because suspended-sediment concentrations are typically measured at less than 3-day intervals, use of optimal estimators will likely result in significant improvements in the accuracy of continuous suspended-sediment concentration records. Additional research on the integration of direct suspended-sediment concentration measurements and optimal estimators applied at hourly or shorter intervals is needed.

  3. “How Did We Get Here?”: Topic Drift in Online Health Discussions

    PubMed Central

    Hartzler, Andrea L; Huh, Jina; Hsieh, Gary; McDonald, David W; Pratt, Wanda

    2016-01-01

    Background Patients increasingly use online health communities to exchange health information and peer support. During the progression of health discussions, a change of topic—topic drift—can occur. Topic drift is a frequent phenomenon linked to incoherence and frustration in online communities and other forms of computer-mediated communication. For sensitive topics, such as health, such drift could have life-altering repercussions, yet topic drift has not been studied in these contexts. Objective Our goals were to understand topic drift in online health communities and then to develop and evaluate an automated approach to detect both topic drift and efforts of community members to counteract such drift. Methods We manually analyzed 721 posts from 184 threads from 7 online health communities within WebMD to understand topic drift, members’ reaction towards topic drift, and their efforts to counteract topic drift. Then, we developed an automated approach to detect topic drift and counteraction efforts. We detected topic drift by calculating cosine similarity between 229,156 posts from 37,805 threads and measuring change of cosine similarity scores from the threads’ first posts to their sequential posts. Using a similar approach, we detected counteractions to topic drift in threads by focusing on the irregular increase of similarity scores compared to the previous post in threads. Finally, we evaluated the performance of our automated approaches to detect topic drift and counteracting efforts by using a manually developed gold standard. Results Our qualitative analyses revealed that in threads of online health communities, topics change gradually, but usually stay within the global frame of topics for the specific community. Members showed frustration when topic drift occurred in the middle of threads but reacted positively to off-topic stories shared as separate threads. Although all types of members helped to counteract topic drift, original posters provided the most effort to keep threads on topic. Cosine similarity scores show promise for automatically detecting topical changes in online health discussions. In our manual evaluation, we achieved an F1 score of .71 and .73 for detecting topic drift and counteracting efforts to stay on topic, respectively. Conclusions Our analyses expand our understanding of topic drift in a health context and highlight practical implications, such as promoting off-topic discussions as a function of building rapport in online health communities. Furthermore, the quantitative findings suggest that an automated tool could help detect topic drift, support counteraction efforts to bring the conversation back on topic, and improve communication in these important communities. Findings from this study have the potential to reduce topic drift and improve online health community members’ experience of computer-mediated communication. Improved communication could enhance the personal health management of members who seek essential information and support during times of difficulty. PMID:27806924

  4. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.

    The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can evaluate satellite operational status and affirm its true identity. The process of ingesting photometry data and deriving satellite physical characteristics can be directed by analysts in a batch mode, meaning using a batch of recent data, or by automated algorithms in an on-line mode in which the assessment is updated with each new data point. Tools used for detecting change to satellite's status or identity, whether performed with a human in the loop or automated algorithms, are generally not built to detect with minimum latency and traceable confidence intervals. To alleviate those deficiencies, we investigate the use of Hidden Markov Models (HMM), in a Bayesian Network framework, to infer the hidden state (changed or unchanged) of a three-axis stabilized geostationary satellite using broadband and color photometry. Unlike frequentist statistics which exploit only the stationary statistics of the observables in the database, HMM also exploits the temporal pattern of the observables as well. The algorithm also operates in “learning” mode to gradually evolve the HMM and accommodate natural changes such as due to the seasonal dependence of GEO satellite's light curve. Our technique is designed to operate with missing color data. The version that ingests both panchromatic and color data can accommodate gaps in color photometry data. That attribute is important because while color indices, e.g. Johnson R and B, enhance the belief (probability) of a hidden state, in real world situations, flux data is collected sporadically in an untasked collect, and color data is limited and sometimes absent. Fluxes are measured with experimental error whose effect on the algorithm will be studied. Photometry data in the AFRL's Geo Color Photometry Catalog and Geo Observations with Latitudinal Diversity Simultaneously (GOLDS) data sets are used to simulate a wide variety of operational changes and identity cross tags. The algorithm is tested against simulated sequences of observed magnitudes, mimicking both the cadence of untasked SSN and other ground sensors, occasional operational changes and possible occurrence of cross tags of in-cluster satellites. We would like to show that the on-line algorithm can detect change; sometimes right after the first post-change data point is analyzed, for zero latency. We also want to show the unsupervised “learning” capability that allows the HMM to evolve with time without user's assistance. For example, the users are not required to “label” the true state of the data points.

  5. Prescribing Errors Involving Medication Dosage Forms

    PubMed Central

    Lesar, Timothy S

    2002-01-01

    CONTEXT Prescribing errors involving medication dose formulations have been reported to occur frequently in hospitals. No systematic evaluations of the characteristics of errors related to medication dosage formulation have been performed. OBJECTIVE To quantify the characteristics, frequency, and potential adverse patient effects of prescribing errors involving medication dosage forms . DESIGN Evaluation of all detected medication prescribing errors involving or related to medication dosage forms in a 631-bed tertiary care teaching hospital. MAIN OUTCOME MEASURES Type, frequency, and potential for adverse effects of prescribing errors involving or related to medication dosage forms. RESULTS A total of 1,115 clinically significant prescribing errors involving medication dosage forms were detected during the 60-month study period. The annual number of detected errors increased throughout the study period. Detailed analysis of the 402 errors detected during the last 16 months of the study demonstrated the most common errors to be: failure to specify controlled release formulation (total of 280 cases; 69.7%) both when prescribing using the brand name (148 cases; 36.8%) and when prescribing using the generic name (132 cases; 32.8%); and prescribing controlled delivery formulations to be administered per tube (48 cases; 11.9%). The potential for adverse patient outcome was rated as potentially “fatal or severe” in 3 cases (0.7%), and “serious” in 49 cases (12.2%). Errors most commonly involved cardiovascular agents (208 cases; 51.7%). CONCLUSIONS Hospitalized patients are at risk for adverse outcomes due to prescribing errors related to inappropriate use of medication dosage forms. This information should be considered in the development of strategies to prevent adverse patient outcomes resulting from such errors. PMID:12213138

  6. Quick foot placement adjustments during gait are less accurate in individuals with focal cerebellar lesions.

    PubMed

    Hoogkamer, Wouter; Potocanac, Zrinka; Van Calenbergh, Frank; Duysens, Jacques

    2017-10-01

    Online gait corrections are frequently used to restore gait stability and prevent falling. They require shorter response times than voluntary movements which suggests that subcortical pathways contribute to the execution of online gait corrections. To evaluate the potential role of the cerebellum in these pathways we tested the hypotheses that online gait corrections would be less accurate in individuals with focal cerebellar damage than in neurologically intact controls and that this difference would be more pronounced for shorter available response times and for short step gait corrections. We projected virtual stepping stones on an instrumented treadmill while some of the approaching stepping stones were shifted forward or backward, requiring participants to adjust their foot placement. Varying the timing of those shifts allowed us to address the effect of available response time on foot placement error. In agreement with our hypothesis, individuals with focal cerebellar lesions were less accurate in adjusting their foot placement in reaction to suddenly shifted stepping stones than neurologically intact controls. However, the cerebellar lesion group's foot placement error did not increase more with decreasing available response distance or for short step versus long step adjustments compared to the control group. Furthermore, foot placement error for the non-shifting stepping stones was also larger in the cerebellar lesion group as compared to the control group. Consequently, the reduced ability to accurately adjust foot placement during walking in individuals with focal cerebellar lesions appears to be a general movement control deficit, which could contribute to increased fall risk. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Growth monitoring and control in complex medium: a case study employing fed-batch penicillin fermentation and computer-aided on-line mass balancing.

    PubMed

    Mou, D G; Cooney, C L

    1983-01-01

    To broaden the practicality of on-line growth monitoring and control, its application in fedbatch penicillin fermentation using high corn steep liquor (CSL) concentration (53 g/L) is demonstrated. By employing a calculation method that considers the vagaries of CSL consumption, overall and instantaneous carbon-balancing equations are successfully used to calculate, on-line, the cell concentration and instantaneous specific growth rate in the penicillin production phase. As a consequence, these equations, together with a feedback control strategy, enable the computer control of glucose feed and maintenance of the preselected production-phase growth rate with error less than 0.002 h(-1).

  8. Detecting and Characterizing Semantic Inconsistencies in Ported Code

    NASA Technical Reports Server (NTRS)

    Ray, Baishakhi; Kim, Miryung; Person, Suzette J.; Rungta, Neha

    2013-01-01

    Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (I) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows thai SPA can dell-oct porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points

  9. Statistical approaches to account for false-positive errors in environmental DNA samples.

    PubMed

    Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid

    2016-05-01

    Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies. © 2015 John Wiley & Sons Ltd.

  10. EEG-based decoding of error-related brain activity in a real-world driving task

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Chavarriaga, R.; Khaliliardali, Z.; Gheorghe, L.; Iturrate, I.; Millán, J. d. R.

    2015-12-01

    Objectives. Recent studies have started to explore the implementation of brain-computer interfaces (BCI) as part of driving assistant systems. The current study presents an EEG-based BCI that decodes error-related brain activity. Such information can be used, e.g., to predict driver’s intended turning direction before reaching road intersections. Approach. We executed experiments in a car simulator (N = 22) and a real car (N = 8). While subject was driving, a directional cue was shown before reaching an intersection, and we classified the presence or not of an error-related potentials from EEG to infer whether the cued direction coincided with the subject’s intention. In this protocol, the directional cue can correspond to an estimation of the driving direction provided by a driving assistance system. We analyzed ERPs elicited during normal driving and evaluated the classification performance in both offline and online tests. Results. An average classification accuracy of 0.698 ± 0.065 was obtained in offline experiments in the car simulator, while tests in the real car yielded a performance of 0.682 ± 0.059. The results were significantly higher than chance level for all cases. Online experiments led to equivalent performances in both simulated and real car driving experiments. These results support the feasibility of decoding these signals to help estimating whether the driver’s intention coincides with the advice provided by the driving assistant in a real car. Significance. The study demonstrates a BCI system in real-world driving, extending the work from previous simulated studies. As far as we know, this is the first online study in real car decoding driver’s error-related brain activity. Given the encouraging results, the paradigm could be further improved by using more sophisticated machine learning approaches and possibly be combined with applications in intelligent vehicles.

  11. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Gaarder, N. T.; Lin, S.

    1986-01-01

    This research project was set up to study various kinds of coding techniques for error control in satellite and space communications for NASA Goddard Space Flight Center. During the project period, researchers investigated the following areas: (1) decoding of Reed-Solomon codes in terms of dual basis; (2) concatenated and cascaded error control coding schemes for satellite and space communications; (3) use of hybrid coding schemes (error correction and detection incorporated with retransmission) to improve system reliability and throughput in satellite communications; (4) good codes for simultaneous error correction and error detection, and (5) error control techniques for ring and star networks.

  12. Design and scheduling for periodic concurrent error detection and recovery in processor arrays

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Chung, Pi-Yu; Fuchs, W. Kent

    1992-01-01

    Periodic application of time-redundant error checking provides the trade-off between error detection latency and performance degradation. The goal is to achieve high error coverage while satisfying performance requirements. We derive the optimal scheduling of checking patterns in order to uniformly distribute the available checking capability and maximize the error coverage. Synchronous buffering designs using data forwarding and dynamic reconfiguration are described. Efficient single-cycle diagnosis is implemented by error pattern analysis and direct-mapped recovery cache. A rollback recovery scheme using start-up control for local recovery is also presented.

  13. A Review of System Identification Methods Applied to Aircraft

    NASA Technical Reports Server (NTRS)

    Klein, V.

    1983-01-01

    Airplane identification, equation error method, maximum likelihood method, parameter estimation in frequency domain, extended Kalman filter, aircraft equations of motion, aerodynamic model equations, criteria for the selection of a parsimonious model, and online aircraft identification are addressed.

  14. Neurometaplasticity: Glucoallostasis control of plasticity of the neural networks of error commission, detection, and correction modulates neuroplasticity to influence task precision

    NASA Astrophysics Data System (ADS)

    Welcome, Menizibeya O.; Dane, Şenol; Mastorakis, Nikos E.; Pereverzev, Vladimir A.

    2017-12-01

    The term "metaplasticity" is a recent one, which means plasticity of synaptic plasticity. Correspondingly, neurometaplasticity simply means plasticity of neuroplasticity, indicating that a previous plastic event determines the current plasticity of neurons. Emerging studies suggest that neurometaplasticity underlie many neural activities and neurobehavioral disorders. In our previous work, we indicated that glucoallostasis is essential for the control of plasticity of the neural network that control error commission, detection and correction. Here we review recent works, which suggest that task precision depends on the modulatory effects of neuroplasticity on the neural networks of error commission, detection, and correction. Furthermore, we discuss neurometaplasticity and its role in error commission, detection, and correction.

  15. On-orbit observations of single event upset in Harris HM-6508 1K RAMs, reissue A

    NASA Astrophysics Data System (ADS)

    Blake, J. B.; Mandel, R.

    1987-02-01

    The Harris HM-6508 1K x 1 RAMs are part of a subsystem of a satellite in a low, polar orbit. The memory module, used in the subsystem containing the RAMs, consists of three printed circuit cards, with each card containing eight 2K byte memory hybrids, for a total of 48K bytes. Each memory hybrid contains 16 HM-6508 RAM chips. On a regular basis all but 256 bytes of the 48K bytes are examined for bit errors. Two different techniques were used for detecting bit errors. The first technique, a memory check sum, was capable of automatically detecting all single bit and some double bit errors which occurred within a page of memory. A memory page consists of 256 bytes. Memory check sum tests are performed approximately every 90 minutes. To detect a multiple error or to determine the exact location of the bit error within the page the entire contents of the memory is dumped and compared to the load file. Memory dumps are normally performed once a month, or immediately after the check sum routine detects an error. Once the exact location of the error is found, the correct value is reloaded into memory. After the memory is reloaded, the contents of the memory location in question is verified in order to determine if the error was a soft error generated by an SEU or a hard error generated by a part failure or cosmic-ray induced latchup.

  16. Relationship auditing of the FMA ontology

    PubMed Central

    Gu, Huanying (Helen); Wei, Duo; Mejino, Jose L.V.; Elhanan, Gai

    2010-01-01

    The Foundational Model of Anatomy (FMA) ontology is a domain reference ontology based on a disciplined modeling approach. Due to its large size, semantic complexity and manual data entry process, errors and inconsistencies are unavoidable and might remain within the FMA structure without detection. In this paper, we present computable methods to highlight candidate concepts for various relationship assignment errors. The process starts with locating structures formed by transitive structural relationships (part_of, tributary_of, branch_of) and examine their assignments in the context of the IS-A hierarchy. The algorithms were designed to detect five major categories of possible incorrect relationship assignments: circular, mutually exclusive, redundant, inconsistent, and missed entries. A domain expert reviewed samples of these presumptive errors to confirm the findings. Seven thousand and fifty-two presumptive errors were detected, the largest proportion related to part_of relationship assignments. The results highlight the fact that errors are unavoidable in complex ontologies and that well designed algorithms can help domain experts to focus on concepts with high likelihood of errors and maximize their effort to ensure consistency and reliability. In the future similar methods might be integrated with data entry processes to offer real-time error detection. PMID:19475727

  17. Towards a Collaborative Filtering Approach to Medication Reconciliation

    PubMed Central

    Hasan, Sharique; Duncan, George T.; Neill, Daniel B.; Padman, Rema

    2008-01-01

    A physician’s prescribing decisions depend on knowledge of the patient’s medication list. This knowledge is often incomplete, and errors or omissions could result in adverse outcomes. To address this problem, the Joint Commission recommends medication reconciliation for creating a more accurate list of a patient’s medications. In this paper, we develop techniques for automatic detection of omissions in medication lists, identifying drugs that the patient may be taking but are not on the patient’s medication list. Our key insight is that this problem is analogous to the collaborative filtering framework increasingly used by online retailers to recommend relevant products to customers. The collaborative filtering approach enables a variety of solution techniques, including nearest neighbor and co-occurrence approaches. We evaluate the effectiveness of these approaches using medication data from a long-term care center in the Eastern US. Preliminary results suggest that this framework may become a valuable tool for medication reconciliation. PMID:18998834

  18. Towards a collaborative filtering approach to medication reconciliation.

    PubMed

    Hasan, Sharique; Duncan, George T; Neill, Daniel B; Padman, Rema

    2008-11-06

    A physicians prescribing decisions depend on knowledge of the patients medication list. This knowledge is often incomplete, and errors or omissions could result in adverse outcomes. To address this problem, the Joint Commission recommends medication reconciliation for creating a more accurate list of a patients medications. In this paper, we develop techniques for automatic detection of omissions in medication lists, identifying drugs that the patient may be taking but are not on the patients medication list. Our key insight is that this problem is analogous to the collaborative filtering framework increasingly used by online retailers to recommend relevant products to customers. The collaborative filtering approach enables a variety of solution techniques, including nearest neighbor and co-occurrence approaches. We evaluate the effectiveness of these approaches using medication data from a long-term care center in the Eastern US. Preliminary results suggest that this framework may become a valuable tool for medication reconciliation.

  19. The NASA F-15 Intelligent Flight Control Systems: Generation II

    NASA Technical Reports Server (NTRS)

    Buschbacher, Mark; Bosworth, John

    2006-01-01

    The Second Generation (Gen II) control system for the F-15 Intelligent Flight Control System (IFCS) program implements direct adaptive neural networks to demonstrate robust tolerance to faults and failures. The direct adaptive tracking controller integrates learning neural networks (NNs) with a dynamic inversion control law. The term direct adaptive is used because the error between the reference model and the aircraft response is being compensated or directly adapted to minimize error without regard to knowing the cause of the error. No parameter estimation is needed for this direct adaptive control system. In the Gen II design, the feedback errors are regulated with a proportional-plus-integral (PI) compensator. This basic compensator is augmented with an online NN that changes the system gains via an error-based adaptation law to improve aircraft performance at all times, including normal flight, system failures, mispredicted behavior, or changes in behavior resulting from damage.

  20. Effects of vibration on inertial wind-tunnel model attitude measurement devices

    NASA Technical Reports Server (NTRS)

    Young, Clarence P., Jr.; Buehrle, Ralph D.; Balakrishna, S.; Kilgore, W. Allen

    1994-01-01

    Results of an experimental study of a wind tunnel model inertial angle-of-attack sensor response to a simulated dynamic environment are presented. The inertial device cannot distinguish between the gravity vector and the centrifugal accelerations associated with wind tunnel model vibration, this situation results in a model attitude measurement bias error. Significant bias error in model attitude measurement was found for the model system tested. The model attitude bias error was found to be vibration mode and amplitude dependent. A first order correction model was developed and used for estimating attitude measurement bias error due to dynamic motion. A method for correcting the output of the model attitude inertial sensor in the presence of model dynamics during on-line wind tunnel operation is proposed.

  1. Automated contouring error detection based on supervised geometric attribute distribution models for radiation therapy: A general strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsin-Chen; Tan, Jun; Dolly, Steven

    2015-02-15

    Purpose: One of the most critical steps in radiation therapy treatment is accurate tumor and critical organ-at-risk (OAR) contouring. Both manual and automated contouring processes are prone to errors and to a large degree of inter- and intraobserver variability. These are often due to the limitations of imaging techniques in visualizing human anatomy as well as to inherent anatomical variability among individuals. Physicians/physicists have to reverify all the radiation therapy contours of every patient before using them for treatment planning, which is tedious, laborious, and still not an error-free process. In this study, the authors developed a general strategy basedmore » on novel geometric attribute distribution (GAD) models to automatically detect radiation therapy OAR contouring errors and facilitate the current clinical workflow. Methods: Considering the radiation therapy structures’ geometric attributes (centroid, volume, and shape), the spatial relationship of neighboring structures, as well as anatomical similarity of individual contours among patients, the authors established GAD models to characterize the interstructural centroid and volume variations, and the intrastructural shape variations of each individual structure. The GAD models are scalable and deformable, and constrained by their respective principal attribute variations calculated from training sets with verified OAR contours. A new iterative weighted GAD model-fitting algorithm was developed for contouring error detection. Receiver operating characteristic (ROC) analysis was employed in a unique way to optimize the model parameters to satisfy clinical requirements. A total of forty-four head-and-neck patient cases, each of which includes nine critical OAR contours, were utilized to demonstrate the proposed strategy. Twenty-nine out of these forty-four patient cases were utilized to train the inter- and intrastructural GAD models. These training data and the remaining fifteen testing data sets were separately employed to test the effectiveness of the proposed contouring error detection strategy. Results: An evaluation tool was implemented to illustrate how the proposed strategy automatically detects the radiation therapy contouring errors for a given patient and provides 3D graphical visualization of error detection results as well. The contouring error detection results were achieved with an average sensitivity of 0.954/0.906 and an average specificity of 0.901/0.909 on the centroid/volume related contouring errors of all the tested samples. As for the detection results on structural shape related contouring errors, an average sensitivity of 0.816 and an average specificity of 0.94 on all the tested samples were obtained. The promising results indicated the feasibility of the proposed strategy for the detection of contouring errors with low false detection rate. Conclusions: The proposed strategy can reliably identify contouring errors based upon inter- and intrastructural constraints derived from clinically approved contours. It holds great potential for improving the radiation therapy workflow. ROC and box plot analyses allow for analytically tuning of the system parameters to satisfy clinical requirements. Future work will focus on the improvement of strategy reliability by utilizing more training sets and additional geometric attribute constraints.« less

  2. Comparison of direct and heterodyne detection optical intersatellite communication links

    NASA Technical Reports Server (NTRS)

    Chen, C. C.; Gardner, C. S.

    1987-01-01

    The performance of direct and heterodyne detection optical intersatellite communication links are evaluated and compared. It is shown that the performance of optical links is very sensitive to the pointing and tracking errors at the transmitter and receiver. In the presence of random pointing and tracking errors, optimal antenna gains exist that will minimize the required transmitter power. In addition to limiting the antenna gains, random pointing and tracking errors also impose a power penalty in the link budget. This power penalty is between 1.6 to 3 dB for a direct detection QPPM link, and 3 to 5 dB for a heterodyne QFSK system. For the heterodyne systems, the carrier phase noise presents another major factor of performance degradation that must be considered. In contrast, the loss due to synchronization error is small. The link budgets for direct and heterodyne detection systems are evaluated. It is shown that, for systems with large pointing and tracking errors, the link budget is dominated by the spatial tracking error, and the direct detection system shows a superior performance because it is less sensitive to the spatial tracking error. On the other hand, for systems with small pointing and tracking jitters, the antenna gains are in general limited by the launch cost, and suboptimal antenna gains are often used in practice. In which case, the heterodyne system has a slightly higher power margin because of higher receiver sensitivity.

  3. TH-B-BRC-00: How to Identify and Resolve Potential Clinical Errors Before They Impact Patients Treatment: Lessons Learned

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2016-06-15

    Radiation treatment consists of a chain of events influenced by the quality of machine operation, beam data commissioning, machine calibration, patient specific data, simulation, treatment planning, imaging and treatment delivery. There is always a chance that the clinical medical physicist may make or fail to detect an error in one of the events that may impact on the patient’s treatment. In the clinical scenario, errors may be systematic and, without peer review, may have a low detectability because they are not part of routine QA procedures. During treatment, there might be errors on machine that needs attention. External reviews ofmore » some of the treatment delivery components by independent reviewers, like IROC, can detect errors, but may not be timely. The goal of this session is to help junior clinical physicists identify potential errors as well as the approach of quality assurance to perform a root cause analysis to find and eliminate an error and to continually monitor for errors. A compilation of potential errors will be presented by examples of the thought process required to spot the error and determine the root cause. Examples may include unusual machine operation, erratic electrometer reading, consistent lower electron output, variation in photon output, body parts inadvertently left in beam, unusual treatment plan, poor normalization, hot spots etc. Awareness of the possibility and detection of error in any link of the treatment process chain will help improve the safe and accurate delivery of radiation to patients. Four experts will discuss how to identify errors in four areas of clinical treatment. D. Followill, NIH grant CA 180803.« less

  4. TH-B-BRC-01: How to Identify and Resolve Potential Clinical Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, I.

    2016-06-15

    Radiation treatment consists of a chain of events influenced by the quality of machine operation, beam data commissioning, machine calibration, patient specific data, simulation, treatment planning, imaging and treatment delivery. There is always a chance that the clinical medical physicist may make or fail to detect an error in one of the events that may impact on the patient’s treatment. In the clinical scenario, errors may be systematic and, without peer review, may have a low detectability because they are not part of routine QA procedures. During treatment, there might be errors on machine that needs attention. External reviews ofmore » some of the treatment delivery components by independent reviewers, like IROC, can detect errors, but may not be timely. The goal of this session is to help junior clinical physicists identify potential errors as well as the approach of quality assurance to perform a root cause analysis to find and eliminate an error and to continually monitor for errors. A compilation of potential errors will be presented by examples of the thought process required to spot the error and determine the root cause. Examples may include unusual machine operation, erratic electrometer reading, consistent lower electron output, variation in photon output, body parts inadvertently left in beam, unusual treatment plan, poor normalization, hot spots etc. Awareness of the possibility and detection of error in any link of the treatment process chain will help improve the safe and accurate delivery of radiation to patients. Four experts will discuss how to identify errors in four areas of clinical treatment. D. Followill, NIH grant CA 180803.« less

  5. Syndromic surveillance for health information system failures: a feasibility study

    PubMed Central

    Ong, Mei-Sing; Magrabi, Farah; Coiera, Enrico

    2013-01-01

    Objective To explore the applicability of a syndromic surveillance method to the early detection of health information technology (HIT) system failures. Methods A syndromic surveillance system was developed to monitor a laboratory information system at a tertiary hospital. Four indices were monitored: (1) total laboratory records being created; (2) total records with missing results; (3) average serum potassium results; and (4) total duplicated tests on a patient. The goal was to detect HIT system failures causing: data loss at the record level; data loss at the field level; erroneous data; and unintended duplication of data. Time-series models of the indices were constructed, and statistical process control charts were used to detect unexpected behaviors. The ability of the models to detect HIT system failures was evaluated using simulated failures, each lasting for 24 h, with error rates ranging from 1% to 35%. Results In detecting data loss at the record level, the model achieved a sensitivity of 0.26 when the simulated error rate was 1%, while maintaining a specificity of 0.98. Detection performance improved with increasing error rates, achieving a perfect sensitivity when the error rate was 35%. In the detection of missing results, erroneous serum potassium results and unintended repetition of tests, perfect sensitivity was attained when the error rate was as small as 5%. Decreasing the error rate to 1% resulted in a drop in sensitivity to 0.65–0.85. Conclusions Syndromic surveillance methods can potentially be applied to monitor HIT systems, to facilitate the early detection of failures. PMID:23184193

  6. Performance Analysis of Hierarchical Group Key Management Integrated with Adaptive Intrusion Detection in Mobile ad hoc Networks

    DTIC Science & Technology

    2016-04-05

    applications in wireless networks such as military battlefields, emergency response, mobile commerce , online gaming, and collaborative work are based on the...www.elsevier.com/locate/peva Performance analysis of hierarchical group key management integrated with adaptive intrusion detection in mobile ad hoc...Accepted 19 September 2010 Available online 26 September 2010 Keywords: Mobile ad hoc networks Intrusion detection Group communication systems Group

  7. Partially supervised P300 speller adaptation for eventual stimulus timing optimization: target confidence is superior to error-related potential score as an uncertain label

    NASA Astrophysics Data System (ADS)

    Zeyl, Timothy; Yin, Erwei; Keightley, Michelle; Chau, Tom

    2016-04-01

    Objective. Error-related potentials (ErrPs) have the potential to guide classifier adaptation in BCI spellers, for addressing non-stationary performance as well as for online optimization of system parameters, by providing imperfect or partial labels. However, the usefulness of ErrP-based labels for BCI adaptation has not been established in comparison to other partially supervised methods. Our objective is to make this comparison by retraining a two-step P300 speller on a subset of confident online trials using naïve labels taken from speller output, where confidence is determined either by (i) ErrP scores, (ii) posterior target scores derived from the P300 potential, or (iii) a hybrid of these scores. We further wish to evaluate the ability of partially supervised adaptation and retraining methods to adjust to a new stimulus-onset asynchrony (SOA), a necessary step towards online SOA optimization. Approach. Eleven consenting able-bodied adults attended three online spelling sessions on separate days with feedback in which SOAs were set at 160 ms (sessions 1 and 2) and 80 ms (session 3). A post hoc offline analysis and a simulated online analysis were performed on sessions two and three to compare multiple adaptation methods. Area under the curve (AUC) and symbols spelled per minute (SPM) were the primary outcome measures. Main results. Retraining using supervised labels confirmed improvements of 0.9 percentage points (session 2, p < 0.01) and 1.9 percentage points (session 3, p < 0.05) in AUC using same-day training data over using data from a previous day, which supports classifier adaptation in general. Significance. Using posterior target score alone as a confidence measure resulted in the highest SPM of the partially supervised methods, indicating that ErrPs are not necessary to boost the performance of partially supervised adaptive classification. Partial supervision significantly improved SPM at a novel SOA, showing promise for eventual online SOA optimization.

  8. Detecting and correcting hard errors in a memory array

    DOEpatents

    Kalamatianos, John; John, Johnsy Kanjirapallil; Gelinas, Robert; Sridharan, Vilas K.; Nevius, Phillip E.

    2015-11-19

    Hard errors in the memory array can be detected and corrected in real-time using reusable entries in an error status buffer. Data may be rewritten to a portion of a memory array and a register in response to a first error in data read from the portion of the memory array. The rewritten data may then be written from the register to an entry of an error status buffer in response to the rewritten data read from the register differing from the rewritten data read from the portion of the memory array.

  9. Disclosure of Medical Errors in Oman

    PubMed Central

    Norrish, Mark I. K.

    2015-01-01

    Objectives: This study aimed to provide insight into the preferences for and perceptions of medical error disclosure (MED) by members of the public in Oman. Methods: Between January and June 2012, an online survey was used to collect responses from 205 members of the public across five governorates of Oman. Results: A disclosure gap was revealed between the respondents’ preferences for MED and perceived current MED practices in Oman. This disclosure gap extended to both the type of error and the person most likely to disclose the error. Errors resulting in patient harm were found to have a strong influence on individuals’ perceived quality of care. In addition, full disclosure was found to be highly valued by respondents and able to mitigate for a perceived lack of care in cases where medical errors led to damages. Conclusion: The perceived disclosure gap between respondents’ MED preferences and perceptions of current MED practices in Oman needs to be addressed in order to increase public confidence in the national health care system. PMID:26052463

  10. VizieR Online Data Catalog: V and R CCD photometry of visual binaries (Abad+, 2004)

    NASA Astrophysics Data System (ADS)

    Abad, C.; Docobo, J. A.; Lanchares, V.; Lahulla, J. F.; Abelleira, P.; Blanco, J.; Alvarez, C.

    2003-11-01

    Table 1 gives relevant data for the visual binaries observed. Observations were carried out over a short period of time, therefore we assign the mean epoch (1998.58) for the totality of data. Data of individual stars are presented as average data with errors, by parameter, when various observations have been calculated, as well as the number of observations involved. Errors corresponding to astrometric relative positions between components are always present. For single observations, parameter fitting errors, specially for dx and dy parameters, have been calculated analysing the chi2 test around the minimum. Following the rules for error propagation, theta and rho errors can be estimated. Then, Table 1 shows single observation errors with an additional significant digit. When a star does not have known references, we include it in Table 2, where J2000 position and magnitudes are from the USNO-A2.0 catalogue (Monet et al., 1998, Cat. ). (2 data files).

  11. The use of self checks and voting in software error detection - An empirical study

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.; Cha, Stephen S.; Knight, John C.; Shimeall, Timothy J.

    1990-01-01

    The results of an empirical study of software error detection using self checks and N-version voting are presented. Working independently, each of 24 programmers first prepared a set of self checks using just the requirements specification of an aerospace application, and then each added self checks to an existing implementation of that specification. The modified programs were executed to measure the error-detection performance of the checks and to compare this with error detection using simple voting among multiple versions. The analysis of the checks revealed that there are great differences in the ability of individual programmers to design effective checks. It was found that some checks that might have been effective failed to detect an error because they were badly placed, and there were numerous instances of checks signaling nonexistent errors. In general, specification-based checks alone were not as effective as specification-based checks combined with code-based checks. Self checks made it possible to identify faults that had not been detected previously by voting 28 versions of the program over a million randomly generated inputs. This appeared to result from the fact that the self checks could examine the internal state of the executing program, whereas voting examines only final results of computations. If internal states had to be identical in N-version voting systems, then there would be no reason to write multiple versions.

  12. Error detection and response adjustment in youth with mild spastic cerebral palsy: an event-related brain potential study.

    PubMed

    Hakkarainen, Elina; Pirilä, Silja; Kaartinen, Jukka; van der Meere, Jaap J

    2013-06-01

    This study evaluated the brain activation state during error making in youth with mild spastic cerebral palsy and a peer control group while carrying out a stimulus recognition task. The key question was whether patients were detecting their own errors and subsequently improving their performance in a future trial. Findings indicated that error responses of the group with cerebral palsy were associated with weak motor preparation, as indexed by the amplitude of the late contingent negative variation. However, patients were detecting their errors as indexed by the amplitude of the response-locked negativity and thus improved their performance in a future trial. Findings suggest that the consequence of error making on future performance is intact in a sample of youth with mild spastic cerebral palsy. Because the study group is small, the present findings need replication using a larger sample.

  13. Clinical implementation and error sensitivity of a 3D quality assurance protocol for prostate and thoracic IMRT

    PubMed Central

    Cotter, Christopher; Turcotte, Julie Catherine; Crawford, Bruce; Sharp, Gregory; Mah'D, Mufeed

    2015-01-01

    This work aims at three goals: first, to define a set of statistical parameters and plan structures for a 3D pretreatment thoracic and prostate intensity‐modulated radiation therapy (IMRT) quality assurance (QA) protocol; secondly, to test if the 3D QA protocol is able to detect certain clinical errors; and third, to compare the 3D QA method with QA performed with single ion chamber and 2D gamma test in detecting those errors. The 3D QA protocol measurements were performed on 13 prostate and 25 thoracic IMRT patients using IBA's COMPASS system. For each treatment planning structure included in the protocol, the following statistical parameters were evaluated: average absolute dose difference (AADD), percent structure volume with absolute dose difference greater than 6% (ADD6), and 3D gamma test. To test the 3D QA protocol error sensitivity, two prostate and two thoracic step‐and‐shoot IMRT patients were investigated. Errors introduced to each of the treatment plans included energy switched from 6 MV to 10 MV, multileaf collimator (MLC) leaf errors, linac jaws errors, monitor unit (MU) errors, MLC and gantry angle errors, and detector shift errors. QA was performed on each plan using a single ion chamber and 2D array of ion chambers for 2D and 3D QA. Based on the measurements performed, we established a uniform set of tolerance levels to determine if QA passes for each IMRT treatment plan structure: maximum allowed AADD is 6%; maximum 4% of any structure volume can be with ADD6 greater than 6%, and maximum 4% of any structure volume may fail 3D gamma test with test parameters 3%/3 mm DTA. Out of the three QA methods tested the single ion chamber performed the worst by detecting 4 out of 18 introduced errors, 2D QA detected 11 out of 18 errors, and 3D QA detected 14 out of 18 errors. PACS number: 87.56.Fc PMID:26699299

  14. Publisher Correction: A molecular cross-linking approach for hybrid metal oxides

    NASA Astrophysics Data System (ADS)

    Jung, Dahee; Saleh, Liban M. A.; Berkson, Zachariah J.; El-Kady, Maher F.; Hwang, Jee Youn; Mohamed, Nahla; Wixtrom, Alex I.; Titarenko, Ekaterina; Shao, Yanwu; McCarthy, Kassandra; Guo, Jian; Martini, Ignacio B.; Kraemer, Stephan; Wegener, Evan C.; Saint-Cricq, Philippe; Ruehle, Bastian; Langeslay, Ryan R.; Delferro, Massimiliano; Brosmer, Jonathan L.; Hendon, Christopher H.; Gallagher-Jones, Marcus; Rodriguez, Jose; Chapman, Karena W.; Miller, Jeffrey T.; Duan, Xiangfeng; Kaner, Richard B.; Zink, Jeffrey I.; Chmelka, Bradley F.; Spokoyny, Alexander M.

    2018-03-01

    In the version of this Article originally published, Liban M. A. Saleh was incorrectly listed as Liban A. M. Saleh due to a technical error. This has now been amended in all online versions of the Article.

  15. Competency: an essential component of caring in nursing.

    PubMed

    Knapp, Bobbi

    2004-01-01

    Providing online e-learning for nurses significantly reduces medical errors by providing "just-in-time" reference and device training. Offering continuing education 24/7 assures continued competency in an ever-changing practice environment while fostering professional development and career mobility.

  16. Coherent detection of position errors in inter-satellite laser communications

    NASA Astrophysics Data System (ADS)

    Xu, Nan; Liu, Liren; Liu, De'an; Sun, Jianfeng; Luan, Zhu

    2007-09-01

    Due to the improved receiver sensitivity and wavelength selectivity, coherent detection became an attractive alternative to direct detection in inter-satellite laser communications. A novel method to coherent detection of position errors information is proposed. Coherent communication system generally consists of receive telescope, local oscillator, optical hybrid, photoelectric detector and optical phase lock loop (OPLL). Based on the system composing, this method adds CCD and computer as position error detector. CCD captures interference pattern while detection of transmission data from the transmitter laser. After processed and analyzed by computer, target position information is obtained from characteristic parameter of the interference pattern. The position errors as the control signal of PAT subsystem drive the receiver telescope to keep tracking to the target. Theoretical deviation and analysis is presented. The application extends to coherent laser rang finder, in which object distance and position information can be obtained simultaneously.

  17. Neural evidence for enhanced error detection in major depressive disorder.

    PubMed

    Chiu, Pearl H; Deldin, Patricia J

    2007-04-01

    Anomalies in error processing have been implicated in the etiology and maintenance of major depressive disorder. In particular, depressed individuals exhibit heightened sensitivity to error-related information and negative environmental cues, along with reduced responsivity to positive reinforcers. The authors examined the neural activation associated with error processing in individuals diagnosed with and without major depression and the sensitivity of these processes to modulation by monetary task contingencies. The error-related negativity and error-related positivity components of the event-related potential were used to characterize error monitoring in individuals with major depressive disorder and the degree to which these processes are sensitive to modulation by monetary reinforcement. Nondepressed comparison subjects (N=17) and depressed individuals (N=18) performed a flanker task under two external motivation conditions (i.e., monetary reward for correct responses and monetary loss for incorrect responses) and a nonmonetary condition. After each response, accuracy feedback was provided. The error-related negativity component assessed the degree of anomaly in initial error detection, and the error positivity component indexed recognition of errors. Across all conditions, the depressed participants exhibited greater amplitude of the error-related negativity component, relative to the comparison subjects, and equivalent error positivity amplitude. In addition, the two groups showed differential modulation by task incentives in both components. These data implicate exaggerated early error-detection processes in the etiology and maintenance of major depressive disorder. Such processes may then recruit excessive neural and cognitive resources that manifest as symptoms of depression.

  18. Evaluating the feasibility of using online software to collect patient information in a chiropractic practice-based research network.

    PubMed

    Kania-Richmond, Ania; Weeks, Laura; Scholten, Jeffrey; Reney, Mikaël

    2016-03-01

    Practice based research networks (PBRNs) are increasingly used as a tool for evidence based practice. We developed and tested the feasibility of using software to enable online collection of patient data within a chiropractic PBRN to support clinical decision making and research in participating clinics. To assess the feasibility of using online software to collect quality patient information. The study consisted of two phases: 1) Assessment of the quality of information provided, using a standardized form; and 2) Exploration of patients' perspectives and experiences regarding online information provision through semi-structured interviews. Data analysis was descriptive. Forty-five new patients were recruited. Thirty-six completed online forms, which were submitted by an appropriate person 100% of the time, with an error rate of less than 1%, and submitted in a timely manner 83% of the time. Twenty-one participants were interviewed. Overall, online forms were preferred given perceived security, ease of use, and enabling provision of more accurate information. Use of online software is feasible, provides high quality information, and is preferred by most participants. A pen-and-paper format should be available for patients with this preference and in case of technical difficulties.

  19. Robust online tracking via adaptive samples selection with saliency detection

    NASA Astrophysics Data System (ADS)

    Yan, Jia; Chen, Xi; Zhu, QiuPing

    2013-12-01

    Online tracking has shown to be successful in tracking of previously unknown objects. However, there are two important factors which lead to drift problem of online tracking, the one is how to select the exact labeled samples even when the target locations are inaccurate, and the other is how to handle the confusors which have similar features with the target. In this article, we propose a robust online tracking algorithm with adaptive samples selection based on saliency detection to overcome the drift problem. To deal with the problem of degrading the classifiers using mis-aligned samples, we introduce the saliency detection method to our tracking problem. Saliency maps and the strong classifiers are combined to extract the most correct positive samples. Our approach employs a simple yet saliency detection algorithm based on image spectral residual analysis. Furthermore, instead of using the random patches as the negative samples, we propose a reasonable selection criterion, in which both the saliency confidence and similarity are considered with the benefits that confusors in the surrounding background are incorporated into the classifiers update process before the drift occurs. The tracking task is formulated as a binary classification via online boosting framework. Experiment results in several challenging video sequences demonstrate the accuracy and stability of our tracker.

  20. Study on the special vision sensor for detecting position error in robot precise TIG welding of some key part of rocket engine

    NASA Astrophysics Data System (ADS)

    Zhang, Wenzeng; Chen, Nian; Wang, Bin; Cao, Yipeng

    2005-01-01

    Rocket engine is a hard-core part of aerospace transportation and thrusting system, whose research and development is very important in national defense, aviation and aerospace. A novel vision sensor is developed, which can be used for error detecting in arc length control and seam tracking in precise pulse TIG welding of the extending part of the rocket engine jet tube. The vision sensor has many advantages, such as imaging with high quality, compactness and multiple functions. The optics design, mechanism design and circuit design of the vision sensor have been described in detail. Utilizing the mirror imaging of Tungsten electrode in the weld pool, a novel method is proposed to detect the arc length and seam tracking error of Tungsten electrode to the center line of joint seam from a single weld image. A calculating model of the method is proposed according to the relation of the Tungsten electrode, weld pool, the mirror of Tungsten electrode in weld pool and joint seam. The new methodologies are given to detect the arc length and seam tracking error. Through analyzing the results of the experiments, a system error modifying method based on a linear function is developed to improve the detecting precise of arc length and seam tracking error. Experimental results show that the final precision of the system reaches 0.1 mm in detecting the arc length and the seam tracking error of Tungsten electrode to the center line of joint seam.

  1. Sensitivity Studies for Space-based Measurements of Atmospheric Total Column Carbon Dioxide Using Reflected Sunlight

    NASA Technical Reports Server (NTRS)

    Mao, Jianping; Kawa, S. Randolph

    2003-01-01

    A series of sensitivity studies is carried out to explore the feasibility of space-based global carbon dioxide (CO2) measurements for global and regional carbon cycle studies. The detection method uses absorption of reflected sunlight in the CO2 vibration-rotation band at 1.58 micron. The sensitivities of the detected radiances are calculated using the line-by-line model (LBLRTM), implemented with the DISORT (Discrete Ordinates Radiative Transfer) model to include atmospheric scattering in this band. The results indicate that (a) the small (approx.1%) changes in CO2 near the Earth's surface are detectable in this CO2 band provided adequate sensor signal-to-noise ratio and spectral resolution are achievable; (b) the effects of other interfering constituents, such as water vapor, aerosols and cirrus clouds, on the radiance are significant but the overall effects of the modification of light path length on total back-to-space radiance sensitivity to CO2 change are minor for general cases, which means that generally the total column CO2 can be derived in high precision from the ratio of the on-line center to off-line radiances; (c) together with CO2 gas absorption aerosol/cirrus cloud layer has differential scattering which may result in the modification of on-line to off-line radiance ratio which could lead a large bias in the total column CO2 retrieval. Approaches to correct such bias need further investigation. (d) CO2 retrieval requires good knowledge of the atmospheric temperature profile, e.g. approximately 1K RMS error in layer temperature, which is achievable from new atmospheric sounders in the near future; (e) the atmospheric path length, over which the CO2 absorption occurs, should be known in order to correctly interpret horizontal gradients of CO2 from the total column CO2 measurement; thus an additional sensor for surface pressure measurement needs to be attached for a complete measurement package.

  2. Spatio-temporal filtering techniques for the detection of disaster-related communication.

    PubMed

    Fitzhugh, Sean M; Ben Gibson, C; Spiro, Emma S; Butts, Carter T

    2016-09-01

    Individuals predominantly exchange information with one another through informal, interpersonal channels. During disasters and other disrupted settings, information spread through informal channels regularly outpaces official information provided by public officials and the press. Social scientists have long examined this kind of informal communication in the rumoring literature, but studying rumoring in disrupted settings has posed numerous methodological challenges. Measuring features of informal communication-timing, content, location-with any degree of precision has historically been extremely challenging in small studies and infeasible at large scales. We address this challenge by using online, informal communication from a popular microblogging website and for which we have precise spatial and temporal metadata. While the online environment provides a new means for observing rumoring, the abundance of data poses challenges for parsing hazard-related rumoring from countless other topics in numerous streams of communication. Rumoring about disaster events is typically temporally and spatially constrained to places where that event is salient. Accordingly, we use spatio and temporal subsampling to increase the resolution of our detection techniques. By filtering out data from known sources of error (per rumor theories), we greatly enhance the signal of disaster-related rumoring activity. We use these spatio-temporal filtering techniques to detect rumoring during a variety of disaster events, from high-casualty events in major population centers to minimally destructive events in remote areas. We consistently find three phases of response: anticipatory excitation where warnings and alerts are issued ahead of an event, primary excitation in and around the impacted area, and secondary excitation which frequently brings a convergence of attention from distant locales onto locations impacted by the event. Our results demonstrate the promise of spatio-temporal filtering techniques for "tuning" measurement of hazard-related rumoring to enable observation of rumoring at scales that have long been infeasible. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Identification and anti-oxidant capacity determination of phenolics and their glycosides in elderflower by on-line HPLC-CUPRAC method.

    PubMed

    Çelik, S Esin; Özyürek, Mustafa; Güçlü, Kubilay; Çapanoğlu, Esra; Apak, Reşat

    2014-01-01

    Development and application of an on-line cupric reducing anti-oxidant capacity (CUPRAC) assay coupled with HPLC for separation and on-line determination of phenolic anti-oxidants in elderflower (Sambucus nigra L.) extracts for their anti-oxidant capacity are significant for evaluating health-beneficial effects. Moreover, this work aimed to assay certain flavonoid glycosides of elderflower that could not be identified/quantified by other similar on-line HPLC methods (i.e. 2,2-diphenyl-1-picrylhdrazyl and 2, 2'-azino-bis-3-ethylbenzothiazoline-6-sulphonic acid). To identify anti-oxidant constituents in elderflower by HPLC and to evaluate their individual anti-oxidant capacities by on-line HPLC-CUPRAC assay with a post-column derivatisation system. The separation and UV detection of polyphenols were performed on a C18 -column using gradient elution with two different mobile phase solutions, that is acetonitrile and 1% glacial acetic acid, with detection at 340 nm. The HPLC-separated anti-oxidant polyphenols in column effluent react with copper(II)-neocuproine in a reaction-coil to reduce the latter to copper(I)-neocuproine (Cu(I)-Nc) chelate having maximum absorption at 450 nm. The detection limits of tested compounds at 450 nm after post-column derivatisation were compared with those of at 340 nm UV-detection without derivatisation. LOD values (µg/mL) of quercetin and its glycosides at 450 nm were lower than those of UV detection at 340 nm. This method was applied successfully to elderflower extract. The flavonol glycosides of quercetin and kaempferol bound to several sugar components (glucose, rhamnose, galactose and rutinose) were identified in the sample. The on-line HPLC-CUPRAC method was advantageous over on-line ABTS and DPPH methods for measuring the flavonoid glycosides of elderflower. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Is there any electrophysiological evidence for subliminal error processing?

    PubMed Central

    Shalgi, Shani; Deouell, Leon Y.

    2013-01-01

    The role of error awareness in executive control and modification of behavior is not fully understood. In line with many recent studies showing that conscious awareness is unnecessary for numerous high-level processes such as strategic adjustments and decision making, it was suggested that error detection can also take place unconsciously. The Error Negativity (Ne) component, long established as a robust error-related component that differentiates between correct responses and errors, was a fine candidate to test this notion: if an Ne is elicited also by errors which are not consciously detected, it would imply a subliminal process involved in error monitoring that does not necessarily lead to conscious awareness of the error. Indeed, for the past decade, the repeated finding of a similar Ne for errors which became aware and errors that did not achieve awareness, compared to the smaller negativity elicited by correct responses (Correct Response Negativity; CRN), has lent the Ne the prestigious status of an index of subliminal error processing. However, there were several notable exceptions to these findings. The study in the focus of this review (Shalgi and Deouell, 2012) sheds new light on both types of previous results. We found that error detection as reflected by the Ne is correlated with subjective awareness: when awareness (or more importantly lack thereof) is more strictly determined using the wagering paradigm, no Ne is elicited without awareness. This result effectively resolves the issue of why there are many conflicting findings regarding the Ne and error awareness. The average Ne amplitude appears to be influenced by individual criteria for error reporting and therefore, studies containing different mixtures of participants who are more confident of their own performance or less confident, or paradigms that either encourage or don't encourage reporting low confidence errors will show different results. Based on this evidence, it is no longer possible to unquestioningly uphold the notion that the amplitude of the Ne is unrelated to subjective awareness, and therefore, that errors are detected without conscious awareness. PMID:24009548

  5. Magnetic-field sensing with quantum error detection under the effect of energy relaxation

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Yuichiro; Benjamin, Simon

    2017-03-01

    A solid state spin is an attractive system with which to realize an ultrasensitive magnetic field sensor. A spin superposition state will acquire a phase induced by the target field, and we can estimate the field strength from this phase. Recent studies have aimed at improving sensitivity through the use of quantum error correction (QEC) to detect and correct any bit-flip errors that may occur during the sensing period. Here we investigate the performance of a two-qubit sensor employing QEC and under the effect of energy relaxation. Surprisingly, we find that the standard QEC technique to detect and recover from an error does not improve the sensitivity compared with the single-qubit sensors. This is a consequence of the fact that the energy relaxation induces both a phase-flip and a bit-flip noise where the former noise cannot be distinguished from the relative phase induced from the target fields. However, we have found that we can improve the sensitivity if we adopt postselection to discard the state when error is detected. Even when quantum error detection is moderately noisy, and allowing for the cost of the postselection technique, we find that this two-qubit system shows an advantage in sensing over a single qubit in the same conditions.

  6. The Nature of Change Detection and Online Representations of Scenes

    ERIC Educational Resources Information Center

    Ryan,J ennifer D.; Cohen, Neal J.

    2004-01-01

    This article provides evidence for implicit change detection and for the contribution of multiple memory sources to online representations. Multiple eye-movement measures distinguished original from changed scenes, even when college students had no conscious awareness for the change. Patients with amnesia showed a systematic deficit on 1 class of…

  7. Evaluation of Object Detection Algorithms for Ship Detection in the Visible Spectrum

    DTIC Science & Technology

    2013-12-01

    Kodak KAI-2093 was assumed throughout the model to be the image equitation sensor. The sensor was assumed to have taken all of the evaluation imagery...www.ShipPhotos.co.uk. [Online]. Available: http://www.shipphotos.co.uk/hull/ [42] Kodak (2007. March 19). Kodak KAI-2093 image sensor. [Online]. Available

  8. Pressurized capillary electrochromatographic analysis of water-soluble vitamins by combining with on-line concentration technique.

    PubMed

    Jia, Li; Liu, Yaling; Du, Yanyan; Xing, Da

    2007-06-22

    A pressurized capillary electrochromatography (pCEC) system was developed for the separation of water-soluble vitamins, in which UV absorbance was used as the detection method and a monolithic silica-ODS column as the separation column. The parameters (type and content of organic solvent in the mobile phase, type and concentration of electrolyte, pH of the electrolyte buffer, applied voltage and flow rate) affecting the separation resolution were evaluated. The combination of two on-line concentration techniques, namely, solvent gradient zone sharpening effect and field-enhanced sample stacking, was utilized to improve detection sensitivity, which proved to be beneficial to enhance the detection sensitivity by enabling the injection of large volumes of samples. Coupling electrokinetic injection with the on-line concentration techniques was much more beneficial for the concentration of positively charged vitamins. Comparing with the conventional injection mode, the enhancement in the detection sensitivities of water-soluble vitamins using the on-line concentration technique is in the range of 3 to 35-fold. The developed pCEC method was applied to evaluate water-soluble vitamins in corns.

  9. Automatic detection of MLC relative position errors for VMAT using the EPID-based picket fence test

    NASA Astrophysics Data System (ADS)

    Christophides, Damianos; Davies, Alex; Fleckney, Mark

    2016-12-01

    Multi-leaf collimators (MLCs) ensure the accurate delivery of treatments requiring complex beam fluences like intensity modulated radiotherapy and volumetric modulated arc therapy. The purpose of this work is to automate the detection of MLC relative position errors  ⩾0.5 mm using electronic portal imaging device-based picket fence tests and compare the results to the qualitative assessment currently in use. Picket fence tests with and without intentional MLC errors were measured weekly on three Varian linacs. The picket fence images analysed covered a time period ranging between 14-20 months depending on the linac. An algorithm was developed that calculated the MLC error for each leaf-pair present in the picket fence images. The baseline error distributions of each linac were characterised for an initial period of 6 months and compared with the intentional MLC errors using statistical metrics. The distributions of median and one-sample Kolmogorov-Smirnov test p-value exhibited no overlap between baseline and intentional errors and were used retrospectively to automatically detect MLC errors in routine clinical practice. Agreement was found between the MLC errors detected by the automatic method and the fault reports during clinical use, as well as interventions for MLC repair and calibration. In conclusion the method presented provides for full automation of MLC quality assurance, based on individual linac performance characteristics. The use of the automatic method has been shown to provide early warning for MLC errors that resulted in clinical downtime.

  10. Learning styles of registered nurses enrolled in an online nursing program.

    PubMed

    Smith, Anita

    2010-01-01

    Technological advances assist in the proliferation of online nursing programs which meet the needs of the working nurse. Understanding online learning styles permits universities to adequately address the educational needs of the professional nurse returning for an advanced degree. The purpose of this study was to describe the learning styles of registered nurses (RNs) enrolled in an online master's nursing program or RN-bachelor of science in nursing (BSN) program. A descriptive, cross-sectional design was used. Kolb's learning style inventory (Version 3.1) was completed by 217 RNs enrolled in online courses at a Southeastern university. Descriptive statistical procedures were used for analysis. Thirty-one percent of the nurses were accommodators, 20% were assimilators, 19% were convergers, and 20% were divergers. Accommodators desire hand-on experiences, carrying out plans and tasks and using an intuitive trial-and-error approach to problem solving. The learning styles of the RNs were similar to the BSN students in traditional classroom settings. Despite their learning style, nurses felt that the online program met their needs. Implementing the technological innovations in nursing education requires the understanding of the hands-on learning of the RN so that the development of the online courses will satisfactorily meet the needs of the nurses who have chosen an online program. Copyright 2010 Elsevier Inc. All rights reserved.

  11. On-line multiple component analysis for efficient quantitative bioprocess development.

    PubMed

    Dietzsch, Christian; Spadiut, Oliver; Herwig, Christoph

    2013-02-20

    On-line monitoring devices for the precise determination of a multitude of components are a prerequisite for fast bioprocess quantification. On-line measured values have to be checked for quality and consistency, in order to extract quantitative information from these data. In the present study we characterized a novel on-line sampling and analysis device comprising an automatic photometric robot. We connected this on-line device to a bioreactor and concomitantly measured six components (i.e. glucose, glycerol, ethanol, acetate, phosphate and ammonium) during different batch cultivations of Pichia pastoris. The on-line measured data did not show significant deviations from off-line taken samples and were consequently used for incremental rate and yield calculations. In this respect we highlighted the importance of data quality and discussed the phenomenon of error propagation. On-line calculated rates and yields depicted the physiological responses of the P. pastoris cells in unlimited and limited cultures. A more detailed analysis of the physiological state was possible by considering the off-line determined biomass dry weight and the calculation of specific rates. Here we present a novel device for on-line monitoring of bioprocesses, which ensures high data quality in real-time and therefore refers to a valuable tool for Process Analytical Technology (PAT). Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Caffeine enhances real-world language processing: evidence from a proofreading task.

    PubMed

    Brunyé, Tad T; Mahoney, Caroline R; Rapp, David N; Ditman, Tali; Taylor, Holly A

    2012-03-01

    Caffeine has become the most prevalently consumed psychostimulant in the world, but its influences on daily real-world functioning are relatively unknown. The present work investigated the effects of caffeine (0 mg, 100 mg, 200 mg, 400 mg) on a commonplace language task that required readers to identify and correct 4 error types in extended discourse: simple local errors (misspelling 1- to 2-syllable words), complex local errors (misspelling 3- to 5-syllable words), simple global errors (incorrect homophones), and complex global errors (incorrect subject-verb agreement and verb tense). In 2 placebo-controlled, double-blind studies using repeated-measures designs, we found higher detection and repair rates for complex global errors, asymptoting at 200 mg in low consumers (Experiment 1) and peaking at 400 mg in high consumers (Experiment 2). In both cases, covariate analyses demonstrated that arousal state mediated the relationship between caffeine consumption and the detection and repair of complex global errors. Detection and repair rates for the other 3 error types were not affected by caffeine consumption. Taken together, we demonstrate that caffeine has differential effects on error detection and repair as a function of dose and error type, and this relationship is closely tied to caffeine's effects on subjective arousal state. These results support the notion that central nervous system stimulants may enhance global processing of language-based materials and suggest that such effects may originate in caffeine-related right hemisphere brain processes. Implications for understanding the relationships between caffeine consumption and real-world cognitive functioning are discussed. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  13. On the sensitivity of TG-119 and IROC credentialing to TPS commissioning errors.

    PubMed

    McVicker, Drew; Yin, Fang-Fang; Adamson, Justus D

    2016-01-08

    We investigate the sensitivity of IMRT commissioning using the TG-119 C-shape phantom and credentialing with the IROC head and neck phantom to treatment planning system commissioning errors. We introduced errors into the various aspects of the commissioning process for a 6X photon energy modeled using the analytical anisotropic algorithm within a commercial treatment planning system. Errors were implemented into the various components of the dose calculation algorithm including primary photons, secondary photons, electron contamination, and MLC parameters. For each error we evaluated the probability that it could be committed unknowingly during the dose algorithm commissioning stage, and the probability of it being identified during the verification stage. The clinical impact of each commissioning error was evaluated using representative IMRT plans including low and intermediate risk prostate, head and neck, mesothelioma, and scalp; the sensitivity of the TG-119 and IROC phantoms was evaluated by comparing dosimetric changes to the dose planes where film measurements occur and change in point doses where dosimeter measurements occur. No commissioning errors were found to have both a low probability of detection and high clinical severity. When errors do occur, the IROC credentialing and TG 119 commissioning criteria are generally effective at detecting them; however, for the IROC phantom, OAR point-dose measurements are the most sensitive despite being currently excluded from IROC analysis. Point-dose measurements with an absolute dose constraint were the most effective at detecting errors, while film analysis using a gamma comparison and the IROC film distance to agreement criteria were less effective at detecting the specific commissioning errors implemented here.

  14. Efficient detection of dangling pointer error for C/C++ programs

    NASA Astrophysics Data System (ADS)

    Zhang, Wenzhe

    2017-08-01

    Dangling pointer error is pervasive in C/C++ programs and it is very hard to detect. This paper introduces an efficient detector to detect dangling pointer error in C/C++ programs. By selectively leave some memory accesses unmonitored, our method could reduce the memory monitoring overhead and thus achieves better performance over previous methods. Experiments show that our method could achieve an average speed up of 9% over previous compiler instrumentation based method and more than 50% over previous page protection based method.

  15. Method and apparatus for detecting timing errors in a system oscillator

    DOEpatents

    Gliebe, Ronald J.; Kramer, William R.

    1993-01-01

    A method of detecting timing errors in a system oscillator for an electronic device, such as a power supply, includes the step of comparing a system oscillator signal with a delayed generated signal and generating a signal representative of the timing error when the system oscillator signal is not identical to the delayed signal. An LED indicates to an operator that a timing error has occurred. A hardware circuit implements the above-identified method.

  16. Semi-Supervised Novelty Detection with Adaptive Eigenbases, and Application to Radio Transients

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Majid, Walid A.; Reed, Colorado J.; Wagstaff, Kiri L.

    2011-01-01

    We present a semi-supervised online method for novelty detection and evaluate its performance for radio astronomy time series data. Our approach uses adaptive eigenbases to combine 1) prior knowledge about uninteresting signals with 2) online estimation of the current data properties to enable highly sensitive and precise detection of novel signals. We apply the method to the problem of detecting fast transient radio anomalies and compare it to current alternative algorithms. Tests based on observations from the Parkes Multibeam Survey show both effective detection of interesting rare events and robustness to known false alarm anomalies.

  17. Online Soft Sensor of Humidity in PEM Fuel Cell Based on Dynamic Partial Least Squares

    PubMed Central

    Long, Rong; Chen, Qihong; Zhang, Liyan; Ma, Longhua; Quan, Shuhai

    2013-01-01

    Online monitoring humidity in the proton exchange membrane (PEM) fuel cell is an important issue in maintaining proper membrane humidity. The cost and size of existing sensors for monitoring humidity are prohibitive for online measurements. Online prediction of humidity using readily available measured data would be beneficial to water management. In this paper, a novel soft sensor method based on dynamic partial least squares (DPLS) regression is proposed and applied to humidity prediction in PEM fuel cell. In order to obtain data of humidity and test the feasibility of the proposed DPLS-based soft sensor a hardware-in-the-loop (HIL) test system is constructed. The time lag of the DPLS-based soft sensor is selected as 30 by comparing the root-mean-square error in different time lag. The performance of the proposed DPLS-based soft sensor is demonstrated by experimental results. PMID:24453923

  18. ChromatoGate: A Tool for Detecting Base Mis-Calls in Multiple Sequence Alignments by Semi-Automatic Chromatogram Inspection

    PubMed Central

    Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros

    2013-01-01

    Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors. PMID:24688709

  19. ChromatoGate: A Tool for Detecting Base Mis-Calls in Multiple Sequence Alignments by Semi-Automatic Chromatogram Inspection.

    PubMed

    Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros

    2013-01-01

    Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors.

  20. Adaptive skin detection based on online training

    NASA Astrophysics Data System (ADS)

    Zhang, Ming; Tang, Liang; Zhou, Jie; Rong, Gang

    2007-11-01

    Skin is a widely used cue for porn image classification. Most conventional methods are off-line training schemes. They usually use a fixed boundary to segment skin regions in the images and are effective only in restricted conditions: e.g. good lightness and unique human race. This paper presents an adaptive online training scheme for skin detection which can handle these tough cases. In our approach, skin detection is considered as a classification problem on Gaussian mixture model. For each image, human face is detected and the face color is used to establish a primary estimation of skin color distribution. Then an adaptive online training algorithm is used to find the real boundary between skin color and background color in current image. Experimental results on 450 images showed that the proposed method is more robust in general situations than the conventional ones.

  1. "How Did We Get Here?": Topic Drift in Online Health Discussions.

    PubMed

    Park, Albert; Hartzler, Andrea L; Huh, Jina; Hsieh, Gary; McDonald, David W; Pratt, Wanda

    2016-11-02

    Patients increasingly use online health communities to exchange health information and peer support. During the progression of health discussions, a change of topic-topic drift-can occur. Topic drift is a frequent phenomenon linked to incoherence and frustration in online communities and other forms of computer-mediated communication. For sensitive topics, such as health, such drift could have life-altering repercussions, yet topic drift has not been studied in these contexts. Our goals were to understand topic drift in online health communities and then to develop and evaluate an automated approach to detect both topic drift and efforts of community members to counteract such drift. We manually analyzed 721 posts from 184 threads from 7 online health communities within WebMD to understand topic drift, members' reaction towards topic drift, and their efforts to counteract topic drift. Then, we developed an automated approach to detect topic drift and counteraction efforts. We detected topic drift by calculating cosine similarity between 229,156 posts from 37,805 threads and measuring change of cosine similarity scores from the threads' first posts to their sequential posts. Using a similar approach, we detected counteractions to topic drift in threads by focusing on the irregular increase of similarity scores compared to the previous post in threads. Finally, we evaluated the performance of our automated approaches to detect topic drift and counteracting efforts by using a manually developed gold standard. Our qualitative analyses revealed that in threads of online health communities, topics change gradually, but usually stay within the global frame of topics for the specific community. Members showed frustration when topic drift occurred in the middle of threads but reacted positively to off-topic stories shared as separate threads. Although all types of members helped to counteract topic drift, original posters provided the most effort to keep threads on topic. Cosine similarity scores show promise for automatically detecting topical changes in online health discussions. In our manual evaluation, we achieved an F1 score of .71 and .73 for detecting topic drift and counteracting efforts to stay on topic, respectively. Our analyses expand our understanding of topic drift in a health context and highlight practical implications, such as promoting off-topic discussions as a function of building rapport in online health communities. Furthermore, the quantitative findings suggest that an automated tool could help detect topic drift, support counteraction efforts to bring the conversation back on topic, and improve communication in these important communities. Findings from this study have the potential to reduce topic drift and improve online health community members' experience of computer-mediated communication. Improved communication could enhance the personal health management of members who seek essential information and support during times of difficulty. ©Albert Park, Andrea L Hartzler, Jina Huh, Gary Hsieh, David W McDonald, Wanda Pratt. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 02.11.2016.

  2. Detecting genotyping errors and describing black bear movement in northern Idaho

    Treesearch

    Michael K. Schwartz; Samuel A. Cushman; Kevin S. McKelvey; Jim Hayden; Cory Engkjer

    2006-01-01

    Non-invasive genetic sampling has become a favored tool to enumerate wildlife. Genetic errors, caused by poor quality samples, can lead to substantial biases in numerical estimates of individuals. We demonstrate how the computer program DROPOUT can detect amplification errors (false alleles and allelic dropout) in a black bear (Ursus americanus) dataset collected in...

  3. Integration of On-Line and Off-Line Diagnostic Algorithms for Aircraft Engine Health Management

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2007-01-01

    This paper investigates the integration of on-line and off-line diagnostic algorithms for aircraft gas turbine engines. The on-line diagnostic algorithm is designed for in-flight fault detection. It continuously monitors engine outputs for anomalous signatures induced by faults. The off-line diagnostic algorithm is designed to track engine health degradation over the lifetime of an engine. It estimates engine health degradation periodically over the course of the engine s life. The estimate generated by the off-line algorithm is used to update the on-line algorithm. Through this integration, the on-line algorithm becomes aware of engine health degradation, and its effectiveness to detect faults can be maintained while the engine continues to degrade. The benefit of this integration is investigated in a simulation environment using a nonlinear engine model.

  4. A Collaborative Assessment Among 11 Pharmaceutical Companies of Misinformation in Commonly Used Online Drug Information Compendia.

    PubMed

    Randhawa, Amarita S; Babalola, Olakiitan; Henney, Zachary; Miller, Michele; Nelson, Tanya; Oza, Meerat; Patel, Chandni; Randhawa, Anupma S; Riley, Joyce; Snyder, Scott; So, Sherri

    2016-05-01

    Online drug information compendia (ODIC) are valuable tools that health care professionals (HCPs) and consumers use to educate themselves on pharmaceutical products. Research suggests that these resources, although informative and easily accessible, may contain misinformation, posing risk for product misuse and patient harm. Evaluate drug summaries within ODIC for accuracy and completeness and identify product-specific misinformation. Between August 2014 and January 2015, medical information (MI) specialists from 11 pharmaceutical/biotechnology companies systematically evaluated 270 drug summaries within 5 commonly used ODIC for misinformation. Using a standardized approach, errors were identified; classified as inaccurate, incomplete, or omitted; and categorized per sections of the Full Prescribing Information (FPI). On review of each drug summary, content-correction requests were proposed and supported by the respective product's FPI. Across the 270 drug summaries reviewed within the 5 compendia, the median of the total number of errors identified was 782, with the greatest number of errors occurring in the categories of Dosage and Administration, Patient Education, and Warnings and Precautions. The majority of errors were classified as incomplete, followed by inaccurate and omitted. This analysis demonstrates that ODIC may contain misinformation. HCPs and consumers should be aware of the potential for misinformation and consider more than 1 drug information resource, including the FPI and Medication Guide as well as pharmaceutical/biotechnology companies' MI departments, to obtain unbiased, accurate, and complete product-specific drug information to help support the safe and effective use of prescription drug products. © The Author(s) 2016.

  5. On-line estimation and compensation of measurement delay in GPS/SINS integration

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Wang, Wei

    2008-10-01

    The chief aim of this paper is to propose a simple on-line estimation and compensation method of GPS/SINS measurement delay. The causes of time delay for GPS/SINS integration are analyzed in this paper. New Kalman filter state equations augmented by measurement delay and modified measurement equations are derived. Based on an open-loop Kalman filter, several simulations are run, results of which show that by the proposed method, the estimation and compensation error of measurement delay is below 0.1s.

  6. MATLAB Simulation of Gradient-Based Neural Network for Online Matrix Inversion

    NASA Astrophysics Data System (ADS)

    Zhang, Yunong; Chen, Ke; Ma, Weimu; Li, Xiao-Dong

    This paper investigates the simulation of a gradient-based recurrent neural network for online solution of the matrix-inverse problem. Several important techniques are employed as follows to simulate such a neural system. 1) Kronecker product of matrices is introduced to transform a matrix-differential-equation (MDE) to a vector-differential-equation (VDE); i.e., finally, a standard ordinary-differential-equation (ODE) is obtained. 2) MATLAB routine "ode45" is introduced to solve the transformed initial-value ODE problem. 3) In addition to various implementation errors, different kinds of activation functions are simulated to show the characteristics of such a neural network. Simulation results substantiate the theoretical analysis and efficacy of the gradient-based neural network for online constant matrix inversion.

  7. Hospital-based transfusion error tracking from 2005 to 2010: identifying the key errors threatening patient transfusion safety.

    PubMed

    Maskens, Carolyn; Downie, Helen; Wendt, Alison; Lima, Ana; Merkley, Lisa; Lin, Yulia; Callum, Jeannie

    2014-01-01

    This report provides a comprehensive analysis of transfusion errors occurring at a large teaching hospital and aims to determine key errors that are threatening transfusion safety, despite implementation of safety measures. Errors were prospectively identified from 2005 to 2010. Error data were coded on a secure online database called the Transfusion Error Surveillance System. Errors were defined as any deviation from established standard operating procedures. Errors were identified by clinical and laboratory staff. Denominator data for volume of activity were used to calculate rates. A total of 15,134 errors were reported with a median number of 215 errors per month (range, 85-334). Overall, 9083 (60%) errors occurred on the transfusion service and 6051 (40%) on the clinical services. In total, 23 errors resulted in patient harm: 21 of these errors occurred on the clinical services and two in the transfusion service. Of the 23 harm events, 21 involved inappropriate use of blood. Errors with no harm were 657 times more common than events that caused harm. The most common high-severity clinical errors were sample labeling (37.5%) and inappropriate ordering of blood (28.8%). The most common high-severity error in the transfusion service was sample accepted despite not meeting acceptance criteria (18.3%). The cost of product and component loss due to errors was $593,337. Errors occurred at every point in the transfusion process, with the greatest potential risk of patient harm resulting from inappropriate ordering of blood products and errors in sample labeling. © 2013 American Association of Blood Banks (CME).

  8. MO-FG-202-06: Improving the Performance of Gamma Analysis QA with Radiomics- Based Image Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wootton, L; Nyflot, M; Ford, E

    2016-06-15

    Purpose: The use of gamma analysis for IMRT quality assurance has well-known limitations. Traditionally, a simple thresholding technique is used to evaluated passing criteria. However, like any image the gamma distribution is rich in information which thresholding mostly discards. We therefore propose a novel method of analyzing gamma images that uses quantitative image features borrowed from radiomics, with the goal of improving error detection. Methods: 368 gamma images were generated from 184 clinical IMRT beams. For each beam the dose to a phantom was measured with EPID dosimetry and compared to the TPS dose calculated with and without normally distributedmore » (2mm sigma) errors in MLC positions. The magnitude of 17 intensity histogram and size-zone radiomic features were derived from each image. The features that differed most significantly between image sets were determined with ROC analysis. A linear machine-learning model was trained on these features to classify images as with or without errors on 180 gamma images.The model was then applied to an independent validation set of 188 additional gamma distributions, half with and half without errors. Results: The most significant features for detecting errors were histogram kurtosis (p=0.007) and three size-zone metrics (p<1e-6 for each). The sizezone metrics detected clusters of high gamma-value pixels under mispositioned MLCs. The model applied to the validation set had an AUC of 0.8, compared to 0.56 for traditional gamma analysis with the decision threshold restricted to 98% or less. Conclusion: A radiomics-based image analysis method was developed that is more effective in detecting error than traditional gamma analysis. Though the pilot study here considers only MLC position errors, radiomics-based methods for other error types are being developed, which may provide better error detection and useful information on the source of detected errors. This work was partially supported by a grant from the Agency for Healthcare Research and Quality, grant number R18 HS022244-01.« less

  9. Corrections of clinical chemistry test results in a laboratory information system.

    PubMed

    Wang, Sihe; Ho, Virginia

    2004-08-01

    The recently released reports by the Institute of Medicine, To Err Is Human and Patient Safety, have received national attention because of their focus on the problem of medical errors. Although a small number of studies have reported on errors in general clinical laboratories, there are, to our knowledge, no reported studies that focus on errors in pediatric clinical laboratory testing. To characterize the errors that have caused corrections to have to be made in pediatric clinical chemistry results in the laboratory information system, Misys. To provide initial data on the errors detected in pediatric clinical chemistry laboratories in order to improve patient safety in pediatric health care. All clinical chemistry staff members were informed of the study and were requested to report in writing when a correction was made in the laboratory information system, Misys. Errors were detected either by the clinicians (the results did not fit the patients' clinical conditions) or by the laboratory technologists (the results were double-checked, and the worksheets were carefully examined twice a day). No incident that was discovered before or during the final validation was included. On each Monday of the study, we generated a report from Misys that listed all of the corrections made during the previous week. We then categorized the corrections according to the types and stages of the incidents that led to the corrections. A total of 187 incidents were detected during the 10-month study, representing a 0.26% error detection rate per requisition. The distribution of the detected incidents included 31 (17%) preanalytic incidents, 46 (25%) analytic incidents, and 110 (59%) postanalytic incidents. The errors related to noninterfaced tests accounted for 50% of the total incidents and for 37% of the affected tests and orderable panels, while the noninterfaced tests and panels accounted for 17% of the total test volume in our laboratory. This pilot study provided the rate and categories of errors detected in a pediatric clinical chemistry laboratory based on the corrections of results in the laboratory information system. A direct interface of the instruments to the laboratory information system showed that it had favorable effects on reducing laboratory errors.

  10. 77 FR 4293 - Agency Information Collection Activities; Submission to OMB for Review and Approval; Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-27

    ...-HQ- OECA-2011-0233, to (1) EPA online using www.regulations.gov (our preferred method), or by email... the OMB Inventory of Approved Burdens due to a mathematical error in determining the person hours per...

  11. Intrusion detection system using Online Sequence Extreme Learning Machine (OS-ELM) in advanced metering infrastructure of smart grid.

    PubMed

    Li, Yuancheng; Qiu, Rixuan; Jing, Sitong

    2018-01-01

    Advanced Metering Infrastructure (AMI) realizes a two-way communication of electricity data through by interconnecting with a computer network as the core component of the smart grid. Meanwhile, it brings many new security threats and the traditional intrusion detection method can't satisfy the security requirements of AMI. In this paper, an intrusion detection system based on Online Sequence Extreme Learning Machine (OS-ELM) is established, which is used to detecting the attack in AMI and carrying out the comparative analysis with other algorithms. Simulation results show that, compared with other intrusion detection methods, intrusion detection method based on OS-ELM is more superior in detection speed and accuracy.

  12. Online virtual isocenter based radiation field targeting for high performance small animal microirradiation

    NASA Astrophysics Data System (ADS)

    Stewart, James M. P.; Ansell, Steve; Lindsay, Patricia E.; Jaffray, David A.

    2015-12-01

    Advances in precision microirradiators for small animal radiation oncology studies have provided the framework for novel translational radiobiological studies. Such systems target radiation fields at the scale required for small animal investigations, typically through a combination of on-board computed tomography image guidance and fixed, interchangeable collimators. Robust targeting accuracy of these radiation fields remains challenging, particularly at the millimetre scale field sizes achievable by the majority of microirradiators. Consistent and reproducible targeting accuracy is further hindered as collimators are removed and inserted during a typical experimental workflow. This investigation quantified this targeting uncertainty and developed an online method based on a virtual treatment isocenter to actively ensure high performance targeting accuracy for all radiation field sizes. The results indicated that the two-dimensional field placement uncertainty was as high as 1.16 mm at isocenter, with simulations suggesting this error could be reduced to 0.20 mm using the online correction method. End-to-end targeting analysis of a ball bearing target on radiochromic film sections showed an improved targeting accuracy with the three-dimensional vector targeting error across six different collimators reduced from 0.56+/- 0.05 mm (mean  ±  SD) to 0.05+/- 0.05 mm for an isotropic imaging voxel size of 0.1 mm.

  13. Experimental demonstration of record high 19.125 Gb/s real-time end-to-end dual-band optical OFDM transmission over 25 km SMF in a simple EML-based IMDD system.

    PubMed

    Giddings, R P; Hugues-Salas, E; Tang, J M

    2012-08-27

    Record high 19.125 Gb/s real-time end-to-end dual-band optical OFDM (OOFDM) transmission is experimentally demonstrated, for the first time, in a simple electro-absorption modulated laser (EML)-based 25 km standard SMF system using intensity modulation and direct detection (IMDD). Adaptively modulated baseband (0-2GHz) and passband (6.125 ± 2GHz) OFDM RF sub-bands, supporting line rates of 10 Gb/s and 9.125 Gb/s respectively, are independently generated and detected with FPGA-based DSP clocked at only 100 MHz and DACs/ADCs operating at sampling speeds as low as 4GS/s. The two OFDM sub-bands are electrically frequency-division-multiplexed (FDM) for intensity modulation of a single optical carrier by an EML. To maximize and balance the signal transmission performance of each sub-band, on-line adaptive features and on-line performance monitoring is fully exploited to optimize key OOFDM transceiver and system parameters, which includes subcarrier characteristics within each individual OFDM sub-band, total and relative sub-band power as well as EML operating conditions. The achieved 19.125 Gb/s over 25 km SMF OOFDM transmission system has an optical power budget of 13.5 dB, and shows almost identical bit error rate (BER) performances for both the baseband and passband signals. In addition, experimental investigations also indicate that the maximum achievable transmission capacity of the present system is mainly determined by the EML frequency chirp-enhanced chromatic dispersion effect, and the passband BER performance is not affected by the two sub-band-induced intermixing effect, which, however, gives a 1.2dB optical power penalty to the baseband signal transmission.

  14. A web-based incident reporting system: a two years' experience in an Italian research and teaching hospital.

    PubMed

    Bodina, A; Demarchi, A; Castaldi, S

    2014-01-01

    A web-based incident reporting system (IRS) is a tool allowing healthcare workers to voluntary and anonymously report adverse events/near misses. In 2010, this system was introduced in a research and teaching hospital in metropolitan area in the North part of Italy, in order to detect errors and to learn from failures in care delivery. The aim of this paper is to assess whether and how IRS has proved to be a valuable tool to manage clinical risk and improve healthcare quality. Adverse events are reported anonymously by staff members with the use of an online template form available in the hospital intranet. We retrospectively reviewed the recorded data for each incident/near miss reported between January 2011 and December 2012. The number of reported incidents/near misses was 521 in 2011 and 442 in 2012. In the two years the admissions were 36.974 and 36.107 respectively. We noticed that nursing staff made more use of IRS and that reported errors were basically related to prescription and administration of medications. Much international literature reports that adverse events and near misses are 10% of admissions. Our data are far from that number, thus meaning that a failure in reporting adverse events exists. This consideration, together with the high number of near misses in comparison with occurred errors, leads us to speculate that adverse events with serious consequences for patients are marginally reported. Probably the lack of a strong leadership considering IRS as an instrument for improving quality and operators' reluctance to overcome the culture of blame may negatively affect IRS.

  15. CUSUM-Logistic Regression analysis for the rapid detection of errors in clinical laboratory test results.

    PubMed

    Sampson, Maureen L; Gounden, Verena; van Deventer, Hendrik E; Remaley, Alan T

    2016-02-01

    The main drawback of the periodic analysis of quality control (QC) material is that test performance is not monitored in time periods between QC analyses, potentially leading to the reporting of faulty test results. The objective of this study was to develop a patient based QC procedure for the more timely detection of test errors. Results from a Chem-14 panel measured on the Beckman LX20 analyzer were used to develop the model. Each test result was predicted from the other 13 members of the panel by multiple regression, which resulted in correlation coefficients between the predicted and measured result of >0.7 for 8 of the 14 tests. A logistic regression model, which utilized the measured test result, the predicted test result, the day of the week and time of day, was then developed for predicting test errors. The output of the logistic regression was tallied by a daily CUSUM approach and used to predict test errors, with a fixed specificity of 90%. The mean average run length (ARL) before error detection by CUSUM-Logistic Regression (CSLR) was 20 with a mean sensitivity of 97%, which was considerably shorter than the mean ARL of 53 (sensitivity 87.5%) for a simple prediction model that only used the measured result for error detection. A CUSUM-Logistic Regression analysis of patient laboratory data can be an effective approach for the rapid and sensitive detection of clinical laboratory errors. Published by Elsevier Inc.

  16. SU-E-T-392: Evaluation of Ion Chamber/film and Log File Based QA to Detect Delivery Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, C; Mason, B; Kirsner, S

    2015-06-15

    Purpose: Ion chamber and film (ICAF) is a method used to verify patient dose prior to treatment. More recently, log file based QA has been shown as an alternative for measurement based QA. In this study, we delivered VMAT plans with and without errors to determine if ICAF and/or log file based QA was able to detect the errors. Methods: Using two VMAT patients, the original treatment plan plus 7 additional plans with delivery errors introduced were generated and delivered. The erroneous plans had gantry, collimator, MLC, gantry and collimator, collimator and MLC, MLC and gantry, and gantry, collimator, andmore » MLC errors. The gantry and collimator errors were off by 4{sup 0} for one of the two arcs. The MLC error introduced was one in which the opening aperture didn’t move throughout the delivery of the field. For each delivery, an ICAF measurement was made as well as a dose comparison based upon log files. Passing criteria to evaluate the plans were ion chamber less and 5% and film 90% of pixels pass the 3mm/3% gamma analysis(GA). For log file analysis 90% of voxels pass the 3mm/3% 3D GA and beam parameters match what was in the plan. Results: Two original plans were delivered and passed both ICAF and log file base QA. Both ICAF and log file QA met the dosimetry criteria on 4 of the 12 erroneous cases analyzed (2 cases were not analyzed). For the log file analysis, all 12 erroneous plans alerted a mismatch in delivery versus what was planned. The 8 plans that didn’t meet criteria all had MLC errors. Conclusion: Our study demonstrates that log file based pre-treatment QA was able to detect small errors that may not be detected using an ICAF and both methods of were able to detect larger delivery errors.« less

  17. Automated replication of cone beam CT-guided treatments in the Pinnacle(3) treatment planning system for adaptive radiotherapy.

    PubMed

    Hargrave, Catriona; Mason, Nicole; Guidi, Robyn; Miller, Julie-Anne; Becker, Jillian; Moores, Matthew; Mengersen, Kerrie; Poulsen, Michael; Harden, Fiona

    2016-03-01

    Time-consuming manual methods have been required to register cone-beam computed tomography (CBCT) images with plans in the Pinnacle(3) treatment planning system in order to replicate delivered treatments for adaptive radiotherapy. These methods rely on fiducial marker (FM) placement during CBCT acquisition or the image mid-point to localise the image isocentre. A quality assurance study was conducted to validate an automated CBCT-plan registration method utilising the Digital Imaging and Communications in Medicine (DICOM) Structure Set (RS) and Spatial Registration (RE) files created during online image-guided radiotherapy (IGRT). CBCTs of a phantom were acquired with FMs and predetermined setup errors using various online IGRT workflows. The CBCTs, DICOM RS and RE files were imported into Pinnacle(3) plans of the phantom and the resulting automated CBCT-plan registrations were compared to existing manual methods. A clinical protocol for the automated method was subsequently developed and tested retrospectively using CBCTs and plans for six bladder patients. The automated CBCT-plan registration method was successfully applied to thirty-four phantom CBCT images acquired with an online 0 mm action level workflow. Ten CBCTs acquired with other IGRT workflows required manual workarounds. This was addressed during the development and testing of the clinical protocol using twenty-eight patient CBCTs. The automated CBCT-plan registrations were instantaneous, replicating delivered treatments in Pinnacle(3) with errors of ±0.5 mm. These errors were comparable to mid-point-dependant manual registrations but superior to FM-dependant manual registrations. The automated CBCT-plan registration method quickly and reliably replicates delivered treatments in Pinnacle(3) for adaptive radiotherapy.

  18. MPI Runtime Error Detection with MUST: Advances in Deadlock Detection

    DOE PAGES

    Hilbrich, Tobias; Protze, Joachim; Schulz, Martin; ...

    2013-01-01

    The widely used Message Passing Interface (MPI) is complex and rich. As a result, application developers require automated tools to avoid and to detect MPI programming errors. We present the Marmot Umpire Scalable Tool (MUST) that detects such errors with significantly increased scalability. We present improvements to our graph-based deadlock detection approach for MPI, which cover future MPI extensions. Our enhancements also check complex MPI constructs that no previous graph-based detection approach handled correctly. Finally, we present optimizations for the processing of MPI operations that reduce runtime deadlock detection overheads. Existing approaches often require ( p ) analysis time permore » MPI operation, for p processes. We empirically observe that our improvements lead to sub-linear or better analysis time per operation for a wide range of real world applications.« less

  19. Decoding of DBEC-TBED Reed-Solomon codes. [Double-Byte-Error-Correcting, Triple-Byte-Error-Detecting

    NASA Technical Reports Server (NTRS)

    Deng, Robert H.; Costello, Daniel J., Jr.

    1987-01-01

    A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256 K bit DRAM's are organized in 32 K x 8 bit-bytes. Byte-oriented codes such as Reed-Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. The paper presents a special decoding technique for double-byte-error-correcting, triple-byte-error-detecting RS codes which is capable of high-speed operation. This technique is designed to find the error locations and the error values directly from the syndrome without having to use the iterative algorithm to find the error locator polynomial.

  20. Adaboost multi-view face detection based on YCgCr skin color model

    NASA Astrophysics Data System (ADS)

    Lan, Qi; Xu, Zhiyong

    2016-09-01

    Traditional Adaboost face detection algorithm uses Haar-like features training face classifiers, whose detection error rate is low in the face region. While under the complex background, the classifiers will make wrong detection easily to the background regions with the similar faces gray level distribution, which leads to the error detection rate of traditional Adaboost algorithm is high. As one of the most important features of a face, skin in YCgCr color space has good clustering. We can fast exclude the non-face areas through the skin color model. Therefore, combining with the advantages of the Adaboost algorithm and skin color detection algorithm, this paper proposes Adaboost face detection algorithm method that bases on YCgCr skin color model. Experiments show that, compared with traditional algorithm, the method we proposed has improved significantly in the detection accuracy and errors.

  1. Using goal- and grip-related information for understanding the correctness of other's actions: an ERP study.

    PubMed

    van Elk, Michiel; Bousardt, Roel; Bekkering, Harold; van Schie, Hein T

    2012-01-01

    Detecting errors in other's actions is of pivotal importance for joint action, competitive behavior and observational learning. Although many studies have focused on the neural mechanisms involved in detecting low-level errors, relatively little is known about error-detection in everyday situations. The present study aimed to identify the functional and neural mechanisms whereby we understand the correctness of other's actions involving well-known objects (e.g. pouring coffee in a cup). Participants observed action sequences in which the correctness of the object grasped and the grip applied to a pair of objects were independently manipulated. Observation of object violations (e.g. grasping the empty cup instead of the coffee pot) resulted in a stronger P3-effect than observation of grip errors (e.g. grasping the coffee pot at the upper part instead of the handle), likely reflecting a reorienting response, directing attention to the relevant location. Following the P3-effect, a parietal slow wave positivity was observed that persisted for grip-errors, likely reflecting the detection of an incorrect hand-object interaction. These findings provide new insight in the functional significance of the neurophysiological markers associated with the observation of incorrect actions and suggest that the P3-effect and the subsequent parietal slow wave positivity may reflect the detection of errors at different levels in the action hierarchy. Thereby this study elucidates the cognitive processes that support the detection of action violations in the selection of objects and grips.

  2. Virtex-5QV Self Scrubber

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wojahn, Christopher K.

    2015-10-20

    This HDL code (hereafter referred to as "software") implements circuitry in Xilinx Virtex-5QV Field Programmable Gate Array (FPGA) hardware. This software allows the device to self-check the consistency of its own configuration memory for radiation-induced errors. The software then provides the capability to correct any single-bit errors detected in the memory using the device's inherent circuitry, or reload corrupted memory frames when larger errors occur that cannot be corrected with the device's built-in error correction and detection scheme.

  3. Optimizing the stimulus presentation paradigm design for the P300-based brain-computer interface using performance prediction.

    PubMed

    Mainsah, B O; Reeves, G; Collins, L M; Throckmorton, C S

    2017-08-01

    The role of a brain-computer interface (BCI) is to discern a user's intended message or action by extracting and decoding relevant information from brain signals. Stimulus-driven BCIs, such as the P300 speller, rely on detecting event-related potentials (ERPs) in response to a user attending to relevant or target stimulus events. However, this process is error-prone because the ERPs are embedded in noisy electroencephalography (EEG) data, representing a fundamental problem in communication of the uncertainty in the information that is received during noisy transmission. A BCI can be modeled as a noisy communication system and an information-theoretic approach can be exploited to design a stimulus presentation paradigm to maximize the information content that is presented to the user. However, previous methods that focused on designing error-correcting codes failed to provide significant performance improvements due to underestimating the effects of psycho-physiological factors on the P300 ERP elicitation process and a limited ability to predict online performance with their proposed methods. Maximizing the information rate favors the selection of stimulus presentation patterns with increased target presentation frequency, which exacerbates refractory effects and negatively impacts performance within the context of an oddball paradigm. An information-theoretic approach that seeks to understand the fundamental trade-off between information rate and reliability is desirable. We developed a performance-based paradigm (PBP) by tuning specific parameters of the stimulus presentation paradigm to maximize performance while minimizing refractory effects. We used a probabilistic-based performance prediction method as an evaluation criterion to select a final configuration of the PBP. With our PBP, we demonstrate statistically significant improvements in online performance, both in accuracy and spelling rate, compared to the conventional row-column paradigm. By accounting for refractory effects, an information-theoretic approach can be exploited to significantly improve BCI performance across a wide range of performance levels.

  4. Optimizing the stimulus presentation paradigm design for the P300-based brain-computer interface using performance prediction

    NASA Astrophysics Data System (ADS)

    Mainsah, B. O.; Reeves, G.; Collins, L. M.; Throckmorton, C. S.

    2017-08-01

    Objective. The role of a brain-computer interface (BCI) is to discern a user’s intended message or action by extracting and decoding relevant information from brain signals. Stimulus-driven BCIs, such as the P300 speller, rely on detecting event-related potentials (ERPs) in response to a user attending to relevant or target stimulus events. However, this process is error-prone because the ERPs are embedded in noisy electroencephalography (EEG) data, representing a fundamental problem in communication of the uncertainty in the information that is received during noisy transmission. A BCI can be modeled as a noisy communication system and an information-theoretic approach can be exploited to design a stimulus presentation paradigm to maximize the information content that is presented to the user. However, previous methods that focused on designing error-correcting codes failed to provide significant performance improvements due to underestimating the effects of psycho-physiological factors on the P300 ERP elicitation process and a limited ability to predict online performance with their proposed methods. Maximizing the information rate favors the selection of stimulus presentation patterns with increased target presentation frequency, which exacerbates refractory effects and negatively impacts performance within the context of an oddball paradigm. An information-theoretic approach that seeks to understand the fundamental trade-off between information rate and reliability is desirable. Approach. We developed a performance-based paradigm (PBP) by tuning specific parameters of the stimulus presentation paradigm to maximize performance while minimizing refractory effects. We used a probabilistic-based performance prediction method as an evaluation criterion to select a final configuration of the PBP. Main results. With our PBP, we demonstrate statistically significant improvements in online performance, both in accuracy and spelling rate, compared to the conventional row-column paradigm. Significance. By accounting for refractory effects, an information-theoretic approach can be exploited to significantly improve BCI performance across a wide range of performance levels.

  5. Incorporating profile information in community detection for online social networks

    NASA Astrophysics Data System (ADS)

    Fan, W.; Yeung, K. H.

    2014-07-01

    Community structure is an important feature in the study of complex networks. It is because nodes of the same community may have similar properties. In this paper we extend two popular community detection methods to partition online social networks. In our extended methods, the profile information of users is used for partitioning. We apply the extended methods in several sample networks of Facebook. Compared with the original methods, the community structures we obtain have higher modularity. Our results indicate that users' profile information is consistent with the community structure of their friendship network to some extent. To the best of our knowledge, this paper is the first to discuss how profile information can be used to improve community detection in online social networks.

  6. Detecting and Correcting Errors in Rapid Aiming Movements: Effects of Movement Time, Distance, and Velocity

    ERIC Educational Resources Information Center

    Sherwood, David E.

    2010-01-01

    According to closed-loop accounts of motor control, movement errors are detected by comparing sensory feedback to an acquired reference state. Differences between the reference state and the movement-produced feedback results in an error signal that serves as a basis for a correction. The main question addressed in the current study was how…

  7. Intelligent complementary sliding-mode control for LUSMS-based X-Y-theta motion control stage.

    PubMed

    Lin, Faa-Jeng; Chen, Syuan-Yi; Shyu, Kuo-Kai; Liu, Yen-Hung

    2010-07-01

    An intelligent complementary sliding-mode control (ICSMC) system using a recurrent wavelet-based Elman neural network (RWENN) estimator is proposed in this study to control the mover position of a linear ultrasonic motors (LUSMs)-based X-Y-theta motion control stage for the tracking of various contours. By the addition of a complementary generalized error transformation, the complementary sliding-mode control (CSMC) can efficiently reduce the guaranteed ultimate bound of the tracking error by half compared with the slidingmode control (SMC) while using the saturation function. To estimate a lumped uncertainty on-line and replace the hitting control of the CSMC directly, the RWENN estimator is adopted in the proposed ICSMC system. In the RWENN, each hidden neuron employs a different wavelet function as an activation function to improve both the convergent precision and the convergent time compared with the conventional Elman neural network (ENN). The estimation laws of the RWENN are derived using the Lyapunov stability theorem to train the network parameters on-line. A robust compensator is also proposed to confront the uncertainties including approximation error, optimal parameter vectors, and higher-order terms in Taylor series. Finally, some experimental results of various contours tracking show that the tracking performance of the ICSMC system is significantly improved compared with the SMC and CSMC systems.

  8. Single-primer fluorescent sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruth, J.L.; Morgan, C.A.; Middendorf, L.R.

    Modified linker arm oligonucleotides complementary to standard M13 priming sites were synthesized, labelled with either one, two, or three fluoresceins, and purified by reverse-phase HPLC. When used as primers in standard dideoxy M13 sequencing with /sup 32/P-dNTPs, normal autoradiographic patterns were obtained. To eliminate the radioactivity, direct on-line fluorescence detection was achieved by the use of a scanning 10 mW Argon laser emitting 488 nm light. Fluorescent bands were detected directly in standard 0.2 or 0.35 mm thick polyacrylamide gels at a distance of 24 cm from the loading wells by a photomultiplier tube filtered at 520 nm. Horizontal andmore » temporal location of each band was displayed by computer as a band in real time, providing visual appearance similar to normal 4-lane autoradiograms. Using a single primer labelled with two fluoresceins, sequences of between 500 and 600 bases have been read in a single loading with better than 98% accuracy; up to 400 bases can be read reproducibly with no errors. More than 50 sequences have been determined by this method. This approach requires only 1-2 ug of cloned template, and produces continuous sequence data at about one band per minute.« less

  9. 77 FR 24703 - Agency Information Collection Activities; Submission to OMB for Review and Approval; Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ...-2012-0034, to: (1) EPA online using www.regulations.gov (our preferred method), or by email to docket... decrease in the labor hours in this ICR compared to the previous ICR due to a mathematical error in [[Page...

  10. 76 FR 14967 - Agency Information Collection Activities; Submission to OMB for Review and Approval; Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-18

    ...- OECA-2010-0374, to (1) EPA online using http://www.regulations.gov (our preferred method), or by email... hours. This is due to a mathematical error in the previous ICR. The increase in cost to Respondents and...

  11. LANDSAT-4 MSS Geometric Correction: Methods and Results

    NASA Technical Reports Server (NTRS)

    Brooks, J.; Kimmer, E.; Su, J.

    1984-01-01

    An automated image registration system such as that developed for LANDSAT-4 can produce all of the information needed to verify and calibrate the software and to evaluate system performance. The on-line MSS archive generation process which upgrades systematic correction data to geodetic correction data is described as well as the control point library build subsystem which generates control point chips and support data for on-line upgrade of correction data. The system performance was evaluated for both temporal and geodetic registration. For temporal registration, 90% errors were computed to be .36 IFOV (instantaneous field of view) = 82.7 meters) cross track, and .29 IFOV along track. Also, for actual production runs monitored, the 90% errors were .29 IFOV cross track and .25 IFOV along track. The system specification is .3 IFOV, 90% of the time, both cross and along track. For geodetic registration performance, the model bias was measured by designating control points in the geodetically corrected imagery.

  12. Clustering of tethered satellite system simulation data by an adaptive neuro-fuzzy algorithm

    NASA Technical Reports Server (NTRS)

    Mitra, Sunanda; Pemmaraju, Surya

    1992-01-01

    Recent developments in neuro-fuzzy systems indicate that the concepts of adaptive pattern recognition, when used to identify appropriate control actions corresponding to clusters of patterns representing system states in dynamic nonlinear control systems, may result in innovative designs. A modular, unsupervised neural network architecture, in which fuzzy learning rules have been embedded is used for on-line identification of similar states. The architecture and control rules involved in Adaptive Fuzzy Leader Clustering (AFLC) allow this system to be incorporated in control systems for identification of system states corresponding to specific control actions. We have used this algorithm to cluster the simulation data of Tethered Satellite System (TSS) to estimate the range of delta voltages necessary to maintain the desired length rate of the tether. The AFLC algorithm is capable of on-line estimation of the appropriate control voltages from the corresponding length error and length rate error without a priori knowledge of their membership functions and familarity with the behavior of the Tethered Satellite System.

  13. Updating of aversive memories after temporal error detection is differentially modulated by mTOR across development

    PubMed Central

    Tallot, Lucille; Diaz-Mataix, Lorenzo; Perry, Rosemarie E.; Wood, Kira; LeDoux, Joseph E.; Mouly, Anne-Marie; Sullivan, Regina M.; Doyère, Valérie

    2017-01-01

    The updating of a memory is triggered whenever it is reactivated and a mismatch from what is expected (i.e., prediction error) is detected, a process that can be unraveled through the memory's sensitivity to protein synthesis inhibitors (i.e., reconsolidation). As noted in previous studies, in Pavlovian threat/aversive conditioning in adult rats, prediction error detection and its associated protein synthesis-dependent reconsolidation can be triggered by reactivating the memory with the conditioned stimulus (CS), but without the unconditioned stimulus (US), or by presenting a CS–US pairing with a different CS–US interval than during the initial learning. Whether similar mechanisms underlie memory updating in the young is not known. Using similar paradigms with rapamycin (an mTORC1 inhibitor), we show that preweaning rats (PN18–20) do form a long-term memory of the CS–US interval, and detect a 10-sec versus 30-sec temporal prediction error. However, the resulting updating/reconsolidation processes become adult-like after adolescence (PN30–40). Our results thus show that while temporal prediction error detection exists in preweaning rats, specific infant-type mechanisms are at play for associative learning and memory. PMID:28202715

  14. Memory and the Moses illusion: failures to detect contradictions with stored knowledge yield negative memorial consequences.

    PubMed

    Bottoms, Hayden C; Eslick, Andrea N; Marsh, Elizabeth J

    2010-08-01

    Although contradictions with stored knowledge are common in daily life, people often fail to notice them. For example, in the Moses illusion, participants fail to notice errors in questions such as "How many animals of each kind did Moses take on the Ark?" despite later showing knowledge that the Biblical reference is to Noah, not Moses. We examined whether error prevalence affected participants' ability to detect distortions in questions, and whether this in turn had memorial consequences. Many of the errors were overlooked, but participants were better able to catch them when they were more common. More generally, the failure to detect errors had negative memorial consequences, increasing the likelihood that the errors were used to answer later general knowledge questions. Methodological implications of this finding are discussed, as it suggests that typical analyses likely underestimate the size of the Moses illusion. Overall, answering distorted questions can yield errors in the knowledge base; most importantly, prior knowledge does not protect against these negative memorial consequences.

  15. Prevalence and pattern of prescription errors in a Nigerian kidney hospital.

    PubMed

    Babatunde, Kehinde M; Akinbodewa, Akinwumi A; Akinboye, Ayodele O; Adejumo, Ademola O

    2016-12-01

    To determine (i) the prevalence and pattern of prescription errors in our Centre and, (ii) appraise pharmacists' intervention and correction of identified prescription errors. A descriptive, single blinded cross-sectional study. Kidney Care Centre is a public Specialist hospital. The monthly patient load averages 60 General Out-patient cases and 17.4 in-patients. A total of 31 medical doctors (comprising of 2 Consultant Nephrologists, 15 Medical Officers, 14 House Officers), 40 nurses and 24 ward assistants participated in the study. One pharmacist runs the daily call schedule. Prescribers were blinded to the study. Prescriptions containing only galenicals were excluded. An error detection mechanism was set up to identify and correct prescription errors. Life-threatening prescriptions were discussed with the Quality Assurance Team of the Centre who conveyed such errors to the prescriber without revealing the on-going study. Prevalence of prescription errors, pattern of prescription errors, pharmacist's intervention. A total of 2,660 (75.0%) combined prescription errors were found to have one form of error or the other; illegitimacy 1,388 (52.18%), omission 1,221(45.90%), wrong dose 51(1.92%) and no error of style was detected. Life-threatening errors were low (1.1-2.2%). Errors were found more commonly among junior doctors and non-medical doctors. Only 56 (1.6%) of the errors were detected and corrected during the process of dispensing. Prescription errors related to illegitimacy and omissions were highly prevalent. There is a need to improve on patient-to-healthcare giver ratio. A medication quality assurance unit is needed in our hospitals. No financial support was received by any of the authors for this study.

  16. Identifying key hospital service quality factors in online health communities.

    PubMed

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain; Kim, Minki

    2015-04-07

    The volume of health-related user-created content, especially hospital-related questions and answers in online health communities, has rapidly increased. Patients and caregivers participate in online community activities to share their experiences, exchange information, and ask about recommended or discredited hospitals. However, there is little research on how to identify hospital service quality automatically from the online communities. In the past, in-depth analysis of hospitals has used random sampling surveys. However, such surveys are becoming impractical owing to the rapidly increasing volume of online data and the diverse analysis requirements of related stakeholders. As a solution for utilizing large-scale health-related information, we propose a novel approach to identify hospital service quality factors and overtime trends automatically from online health communities, especially hospital-related questions and answers. We defined social media-based key quality factors for hospitals. In addition, we developed text mining techniques to detect such factors that frequently occur in online health communities. After detecting these factors that represent qualitative aspects of hospitals, we applied a sentiment analysis to recognize the types of recommendations in messages posted within online health communities. Korea's two biggest online portals were used to test the effectiveness of detection of social media-based key quality factors for hospitals. To evaluate the proposed text mining techniques, we performed manual evaluations on the extraction and classification results, such as hospital name, service quality factors, and recommendation types using a random sample of messages (ie, 5.44% (9450/173,748) of the total messages). Service quality factor detection and hospital name extraction achieved average F1 scores of 91% and 78%, respectively. In terms of recommendation classification, performance (ie, precision) is 78% on average. Extraction and classification performance still has room for improvement, but the extraction results are applicable to more detailed analysis. Further analysis of the extracted information reveals that there are differences in the details of social media-based key quality factors for hospitals according to the regions in Korea, and the patterns of change seem to accurately reflect social events (eg, influenza epidemics). These findings could be used to provide timely information to caregivers, hospital officials, and medical officials for health care policies.

  17. Flight deck disturbance management: a simulator study of diagnosis and recovery from breakdowns in pilot-automation coordination.

    PubMed

    Nikolic, Mark I; Sarter, Nadine B

    2007-08-01

    To examine operator strategies for diagnosing and recovering from errors and disturbances as well as the impact of automation design and time pressure on these processes. Considerable efforts have been directed at error prevention through training and design. However, because errors cannot be eliminated completely, their detection, diagnosis, and recovery must also be supported. Research has focused almost exclusively on error detection. Little is known about error diagnosis and recovery, especially in the context of event-driven tasks and domains. With a confederate pilot, 12 airline pilots flew a 1-hr simulator scenario that involved three challenging automation-related tasks and events that were likely to produce erroneous actions or assessments. Behavioral data were compared with a canonical path to examine pilots' error and disturbance management strategies. Debriefings were conducted to probe pilots' system knowledge. Pilots seldom followed the canonical path to cope with the scenario events. Detection of a disturbance was often delayed. Diagnostic episodes were rare because of pilots' knowledge gaps and time criticality. In many cases, generic inefficient recovery strategies were observed, and pilots relied on high levels of automation to manage the consequences of an error. Our findings describe and explain the nature and shortcomings of pilots' error management activities. They highlight the need for improved automation training and design to achieve more timely detection, accurate explanation, and effective recovery from errors and disturbances. Our findings can inform the design of tools and techniques that support disturbance management in various complex, event-driven environments.

  18. Composite Interval Mapping Based on Lattice Design for Error Control May Increase Power of Quantitative Trait Locus Detection.

    PubMed

    He, Jianbo; Li, Jijie; Huang, Zhongwen; Zhao, Tuanjie; Xing, Guangnan; Gai, Junyi; Guan, Rongzhan

    2015-01-01

    Experimental error control is very important in quantitative trait locus (QTL) mapping. Although numerous statistical methods have been developed for QTL mapping, a QTL detection model based on an appropriate experimental design that emphasizes error control has not been developed. Lattice design is very suitable for experiments with large sample sizes, which is usually required for accurate mapping of quantitative traits. However, the lack of a QTL mapping method based on lattice design dictates that the arithmetic mean or adjusted mean of each line of observations in the lattice design had to be used as a response variable, resulting in low QTL detection power. As an improvement, we developed a QTL mapping method termed composite interval mapping based on lattice design (CIMLD). In the lattice design, experimental errors are decomposed into random errors and block-within-replication errors. Four levels of block-within-replication errors were simulated to show the power of QTL detection under different error controls. The simulation results showed that the arithmetic mean method, which is equivalent to a method under random complete block design (RCBD), was very sensitive to the size of the block variance and with the increase of block variance, the power of QTL detection decreased from 51.3% to 9.4%. In contrast to the RCBD method, the power of CIMLD and the adjusted mean method did not change for different block variances. The CIMLD method showed 1.2- to 7.6-fold higher power of QTL detection than the arithmetic or adjusted mean methods. Our proposed method was applied to real soybean (Glycine max) data as an example and 10 QTLs for biomass were identified that explained 65.87% of the phenotypic variation, while only three and two QTLs were identified by arithmetic and adjusted mean methods, respectively.

  19. A multiobserver study of the effects of including point-of-care patient photographs with portable radiography: a means to detect wrong-patient errors.

    PubMed

    Tridandapani, Srini; Ramamurthy, Senthil; Provenzale, James; Obuchowski, Nancy A; Evanoff, Michael G; Bhatti, Pamela

    2014-08-01

    To evaluate whether the presence of facial photographs obtained at the point-of-care of portable radiography leads to increased detection of wrong-patient errors. In this institutional review board-approved study, 166 radiograph-photograph combinations were obtained from 30 patients. Consecutive radiographs from the same patients resulted in 83 unique pairs (ie, a new radiograph and prior, comparison radiograph) for interpretation. To simulate wrong-patient errors, mismatched pairs were generated by pairing radiographs from different patients chosen randomly from the sample. Ninety radiologists each interpreted a unique randomly chosen set of 10 radiographic pairs, containing up to 10% mismatches (ie, error pairs). Radiologists were randomly assigned to interpret radiographs with or without photographs. The number of mismatches was identified, and interpretation times were recorded. Ninety radiologists with 21 ± 10 (mean ± standard deviation) years of experience were recruited to participate in this observer study. With the introduction of photographs, the proportion of errors detected increased from 31% (9 of 29) to 77% (23 of 30; P = .006). The odds ratio for detection of error with photographs to detection without photographs was 7.3 (95% confidence interval: 2.29-23.18). Observer qualifications, training, or practice in cardiothoracic radiology did not influence sensitivity for error detection. There is no significant difference in interpretation time for studies without photographs and those with photographs (60 ± 22 vs. 61 ± 25 seconds; P = .77). In this observer study, facial photographs obtained simultaneously with portable chest radiographs increased the identification of any wrong-patient errors, without substantial increase in interpretation time. This technique offers a potential means to increase patient safety through correct patient identification. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  20. Improvement of tsunami detection in timeseries data of GPS buoys with the Continuous Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Chida, Y.; Takagawa, T.

    2017-12-01

    The observation data of GPS buoys which are installed in the offshore of Japan are used for monitoring not only waves but also tsunamis in Japan. The real-time data was successfully used to upgrade the tsunami warnings just after the 2011 Tohoku earthquake. Huge tsunamis can be easily detected because the signal-noise ratio is high enough, but moderate tsunami is not. GPS data sometimes include the error waveforms like tsunamis because of changing accuracy by the number and the position of GPS satellites. To distinguish the true tsunami waveforms from pseudo-tsunami ones is important for tsunami detection. In this research, a method to reduce misdetections of tsunami in the observation data of GPS buoys and to increase the efficiency of tsunami detection was developed.Firstly, the error waveforms were extracted by using the indexes of position dilution of precision, reliability of GPS satellite positioning and satellite number for calculation. Then, the output from this procedure was used for the Continuous Wavelet Transform (CWT) to analyze the time-frequency characteristics of error waveforms and real tsunami waveforms.We found that the error waveforms tended to appear when the accuracy of GPS buoys positioning was low. By extracting these waveforms, it was possible to decrease about 43% error waveforms without the reduction of the tsunami detection rate. Moreover, we found that the amplitudes of power spectra obtained from the error waveforms and real tsunamis were similar in the component of long period (4-65 minutes), on the other hand, the amplitude in the component of short period (< 1 minute) obtained from the error waveforms was significantly larger than that of the real tsunami waveforms. By thresholding of the short-period component, further extraction of error waveforms became possible without a significant reduction of tsunami detection rate.

Top