Sample records for false detection rate

  1. A false-alarm aware methodology to develop robust and efficient multi-scale infrared small target detection algorithm

    NASA Astrophysics Data System (ADS)

    Moradi, Saed; Moallem, Payman; Sabahi, Mohamad Farzan

    2018-03-01

    False alarm rate and detection rate are still two contradictory metrics for infrared small target detection in an infrared search and track system (IRST), despite the development of new detection algorithms. In certain circumstances, not detecting true targets is more tolerable than detecting false items as true targets. Hence, considering background clutter and detector noise as the sources of the false alarm in an IRST system, in this paper, a false alarm aware methodology is presented to reduce false alarm rate while the detection rate remains undegraded. To this end, advantages and disadvantages of each detection algorithm are investigated and the sources of the false alarms are determined. Two target detection algorithms having independent false alarm sources are chosen in a way that the disadvantages of the one algorithm can be compensated by the advantages of the other one. In this work, multi-scale average absolute gray difference (AAGD) and Laplacian of point spread function (LoPSF) are utilized as the cornerstones of the desired algorithm of the proposed methodology. After presenting a conceptual model for the desired algorithm, it is implemented through the most straightforward mechanism. The desired algorithm effectively suppresses background clutter and eliminates detector noise. Also, since the input images are processed through just four different scales, the desired algorithm has good capability for real-time implementation. Simulation results in term of signal to clutter ratio and background suppression factor on real and simulated images prove the effectiveness and the performance of the proposed methodology. Since the desired algorithm was developed based on independent false alarm sources, our proposed methodology is expandable to any pair of detection algorithms which have different false alarm sources.

  2. A statistical model of false negative and false positive detection of phase singularities.

    PubMed

    Jacquemet, Vincent

    2017-10-01

    The complexity of cardiac fibrillation dynamics can be assessed by analyzing the distribution of phase singularities (PSs) observed using mapping systems. Interelectrode distance, however, limits the accuracy of PS detection. To investigate in a theoretical framework the PS false negative and false positive rates in relation to the characteristics of the mapping system and fibrillation dynamics, we propose a statistical model of phase maps with controllable number and locations of PSs. In this model, phase maps are generated from randomly distributed PSs with physiologically-plausible directions of rotation. Noise and distortion of the phase are added. PSs are detected using topological charge contour integrals on regular grids of varying resolutions. Over 100 × 10 6 realizations of the random field process are used to estimate average false negative and false positive rates using a Monte-Carlo approach. The false detection rates are shown to depend on the average distance between neighboring PSs expressed in units of interelectrode distance, following approximately a power law with exponents in the range of 1.14 to 2 for false negatives and around 2.8 for false positives. In the presence of noise or distortion of phase, false detection rates at high resolution tend to a non-zero noise-dependent lower bound. This model provides an easy-to-implement tool for benchmarking PS detection algorithms over a broad range of configurations with multiple PSs.

  3. [Molecular beacon based PNA-FISH method combined with fluorescence scanning for rapid detection of Listeria monocytogenes].

    PubMed

    Wu, Shan; Zhang, Xiaofeng; Shuai, Jiangbing; Li, Ke; Yu, Huizhen; Jin, Chenchen

    2016-07-04

    To simplify the PNA-FISH (Peptide nucleic acid-fluorescence in situ hybridization) test, molecular beacon based PNA probe combined with fluorescence scanning detection technology was applied to replace the original microscope observation to detect Listeria monocytogenes The 5′ end and 3′ end of the L. monocytogenes specific PNA probes were labeled with the fluorescent group and the quenching group respectively, to form a molecular beacon based PNA probe. When PNA probe used for fluorescence scanning and N1 treatment as the control, the false positive rate was 11.4%, and the false negative rate was 0; when N2 treatment as the control, the false positive rate decreased to 4.3%, but the false negative rate rose to 18.6%. When beacon based PNA probe used for fluorescence scanning, taken N1 treatment as blank control, the false positive rate was 8.6%, and the false negative rate was 1.4%; taken N2 treatment as blank control, the false positive rate was 5.7%, and the false negative rate was 1.4%. Compared with PNA probe, molecular beacon based PNA probe can effectively reduce false positives and false negatives. The success rates of hybridization of the two PNA probes were 83.3% and 95.2% respectively; and the rates of the two beacon based PNA probes were 91.7% and 90.5% respectively, which indicated that labeling the both ends of the PNA probe dose not decrease the hybridization rate with the target bacteria. The combination of liquid phase PNA-FISH and fluorescence scanning method, can significantly improve the detection efficiency.

  4. Statistical approaches to account for false-positive errors in environmental DNA samples.

    PubMed

    Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid

    2016-05-01

    Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies. © 2015 John Wiley & Sons Ltd.

  5. Automatic multimodal detection for long-term seizure documentation in epilepsy.

    PubMed

    Fürbass, F; Kampusch, S; Kaniusas, E; Koren, J; Pirker, S; Hopfengärtner, R; Stefan, H; Kluge, T; Baumgartner, C

    2017-08-01

    This study investigated sensitivity and false detection rate of a multimodal automatic seizure detection algorithm and the applicability to reduced electrode montages for long-term seizure documentation in epilepsy patients. An automatic seizure detection algorithm based on EEG, EMG, and ECG signals was developed. EEG/ECG recordings of 92 patients from two epilepsy monitoring units including 494 seizures were used to assess detection performance. EMG data were extracted by bandpass filtering of EEG signals. Sensitivity and false detection rate were evaluated for each signal modality and for reduced electrode montages. All focal seizures evolving to bilateral tonic-clonic (BTCS, n=50) and 89% of focal seizures (FS, n=139) were detected. Average sensitivity in temporal lobe epilepsy (TLE) patients was 94% and 74% in extratemporal lobe epilepsy (XTLE) patients. Overall detection sensitivity was 86%. Average false detection rate was 12.8 false detections in 24h (FD/24h) for TLE and 22 FD/24h in XTLE patients. Utilization of 8 frontal and temporal electrodes reduced average sensitivity from 86% to 81%. Our automatic multimodal seizure detection algorithm shows high sensitivity with full and reduced electrode montages. Evaluation of different signal modalities and electrode montages paces the way for semi-automatic seizure documentation systems. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  6. Comparison of human and algorithmic target detection in passive infrared imagery

    NASA Astrophysics Data System (ADS)

    Weber, Bruce A.; Hutchinson, Meredith

    2003-09-01

    We have designed an experiment that compares the performance of human observers and a scale-insensitive target detection algorithm that uses pixel level information for the detection of ground targets in passive infrared imagery. The test database contains targets near clutter whose detectability ranged from easy to very difficult. Results indicate that human observers detect more "easy-to-detect" targets, and with far fewer false alarms, than the algorithm. For "difficult-to-detect" targets, human and algorithm detection rates are considerably degraded, and algorithm false alarms excessive. Analysis of detections as a function of observer confidence shows that algorithm confidence attribution does not correspond to human attribution, and does not adequately correlate with correct detections. The best target detection score for any human observer was 84%, as compared to 55% for the algorithm for the same false alarm rate. At 81%, the maximum detection score for the algorithm, the same human observer had 6 false alarms per frame as compared to 29 for the algorithm. Detector ROC curves and observer-confidence analysis benchmarks the algorithm and provides insights into algorithm deficiencies and possible paths to improvement.

  7. Experimental investigation of observation error in anuran call surveys

    USGS Publications Warehouse

    McClintock, B.T.; Bailey, L.L.; Pollock, K.H.; Simons, T.R.

    2010-01-01

    Occupancy models that account for imperfect detection are often used to monitor anuran and songbird species occurrence. However, presenceabsence data arising from auditory detections may be more prone to observation error (e.g., false-positive detections) than are sampling approaches utilizing physical captures or sightings of individuals. We conducted realistic, replicated field experiments using a remote broadcasting system to simulate simple anuran call surveys and to investigate potential factors affecting observation error in these studies. Distance, time, ambient noise, and observer abilities were the most important factors explaining false-negative detections. Distance and observer ability were the best overall predictors of false-positive errors, but ambient noise and competing species also affected error rates for some species. False-positive errors made up 5 of all positive detections, with individual observers exhibiting false-positive rates between 0.5 and 14. Previous research suggests false-positive errors of these magnitudes would induce substantial positive biases in standard estimators of species occurrence, and we recommend practices to mitigate for false positives when developing occupancy monitoring protocols that rely on auditory detections. These recommendations include additional observer training, limiting the number of target species, and establishing distance and ambient noise thresholds during surveys. ?? 2010 The Wildlife Society.

  8. Improved multi-stage neonatal seizure detection using a heuristic classifier and a data-driven post-processor.

    PubMed

    Ansari, A H; Cherian, P J; Dereymaeker, A; Matic, V; Jansen, K; De Wispelaere, L; Dielman, C; Vervisch, J; Swarte, R M; Govaert, P; Naulaers, G; De Vos, M; Van Huffel, S

    2016-09-01

    After identifying the most seizure-relevant characteristics by a previously developed heuristic classifier, a data-driven post-processor using a novel set of features is applied to improve the performance. The main characteristics of the outputs of the heuristic algorithm are extracted by five sets of features including synchronization, evolution, retention, segment, and signal features. Then, a support vector machine and a decision making layer remove the falsely detected segments. Four datasets including 71 neonates (1023h, 3493 seizures) recorded in two different university hospitals, are used to train and test the algorithm without removing the dubious seizures. The heuristic method resulted in a false alarm rate of 3.81 per hour and good detection rate of 88% on the entire test databases. The post-processor, effectively reduces the false alarm rate by 34% while the good detection rate decreases by 2%. This post-processing technique improves the performance of the heuristic algorithm. The structure of this post-processor is generic, improves our understanding of the core visually determined EEG features of neonatal seizures and is applicable for other neonatal seizure detectors. The post-processor significantly decreases the false alarm rate at the expense of a small reduction of the good detection rate. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  9. An improved PCA method with application to boiler leak detection.

    PubMed

    Sun, Xi; Marquez, Horacio J; Chen, Tongwen; Riaz, Muhammad

    2005-07-01

    Principal component analysis (PCA) is a popular fault detection technique. It has been widely used in process industries, especially in the chemical industry. In industrial applications, achieving a sensitive system capable of detecting incipient faults, which maintains the false alarm rate to a minimum, is a crucial issue. Although a lot of research has been focused on these issues for PCA-based fault detection and diagnosis methods, sensitivity of the fault detection scheme versus false alarm rate continues to be an important issue. In this paper, an improved PCA method is proposed to address this problem. In this method, a new data preprocessing scheme and a new fault detection scheme designed for Hotelling's T2 as well as the squared prediction error are developed. A dynamic PCA model is also developed for boiler leak detection. This new method is applied to boiler water/steam leak detection with real data from Syncrude Canada's utility plant in Fort McMurray, Canada. Our results demonstrate that the proposed method can effectively reduce false alarm rate, provide effective and correct leak alarms, and give early warning to operators.

  10. Development of Technologies for Early Detection and Stratification of Breast Cancer

    DTIC Science & Technology

    2012-10-01

    at the time of screening, and has an 8-10% false positive rate.3 These drawbacks lead to inaccurate patient diagnosis, which can allow potentially...95% recovery efficiency. Furthermore, using whole blood from healthy donors, we determined we have a zero false positive rate; that is, we have not...detected a single false positive event out of the dozen samples we ran. The technology we developed here is not only useful for the isolation of CTCs

  11. User acceptance of intelligent avionics: A study of automatic-aided target recognition

    NASA Technical Reports Server (NTRS)

    Becker, Curtis A.; Hayes, Brian C.; Gorman, Patrick C.

    1991-01-01

    User acceptance of new support systems typically was evaluated after the systems were specified, designed, and built. The current study attempts to assess user acceptance of an Automatic-Aided Target Recognition (ATR) system using an emulation of such a proposed system. The detection accuracy and false alarm level of the ATR system were varied systematically, and subjects rated the tactical value of systems exhibiting different performance levels. Both detection accuracy and false alarm level affected the subjects' ratings. The data from two experiments suggest a cut-off point in ATR performance below which the subjects saw little tactical value in the system. An ATR system seems to have obvious tactical value only if it functions at a correct detection rate of 0.7 or better with a false alarm level of 0.167 false alarms per square degree or fewer.

  12. Accurate mobile malware detection and classification in the cloud.

    PubMed

    Wang, Xiaolei; Yang, Yuexiang; Zeng, Yingzhi

    2015-01-01

    As the dominator of the Smartphone operating system market, consequently android has attracted the attention of s malware authors and researcher alike. The number of types of android malware is increasing rapidly regardless of the considerable number of proposed malware analysis systems. In this paper, by taking advantages of low false-positive rate of misuse detection and the ability of anomaly detection to detect zero-day malware, we propose a novel hybrid detection system based on a new open-source framework CuckooDroid, which enables the use of Cuckoo Sandbox's features to analyze Android malware through dynamic and static analysis. Our proposed system mainly consists of two parts: anomaly detection engine performing abnormal apps detection through dynamic analysis; signature detection engine performing known malware detection and classification with the combination of static and dynamic analysis. We evaluate our system using 5560 malware samples and 6000 benign samples. Experiments show that our anomaly detection engine with dynamic analysis is capable of detecting zero-day malware with a low false negative rate (1.16 %) and acceptable false positive rate (1.30 %); it is worth noting that our signature detection engine with hybrid analysis can accurately classify malware samples with an average positive rate 98.94 %. Considering the intensive computing resources required by the static and dynamic analysis, our proposed detection system should be deployed off-device, such as in the Cloud. The app store markets and the ordinary users can access our detection system for malware detection through cloud service.

  13. A Viola-Jones based hybrid face detection framework

    NASA Astrophysics Data System (ADS)

    Murphy, Thomas M.; Broussard, Randy; Schultz, Robert; Rakvic, Ryan; Ngo, Hau

    2013-12-01

    Improvements in face detection performance would benefit many applications. The OpenCV library implements a standard solution, the Viola-Jones detector, with a statistically boosted rejection cascade of binary classifiers. Empirical evidence has shown that Viola-Jones underdetects in some instances. This research shows that a truncated cascade augmented by a neural network could recover these undetected faces. A hybrid framework is constructed, with a truncated Viola-Jones cascade followed by an artificial neural network, used to refine the face decision. Optimally, a truncation stage that captured all faces and allowed the neural network to remove the false alarms is selected. A feedforward backpropagation network with one hidden layer is trained to discriminate faces based upon the thresholding (detection) values of intermediate stages of the full rejection cascade. A clustering algorithm is used as a precursor to the neural network, to group significant overlappings. Evaluated on the CMU/VASC Image Database, comparison with an unmodified OpenCV approach shows: (1) a 37% increase in detection rates if constrained by the requirement of no increase in false alarms, (2) a 48% increase in detection rates if some additional false alarms are tolerated, and (3) an 82% reduction in false alarms with no reduction in detection rates. These results demonstrate improved face detection and could address the need for such improvement in various applications.

  14. Automatic detection of ECG cable interchange by analyzing both morphology and interlead relations.

    PubMed

    Han, Chengzong; Gregg, Richard E; Feild, Dirk Q; Babaeizadeh, Saeed

    2014-01-01

    ECG cable interchange can generate erroneous diagnoses. For algorithms detecting ECG cable interchange, high specificity is required to maintain a low total false positive rate because the prevalence of interchange is low. In this study, we propose and evaluate an improved algorithm for automatic detection and classification of ECG cable interchange. The algorithm was developed by using both ECG morphology information and redundancy information. ECG morphology features included QRS-T and P-wave amplitude, frontal axis and clockwise vector loop rotation. The redundancy features were derived based on the EASI™ lead system transformation. The classification was implemented using linear support vector machine. The development database came from multiple sources including both normal subjects and cardiac patients. An independent database was used to test the algorithm performance. Common cable interchanges were simulated by swapping either limb cables or precordial cables. For the whole validation database, the overall sensitivity and specificity for detecting precordial cable interchange were 56.5% and 99.9%, and the sensitivity and specificity for detecting limb cable interchange (excluding left arm-left leg interchange) were 93.8% and 99.9%. Defining precordial cable interchange or limb cable interchange as a single positive event, the total false positive rate was 0.7%. When the algorithm was designed for higher sensitivity, the sensitivity for detecting precordial cable interchange increased to 74.6% and the total false positive rate increased to 2.7%, while the sensitivity for detecting limb cable interchange was maintained at 93.8%. The low total false positive rate was maintained at 0.6% for the more abnormal subset of the validation database including only hypertrophy and infarction patients. The proposed algorithm can detect and classify ECG cable interchanges with high specificity and low total false positive rate, at the cost of decreased sensitivity for certain precordial cable interchanges. The algorithm could also be configured for higher sensitivity for different applications where a lower specificity can be tolerated. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. [Comparison between rapid detection method of enzyme substrate technique and multiple-tube fermentation technique in water coliform bacteria detection].

    PubMed

    Sun, Zong-ke; Wu, Rong; Ding, Pei; Xue, Jin-Rong

    2006-07-01

    To compare between rapid detection method of enzyme substrate technique and multiple-tube fermentation technique in water coliform bacteria detection. Using inoculated and real water samples to compare the equivalence and false positive rate between two methods. Results demonstrate that enzyme substrate technique shows equivalence with multiple-tube fermentation technique (P = 0.059), false positive rate between the two methods has no statistical difference. It is suggested that enzyme substrate technique can be used as a standard method for water microbiological safety evaluation.

  16. Methods for threshold determination in multiplexed assays

    DOEpatents

    Tammero, Lance F. Bentley; Dzenitis, John M; Hindson, Benjamin J

    2014-06-24

    Methods for determination of threshold values of signatures comprised in an assay are described. Each signature enables detection of a target. The methods determine a probability density function of negative samples and a corresponding false positive rate curve. A false positive criterion is established and a threshold for that signature is determined as a point at which the false positive rate curve intersects the false positive criterion. A method for quantitative analysis and interpretation of assay results together with a method for determination of a desired limit of detection of a signature in an assay are also described.

  17. Evaluation of machine learning algorithms for improved risk assessment for Down's syndrome.

    PubMed

    Koivu, Aki; Korpimäki, Teemu; Kivelä, Petri; Pahikkala, Tapio; Sairanen, Mikko

    2018-05-04

    Prenatal screening generates a great amount of data that is used for predicting risk of various disorders. Prenatal risk assessment is based on multiple clinical variables and overall performance is defined by how well the risk algorithm is optimized for the population in question. This article evaluates machine learning algorithms to improve performance of first trimester screening of Down syndrome. Machine learning algorithms pose an adaptive alternative to develop better risk assessment models using the existing clinical variables. Two real-world data sets were used to experiment with multiple classification algorithms. Implemented models were tested with a third, real-world, data set and performance was compared to a predicate method, a commercial risk assessment software. Best performing deep neural network model gave an area under the curve of 0.96 and detection rate of 78% with 1% false positive rate with the test data. Support vector machine model gave area under the curve of 0.95 and detection rate of 61% with 1% false positive rate with the same test data. When compared with the predicate method, the best support vector machine model was slightly inferior, but an optimized deep neural network model was able to give higher detection rates with same false positive rate or similar detection rate but with markedly lower false positive rate. This finding could further improve the first trimester screening for Down syndrome, by using existing clinical variables and a large training data derived from a specific population. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Quantitative assessment of hit detection and confirmation in single and duplicate high-throughput screenings.

    PubMed

    Wu, Zhijin; Liu, Dongmei; Sui, Yunxia

    2008-02-01

    The process of identifying active targets (hits) in high-throughput screening (HTS) usually involves 2 steps: first, removing or adjusting for systematic variation in the measurement process so that extreme values represent strong biological activity instead of systematic biases such as plate effect or edge effect and, second, choosing a meaningful cutoff on the calculated statistic to declare positive compounds. Both false-positive and false-negative errors are inevitable in this process. Common control or estimation of error rates is often based on an assumption of normal distribution of the noise. The error rates in hit detection, especially false-negative rates, are hard to verify because in most assays, only compounds selected in primary screening are followed up in confirmation experiments. In this article, the authors take advantage of a quantitative HTS experiment in which all compounds are tested 42 times over a wide range of 14 concentrations so true positives can be found through a dose-response curve. Using the activity status defined by dose curve, the authors analyzed the effect of various data-processing procedures on the sensitivity and specificity of hit detection, the control of error rate, and hit confirmation. A new summary score is proposed and demonstrated to perform well in hit detection and useful in confirmation rate estimation. In general, adjusting for positional effects is beneficial, but a robust test can prevent overadjustment. Error rates estimated based on normal assumption do not agree with actual error rates, for the tails of noise distribution deviate from normal distribution. However, false discovery rate based on empirically estimated null distribution is very close to observed false discovery proportion.

  19. Unsupervised Anomaly Detection Based on Clustering and Multiple One-Class SVM

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Kwon, Yongjin

    Intrusion detection system (IDS) has played an important role as a device to defend our networks from cyber attacks. However, since it is unable to detect unknown attacks, i.e., 0-day attacks, the ultimate challenge in intrusion detection field is how we can exactly identify such an attack by an automated manner. Over the past few years, several studies on solving these problems have been made on anomaly detection using unsupervised learning techniques such as clustering, one-class support vector machine (SVM), etc. Although they enable one to construct intrusion detection models at low cost and effort, and have capability to detect unforeseen attacks, they still have mainly two problems in intrusion detection: a low detection rate and a high false positive rate. In this paper, we propose a new anomaly detection method based on clustering and multiple one-class SVM in order to improve the detection rate while maintaining a low false positive rate. We evaluated our method using KDD Cup 1999 data set. Evaluation results show that our approach outperforms the existing algorithms reported in the literature; especially in detection of unknown attacks.

  20. E-commerce Review System to Detect False Reviews.

    PubMed

    Kolhar, Manjur

    2017-08-15

    E-commerce sites have been doing profitable business since their induction in high-speed and secured networks. Moreover, they continue to influence consumers through various methods. One of the most effective methods is the e-commerce review rating system, in which consumers provide review ratings for the products used. However, almost all e-commerce review rating systems are unable to provide cumulative review ratings. Furthermore, review ratings are influenced by positive and negative malicious feedback ratings, collectively called false reviews. In this paper, we proposed an e-commerce review system framework developed using the cumulative sum method to detect and remove malicious review ratings.

  1. Wavelet method for CT colonography computer-aided polyp detection.

    PubMed

    Li, Jiang; Van Uitert, Robert; Yao, Jianhua; Petrick, Nicholas; Franaszek, Marek; Huang, Adam; Summers, Ronald M

    2008-08-01

    Computed tomographic colonography (CTC) computer aided detection (CAD) is a new method to detect colon polyps. Colonic polyps are abnormal growths that may become cancerous. Detection and removal of colonic polyps, particularly larger ones, has been shown to reduce the incidence of colorectal cancer. While high sensitivities and low false positive rates are consistently achieved for the detection of polyps sized 1 cm or larger, lower sensitivities and higher false positive rates occur when the goal of CAD is to identify "medium"-sized polyps, 6-9 mm in diameter. Such medium-sized polyps may be important for clinical patient management. We have developed a wavelet-based postprocessor to reduce false positives for this polyp size range. We applied the wavelet-based postprocessor to CTC CAD findings from 44 patients in whom 45 polyps with sizes of 6-9 mm were found at segmentally unblinded optical colonoscopy and visible on retrospective review of the CT colonography images. Prior to the application of the wavelet-based postprocessor, the CTC CAD system detected 33 of the polyps (sensitivity 73.33%) with 12.4 false positives per patient, a sensitivity comparable to that of expert radiologists. Fourfold cross validation with 5000 bootstraps showed that the wavelet-based postprocessor could reduce the false positives by 56.61% (p <0.001), to 5.38 per patient (95% confidence interval [4.41, 6.34]), without significant sensitivity degradation (32/45, 71.11%, 95% confidence interval [66.39%, 75.74%], p=0.1713). We conclude that this wavelet-based postprocessor can substantially reduce the false positive rate of our CTC CAD for this important polyp size range.

  2. Experimental investigation of false positive errors in auditory species occurrence surveys

    USGS Publications Warehouse

    Miller, David A.W.; Weir, Linda A.; McClintock, Brett T.; Grant, Evan H. Campbell; Bailey, Larissa L.; Simons, Theodore R.

    2012-01-01

    False positive errors are a significant component of many ecological data sets, which in combination with false negative errors, can lead to severe biases in conclusions about ecological systems. We present results of a field experiment where observers recorded observations for known combinations of electronically broadcast calling anurans under conditions mimicking field surveys to determine species occurrence. Our objectives were to characterize false positive error probabilities for auditory methods based on a large number of observers, to determine if targeted instruction could be used to reduce false positive error rates, and to establish useful predictors of among-observer and among-species differences in error rates. We recruited 31 observers, ranging in abilities from novice to expert, that recorded detections for 12 species during 180 calling trials (66,960 total observations). All observers made multiple false positive errors and on average 8.1% of recorded detections in the experiment were false positive errors. Additional instruction had only minor effects on error rates. After instruction, false positive error probabilities decreased by 16% for treatment individuals compared to controls with broad confidence interval overlap of 0 (95% CI: -46 to 30%). This coincided with an increase in false negative errors due to the treatment (26%; -3 to 61%). Differences among observers in false positive and in false negative error rates were best predicted by scores from an online test and a self-assessment of observer ability completed prior to the field experiment. In contrast, years of experience conducting call surveys was a weak predictor of error rates. False positive errors were also more common for species that were played more frequently, but were not related to the dominant spectral frequency of the call. Our results corroborate other work that demonstrates false positives are a significant component of species occurrence data collected by auditory methods. Instructing observers to only report detections they are completely certain are correct is not sufficient to eliminate errors. As a result, analytical methods that account for false positive errors will be needed, and independent testing of observer ability is a useful predictor for among-observer variation in observation error rates.

  3. Intelligent agent-based intrusion detection system using enhanced multiclass SVM.

    PubMed

    Ganapathy, S; Yogesh, P; Kannan, A

    2012-01-01

    Intrusion detection systems were used in the past along with various techniques to detect intrusions in networks effectively. However, most of these systems are able to detect the intruders only with high false alarm rate. In this paper, we propose a new intelligent agent-based intrusion detection model for mobile ad hoc networks using a combination of attribute selection, outlier detection, and enhanced multiclass SVM classification methods. For this purpose, an effective preprocessing technique is proposed that improves the detection accuracy and reduces the processing time. Moreover, two new algorithms, namely, an Intelligent Agent Weighted Distance Outlier Detection algorithm and an Intelligent Agent-based Enhanced Multiclass Support Vector Machine algorithm are proposed for detecting the intruders in a distributed database environment that uses intelligent agents for trust management and coordination in transaction processing. The experimental results of the proposed model show that this system detects anomalies with low false alarm rate and high-detection rate when tested with KDD Cup 99 data set.

  4. Intelligent Agent-Based Intrusion Detection System Using Enhanced Multiclass SVM

    PubMed Central

    Ganapathy, S.; Yogesh, P.; Kannan, A.

    2012-01-01

    Intrusion detection systems were used in the past along with various techniques to detect intrusions in networks effectively. However, most of these systems are able to detect the intruders only with high false alarm rate. In this paper, we propose a new intelligent agent-based intrusion detection model for mobile ad hoc networks using a combination of attribute selection, outlier detection, and enhanced multiclass SVM classification methods. For this purpose, an effective preprocessing technique is proposed that improves the detection accuracy and reduces the processing time. Moreover, two new algorithms, namely, an Intelligent Agent Weighted Distance Outlier Detection algorithm and an Intelligent Agent-based Enhanced Multiclass Support Vector Machine algorithm are proposed for detecting the intruders in a distributed database environment that uses intelligent agents for trust management and coordination in transaction processing. The experimental results of the proposed model show that this system detects anomalies with low false alarm rate and high-detection rate when tested with KDD Cup 99 data set. PMID:23056036

  5. Synthetic aperture radar target detection, feature extraction, and image formation techniques

    NASA Technical Reports Server (NTRS)

    Li, Jian

    1994-01-01

    This report presents new algorithms for target detection, feature extraction, and image formation with the synthetic aperture radar (SAR) technology. For target detection, we consider target detection with SAR and coherent subtraction. We also study how the image false alarm rates are related to the target template false alarm rates when target templates are used for target detection. For feature extraction from SAR images, we present a computationally efficient eigenstructure-based 2D-MODE algorithm for two-dimensional frequency estimation. For SAR image formation, we present a robust parametric data model for estimating high resolution range signatures of radar targets and for forming high resolution SAR images.

  6. Experimental and environmental factors affect spurious detection of ecological thresholds

    USGS Publications Warehouse

    Daily, Jonathan P.; Hitt, Nathaniel P.; Smith, David; Snyder, Craig D.

    2012-01-01

    Threshold detection methods are increasingly popular for assessing nonlinear responses to environmental change, but their statistical performance remains poorly understood. We simulated linear change in stream benthic macroinvertebrate communities and evaluated the performance of commonly used threshold detection methods based on model fitting (piecewise quantile regression [PQR]), data partitioning (nonparametric change point analysis [NCPA]), and a hybrid approach (significant zero crossings [SiZer]). We demonstrated that false detection of ecological thresholds (type I errors) and inferences on threshold locations are influenced by sample size, rate of linear change, and frequency of observations across the environmental gradient (i.e., sample-environment distribution, SED). However, the relative importance of these factors varied among statistical methods and between inference types. False detection rates were influenced primarily by user-selected parameters for PQR (τ) and SiZer (bandwidth) and secondarily by sample size (for PQR) and SED (for SiZer). In contrast, the location of reported thresholds was influenced primarily by SED. Bootstrapped confidence intervals for NCPA threshold locations revealed strong correspondence to SED. We conclude that the choice of statistical methods for threshold detection should be matched to experimental and environmental constraints to minimize false detection rates and avoid spurious inferences regarding threshold location.

  7. The introduction of the absolute risk for the detection of fetal aneuploidies in the first-trimester screening.

    PubMed

    Padula, Francesco; Laganà, Antonio Simone; Vitale, Salvatore Giovanni; D'Emidio, Laura; Coco, Claudio; Giannarelli, Diana; Cariola, Maria; Favilli, Alessandro; Giorlandino, Claudio

    2017-05-01

    Maternal age is a crucial factor in fetal aneuploidy screening, resulting in an increased rate of false-positive cases in older women and false-negative cases in younger women. The absolute risk (AR) is the simplest way to eliminate the background maternal age risk, as it represents the amount of improvement of the combined risk from the maternal background risk. The aim of this work is to assess the performance of the AR in the combined first-trimester screening for aneuploidies. A retrospective validation of the AR in the combined first-trimester screening for fetal aneuploidies, in an unselected population at Altamedica Fetal-Maternal Medical Center in Rome, between March 2007 and December 2008. Of 3845 women included in the study, we had a complete follow-up on 2984. We evaluated that an AR < 3 would individuate 22 of 23 cases of aneuploidy with a detection rate of 95.7% (95%CI 87.3-100), a false-positive rate of 8.7% (95%CI 7.7-9.7) and a false-negative rate of 4.3% (95%CI 0-12.7). In our study, the AR ameliorates the detection rate for aneuploidy. Further research and a prospective study on a larger population would help us to improve the AR in detecting most cases of aneuploidy.

  8. A Region Tracking-Based Vehicle Detection Algorithm in Nighttime Traffic Scenes

    PubMed Central

    Wang, Jianqiang; Sun, Xiaoyan; Guo, Junbin

    2013-01-01

    The preceding vehicles detection technique in nighttime traffic scenes is an important part of the advanced driver assistance system (ADAS). This paper proposes a region tracking-based vehicle detection algorithm via the image processing technique. First, the brightness of the taillights during nighttime is used as the typical feature, and we use the existing global detection algorithm to detect and pair the taillights. When the vehicle is detected, a time series analysis model is introduced to predict vehicle positions and the possible region (PR) of the vehicle in the next frame. Then, the vehicle is only detected in the PR. This could reduce the detection time and avoid the false pairing between the bright spots in the PR and the bright spots out of the PR. Additionally, we present a thresholds updating method to make the thresholds adaptive. Finally, experimental studies are provided to demonstrate the application and substantiate the superiority of the proposed algorithm. The results show that the proposed algorithm can simultaneously reduce both the false negative detection rate and the false positive detection rate.

  9. Convolutional neural network based deep-learning architecture for prostate cancer detection on multiparametric magnetic resonance images

    NASA Astrophysics Data System (ADS)

    Tsehay, Yohannes K.; Lay, Nathan S.; Roth, Holger R.; Wang, Xiaosong; Kwak, Jin Tae; Turkbey, Baris I.; Pinto, Peter A.; Wood, Brad J.; Summers, Ronald M.

    2017-03-01

    Prostate cancer (PCa) is the second most common cause of cancer related deaths in men. Multiparametric MRI (mpMRI) is the most accurate imaging method for PCa detection; however, it requires the expertise of experienced radiologists leading to inconsistency across readers of varying experience. To increase inter-reader agreement and sensitivity, we developed a computer-aided detection (CAD) system that can automatically detect lesions on mpMRI that readers can use as a reference. We investigated a convolutional neural network based deep-learing (DCNN) architecture to find an improved solution for PCa detection on mpMRI. We adopted a network architecture from a state-of-the-art edge detector that takes an image as an input and produces an image probability map. Two-fold cross validation along with a receiver operating characteristic (ROC) analysis and free-response ROC (FROC) were used to determine our deep-learning based prostate-CAD's (CADDL) performance. The efficacy was compared to an existing prostate CAD system that is based on hand-crafted features, which was evaluated on the same test-set. CADDL had an 86% detection rate at 20% false-positive rate while the top-down learning CAD had 80% detection rate at the same false-positive rate, which translated to 94% and 85% detection rate at 10 false-positives per patient on the FROC. A CNN based CAD is able to detect cancerous lesions on mpMRI of the prostate with results comparable to an existing prostate-CAD showing potential for further development.

  10. Decision-level fusion of SAR and IR sensor information for automatic target detection

    NASA Astrophysics Data System (ADS)

    Cho, Young-Rae; Yim, Sung-Hyuk; Cho, Hyun-Woong; Won, Jin-Ju; Song, Woo-Jin; Kim, So-Hyeon

    2017-05-01

    We propose a decision-level architecture that combines synthetic aperture radar (SAR) and an infrared (IR) sensor for automatic target detection. We present a new size-based feature, called target-silhouette to reduce the number of false alarms produced by the conventional target-detection algorithm. Boolean Map Visual Theory is used to combine a pair of SAR and IR images to generate the target-enhanced map. Then basic belief assignment is used to transform this map into a belief map. The detection results of sensors are combined to build the target-silhouette map. We integrate the fusion mass and the target-silhouette map on the decision level to exclude false alarms. The proposed algorithm is evaluated using a SAR and IR synthetic database generated by SE-WORKBENCH simulator, and compared with conventional algorithms. The proposed fusion scheme achieves higher detection rate and lower false alarm rate than the conventional algorithms.

  11. Sub-parts-per-billion level detection of dimethyl methyl phosphonate (DMMP) by quantum cascade laser photoacoustic spectroscopy.

    PubMed

    Mukherjee, Anadi; Dunayevskiy, Ilya; Prasanna, Manu; Go, Rowel; Tsekoun, Alexei; Wang, Xiaojun; Fan, Jenyu; Patel, C Kumar N

    2008-04-01

    The need for the detection of chemical warfare agents (CWAs) is no longer confined to battlefield environments because of at least one confirmed terrorist attack, the Tokyo Subway [Emerg. Infect. Dis. 5, 513 (1999)] in 1995, and a suspected, i.e., a false-alarm of a CWA in the Russell Senate Office Building [Washington Post, 9 February 2006, p. B01]. Therefore, detection of CWAs with high sensitivity and low false-alarm rates is considered an important priority for ensuring public safety. We report a minimum detection level for a CWA simulant, dimethyl methyl phosphonate (DMMP), of <0.5 ppb (parts in 10(9)) by use of a widely tunable external grating cavity quantum cascade laser and photoacoustic spectroscopy. With interferents present in Santa Monica, California street air, we demonstrate a false-alarm rate of 1:10(6) at a detection threshold of 1.6 ppb.

  12. A novel seizure detection algorithm informed by hidden Markov model event states

    NASA Astrophysics Data System (ADS)

    Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian

    2016-06-01

    Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.

  13. Detection of explosive remnants of war by neutron thermalisation.

    PubMed

    Brooks, F D; Drosg, M; Smit, F D; Wikner, C

    2012-01-01

    The HYDAD-D landmine detector (Brooks and Drosg, 2005) has been modified and field-tested for 17 months in a variety of soil conditions. Test objects containing about the same mass of hydrogen (20g) as small explosive remnants of war, such as antipersonnel landmines, were detected with efficiency 100% when buried at cover depths up to 10cm. The false alarm rate under the same conditions was 9%. Plots of detection efficiency versus false alarm rate are presented. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Single photon counting linear mode avalanche photodiode technologies

    NASA Astrophysics Data System (ADS)

    Williams, George M.; Huntington, Andrew S.

    2011-10-01

    The false count rate of a single-photon-sensitive photoreceiver consisting of a high-gain, low-excess-noise linear-mode InGaAs avalanche photodiode (APD) and a high-bandwidth transimpedance amplifier (TIA) is fit to a statistical model. The peak height distribution of the APD's multiplied dark current is approximated by the weighted sum of McIntyre distributions, each characterizing dark current generated at a different location within the APD's junction. The peak height distribution approximated in this way is convolved with a Gaussian distribution representing the input-referred noise of the TIA to generate the statistical distribution of the uncorrelated sum. The cumulative distribution function (CDF) representing count probability as a function of detection threshold is computed, and the CDF model fit to empirical false count data. It is found that only k=0 McIntyre distributions fit the empirically measured CDF at high detection threshold, and that false count rate drops faster than photon count rate as detection threshold is raised. Once fit to empirical false count data, the model predicts the improvement of the false count rate to be expected from reductions in TIA noise and APD dark current. Improvement by at least three orders of magnitude is thought feasible with further manufacturing development and a capacitive-feedback TIA (CTIA).

  15. Age-standardisation when target setting and auditing performance of Down syndrome screening programmes.

    PubMed

    Cuckle, Howard; Aitken, David; Goodburn, Sandra; Senior, Brian; Spencer, Kevin; Standing, Sue

    2004-11-01

    To describe and illustrate a method of setting Down syndrome screening targets and auditing performance that allows for differences in the maternal age distribution. A reference population was determined from a Gaussian model of maternal age. Target detection and false-positive rates were determined by standard statistical modelling techniques, except that the reference population rather than an observed population was used. Second-trimester marker parameters were obtained for Down syndrome from a large meta-analysis, and for unaffected pregnancies from the combined results of more than 600,000 screens in five centres. Audited detection and false-positive rates were the weighted average of the rates in five broad age groups corrected for viability bias. Weights were based on the age distributions in the reference population. Maternal age was found to approximate reasonably well to a Gaussian distribution with mean 27 years and standard deviation 5.5 years. Depending on marker combination, the target detection rates were 59 to 64% and false-positive rate 4.2 to 5.4% for a 1 in 250 term cut-off; 65 to 68% and 6.1 to 7.3% for 1 in 270 at mid-trimester. Among the five centres, the audited detection rate ranged from 7% below target to 10% above target, with audited false-positive rates better than the target by 0.3 to 1.5%. Age-standardisation should help to improve screening quality by allowing for intrinsic differences between programmes, so that valid comparisons can be made. Copyright 2004 John Wiley & Sons, Ltd.

  16. A novel ECG detector performance metric and its relationship with missing and false heart rate limit alarms.

    PubMed

    Daluwatte, Chathuri; Vicente, Jose; Galeotti, Loriano; Johannesen, Lars; Strauss, David G; Scully, Christopher G

    Performance of ECG beat detectors is traditionally assessed on long intervals (e.g.: 30min), but only incorrect detections within a short interval (e.g.: 10s) may cause incorrect (i.e., missed+false) heart rate limit alarms (tachycardia and bradycardia). We propose a novel performance metric based on distribution of incorrect beat detection over a short interval and assess its relationship with incorrect heart rate limit alarm rates. Six ECG beat detectors were assessed using performance metrics over long interval (sensitivity and positive predictive value over 30min) and short interval (Area Under empirical cumulative distribution function (AUecdf) for short interval (i.e., 10s) sensitivity and positive predictive value) on two ECG databases. False heart rate limit and asystole alarm rates calculated using a third ECG database were then correlated (Spearman's rank correlation) with each calculated performance metric. False alarm rates correlated with sensitivity calculated on long interval (i.e., 30min) (ρ=-0.8 and p<0.05) and AUecdf for sensitivity (ρ=0.9 and p<0.05) in all assessed ECG databases. Sensitivity over 30min grouped the two detectors with lowest false alarm rates while AUecdf for sensitivity provided further information to identify the two beat detectors with highest false alarm rates as well, which was inseparable with sensitivity over 30min. Short interval performance metrics can provide insights on the potential of a beat detector to generate incorrect heart rate limit alarms. Published by Elsevier Inc.

  17. Automated Detection, Localization, and Classification of Traumatic Vertebral Body Fractures in the Thoracic and Lumbar Spine at CT

    PubMed Central

    Burns, Joseph E.; Yao, Jianhua; Muñoz, Hector

    2016-01-01

    Purpose To design and validate a fully automated computer system for the detection and anatomic localization of traumatic thoracic and lumbar vertebral body fractures at computed tomography (CT). Materials and Methods This retrospective study was HIPAA compliant. Institutional review board approval was obtained, and informed consent was waived. CT examinations in 104 patients (mean age, 34.4 years; range, 14–88 years; 32 women, 72 men), consisting of 94 examinations with positive findings for fractures (59 with vertebral body fractures) and 10 control examinations (without vertebral fractures), were performed. There were 141 thoracic and lumbar vertebral body fractures in the case set. The locations of fractures were marked and classified by a radiologist according to Denis column involvement. The CT data set was divided into training and testing subsets (37 and 67 subsets, respectively) for analysis by means of prototype software for fully automated spinal segmentation and fracture detection. Free-response receiver operating characteristic analysis was performed. Results Training set sensitivity for detection and localization of fractures within each vertebra was 0.82 (28 of 34 findings; 95% confidence interval [CI]: 0.68, 0.90), with a false-positive rate of 2.5 findings per patient. The sensitivity for fracture localization to the correct vertebra was 0.88 (23 of 26 findings; 95% CI: 0.72, 0.96), with a false-positive rate of 1.3. Testing set sensitivity for the detection and localization of fractures within each vertebra was 0.81 (87 of 107 findings; 95% CI: 0.75, 0.87), with a false-positive rate of 2.7. The sensitivity for fracture localization to the correct vertebra was 0.92 (55 of 60 findings; 95% CI: 0.79, 0.94), with a false-positive rate of 1.6. The most common cause of false-positive findings was nutrient foramina (106 of 272 findings [39%]). Conclusion The fully automated computer system detects and anatomically localizes vertebral body fractures in the thoracic and lumbar spine on CT images with a high sensitivity and a low false-positive rate. © RSNA, 2015 Online supplemental material is available for this article. PMID:26172532

  18. How to limit false positives in environmental DNA and metabarcoding?

    PubMed

    Ficetola, Gentile Francesco; Taberlet, Pierre; Coissac, Eric

    2016-05-01

    Environmental DNA (eDNA) and metabarcoding are boosting our ability to acquire data on species distribution in a variety of ecosystems. Nevertheless, as most of sampling approaches, eDNA is not perfect. It can fail to detect species that are actually present, and even false positives are possible: a species may be apparently detected in areas where it is actually absent. Controlling false positives remains a main challenge for eDNA analyses: in this issue of Molecular Ecology Resources, Lahoz-Monfort et al. () test the performance of multiple statistical modelling approaches to estimate the rate of detection and false positives from eDNA data. Here, we discuss the importance of controlling for false detection from early steps of eDNA analyses (laboratory, bioinformatics), to improve the quality of results and allow an efficient use of the site occupancy-detection modelling (SODM) framework for limiting false presences in eDNA analysis. © 2016 John Wiley & Sons Ltd.

  19. Towards Development of a 3-State Self-Paced Brain-Computer Interface

    PubMed Central

    Bashashati, Ali; Ward, Rabab K.; Birch, Gary E.

    2007-01-01

    Most existing brain-computer interfaces (BCIs) detect specific mental activity in a so-called synchronous paradigm. Unlike synchronous systems which are operational at specific system-defined periods, self-paced (asynchronous) interfaces have the advantage of being operational at all times. The low-frequency asynchronous switch design (LF-ASD) is a 2-state self-paced BCI that detects the presence of a specific finger movement in the ongoing EEG. Recent evaluations of the 2-state LF-ASD show an average true positive rate of 41% at the fixed false positive rate of 1%. This paper proposes two designs for a 3-state self-paced BCI that is capable of handling idle brain state. The two proposed designs aim at detecting right- and left-hand extensions from the ongoing EEG. They are formed of two consecutive detectors. The first detects the presence of a right- or a left-hand movement and the second classifies the detected movement as a right or a left one. In an offline analysis of the EEG data collected from four able-bodied individuals, the 3-state brain-computer interface shows a comparable performance with a 2-state system and significant performance improvement if used as a 2-state BCI, that is, in detecting the presence of a right- or a left-hand movement (regardless of the type of movement). It has an average true positive rate of 37.5% and 42.8% (at false positives rate of 1%) in detecting right- and left-hand extensions, respectively, in the context of a 3-state self-paced BCI and average detection rate of 58.1% (at false positive rate of 1%) in the context of a 2-state self-paced BCI. PMID:18288260

  20. Vision screening in preschool children: comparison of orthoptists and clinical medical officers as primary screeners.

    PubMed Central

    Bolger, P G; Stewart-Brown, S L; Newcombe, E; Starbuck, A

    1991-01-01

    OBJECTIVE--To see if there were differences in referral rates and abnormalities detected from two areas that were operating different preschool vision screening programmes. DESIGN--Cohort study using case notes of referrals. SETTING--Community based secondary referral centres in the county of Avon. PATIENTS--263 referrals from a child population of 7105 in Southmead district, an area that used orthoptists as primary vision screeners; 111 referrals from a child population of 2977 in Weston-super-Mare, an area that used clinical medical officers for screening. MAIN OUTCOME MEASURES--Amblyopia and squint detection rates, together with false positive referral rates. RESULTS--The amblyopia detection rate in Southmead district was significantly higher than in Weston-super-Mare (11/1000 children v 5/1000), as was the detection rate of squint (11/1000 v 3/1000). However, the false positive referral rate from Southmead was significantly lower than that from Weston-super-Mare (9/1000 v 23/1000). CONCLUSION--Preschool vision screening using orthoptists as primary screeners offers a more effective method of detecting visual abnormalities than using clinical medical officers. PMID:1747671

  1. First-trimester screening for early and late preeclampsia using maternal characteristics, biomarkers, and estimated placental volume.

    PubMed

    Sonek, Jiri; Krantz, David; Carmichael, Jon; Downing, Cathy; Jessup, Karen; Haidar, Ziad; Ho, Shannon; Hallahan, Terrence; Kliman, Harvey J; McKenna, David

    2018-01-01

    Preeclampsia is a major cause of perinatal morbidity and mortality. First-trimester screening has been shown to be effective in selecting patients at an increased risk for preeclampsia in some studies. We sought to evaluate the feasibility of screening for preeclampsia in the first trimester based on maternal characteristics, medical history, biomarkers, and placental volume. This is a prospective observational nonintervention cohort study in an unselected US population. Patients who presented for an ultrasound examination between 11-13+6 weeks' gestation were included. The following parameters were assessed and were used to calculate the risk of preeclampsia: maternal characteristics (demographic, anthropometric, and medical history), maternal biomarkers (mean arterial pressure, uterine artery pulsatility index, placental growth factor, pregnancy-associated plasma protein A, and maternal serum alpha-fetoprotein), and estimated placental volume. After delivery, medical records were searched for the diagnosis of preeclampsia. Detection rates for early-onset preeclampsia (<34 weeks' gestation) and later-onset preeclampsia (≥34 weeks' gestation) for 5% and 10% false-positive rates using various combinations of markers were calculated. We screened 1288 patients of whom 1068 (82.99%) were available for analysis. In all, 46 (4.3%) developed preeclampsia, with 13 (1.22%) having early-onset preeclampsia and 33 (3.09%) having late-onset preeclampsia. Using maternal characteristics, serum biomarkers, and uterine artery pulsatility index, the detection rate of early-onset preeclampsia for either 5% or 10% false-positive rate was 85%. With the same protocol, the detection rates for preeclampsia with delivery <37 weeks were 52% and 60% for 5% and 10% false-positive rates, respectively. Based on maternal characteristics, the detection rates for late-onset preeclampsia were 15% and 48% for 5% and 10%, while for preeclampsia at ≥37 weeks' gestation the detection rates were 24% and 43%, respectively. The detection rates for late-onset preeclampsia and preeclampsia with delivery at >37 weeks' gestation were not improved by the addition of biomarkers. Screening for preeclampsia at 11-13+6 weeks' gestation using maternal characteristics and biomarkers is associated with a high detection rate for a low false-positive rate. Screening for late-onset preeclampsia yields a much poorer performance. In this study the utility of estimated placental volume and mean arterial pressure was limited but larger studies are needed to ultimately determine the effectiveness of these markers. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Using a Regression Method for Estimating Performance in a Rapid Serial Visual Presentation Target-Detection Task

    DTIC Science & Technology

    2017-12-01

    values designating each stimulus as a target ( true ) or nontarget (false). Both stim_time and stim_label should have length equal to the number of...position unless so designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an official endorsement or...depend strongly on the true values of hit rate and false-alarm rate. Based on its better estimation of hit rate and false-alarm rate, the regression

  3. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems

    PubMed Central

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477

  4. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.

    PubMed

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes.

  5. Analysis of different device-based intrathoracic impedance vectors for detection of heart failure events (from the Detect Fluid Early from Intrathoracic Impedance Monitoring study).

    PubMed

    Heist, E Kevin; Herre, John M; Binkley, Philip F; Van Bakel, Adrian B; Porterfield, James G; Porterfield, Linda M; Qu, Fujian; Turkel, Melanie; Pavri, Behzad B

    2014-10-15

    Detect Fluid Early from Intrathoracic Impedance Monitoring (DEFEAT-PE) is a prospective, multicenter study of multiple intrathoracic impedance vectors to detect pulmonary congestion (PC) events. Changes in intrathoracic impedance between the right ventricular (RV) coil and device can (RVcoil→Can) of implantable cardioverter-defibrillators (ICDs) and cardiac resynchronization therapy ICDs (CRT-Ds) are used clinically for the detection of PC events, but other impedance vectors and algorithms have not been studied prospectively. An initial 75-patient study was used to derive optimal impedance vectors to detect PC events, with 2 vector combinations selected for prospective analysis in DEFEAT-PE (ICD vectors: RVring→Can + RVcoil→Can, detection threshold 13 days; CRT-D vectors: left ventricular ring→Can + RVcoil→Can, detection threshold 14 days). Impedance changes were considered true positive if detected <30 days before an adjudicated PC event. One hundred sixty-two patients were enrolled (80 with ICDs and 82 with CRT-Ds), all with ≥1 previous PC event. One hundred forty-four patients provided study data, with 214 patient-years of follow-up and 139 PC events. Sensitivity for PC events of the prespecified algorithms was as follows: ICD: sensitivity 32.3%, false-positive rate 1.28 per patient-year; CRT-D: sensitivity 32.4%, false-positive rate 1.66 per patient-year. An alternative algorithm, ultimately approved by the US Food and Drug Administration (RVring→Can + RVcoil→Can, detection threshold 14 days), resulted in (for all patients) sensitivity of 21.6% and a false-positive rate of 0.9 per patient-year. The CRT-D thoracic impedance vector algorithm selected in the derivation study was not superior to the ICD algorithm RVring→Can + RVcoil→Can when studied prospectively. In conclusion, to achieve an acceptably low false-positive rate, the intrathoracic impedance algorithms studied in DEFEAT-PE resulted in low sensitivity for the prediction of heart failure events. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery

    PubMed Central

    Sivaraks, Haemwaan

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284

  7. Sentinel Lymph Node Detection Using Carbon Nanoparticles in Patients with Early Breast Cancer

    PubMed Central

    Lu, Jianping; Zeng, Yi; Chen, Xia; Yan, Jun

    2015-01-01

    Purpose Carbon nanoparticles have a strong affinity for the lymphatic system. The purpose of this study was to evaluate the feasibility of sentinel lymph node biopsy using carbon nanoparticles in early breast cancer and to optimize the application procedure. Methods Firstly, we performed a pilot study to demonstrate the optimized condition using carbon nanoparticles for sentinel lymph nodes (SLNs) detection by investigating 36 clinically node negative breast cancer patients. In subsequent prospective study, 83 patients with clinically node negative breast cancer were included to evaluate SLNs using carbon nanoparticles. Another 83 SLNs were detected by using blue dye. SLNs detection parameters were compared between the methods. All patients irrespective of the SLNs status underwent axillary lymph node dissection for verification of axillary node status after the SLN biopsy. Results In pilot study, a 1 ml carbon nanoparticles suspension used 10–15min before surgery was associated with the best detection rate. In subsequent prospective study, with carbon nanoparticles, the identification rate, accuracy, false negative rate was 100%, 96.4%, 11.1%, respectively. The identification rate and accuracy were 88% and 95.5% with 15.8% of false negative rate using blue dye technique. The use of carbon nanoparticles suspension showed significantly superior results in identification rate (p = 0.001) and reduced false-negative results compared with blue dye technique. Conclusion Our study demonstrated feasibility and accuracy of using carbon nanoparticles for SLNs mapping in breast cancer patients. Carbon nanoparticles are useful in SLNs detection in institutions without access to radioisotope. PMID:26296136

  8. Assessing environmental DNA detection in controlled lentic systems.

    PubMed

    Moyer, Gregory R; Díaz-Ferguson, Edgardo; Hill, Jeffrey E; Shea, Colin

    2014-01-01

    Little consideration has been given to environmental DNA (eDNA) sampling strategies for rare species. The certainty of species detection relies on understanding false positive and false negative error rates. We used artificial ponds together with logistic regression models to assess the detection of African jewelfish eDNA at varying fish densities (0, 0.32, 1.75, and 5.25 fish/m3). Our objectives were to determine the most effective water stratum for eDNA detection, estimate true and false positive eDNA detection rates, and assess the number of water samples necessary to minimize the risk of false negatives. There were 28 eDNA detections in 324, 1-L, water samples collected from four experimental ponds. The best-approximating model indicated that the per-L-sample probability of eDNA detection was 4.86 times more likely for every 2.53 fish/m3 (1 SD) increase in fish density and 1.67 times less likely for every 1.02 C (1 SD) increase in water temperature. The best section of the water column to detect eDNA was the surface and to a lesser extent the bottom. Although no false positives were detected, the estimated likely number of false positives in samples from ponds that contained fish averaged 3.62. At high densities of African jewelfish, 3-5 L of water provided a >95% probability for the presence/absence of its eDNA. Conversely, at moderate and low densities, the number of water samples necessary to achieve a >95% probability of eDNA detection approximated 42-73 and >100 L, respectively. Potential biases associated with incomplete detection of eDNA could be alleviated via formal estimation of eDNA detection probabilities under an occupancy modeling framework; alternatively, the filtration of hundreds of liters of water may be required to achieve a high (e.g., 95%) level of certainty that African jewelfish eDNA will be detected at low densities (i.e., <0.32 fish/m3 or 1.75 g/m3).

  9. Generalized site occupancy models allowing for false positive and false negative errors

    USGS Publications Warehouse

    Royle, J. Andrew; Link, W.A.

    2006-01-01

    Site occupancy models have been developed that allow for imperfect species detection or ?false negative? observations. Such models have become widely adopted in surveys of many taxa. The most fundamental assumption underlying these models is that ?false positive? errors are not possible. That is, one cannot detect a species where it does not occur. However, such errors are possible in many sampling situations for a number of reasons, and even low false positive error rates can induce extreme bias in estimates of site occupancy when they are not accounted for. In this paper, we develop a model for site occupancy that allows for both false negative and false positive error rates. This model can be represented as a two-component finite mixture model and can be easily fitted using freely available software. We provide an analysis of avian survey data using the proposed model and present results of a brief simulation study evaluating the performance of the maximum-likelihood estimator and the naive estimator in the presence of false positive errors.

  10. Real-time people and vehicle detection from UAV imagery

    NASA Astrophysics Data System (ADS)

    Gaszczak, Anna; Breckon, Toby P.; Han, Jiwan

    2011-01-01

    A generic and robust approach for the real-time detection of people and vehicles from an Unmanned Aerial Vehicle (UAV) is an important goal within the framework of fully autonomous UAV deployment for aerial reconnaissance and surveillance. Here we present an approach for the automatic detection of vehicles based on using multiple trained cascaded Haar classifiers with secondary confirmation in thermal imagery. Additionally we present a related approach for people detection in thermal imagery based on a similar cascaded classification technique combining additional multivariate Gaussian shape matching. The results presented show the successful detection of vehicle and people under varying conditions in both isolated rural and cluttered urban environments with minimal false positive detection. Performance of the detector is optimized to reduce the overall false positive rate by aiming at the detection of each object of interest (vehicle/person) at least once in the environment (i.e. per search patter flight path) rather than every object in each image frame. Currently the detection rate for people is ~70% and cars ~80% although the overall episodic object detection rate for each flight pattern exceeds 90%.

  11. [The Application of Gold-immunochromatographic Test Strip Reader in Serum Antibody Detection in Echinococcosis].

    PubMed

    Xie, Gui-lin; Yin, Jun-xia; Duan, Xin-yu; Feng, Xiao-hui; Wang, Yang; Jiang, Kai; Chen, Xin-mei

    2015-06-01

    One hundred and fifty-nine serum samples from hydatid disease patients and 80 serum samples from patients with other liver diseases were detected by gold-immunochromatographic assay, and read by naked eyes and the gold-immunochromatographic test strip reader. The sensitivity, specificity, and accuracy of eye-based method was 92.4% (147/159), 85.0% (68/80), and 89.9% (215/239), which was lower than that of the reader detection (95.6%, 93.7%, 95.0%, respectively). While, its false negative rate (7.5%, 12/159) and false positive rate (15.0%, 12/80) was higher than that of the reader detection (4.4% and 6.3%, respectively).

  12. Multi-level scanning method for defect inspection

    DOEpatents

    Bokor, Jeffrey; Jeong, Seongtae

    2002-01-01

    A method for performing scanned defect inspection of a collection of contiguous areas using a specified false-alarm-rate and capture-rate within an inspection system that has characteristic seek times between inspection locations. The multi-stage method involves setting an increased false-alarm-rate for a first stage of scanning, wherein subsequent stages of scanning inspect only the detected areas of probable defects at lowered values for the false-alarm-rate. For scanning inspection operations wherein the seek time and area uncertainty is favorable, the method can substantially increase inspection throughput.

  13. Electro-optical muzzle flash detection

    NASA Astrophysics Data System (ADS)

    Krieg, Jürgen; Eisele, Christian; Seiffer, Dirk

    2016-10-01

    Localizing a shooter in a complex scenario is a difficult task. Acoustic sensors can be used to detect blast waves. Radar technology permits detection of the projectile. A third method is to detect the muzzle flash using electro-optical devices. Detection of muzzle flash events is possible with focal plane arrays, line and single element detectors. In this paper, we will show that the detection of a muzzle flash works well in the shortwave infrared spectral range. Important for the acceptance of an operational warning system in daily use is a very low false alarm rate. Using data from a detector with a high sampling rate the temporal signature of a potential muzzle flash event can be analyzed and the false alarm rate can be reduced. Another important issue is the realization of an omnidirectional view required on an operational level. It will be shown that a combination of single element detectors and simple optics in an appropriate configuration is a capable solution.

  14. Measures and Interpretations of Vigilance Performance: Evidence Against the Detection Criterion

    NASA Technical Reports Server (NTRS)

    Balakrishnan, J. D.

    1998-01-01

    Operators' performance in a vigilance task is often assumed to depend on their choice of a detection criterion. When the signal rate is low this criterion is set high, causing the hit and false alarm rates to be low. With increasing time on task the criterion presumably tends to increase even further, thereby further decreasing the hit and false alarm rates. Virtually all of the empirical evidence for this simple interpretation is based on estimates of the bias measure Beta from signal detection theory. In this article, I describe a new approach to studying decision making that does not require the technical assumptions of signal detection theory. The results of this new analysis suggest that the detection criterion is never biased toward either response, even when the signal rate is low and the time on task is long. Two modifications of the signal detection theory framework are considered to account for this seemingly paradoxical result. The first assumes that the signal rate affects the relative sizes of the variances of the information distributions; the second assumes that the signal rate affects the logic of the operator's stopping rule. Actual or potential applications of this research include the improved training and performance assessment of operators in areas such as product quality control, air traffic control, and medical and clinical diagnosis.

  15. Fusion of thermal- and visible-band video for abandoned object detection

    NASA Astrophysics Data System (ADS)

    Beyan, Cigdem; Yigit, Ahmet; Temizel, Alptekin

    2011-07-01

    Timely detection of packages that are left unattended in public spaces is a security concern, and rapid detection is important for prevention of potential threats. Because constant surveillance of such places is challenging and labor intensive, automated abandoned-object-detection systems aiding operators have started to be widely used. In many studies, stationary objects, such as people sitting on a bench, are also detected as suspicious objects due to abandoned items being defined as items newly added to the scene and remained stationary for a predefined time. Therefore, any stationary object results in an alarm causing a high number of false alarms. These false alarms could be prevented by classifying suspicious items as living and nonliving objects. In this study, a system for abandoned object detection that aids operators surveilling indoor environments such as airports, railway or metro stations, is proposed. By analysis of information from a thermal- and visible-band camera, people and the objects left behind can be detected and discriminated as living and nonliving, reducing the false-alarm rate. Experiments demonstrate that using data obtained from a thermal camera in addition to a visible-band camera also increases the true detection rate of abandoned objects.

  16. Breast MR segmentation and lesion detection with cellular neural networks and 3D template matching.

    PubMed

    Ertaş, Gökhan; Gülçür, H Ozcan; Osman, Onur; Uçan, Osman N; Tunaci, Mehtap; Dursun, Memduh

    2008-01-01

    A novel fully automated system is introduced to facilitate lesion detection in dynamic contrast-enhanced, magnetic resonance mammography (DCE-MRM). The system extracts breast regions from pre-contrast images using a cellular neural network, generates normalized maximum intensity-time ratio (nMITR) maps and performs 3D template matching with three layers of 12x12 cells to detect lesions. A breast is considered to be properly segmented when relative overlap >0.85 and misclassification rate <0.10. Sensitivity, false-positive rate per slice and per lesion are used to assess detection performance. The system was tested with a dataset of 2064 breast MR images (344slicesx6 acquisitions over time) from 19 women containing 39 marked lesions. Ninety-seven percent of the breasts were segmented properly and all the lesions were detected correctly (detection sensitivity=100%), however, there were some false-positive detections (31%/lesion, 10%/slice).

  17. Automatic near-real-time detection of CMEs in Mauna Loa K-Cor coronagraph images

    NASA Astrophysics Data System (ADS)

    Thompson, William T.; St. Cyr, Orville Chris; Burkepile, Joan; Posner, Arik

    2017-08-01

    A simple algorithm has been developed to detect the onset of coronal mass ejections (CMEs), together with an estimate of their speed, in near-real-time using images of the linearly polarized white-light solar corona taken by the K-Cor telescope at the Mauna Loa Solar Observatory (MLSO). The algorithm used is a variation on the Solar Eruptive Event Detection System (SEEDS) developed at George Mason University. The algorithm was tested against K-Cor data taken between 29 April 2014 and 20 February 2017, on days which the MLSO website marked as containing CMEs. This resulted in testing of 139 days worth of data containing 171 CMEs. The detection rate varied from close to 80% in 2014-2015 when solar activity was high, down to as low as 20-30% in 2017 when activity was low. The difference in effectiveness with solar cycle is attributed to the difference in relative prevalance of strong CMEs between active and quiet periods. There were also twelve false detections during this time period, leading to an average false detection rate of 8.6% on any given day. However, half of the false detections were clustered into two short periods of a few days each when special conditions prevailed to increase the false detection rate. The K-Cor data were also compared with major Solar Energetic Particle (SEP) storms during this time period. There were three SEP events detected either at Earth or at one of the two STEREO spacecraft where K-Cor was observing during the relevant time period. The K-Cor CME detection algorithm successfully generated alerts for two of these events, with lead times of 1-3 hours before the SEP onset at 1 AU. The third event was not detected by the automatic algorithm because of the unusually broad width of the CME in position angle.

  18. The human false-negative rate of rescreening Pap tests. Measured in a two-arm prospective clinical trial.

    PubMed

    Renshaw, A A; Lezon, K M; Wilbur, D C

    2001-04-25

    Routine quality control rescreening often is used to calculate the false-negative rate (FNR) of gynecologic cytology. Theoretic analysis suggests that this is not appropriate, due to the high FNR of rescreening and the inability to actually measure it. The authors sought to determine the FNR of manual rescreening in a large, prospective, two-arm clinical trial using an analytic instrument in the evaluation. The results of the Autopap System Clinical Trial, encompassing 25,124 analyzed slides, were reviewed. The false-negative and false-positive rates at various thresholds were determined for routine primary screening, routine rescreening, Autopap primary screening, and Autopap rescreening by using a simple, standard methodology. The FNR of routine manual rescreening at the level of atypical squamous cells of undetermined significance (ASCUS) was 73%, more than 3 times the FNR of primary screening; 11 cases were detected. The FNR of Autopap rescreening was 34%; 80 cases were detected. Routine manual rescreening decreased the laboratory FNR by less than 1%; Autopap rescreening reduced the overall laboratory FNR by 5.7%. At the same time, the false-positive rate for Autopap screening was significantly less than that of routine manual screening at the ASCUS level (4.7% vs. 5.6%; P < 0.0001). Rescreening with the Autopap system remained more sensitive than manual rescreening at the low grade squamous intraepithelial lesions threshold (FNR of 58.8% vs. 100%, respectively), although the number of cases rescreened was low. Routine manual rescreening cannot be used to calculate the FNR of primary screening. Routine rescreening is an extremely ineffective method to detect error and thereby decrease a laboratory's FNR. The Autopap system is a much more effective way of detecting errors within a laboratory and reduces the laboratory's FNR by greater than 25%.

  19. Endoscopic tri-modal imaging for detection of early neoplasia in Barrett's oesophagus: a multi-centre feasibility study using high-resolution endoscopy, autofluorescence imaging and narrow band imaging incorporated in one endoscopy system.

    PubMed

    Curvers, W L; Singh, R; Song, L-M Wong-Kee; Wolfsen, H C; Ragunath, K; Wang, K; Wallace, M B; Fockens, P; Bergman, J J G H M

    2008-02-01

    To investigate the diagnostic potential of endoscopic tri-modal imaging and the relative contribution of each imaging modality (i.e. high-resolution endoscopy (HRE), autofluorescence imaging (AFI) and narrow-band imaging (NBI)) for the detection of early neoplasia in Barrett's oesophagus. Prospective multi-centre study. Tertiary referral centres. 84 Patients with Barrett's oesophagus. The Barrett's oesophagus was inspected with HRE followed by AFI. All lesions detected with HRE and/or AFI were subsequently inspected in detail by NBI for the presence of abnormal mucosal and/or microvascular patterns. Biopsies were obtained from all suspicious lesions for blinded histopathological assessment followed by random biopsies. (1) Number of patients with early neoplasia diagnosed by HRE and AFI; (2) number of lesions with early neoplasia detected with HRE and AFI; and (3) reduction of false positive AFI findings after NBI. Per patient analysis: AFI identified all 16 patients with early neoplasia identified with HRE and detected an additional 11 patients with early neoplasia that were not identified with HRE. In three patients no abnormalities were seen but random biopsies revealed HGIN. After HRE inspection, AFI detected an additional 102 lesions; 19 contained HGIN/EC (false positive rate of AFI after HRE: 81%). Detailed inspection with NBI reduced this false positive rate to 26%. In this international multi-centre study, the addition of AFI to HRE increased the detection of both the number of patients and the number of lesions with early neoplasia in patients with Barrett's oesophagus. The false positive rate of AFI was reduced after detailed inspection with NBI.

  20. Effect of radiologists' diagnostic work-up volume on interpretive performance.

    PubMed

    Buist, Diana S M; Anderson, Melissa L; Smith, Robert A; Carney, Patricia A; Miglioretti, Diana L; Monsees, Barbara S; Sickles, Edward A; Taplin, Stephen H; Geller, Berta M; Yankaskas, Bonnie C; Onega, Tracy L

    2014-11-01

    To examine radiologists' screening performance in relation to the number of diagnostic work-ups performed after abnormal findings are discovered at screening mammography by the same radiologist or by different radiologists. In an institutional review board-approved HIPAA-compliant study, the authors linked 651 671 screening mammograms interpreted from 2002 to 2006 by 96 radiologists in the Breast Cancer Surveillance Consortium to cancer registries (standard of reference) to evaluate the performance of screening mammography (sensitivity, false-positive rate [ FPR false-positive rate ], and cancer detection rate [ CDR cancer detection rate ]). Logistic regression was used to assess the association between the volume of recalled screening mammograms ("own" mammograms, where the radiologist who interpreted the diagnostic image was the same radiologist who had interpreted the screening image, and "any" mammograms, where the radiologist who interpreted the diagnostic image may or may not have been the radiologist who interpreted the screening image) and screening performance and whether the association between total annual volume and performance differed according to the volume of diagnostic work-up. Annually, 38% of radiologists performed the diagnostic work-up for 25 or fewer of their own recalled screening mammograms, 24% performed the work-up for 0-50, and 39% performed the work-up for more than 50. For the work-up of recalled screening mammograms from any radiologist, 24% of radiologists performed the work-up for 0-50 mammograms, 32% performed the work-up for 51-125, and 44% performed the work-up for more than 125. With increasing numbers of radiologist work-ups for their own recalled mammograms, the sensitivity (P = .039), FPR false-positive rate (P = .004), and CDR cancer detection rate (P < .001) of screening mammography increased, yielding a stepped increase in women recalled per cancer detected from 17.4 for 25 or fewer mammograms to 24.6 for more than 50 mammograms. Increases in work-ups for any radiologist yielded significant increases in FPR false-positive rate (P = .011) and CDR cancer detection rate (P = .001) and a nonsignificant increase in sensitivity (P = .15). Radiologists with a lower annual volume of any work-ups had consistently lower FPR false-positive rate , sensitivity, and CDR cancer detection rate at all annual interpretive volumes. These findings support the hypothesis that radiologists may improve their screening performance by performing the diagnostic work-up for their own recalled screening mammograms and directly receiving feedback afforded by means of the outcomes associated with their initial decision to recall. Arranging for radiologists to work up a minimum number of their own recalled cases could improve screening performance but would need systems to facilitate this workflow.

  1. Uncertainty in biological monitoring: a framework for data collection and analysis to account for multiple sources of sampling bias

    USGS Publications Warehouse

    Ruiz-Gutierrez, Viviana; Hooten, Melvin B.; Campbell Grant, Evan H.

    2016-01-01

    Biological monitoring programmes are increasingly relying upon large volumes of citizen-science data to improve the scope and spatial coverage of information, challenging the scientific community to develop design and model-based approaches to improve inference.Recent statistical models in ecology have been developed to accommodate false-negative errors, although current work points to false-positive errors as equally important sources of bias. This is of particular concern for the success of any monitoring programme given that rates as small as 3% could lead to the overestimation of the occurrence of rare events by as much as 50%, and even small false-positive rates can severely bias estimates of occurrence dynamics.We present an integrated, computationally efficient Bayesian hierarchical model to correct for false-positive and false-negative errors in detection/non-detection data. Our model combines independent, auxiliary data sources with field observations to improve the estimation of false-positive rates, when a subset of field observations cannot be validated a posteriori or assumed as perfect. We evaluated the performance of the model across a range of occurrence rates, false-positive and false-negative errors, and quantity of auxiliary data.The model performed well under all simulated scenarios, and we were able to identify critical auxiliary data characteristics which resulted in improved inference. We applied our false-positive model to a large-scale, citizen-science monitoring programme for anurans in the north-eastern United States, using auxiliary data from an experiment designed to estimate false-positive error rates. Not correcting for false-positive rates resulted in biased estimates of occupancy in 4 of the 10 anuran species we analysed, leading to an overestimation of the average number of occupied survey routes by as much as 70%.The framework we present for data collection and analysis is able to efficiently provide reliable inference for occurrence patterns using data from a citizen-science monitoring programme. However, our approach is applicable to data generated by any type of research and monitoring programme, independent of skill level or scale, when effort is placed on obtaining auxiliary information on false-positive rates.

  2. Prospective multi-center study of an automatic online seizure detection system for epilepsy monitoring units.

    PubMed

    Fürbass, F; Ossenblok, P; Hartmann, M; Perko, H; Skupch, A M; Lindinger, G; Elezi, L; Pataraia, E; Colon, A J; Baumgartner, C; Kluge, T

    2015-06-01

    A method for automatic detection of epileptic seizures in long-term scalp-EEG recordings called EpiScan will be presented. EpiScan is used as alarm device to notify medical staff of epilepsy monitoring units (EMUs) in case of a seizure. A prospective multi-center study was performed in three EMUs including 205 patients. A comparison between EpiScan and the Persyst seizure detector on the prospective data will be presented. In addition, the detection results of EpiScan on retrospective EEG data of 310 patients and the public available CHB-MIT dataset will be shown. A detection sensitivity of 81% was reached for unequivocal electrographic seizures with false alarm rate of only 7 per day. No statistical significant differences in the detection sensitivities could be found between the centers. The comparison to the Persyst seizure detector showed a lower false alarm rate of EpiScan but the difference was not of statistical significance. The automatic seizure detection method EpiScan showed high sensitivity and low false alarm rate in a prospective multi-center study on a large number of patients. The application as seizure alarm device in EMUs becomes feasible and will raise the efficiency of video-EEG monitoring and the safety levels of patients. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Performance of Spectrogram-Based Seizure Identification of Adult EEGs by Critical Care Nurses and Neurophysiologists.

    PubMed

    Amorim, Edilberto; Williamson, Craig A; Moura, Lidia M V R; Shafi, Mouhsin M; Gaspard, Nicolas; Rosenthal, Eric S; Guanci, Mary M; Rajajee, Venkatakrishna; Westover, M Brandon

    2017-07-01

    Continuous EEG screening using spectrograms or compressed spectral arrays (CSAs) by neurophysiologists has shorter review times with minimal loss of sensitivity for seizure detection when compared with visual analysis of raw EEG. Limited data are available on the performance characteristics of CSA-based seizure detection by neurocritical care nurses. This is a prospective cross-sectional study that was conducted in two academic neurocritical care units and involved 33 neurointensive care unit nurses and four neurophysiologists. All nurses underwent a brief training session before testing. Forty two-hour CSA segments of continuous EEG were reviewed and rated for the presence of seizures. Two experienced clinical neurophysiologists masked to the CSA data performed conventional visual analysis of the raw EEG and served as the gold standard. The overall accuracy was 55.7% among nurses and 67.5% among neurophysiologists. Nurse seizure detection sensitivity was 73.8%, and the false-positive rate was 1-per-3.2 hours. Sensitivity and false-alarm rate for the neurophysiologists was 66.3% and 1-per-6.4 hours, respectively. Interrater agreement for seizure screening was fair for nurses (Gwet AC1 statistic: 43.4%) and neurophysiologists (AC1: 46.3%). Training nurses to perform seizure screening utilizing continuous EEG CSA displays is feasible and associated with moderate sensitivity. Nurses and neurophysiologists had comparable sensitivities, but nurses had a higher false-positive rate. Further work is needed to improve sensitivity and reduce false-alarm rates.

  4. Research on Abnormal Detection Based on Improved Combination of K - means and SVDD

    NASA Astrophysics Data System (ADS)

    Hao, Xiaohong; Zhang, Xiaofeng

    2018-01-01

    In order to improve the efficiency of network intrusion detection and reduce the false alarm rate, this paper proposes an anomaly detection algorithm based on improved K-means and SVDD. The algorithm first uses the improved K-means algorithm to cluster the training samples of each class, so that each class is independent and compact in class; Then, according to the training samples, the SVDD algorithm is used to construct the minimum superspheres. The subordinate relationship of the samples is determined by calculating the distance of the minimum superspheres constructed by SVDD. If the test sample is less than the center of the hypersphere, the test sample belongs to this class, otherwise it does not belong to this class, after several comparisons, the final test of the effective detection of the test sample.In this paper, we use KDD CUP99 data set to simulate the proposed anomaly detection algorithm. The results show that the algorithm has high detection rate and low false alarm rate, which is an effective network security protection method.

  5. Continuous detection of weak sensory signals in afferent spike trains: the role of anti-correlated interspike intervals in detection performance.

    PubMed

    Goense, J B M; Ratnam, R

    2003-10-01

    An important problem in sensory processing is deciding whether fluctuating neural activity encodes a stimulus or is due to variability in baseline activity. Neurons that subserve detection must examine incoming spike trains continuously, and quickly and reliably differentiate signals from baseline activity. Here we demonstrate that a neural integrator can perform continuous signal detection, with performance exceeding that of trial-based procedures, where spike counts in signal- and baseline windows are compared. The procedure was applied to data from electrosensory afferents of weakly electric fish (Apteronotus leptorhynchus), where weak perturbations generated by small prey add approximately 1 spike to a baseline of approximately 300 spikes s(-1). The hypothetical postsynaptic neuron, modeling an electrosensory lateral line lobe cell, could detect an added spike within 10-15 ms, achieving near ideal detection performance (80-95%) at false alarm rates of 1-2 Hz, while trial-based testing resulted in only 30-35% correct detections at that false alarm rate. The performance improvement was due to anti-correlations in the afferent spike train, which reduced both the amplitude and duration of fluctuations in postsynaptic membrane activity, and so decreased the number of false alarms. Anti-correlations can be exploited to improve detection performance only if there is memory of prior decisions.

  6. Improving staff response to seizures on the epilepsy monitoring unit with online EEG seizure detection algorithms.

    PubMed

    Rommens, Nicole; Geertsema, Evelien; Jansen Holleboom, Lisanne; Cox, Fieke; Visser, Gerhard

    2018-05-11

    User safety and the quality of diagnostics on the epilepsy monitoring unit (EMU) depend on reaction to seizures. Online seizure detection might improve this. While good sensitivity and specificity is reported, the added value above staff response is unclear. We ascertained the added value of two electroencephalograph (EEG) seizure detection algorithms in terms of additional detected seizures or faster detection time. EEG-video seizure recordings of people admitted to an EMU over one year were included, with a maximum of two seizures per subject. All recordings were retrospectively analyzed using Encevis EpiScan and BESA Epilepsy. Detection sensitivity and latency of the algorithms were compared to staff responses. False positive rates were estimated on 30 uninterrupted recordings (roughly 24 h per subject) of consecutive subjects admitted to the EMU. EEG-video recordings used included 188 seizures. The response rate of staff was 67%, of Encevis 67%, and of BESA Epilepsy 65%. Of the 62 seizures missed by staff, 66% were recognized by Encevis and 39% by BESA Epilepsy. The median latency was 31 s (staff), 10 s (Encevis), and 14 s (BESA Epilepsy). After correcting for walking time from the observation room to the subject, both algorithms detected faster than staff in 65% of detected seizures. The full recordings included 617 h of EEG. Encevis had a median false positive rate of 4.9 per 24 h and BESA Epilepsy of 2.1 per 24 h. EEG-video seizure detection algorithms may improve reaction to seizures by improving the total number of seizures detected and the speed of detection. The false positive rate is feasible for use in a clinical situation. Implementation of these algorithms might result in faster diagnostic testing and better observation during seizures. Copyright © 2018. Published by Elsevier Inc.

  7. High resolution melting analysis for epidermal growth factor receptor mutations in formalin-fixed paraffin-embedded tissue and plasma free DNA from non-small cell lung cancer patients.

    PubMed

    Jing, Chang-Wen; Wang, Zhuo; Cao, Hai-Xia; Ma, Rong; Wu, Jian-Zhong

    2014-01-01

    The aim of the research was to explore a cost effective, fast, easy to perform, and sensitive method for epidermal growth factor receptor (EGFR) mutation testing. High resolution melting analysis (HRM) was introduced to evaluate the efficacy of the analysis for dectecting EGFR mutations in exons 18 to 21 using formalin-fixed paraffin-embedded (FFPE) tissues and plasma free DNA from 120 patients. The total EGFR mutation rate was 37.5% (45/120) detected by direct sequencing. There were 48 mutations in 120 FFPE tissues assessed by HRM. For plasma free DNA, the EGFR mutation rate was 25.8% (31/120). The sensitivity of HRM assays in FFPE samples was 100% by HRM. There was a low false-positive mutation rate but a high false-negative rate in plasma free DNA detected by HRM. Our results show that HRM analysis has the advantage of small tumor sample need. HRM applied with plasma free DNA showed a high false-negative rate but a low false-positive rate. Further research into appropriate methods and analysis needs to be performed before HRM for plasma free DNA could be accepted as an option in diagnostic or screening settings.

  8. An Integrated Intrusion Detection Model of Cluster-Based Wireless Sensor Network

    PubMed Central

    Sun, Xuemei; Yan, Bo; Zhang, Xinzhong; Rong, Chuitian

    2015-01-01

    Considering wireless sensor network characteristics, this paper combines anomaly and mis-use detection and proposes an integrated detection model of cluster-based wireless sensor network, aiming at enhancing detection rate and reducing false rate. Adaboost algorithm with hierarchical structures is used for anomaly detection of sensor nodes, cluster-head nodes and Sink nodes. Cultural-Algorithm and Artificial-Fish–Swarm-Algorithm optimized Back Propagation is applied to mis-use detection of Sink node. Plenty of simulation demonstrates that this integrated model has a strong performance of intrusion detection. PMID:26447696

  9. An Integrated Intrusion Detection Model of Cluster-Based Wireless Sensor Network.

    PubMed

    Sun, Xuemei; Yan, Bo; Zhang, Xinzhong; Rong, Chuitian

    2015-01-01

    Considering wireless sensor network characteristics, this paper combines anomaly and mis-use detection and proposes an integrated detection model of cluster-based wireless sensor network, aiming at enhancing detection rate and reducing false rate. Adaboost algorithm with hierarchical structures is used for anomaly detection of sensor nodes, cluster-head nodes and Sink nodes. Cultural-Algorithm and Artificial-Fish-Swarm-Algorithm optimized Back Propagation is applied to mis-use detection of Sink node. Plenty of simulation demonstrates that this integrated model has a strong performance of intrusion detection.

  10. Small Infrared Target Detection by Region-Adaptive Clutter Rejection for Sea-Based Infrared Search and Track

    PubMed Central

    Kim, Sungho; Lee, Joohyoung

    2014-01-01

    This paper presents a region-adaptive clutter rejection method for small target detection in sea-based infrared search and track. In the real world, clutter normally generates many false detections that impede the deployment of such detection systems. Incoming targets (missiles, boats, etc.) can be located in the sky, horizon and sea regions, which have different types of clutters, such as clouds, a horizontal line and sea-glint. The characteristics of regional clutter were analyzed after the geometrical analysis-based region segmentation. The false detections caused by cloud clutter were removed by the spatial attribute-based classification. Those by the horizontal line were removed using the heterogeneous background removal filter. False alarms by sun-glint were rejected using the temporal consistency filter, which is the most difficult part. The experimental results of the various cluttered background sequences show that the proposed region adaptive clutter rejection method produces fewer false alarms than that of the mean subtraction filter (MSF) with an acceptable degradation detection rate. PMID:25054633

  11. Toward Non-Invasive and Automatic Intravenous Infiltration Detection: Evaluation of Bioimpedance and Skin Strain in a Pig Model.

    PubMed

    Bicen, A Ozan; West, Leanne L; Cesar, Liliana; Inan, Omer T

    2018-01-01

    Intravenous (IV) therapy is prevalent in hospital settings, where fluids are typically delivered with an IV into a peripheral vein of the patient. IV infiltration is the inadvertent delivery of fluids into the extravascular space rather than into the vein (and requires urgent treatment to avoid scarring and severe tissue damage), for which medical staff currently needs to check patients periodically. In this paper, the performance of two non-invasive sensing modalities, electrical bioimpedance (EBI), and skin strain sensing, for the automatic detection of IV infiltration was investigated in an animal model. Infiltrations were physically simulated on the hind limb of anesthetized pigs, where the sensors for EBI and skin strain sensing were co-located. The obtained data were used to examine the ability to distinguish between infusion into the vein and an infiltration event using bioresistance and bioreactance (derived from EBI), as well as skin strain. Skin strain and bioresistance sensing could achieve detection rates greater than 0.9 for infiltration fluid volumes of 2 and 10 mL, respectively, for a given false positive, i.e., false alarm rate of 0.05. Furthermore, the fusion of multiple sensing modalities could achieve a detection rate of 0.97 with a false alarm rate of 0.096 for 5mL fluid volume of infiltration. EBI and skin strain sensing can enable non-invasive and real-time IV infiltration detection systems. Fusion of multiple sensing modalities can help to detect expanded range of leaking fluid volumes. The provided performance results and comparisons in this paper are an important step towards clinical translation of sensing technologies for detecting IV infiltration.

  12. Toward Non-Invasive and Automatic Intravenous Infiltration Detection: Evaluation of Bioimpedance and Skin Strain in a Pig Model

    PubMed Central

    Bicen, A. Ozan; West, Leanne L.; Cesar, Liliana

    2018-01-01

    Intravenous (IV) therapy is prevalent in hospital settings, where fluids are typically delivered with an IV into a peripheral vein of the patient. IV infiltration is the inadvertent delivery of fluids into the extravascular space rather than into the vein (and requires urgent treatment to avoid scarring and severe tissue damage), for which medical staff currently needs to check patients periodically. In this paper, the performance of two non-invasive sensing modalities, electrical bioimpedance (EBI), and skin strain sensing, for the automatic detection of IV infiltration was investigated in an animal model. Infiltrations were physically simulated on the hind limb of anesthetized pigs, where the sensors for EBI and skin strain sensing were co-located. The obtained data were used to examine the ability to distinguish between infusion into the vein and an infiltration event using bioresistance and bioreactance (derived from EBI), as well as skin strain. Skin strain and bioresistance sensing could achieve detection rates greater than 0.9 for infiltration fluid volumes of 2 and 10 mL, respectively, for a given false positive, i.e., false alarm rate of 0.05. Furthermore, the fusion of multiple sensing modalities could achieve a detection rate of 0.97 with a false alarm rate of 0.096 for 5mL fluid volume of infiltration. EBI and skin strain sensing can enable non-invasive and real-time IV infiltration detection systems. Fusion of multiple sensing modalities can help to detect expanded range of leaking fluid volumes. The provided performance results and comparisons in this paper are an important step towards clinical translation of sensing technologies for detecting IV infiltration. PMID:29692956

  13. Hideen Markov Models and Neural Networks for Fault Detection in Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic

    1994-01-01

    None given. (From conclusion): Neural networks plus Hidden Markov Models(HMM)can provide excellene detection and false alarm rate performance in fault detection applications. Modified models allow for novelty detection. Also covers some key contributions of neural network model, and application status.

  14. Serum Free Light Chain Assay and κ/λ Ratio: Performance in Patients With Monoclonal Gammopathy-High False Negative Rate for κ/λ Ratio

    PubMed Central

    Singh, Gurmukh

    2017-01-01

    Background Serum free light chain assay (SFLCA) and κ/λ ratio, and protein electrophoretic methods are used in the diagnosis and monitoring of monoclonal gammopathies. Methods Results for serum free light chains, serum and urine protein electrophoreses and immunofixation electrophoreses in 468 patients with a diagnosis of monoclonal gammopathy were compared. The results of the two methods were graded as concordant, non-concordant or discordant with the established diagnoses to assess the relative performance of the methods. Results of κ/λ ratio in samples with monoclonal protein detectable by electrophoretic methods were also analyzed. Results Protein electrophoreses results were concordant with the established diagnoses significantly more often than κ/λ ratio. The false negative rate for κ/λ ratio was higher than that for electrophoretic methods. κ/λ ratio was falsely negative in about 27% of the 1,860 samples with detectable monoclonal immunoglobulin. The false negative rate was higher in lesions with lambda chains (32%) than those with kappa chains (24%). The false negative rate for κ/λ ratio was over 55% in samples with monoclonal gammopathy of undetermined significance. Even at first encounter, the false negative rates for κ/λ ratios for monoclonal gammopathy of undetermined significance, smoldering myeloma and multiple myeloma were 66.98%, 23.08%, and 30.15%, respectively, with false negative rate for lambda chain lesions being higher. Conclusions Electrophoretic studies of serum and urine are superior to SFLCA and κ/λ ratio. Abnormal κ/λ ratio, per se, is not diagnostic of monoclonal gammopathy. A normal κ/λ ratio does not exclude monoclonal gammopathy. False negative rates for lesions with lambda chain are higher than those for lesions with kappa chains. Electrophoretic studies of urine are underutilized. Clinical usefulness and medical necessity of SFLCA and κ/λ ratio is of questionable value in routine clinical testing. PMID:27924175

  15. Serum Free Light Chain Assay and κ/λ Ratio: Performance in Patients With Monoclonal Gammopathy-High False Negative Rate for κ/λ Ratio.

    PubMed

    Singh, Gurmukh

    2017-01-01

    Serum free light chain assay (SFLCA) and κ/λ ratio, and protein electrophoretic methods are used in the diagnosis and monitoring of monoclonal gammopathies. Results for serum free light chains, serum and urine protein electrophoreses and immunofixation electrophoreses in 468 patients with a diagnosis of monoclonal gammopathy were compared. The results of the two methods were graded as concordant, non-concordant or discordant with the established diagnoses to assess the relative performance of the methods. Results of κ/λ ratio in samples with monoclonal protein detectable by electrophoretic methods were also analyzed. Protein electrophoreses results were concordant with the established diagnoses significantly more often than κ/λ ratio. The false negative rate for κ/λ ratio was higher than that for electrophoretic methods. κ/λ ratio was falsely negative in about 27% of the 1,860 samples with detectable monoclonal immunoglobulin. The false negative rate was higher in lesions with lambda chains (32%) than those with kappa chains (24%). The false negative rate for κ/λ ratio was over 55% in samples with monoclonal gammopathy of undetermined significance. Even at first encounter, the false negative rates for κ/λ ratios for monoclonal gammopathy of undetermined significance, smoldering myeloma and multiple myeloma were 66.98%, 23.08%, and 30.15%, respectively, with false negative rate for lambda chain lesions being higher. Electrophoretic studies of serum and urine are superior to SFLCA and κ/λ ratio. Abnormal κ/λ ratio, per se , is not diagnostic of monoclonal gammopathy. A normal κ/λ ratio does not exclude monoclonal gammopathy. False negative rates for lesions with lambda chain are higher than those for lesions with kappa chains. Electrophoretic studies of urine are underutilized. Clinical usefulness and medical necessity of SFLCA and κ/λ ratio is of questionable value in routine clinical testing.

  16. Enhancing the detection of barcoded reads in high throughput DNA sequencing data by controlling the false discovery rate.

    PubMed

    Buschmann, Tilo; Zhang, Rong; Brash, Douglas E; Bystrykh, Leonid V

    2014-08-07

    DNA barcodes are short unique sequences used to label DNA or RNA-derived samples in multiplexed deep sequencing experiments. During the demultiplexing step, barcodes must be detected and their position identified. In some cases (e.g., with PacBio SMRT), the position of the barcode and DNA context is not well defined. Many reads start inside the genomic insert so that adjacent primers might be missed. The matter is further complicated by coincidental similarities between barcode sequences and reference DNA. Therefore, a robust strategy is required in order to detect barcoded reads and avoid a large number of false positives or negatives.For mass inference problems such as this one, false discovery rate (FDR) methods are powerful and balanced solutions. Since existing FDR methods cannot be applied to this particular problem, we present an adapted FDR method that is suitable for the detection of barcoded reads as well as suggest possible improvements. In our analysis, barcode sequences showed high rates of coincidental similarities with the Mus musculus reference DNA. This problem became more acute when the length of the barcode sequence decreased and the number of barcodes in the set increased. The method presented in this paper controls the tail area-based false discovery rate to distinguish between barcoded and unbarcoded reads. This method helps to establish the highest acceptable minimal distance between reads and barcode sequences. In a proof of concept experiment we correctly detected barcodes in 83% of the reads with a precision of 89%. Sensitivity improved to 99% at 99% precision when the adjacent primer sequence was incorporated in the analysis. The analysis was further improved using a paired end strategy. Following an analysis of the data for sequence variants induced in the Atp1a1 gene of C57BL/6 murine melanocytes by ultraviolet light and conferring resistance to ouabain, we found no evidence of cross-contamination of DNA material between samples. Our method offers a proper quantitative treatment of the problem of detecting barcoded reads in a noisy sequencing environment. It is based on the false discovery rate statistics that allows a proper trade-off between sensitivity and precision to be chosen.

  17. Improved detection and false alarm rejection for chemical vapors using passive hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Marinelli, William J.; Miyashiro, Rex; Gittins, Christopher M.; Konno, Daisei; Chang, Shing; Farr, Matt; Perkins, Brad

    2013-05-01

    Two AIRIS sensors were tested at Dugway Proving Grounds against chemical agent vapor simulants. The primary objectives of the test were to: 1) assess performance of algorithm improvements designed to reduce false alarm rates with a special emphasis on solar effects, and 3) evaluate performance in target detection at 5 km. The tests included 66 total releases comprising alternating 120 kg glacial acetic acid (GAA) and 60 kg triethyl phosphate (TEP) events. The AIRIS sensors had common algorithms, detection thresholds, and sensor parameters. The sensors used the target set defined for the Joint Service Lightweight Chemical Agent Detector (JSLSCAD) with TEP substituted for GA and GAA substituted for VX. They were exercised at two sites located at either 3 km or 5 km from the release point. Data from the tests will be presented showing that: 1) excellent detection capability was obtained at both ranges with significantly shorter alarm times at 5 km, 2) inter-sensor comparison revealed very comparable performance, 3) false alarm rates < 1 incident per 10 hours running time over 143 hours of sensor operations were achieved, 4) algorithm improvements eliminated both solar and cloud false alarms. The algorithms enabling the improved false alarm rejection will be discussed. The sensor technology has recently been extended to address the problem of detection of liquid and solid chemical agents and toxic industrial chemical on surfaces. The phenomenology and applicability of passive infrared hyperspectral imaging to this problem will be discussed and demonstrated.

  18. Automatic mouse ultrasound detector (A-MUD): A new tool for processing rodent vocalizations.

    PubMed

    Zala, Sarah M; Reitschmidt, Doris; Noll, Anton; Balazs, Peter; Penn, Dustin J

    2017-01-01

    House mice (Mus musculus) emit complex ultrasonic vocalizations (USVs) during social and sexual interactions, which have features similar to bird song (i.e., they are composed of several different types of syllables, uttered in succession over time to form a pattern of sequences). Manually processing complex vocalization data is time-consuming and potentially subjective, and therefore, we developed an algorithm that automatically detects mouse ultrasonic vocalizations (Automatic Mouse Ultrasound Detector or A-MUD). A-MUD is a script that runs on STx acoustic software (S_TOOLS-STx version 4.2.2), which is free for scientific use. This algorithm improved the efficiency of processing USV files, as it was 4-12 times faster than manual segmentation, depending upon the size of the file. We evaluated A-MUD error rates using manually segmented sound files as a 'gold standard' reference, and compared them to a commercially available program. A-MUD had lower error rates than the commercial software, as it detected significantly more correct positives, and fewer false positives and false negatives. The errors generated by A-MUD were mainly false negatives, rather than false positives. This study is the first to systematically compare error rates for automatic ultrasonic vocalization detection methods, and A-MUD and subsequent versions will be made available for the scientific community.

  19. Detection, isolation and diagnosability analysis of intermittent faults in stochastic systems

    NASA Astrophysics Data System (ADS)

    Yan, Rongyi; He, Xiao; Wang, Zidong; Zhou, D. H.

    2018-02-01

    Intermittent faults (IFs) have the properties of unpredictability, non-determinacy, inconsistency and repeatability, switching systems between faulty and healthy status. In this paper, the fault detection and isolation (FDI) problem of IFs in a class of linear stochastic systems is investigated. For the detection and isolation of IFs, it includes: (1) to detect all the appearing time and the disappearing time of an IF; (2) to detect each appearing (disappearing) time of the IF before the subsequent disappearing (appearing) time; (3) to determine where the IFs happen. Based on the outputs of the observers we designed, a novel set of residuals is constructed by using the sliding-time window technique, and two hypothesis tests are proposed to detect all the appearing time and disappearing time of IFs. The isolation problem of IFs is also considered. Furthermore, within a statistical framework, the definition of the diagnosability of IFs is proposed, and a sufficient condition is brought forward for the diagnosability of IFs. Quantitative performance analysis results for the false alarm rate and missing detection rate are discussed, and the influences of some key parameters of the proposed scheme on performance indices such as the false alarm rate and missing detection rate are analysed rigorously. The effectiveness of the proposed scheme is illustrated via a simulation example of an unmanned helicopter longitudinal control system.

  20. Machine-learning-based real-bogus system for the HSC-SSP moving object detection pipeline

    NASA Astrophysics Data System (ADS)

    Lin, Hsing-Wen; Chen, Ying-Tung; Wang, Jen-Hung; Wang, Shiang-Yu; Yoshida, Fumi; Ip, Wing-Huen; Miyazaki, Satoshi; Terai, Tsuyoshi

    2018-01-01

    Machine-learning techniques are widely applied in many modern optical sky surveys, e.g., Pan-STARRS1, PTF/iPTF, and the Subaru/Hyper Suprime-Cam survey, to reduce human intervention in data verification. In this study, we have established a machine-learning-based real-bogus system to reject false detections in the Subaru/Hyper-Suprime-Cam Strategic Survey Program (HSC-SSP) source catalog. Therefore, the HSC-SSP moving object detection pipeline can operate more effectively due to the reduction of false positives. To train the real-bogus system, we use stationary sources as the real training set and "flagged" data as the bogus set. The training set contains 47 features, most of which are photometric measurements and shape moments generated from the HSC image reduction pipeline (hscPipe). Our system can reach a true positive rate (tpr) ˜96% with a false positive rate (fpr) ˜1% or tpr ˜99% at fpr ˜5%. Therefore, we conclude that stationary sources are decent real training samples, and using photometry measurements and shape moments can reject false positives effectively.

  1. Sensor Anomaly Detection in Wireless Sensor Networks for Healthcare

    PubMed Central

    Haque, Shah Ahsanul; Rahman, Mustafizur; Aziz, Syed Mahfuzul

    2015-01-01

    Wireless Sensor Networks (WSN) are vulnerable to various sensor faults and faulty measurements. This vulnerability hinders efficient and timely response in various WSN applications, such as healthcare. For example, faulty measurements can create false alarms which may require unnecessary intervention from healthcare personnel. Therefore, an approach to differentiate between real medical conditions and false alarms will improve remote patient monitoring systems and quality of healthcare service afforded by WSN. In this paper, a novel approach is proposed to detect sensor anomaly by analyzing collected physiological data from medical sensors. The objective of this method is to effectively distinguish false alarms from true alarms. It predicts a sensor value from historic values and compares it with the actual sensed value for a particular instance. The difference is compared against a threshold value, which is dynamically adjusted, to ascertain whether the sensor value is anomalous. The proposed approach has been applied to real healthcare datasets and compared with existing approaches. Experimental results demonstrate the effectiveness of the proposed system, providing high Detection Rate (DR) and low False Positive Rate (FPR). PMID:25884786

  2. Fusion of KLMS and blob based pre-screener for buried landmine detection using ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Baydar, Bora; Akar, Gözde Bozdaǧi.; Yüksel, Seniha E.; Öztürk, Serhat

    2016-05-01

    In this paper, a decision level fusion using multiple pre-screener algorithms is proposed for the detection of buried landmines from Ground Penetrating Radar (GPR) data. The Kernel Least Mean Square (KLMS) and the Blob Filter pre-screeners are fused together to work in real time with less false alarms and higher true detection rates. The effect of the kernel variance is investigated for the KLMS algorithm. Also, the results of the KLMS and KLMS+Blob filter algorithms are compared to the LMS method in terms of processing time and false alarm rates. Proposed algorithm is tested on both simulated data and real data collected at the field of IPA Defence at METU, Ankara, Turkey.

  3. PRESAGE: Protecting Structured Address Generation against Soft Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram

    Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less

  4. PRESAGE: Protecting Structured Address Generation against Soft Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram

    Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less

  5. Using Dictionary Pair Learning for Seizure Detection.

    PubMed

    Ma, Xin; Yu, Nana; Zhou, Weidong

    2018-02-13

    Automatic seizure detection is extremely important in the monitoring and diagnosis of epilepsy. The paper presents a novel method based on dictionary pair learning (DPL) for seizure detection in the long-term intracranial electroencephalogram (EEG) recordings. First, for the EEG data, wavelet filtering and differential filtering are applied, and the kernel function is performed to make the signal linearly separable. In DPL, the synthesis dictionary and analysis dictionary are learned jointly from original training samples with alternating minimization method, and sparse coefficients are obtained by using of linear projection instead of costly [Formula: see text]-norm or [Formula: see text]-norm optimization. At last, the reconstructed residuals associated with seizure and nonseizure sub-dictionary pairs are calculated as the decision values, and the postprocessing is performed for improving the recognition rate and reducing the false detection rate of the system. A total of 530[Formula: see text]h from 20 patients with 81 seizures were used to evaluate the system. Our proposed method has achieved an average segment-based sensitivity of 93.39%, specificity of 98.51%, and event-based sensitivity of 96.36% with false detection rate of 0.236/h.

  6. Application of data fusion technology based on D-S evidence theory in fire detection

    NASA Astrophysics Data System (ADS)

    Cai, Zhishan; Chen, Musheng

    2015-12-01

    Judgment and identification based on single fire characteristic parameter information in fire detection is subject to environmental disturbances, and accordingly its detection performance is limited with the increase of false positive rate and false negative rate. The compound fire detector employs information fusion technology to judge and identify multiple fire characteristic parameters in order to improve the reliability and accuracy of fire detection. The D-S evidence theory is applied to the multi-sensor data-fusion: first normalize the data from all sensors to obtain the normalized basic probability function of the fire occurrence; then conduct the fusion processing using the D-S evidence theory; finally give the judgment results. The results show that the method meets the goal of accurate fire signal identification and increases the accuracy of fire alarm, and therefore is simple and effective.

  7. The Use of Meteorlogical Data to Improve Contrail Detection in Thermal Imagery over Ireland.

    NASA Technical Reports Server (NTRS)

    Whelan, Gillian M.; Cawkwell, Fiona; Mannstein, Hermann; Minnis, Patrick

    2009-01-01

    Aircraft induced contrails have been found to have a net warming influence on the climate system, with strong regional dependence. Persistent linear contrails are detectable in 1 Km thermal imagery and, using an automated Contrail Detection Algorithm (CDA), can be identified on the basis of their different properties at the 11 and 12 m w av.el enTgthshe algorithm s ability to distinguish contrails from other linear features depends on the sensitivity of its tuning parameters. In order to keep the number of false identifications low, the algorithm imposes strict limits on contrail size, linearity and intensity. This paper investigates whether including additional information (i.e. meteorological data) within the CDA may allow for these criteria to be less rigorous, thus increasing the contrail-detection rate, without increasing the false alarm rate.

  8. Deep belief networks for false alarm rejection in forward-looking ground-penetrating radar

    NASA Astrophysics Data System (ADS)

    Becker, John; Havens, Timothy C.; Pinar, Anthony; Schulz, Timothy J.

    2015-05-01

    Explosive hazards are one of the most deadly threats in modern conflicts. The U.S. Army is interested in a reliable way to detect these hazards at range. A promising way of accomplishing this task is using a forward-looking ground-penetrating radar (FLGPR) system. Recently, the Army has been testing a system that utilizes both L-band and X-band radar arrays on a vehicle mounted platform. Using data from this system, we sought to improve the performance of a constant false-alarm-rate (CFAR) prescreener through the use of a deep belief network (DBN). DBNs have also been shown to perform exceptionally well at generalized anomaly detection. They combine unsupervised pre-training with supervised fine-tuning to generate low-dimensional representations of high-dimensional input data. We seek to take advantage of these two properties by training a DBN on the features of the CFAR prescreener's false alarms (FAs) and then use that DBN to separate FAs from true positives. Our analysis shows that this method improves the detection statistics significantly. By training the DBN on a combination of image features, we were able to significantly increase the probability of detection while maintaining a nominal number of false alarms per square meter. Our research shows that DBNs are a good candidate for improving detection rates in FLGPR systems.

  9. Looped back fiber mode for reduction of false alarm in leak detection using distributed optical fiber sensor.

    PubMed

    Chelliah, Pandian; Murgesan, Kasinathan; Samvel, Sosamma; Chelamchala, Babu Rao; Tammana, Jayakumar; Nagarajan, Murali; Raj, Baldev

    2010-07-10

    Optical-fiber-based sensors have inherent advantages, such as immunity to electromagnetic interference, compared to the conventional sensors. Distributed optical fiber sensor (DOFS) systems, such as Raman and Brillouin distributed temperature sensors are used for leak detection. The inherent noise of fiber-based systems leads to occasional false alarms. In this paper, a methodology is proposed to overcome this. This uses a looped back fiber mode in DOFS and voting logic is employed to considerably reduce the false alarm rate.

  10. Fusion of ultrasonic and infrared signatures for personnel detection by a mobile robot

    NASA Astrophysics Data System (ADS)

    Carroll, Matthew S.; Meng, Min; Cadwallender, William K.

    1992-04-01

    Passive Infrared sensors used for intrusion detection, especially those used on mobile robots, are vulnerable to false alarms caused by clutter objects such as radiators, steam pipes, windows, etc., as well as deliberately caused false alarms caused by decoy objects. To overcome these sources of false alarms, we are now combining thermal and ultrasonic signals, the results being a more robust system for detecting personnel. Our paper will discuss the fusion strategies used for combining sensor information. Our first strategy uses a statistical classifier using features such as the sonar cross-section, the received thermal energy, and ultrasonic range. Our second strategy uses s 3-layered neural classifier trained by backpropagation. The probability of correct classification and the false alarm rate for both strategies will be presented in the paper.

  11. Double versus single reading of mammograms in a breast cancer screening programme: a cost-consequence analysis.

    PubMed

    Posso, Margarita C; Puig, Teresa; Quintana, Ma Jesus; Solà-Roca, Judit; Bonfill, Xavier

    2016-09-01

    To assess the costs and health-related outcomes of double versus single reading of digital mammograms in a breast cancer screening programme. Based on data from 57,157 digital screening mammograms from women aged 50-69 years, we compared costs, false-positive results, positive predictive value and cancer detection rate using four reading strategies: double reading with and without consensus and arbitration, and single reading with first reader only and second reader only. Four highly trained radiologists read the mammograms. Double reading with consensus and arbitration was 15 % (Euro 334,341) more expensive than single reading with first reader only. False-positive results were more frequent at double reading with consensus and arbitration than at single reading with first reader only (4.5 % and 4.2 %, respectively; p < 0.001). The positive predictive value (9.3 % and 9.1 %; p = 0.812) and cancer detection rate were similar for both reading strategies (4.6 and 4.2 per 1000 screens; p = 0.283). Our results suggest that changing to single reading of mammograms could produce savings in breast cancer screening. Single reading could reduce the frequency of false-positive results without changing the cancer detection rate. These results are not conclusive and cannot be generalized to other contexts with less trained radiologists. • Double reading of digital mammograms is more expensive than single reading. • Compared to single reading, double reading yields a higher proportion of false-positive results. • The cancer detection rate was similar for double and single readings. • Single reading may be a cost-effective strategy in breast cancer screening programmes.

  12. Unmodeled observation error induces bias when inferring patterns and dynamics of species occurrence via aural detections

    USGS Publications Warehouse

    McClintock, Brett T.; Bailey, Larissa L.; Pollock, Kenneth H.; Simons, Theodore R.

    2010-01-01

    The recent surge in the development and application of species occurrence models has been associated with an acknowledgment among ecologists that species are detected imperfectly due to observation error. Standard models now allow unbiased estimation of occupancy probability when false negative detections occur, but this is conditional on no false positive detections and sufficient incorporation of explanatory variables for the false negative detection process. These assumptions are likely reasonable in many circumstances, but there is mounting evidence that false positive errors and detection probability heterogeneity may be much more prevalent in studies relying on auditory cues for species detection (e.g., songbird or calling amphibian surveys). We used field survey data from a simulated calling anuran system of known occupancy state to investigate the biases induced by these errors in dynamic models of species occurrence. Despite the participation of expert observers in simplified field conditions, both false positive errors and site detection probability heterogeneity were extensive for most species in the survey. We found that even low levels of false positive errors, constituting as little as 1% of all detections, can cause severe overestimation of site occupancy, colonization, and local extinction probabilities. Further, unmodeled detection probability heterogeneity induced substantial underestimation of occupancy and overestimation of colonization and local extinction probabilities. Completely spurious relationships between species occurrence and explanatory variables were also found. Such misleading inferences would likely have deleterious implications for conservation and management programs. We contend that all forms of observation error, including false positive errors and heterogeneous detection probabilities, must be incorporated into the estimation framework to facilitate reliable inferences about occupancy and its associated vital rate parameters.

  13. Detecting Seismic Events Using a Supervised Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Burks, L.; Forrest, R.; Ray, J.; Young, C.

    2017-12-01

    We explore the use of supervised hidden Markov models (HMMs) to detect seismic events in streaming seismogram data. Current methods for seismic event detection include simple triggering algorithms, such as STA/LTA and the Z-statistic, which can lead to large numbers of false positives that must be investigated by an analyst. The hypothesis of this study is that more advanced detection methods, such as HMMs, may decreases false positives while maintaining accuracy similar to current methods. We train a binary HMM classifier using 2 weeks of 3-component waveform data from the International Monitoring System (IMS) that was carefully reviewed by an expert analyst to pick all seismic events. Using an ensemble of simple and discrete features, such as the triggering of STA/LTA, the HMM predicts the time at which transition occurs from noise to signal. Compared to the STA/LTA detection algorithm, the HMM detects more true events, but the false positive rate remains unacceptably high. Future work to potentially decrease the false positive rate may include using continuous features, a Gaussian HMM, and multi-class HMMs to distinguish between types of seismic waves (e.g., P-waves and S-waves). Acknowledgement: Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.SAND No: SAND2017-8154 A

  14. Automatic near-real-time detection of CMEs in Mauna Loa K-Cor coronagraph images

    NASA Astrophysics Data System (ADS)

    Thompson, W. T.; St Cyr, O. C.; Burkepile, J.; Posner, A.

    2017-12-01

    A simple algorithm has been developed to detect the onset of coronal massejections (CMEs), together with an estimate of their speed, in near-real-timeusing images of the linearly polarized white-light solar corona taken by theK-Cor telescope at the Mauna Loa Solar Observatory (MLSO). The algorithm usedis a variation on the Solar Eruptive Event Detection System (SEEDS) developedat George Mason University. The algorithm was tested against K-Cor data takenbetween 29 April 2014 and 20 February 2017, on days which the MLSO websitemarked as containing CMEs. This resulted in testing of 139 days worth of datacontaining 171 CMEs. The detection rate varied from close to 80% in 2014-2015when solar activity was high, down to as low as 20-30% in 2017 when activitywas low. The difference in effectiveness with solar cycle is attributed to thedifference in relative prevalance of strong CMEs between active and quietperiods. There were also twelve false detections during this time period,leading to an average false detection rate of 8.6% on any given day. However,half of the false detections were clustered into two short periods of a fewdays each when special conditions prevailed to increase the false detectionrate. The K-Cor data were also compared with major Solar Energetic Particle(SEP) storms during this time period. There were three SEP events detectedeither at Earth or at one of the two STEREO spacecraft where K-Cor wasobserving during the relevant time period. The K-Cor CME detection algorithmsuccessfully generated alerts for two of these events, with lead times of 1-3hours before the SEP onset at 1 AU. The third event was not detected by theautomatic algorithm because of the unusually broad width of the CME in positionangle.

  15. Pulse Oximetry and Auscultation for Congenital Heart Disease Detection.

    PubMed

    Hu, Xiao-Jing; Ma, Xiao-Jing; Zhao, Qu-Ming; Yan, Wei-Li; Ge, Xiao-Ling; Jia, Bing; Liu, Fang; Wu, Lin; Ye, Ming; Liang, Xue-Cun; Zhang, Jing; Gao, Yan; Zhai, Xiao-Wen; Huang, Guo-Ying

    2017-10-01

    Pulse oximetry (POX) has been confirmed as a specific screening modality for critical congenital heart disease (CCHD), with moderate sensitivity. However, POX is not able to detect most serious and critical cardiac lesions (major congenital heart disease [CHD]) without hypoxemia. In this study, we investigated the accuracy and feasibility of the addition of cardiac auscultation to POX as a screening method for asymptomatic major CHD. A multicenter prospective observational screening study was conducted at 15 hospitals in Shanghai between July 1, 2012, and December 31, 2014. Newborns with either an abnormal POX or cardiac auscultation were defined as screen positive. All screen-positive newborns underwent further echocardiography. False-negative results were identified by clinical follow-up, parents' feedback, and telephone review. We assessed the accuracy of POX plus cardiac auscultation for the detection of major CHD. CHD screening was completed in all 15 hospitals, with a screening rate of 94.0% to 99.8%. In total, 167 190 consecutive asymptomatic newborn infants were screened, of which 203 had major CHD (44 critical and 159 serious). The sensitivity of POX plus cardiac auscultation was 95.5% (95% confidence interval 84.9%-98.7%) for CCHD and 92.1% (95% confidence interval 87.7%-95.1%) for major CHD. The false-positive rate was 1.2% for detecting CCHD and 1.1% for detecting major CHD. In our current study, we show that using POX plus cardiac auscultation significantly improved the detection rate of major CHD in the early neonatal stage, with high sensitivity and a reasonable false-positive rate. It provides strong evidence and a reliable method for neonatal CHD screening. Copyright © 2017 by the American Academy of Pediatrics.

  16. High-Speed Incoming Infrared Target Detection by Fusion of Spatial and Temporal Detectors

    PubMed Central

    Kim, Sungho

    2015-01-01

    This paper presents a method for detecting high-speed incoming targets by the fusion of spatial and temporal detectors to achieve a high detection rate for an active protection system (APS). The incoming targets have different image velocities according to the target-camera geometry. Therefore, single-target detector-based approaches, such as a 1D temporal filter, 2D spatial filter and 3D matched filter, cannot provide a high detection rate with moderate false alarms. The target speed variation was analyzed according to the incoming angle and target velocity. The speed of the distant target at the firing time is almost stationary and increases slowly. The speed varying targets are detected stably by fusing the spatial and temporal filters. The stationary target detector is activated by an almost zero temporal contrast filter (TCF) and identifies targets using a spatial filter called the modified mean subtraction filter (M-MSF). A small motion (sub-pixel velocity) target detector is activated by a small TCF value and finds targets using the same spatial filter. A large motion (pixel-velocity) target detector works when the TCF value is high. The final target detection is terminated by fusing the three detectors based on the threat priority. The experimental results of the various target sequences show that the proposed fusion-based target detector produces the highest detection rate with an acceptable false alarm rate. PMID:25815448

  17. Two stage algorithm vs commonly used approaches for the suspect screening of complex environmental samples analyzed via liquid chromatography high resolution time of flight mass spectroscopy: A test study.

    PubMed

    Samanipour, Saer; Baz-Lomba, Jose A; Alygizakis, Nikiforos A; Reid, Malcolm J; Thomaidis, Nikolaos S; Thomas, Kevin V

    2017-06-09

    LC-HR-QTOF-MS recently has become a commonly used approach for the analysis of complex samples. However, identification of small organic molecules in complex samples with the highest level of confidence is a challenging task. Here we report on the implementation of a two stage algorithm for LC-HR-QTOF-MS datasets. We compared the performances of the two stage algorithm, implemented via NIVA_MZ_Analyzer™, with two commonly used approaches (i.e. feature detection and XIC peak picking, implemented via UNIFI by Waters and TASQ by Bruker, respectively) for the suspect analysis of four influent wastewater samples. We first evaluated the cross platform compatibility of LC-HR-QTOF-MS datasets generated via instruments from two different manufacturers (i.e. Waters and Bruker). Our data showed that with an appropriate spectral weighting function the spectra recorded by the two tested instruments are comparable for our analytes. As a consequence, we were able to perform full spectral comparison between the data generated via the two studied instruments. Four extracts of wastewater influent were analyzed for 89 analytes, thus 356 detection cases. The analytes were divided into 158 detection cases of artificial suspect analytes (i.e. verified by target analysis) and 198 true suspects. The two stage algorithm resulted in a zero rate of false positive detection, based on the artificial suspect analytes while producing a rate of false negative detection of 0.12. For the conventional approaches, the rates of false positive detection varied between 0.06 for UNIFI and 0.15 for TASQ. The rates of false negative detection for these methods ranged between 0.07 for TASQ and 0.09 for UNIFI. The effect of background signal complexity on the two stage algorithm was evaluated through the generation of a synthetic signal. We further discuss the boundaries of applicability of the two stage algorithm. The importance of background knowledge and experience in evaluating the reliability of results during the suspect screening was evaluated. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Optimizing the TESS Planet Finding Pipeline

    NASA Astrophysics Data System (ADS)

    Chitamitara, Aerbwong; Smith, Jeffrey C.; Tenenbaum, Peter; TESS Science Processing Operations Center

    2017-10-01

    The Transiting Exoplanet Survey Satellite (TESS) is a new NASA planet finding all-sky survey that will observe stars within 200 light years and 10-100 times brighter than that of the highly successful Kepler mission. TESS is expected to detect ~1000 planets smaller than Neptune and dozens of Earth size planets. As in the Kepler mission, the Science Processing Operations Center (SPOC) processing pipeline at NASA Ames Research center is tasked with calibrating the raw pixel data, generating systematic error corrected light curves and then detecting and validating transit signals. The Transiting Planet Search (TPS) component of the pipeline must be modified and tuned for the new data characteristics in TESS. For example, due to each sector being viewed for as little as 28 days, the pipeline will be identifying transiting planets based on a minimum of two transit signals rather than three, as in the Kepler mission. This may result in a significantly higher false positive rate. The study presented here is to measure the detection efficiency of the TESS pipeline using simulated data. Transiting planets identified by TPS are compared to transiting planets from the simulated transit model using the measured epochs, periods, transit durations and the expected detection statistic of injected transit signals (expected MES). From the comparisons, the recovery and false positive rates of TPS is measured. Measurements of recovery in TPS are then used to adjust TPS configuration parameters to maximize the planet recovery rate and minimize false detections. The improvements in recovery rate between initial TPS conditions and after various adjustments will be presented and discussed.

  19. Dynamic sample size detection in learning command line sequence for continuous authentication.

    PubMed

    Traore, Issa; Woungang, Isaac; Nakkabi, Youssef; Obaidat, Mohammad S; Ahmed, Ahmed Awad E; Khalilian, Bijan

    2012-10-01

    Continuous authentication (CA) consists of authenticating the user repetitively throughout a session with the goal of detecting and protecting against session hijacking attacks. While the accuracy of the detector is central to the success of CA, the detection delay or length of an individual authentication period is important as well since it is a measure of the window of vulnerability of the system. However, high accuracy and small detection delay are conflicting requirements that need to be balanced for optimum detection. In this paper, we propose the use of sequential sampling technique to achieve optimum detection by trading off adequately between detection delay and accuracy in the CA process. We illustrate our approach through CA based on user command line sequence and naïve Bayes classification scheme. Experimental evaluation using the Greenberg data set yields encouraging results consisting of a false acceptance rate (FAR) of 11.78% and a false rejection rate (FRR) of 1.33%, with an average command sequence length (i.e., detection delay) of 37 commands. When using the Schonlau (SEA) data set, we obtain FAR = 4.28% and FRR = 12%.

  20. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    PubMed

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  1. False recognition of facial expressions of emotion: causes and implications.

    PubMed

    Fernández-Dols, José-Miguel; Carrera, Pilar; Barchard, Kimberly A; Gacitua, Marta

    2008-08-01

    This article examines the importance of semantic processes in the recognition of emotional expressions, through a series of three studies on false recognition. The first study found a high frequency of false recognition of prototypical expressions of emotion when participants viewed slides and video clips of nonprototypical fearful and happy expressions. The second study tested whether semantic processes caused false recognition. The authors found that participants made significantly higher error rates when asked to detect expressions that corresponded to semantic labels than when asked to detect visual stimuli. Finally, given that previous research reported that false memories are less prevalent in younger children, the third study tested whether false recognition of prototypical expressions increased with age. The authors found that 67% of eight- to nine-year-old children reported nonpresent prototypical expressions of fear in a fearful context, but only 40% of 6- to 7-year-old children did so. Taken together, these three studies demonstrate the importance of semantic processes in the detection and categorization of prototypical emotional expressions.

  2. Costas loop lock detection in the advanced receiver

    NASA Technical Reports Server (NTRS)

    Mileant, A.; Hinedi, S.

    1989-01-01

    The advanced receiver currently being developed uses a Costas digital loop to demodulate the subcarrier. Previous analyses of lock detector algorithms for Costas loops have ignored the effects of the inherent correlation between the samples of the phase-error process. Accounting for this correlation is necessary to achieve the desired lock-detection probability for a given false-alarm rate. Both analysis and simulations are used to quantify the effects of phase correlation on lock detection for the square-law and the absolute-value type detectors. Results are obtained which depict the lock-detection probability as a function of loop signal-to-noise ratio for a given false-alarm rate. The mathematical model and computer simulation show that the square-law detector experiences less degradation due to phase jitter than the absolute-value detector and that the degradation in detector signal-to-noise ratio is more pronounced for square-wave than for sine-wave signals.

  3. Suspicious Behavior Detection System for an Open Space Parking Based on Recognition of Human Elemental Actions

    NASA Astrophysics Data System (ADS)

    Inomata, Teppei; Kimura, Kouji; Hagiwara, Masafumi

    Studies for video surveillance applications for preventing various crimes such as stealing and violence have become a hot topic. This paper proposes a new video surveillance system that can detect suspicious behaviors such as a car break-in and vandalization in an open space parking, and that is based on image processing. The proposed system has the following features: it 1)deals time series data flow, 2)recognizes “human elemental actions” using statistic features, and 3)detects suspicious behavior using Subspace method and AdaBoost. We conducted the experiments to test the performance of the proposed system using open space parking scenes. As a result, we obtained about 10.0% for false positive rate, and about 4.6% for false negative rate.

  4. Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring †

    PubMed Central

    Mao, Yingchi; Qi, Hai; Ping, Ping; Li, Xiaofang

    2017-01-01

    Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED) was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability. PMID:29207535

  5. Automatic mouse ultrasound detector (A-MUD): A new tool for processing rodent vocalizations

    PubMed Central

    Reitschmidt, Doris; Noll, Anton; Balazs, Peter; Penn, Dustin J.

    2017-01-01

    House mice (Mus musculus) emit complex ultrasonic vocalizations (USVs) during social and sexual interactions, which have features similar to bird song (i.e., they are composed of several different types of syllables, uttered in succession over time to form a pattern of sequences). Manually processing complex vocalization data is time-consuming and potentially subjective, and therefore, we developed an algorithm that automatically detects mouse ultrasonic vocalizations (Automatic Mouse Ultrasound Detector or A-MUD). A-MUD is a script that runs on STx acoustic software (S_TOOLS-STx version 4.2.2), which is free for scientific use. This algorithm improved the efficiency of processing USV files, as it was 4–12 times faster than manual segmentation, depending upon the size of the file. We evaluated A-MUD error rates using manually segmented sound files as a ‘gold standard’ reference, and compared them to a commercially available program. A-MUD had lower error rates than the commercial software, as it detected significantly more correct positives, and fewer false positives and false negatives. The errors generated by A-MUD were mainly false negatives, rather than false positives. This study is the first to systematically compare error rates for automatic ultrasonic vocalization detection methods, and A-MUD and subsequent versions will be made available for the scientific community. PMID:28727808

  6. Mixed group validation: a method to address the limitations of criterion group validation in research on malingering detection.

    PubMed

    Frederick, R I

    2000-01-01

    Mixed group validation (MGV) is offered as an alternative to criterion group validation (CGV) to estimate the true positive and false positive rates of tests and other diagnostic signs. CGV requires perfect confidence about each research participant's status with respect to the presence or absence of pathology. MGV determines diagnostic efficiencies based on group data; knowing an individual's status with respect to pathology is not required. MGV can use relatively weak indicators to validate better diagnostic signs, whereas CGV requires perfect diagnostic signs to avoid error in computing true positive and false positive rates. The process of MGV is explained, and a computer simulation demonstrates the soundness of the procedure. MGV of the Rey 15-Item Memory Test (Rey, 1958) for 723 pre-trial criminal defendants resulted in higher estimates of true positive rates and lower estimates of false positive rates as compared with prior research conducted with CGV. The author demonstrates how MGV addresses all the criticisms Rogers (1997b) outlined for differential prevalence designs in malingering detection research. Copyright 2000 John Wiley & Sons, Ltd.

  7. [Exclusive use of blue dye to detect sentinel lymph nodes in breast cancer].

    PubMed

    Bühler H, Simón; Rojas P, Hugo; Cayazzo M, Daniela; Cunill C, Eduardo; Vesperinas A, Gonzalo; Hamilton S, James

    2008-08-01

    The use of a dye and radiocolloid to detect sentinel lymph nodes in breast cancer increases the detection rates. However the use of either method alone does not modify the false negative rate. Therefore there is no formal contraindication for the exclusive use of dye to detect nodes. To report a prospective analysis of the exclusive blue dye technique for sentinel node biopsy in patients with early breast cancer. We analyzed the first 100 women with pathologically proven breast cancer who met the inclusion criteria. Patent blue dye was used as colorant. In the first 25 cases sentinel node was identified using radiocolloid and blue dye an then an axillary dissection performed. In the next 25 women, blue dye was used exclusively for detection and an axillary dissection was performed. In the next 50 cases, blue dye was used and only isolated sentinel node biopsy was performed. In 92 of the 100 women a sentinel node was successfully detected. In the first 50 women, the false negative rate of sentinel lymph node detection was 6.9%. No complications occurred. During follow-up, lasting three to 29 months, no axillary relapse was observed. Sentinel node biopsy in patients with early breast cancer using exclusively blue dye is feasible and safe.

  8. Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. II. Analysis and applications

    NASA Astrophysics Data System (ADS)

    Mikosch, Jochen; Patchkovskii, Serguei

    2013-10-01

    We use an analytical theory of noisy Poisson processes, developed in the preceding companion publication, to compare coincidence and covariance measurement approaches in photoelectron and -ion spectroscopy. For non-unit detection efficiencies, coincidence data acquisition (DAQ) suffers from false coincidences. The rate of false coincidences grows quadratically with the rate of elementary ionization events. To minimize false coincidences for rare event outcomes, very low event rates may hence be required. Coincidence measurements exhibit high tolerance to noise introduced by unstable experimental conditions. Covariance DAQ on the other hand is free of systematic errors as long as stable experimental conditions are maintained. In the presence of noise, all channels in a covariance measurement become correlated. Under favourable conditions, covariance DAQ may allow orders of magnitude reduction in measurement times. Finally, we use experimental data for strong-field ionization of 1,3-butadiene to illustrate how fluctuations in experimental conditions can contaminate a covariance measurement, and how such contamination can be detected.

  9. Environmental Influences on Diel Calling Behavior in Baleen Whales

    DTIC Science & Technology

    2013-09-30

    to allow known calls (e.g., right whale upcall and gunshot, fin whale 20- Hz pulses, humpback whale downsweeps, sei whale low-frequency downsweeps...fin, humpback , sei, and North Atlantic right whales . Real-time detections were evaluated after recovery of the gliders by (1) comparing the acoustic...from both an aircraft and ship. The overall false detection rate for individual calls was 14%, and for right, humpback , and fin whales , false

  10. 40 CFR 265.222 - Action leakage rate.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to § 265.221(a). The action leakage rate is the maximum design flow rate that the leak detection... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Action leakage rate. 265.222 Section... FACILITIES Surface Impoundments § 265.222 Action leakage rate. (a) The owner or operator of surface...

  11. Expert system constant false alarm rate processor

    NASA Astrophysics Data System (ADS)

    Baldygo, William J., Jr.; Wicks, Michael C.

    1993-10-01

    The requirements for high detection probability and low false alarm probability in modern wide area surveillance radars are rarely met due to spatial variations in clutter characteristics. Many filtering and CFAR detection algorithms have been developed to effectively deal with these variations; however, any single algorithm is likely to exhibit excessive false alarms and intolerably low detection probabilities in a dynamically changing environment. A great deal of research has led to advances in the state of the art in Artificial Intelligence (AI) and numerous areas have been identified for application to radar signal processing. The approach suggested here, discussed in a patent application submitted by the authors, is to intelligently select the filtering and CFAR detection algorithms being executed at any given time, based upon the observed characteristics of the interference environment. This approach requires sensing the environment, employing the most suitable algorithms, and applying an appropriate multiple algorithm fusion scheme or consensus algorithm to produce a global detection decision.

  12. Acoustic detection of cracks in the anvil of a large-volume cubic high-pressure apparatus

    NASA Astrophysics Data System (ADS)

    Yan, Zhaoli; Chen, Bin; Tian, Hao; Cheng, Xiaobin; Yang, Jun

    2015-12-01

    A large-volume cubic high-pressure apparatus with three pairs of tungsten carbide anvils is the most popular device for synthetic diamond production. Currently, the consumption of anvils is one of the important costs for the diamond production industry. If one of the anvils is fractured during the production process, the other five anvils in the apparatus may be endangered as a result of a sudden loss of pressure. It is of critical importance to detect and replace cracked anvils before they fracture for reduction of the cost of diamond production and safety. An acoustic detection method is studied in this paper. Two new features, nested power spectrum centroid and modified power spectrum variance, are proposed and combined with linear prediction coefficients to construct a feature vector. A support vector machine model is trained for classification. A sliding time window is proposed for decision-level information fusion. The experiments and analysis show that the recognition rate of anvil cracks is 95%, while the false-alarm rate is as low as 5.8 × 10-4 during a time window; this false-alarm rate indicates that at most one false alarm occurs every 2 months at a confidence level of 90%. An instrument to monitor anvil cracking was designed based on a digital signal processor and has been running for more than eight months in a diamond production field. In this time, two anvil-crack incidents occurred and were detected by the instrument correctly. In addition, no false alarms occurred.

  13. Acoustic detection of cracks in the anvil of a large-volume cubic high-pressure apparatus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Zhaoli, E-mail: zl-yan@mail.ioa.ac.cn; Tian, Hao; Cheng, Xiaobin

    2015-12-15

    A large-volume cubic high-pressure apparatus with three pairs of tungsten carbide anvils is the most popular device for synthetic diamond production. Currently, the consumption of anvils is one of the important costs for the diamond production industry. If one of the anvils is fractured during the production process, the other five anvils in the apparatus may be endangered as a result of a sudden loss of pressure. It is of critical importance to detect and replace cracked anvils before they fracture for reduction of the cost of diamond production and safety. An acoustic detection method is studied in this paper.more » Two new features, nested power spectrum centroid and modified power spectrum variance, are proposed and combined with linear prediction coefficients to construct a feature vector. A support vector machine model is trained for classification. A sliding time window is proposed for decision-level information fusion. The experiments and analysis show that the recognition rate of anvil cracks is 95%, while the false-alarm rate is as low as 5.8 × 10{sup −4} during a time window; this false-alarm rate indicates that at most one false alarm occurs every 2 months at a confidence level of 90%. An instrument to monitor anvil cracking was designed based on a digital signal processor and has been running for more than eight months in a diamond production field. In this time, two anvil-crack incidents occurred and were detected by the instrument correctly. In addition, no false alarms occurred.« less

  14. Pharmacovigilance data mining with methods based on false discovery rates: a comparative simulation study.

    PubMed

    Ahmed, I; Thiessard, F; Miremont-Salamé, G; Bégaud, B; Tubert-Bitter, P

    2010-10-01

    The early detection of adverse reactions caused by drugs that are already on the market is the prime concern of pharmacovigilance efforts; the methods in use for postmarketing surveillance are aimed at detecting signals pointing to potential safety concerns, on the basis of reports from health-care providers and from information available in various databases. Signal detection methods based on the estimation of false discovery rate (FDR) have recently been proposed. They address the limitation of arbitrary detection thresholds of the automatic methods in current use, including those last updated by the US Food and Drug Administration and the World Health Organization's Uppsala Monitoring Centre. We used two simulation procedures to compare the false-positive performances for three current methods: the reporting odds ratio (ROR), the information component (IC), the gamma Poisson shrinkage (GPS), and also for two FDR-based methods derived from the GPS model and Fisher's test. Large differences in FDR rates were associated with the signal-detection methods currently in use. These differences ranged from 0.01 to 12% in an analysis that was restricted to signals with at least three reports. The numbers of signals generated were also highly variable. Among fixed-size lists of signals, the FDR was lowered when the FDR-based approaches were used. Overall, the outcomes in both simulation studies suggest that improvement in effectiveness can be expected from use of the FDR-based GPS method.

  15. Detecting differentially expressed genes in heterogeneous diseases using half Student's t-test.

    PubMed

    Hsu, Chun-Lun; Lee, Wen-Chung

    2010-12-01

    Microarray technology provides information about hundreds and thousands of gene-expression data in a single experiment. To search for disease-related genes, researchers test for those genes that are differentially expressed between the case subjects and the control subjects. The authors propose a new test, the 'half Student's t-test', specifically for detecting differentially expressed genes in heterogeneous diseases. Monte-Carlo simulation shows that the test maintains the nominal α level quite well for both normal and non-normal distributions. Power of the half Student's t is higher than that of the conventional 'pooled' Student's t when there is heterogeneity in the disease under study. The power gain by using the half Student's t can reach ∼10% when the standard deviation of the case group is 50% larger than that of the control group. Application to a colon cancer data reveals that when the false discovery rate (FDR) is controlled at 0.05, the half Student's t can detect 344 differentially expressed genes, whereas the pooled Student's t can detect only 65 genes. Or alternatively, if only 50 genes are to be selected, the FDR for the pooled Student's t has to be set at 0.0320 (false positive rate of ∼3%), but for the half Student's t, it can be at as low as 0.0001 (false positive rate of about one per ten thousands). The half Student's t-test is to be recommended for the detection of differentially expressed genes in heterogeneous diseases.

  16. Kepler Reliability and Occurrence Rates

    NASA Astrophysics Data System (ADS)

    Bryson, Steve

    2016-10-01

    The Kepler mission has produced tables of exoplanet candidates (``KOI table''), as well as tables of transit detections (``TCE table''), hosted at the Exoplanet Archive (http://exoplanetarchive.ipac.caltech.edu). Transit detections in the TCE table that are plausibly due to a transiting object are selected for inclusion in the KOI table. KOI table entries that have not been identified as false positives (FPs) or false alarms (FAs) are classified as planet candidates (PCs, Mullally et al. 2015). A subset of PCs have been confirmed as planetary transits with greater than 99% probability, but most PCs have <99% probability of being true planets. The fraction of PCs that are true transiting planets is the PC reliability rate. The overall PC population is believed to have a reliability rate >90% (Morton & Johnson 2011).

  17. Hierarchical Leak Detection and Localization Method in Natural Gas Pipeline Monitoring Sensor Networks

    PubMed Central

    Wan, Jiangwen; Yu, Yang; Wu, Yinfeng; Feng, Renjian; Yu, Ning

    2012-01-01

    In light of the problems of low recognition efficiency, high false rates and poor localization accuracy in traditional pipeline security detection technology, this paper proposes a type of hierarchical leak detection and localization method for use in natural gas pipeline monitoring sensor networks. In the signal preprocessing phase, original monitoring signals are dealt with by wavelet transform technology to extract the single mode signals as well as characteristic parameters. In the initial recognition phase, a multi-classifier model based on SVM is constructed and characteristic parameters are sent as input vectors to the multi-classifier for initial recognition. In the final decision phase, an improved evidence combination rule is designed to integrate initial recognition results for final decisions. Furthermore, a weighted average localization algorithm based on time difference of arrival is introduced for determining the leak point’s position. Experimental results illustrate that this hierarchical pipeline leak detection and localization method could effectively improve the accuracy of the leak point localization and reduce the undetected rate as well as false alarm rate. PMID:22368464

  18. Hierarchical leak detection and localization method in natural gas pipeline monitoring sensor networks.

    PubMed

    Wan, Jiangwen; Yu, Yang; Wu, Yinfeng; Feng, Renjian; Yu, Ning

    2012-01-01

    In light of the problems of low recognition efficiency, high false rates and poor localization accuracy in traditional pipeline security detection technology, this paper proposes a type of hierarchical leak detection and localization method for use in natural gas pipeline monitoring sensor networks. In the signal preprocessing phase, original monitoring signals are dealt with by wavelet transform technology to extract the single mode signals as well as characteristic parameters. In the initial recognition phase, a multi-classifier model based on SVM is constructed and characteristic parameters are sent as input vectors to the multi-classifier for initial recognition. In the final decision phase, an improved evidence combination rule is designed to integrate initial recognition results for final decisions. Furthermore, a weighted average localization algorithm based on time difference of arrival is introduced for determining the leak point's position. Experimental results illustrate that this hierarchical pipeline leak detection and localization method could effectively improve the accuracy of the leak point localization and reduce the undetected rate as well as false alarm rate.

  19. Rapid response radiation sensors for homeland security applications

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Guss, Paul

    2014-09-01

    The National Security Technologies, LLC, Remote Sensing Laboratory is developing a rapid response radiation detection system for homeland security field applications. The intelligence-driven system is deployed only when non-radiological information about the target is verifiable. The survey area is often limited, so the detection range is small; in most cases covering a distance of 10 meters or less suffices. Definitive response is required in no more than 3 seconds and should minimize false negative alarms, but can err on the side of positive false alarms. The detection system is rapidly reconfigurable in terms of size, shape, and outer appearance; it is a plug-and-play system. Multiple radiation detection components (viz., two or more sodium iodide scintillators) are used to independently "over-determine" the existence of the threat object. Rapid response electronic dose rate meters are also included in the equipment suite. Carefully studied threat signatures are the basis of the decision making. The use of Rad-Detect predictive modeling provides information on the nature of the threat object. Rad-Detect provides accurate dose rate from heavily shielded large sources; for example those lost in Mexico were Category 1 radiation sources (~3,000 Ci of 60Co), the most dangerous of five categories defined by the International Atomic Energy Agency. Taken out of their shielding containers, Category 1 sources can kill anyone who is exposed to them at close range for a few minutes to an hour. Whenever possible sub-second data acquisition will be attempted, and, when deployed, the system will be characterized for false alarm rates. Although the radiation detection materials selected are fast (viz., faster scintillators), their speed is secondary to sensitivity, which is of primary importance. Results from these efforts will be discussed and demonstrated.

  20. Nonbleeding adenomas: Evidence of systematic false-negative fecal immunochemical test results and their implications for screening effectiveness-A modeling study.

    PubMed

    van der Meulen, Miriam P; Lansdorp-Vogelaar, Iris; van Heijningen, Else-Mariëtte B; Kuipers, Ernst J; van Ballegooijen, Marjolein

    2016-06-01

    If some adenomas do not bleed over several years, they will cause systematic false-negative fecal immunochemical test (FIT) results. The long-term effectiveness of FIT screening has been estimated without accounting for such systematic false-negativity. There are now data with which to evaluate this issue. The authors developed one microsimulation model (MISCAN [MIcrosimulation SCreening ANalysis]-Colon) without systematic false-negative FIT results and one model that allowed a percentage of adenomas to be systematically missed in successive FIT screening rounds. Both variants were adjusted to reproduce the first-round findings of the Dutch CORERO FIT screening trial. The authors then compared simulated detection rates in the second screening round with those observed, and adjusted the simulated percentage of systematically missed adenomas to those data. Finally, the authors calculated the impact of systematic false-negative FIT results on the effectiveness of repeated FIT screening. The model without systematic false-negativity simulated higher detection rates in the second screening round than observed. These observed rates could be reproduced when assuming that FIT systematically missed 26% of advanced and 73% of nonadvanced adenomas. To reduce the false-positive rate in the second round to the observed level, the authors also had to assume that 30% of false-positive findings were systematically false-positive. Systematic false-negative FIT testing limits the long-term reduction of biennial FIT screening in the incidence of colorectal cancer (35.6% vs 40.9%) and its mortality (55.2% vs 59.0%) in participants. The results of the current study provide convincing evidence based on the combination of real-life and modeling data that a percentage of adenomas are systematically missed by repeat FIT screening. This impairs the efficacy of FIT screening. Cancer 2016;122:1680-8. © 2016 American Cancer Society. © 2016 American Cancer Society.

  1. Patient-Specific Early Seizure Detection from Scalp EEG

    PubMed Central

    Minasyan, Georgiy R.; Chatten, John B.; Chatten, Martha Jane; Harner, Richard N.

    2010-01-01

    Objective Develop a method for automatic detection of seizures prior to or immediately after clinical onset using features derived from scalp EEG. Methods This detection method is patient-specific. It uses recurrent neural networks and a variety of input features. For each patient we trained and optimized the detection algorithm for two cases: 1) during the period immediately preceding seizure onset, and 2) during the period immediately following seizure onset. Continuous scalp EEG recordings (duration 15 – 62 h, median 25 h) from 25 patients, including a total of 86 seizures, were used in this study. Results Pre-onset detection was successful in 14 of the 25 patients. For these 14 patients, all of the testing seizures were detected prior to seizure onset with a median pre-onset time of 51 sec and false positive rate was 0.06/h. Post-onset detection had 100% sensitivity, 0.023/hr false positive rate and median delay of 4 sec after onset. Conclusions The unique results of this study relate to pre-onset detection. Significance Our results suggest that reliable pre-onset seizure detection may be achievable for a significant subset of epilepsy patients without use of invasive electrodes. PMID:20461014

  2. [Full Sibling Identification by IBS Scoring Method and Establishment of the Query Table of Its Critical Value].

    PubMed

    Li, R; Li, C T; Zhao, S M; Li, H X; Li, L; Wu, R G; Zhang, C C; Sun, H Y

    2017-04-01

    To establish a query table of IBS critical value and identification power for the detection systems with different numbers of STR loci under different false judgment standards. Samples of 267 pairs of full siblings and 360 pairs of unrelated individuals were collected and 19 autosomal STR loci were genotyped by Golden e ye™ 20A system. The full siblings were determined using IBS scoring method according to the 'Regulation for biological full sibling testing'. The critical values and identification power for the detection systems with different numbers of STR loci under different false judgment standards were calculated by theoretical methods. According to the formal IBS scoring criteria, the identification power of full siblings and unrelated individuals was 0.764 0 and the rate of false judgment was 0. The results of theoretical calculation were consistent with that of sample observation. The query table of IBS critical value for identification of full sibling detection systems with different numbers of STR loci was successfully established. The IBS scoring method defined by the regulation has high detection efficiency and low false judgment rate, which provides a relatively conservative result. The query table of IBS critical value for identification of full sibling detection systems with different numbers of STR loci provides an important reference data for the result judgment of full sibling testing and owns a considerable practical value. Copyright© by the Editorial Department of Journal of Forensic Medicine

  3. Effects of Sampling and Spatio/Temporal Granularity in Traffic Monitoring on Anomaly Detectability

    NASA Astrophysics Data System (ADS)

    Ishibashi, Keisuke; Kawahara, Ryoichi; Mori, Tatsuya; Kondoh, Tsuyoshi; Asano, Shoichiro

    We quantitatively evaluate how sampling and spatio/temporal granularity in traffic monitoring affect the detectability of anomalous traffic. Those parameters also affect the monitoring burden, so network operators face a trade-off between the monitoring burden and detectability and need to know which are the optimal paramter values. We derive equations to calculate the false positive ratio and false negative ratio for given values of the sampling rate, granularity, statistics of normal traffic, and volume of anomalies to be detected. Specifically, assuming that the normal traffic has a Gaussian distribution, which is parameterized by its mean and standard deviation, we analyze how sampling and monitoring granularity change these distribution parameters. This analysis is based on observation of the backbone traffic, which exhibits spatially uncorrelated and temporally long-range dependence. Then we derive the equations for detectability. With those equations, we can answer the practical questions that arise in actual network operations: what sampling rate to set to find the given volume of anomaly, or, if the sampling is too high for actual operation, what granularity is optimal to find the anomaly for a given lower limit of sampling rate.

  4. Interval Breast Cancer Rates and Histopathologic Tumor Characteristics after False-Positive Findings at Mammography in a Population-based Screening Program.

    PubMed

    Hofvind, Solveig; Sagstad, Silje; Sebuødegård, Sofie; Chen, Ying; Roman, Marta; Lee, Christoph I

    2018-04-01

    Purpose To compare rates and tumor characteristics of interval breast cancers (IBCs) detected after a negative versus false-positive screening among women participating in the Norwegian Breast Cancer Screening Program. Materials and Methods The Cancer Registry Regulation approved this retrospective study. Information about 423 445 women aged 49-71 years who underwent 789 481 full-field digital mammographic screening examinations during 2004-2012 was extracted from the Cancer Registry of Norway. Rates and odds ratios of IBC among women with a negative (the reference group) versus a false-positive screening were estimated by using logistic regression models adjusted for age at diagnosis and county of residence. Results A total of 1302 IBCs were diagnosed after 789 481 screening examinations, of which 7.0% (91 of 1302) were detected among women with a false-positive screening as the most recent breast imaging examination before detection. By using negative screening as the reference, adjusted odds ratios of IBCs were 3.3 (95% confidence interval [CI]: 2.6, 4.2) and 2.8 (95% CI: 1.8, 4.4) for women with a false-positive screening without and with needle biopsy, respectively. Women with a previous negative screening had a significantly lower proportion of tumors that were 10 mm or less (14.3% [150 of 1049] vs 50.0% [seven of 14], respectively; P < .01) and grade I tumors (13.2% [147 of 1114] vs 42.9% [six of 14]; P < .01), but a higher proportion of cases with lymph nodes positive for cancer (40.9% [442 of 1080] vs 13.3% [two of 15], respectively; P = .03) compared with women with a previous false-positive screening with benign biopsy. A retrospective review of the screening mammographic examinations identified 42.9% (39 of 91) of the false-positive cases to be the same lesion as the IBC. Conclusion By using a negative screening as the reference, a false-positive screening examination increased the risk of an IBC three-fold. The tumor characteristics of IBC after a negative screening were less favorable compared with those detected after a previous false-positive screening. © RSNA, 2017 Online supplemental material is available for this article.

  5. DNA sequencing of maternal plasma reliably identifies trisomy 18 and trisomy 13 as well as Down syndrome: an international collaborative study

    PubMed Central

    Palomaki, Glenn E.; Deciu, Cosmin; Kloza, Edward M.; Lambert-Messerlian, Geralyn M.; Haddow, James E.; Neveux, Louis M.; Ehrich, Mathias; van den Boom, Dirk; Bombard, Allan T.; Grody, Wayne W.; Nelson, Stanley F.; Canick, Jacob A.

    2012-01-01

    Purpose: To determine whether maternal plasma cell–free DNA sequencing can effectively identify trisomy 18 and 13. Methods: Sixty-two pregnancies with trisomy 18 and 12 with trisomy 13 were selected from a cohort of 4,664 pregnancies along with matched euploid controls (including 212 additional Down syndrome and matched controls already reported), and their samples tested using a laboratory-developed, next-generation sequencing test. Interpretation of the results for chromosome 18 and 13 included adjustment for CG content bias. Results: Among the 99.1% of samples interpreted (1,971/1,988), observed trisomy 18 and 13 detection rates were 100% (59/59) and 91.7% (11/12) at false-positive rates of 0.28% and 0.97%, respectively. Among the 17 samples without an interpretation, three were trisomy 18. If z-score cutoffs for trisomy 18 and 13 were raised slightly, the overall false-positive rates for the three aneuploidies could be as low as 0.1% (2/1,688) at an overall detection rate of 98.9% (280/283) for common aneuploidies. An independent academic laboratory confirmed performance in a subset. Conclusion: Among high-risk pregnancies, sequencing circulating cell–free DNA detects nearly all cases of Down syndrome, trisomy 18, and trisomy 13, at a low false-positive rate. This can potentially reduce invasive diagnostic procedures and related fetal losses by 95%. Evidence supports clinical testing for these aneuploidies. PMID:22281937

  6. Quantifying Human Performance of a Dynamic Military Target Detection Task: An Application of the Theory of Signal Detection.

    DTIC Science & Technology

    1995-06-01

    applied to analyze numerous experimental tasks (Macmillan and Creelman , 1991). One of these tasks, target detection, is the subject research. In...between each associated pair of false alarm rate and hit rate z-scores is d’ for the bias level associated with the pairing (Macmillan and Creelman , 1991...unequal variance in normal distributions (Macmillan and Creelman , 1991). 61 1966). It is described in detail for the interested reader by Green and

  7. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection

    PubMed Central

    Kim, Sungho; Song, Woo-Jin; Kim, So-Hyun

    2016-01-01

    Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR) images or infrared (IR) images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT) and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter) and an asymmetric morphological closing filter (AMCF, post-filter) into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC)-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic database generated by OKTAL-SE. PMID:27447635

  8. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection.

    PubMed

    Kim, Sungho; Song, Woo-Jin; Kim, So-Hyun

    2016-07-19

    Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR) images or infrared (IR) images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT) and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter) and an asymmetric morphological closing filter (AMCF, post-filter) into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC)-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic database generated by OKTAL-SE.

  9. A source-attractor approach to network detection of radiation sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Qishi; Barry, M. L..; Grieme, M.

    Radiation source detection using a network of detectors is an active field of research for homeland security and defense applications. We propose Source-attractor Radiation Detection (SRD) method to aggregate measurements from a network of detectors for radiation source detection. SRD method models a potential radiation source as a magnet -like attractor that pulls in pre-computed virtual points from the detector locations. A detection decision is made if a sufficient level of attraction, quantified by the increase in the clustering of the shifted virtual points, is observed. Compared with traditional methods, SRD has the following advantages: i) it does not requiremore » an accurate estimate of the source location from limited and noise-corrupted sensor readings, unlike the localizationbased methods, and ii) its virtual point shifting and clustering calculation involve simple arithmetic operations based on the number of detectors, avoiding the high computational complexity of grid-based likelihood estimation methods. We evaluate its detection performance using canonical datasets from Domestic Nuclear Detection Office s (DNDO) Intelligence Radiation Sensors Systems (IRSS) tests. SRD achieves both lower false alarm rate and false negative rate compared to three existing algorithms for network source detection.« less

  10. Bio-ALIRT biosurveillance detection algorithm evaluation.

    PubMed

    Siegrist, David; Pavlin, J

    2004-09-24

    Early detection of disease outbreaks by a medical biosurveillance system relies on two major components: 1) the contribution of early and reliable data sources and 2) the sensitivity, specificity, and timeliness of biosurveillance detection algorithms. This paper describes an effort to assess leading detection algorithms by arranging a common challenge problem and providing a common data set. The objectives of this study were to determine whether automated detection algorithms can reliably and quickly identify the onset of natural disease outbreaks that are surrogates for possible terrorist pathogen releases, and do so at acceptable false-alert rates (e.g., once every 2-6 weeks). Historic de-identified data were obtained from five metropolitan areas over 23 months; these data included International Classification of Diseases, Ninth Revision (ICD-9) codes related to respiratory and gastrointestinal illness syndromes. An outbreak detection group identified and labeled two natural disease outbreaks in these data and provided them to analysts for training of detection algorithms. All outbreaks in the remaining test data were identified but not revealed to the detection groups until after their analyses. The algorithms established a probability of outbreak for each day's counts. The probability of outbreak was assessed as an "actual" alert for different false-alert rates. The best algorithms were able to detect all of the outbreaks at false-alert rates of one every 2-6 weeks. They were often able to detect for the same day human investigators had identified as the true start of the outbreak. Because minimal data exists for an actual biologic attack, determining how quickly an algorithm might detect such an attack is difficult. However, application of these algorithms in combination with other data-analysis methods to historic outbreak data indicates that biosurveillance techniques for analyzing syndrome counts can rapidly detect seasonal respiratory and gastrointestinal illness outbreaks. Further research is needed to assess the value of electronic data sources for predictive detection. In addition, simulations need to be developed and implemented to better characterize the size and type of biologic attack that can be detected by current methods by challenging them under different projected operational conditions.

  11. US women's attitudes to false positive mammography results and detection of ductal carcinoma in situ: cross sectional survey

    PubMed Central

    Schwartz, Lisa M; Woloshin, Steven; Sox, Harold C; Fischhoff, Baruch; Welch, H Gilbert

    2000-01-01

    Objective To determine women's attitudes to and knowledge of both false positive mammography results and the detection of ductal carcinoma in situ after screening mammography. Design Cross sectional survey. Setting United States. Participants 479 women aged 18-97 years who did not report a history of breast cancer. Main outcome measures Attitudes to and knowledge of false positive results and the detection of ductal carcinoma in situ after screening mammography. Results Women were aware that false positive results do occur. Their median estimate of the false positive rate for 10 years of annual screening was 20% (25th percentile estimate, 10%; 75th percentile estimate, 45%). The women were highly tolerant of false positives: 63% thought that 500 or more false positives per life saved was reasonable and 37% would tolerate 10 000 or more. Women who had had a false positive result (n=76) expressed the same high tolerance: 39% would tolerate 10 000 or more false positives. 62% of women did not want to take false positive results into account when deciding about screening. Only 8% of women thought that mammography could harm a woman without breast cancer, and 94% doubted the possibility of non-progressive breast cancers. Few had heard about ductal carcinoma in situ, a cancer that may not progress, but when informed, 60% of women wanted to take into account the possibility of it being detected when deciding about screening. Conclusions Women are aware of false positives and seem to view them as an acceptable consequence of screening mammography. In contrast, most women are unaware that screening can detect cancers that may never progress but feel that such information would be relevant. Education should perhaps focus less on false positives and more on the less familiar outcome of detection of ductal carcinoma in situ. PMID:10856064

  12. Where Have All the Interactions Gone? Estimating the Coverage of Two-Hybrid Protein Interaction Maps

    PubMed Central

    Huang, Hailiang; Jedynak, Bruno M; Bader, Joel S

    2007-01-01

    Yeast two-hybrid screens are an important method for mapping pairwise physical interactions between proteins. The fraction of interactions detected in independent screens can be very small, and an outstanding challenge is to determine the reason for the low overlap. Low overlap can arise from either a high false-discovery rate (interaction sets have low overlap because each set is contaminated by a large number of stochastic false-positive interactions) or a high false-negative rate (interaction sets have low overlap because each misses many true interactions). We extend capture–recapture theory to provide the first unified model for false-positive and false-negative rates for two-hybrid screens. Analysis of yeast, worm, and fly data indicates that 25% to 45% of the reported interactions are likely false positives. Membrane proteins have higher false-discovery rates on average, and signal transduction proteins have lower rates. The overall false-negative rate ranges from 75% for worm to 90% for fly, which arises from a roughly 50% false-negative rate due to statistical undersampling and a 55% to 85% false-negative rate due to proteins that appear to be systematically lost from the assays. Finally, statistical model selection conclusively rejects the Erdös-Rényi network model in favor of the power law model for yeast and the truncated power law for worm and fly degree distributions. Much as genome sequencing coverage estimates were essential for planning the human genome sequencing project, the coverage estimates developed here will be valuable for guiding future proteomic screens. All software and datasets are available in Datasets S1 and S2, Figures S1–S5, and Tables S1−S6, and are also available from our Web site, http://www.baderzone.org. PMID:18039026

  13. INFRARED- BASED BLINK DETECTING GLASSES FOR FACIAL PACING: TOWARDS A BIONIC BLINK

    PubMed Central

    Frigerio, Alice; Hadlock, Tessa A; Murray, Elizabeth H; Heaton, James T

    2015-01-01

    IMPORTANCE Facial paralysis remains one of the most challenging conditions to effectively manage, often causing life-altering deficits in both function and appearance. Facial rehabilitation via pacing and robotic technology has great yet unmet potential. A critical first step towards reanimating symmetrical facial movement in cases of unilateral paralysis is the detection of healthy movement to use as a trigger for stimulated movement. OBJECTIVE To test a blink detection system that can be attached to standard eyeglasses and used as part of a closed-loop facial pacing system. DESIGN Standard safety glasses were equipped with an infrared (IR) emitter/detector pair oriented horizontally across the palpebral fissure, creating a monitored IR beam that became interrupted when the eyelids closed. SETTING Tertiary care Facial Nerve Center. PARTICIPANTS 24 healthy volunteers. MAIN OUTCOME MEASURE Video-quantified blinking was compared with both IR sensor signal magnitude and rate of change in healthy participants with their gaze in repose, while they shifted gaze from central to far peripheral positions, and during the production of particular facial expressions. RESULTS Blink detection based on signal magnitude achieved 100% sensitivity in forward gaze, but generated false-detections on downward gaze. Calculations of peak rate of signal change (first derivative) typically distinguished blinks from gaze-related lid movements. During forward gaze, 87% of detected blink events were true positives, 11% were false positives, and 2% false negatives. Of the 11% false positives, 6% were associated with partial eyelid closures. During gaze changes, false blink detection occurred 6.3% of the time during lateral eye movements, 10.4% during upward movements, 46.5% during downward movements, and 5.6% for movements from an upward or downward gaze back to the primary gaze. Facial expressions disrupted sensor output if they caused substantial squinting or shifted the glasses. CONCLUSION AND RELEVANCE Our blink detection system provides a reliable, non-invasive indication of eyelid closure using an invisible light beam passing in front of the eye. Future versions will aim to mitigate detection errors by using multiple IR emitter/detector pairs mounted on the glasses, and alternative frame designs may reduce shifting of the sensors relative to the eye during facial movements. PMID:24699708

  14. Improved peak detection in mass spectrum by incorporating continuous wavelet transform-based pattern matching.

    PubMed

    Du, Pan; Kibbe, Warren A; Lin, Simon M

    2006-09-01

    A major problem for current peak detection algorithms is that noise in mass spectrometry (MS) spectra gives rise to a high rate of false positives. The false positive rate is especially problematic in detecting peaks with low amplitudes. Usually, various baseline correction algorithms and smoothing methods are applied before attempting peak detection. This approach is very sensitive to the amount of smoothing and aggressiveness of the baseline correction, which contribute to making peak detection results inconsistent between runs, instrumentation and analysis methods. Most peak detection algorithms simply identify peaks based on amplitude, ignoring the additional information present in the shape of the peaks in a spectrum. In our experience, 'true' peaks have characteristic shapes, and providing a shape-matching function that provides a 'goodness of fit' coefficient should provide a more robust peak identification method. Based on these observations, a continuous wavelet transform (CWT)-based peak detection algorithm has been devised that identifies peaks with different scales and amplitudes. By transforming the spectrum into wavelet space, the pattern-matching problem is simplified and in addition provides a powerful technique for identifying and separating the signal from the spike noise and colored noise. This transformation, with the additional information provided by the 2D CWT coefficients can greatly enhance the effective signal-to-noise ratio. Furthermore, with this technique no baseline removal or peak smoothing preprocessing steps are required before peak detection, and this improves the robustness of peak detection under a variety of conditions. The algorithm was evaluated with SELDI-TOF spectra with known polypeptide positions. Comparisons with two other popular algorithms were performed. The results show the CWT-based algorithm can identify both strong and weak peaks while keeping false positive rate low. The algorithm is implemented in R and will be included as an open source module in the Bioconductor project.

  15. [Preimplantation genetic diagnosis of Duchenne muscular dystrophy by single cell triplex PCR].

    PubMed

    Wu, Yue-Li; Wu, Ling-Qian; Li, Yan-Ping; Liu, Dong-E; Zeng, Qiao; Zhu, Hai-Yan; Pan, Qian; Liang, De-Sheng; Hu, Hao; Long, Zhi-Gao; Li, Juan; Dai, He-Ping; Xia, Kun; Xia, Jia-Hui

    2007-04-01

    To detect two exons of Duchenne muscular dystrophy (DMD) gene and a gender discrimination locus amelogenin gene by single cell triplex PCR, and to evaluate the possibility of this technique for preimplantation genetic diagnosis (PGD) in DMD family with DMD deletion mutation. Single lymphocytes from a normal male, a normal female, two DMD patients (exon 8 and 47 deleted, respectively) and single blastomeres from the couples treated by the in vitro fertilization pre-embryo transfer (IVF-ET) and without family history of DMD were obtained. Exons 8 and 47 of DMD gene were amplified by a triplex PCR assay, the amelogenin gene on X and Y chromosomes were co-amplified to analyze the correlation between embryo gender and deletion status. In the normal single lymphocytes, the amplification rate of exons 8 and 47 of DMD and amelogenin gene were 93.8%, 93.8%, and 95.3% respectively. The false positive rate was 3.3%. In the exon 8 deleted DMD patient, the amplification rate of exon 47 of DMD and amelogenin gene was 95.8%, and the false positive rate was 3.3%. In the exon 47 deleted DMD patient, the amplification rate of exon 8 of DMD and amelogenin gene was 95.8%, and the false positive rate was 0. In the single blastomeres, the amplification rate of exons 8 and 47 of DMD and amelogenin gene was 82.5%, 80.0% and 77.5%, respectively, and the false positive rate was 0. The single cell triplex PCR protocol for the detection of DMD and amelogenin gene is highly sensitive, specific and reliable, and can be used for PGD in those DMD families with DMD deletion mutation.

  16. Grouped fuzzy SVM with EM-based partition of sample space for clustered microcalcification detection.

    PubMed

    Wang, Huiya; Feng, Jun; Wang, Hongyu

    2017-07-20

    Detection of clustered microcalcification (MC) from mammograms plays essential roles in computer-aided diagnosis for early stage breast cancer. To tackle problems associated with the diversity of data structures of MC lesions and the variability of normal breast tissues, multi-pattern sample space learning is required. In this paper, a novel grouped fuzzy Support Vector Machine (SVM) algorithm with sample space partition based on Expectation-Maximization (EM) (called G-FSVM) is proposed for clustered MC detection. The diversified pattern of training data is partitioned into several groups based on EM algorithm. Then a series of fuzzy SVM are integrated for classification with each group of samples from the MC lesions and normal breast tissues. From DDSM database, a total of 1,064 suspicious regions are selected from 239 mammography, and the measurement of Accuracy, True Positive Rate (TPR), False Positive Rate (FPR) and EVL = TPR* 1-FPR are 0.82, 0.78, 0.14 and 0.72, respectively. The proposed method incorporates the merits of fuzzy SVM and multi-pattern sample space learning, decomposing the MC detection problem into serial simple two-class classification. Experimental results from synthetic data and DDSM database demonstrate that our integrated classification framework reduces the false positive rate significantly while maintaining the true positive rate.

  17. Signal detection of adverse events with imperfect confirmation rates in vaccine safety studies using self-controlled case series design.

    PubMed

    Xu, Stanley; Newcomer, Sophia; Nelson, Jennifer; Qian, Lei; McClure, David; Pan, Yi; Zeng, Chan; Glanz, Jason

    2014-05-01

    The Vaccine Safety Datalink project captures electronic health record data including vaccinations and medically attended adverse events on 8.8 million enrollees annually from participating managed care organizations in the United States. While the automated vaccination data are generally of high quality, a presumptive adverse event based on diagnosis codes in automated health care data may not be true (misclassification). Consequently, analyses using automated health care data can generate false positive results, where an association between the vaccine and outcome is incorrectly identified, as well as false negative findings, where a true association or signal is missed. We developed novel conditional Poisson regression models and fixed effects models that accommodate misclassification of adverse event outcome for self-controlled case series design. We conducted simulation studies to evaluate their performance in signal detection in vaccine safety hypotheses generating (screening) studies. We also reanalyzed four previously identified signals in a recent vaccine safety study using the newly proposed models. Our simulation studies demonstrated that (i) outcome misclassification resulted in both false positive and false negative signals in screening studies; (ii) the newly proposed models reduced both the rates of false positive and false negative signals. In reanalyses of four previously identified signals using the novel statistical models, the incidence rate ratio estimates and statistical significances were similar to those using conventional models and including only medical record review confirmed cases. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Ship detection in panchromatic images: a new method and its DSP implementation

    NASA Astrophysics Data System (ADS)

    Yao, Yuan; Jiang, Zhiguo; Zhang, Haopeng; Wang, Mengfei; Meng, Gang

    2016-03-01

    In this paper, a new ship detection method is proposed after analyzing the characteristics of panchromatic remote sensing images and ship targets. Firstly, AdaBoost(Adaptive Boosting) classifiers trained by Haar features are utilized to make coarse detection of ship targets. Then LSD (Line Segment Detector) is adopted to extract the line features in target slices to make fine detection. Experimental results on a dataset of panchromatic remote sensing images with a spatial resolution of 2m show that the proposed algorithm can achieve high detection rate and low false alarm rate. Meanwhile, the algorithm can meet the needs of practical applications on DSP (Digital Signal Processor).

  19. First day of life pulse oximetry screening to detect congenital heart defects.

    PubMed

    Meberg, Alf; Brügmann-Pieper, Sabine; Due, Reidar; Eskedal, Leif; Fagerli, Ingebjørg; Farstad, Teresa; Frøisland, Dag Helge; Sannes, Catharina Hovland; Johansen, Ole Jakob; Keljalic, Jasmina; Markestad, Trond; Nygaard, Egil Andre; Røsvik, Alet; Silberg, Inger Elisabeth

    2008-06-01

    To evaluate the efficacy of first day of life pulse oximetry screening to detect congenital heart defects (CHDs). We performed a population-based prospective multicenter study of postductal (foot) arterial oxygen saturation (SpO(2)) in apparently healthy newborns after transfer from the delivery suite to the nursery. SpO(2) < 95% led to further diagnostic evaluations. Of 57,959 live births, 50,008 (86%) were screened. In the screened population, 35 CHDs were [corrected] classified as critical (ductus dependent, cyanotic). CHDs were prospectively registered and diagnosed in 658/57,959 (1.1%) [corrected] Of the infants screened, 324 (0.6%) failed the test. Of these, 43 (13%) had CHDs (27 critical), and 134 (41%) had pulmonary diseases or other disorders. The remaining 147 infants (45%) were healthy with transitional circulation. The median age for babies with CHDs at failing the test was 6 hours (range, 1-21 hours). For identifying critical CHDs, the pulse oximetry screening had a sensitivity rate of 77.1% (95% CI, 59.4-89.0), specificity rate of 99.4% (95% CI, 99.3-99.5), and a false-positive rate of 0.6% (95% CI, 0.5-0.7). Early pulse oximetry screening promotes early detection of critical CHDs and other potentially severe diseases. The sensitivity rate for detecting critical CHDs is high, and the false-positive rate is low.

  20. Risk assessment of false-positive quantitative real-time PCR results in food, due to detection of DNA originating from dead cells.

    PubMed

    Wolffs, Petra; Norling, Börje; Rådström, Peter

    2005-03-01

    Real-time PCR technology is increasingly used for detection and quantification of pathogens in food samples. A main disadvantage of nucleic acid detection is the inability to distinguish between signals originating from viable cells and DNA released from dead cells. In order to gain knowledge concerning risks of false-positive results due to detection of DNA originating from dead cells, quantitative PCR (qPCR) was used to investigate the degradation kinetics of free DNA in four types of meat samples. Results showed that the fastest degradation rate was observed (1 log unit per 0.5 h) in chicken homogenate, whereas the slowest rate was observed in pork rinse (1 log unit per 120.5 h). Overall results indicated that degradation occurred faster in chicken samples than in pork samples and faster at higher temperatures. Based on these results, it was concluded that, especially in pork samples, there is a risk of false-positive PCR results. This was confirmed in a quantitative study on cell death and signal persistence over a period of 28 days, employing three different methods, i.e. viable counts, direct qPCR, and finally floatation, a recently developed discontinuous density centrifugation method, followed by qPCR. Results showed that direct qPCR resulted in an overestimation of up to 10 times of the amount of cells in the samples compared to viable counts, due to detection of DNA from dead cells. However, after using floatation prior to qPCR, results resembled the viable count data. This indicates that by using of floatation as a sample treatment step prior to qPCR, the risk of false-positive PCR results due to detection of dead cells, can be minimized.

  1. Evaluation of usefulness of fine-needle aspiration cytology in the diagnosis of tumours of the accessory parotid gland: a preliminary analysis of a case series in Japan.

    PubMed

    Iguchi, Hiroyoshi; Wada, Tadashi; Matsushita, Naoki; Oishi, Masahiro; Teranishi, Yuichi; Yamane, Hideo

    2014-07-01

    The accuracy and sensitivity of fine-needle aspiration cytology (FNAC) in this analysis were not satisfactory, and the false-negative rate seemed to be higher than for parotid tumours. The possibility of low-grade malignancy should be considered in the surgical treatment of accessory parotid gland (APG) tumours, even if the preoperative results of FNAC suggest that the tumour is benign. Little is known about the usefulness of FNAC in the preoperative evaluation of APG tumours, probably due to the paucity of APG tumour cases. We examined the usefulness of FNAC in the detection of malignant APG tumours. We conducted a retrospective analysis of 3 cases from our hospital, along with 18 previously reported Japanese cases. We compared the preoperative FNAC results with postoperative histopathological diagnoses of APG tumours and evaluated the accuracy, sensitivity, specificity and false-negative rates of FNAC in detecting malignant APG tumours. There were four false-negative cases (19.0%), three of mucoepidermoid carcinomas and one of malignant lymphoma. One false-positive result was noted in the case of a myoepithelioma, which was cytologically diagnosed as suspected adenoid cystic carcinoma. The accuracy, sensitivity and specificity of FNAC in detecting malignant tumours were 76.2%, 60.0% and 90.9%, respectively.

  2. [The incidence of human papilloma virus associated vulvar cancer in younger women is increasing and wide local excision with sentinel lymph node biopsie has become standard].

    PubMed

    Fehr, Mathias K

    2011-10-01

    Sentinel lymph node (SLN) dissections have been shown to be sensitive for the evaluation of nodal basins for metastatic disease and are associated with decreased short-term and long-term morbidity when compared with complete lymph node dissection. There has been increasing interest in the use of SLN technology in gynecologic cancers. This review assesses the current evidence-based literature for the use of SLN dissections in gynecologic malignancies. Recent literature continues to support the safety and feasibility of SLN biopsy for early stage vulvar cancer with negative predictive value approaching 100 % and low false negative rates. Alternatively, for endometrial cancer most studies have reported low false-negative rates, with variable sensitivities and have reported low detection rates of the sentinel node. Studies examining the utility of SLN biopsy in early-stage cervical cancer remain promising with detection rates, sensitivities, and false-negative rates greater than 90 % for stage 1B1 tumors. SLN dissections have been shown to be effective and safe in certain, select vulvar cancer patients and can be considered an alternative surgical approach for these patients. For endometrial and cervical cancer, SLN dissection continues to have encouraging results and however needs further investigation.

  3. [Detecting fire smoke based on the multispectral image].

    PubMed

    Wei, Ying-Zhuo; Zhang, Shao-Wu; Liu, Yan-Wei

    2010-04-01

    Smoke detection is very important for preventing forest-fire in the fire early process. Because the traditional technologies based on video and image processing are easily affected by the background dynamic information, three limitations exist in these technologies, i. e. lower anti-interference ability, higher false detection rate and the fire smoke and water fog being not easily distinguished. A novel detection method for detecting smoke based on the multispectral image was proposed in the present paper. Using the multispectral digital imaging technique, the multispectral image series of fire smoke and water fog were obtained in the band scope of 400 to 720 nm, and the images were divided into bins. The Euclidian distance among the bins was taken as a measurement for showing the difference of spectrogram. After obtaining the spectral feature vectors of dynamic region, the regions of fire smoke and water fog were extracted according to the spectrogram feature difference between target and background. The indoor and outdoor experiments show that the smoke detection method based on multispectral image can be applied to the smoke detection, which can effectively distinguish the fire smoke and water fog. Combined with video image processing method, the multispectral image detection method can also be applied to the forest fire surveillance, reducing the false alarm rate in forest fire detection.

  4. A real-time tracking system of infrared dim and small target based on FPGA and DSP

    NASA Astrophysics Data System (ADS)

    Rong, Sheng-hui; Zhou, Hui-xin; Qin, Han-lin; Wang, Bing-jian; Qian, Kun

    2014-11-01

    A core technology in the infrared warning system is the detection tracking of dim and small targets with complicated background. Consequently, running the detection algorithm on the hardware platform has highly practical value in the military field. In this paper, a real-time detection tracking system of infrared dim and small target which is used FPGA (Field Programmable Gate Array) and DSP (Digital Signal Processor) as the core was designed and the corresponding detection tracking algorithm and the signal flow is elaborated. At the first stage, the FPGA obtain the infrared image sequence from the sensor, then it suppresses background clutter by mathematical morphology method and enhances the target intensity by Laplacian of Gaussian operator. At the second stage, the DSP obtain both the original image and the filtered image form the FPGA via the video port. Then it segments the target from the filtered image by an adaptive threshold segmentation method and gets rid of false target by pipeline filter. Experimental results show that our system can achieve higher detection rate and lower false alarm rate.

  5. Network Algorithms for Detection of Radiation Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S; Brooks, Richard R; Wu, Qishi

    In support of national defense, Domestic Nuclear Detection Office s (DNDO) Intelligent Radiation Sensor Systems (IRSS) program supported the development of networks of radiation counters for detecting, localizing and identifying low-level, hazardous radiation sources. Industry teams developed the first generation of such networks with tens of counters, and demonstrated several of their capabilities in indoor and outdoor characterization tests. Subsequently, these test measurements have been used in algorithm replays using various sub-networks of counters. Test measurements combined with algorithm outputs are used to extract Key Measurements and Benchmark (KMB) datasets. We present two selective analyses of these datasets: (a) amore » notional border monitoring scenario that highlights the benefits of a network of counters compared to individual detectors, and (b) new insights into the Sequential Probability Ratio Test (SPRT) detection method, which lead to its adaptations for improved detection. Using KMB datasets from an outdoor test, we construct a notional border monitoring scenario, wherein twelve 2 *2 NaI detectors are deployed on the periphery of 21*21meter square region. A Cs-137 (175 uCi) source is moved across this region, starting several meters from outside and finally moving away. The measurements from individual counters and the network were processed using replays of a particle filter algorithm developed under IRSS program. The algorithm outputs from KMB datasets clearly illustrate the benefits of combining measurements from all networked counters: the source was detected before it entered the region, during its trajectory inside, and until it moved several meters away. When individual counters are used for detection, the source was detected for much shorter durations, and sometimes was missed in the interior region. The application of SPRT for detecting radiation sources requires choosing the detection threshold, which in turn requires a source strength estimate, typically specified as a multiplier of the background radiation level. A judicious selection of this source multiplier is essential to achieve optimal detection probability at a specified false alarm rate. Typically, this threshold is chosen from the Receiver Operating Characteristic (ROC) by varying the source multiplier estimate. ROC is expected to have a monotonically increasing profile between the detection probability and false alarm rate. We derived ROCs for multiple indoor tests using KMB datasets, which revealed an unexpected loop shape: as the multiplier increases, detection probability and false alarm rate both increase until a limit, and then both contract. Consequently, two detection probabilities correspond to the same false alarm rate, and the higher is achieved at a lower multiplier, which is the desired operating point. Using the Chebyshev s inequality we analytically confirm this shape. Then, we present two improved network-SPRT methods by (a) using the threshold off-set as a weighting factor for the binary decisions from individual detectors in a weighted majority voting fusion rule, and (b) applying a composite SPRT derived using measurements from all counters.« less

  6. An Underwater Target Detection System for Electro-Optical Imagery Data

    DTIC Science & Technology

    2010-06-01

    detection and segmentation of underwater mine-like objects in the EO images captured with a CCD-based image sensor. The main focus of this research is to...develop a robust detection algorithm that can be used to detect low contrast and partial underwater objects from the EO imagery with low false alarm rate...underwater target detection I. INTRODUCTION Automatic detection and recognition of underwater objects from EO imagery poses a serious challenge due to poor

  7. [Analysis of 163 rib fractures by imaging examination].

    PubMed

    Song, Tian-fu; Wang, Chao-chao

    2014-12-01

    To explore the applications of imaging examination on rib fracture sites in forensic identification. Features including the sites, numbers of the processed imaging examination and the first radiological technology at diagnosis in 56 cases of rib fractures from 163 injuries were retrospectively analyzed. The detection rate of the rib fractures within 14 days was 65.6%. The initial detection rate of anterior rib fracture proceeded by X-ray was 76.2%, then 90.5% detected at a second time X-ray, while the detection rate of CT was 66.7% and 80.0%, respectively. The initial detec- tion rate of rib fracture in axillary section proceeded by X-ray was 27.6%, then 58.6% detected at a second time X-ray, while the detection rate of CT was 54.3% and 80.4%, respectively. The initial detection rate of posterior rib fracture proceeded by X-ray was 63.6%, then 81.8% detected at a second time X-ray, while the detection rate of CT was 50.0% and 70.0%, respectively. It is important to pay attention to the use of combined imaging examinations and the follow-up results. In the cases of suspicious for rib fracture in axillary section, CT examination is suggested in such false X-ray negative cases.

  8. ADAPTIVE WATER SENSOR SIGNAL PROCESSING: EXPERIMENTAL RESULTS AND IMPLICATIONS FOR ONLINE CONTAMINANT WARNING SYSTEMS

    EPA Science Inventory

    A contaminant detection technique and its optimization algorithms have two principal functions. One is the adaptive signal treatment that suppresses background noise and enhances contaminant signals, leading to a promising detection of water quality changes at a false rate as low...

  9. Fusion of Heterogeneous Intrusion Detection Systems for Network Attack Detection

    PubMed Central

    Kaliappan, Jayakumar; Thiagarajan, Revathi; Sundararajan, Karpagam

    2015-01-01

    An intrusion detection system (IDS) helps to identify different types of attacks in general, and the detection rate will be higher for some specific category of attacks. This paper is designed on the idea that each IDS is efficient in detecting a specific type of attack. In proposed Multiple IDS Unit (MIU), there are five IDS units, and each IDS follows a unique algorithm to detect attacks. The feature selection is done with the help of genetic algorithm. The selected features of the input traffic are passed on to the MIU for processing. The decision from each IDS is termed as local decision. The fusion unit inside the MIU processes all the local decisions with the help of majority voting rule and makes the final decision. The proposed system shows a very good improvement in detection rate and reduces the false alarm rate. PMID:26295058

  10. Fusion of Heterogeneous Intrusion Detection Systems for Network Attack Detection.

    PubMed

    Kaliappan, Jayakumar; Thiagarajan, Revathi; Sundararajan, Karpagam

    2015-01-01

    An intrusion detection system (IDS) helps to identify different types of attacks in general, and the detection rate will be higher for some specific category of attacks. This paper is designed on the idea that each IDS is efficient in detecting a specific type of attack. In proposed Multiple IDS Unit (MIU), there are five IDS units, and each IDS follows a unique algorithm to detect attacks. The feature selection is done with the help of genetic algorithm. The selected features of the input traffic are passed on to the MIU for processing. The decision from each IDS is termed as local decision. The fusion unit inside the MIU processes all the local decisions with the help of majority voting rule and makes the final decision. The proposed system shows a very good improvement in detection rate and reduces the false alarm rate.

  11. [Computed tomography with computer-assisted detection of pulmonary nodules in dogs and cats].

    PubMed

    Niesterok, C; Piesnack, S; Köhler, C; Ludewig, E; Alef, M; Kiefer, I

    2015-01-01

    The aim of this study was to assess the potential benefit of computer-assisted detection (CAD) of pulmonary nodules in veterinary medicine. Therefore, the CAD rate was compared to the detection rates of two individual examiners in terms of its sensitivity and false-positive findings. We included 51 dogs and 16 cats with pulmonary nodules previously diagnosed by computed tomography. First, the number of nodules ≥ 3 mm was recorded for each patient by two independent examiners. Subsequently, each examiner used the CAD software for automated nodule detection. With the knowledge of the CAD results, a final consensus decision on the number of nodules was achieved. The software used was a commercially available CAD program. The sensitivity of examiner 1 was 89.2%, while that of examiner 2 reached 87.4%. CAD had a sensitivity of 69.4%. With CAD, the sensitivity of examiner 1 increased to 94.7% and that of examiner 2 to 90.8%. The CAD-system, which we used in our study, had a moderate sensitivity of 69.4%. Despite its severe limitations, with a high level of false-positive and false-negative results, CAD increased the examiners' sensitivity. Therefore, its supportive role in diagnostics appears to be evident.

  12. Multi-temporal change image inference towards false alarms reduction for an operational photogrammetric rockfall detection system

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Kallimani, Christina; Tripolitsiotis, Achilleas

    2015-06-01

    Rockfall incidents affect civil security and hamper the sustainable growth of hard to access mountainous areas due to casualties, injuries and infrastructure loss. Rockfall occurrences cannot be easily prevented, whereas previous studies for rockfall multiple sensor early detection systems have focused on large scale incidents. However, even a single rock may cause the loss of a human life along transportation routes thus, it is highly important to establish methods for the early detection of small-scale rockfall incidents. Terrestrial photogrammetric techniques are prone to a series of errors leading to false alarm incidents, including vegetation, wind, and non relevant change in the scene under consideration. In this study, photogrammetric monitoring of rockfall prone slopes is established and the resulting multi-temporal change imagery is processed in order to minimize false alarm incidents. Integration of remote sensing imagery analysis techniques is hereby applied to enhance early detection of a rockfall. Experimental data demonstrated that an operational system able to identify a 10-cm rock movement within a 10% false alarm rate is technically feasible.

  13. Robust Face Detection from Still Images

    DTIC Science & Technology

    2014-01-01

    significant change in false acceptance rates. Keywords— face detection; illumination; skin color variation; Haar-like features; OpenCV I. INTRODUCTION... OpenCV and an algorithm which used histogram equalization. The test is performed against 17 subjects under 576 viewing conditions from the extended Yale...original OpenCV algorithm proved the least accurate, having a hit rate of only 75.6%. It also had the lowest FAR but only by a slight margin at 25.2

  14. Evaluation of BACTEC 9240 blood culture system by using high-volume aerobic resin media.

    PubMed Central

    Schwabe, L D; Thomson, R B; Flint, K K; Koontz, F P

    1995-01-01

    The BACTEC 9240 blood culture system (Becton Dickinson Diagnostic Instrument Systems, Sparks, Md.) is one of three automated, continuous-monitoring systems that is widely used in clinical laboratories. The BACTEC 9240 was compared with the BACTEC NR 660 for the detection of organisms and bacteremic episodes; time to detection of positive cultures; number of false-positive and false-negative cultures; and time needed to load, process, and perform quality control functions by using high-volume aerobic media. Blood specimens (5,282) were inoculated in equal volumes (5 to 10 ml per bottle) into BACTEC Plus Aerobic/F (9240 system) and BACTEC Plus NR26 (660 system) bottles. Clinically significant isolates were detected in 6.6% of cultures, representing 348 microorganisms and 216 bacteremic episodes. Two hundred forty-eight microorganisms were detected by both systems, 48 by the 9240 only and 52 by the 660 only (P = not significant). Of the bacteremic episodes, 158 were detected by both systems, 27 by the 9240 only and 31 by the 660 only (P = not significant). Analysis of data by month revealed equivalent recovery rates for both systems, with the exception of a 30-day period at one study site during which the 660 system detected significantly more microorganisms. Following a proprietary hardware design retrofit of the 9240 instrument, detection rates were again equivalent for the remaining three months at this study site. Positive cultures detected by both systems were detected an average of 4.3 h faster by the 9240 system (21 versus 25.3 h). The numbers of false-positive cultures for the 9240 and 660 systems were 40 (1.0%) and 9 ( < 1.0%), respectively.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7494044

  15. Multisensor fusion for the detection of mines and minelike targets

    NASA Astrophysics Data System (ADS)

    Hanshaw, Terilee

    1995-06-01

    The US Army's Communications and Electronics Command through the auspices of its Night Vision and Electronics Sensors Directorate (CECOM-NVESD) is actively applying multisensor techniques to the detection of mine targets. This multisensor research results from the 'detection activity' with its broad range of operational conditions and targets. Multisensor operation justifies significant attention by yielding high target detection and low false alarm statistics. Furthermore, recent advances in sensor and computing technologies make its practical application realistic and affordable. The mine detection field-of-endeavor has since its WWI baptismal investigated the known spectra for applicable mine observation phenomena. Countless sensors, algorithms, processors, networks, and other techniques have been investigated to determine candidacy for mine detection. CECOM-NVESD efforts have addressed a wide range of sensors spanning the spectrum from gravity field perturbations, magentic field disturbances, seismic sounding, electromagnetic fields, earth penetrating radar imagery, and infrared/visible/ultraviolet surface imaging technologies. Supplementary analysis has considered sensor candidate applicability by testing under field conditions (versus laboratory), in determination of fieldability. As these field conditions directly effect the probability of detection and false alarms, sensor employment and design must be considered. Consequently, as a given sensor's performance is influenced directly by the operational conditions, tradeoffs are necessary. At present, mass produced and fielded mine detection techniques are limited to those incorporating a single sensor/processor methodology such as, pulse induction and megnetometry, as found in hand held detectors. The most sensitive fielded systems can detect minute metal components in small mine targets but result in very high false alarm rates reducing velocity in operation environments. Furthermore, the actual speed of advance for the entire mission (convoy, movement to engagement, etc.) is determined by the level of difficulty presented in clearance or avoidance activities required in response to the potential 'targets' marked throughout a detection activity. Therefore the application of fielded hand held systems to convoy operations in clearly impractical. CECOM-NVESD efforts are presently seeking to overcome these operational limitations by substantially increasing speed of detection while reducing the false alarm rate through the application of multisensor techniques. The CECOM-NVESD application of multisensor techniques through integration/fusion methods will be defined in this paper.

  16. A Method of Face Detection with Bayesian Probability

    NASA Astrophysics Data System (ADS)

    Sarker, Goutam

    2010-10-01

    The objective of face detection is to identify all images which contain a face, irrespective of its orientation, illumination conditions etc. This is a hard problem, because the faces are highly variable in size, shape lighting conditions etc. Many methods have been designed and developed to detect faces in a single image. The present paper is based on one `Appearance Based Method' which relies on learning the facial and non facial features from image examples. This in its turn is based on statistical analysis of examples and counter examples of facial images and employs Bayesian Conditional Classification Rule to detect the probability of belongingness of a face (or non-face) within an image frame. The detection rate of the present system is very high and thereby the number of false positive and false negative detection is substantially low.

  17. Improved Conflict Detection for Reducing Operational Errors in Air Traffic Control

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Hainz

    2003-01-01

    An operational error is an incident in which an air traffic controller allows the separation between two aircraft to fall below the minimum separation standard. The rates of such errors in the US have increased significantly over the past few years. This paper proposes new detection methods that can help correct this trend by improving on the performance of Conflict Alert, the existing software in the Host Computer System that is intended to detect and warn controllers of imminent conflicts. In addition to the usual trajectory based on the flight plan, a "dead-reckoning" trajectory (current velocity projection) is also generated for each aircraft and checked for conflicts. Filters for reducing common types of false alerts were implemented. The new detection methods were tested in three different ways. First, a simple flightpath command language was developed t o generate precisely controlled encounters for the purpose of testing the detection software. Second, written reports and tracking data were obtained for actual operational errors that occurred in the field, and these were "replayed" to test the new detection algorithms. Finally, the detection methods were used to shadow live traffic, and performance was analysed, particularly with regard to the false-alert rate. The results indicate that the new detection methods can provide timely warnings of imminent conflicts more consistently than Conflict Alert.

  18. Computer-aided diagnosis of contrast-enhanced spectral mammography: A feasibility study.

    PubMed

    Patel, Bhavika K; Ranjbar, Sara; Wu, Teresa; Pockaj, Barbara A; Li, Jing; Zhang, Nan; Lobbes, Mark; Zhang, Bin; Mitchell, J Ross

    2018-01-01

    To evaluate whether the use of a computer-aided diagnosis-contrast-enhanced spectral mammography (CAD-CESM) tool can further increase the diagnostic performance of CESM compared with that of experienced radiologists. This IRB-approved retrospective study analyzed 50 lesions described on CESM from August 2014 to December 2015. Histopathologic analyses, used as the criterion standard, revealed 24 benign and 26 malignant lesions. An expert breast radiologist manually outlined lesion boundaries on the different views. A set of morphologic and textural features were then extracted from the low-energy and recombined images. Machine-learning algorithms with feature selection were used along with statistical analysis to reduce, select, and combine features. Selected features were then used to construct a predictive model using a support vector machine (SVM) classification method in a leave-one-out-cross-validation approach. The classification performance was compared against the diagnostic predictions of 2 breast radiologists with access to the same CESM cases. Based on the SVM classification, CAD-CESM correctly identified 45 of 50 lesions in the cohort, resulting in an overall accuracy of 90%. The detection rate for the malignant group was 88% (3 false-negative cases) and 92% for the benign group (2 false-positive cases). Compared with the model, radiologist 1 had an overall accuracy of 78% and a detection rate of 92% (2 false-negative cases) for the malignant group and 62% (10 false-positive cases) for the benign group. Radiologist 2 had an overall accuracy of 86% and a detection rate of 100% for the malignant group and 71% (8 false-positive cases) for the benign group. The results of our feasibility study suggest that a CAD-CESM tool can provide complementary information to radiologists, mainly by reducing the number of false-positive findings. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Mammography screening using independent double reading with consensus: is there a potential benefit for computer-aided detection?

    PubMed

    Skaane, Per; Kshirsagar, Ashwini; Hofvind, Solveig; Jahr, Gunnar; Castellino, Ronald A

    2012-04-01

    Double reading improves the cancer detection rate in mammography screening. Single reading with computer-aided detection (CAD) has been considered to be an alternative to double reading. Little is known about the potential benefit of CAD in breast cancer screening with double reading. To compare prospective independent double reading of screen-film (SFM) and full-field digital (FFDM) mammography in population-based screening with retrospective standalone CAD performance on the baseline mammograms of the screen-detected cancers and subsequent cancers diagnosed during the follow-up period. The study had ethics committee approval. A 5-point rating scale for probability of cancer was used for 23,923 (SFM = 16,983; FFDM = 6940) screening mammograms. Of 208 evaluable cancers, 104 were screen-detected and 104 were subsequent (44 interval and 60 next screening round) cancers. Baseline mammograms of subsequent cancers were retrospectively classified in consensus without information about cancer location, histology, or CAD prompting as normal, non-specific minimal signs, significant minimal signs, and false-negatives. The baseline mammograms of the screen-detected cancers and subsequent cancers were evaluated by CAD. Significant minimal signs and false-negatives were considered 'actionable' and potentially diagnosable if correctly prompted by CAD. CAD correctly marked 94% (98/104) of the baseline mammograms of the screen-detected cancers (SFM = 95% [61/64]; FFDM = 93% [37/40]), including 96% (23/24) of those with discordant interpretations. Considering only those baseline examinations of subsequent cancers prospectively interpreted as normal and retrospectively categorized as 'actionable', CAD input at baseline screening had the potential to increase the cancer detection rate from 0.43% to 0.51% (P = 0.13); and to increase cancer detection by 16% ([104 + 17]/104) and decrease interval cancers by 20% (from 44 to 35). CAD may have the potential to increase cancer detection by up to 16%, and to reduce the number of interval cancers by up to 20% in SFM and FFDM screening programs using independent double reading with consensus review. The influence of true- and false-positive CAD marks on decision-making can, however, only be evaluated in a prospective clinical study.

  20. Narrowband signal detection in the SETI field test

    NASA Technical Reports Server (NTRS)

    Cullers, D. Kent; Deans, Stanley R.

    1986-01-01

    Various methods for detecting narrow-band signals are evaluated. The characteristics of synchronized and unsynchronized pulses are examined. Synchronous, square law, regular pulse, and the general form detections are discussed. The CW, single pulse, synchronous, and four pulse detections are analyzed in terms of false alarm rate and threshold relative to average noise power. Techniques for saving memory and retaining sensitivity are described. Consideration is given to nondrifting CW detection, asynchronous pulse detection, interpolative and extrapolative pulse detectors, and finite and infinite pulses.

  1. Noise-tolerant instantaneous heart rate and R-peak detection using short-term autocorrelation for wearable healthcare systems.

    PubMed

    Fujii, Takahide; Nakano, Masanao; Yamashita, Ken; Konishi, Toshihiro; Izumi, Shintaro; Kawaguchi, Hiroshi; Yoshimoto, Masahiko

    2013-01-01

    This paper describes a robust method of Instantaneous Heart Rate (IHR) and R-peak detection from noisy electrocardiogram (ECG) signals. Generally, the IHR is calculated from the R-wave interval. Then, the R-waves are extracted from the ECG using a threshold. However, in wearable bio-signal monitoring systems, noise increases the incidence of misdetection and false detection of R-peaks. To prevent incorrect detection, we introduce a short-term autocorrelation (STAC) technique and a small-window autocorrelation (SWAC) technique, which leverages the similarity of QRS complex waveforms. Simulation results show that the proposed method improves the noise tolerance of R-peak detection.

  2. Wavelength band selection method for multispectral target detection.

    PubMed

    Karlholm, Jörgen; Renhorn, Ingmar

    2002-11-10

    A framework is proposed for the selection of wavelength bands for multispectral sensors by use of hyperspectral reference data. Using the results from the detection theory we derive a cost function that is minimized by a set of spectral bands optimal in terms of detection performance for discrimination between a class of small rare targets and clutter with known spectral distribution. The method may be used, e.g., in the design of multispectral infrared search and track and electro-optical missile warning sensors, where a low false-alarm rate and a high-detection probability for detection of small targets against a clutter background are of critical importance, but the required high frame rate prevents the use of hyperspectral sensors.

  3. A hierarchical approach for online temporal lobe seizure detection in long-term intracranial EEG recordings

    NASA Astrophysics Data System (ADS)

    Liang, Sheng-Fu; Chen, Yi-Chun; Wang, Yu-Lin; Chen, Pin-Tzu; Yang, Chia-Hsiang; Chiueh, Herming

    2013-08-01

    Objective. Around 1% of the world's population is affected by epilepsy, and nearly 25% of patients cannot be treated effectively by available therapies. The presence of closed-loop seizure-triggered stimulation provides a promising solution for these patients. Realization of fast, accurate, and energy-efficient seizure detection is the key to such implants. In this study, we propose a two-stage on-line seizure detection algorithm with low-energy consumption for temporal lobe epilepsy (TLE). Approach. Multi-channel signals are processed through independent component analysis and the most representative independent component (IC) is automatically selected to eliminate artifacts. Seizure-like intracranial electroencephalogram (iEEG) segments are fast detected in the first stage of the proposed method and these seizures are confirmed in the second stage. The conditional activation of the second-stage signal processing reduces the computational effort, and hence energy, since most of the non-seizure events are filtered out in the first stage. Main results. Long-term iEEG recordings of 11 patients who suffered from TLE were analyzed via leave-one-out cross validation. The proposed method has a detection accuracy of 95.24%, a false alarm rate of 0.09/h, and an average detection delay time of 9.2 s. For the six patients with mesial TLE, a detection accuracy of 100.0%, a false alarm rate of 0.06/h, and an average detection delay time of 4.8 s can be achieved. The hierarchical approach provides a 90% energy reduction, yielding effective and energy-efficient implementation for real-time epileptic seizure detection. Significance. An on-line seizure detection method that can be applied to monitor continuous iEEG signals of patients who suffered from TLE was developed. An IC selection strategy to automatically determine the most seizure-related IC for seizure detection was also proposed. The system has advantages of (1) high detection accuracy, (2) low false alarm, (3) short detection latency, and (4) energy-efficient design for hardware implementation.

  4. Waveforms for Active Sensing: Optical Waveform Design and Analysis for Ballistic Imaging Through Turbid Media

    DTIC Science & Technology

    2008-04-04

    Poisson process, the probabilities of false alarm PFA and detection PD are computed as PFA P k1 N yk 0 k 1 eNXeNXek k! 1...character- istics (ROC) (PD versus PFA ) curves for detecting opaque objects in heavy fog, considering a detector with Xe 20 and a laser power as in...objects in this scattering me- dium up to a distance of 30 MFPs. Figure 4(b) shows the system performance curves by fixing the false alarm rate PFA at

  5. Towards Accurate Node-Based Detection of P2P Botnets

    PubMed Central

    2014-01-01

    Botnets are a serious security threat to the current Internet infrastructure. In this paper, we propose a novel direction for P2P botnet detection called node-based detection. This approach focuses on the network characteristics of individual nodes. Based on our model, we examine node's flows and extract the useful features over a given time period. We have tested our approach on real-life data sets and achieved detection rates of 99-100% and low false positives rates of 0–2%. Comparison with other similar approaches on the same data sets shows that our approach outperforms the existing approaches. PMID:25089287

  6. Neural network for photoplethysmographic respiratory rate monitoring

    NASA Astrophysics Data System (ADS)

    Johansson, Anders

    2001-10-01

    The photoplethysmographic signal (PPG) includes respiratory components seen as frequency modulation of the heart rate (respiratory sinus arrhythmia, RSA), amplitude modulation of the cardiac pulse, and respiratory induced intensity variations (RIIV) in the PPG baseline. The aim of this study was to evaluate the accuracy of these components in determining respiratory rate, and to combine the components in a neural network for improved accuracy. The primary goal is to design a PPG ventilation monitoring system. PPG signals were recorded from 15 healthy subjects. From these signals, the systolic waveform, diastolic waveform, respiratory sinus arrhythmia, pulse amplitude and RIIV were extracted. By using simple algorithms, the rates of false positive and false negative detection of breaths were calculated for each of the five components in a separate analysis. Furthermore, a simple neural network (NN) was tried out in a combined pattern recognition approach. In the separate analysis, the error rates (sum of false positives and false negatives) ranged from 9.7% (pulse amplitude) to 14.5% (systolic waveform). The corresponding value of the NN analysis was 9.5-9.6%.

  7. A statistical method for the detection of variants from next-generation resequencing of DNA pools.

    PubMed

    Bansal, Vikas

    2010-06-15

    Next-generation sequencing technologies have enabled the sequencing of several human genomes in their entirety. However, the routine resequencing of complete genomes remains infeasible. The massive capacity of next-generation sequencers can be harnessed for sequencing specific genomic regions in hundreds to thousands of individuals. Sequencing-based association studies are currently limited by the low level of multiplexing offered by sequencing platforms. Pooled sequencing represents a cost-effective approach for studying rare variants in large populations. To utilize the power of DNA pooling, it is important to accurately identify sequence variants from pooled sequencing data. Detection of rare variants from pooled sequencing represents a different challenge than detection of variants from individual sequencing. We describe a novel statistical approach, CRISP [Comprehensive Read analysis for Identification of Single Nucleotide Polymorphisms (SNPs) from Pooled sequencing] that is able to identify both rare and common variants by using two approaches: (i) comparing the distribution of allele counts across multiple pools using contingency tables and (ii) evaluating the probability of observing multiple non-reference base calls due to sequencing errors alone. Information about the distribution of reads between the forward and reverse strands and the size of the pools is also incorporated within this framework to filter out false variants. Validation of CRISP on two separate pooled sequencing datasets generated using the Illumina Genome Analyzer demonstrates that it can detect 80-85% of SNPs identified using individual sequencing while achieving a low false discovery rate (3-5%). Comparison with previous methods for pooled SNP detection demonstrates the significantly lower false positive and false negative rates for CRISP. Implementation of this method is available at http://polymorphism.scripps.edu/~vbansal/software/CRISP/.

  8. Supporting the Development and Adoption of Automatic Lameness Detection Systems in Dairy Cattle: Effect of System Cost and Performance on Potential Market Shares

    PubMed Central

    Van Weyenberg, Stephanie; Van Nuffel, Annelies; Lauwers, Ludwig; Vangeyte, Jürgen

    2017-01-01

    Simple Summary Most prototypes of systems to automatically detect lameness in dairy cattle are still not available on the market. Estimating their potential adoption rate could support developers in defining development goals towards commercially viable and well-adopted systems. We simulated the potential market shares of such prototypes to assess the effect of altering the system cost and detection performance on the potential adoption rate. We found that system cost and lameness detection performance indeed substantially influence the potential adoption rate. In order for farmers to prefer automatic detection over current visual detection, the usefulness that farmers attach to a system with specific characteristics should be higher than that of visual detection. As such, we concluded that low system costs and high detection performances are required before automatic lameness detection systems become applicable in practice. Abstract Most automatic lameness detection system prototypes have not yet been commercialized, and are hence not yet adopted in practice. Therefore, the objective of this study was to simulate the effect of detection performance (percentage missed lame cows and percentage false alarms) and system cost on the potential market share of three automatic lameness detection systems relative to visual detection: a system attached to the cow, a walkover system, and a camera system. Simulations were done using a utility model derived from survey responses obtained from dairy farmers in Flanders, Belgium. Overall, systems attached to the cow had the largest market potential, but were still not competitive with visual detection. Increasing the detection performance or lowering the system cost led to higher market shares for automatic systems at the expense of visual detection. The willingness to pay for extra performance was €2.57 per % less missed lame cows, €1.65 per % less false alerts, and €12.7 for lame leg indication, respectively. The presented results could be exploited by system designers to determine the effect of adjustments to the technology on a system’s potential adoption rate. PMID:28991188

  9. Sentinel lymph node detection in patients with early cervical cancer.

    PubMed

    Acharya, B C; Jihong, L

    2009-01-01

    Lymph node status is the most important independent prognostic factor in early stage cervical cancer. Intraoperative lymphatic mapping and sentinel lymph node detection have been increasingly evaluated in the treatment of a variety of solid tumors, particularly breast cancer and cutaneous melanoma. This study evaluated the feasibility of these procedures in patients undergoing radical hysterectomy with pelvic lymphadenectomy for early cervical cancer. A total of 30 patients with histologically diagnosed FIGO stage IA to IIA cervical cancer were enrolled to this study. They were scheduled to undergo radical abdominal hysterectomy and pelvic lymphadenectomy after injecting patent blue dye in cervix. A total of 60 SLNs (mean 2.5) were detected in 24 patients with detection rate of 80%. Bilateral SLNs were detected in 70.1% of cases. SLNs were identified in obturator and external iliac areas in 50% and 31.7%, respectively; no SLNs were discovered in the common iliac region. Seven patients (23.3%) had lymph node metastases; one of these had false negative SLN.The false negative rate and negative predictive value were 14.3% and 94.4%, respectively. SLN detection procedure with blue dye technique is a feasible procedure in cervical cancer. Patent blue dye is cheap, safe and effective tracer to detect sentinel node in carcinoma of cervix.

  10. Detection of exudates in fundus imagery using a constant false-alarm rate (CFAR) detector

    NASA Astrophysics Data System (ADS)

    Khanna, Manish; Kapoor, Elina

    2014-05-01

    Diabetic retinopathy is the leading cause of blindness in adults in the United States. The presence of exudates in fundus imagery is the early sign of diabetic retinopathy so detection of these lesions is essential in preventing further ocular damage. In this paper we present a novel technique to automatically detect exudates in fundus imagery that is robust against spatial and temporal variations of background noise. The detection threshold is adjusted dynamically, based on the local noise statics around the pixel under test in order to maintain a pre-determined, constant false alarm rate (CFAR). The CFAR detector is often used to detect bright targets in radar imagery where the background clutter can vary considerably from scene to scene and with angle to the scene. Similarly, the CFAR detector addresses the challenge of detecting exudate lesions in RGB and multispectral fundus imagery where the background clutter often exhibits variations in brightness and texture. These variations present a challenge to common, global thresholding detection algorithms and other methods. Performance of the CFAR algorithm is tested against a publicly available, annotated, diabetic retinopathy database and preliminary testing suggests that performance of the CFAR detector proves to be superior to techniques such as Otsu thresholding.

  11. Acoustic intrusion detection and positioning system

    NASA Astrophysics Data System (ADS)

    Berman, Ohad; Zalevsky, Zeev

    2002-08-01

    Acoustic sensors are becoming more and more applicable as a military battlefield technology. Those sensors allow a detection and direciton estimation with low false alarm rate and high probability of detection. The recent technological progress related to these fields of reserach, together with an evolution of sophisticated algorithms, allow the successful integration of those sensoe in battlefield technologies. In this paper the performances of an acoustic sensor for a detection of avionic vessels is investigated and analyzed.

  12. Clinical utility of contrast-enhanced spectral mammography as an adjunct for tomosynthesis-detected architectural distortion.

    PubMed

    Patel, Bhavika K; Naylor, Michelle E; Kosiorek, Heidi E; Lopez-Alvarez, Yania M; Miller, Adrian M; Pizzitola, Victor J; Pockaj, Barbara A

    Supplement tomosynthesis-detected architectural distortions (AD) with CESM to better characterize malignant vs benign lesions. Retrospective review CESM prior to biopsied AD. Pathology: benign, radial scar, or malignant. 49 lesions (45 patients). 29 invasive cancers, 1 DCIS (range, 0.4-4.7cm); 9 radial scars; 10 benign. 37 (75.5%) ADs had associated enhancement. PPV 78.4% (29/37), sensitivity 96.7% (29/30); specificity, 57.9% (11/19); NPV, 91.7% (11/12). False-positive rate 21.6% (8/37); false-negative rate, 8.3% (1/12). Accuracy 81.6% (40/49). High sensitivity and NPV of CESM in patients with AD is promising as an adjunct tool in diagnosing malignancy and avoiding unnecessary biopsy, respectively. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Space debris detection in optical image sequences.

    PubMed

    Xi, Jiangbo; Wen, Desheng; Ersoy, Okan K; Yi, Hongwei; Yao, Dalei; Song, Zongxi; Xi, Shaobo

    2016-10-01

    We present a high-accuracy, low false-alarm rate, and low computational-cost methodology for removing stars and noise and detecting space debris with low signal-to-noise ratio (SNR) in optical image sequences. First, time-index filtering and bright star intensity enhancement are implemented to remove stars and noise effectively. Then, a multistage quasi-hypothesis-testing method is proposed to detect the pieces of space debris with continuous and discontinuous trajectories. For this purpose, a time-index image is defined and generated. Experimental results show that the proposed method can detect space debris effectively without any false alarms. When the SNR is higher than or equal to 1.5, the detection probability can reach 100%, and when the SNR is as low as 1.3, 1.2, and 1, it can still achieve 99%, 97%, and 85% detection probabilities, respectively. Additionally, two large sets of image sequences are tested to show that the proposed method performs stably and effectively.

  14. Application of a CO2 dial system for infrared detection of forest fire and reduction of false alarm

    NASA Astrophysics Data System (ADS)

    Bellecci, C.; Francucci, M.; Gaudio, P.; Gelfusa, M.; Martellucci, S.; Richetta, M.; Lo Feudo, T.

    2007-04-01

    Forest fires can be the cause of serious environmental and economic damages. For this reason considerable effort has been directed toward forest protection and fire fighting. The means traditionally used for early fire detection mainly consist in human observers dispersed over forest regions. A significant improvement in early warning capabilities could be obtained by using automatic detection apparatus. In order to early detect small forest fires and minimize false alarms, the use of a lidar system and dial technique will be considered. A first evaluation of the lowest detectable concentration will be estimated by numerical simulation. The theoretical model will also be used to get the capability of the dial system to control wooded areas. Fixing the burning rate for several fuels, the maximum range of detection will be evaluated. Finally results of simulations will be reported.

  15. Application of passive imaging polarimetry in the discrimination and detection of different color targets of identical shapes using color-blind imaging sensors

    NASA Astrophysics Data System (ADS)

    El-Saba, A. M.; Alam, M. S.; Surpanani, A.

    2006-05-01

    Important aspects of automatic pattern recognition systems are their ability to efficiently discriminate and detect proper targets with low false alarms. In this paper we extend the applications of passive imaging polarimetry to effectively discriminate and detect different color targets of identical shapes using color-blind imaging sensor. For this case of study we demonstrate that traditional color-blind polarization-insensitive imaging sensors that rely only on the spatial distribution of targets suffer from high false detection rates, especially in scenarios where multiple identical shape targets are present. On the other hand we show that color-blind polarization-sensitive imaging sensors can successfully and efficiently discriminate and detect true targets based on their color only. We highlight the main advantages of using our proposed polarization-encoded imaging sensor.

  16. Toward Failure Modeling In Complex Dynamic Systems: Impact of Design and Manufacturing Variations

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; McAdams, Daniel A.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    When designing vehicle vibration monitoring systems for aerospace devices, it is common to use well-established models of vibration features to determine whether failures or defects exist. Most of the algorithms used for failure detection rely on these models to detect significant changes during a flight environment. In actual practice, however, most vehicle vibration monitoring systems are corrupted by high rates of false alarms and missed detections. Research conducted at the NASA Ames Research Center has determined that a major reason for the high rates of false alarms and missed detections is the numerous sources of statistical variations that are not taken into account in the. modeling assumptions. In this paper, we address one such source of variations, namely, those caused during the design and manufacturing of rotating machinery components that make up aerospace systems. We present a novel way of modeling the vibration response by including design variations via probabilistic methods. The results demonstrate initial feasibility of the method, showing great promise in developing a general methodology for designing more accurate aerospace vehicle vibration monitoring systems.

  17. Is screening with digital imaging using one retinal view adequate?

    PubMed

    Herbert, H M; Jordan, K; Flanagan, D W

    2003-05-01

    To compare the detection of diabetic retinopathy from digital images with slit-lamp biomicroscopy, and to determine whether British Diabetic Association (BDA) screening criteria are attained (>80% sensitivity, >95% specificity, &<5% technical failure). Diabetics referred for screening were studied in a prospective fashion. A single 45 degrees fundus image was obtained using the nonmydriatic digital camera. Each patient subsequently underwent slit-lamp biomicroscopy and diabetic retinopathy grading by a consultant ophthalmologist. Diabetic retinopathy and maculopathy was graded according to the Early Treatment of Diabetic Retinopathy Study. A total of 145 patients (288 eyes) were identified for screening. Of these, 26% of eyes had diabetic retinopathy, and eight eyes (3%) had sight-threatening diabetic retinopathy requiring treatment. The sensitivity for detection of any diabetic retinopathy was 38% and the specificity 95%. There was a 4% technical failure rate. There were 42/288 false negatives and 10/288 false positives. Of the 42 false negatives, 18 represented diabetic maculopathy, 20 represented peripheral diabetic retinopathy and four eyes had both macular and peripheral changes. Three eyes in the false-negative group (1% of total eyes) had sight-threatening retinopathy. There was good concordance between the two consultants (79% agreement on slit-lamp biomicroscopy and 84% on digital image interpretation). The specificity value and technical failure rate compare favourably with BDA guidelines. The low sensitivity for detection of any retinopathy reflects failure to detect minimal maculopathy and retinopathy outside the 45 degrees image. This could be improved by an additional nasal image and careful evaluation of macular images with a low threshold for slit-lamp biomicroscopy if image quality is poor.

  18. Discrete False-Discovery Rate Improves Identification of Differentially Abundant Microbes.

    PubMed

    Jiang, Lingjing; Amir, Amnon; Morton, James T; Heller, Ruth; Arias-Castro, Ery; Knight, Rob

    2017-01-01

    Differential abundance testing is a critical task in microbiome studies that is complicated by the sparsity of data matrices. Here we adapt for microbiome studies a solution from the field of gene expression analysis to produce a new method, discrete false-discovery rate (DS-FDR), that greatly improves the power to detect differential taxa by exploiting the discreteness of the data. Additionally, DS-FDR is relatively robust to the number of noninformative features, and thus removes the problem of filtering taxonomy tables by an arbitrary abundance threshold. We show by using a combination of simulations and reanalysis of nine real-world microbiome data sets that this new method outperforms existing methods at the differential abundance testing task, producing a false-discovery rate that is up to threefold more accurate, and halves the number of samples required to find a given difference (thus increasing the efficiency of microbiome experiments considerably). We therefore expect DS-FDR to be widely applied in microbiome studies. IMPORTANCE DS-FDR can achieve higher statistical power to detect significant findings in sparse and noisy microbiome data compared to the commonly used Benjamini-Hochberg procedure and other FDR-controlling procedures.

  19. Modulation of Emotional Appraisal by False Physiological Feedback during fMRI

    PubMed Central

    Gray, Marcus A.; Harrison, Neil A.; Wiens, Stefan; Critchley, Hugo D.

    2007-01-01

    Background James and Lange proposed that emotions are the perception of physiological reactions. Two-level theories of emotion extend this model to suggest that cognitive interpretations of physiological changes shape self-reported emotions. Correspondingly false physiological feedback of evoked or tonic bodily responses can alter emotional attributions. Moreover, anxiety states are proposed to arise from detection of mismatch between actual and anticipated states of physiological arousal. However, the neural underpinnings of these phenomena previously have not been examined. Methodology/Principal Findings We undertook a functional brain imaging (fMRI) experiment to investigate how both primary and second-order levels of physiological (viscerosensory) representation impact on the processing of external emotional cues. 12 participants were scanned while judging face stimuli during both exercise and non-exercise conditions in the context of true and false auditory feedback of tonic heart rate. We observed that the perceived emotional intensity/salience of neutral faces was enhanced by false feedback of increased heart rate. Regional changes in neural activity corresponding to this behavioural interaction were observed within included right anterior insula, bilateral mid insula, and amygdala. In addition, right anterior insula activity was enhanced during by asynchronous relative to synchronous cardiac feedback even with no change in perceived or actual heart rate suggesting this region serves as a comparator to detect physiological mismatches. Finally, BOLD activity within right anterior insula and amygdala predicted the corresponding changes in perceived intensity ratings at both a group and an individual level. Conclusions/Significance Our findings identify the neural substrates supporting behavioural effects of false physiological feedback, and highlight mechanisms that underlie subjective anxiety states, including the importance of the right anterior insula in guiding second-order “cognitive” representations of bodily arousal state. PMID:17579718

  20. A study of malware detection on smart mobile devices

    NASA Astrophysics Data System (ADS)

    Yu, Wei; Zhang, Hanlin; Xu, Guobin

    2013-05-01

    The growing in use of smart mobile devices for everyday applications has stimulated the spread of mobile malware, especially on popular mobile platforms. As a consequence, malware detection becomes ever more critical in sustaining the mobile market and providing a better user experience. In this paper, we review the existing malware and detection schemes. Using real-world malware samples with known signatures, we evaluate four popular commercial anti-virus tools and our data shows that these tools can achieve high detection accuracy. To deal with the new malware with unknown signatures, we study the anomaly based detection using decision tree algorithm. We evaluate the effectiveness of our detection scheme using malware and legitimate software samples. Our data shows that the detection scheme using decision tree can achieve a detection rate up to 90% and a false positive rate as low as 10%.

  1. Hypothesis tests for the detection of constant speed radiation moving sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes amore » benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)« less

  2. Simple immunoassay for detection of PCBs in transformer oil.

    PubMed

    Glass, Thomas R; Ohmura, Naoya; Taemi, Yukihiro; Joh, Takashi

    2005-07-01

    A rapid and inexpensive procedure to detect polychlorinated biphenyls (PCBs) in transformer oil is needed to facilitate identification and removal of PCB contaminated transformers. Here we describe a simple two-step liquid-liquid extraction using acidic dimethyl sulfoxide in conjunction with an immunoassay for detecting PCBs in transformer oil. The process described is faster and simpler than any previous immunoassay while maintaining comparable detection limit and false negative rate. Cross reactivity data, characterizing the immunoassay response to the four Kanechlor technical mixtures of PCBs in oil, are presented. Forty-five used transformer oil samples were analyzed by gas chromatography-high-resolution mass spectrometry and were also evaluated using the immunoassay protocol developed. Results presented show zero false negatives at a 1.4 ppm nominal cutoff for the transformer oils analyzed.

  3. Autofluorescence imaging to optimize 5-ALA-induced fluorescence endoscopy of bladder carcinoma.

    PubMed

    Frimberger, D; Zaak, D; Stepp, H; Knüchel, R; Baumgartner, R; Schneede, P; Schmeller, N; Hofstetter, A

    2001-09-01

    To design an optical system for detecting autofluorescence (AF) of bladder tumors and to determine the success of reducing the false-positive rate of 5-aminolevulinic acid-induced fluorescence endoscopy (AFE). AFE provides significantly higher sensitivity in detecting and localizing bladder carcinoma compared with white light endoscopy. The specificity of AFE is equivalent to white light endoscopy, mostly because of the false-positive fluorescence of chronic cystitis lesions. Laser-induced spectral autofluorescence detection is also an efficient method in the diagnosis of bladder carcinoma. Bladder tissue was excited to AF using the D-Light (375 to 440 nm) after regular AFE with detection of fluorescence-positive areas. The optical image was produced using a special RGB camera. Biopsies were taken from AFE-positive areas, the peritumoral edges, and normal bladder mucosa. The AF images of the suspicious areas were compared with the AFE images and the histologic results. A total of 43 biopsies were histologically examined (24 benign and 19 neoplastic). AF imaging showed contrast differences between papillary tumors, flat lesions, and normal mucosa. The combination of AFE with AF raised the specificity of AFE alone from 67% to 88%. AF imaging is possible. The value of the method in reducing the false-positive rate of the highly sensitive AFE needs to be validated with higher numbers. The combination of AF with AFE had a 20% higher specificity than AFE alone in our study.

  4. Is sequencing better than phenotypic tests for the detection of pyrazinamide resistance?

    PubMed

    Bouzouita, I; Cabibbe, A M; Trovato, A; Draoui, H; Ghariani, A; Midouni, B; Essalah, L; Mehiri, E; Cirillo, D M; Slim-Saidi, L

    2018-06-01

    Phenotypic tests used to detect pyrazinamide (PZA) resistance are slow and have a high rate of false resistance. To evaluate the accuracy of pncA sequencing for the detection of PZA resistance in Mycobacterium tuberculosis strains isolated in Tunisia. A total of 82 isolates, 41 resistant and 41 susceptible to PZA on BACTEC™ MGIT™ 960, were sequenced for pncA. Whole genome sequencing was performed for strains that were phenotypically resistant and had wild-type pncA in addition to MGIT retesting with a modified protocol. Twenty-three strains resistant to PZA with negative pyrazinamidase (PZase) activity harboured a mutation in the promoter or coding region of pncA. However, 18 strains resistant to PZA did not present any mutation. Repeat MGIT 960 showed that 16 of 18 M. tuberculosis isolates were falsely resistant to PZA. Compared with MGIT, PZase activity assay and pncA sequencing both presented a sensitivity of 92.0% (95%CI 73.9-99.0) and a specificity of respectively 96.5% (positive predictive value [PPV] 92.0%, negative predictive value [NPV] 96.5%) and 100.0% (PPV 100.0%, NPV 96.6%). The standard MGIT assay showed a high rate of false resistance to PZA, and the PZase activity assay is slow. pncA sequencing could therefore represent a rapid, accurate, alternative test to detect PZA resistance.

  5. Improving signal-to-noise in the direct imaging of exoplanets and circumstellar disks with MLOCI

    NASA Astrophysics Data System (ADS)

    Wahhaj, Zahed; Cieza, Lucas A.; Mawet, Dimitri; Yang, Bin; Canovas, Hector; de Boer, Jozua; Casassus, Simon; Ménard, François; Schreiber, Matthias R.; Liu, Michael C.; Biller, Beth A.; Nielsen, Eric L.; Hayward, Thomas L.

    2015-09-01

    We present a new algorithm designed to improve the signal-to-noise ratio (S/N) of point and extended source detections around bright stars in direct imaging data.One of our innovations is that we insert simulated point sources into the science images, which we then try to recover with maximum S/N. This improves the S/N of real point sources elsewhere in the field. The algorithm, based on the locally optimized combination of images (LOCI) method, is called Matched LOCI or MLOCI. We show with Gemini Planet Imager (GPI) data on HD 135344 B and Near-Infrared Coronagraphic Imager (NICI) data on several stars that the new algorithm can improve the S/N of point source detections by 30-400% over past methods. We also find no increase in false detections rates. No prior knowledge of candidate companion locations is required to use MLOCI. On the other hand, while non-blind applications may yield linear combinations of science images that seem to increase the S/N of true sources by a factor >2, they can also yield false detections at high rates. This is a potential pitfall when trying to confirm marginal detections or to redetect point sources found in previous epochs. These findings are relevant to any method where the coefficients of the linear combination are considered tunable, e.g., LOCI and principal component analysis (PCA). Thus we recommend that false detection rates be analyzed when using these techniques. Based on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the NSF on behalf of the Gemini partnership: the National Science Foundation (USA), the Science and Technology Facilities Council (UK), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministério da Ciência e Tecnologia (Brazil) and Ministerio de Ciencia, Tecnología e Innovación Productiva (Argentina).

  6. Analysis of Infrared Signature Variation and Robust Filter-Based Supersonic Target Detection

    PubMed Central

    Sun, Sun-Gu; Kim, Kyung-Tae

    2014-01-01

    The difficulty of small infrared target detection originates from the variations of infrared signatures. This paper presents the fundamental physics of infrared target variations and reports the results of variation analysis of infrared images acquired using a long wave infrared camera over a 24-hour period for different types of backgrounds. The detection parameters, such as signal-to-clutter ratio were compared according to the recording time, temperature and humidity. Through variation analysis, robust target detection methodologies are derived by controlling thresholds and designing a temporal contrast filter to achieve high detection rate and low false alarm rate. Experimental results validate the robustness of the proposed scheme by applying it to the synthetic and real infrared sequences. PMID:24672290

  7. A research using hybrid RBF/Elman neural networks for intrusion detection system secure model

    NASA Astrophysics Data System (ADS)

    Tong, Xiaojun; Wang, Zhu; Yu, Haining

    2009-10-01

    A hybrid RBF/Elman neural network model that can be employed for both anomaly detection and misuse detection is presented in this paper. The IDSs using the hybrid neural network can detect temporally dispersed and collaborative attacks effectively because of its memory of past events. The RBF network is employed as a real-time pattern classification and the Elman network is employed to restore the memory of past events. The IDSs using the hybrid neural network are evaluated against the intrusion detection evaluation data sponsored by U.S. Defense Advanced Research Projects Agency (DARPA). Experimental results are presented in ROC curves. Experiments show that the IDSs using this hybrid neural network improve the detection rate and decrease the false positive rate effectively.

  8. Screening for subclinical Cushing's syndrome in type 2 diabetes mellitus: low false-positive rates with nocturnal salivary cortisol.

    PubMed

    Gagliardi, L; Chapman, I M; O'Loughlin, P; Torpy, D J

    2010-04-01

    The diagnosis of subclinical Cushing's syndrome (SCS) is important, but its relative rarity amongst patients with common metabolic disorders requires a simple test with a low false-positive rate. Using nocturnal salivary cortisol (NSC), which we first validated in patients with suspected and proven Cushing's syndrome, we screened 106 overweight patients with type 2 diabetes mellitus, a group at high risk of SCS and nontumoral hypothalamic-pituitary-adrenal axis perturbations. Our hypothesis was that a lower false-positive rate with NSC was likely, compared with that reported with the dexamethasone suppression test (DST) (10-20%), currently the foundation of diagnosis of SCS. No participant had clinically apparent Cushing's syndrome. Three participants had an elevated NSC but further testing excluded SCS. In this study, NSC had a lower false-positive rate (3%) than previously reported for the DST. Given the reported excellent performance of NSC in detection of hypercortisolism, the low false-positive rate in SCS suggests NSC may be superior to the DST for SCS screening. The NSC and DST should be compared directly in metabolic disorder patients; although our data suggest the patient group will need to be substantially larger to definitively determine the optimal screening test. Georg Thieme Verlag KG Stuttgart New York.

  9. Reducing false positives of microcalcification detection systems by removal of breast arterial calcifications.

    PubMed

    Mordang, Jan-Jurre; Gubern-Mérida, Albert; den Heeten, Gerard; Karssemeijer, Nico

    2016-04-01

    In the past decades, computer-aided detection (CADe) systems have been developed to aid screening radiologists in the detection of malignant microcalcifications. These systems are useful to avoid perceptual oversights and can increase the radiologists' detection rate. However, due to the high number of false positives marked by these CADe systems, they are not yet suitable as an independent reader. Breast arterial calcifications (BACs) are one of the most frequent false positives marked by CADe systems. In this study, a method is proposed for the elimination of BACs as positive findings. Removal of these false positives will increase the performance of the CADe system in finding malignant microcalcifications. A multistage method is proposed for the removal of BAC findings. The first stage consists of a microcalcification candidate selection, segmentation and grouping of the microcalcifications, and classification to remove obvious false positives. In the second stage, a case-based selection is applied where cases are selected which contain BACs. In the final stage, BACs are removed from the selected cases. The BACs removal stage consists of a GentleBoost classifier trained on microcalcification features describing their shape, topology, and texture. Additionally, novel features are introduced to discriminate BACs from other positive findings. The CADe system was evaluated with and without BACs removal. Here, both systems were applied on a validation set containing 1088 cases of which 95 cases contained malignant microcalcifications. After bootstrapping, free-response receiver operating characteristics and receiver operating characteristics analyses were carried out. Performance between the two systems was compared at 0.98 and 0.95 specificity. At a specificity of 0.98, the sensitivity increased from 37% to 52% and the sensitivity increased from 62% up to 76% at a specificity of 0.95. Partial areas under the curve in the specificity range of 0.8-1.0 were significantly different between the system without BACs removal and the system with BACs removal, 0.129 ± 0.009 versus 0.144 ± 0.008 (p<0.05), respectively. Additionally, the sensitivity at one false positive per 50 cases and one false positive per 25 cases increased as well, 37% versus 51% (p<0.05) and 58% versus 67% (p<0.05) sensitivity, respectively. Additionally, the CADe system with BACs removal reduces the number of false positives per case by 29% on average. The same sensitivity at one false positive per 50 cases in the CADe system without BACs removal can be achieved at one false positive per 80 cases in the CADe system with BACs removal. By using dedicated algorithms to detect and remove breast arterial calcifications, the performance of CADe systems can be improved, in particular, at false positive rates representative for operating points used in screening.

  10. Incorporation of operator knowledge for improved HMDS GPR classification

    NASA Astrophysics Data System (ADS)

    Kennedy, Levi; McClelland, Jessee R.; Walters, Joshua R.

    2012-06-01

    The Husky Mine Detection System (HMDS) detects and alerts operators to potential threats observed in groundpenetrating RADAR (GPR) data. In the current system architecture, the classifiers have been trained using available data from multiple training sites. Changes in target types, clutter types, and operational conditions may result in statistical differences between the training data and the testing data for the underlying features used by the classifier, potentially resulting in an increased false alarm rate or a lower probability of detection for the system. In the current mode of operation, the automated detection system alerts the human operator when a target-like object is detected. The operator then uses data visualization software, contextual information, and human intuition to decide whether the alarm presented is an actual target or a false alarm. When the statistics of the training data and the testing data are mismatched, the automated detection system can overwhelm the analyst with an excessive number of false alarms. This is evident in the performance of and the data collected from deployed systems. This work demonstrates that analyst feedback can be successfully used to re-train a classifier to account for variable testing data statistics not originally captured in the initial training data.

  11. Foraging Parameters Influencing the Detection and Interpretation of Area-Restricted Search Behaviour in Marine Predators: A Case Study with the Masked Booby

    PubMed Central

    Sommerfeld, Julia; Kato, Akiko; Ropert-Coudert, Yan; Garthe, Stefan; Hindell, Mark A.

    2013-01-01

    Identification of Area-restricted search (ARS) behaviour is used to better understand foraging movements and strategies of marine predators. Track-based descriptive analyses are commonly used to detect ARS behaviour, but they may be biased by factors such as foraging trip duration or non-foraging behaviours (i.e. resting on the water). Using first-passage time analysis we tested if (I) daylight resting at the sea surface positions falsely increase the detection of ARS behaviour and (II) short foraging trips are less likely to include ARS behaviour in Masked Boobies Sula dactylatra. We further analysed whether ARS behaviour may be used as a proxy to identify important feeding areas. Depth-acceleration and GPS-loggers were simultaneously deployed on chick-rearing adults to obtain (1) location data every 4 minutes and (2) detailed foraging activity such as diving rates, time spent sitting on the water surface and in flight. In 82% of 50 foraging trips, birds adopted ARS behaviour. In 19.3% of 57 detected ARS zones, birds spent more than 70% of total ARS duration resting on the water, suggesting that these ARS zones were falsely detected. Based on generalized linear mixed models, the probability of detecting false ARS zones was 80%. False ARS zones mostly occurred during short trips in close proximity to the colony, with low or no diving activity. This demonstrates the need to account for resting on the water surface positions in marine animals when determining ARS behaviour based on foraging locations. Dive rates were positively correlated with trip duration and the probability of ARS behaviour increased with increasing number of dives, suggesting that the adoption of ARS behaviour in Masked Boobies is linked to enhanced foraging activity. We conclude that ARS behaviour may be used as a proxy to identify important feeding areas in this species. PMID:23717471

  12. cn.MOPS: mixture of Poissons for discovering copy number variations in next-generation sequencing data with a low false discovery rate

    PubMed Central

    Klambauer, Günter; Schwarzbauer, Karin; Mayr, Andreas; Clevert, Djork-Arné; Mitterecker, Andreas; Bodenhofer, Ulrich; Hochreiter, Sepp

    2012-01-01

    Quantitative analyses of next-generation sequencing (NGS) data, such as the detection of copy number variations (CNVs), remain challenging. Current methods detect CNVs as changes in the depth of coverage along chromosomes. Technological or genomic variations in the depth of coverage thus lead to a high false discovery rate (FDR), even upon correction for GC content. In the context of association studies between CNVs and disease, a high FDR means many false CNVs, thereby decreasing the discovery power of the study after correction for multiple testing. We propose ‘Copy Number estimation by a Mixture Of PoissonS’ (cn.MOPS), a data processing pipeline for CNV detection in NGS data. In contrast to previous approaches, cn.MOPS incorporates modeling of depths of coverage across samples at each genomic position. Therefore, cn.MOPS is not affected by read count variations along chromosomes. Using a Bayesian approach, cn.MOPS decomposes variations in the depth of coverage across samples into integer copy numbers and noise by means of its mixture components and Poisson distributions, respectively. The noise estimate allows for reducing the FDR by filtering out detections having high noise that are likely to be false detections. We compared cn.MOPS with the five most popular methods for CNV detection in NGS data using four benchmark datasets: (i) simulated data, (ii) NGS data from a male HapMap individual with implanted CNVs from the X chromosome, (iii) data from HapMap individuals with known CNVs, (iv) high coverage data from the 1000 Genomes Project. cn.MOPS outperformed its five competitors in terms of precision (1–FDR) and recall for both gains and losses in all benchmark data sets. The software cn.MOPS is publicly available as an R package at http://www.bioinf.jku.at/software/cnmops/ and at Bioconductor. PMID:22302147

  13. cn.MOPS: mixture of Poissons for discovering copy number variations in next-generation sequencing data with a low false discovery rate.

    PubMed

    Klambauer, Günter; Schwarzbauer, Karin; Mayr, Andreas; Clevert, Djork-Arné; Mitterecker, Andreas; Bodenhofer, Ulrich; Hochreiter, Sepp

    2012-05-01

    Quantitative analyses of next-generation sequencing (NGS) data, such as the detection of copy number variations (CNVs), remain challenging. Current methods detect CNVs as changes in the depth of coverage along chromosomes. Technological or genomic variations in the depth of coverage thus lead to a high false discovery rate (FDR), even upon correction for GC content. In the context of association studies between CNVs and disease, a high FDR means many false CNVs, thereby decreasing the discovery power of the study after correction for multiple testing. We propose 'Copy Number estimation by a Mixture Of PoissonS' (cn.MOPS), a data processing pipeline for CNV detection in NGS data. In contrast to previous approaches, cn.MOPS incorporates modeling of depths of coverage across samples at each genomic position. Therefore, cn.MOPS is not affected by read count variations along chromosomes. Using a Bayesian approach, cn.MOPS decomposes variations in the depth of coverage across samples into integer copy numbers and noise by means of its mixture components and Poisson distributions, respectively. The noise estimate allows for reducing the FDR by filtering out detections having high noise that are likely to be false detections. We compared cn.MOPS with the five most popular methods for CNV detection in NGS data using four benchmark datasets: (i) simulated data, (ii) NGS data from a male HapMap individual with implanted CNVs from the X chromosome, (iii) data from HapMap individuals with known CNVs, (iv) high coverage data from the 1000 Genomes Project. cn.MOPS outperformed its five competitors in terms of precision (1-FDR) and recall for both gains and losses in all benchmark data sets. The software cn.MOPS is publicly available as an R package at http://www.bioinf.jku.at/software/cnmops/ and at Bioconductor.

  14. Foraging parameters influencing the detection and interpretation of area-restricted search behaviour in marine predators: a case study with the masked booby.

    PubMed

    Sommerfeld, Julia; Kato, Akiko; Ropert-Coudert, Yan; Garthe, Stefan; Hindell, Mark A

    2013-01-01

    Identification of Area-restricted search (ARS) behaviour is used to better understand foraging movements and strategies of marine predators. Track-based descriptive analyses are commonly used to detect ARS behaviour, but they may be biased by factors such as foraging trip duration or non-foraging behaviours (i.e. resting on the water). Using first-passage time analysis we tested if (I) daylight resting at the sea surface positions falsely increase the detection of ARS behaviour and (II) short foraging trips are less likely to include ARS behaviour in Masked Boobies Sula dactylatra. We further analysed whether ARS behaviour may be used as a proxy to identify important feeding areas. Depth-acceleration and GPS-loggers were simultaneously deployed on chick-rearing adults to obtain (1) location data every 4 minutes and (2) detailed foraging activity such as diving rates, time spent sitting on the water surface and in flight. In 82% of 50 foraging trips, birds adopted ARS behaviour. In 19.3% of 57 detected ARS zones, birds spent more than 70% of total ARS duration resting on the water, suggesting that these ARS zones were falsely detected. Based on generalized linear mixed models, the probability of detecting false ARS zones was 80%. False ARS zones mostly occurred during short trips in close proximity to the colony, with low or no diving activity. This demonstrates the need to account for resting on the water surface positions in marine animals when determining ARS behaviour based on foraging locations. Dive rates were positively correlated with trip duration and the probability of ARS behaviour increased with increasing number of dives, suggesting that the adoption of ARS behaviour in Masked Boobies is linked to enhanced foraging activity. We conclude that ARS behaviour may be used as a proxy to identify important feeding areas in this species.

  15. False-positive rate determination of protein target discovery using a covalent modification- and mass spectrometry-based proteomics platform.

    PubMed

    Strickland, Erin C; Geer, M Ariel; Hong, Jiyong; Fitzgerald, Michael C

    2014-01-01

    Detection and quantitation of protein-ligand binding interactions is important in many areas of biological research. Stability of proteins from rates of oxidation (SPROX) is an energetics-based technique for identifying the proteins targets of ligands in complex biological mixtures. Knowing the false-positive rate of protein target discovery in proteome-wide SPROX experiments is important for the correct interpretation of results. Reported here are the results of a control SPROX experiment in which chemical denaturation data is obtained on the proteins in two samples that originated from the same yeast lysate, as would be done in a typical SPROX experiment except that one sample would be spiked with the test ligand. False-positive rates of 1.2-2.2% and <0.8% are calculated for SPROX experiments using Q-TOF and Orbitrap mass spectrometer systems, respectively. Our results indicate that the false-positive rate is largely determined by random errors associated with the mass spectral analysis of the isobaric mass tag (e.g., iTRAQ®) reporter ions used for peptide quantitation. Our results also suggest that technical replicates can be used to effectively eliminate such false positives that result from this random error, as is demonstrated in a SPROX experiment to identify yeast protein targets of the drug, manassantin A. The impact of ion purity in the tandem mass spectral analyses and of background oxidation on the false-positive rate of protein target discovery using SPROX is also discussed.

  16. Change-based threat detection in urban environments with a forward-looking camera

    NASA Astrophysics Data System (ADS)

    Morton, Kenneth, Jr.; Ratto, Christopher; Malof, Jordan; Gunter, Michael; Collins, Leslie; Torrione, Peter

    2012-06-01

    Roadside explosive threats continue to pose a significant risk to soldiers and civilians in conflict areas around the world. These objects are easy to manufacture and procure, but due to their ad hoc nature, they are difficult to reliably detect using standard sensing technologies. Although large roadside explosive hazards may be difficult to conceal in rural environments, urban settings provide a much more complicated background where seemingly innocuous objects (e.g., piles of trash, roadside debris) may be used to obscure threats. Since direct detection of all innocuous objects would flag too many objects to be of use, techniques must be employed to reduce the number of alarms generated and highlight only a limited subset of possibly threatening regions for the user. In this work, change detection techniques are used to reduce false alarm rates and increase detection capabilities for possible threat identification in urban environments. The proposed model leverages data from multiple video streams collected over the same regions by first applying video aligning and then using various distance metrics to detect changes based on image keypoints in the video streams. Data collected at an urban warfare simulation range at an Eastern US test site was used to evaluate the proposed approach, and significant reductions in false alarm rates compared to simpler techniques are illustrated.

  17. ERASE-Seq: Leveraging replicate measurements to enhance ultralow frequency variant detection in NGS data

    PubMed Central

    Kamps-Hughes, Nick; McUsic, Andrew; Kurihara, Laurie; Harkins, Timothy T.; Pal, Prithwish; Ray, Claire

    2018-01-01

    The accurate detection of ultralow allele frequency variants in DNA samples is of interest in both research and medical settings, particularly in liquid biopsies where cancer mutational status is monitored from circulating DNA. Next-generation sequencing (NGS) technologies employing molecular barcoding have shown promise but significant sensitivity and specificity improvements are still needed to detect mutations in a majority of patients before the metastatic stage. To address this we present analytical validation data for ERASE-Seq (Elimination of Recurrent Artifacts and Stochastic Errors), a method for accurate and sensitive detection of ultralow frequency DNA variants in NGS data. ERASE-Seq differs from previous methods by creating a robust statistical framework to utilize technical replicates in conjunction with background error modeling, providing a 10 to 100-fold reduction in false positive rates compared to published molecular barcoding methods. ERASE-Seq was tested using spiked human DNA mixtures with clinically realistic DNA input quantities to detect SNVs and indels between 0.05% and 1% allele frequency, the range commonly found in liquid biopsy samples. Variants were detected with greater than 90% sensitivity and a false positive rate below 0.1 calls per 10,000 possible variants. The approach represents a significant performance improvement compared to molecular barcoding methods and does not require changing molecular reagents. PMID:29630678

  18. Using the morphology of photoplethysmogram peaks to detect changes in posture.

    PubMed

    Linder, Stephen P; Wendelken, Suzanne M; Wei, Edward; McGrath, Susan P

    2006-06-01

    The morphology of the pulsatile component of the photoplethysmogram (PPG) has been shown to vary with physiology, but changes in the morphology caused by the baroreflex response to orthostatic stress have not been investigated. Using two FDA approved Nonin pulse oximeters placed on the finger and ear, we monitored 11 subjects, for three trials each, as they stood from a supine position. Each cardiac cycle was automatically extracted from the PPG waveform and characterized using statistics corresponding to normalized peak width, instantaneous heart rate, and amplitude of the pulsatile component of the ear PPG. A nonparametric Wilcoxon rank sum test was then used to detect in real-time changes in these features with p < 0.01. In all 33 trials, the standing event was detected as an abrupt change in at least two of these features, with only one false alarm. In 26 trials, an abrupt change was detected in all three features, with no false alarms. An increase in the normalize peak width was detected before an increase in heart rate, and in 21 trials a peak in the feature was detected before or as standing commenced. During standing, the pulse rate always increases, and then amplitude of the ear PPG constricts by a factor of two or more. We hypothesis that the baroreflex first reduces the percentage of time blood flow is stagnant during the cardiac cycle, then increases the hear rate, and finally vasoconstricts the peripheral tissue in order to reestablishing a nominal blood pressure. These three features therefore can be used as a detector of the baroreflex response to changes in posture or other forms of blood volume sequestration.

  19. DECIPHER, a Search-Based Approach to Chimera Identification for 16S rRNA Sequences

    PubMed Central

    Wright, Erik S.; Yilmaz, L. Safak

    2012-01-01

    DECIPHER is a new method for finding 16S rRNA chimeric sequences by the use of a search-based approach. The method is based upon detecting short fragments that are uncommon in the phylogenetic group where a query sequence is classified but frequently found in another phylogenetic group. The algorithm was calibrated for full sequences (fs_DECIPHER) and short sequences (ss_DECIPHER) and benchmarked against WigeoN (Pintail), ChimeraSlayer, and Uchime using artificially generated chimeras. Overall, ss_DECIPHER and Uchime provided the highest chimera detection for sequences 100 to 600 nucleotides long (79% and 81%, respectively), but Uchime's performance deteriorated for longer sequences, while ss_DECIPHER maintained a high detection rate (89%). Both methods had low false-positive rates (1.3% and 1.6%). The more conservative fs_DECIPHER, benchmarked only for sequences longer than 600 nucleotides, had an overall detection rate lower than that of ss_DECIPHER (75%) but higher than those of the other programs. In addition, fs_DECIPHER had the lowest false-positive rate among all the benchmarked programs (<0.20%). DECIPHER was outperformed only by ChimeraSlayer and Uchime when chimeras were formed from closely related parents (less than 10% divergence). Given the differences in the programs, it was possible to detect over 89% of all chimeras with just the combination of ss_DECIPHER and Uchime. Using fs_DECIPHER, we detected between 1% and 2% additional chimeras in the RDP, SILVA, and Greengenes databases from which chimeras had already been removed with Pintail or Bellerophon. DECIPHER was implemented in the R programming language and is directly accessible through a webpage or by downloading the program as an R package (http://DECIPHER.cee.wisc.edu). PMID:22101057

  20. Target attribute-based false alarm rejection in small infrared target detection

    NASA Astrophysics Data System (ADS)

    Kim, Sungho

    2012-11-01

    Infrared search and track is an important research area in military applications. Although there are a lot of works on small infrared target detection methods, we cannot apply them in real field due to high false alarm rate caused by clutters. This paper presents a novel target attribute extraction and machine learning-based target discrimination method. Eight kinds of target features are extracted and analyzed statistically. Learning-based classifiers such as SVM and Adaboost are developed and compared with conventional classifiers for real infrared images. In addition, the generalization capability is also inspected for various infrared clutters.

  1. RATIR Follow-up of LIGO/Virgo Gravitational Wave Events

    NASA Astrophysics Data System (ADS)

    Golkhou, V. Zach; Butler, Nathaniel R.; Strausbaugh, Robert; Troja, Eleonora; Kutyrev, Alexander; Lee, William H.; Román-Zúñiga, Carlos G.; Watson, Alan M.

    2018-04-01

    We have recently witnessed the first multi-messenger detection of colliding neutron stars through gravitational waves (GWs) and electromagnetic (EM) waves (GW 170817) thanks to the joint efforts of LIGO/Virgo and Space/Ground-based telescopes. In this paper, we report on the RATIR follow-up observation strategies and show the results for the trigger G194575. This trigger is not of astrophysical interest; however, it is of great interest to the robust design of a follow-up engine to explore large sky-error regions. We discuss the development of an image-subtraction pipeline for the six-color, optical/NIR imaging camera RATIR. Considering a two-band (i and r) campaign in the fall of 2015, we find that the requirement of simultaneous detection in both bands leads to a factor ∼10 reduction in false alarm rate, which can be further reduced using additional bands. We also show that the performance of our proposed algorithm is robust to fluctuating observing conditions, maintaining a low false alarm rate with a modest decrease in system efficiency that can be overcome utilizing repeat visits. Expanding our pipeline to search for either optical or NIR detections (three or more bands), considering separately the optical riZ and NIR YJH bands, should result in a false alarm rate ≈1% and an efficiency ≈90%. RATIR’s simultaneous optical/NIR observations are expected to yield about one candidate transient in the vast 100 deg2 LIGO error region for prioritized follow-up with larger aperture telescopes.

  2. Automatic discrimination of fine roots in minirhizotron images.

    PubMed

    Zeng, Guang; Birchfield, Stanley T; Wells, Christina E

    2008-01-01

    Minirhizotrons provide detailed information on the production, life history and mortality of fine roots. However, manual processing of minirhizotron images is time-consuming, limiting the number and size of experiments that can reasonably be analysed. Previously, an algorithm was developed to automatically detect and measure individual roots in minirhizotron images. Here, species-specific root classifiers were developed to discriminate detected roots from bright background artifacts. Classifiers were developed from training images of peach (Prunus persica), freeman maple (Acer x freemanii) and sweetbay magnolia (Magnolia virginiana) using the Adaboost algorithm. True- and false-positive rates for classifiers were estimated using receiver operating characteristic curves. Classifiers gave true positive rates of 89-94% and false positive rates of 3-7% when applied to nontraining images of the species for which they were developed. The application of a classifier trained on one species to images from another species resulted in little or no reduction in accuracy. These results suggest that a single root classifier can be used to distinguish roots from background objects across multiple minirhizotron experiments. By incorporating root detection and discrimination algorithms into an open-source minirhizotron image analysis application, many analysis tasks that are currently performed by hand can be automated.

  3. Explosive hazard detection using MIMO forward-looking ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Shaw, Darren; Ho, K. C.; Stone, Kevin; Keller, James M.; Popescu, Mihail; Anderson, Derek T.; Luke, Robert H.; Burns, Brian

    2015-05-01

    This paper proposes a machine learning algorithm for subsurface object detection on multiple-input-multiple-output (MIMO) forward-looking ground-penetrating radar (FLGPR). By detecting hazards using FLGPR, standoff distances of up to tens of meters can be acquired, but this is at the degradation of performance due to high false alarm rates. The proposed system utilizes an anomaly detection prescreener to identify potential object locations. Alarm locations have multiple one-dimensional (ML) spectral features, two-dimensional (2D) spectral features, and log-Gabor statistic features extracted. The ability of these features to reduce the number of false alarms and increase the probability of detection is evaluated for both co-polarizations present in the Akela MIMO array. Classification is performed by a Support Vector Machine (SVM) with lane-based cross-validation for training and testing. Class imbalance and optimized SVM kernel parameters are considered during classifier training.

  4. Automated detection of retinal nerve fiber layer defects on fundus images: false positive reduction based on vessel likelihood

    NASA Astrophysics Data System (ADS)

    Muramatsu, Chisako; Ishida, Kyoko; Sawada, Akira; Hatanaka, Yuji; Yamamoto, Tetsuya; Fujita, Hiroshi

    2016-03-01

    Early detection of glaucoma is important to slow down or cease progression of the disease and for preventing total blindness. We have previously proposed an automated scheme for detection of retinal nerve fiber layer defect (NFLD), which is one of the early signs of glaucoma observed on retinal fundus images. In this study, a new multi-step detection scheme was included to improve detection of subtle and narrow NFLDs. In addition, new features were added to distinguish between NFLDs and blood vessels, which are frequent sites of false positives (FPs). The result was evaluated with a new test dataset consisted of 261 cases, including 130 cases with NFLDs. Using the proposed method, the initial detection rate was improved from 82% to 98%. At the sensitivity of 80%, the number of FPs per image was reduced from 4.25 to 1.36. The result indicates the potential usefulness of the proposed method for early detection of glaucoma.

  5. The ship edge feature detection based on high and low threshold for remote sensing image

    NASA Astrophysics Data System (ADS)

    Li, Xuan; Li, Shengyang

    2018-05-01

    In this paper, a method based on high and low threshold is proposed to detect the ship edge feature due to the low accuracy rate caused by the noise. Analyze the relationship between human vision system and the target features, and to determine the ship target by detecting the edge feature. Firstly, using the second-order differential method to enhance the quality of image; Secondly, to improvement the edge operator, we introduction of high and low threshold contrast to enhancement image edge and non-edge points, and the edge as the foreground image, non-edge as a background image using image segmentation to achieve edge detection, and remove the false edges; Finally, the edge features are described based on the result of edge features detection, and determine the ship target. The experimental results show that the proposed method can effectively reduce the number of false edges in edge detection, and has the high accuracy of remote sensing ship edge detection.

  6. [Feasibility analysis of sentinel lymph node biopsy in patients with breast cancer after local lumpectomy].

    PubMed

    Wang, J; Wang, X; Wang, W Y; Liu, J Q; Xing, Z Y; Wang, X

    2016-07-01

    To explore the feasibility, safety and clinical application value of sentinel lymph node biopsy(SLNB)in patients with breast cancer after local lumpectomy. Clinical data of 195 patients who previously received local lumpectomy from January 2005 to April 2015 were retrospectively analyzed. All the patients with pathologic stage T1-2N0M0 (T1-2N0M0) breast cancer underwent SLNB. Methylene blue, carbon nanoparticles suspension, technetium-99m-labeled dextran, or in combination were used in the SLNB. The interval from lumpectomy to SLNB was 1-91 days(mean, 18.3 days)and the maximum diameter of tumors before first operation was 0.2-4.5 cm (mean, 1.8 cm). The sentinel lymph node was successfully found in all the cases and the detection rate was 100%. 42 patients received axillary lymph node dissection (ALND), 19 patients had pathologically positive sentinel lymph node, with an accuracy rate of 97.6%, sensitivity of 95.0%, false negative rate of 5.0%, and specificity of 100%, and the false positive rate was 0. Logistic regression analysis suggested that the age of patients was significantly associated with sentinel lymph node metastasis after local lumpectomy. For early breast cancer and after breast tumor biopsy, the influence of local lumpectomy on detection rate and accuracy of sentinel lymph node is not significant. Sentinel lymph node biopsy with appropriately chosen tracing technique may still provide a high detection rate and accuracy.

  7. Position and volume estimation of atmospheric nuclear detonations from video reconstruction

    NASA Astrophysics Data System (ADS)

    Schmitt, Daniel T.

    Recent work in digitizing films of foundational atmospheric nuclear detonations from the 1950s provides an opportunity to perform deeper analysis on these historical tests. This work leverages multi-view geometry and computer vision techniques to provide an automated means to perform three-dimensional analysis of the blasts for several points in time. The accomplishment of this requires careful alignment of the films in time, detection of features in the images, matching of features, and multi-view reconstruction. Sub-explosion features can be detected with a 67% hit rate and 22% false alarm rate. Hotspot features can be detected with a 71.95% hit rate, 86.03% precision and a 0.015% false positive rate. Detected hotspots are matched across 57-109 degree viewpoints with 76.63% average correct matching by defining their location relative to the center of the explosion, rotating them to the alternative viewpoint, and matching them collectively. When 3D reconstruction is applied to the hotspot matching it completes an automated process that has been used to create 168 3D point clouds with 31.6 points per reconstruction with each point having an accuracy of 0.62 meters with 0.35, 0.24, and 0.34 meters of accuracy in the x-, y- and z-direction respectively. As a demonstration of using the point clouds for analysis, volumes are estimated and shown to be consistent with radius-based models and in some cases improve on the level of uncertainty in the yield calculation.

  8. Genotypic and Phenotypic Characterization of Escherichia coli Isolates from Feces, Hands, and Soils in Rural Bangladesh via the Colilert Quanti-Tray System

    PubMed Central

    Islam, M. Aminul; Pickering, Amy J.; Roy, Subarna; Fuhrmeister, Erica R.; Ercumen, Ayse; Harris, Angela; Bishai, Jason; Schwab, Kellogg J.

    2014-01-01

    The increased awareness of the role of environmental matrices in enteric disease transmission has resulted in the need for rapid, field-based methods for fecal indicator bacteria and pathogen detection. Evidence of the specificity of β-glucuronidase-based assays for detection of Escherichia coli from environmental matrices relevant to enteric pathogen transmission in developing countries, such as hands, soils, and surfaces, is limited. In this study, we quantify the false-positive rate of a β-glucuronidase-based E. coli detection assay (Colilert) for two environmental reservoirs in Bangladeshi households (hands and soils) and three fecal composite sources (cattle, chicken, and humans). We investigate whether or not the isolation source of E. coli influences phenotypic and genotypic characteristics. Phenotypic characteristics include results of biochemical assays provided by the API-20E test; genotypic characteristics include the Clermont phylogroup and the presence of enteric and/or environmental indicator genes sfmH, rfaI, and fucK. Our findings demonstrate no statistically significant difference in the false-positive rate of Colilert for environmental compared to enteric samples. E. coli isolates from all source types are genetically diverse, representing six of the seven phylogroups, and there is no difference in relative frequency of phylogroups between enteric and environmental samples. We conclude that Colilert, and likely other β-glucuronidase-based assays, is appropriate for detection of E. coli on hands and in soils with low false-positive rates. Furthermore, E. coli isolated from hands and soils in Bangladeshi households are diverse and indistinguishable from cattle, chicken, and human fecal isolates, using traditional biochemical assays and phylogrouping. PMID:25548044

  9. Genotypic and phenotypic characterization of Escherichia coli isolates from feces, hands, and soils in rural Bangladesh via the Colilert Quanti-Tray System.

    PubMed

    Julian, Timothy R; Islam, M Aminul; Pickering, Amy J; Roy, Subarna; Fuhrmeister, Erica R; Ercumen, Ayse; Harris, Angela; Bishai, Jason; Schwab, Kellogg J

    2015-03-01

    The increased awareness of the role of environmental matrices in enteric disease transmission has resulted in the need for rapid, field-based methods for fecal indicator bacteria and pathogen detection. Evidence of the specificity of β-glucuronidase-based assays for detection of Escherichia coli from environmental matrices relevant to enteric pathogen transmission in developing countries, such as hands, soils, and surfaces, is limited. In this study, we quantify the false-positive rate of a β-glucuronidase-based E. coli detection assay (Colilert) for two environmental reservoirs in Bangladeshi households (hands and soils) and three fecal composite sources (cattle, chicken, and humans). We investigate whether or not the isolation source of E. coli influences phenotypic and genotypic characteristics. Phenotypic characteristics include results of biochemical assays provided by the API-20E test; genotypic characteristics include the Clermont phylogroup and the presence of enteric and/or environmental indicator genes sfmH, rfaI, and fucK. Our findings demonstrate no statistically significant difference in the false-positive rate of Colilert for environmental compared to enteric samples. E. coli isolates from all source types are genetically diverse, representing six of the seven phylogroups, and there is no difference in relative frequency of phylogroups between enteric and environmental samples. We conclude that Colilert, and likely other β-glucuronidase-based assays, is appropriate for detection of E. coli on hands and in soils with low false-positive rates. Furthermore, E. coli isolated from hands and soils in Bangladeshi households are diverse and indistinguishable from cattle, chicken, and human fecal isolates, using traditional biochemical assays and phylogrouping. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  10. Levels of Office Blood Pressure and Their Operating Characteristics for Detecting Masked Hypertension Based on Ambulatory Blood Pressure Monitoring

    PubMed Central

    Lin, Feng-Chang; Tuttle, Laura A.; Shimbo, Daichi; Diaz, Keith M.; Olsson, Emily; Stankevitz, Kristin; Hinderliter, Alan L.

    2015-01-01

    BACKGROUND Masked hypertension (MH)—nonelevated office blood pressure (BP) with elevated out-of-office BP average—conveys cardiovascular risk similar to or approaching sustained hypertension, making its detection of potential clinical importance. However, it may not be feasible or cost-effective to perform ambulatory BP monitoring (ABPM) on all patients with a nonelevated office BP. There likely exists a level of office BP below which ABPM is not warranted because the probability of MH is low. METHODS We analyzed data from 294 adults aged ≥30 years not on BP-lowering medication with office BP <140/90mm Hg, all of whom underwent 24-hour ABPM. We calculated sensitivity, false-positive rate, and likelihood ratios (LRs) for the range of office BP cutoffs from 110 to 138mm Hg systolic and from 68 to 88mm Hg diastolic for detecting MH. RESULTS The systolic BP cutoff with the highest +LR for detecting MH (1.8) was 120mm Hg, and the diastolic cutoff with the highest +LR (2.4) was 82mm Hg. However, the systolic level of 120mm Hg had a false-positive rate of 42%, and the diastolic level of 82mm Hg had a sensitivity of only 39%. CONCLUSIONS The cutoff of office BP with the best overall operating characteristics for diagnosing MH is approximately 120/82mm Hg. However, this cutoff may have an unacceptably high false-positive rate. Clinical risk tools to identify patients with nonelevated office BP for whom ABPM should be considered will likely need to include factors in addition to office BP. PMID:24898379

  11. Analysis of 27 antibiotic residues in raw cow's milk and milk-based products--validation of Delvotest® T.

    PubMed

    Bion, Cindy; Beck Henzelin, Andrea; Qu, Yajuan; Pizzocri, Giuseppe; Bolzoni, Giuseppe; Buffoli, Elena

    2016-01-01

    Delvotest® T was evaluated for its capability at detecting residues of 27 antibiotics in raw cow's milk and in some dairy ingredients (skimmed and full-cream milk powders). The kit was used as a screening tool for the qualitative determination of antibiotics from different families in a single test. Results delivered by such a method are expressed as 'positive' or 'negative', referring to the claimed screening target concentration (STC). Validation was conducted according to the European Community Reference Laboratories' (CRLs) residues guidelines of 20 January 2010 and performed by two laboratories, one located in Europe and the other in Asia. Five criteria were evaluated including detection capability at STC, false-positive (FP) rate, false-negative (FN) rate, robustness and cross-reactivity using visual reading and Delvoscan®. STCs were set at or below the corresponding maximum residue limit (MRL), as fixed by European Regulation EC No. 37/2010. Four antibiotics (nafcillin, oxytetracycline, tetracycline and rifaximin) out of 27 had a false-negative rate ranging from 1.7% to 4.9%; however, it was still compliant with the CRLs' requirements. Globally, Delvotest T can be recommended for the analysis of the surveyed antibiotics in raw cow's milk, skimmed and full-cream milk powders. Additional compounds were tested such as sulfamethazine, spiramycin and erythromycin; however, detection at the corresponding MRL was not achievable and these compounds were removed from the validation. Other drugs from the sulfonamide, aminoglycoside or macrolide families not detected by the test at the MRL were not evaluated in this study. Regarding the reliability of this rapid test to milk-based preparations, additional experiments should be performed on a larger range of compounds and samples to validate the Delvotest T in such matrices.

  12. Clinical outcome of subchromosomal events detected by whole‐genome noninvasive prenatal testing

    PubMed Central

    Helgeson, J.; Wardrop, J.; Boomer, T.; Almasri, E.; Paxton, W. B.; Saldivar, J. S.; Dharajiya, N.; Monroe, T. J.; Farkas, D. H.; Grosu, D. S.

    2015-01-01

    Abstract Objective A novel algorithm to identify fetal microdeletion events in maternal plasma has been developed and used in clinical laboratory‐based noninvasive prenatal testing. We used this approach to identify the subchromosomal events 5pdel, 22q11del, 15qdel, 1p36del, 4pdel, 11qdel, and 8qdel in routine testing. We describe the clinical outcomes of those samples identified with these subchromosomal events. Methods Blood samples from high‐risk pregnant women submitted for noninvasive prenatal testing were analyzed using low coverage whole genome massively parallel sequencing. Sequencing data were analyzed using a novel algorithm to detect trisomies and microdeletions. Results In testing 175 393 samples, 55 subchromosomal deletions were reported. The overall positive predictive value for each subchromosomal aberration ranged from 60% to 100% for cases with diagnostic and clinical follow‐up information. The total false positive rate was 0.0017% for confirmed false positives results; false negative rate and sensitivity were not conclusively determined. Conclusion Noninvasive testing can be expanded into the detection of subchromosomal copy number variations, while maintaining overall high test specificity. In the current setting, our results demonstrate high positive predictive values for testing of rare subchromosomal deletions. © 2015 The Authors. Prenatal Diagnosis published by John Wiley & Sons Ltd. PMID:26088833

  13. A reverse order interview does not aid deception detection regarding intentions

    PubMed Central

    Fenn, Elise; McGuire, Mollie; Langben, Sara; Blandón-Gitlin, Iris

    2015-01-01

    Promising recent research suggests that more cognitively demanding interviews improve deception detection accuracy. Would these cognitively demanding techniques work in the same way when discriminating between true and false future intentions? In Experiment 1 participants planned to complete a task, but instead were intercepted and interviewed about their intentions. Participants lied or told the truth, and were subjected to high (reverse order) or low (sequential order) cognitive load interviews. Third-party observers watched these interviews and indicated whether they thought the person was lying or telling the truth. Subjecting participants to a reverse compared to sequential interview increased the misidentification rate and the appearance of cognitive load in truth tellers. People lying about false intentions were not better identified. In Experiment 2, a second set of third-party observers rated behavioral cues. Consistent with Experiment 1, truth tellers, but not liars, exhibited more behaviors associated with lying and fewer behaviors associated with truth telling in the reverse than sequential interview. Together these results suggest that certain cognitively demanding interviews may be less useful when interviewing to detect false intentions. Explaining a true intention while under higher cognitive demand places truth tellers at risk of being misclassified. There may be such a thing as too much cognitive load induced by certain techniques PMID:26379610

  14. A reverse order interview does not aid deception detection regarding intentions.

    PubMed

    Fenn, Elise; McGuire, Mollie; Langben, Sara; Blandón-Gitlin, Iris

    2015-01-01

    Promising recent research suggests that more cognitively demanding interviews improve deception detection accuracy. Would these cognitively demanding techniques work in the same way when discriminating between true and false future intentions? In Experiment 1 participants planned to complete a task, but instead were intercepted and interviewed about their intentions. Participants lied or told the truth, and were subjected to high (reverse order) or low (sequential order) cognitive load interviews. Third-party observers watched these interviews and indicated whether they thought the person was lying or telling the truth. Subjecting participants to a reverse compared to sequential interview increased the misidentification rate and the appearance of cognitive load in truth tellers. People lying about false intentions were not better identified. In Experiment 2, a second set of third-party observers rated behavioral cues. Consistent with Experiment 1, truth tellers, but not liars, exhibited more behaviors associated with lying and fewer behaviors associated with truth telling in the reverse than sequential interview. Together these results suggest that certain cognitively demanding interviews may be less useful when interviewing to detect false intentions. Explaining a true intention while under higher cognitive demand places truth tellers at risk of being misclassified. There may be such a thing as too much cognitive load induced by certain techniques.

  15. "Hook-like effect" causes false-negative point-of-care urine pregnancy testing in emergency patients.

    PubMed

    Griffey, Richard T; Trent, Caleb J; Bavolek, Rebecca A; Keeperman, Jacob B; Sampson, Christopher; Poirier, Robert F

    2013-01-01

    Failure to detect pregnancy in the emergency department (ED) can have important consequences. Urine human chorionic gonadotropin (uhCG) point-of-care (POC) assays are valued for rapidly detecting early pregnancy with high sensitivity. However, under certain conditions, POC uhCG tests can fail to detect pregnancy. In investigating a series of late first-trimester false-negative pregnancy tests in our ED, a novel and distinct causative phenomenon was recently elucidated in our institution. We discuss uhCG POC tests, review our false-negative rate, and describe mechanisms for false negatives and potential remedies. The false-negative POC uhCG rate is very low, but in the setting of a large volume of tests, the numbers are worth consideration. In positive uhCG POC tests, free and fixed antibodies bind hCG to form a "sandwich"; hCG is present in several variant forms that change in their concentrations at different stages of pregnancy. When in excess, intact hCG can saturate the antibodies, preventing sandwich formation (hook effect phenomenon). Some assays may include an antibody that does not recognize certain variants present in later stages of pregnancy. When this variant is in excess, it can bind one antibody avidly and the other not at all, resulting in a false-negative test (hook-like phenomenon). In both situations, dilution is key to an accurate test. Manufacturers should consider that uhCG tests are routinely used at many stages of pregnancy. Characterizing uhCG variants recognized by their tests and eliminating lot-to-lot variability may help improve uhCG test performance. Clinicians need to be aware of and familiarize themselves with the limitations of the specific type of uhCG POC tests used in their practice, recognizing that under certain circumstances, false-negative tests can occur. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. A Metric for Reducing False Positives in the Computer-Aided Detection of Breast Cancer from Dynamic Contrast-Enhanced Magnetic Resonance Imaging Based Screening Examinations of High-Risk Women.

    PubMed

    Levman, Jacob E D; Gallego-Ortiz, Cristina; Warner, Ellen; Causer, Petrina; Martel, Anne L

    2016-02-01

    Magnetic resonance imaging (MRI)-enabled cancer screening has been shown to be a highly sensitive method for the early detection of breast cancer. Computer-aided detection systems have the potential to improve the screening process by standardizing radiologists to a high level of diagnostic accuracy. This retrospective study was approved by the institutional review board of Sunnybrook Health Sciences Centre. This study compares the performance of a proposed method for computer-aided detection (based on the second-order spatial derivative of the relative signal intensity) with the signal enhancement ratio (SER) on MRI-based breast screening examinations. Comparison is performed using receiver operating characteristic (ROC) curve analysis as well as free-response receiver operating characteristic (FROC) curve analysis. A modified computer-aided detection system combining the proposed approach with the SER method is also presented. The proposed method provides improvements in the rates of false positive markings over the SER method in the detection of breast cancer (as assessed by FROC analysis). The modified computer-aided detection system that incorporates both the proposed method and the SER method yields ROC results equal to that produced by SER while simultaneously providing improvements over the SER method in terms of false positives per noncancerous exam. The proposed method for identifying malignancies outperforms the SER method in terms of false positives on a challenging dataset containing many small lesions and may play a useful role in breast cancer screening by MRI as part of a computer-aided detection system.

  17. Assessment of error rates in acoustic monitoring with the R package monitoR

    USGS Publications Warehouse

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    Detecting population-scale reactions to climate change and land-use change may require monitoring many sites for many years, a process that is suited for an automated system. We developed and tested monitoR, an R package for long-term, multi-taxa acoustic monitoring programs. We tested monitoR with two northeastern songbird species: black-throated green warbler (Setophaga virens) and ovenbird (Seiurus aurocapilla). We compared detection results from monitoR in 52 10-minute surveys recorded at 10 sites in Vermont and New York, USA to a subset of songs identified by a human that were of a single song type and had visually identifiable spectrograms (e.g. a signal:noise ratio of at least 10 dB: 166 out of 439 total songs for black-throated green warbler, 502 out of 990 total songs for ovenbird). monitoR’s automated detection process uses a ‘score cutoff’, which is the minimum match needed for an unknown event to be considered a detection and results in a true positive, true negative, false positive or false negative detection. At the chosen score cut-offs, monitoR correctly identified presence for black-throated green warbler and ovenbird in 64% and 72% of the 52 surveys using binary point matching, respectively, and 73% and 72% of the 52 surveys using spectrogram cross-correlation, respectively. Of individual songs, 72% of black-throated green warbler songs and 62% of ovenbird songs were identified by binary point matching. Spectrogram cross-correlation identified 83% of black-throated green warbler songs and 66% of ovenbird songs. False positive rates were  for song event detection.

  18. Knowledge-based tracking algorithm

    NASA Astrophysics Data System (ADS)

    Corbeil, Allan F.; Hawkins, Linda J.; Gilgallon, Paul F.

    1990-10-01

    This paper describes the Knowledge-Based Tracking (KBT) algorithm for which a real-time flight test demonstration was recently conducted at Rome Air Development Center (RADC). In KBT processing, the radar signal in each resolution cell is thresholded at a lower than normal setting to detect low RCS targets. This lower threshold produces a larger than normal false alarm rate. Therefore, additional signal processing including spectral filtering, CFAR and knowledge-based acceptance testing are performed to eliminate some of the false alarms. TSC's knowledge-based Track-Before-Detect (TBD) algorithm is then applied to the data from each azimuth sector to detect target tracks. In this algorithm, tentative track templates are formed for each threshold crossing and knowledge-based association rules are applied to the range, Doppler, and azimuth measurements from successive scans. Lastly, an M-association out of N-scan rule is used to declare a detection. This scan-to-scan integration enhances the probability of target detection while maintaining an acceptably low output false alarm rate. For a real-time demonstration of the KBT algorithm, the L-band radar in the Surveillance Laboratory (SL) at RADC was used to illuminate a small Cessna 310 test aircraft. The received radar signal wa digitized and processed by a ST-100 Array Processor and VAX computer network in the lab. The ST-100 performed all of the radar signal processing functions, including Moving Target Indicator (MTI) pulse cancelling, FFT Doppler filtering, and CFAR detection. The VAX computers performed the remaining range-Doppler clustering, beamsplitting and TBD processing functions. The KBT algorithm provided a 9.5 dB improvement relative to single scan performance with a nominal real time delay of less than one second between illumination and display.

  19. Adaptive error detection for HDR/PDR brachytherapy: Guidance for decision making during real-time in vivo point dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertzscher, Gustavo, E-mail: guke@dtu.dk; Andersen, Claus E., E-mail: clan@dtu.dk; Tanderup, Kari, E-mail: karitand@rm.dk

    Purpose: This study presents an adaptive error detection algorithm (AEDA) for real-timein vivo point dosimetry during high dose rate (HDR) or pulsed dose rate (PDR) brachytherapy (BT) where the error identification, in contrast to existing approaches, does not depend on an a priori reconstruction of the dosimeter position. Instead, the treatment is judged based on dose rate comparisons between measurements and calculations of the most viable dosimeter position provided by the AEDA in a data driven approach. As a result, the AEDA compensates for false error cases related to systematic effects of the dosimeter position reconstruction. Given its nearly exclusivemore » dependence on stable dosimeter positioning, the AEDA allows for a substantially simplified and time efficient real-time in vivo BT dosimetry implementation. Methods: In the event of a measured potential treatment error, the AEDA proposes the most viable dosimeter position out of alternatives to the original reconstruction by means of a data driven matching procedure between dose rate distributions. If measured dose rates do not differ significantly from the most viable alternative, the initial error indication may be attributed to a mispositioned or misreconstructed dosimeter (false error). However, if the error declaration persists, no viable dosimeter position can be found to explain the error, hence the discrepancy is more likely to originate from a misplaced or misreconstructed source applicator or from erroneously connected source guide tubes (true error). Results: The AEDA applied on twoin vivo dosimetry implementations for pulsed dose rate BT demonstrated that the AEDA correctly described effects responsible for initial error indications. The AEDA was able to correctly identify the major part of all permutations of simulated guide tube swap errors and simulated shifts of individual needles from the original reconstruction. Unidentified errors corresponded to scenarios where the dosimeter position was sufficiently symmetric with respect to error and no-error source position constellations. The AEDA was able to correctly identify all false errors represented by mispositioned dosimeters contrary to an error detection algorithm relying on the original reconstruction. Conclusions: The study demonstrates that the AEDA error identification during HDR/PDR BT relies on a stable dosimeter position rather than on an accurate dosimeter reconstruction, and the AEDA’s capacity to distinguish between true and false error scenarios. The study further shows that the AEDA can offer guidance in decision making in the event of potential errors detected with real-timein vivo point dosimetry.« less

  20. Visual Sensor Based Abnormal Event Detection with Moving Shadow Removal in Home Healthcare Applications

    PubMed Central

    Lee, Young-Sook; Chung, Wan-Young

    2012-01-01

    Vision-based abnormal event detection for home healthcare systems can be greatly improved using visual sensor-based techniques able to detect, track and recognize objects in the scene. However, in moving object detection and tracking processes, moving cast shadows can be misclassified as part of objects or moving objects. Shadow removal is an essential step for developing video surveillance systems. The goal of the primary is to design novel computer vision techniques that can extract objects more accurately and discriminate between abnormal and normal activities. To improve the accuracy of object detection and tracking, our proposed shadow removal algorithm is employed. Abnormal event detection based on visual sensor by using shape features variation and 3-D trajectory is presented to overcome the low fall detection rate. The experimental results showed that the success rate of detecting abnormal events was 97% with a false positive rate of 2%. Our proposed algorithm can allow distinguishing diverse fall activities such as forward falls, backward falls, and falling asides from normal activities. PMID:22368486

  1. Accuracy and utility of an epigenetic biomarker for smoking in populations with varying rates of false self-report.

    PubMed

    Andersen, Allan M; Philibert, Robert A; Gibbons, Fredrick X; Simons, Ronald L; Long, Jeffrey

    2017-09-01

    Better biomarkers to detect smoking are needed given the tremendous public health burden caused by smoking. Current biomarkers to detect smoking have significant limitations, notably a short half-life for detection and lack of sensitivity for light smokers. These limitations may be particularly problematic in populations with less accurate self-reporting. Prior epigenome-wide association studies indicate that methylation status at cg05575921, a CpG residue located in the aryl hydrocarbon receptor repressor (AHRR) gene, may be a robust indicator of smoking status in individuals with as little as half of a pack-year of smoking. In this study, we show that a novel droplet digital PCR assay for measuring methylation at cg05575921 can reliably detect smoking status, as confirmed by serum cotinine, in populations with different demographic characteristics, smoking histories, and rates of false-negative self-report of smoking behavior. Using logistic regression models, we show that obtaining maximum accuracy in predicting smoking status depends on appropriately weighting self-report and cg05575921 methylation according to the characteristics of the sample being tested. Furthermore, models using only cg05575921 methylation to predict smoking perform nearly as well as those also including self-report across populations. In conclusion, cg05575921 has significant potential as a clinical biomarker to detect smoking in populations with varying rates of accuracy in self-report of smoking behavior. This article is a U.S. Government work and is in the public domain in the USA.

  2. Performance of CT scan of abdomen and pelvis in detecting asymptomatic synchronous metastasis in breast cancer.

    PubMed

    James, Justin; Teo, Melanie; Ramachandran, Vivekananda; Law, Michael; Stoney, David; Cheng, Michael

    2017-10-01

    In many centres in Australia, CT scan of abdomen and pelvis (CTAP) is a commonly used staging investigation to detect asymptomatic synchronous metastasis (ASM) in newly diagnosed breast cancer. However, its routine use is not supported by strong evidence either on its cost effectiveness or on specificity. Despite contrary recommendations by international guidelines this staging investigation is widely used among new early breast cancers(EBC). This retrospective study aims to assess the cost effectiveness and usefulness of CTAP in new breast cancers. All patients with primary invasive breast cancers who underwent breast cancer treatment through Eastern health breast unit during 50-month period from January 2012 were included in the study. All staging CTAP results were reviewed to evaluate its yield, false positive rate and cost of investigation per single positive result. Odds ratio for positive test results were calculated for five possible risk factors (Age less than 40 years, stage III disease, presence of LVI, HER2 positive disease and presence of metastasis in lymph node). 49% (n = 285) of all breast cancer patient underwent staging CTAP which lead to the detection of 4 ASM. (Over all yield of 1%) Overall false positive rate was 15% because of 42 indeterminate results needing further tests. Based merely on approved billing rates this amounted to $ 40733 per single ASM identified. Presence of lymph node metastasis did not increase the chance of positive test result (OR = 1.3; CI:0.13-12.69). Staging CTAP is associated with high incidence of false positive rates and low yield, especially among EBCs. It is desirable to choose this investigation more selectively than currently practiced. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  3. A posture recognition based fall detection system for monitoring an elderly person in a smart home environment.

    PubMed

    Yu, Miao; Rhuma, Adel; Naqvi, Syed Mohsen; Wang, Liang; Chambers, Jonathon

    2012-11-01

    We propose a novel computer vision based fall detection system for monitoring an elderly person in a home care application. Background subtraction is applied to extract the foreground human body and the result is improved by using certain post-processing. Information from ellipse fitting and a projection histogram along the axes of the ellipse are used as the features for distinguishing different postures of the human. These features are then fed into a directed acyclic graph support vector machine (DAGSVM) for posture classification, the result of which is then combined with derived floor information to detect a fall. From a dataset of 15 people, we show that our fall detection system can achieve a high fall detection rate (97.08%) and a very low false detection rate (0.8%) in a simulated home environment.

  4. Auditory false perception in schizophrenia: Development and validation of auditory signal detection task.

    PubMed

    Chhabra, Harleen; Sowmya, Selvaraj; Sreeraj, Vanteemar S; Kalmady, Sunil V; Shivakumar, Venkataram; Amaresha, Anekal C; Narayanaswamy, Janardhanan C; Venkatasubramanian, Ganesan

    2016-12-01

    Auditory hallucinations constitute an important symptom component in 70-80% of schizophrenia patients. These hallucinations are proposed to occur due to an imbalance between perceptual expectation and external input, resulting in attachment of meaning to abstract noises; signal detection theory has been proposed to explain these phenomena. In this study, we describe the development of an auditory signal detection task using a carefully chosen set of English words that could be tested successfully in schizophrenia patients coming from varying linguistic, cultural and social backgrounds. Schizophrenia patients with significant auditory hallucinations (N=15) and healthy controls (N=15) performed the auditory signal detection task wherein they were instructed to differentiate between a 5-s burst of plain white noise and voiced-noise. The analysis showed that false alarms (p=0.02), discriminability index (p=0.001) and decision bias (p=0.004) were significantly different between the two groups. There was a significant negative correlation between false alarm rate and decision bias. These findings extend further support for impaired perceptual expectation system in schizophrenia patients. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. An effective method on pornographic images realtime recognition

    NASA Astrophysics Data System (ADS)

    Wang, Baosong; Lv, Xueqiang; Wang, Tao; Wang, Chengrui

    2013-03-01

    In this paper, skin detection, texture filtering and face detection are used to extract feature on an image library, training them with the decision tree arithmetic to create some rules as a decision tree classifier to distinguish an unknown image. Experiment based on more than twenty thousand images, the precision rate can get 76.21% when testing on 13025 pornographic images and elapsed time is less than 0.2s. This experiment shows it has a good popularity. Among the steps mentioned above, proposing a new skin detection model which called irregular polygon region skin detection model based on YCbCr color space. This skin detection model can lower the false detection rate on skin detection. A new method called sequence region labeling on binary connected area can calculate features on connected area, it is faster and needs less memory than other recursive methods.

  6. A P2P Botnet detection scheme based on decision tree and adaptive multilayer neural networks.

    PubMed

    Alauthaman, Mohammad; Aslam, Nauman; Zhang, Li; Alasem, Rafe; Hossain, M A

    2018-01-01

    In recent years, Botnets have been adopted as a popular method to carry and spread many malicious codes on the Internet. These malicious codes pave the way to execute many fraudulent activities including spam mail, distributed denial-of-service attacks and click fraud. While many Botnets are set up using centralized communication architecture, the peer-to-peer (P2P) Botnets can adopt a decentralized architecture using an overlay network for exchanging command and control data making their detection even more difficult. This work presents a method of P2P Bot detection based on an adaptive multilayer feed-forward neural network in cooperation with decision trees. A classification and regression tree is applied as a feature selection technique to select relevant features. With these features, a multilayer feed-forward neural network training model is created using a resilient back-propagation learning algorithm. A comparison of feature set selection based on the decision tree, principal component analysis and the ReliefF algorithm indicated that the neural network model with features selection based on decision tree has a better identification accuracy along with lower rates of false positives. The usefulness of the proposed approach is demonstrated by conducting experiments on real network traffic datasets. In these experiments, an average detection rate of 99.08 % with false positive rate of 0.75 % was observed.

  7. Controlling false-negative errors in microarray differential expression analysis: a PRIM approach.

    PubMed

    Cole, Steve W; Galic, Zoran; Zack, Jerome A

    2003-09-22

    Theoretical considerations suggest that current microarray screening algorithms may fail to detect many true differences in gene expression (Type II analytic errors). We assessed 'false negative' error rates in differential expression analyses by conventional linear statistical models (e.g. t-test), microarray-adapted variants (e.g. SAM, Cyber-T), and a novel strategy based on hold-out cross-validation. The latter approach employs the machine-learning algorithm Patient Rule Induction Method (PRIM) to infer minimum thresholds for reliable change in gene expression from Boolean conjunctions of fold-induction and raw fluorescence measurements. Monte Carlo analyses based on four empirical data sets show that conventional statistical models and their microarray-adapted variants overlook more than 50% of genes showing significant up-regulation. Conjoint PRIM prediction rules recover approximately twice as many differentially expressed transcripts while maintaining strong control over false-positive (Type I) errors. As a result, experimental replication rates increase and total analytic error rates decline. RT-PCR studies confirm that gene inductions detected by PRIM but overlooked by other methods represent true changes in mRNA levels. PRIM-based conjoint inference rules thus represent an improved strategy for high-sensitivity screening of DNA microarrays. Freestanding JAVA application at http://microarray.crump.ucla.edu/focus

  8. Coincidence and coherent data analysis methods for gravitational wave bursts in a network of interferometric detectors

    NASA Astrophysics Data System (ADS)

    Arnaud, Nicolas; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Kreckelbergh, Stephane; Porter, Edward K.

    2003-11-01

    Network data analysis methods are the only way to properly separate real gravitational wave (GW) transient events from detector noise. They can be divided into two generic classes: the coincidence method and the coherent analysis. The former uses lists of selected events provided by each interferometer belonging to the network and tries to correlate them in time to identify a physical signal. Instead of this binary treatment of detector outputs (signal present or absent), the latter method involves first the merging of the interferometer data and looks for a common pattern, consistent with an assumed GW waveform and a given source location in the sky. The thresholds are only applied later, to validate or not the hypothesis made. As coherent algorithms use more complete information than coincidence methods, they are expected to provide better detection performances, but at a higher computational cost. An efficient filter must yield a good compromise between a low false alarm rate (hence triggering on data at a manageable rate) and a high detection efficiency. Therefore, the comparison of the two approaches is achieved using so-called receiving operating characteristics (ROC), giving the relationship between the false alarm rate and the detection efficiency for a given method. This paper investigates this question via Monte Carlo simulations, using the network model developed in a previous article. Its main conclusions are the following. First, a three-interferometer network such as Virgo-LIGO is found to be too small to reach good detection efficiencies at low false alarm rates: larger configurations are suitable to reach a confidence level high enough to validate as true GW a detected event. In addition, an efficient network must contain interferometers with comparable sensitivities: studying the three-interferometer LIGO network shows that the 2-km interferometer with half sensitivity leads to a strong reduction of performances as compared to a network of three interferometers with full sensitivity. Finally, it is shown that coherent analyses are feasible for burst searches and are clearly more efficient than coincidence strategies. Therefore, developing such methods should be an important goal of a worldwide collaborative data analysis.

  9. Ellipsoids for anomaly detection in remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Grosklos, Guenchik; Theiler, James

    2015-05-01

    For many target and anomaly detection algorithms, a key step is the estimation of a centroid (relatively easy) and a covariance matrix (somewhat harder) that characterize the background clutter. For a background that can be modeled as a multivariate Gaussian, the centroid and covariance lead to an explicit probability density function that can be used in likelihood ratio tests for optimal detection statistics. But ellipsoidal contours can characterize a much larger class of multivariate density function, and the ellipsoids that characterize the outer periphery of the distribution are most appropriate for detection in the low false alarm rate regime. Traditionally the sample mean and sample covariance are used to estimate ellipsoid location and shape, but these quantities are confounded both by large lever-arm outliers and non-Gaussian distributions within the ellipsoid of interest. This paper compares a variety of centroid and covariance estimation schemes with the aim of characterizing the periphery of the background distribution. In particular, we will consider a robust variant of the Khachiyan algorithm for minimum-volume enclosing ellipsoid. The performance of these different approaches is evaluated on multispectral and hyperspectral remote sensing imagery using coverage plots of ellipsoid volume versus false alarm rate.

  10. A soft kinetic data structure for lesion border detection.

    PubMed

    Kockara, Sinan; Mete, Mutlu; Yip, Vincent; Lee, Brendan; Aydin, Kemal

    2010-06-15

    The medical imaging and image processing techniques, ranging from microscopic to macroscopic, has become one of the main components of diagnostic procedures to assist dermatologists in their medical decision-making processes. Computer-aided segmentation and border detection on dermoscopic images is one of the core components of diagnostic procedures and therapeutic interventions for skin cancer. Automated assessment tools for dermoscopic images have become an important research field mainly because of inter- and intra-observer variations in human interpretations. In this study, a novel approach-graph spanner-for automatic border detection in dermoscopic images is proposed. In this approach, a proximity graph representation of dermoscopic images in order to detect regions and borders in skin lesion is presented. Graph spanner approach is examined on a set of 100 dermoscopic images whose manually drawn borders by a dermatologist are used as the ground truth. Error rates, false positives and false negatives along with true positives and true negatives are quantified by digitally comparing results with manually determined borders from a dermatologist. The results show that the highest precision and recall rates obtained to determine lesion boundaries are 100%. However, accuracy of assessment averages out at 97.72% and borders errors' mean is 2.28% for whole dataset.

  11. A robust hypothesis test for the sensitive detection of constant speed radiation moving sources

    NASA Astrophysics Data System (ADS)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir; Boudergui, Karim; Moline, Yoann; Sannié, Guillaume; Gameiro, Jordan; Normand, Stéphane; Méchin, Laurence

    2015-09-01

    Radiation Portal Monitors are deployed in linear networks to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal-to-noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes a benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive backgrounds, and a vehicle source carrier under the same respectively high and low count rate radioactive backgrounds, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm. It also guarantees that the optimal coverage factor for this compromise remains stable regardless of signal-to-noise ratio variations between 2 and 0.8, therefore allowing the final user to parametrize the test with the sole prior knowledge of background amplitude.

  12. Computer-aided detection and diagnosis of masses and clustered microcalcifications from digital mammograms

    NASA Astrophysics Data System (ADS)

    Nishikawa, Robert M.; Giger, Maryellen L.; Doi, Kunio; Vyborny, Carl J.; Schmidt, Robert A.; Metz, Charles E.; Wu, Chris Y.; Yin, Fang-Fang; Jiang, Yulei; Huo, Zhimin; Lu, Ping; Zhang, Wei; Ema, Takahiro; Bick, Ulrich; Papaioannou, John; Nagel, Rufus H.

    1993-07-01

    We are developing an 'intelligent' workstation to assist radiologists in diagnosing breast cancer from mammograms. The hardware for the workstation will consist of a film digitizer, a high speed computer, a large volume storage device, a film printer, and 4 high resolution CRT monitors. The software for the workstation is a comprehensive package of automated detection and classification schemes. Two rule-based detection schemes have been developed, one for breast masses and the other for clustered microcalcifications. The sensitivity of both schemes is 85% with a false-positive rate of approximately 3.0 and 1.5 false detections per image, for the mass and cluster detection schemes, respectively. Computerized classification is performed by an artificial neural network (ANN). The ANN has a sensitivity of 100% with a specificity of 60%. Currently, the ANN, which is a three-layer, feed-forward network, requires as input ratings of 14 different radiographic features of the mammogram that were determined subjectively by a radiologist. We are in the process of developing automated techniques to objectively determine these 14 features. The workstation will be placed in the clinical reading area of the radiology department in the near future, where controlled clinical tests will be performed to measure its efficacy.

  13. Reliable motion detection of small targets in video with low signal-to-clutter ratios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, S.A.; Naylor, R.B.

    1995-07-01

    Studies show that vigilance decreases rapidly after several minutes when human operators are required to search live video for infrequent intrusion detections. Therefore, there is a need for systems which can automatically detect targets in live video and reserve the operator`s attention for assessment only. Thus far, automated systems have not simultaneously provided adequate detection sensitivity, false alarm suppression, and ease of setup when used in external, unconstrained environments. This unsatisfactory performance can be exacerbated by poor video imagery with low contrast, high noise, dynamic clutter, image misregistration, and/or the presence of small, slow, or erratically moving targets. This papermore » describes a highly adaptive video motion detection and tracking algorithm which has been developed as part of Sandia`s Advanced Exterior Sensor (AES) program. The AES is a wide-area detection and assessment system for use in unconstrained exterior security applications. The AES detection and tracking algorithm provides good performance under stressing data and environmental conditions. Features of the algorithm include: reliable detection with negligible false alarm rate of variable velocity targets having low signal-to-clutter ratios; reliable tracking of targets that exhibit motion that is non-inertial, i.e., varies in direction and velocity; automatic adaptation to both infrared and visible imagery with variable quality; and suppression of false alarms caused by sensor flaws and/or cutouts.« less

  14. Supervised machine learning on a network scale: application to seismic event classification and detection

    NASA Astrophysics Data System (ADS)

    Reynen, Andrew; Audet, Pascal

    2017-09-01

    A new method using a machine learning technique is applied to event classification and detection at seismic networks. This method is applicable to a variety of network sizes and settings. The algorithm makes use of a small catalogue of known observations across the entire network. Two attributes, the polarization and frequency content, are used as input to regression. These attributes are extracted at predicted arrival times for P and S waves using only an approximate velocity model, as attributes are calculated over large time spans. This method of waveform characterization is shown to be able to distinguish between blasts and earthquakes with 99 per cent accuracy using a network of 13 stations located in Southern California. The combination of machine learning with generalized waveform features is further applied to event detection in Oklahoma, United States. The event detection algorithm makes use of a pair of unique seismic phases to locate events, with a precision directly related to the sampling rate of the generalized waveform features. Over a week of data from 30 stations in Oklahoma, United States are used to automatically detect 25 times more events than the catalogue of the local geological survey, with a false detection rate of less than 2 per cent. This method provides a highly confident way of detecting and locating events. Furthermore, a large number of seismic events can be automatically detected with low false alarm, allowing for a larger automatic event catalogue with a high degree of trust.

  15. Detection of CIN by naked eye visualization after application of acetic acid.

    PubMed

    Londhe, M; George, S S; Seshadri, L

    1997-06-01

    A prospective study was undertaken to determine the sensitivity and specificity of acetic application to the cervix followed by naked eye visualization as a screening test for detection of cervical intraepithelial neoplasia. Three hundred and seventy two sexually active woman in the reproductive age group were studied. All the women underwent Papanicolaou test, acetic acid test and colposcopy. One hundred and seventy five woman were acetic acid test negative, 197 women were acetic acid test positive. The sensitivity of acetic acid test was 72.4%, specificity 54% and false negative rate 15.2%, as compared to papanicolaou test which had a sensitivity of 13.2%, specificity of 96.3% and false negative rate of 24.4%. The advantage of the acetic acid test lies in its easy technique, low cost and high sensitivity which are important factors for determining the efficacy of any screening programme in developing countries.

  16. To Control False Positives in Gene-Gene Interaction Analysis: Two Novel Conditional Entropy-Based Approaches

    PubMed Central

    Lin, Meihua; Li, Haoli; Zhao, Xiaolei; Qin, Jiheng

    2013-01-01

    Genome-wide analysis of gene-gene interactions has been recognized as a powerful avenue to identify the missing genetic components that can not be detected by using current single-point association analysis. Recently, several model-free methods (e.g. the commonly used information based metrics and several logistic regression-based metrics) were developed for detecting non-linear dependence between genetic loci, but they are potentially at the risk of inflated false positive error, in particular when the main effects at one or both loci are salient. In this study, we proposed two conditional entropy-based metrics to challenge this limitation. Extensive simulations demonstrated that the two proposed metrics, provided the disease is rare, could maintain consistently correct false positive rate. In the scenarios for a common disease, our proposed metrics achieved better or comparable control of false positive error, compared to four previously proposed model-free metrics. In terms of power, our methods outperformed several competing metrics in a range of common disease models. Furthermore, in real data analyses, both metrics succeeded in detecting interactions and were competitive with the originally reported results or the logistic regression approaches. In conclusion, the proposed conditional entropy-based metrics are promising as alternatives to current model-based approaches for detecting genuine epistatic effects. PMID:24339984

  17. Current automated 3D cell detection methods are not a suitable replacement for manual stereologic cell counting

    PubMed Central

    Schmitz, Christoph; Eastwood, Brian S.; Tappan, Susan J.; Glaser, Jack R.; Peterson, Daniel A.; Hof, Patrick R.

    2014-01-01

    Stereologic cell counting has had a major impact on the field of neuroscience. A major bottleneck in stereologic cell counting is that the user must manually decide whether or not each cell is counted according to three-dimensional (3D) stereologic counting rules by visual inspection within hundreds of microscopic fields-of-view per investigated brain or brain region. Reliance on visual inspection forces stereologic cell counting to be very labor-intensive and time-consuming, and is the main reason why biased, non-stereologic two-dimensional (2D) “cell counting” approaches have remained in widespread use. We present an evaluation of the performance of modern automated cell detection and segmentation algorithms as a potential alternative to the manual approach in stereologic cell counting. The image data used in this study were 3D microscopic images of thick brain tissue sections prepared with a variety of commonly used nuclear and cytoplasmic stains. The evaluation compared the numbers and locations of cells identified unambiguously and counted exhaustively by an expert observer with those found by three automated 3D cell detection algorithms: nuclei segmentation from the FARSIGHT toolkit, nuclei segmentation by 3D multiple level set methods, and the 3D object counter plug-in for ImageJ. Of these methods, FARSIGHT performed best, with true-positive detection rates between 38 and 99% and false-positive rates from 3.6 to 82%. The results demonstrate that the current automated methods suffer from lower detection rates and higher false-positive rates than are acceptable for obtaining valid estimates of cell numbers. Thus, at present, stereologic cell counting with manual decision for object inclusion according to unbiased stereologic counting rules remains the only adequate method for unbiased cell quantification in histologic tissue sections. PMID:24847213

  18. p-hacking by post hoc selection with multiple opportunities: Detectability by skewness test?: Comment on Simonsohn, Nelson, and Simmons (2014).

    PubMed

    Ulrich, Rolf; Miller, Jeff

    2015-12-01

    Simonsohn, Nelson, and Simmons (2014) have suggested a novel test to detect p-hacking in research, that is, when researchers report excessive rates of "significant effects" that are truly false positives. Although this test is very useful for identifying true effects in some cases, it fails to identify false positives in several situations when researchers conduct multiple statistical tests (e.g., reporting the most significant result). In these cases, p-curves are right-skewed, thereby mimicking the existence of real effects even if no effect is actually present. (c) 2015 APA, all rights reserved).

  19. Reflectance-based skin detection in the short wave infrared band and its application to video

    NASA Astrophysics Data System (ADS)

    Langston, Tye

    2016-10-01

    Robust reflectance-based skin detection is a potentially powerful tool for security and search and rescue applications, especially when applied to video. However, to be useful it must be able to account for the variations of human skin, as well as other items in the environment that could cause false detections. This effort focused on identifying a robust skin detection scheme that is appropriate for video application. Skin reflectance was modeled to identify unique skin features and compare them to potential false positive materials. Based on these comparisons, specific wavelength bands were selected and different combinations of two and three optical filters were used for actively identifying skin, as well as identifying and removing potential false positive materials. One wavelength combination (1072/1250 nm) was applied to video using both single- and dual-camera configurations based on its still image performance, as well as its appropriateness for video application. There are several important factors regarding the extension of still image skin detection to video, including light available for detection (solar irradiance and reflectance intensity), overall intensity differences between different optical filters, optical component light loss, frame rate, time lag when switching between filters, image coregistration, and camera auto gain behavior.

  20. Colonic polyps: application value of computer-aided detection in computed tomographic colonography.

    PubMed

    Zhang, Hui-Mao; Guo, Wei; Liu, Gui-Feng; An, Dong-Hong; Gao, Shuo-Hui; Sun, Li-Bo; Yang, Hai-Shan

    2011-02-01

    Colonic polyps are frequently encountered in clinics. Computed tomographic colonography (CTC), as a painless and quick detection, has high values in clinics. In this study, we evaluated the application value of computer-aided detection (CAD) in CTC detection of colonic polyps in the Chinese population. CTC was performed with a GE 64-row multidetector computed tomography (MDCT) scanner. Data of 50 CTC patients (39 patients positive for at least one polyp of ≥ 0.5 cm in size and the other 11 patients negative by endoscopic detection) were retrospectively reviewed first without computer-aided detection (CAD) and then with CAD by four radiologists (two were experienced and another two inexperienced) blinded to colonoscopy findings. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of detected colonic polyps, as well as the areas under the ROC curves (Az value) with and without CAD were calculated. CAD increased the overall sensitivity, specificity, positive predictive value, negative predictive value and accuracy of the colonic polyps detected by experienced and inexperienced readers. The sensitivity in detecting small polyps (5 - 9 mm) with CAD in experienced and inexperienced readers increased from 82% and 44% to 93% and 82%, respectively (P > 0.05 and P < 0.001). With the use of CAD, the overall false positive rate and false negative rate for the detection of polyps by experienced and inexperienced readers decreased in different degrees. Among 13 sessile polyps not detected by CAD, two were ≥ 1.0 cm, eleven were 5 - 9 mm in diameter, and nine were flat-shaped lesions. The application of CAD in combination with CTC can increase the ability to detect colonic polyps, particularly for inexperienced readers. However, CAD is of limited value for the detection of flat polyps.

  1. Applying a CAD-generated imaging marker to assess short-term breast cancer risk

    NASA Astrophysics Data System (ADS)

    Mirniaharikandehei, Seyedehnafiseh; Zarafshani, Ali; Heidari, Morteza; Wang, Yunzhi; Aghaei, Faranak; Zheng, Bin

    2018-02-01

    Although whether using computer-aided detection (CAD) helps improve radiologists' performance in reading and interpreting mammograms is controversy due to higher false-positive detection rates, objective of this study is to investigate and test a new hypothesis that CAD-generated false-positives, in particular, the bilateral summation of false-positives, is a potential imaging marker associated with short-term breast cancer risk. An image dataset involving negative screening mammograms acquired from 1,044 women was retrospectively assembled. Each case involves 4 images of craniocaudal (CC) and mediolateral oblique (MLO) view of the left and right breasts. In the next subsequent mammography screening, 402 cases were positive for cancer detected and 642 remained negative. A CAD scheme was applied to process all "prior" negative mammograms. Some features from CAD scheme were extracted, which include detection seeds, the total number of false-positive regions, an average of detection scores and the sum of detection scores in CC and MLO view images. Then the features computed from two bilateral images of left and right breasts from either CC or MLO view were combined. In order to predict the likelihood of each testing case being positive in the next subsequent screening, two logistic regression models were trained and tested using a leave-one-case-out based cross-validation method. Data analysis demonstrated the maximum prediction accuracy with an area under a ROC curve of AUC=0.65+/-0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of [2.95, 6.83]. The results also illustrated an increasing trend in the adjusted odds ratio and risk prediction scores (p<0.01). Thus, the study showed that CAD-generated false-positives might provide a new quantitative imaging marker to help assess short-term breast cancer risk.

  2. Reducing false positives of microcalcification detection systems by removal of breast arterial calcifications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mordang, Jan-Jurre, E-mail: Jan-Jurre.Mordang@radboudumc.nl; Gubern-Mérida, Albert; Karssemeijer, Nico

    Purpose: In the past decades, computer-aided detection (CADe) systems have been developed to aid screening radiologists in the detection of malignant microcalcifications. These systems are useful to avoid perceptual oversights and can increase the radiologists’ detection rate. However, due to the high number of false positives marked by these CADe systems, they are not yet suitable as an independent reader. Breast arterial calcifications (BACs) are one of the most frequent false positives marked by CADe systems. In this study, a method is proposed for the elimination of BACs as positive findings. Removal of these false positives will increase the performancemore » of the CADe system in finding malignant microcalcifications. Methods: A multistage method is proposed for the removal of BAC findings. The first stage consists of a microcalcification candidate selection, segmentation and grouping of the microcalcifications, and classification to remove obvious false positives. In the second stage, a case-based selection is applied where cases are selected which contain BACs. In the final stage, BACs are removed from the selected cases. The BACs removal stage consists of a GentleBoost classifier trained on microcalcification features describing their shape, topology, and texture. Additionally, novel features are introduced to discriminate BACs from other positive findings. Results: The CADe system was evaluated with and without BACs removal. Here, both systems were applied on a validation set containing 1088 cases of which 95 cases contained malignant microcalcifications. After bootstrapping, free-response receiver operating characteristics and receiver operating characteristics analyses were carried out. Performance between the two systems was compared at 0.98 and 0.95 specificity. At a specificity of 0.98, the sensitivity increased from 37% to 52% and the sensitivity increased from 62% up to 76% at a specificity of 0.95. Partial areas under the curve in the specificity range of 0.8–1.0 were significantly different between the system without BACs removal and the system with BACs removal, 0.129 ± 0.009 versus 0.144 ± 0.008 (p<0.05), respectively. Additionally, the sensitivity at one false positive per 50 cases and one false positive per 25 cases increased as well, 37% versus 51% (p<0.05) and 58% versus 67% (p<0.05) sensitivity, respectively. Additionally, the CADe system with BACs removal reduces the number of false positives per case by 29% on average. The same sensitivity at one false positive per 50 cases in the CADe system without BACs removal can be achieved at one false positive per 80 cases in the CADe system with BACs removal. Conclusions: By using dedicated algorithms to detect and remove breast arterial calcifications, the performance of CADe systems can be improved, in particular, at false positive rates representative for operating points used in screening.« less

  3. FTIR gas analysis with improved sensitivity and selectivity for CWA and TIC detection

    NASA Astrophysics Data System (ADS)

    Phillips, Charles M.; Tan, Huwei

    2010-04-01

    This presentation describes the use of an FTIR (Fourier Transform Infrared)-based spectrometer designed to continuously monitor ambient air for the presence of chemical warfare agents (CWAs) and toxic industrial chemicals (TICs). The necessity of a reliable system capable of quickly and accurately detecting very low levels of CWAs and TICs while simultaneously retaining a negligible false alarm rate will be explored. Technological advancements in FTIR sensing have reduced noise while increasing selectivity and speed of detection. These novel analyzer design characteristics are discussed in detail and descriptions are provided which show how optical throughput, gas cell form factor, and detector response are optimized. The hardware and algorithms described here will explain why this FTIR system is very effective for the simultaneous detection and speciation of a wide variety of toxic compounds at ppb concentrations. Analytical test data will be reviewed demonstrating the system's sensitivity to and selectivity for specific CWAs and TICs; this will include recent data acquired as part of the DHS ARFCAM (Autonomous Rapid Facility Chemical Agent Monitor) project. These results include analyses of the data from live agent testing for the determination of CWA detection limits, immunity to interferences, detection times, residual noise analysis and false alarm rates. Sensing systems such as this are critical for effective chemical hazard identification which is directly relevant to the CBRNE community.

  4. Fetal heart rate deceleration detection using a discrete cosine transform implementation of singular spectrum analysis.

    PubMed

    Warrick, P A; Precup, D; Hamilton, E F; Kearney, R E

    2007-01-01

    To develop a singular-spectrum analysis (SSA) based change-point detection algorithm applicable to fetal heart rate (FHR) monitoring to improve the detection of deceleration events. We present a method for decomposing a signal into near-orthogonal components via the discrete cosine transform (DCT) and apply this in a novel online manner to change-point detection based on SSA. The SSA technique forms models of the underlying signal that can be compared over time; models that are sufficiently different indicate signal change points. To adapt the algorithm to deceleration detection where many successive similar change events can occur, we modify the standard SSA algorithm to hold the reference model constant under such conditions, an approach that we term "base-hold SSA". The algorithm is applied to a database of 15 FHR tracings that have been preprocessed to locate candidate decelerations and is compared to the markings of an expert obstetrician. Of the 528 true and 1285 false decelerations presented to the algorithm, the base-hold approach improved on standard SSA, reducing the number of missed decelerations from 64 to 49 (21.9%) while maintaining the same reduction in false-positives (278). The standard SSA assumption that changes are infrequent does not apply to FHR analysis where decelerations can occur successively and in close proximity; our base-hold SSA modification improves detection of these types of event series.

  5. Screen-detected versus interval cancers: Effect of imaging modality and breast density in the Flemish Breast Cancer Screening Programme.

    PubMed

    Timmermans, Lore; Bleyen, Luc; Bacher, Klaus; Van Herck, Koen; Lemmens, Kim; Van Ongeval, Chantal; Van Steen, Andre; Martens, Patrick; De Brabander, Isabel; Goossens, Mathieu; Thierens, Hubert

    2017-09-01

    To investigate if direct radiography (DR) performs better than screen-film mammography (SF) and computed radiography (CR) in dense breasts in a decentralized organised Breast Cancer Screening Programme. To this end, screen-detected versus interval cancers were studied in different BI-RADS density classes for these imaging modalities. The study cohort consisted of 351,532 women who participated in the Flemish Breast Cancer Screening Programme in 2009 and 2010. Information on screen-detected and interval cancers, breast density scores of radiologist second readers, and imaging modality was obtained by linkage of the databases of the Centre of Cancer Detection and the Belgian Cancer Registry. Overall, 67% of occurring breast cancers are screen detected and 33% are interval cancers, with DR performing better than SF and CR. The interval cancer rate increases gradually with breast density, regardless of modality. In the high-density class, the interval cancer rate exceeds the cancer detection rate for SF and CR, but not for DR. DR is superior to SF and CR with respect to cancer detection rates for high-density breasts. To reduce the high interval cancer rate in dense breasts, use of an additional imaging technique in screening can be taken into consideration. • Interval cancer rate increases gradually with breast density, regardless of modality. • Cancer detection rate in high-density breasts is superior in DR. • IC rate exceeds CDR for SF and CR in high-density breasts. • DR performs better in high-density breasts for third readings and false-positives.

  6. False match elimination for face recognition based on SIFT algorithm

    NASA Astrophysics Data System (ADS)

    Gu, Xuyuan; Shi, Ping; Shao, Meide

    2011-06-01

    The SIFT (Scale Invariant Feature Transform) is a well known algorithm used to detect and describe local features in images. It is invariant to image scale, rotation and robust to the noise and illumination. In this paper, a novel method used for face recognition based on SIFT is proposed, which combines the optimization of SIFT, mutual matching and Progressive Sample Consensus (PROSAC) together and can eliminate the false matches of face recognition effectively. Experiments on ORL face database show that many false matches can be eliminated and better recognition rate is achieved.

  7. Detecting and Discriminating Gravitational Microlensing in the SuperMACHO Survey

    NASA Astrophysics Data System (ADS)

    Garg, Arti

    2010-02-01

    The SuperMACHO Project is a 5 year survey to determine the nature of the lens population responsible for the excess gravitational microlensing rate toward the Large Magellanic Cloud observed by the MACHO project. The MACHO results indicate a large population of compact lenses toward the clouds, and the observed lensing rate is consistent with a Milky Way halo comprised of up to ˜20% Massive Compact Halo Objects (MACHO's), dark matter that is most likely baryonic. This work describes the method by which gravitational microlensing is detected in the SuperMACHO survey. Based on the MACHO findings and the SuperMACHO observing strategy and selection criteria, we expect <10-6 of the sources monitored to be lensed at any time. Our detection criteria are designed to minimize false positives while preserving a statistically significant detection rate. We provide an overview of the detection criteria. We also discuss the selection criteria used to discriminate between microlensing and other astrophysical transients. )

  8. Multilayer Statistical Intrusion Detection in Wireless Networks

    NASA Astrophysics Data System (ADS)

    Hamdi, Mohamed; Meddeb-Makhlouf, Amel; Boudriga, Noureddine

    2008-12-01

    The rapid proliferation of mobile applications and services has introduced new vulnerabilities that do not exist in fixed wired networks. Traditional security mechanisms, such as access control and encryption, turn out to be inefficient in modern wireless networks. Given the shortcomings of the protection mechanisms, an important research focuses in intrusion detection systems (IDSs). This paper proposes a multilayer statistical intrusion detection framework for wireless networks. The architecture is adequate to wireless networks because the underlying detection models rely on radio parameters and traffic models. Accurate correlation between radio and traffic anomalies allows enhancing the efficiency of the IDS. A radio signal fingerprinting technique based on the maximal overlap discrete wavelet transform (MODWT) is developed. Moreover, a geometric clustering algorithm is presented. Depending on the characteristics of the fingerprinting technique, the clustering algorithm permits to control the false positive and false negative rates. Finally, simulation experiments have been carried out to validate the proposed IDS.

  9. Histoplasma Urinary Antigen Testing Obviates the Need for Coincident Serum Antigen Testing.

    PubMed

    Libert, Diane; Procop, Gary W; Ansari, Mohammad Q

    2018-03-07

    Serum and urine antigen (SAg, UAg) detection are common tests for Histoplasma capsulatum. UAg detection is more widely used and reportedly has a higher sensitivity. We investigated whether SAg detection contributes meaningfully to the initial evaluation of patients with suspected histoplasmosis. We reviewed 20,285 UAg and 1,426 SAg tests ordered from 1997 to 2016 and analyzed paired UAg and SAg tests completed on the same patient within 1 week. We determined the positivity rate for each test. Of 601 paired specimens, 542 were concurrent negatives and 48 were concurrent positives (98% agreement). Medical records were available for eight of 11 pairs with discrepant results. UAg was falsely positive in six instances, truly positive once, and falsely negative once. These findings support using a single antigen detection test, rather than both UAg and SAg, as an initial screen for suspected histoplasmosis. This aligns with the current practice of most physicians.

  10. Comparison of probability statistics for automated ship detection in SAR imagery

    NASA Astrophysics Data System (ADS)

    Henschel, Michael D.; Rey, Maria T.; Campbell, J. W. M.; Petrovic, D.

    1998-12-01

    This paper discuses the initial results of a recent operational trial of the Ocean Monitoring Workstation's (OMW) ship detection algorithm which is essentially a Constant False Alarm Rate filter applied to Synthetic Aperture Radar data. The choice of probability distribution and methodologies for calculating scene specific statistics are discussed in some detail. An empirical basis for the choice of probability distribution used is discussed. We compare the results using a l-look, k-distribution function with various parameter choices and methods of estimation. As a special case of sea clutter statistics the application of a (chi) 2-distribution is also discussed. Comparisons are made with reference to RADARSAT data collected during the Maritime Command Operation Training exercise conducted in Atlantic Canadian Waters in June 1998. Reference is also made to previously collected statistics. The OMW is a commercial software suite that provides modules for automated vessel detection, oil spill monitoring, and environmental monitoring. This work has been undertaken to fine tune the OMW algorithm's, with special emphasis on the false alarm rate of each algorithm.

  11. Comparison of body mass index, waist circumference, and waist to height ratio in the prediction of hypertension and diabetes mellitus: Filipino-American women cardiovascular study.

    PubMed

    Battie, Cynthia A; Borja-Hart, Nancy; Ancheta, Irma B; Flores, Rene; Rao, Goutham; Palaniappan, Latha

    2016-12-01

    The relative ability of three obesity indices to predict hypertension (HTN) and diabetes (DM) and the validity of using Asian-specific thresholds of these indices were examined in Filipino-American women (FAW). Filipino-American women ( n  = 382), 40-65 years of age were screened for hypertension (HTN) and diabetes (DM) in four major US cities. Body mass index (BMI), waist circumference (WC) and waist circumference to height ratio (WHtR) were measured. ROC analyses determined that the three obesity measurements were similar in predicting HTN and DM (AUC: 0.6-0.7). The universal WC threshold of ≥ 35 in. missed 13% of the hypertensive patients and 12% of the diabetic patients. The Asian WC threshold of ≥ 31.5 in. increased detection of HTN and DM but with a high rate of false positives. The traditional BMI ≥ 25 kg/m 2 threshold missed 35% of those with hypertension and 24% of those with diabetes. The Asian BMI threshold improved detection but resulted in a high rate of false positives. The suggested WHtR cut-off of ≥ 0.5 missed only 1% of those with HTN and 0% of those with DM. The three obesity measurements had similar but modest ability to predict HTN and DM in FAW. Using Asian-specific thresholds increased accuracy but with a high rate of false positives. Whether FAW, especially at older ages, should be encouraged to reach these lower thresholds needs further investigation because of the high false positive rates.

  12. Detecting outliers when fitting data with nonlinear regression – a new method based on robust nonlinear regression and the false discovery rate

    PubMed Central

    Motulsky, Harvey J; Brown, Ronald E

    2006-01-01

    Background Nonlinear regression, like linear regression, assumes that the scatter of data around the ideal curve follows a Gaussian or normal distribution. This assumption leads to the familiar goal of regression: to minimize the sum of the squares of the vertical or Y-value distances between the points and the curve. Outliers can dominate the sum-of-the-squares calculation, and lead to misleading results. However, we know of no practical method for routinely identifying outliers when fitting curves with nonlinear regression. Results We describe a new method for identifying outliers when fitting data with nonlinear regression. We first fit the data using a robust form of nonlinear regression, based on the assumption that scatter follows a Lorentzian distribution. We devised a new adaptive method that gradually becomes more robust as the method proceeds. To define outliers, we adapted the false discovery rate approach to handling multiple comparisons. We then remove the outliers, and analyze the data using ordinary least-squares regression. Because the method combines robust regression and outlier removal, we call it the ROUT method. When analyzing simulated data, where all scatter is Gaussian, our method detects (falsely) one or more outlier in only about 1–3% of experiments. When analyzing data contaminated with one or several outliers, the ROUT method performs well at outlier identification, with an average False Discovery Rate less than 1%. Conclusion Our method, which combines a new method of robust nonlinear regression with a new method of outlier identification, identifies outliers from nonlinear curve fits with reasonable power and few false positives. PMID:16526949

  13. Diagnostic and prognostic value of additional SPECT/CT in sentinel lymph node mapping in breast cancer patients.

    PubMed

    Stanzel, Susanne; Pernthaler, Birgit; Schwarz, Thomas; Bjelic-Radisic, Vesna; Kerschbaumer, Stefan; Aigner, Reingard M

    2018-06-01

    of the study was to demonstrate the diagnostic and prognostic value of SPECT/CT in sentinel lymph node mapping (SLNM) in patients with invasive breast cancer. 114 patients with invasive breast cancer with clinically negative lymph nodes were included in this retrospective study as they were referred for SLNM with 99m Tc-nanocolloid. Planar image acquisition was accomplished in a one-day or two-day protocol depending on the schedule of the surgical procedure. Low dose SPECT/CT was performed after the planar images. The sentinel lymph node biopsy (SLNB) was considered false negative if a primary recurrence developed within 12 months after SLNB in the axilla from which a tumor-free SLN had been removed. Between December 2009 and December 2011, 114 patients (pts.) underwent SLNM with additional SPECT/CT. Planar imaging identified in 109 pts. 139 SLNs, which were tumor-positive in 42 nodes (n = 41 pts.). SPECT/CT identified in 81 pts. 151 additional SLNs, of which 19 were tumor-positive and led to therapy change (axillary lymph node dissection) in 11 pts. (9.6 %). Of overall 61 tumor-positive SLNs (n = 52 pts.) SPECT/CT detected all, whereas planar imaging detected only 42 of 61 ( P < 0.0001). No patient had lymph node metastasis within 12 months after SLNB in the axilla from which a tumor-free SLN had been removed resulting in a false-negative rate of 0 %. The local relapse rate was 1.8 % leading to a 4-year disease-free survival rate of 90 %. Among patients with breast cancer, the use of SPECT/CT-aided SLNM correlated due to a better anatomical localization and identification of planar not visible SLNs with a higher detection rate of SLNs. This led to therapeutic consequences and an excellent false-negative and 4-year disease-free survival rate. Schattauer GmbH.

  14. An effective hair detection algorithm for dermoscopic melanoma images of skin lesions

    NASA Astrophysics Data System (ADS)

    Chakraborti, Damayanti; Kaur, Ravneet; Umbaugh, Scott; LeAnder, Robert

    2016-09-01

    Dermoscopic images are obtained using the method of skin surface microscopy. Pigmented skin lesions are evaluated in terms of texture features such as color and structure. Artifacts, such as hairs, bubbles, black frames, ruler-marks, etc., create obstacles that prevent accurate detection of skin lesions by both clinicians and computer-aided diagnosis. In this article, we propose a new algorithm for the automated detection of hairs, using an adaptive, Canny edge-detection method, followed by morphological filtering and an arithmetic addition operation. The algorithm was applied to 50 dermoscopic melanoma images. In order to ascertain this method's relative detection accuracy, it was compared to the Razmjooy hair-detection method [1], using segmentation error (SE), true detection rate (TDR) and false positioning rate (FPR). The new method produced 6.57% SE, 96.28% TDR and 3.47% FPR, compared to 15.751% SE, 86.29% TDR and 11.74% FPR produced by the Razmjooy method [1]. Because of the 7.27-9.99% improvement in those parameters, we conclude that the new algorithm produces much better results for detecting thick, thin, dark and light hairs. The new method proposed here, shows an appreciable difference in the rate of detecting bubbles, as well.

  15. Alarm characterization for a continuous glucose monitor that replaces traditional blood glucose monitoring.

    PubMed

    McGarraugh, Geoffrey

    2010-01-01

    Continuous glucose monitoring (CGM) devices available in the United States are approved for use as adjuncts to self-monitoring of blood glucose (SMBG); all CGM alarms require SMBG confirmation before treatment. In this report, an analysis method is proposed to determine the CGM threshold alarm accuracy required to eliminate SMBG confirmation. The proposed method builds on the Clinical and Laboratory Standards Institute (CLSI) guideline for evaluating CGM threshold alarms using data from an in-clinic study of subjects with type 1 diabetes. The CLSI method proposes a maximum time limit of +/-30 minutes for the detection of hypo- and hyperglycemic events but does not include limits for glucose measurement accuracy. The International Standards Organization (ISO) standard for SMBG glucose measurement accuracy (ISO 15197) is +/-15 mg/dl for glucose <75 mg/dl and +/-20% for glucose > or = 75 mg/dl. This standard was combined with the CLSI method to more completely characterize the accuracy of CGM alarms. Incorporating the ISO 15197 accuracy margins, FreeStyle Navigator CGM system alarms detected 70 mg/dl hypoglycemia within 30 minutes at a rate of 70.3%, with a false alarm rate of 11.4%. The device detected high glucose in the range of 140-300 mg/dl within 30 minutes at an average rate of 99.2%, with a false alarm rate of 2.1%. Self-monitoring of blood glucose confirmation is necessary for detecting and treating hypoglycemia with the FreeStyle Navigator CGM system, but at high glucose levels, SMBG confirmation adds little incremental value to CGM alarms. 2010 Diabetes Technology Society.

  16. Automated detection scheme of architectural distortion in mammograms using adaptive Gabor filter

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Ruriha; Teramoto, Atsushi; Matsubara, Tomoko; Fujita, Hiroshi

    2013-03-01

    Breast cancer is a serious health concern for all women. Computer-aided detection for mammography has been used for detecting mass and micro-calcification. However, there are challenges regarding the automated detection of the architectural distortion about the sensitivity. In this study, we propose a novel automated method for detecting architectural distortion. Our method consists of the analysis of the mammary gland structure, detection of the distorted region, and reduction of false positive results. We developed the adaptive Gabor filter for analyzing the mammary gland structure that decides filter parameters depending on the thickness of the gland structure. As for post-processing, healthy mammary glands that run from the nipple to the chest wall are eliminated by angle analysis. Moreover, background mammary glands are removed based on the intensity output image obtained from adaptive Gabor filter. The distorted region of the mammary gland is then detected as an initial candidate using a concentration index followed by binarization and labeling. False positives in the initial candidate are eliminated using 23 types of characteristic features and a support vector machine. In the experiments, we compared the automated detection results with interpretations by a radiologist using 50 cases (200 images) from the Digital Database of Screening Mammography (DDSM). As a result, true positive rate was 82.72%, and the number of false positive per image was 1.39. There results indicate that the proposed method may be useful for detecting architectural distortion in mammograms.

  17. Detecting Earthquakes over a Seismic Network using Single-Station Similarity Measures

    NASA Astrophysics Data System (ADS)

    Bergen, Karianne J.; Beroza, Gregory C.

    2018-03-01

    New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected move-out. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to two weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalog (including 95% of the catalog events), and less than 1% of these candidate events are false detections.

  18. Peak tree: a new tool for multiscale hierarchical representation and peak detection of mass spectrometry data.

    PubMed

    Zhang, Peng; Li, Houqiang; Wang, Honghui; Wong, Stephen T C; Zhou, Xiaobo

    2011-01-01

    Peak detection is one of the most important steps in mass spectrometry (MS) analysis. However, the detection result is greatly affected by severe spectrum variations. Unfortunately, most current peak detection methods are neither flexible enough to revise false detection results nor robust enough to resist spectrum variations. To improve flexibility, we introduce peak tree to represent the peak information in MS spectra. Each tree node is a peak judgment on a range of scales, and each tree decomposition, as a set of nodes, is a candidate peak detection result. To improve robustness, we combine peak detection and common peak alignment into a closed-loop framework, which finds the optimal decomposition via both peak intensity and common peak information. The common peak information is derived and loopily refined from the density clustering of the latest peak detection result. Finally, we present an improved ant colony optimization biomarker selection method to build a whole MS analysis system. Experiment shows that our peak detection method can better resist spectrum variations and provide higher sensitivity and lower false detection rates than conventional methods. The benefits from our peak-tree-based system for MS disease analysis are also proved on real SELDI data.

  19. Adaptive Impact-Driven Detection of Silent Data Corruption for HPC Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di, Sheng; Cappello, Franck

    For exascale HPC applications, silent data corruption (SDC) is one of the most dangerous problems because there is no indication that there are errors during the execution. We propose an adaptive impact-driven method that can detect SDCs dynamically. The key contributions are threefold. (1) We carefully characterize 18 real-world HPC applications and discuss the runtime data features, as well as the impact of the SDCs on their execution results. (2) We propose an impact-driven detection model that does not blindly improve the prediction accuracy, but instead detects only influential SDCs to guarantee user-acceptable execution results. (3) Our solution can adaptmore » to dynamic prediction errors based on local runtime data and can automatically tune detection ranges for guaranteeing low false alarms. Experiments show that our detector can detect 80-99.99% of SDCs with a false alarm rate less that 1% of iterations for most cases. The memory cost and detection overhead are reduced to 15% and 6.3%, respectively, for a large majority of applications.« less

  20. CNV detection method optimized for high-resolution arrayCGH by normality test.

    PubMed

    Ahn, Jaegyoon; Yoon, Youngmi; Park, Chihyun; Park, Sanghyun

    2012-04-01

    High-resolution arrayCGH platform makes it possible to detect small gains and losses which previously could not be measured. However, current CNV detection tools fitted to early low-resolution data are not applicable to larger high-resolution data. When CNV detection tools are applied to high-resolution data, they suffer from high false-positives, which increases validation cost. Existing CNV detection tools also require optimal parameter values. In most cases, obtaining these values is a difficult task. This study developed a CNV detection algorithm that is optimized for high-resolution arrayCGH data. This tool operates up to 1500 times faster than existing tools on a high-resolution arrayCGH of whole human chromosomes which has 42 million probes whose average length is 50 bases, while preserving false positive/negative rates. The algorithm also uses a normality test, thereby removing the need for optimal parameters. To our knowledge, this is the first formulation for CNV detecting problems that results in a near-linear empirical overall complexity for real high-resolution data. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Vehicle tracking using fuzzy-based vehicle detection window with adaptive parameters

    NASA Astrophysics Data System (ADS)

    Chitsobhuk, Orachat; Kasemsiri, Watjanapong; Glomglome, Sorayut; Lapamonpinyo, Pipatphon

    2018-04-01

    In this paper, fuzzy-based vehicle tracking system is proposed. The proposed system consists of two main processes: vehicle detection and vehicle tracking. In the first process, the Gradient-based Adaptive Threshold Estimation (GATE) algorithm is adopted to provide the suitable threshold value for the sobel edge detection. The estimated threshold can be adapted to the changes of diverse illumination conditions throughout the day. This leads to greater vehicle detection performance compared to a fixed user's defined threshold. In the second process, this paper proposes the novel vehicle tracking algorithms namely Fuzzy-based Vehicle Analysis (FBA) in order to reduce the false estimation of the vehicle tracking caused by uneven edges of the large vehicles and vehicle changing lanes. The proposed FBA algorithm employs the average edge density and the Horizontal Moving Edge Detection (HMED) algorithm to alleviate those problems by adopting fuzzy rule-based algorithms to rectify the vehicle tracking. The experimental results demonstrate that the proposed system provides the high accuracy of vehicle detection about 98.22%. In addition, it also offers the low false detection rates about 3.92%.

  2. Investigating strength and frequency effects in recognition memory using type-2 signal detection theory.

    PubMed

    Higham, Philip A; Perfect, Timothy J; Bruno, Davide

    2009-01-01

    Criterion- versus distribution-shift accounts of frequency and strength effects in recognition memory were investigated with Type-2 signal detection receiver operating characteristic (ROC) analysis, which provides a measure of metacognitive monitoring. Experiment 1 demonstrated a frequency-based mirror effect, with a higher hit rate and lower false alarm rate, for low frequency words compared with high frequency words. In Experiment 2, the authors manipulated item strength with repetition, which showed an increased hit rate but no effect on the false alarm rate. Whereas Type-1 indices were ambiguous as to whether these effects were based on a criterion- or distribution-shift model, the two models predict opposite effects on Type-2 distractor monitoring under some assumptions. Hence, Type-2 ROC analysis discriminated between potential models of recognition that could not be discriminated using Type-1 indices alone. In Experiment 3, the authors manipulated Type-1 response bias by varying the number of old versus new response categories to confirm the assumptions made in Experiments 1 and 2. The authors conclude that Type-2 analyses are a useful tool for investigating recognition memory when used in conjunction with more traditional Type-1 analyses.

  3. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks.

    PubMed

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-06-13

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens' quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%.

  4. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks

    PubMed Central

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-01-01

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens’ quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%. PMID:27304957

  5. On Adaptive Cell-Averaging CFAR (Constant False-Alarm Rate) Radar Signal Detection

    DTIC Science & Technology

    1987-10-01

    SIICILE COPY 4 F FInI Tedwill Rlmrt to October 197 00 C\\JT ON ADAPTIVE CELL-AVERA81NG CFAR I RADAR SIGNAL DETECTION Syracuse University Mourud krket...NY 13441-5700 ELEMENT NO. NO. NO ACCESSION NO. 11. TITLE (Include Security Classification) 61102F 2’ 05 J8 PD - ON ADAPTIVE CELL-AVERAGING CFAR RADAR... CFAR ). One approach to adaptive detection in nonstationary noise and clutter background is to compare the processed target signal to an adaptive

  6. A novel onset detection technique for brain-computer interfaces using sound-production related cognitive tasks in simulated-online system

    NASA Astrophysics Data System (ADS)

    Song, YoungJae; Sepulveda, Francisco

    2017-02-01

    Objective. Self-paced EEG-based BCIs (SP-BCIs) have traditionally been avoided due to two sources of uncertainty: (1) precisely when an intentional command is sent by the brain, i.e., the command onset detection problem, and (2) how different the intentional command is when compared to non-specific (or idle) states. Performance evaluation is also a problem and there are no suitable standard metrics available. In this paper we attempted to tackle these issues. Approach. Self-paced covert sound-production cognitive tasks (i.e., high pitch and siren-like sounds) were used to distinguish between intentional commands (IC) and idle states. The IC states were chosen for their ease of execution and negligible overlap with common cognitive states. Band power and a digital wavelet transform were used for feature extraction, and the Davies-Bouldin index was used for feature selection. Classification was performed using linear discriminant analysis. Main results. Performance was evaluated under offline and simulated-online conditions. For the latter, a performance score called true-false-positive (TFP) rate, ranging from 0 (poor) to 100 (perfect), was created to take into account both classification performance and onset timing errors. Averaging the results from the best performing IC task for all seven participants, an 77.7% true-positive (TP) rate was achieved in offline testing. For simulated-online analysis the best IC average TFP score was 76.67% (87.61% TP rate, 4.05% false-positive rate). Significance. Results were promising when compared to previous IC onset detection studies using motor imagery, in which best TP rates were reported as 72.0% and 79.7%, and which, crucially, did not take timing errors into account. Moreover, based on our literature review, there is no previous covert sound-production onset detection system for spBCIs. Results showed that the proposed onset detection technique and TFP performance metric have good potential for use in SP-BCIs.

  7. Neural Network Target Identification System for False Alarm Reduction

    NASA Technical Reports Server (NTRS)

    Ye, David; Edens, Weston; Lu, Thomas T.; Chao, Tien-Hsin

    2009-01-01

    A multi-stage automated target recognition (ATR) system has been designed to perform computer vision tasks with adequate proficiency in mimicking human vision. The system is able to detect, identify, and track targets of interest. Potential regions of interest (ROIs) are first identified by the detection stage using an Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter combined with a wavelet transform. False positives are then eliminated by the verification stage using feature extraction methods in conjunction with neural networks. Feature extraction transforms the ROIs using filtering and binning algorithms to create feature vectors. A feed forward back propagation neural network (NN) is then trained to classify each feature vector and remove false positives. This paper discusses the test of the system performance and parameter optimizations process which adapts the system to various targets and datasets. The test results show that the system was successful in substantially reducing the false positive rate when tested on a sonar image dataset.

  8. Multiratio fusion change detection with adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Hytla, Patrick C.; Balster, Eric J.; Vasquez, Juan R.; Neuroth, Robert M.

    2017-04-01

    A ratio-based change detection method known as multiratio fusion (MRF) is proposed and tested. The MRF framework builds on other change detection components proposed in this work: dual ratio (DR) and multiratio (MR). The DR method involves two ratios coupled with adaptive thresholds to maximize detected changes and minimize false alarms. The use of two ratios is shown to outperform the single ratio case when the means of the image pairs are not equal. MR change detection builds on the DR method by including negative imagery to produce four total ratios with adaptive thresholds. Inclusion of negative imagery is shown to improve detection sensitivity and to boost detection performance in certain target and background cases. MRF further expands this concept by fusing together the ratio outputs using a routine in which detections must be verified by two or more ratios to be classified as a true changed pixel. The proposed method is tested with synthetically generated test imagery and real datasets with results compared to other methods found in the literature. DR is shown to significantly outperform the standard single ratio method. MRF produces excellent change detection results that exhibit up to a 22% performance improvement over other methods from the literature at low false-alarm rates.

  9. Detecting local diversity-dependence in diversification.

    PubMed

    Xu, Liang; Etienne, Rampal S

    2018-04-06

    Whether there are ecological limits to species diversification is a hotly debated topic. Molecular phylogenies show slowdowns in lineage accumulation, suggesting that speciation rates decline with increasing diversity. A maximum-likelihood (ML) method to detect diversity-dependent (DD) diversification from phylogenetic branching times exists, but it assumes that diversity-dependence is a global phenomenon and therefore ignores that the underlying species interactions are mostly local, and not all species in the phylogeny co-occur locally. Here, we explore whether this ML method based on the nonspatial diversity-dependence model can detect local diversity-dependence, by applying it to phylogenies, simulated with a spatial stochastic model of local DD speciation, extinction, and dispersal between two local communities. We find that type I errors (falsely detecting diversity-dependence) are low, and the power to detect diversity-dependence is high when dispersal rates are not too low. Interestingly, when dispersal is high the power to detect diversity-dependence is even higher than in the nonspatial model. Moreover, estimates of intrinsic speciation rate, extinction rate, and ecological limit strongly depend on dispersal rate. We conclude that the nonspatial DD approach can be used to detect diversity-dependence in clades of species that live in not too disconnected areas, but parameter estimates must be interpreted cautiously. © 2018 The Author(s). Evolution published by Wiley Periodicals, Inc. on behalf of The Society for the Study of Evolution.

  10. Detection of epileptic seizure in EEG signals using linear least squares preprocessing.

    PubMed

    Roshan Zamir, Z

    2016-09-01

    An epileptic seizure is a transient event of abnormal excessive neuronal discharge in the brain. This unwanted event can be obstructed by detection of electrical changes in the brain that happen before the seizure takes place. The automatic detection of seizures is necessary since the visual screening of EEG recordings is a time consuming task and requires experts to improve the diagnosis. Much of the prior research in detection of seizures has been developed based on artificial neural network, genetic programming, and wavelet transforms. Although the highest achieved accuracy for classification is 100%, there are drawbacks, such as the existence of unbalanced datasets and the lack of investigations in performances consistency. To address these, four linear least squares-based preprocessing models are proposed to extract key features of an EEG signal in order to detect seizures. The first two models are newly developed. The original signal (EEG) is approximated by a sinusoidal curve. Its amplitude is formed by a polynomial function and compared with the predeveloped spline function. Different statistical measures, namely classification accuracy, true positive and negative rates, false positive and negative rates and precision, are utilised to assess the performance of the proposed models. These metrics are derived from confusion matrices obtained from classifiers. Different classifiers are used over the original dataset and the set of extracted features. The proposed models significantly reduce the dimension of the classification problem and the computational time while the classification accuracy is improved in most cases. The first and third models are promising feature extraction methods with the classification accuracy of 100%. Logistic, LazyIB1, LazyIB5, and J48 are the best classifiers. Their true positive and negative rates are 1 while false positive and negative rates are 0 and the corresponding precision values are 1. Numerical results suggest that these models are robust and efficient for detecting epileptic seizure. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Facial expression system on video using widrow hoff

    NASA Astrophysics Data System (ADS)

    Jannah, M.; Zarlis, M.; Mawengkang, H.

    2018-03-01

    Facial expressions recognition is one of interesting research. This research contains human feeling to computer application Such as the interaction between human and computer, data compression, facial animation and facial detection from the video. The purpose of this research is to create facial expression system that captures image from the video camera. The system in this research uses Widrow-Hoff learning method in training and testing image with Adaptive Linear Neuron (ADALINE) approach. The system performance is evaluated by two parameters, detection rate and false positive rate. The system accuracy depends on good technique and face position that trained and tested.

  12. E/N effects on K0 values revealed by high precision measurements under low field conditions

    NASA Astrophysics Data System (ADS)

    Hauck, Brian C.; Siems, William F.; Harden, Charles S.; McHugh, Vincent M.; Hill, Herbert H.

    2016-07-01

    Ion mobility spectrometry (IMS) is used to detect chemical warfare agents, explosives, and narcotics. While IMS has a low rate of false positives, their occurrence causes the loss of time and money as the alarm is verified. Because numerous variables affect the reduced mobility (K0) of an ion, wide detection windows are required in order to ensure a low false negative response rate. Wide detection windows, however, reduce response selectivity, and interferents with similar K0 values may be mistaken for targeted compounds and trigger a false positive alarm. Detection windows could be narrowed if reference K0 values were accurately known for specific instrumental conditions. Unfortunately, there is a lack of confidence in the literature values due to discrepancies in the reported K0 values and their lack of reported error. This creates the need for the accurate control and measurement of each variable affecting ion mobility, as well as for a central accurate IMS database for reference and calibration. A new ion mobility spectrometer has been built that reduces the error of measurements affecting K0 by an order of magnitude less than ±0.2%. Precise measurements of ±0.002 cm2 V-1 s-1 or better have been produced and, as a result, an unexpected relationship between K0 and the electric field to number density ratio (E/N) has been discovered in which the K0 values of ions decreased as a function of E/N along a second degree polynomial trend line towards an apparent asymptote at approximately 4 Td.

  13. Assessment of target detection limits in hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gross, W.; Boehler, J.; Schilling, H.; Middelmann, W.; Weyermann, J.; Wellig, P.; Oechslin, R.; Kneubuehler, M.

    2015-10-01

    Hyperspectral remote sensing data can be used for civil and military applications to detect and classify target objects that cannot be reliably separated using broadband sensors. The comparably low spatial resolution is compensated by the fact that small targets, even below image resolution, can still be classified. The goal of this paper is to determine the target size to spatial resolution ratio for successful classification of different target and background materials. Airborne hyperspectral data is used to simulate data with known mixture ratios and to estimate the detection threshold for given false alarm rates. The data was collected in July 2014 over Greding, Germany, using airborne aisaEAGLE and aisaHAWK hyperspectral sensors. On the ground, various target materials were placed on natural background. The targets were four quadratic molton patches with an edge length of 7 meters in the colors black, white, grey and green. Also, two different types of polyethylene (camouflage nets) with an edge length of approximately 5.5 meters were deployed. Synthetic data is generated from the original data using spectral mixtures. Target signatures are linearly combined with different background materials in specific ratios. The simulated mixtures are appended to the original data and the target areas are removed for evaluation. Commonly used classification algorithms, e.g. Matched Filtering, Adaptive Cosine Estimator are used to determine the detection limit. Fixed false alarm rates are employed to find and analyze certain regions where false alarms usually occur first. A combination of 18 targets and 12 backgrounds is analyzed for three VNIR and two SWIR data sets of the same area.

  14. Improving and Assessing Planet Sensitivity of the GPI Exoplanet Survey with a Forward Model Matched Filter

    NASA Astrophysics Data System (ADS)

    Ruffio, Jean-Baptiste; Macintosh, Bruce; Wang, Jason J.; Pueyo, Laurent; Nielsen, Eric L.; De Rosa, Robert J.; Czekala, Ian; Marley, Mark S.; Arriaga, Pauline; Bailey, Vanessa P.; Barman, Travis; Bulger, Joanna; Chilcote, Jeffrey; Cotten, Tara; Doyon, Rene; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Gerard, Benjamin L.; Goodsell, Stephen J.; Graham, James R.; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn; Larkin, James E.; Maire, Jérôme; Marchis, Franck; Marois, Christian; Metchev, Stanimir; Millar-Blanchaer, Maxwell A.; Morzinski, Katie M.; Oppenheimer, Rebecca; Palmer, David; Patience, Jennifer; Perrin, Marshall; Poyneer, Lisa; Rajan, Abhijith; Rameau, Julien; Rantakyrö, Fredrik T.; Savransky, Dmitry; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane; Wolff, Schuyler

    2017-06-01

    We present a new matched-filter algorithm for direct detection of point sources in the immediate vicinity of bright stars. The stellar point-spread function (PSF) is first subtracted using a Karhunen-Loéve image processing (KLIP) algorithm with angular and spectral differential imaging (ADI and SDI). The KLIP-induced distortion of the astrophysical signal is included in the matched-filter template by computing a forward model of the PSF at every position in the image. To optimize the performance of the algorithm, we conduct extensive planet injection and recovery tests and tune the exoplanet spectra template and KLIP reduction aggressiveness to maximize the signal-to-noise ratio (S/N) of the recovered planets. We show that only two spectral templates are necessary to recover any young Jovian exoplanets with minimal S/N loss. We also developed a complete pipeline for the automated detection of point-source candidates, the calculation of receiver operating characteristics (ROC), contrast curves based on false positives, and completeness contours. We process in a uniform manner more than 330 data sets from the Gemini Planet Imager Exoplanet Survey and assess GPI typical sensitivity as a function of the star and the hypothetical companion spectral type. This work allows for the first time a comparison of different detection algorithms at a survey scale accounting for both planet completeness and false-positive rate. We show that the new forward model matched filter allows the detection of 50% fainter objects than a conventional cross-correlation technique with a Gaussian PSF template for the same false-positive rate.

  15. Automatic Near-Real-Time Detection of CMEs in Mauna Loa K-Cor Coronagraph Images

    NASA Astrophysics Data System (ADS)

    Thompson, W. T.; St. Cyr, O. C.; Burkepile, J. T.; Posner, A.

    2017-10-01

    A simple algorithm has been developed to detect the onset of coronal mass ejections (CMEs), together with speed estimates, in near-real time using linearly polarized white-light solar coronal images from the Mauna Loa Solar Observatory K-Cor telescope. Ground observations in the low corona can warn of CMEs well before they appear in space coronagraphs. The algorithm used is a variation on the Solar Eruptive Event Detection System developed at George Mason University. It was tested against K-Cor data taken between 29 April 2014 and 20 February 2017, on days identified as containing CMEs. This resulted in testing of 139 days' worth of data containing 171 CMEs. The detection rate varied from close to 80% when solar activity was high down to as low as 20-30% when activity was low. The difference in effectiveness with solar cycle is attributed to the relative prevalence of strong CMEs between active and quiet periods. There were also 12 false detections, leading to an average false detection rate of 8.6%. The K-Cor data were also compared with major solar energetic particle (SEP) storms during this time period. There were three SEP events detected either at Earth or at one of the two STEREO spacecraft when K-Cor was observing during the relevant time period. The algorithm successfully generated alerts for two of these events, with lead times of 1-3 h before the SEP onset at 1 AU. The third event was not detected by the automatic algorithm because of the unusually broad width in position angle.

  16. Optimization and validation of CEDIA drugs of abuse immunoassay tests in serum on Hitachi 912.

    PubMed

    Kirschbaum, Katrin M; Musshoff, Frank; Schmithausen, Ricarda; Stockhausen, Sarah; Madea, Burkhard

    2011-10-10

    Due to sensitive limits of detection of chromatographic methods and low limit values regarding the screening of drugs under the terms of impairment in safe driving (§ 24a StVG, Street Traffic Law in Germany), preliminary immunoassay (IA) tests should be able to detect also low concentrations of legal and illegal drugs in serum in forensic cases. False-negatives should be avoided, the rate of false-positive samples should be low due to cost and time. An optimization of IA cutoff values and a validation of the assay is required for each laboratory. In a retrospective study results for serum samples containing amphetamine, methylenedioxy derivatives, cannabinoids, benzodiazepines, cocaine (metabolites), methadone and opiates obtained with CEDIA drugs of abuse reagents on a Hitachi 912 autoanalyzer were compared with quantitative results of chromatographic methods (gas or liquid chromatography coupled with mass spectrometry (GC/MS or LC/MS)). Firstly sensitivity, specificity, positive and negative predictive values and overall misclassification rates were evaluated by contingency tables and compared to ROC-analyses and Youden-Indices. Secondly ideal cutoffs were statistically calculated on the basis of sensitivity and specificity as decisive statistical criteria with focus on a high sensitivity (low rates of false-negatives), i.e. using the Youden-Index. Immunoassay (IA) and confirmatory results were available for 3014 blood samples. Sensitivity was 90% or more for nearly all analytes: amphetamines (IA cutoff 9.5 ng/ml), methylenedioxy derivatives (IA cutoff 5.5 ng/ml), cannabinoids (IA cutoff 14.5 ng/ml), benzodiazepines (IA cutoff >0 ng/ml). Test of opiates showed a sensitivity of 86% for a IA cutoff value of >0 ng/ml. Values for specificity ranged between 33% (methadone, IA cutoff 10 ng/ml) and 90% (cocaine, IA cutoff 20 ng/ml). Lower cutoff values as recommended by ROC analyses were chosen for most tests to decrease the rate of false-negatives. Analyses enabled the definition of cutoff values with good values for sensitivity. Small rates of false-positives can be accepted in forensic cases. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. High false-negative rate of anti-HCV among Egyptian patients on regular hemodialysis.

    PubMed

    El-Sherif, Assem; Elbahrawy, Ashraf; Aboelfotoh, Atef; Abdelkarim, Magdy; Saied Mohammad, Abdel-Gawad; Abdallah, Abdallah Mahmoud; Mostafa, Sadek; Elmestikawy, Amr; Elwassief, Ahmed; Salah, Mohamed; Abdelbaseer, Mohamed Ali; Abdelwahab, Kouka Saadeldin

    2012-07-01

    Routine serological testing for hepatitis C virus (HCV) infection among hemodialysis (HD) patients is currently recommended. A dilemma existed on the value of serology because some investigators reported a high rate of false-negative serologic testing. In this study, we aimed to detect the false-negative rate of anti-HCV among Egyptian HD patients. Seventy-eight HD patients, negative for anti-HCV, anti-HIV, and hepatitis B surface antigen, were tested for HCV RNA by reverse transcriptase polymerase chain reaction (RT-PCR). In the next step, the viral load was quantified by real-time PCR in RT-PCR-positive patients. Risk factors for HCV infection, as well as clinical and biochemical indicators of liver disease, were compared between false-negative and true-negative anti-HCV HD patients. The frequency of false-negative anti-HCV was 17.9%. Frequency of blood transfusion, duration of HD, dialysis at multiple centers, and diabetes mellitus were not identified as risk factors for HCV infection. The frequency of false-negative results had a linear relation to the prevalence of HCV infection in the HD units. Timely identification of HCV within dialysis units is needed in order to lower the risk of HCV spread within the HD units. The high false-negative rate of anti-HCV among HD patients in our study justifies testing of a large scale of patients for precious assessment of effectiveness of nucleic acid amplification technology testing in screening HD patient. © 2012 The Authors. Hemodialysis International © 2012 International Society for Hemodialysis.

  18. Study of false positives in 5-ALA induced photodynamic diagnosis of bladder carcinoma

    NASA Astrophysics Data System (ADS)

    Draga, Ronald O. P.; Grimbergen, Matthijs C. M.; Kok, Esther T.; Jonges, Trudy G. N.; Bosch, J. L. H. R.

    2009-02-01

    Photodynamic diagnosis (PDD) is a technique that enhances the detection of tumors during cystoscopy using a photosensitizer which accumulates primarily in cancerous cells and will fluoresce when illuminated by violetblue light. A disadvantage of PDD is the relatively low specificity. In this retrospective study we aimed to identify predictors for false positive findings in PDD. Factors such as gender, age, recent transurethral resection of bladder tumors (TURBT), previous intravesical therapy (IVT) and urinary tract infections (UTIs) were examined for association with the false positive rates in a multivariate analysis. Data of 366 procedures and 200 patients were collected. Patients were instilled with 5-aminolevulinic acid (5-ALA) intravesically and 1253 biopsies were taken from tumors and suspicious lesions. Female gender and TURBT are independent predictors of false positives in PDD. However, previous intravesical therapy with Bacille Calmette-Guérin is also an important predictor of false positives. The false positive rate decreases during the first 9-12 weeks after the latest TURBT and the latest intravesical chemotherapy. Although shortly after IVT and TURBT false positives increase, PDD improves the diagnostic sensitivity and results in more adequate treatment strategies in a significant number of patients.

  19. Kepler's Final Survey Catalog

    NASA Astrophysics Data System (ADS)

    Mullally, S. E.

    2017-12-01

    The Kepler mission was designed to detect transiting exoplanets and has succeeded in finding over 4000 candidates. These candidates include approximately 50 terrestrial-sized worlds near to the habitable zone of their GKM dwarf stars (shown in figure against the stellar temperature). However not all transit detections are created equal. False positives, such as background eclipsing binaries, can mimic the signal of a transiting planet. Additionally, at Kepler's detection limit noise, either from the star or from the detector, can create signals that also mimic a transiting planet. For the data release 25 Kepler catalog we simulated these false alarms and determined how often known false alarms are called candidates. When this reliability information is combined with our studies of catalog completeness, this catalog can be used to understand the occurrence rate of exoplanets, even for the small, temperate planet candidates found by Kepler. I will discuss the automated methods we used to create and characterize this latest catalog, highlighting how we balanced the completeness and reliability of the long period candidates. While Kepler has been very successful at detecting transiting terrestrial-sized exoplanets, many of these detections are around stars that are too dim for successful follow-up work. Future missions will pick up where Kepler left off and find small planets around some of the brightest and smallest stars.

  20. Accounting for false-positive acoustic detections of bats using occupancy models

    USGS Publications Warehouse

    Clement, Matthew J.; Rodhouse, Thomas J.; Ormsbee, Patricia C.; Szewczak, Joseph M.; Nichols, James D.

    2014-01-01

    4. Synthesis and applications. Our results suggest that false positives sufficient to affect inferences may be common in acoustic surveys for bats. We demonstrate an approach that can estimate occupancy, regardless of the false-positive rate, when acoustic surveys are paired with capture surveys. Applications of this approach include monitoring the spread of White-Nose Syndrome, estimating the impact of climate change and informing conservation listing decisions. We calculate a site-specific probability of occupancy, conditional on survey results, which could inform local permitting decisions, such as for wind energy projects. More generally, the magnitude of false positives suggests that false-positive occupancy models can improve accuracy in research and monitoring of bats and provide wildlife managers with more reliable information.

  1. False-negative rate cannot be reduced by lowering the haemoglobin concentration cut-off in colorectal cancer screening using faecal immunochemical test.

    PubMed

    Ibañez-Sanz, Gemma; Garcia, Montse; Milà, Núria; Rodríguez-Moranta, Francisco; Binefa, Gemma; Gómez-Matas, Javier; Benito, Llúcia; Padrol, Isabel; Barenys, Mercè; Moreno, Victor

    2017-09-01

    The aim of this study was to analyse false-negative (FN) results of the faecal immunochemical test (FIT) and its determinants in a colorectal cancer screening programme in Catalonia. We carried out a cross-sectional study among 218 screenees with a negative FIT result who agreed to undergo a colonoscopy. A false-negative result was defined as the detection, at colonoscopy, of intermediate/high-risk polyps or colorectal cancer in a patient with a previous negative FIT (<20 µgHb/g). Multivariate logistic regression models were constructed to identify sociodemographic (sex, age) and screening variables (quantitative faecal haemoglobin, colonoscopy findings) related to FN results. Adjusted odds ratios and their 95% confidence intervals were estimated. There were 15.6% FN FIT results. Faecal haemoglobin was undetected in 45.5% of these results and was below 4 µgHb/g in 94.0% of the individuals with a FN result. About 60% of the lesions were located in the proximal colon, whereas the expected percentage was 30%. Decreasing the positivity threshold of FIT does not increase the detection rate of advanced neoplasia, but may increase the costs and potential adverse effects.

  2. Qualitative PCR method for Roundup Ready soybean: interlaboratory study.

    PubMed

    Kodama, Takashi; Kasahara, Masaki; Minegishi, Yasutaka; Futo, Satoshi; Sawada, Chihiro; Watai, Masatoshi; Akiyama, Hiroshi; Teshima, Reiko; Kurosawa, Yasunori; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2011-01-01

    Quantitative and qualitative methods based on PCR have been developed for genetically modified organisms (GMO). Interlaboratory studies were previously conducted for GMO quantitative methods; in this study, an interlaboratory study was conducted for a qualitative method for a GM soybean, Roundup Ready soy (RR soy), with primer pairs designed for the quantitative method of RR soy studied previously. Fourteen laboratories in Japan participated. Each participant extracted DNA from 1.0 g each of the soy samples containing 0, 0.05, and 0.10% of RR soy, and performed PCR with primer pairs for an internal control gene (Le1) and RR soy followed by agarose gel electrophoresis. The PCR product amplified in this PCR system for Le1 was detected from all samples. The sensitivity, specificity, and false-negative and false-positive rates of the method were obtained from the results of RR soy detection. False-negative rates at the level of 0.05 and 0.10% of the RR soy samples were 6.0 and 2.3%, respectively, revealing that the LOD of the method was somewhat below 0.10%. The current study demonstrated that the qualitative method would be practical for monitoring the labeling system of GM soy in kernel lots.

  3. A hybrid protection approaches for denial of service (DoS) attacks in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Gunasekaran, Mahalakshmi; Periakaruppan, Subathra

    2017-06-01

    Wireless sensor network (WSN) contains the distributed autonomous devices with the sensing capability of physical and environmental conditions. During the clustering operation, the consumption of more energy causes the draining in battery power that leads to minimum network lifetime. Hence, the WSN devices are initially operated on low-power sleep mode to maximise the lifetime. But, the attacks arrival cause the disruption in low-power operating called denial of service (DoS) attacks. The conventional intrusion detection (ID) approaches such as rule-based and anomaly-based methods effectively detect the DoS attacks. But, the energy consumption and false detection rate are more. The absence of attack information and broadcast of its impact to the other cluster head (CH) leads to easy DoS attacks arrival. This article combines the isolation and routing tables to detect the attack in the specific cluster and broadcasts the information to other CH. The intercommunication between the CHs prevents the DoS attacks effectively. In addition, the swarm-based defence approach is proposed to migrate the fault channel to normal operating channel through frequency hop approaches. The comparative analysis between the proposed table-based intrusion detection systems (IDSs) and swarm-based defence approaches with the traditional IDS regarding the parameters of transmission overhead/efficiency, energy consumption, and false positive/negative rates proves the capability of DoS prediction/prevention in WSN.

  4. Algorithm for automatic analysis of electro-oculographic data

    PubMed Central

    2013-01-01

    Background Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. Methods The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. Results The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. Conclusion The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics. PMID:24160372

  5. Algorithm for automatic analysis of electro-oculographic data.

    PubMed

    Pettersson, Kati; Jagadeesan, Sharman; Lukander, Kristian; Henelius, Andreas; Haeggström, Edward; Müller, Kiti

    2013-10-25

    Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics.

  6. Saposin-like protein 2 has an immunodiagnostic potential for detecting Fasciolosis gigantica.

    PubMed

    Kueakhai, Pornanan; Changklungmoa, Narin; Chaithirayanon, Kulathida; Phatsara, Manussabhorn; Preyavichyapugdee, Narin; Riengrojpitak, Suda; Sangpairoj, Kant; Chusongsang, Phiraphol; Sobhon, Prasert

    2015-01-01

    Saposin-like protein 2 (SAP-2) plays an important role in the digestive process of Fasciola gigantica (Fg). It is one of the major proteins synthesized by the caecal epithelial cells and released into fluke's excretion-secretion. Therefore, FgSAP-2 is a plausible target for detecting fasciolosis. A polyclonal antibody (PoAb) against recombinant FgSAP-2 was produced by immunizing rabbits with the recombinant protein (rFgSAP-2), and used in sandwich ELISA assay to detect the circulating FgSAP-2 in sera of mice experimentally infected with F. gigantica metacercariae. The assay could detect rFgSAP-2 and the native FgSAP-2 in the excretory-secretory (ES) and whole body (WB) fractions of adult F. gigantica at the concentrations as low as 38 pg/ml, 24 ng/ml, and 102 ng/ml, respectively. As well, the sera from mice experimentally infected with F. gigantica were tested positive by this sandwich ELISA, which exhibited sensitivity, specificity, false positive rate, false negative rate and accuracy at 99.99, 98.67, 1.33, 0.01 and 99.32%, respectively. Therefore, this assay could be used for diagnosis of fasciolosis by F. gigantica. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Transitioning to future air traffic management: effects of imperfect automation on controller attention and performance.

    PubMed

    Rovira, Ericka; Parasuraman, Raja

    2010-06-01

    This study examined whether benefits of conflict probe automation would occur in a future air traffic scenario in which air traffic service providers (ATSPs) are not directly responsible for freely maneuvering aircraft but are controlling other nonequipped aircraft (mixed-equipage environment). The objective was to examine how the type of automation imperfection (miss vs. false alarm) affects ATSP performance and attention allocation. Research has shown that the type of automation imperfection leads to differential human performance costs. Participating in four 30-min scenarios were 12 full-performance-level ATSPs. Dependent variables included conflict detection and resolution performance, eye movements, and subjective ratings of trust and self confidence. ATSPs detected conflicts faster and more accurately with reliable automation, as compared with manual performance. When the conflict probe automation was unreliable, conflict detection performance declined with both miss (25% conflicts detected) and false alarm automation (50% conflicts detected). When the primary task of conflict detection was automated, even highly reliable yet imperfect automation (miss or false alarm) resulted in serious negative effects on operator performance. The further in advance that conflict probe automation predicts a conflict, the greater the uncertainty of prediction; thus, designers should provide users with feedback on the state of the automation or other tools that allow for inspection and analysis of the data underlying the conflict probe algorithm.

  8. Visual and semi-automatic non-invasive detection of interictal fast ripples: A potential biomarker of epilepsy in children with tuberous sclerosis complex.

    PubMed

    Bernardo, Danilo; Nariai, Hiroki; Hussain, Shaun A; Sankar, Raman; Salamon, Noriko; Krueger, Darcy A; Sahin, Mustafa; Northrup, Hope; Bebin, E Martina; Wu, Joyce Y

    2018-04-03

    We aim to establish that interictal fast ripples (FR; 250-500 Hz) are detectable on scalp EEG, and to investigate their association to epilepsy. Scalp EEG recordings of a subset of children with tuberous sclerosis complex (TSC)-associated epilepsy from two large multicenter observational TSC studies were analyzed and compared to control children without epilepsy or any other brain-based diagnoses. FR were identified both by human visual review and compared with semi-automated review utilizing a deep learning-based FR detector. Seven out of 7 children with TSC-associated epilepsy had scalp FR compared to 0 out of 4 children in the control group (p = 0.003). The automatic detector has a sensitivity of 98% and false positive rate with average of 11.2 false positives per minute. Non-invasive detection of interictal scalp FR was feasible, by both visual and semi-automatic detection. Interictal scalp FR occurred exclusively in children with TSC-associated epilepsy and were absent in controls without epilepsy. The proposed detector achieves high sensitivity of FR detection; however, expert review of the results to reduce false positives is advised. Interictal FR are detectable on scalp EEG and may potentially serve as a biomarker of epilepsy in children with TSC. Copyright © 2018 International Federation of Clinical Neurophysiology. All rights reserved.

  9. Biomarker Reference Sets for Cancers in Women — EDRN Public Portal

    Cancer.gov

    The purpose of this study is to develop a standard reference set of specimens for use by investigators participating in the National Cancer Institutes Early Detection Research Network (EDRN) in defining false positive rates for new cancer biomarkers in women.

  10. Multi-Stage System for Automatic Target Recognition

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin; Lu, Thomas T.; Ye, David; Edens, Weston; Johnson, Oliver

    2010-01-01

    A multi-stage automated target recognition (ATR) system has been designed to perform computer vision tasks with adequate proficiency in mimicking human vision. The system is able to detect, identify, and track targets of interest. Potential regions of interest (ROIs) are first identified by the detection stage using an Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter combined with a wavelet transform. False positives are then eliminated by the verification stage using feature extraction methods in conjunction with neural networks. Feature extraction transforms the ROIs using filtering and binning algorithms to create feature vectors. A feedforward back-propagation neural network (NN) is then trained to classify each feature vector and to remove false positives. The system parameter optimizations process has been developed to adapt to various targets and datasets. The objective was to design an efficient computer vision system that can learn to detect multiple targets in large images with unknown backgrounds. Because the target size is small relative to the image size in this problem, there are many regions of the image that could potentially contain the target. A cursory analysis of every region can be computationally efficient, but may yield too many false positives. On the other hand, a detailed analysis of every region can yield better results, but may be computationally inefficient. The multi-stage ATR system was designed to achieve an optimal balance between accuracy and computational efficiency by incorporating both models. The detection stage first identifies potential ROIs where the target may be present by performing a fast Fourier domain OT-MACH filter-based correlation. Because threshold for this stage is chosen with the goal of detecting all true positives, a number of false positives are also detected as ROIs. The verification stage then transforms the regions of interest into feature space, and eliminates false positives using an artificial neural network classifier. The multi-stage system allows tuning the detection sensitivity and the identification specificity individually in each stage. It is easier to achieve optimized ATR operation based on its specific goal. The test results show that the system was successful in substantially reducing the false positive rate when tested on a sonar and video image datasets.

  11. Retrospective study evaluating the performance of a first-trimester combined screening for trisomy 21 in an Italian unselected population

    PubMed Central

    Padula, Francesco; Cignini, Pietro; Giannarelli, Diana; Brizzi, Cristiana; Coco, Claudio; D’Emidio, Laura; Giorgio, Elsa; Giorlandino, Maurizio; Mangiafico, Lucia; Mastrandrea, Marialuisa; Milite, Vincenzo; Mobili, Luisa; Nanni, Cinzia; Raffio, Raffaella; Taramanni, Cinzia; Vigna, Roberto; Mesoraca, Alvaro; Bizzoco, Domenico; Gabrielli, Ivan; Di Giacomo, Gianluca; Barone, Maria Antonietta; Cima, Antonella; Giorlandino, Francesca Romana; Emili, Sabrina; Cupellaro, Marina; Giorlandino, Claudio

    2014-01-01

    Objectives to assess the performance of a combined first-trimester screening for trisomy 21 in an unselected Italian population referred to a specialized private center for prenatal medicine. Methods a retrospective validation of first-trimester screening algorithms [risk calculation based on maternal age and nuchal translucency (NT) alone, maternal age and serum parameters (free β-hCG and PAPP-A) alone and a combination of both] for fetal aneuploidies evaluated in an unselected Italian population at Artemisia Fetal-Maternal Medical Centre in Rome. All measurements were performed between 11+0 and 13+6 weeks of gestation, between April 2007 and December 2008. Results of 3,610 single fetuses included in the study, we had a complete follow-up on 2,984. Fourteen of 17 cases of trisomy 21 were detected when a cut-off of 1:300 was applied [detection rate (DR) 82.4%, 95% confidence interval (CI) 64.2–100; false-positive rate (FPR) 4.7%, 95% CI 3.9–5.4; false-negative rate (FNR) 17.6%, 95% CI 0–35.8%]. Conclusion in our study population the detection rate for trisomy 21, using the combined risk calculation based on maternal age, fetal NT, maternal PAPP-A and free β-hCG levels, was superior to the application of either parameter alone. The algorithm has been validated for first trimester screening in the Italian population. PMID:26266002

  12. First trimester combined test for Down syndrome screening in unselected pregnancies - a report of a 13-year experience.

    PubMed

    Lee, Fa-Kung; Chen, Li-Ching; Cheong, Mei-Leng; Chou, Ching-Yu; Tsai, Ming-Song

    2013-12-01

    To analyze the performance of the first trimester Down syndrome screening in a single medical center in Northern Taiwan. From April 1999 to June 2012, a total of 25,104 pregnant women at gestational age of 10 weeks to 13 weeks 6 days received first trimester "combined test" for Down syndrome screening. The test combines the ultrasound scan of nuchal translucency thickness and maternal biochemical serum levels of pregnancy-associated plasma protein A (PAPP-A) and free beta-human chorionic gonadotropin (β-hCG). A positive screen was defined as an estimated Down syndrome risk ≥ 1/270, and either chorionic villous sampling or amniocentesis was performed for fetal chromosomal analyses. Seventy-eight of the 25,104 pregnancies were proven to have fetal chromosome anomalies. The detection rates for trisomy 21, trisomy 18, Turner syndrome, and other chromosome anomalies were 87.5% (21/24), 69.2% (9/13), 81.8% (9/11), and 60% (18/30), respectively, with a false positive rate (FPR) of 5.4% (1353/25,026). Further evaluation of the detection rates for trisomy 21, by gestational age at 11, 12, and 13 weeks, were 92.3%, 87.5%, and 66.7%, respectively. The first trimester combined test is an effective screening tool for Down syndrome detection with an acceptable low false positive rate. The best timing of screening will be between 11 and 12 weeks' gestation. Copyright © 2013. Published by Elsevier B.V.

  13. Supporting the Development and Adoption of Automatic Lameness Detection Systems in Dairy Cattle: Effect of System Cost and Performance on Potential Market Shares.

    PubMed

    Van De Gucht, Tim; Van Weyenberg, Stephanie; Van Nuffel, Annelies; Lauwers, Ludwig; Vangeyte, Jürgen; Saeys, Wouter

    2017-10-08

    Most automatic lameness detection system prototypes have not yet been commercialized, and are hence not yet adopted in practice. Therefore, the objective of this study was to simulate the effect of detection performance (percentage missed lame cows and percentage false alarms) and system cost on the potential market share of three automatic lameness detection systems relative to visual detection: a system attached to the cow, a walkover system, and a camera system. Simulations were done using a utility model derived from survey responses obtained from dairy farmers in Flanders, Belgium. Overall, systems attached to the cow had the largest market potential, but were still not competitive with visual detection. Increasing the detection performance or lowering the system cost led to higher market shares for automatic systems at the expense of visual detection. The willingness to pay for extra performance was €2.57 per % less missed lame cows, €1.65 per % less false alerts, and €12.7 for lame leg indication, respectively. The presented results could be exploited by system designers to determine the effect of adjustments to the technology on a system's potential adoption rate.

  14. Intelligent detection and identification in fiber-optical perimeter intrusion monitoring system based on the FBG sensor network

    NASA Astrophysics Data System (ADS)

    Wu, Huijuan; Qian, Ya; Zhang, Wei; Li, Hanyu; Xie, Xin

    2015-12-01

    A real-time intelligent fiber-optic perimeter intrusion detection system (PIDS) based on the fiber Bragg grating (FBG) sensor network is presented in this paper. To distinguish the effects of different intrusion events, a novel real-time behavior impact classification method is proposed based on the essential statistical characteristics of signal's profile in the time domain. The features are extracted by the principal component analysis (PCA), which are then used to identify the event with a K-nearest neighbor classifier. Simulation and field tests are both carried out to validate its effectiveness. The average identification rate (IR) for five sample signals in the simulation test is as high as 96.67%, and the recognition rate for eight typical signals in the field test can also be achieved up to 96.52%, which includes both the fence-mounted and the ground-buried sensing signals. Besides, critically high detection rate (DR) and low false alarm rate (FAR) can be simultaneously obtained based on the autocorrelation characteristics analysis and a hierarchical detection and identification flow.

  15. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    USGS Publications Warehouse

    Hobbs, Michael T.; Brehme, Cheryl S.

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  16. GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D

    This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% truemore » positive rates at both.« less

  17. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.

    PubMed

    Hobbs, Michael T; Brehme, Cheryl S

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  18. Application of partial inversion pulse to ultrasonic time-domain correlation method to measure the flow rate in a pipe

    NASA Astrophysics Data System (ADS)

    Wada, Sanehiro; Furuichi, Noriyuki; Shimada, Takashi

    2017-11-01

    This paper proposes the application of a novel ultrasonic pulse, called a partial inversion pulse (PIP), to the measurement of the velocity profile and flow rate in a pipe using the ultrasound time-domain correlation (UTDC) method. In general, the measured flow rate depends on the velocity profile in the pipe; thus, on-site calibration is the only method of checking the accuracy of on-site flow rate measurements. Flow rate calculation using UTDC is based on the integration of the measured velocity profile. The advantages of this method compared with the ultrasonic pulse Doppler method include the possibility of the velocity range having no limitation and its applicability to flow fields without a sufficient amount of reflectors. However, it has been previously reported that the measurable velocity range for UTDC is limited by false detections. Considering the application of this method to on-site flow fields, the issue of velocity range is important. To reduce the effect of false detections, a PIP signal, which is an ultrasound signal that contains a partially inverted region, was developed in this study. The advantages of the PIP signal are that it requires little additional hardware cost and no additional software cost in comparison with conventional methods. The effects of inversion on the characteristics of the ultrasound transmission were estimated through numerical calculation. Then, experimental measurements were performed at a national standard calibration facility for water flow rate in Japan. The experimental results demonstrate that measurements made using a PIP signal are more accurate and yield a higher detection ratio than measurements using a normal pulse signal.

  19. FraudMiner: A Novel Credit Card Fraud Detection Model Based on Frequent Itemset Mining

    PubMed Central

    Seeja, K. R.; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers. PMID:25302317

  20. FraudMiner: a novel credit card fraud detection model based on frequent itemset mining.

    PubMed

    Seeja, K R; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers.

  1. Detecting earthquakes over a seismic network using single-station similarity measures

    NASA Astrophysics Data System (ADS)

    Bergen, Karianne J.; Beroza, Gregory C.

    2018-06-01

    New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected moveout. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to 2 weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalogue (including 95 per cent of the catalogue events), and less than 1 per cent of these candidate events are false detections.

  2. Reformatted images improve the detection rate of acute traumatic subdural hematomas on brain CT compared with axial images alone.

    PubMed

    Amrhein, Timothy J; Mostertz, William; Matheus, Maria Gisele; Maass-Bolles, Genevieve; Sharma, Komal; Collins, Heather R; Kranz, Peter G

    2017-02-01

    Subdural hematomas (SDHs) comprise a significant percentage of missed intracranial hemorrhage on axial brain CT. SDH detection rates could be improved with the addition of reformatted images. Though performed at some centers, the potential additional diagnostic sensitivity of reformatted images has not yet been investigated. The purpose of our study is to determine if the addition of coronal and sagittal reformatted images to an axial brain CT increases the sensitivity and specificity for detection of acute traumatic SDH. We retrospectively reviewed consecutive brain CTs acquired for acute trauma that contained new SDHs. An equivalent number of normal brain CTs served as control. Paired sets of images were created for each case: (1) axial images only ("axial only") and (2) axial, coronal, sagittal images ("reformat added"). Three readers interpreted both the axial only and companion reformat added for each case, separated by 1 month. Reading times and SDH detection rates were compared. One hundred SDH and 100 negative examinations were collected. Sensitivity and specificity for the axial-only scans were 75.7 and 94.3 %, respectively, compared with 88.3 and 98.3 % for reformat added. There was a 24.3 % false negative (missed SDH) rate with axial-only scans versus 11.7 % with reformat added (p = <0.001). Median reader interpretation times were longer with the addition of reformatted images (125 versus 89 s), but this difference was not significant (p = 0.23). The addition of coronal and sagittal images in trauma brain CT resulted in improved sensitivity and specificity as well as a reduction in SDH false negatives by greater than 50 %. Reformatted images substantially reduce the number of missed SDHs compared with axial images alone.

  3. Identifying Threats Using Graph-based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Eberle, William; Holder, Lawrence; Cook, Diane

    Much of the data collected during the monitoring of cyber and other infrastructures is structural in nature, consisting of various types of entities and relationships between them. The detection of threatening anomalies in such data is crucial to protecting these infrastructures. We present an approach to detecting anomalies in a graph-based representation of such data that explicitly represents these entities and relationships. The approach consists of first finding normative patterns in the data using graph-based data mining and then searching for small, unexpected deviations to these normative patterns, assuming illicit behavior tries to mimic legitimate, normative behavior. The approach is evaluated using several synthetic and real-world datasets. Results show that the approach has high truepositive rates, low false-positive rates, and is capable of detecting complex structural anomalies in real-world domains including email communications, cellphone calls and network traffic.

  4. Parameter Transient Behavior Analysis on Fault Tolerant Control System

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine (Technical Monitor); Shin, Jong-Yeob

    2003-01-01

    In a fault tolerant control (FTC) system, a parameter varying FTC law is reconfigured based on fault parameters estimated by fault detection and isolation (FDI) modules. FDI modules require some time to detect fault occurrences in aero-vehicle dynamics. This paper illustrates analysis of a FTC system based on estimated fault parameter transient behavior which may include false fault detections during a short time interval. Using Lyapunov function analysis, the upper bound of an induced-L2 norm of the FTC system performance is calculated as a function of a fault detection time and the exponential decay rate of the Lyapunov function.

  5. Online anomaly detection in wireless body area networks for reliable healthcare monitoring.

    PubMed

    Salem, Osman; Liu, Yaning; Mehaoua, Ahmed; Boutaba, Raouf

    2014-09-01

    In this paper, we propose a lightweight approach for online detection of faulty measurements by analyzing the data collected from medical wireless body area networks. The proposed framework performs sequential data analysis using a smart phone as a base station, and takes into account the constrained resources of the smart phone, such as processing power and storage capacity. The main objective is to raise alarms only when patients enter in an emergency situation, and to discard false alarms triggered by faulty measurements or ill-behaved sensors. The proposed approach is based on the Haar wavelet decomposition, nonseasonal Holt-Winters forecasting, and the Hampel filter for spatial analysis, and on for temporal analysis. Our objective is to reduce false alarms resulting from unreliable measurements and to reduce unnecessary healthcare intervention. We apply our proposed approach on real physiological dataset. Our experimental results prove the effectiveness of our approach in achieving good detection accuracy with a low false alarm rate. The simplicity and the processing speed of our proposed framework make it useful and efficient for real time diagnosis.

  6. Discrimination of plant-parasitic nematodes from complex soil communities using ecometagenetics.

    PubMed

    Porazinska, Dorota L; Morgan, Matthew J; Gaspar, John M; Court, Leon N; Hardy, Christopher M; Hodda, Mike

    2014-07-01

    Many plant pathogens are microscopic, cryptic, and difficult to diagnose. The new approach of ecometagenetics, involving ultrasequencing, bioinformatics, and biostatistics, has the potential to improve diagnoses of plant pathogens such as nematodes from the complex mixtures found in many agricultural and biosecurity situations. We tested this approach on a gradient of complexity ranging from a few individuals from a few species of known nematode pathogens in a relatively defined substrate to a complex and poorly known suite of nematode pathogens in a complex forest soil, including its associated biota of unknown protists, fungi, and other microscopic eukaryotes. We added three known but contrasting species (Pratylenchus neglectus, the closely related P. thornei, and Heterodera avenae) to half the set of substrates, leaving the other half without them. We then tested whether all nematode pathogens-known and unknown, indigenous, and experimentally added-were detected consistently present or absent. We always detected the Pratylenchus spp. correctly and with the number of sequence reads proportional to the numbers added. However, a single cyst of H. avenae was only identified approximately half the time it was present. Other plant-parasitic nematodes and nematodes from other trophic groups were detected well but other eukaryotes were detected less consistently. DNA sampling errors or informatic errors or both were involved in misidentification of H. avenae; however, the proportions of each varied in the different bioinformatic pipelines and with different parameters used. To a large extent, false-positive and false-negative errors were complementary: pipelines and parameters with the highest false-positive rates had the lowest false-negative rates and vice versa. Sources of error identified included assumptions in the bioinformatic pipelines, slight differences in primer regions, the number of sequence reads regarded as the minimum threshold for inclusion in analysis, and inaccessible DNA in resistant life stages. Identification of the sources of error allows us to suggest ways to improve identification using ecometagenetics.

  7. Consensus review of discordant findings maximizes cancer detection rate in double-reader screening mammography: Irish National Breast Screening Program experience.

    PubMed

    Shaw, Colette M; Flanagan, Fidema L; Fenlon, Helen M; McNicholas, Michelle M

    2009-02-01

    To assesses consensus review of discordant screening mammography findings in terms of its sensitivity, safety, and effect on overall performance in the first 6 years of operation of the Irish National Breast Screening Program (NBSP). Women who participated in the Irish NBSP gave written informed consent for use of their data for auditing purposes. Local ethics committee approval was obtained. The study population consisted of women who participated in the Irish NBSP and underwent initial screening mammography at one of the two screening centers serving the eastern part of Ireland between 2000 and 2005. Independent double reading of mammograms was performed. When the readers disagreed regarding referral, the case was reviewed by a consensus panel. Of the 128 569 screenings performed, 1335 (1%) were discussed by consensus. Of the 1335 cases discussed by consensus, 606 (45.39%) were recalled for further assessment. This resulted in an overall recall rate of 4.41%. In those recalled to assessment, 71 cases of malignant disease were diagnosed (ductal carcinoma in situ, n = 24; invasive cancer, n = 47). The remaining 729 patients were returned to biennial screening. Of these 729 patients, seven had false-negative findings that were identified in the subsequent screening round. Use of the highest reader recall method, in which a patient is recalled if her findings are deemed abnormal by either reader, could potentially increase the cancer detection rate by 0.6 per 1000 women screened but would increase the recall rate by 12.69% and the number of false-positive findings by 15.37%. The consensus panel identified 71 (7.33%) of 968 cancers diagnosed. Consensus review substantially reduced the number of cases recalled and was associated with a low false-negative rate.

  8. How frequently do allegations of scientific misconduct occur in ecology and evolution, and what happens afterwards?

    PubMed

    Moreno-Rueda, Gregorio

    2013-03-01

    Scientific misconduct obstructs the advance of knowledge in science. Its impact in some disciplines is still poorly known, as is the frequency in which it is detected. Here, I examine how frequently editors of ecology and evolution journals detect scientist misconduct. On average, editors managed 0.114 allegations of misconduct per year. Editors considered 6 of 14 allegations (42.9%) to be true, but only in 2 cases were the authors declared guilty, the remaining being dropped for lack of proof. The annual rate of allegations that were probably warranted was 0.053, although the rate of demonstrated misconduct was 0.018, while the rate of false or erroneous allegations was 0.024. Considering that several cases of misconduct are probably not reported, these findings suggest that editors detect less than one-third of all fraudulent papers.

  9. Earthquake Fingerprints: Representing Earthquake Waveforms for Similarity-Based Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Beroza, G. C.

    2016-12-01

    New earthquake detection methods, such as Fingerprint and Similarity Thresholding (FAST), use fast approximate similarity search to identify similar waveforms in long-duration data without templates (Yoon et al. 2015). These methods have two key components: fingerprint extraction and an efficient search algorithm. Fingerprint extraction converts waveforms into fingerprints, compact signatures that represent short-duration waveforms for identification and search. Earthquakes are detected using an efficient indexing and search scheme, such as locality-sensitive hashing, that identifies similar waveforms in a fingerprint database. The quality of the search results, and thus the earthquake detection results, is strongly dependent on the fingerprinting scheme. Fingerprint extraction should map similar earthquake waveforms to similar waveform fingerprints to ensure a high detection rate, even under additive noise and small distortions. Additionally, fingerprints corresponding to noise intervals should have mutually dissimilar fingerprints to minimize false detections. In this work, we compare the performance of multiple fingerprint extraction approaches for the earthquake waveform similarity search problem. We apply existing audio fingerprinting (used in content-based audio identification systems) and time series indexing techniques and present modified versions that are specifically adapted for seismic data. We also explore data-driven fingerprinting approaches that can take advantage of labeled or unlabeled waveform data. For each fingerprinting approach we measure its ability to identify similar waveforms in a low signal-to-noise setting, and quantify the trade-off between true and false detection rates in the presence of persistent noise sources. We compare the performance using known event waveforms from eight independent stations in the Northern California Seismic Network.

  10. Tactical Conflict Detection in Terminal Airspace

    NASA Technical Reports Server (NTRS)

    Tang, Huabin; Robinson, John E.; Denery, Dallas G.

    2010-01-01

    Air traffic systems have long relied on automated short-term conflict prediction algorithms to warn controllers of impending conflicts (losses of separation). The complexity of terminal airspace has proven difficult for such systems as it often leads to excessive false alerts. Thus, the legacy system, called Conflict Alert, which provides short-term alerts in both en-route and terminal airspace currently, is often inhibited or degraded in areas where frequent false alerts occur, even though the alerts are provided only when an aircraft is in dangerous proximity of other aircraft. This research investigates how a minimal level of flight intent information may be used to improve short-term conflict detection in terminal airspace such that it can be used by the controller to maintain legal aircraft separation. The flight intent information includes a site-specific nominal arrival route and inferred altitude clearances in addition to the flight plan that includes the RNAV (Area Navigation) departure route. A new tactical conflict detection algorithm is proposed, which uses a single analytic trajectory, determined by the flight intent and the current state information of the aircraft, and includes a complex set of current, dynamic separation standards for terminal airspace to define losses of separation. The new algorithm is compared with an algorithm that imitates a known en-route algorithm and another that imitates Conflict Alert by analysis of false-alert rate and alert lead time with recent real-world data of arrival and departure operations and a large set of operational error cases from Dallas/Fort Worth TRACON (Terminal Radar Approach Control). The new algorithm yielded a false-alert rate of two per hour and an average alert lead time of 38 seconds.

  11. AdaBoost-based algorithm for network intrusion detection.

    PubMed

    Hu, Weiming; Hu, Wei; Maybank, Steve

    2008-04-01

    Network intrusion detection aims at distinguishing the attacks on the Internet from normal use of the Internet. It is an indispensable part of the information security system. Due to the variety of network behaviors and the rapid development of attack fashions, it is necessary to develop fast machine-learning-based intrusion detection algorithms with high detection rates and low false-alarm rates. In this correspondence, we propose an intrusion detection algorithm based on the AdaBoost algorithm. In the algorithm, decision stumps are used as weak classifiers. The decision rules are provided for both categorical and continuous features. By combining the weak classifiers for continuous features and the weak classifiers for categorical features into a strong classifier, the relations between these two different types of features are handled naturally, without any forced conversions between continuous and categorical features. Adaptable initial weights and a simple strategy for avoiding overfitting are adopted to improve the performance of the algorithm. Experimental results show that our algorithm has low computational complexity and error rates, as compared with algorithms of higher computational complexity, as tested on the benchmark sample data.

  12. Detection of Foreign Matter in Transfusion Solution Based on Gaussian Background Modeling and an Optimized BP Neural Network

    PubMed Central

    Zhou, Fuqiang; Su, Zhen; Chai, Xinghua; Chen, Lipeng

    2014-01-01

    This paper proposes a new method to detect and identify foreign matter mixed in a plastic bottle filled with transfusion solution. A spin-stop mechanism and mixed illumination style are applied to obtain high contrast images between moving foreign matter and a static transfusion background. The Gaussian mixture model is used to model the complex background of the transfusion image and to extract moving objects. A set of features of moving objects are extracted and selected by the ReliefF algorithm, and optimal feature vectors are fed into the back propagation (BP) neural network to distinguish between foreign matter and bubbles. The mind evolutionary algorithm (MEA) is applied to optimize the connection weights and thresholds of the BP neural network to obtain a higher classification accuracy and faster convergence rate. Experimental results show that the proposed method can effectively detect visible foreign matter in 250-mL transfusion bottles. The misdetection rate and false alarm rate are low, and the detection accuracy and detection speed are satisfactory. PMID:25347581

  13. Machine learning for real time remote detection

    NASA Astrophysics Data System (ADS)

    Labbé, Benjamin; Fournier, Jérôme; Henaff, Gilles; Bascle, Bénédicte; Canu, Stéphane

    2010-10-01

    Infrared systems are key to providing enhanced capability to military forces such as automatic control of threats and prevention from air, naval and ground attacks. Key requirements for such a system to produce operational benefits are real-time processing as well as high efficiency in terms of detection and false alarm rate. These are serious issues since the system must deal with a large number of objects and categories to be recognized (small vehicles, armored vehicles, planes, buildings, etc.). Statistical learning based algorithms are promising candidates to meet these requirements when using selected discriminant features and real-time implementation. This paper proposes a new decision architecture benefiting from recent advances in machine learning by using an effective method for level set estimation. While building decision function, the proposed approach performs variable selection based on a discriminative criterion. Moreover, the use of level set makes it possible to manage rejection of unknown or ambiguous objects thus preserving the false alarm rate. Experimental evidences reported on real world infrared images demonstrate the validity of our approach.

  14. Mixture models for detecting differentially expressed genes in microarrays.

    PubMed

    Jones, Liat Ben-Tovim; Bean, Richard; McLachlan, Geoffrey J; Zhu, Justin Xi

    2006-10-01

    An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.

  15. Cluster-level statistical inference in fMRI datasets: The unexpected behavior of random fields in high dimensions.

    PubMed

    Bansal, Ravi; Peterson, Bradley S

    2018-06-01

    Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal FWERs. Those rejected clusters were outlying values in the distribution of cluster size but cannot be distinguished from true positive findings without further analyses, including assessing whether fMRI signal in those regions correlates with other clinical, behavioral, or cognitive measures. Rejecting the large clusters, however, significantly reduced the statistical power of nonparametric methods in detecting true findings compared with parametric methods, which would have detected most true findings that are essential for making valid biological inferences in MRI data. Parametric analyses, in contrast, detected most true findings while generating relatively few false positives: on average, less than one of those very large clusters would be deemed a true finding in each brain-wide analysis. We therefore recommend the continued use of parametric methods that model nonstationary smoothness for cluster-level, familywise control of false positives, particularly when using a Cluster Defining Threshold of 2.5 or higher, and subsequently assessing rigorously the biological plausibility of the findings, even for large clusters. Finally, because nonparametric methods yielded a large reduction in statistical power to detect true positive findings, we conclude that the modest reduction in false positive findings that nonparametric analyses afford does not warrant a re-analysis of previously published fMRI studies using nonparametric techniques. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Sensor Data Quality and Angular Rate Down-Selection Algorithms on SLS EM-1

    NASA Technical Reports Server (NTRS)

    Park, Thomas; Smith, Austin; Oliver, T. Emerson

    2018-01-01

    The NASA Space Launch System Block 1 launch vehicle is equipped with an Inertial Navigation System (INS) and multiple Rate Gyro Assemblies (RGA) that are used in the Guidance, Navigation, and Control (GN&C) algorithms. The INS provides the inertial position, velocity, and attitude of the vehicle along with both angular rate and specific force measurements. Additionally, multiple sets of co-located rate gyros supply angular rate data. The collection of angular rate data, taken along the launch vehicle, is used to separate out vehicle motion from flexible body dynamics. Since the system architecture uses redundant sensors, the capability was developed to evaluate the health (or validity) of the independent measurements. A suite of Sensor Data Quality (SDQ) algorithms is responsible for assessing the angular rate data from the redundant sensors. When failures are detected, SDQ will take the appropriate action and disqualify or remove faulted sensors from forward processing. Additionally, the SDQ algorithms contain logic for down-selecting the angular rate data used by the GNC software from the set of healthy measurements. This paper explores the trades and analyses that were performed in selecting a set of robust fault-detection algorithms included in the GN&C flight software. These trades included both an assessment of hardware-provided health and status data as well as an evaluation of different algorithms based on time-to-detection, type of failures detected, and probability of detecting false positives. We then provide an overview of the algorithms used for both fault-detection and measurement down selection. We next discuss the role of trajectory design, flexible-body models, and vehicle response to off-nominal conditions in setting the detection thresholds. Lastly, we present lessons learned from software integration and hardware-in-the-loop testing.

  17. An economic evaluation of second-trimester genetic ultrasonography for prenatal detection of down syndrome.

    PubMed

    Vintzileos, A M; Ananth, C V; Fisher, A J; Smulian, J C; Day-Salvatore, D; Beazoglou, T; Knuppel, R A

    1998-11-01

    The objective of this study was to perform an economic evaluation of second-trimester genetic ultrasonography for prenatal detection of Down syndrome. More specifically, we sought to determine the following: (1) the diagnostic accuracy requirements (from the cost-benefit point of view) of genetic ultrasonography versus genetic amniocentesis for women at increased risk for fetal Down syndrome and (2) the possible economic impact of second-trimester genetic ultrasonography for the US population on the basis of the ultrasonographic accuracies reported in previously published studies. A cost-benefit equation was developed from the hypothesis that the cost of universal genetic amniocentesis of patients at increased risk for carrying a fetus with Down syndrome should be at least equal to the cost of universal genetic ultrasonography with amniocentesis used only for those with abnormal ultrasonographic results. The main components of the equation included the diagnostic accuracy of genetic ultrasonography (sensitivity and specificity for detecting Down syndrome), the costs of the amniocentesis package and genetic ultrasonography, and the lifetime cost of Down syndrome cases not detected by the genetic ultrasonography. After appropriate manipulation of the equation a graph was constructed, representing the balance between sensitivity and false-positive rate of genetic ultrasonography; this was used to examine the accuracy of previously published studies from the cost-benefit point of view. Sensitivity analyses included individual risks for Down syndrome ranging from 1:261 (risk of a 35-year-old at 18 weeks' gestation) to 1:44 (risk of a 44-year-old at 18 weeks' gestation). This economic evaluation was conducted from the societal perspective. Genetic ultrasonography was found to be economically beneficial only if the overall sensitivity for detecting Down syndrome was >74%. Even then, the cost-benefit ratio depended on the corresponding false-positive rate. Of the 7 published studies that used multiple ultrasonographic markers for genetic ultrasonography, 6 had accuracies compatible with benefits. The required ultrasonographic accuracy (sensitivity and false-positive rate) varied according to the prevalence of Down syndrome in the population tested. The cost-benefit ratio of second-trimester genetic ultrasonography depends on its diagnostic accuracy, and it is beneficial only when its overall sensitivity for Down syndrome is >74%.

  18. High-throughput discovery of rare human nucleotide polymorphisms by Ecotilling

    PubMed Central

    Till, Bradley J.; Zerr, Troy; Bowers, Elisabeth; Greene, Elizabeth A.; Comai, Luca; Henikoff, Steven

    2006-01-01

    Human individuals differ from one another at only ∼0.1% of nucleotide positions, but these single nucleotide differences account for most heritable phenotypic variation. Large-scale efforts to discover and genotype human variation have been limited to common polymorphisms. However, these efforts overlook rare nucleotide changes that may contribute to phenotypic diversity and genetic disorders, including cancer. Thus, there is an increasing need for high-throughput methods to robustly detect rare nucleotide differences. Toward this end, we have adapted the mismatch discovery method known as Ecotilling for the discovery of human single nucleotide polymorphisms. To increase throughput and reduce costs, we developed a universal primer strategy and implemented algorithms for automated band detection. Ecotilling was validated by screening 90 human DNA samples for nucleotide changes in 5 gene targets and by comparing results to public resequencing data. To increase throughput for discovery of rare alleles, we pooled samples 8-fold and found Ecotilling to be efficient relative to resequencing, with a false negative rate of 5% and a false discovery rate of 4%. We identified 28 new rare alleles, including some that are predicted to damage protein function. The detection of rare damaging mutations has implications for models of human disease. PMID:16893952

  19. cn.FARMS: a latent variable model to detect copy number variations in microarray data with a low false discovery rate.

    PubMed

    Clevert, Djork-Arné; Mitterecker, Andreas; Mayr, Andreas; Klambauer, Günter; Tuefferd, Marianne; De Bondt, An; Talloen, Willem; Göhlmann, Hinrich; Hochreiter, Sepp

    2011-07-01

    Cost-effective oligonucleotide genotyping arrays like the Affymetrix SNP 6.0 are still the predominant technique to measure DNA copy number variations (CNVs). However, CNV detection methods for microarrays overestimate both the number and the size of CNV regions and, consequently, suffer from a high false discovery rate (FDR). A high FDR means that many CNVs are wrongly detected and therefore not associated with a disease in a clinical study, though correction for multiple testing takes them into account and thereby decreases the study's discovery power. For controlling the FDR, we propose a probabilistic latent variable model, 'cn.FARMS', which is optimized by a Bayesian maximum a posteriori approach. cn.FARMS controls the FDR through the information gain of the posterior over the prior. The prior represents the null hypothesis of copy number 2 for all samples from which the posterior can only deviate by strong and consistent signals in the data. On HapMap data, cn.FARMS clearly outperformed the two most prevalent methods with respect to sensitivity and FDR. The software cn.FARMS is publicly available as a R package at http://www.bioinf.jku.at/software/cnfarms/cnfarms.html.

  20. An Adaptive Ship Detection Algorithm for Hrws SAR Images Under Complex Background: Application to SENTINEL1A Data

    NASA Astrophysics Data System (ADS)

    He, G.; Xia, Z.; Chen, H.; Li, K.; Zhao, Z.; Guo, Y.; Feng, P.

    2018-04-01

    Real-time ship detection using synthetic aperture radar (SAR) plays a vital role in disaster emergency and marine security. Especially the high resolution and wide swath (HRWS) SAR images, provides the advantages of high resolution and wide swath synchronously, significantly promotes the wide area ocean surveillance performance. In this study, a novel method is developed for ship target detection by using the HRWS SAR images. Firstly, an adaptive sliding window is developed to propose the suspected ship target areas, based upon the analysis of SAR backscattering intensity images. Then, backscattering intensity and texture features extracted from the training samples of manually selected ship and non-ship slice images, are used to train a support vector machine (SVM) to classify the proposed ship slice images. The approach is verified by using the Sentinl1A data working in interferometric wide swath mode. The results demonstrate the improvement performance of the proposed method over the constant false alarm rate (CFAR) method, where the classification accuracy improved from 88.5 % to 96.4 % and the false alarm rate mitigated from 11.5 % to 3.6 % compared with CFAR respectively.

  1. Stochastic resonance-enhanced laser-based particle detector.

    PubMed

    Dutta, A; Werner, C

    2009-01-01

    This paper presents a Laser-based particle detector whose response was enhanced by modulating the Laser diode with a white-noise generator. A Laser sheet was generated to cast a shadow of the object on a 200 dots per inch, 512 x 1 pixels linear sensor array. The Laser diode was modulated with a white-noise generator to achieve stochastic resonance. The white-noise generator essentially amplified the wide-bandwidth (several hundred MHz) noise produced by a reverse-biased zener diode operating in junction-breakdown mode. The gain in the amplifier in the white-noise generator was set such that the Receiver Operating Characteristics plot provided the best discriminability. A monofiber 40 AWG (approximately 80 microm) wire was detected with approximately 88% True Positive rate and approximately 19% False Positive rate in presence of white-noise modulation and with approximately 71% True Positive rate and approximately 15% False Positive rate in absence of white-noise modulation.

  2. Probabilistic resident space object detection using archival THEMIS fluxgate magnetometer data

    NASA Astrophysics Data System (ADS)

    Brew, Julian; Holzinger, Marcus J.

    2018-05-01

    Recent progress in the detection of small space objects, at geosynchronous altitudes, through ground-based optical and radar measurements is demonstrated as a viable method. However, in general, these methods are limited to detection of objects greater than 10 cm. This paper examines the use of magnetometers to detect plausible flyby encounters with charged space objects using a matched filter signal existence binary hypothesis test approach. Relevant data-set processing and reduction of archival fluxgate magnetometer data from the NASA THEMIS mission is discussed in detail. Using the proposed methodology and a false alarm rate of 10%, 285 plausible detections with probability of detection greater than 80% are claimed and several are reviewed in detail.

  3. Maternal cfDNA screening for Down syndrome--a cost sensitivity analysis.

    PubMed

    Cuckle, Howard; Benn, Peter; Pergament, Eugene

    2013-07-01

    This study aimed to determine the principal factors contributing to the cost of avoiding a birth with Down syndrome by using cell-free DNA (cfDNA) to replace conventional screening. A range of unit costs were assigned to each item in the screening process. Detection rates were estimated by meta-analysis and modeling. The marginal cost associated with the detection of additional cases using cfDNA was estimated from the difference in average costs divided by the difference in detection. The main factor was the unit cost of cfDNA testing. For example, replacing a combined test costing $150 with 3% false-positive rate and invasive testing at $1000, by cfDNA tests at $2000, $1500, $1000, and $500, the marginal cost is $8.0, $5.8, $3.6, and $1.4m, respectively. Costs were lower when replacing a quadruple test and higher for a 5% false-positive rate, but the relative importance of cfDNA unit cost was unchanged. A contingent policy whereby 10% to 20% women were selected for cfDNA testing by conventional screening was considerably more cost-efficient. Costs were sensitive to cfDNA uptake. Universal cfDNA screening for Down syndrome will only become affordable by public health purchasers if costs fall substantially. Until this happens, the contingent use of cfDNA is recommended. © 2013 John Wiley & Sons, Ltd.

  4. Directed area search using socio-biological vision algorithms and cognitive Bayesian reasoning

    NASA Astrophysics Data System (ADS)

    Medasani, S.; Owechko, Y.; Allen, D.; Lu, T. C.; Khosla, D.

    2010-04-01

    Volitional search systems that assist the analyst by searching for specific targets or objects such as vehicles, factories, airports, etc in wide area overhead imagery need to overcome multiple problems present in current manual and automatic approaches. These problems include finding targets hidden in terabytes of information, relatively few pixels on targets, long intervals between interesting regions, time consuming analysis requiring many analysts, no a priori representative examples or templates of interest, detecting multiple classes of objects, and the need for very high detection rates and very low false alarm rates. This paper describes a conceptual analyst-centric framework that utilizes existing technology modules to search and locate occurrences of targets of interest (e.g., buildings, mobile targets of military significance, factories, nuclear plants, etc.), from video imagery of large areas. Our framework takes simple queries from the analyst and finds the queried targets with relatively minimum interaction from the analyst. It uses a hybrid approach that combines biologically inspired bottom up attention, socio-biologically inspired object recognition for volitionally recognizing targets, and hierarchical Bayesian networks for modeling and representing the domain knowledge. This approach has the benefits of high accuracy, low false alarm rate and can handle both low-level visual information and high-level domain knowledge in a single framework. Such a system would be of immense help for search and rescue efforts, intelligence gathering, change detection systems, and other surveillance systems.

  5. Brain fingerprinting field studies comparing P300-MERMER and P300 brainwave responses in the detection of concealed information.

    PubMed

    Farwell, Lawrence A; Richardson, Drew C; Richardson, Graham M

    2013-08-01

    Brain fingerprinting detects concealed information stored in the brain by measuring brainwave responses. We compared P300 and P300-MERMER event-related brain potentials for error rate/accuracy and statistical confidence in four field/real-life studies. 76 tests detected presence or absence of information regarding (1) real-life events including felony crimes; (2) real crimes with substantial consequences (either a judicial outcome, i.e., evidence admitted in court, or a $100,000 reward for beating the test); (3) knowledge unique to FBI agents; and (4) knowledge unique to explosives (EOD/IED) experts. With both P300 and P300-MERMER, error rate was 0 %: determinations were 100 % accurate, no false negatives or false positives; also no indeterminates. Countermeasures had no effect. Median statistical confidence for determinations was 99.9 % with P300-MERMER and 99.6 % with P300. Brain fingerprinting methods and scientific standards for laboratory and field applications are discussed. Major differences in methods that produce different results are identified. Markedly different methods in other studies have produced over 10 times higher error rates and markedly lower statistical confidences than those of these, our previous studies, and independent replications. Data support the hypothesis that accuracy, reliability, and validity depend on following the brain fingerprinting scientific standards outlined herein.

  6. Improvement of the sentinel lymph node detection rate of cervical sentinel lymph node biopsy using real-time fluorescence navigation with indocyanine green in head and neck skin cancer.

    PubMed

    Nakamura, Yasuhiro; Fujisawa, Yasuhiro; Nakamura, Yoshiyuki; Maruyama, Hiroshi; Furuta, Jun-ichi; Kawachi, Yasuhiro; Otsuka, Fujio

    2013-06-01

    The standard technique using lymphoscintigraphy, blue dye and a gamma probe has established a reliable method for sentinel node biopsy for skin cancer. However, the detection rate of cervical sentinel lymph nodes (SLN) is generally lower than that of inguinal or axillary SLN because of the complexity of lymphatic drainage in the head and neck region and the "shine-through" phenomenon. Recently, indocyanine green fluorescence imaging has been reported as a new method to detect SLN. We hypothesized that fluorescence navigation with indocyanine green in combination with the standard technique would improve the detection rate of cervical sentinel nodes. We performed cervical sentinel node biopsies using the standard technique in 20 basins of 18 patients (group A) and using fluorescence navigation in combination with the standard technique in 12 basins of 16 patients (group B). The mean number of sentinel nodes was two per basin (range, 1-4) in group A and three per basin (range, 1-5) in group B. The detection rate of sentinel nodes was 83% (29/35) in group A and 95% (36/38) in group B. The false-negative rate was 6% (1/18 patients) in group A and 0% in group B. Fluorescence navigation with indocyanine green may improve the cervical sentinel node detection rate. However, greater collection of data regarding the usefulness of cervical sentinel node biopsy using indocyanine green is necessary. © 2013 Japanese Dermatological Association.

  7. Assessment of Data and Knowledge Fusion Strategies for Diagnostics and Prognostics

    DTIC Science & Technology

    2001-04-05

    prognostic technologies has proven effective in reducing false alarm rates, increasing confidence levels in early fault detection , and predicting time...or better than the sum of the parts. Specific to health management, this means reduced uncertainty in current condition assessment reduced (improving...achieve time synchronous averaged vibration features. Semmm Amy -U....1A MreN T.g 4 Id F~As- Anomaly DEtection Figure 1 - Fusion Application Areas At a

  8. A gradient-boosting approach for filtering de novo mutations in parent-offspring trios.

    PubMed

    Liu, Yongzhuang; Li, Bingshan; Tan, Renjie; Zhu, Xiaolin; Wang, Yadong

    2014-07-01

    Whole-genome and -exome sequencing on parent-offspring trios is a powerful approach to identifying disease-associated genes by detecting de novo mutations in patients. Accurate detection of de novo mutations from sequencing data is a critical step in trio-based genetic studies. Existing bioinformatic approaches usually yield high error rates due to sequencing artifacts and alignment issues, which may either miss true de novo mutations or call too many false ones, making downstream validation and analysis difficult. In particular, current approaches have much worse specificity than sensitivity, and developing effective filters to discriminate genuine from spurious de novo mutations remains an unsolved challenge. In this article, we curated 59 sequence features in whole genome and exome alignment context which are considered to be relevant to discriminating true de novo mutations from artifacts, and then employed a machine-learning approach to classify candidates as true or false de novo mutations. Specifically, we built a classifier, named De Novo Mutation Filter (DNMFilter), using gradient boosting as the classification algorithm. We built the training set using experimentally validated true and false de novo mutations as well as collected false de novo mutations from an in-house large-scale exome-sequencing project. We evaluated DNMFilter's theoretical performance and investigated relative importance of different sequence features on the classification accuracy. Finally, we applied DNMFilter on our in-house whole exome trios and one CEU trio from the 1000 Genomes Project and found that DNMFilter could be coupled with commonly used de novo mutation detection approaches as an effective filtering approach to significantly reduce false discovery rate without sacrificing sensitivity. The software DNMFilter implemented using a combination of Java and R is freely available from the website at http://humangenome.duke.edu/software. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Automated detection of lung nodules with three-dimensional convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Pérez, Gustavo; Arbeláez, Pablo

    2017-11-01

    Lung cancer is the cancer type with highest mortality rate worldwide. It has been shown that early detection with computer tomography (CT) scans can reduce deaths caused by this disease. Manual detection of cancer nodules is costly and time-consuming. We present a general framework for the detection of nodules in lung CT images. Our method consists of the pre-processing of a patient's CT with filtering and lung extraction from the entire volume using a previously calculated mask for each patient. From the extracted lungs, we perform a candidate generation stage using morphological operations, followed by the training of a three-dimensional convolutional neural network for feature representation and classification of extracted candidates for false positive reduction. We perform experiments on the publicly available LIDC-IDRI dataset. Our candidate extraction approach is effective to produce precise candidates with a recall of 99.6%. In addition, false positive reduction stage manages to successfully classify candidates and increases precision by a factor of 7.000.

  10. Applying a 2D based CAD scheme for detecting micro-calcification clusters using digital breast tomosynthesis images: an assessment

    NASA Astrophysics Data System (ADS)

    Park, Sang Cheol; Zheng, Bin; Wang, Xiao-Hui; Gur, David

    2008-03-01

    Digital breast tomosynthesis (DBT) has emerged as a promising imaging modality for screening mammography. However, visually detecting micro-calcification clusters depicted on DBT images is a difficult task. Computer-aided detection (CAD) schemes for detecting micro-calcification clusters depicted on mammograms can achieve high performance and the use of CAD results can assist radiologists in detecting subtle micro-calcification clusters. In this study, we compared the performance of an available 2D based CAD scheme with one that includes a new grouping and scoring method when applied to both projection and reconstructed DBT images. We selected a dataset involving 96 DBT examinations acquired on 45 women. Each DBT image set included 11 low dose projection images and a varying number of reconstructed image slices ranging from 18 to 87. In this dataset 20 true-positive micro-calcification clusters were visually detected on the projection images and 40 were visually detected on the reconstructed images, respectively. We first applied the CAD scheme that was previously developed in our laboratory to the DBT dataset. We then tested a new grouping method that defines an independent cluster by grouping the same cluster detected on different projection or reconstructed images. We then compared four scoring methods to assess the CAD performance. The maximum sensitivity level observed for the different grouping and scoring methods were 70% and 88% for the projection and reconstructed images with a maximum false-positive rate of 4.0 and 15.9 per examination, respectively. This preliminary study demonstrates that (1) among the maximum, the minimum or the average CAD generated scores, using the maximum score of the grouped cluster regions achieved the highest performance level, (2) the histogram based scoring method is reasonably effective in reducing false-positive detections on the projection images but the overall CAD sensitivity is lower due to lower signal-to-noise ratio, and (3) CAD achieved higher sensitivity and higher false-positive rate (per examination) on the reconstructed images. We concluded that without changing the detection threshold or performing pre-filtering to possibly increase detection sensitivity, current CAD schemes developed and optimized for 2D mammograms perform relatively poorly and need to be re-optimized using DBT datasets and new grouping and scoring methods need to be incorporated into the schemes if these are to be used on the DBT examinations.

  11. Image processing improvement for optical observations of space debris with the TAROT telescopes

    NASA Astrophysics Data System (ADS)

    Thiebaut, C.; Theron, S.; Richard, P.; Blanchet, G.; Klotz, A.; Boër, M.

    2016-07-01

    CNES is involved in the Inter-Agency Space Debris Coordination Committee (IADC) and is observing space debris with two robotic ground based fully automated telescopes called TAROT and operated by the CNRS. An image processing algorithm devoted to debris detection in geostationary orbit is implemented in the standard pipeline. Nevertheless, this algorithm is unable to deal with debris tracking mode images, this mode being the preferred one for debris detectability. We present an algorithm improvement for this mode and give results in terms of false detection rate.

  12. Use of joint two-view information for computerized lesion detection on mammograms: improvement of microcalcification detection accuracy

    NASA Astrophysics Data System (ADS)

    Sahiner, Berkman; Gurcan, Metin N.; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Petrick, Nicholas; Helvie, Mark A.

    2002-05-01

    We are developing new techniques to improve the accuracy of computerized microcalcification detection by using the joint two-view information on craniocaudal (CC) and mediolateral-oblique (MLO) views. After cluster candidates were detected using a single-view detection technique, candidates on CC and MLO views were paired using their radial distances from the nipple. Object pairs were classified with a joint two-view classifier that used the similarity of objects in a pair. Each cluster candidate was also classified as a true microcalcification cluster or a false-positive (FP) using its single-view features. The outputs of these two classifiers were fused. A data set of 38 pairs of mammograms from our database was used to train the new detection technique. The independent test set consisted of 77 pairs of mammograms from the University of South Florida public database. At a per-film sensitivity of 70%, the FP rates were 0.17 and 0.27 with the fusion and single-view detection methods, respectively. Our results indicate that correspondence of cluster candidates on two different views provides valuable additional information for distinguishing false from true microcalcification clusters.

  13. Comparison of human observer and algorithmic target detection in nonurban forward-looking infrared imagery

    NASA Astrophysics Data System (ADS)

    Weber, Bruce A.

    2005-07-01

    We have performed an experiment that compares the performance of human observers with that of a robust algorithm for the detection of targets in difficult, nonurban forward-looking infrared imagery. Our purpose was to benchmark the comparison and document performance differences for future algorithm improvement. The scale-insensitive detection algorithm, used as a benchmark by the Night Vision Electronic Sensors Directorate for algorithm evaluation, employed a combination of contrastlike features to locate targets. Detection receiver operating characteristic curves and observer-confidence analyses were used to compare human and algorithmic responses and to gain insight into differences. The test database contained ground targets, in natural clutter, whose detectability, as judged by human observers, ranged from easy to very difficult. In general, as compared with human observers, the algorithm detected most of the same targets, but correlated confidence with correct detections poorly and produced many more false alarms at any useful level of performance. Though characterizing human performance was not the intent of this study, results suggest that previous observational experience was not a strong predictor of human performance, and that combining individual human observations by majority vote significantly reduced false-alarm rates.

  14. A prospective clinical trial to compare the performance of dried blood spots prenatal screening for Down's syndrome with conventional non-invasive testing technology.

    PubMed

    Hu, Huiying; Jiang, Yulin; Zhang, Minghui; Liu, Shanying; Hao, Na; Zhou, Jing; Liu, Juntao; Zhang, Xiaojin; Ma, Liangkun

    2017-03-01

    To evaluate, side by side, the efficiency of dried blood spots (DBSs) against serum screening for Down's syndrome, and then, to construct a two-tier strategy by topping up the fetal cell-free DNA (cfDNA) secondary screening over the high-risk women marked by the primary blood testing to build a practical screening tactic to identify fetal Down's syndrome. One thousand eight hundred and thirty-seven low-risk Chinese women, with singleton pregnancy, were enrolled for the study. Alpha-fetoprotein and free beta human chorionic gonadotropin were measured for the serum as well as for the parallel DBS samples. Partial high-risk pregnant women identified by primary blood testing (n = 38) were also subject to the secondary cfDNA screening. Diagnostic amniocentesis was utilized to confirm the screening results. The true positive rate for Down's syndrome detection was 100% for both blood screening methods; however, the false-positive rate was 3.0% for DBS and 4.0% for serum screening, respectively. DBS correlated well with serum screening on Down's syndrome detection. Three out of 38 primary high-risk women displayed chromosomal abnormalities by cfDNA analysis, which were confirmed by amniocentesis. Either the true detection rate or the false-positive rate for Down's syndrome between DBS and the serum test is comparable. In addition, blood primary screening aligned with secondary cfDNA analysis, a "before and after" two-tier screening strategy, can massively decrease the false-positive rate, which, then, dramatically reduces the demand for invasive diagnostic operation. Impact statement Children born with Down's syndrome display a wide range of mental and physical disability. Currently, there is no effective treatment to ease the burden and anxiety of the Down's syndrome family and the surrounding society. This study is to evaluate the efficiency of dried blood spots against serum screening for Down's syndrome and to construct a two-tier strategy by topping up the fetal cell-free DNA (cfDNA) secondary screening over the high-risk women marked by the primary blood testing to build a practical screening tactic to identify fetal Down's syndrome. Results demonstrate that fetal cfDNA can significantly reduce false-positive rate close to none while distinguishing all true positives. Thus, we recommend that fetal cfDNA analysis to be utilized as a secondary screening tool atop of the primary blood protein screening to further minimize the capacity of undesirable invasive diagnostic operations.

  15. Testing the effectiveness of automated acoustic sensors for monitoring vocal activity of Marbled Murrelets Brachyramphus marmoratus

    USGS Publications Warehouse

    Cragg, Jenna L.; Burger, Alan E.; Piatt, John F.

    2015-01-01

    Cryptic nest sites and secretive breeding behavior make population estimates and monitoring of Marbled Murrelets Brachyramphus marmoratus difficult and expensive. Standard audio-visual and radar protocols have been refined but require intensive field time by trained personnel. We examined the detection range of automated sound recorders (Song Meters; Wildlife Acoustics Inc.) and the reliability of automated recognition models (“recognizers”) for identifying and quantifying Marbled Murrelet vocalizations during the 2011 and 2012 breeding seasons at Kodiak Island, Alaska. The detection range of murrelet calls by Song Meters was estimated to be 60 m. Recognizers detected 20 632 murrelet calls (keer and keheer) from a sample of 268 h of recordings, yielding 5 870 call series, which compared favorably with human scanning of spectrograms (on average detecting 95% of the number of call series identified by a human observer, but not necessarily the same call series). The false-negative rate (percentage of murrelet call series that the recognizers failed to detect) was 32%, mainly involving weak calls and short call series. False-positives (other sounds included by recognizers as murrelet calls) were primarily due to complex songs of other bird species, wind and rain. False-positives were lower in forest nesting habitat (48%) and highest in shrubby vegetation where calls of other birds were common (97%–99%). Acoustic recorders tracked spatial and seasonal trends in vocal activity, with higher call detections in high-quality forested habitat and during late July/early August. Automated acoustic monitoring of Marbled Murrelet calls could provide cost-effective, valuable information for assessing habitat use and temporal and spatial trends in nesting activity; reliability is dependent on careful placement of sensors to minimize false-positives and on prudent application of digital recognizers with visual checking of spectrograms.

  16. Masquerade Detection Using a Taxonomy-Based Multinomial Modeling Approach in UNIX Systems

    DTIC Science & Technology

    2008-08-25

    primarily the modeling of statistical features , such as the frequency of events, the duration of events, the co- occurrence of multiple events...are identified, we can extract features representing such behavior while auditing the user’s behavior. Figure1: Taxonomy of Linux and Unix...achieved when the features are extracted just from simple commands. Method Hit Rate False Positive Rate ocSVM using simple cmds (freq.-based

  17. Milagro Observations of Potential TeV Emitters

    NASA Astrophysics Data System (ADS)

    Abeysekara, Anushka; Linnemann, James

    2012-03-01

    We searched for point sources in Milagro sky maps at the locations in four catalogs of potential TeV emitting sources. Our candidates are selected from the Fermi 2FGL pulsars, Fermi 2FGL extragalactic sources, TeVCat extragalactic sources, and from the BL Lac TeV Candidate list published by Costamante and Ghisellini in 2002. The False Discovery Rate (FDR) statistical procedure is used to select the sources. The FDR procedure controls the fraction of false detections. Our results are presented in this talk.

  18. Dual-Probe Real-Time PCR Assay for Detection of Variola or Other Orthopoxviruses with Dried Reagents

    DTIC Science & Technology

    2008-09-10

    can naturally produce disease in humans hat closely resembles smallpox, with up to 15% mortality rates ∗ Corresponding author . Tel.: +1 301 619 2415...GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR (S) Aitichou, M Saleh, S Park, K Huggins, J O’Guinn, M Jahrling, PB Ibrahim, MS 5d. PROJECT...false positivewas btained with dried real-time PCR reagents resulting in 98% speci- city. The false positive was obtained from one of two hantavirus

  19. Detection of Fundus Lesions Using Classifier Selection

    NASA Astrophysics Data System (ADS)

    Nagayoshi, Hiroto; Hiramatsu, Yoshitaka; Sako, Hiroshi; Himaga, Mitsutoshi; Kato, Satoshi

    A system for detecting fundus lesions caused by diabetic retinopathy from fundus images is being developed. The system can screen the images in advance in order to reduce the inspection workload on doctors. One of the difficulties that must be addressed in completing this system is how to remove false positives (which tend to arise near blood vessels) without decreasing the detection rate of lesions in other areas. To overcome this difficulty, we developed classifier selection according to the position of a candidate lesion, and we introduced new features that can distinguish true lesions from false positives. A system incorporating classifier selection and these new features was tested in experiments using 55 fundus images with some lesions and 223 images without lesions. The results of the experiments confirm the effectiveness of the proposed system, namely, degrees of sensitivity and specificity of 98% and 81%, respectively.

  20. Detection of sentinel lymph nodes in patients with early stage cervical cancer.

    PubMed

    Seong, Seok Ju; Park, Hyun; Yang, Kwang Moon; Kim, Tae Jin; Lim, Kyung Taek; Shim, Jae Uk; Park, Chong Taik; Lee, Ki Heon

    2007-02-01

    The purpose of this study was to determine the feasibility of identifying the sentinel lymph nodes (SNs) as well as to evaluate factors that might influence the SN detection rate in patients with cervical cancer of the uterus. Eighty nine patients underwent intracervical injection of 1% isosulfan blue dye at the time of planned radical hysterectomy and lymphadenectomy between January 2003 and December 2003. With the visual detection of lymph nodes that stained blue, SNs were identified and removed separately. Then all patients underwent complete pelvic lymph node dissection and/or para-aortic lymph node dissection. SNs were identified in 51 of 89 (57.3%) patients. The most common site for SN detection was the external iliac area. Metastatic nodes were detected in 21 of 89 (23.5%) patients. One false negative SN was obtained. Successful SN detection was more likely in patients younger than 50 yr (p=0.02) and with a history of preoperative conization (p=0.05). However, stage, histological type, surgical procedure and neoadjuvant chemotherapy showed no significant difference for SN detection rate. Therefore, the identification of SNs with isosulfan blue dye is feasible and safe. The SN detection rate was high in patients younger than 50 yr or with a history of preoperative conization.

  1. Occupancy Modeling for Improved Accuracy and Understanding of Pathogen Prevalence and Dynamics

    PubMed Central

    Colvin, Michael E.; Peterson, James T.; Kent, Michael L.; Schreck, Carl B.

    2015-01-01

    Most pathogen detection tests are imperfect, with a sensitivity < 100%, thereby resulting in the potential for a false negative, where a pathogen is present but not detected. False negatives in a sample inflate the number of non-detections, negatively biasing estimates of pathogen prevalence. Histological examination of tissues as a diagnostic test can be advantageous as multiple pathogens can be examined and providing important information on associated pathological changes to the host. However, it is usually less sensitive than molecular or microbiological tests for specific pathogens. Our study objectives were to 1) develop a hierarchical occupancy model to examine pathogen prevalence in spring Chinook salmon Oncorhynchus tshawytscha and their distribution among host tissues 2) use the model to estimate pathogen-specific test sensitivities and infection rates, and 3) illustrate the effect of using replicate within host sampling on sample sizes required to detect a pathogen. We examined histological sections of replicate tissue samples from spring Chinook salmon O. tshawytscha collected after spawning for common pathogens seen in this population: Apophallus/echinostome metacercariae, Parvicapsula minibicornis, Nanophyetus salmincola/ metacercariae, and Renibacterium salmoninarum. A hierarchical occupancy model was developed to estimate pathogen and tissue-specific test sensitivities and unbiased estimation of host- and organ-level infection rates. Model estimated sensitivities and host- and organ-level infections rates varied among pathogens and model estimated infection rate was higher than prevalence unadjusted for test sensitivity, confirming that prevalence unadjusted for test sensitivity was negatively biased. The modeling approach provided an analytical approach for using hierarchically structured pathogen detection data from lower sensitivity diagnostic tests, such as histology, to obtain unbiased pathogen prevalence estimates with associated uncertainties. Accounting for test sensitivity using within host replicate samples also required fewer individual fish to be sampled. This approach is useful for evaluating pathogen or microbe community dynamics when test sensitivity is <100%. PMID:25738709

  2. Occupancy modeling for improved accuracy and understanding of pathogen prevalence and dynamics

    USGS Publications Warehouse

    Colvin, Michael E.; Peterson, James T.; Kent, Michael L.; Schreck, Carl B.

    2015-01-01

    Most pathogen detection tests are imperfect, with a sensitivity < 100%, thereby resulting in the potential for a false negative, where a pathogen is present but not detected. False negatives in a sample inflate the number of non-detections, negatively biasing estimates of pathogen prevalence. Histological examination of tissues as a diagnostic test can be advantageous as multiple pathogens can be examined and providing important information on associated pathological changes to the host. However, it is usually less sensitive than molecular or microbiological tests for specific pathogens. Our study objectives were to 1) develop a hierarchical occupancy model to examine pathogen prevalence in spring Chinook salmonOncorhynchus tshawytscha and their distribution among host tissues 2) use the model to estimate pathogen-specific test sensitivities and infection rates, and 3) illustrate the effect of using replicate within host sampling on sample sizes required to detect a pathogen. We examined histological sections of replicate tissue samples from spring Chinook salmon O. tshawytscha collected after spawning for common pathogens seen in this population:Apophallus/echinostome metacercariae, Parvicapsula minibicornis, Nanophyetus salmincola/metacercariae, and Renibacterium salmoninarum. A hierarchical occupancy model was developed to estimate pathogen and tissue-specific test sensitivities and unbiased estimation of host- and organ-level infection rates. Model estimated sensitivities and host- and organ-level infections rates varied among pathogens and model estimated infection rate was higher than prevalence unadjusted for test sensitivity, confirming that prevalence unadjusted for test sensitivity was negatively biased. The modeling approach provided an analytical approach for using hierarchically structured pathogen detection data from lower sensitivity diagnostic tests, such as histology, to obtain unbiased pathogen prevalence estimates with associated uncertainties. Accounting for test sensitivity using within host replicate samples also required fewer individual fish to be sampled. This approach is useful for evaluating pathogen or microbe community dynamics when test sensitivity is <100%.

  3. Detection of breast cancer with full-field digital mammography and computer-aided detection.

    PubMed

    The, Juliette S; Schilling, Kathy J; Hoffmeister, Jeffrey W; Friedmann, Euvondia; McGinnis, Ryan; Holcomb, Richard G

    2009-02-01

    The purpose of this study was to evaluate computer-aided detection (CAD) performance with full-field digital mammography (FFDM). CAD (Second Look, version 7.2) was used to evaluate 123 cases of breast cancer detected with FFDM (Senographe DS). Retrospectively, CAD sensitivity was assessed using breast density, mammographic presentation, histopathology results, and lesion size. To determine the case-based false-positive rate, patients with four standard views per case were included in the study group. Eighteen unilateral mammography examinations with nonstandard views were excluded, resulting in a sample of 105 bilateral cases. CAD detected 115 (94%) of 123 cancer cases: six of six (100%) in fatty breasts, 63 of 66 (95%) in breasts containing scattered fibroglandular densities, 43 of 46 (93%) in heterogeneously dense breasts, and three of five (60%) in extremely dense breasts. CAD detected 93% (41/44) of cancers manifesting as calcifications, 92% (57/62) as masses, and 100% (17/17) as mixed masses and calcifications. CAD detected 94% of the invasive ductal carcinomas (n = 63), 100% of the invasive lobular carcinomas (n = 7), 91% of the other invasive carcinomas (n = 11), and 93% of the ductal carcinomas in situ (n = 42). CAD sensitivity for cancers 1-10 mm (n = 55) was 89%; 11-20 mm (n = 37), 97%; 21-30 mm (n = 16), 100%; and larger than 30 mm (n = 15), 93%. The CAD false-positive rate was 2.3 marks per four-image case. CAD with FFDM showed a high sensitivity in identifying cancers manifesting as calcifications and masses. Sensitivity was maintained in cancers with lower mammographic sensitivity, including invasive lobular carcinomas and small neoplasms (1-20 mm). CAD with FFDM should be effective in assisting radiologists with earlier detection of breast cancer. Future studies are needed to assess CAD accuracy in larger populations.

  4. Next-Generation Sequencing Analysis of the Diversity of Human Noroviruses in Japanese Oysters.

    PubMed

    Imamura, Saiki; Kanezashi, Hiromi; Goshima, Tomoko; Haruna, Mika; Okada, Tsukasa; Inagaki, Nobuya; Uema, Masashi; Noda, Mamoru; Akimoto, Keiko

    2017-08-01

    To obtain detailed information on the diversity of infectious norovirus in oysters (Crossostrea gigas), oysters obtained from fish producers at six different sites (sites A, B, C, D, E, and F) in Japan were analyzed once a month during the period spanning October 2015-February 2016. To avoid false-positive polymerase chain reaction (PCR) results derived from noninfectious virus particles, samples were pretreated with RNase before reverse transcription-PCR (RT-PCR). RT-PCR products were subjected to next-generation sequencing to identify norovirus genotypes in oysters. As a result, all GI genotypes were detected in the investigational period. The detection rate and proportion of norovirus GI genotypes differed depending on the sampling site and month. GII.3, GII.4, GII.13, GII.16, and GII.17 were detected in this study. Both the detection rate and proportion of norovirus GII genotypes differed depending on the sampling site and month. In total, the detection rate and proportion of GII.3 were highest from October to December among all detected genotypes. In January, the detection rates of GII.4 and GII.17 reached the same level as that of GII.3. The proportion of GII.17 was relatively lower from October to December, whereas it was the highest in January. To our knowledge, this is the first investigation on noroviruses in oysters in Japan, based on a method that can distinguish their infectivity.

  5. Improving and Assessing Planet Sensitivity of the GPI Exoplanet Survey with a Forward Model Matched Filter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruffio, Jean-Baptiste; Macintosh, Bruce; Nielsen, Eric L.

    We present a new matched-filter algorithm for direct detection of point sources in the immediate vicinity of bright stars. The stellar point-spread function (PSF) is first subtracted using a Karhunen-Loéve image processing (KLIP) algorithm with angular and spectral differential imaging (ADI and SDI). The KLIP-induced distortion of the astrophysical signal is included in the matched-filter template by computing a forward model of the PSF at every position in the image. To optimize the performance of the algorithm, we conduct extensive planet injection and recovery tests and tune the exoplanet spectra template and KLIP reduction aggressiveness to maximize the signal-to-noise ratiomore » (S/N) of the recovered planets. We show that only two spectral templates are necessary to recover any young Jovian exoplanets with minimal S/N loss. We also developed a complete pipeline for the automated detection of point-source candidates, the calculation of receiver operating characteristics (ROC), contrast curves based on false positives, and completeness contours. We process in a uniform manner more than 330 data sets from the Gemini Planet Imager Exoplanet Survey and assess GPI typical sensitivity as a function of the star and the hypothetical companion spectral type. This work allows for the first time a comparison of different detection algorithms at a survey scale accounting for both planet completeness and false-positive rate. We show that the new forward model matched filter allows the detection of 50% fainter objects than a conventional cross-correlation technique with a Gaussian PSF template for the same false-positive rate.« less

  6. STFT or CWT for the detection of Doppler ultrasound embolic signals.

    PubMed

    Gonçalves, Ivo B; Leiria, Ana; Moura, M M M

    2013-09-01

    Aiming reliable detection and localization of cerebral blood flow and emboli, embolic signals were added to simulated middle cerebral artery Doppler signals and analysed. Short-time Fourier transform (STFT) and continuous wavelet transform (CWT) were used in the evaluation. The following parameters were used in this study: the powers of the embolic signals added were 5, 6, 6.5, 7, 7.5, 8 and 9 dB; the mother wavelets for CWT analysis were Morlet, Mexican hat, Meyer, Gaussian (order 4) and Daubechies (orders 4 and 8); and the thresholds for detection (equated in terms of false positive, false negative and sensitivity) were 2 and 3.5 dB for the CWT and STFT, respectively. The results indicate that although the STFT allows accurately detecting emboli, better time localization can be achieved with the CWT. Among the CWT, the current best overall results were obtained with Mexican Hat mother wavelet, with optimal results for sensitivity (100% detection rate) for nearly all emboli power values studied. Copyright © 2013 John Wiley & Sons, Ltd.

  7. Efficient Mining and Detection of Sequential Intrusion Patterns for Network Intrusion Detection Systems

    NASA Astrophysics Data System (ADS)

    Shyu, Mei-Ling; Huang, Zifang; Luo, Hongli

    In recent years, pervasive computing infrastructures have greatly improved the interaction between human and system. As we put more reliance on these computing infrastructures, we also face threats of network intrusion and/or any new forms of undesirable IT-based activities. Hence, network security has become an extremely important issue, which is closely connected with homeland security, business transactions, and people's daily life. Accurate and efficient intrusion detection technologies are required to safeguard the network systems and the critical information transmitted in the network systems. In this chapter, a novel network intrusion detection framework for mining and detecting sequential intrusion patterns is proposed. The proposed framework consists of a Collateral Representative Subspace Projection Modeling (C-RSPM) component for supervised classification, and an inter-transactional association rule mining method based on Layer Divided Modeling (LDM) for temporal pattern analysis. Experiments on the KDD99 data set and the traffic data set generated by a private LAN testbed show promising results with high detection rates, low processing time, and low false alarm rates in mining and detecting sequential intrusion detections.

  8. Detecting Recombination Hotspots from Patterns of Linkage Disequilibrium.

    PubMed

    Wall, Jeffrey D; Stevison, Laurie S

    2016-08-09

    With recent advances in DNA sequencing technologies, it has become increasingly easy to use whole-genome sequencing of unrelated individuals to assay patterns of linkage disequilibrium (LD) across the genome. One type of analysis that is commonly performed is to estimate local recombination rates and identify recombination hotspots from patterns of LD. One method for detecting recombination hotspots, LDhot, has been used in a handful of species to further our understanding of the basic biology of recombination. For the most part, the effectiveness of this method (e.g., power and false positive rate) is unknown. In this study, we run extensive simulations to compare the effectiveness of three different implementations of LDhot. We find large differences in the power and false positive rates of these different approaches, as well as a strong sensitivity to the window size used (with smaller window sizes leading to more accurate estimation of hotspot locations). We also compared our LDhot simulation results with comparable simulation results obtained from a Bayesian maximum-likelihood approach for identifying hotspots. Surprisingly, we found that the latter computationally intensive approach had substantially lower power over the parameter values considered in our simulations. Copyright © 2016 Wall and Stevison.

  9. Use of the ecf1 gene to detect Shiga toxin-producing Escherichia coli in beef samples.

    PubMed

    Livezey, Kristin W; Groschel, Bettina; Becker, Michael M

    2015-04-01

    Escherichia coli O157:H7 and six serovars (O26, O103, O121, O111, O145, and O45) are frequently implicated in severe clinical illness worldwide. Standard testing methods using stx, eae, and O serogroup-specific gene sequences for detecting the top six non-O157 STEC bear the disadvantage that these genes may reside, independently, in different nonpathogenic organisms, leading to false-positive results. The ecf operon has previously been identified in the large enterohemolysin-encoding plasmid of eae-positive Shiga toxin-producing E. coli (STEC). Here, we explored the utility of the ecf operon as a single marker to detect eae-positive STEC from pure broth and primary meat enrichments. Analysis of 501 E. coli isolates demonstrated a strong correlation (99.6%) between the presence of the ecf1 gene and the combined presence of stx, eae, and ehxA genes. Two large studies were carried out to determine the utility of an ecf1 detection assay to detect non-O157 STEC strains in enriched meat samples in comparison to the results using the U. S. Department of Agriculture Food Safety and Inspection Service (FSIS) method that detects stx and eae genes. In ground beef samples (n = 1,065), the top six non-O157 STEC were detected in 4.0% of samples by an ecf1 detection assay and in 5.0% of samples by the stx- and eae-based method. In contrast, in beef samples composed largely of trim (n = 1,097), the top six non-O157 STEC were detected at 1.1% by both methods. Estimation of false-positive rates among the top six non-O157 STEC revealed a lower rate using the ecf1 detection method (0.5%) than using the eae and stx screening method (1.1%). Additionally, the ecf1 detection assay detected STEC strains associated with severe illness that are not included in the FSIS regulatory definition of adulterant STEC.

  10. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    PubMed Central

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing. PMID:28981533

  11. Bayesian performance metrics of binary sensors in homeland security applications

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz P.; Forrester, Thomas C.

    2008-04-01

    Bayesian performance metrics, based on such parameters, as: prior probability, probability of detection (or, accuracy), false alarm rate, and positive predictive value, characterizes the performance of binary sensors; i.e., sensors that have only binary response: true target/false target. Such binary sensors, very common in Homeland Security, produce an alarm that can be true, or false. They include: X-ray airport inspection, IED inspections, product quality control, cancer medical diagnosis, part of ATR, and many others. In this paper, we analyze direct and inverse conditional probabilities in the context of Bayesian inference and binary sensors, using X-ray luggage inspection statistical results as a guideline.

  12. Multicenter Trial of Sentinel Node Biopsy for Breast Cancer Using Both Technetium Sulfur Colloid and Isosulfan Blue Dye

    PubMed Central

    Tafra, Lorraine; Lannin, Donald R.; Swanson, Melvin S.; Van Eyk, Jason J.; Verbanac, Kathryn M.; Chua, Arlene N.; Ng, Peter C.; Edwards, Maxine S.; Halliday, Bradford E.; Henry, C. Alan; Sommers, Linda M.; Carman, Claire M.; Molin, Melinda R.; Yurko, John E.; Perry, Roger R.; Williams, Robert

    2001-01-01

    Objective To determine the factors associated with false-negative results on sentinel node biopsy and sentinel node localization (identification rate) in patients with breast cancer enrolled in a multicenter trial using a combination technique of isosulfan blue with technetium sulfur colloid (Tc99). Summary Background Data Sentinel node biopsy is a diagnostic test used to detect breast cancer metastases. To test the reliability of this method, a complete lymph node dissection must be performed to determine the false-negative rate. Single-institution series have reported excellent results, although one multicenter trial reported a false-negative rate as high as 29% using radioisotope alone. A multicenter trial was initiated to test combined use of Tc99 and isosulfan blue. Methods Investigators (both private-practice and academic surgeons) were recruited after attending a course on the technique of sentinel node biopsy. No investigator participated in a learning trial before entering patients. Tc99 and isosulfan blue were injected into the peritumoral region. Results Five hundred twenty-nine patients underwent 535 sentinel node biopsy procedures for an overall identification rate in finding a sentinel node of 87% and a false-negative rate of 13%. The identification rate increased and the false-negative rate decreased to 90% and 4.3%, respectively, after investigators had performed more than 30 cases. Univariate analysis of tumor showed the poorest success rate with older patients and inexperienced surgeons. Multivariate analysis identified both age and experience as independent predictors of failure. However, with older patients, inexperienced surgeons, and patients with five or more metastatic axillary nodes, the false-negative rate was consistently greater. Conclusions This multicenter trial, from both private practice and academic institutions, is an excellent indicator of the general utility of sentinel node biopsy. It establishes the factors that play an important role (patient age, surgical experience, tumor location) and those that are irrelevant (prior surgery, tumor size, Tc99 timing). This widens the applicability of the technique and identifies factors that require further investigation. PMID:11141225

  13. Subliminal stimulation and somatosensory signal detection.

    PubMed

    Ferrè, Elisa Raffaella; Sahani, Maneesh; Haggard, Patrick

    2016-10-01

    Only a small fraction of sensory signals is consciously perceived. The brain's perceptual systems may include mechanisms of feedforward inhibition that protect the cortex from subliminal noise, thus reserving cortical capacity and conscious awareness for significant stimuli. Here we provide a new view of these mechanisms based on signal detection theory, and gain control. We demonstrated that subliminal somatosensory stimulation decreased sensitivity for the detection of a subsequent somatosensory input, largely due to increased false alarm rates. By delivering the subliminal somatosensory stimulus and the to-be-detected somatosensory stimulus to different digits of the same hand, we show that this effect spreads across the sensory surface. In addition, subliminal somatosensory stimulation tended to produce an increased probability of responding "yes", whether the somatosensory stimulus was present or not. Our results suggest that subliminal stimuli temporarily reduce input gain, avoiding excessive responses to further small inputs. This gain control may be automatic, and may precede discriminative classification of inputs into signals or noise. Crucially, we found that subliminal inputs influenced false alarm rates only on blocks where the to-be-detected stimuli were present, and not on pre-test control blocks where they were absent. Participants appeared to adjust their perceptual criterion according to a statistical distribution of stimuli in the current context, with the presence of supraliminal stimuli having an important role in the criterion-setting process. These findings clarify the cognitive mechanisms that reserve conscious perception for salient and important signals. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Effects of Demographic History on the Detection of Recombination Hotspots from Linkage Disequilibrium

    PubMed Central

    Dapper, Amy L; Payseur, Bret A

    2018-01-01

    Abstract In some species, meiotic recombination is concentrated in small genomic regions. These “recombination hotspots” leave signatures in fine-scale patterns of linkage disequilibrium, raising the prospect that the genomic landscape of hotspots can be characterized from sequence variation. This approach has led to the inference that hotspots evolve rapidly in some species, but are conserved in others. Historic demographic events, such as population bottlenecks, are known to affect patterns of linkage disequilibrium across the genome, violating population genetic assumptions of this approach. Although such events are prevalent, demographic history is generally ignored when making inferences about the evolution of recombination hotspots. To determine the effect of demography on the detection of recombination hotspots, we use the coalescent to simulate haplotypes with a known recombination landscape. We measure the ability of popular linkage disequilibrium-based programs to detect hotspots across a range of demographic histories, including population bottlenecks, hidden population structure, population expansions, and population contractions. We find that demographic events have the potential to greatly reduce the power and increase the false positive rate of hotspot discovery. Neither the power nor the false positive rate of hotspot detection can be predicted without also knowing the demographic history of the sample. Our results suggest that ignoring demographic history likely overestimates the power to detect hotspots and therefore underestimates the degree of hotspot sharing between species. We suggest strategies for incorporating demographic history into population genetic inferences about recombination hotspots. PMID:29045724

  15. Novel approach for low-cost muzzle flash detection system

    NASA Astrophysics Data System (ADS)

    Voskoboinik, Asher

    2008-04-01

    A low-cost muzzle flash detection based on CMOS sensor technology is proposed. This low-cost technology makes it possible to detect various transient events with characteristic times between dozens of microseconds up to dozens of milliseconds while sophisticated algorithms successfully separate them from false alarms by utilizing differences in geometrical characteristics and/or temporal signatures. The proposed system consists of off-the-shelf smart CMOS cameras with built-in signal and image processing capabilities for pre-processing together with allocated memory for storing a buffer of images for further post-processing. Such a sensor does not require sending giant amounts of raw data to a real-time processing unit but provides all calculations in-situ where processing results are the output of the sensor. This patented CMOS muzzle flash detection concept exhibits high-performance detection capability with very low false-alarm rates. It was found that most false-alarms due to sun glints are from sources at distances of 500-700 meters from the sensor and can be distinguished by time examination techniques from muzzle flash signals. This will enable to eliminate up to 80% of falsealarms due to sun specular reflections in the battle field. Additional effort to distinguish sun glints from suspected muzzle flash signal is made by optimization of the spectral band in Near-IR region. The proposed system can be used for muzzle detection of small arms, missiles and rockets and other military applications.

  16. Initial experience with computer aided detection for microcalcification in digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Harkness, E. F.; Lim, Y. Y.; Wilson, M. W.; Haq, R.; Zhou, J.; Tate, C.; Maxwell, A. J.; Astley, S. M.; Gilbert, F. J.

    2015-03-01

    Digital breast tomosynthesis (DBT) addresses limitations of 2-D projection imaging for detection of masses. Microcalcification clusters may be more difficult to appreciate in DBT as individual calcifications within clusters may appear on different slices. This research aims to evaluate the performance of ImageChecker 3D Calc CAD v1.0. Women were recruited as part of the TOMMY trial. From the trial, 169 were included in this study. The DBT images were processed with the computer aided detection (CAD) algorithm. Three consultant radiologists reviewed the images and recorded whether CAD prompts were on or off target. 79/80 (98.8%) malignant cases had a prompt on the area of microcalcification. In these cases, there were 1-15 marks (median 5) with the majority of false prompts (n=326/431) due to benign (68%) and vascular (24%) calcifications. Of 89 normal/benign cases, there were 1-13 prompts (median 3), 27 (30%) had no prompts and the majority of false prompts (n=238) were benign (77%) calcifications. CAD is effective in prompting malignant microcalcification clusters and may overcome the difficulty of detecting clusters in slice images. Although there was a high rate of false prompts, further advances in the software may improve specificity.

  17. Gut transcription in Helicoverpa zea is dynamically altered in response to baculovirus infection

    USDA-ARS?s Scientific Manuscript database

    The Helicoverpa zea transcriptome was analyzed 24 hours after H. zea larvae fed on artificial diet laced with Helicoverpa zea single nucleopolyhedrovirus (HzSNPV). Significant differential regulation of 1,139 putative genes (P<0.05 T-test with Benjamini and Hochberg False Discovery Rate) was detect...

  18. Multimodal Sensor Fusion for Personnel Detection

    DTIC Science & Technology

    2011-07-01

    video ). Efficacy of UGS systems is often limited by high false alarm rates because the onboard data processing algorithms may not be able to correctly...humans) and animals (e.g., donkeys , mules, and horses). The humans walked alone and in groups with and without backpacks; the animals were led by their

  19. Developing a Qualia-Based Multi-Agent Architecture for Use in Malware Detection

    DTIC Science & Technology

    2010-03-01

    executables were correctly classified with a 6% false positive rate [7]. Kolter and Maloof expand Schultz’s work by analyzing different...Proceedings of the 2001 IEEE Symposium on Security and Privacy. Los Alamitos, CA: IEEE Computer Society, 2001. [8] J. Z. Kolter and M. A. Maloof

  20. Image and Sensor Data Processing for Target Acquisition and Recognition.

    DTIC Science & Technology

    1980-11-01

    technological, cost, size and weight constraints. The critical problem seems to be detecting the target with an acceptable false alarm rate, rather than...for its operation, loss of detail can result from adjacent differing pixels being forced into the same displayed grey-level. This may be detected in...a) Enhanced (b) Fig.1 High contrast scene demonstrating loss of detail in a local area of low contrast Luminance T V Black Peak white wavelength 1

  1. A critical reappraisal of false negative sentinel lymph node biopsy in melanoma.

    PubMed

    Manca, G; Romanini, A; Rubello, D; Mazzarri, S; Boni, G; Chiacchio, S; Tredici, M; Duce, V; Tardelli, E; Volterrani, D; Mariani, G

    2014-06-01

    Lymphatic mapping and sentinel lymph node biopsy (SLNB) have completely changed the clinical management of cutaneous melanoma. This procedure has been accepted worldwide as a recognized method for nodal staging. SLNB is able to accurately determine nodal basin status, providing the most useful prognostic information. However, SLNB is not a perfect diagnostic test. Several large-scale studies have reported a relatively high false-negative rate (5.6-21%), correctly defined as the proportion of false-negative results with respect to the total number of "actual" positive lymph nodes. The main purpose of this review is to address the technical issues that nuclear physicians, surgeons, and pathologists should carefully consider to improve the accuracy of SLNB by minimizing its false-negative rate. In particular, SPECT/CT imaging has demonstrated to be able to identify a greater number of sentinel lymph nodes (SLNs) than those found by planar lymphoscintigraphy. Furthermore, a unique definition in the international guidelines is missing for the operational identification of SLNs, which may be partly responsible for this relatively high false-negative rate of SLNB. Therefore, it is recommended for the scientific community to agree on the radioactive counting rate threshold so that the surgeon can be better radioguided to detect all the lymph nodes which are most likely to harbor metastases. Another possible source of error may be linked to the examination of the harvested SLNs by conventional histopathological methods. A more careful and extensive SLN analysis (e.g. molecular analysis by RT-PCR) is able to find more positive nodes, so that the false-negative rate is reduced. Older age at diagnosis, deeper lesions, histologic ulceration, head-neck anatomical location of primary lesions are the clinical factors associated with false-negative SLNBs in melanoma patients. There is still much controversy about the clinical significance of a false-negative SLNB on the prognosis of melanoma patients. Indeed, most studies have failed to show that there is worse melanoma-specific survival for false-negative compared to true-positive SLNB patients.

  2. Enhanced detection and visualization of anomalies in spectral imagery

    NASA Astrophysics Data System (ADS)

    Basener, William F.; Messinger, David W.

    2009-05-01

    Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.

  3. Imperfect pathogen detection from non-invasive skin swabs biases disease inference

    USGS Publications Warehouse

    DiRenzo, Graziella V.; Grant, Evan H. Campbell; Longo, Ana; Che-Castaldo, Christian; Zamudio, Kelly R.; Lips, Karen

    2018-01-01

    1. Conservation managers rely on accurate estimates of disease parameters, such as pathogen prevalence and infection intensity, to assess disease status of a host population. However, these disease metrics may be biased if low-level infection intensities are missed by sampling methods or laboratory diagnostic tests. These false negatives underestimate pathogen prevalence and overestimate mean infection intensity of infected individuals. 2. Our objectives were two-fold. First, we quantified false negative error rates of Batrachochytrium dendrobatidis on non-invasive skin swabs collected from an amphibian community in El Copé, Panama. We swabbed amphibians twice in sequence, and we used a recently developed hierarchical Bayesian estimator to assess disease status of the population. Second, we developed a novel hierarchical Bayesian model to simultaneously account for imperfect pathogen detection from field sampling and laboratory diagnostic testing. We evaluated the performance of the model using simulations and varying sampling design to quantify the magnitude of bias in estimates of pathogen prevalence and infection intensity. 3. We show that Bd detection probability from skin swabs was related to host infection intensity, where Bd infections < 10 zoospores have < 95% probability of being detected. If imperfect Bd detection was not considered, then Bd prevalence was underestimated by as much as 16%. In the Bd-amphibian system, this indicates a need to correct for imperfect pathogen detection caused by skin swabs in persisting host communities with low-level infections. More generally, our results have implications for study designs in other disease systems, particularly those with similar objectives, biology, and sampling decisions. 4. Uncertainty in pathogen detection is an inherent property of most sampling protocols and diagnostic tests, where the magnitude of bias depends on the study system, type of infection, and false negative error rates. Given that it may be difficult to know this information in advance, we advocate that the most cautious approach is to assume all errors are possible and to accommodate them by adjusting sampling designs. The modeling framework presented here improves the accuracy in estimating pathogen prevalence and infection intensity.

  4. Post-processing for improving hyperspectral anomaly detection accuracy

    NASA Astrophysics Data System (ADS)

    Wu, Jee-Cheng; Jiang, Chi-Ming; Huang, Chen-Liang

    2015-10-01

    Anomaly detection is an important topic in the exploitation of hyperspectral data. Based on the Reed-Xiaoli (RX) detector and a morphology operator, this research proposes a novel technique for improving the accuracy of hyperspectral anomaly detection. Firstly, the RX-based detector is used to process a given input scene. Then, a post-processing scheme using morphology operator is employed to detect those pixels around high-scoring anomaly pixels. Tests were conducted using two real hyperspectral images with ground truth information and the results based on receiver operating characteristic curves, illustrated that the proposed method reduced the false alarm rates of the RXbased detector.

  5. An explosives detection system for airline security using coherent x-ray scattering technology

    NASA Astrophysics Data System (ADS)

    Madden, Robert W.; Mahdavieh, Jacob; Smith, Richard C.; Subramanian, Ravi

    2008-08-01

    L-3 Communications Security and Detection Systems (SDS) has developed a new system for automated alarm resolution in airline baggage Explosive Detection Systems (EDS) based on coherent x-ray scattering spectroscopy. The capabilities of the system were demonstrated in tests with concealed explosives at the Transportation Security Laboratory and airline passenger baggage at Orlando International Airport. The system uses x-ray image information to identify suspicious objects and performs targeted diffraction measurements to classify them. This extra layer of detection capability affords a significant reduction in the rate of false alarm objects that must presently be resolved by opening passenger bags for hand inspection.

  6. Tables of square-law signal detection statistics for Hann spectra with 50 percent overlap

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. Kent

    1991-01-01

    The Search for Extraterrestrial Intelligence, currently being planned by NASA, will require that an enormous amount of data be analyzed in real time by special purpose hardware. It is expected that overlapped Hann data windows will play an important role in this analysis. In order to understand the statistical implication of this approach, it has been necessary to compute detection statistics for overlapped Hann spectra. Tables of signal detection statistics are given for false alarm rates from 10(exp -14) to 10(exp -1) and signal detection probabilities from 0.50 to 0.99; the number of computed spectra ranges from 4 to 2000.

  7. False-Positive Rate of AKI Using Consensus Creatinine-Based Criteria.

    PubMed

    Lin, Jennie; Fernandez, Hilda; Shashaty, Michael G S; Negoianu, Dan; Testani, Jeffrey M; Berns, Jeffrey S; Parikh, Chirag R; Wilson, F Perry

    2015-10-07

    Use of small changes in serum creatinine to diagnose AKI allows for earlier detection but may increase diagnostic false-positive rates because of inherent laboratory and biologic variabilities of creatinine. We examined serum creatinine measurement characteristics in a prospective observational clinical reference cohort of 2267 adult patients with AKI by Kidney Disease Improving Global Outcomes creatinine criteria and used these data to create a simulation cohort to model AKI false-positive rates. We simulated up to seven successive blood draws on an equal population of hypothetical patients with unchanging true serum creatinine values. Error terms generated from laboratory and biologic variabilities were added to each simulated patient's true serum creatinine value to obtain the simulated measured serum creatinine for each blood draw. We determined the proportion of patients who would be erroneously diagnosed with AKI by Kidney Disease Improving Global Outcomes creatinine criteria. Within the clinical cohort, 75.0% of patients received four serum creatinine draws within at least one 48-hour period during hospitalization. After four simulated creatinine measurements that accounted for laboratory variability calculated from assay characteristics and 4.4% of biologic variability determined from the clinical cohort and publicly available data, the overall false-positive rate for AKI diagnosis was 8.0% (interquartile range =7.9%-8.1%), whereas patients with true serum creatinine ≥1.5 mg/dl (representing 21% of the clinical cohort) had a false-positive AKI diagnosis rate of 30.5% (interquartile range =30.1%-30.9%) versus 2.0% (interquartile range =1.9%-2.1%) in patients with true serum creatinine values <1.5 mg/dl (P<0.001). Use of small serum creatinine changes to diagnose AKI is limited by high false-positive rates caused by inherent variability of serum creatinine at higher baseline values, potentially misclassifying patients with CKD in AKI studies. Copyright © 2015 by the American Society of Nephrology.

  8. Detection of influenza-like illness aberrations by directly monitoring Pearson residuals of fitted negative binomial regression models.

    PubMed

    Chan, Ta-Chien; Teng, Yung-Chu; Hwang, Jing-Shiang

    2015-02-21

    Emerging novel influenza outbreaks have increasingly been a threat to the public and a major concern of public health departments. Real-time data in seamless surveillance systems such as health insurance claims data for influenza-like illnesses (ILI) are ready for analysis, making it highly desirable to develop practical techniques to analyze such readymade data for outbreak detection so that the public can receive timely influenza epidemic warnings. This study proposes a simple and effective approach to analyze area-based health insurance claims data including outpatient and emergency department (ED) visits for early detection of any aberrations of ILI. The health insurance claims data during 2004-2009 from a national health insurance research database were used for developing early detection methods. The proposed approach fitted the daily new ILI visits and monitored the Pearson residuals directly for aberration detection. First, negative binomial regression was used for both outpatient and ED visits to adjust for potentially influential factors such as holidays, weekends, seasons, temporal dependence and temperature. Second, if the Pearson residuals exceeded 1.96, aberration signals were issued. The empirical validation of the model was done in 2008 and 2009. In addition, we designed a simulation study to compare the time of outbreak detection, non-detection probability and false alarm rate between the proposed method and modified CUSUM. The model successfully detected the aberrations of 2009 pandemic (H1N1) influenza virus in northern, central and southern Taiwan. The proposed approach was more sensitive in identifying aberrations in ED visits than those in outpatient visits. Simulation studies demonstrated that the proposed approach could detect the aberrations earlier, and with lower non-detection probability and mean false alarm rate in detecting aberrations compared to modified CUSUM methods. The proposed simple approach was able to filter out temporal trends, adjust for temperature, and issue warning signals for the first wave of the influenza epidemic in a timely and accurate manner.

  9. Evaluating suggestibility to additive and contradictory misinformation following explicit error detection in younger and older adults.

    PubMed

    Huff, Mark J; Umanath, Sharda

    2018-06-01

    In 2 experiments, we assessed age-related suggestibility to additive and contradictory misinformation (i.e., remembering of false details from an external source). After reading a fictional story, participants answered questions containing misleading details that were either additive (misleading details that supplemented an original event) or contradictory (errors that changed original details). On a final test, suggestibility was greater for additive than contradictory misinformation, and older adults endorsed fewer false contradictory details than younger adults. To mitigate suggestibility in Experiment 2, participants were warned about potential errors, instructed to detect errors, or instructed to detect errors after exposure to examples of additive and contradictory details. Again, suggestibility to additive misinformation was greater than contradictory, and older adults endorsed less contradictory misinformation. Only after detection instructions with misinformation examples were younger adults able to reduce contradictory misinformation effects and reduced these effects to the level of older adults. Additive misinformation however, was immune to all warning and detection instructions. Thus, older adults were less susceptible to contradictory misinformation errors, and younger adults could match this misinformation rate when warning/detection instructions were strong. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. An UGS radar with micro-Doppler capabilities for wide area persistent surveillance

    NASA Astrophysics Data System (ADS)

    Tahmoush, Dave; Silvious, Jerry; Clark, John

    2010-04-01

    Detecting humans and distinguishing them from natural fauna is an important issue in security applications to reduce false alarm rates. In particular, it is important to detect and classify people who are walking in remote locations and transmit back detections over extended periods at a low cost and with minimal maintenance. The ability to discriminate men versus animals and vehicles at long range would give a distinct sensor advantage. The reduction in false positive detections due to animals would increase the usefulness of detections, while dismount identification could reduce friendly-fire. We developed and demonstrate a compact radar technology that is scalable to a variety of ultra-lightweight and low-power platforms for wide area persistent surveillance as an unattended, unmanned, and man-portable ground sensor. The radar uses micro-Doppler processing to characterize the tracks of moving targets and to then eliminate unimportant detections due to animals or civilian activity. This paper presents the system and data on humans, vehicles, and animals at multiple angles and directions of motion, demonstrates the signal processing approach that makes the targets visually recognizable, and verifies that the UGS radar has enough micro-Doppler capability to distinguish between humans, vehicles, and animals.

  11. HacDivSel: Two new methods (haplotype-based and outlier-based) for the detection of divergent selection in pairs of populations

    PubMed Central

    2017-01-01

    The detection of genomic regions involved in local adaptation is an important topic in current population genetics. There are several detection strategies available depending on the kind of genetic and demographic information at hand. A common drawback is the high risk of false positives. In this study we introduce two complementary methods for the detection of divergent selection from populations connected by migration. Both methods have been developed with the aim of being robust to false positives. The first method combines haplotype information with inter-population differentiation (FST). Evidence of divergent selection is concluded only when both the haplotype pattern and the FST value support it. The second method is developed for independently segregating markers i.e. there is no haplotype information. In this case, the power to detect selection is attained by developing a new outlier test based on detecting a bimodal distribution. The test computes the FST outliers and then assumes that those of interest would have a different mode. We demonstrate the utility of the two methods through simulations and the analysis of real data. The simulation results showed power ranging from 60–95% in several of the scenarios whilst the false positive rate was controlled below the nominal level. The analysis of real samples consisted of phased data from the HapMap project and unphased data from intertidal marine snail ecotypes. The results illustrate that the proposed methods could be useful for detecting locally adapted polymorphisms. The software HacDivSel implements the methods explained in this manuscript. PMID:28423003

  12. Fishing in the Water: Effect of Sampled Water Volume on Environmental DNA-Based Detection of Macroinvertebrates.

    PubMed

    Mächler, Elvira; Deiner, Kristy; Spahn, Fabienne; Altermatt, Florian

    2016-01-05

    Accurate detection of organisms is crucial for the effective management of threatened and invasive species because false detections directly affect the implementation of management actions. The use of environmental DNA (eDNA) as a species detection tool is in a rapid development stage; however, concerns about accurate detections using eDNA have been raised. We evaluated the effect of sampled water volume (0.25 to 2 L) on the detection rate for three macroinvertebrate species. Additionally, we tested (depending on the sampled water volume) what amount of total extracted DNA should be screened to reduce uncertainty in detections. We found that all three species were detected in all volumes of water. Surprisingly, however, only one species had a positive relationship between an increased sample volume and an increase in the detection rate. We conclude that the optimal sample volume might depend on the species-habitat combination and should be tested for the system where management actions are warranted. Nevertheless, we minimally recommend sampling water volumes of 1 L and screening at least 14 μL of extracted eDNA for each sample to reduce uncertainty in detections when studying macroinvertebrates in rivers and using our molecular workflow.

  13. Two-step glutamate dehydrogenase antigen real-time polymerase chain reaction assay for detection of toxigenic Clostridium difficile.

    PubMed

    Goldenberg, S D; Cliff, P R; Smith, S; Milner, M; French, G L

    2010-01-01

    Current diagnosis of Clostridium difficile infection (CDI) relies upon detection of toxins A/B in stool by enzyme immunoassay [EIA(A/B)]. This strategy is unsatisfactory because it has a low sensitivity resulting in significant false negatives. We investigated the performance of a two-step algorithm for diagnosis of CDI using detection of glutamate dehydrogenase (GDH). GDH-positive samples were tested for C. difficile toxin B gene (tcdB) by polymerase chain reaction (PCR). The performance of the two-step protocol was compared with toxin detection by the Meridian Premier EIA kit in 500 consecutive stool samples from patients with suspected CDI. The reference standard among samples that were positive by either EIA(A/B) or GDH testing was culture cytotoxin neutralisation (culture/CTN). Thirty-six (7%) of 500 samples were identified as true positives by culture/CTN. EIA(A/B) identified 14 of the positive specimens with 22 false negatives and two false positives. The two-step protocol identified 34 of the positive samples with two false positives and two false negatives. EIA(A/B) had a sensitivity of 39%, specificity of 99%, positive predictive value of 88% and negative predictive value of 95%. The two-step algorithm performed better, with corresponding values of 94%, 99%, 94% and 99% respectively. Screening for GDH before confirmation of positives by PCR is cheaper than screening all specimens by PCR and is an effective method for routine use. Current EIA(A/B) tests for CDI are of inadequate sensitivity and should be replaced; however, this may result in apparent changes in CDI rates that would need to be explained in national surveillance statistics. Copyright 2009 The Hospital Infection Society. Published by Elsevier Ltd. All rights reserved.

  14. A pdf-Free Change Detection Test Based on Density Difference Estimation.

    PubMed

    Bu, Li; Alippi, Cesare; Zhao, Dongbin

    2018-02-01

    The ability to detect online changes in stationarity or time variance in a data stream is a hot research topic with striking implications. In this paper, we propose a novel probability density function-free change detection test, which is based on the least squares density-difference estimation method and operates online on multidimensional inputs. The test does not require any assumption about the underlying data distribution, and is able to operate immediately after having been configured by adopting a reservoir sampling mechanism. Thresholds requested to detect a change are automatically derived once a false positive rate is set by the application designer. Comprehensive experiments validate the effectiveness in detection of the proposed method both in terms of detection promptness and accuracy.

  15. Preliminary study of detection of buried landmines using a programmable hyperspectral imager

    NASA Astrophysics Data System (ADS)

    McFee, John E.; Ripley, Herb T.; Buxton, Roger; Thriscutt, Andrew M.

    1996-05-01

    Experiments were conducted to determine if buried mines could be detected by measuring the change in reflectance spectra of vegetation above mine burial sites. Mines were laid using hand methods and simulated mechanical methods and spectral images were obtained over a three month period using a casi hyperspectral imager scanned from a personnel lift. Mines were not detectable by measurement of the shift of the red edge of vegetative spectra. By calculating the linear correlation coefficient image, some mines in light vegetative cover (grass, grass/blueberries) were apparently detected, but mines buried in heavy vegetation cover (deep ferns) were not detectable. Due to problems with ground truthing, accurate probabilities of detection and false alarm rates were not obtained.

  16. Sensitive test for sea mine identification based on polarization-aided image processing.

    PubMed

    Leonard, I; Alfalou, A; Brosseau, C

    2013-12-02

    Techniques are widely sought to detect and identify sea mines. This issue is characterized by complicated mine shapes and underwater light propagation dependencies. In a preliminary study we use a preprocessing step for denoising underwater images before applying the algorithm for mine detection. Once a mine is detected, the protocol for identifying it is activated. Among many correlation filters, we have focused our attention on the asymmetric segmented phase-only filter for quantifying the recognition rate because it allows us to significantly increase the number of reference images in the fabrication of this filter. Yet they are not entirely satisfactory in terms of recognition rate and the obtained images revealed to be of low quality. In this report, we propose a way to improve upon this preliminary study by using a single wavelength polarimetric camera in order to denoise the images. This permits us to enhance images and improve depth visibility. We present illustrative results using in situ polarization imaging of a target through a milk-water mixture and demonstrate that our challenging objective of increasing the detection rate and decreasing the false alarm rate has been achieved.

  17. Generic, scalable and decentralized fault detection for robot swarms.

    PubMed

    Tarapore, Danesh; Christensen, Anders Lyhne; Timmis, Jon

    2017-01-01

    Robot swarms are large-scale multirobot systems with decentralized control which means that each robot acts based only on local perception and on local coordination with neighboring robots. The decentralized approach to control confers number of potential benefits. In particular, inherent scalability and robustness are often highlighted as key distinguishing features of robot swarms compared with systems that rely on traditional approaches to multirobot coordination. It has, however, been shown that swarm robotics systems are not always fault tolerant. To realize the robustness potential of robot swarms, it is thus essential to give systems the capacity to actively detect and accommodate faults. In this paper, we present a generic fault-detection system for robot swarms. We show how robots with limited and imperfect sensing capabilities are able to observe and classify the behavior of one another. In order to achieve this, the underlying classifier is an immune system-inspired algorithm that learns to distinguish between normal behavior and abnormal behavior online. Through a series of experiments, we systematically assess the performance of our approach in a detailed simulation environment. In particular, we analyze our system's capacity to correctly detect robots with faults, false positive rates, performance in a foraging task in which each robot exhibits a composite behavior, and performance under perturbations of the task environment. Results show that our generic fault-detection system is robust, that it is able to detect faults in a timely manner, and that it achieves a low false positive rate. The developed fault-detection system has the potential to enable long-term autonomy for robust multirobot systems, thus increasing the usefulness of robots for a diverse repertoire of upcoming applications in the area of distributed intelligent automation.

  18. Generic, scalable and decentralized fault detection for robot swarms

    PubMed Central

    Christensen, Anders Lyhne; Timmis, Jon

    2017-01-01

    Robot swarms are large-scale multirobot systems with decentralized control which means that each robot acts based only on local perception and on local coordination with neighboring robots. The decentralized approach to control confers number of potential benefits. In particular, inherent scalability and robustness are often highlighted as key distinguishing features of robot swarms compared with systems that rely on traditional approaches to multirobot coordination. It has, however, been shown that swarm robotics systems are not always fault tolerant. To realize the robustness potential of robot swarms, it is thus essential to give systems the capacity to actively detect and accommodate faults. In this paper, we present a generic fault-detection system for robot swarms. We show how robots with limited and imperfect sensing capabilities are able to observe and classify the behavior of one another. In order to achieve this, the underlying classifier is an immune system-inspired algorithm that learns to distinguish between normal behavior and abnormal behavior online. Through a series of experiments, we systematically assess the performance of our approach in a detailed simulation environment. In particular, we analyze our system’s capacity to correctly detect robots with faults, false positive rates, performance in a foraging task in which each robot exhibits a composite behavior, and performance under perturbations of the task environment. Results show that our generic fault-detection system is robust, that it is able to detect faults in a timely manner, and that it achieves a low false positive rate. The developed fault-detection system has the potential to enable long-term autonomy for robust multirobot systems, thus increasing the usefulness of robots for a diverse repertoire of upcoming applications in the area of distributed intelligent automation. PMID:28806756

  19. Comparison between Scalp EEG and Behind-the-Ear EEG for Development of a Wearable Seizure Detection System for Patients with Focal Epilepsy

    PubMed Central

    Gu, Ying; Cleeren, Evy; Dan, Jonathan; Claes, Kasper; Hunyadi, Borbála

    2017-01-01

    A wearable electroencephalogram (EEG) device for continuous monitoring of patients suffering from epilepsy would provide valuable information for the management of the disease. Currently no EEG setup is small and unobtrusive enough to be used in daily life. Recording behind the ear could prove to be a solution to a wearable EEG setup. This article examines the feasibility of recording epileptic EEG from behind the ear. It is achieved by comparison with scalp EEG recordings. Traditional scalp EEG and behind-the-ear EEG were simultaneously acquired from 12 patients with temporal, parietal, or occipital lobe epilepsy. Behind-the-ear EEG consisted of cross-head channels and unilateral channels. The analysis on Electrooculography (EOG) artifacts resulting from eye blinking showed that EOG artifacts were absent on cross-head channels and had significantly small amplitudes on unilateral channels. Temporal waveform and frequency content during seizures from behind-the-ear EEG visually resembled that from scalp EEG. Further, coherence analysis confirmed that behind-the-ear EEG acquired meaningful epileptic discharges similarly to scalp EEG. Moreover, automatic seizure detection based on support vector machine (SVM) showed that comparable seizure detection performance can be achieved using these two recordings. With scalp EEG, detection had a median sensitivity of 100% and a false detection rate of 1.14 per hour, while, with behind-the-ear EEG, it had a median sensitivity of 94.5% and a false detection rate of 0.52 per hour. These findings demonstrate the feasibility of detecting seizures from EEG recordings behind the ear for patients with focal epilepsy. PMID:29295522

  20. Reliability analysis and fault-tolerant system development for a redundant strapdown inertial measurement unit. [inertial platforms

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.

  1. Automated Epileptic Seizure Detection Based on Wearable ECG and PPG in a Hospital Environment

    PubMed Central

    De Cooman, Thomas; Gu, Ying; Cleeren, Evy; Claes, Kasper; Van Paesschen, Wim; Van Huffel, Sabine; Hunyadi, Borbála

    2017-01-01

    Electrocardiography has added value to automatically detect seizures in temporal lobe epilepsy (TLE) patients. The wired hospital system is not suited for a long-term seizure detection system at home. To address this need, the performance of two wearable devices, based on electrocardiography (ECG) and photoplethysmography (PPG), are compared with hospital ECG using an existing seizure detection algorithm. This algorithm classifies the seizures on the basis of heart rate features, extracted from the heart rate increase. The algorithm was applied to recordings of 11 patients in a hospital setting with 701 h capturing 47 (fronto-)temporal lobe seizures. The sensitivities of the hospital system, the wearable ECG device and the wearable PPG device were respectively 57%, 70% and 32%, with corresponding false alarms per hour of 1.92, 2.11 and 1.80. Whereas seizure detection performance using the wrist-worn PPG device was considerably lower, the performance using the wearable ECG is proven to be similar to that of the hospital ECG. PMID:29027928

  2. Automated Epileptic Seizure Detection Based on Wearable ECG and PPG in a Hospital Environment.

    PubMed

    Vandecasteele, Kaat; De Cooman, Thomas; Gu, Ying; Cleeren, Evy; Claes, Kasper; Paesschen, Wim Van; Huffel, Sabine Van; Hunyadi, Borbála

    2017-10-13

    Electrocardiography has added value to automatically detect seizures in temporal lobe epilepsy (TLE) patients. The wired hospital system is not suited for a long-term seizure detection system at home. To address this need, the performance of two wearable devices, based on electrocardiography (ECG) and photoplethysmography (PPG), are compared with hospital ECG using an existing seizure detection algorithm. This algorithm classifies the seizures on the basis of heart rate features, extracted from the heart rate increase. The algorithm was applied to recordings of 11 patients in a hospital setting with 701 h capturing 47 (fronto-)temporal lobe seizures. The sensitivities of the hospital system, the wearable ECG device and the wearable PPG device were respectively 57%, 70% and 32%, with corresponding false alarms per hour of 1.92, 2.11 and 1.80. Whereas seizure detection performance using the wrist-worn PPG device was considerably lower, the performance using the wearable ECG is proven to be similar to that of the hospital ECG.

  3. Empirical Bayes method for reducing false discovery rates of correlation matrices with block diagonal structure.

    PubMed

    Pacini, Clare; Ajioka, James W; Micklem, Gos

    2017-04-12

    Correlation matrices are important in inferring relationships and networks between regulatory or signalling elements in biological systems. With currently available technology sample sizes for experiments are typically small, meaning that these correlations can be difficult to estimate. At a genome-wide scale estimation of correlation matrices can also be computationally demanding. We develop an empirical Bayes approach to improve covariance estimates for gene expression, where we assume the covariance matrix takes a block diagonal form. Our method shows lower false discovery rates than existing methods on simulated data. Applied to a real data set from Bacillus subtilis we demonstrate it's ability to detecting known regulatory units and interactions between them. We demonstrate that, compared to existing methods, our method is able to find significant covariances and also to control false discovery rates, even when the sample size is small (n=10). The method can be used to find potential regulatory networks, and it may also be used as a pre-processing step for methods that calculate, for example, partial correlations, so enabling the inference of the causal and hierarchical structure of the networks.

  4. Experimental evaluation of fingerprint verification system based on double random phase encoding

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi

    2006-03-01

    We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.

  5. TeraSCREEN: multi-frequency multi-mode Terahertz screening for border checks

    NASA Astrophysics Data System (ADS)

    Alexander, Naomi E.; Alderman, Byron; Allona, Fernando; Frijlink, Peter; Gonzalo, Ramón; Hägelen, Manfred; Ibáñez, Asier; Krozer, Viktor; Langford, Marian L.; Limiti, Ernesto; Platt, Duncan; Schikora, Marek; Wang, Hui; Weber, Marc Andree

    2014-06-01

    The challenge for any security screening system is to identify potentially harmful objects such as weapons and explosives concealed under clothing. Classical border and security checkpoints are no longer capable of fulfilling the demands of today's ever growing security requirements, especially with respect to the high throughput generally required which entails a high detection rate of threat material and a low false alarm rate. TeraSCREEN proposes to develop an innovative concept of multi-frequency multi-mode Terahertz and millimeter-wave detection with new automatic detection and classification functionalities. The system developed will demonstrate, at a live control point, the safe automatic detection and classification of objects concealed under clothing, whilst respecting privacy and increasing current throughput rates. This innovative screening system will combine multi-frequency, multi-mode images taken by passive and active subsystems which will scan the subjects and obtain complementary spatial and spectral information, thus allowing for automatic threat recognition. The TeraSCREEN project, which will run from 2013 to 2016, has received funding from the European Union's Seventh Framework Programme under the Security Call. This paper will describe the project objectives and approach.

  6. Pedestrian detection in infrared image using HOG and Autoencoder

    NASA Astrophysics Data System (ADS)

    Chen, Tianbiao; Zhang, Hao; Shi, Wenjie; Zhang, Yu

    2017-11-01

    In order to guarantee the safety of driving at night, vehicle-mounted night vision system was used to detect pedestrian in front of cars and send alarm to prevent the potential dangerous. To decrease the false positive rate (FPR) and increase the true positive rate (TPR), a pedestrian detection method based on HOG and Autoencoder (HOG+Autoencoder) was presented. Firstly, the HOG features of input images were computed and encoded by Autoencoder. Then the encoded features were classified by Softmax. In the process of training, Autoencoder was trained unsupervised. Softmax was trained with supervision. Autoencoder and Softmax were stacked into a model and fine-tuned by labeled images. Experiment was conducted to compare the detection performance between HOG and HOG+Autoencoder, using images collected by vehicle-mounted infrared camera. There were 80000 images for training set and 20000 for the testing set, with a rate of 1:3 between positive and negative images. The result shows that when TPR is 95%, FPR of HOG+Autoencoder is 0.4%, while the FPR of HOG is 5% with the same TPR.

  7. Availability of tissue rinse liquid-based cytology for the rapid diagnosis of sentinel lymph node metastasis and improved bilateral detection by photodynamic eye camera.

    PubMed

    Kato, Hidenori; Ohba, Yoko; Yamazaki, Hiroyuki; Minobe, Shin-Ichiro; Sudo, Satoko; Todo, Yukiharu; Okamoto, Kazuhira; Yamashiro, Katsushige

    2015-08-01

    On sentinel lymph node navigation surgery for early invasive cervical cancers, to gain high sensitivity and specificity, the sentinel nodes should be detected bilaterally and pathological diagnosis should be sensitive to detect micrometastasis. To improve these problems, we tried tissue rinse liquid-based cytology and the photodynamic eye. From 2005 to 2013, 102 patients with Stage Ib1 uterine cervical cancer were subjected to sentinel lymph node navigation surgery with Technetium-99 m colloid and blue dye. For the recent 11 patients with whom bilateral sentinel node detection was not available, the photodynamic eye was selectively examined. The detected sentinel node was cut along the minor axis into 2 mm slices, soaked in 10 ml CytoRich red and then subjected to tissue rinse liquid-based cytology at the time of surgery. With the accumulation of 102 Ib1 patients subjected to sentinel lymph node navigation surgery, the bilateral sentinel node detection rate was 67.7%. The photodynamic eye was examined for the recent 11 patients who did not have bilateral signals. Out of the 11, 10 patients obtained bilateral signals successfully. During the period of examining the photodynamic eye, a total of 34 patients were subjected to sentinel lymph node navigation surgery. Thus, the overall bilateral detection rate increased to 97% in this subset. Two hundred and five lymph nodes were available as sentinel nodes. The sensitivity of tissue rinse liquid-based cytology was 91.7%, and the specificity was 100%. False positivity was 0% and false negativity was 8.3%. Detection failure was observed only with one micrometastasis and one case of isolated tumor cells. Combination of photodynamic eye detection and tissue rinse liquid-based cytology pathology can be a promising method for more rewarding sentinel node detection. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Comparison of ASGARD and UFOCapture

    NASA Technical Reports Server (NTRS)

    Blaauw, Rhiannon C.; Cruse, Katherine S.

    2011-01-01

    The Meteoroid Environment Office is undertaking a comparison between UFOCapture/Analyzer and ASGARD (All Sky and Guided Automatic Realtime Detection). To accomplish this, video output from a Watec video camera on a 17 mm Schneider lens (25 degree field of view) was split and input into the two different meteor detection softwares. The purpose of this study is to compare the sensitivity of the two systems, false alarm rates and trajectory information, among other quantities. The important components of each software will be highlighted and comments made about the detection/rejection algorithms and the amount of user-labor required for each system.

  9. Neyman Pearson detection of K-distributed random variables

    NASA Astrophysics Data System (ADS)

    Tucker, J. Derek; Azimi-Sadjadi, Mahmood R.

    2010-04-01

    In this paper a new detection method for sonar imagery is developed in K-distributed background clutter. The equation for the log-likelihood is derived and compared to the corresponding counterparts derived for the Gaussian and Rayleigh assumptions. Test results of the proposed method on a data set of synthetic underwater sonar images is also presented. This database contains images with targets of different shapes inserted into backgrounds generated using a correlated K-distributed model. Results illustrating the effectiveness of the K-distributed detector are presented in terms of probability of detection, false alarm, and correct classification rates for various bottom clutter scenarios.

  10. Detection of right-to-left shunts: comparison between the International Consensus and Spencer Logarithmic Scale criteria.

    PubMed

    Lao, Annabelle Y; Sharma, Vijay K; Tsivgoulis, Georgios; Frey, James L; Malkoff, Marc D; Navarro, Jose C; Alexandrov, Andrei V

    2008-10-01

    International Consensus Criteria (ICC) consider right-to-left shunt (RLS) present when Transcranial Doppler (TCD) detects even one microbubble (microB). Spencer Logarithmic Scale (SLS) offers more grades of RLS with detection of >30 microB corresponding to a large shunt. We compared the yield of ICC and SLS in detection and quantification of a large RLS. We prospectively evaluated paradoxical embolism in consecutive patients with ischemic strokes or transient ischemic attack (TIA) using injections of 9 cc saline agitated with 1 cc of air. Results were classified according to ICC [negative (no microB), grade I (1-20 microB), grade II (>20 microB or "shower" appearance of microB), and grade III ("curtain" appearance of microB)] and SLS criteria [negative (no microB), grade I (1-10 microB), grade II (11-30 microB), grade III (31100 microB), grade IV (101300 microB), grade V (>300 microB)]. The RLS size was defined as large (>4 mm) using diameter measurement of the septal defects on transesophageal echocardiography (TEE). TCD comparison to TEE showed 24 true positive, 48 true negative, 4 false positive, and 2 false negative cases (sensitivity 92.3%, specificity 92.3%, positive predictive value (PPV) 85.7%, negative predictive value (NPV) 96%, and accuracy 92.3%) for any RLS presence. Both ICC and SLS were 100% sensitive for detection of large RLS. ICC and SLS criteria yielded a false positive rate of 24.4% and 7.7%, respectively when compared to TEE. Although both grading scales provide agreement as to any shunt presence, using the Spencer Scale grade III or higher can decrease by one-half the number of false positive TCD diagnoses to predict large RLS on TEE.

  11. Visual analysis of trash bin processing on garbage trucks in low resolution video

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Loibner, Gernot

    2015-03-01

    We present a system for trash can detection and counting from a camera which is mounted on a garbage collection truck. A working prototype has been successfully implemented and tested with several hours of real-world video. The detection pipeline consists of HOG detectors for two trash can sizes, and meanshift tracking and low level image processing for the analysis of the garbage disposal process. Considering the harsh environment and unfavorable imaging conditions, the process works already good enough so that very useful measurements from video data can be extracted. The false positive/false negative rate of the full processing pipeline is about 5-6% at fully automatic operation. Video data of a full day (about 8 hrs) can be processed in about 30 minutes on a standard PC.

  12. Open-area concealed-weapon detection system

    NASA Astrophysics Data System (ADS)

    Pati, P.; Mather, P.

    2011-06-01

    Concealed Weapon Detection (CWD) has become a significant challenge to present day security needs; individuals carrying weapons into airplanes, schools, and secured establishments are threat to public security. Although controlled screening, of people for concealed weapons, has been employed in many establishments, procedures and equipment are designed to work in restricted environments like airport passport control, military checkpoints, hospitals, school and university entrance. Furthermore, screening systems do not effectively decipher between threat and non-threat metal objects, thus leading to high rate of false alarms which can become a liability to daily operational needs of establishments. Therefore, the design and development of a new CWD system to operate in a large open area environment with large numbers of people reduced incidences of false alarms and increased location accuracy is essential.

  13. Using YOLO based deep learning network for real time detection and localization of lung nodules from low dose CT scans

    NASA Astrophysics Data System (ADS)

    Ramachandran S., Sindhu; George, Jose; Skaria, Shibon; V. V., Varun

    2018-02-01

    Lung cancer is the leading cause of cancer related deaths in the world. The survival rate can be improved if the presence of lung nodules are detected early. This has also led to more focus being given to computer aided detection (CAD) and diagnosis of lung nodules. The arbitrariness of shape, size and texture of lung nodules is a challenge to be faced when developing these detection systems. In the proposed work we use convolutional neural networks to learn the features for nodule detection, replacing the traditional method of handcrafting features like geometric shape or texture. Our network uses the DetectNet architecture based on YOLO (You Only Look Once) to detect the nodules in CT scans of lung. In this architecture, object detection is treated as a regression problem with a single convolutional network simultaneously predicting multiple bounding boxes and class probabilities for those boxes. By performing training using chest CT scans from Lung Image Database Consortium (LIDC), NVIDIA DIGITS and Caffe deep learning framework, we show that nodule detection using this single neural network can result in reasonably low false positive rates with high sensitivity and precision.

  14. Performance of fusion algorithms for computer-aided detection and classification of mines in very shallow water obtained from testing in navy Fleet Battle Exercise-Hotel 2000

    NASA Astrophysics Data System (ADS)

    Ciany, Charles M.; Zurawski, William; Kerfoot, Ian

    2001-10-01

    The performance of Computer Aided Detection/Computer Aided Classification (CAD/CAC) Fusion algorithms on side-scan sonar images was evaluated using data taken at the Navy's's Fleet Battle Exercise-Hotel held in Panama City, Florida, in August 2000. A 2-of-3 binary fusion algorithm is shown to provide robust performance. The algorithm accepts the classification decisions and associated contact locations form three different CAD/CAC algorithms, clusters the contacts based on Euclidian distance, and then declares a valid target when a clustered contact is declared by at least 2 of the 3 individual algorithms. This simple binary fusion provided a 96 percent probability of correct classification at a false alarm rate of 0.14 false alarms per image per side. The performance represented a 3.8:1 reduction in false alarms over the best performing single CAD/CAC algorithm, with no loss in probability of correct classification.

  15. The Large Synoptic Survey Telescope as a Near-Earth Object discovery machine

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Slater, Colin T.; Moeyens, Joachim; Allen, Lori; Axelrod, Tim; Cook, Kem; Ivezić, Željko; Jurić, Mario; Myers, Jonathan; Petry, Catherine E.

    2018-03-01

    Using the most recent prototypes, design, and as-built system information, we test and quantify the capability of the Large Synoptic Survey Telescope (LSST) to discover Potentially Hazardous Asteroids (PHAs) and Near-Earth Objects (NEOs). We empirically estimate an expected upper limit to the false detection rate in LSST image differencing, using measurements on DECam data and prototype LSST software and find it to be about 450 deg-2. We show that this rate is already tractable with current prototype of the LSST Moving Object Processing System (MOPS) by processing a 30-day simulation consistent with measured false detection rates. We proceed to evaluate the performance of the LSST baseline survey strategy for PHAs and NEOs using a high-fidelity simulated survey pointing history. We find that LSST alone, using its baseline survey strategy, will detect 66% of the PHA and 61% of the NEO population objects brighter than H = 22 , with the uncertainty in the estimate of ± 5 percentage points. By generating and examining variations on the baseline survey strategy, we show it is possible to further improve the discovery yields. In particular, we find that extending the LSST survey by two additional years and doubling the MOPS search window increases the completeness for PHAs to 86% (including those discovered by contemporaneous surveys) without jeopardizing other LSST science goals (77% for NEOs). This equates to reducing the undiscovered population of PHAs by additional 26% (15% for NEOs), relative to the baseline survey.

  16. Affected sib pair tests in inbred populations.

    PubMed

    Liu, W; Weir, B S

    2004-11-01

    The affected-sib-pair (ASP) method for detecting linkage between a disease locus and marker loci was first established 50 years ago, and since then numerous modifications have been made. We modify two identity-by-state (IBS) test statistics of Lange (Lange, 1986a, 1986b) to allow for inbreeding in the population. We evaluate the power and false positive rates of the modified tests under three disease models, using simulated data. Before estimating false positive rates, we demonstrate that IBS tests are tests of both linkage and linkage disequilibrium between marker and disease loci. Therefore, the null hypothesis of IBS tests should be no linkage and no LD. When the population inbreeding coefficient is large, the false positive rates of Lange's tests become much larger than the nominal value, while those of our modified tests remain close to the nominal value. To estimate power with a controlled false positive rate, we choose the cutoff values based on simulated datasets under the null hypothesis, so that both Lange's tests and the modified tests generate same false positive rate. The powers of Lange's z-test and our modified z-test are very close and do not change much with increasing inbreeding. The power of the modified chi-square test also stays stable when the inbreeding coefficient increases. However, the power of Lange's chi-square test increases with increasing inbreeding, and is larger than that of our modified chi-square test for large inbreeding coefficients. The power is high under a recessive disease model for both Lange's tests and the modified tests, though the power is low for additive and dominant disease models. Allowing for inbreeding is therefore appropriate, at least for diseases known to be recessive.

  17. Conditional Outlier Detection for Clinical Alerting

    PubMed Central

    Hauskrecht, Milos; Valko, Michal; Batal, Iyad; Clermont, Gilles; Visweswaran, Shyam; Cooper, Gregory F.

    2010-01-01

    We develop and evaluate a data-driven approach for detecting unusual (anomalous) patient-management actions using past patient cases stored in an electronic health record (EHR) system. Our hypothesis is that patient-management actions that are unusual with respect to past patients may be due to a potential error and that it is worthwhile to raise an alert if such a condition is encountered. We evaluate this hypothesis using data obtained from the electronic health records of 4,486 post-cardiac surgical patients. We base the evaluation on the opinions of a panel of experts. The results support that anomaly-based alerting can have reasonably low false alert rates and that stronger anomalies are correlated with higher alert rates. PMID:21346986

  18. Conditional outlier detection for clinical alerting.

    PubMed

    Hauskrecht, Milos; Valko, Michal; Batal, Iyad; Clermont, Gilles; Visweswaran, Shyam; Cooper, Gregory F

    2010-11-13

    We develop and evaluate a data-driven approach for detecting unusual (anomalous) patient-management actions using past patient cases stored in an electronic health record (EHR) system. Our hypothesis is that patient-management actions that are unusual with respect to past patients may be due to a potential error and that it is worthwhile to raise an alert if such a condition is encountered. We evaluate this hypothesis using data obtained from the electronic health records of 4,486 post-cardiac surgical patients. We base the evaluation on the opinions of a panel of experts. The results support that anomaly-based alerting can have reasonably low false alert rates and that stronger anomalies are correlated with higher alert rates.

  19. The Direct Imaging Search for Earth 2.0: Quantifying Biases and Planetary False Positives

    NASA Astrophysics Data System (ADS)

    Guimond, Claire Marie; Cowan, Nicolas B.

    2018-06-01

    Direct imaging is likely the best way to characterize the atmospheres of Earth-sized exoplanets in the habitable zone of Sun-like stars. Previously, Stark et al. estimated the Earth twin yield of future direct imaging missions, such as LUVOIR and HabEx. We extend this analysis to other types of planets, which will act as false positives for Earth twins. We define an Earth twin as any exoplanet within half an e-folding of 1 au in semimajor axis and 1 {R}\\oplus in planetary radius, orbiting a G-dwarf. Using Monte Carlo analyses, we quantify the biases and planetary false-positive rates of Earth searches. That is, given a pale dot at the correct projected separation and brightness to be a candidate Earth, what are the odds that it is, in fact, an Earth twin? Our notional telescope has a diameter of 10 m, an inner working angle of 3λ/D, and an outer working angle of 10λ/D (62 mas and 206 mas at 1.0 μm). With no precursor knowledge and one visit per star, 77% of detected candidate Earths are actually un-Earths; their mean radius is 2.3 {R}\\oplus , a sub-Neptune. The odds improve if we image every planet at its optimal orbital phase, either by relying on precursor knowledge, or by performing multi-epoch direct imaging. In such a targeted search, 47% of detected Earth twin candidates are false positives, and they have a mean radius of 1.7 {R}\\oplus . The false-positive rate is insensitive to stellar spectral type and the assumption of circular orbits.

  20. Detection theory applied to high intensity focused ultrasound (HIFU) treatment evaluation

    NASA Astrophysics Data System (ADS)

    Sanghvi, Narendra; Wunderlich, Adam; Seip, Ralf; Tavakkoli, Jahangir; Dines, Kris; Baily, Michael; Crum, Lawrence

    2003-04-01

    The aim of this work is to develop a HIFU treatment evaluation algorithm based on 1-D pulse/echo (P/E) ultrasound data taken during HIFU exposures. The algorithm is applicable to large treatment volumes resulting from several overlapping elementary exposures. Treatments consisted of multiple HIFU exposures with an on-time of 3 seconds each, spaced 3 mm apart, and an off-time of 6 seconds in between HIFU exposures. The HIFU was paused for approximately 70 milliseconds every 0.5 seconds, while P/E data was acquired along the beam axis, using a confocal imaging transducer. Data was collected from multiple in vitro and in vivo tissue treatments, including shams. The cumulative energy change in the P/E data was found for every HIFU exposure, as a function of depth. Subsequently, a likelihood ratio test with a fixed false alarm rate was used to derive a positive or negative lesion creation decision for that position. For false alarm rates less than 5%, positive treatment outcomes were consistently detected for better than 90% of the HIFU exposures. In addition, the algorithm outcome correlated to the applied HIFU intensity level. Lesion formation was therefore successfully detected as a function of dosage. [Work supported by NIH SBIR Grant 2 R 44 CA 83244-02.

  1. Sentinel lymph node mapping in melanoma: the issue of false-negative findings.

    PubMed

    Manca, Gianpiero; Rubello, Domenico; Romanini, Antonella; Boni, Giuseppe; Chiacchio, Serena; Tredici, Manuel; Mazzarri, Sara; Duce, Valerio; Colletti, Patrick M; Volterrani, Duccio; Mariani, Giuliano

    2014-07-01

    Management of cutaneous melanoma has changed after introduction in the clinical routine of sentinel lymph node biopsy (SLNB) for nodal staging. By defining the nodal basin status, SLNB provides a powerful prognostic information. Nevertheless, some debate still surrounds the accuracy of this procedure in terms of false-negative rate. Several large-scale studies have reported a relatively high false-negative rate (5.6%-21%), correctly defined as the proportion of false-negative results with respect to the total number of "actual" positive lymph nodes. In this review, we identified all the technical aspects that the nuclear medicine physician, the surgeon, and the pathologist should take into account to improve accuracy of the procedure and minimize the false-negative rate. In particular, SPECT/CT imaging detects more SLNs than those found by planar lymphoscintigraphy. Furthermore, the nuclear medicine community should reach a consensus on the radioactive counting rate threshold to better guide the surgeon in identifying the lymph nodes with the highest likelihood of housing metastases ("true biologic SLNs"). Analysis of the harvested SLNs by conventional techniques is also a further potential source for error. More accurate SLN analysis (eg, molecular analysis by reverse transcriptase-polymerase chain reaction) and more extensive SLN sampling identify more positive nodes, thus reducing the false-negative rate.The clinical factors identifying patients at higher-risk local recurrence after a negative SLNB include older age at diagnosis, deeper lesions, histological ulceration, and head-neck anatomic location of the primary lesion.The clinical impact of a false-negative SLNB on the prognosis of melanoma patients remains controversial, because the majority of studies have failed to demonstrate overall statistically significant disadvantage in melanoma-specific survival for false-negative SLNB patients compared with true-positive SLNB patients.When new more effective drugs will be available in the adjuvant setting for stage III melanoma patients, the implication of an accurate staging procedure for the sentinel lymph nodes will be crucial for both patients and clinicians. Standardization and accuracy of SLN identification, removal, and analysis are required.

  2. Social-Cognitive Biases in Simulated Airline Luggage Screening

    NASA Technical Reports Server (NTRS)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    This study illustrated how social cognitive biases affect the decision making process of air1ine luggage screeners. Participants (n = 96) performed a computer simulated task to detect hidden weapons in 200 x-ray images of passenger luggage. Participants saw each image for two (high time pressure) or six seconds (low time pressure). Participants observed pictures of the "passenger" who owns the luggage . The "pre-anchor group" answered questions about the passenger before the luggage image appeared, the "post-snchor" group answered questions after the luggage appeared, and the "no-anchor group" answered no questions. Participants either stopped or did not stop the bag. and rated their confidence in their decision. Participants under high time pressure had lower hit rates and higher false alarms, Significant differences between the pre-, no-, and post-anchor groups were based on the gender and race of the passengers. Participants had higher false alarm rates in response to male than female passengers.

  3. Quantitative Analysis of Situational Awareness (QUASA): Applying Signal Detection Theory to True/False Probes and Self-Ratings

    DTIC Science & Technology

    2004-06-01

    obtained. Further refinements of the technique based on recent research in experimental psychology are also considered. INTRODUCTION The key...an established line of research in psychology in which objective and subjective metrics are combined to analyse the degree of ‘calibration’ in... Creelman , 1991). A notable exception is the study by Kunimoto et al. (2001) in which confidence ratings were subjected to SDT analysis to evaluate the

  4. The effect of information about false negative and false positive rates on people's attitudes towards colorectal cancer screening using faecal occult blood testing (FOBt).

    PubMed

    Miles, Anne; Rodrigues, Vania; Sevdalis, Nick

    2013-11-01

    To examine the impact of numeric risk information about false negative (FN) and false positive (FP) rates in faecal occult blood testing (FOBt) on attitudes towards screening. 95 people aged 45-59, living in England, read 6 hypothetical vignettes presented online about the use of FOB testing to detect bowel cancer, in which information about FN and FP rates was systematically varied. Both verbal and numeric FN risk information reduced people's interest in screening compared with no FN information. Numeric FN risk information reduced people's perceptions of screening effectiveness and lowered perceived trust in the results of screening compared with both verbal FN information and no FN information. FP information did not affect attitudes towards FOB testing. There was limited evidence that FN information reduced interest and perceptions of screening effectiveness more in educated groups. Numeric FN risk information decreased people's perceptions of screening effectiveness and trust in the results of screening but did not affect people's interest in screening anymore than verbal FN risk information. Numeric FN information could be added to patient information without affecting interest in screening, although this needs to be replicated in a larger, more representative sample. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Validation of an Arab name algorithm in the determination of Arab ancestry for use in health research.

    PubMed

    El-Sayed, Abdulrahman M; Lauderdale, Diane S; Galea, Sandro

    2010-12-01

    Data about Arab-Americans, a growing ethnic minority, are not routinely collected in vital statistics, registry, or administrative data in the USA. The difficulty in identifying Arab-Americans using publicly available data sources is a barrier to health research about this group. Here, we validate an empirically based probabilistic Arab name algorithm (ANA) for identifying Arab-Americans in health research. We used data from all Michigan birth certificates between 2000 and 2005. Fathers' surnames and mothers' maiden names were coded as Arab or non-Arab according to the ANA. We calculated sensitivity, specificity, and positive (PPV) and negative predictive values (NPV) of Arab ethnicity inferred using the ANA as compared to self-reported Arab ancestry. Statewide, the ANA had a specificity of 98.9%, a sensitivity of 50.3%, a PPV of 57.0%, and an NPV of 98.6%. Both the false-positive and false-negative rates were higher among men than among women. As the concentration of Arab-Americans in a study locality increased, the ANA false-positive rate increased and false-negative rate decreased. The ANA is highly specific but only moderately sensitive as a means of detecting Arab ancestry. Future research should compare health characteristics among Arab-American populations defined by Arab ancestry and those defined by the ANA.

  6. Improving Spectral Image Classification through Band-Ratio Optimization and Pixel Clustering

    NASA Astrophysics Data System (ADS)

    O'Neill, M.; Burt, C.; McKenna, I.; Kimblin, C.

    2017-12-01

    The Underground Nuclear Explosion Signatures Experiment (UNESE) seeks to characterize non-prompt observables from underground nuclear explosions (UNE). As part of this effort, we evaluated the ability of DigitalGlobe's WorldView-3 (WV3) to detect and map UNE signatures. WV3 is the current state-of-the-art, commercial, multispectral imaging satellite; however, it has relatively limited spectral and spatial resolutions. These limitations impede image classifiers from detecting targets that are spatially small and lack distinct spectral features. In order to improve classification results, we developed custom algorithms to reduce false positive rates while increasing true positive rates via a band-ratio optimization and pixel clustering front-end. The clusters resulting from these algorithms were processed with standard spectral image classifiers such as Mixture-Tuned Matched Filter (MTMF) and Adaptive Coherence Estimator (ACE). WV3 and AVIRIS data of Cuprite, Nevada, were used as a validation data set. These data were processed with a standard classification approach using MTMF and ACE algorithms. They were also processed using the custom front-end prior to the standard approach. A comparison of the results shows that the custom front-end significantly increases the true positive rate and decreases the false positive rate.This work was done by National Security Technologies, LLC, under Contract No. DE-AC52-06NA25946 with the U.S. Department of Energy. DOE/NV/25946-3283.

  7. Use of Near-Infrared Spectroscopy and Chemometrics for the Nondestructive Identification of Concealed Damage in Raw Almonds (Prunus dulcis).

    PubMed

    Rogel-Castillo, Cristian; Boulton, Roger; Opastpongkarn, Arunwong; Huang, Guangwei; Mitchell, Alyson E

    2016-07-27

    Concealed damage (CD) is defined as a brown discoloration of the kernel interior (nutmeat) that appears only after moderate to high heat treatment (e.g., blanching, drying, roasting, etc.). Raw almonds with CD have no visible defects before heat treatment. Currently, there are no screening methods available for detecting CD in raw almonds. Herein, the feasibility of using near-infrared (NIR) spectroscopy between 1125 and 2153 nm for the detection of CD in almonds is demonstrated. Almond kernels with CD have less NIR absorbance in the region related with oil, protein, and carbohydrates. With the use of partial least squares discriminant analysis (PLS-DA) and selection of specific wavelengths, three classification models were developed. The calibration models have false-positive and false-negative error rates ranging between 12.4 and 16.1% and between 10.6 and 17.2%, respectively. The percent error rates ranged between 8.2 and 9.2%. Second-derivative preprocessing of the selected wavelength resulted in the most robust predictive model.

  8. Application of matrix-assisted laser desorption ionization time-of-flight mass spectrometry in the screening of vanA-positive Enterococcus faecium.

    PubMed

    Wang, Li-jun; Lu, Xin-xin; Wu, Wei; Sui, Wen-jun; Zhang, Gui

    2014-01-01

    In order to evaluate a rapid matrix-assisted laser desorption ionization-time of flight mass spectrometry (MAIDI-TOF MS) assay in screening vancomycin-resistant Enterococcus faecium, a total of 150 E. faecium clinical strains were studied, including 60 vancomycin-resistant E. faecium (VREF) isolates and 90 vancomycin-susceptible (VSEF) strains. Vancomycin resistance genes were detected by sequencing. E. faecium were identified by MALDI-TOF MS. A genetic algorithm model with ClinProTools software was generated using spectra of 30 VREF isolates and 30 VSEF isolates. Using this model, 90 test isolates were discriminated between VREF and VSEF. The results showed that all sixty VREF isolates carried the vanA gene. The performance of VREF detection by the genetic algorithm model of MALDI-TOF MS compared to the sequencing method was sensitivity = 80%, specificity = 90%, false positive rate =10%, false negative rate =10%, positive predictive value = 80%, negative predictive value= 90%. MALDI-TOF MS can be used as a screening test for discrimination between vanA-positive E. faecium and vanA-negative E. faecium.

  9. The performance of spatially offset Raman spectroscopy for liquid explosive detection

    NASA Astrophysics Data System (ADS)

    Loeffen, Paul W.; Maskall, Guy; Bonthron, Stuart; Bloomfield, Matthew; Tombling, Craig; Matousek, Pavel

    2016-10-01

    Aviation security requirements adopted in 2014 require liquids to be screened at most airports throughout Europe, North America and Australia. Cobalt's unique Spatially Offset Raman Spectroscopy (SORS™) technology has proven extremely effective at screening liquids, aerosols and gels (LAGS) with extremely low false alarm rates. SORS is compatible with a wide range of containers, including coloured, opaque or clear plastics, glass and paper, as well as duty-free bottles in STEBs (secure tamper-evident bags). Our award-winning Insight range has been specially developed for table-top screening at security checkpoints. Insight systems use our patented SORS technology for rapid and accurate chemical analysis of substances in unopened non-metallic containers. Insight100M™ and the latest member of the range - Insight200M™ - also screen metallic containers. Our unique systems screen liquids, aerosols and gels with the highest detection capability and lowest false alarm rates of any ECAC-approved scanner, with several hundred units already in use at airports including eight of the top ten European hubs. This paper presents an analysis of real performance data for these systems.

  10. Searching for Exoplanets using Artificial Intelligence

    NASA Astrophysics Data System (ADS)

    Pearson, Kyle Alexander; Palafox, Leon; Griffith, Caitlin Ann

    2017-10-01

    In the last decade, over a million stars were monitored to detect transiting planets. The large volume of data obtained from current and future missions (e.g. Kepler, K2, TESS and LSST) requires automated methods to detect the signature of a planet. Manual interpretation of potential exoplanet candidates is labor intensive and subject to human error, the results of which are difficult to quantify. Here we present a new method of detecting exoplanet candidates in large planetary search projects which, unlike current methods uses a neural network. Neural networks, also called ``deep learning'' or ``deep nets'', are a state of the art machine learning technique designed to give a computer perception into a specific problem by training it to recognize patterns. Unlike past transit detection algorithms, the deep net learns to characterize the data instead of relying on hand-coded metrics that humans perceive as the most representative. Exoplanet transits have different shapes, as a result of, e.g. the planet's and stellar atmosphere and transit geometry. Thus, a simple template does not suffice to capture the subtle details, especially if the signal is below the noise or strong systematics are present. Current false-positive rates from the Kepler data are estimated around 12.3% for Earth-like planets and there has been no study of the false negative rates. It is therefore important to ask how the properties of current algorithms exactly affect the results of the Kepler mission and, future missions such as TESS, which flies next year. These uncertainties affect the fundamental research derived from missions, such as the discovery of habitable planets, estimates of their occurrence rates and our understanding about the nature and evolution of planetary systems.

  11. TU-G-204-09: The Effects of Reduced- Dose Lung Cancer Screening CT On Lung Nodule Detection Using a CAD Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, S; Lo, P; Kim, G

    2015-06-15

    Purpose: While Lung Cancer Screening CT is being performed at low doses, the purpose of this study was to investigate the effects of further reducing dose on the performance of a CAD nodule-detection algorithm. Methods: We selected 50 cases from our local database of National Lung Screening Trial (NLST) patients for which we had both the image series and the raw CT data from the original scans. All scans were acquired with fixed mAs (25 for standard-sized patients, 40 for large patients) on a 64-slice scanner (Sensation 64, Siemens Healthcare). All images were reconstructed with 1-mm slice thickness, B50 kernel.more » 10 of the cases had at least one nodule reported on the NLST reader forms. Based on a previously-published technique, we added noise to the raw data to simulate reduced-dose versions of each case at 50% and 25% of the original NLST dose (i.e. approximately 1.0 and 0.5 mGy CTDIvol). For each case at each dose level, the CAD detection algorithm was run and nodules greater than 4 mm in diameter were reported. These CAD results were compared to “truth”, defined as the approximate nodule centroids from the NLST reports. Subject-level mean sensitivities and false-positive rates were calculated for each dose level. Results: The mean sensitivities of the CAD algorithm were 35% at the original dose, 20% at 50% dose, and 42.5% at 25% dose. The false-positive rates, in decreasing-dose order, were 3.7, 2.9, and 10 per case. In certain cases, particularly in larger patients, there were severe photon-starvation artifacts, especially in the apical region due to the high-attenuating shoulders. Conclusion: The detection task was challenging for the CAD algorithm at all dose levels, including the original NLST dose. However, the false-positive rate at 25% dose approximately tripled, suggesting a loss of CAD robustness somewhere between 0.5 and 1.0 mGy. NCI grant U01 CA181156 (Quantitative Imaging Network); Tobacco Related Disease Research Project grant 22RT-0131.« less

  12. Density estimation of Yangtze finless porpoises using passive acoustic sensors and automated click train detection.

    PubMed

    Kimura, Satoko; Akamatsu, Tomonari; Li, Songhai; Dong, Shouyue; Dong, Lijun; Wang, Kexiong; Wang, Ding; Arai, Nobuaki

    2010-09-01

    A method is presented to estimate the density of finless porpoises using stationed passive acoustic monitoring. The number of click trains detected by stereo acoustic data loggers (A-tag) was converted to an estimate of the density of porpoises. First, an automated off-line filter was developed to detect a click train among noise, and the detection and false-alarm rates were calculated. Second, a density estimation model was proposed. The cue-production rate was measured by biologging experiments. The probability of detecting a cue and the area size were calculated from the source level, beam patterns, and a sound-propagation model. The effect of group size on the cue-detection rate was examined. Third, the proposed model was applied to estimate the density of finless porpoises at four locations from the Yangtze River to the inside of Poyang Lake. The estimated mean density of porpoises in a day decreased from the main stream to the lake. Long-term monitoring during 466 days from June 2007 to May 2009 showed variation in the density 0-4.79. However, the density was fewer than 1 porpoise/km(2) during 94% of the period. These results suggest a potential gap and seasonal migration of the population in the bottleneck of Poyang Lake.

  13. Measuring target detection performance in paradigms with high event rates.

    PubMed

    Bendixen, Alexandra; Andersen, Søren K

    2013-05-01

    Combining behavioral and neurophysiological measurements inevitably implies mutual constraints, such as when the neurophysiological measurement requires fast-paced stimulus presentation and hence the attribution of a behavioral response to a particular preceding stimulus becomes ambiguous. We develop and test a method for validly assessing behavioral detection performance in spite of this ambiguity. We examine four approaches taken in the literature to treat such situations. We analytically derive a new variant of computing the classical parameters of signal detection theory, hit and false alarm rates, adapted to fast-paced paradigms. Each of the previous approaches shows specific shortcomings (susceptibility towards response window choice, biased estimates of behavioral detection performance). Superior performance of our new approach is demonstrated for both simulated and empirical behavioral data. Further evidence is provided by reliable correspondence between behavioral performance and the N2b component as an electrophysiological indicator of target detection. The appropriateness of our approach is substantiated by both theoretical and empirical arguments. We demonstrate an easy-to-implement solution for measuring target detection performance independent of the rate of event presentation. Thus overcoming the measurement bias of previous approaches, our method will help to clarify the behavioral relevance of different measures of cortical activation. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  14. A Neutron Based Interrogation System For SNM In Cargo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, Steven Z.; Koltick, David S.

    A complete system has been simulated using experimentally obtained input parameters for the detection of special nuclear materials (SNM). A variation of the associated particle imaging (API) technique, referred to as reverse associated particle imaging detection (RAPID), has been developed in the context of detecting 5-kg spherical samples of U-235 in cargo. The RAPID technique allows for the interrogation of containers at neutron production rates between {approx}1x10{sup 8} neutrons/s and {approx}3x10{sup 8} neutrons/s. The merit of performance for the system is the time to detect the threat material with 95% probability of detection and 10{sup -4} false positive rate permore » interrogated voxel of cargo. Detection times of 5 minutes were found for a maximally loaded cargo container uniformly filled with iron and as low as 1 second in containers loaded to 1/4 of full capacity with either iron or wood. The worse case system performance, 30 minutes interrogation time, occurs for a maximally loaded container containing wood at 0.4 g/cm{sup 3}.« less

  15. Method for oil pipeline leak detection based on distributed fiber optic technology

    NASA Astrophysics Data System (ADS)

    Chen, Huabo; Tu, Yaqing; Luo, Ting

    1998-08-01

    Pipeline leak detection is a difficult problem to solve up to now. Some traditional leak detection methods have such problems as high rate of false alarm or missing detection, low location estimate capability. For the problems given above, a method for oil pipeline leak detection based on distributed optical fiber sensor with special coating is presented. The fiber's coating interacts with hydrocarbon molecules in oil, which alters the refractive indexed of the coating. Therefore the light-guiding properties of the fiber are modified. Thus pipeline leak location can be determined by OTDR. Oil pipeline lead detection system is designed based on the principle. The system has some features like real time, multi-point detection at the same time and high location accuracy. In the end, some factors that probably influence detection are analyzed and primary improving actions are given.

  16. Space moving target detection using time domain feature

    NASA Astrophysics Data System (ADS)

    Wang, Min; Chen, Jin-yong; Gao, Feng; Zhao, Jin-yu

    2018-01-01

    The traditional space target detection methods mainly use the spatial characteristics of the star map to detect the targets, which can not make full use of the time domain information. This paper presents a new space moving target detection method based on time domain features. We firstly construct the time spectral data of star map, then analyze the time domain features of the main objects (target, stars and the background) in star maps, finally detect the moving targets using single pulse feature of the time domain signal. The real star map target detection experimental results show that the proposed method can effectively detect the trajectory of moving targets in the star map sequence, and the detection probability achieves 99% when the false alarm rate is about 8×10-5, which outperforms those of compared algorithms.

  17. A novel ship CFAR detection algorithm based on adaptive parameter enhancement and wake-aided detection in SAR images

    NASA Astrophysics Data System (ADS)

    Meng, Siqi; Ren, Kan; Lu, Dongming; Gu, Guohua; Chen, Qian; Lu, Guojun

    2018-03-01

    Synthetic aperture radar (SAR) is an indispensable and useful method for marine monitoring. With the increase of SAR sensors, high resolution images can be acquired and contain more target structure information, such as more spatial details etc. This paper presents a novel adaptive parameter transform (APT) domain constant false alarm rate (CFAR) to highlight targets. The whole method is based on the APT domain value. Firstly, the image is mapped to the new transform domain by the algorithm. Secondly, the false candidate target pixels are screened out by the CFAR detector to highlight the target ships. Thirdly, the ship pixels are replaced by the homogeneous sea pixels. And then, the enhanced image is processed by Niblack algorithm to obtain the wake binary image. Finally, normalized Hough transform (NHT) is used to detect wakes in the binary image, as a verification of the presence of the ships. Experiments on real SAR images validate that the proposed transform does enhance the target structure and improve the contrast of the image. The algorithm has a good performance in the ship and ship wake detection.

  18. New reporting procedures based on long-term method detection levels and some considerations for interpretations of water-quality data provided by the U.S. Geological Survey National Water Quality Laboratory

    USGS Publications Warehouse

    Childress, Carolyn J. Oblinger; Foreman, William T.; Connor, Brooke F.; Maloney, Thomas J.

    1999-01-01

    This report describes the U.S. Geological Survey National Water Quality Laboratory?s approach for determining long-term method detection levels and establishing reporting levels, details relevant new reporting conventions, and provides preliminary guidance on interpreting data reported with the new conventions. At the long-term method detection level concentration, the risk of a false positive detection (analyte reported present at the long-term method detection level when not in sample) is no more than 1 percent. However, at the long-term method detection level, the risk of a false negative occurrence (analyte reported not present when present at the long-term method detection level concentration) is up to 50 percent. Because this false negative rate is too high for use as a default 'less than' reporting level, a more reliable laboratory reporting level is set at twice the determined long-term method detection level. For all methods, concentrations measured between the laboratory reporting level and the long-term method detection level will be reported as estimated concentrations. Non-detections will be censored to the laboratory reporting level. Adoption of the new reporting conventions requires a full understanding of how low-concentration data can be used and interpreted and places responsibility for using and presenting final data with the user rather than with the laboratory. Users must consider that (1) new laboratory reporting levels may differ from previously established minimum reporting levels, (2) long-term method detection levels and laboratory reporting levels may change over time, and (3) estimated concentrations are less certain than concentrations reported above the laboratory reporting level. The availability of uncensored but qualified low-concentration data for interpretation and statistical analysis is a substantial benefit to the user. A decision to censor data after they are reported from the laboratory may still be made by the user, if merited, on the basis of the intended use of the data.

  19. Design of oil pipeline leak detection and communication system based on optical fiber technology

    NASA Astrophysics Data System (ADS)

    Tu, Yaqing; Chen, Huabo

    1999-08-01

    The integrity of oil pipeline is always a major concern of operators. Pipeline leak not only leads to loss of oil, but pollutes environment. A new pipeline leak detection and communication system based on optical fiber technology to ensure the pipeline reliability is presented. Combined direct leak detection method with an indirect one, the system will greatly reduce the rate of false alarm. According, to the practical features of oil pipeline,the pipeline communication system is designed employing the state-of-the-art optic fiber communication technology. The system has such feature as high location accuracy of leak detection, good real-time characteristic, etc. which overcomes the disadvantages of traditional leak detection methods and communication system effectively.

  20. Fast iterative censoring CFAR algorithm for ship detection from SAR images

    NASA Astrophysics Data System (ADS)

    Gu, Dandan; Yue, Hui; Zhang, Yuan; Gao, Pengcheng

    2017-11-01

    Ship detection is one of the essential techniques for ship recognition from synthetic aperture radar (SAR) images. This paper presents a fast iterative detection procedure to eliminate the influence of target returns on the estimation of local sea clutter distributions for constant false alarm rate (CFAR) detectors. A fast block detector is first employed to extract potential target sub-images; and then, an iterative censoring CFAR algorithm is used to detect ship candidates from each target blocks adaptively and efficiently, where parallel detection is available, and statistical parameters of G0 distribution fitting local sea clutter well can be quickly estimated based on an integral image operator. Experimental results of TerraSAR-X images demonstrate the effectiveness of the proposed technique.

  1. Sequential feature selection for detecting buried objects using forward looking ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Shaw, Darren; Stone, Kevin; Ho, K. C.; Keller, James M.; Luke, Robert H.; Burns, Brian P.

    2016-05-01

    Forward looking ground penetrating radar (FLGPR) has the benefit of detecting objects at a significant standoff distance. The FLGPR signal is radiated over a large surface area and the radar signal return is often weak. Improving detection, especially for buried in road targets, while maintaining an acceptable false alarm rate remains to be a challenging task. Various kinds of features have been developed over the years to increase the FLGPR detection performance. This paper focuses on investigating the use of as many features as possible for detecting buried targets and uses the sequential feature selection technique to automatically choose the features that contribute most for improving performance. Experimental results using data collected at a government test site are presented.

  2. Sensitivity and specificity of intrathecal fluorescein and white light excitation for detecting intraoperative cerebrospinal fluid leak in endoscopic skull base surgery: a prospective study.

    PubMed

    Raza, Shaan M; Banu, Matei A; Donaldson, Angela; Patel, Kunal S; Anand, Vijay K; Schwartz, Theodore H

    2016-03-01

    The intraoperative detection of CSF leaks during endonasal endoscopic skull base surgery is critical to preventing postoperative CSF leaks. Intrathecal fluorescein (ITF) has been used at varying doses to aid in the detection of intraoperative CSF leaks. However, the sensitivity and specificity of ITF at certain dosages is unknown. A prospective database of all endoscopic endonasal procedures was reviewed. All patients received 25 mg ITF diluted in 10 ml CSF and were pretreated with dexamethasone and Benadryl. Immediately after surgery, the operating surgeon prospectively noted if there was an intraoperative CSF leak and fluorescein was identified. The sensitivity, specificity, and positive and negative predictive power of ITF for detecting intraoperative CSF leak were calculated. Factors correlating with postoperative CSF leak were determined. Of 419 patients, 35.8% of patients did not show a CSF leak. Fluorescein-tinted CSF (true positive) was noted in 59.7% of patients and 0 false positives were encountered. CSF without fluorescein staining (false negative) was noted in 4.5% of patients. The sensitivity and specificity of ITF were 92.9% and 100%, respectively. The negative and positive predictive values were 88.8% and 100%, respectively. Postoperative CSF leaks only occurred in true positives at a rate of 2.8%. ITF is extremely specific and very sensitive for detecting intraoperative CSF leaks. Although false negatives can occur, these patients do not appear to be at risk for postoperative CSF leak. The use of ITF may help surgeons prevent postoperative CSF leaks by intraoperatively detecting and confirming a watertight repair.

  3. A novel liquid-phase piezoelectric immunosensor for detecting Schistosoma japonicum circulating antigen.

    PubMed

    Wen, Zhili; Wang, Shiping; Wu, Zhaoyang; Shen, Guoli

    2011-09-01

    A new liquid-phase piezoelectric immunosensor (LP-PEIS), which can detect Schistosoma japonicum (Sj) circulating antigens (SjCAg) quantificationally, was developed. The IgG antibodies were purified from the sera of rabbits which had been infected or immunized by Sj and were immobilized on the surface of piezoelectric quartz crystal in LP-PEIS by staphylococcal protein A (SPA). It was used to detect SjCAg in sera of rabbits which had been infected by Sj in order to acquire some optimum conditions for detecting SjCAg. Finally, the LP-PEIS with optimum conditions was used to detect SjCAg in sera of patients who had been infected by Sj, and was compared with sandwich ELISA. A lot of optimum conditions of LP-PEIS for detecting SjCAg had been acquired. In the detection of patients' sera with acute Schistosomiasis, LP-PEIS has higher positive rate (100%) and lower false positive rate (3.0%) than sandwich ELISA (92.8%, 6.0%). However, there were no significant difference between LP-PEIS and sandwich ELISA. LP-PEIS can quantificationally detect SjCAg in patients' sera as well as sandwich ELISA. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. Comprehensive Detection of Gas Plumes from Multibeam Water Column Images with Minimisation of Noise Interferences

    PubMed Central

    Zhao, Jianhu; Zhang, Hongmei; Wang, Shiqi

    2017-01-01

    Multibeam echosounder systems (MBES) can record backscatter strengths of gas plumes in the water column (WC) images that may be an indicator of possible occurrence of gas at certain depths. Manual or automatic detection is generally adopted in finding gas plumes, but frequently results in low efficiency and high false detection rates because of WC images that are polluted by noise. To improve the efficiency and reliability of the detection, a comprehensive detection method is proposed in this paper. In the proposed method, the characteristics of WC background noise are first analyzed and given. Then, the mean standard deviation threshold segmentations are respectively used for the denoising of time-angle and depth-angle images, an intersection operation is performed for the two segmented images to further weaken noise in the WC data, and the gas plumes in the WC data are detected from the intersection image by the morphological constraint. The proposed method was tested by conducting shallow-water and deepwater experiments. In these experiments, the detections were conducted automatically and higher correct detection rates than the traditional methods were achieved. The performance of the proposed method is analyzed and discussed. PMID:29186014

  5. Characterization of difference of Gaussian filters in the detection of mammographic regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catarious, David M. Jr.; Baydush, Alan H.; Floyd, Carey E. Jr.

    2006-11-15

    In this article, we present a characterization of the effect of difference of Gaussians (DoG) filters in the detection of mammographic regions. DoG filters have been used previously in mammographic mass computer-aided detection (CAD) systems. As DoG filters are constructed from the subtraction of two bivariate Gaussian distributions, they require the specification of three parameters: the size of the filter template and the standard deviations of the constituent Gaussians. The influence of these three parameters in the detection of mammographic masses has not been characterized. In this work, we aim to determine how the parameters affect (1) the physical descriptorsmore » of the detected regions (2) the true and false positive rates, and (3) the classification performance of the individual descriptors. To this end, 30 DoG filters are created from the combination of three template sizes and four values for each of the Gaussians' standard deviations. The filters are used to detect regions in a study database of 181 craniocaudal-view mammograms extracted from the Digital Database for Screening Mammography. To describe the physical characteristics of the identified regions, morphological and textural features are extracted from each of the detected regions. Differences in the mean values of the features caused by altering the DoG parameters are examined through statistical and empirical comparisons. The parameters' effects on the true and false positive rate are determined by examining the mean malignant sensitivities and false positives per image (FPpI). Finally, the effect on the classification performance is described by examining the variation in FPpI at the point where 81% of the malignant masses in the study database are detected. Overall, the findings of the study indicate that increasing the standard deviations of the Gaussians used to construct a DoG filter results in a dramatic decrease in the number of regions identified at the expense of missing a small number of malignancies. The sharp reduction in the number of identified regions allowed the identification of textural differences between large and small mammographic regions. We find that the classification performances of the features that achieve the lowest average FPpI are influenced by all three of the parameters.« less

  6. Effects of Demographic History on the Detection of Recombination Hotspots from Linkage Disequilibrium.

    PubMed

    Dapper, Amy L; Payseur, Bret A

    2018-02-01

    In some species, meiotic recombination is concentrated in small genomic regions. These "recombination hotspots" leave signatures in fine-scale patterns of linkage disequilibrium, raising the prospect that the genomic landscape of hotspots can be characterized from sequence variation. This approach has led to the inference that hotspots evolve rapidly in some species, but are conserved in others. Historic demographic events, such as population bottlenecks, are known to affect patterns of linkage disequilibrium across the genome, violating population genetic assumptions of this approach. Although such events are prevalent, demographic history is generally ignored when making inferences about the evolution of recombination hotspots. To determine the effect of demography on the detection of recombination hotspots, we use the coalescent to simulate haplotypes with a known recombination landscape. We measure the ability of popular linkage disequilibrium-based programs to detect hotspots across a range of demographic histories, including population bottlenecks, hidden population structure, population expansions, and population contractions. We find that demographic events have the potential to greatly reduce the power and increase the false positive rate of hotspot discovery. Neither the power nor the false positive rate of hotspot detection can be predicted without also knowing the demographic history of the sample. Our results suggest that ignoring demographic history likely overestimates the power to detect hotspots and therefore underestimates the degree of hotspot sharing between species. We suggest strategies for incorporating demographic history into population genetic inferences about recombination hotspots. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  7. Fast Vessel Detection in Gaofen-3 SAR Images with Ultrafine Strip-Map Mode

    PubMed Central

    Liu, Lei; Qiu, Xiaolan; Lei, Bin

    2017-01-01

    This study aims to detect vessels with lengths ranging from about 70 to 300 m, in Gaofen-3 (GF-3) SAR images with ultrafine strip-map (UFS) mode as fast as possible. Based on the analysis of the characteristics of vessels in GF-3 SAR imagery, an effective vessel detection method is proposed in this paper. Firstly, the iterative constant false alarm rate (CFAR) method is employed to detect the potential ship pixels. Secondly, the mean-shift operation is applied on each potential ship pixel to identify the candidate target region. During the mean-shift process, we maintain a selection matrix recording which pixels can be taken, and these pixels are called as the valid points of the candidate target. The l1 norm regression is used to extract the principal axis and detect the valid points. Finally, two kinds of false alarms, the bright line and the azimuth ambiguity, are removed by comparing the valid area of the candidate target with a pre-defined value and computing the displacement between the true target and the corresponding replicas respectively. Experimental results on three GF-3 SAR images with UFS mode demonstrate the effectiveness and efficiency of the proposed method. PMID:28678197

  8. Automated asteroseismic peak detections

    NASA Astrophysics Data System (ADS)

    García Saravia Ortiz de Montellano, Andrés; Hekker, S.; Themeßl, N.

    2018-05-01

    Space observatories such as Kepler have provided data that can potentially revolutionize our understanding of stars. Through detailed asteroseismic analyses we are capable of determining fundamental stellar parameters and reveal the stellar internal structure with unprecedented accuracy. However, such detailed analyses, known as peak bagging, have so far been obtained for only a small percentage of the observed stars while most of the scientific potential of the available data remains unexplored. One of the major challenges in peak bagging is identifying how many solar-like oscillation modes are visible in a power density spectrum. Identification of oscillation modes is usually done by visual inspection that is time-consuming and has a degree of subjectivity. Here, we present a peak-detection algorithm especially suited for the detection of solar-like oscillations. It reliably characterizes the solar-like oscillations in a power density spectrum and estimates their parameters without human intervention. Furthermore, we provide a metric to characterize the false positive and false negative rates to provide further information about the reliability of a detected oscillation mode or the significance of a lack of detected oscillation modes. The algorithm presented here opens the possibility for detailed and automated peak bagging of the thousands of solar-like oscillators observed by Kepler.

  9. Impaired Context Processing is Attributable to Global Neuropsychological Impairment in Schizophrenia and Psychotic Bipolar Disorder

    PubMed Central

    Hill, S. Kristian; Gold, James M.; Keefe, Richard S. E.; Clementz, Brett A.; Gershon, Elliot; Keshavan, Matcheri S.; Pearlson, Godfrey; Tamminga, Carol A.; Sweeney, John A.

    2017-01-01

    Abstract Background: Context processing may reflect a specific cognitive impairment in schizophrenia. Whether impaired context processing is observed across psychotic disorders or among relatives of affected individuals, and whether it is a deficit that is independent from the generalized neuropsychological deficits seen in psychotic disorders, are less established. Methods: Schizophrenia, schizoaffective, and psychotic bipolar probands (n = 660), their first-degree relatives (n = 741), and healthy individuals (n = 308) studied by the Bipolar-Schizophrenia Network on Intermediate Phenotypes consortium performed an expectancy task requiring use of contextual information to overcome a pre-potent response. Sensitivity for target detection and false alarm rates on trials requiring inhibition or goal maintenance were measured. Results: Proband groups and relatives with psychosis spectrum personality traits demonstrated reduced target sensitivity and elevated false alarm rates. False alarm rate was higher under inhibition vs goal maintenance conditions although this difference was attenuated in schizophrenia and schizoaffective proband groups. After accounting for global neuropsychological impairment, as reflected by the composite score from the Brief Assessment of Cognition in Schizophrenia neuropsychological battery, deficits in schizophrenia and bipolar proband groups were no longer significant. Performance measures were moderately familial. Conclusion: Reduced target detection, but not a specific deficit in context processing, is observed across psychotic disorders. Impairments in both goal maintenance and response inhibition appear to contribute comparably to deficits in schizophrenia and schizoaffective disorder, whereas greater difficulty with response inhibition underlies deficits in bipolar disorder. Yet, these deficits are not independent from the generalized neurocognitive impairment observed in schizophrenia and psychotic bipolar disorder. PMID:27306316

  10. Hidden Markov models and neural networks for fault detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic

    1994-01-01

    Neural networks plus hidden Markov models (HMM) can provide excellent detection and false alarm rate performance in fault detection applications, as shown in this viewgraph presentation. Modified models allow for novelty detection. Key contributions of neural network models are: (1) excellent nonparametric discrimination capability; (2) a good estimator of posterior state probabilities, even in high dimensions, and thus can be embedded within overall probabilistic model (HMM); and (3) simple to implement compared to other nonparametric models. Neural network/HMM monitoring model is currently being integrated with the new Deep Space Network (DSN) antenna controller software and will be on-line monitoring a new DSN 34-m antenna (DSS-24) by July, 1994.

  11. Cluster signal-to-noise analysis for evaluation of the information content in an image.

    PubMed

    Weerawanich, Warangkana; Shimizu, Mayumi; Takeshita, Yohei; Okamura, Kazutoshi; Yoshida, Shoko; Yoshiura, Kazunori

    2018-01-01

    (1) To develop an observer-free method of analysing image quality related to the observer performance in the detection task and (2) to analyse observer behaviour patterns in the detection of small mass changes in cone-beam CT images. 13 observers detected holes in a Teflon phantom in cone-beam CT images. Using the same images, we developed a new method, cluster signal-to-noise analysis, to detect the holes by applying various cut-off values using ImageJ and reconstructing cluster signal-to-noise curves. We then evaluated the correlation between cluster signal-to-noise analysis and the observer performance test. We measured the background noise in each image to evaluate the relationship with false positive rates (FPRs) of the observers. Correlations between mean FPRs and intra- and interobserver variations were also evaluated. Moreover, we calculated true positive rates (TPRs) and accuracies from background noise and evaluated their correlations with TPRs from observers. Cluster signal-to-noise curves were derived in cluster signal-to-noise analysis. They yield the detection of signals (true holes) related to noise (false holes). This method correlated highly with the observer performance test (R 2 = 0.9296). In noisy images, increasing background noise resulted in higher FPRs and larger intra- and interobserver variations. TPRs and accuracies calculated from background noise had high correlation with actual TPRs from observers; R 2 was 0.9244 and 0.9338, respectively. Cluster signal-to-noise analysis can simulate the detection performance of observers and thus replace the observer performance test in the evaluation of image quality. Erroneous decision-making increased with increasing background noise.

  12. Indirect Estimation of Radioactivity in Containerized Cargo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, Kenneth D.; Scherrer, Chad; Smith, Eric L.

    Detecting illicit nuclear and radiological material in containerized cargo challenges the state of the art in detection systems. Current systems are being evaluated and new systems envisioned to address the need for the high probability of detection and extremely low false alarm rates necessary to thwart potential threats and extremely low nuisance and false alarm rates while maintaining necessary to maintain the flow of commerce impacted by the enormous volume of commodities imported in shipping containers. Maintaining flow of commerce also means that primary inspection must be rapid, requiring relatively indirect measurements of cargo from outside the containers. With increasingmore » information content in such indirect measurements, it is natural to ask how the information might be combined to improved detection. Toward this end, we present an approach to estimating isotopic activity of naturally occurring radioactive material in cargo grouped by commodity type, combining container manifest data with radiography and gamma spectroscopy aligned to location along the container. The heart of this approach is our statistical model of gamma counts within peak regions of interest, which captures the effects of background suppression, counting noise, convolution of neighboring cargo contributions, and down-scattered photons to provide physically constrained estimates of counts due to decay of specific radioisotopes in cargo alone. Coupled to that model, we use a mechanistic model of self-attenuated radiation flux to estimate the isotopic activity within cargo, segmented by location within each container, that produces those counts. We demonstrate our approach by applying it to a set of measurements taken at the Port of Seattle in 2006. This approach to synthesizing disparate available data streams and extraction of cargo characteristics holds the potential to improve primary inspection using current detection capabilities and to enable simulation-based evaluation of new candidate detection systems.« less

  13. Comprehensive Biothreat Cluster Identification by PCR/Electrospray-Ionization Mass Spectrometry

    PubMed Central

    Sampath, Rangarajan; Mulholland, Niveen; Blyn, Lawrence B.; Massire, Christian; Whitehouse, Chris A.; Waybright, Nicole; Harter, Courtney; Bogan, Joseph; Miranda, Mary Sue; Smith, David; Baldwin, Carson; Wolcott, Mark; Norwood, David; Kreft, Rachael; Frinder, Mark; Lovari, Robert; Yasuda, Irene; Matthews, Heather; Toleno, Donna; Housley, Roberta; Duncan, David; Li, Feng; Warren, Robin; Eshoo, Mark W.; Hall, Thomas A.; Hofstadler, Steven A.; Ecker, David J.

    2012-01-01

    Technology for comprehensive identification of biothreats in environmental and clinical specimens is needed to protect citizens in the case of a biological attack. This is a challenge because there are dozens of bacterial and viral species that might be used in a biological attack and many have closely related near-neighbor organisms that are harmless. The biothreat agent, along with its near neighbors, can be thought of as a biothreat cluster or a biocluster for short. The ability to comprehensively detect the important biothreat clusters with resolution sufficient to distinguish the near neighbors with an extremely low false positive rate is required. A technological solution to this problem can be achieved by coupling biothreat group-specific PCR with electrospray ionization mass spectrometry (PCR/ESI-MS). The biothreat assay described here detects ten bacterial and four viral biothreat clusters on the NIAID priority pathogen and HHS/USDA select agent lists. Detection of each of the biothreat clusters was validated by analysis of a broad collection of biothreat organisms and near neighbors prepared by spiking biothreat nucleic acids into nucleic acids extracted from filtered environmental air. Analytical experiments were carried out to determine breadth of coverage, limits of detection, linearity, sensitivity, and specificity. Further, the assay breadth was demonstrated by testing a diverse collection of organisms from each biothreat cluster. The biothreat assay as configured was able to detect all the target organism clusters and did not misidentify any of the near-neighbor organisms as threats. Coupling biothreat cluster-specific PCR to electrospray ionization mass spectrometry simultaneously provides the breadth of coverage, discrimination of near neighbors, and an extremely low false positive rate due to the requirement that an amplicon with a precise base composition of a biothreat agent be detected by mass spectrometry. PMID:22768032

  14. Should all women with cervical atypia be referred for colposcopy: a HARNET study. Harrisburgh Area Research Network.

    PubMed

    Slawson, D C; Bennett, J H; Simon, L J; Herman, J M

    1994-04-01

    Clinicians who manage women with Papanicolaou (Pap) smears showing atypical squamous cells of undetermined significance (ASCUS) may miss clinically significant cervical disease by repeating the cytology alone. We evaluated the ability of the human papillomavirus (HPV) screen and the naked-eye examination after a cervical acetic acid wash to enhance the follow-up Pap smear in predicting an abnormal colposcopic biopsy. Pap smears were performed on all women (N = 7458) attending six family practice offices for a health maintenance examination from August 1989 through February 1991. Consenting subjects with ASCUS underwent repeat cytological testing, an HPV screen, and a cervical acetic acid wash examination immediately before colposcopy after a 4- to 6-month waiting period. Of the 122 consenting women identified with ASCUS, 67 (55%) demonstrated abnormalities on biopsy, including 26 with condyloma, 26 with cervical intraepithelial neoplasia I (CIN I), and 15 with CIN II to III. The false-negative rate, 58%, of the follow-up Pap smear alone for detecting these cases of condyloma and CIN was significantly decreased (false-negative rate, 27%) with the use of the cervical acetic acid wash as an adjunctive test. There was no additional reduction in the false-negative rate with the use of the HPV screen. Of the 15 subjects with high-grade cervical lesions (CIN II to III), 14 had either an abnormal follow-up Pap smear or an abnormal cervical acetic acid wash examination. Among women with cervical atypia, a single follow-up Pap smear alone failed to detect one third of the cases of high-grade disease. Ninety-three percent of these cases were detected, however, with a follow-up Pap smear and an acetic acid wash. Our one subject with a high-grade lesion missed with this combination of tests had an unsatisfactory Pap smear. Use of both tests together may reliably guide clinical decisions regarding the management of cervical atypia.

  15. A fourth order PDE based fuzzy c- means approach for segmentation of microscopic biopsy images in presence of Poisson noise for cancer detection.

    PubMed

    Kumar, Rajesh; Srivastava, Subodh; Srivastava, Rajeev

    2017-07-01

    For cancer detection from microscopic biopsy images, image segmentation step used for segmentation of cells and nuclei play an important role. Accuracy of segmentation approach dominate the final results. Also the microscopic biopsy images have intrinsic Poisson noise and if it is present in the image the segmentation results may not be accurate. The objective is to propose an efficient fuzzy c-means based segmentation approach which can also handle the noise present in the image during the segmentation process itself i.e. noise removal and segmentation is combined in one step. To address the above issues, in this paper a fourth order partial differential equation (FPDE) based nonlinear filter adapted to Poisson noise with fuzzy c-means segmentation method is proposed. This approach is capable of effectively handling the segmentation problem of blocky artifacts while achieving good tradeoff between Poisson noise removals and edge preservation of the microscopic biopsy images during segmentation process for cancer detection from cells. The proposed approach is tested on breast cancer microscopic biopsy data set with region of interest (ROI) segmented ground truth images. The microscopic biopsy data set contains 31 benign and 27 malignant images of size 896 × 768. The region of interest selected ground truth of all 58 images are also available for this data set. Finally, the result obtained from proposed approach is compared with the results of popular segmentation algorithms; fuzzy c-means, color k-means, texture based segmentation, and total variation fuzzy c-means approaches. The experimental results shows that proposed approach is providing better results in terms of various performance measures such as Jaccard coefficient, dice index, Tanimoto coefficient, area under curve, accuracy, true positive rate, true negative rate, false positive rate, false negative rate, random index, global consistency error, and variance of information as compared to other segmentation approaches used for cancer detection. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. [Importance of AFPplus-screening in daily practice. Evaluation of 2641 analyses].

    PubMed

    Longoni, S; Ackermann, M; Viollier, E H

    1997-04-30

    Since 1991 the measurement of 3 biochemical parameters for the risk assessment of trisomy 21 pregnancy (AFPplus, Alpha-check) is also offered by private laboratories in Switzerland. This work aimed at the proof of reliability of this risk analysis in the hands of practitioners and private laboratories. The investigated cohort included 2,641 pregnant women. An anonymous questionnaire on the outcome of pregnancy furnished informations in 97% of cases (2,561 cases). At a trisomy 21 risk cutoff of one to 380, 9% of the pregnant women had conspicuous AFPplus results (trisomy 21-risk > 1 to 380; age distribution at term: 17 to 45 years; median 29.4 years). Among the conspicuous cases were 8 with trisomy 21, 1 with trisomy 18, 5 spontaneous abortions and 3 special cases. Seven of a total of 9 trisomies were detected by the screening. From these data the following diagnostic parameters for AFPplus have been calculated: specificity 91.0%, sensitivity 77.8%, positive predictive value 99.9%, false positive rate 9.0%, false negative rate 22.2%. The use of AFPplus by a practitioner cooperating with a private laboratory thus reveals a detection rate comparable to that known from larger centers. AFPplus by a practitioner cooperating with a private laboratory thus reveals a detection rate comparable to that known from larger centers. AFPplus allows to individualize the statistical age-related risk of a pregnant women before age 35. Above age 35 AFPplus may support the indication for cases suitable for cytogenetic investigation. Thorough personal information of the patient before the 16th pregnancy week by the practitioner or by genetic counseling has without any doubt first priority.

  17. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds

  18. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds.

  19. ADEPT, a dynamic next generation sequencing data error-detection program with trimming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Shihai; Lo, Chien-Chi; Li, Po-E

    Illumina is the most widely used next generation sequencing technology and produces millions of short reads that contain errors. These sequencing errors constitute a major problem in applications such as de novo genome assembly, metagenomics analysis and single nucleotide polymorphism discovery. In this study, we present ADEPT, a dynamic error detection method, based on the quality scores of each nucleotide and its neighboring nucleotides, together with their positions within the read and compares this to the position-specific quality score distribution of all bases within the sequencing run. This method greatly improves upon other available methods in terms of the truemore » positive rate of error discovery without affecting the false positive rate, particularly within the middle of reads. We conclude that ADEPT is the only tool to date that dynamically assesses errors within reads by comparing position-specific and neighboring base quality scores with the distribution of quality scores for the dataset being analyzed. The result is a method that is less prone to position-dependent under-prediction, which is one of the most prominent issues in error prediction. The outcome is that ADEPT improves upon prior efforts in identifying true errors, primarily within the middle of reads, while reducing the false positive rate.« less

  20. ADEPT, a dynamic next generation sequencing data error-detection program with trimming

    DOE PAGES

    Feng, Shihai; Lo, Chien-Chi; Li, Po-E; ...

    2016-02-29

    Illumina is the most widely used next generation sequencing technology and produces millions of short reads that contain errors. These sequencing errors constitute a major problem in applications such as de novo genome assembly, metagenomics analysis and single nucleotide polymorphism discovery. In this study, we present ADEPT, a dynamic error detection method, based on the quality scores of each nucleotide and its neighboring nucleotides, together with their positions within the read and compares this to the position-specific quality score distribution of all bases within the sequencing run. This method greatly improves upon other available methods in terms of the truemore » positive rate of error discovery without affecting the false positive rate, particularly within the middle of reads. We conclude that ADEPT is the only tool to date that dynamically assesses errors within reads by comparing position-specific and neighboring base quality scores with the distribution of quality scores for the dataset being analyzed. The result is a method that is less prone to position-dependent under-prediction, which is one of the most prominent issues in error prediction. The outcome is that ADEPT improves upon prior efforts in identifying true errors, primarily within the middle of reads, while reducing the false positive rate.« less

  1. Revisiting Recombination Signal in the Tick-Borne Encephalitis Virus: A Simulation Approach

    PubMed Central

    Johansson, Magnus; Norberg, Peter

    2016-01-01

    The hypothesis of wide spread reticulate evolution in Tick-Borne Encephalitis virus (TBEV) has recently gained momentum with several publications describing past recombination events involving various TBEV clades. Despite a large body of work, no consensus has yet emerged on TBEV evolutionary dynamics. Understanding the occurrence and frequency of recombination in TBEV bears significant impact on epidemiology, evolution, and vaccination with live vaccines. In this study, we investigated the possibility of detecting recombination events in TBEV by simulating recombinations at several locations on the virus’ phylogenetic tree and for different lengths of recombining fragments. We derived estimations of rates of true and false positive for the detection of past recombination events for seven recombination detection algorithms. Our analytical framework can be applied to any investigation dealing with the difficult task of distinguishing genuine recombination signal from background noise. Our results suggest that the problem of false positives associated with low detection P-values in TBEV, is more insidious than generally acknowledged. We reappraised the recombination signals present in the empirical data, and showed that reliable signals could only be obtained in a few cases when highly genetically divergent strains were involved, whereas false positives were common among genetically similar strains. We thus conclude that recombination among wild-type TBEV strains may occur, which has potential implications for vaccination with live vaccines, but that these events are surprisingly rare. PMID:27760182

  2. cnvScan: a CNV screening and annotation tool to improve the clinical utility of computational CNV prediction from exome sequencing data.

    PubMed

    Samarakoon, Pubudu Saneth; Sorte, Hanne Sørmo; Stray-Pedersen, Asbjørg; Rødningen, Olaug Kristin; Rognes, Torbjørn; Lyle, Robert

    2016-01-14

    With advances in next generation sequencing technology and analysis methods, single nucleotide variants (SNVs) and indels can be detected with high sensitivity and specificity in exome sequencing data. Recent studies have demonstrated the ability to detect disease-causing copy number variants (CNVs) in exome sequencing data. However, exonic CNV prediction programs have shown high false positive CNV counts, which is the major limiting factor for the applicability of these programs in clinical studies. We have developed a tool (cnvScan) to improve the clinical utility of computational CNV prediction in exome data. cnvScan can accept input from any CNV prediction program. cnvScan consists of two steps: CNV screening and CNV annotation. CNV screening evaluates CNV prediction using quality scores and refines this using an in-house CNV database, which greatly reduces the false positive rate. The annotation step provides functionally and clinically relevant information using multiple source datasets. We assessed the performance of cnvScan on CNV predictions from five different prediction programs using 64 exomes from Primary Immunodeficiency (PIDD) patients, and identified PIDD-causing CNVs in three individuals from two different families. In summary, cnvScan reduces the time and effort required to detect disease-causing CNVs by reducing the false positive count and providing annotation. This improves the clinical utility of CNV detection in exome data.

  3. Pitfalls in the molecular genetic diagnosis of Leber hereditary optic neuropathy (LHON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johns, D.R.; Neufeld, M.J.

    1993-10-01

    Pathogenetic mutations in mtDNA are found in the majority of patients with Leber hereditary optic neuropathy (LHON), and molecular genetic techniques to detect them are important for diagnosis. A false-positive molecular genetic error has adverse consequences for the diagnosis of this maternally inherited disease. The authors found a number of mtDNA polymorphisms that occur adjacent to known LHON-associated mutations and that confound their molecular genetic detection. These transition mutations occur at mtDNA nt 11779 (SfaNI site loss, 11778 mutation), nt 3459 (BsaHI site loss, 3460 mutation), nt 15258 (AccI site loss, 15257 mutation), nt 14485 (mismatch primer Sau3AI site loss,more » 14484 mutation), and nt 13707 (BstNI site loss, 13708 mutation). Molecular genetic detection of the most common pathogenetic mtDNA mutations in LHON, using a single restriction enzyme, may be confounded by adjacent polymorphisms that occur with a false-positive rate of 2%-7%. 19 refs.« less

  4. Application of artificial neural network to search for gravitational-wave signals associated with short gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Kim, Kyungmin; Harry, Ian W.; Hodge, Kari A.; Kim, Young-Min; Lee, Chang-Hwan; Lee, Hyun Kyu; Oh, John J.; Oh, Sang Hoon; Son, Edwin J.

    2015-12-01

    We apply a machine learning algorithm, the artificial neural network, to the search for gravitational-wave signals associated with short gamma-ray bursts (GRBs). The multi-dimensional samples consisting of data corresponding to the statistical and physical quantities from the coherent search pipeline are fed into the artificial neural network to distinguish simulated gravitational-wave signals from background noise artifacts. Our result shows that the data classification efficiency at a fixed false alarm probability (FAP) is improved by the artificial neural network in comparison to the conventional detection statistic. Specifically, the distance at 50% detection probability at a fixed false positive rate is increased about 8%-14% for the considered waveform models. We also evaluate a few seconds of the gravitational-wave data segment using the trained networks and obtain the FAP. We suggest that the artificial neural network can be a complementary method to the conventional detection statistic for identifying gravitational-wave signals related to the short GRBs.

  5. Limits of detection and decision. Part 3

    NASA Astrophysics Data System (ADS)

    Voigtman, E.

    2008-02-01

    It has been shown that the MARLAP (Multi-Agency Radiological Laboratory Analytical Protocols) for estimating the Currie detection limit, which is based on 'critical values of the non-centrality parameter of the non-central t distribution', is intrinsically biased, even if no calibration curve or regression is used. This completed the refutation of the method, begun in Part 2. With the field cleared of obstructions, the true theory underlying Currie's limits of decision, detection and quantification, as they apply in a simple linear chemical measurement system (CMS) having heteroscedastic, Gaussian measurement noise and using weighted least squares (WLS) processing, was then derived. Extensive Monte Carlo simulations were performed, on 900 million independent calibration curves, for linear, "hockey stick" and quadratic noise precision models (NPMs). With errorless NPM parameters, all the simulation results were found to be in excellent agreement with the derived theoretical expressions. Even with as much as 30% noise on all of the relevant NPM parameters, the worst absolute errors in rates of false positives and false negatives, was only 0.3%.

  6. Evaluation of culture- and PCR-based detection methods for Escherichia coli O157:H7 in inoculated ground beeft.

    PubMed

    Arthur, Terrance M; Bosilevac, Joseph M; Nou, Xiangwu; Koohmaraie, Mohammad

    2005-08-01

    Currently, several beef processors employ test-and-hold systems for increased quality control of ground beef. In such programs, each lot of product must be tested and found negative for Escherichia coli O157:H7 prior to release of the product into commerce. Optimization of three testing attributes (detection time, specificity, and sensitivity) is critical to the success of such strategies. Because ground beef is a highly perishable product, the testing methodology used must be as rapid as possible. The test also must have a low false-positive result rate so product is not needlessly discarded. False-negative results cannot be tolerated because they would allow contaminated product to be released and potentially cause disease. In this study, two culture-based and three PCR-based methods for detecting E. coli O157:H7 in ground beef were compared for their abilities to meet the above criteria. Ground beef samples were individually spiked with five genetically distinct strains of E. coli O157: H7 at concentrations of 17 and 1.7 CFU/65 g and then subjected to the various testing methodologies. There was no difference (P > 0.05) in the abilities of the PCR-based methods to detect E. coli O157:H7 inoculated in ground beef at 1.7 CFU/65 g. The culture-based systems detected more positive samples than did the PCR-based systems, but the detection times (21 to 48 h) were at least 9 h longer than those for the PCR-based methods (7.5 to 12 h). Ground beef samples were also spiked with potentially cross-reactive strains. The PCR-based systems that employed an immunomagnetic separation step prior to detection produced fewer false-positive results.

  7. Deep learning algorithms for detecting explosive hazards in ground penetrating radar data

    NASA Astrophysics Data System (ADS)

    Besaw, Lance E.; Stimac, Philip J.

    2014-05-01

    Buried explosive hazards (BEHs) have been, and continue to be, one of the most deadly threats in modern conflicts. Current handheld sensors rely on a highly trained operator for them to be effective in detecting BEHs. New algorithms are needed to reduce the burden on the operator and improve the performance of handheld BEH detectors. Traditional anomaly detection and discrimination algorithms use "hand-engineered" feature extraction techniques to characterize and classify threats. In this work we use a Deep Belief Network (DBN) to transcend the traditional approaches of BEH detection (e.g., principal component analysis and real-time novelty detection techniques). DBNs are pretrained using an unsupervised learning algorithm to generate compressed representations of unlabeled input data and form feature detectors. They are then fine-tuned using a supervised learning algorithm to form a predictive model. Using ground penetrating radar (GPR) data collected by a robotic cart swinging a handheld detector, our research demonstrates that relatively small DBNs can learn to model GPR background signals and detect BEHs with an acceptable false alarm rate (FAR). In this work, our DBNs achieved 91% probability of detection (Pd) with 1.4 false alarms per square meter when evaluated on anti-tank and anti-personnel targets at temperate and arid test sites. This research demonstrates that DBNs are a viable approach to detect and classify BEHs.

  8. Towards Comprehensive Variation Models for Designing Vehicle Monitoring Systems

    NASA Technical Reports Server (NTRS)

    McAdams, Daniel A.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    When designing vehicle vibration monitoring systems for aerospace devices, it is common to use well-established models of vibration features to determine whether failures or defects exist. Most of the algorithms used for failure detection rely on these models to detect significant changes in a flight environment. In actual practice, however, most vehicle vibration monitoring systems are corrupted by high rates of false alarms and missed detections. This crucial roadblock makes their implementation in real vehicles (e.g., helicopter transmissions and aircraft engines) difficult, making their operation costly and unreliable. Research conducted at the NASA Ames Research Center has determined that a major reason for the high rates of false alarms and missed detections is the numerous sources of statistical variations that are not taken into account in the modeling assumptions. In this paper, we address one such source of variations, namely, those caused during the design and manufacturing of rotating machinery components that make up aerospace systems. We present a novel way of modeling the vibration response by including design variations via probabilistic methods. Using such models, we develop a methodology to account for design and manufacturing variations, and explore the changes in the vibration response to determine its stochastic nature. We explore the potential of the methodology using a nonlinear cam-follower model, where the spring stiffness values are assumed to follow a normal distribution. The results demonstrate initial feasibility of the method, showing great promise in developing a general methodology for designing more accurate aerospace vehicle monitoring systems.

  9. Linker-assisted immunoassay and liquid chromatography/mass spectrometry for the analysis of glyphosate

    USGS Publications Warehouse

    Lee, E.A.; Zimmerman, L.R.; Bhullar, B.S.; Thurman, E.M.

    2002-01-01

    A novel, sensitive, linker-assisted enzyme-linked immunosorbent assay (L'ELISA) was compared to on-line solidphase extraction (SPE) with high-performance liquid chromatography/mass spectrometry (HPLC/MS) for the analysis of glyphosate in surface water and groundwater samples. The L'ELISA used succinic anhydride to derivatize glyphosate, which mimics the epitotic attachment of glyphosate to horseradish peroxidase hapten. Thus, L'ELISA recognized the derivatized glyphosate more effectively (detection limit of 0.1 μg/L) and with increased sensitivity (10-100 times) over conventional ELISA and showed the potential for other applications. The precision and accuracy of L'ELISA then was compared with on-line SPE/HPLC/MS, which detected glyphosate and its degradate derivatized with 9-fluorenylmethyl chloroformate using negative-ion electrospray (detection limit 0.1 μg/L, relative standard deviation ±15%). Derivatization efficiency and matrix effects were minimized by adding an isotope-labeled glyphosate (2-13C15N). The accuracy of L'ELISA gave a false positive rate of 18% between 0.1 and 1.0 μg/L and a false positive rate of only 1% above 1.0 μg/L. The relative standard deviation was ±20%. The correlation of L'ELISA and HPLC/MS for 66 surface water and groundwater samples was 0.97 with a slope of 1.28, with many detections of glyphosate and its degradate in surface water but not in groundwater.

  10. Securing Collaborative Spectrum Sensing against Untrustworthy Secondary Users in Cognitive Radio Networks

    NASA Astrophysics Data System (ADS)

    Wang, Wenkai; Li, Husheng; Sun, Yan(Lindsay); Han, Zhu

    2009-12-01

    Cognitive radio is a revolutionary paradigm to migrate the spectrum scarcity problem in wireless networks. In cognitive radio networks, collaborative spectrum sensing is considered as an effective method to improve the performance of primary user detection. For current collaborative spectrum sensing schemes, secondary users are usually assumed to report their sensing information honestly. However, compromised nodes can send false sensing information to mislead the system. In this paper, we study the detection of untrustworthy secondary users in cognitive radio networks. We first analyze the case when there is only one compromised node in collaborative spectrum sensing schemes. Then we investigate the scenario that there are multiple compromised nodes. Defense schemes are proposed to detect malicious nodes according to their reporting histories. We calculate the suspicious level of all nodes based on their reports. The reports from nodes with high suspicious levels will be excluded in decision-making. Compared with existing defense methods, the proposed scheme can effectively differentiate malicious nodes and honest nodes. As a result, it can significantly improve the performance of collaborative sensing. For example, when there are 10 secondary users, with the primary user detection rate being equal to 0.99, one malicious user can make the false alarm rate [InlineEquation not available: see fulltext.] increase to 72%. The proposed scheme can reduce it to 5%. Two malicious users can make [InlineEquation not available: see fulltext.] increase to 85% and the proposed scheme reduces it to 8%.

  11. A multi-scale tensor voting approach for small retinal vessel segmentation in high resolution fundus images.

    PubMed

    Christodoulidis, Argyrios; Hurtut, Thomas; Tahar, Houssem Ben; Cheriet, Farida

    2016-09-01

    Segmenting the retinal vessels from fundus images is a prerequisite for many CAD systems for the automatic detection of diabetic retinopathy lesions. So far, research efforts have concentrated mainly on the accurate localization of the large to medium diameter vessels. However, failure to detect the smallest vessels at the segmentation step can lead to false positive lesion detection counts in a subsequent lesion analysis stage. In this study, a new hybrid method for the segmentation of the smallest vessels is proposed. Line detection and perceptual organization techniques are combined in a multi-scale scheme. Small vessels are reconstructed from the perceptual-based approach via tracking and pixel painting. The segmentation was validated in a high resolution fundus image database including healthy and diabetic subjects using pixel-based as well as perceptual-based measures. The proposed method achieves 85.06% sensitivity rate, while the original multi-scale line detection method achieves 81.06% sensitivity rate for the corresponding images (p<0.05). The improvement in the sensitivity rate for the database is 6.47% when only the smallest vessels are considered (p<0.05). For the perceptual-based measure, the proposed method improves the detection of the vasculature by 7.8% against the original multi-scale line detection method (p<0.05). Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Assessment of frequency specific auditory steady-state response using amplitude modulation with 2-order exponential envelope.

    PubMed

    Cevallos-Larrea, Pablo; Pereira, Thobias; Santos, Wagner; Frota, Silvana M; Infantosi, Antonio F; Ichinose, Roberto M; Tierra-Criollo, Carlos

    2016-08-01

    This study investigated the performance of Frequency Specific Auditory Steady-State Response (FS-ASSR) detection elicited by the amplitude modulated tone with 2-order exponential envelope (AM2), using objective response detection (ORD) techniques of Spectral F-Test (SFT) and Magnitude Squared Coherence (MSC). ASSRs from 24 normal hearing adults were obtained during binaural multi-tone stimulation of amplitude-modulation (AM) and AM2 at intensities of 60, 45 and 30 dBSPL. The carrier frequencies were 500, 1000, 2000, and 4000 Hz, modulated between 77 and 105 Hz. AM2 achieve FS-ASSR amplitudes higher than AM by 16%, 18% and 12% at 60, 45 and 30 dBSPL, respectively, with a major increase at 500 Hz (22.5%). AMS2PL increased the Detection Rate (DR) up to 8.3% at 500 Hz for 30 dBSPL, which is particularly beneficial for FS-ASSR detection near the hearing threshold. In addition, responses in 1000 and 4000 Hz were consistently increased. The MSC and SFT presented no differences in Detection Rate (DR). False Detection Rate (FDR) was close to 5% for both techniques and tones. Detection times to reach DR over 90% were 3.5 and 4.9 min at 60 and 45 dBSPL, respectively. Further investigation concerning efficient multiple FS-ASSR is still necessary, such as testing subjects with hearing loss.

  13. Role of Urine Drug Testing in the Current Opioid Epidemic.

    PubMed

    Mahajan, Gagan

    2017-12-01

    While the evidence for urine drug testing for patients on chronic opioid therapy is weak, the guidelines created by numerous medical societies and state and federal regulatory agencies recommend that it be included as one of the tools used to monitor patients for compliance with chronic opioid therapy. To get the most comprehensive results, clinicians should order both an immunoassay screen and confirmatory urine drug test. The immunoassay screen, which can be performed as an in-office point-of-care test or as a laboratory-based test, is a cheap and convenient study to order. Limitations of an immunoassay screen, however, include having a high threshold of detectability and only providing qualitative information about a select number of drug classes. Because of these restrictions, clinicians should understand that immunoassay screens have high false-positive and false-negative rates. Despite these limitations, though, the results can assist the clinician with making preliminary treatment decisions. In comparison, a confirmatory urine drug test, which can only be performed as a laboratory-based test, has a lower threshold of detectability and provides both qualitative and quantitative information. A urine drug test's greater degree of specificity allows for a relatively low false-negative and false-positive rate in contrast to an immunoassay screen. Like any other diagnostic test, an immunoassay screen and a confirmatory urine drug test both possess limitations. Clinicians must keep this in mind when interpreting an unexpected test result and consult with their laboratory when in doubt about the meaning of the test result to avoid making erroneous decisions that negatively impact both the patient and clinician.

  14. Real-time heart rate measurement for multi-people using compressive tracking

    NASA Astrophysics Data System (ADS)

    Liu, Lingling; Zhao, Yuejin; Liu, Ming; Kong, Lingqin; Dong, Liquan; Ma, Feilong; Pang, Zongguang; Cai, Zhi; Zhang, Yachu; Hua, Peng; Yuan, Ruifeng

    2017-09-01

    The rise of aging population has created a demand for inexpensive, unobtrusive, automated health care solutions. Image PhotoPlethysmoGraphy(IPPG) aids in the development of these solutions by allowing for the extraction of physiological signals from video data. However, the main deficiencies of the recent IPPG methods are non-automated, non-real-time and susceptible to motion artifacts(MA). In this paper, a real-time heart rate(HR) detection method for multiple subjects simultaneously was proposed and realized using the open computer vision(openCV) library, which consists of getting multiple subjects' facial video automatically through a Webcam, detecting the region of interest (ROI) in the video, reducing the false detection rate by our improved Adaboost algorithm, reducing the MA by our improved compress tracking(CT) algorithm, wavelet noise-suppression algorithm for denoising and multi-threads for higher detection speed. For comparison, HR was measured simultaneously using a medical pulse oximetry device for every subject during all sessions. Experimental results on a data set of 30 subjects show that the max average absolute error of heart rate estimation is less than 8 beats per minute (BPM), and the processing speed of every frame has almost reached real-time: the experiments with video recordings of ten subjects under the condition of the pixel resolution of 600× 800 pixels show that the average HR detection time of 10 subjects was about 17 frames per second (fps).

  15. Anomaly metrics to differentiate threat sources from benign sources in primary vehicle screening.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, Israel Dov; Mengesha, Wondwosen

    2011-09-01

    Discrimination of benign sources from threat sources at Port of Entries (POE) is of a great importance in efficient screening of cargo and vehicles using Radiation Portal Monitors (RPM). Currently RPM's ability to distinguish these radiological sources is seriously hampered by the energy resolution of the deployed RPMs. As naturally occurring radioactive materials (NORM) are ubiquitous in commerce, false alarms are problematic as they require additional resources in secondary inspection in addition to impacts on commerce. To increase the sensitivity of such detection systems without increasing false alarm rates, alarm metrics need to incorporate the ability to distinguish benign andmore » threat sources. Principal component analysis (PCA) and clustering technique were implemented in the present study. Such techniques were investigated for their potential to lower false alarm rates and/or increase sensitivity to weaker threat sources without loss of specificity. Results of the investigation demonstrated improved sensitivity and specificity in discriminating benign sources from threat sources.« less

  16. Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Todd Vollmer; Milos Manic

    The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, thismore » paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.« less

  17. The Third Swift Burst Alert Telescope Gamma-Ray Burst Catalog

    NASA Astrophysics Data System (ADS)

    Lien, Amy; Sakamoto, Takanori; Barthelmy, Scott D.; Baumgartner, Wayne H.; Cannizzo, John K.; Chen, Kevin; Collins, Nicholas R.; Cummings, Jay R.; Gehrels, Neil; Krimm, Hans A.; Markwardt, Craig. B.; Palmer, David M.; Stamatikos, Michael; Troja, Eleonora; Ukwatta, T. N.

    2016-09-01

    To date, the Burst Alert Telescope (BAT) onboard Swift has detected ˜1000 gamma-ray bursts (GRBs), of which ˜360 GRBs have redshift measurements, ranging from z = 0.03 to z = 9.38. We present the analyses of the BAT-detected GRBs for the past ˜11 years up through GRB 151027B. We report summaries of both the temporal and spectral analyses of the GRB characteristics using event data (I.e., data for each photon within approximately 250 s before and 950 s after the BAT trigger time), and discuss the instrumental sensitivity and selection effects of GRB detections. We also explore the GRB properties with redshift when possible. The result summaries and data products are available at http://swift.gsfc.nasa.gov/results/batgrbcat/index.html. In addition, we perform searches for GRB emissions before or after the event data using the BAT survey data. We estimate the false detection rate to be only one false detection in this sample. There are 15 ultra-long GRBs (˜2% of the BAT GRBs) in this search with confirmed emission beyond ˜1000 s of event data, and only two GRBs (GRB 100316D and GRB 101024A) with detections in the survey data prior to the starting of event data.

  18. THE THIRD SWIFT BURST ALERT TELESCOPE GAMMA-RAY BURST CATALOG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lien, Amy; Baumgartner, Wayne H.; Cannizzo, John K.

    2016-09-20

    To date, the Burst Alert Telescope (BAT) onboard Swift has detected ∼1000 gamma-ray bursts (GRBs), of which ∼360 GRBs have redshift measurements, ranging from z = 0.03 to z = 9.38. We present the analyses of the BAT-detected GRBs for the past ∼11 years up through GRB 151027B. We report summaries of both the temporal and spectral analyses of the GRB characteristics using event data (i.e., data for each photon within approximately 250 s before and 950 s after the BAT trigger time), and discuss the instrumental sensitivity and selection effects of GRB detections. We also explore the GRB propertiesmore » with redshift when possible. The result summaries and data products are available at http://swift.gsfc.nasa.gov/results/batgrbcat/index.html. In addition, we perform searches for GRB emissions before or after the event data using the BAT survey data. We estimate the false detection rate to be only one false detection in this sample. There are 15 ultra-long GRBs (∼2% of the BAT GRBs) in this search with confirmed emission beyond ∼1000 s of event data, and only two GRBs (GRB 100316D and GRB 101024A) with detections in the survey data prior to the starting of event data.« less

  19. Computer-aided detection of pulmonary embolism at CT pulmonary angiography: can it improve performance of inexperienced readers?

    PubMed

    Blackmon, Kevin N; Florin, Charles; Bogoni, Luca; McCain, Joshua W; Koonce, James D; Lee, Heon; Bastarrika, Gorka; Thilo, Christian; Costello, Philip; Salganicoff, Marcos; Joseph Schoepf, U

    2011-06-01

    To evaluate the effect of a computer-aided detection (CAD) algorithm on the performance of novice readers for detection of pulmonary embolism (PE) at CT pulmonary angiography (CTPA). We included CTPA examinations of 79 patients (50 female, 52 ± 18 years). Studies were evaluated by two independent inexperienced readers who marked all vessels containing PE. After 3 months all studies were reevaluated by the same two readers, this time aided by CAD prototype. A consensus read by three expert radiologists served as the reference standard. Statistical analysis used χ(2) and McNemar testing. Expert consensus revealed 119 PEs in 32 studies. For PE detection, the sensitivity of CAD alone was 78%. Inexperienced readers' initial interpretations had an average per-PE sensitivity of 50%, which improved to 71% (p < 0.001) with CAD as a second reader. False positives increased from 0.18 to 0.25 per study (p = 0.03). Per-study, the readers initially detected 27/32 positive studies (84%); with CAD this number increased to 29.5 studies (92%; p = 0.125). Our results suggest that CAD significantly improves the sensitivity of PE detection for inexperienced readers with a small but appreciable increase in the rate of false positives.

  20. A multicenter hospital-based diagnosis study of automated breast ultrasound system in detecting breast cancer among Chinese women.

    PubMed

    Zhang, Xi; Lin, Xi; Tan, Yanjuan; Zhu, Ying; Wang, Hui; Feng, Ruimei; Tang, Guoxue; Zhou, Xiang; Li, Anhua; Qiao, Youlin

    2018-04-01

    The automated breast ultrasound system (ABUS) is a potential method for breast cancer detection; however, its diagnostic performance remains unclear. We conducted a hospital-based multicenter diagnostic study to evaluate the clinical performance of the ABUS for breast cancer detection by comparing it to handheld ultrasound (HHUS) and mammography (MG). Eligible participants underwent HHUS and ABUS testing; women aged 40-69 years additionally underwent MG. Images were interpreted using the Breast Imaging Reporting and Data System (BI-RADS). Women in the BI-RADS categories 1-2 were considered negative. Women classified as BI-RADS 3 underwent magnetic resonance imaging to distinguish true- and false-negative results. Core aspiration or surgical biopsy was performed in women classified as BI-RADS 4-5, followed by a pathological diagnosis. Kappa values and agreement rates were calculated between ABUS, HHUS and MG. A total of 1,973 women were included in the final analysis. Of these, 1,353 (68.6%) and 620 (31.4%) were classified as BI-RADS categories 1-3 and 4-5, respectively. In the older age group, the agreement rate and Kappa value between the ABUS and HHUS were 94.0% and 0.860 (P<0.001), respectively; they were 89.2% and 0.735 (P<0.001) between the ABUS and MG, respectively. Regarding consistency between imaging and pathology results, 78.6% of women classified as BI-RADS 4-5 based on the ABUS were diagnosed with precancerous lesions or cancer; which was 7.2% higher than that of women based on HHUS. For BI-RADS 1-2, the false-negative rates of the ABUS and HHUS were almost identical and were much lower than those of MG. We observed a good diagnostic reliability for the ABUS. Considering its performance for breast cancer detection in women with high-density breasts and its lower operator dependence, the ABUS is a promising option for breast cancer detection in China.

  1. Sample Selection for Training Cascade Detectors.

    PubMed

    Vállez, Noelia; Deniz, Oscar; Bueno, Gloria

    2015-01-01

    Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.

  2. Modeling inter-signal arrival times for accurate detection of CAN bus signal injection attacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Michael Roy; Bridges, Robert A; Combs, Frank L

    Modern vehicles rely on hundreds of on-board electronic control units (ECUs) communicating over in-vehicle networks. As external interfaces to the car control networks (such as the on-board diagnostic (OBD) port, auxiliary media ports, etc.) become common, and vehicle-to-vehicle / vehicle-to-infrastructure technology is in the near future, the attack surface for vehicles grows, exposing control networks to potentially life-critical attacks. This paper addresses the need for securing the CAN bus by detecting anomalous traffic patterns via unusual refresh rates of certain commands. While previous works have identified signal frequency as an important feature for CAN bus intrusion detection, this paper providesmore » the first such algorithm with experiments on five attack scenarios. Our data-driven anomaly detection algorithm requires only five seconds of training time (on normal data) and achieves true positive / false discovery rates of 0.9998/0.00298, respectively (micro-averaged across the five experimental tests).« less

  3. Urban Area Detection in Very High Resolution Remote Sensing Images Using Deep Convolutional Neural Networks.

    PubMed

    Tian, Tian; Li, Chang; Xu, Jinkang; Ma, Jiayi

    2018-03-18

    Detecting urban areas from very high resolution (VHR) remote sensing images plays an important role in the field of Earth observation. The recently-developed deep convolutional neural networks (DCNNs), which can extract rich features from training data automatically, have achieved outstanding performance on many image classification databases. Motivated by this fact, we propose a new urban area detection method based on DCNNs in this paper. The proposed method mainly includes three steps: (i) a visual dictionary is obtained based on the deep features extracted by pre-trained DCNNs; (ii) urban words are learned from labeled images; (iii) the urban regions are detected in a new image based on the nearest dictionary word criterion. The qualitative and quantitative experiments on different datasets demonstrate that the proposed method can obtain a remarkable overall accuracy (OA) and kappa coefficient. Moreover, it can also strike a good balance between the true positive rate (TPR) and false positive rate (FPR).

  4. Robust and Effective Component-based Banknote Recognition for the Blind

    PubMed Central

    Hasanuzzaman, Faiz M.; Yang, Xiaodong; Tian, YingLi

    2012-01-01

    We develop a novel camera-based computer vision technology to automatically recognize banknotes for assisting visually impaired people. Our banknote recognition system is robust and effective with the following features: 1) high accuracy: high true recognition rate and low false recognition rate, 2) robustness: handles a variety of currency designs and bills in various conditions, 3) high efficiency: recognizes banknotes quickly, and 4) ease of use: helps blind users to aim the target for image capture. To make the system robust to a variety of conditions including occlusion, rotation, scaling, cluttered background, illumination change, viewpoint variation, and worn or wrinkled bills, we propose a component-based framework by using Speeded Up Robust Features (SURF). Furthermore, we employ the spatial relationship of matched SURF features to detect if there is a bill in the camera view. This process largely alleviates false recognition and can guide the user to correctly aim at the bill to be recognized. The robustness and generalizability of the proposed system is evaluated on a dataset including both positive images (with U.S. banknotes) and negative images (no U.S. banknotes) collected under a variety of conditions. The proposed algorithm, achieves 100% true recognition rate and 0% false recognition rate. Our banknote recognition system is also tested by blind users. PMID:22661884

  5. [Prospective performance evaluation of first trimester screenings in Germany for risk calculation through http://www.firsttrimester.net].

    PubMed

    Kleinsorge, F; Smetanay, K; Rom, J; Hörmansdörfer, C; Hörmannsdörfer, C; Scharf, A; Schmidt, P

    2010-12-01

    In 2008, 2 351 first trimester screenings were calculated by a newly developed internet database ( http:// www.firsttrimester.net ) to evaluate the risk for the presence of Down's syndrome. All data were evaluated by the conventional first trimester screening according to Nicolaides (FTS), based on the previous JOY Software, and by the advanced first trimester screening (AFS). After receiving the feedback of the karyotype as well as the rates of the correct positives, correct negatives, false positives, false negatives, the sensitivity and specificity were calculated and compared. Overall 255 cases were investigated which were analysed by both methods. These included 2 cases of Down's syndrome and one case of trisomy 18. The FTS and the AFS had a sensitivity of 100%. The specificity was 88.5% for the FTS and 93.0% for the AFS. As already shown in former studies, the higher specificity of the AFS is a result of a reduction of the false positive rate (28 to 17 cases). As a consequence of the AFS with a detection rate of 100% the rate of further invasive diagnostics in pregnant women is decreased by having 39% fewer positive tested women. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Comparative performance analysis for computer aided lung nodule detection and segmentation on ultra-low-dose vs. standard-dose CT

    NASA Astrophysics Data System (ADS)

    Wiemker, Rafael; Rogalla, Patrik; Opfer, Roland; Ekin, Ahmet; Romano, Valentina; Bülow, Thomas

    2006-03-01

    The performance of computer aided lung nodule detection (CAD) and computer aided nodule volumetry is compared between standard-dose (70-100 mAs) and ultra-low-dose CT images (5-10 mAs). A direct quantitative performance comparison was possible, since for each patient both an ultra-low-dose and a standard-dose CT scan were acquired within the same examination session. The data sets were recorded with a multi-slice CT scanner at the Charite university hospital Berlin with 1 mm slice thickness. Our computer aided nodule detection and segmentation algorithms were deployed on both ultra-low-dose and standard-dose CT data without any dose-specific fine-tuning or preprocessing. As a reference standard 292 nodules from 20 patients were visually identified, each nodule both in ultra-low-dose and standard-dose data sets. The CAD performance was analyzed by virtue of multiple FROC curves for different lower thresholds of the nodule diameter. For nodules with a volume-equivalent diameter equal or larger than 4 mm (149 nodules pairs), we observed a detection rate of 88% at a median false positive rate of 2 per patient in standard-dose images, and 86% detection rate in ultra-low-dose images, also at 2 FPs per patient. Including even smaller nodules equal or larger than 2 mm (272 nodules pairs), we observed a detection rate of 86% in standard-dose images, and 84% detection rate in ultra-low-dose images, both at a rate of 5 FPs per patient. Moreover, we observed a correlation of 94% between the volume-equivalent nodule diameter as automatically measured on ultra-low-dose versus on standard-dose images, indicating that ultra-low-dose CT is also feasible for growth-rate assessment in follow-up examinations. The comparable performance of lung nodule CAD in ultra-low-dose and standard-dose images is of particular interest with respect to lung cancer screening of asymptomatic patients.

  7. Efficacy and Safety of Blue Light Flexible Cystoscopy with Hexaminolevulinate in the Surveillance of Bladder Cancer: A Phase III, Comparative, Multicenter Study.

    PubMed

    Daneshmand, Siamak; Patel, Sanjay; Lotan, Yair; Pohar, Kamal; Trabulsi, Edouard; Woods, Michael; Downs, Tracy; Huang, William; Jones, Jeffrey; O'Donnell, Michael; Bivalacqua, Trinity; DeCastro, Joel; Steinberg, Gary; Kamat, Ashish; Resnick, Matthew; Konety, Badrinath; Schoenberg, Mark; Jones, J Stephen

    2018-05-01

    We compared blue light flexible cystoscopy with white light flexible cystoscopy for the detection of bladder cancer during surveillance. Patients at high risk for recurrence received hexaminolevulinate intravesically before white light flexible cystoscopy and randomization to blue light flexible cystoscopy. All suspicious lesions were documented. Patients with suspicious lesions were referred to the operating room for repeat white and blue light cystoscopy. All suspected lesions were biopsied or resected and specimens were examined by an independent pathology consensus panel. The primary study end point was the proportion of patients with histologically confirmed malignancy detected only with blue light flexible cystoscopy. Additional end points were the false-positive rate, carcinoma in situ detection and additional tumors detected only with blue light cystoscopy. Following surveillance 103 of the 304 patients were referred, including 63 with confirmed malignancy, of whom 26 had carcinoma in situ. In 13 of the 63 patients (20.6%, 95% CI 11.5-32.7) recurrence was seen only with blue light flexible cystoscopy (p <0.0001). Five of these cases were confirmed as carcinoma in situ. Operating room examination confirmed carcinoma in situ in 26 of 63 patients (41%), which was detected only with blue light cystoscopy in 9 of the 26 (34.6%, 95% CI 17.2-55.7, p <0.0001). Blue light cystoscopy identified additional malignant lesions in 29 of the 63 patients (46%). The false-positive rate was 9.1% for white and blue light cystoscopy. None of the 12 adverse events during surveillance were serious. Office based blue light flexible cystoscopy significantly improves the detection of patients with recurrent bladder cancer and it is safe when used for surveillance. Blue light cystoscopy in the operating room significantly improves the detection of carcinoma in situ and detects lesions that are missed with white light cystoscopy. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  8. Detecting Smoking Events Using Accelerometer Data Collected Via Smartwatch Technology: Validation Study

    PubMed Central

    Cole, Casey A; Anshari, Dien; Lambert, Victoria; Thrasher, James F

    2017-01-01

    Background Smoking is the leading cause of preventable death in the world today. Ecological research on smoking in context currently relies on self-reported smoking behavior. Emerging smartwatch technology may more objectively measure smoking behavior by automatically detecting smoking sessions using robust machine learning models. Objective This study aimed to examine the feasibility of detecting smoking behavior using smartwatches. The second aim of this study was to compare the success of observing smoking behavior with smartwatches to that of conventional self-reporting. Methods A convenience sample of smokers was recruited for this study. Participants (N=10) recorded 12 hours of accelerometer data using a mobile phone and smartwatch. During these 12 hours, they engaged in various daily activities, including smoking, for which they logged the beginning and end of each smoking session. Raw data were classified as either smoking or nonsmoking using a machine learning model for pattern recognition. The accuracy of the model was evaluated by comparing the output with a detailed description of a modeled smoking session. Results In total, 120 hours of data were collected from participants and analyzed. The accuracy of self-reported smoking was approximately 78% (96/123). Our model was successful in detecting 100 of 123 (81%) smoking sessions recorded by participants. After eliminating sessions from the participants that did not adhere to study protocols, the true positive detection rate of the smartwatch based-detection increased to more than 90%. During the 120 hours of combined observation time, only 22 false positive smoking sessions were detected resulting in a 2.8% false positive rate. Conclusions Smartwatch technology can provide an accurate, nonintrusive means of monitoring smoking behavior in natural contexts. The use of machine learning algorithms for passively detecting smoking sessions may enrich ecological momentary assessment protocols and cessation intervention studies that often rely on self-reported behaviors and may not allow for targeted data collection and communications around smoking events. PMID:29237580

  9. Demonstration of the use of ADAPT to derive predictive maintenance algorithms for the KSC central heat plant

    NASA Technical Reports Server (NTRS)

    Hunter, H. E.

    1972-01-01

    The Avco Data Analysis and Prediction Techniques (ADAPT) were employed to determine laws capable of detecting failures in a heat plant up to three days in advance of the occurrence of the failure. The projected performance of algorithms yielded a detection probability of 90% with false alarm rates of the order of 1 per year for a sample rate of 1 per day with each detection, followed by 3 hourly samplings. This performance was verified on 173 independent test cases. The program also demonstrated diagnostic algorithms and the ability to predict the time of failure to approximately plus or minus 8 hours up to three days in advance of the failure. The ADAPT programs produce simple algorithms which have a unique possibility of a relatively low cost updating procedure. The algorithms were implemented on general purpose computers at Kennedy Space Flight Center and tested against current data.

  10. VizieR Online Data Catalog: Bright white dwarfs IRAC photometry (Barber+, 2016)

    NASA Astrophysics Data System (ADS)

    Barber, S. D.; Belardi, C.; Kilic, M.; Gianninas, A.

    2017-07-01

    Mid-infrared photometry, like the 3.4 and 4.6um photometry available from WISE, is necessary to detect emission from a debris disc orbiting a WD. WISE, however, has poor spatial resolution (6 arcsec beam size) and is known to have a 75 per cent false positive rate for detecting dusty discs around WDs fainter than 14.5(15) mag in W1(W2) (Barber et al. (2014ApJ...786...77B). To mitigate this high rate of spurious detections, we compile higher spatial resolution archival data from the InfraRed Array Camera (IRAC) on the Spitzer Space Telescope. We query the Spitzer Heritage Archive for any observations within 10 arcsec of the 1265 WDs from Gianninas et al. (2011, Cat. J/ApJ/743/138) and find 907 Astronomical Observing Requests (AORs) for 381 WDs. (1 data file).

  11. SAR/multispectral image fusion for the detection of environmental hazards with a GIS

    NASA Astrophysics Data System (ADS)

    Errico, Angela; Angelino, Cesario Vincenzo; Cicala, Luca; Podobinski, Dominik P.; Persechino, Giuseppe; Ferrara, Claudia; Lega, Massimiliano; Vallario, Andrea; Parente, Claudio; Masi, Giuseppe; Gaetano, Raffaele; Scarpa, Giuseppe; Amitrano, Donato; Ruello, Giuseppe; Verdoliva, Luisa; Poggi, Giovanni

    2014-10-01

    In this paper we propose a GIS-based methodology, using optical and SAR remote sensing data, together with more conventional sources, for the detection of small cattle breeding areas, potentially responsible of hazardous littering. This specific environmental problem is very relevant for the Caserta area, in southern Italy, where many small buffalo breeding farms exist which are not even known to the productive activity register, and are not easily monitored and surveyed. Experiments on a test area, with available specific ground truth, prove that the proposed systems is characterized by very large detection probability and negligible false alarm rate.

  12. Effect of subliminal visual material on an auditory signal detection task.

    PubMed

    Moroney, E; Bross, M

    1984-02-01

    An experiment assessed the effect of subliminally embedded, visual material on an auditory detection task. 22 women and 19 men were presented tachistoscopically with words designated as "emotional" or "neutral" on the basis of prior GSRs and a Word Rating List under four conditions: (a) Unembedded Neutral, (b) Embedded Neutral, (c) Unembedded Emotional, and (d) Embedded Emotional. On each trial subjects made forced choices concerning the presence or absence of an auditory tone (1000 Hz) at threshold level; hits and false alarm rates were used to compute non-parametric indices for sensitivity (A') and response bias (B"). While over-all analyses of variance yielded no significant differences, further examination of the data suggests the presence of subliminally "receptive" and "non-receptive" subpopulations.

  13. Mobile chemical detector (AP2C+SP4E) as an aid for medical decision making in the battlefield.

    PubMed

    Eisenkraft, Arik; Markel, Gal; Simovich, Shirley; Layish, Ido; Hoffman, Azik; Finkelstein, Arseny; Rotman, Eran; Dushnitsky, Tsvika; Krivoy, Amir

    2007-09-01

    The combination of the AP2C unit with the SP4E kit composes a lightweight mobile detector of chemical warfare agents (CWA), such as nerve and mustard agents, with both vapor- and liquid-sampling capabilities. This apparatus was recently introduced into our military medical units as an aid for detection of CWA on casualties. Importantly, critical information regarding the applicability in the battlefield was absent. In view of the serious consequences that might follow a proclamation of CWA recognition in battlefield, a high false-positive rate positions the utilization of this apparatus as a medical decision tool in question. We have therefore conducted a field experiment to test the false-positive rate as well as analyze possible factors leading to false-positive readings with this device. The experiment was carried out before and after a 4-day army field exercise, using a standard AP2C device, a SP4E surface sampling kit, and a specially designed medical sampling kit for casualties, intended for medical teams. Soldiers were examined at rest, after mild exercise, and after 4 days in the field. The readings with AP2C alone were compared to the combination of AP2C and SP4E and to the medical sampling kit. Various body fluids served as negative controls. Remarkably, we found a false-positive rate of 57% at rest and after mild exercise, and an even higher rate of 64% after the 4-day field exercise with the AP2C detector alone, as compared to almost no false-positive readings with the combination of AP2C and SP4E. Strikingly, the medical sampling kit has yielded numerous false-positive readings, even in normal body fluids such as blood, urine, and saliva. We therefore see no place for using the medical sampling kit due to an unaccepted high rate of false-positive readings. Finally, we have designed an algorithm that uses the entire apparatus of AP2C and SP4E as a reliable validation tool for medical triage in the setting of exposure to nerve agents in the battlefield.

  14. A prospective investigation of fluorescence imaging to detect sentinel lymph nodes at robotic-assisted endometrial cancer staging.

    PubMed

    Paley, Pamela J; Veljovich, Dan S; Press, Joshua Z; Isacson, Christina; Pizer, Ellen; Shah, Chirag

    2016-07-01

    The accuracy of sentinel lymph node mapping has been shown in endometrial cancer, but studies to date have primarily focused on cohorts at low risk for nodal involvement. In our practice, we acknowledge the lack of benefit of lymphadenectomy in the low-risk subgroup and omit lymph node removal in these patients. Thus, our aim was to evaluate the feasibility and accuracy of sentinel node mapping in women at sufficient risk for nodal metastasis warranting lymphadenectomy and in whom the potential benefit of avoiding nodal procurement could be realized. To evaluate the detection rate and accuracy of fluorescence-guided sentinel lymph node mapping in endometrial cancer patients undergoing robotic-assisted staging. One hundred twenty-three endometrial cancer patients undergoing sentinel lymph node sentinel node mapping using indocyanine green were prospectively evaluated. Two mL (1.0 mg/mL) of dye were injected into the cervical stroma divided between the 2-3 and 9-10 o'clock positions at the time of uterine manipulator placement. Before hysterectomy, the retroperitoneal spaces were developed and fluorescence imaging was used for sentinel node detection. Identified sentinel nodes were removed and submitted for touch prep intraoperatively, followed by permanent assessment with routine hematoxylin and eosin levels. Patients then underwent hysterectomy, bilateral salpingo-oophorectomy, and completion bilateral pelvic and periaortic lymphadenectomy based on intrauterine risk factors determined intraoperatively (tumor size >2 cm, >50% myometrial invasion, and grade 3 histology). Of 123 patients enrolled, at least 1 sentinel node was detected in 119 (96.7%). Ninety-nine patients (80%) had bilateral pelvic or periaortic sentinel nodes detected. A total of 85 patients met criteria warranting completion lymphadenectomy. In 14 patients (16%) periaortic lymphadenectomy was not feasible, and the mean number of pelvic nodes procured was 13 (6-22). Of the 71 patients undergoing pelvic and periaortic lymphadenectomy, the mean nodal count was 23.2 (8-51). Of patients undergoing lymphadenectomy, 10.6% had lymph node metastasis on final hematoxylin and eosin evaluation. Notably, the sentinel node was the only positive node in 44% of cases. There were no cases in which final pathology of the sentinel node was negative and metastatic disease was detected upon completion lymphadenectomy in the non-sentinel nodes (no false negatives), yielding a sensitivity of 100%. Of the 14 sentinel nodes ultimately found to harbor metastases, 3 were negative on touch prep, yielding a sensitivity of 78.6% for intraoperative detection of sentinel node involvement. In all 3 of the false-negative touch preps, final pathology detected a single micrometastasis (0.24 mm, 1.4 mm, 1.5 mm). As expected, there were no false-positive results, yielding a specificity of 100%. No complications related to sentinel node mapping or allergic reactions to the dye were encountered. Intraoperative sentinel node mapping using fluorescence imaging with indocyanine green in endometrial cancer patients is feasible and yields high detection rates. In our pilot study, sentinel node mapping identified all women with Stage IIIC disease. Low false-negative rates are encouraging, and if confirmed in multi-institutional trials, this approach would be anticipated to reduce the morbidity, operative times, and costs associated with complete pelvic and periaortic lymphadenectomy. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. External Quality Assessment for Avian Influenza A (H7N9) Virus Detection Using Armored RNA

    PubMed Central

    Sun, Yu; Jia, Tingting; Sun, Yanli; Han, Yanxi; Wang, Lunan; Zhang, Rui; Zhang, Kuo; Lin, Guigao; Xie, Jiehong

    2013-01-01

    An external quality assessment (EQA) program for the molecular detection of avian influenza A (H7N9) virus was implemented by the National Center for Clinical Laboratories (NCCL) of China in June 2013. Virus-like particles (VLPs) that contained full-length RNA sequences of the hemagglutinin (HA), neuraminidase (NA), matrix protein (MP), and nucleoprotein (NP) genes from the H7N9 virus (armored RNAs) were constructed. The EQA panel, comprising 6 samples with different concentrations of armored RNAs positive for H7N9 viruses and four H7N9-negative samples (including one sample positive for only the MP gene of the H7N9 virus), was distributed to 79 laboratories in China that carry out the molecular detection of H7N9 viruses. The overall performances of the data sets were classified according to the results for the H7 and N9 genes. Consequently, we received 80 data sets (one participating group provided two sets of results) which were generated using commercial (n = 60) or in-house (n = 17) reverse transcription-quantitative PCR (qRT-PCR) kits and a commercial assay that employed isothermal amplification method (n = 3). The results revealed that the majority (82.5%) of the data sets correctly identified the H7N9 virus, while 17.5% of the data sets needed improvements in their diagnostic capabilities. These “improvable” data sets were derived mostly from false-negative results for the N9 gene at relatively low concentrations. The false-negative rate was 5.6%, and the false-positive rate was 0.6%. In addition, we observed varied diagnostic capabilities between the different commercially available kits and the in-house-developed assays, with the assay manufactured by BioPerfectus Technologies (Jiangsu, China) performing better than the others. Overall, the majority of laboratories have reliable diagnostic capacities for the detection of H7N9 virus. PMID:24088846

  16. External quality assessment for Avian Influenza A (H7N9) Virus detection using armored RNA.

    PubMed

    Sun, Yu; Jia, Tingting; Sun, Yanli; Han, Yanxi; Wang, Lunan; Zhang, Rui; Zhang, Kuo; Lin, Guigao; Xie, Jiehong; Li, Jinming

    2013-12-01

    An external quality assessment (EQA) program for the molecular detection of avian influenza A (H7N9) virus was implemented by the National Center for Clinical Laboratories (NCCL) of China in June 2013. Virus-like particles (VLPs) that contained full-length RNA sequences of the hemagglutinin (HA), neuraminidase (NA), matrix protein (MP), and nucleoprotein (NP) genes from the H7N9 virus (armored RNAs) were constructed. The EQA panel, comprising 6 samples with different concentrations of armored RNAs positive for H7N9 viruses and four H7N9-negative samples (including one sample positive for only the MP gene of the H7N9 virus), was distributed to 79 laboratories in China that carry out the molecular detection of H7N9 viruses. The overall performances of the data sets were classified according to the results for the H7 and N9 genes. Consequently, we received 80 data sets (one participating group provided two sets of results) which were generated using commercial (n = 60) or in-house (n = 17) reverse transcription-quantitative PCR (qRT-PCR) kits and a commercial assay that employed isothermal amplification method (n = 3). The results revealed that the majority (82.5%) of the data sets correctly identified the H7N9 virus, while 17.5% of the data sets needed improvements in their diagnostic capabilities. These "improvable" data sets were derived mostly from false-negative results for the N9 gene at relatively low concentrations. The false-negative rate was 5.6%, and the false-positive rate was 0.6%. In addition, we observed varied diagnostic capabilities between the different commercially available kits and the in-house-developed assays, with the assay manufactured by BioPerfectus Technologies (Jiangsu, China) performing better than the others. Overall, the majority of laboratories have reliable diagnostic capacities for the detection of H7N9 virus.

  17. Understanding environmental DNA detection probabilities: A case study using a stream-dwelling char Salvelinus fontinalis

    USGS Publications Warehouse

    Wilcox, Taylor M; Mckelvey, Kevin S.; Young, Michael K.; Sepulveda, Adam; Shepard, Bradley B.; Jane, Stephen F; Whiteley, Andrew R.; Lowe, Winsor H.; Schwartz, Michael K.

    2016-01-01

    Environmental DNA sampling (eDNA) has emerged as a powerful tool for detecting aquatic animals. Previous research suggests that eDNA methods are substantially more sensitive than traditional sampling. However, the factors influencing eDNA detection and the resulting sampling costs are still not well understood. Here we use multiple experiments to derive independent estimates of eDNA production rates and downstream persistence from brook trout (Salvelinus fontinalis) in streams. We use these estimates to parameterize models comparing the false negative detection rates of eDNA sampling and traditional backpack electrofishing. We find that using the protocols in this study eDNA had reasonable detection probabilities at extremely low animal densities (e.g., probability of detection 0.18 at densities of one fish per stream kilometer) and very high detection probabilities at population-level densities (e.g., probability of detection > 0.99 at densities of ≥ 3 fish per 100 m). This is substantially more sensitive than traditional electrofishing for determining the presence of brook trout and may translate into important cost savings when animals are rare. Our findings are consistent with a growing body of literature showing that eDNA sampling is a powerful tool for the detection of aquatic species, particularly those that are rare and difficult to sample using traditional methods.

  18. Modeling and Detection of Ice Particle Accretion in Aircraft Engine Compression Systems

    NASA Technical Reports Server (NTRS)

    May, Ryan D.; Simon, Donald L.; Guo, Ten-Huei

    2012-01-01

    The accretion of ice particles in the core of commercial aircraft engines has been an ongoing aviation safety challenge. While no accidents have resulted from this phenomenon to date, numerous engine power loss events ranging from uneventful recoveries to forced landings have been recorded. As a first step to enabling mitigation strategies during ice accretion, a detection scheme must be developed that is capable of being implemented on board modern engines. In this paper, a simple detection scheme is developed and tested using a realistic engine simulation with approximate ice accretion models based on data from a compressor design tool. These accretion models are implemented as modified Low Pressure Compressor maps and have the capability to shift engine performance based on a specified level of ice blockage. Based on results from this model, it is possible to detect the accretion of ice in the engine core by observing shifts in the typical sensed engine outputs. Results are presented in which, for a 0.1 percent false positive rate, a true positive detection rate of 98 percent is achieved.

  19. A New Intrusion Detection Method Based on Antibody Concentration

    NASA Astrophysics Data System (ADS)

    Zeng, Jie; Li, Tao; Li, Guiyang; Li, Haibo

    Antibody is one kind of protein that fights against the harmful antigen in human immune system. In modern medical examination, the health status of a human body can be diagnosed by detecting the intrusion intensity of a specific antigen and the concentration indicator of corresponding antibody from human body’s serum. In this paper, inspired by the principle of antigen-antibody reactions, we present a New Intrusion Detection Method Based on Antibody Concentration (NIDMBAC) to reduce false alarm rate without affecting detection rate. In our proposed method, the basic definitions of self, nonself, antigen and detector in the intrusion detection domain are given. Then, according to the antigen intrusion intensity, the change of antibody number is recorded from the process of clone proliferation for detectors based on the antigen classified recognition. Finally, building upon the above works, a probabilistic calculation method for the intrusion alarm production, which is based on the correlation between the antigen intrusion intensity and the antibody concen-tration, is proposed. Our theoretical analysis and experimental results show that our proposed method has a better performance than traditional methods.

  20. Spatially offset Raman spectroscopy for explosives detection through difficult (opaque) containers

    NASA Astrophysics Data System (ADS)

    Maskall, Guy T.; Bonthron, Stuart; Crawford, David

    2013-10-01

    With the continuing threat to aviation security from homemade explosive devices, the restrictions on taking a volume of liquid greater than 100 ml onto an aircraft remain in place. From January 2014, these restrictions will gradually be reduced via a phased implementation of technological screening of Liquids, Aerosols and Gels (LAGs). Raman spectroscopy offers a highly sensitive, and specific, technique for the detection and identification of chemicals. Spatially Offset Raman Spectroscopy (SORS), in particular, offers significant advantages over conventional Raman spectroscopy for detecting and recognizing contents within optically challenging (Raman active) containers. Containers vary enormously in their composition; glass type, plastic type, thickness, reflectance, and pigmentation are all variable and cause an infinite range of absorbances, fluorescence backgrounds, Rayleigh backscattered laser light, and container Raman bands. In this paper we show that the data processing chain for Cobalt Light Systems' INSIGHT100 bottlescanner is robust to such variability. We discuss issues of model selection for the detection stage and demonstrate an overall detection rate across a wide range of threats and containers of 97% with an associated false alarm rate of 0.1% or lower.

  1. Early detection of lung cancer recurrence after stereotactic ablative radiation therapy: radiomics system design

    NASA Astrophysics Data System (ADS)

    Dammak, Salma; Palma, David; Mattonen, Sarah; Senan, Suresh; Ward, Aaron D.

    2018-02-01

    Stereotactic ablative radiotherapy (SABR) is the standard treatment recommendation for Stage I non-small cell lung cancer (NSCLC) patients who are inoperable or who refuse surgery. This option is well tolerated by even unfit patients and has a low recurrence risk post-treatment. However, SABR induces changes in the lung parenchyma that can appear similar to those of recurrence, and the difference between the two at an early follow-up time point is not easily distinguishable for an expert physician. We hypothesized that a radiomics signature derived from standard-of-care computed tomography (CT) imaging can detect cancer recurrence within six months of SABR treatment. This study reports on the design phase of our work, with external validation planned in future work. In this study, we performed cross-validation experiments with four feature selection approaches and seven classifiers on an 81-patient data set. We extracted 104 radiomics features from the consolidative and the peri-consolidative regions on the follow-up CT scans. The best results were achieved using the sum of estimated Mahalanobis distances (Maha) for supervised forward feature selection and a trainable automatic radial basis support vector classifier (RBSVC). This system produced an area under the receiver operating characteristic curve (AUC) of 0.84, an error rate of 16.4%, a false negative rate of 12.7%, and a false positive rate of 20.0% for leaveone patient out cross-validation. This suggests that once validated on an external data set, radiomics could reliably detect post-SABR recurrence and form the basis of a tool assisting physicians in making salvage treatment decisions.

  2. Real-time distributed fiber optic sensor for security systems: Performance, event classification and nuisance mitigation

    NASA Astrophysics Data System (ADS)

    Mahmoud, Seedahmed S.; Visagathilagar, Yuvaraja; Katsifolis, Jim

    2012-09-01

    The success of any perimeter intrusion detection system depends on three important performance parameters: the probability of detection (POD), the nuisance alarm rate (NAR), and the false alarm rate (FAR). The most fundamental parameter, POD, is normally related to a number of factors such as the event of interest, the sensitivity of the sensor, the installation quality of the system, and the reliability of the sensing equipment. The suppression of nuisance alarms without degrading sensitivity in fiber optic intrusion detection systems is key to maintaining acceptable performance. Signal processing algorithms that maintain the POD and eliminate nuisance alarms are crucial for achieving this. In this paper, a robust event classification system using supervised neural networks together with a level crossings (LCs) based feature extraction algorithm is presented for the detection and recognition of intrusion and non-intrusion events in a fence-based fiber-optic intrusion detection system. A level crossings algorithm is also used with a dynamic threshold to suppress torrential rain-induced nuisance alarms in a fence system. Results show that rain-induced nuisance alarms can be suppressed for rainfall rates in excess of 100 mm/hr with the simultaneous detection of intrusion events. The use of a level crossing based detection and novel classification algorithm is also presented for a buried pipeline fiber optic intrusion detection system for the suppression of nuisance events and discrimination of intrusion events. The sensor employed for both types of systems is a distributed bidirectional fiber-optic Mach-Zehnder (MZ) interferometer.

  3. GISentinel: a software platform for automatic ulcer detection on capsule endoscopy videos

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Jiao, Heng; Meng, Fan; Leighton, Jonathon A.; Shabana, Pasha; Rentz, Lauri

    2014-03-01

    In this paper, we present a novel and clinically valuable software platform for automatic ulcer detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos take about 8 hours. They have to be reviewed manually by physicians to detect and locate diseases such as ulcers and bleedings. The process is time consuming. Moreover, because of the long-time manual review, it is easy to lead to miss-finding. Working with our collaborators, we were focusing on developing a software platform called GISentinel, which can fully automated GI tract ulcer detection and classification. This software includes 3 parts: the frequency based Log-Gabor filter regions of interest (ROI) extraction, the unique feature selection and validation method (e.g. illumination invariant feature, color independent features, and symmetrical texture features), and the cascade SVM classification for handling "ulcer vs. non-ulcer" cases. After the experiments, this SW gave descent results. In frame-wise, the ulcer detection rate is 69.65% (319/458). In instance-wise, the ulcer detection rate is 82.35%(28/34).The false alarm rate is 16.43% (34/207). This work is a part of our innovative 2D/3D based GI tract disease detection software platform. The final goal of this SW is to find and classification of major GI tract diseases intelligently, such as bleeding, ulcer, and polyp from the CE videos. This paper will mainly describe the automatic ulcer detection functional module.

  4. [Applying competitive polymerase chain reaction to the detection of hepatitis B virus DNA].

    PubMed

    Wang, Ling; Yang, Peng; Li, Shuang-qing; Xu, Shu-hui; Cao, Gui-qun; Zhang, Fa-qiang; Zhang, Mei-xia; Chen, Qing-ying; Xia, Qing-jie; Liu, Kai; Tang, Fang; Zhang, Yuan-zheng

    2004-11-01

    To reduce the rate of accidental false negative result in the HBV DNA PCR test on clinical serum samples. A competitive polymerase chain reaction (C-PCR) was used to decrease the false negative ratio. In the C-PCR, a constructed inner control DNA was added for co-amplification with the HBV target DNA. In a 20 microl C-PCR system, about 60 to 200 copies of inner control DNA could give apparent co-amplification signal band after electrophoresis on a 2% agarose gel. Five of 120 samples of clinical serum (4.2%) could not be amplified. C-PCR has the advantage of yielding information on false negative in the HBV DNA PCR assay of clinical serum samples.

  5. Large-scale benchmarking reveals false discoveries and count transformation sensitivity in 16S rRNA gene amplicon data analysis methods used in microbiome studies.

    PubMed

    Thorsen, Jonathan; Brejnrod, Asker; Mortensen, Martin; Rasmussen, Morten A; Stokholm, Jakob; Al-Soud, Waleed Abu; Sørensen, Søren; Bisgaard, Hans; Waage, Johannes

    2016-11-25

    There is an immense scientific interest in the human microbiome and its effects on human physiology, health, and disease. A common approach for examining bacterial communities is high-throughput sequencing of 16S rRNA gene hypervariable regions, aggregating sequence-similar amplicons into operational taxonomic units (OTUs). Strategies for detecting differential relative abundance of OTUs between sample conditions include classical statistical approaches as well as a plethora of newer methods, many borrowing from the related field of RNA-seq analysis. This effort is complicated by unique data characteristics, including sparsity, sequencing depth variation, and nonconformity of read counts to theoretical distributions, which is often exacerbated by exploratory and/or unbalanced study designs. Here, we assess the robustness of available methods for (1) inference in differential relative abundance analysis and (2) beta-diversity-based sample separation, using a rigorous benchmarking framework based on large clinical 16S microbiome datasets from different sources. Running more than 380,000 full differential relative abundance tests on real datasets with permuted case/control assignments and in silico-spiked OTUs, we identify large differences in method performance on a range of parameters, including false positive rates, sensitivity to sparsity and case/control balances, and spike-in retrieval rate. In large datasets, methods with the highest false positive rates also tend to have the best detection power. For beta-diversity-based sample separation, we show that library size normalization has very little effect and that the distance metric is the most important factor in terms of separation power. Our results, generalizable to datasets from different sequencing platforms, demonstrate how the choice of method considerably affects analysis outcome. Here, we give recommendations for tools that exhibit low false positive rates, have good retrieval power across effect sizes and case/control proportions, and have low sparsity bias. Result output from some commonly used methods should be interpreted with caution. We provide an easily extensible framework for benchmarking of new methods and future microbiome datasets.

  6. Chimeric 16S rRNA sequence formation and detection in Sanger and 454-pyrosequenced PCR amplicons

    PubMed Central

    Haas, Brian J.; Gevers, Dirk; Earl, Ashlee M.; Feldgarden, Mike; Ward, Doyle V.; Giannoukos, Georgia; Ciulla, Dawn; Tabbaa, Diana; Highlander, Sarah K.; Sodergren, Erica; Methé, Barbara; DeSantis, Todd Z.; Petrosino, Joseph F.; Knight, Rob; Birren, Bruce W.

    2011-01-01

    Bacterial diversity among environmental samples is commonly assessed with PCR-amplified 16S rRNA gene (16S) sequences. Perceived diversity, however, can be influenced by sample preparation, primer selection, and formation of chimeric 16S amplification products. Chimeras are hybrid products between multiple parent sequences that can be falsely interpreted as novel organisms, thus inflating apparent diversity. We developed a new chimera detection tool called Chimera Slayer (CS). CS detects chimeras with greater sensitivity than previous methods, performs well on short sequences such as those produced by the 454 Life Sciences (Roche) Genome Sequencer, and can scale to large data sets. By benchmarking CS performance against sequences derived from a controlled DNA mixture of known organisms and a simulated chimera set, we provide insights into the factors that affect chimera formation such as sequence abundance, the extent of similarity between 16S genes, and PCR conditions. Chimeras were found to reproducibly form among independent amplifications and contributed to false perceptions of sample diversity and the false identification of novel taxa, with less-abundant species exhibiting chimera rates exceeding 70%. Shotgun metagenomic sequences of our mock community appear to be devoid of 16S chimeras, supporting a role for shotgun metagenomics in validating novel organisms discovered in targeted sequence surveys. PMID:21212162

  7. Aircraft Detection in High-Resolution SAR Images Based on a Gradient Textural Saliency Map.

    PubMed

    Tan, Yihua; Li, Qingyun; Li, Yansheng; Tian, Jinwen

    2015-09-11

    This paper proposes a new automatic and adaptive aircraft target detection algorithm in high-resolution synthetic aperture radar (SAR) images of airport. The proposed method is based on gradient textural saliency map under the contextual cues of apron area. Firstly, the candidate regions with the possible existence of airport are detected from the apron area. Secondly, directional local gradient distribution detector is used to obtain a gradient textural saliency map in the favor of the candidate regions. In addition, the final targets will be detected by segmenting the saliency map using CFAR-type algorithm. The real high-resolution airborne SAR image data is used to verify the proposed algorithm. The results demonstrate that this algorithm can detect aircraft targets quickly and accurately, and decrease the false alarm rate.

  8. Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling

    NASA Astrophysics Data System (ADS)

    Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji

    We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.

  9. Evaluation of information-theoretic similarity measures for content-based retrieval and detection of masses in mammograms.

    PubMed

    Tourassi, Georgia D; Harrawood, Brian; Singh, Swatee; Lo, Joseph Y; Floyd, Carey E

    2007-01-01

    The purpose of this study was to evaluate image similarity measures employed in an information-theoretic computer-assisted detection (IT-CAD) scheme. The scheme was developed for content-based retrieval and detection of masses in screening mammograms. The study is aimed toward an interactive clinical paradigm where physicians query the proposed IT-CAD scheme on mammographic locations that are either visually suspicious or indicated as suspicious by other cuing CAD systems. The IT-CAD scheme provides an evidence-based, second opinion for query mammographic locations using a knowledge database of mass and normal cases. In this study, eight entropy-based similarity measures were compared with respect to retrieval precision and detection accuracy using a database of 1820 mammographic regions of interest. The IT-CAD scheme was then validated on a separate database for false positive reduction of progressively more challenging visual cues generated by an existing, in-house mass detection system. The study showed that the image similarity measures fall into one of two categories; one category is better suited to the retrieval of semantically similar cases while the second is more effective with knowledge-based decisions regarding the presence of a true mass in the query location. In addition, the IT-CAD scheme yielded a substantial reduction in false-positive detections while maintaining high detection rate for malignant masses.

  10. Evaluation of information-theoretic similarity measures for content-based retrieval and detection of masses in mammograms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tourassi, Georgia D.; Harrawood, Brian; Singh, Swatee

    The purpose of this study was to evaluate image similarity measures employed in an information-theoretic computer-assisted detection (IT-CAD) scheme. The scheme was developed for content-based retrieval and detection of masses in screening mammograms. The study is aimed toward an interactive clinical paradigm where physicians query the proposed IT-CAD scheme on mammographic locations that are either visually suspicious or indicated as suspicious by other cuing CAD systems. The IT-CAD scheme provides an evidence-based, second opinion for query mammographic locations using a knowledge database of mass and normal cases. In this study, eight entropy-based similarity measures were compared with respect to retrievalmore » precision and detection accuracy using a database of 1820 mammographic regions of interest. The IT-CAD scheme was then validated on a separate database for false positive reduction of progressively more challenging visual cues generated by an existing, in-house mass detection system. The study showed that the image similarity measures fall into one of two categories; one category is better suited to the retrieval of semantically similar cases while the second is more effective with knowledge-based decisions regarding the presence of a true mass in the query location. In addition, the IT-CAD scheme yielded a substantial reduction in false-positive detections while maintaining high detection rate for malignant masses.« less

  11. Architecture for an artificial immune system.

    PubMed

    Hofmeyr, S A; Forrest, S

    2000-01-01

    An artificial immune system (ARTIS) is described which incorporates many properties of natural immune systems, including diversity, distributed computation, error tolerance, dynamic learning and adaptation, and self-monitoring. ARTIS is a general framework for a distributed adaptive system and could, in principle, be applied to many domains. In this paper, ARTIS is applied to computer security in the form of a network intrusion detection system called LISYS. LISYS is described and shown to be effective at detecting intrusions, while maintaining low false positive rates. Finally, similarities and differences between ARTIS and Holland's classifier systems are discussed.

  12. Proceedings of Symposium on Analysis and Detection of Explosives (3rd) Held in Mannheim-Neuostheim (Germany, F.R.) on 10-13 July 1989

    DTIC Science & Technology

    1989-07-13

    tetrabutylammonium hydroxide (TBAOH) regenerant. Because of the cost of TBAOH, we recently procured an autoregenerant accessory (in essence , an ion exchange system...examples of each of the above), the Al Exfinder 150, Al Model 85 explosives detecting doorway (which was not used for searching bags) and the Jasmin ...The Jasmin Simtec Exdetex 2, (which is really designed for building search) proved to have a very high sensitivity and a very low false alarm rate but

  13. Abort Trigger False Positive and False Negative Analysis Methodology for Threshold-Based Abort Detection

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon

    2015-01-01

    This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.

  14. Validation of an Arab names algorithm in the determination of Arab ancestry for use in health research

    PubMed Central

    El-Sayed, Abdulrahman M.; Lauderdale, Diane S.; Galea, Sandro

    2010-01-01

    Objective Data about Arab-Americans, a growing ethnic minority, is not routinely collected in vital statistics, registry, or administrative data in the US. The difficulty in identifying Arab-Americans using publicly available data sources is a barrier to health research about this group. Here, we validate an empirically-based, probabilistic Arab name algorithm (ANA) for identifying Arab-Americans in health research. Design We used data from all Michigan birth certificates between 2000-2005. Fathers’ surnames and mothers’ maiden names were coded as Arab or non-Arab according to the ANA. We calculated sensitivity, specificity, and positive (PPV) and negative predictive values (NPV) of Arab ethnicity inferred using the ANA as compared to self-reported Arab ancestry. Results State-wide, the ANA had a specificity of 98.9%, a sensitivity of 50.3%, a PPV of 57.0%, and a NPV of 98.6%. Both the false positive and false negative rates were higher among men than among women. As the concentration of Arab-Americans in a study locality increased, the ANA false positive rate increased and false-negative rate decreased. Conclusion The ANA is highly specific but only moderately sensitive as a means of detecting Arab ancestry. Future research should compare health characteristics among Arab-American populations defined by Arab ancestry and those defined by the ANA. PMID:20845117

  15. False-positive IgM for CMV in pregnant women with autoimmune disease: a novel prognostic factor for poor pregnancy outcome.

    PubMed

    De Carolis, S; Santucci, S; Botta, A; Garofalo, S; Martino, C; Perrelli, A; Salvi, S; Degennaro, Va; de Belvis, Ag; Ferrazzani, S; Scambia, G

    2010-06-01

    Our aims were to assess the frequency of false-positive IgM antibodies for cytomegalovirus in pregnant women with autoimmune diseases and in healthy women (controls) and to determine their relationship with pregnancy outcome. Data from 133 pregnancies in 118 patients with autoimmune diseases and from 222 pregnancies in 198 controls were assessed. When positive IgM for cytomegalovirus was detected, IgG avidity, cytomegalovirus isolation and polymerase chain reaction for CMV-DNA in maternal urine and amniotic fluid samples were performed in order to identify primary infection or false positivity. A statistically significantly higher rate of false-positive IgM was found in pregnancies with autoimmune diseases (16.5%) in comparison with controls (0.9%). A worse pregnancy outcome was observed among patients with autoimmune disease and false cytomegalovirus IgM in comparison with those without false positivity: earlier week of delivery (p = 0.017), lower neonatal birth weight (p = 0.0004) and neonatal birth weight percentile (p = 0.002), higher rate of intrauterine growth restriction (p = 0.02) and babies weighing less than 2000 g (p = 0.025) were encountered. The presence of false cytomegalovirus IgM in patients with autoimmune diseases could be used as a novel prognostic index of poor pregnancy outcome: it may reflect a non-specific activation of the immune system that could negatively affect pregnancy outcome. Lupus (2010) 19, 844-849.

  16. [Application of digital 3D technique combined with nanocarbon-aided navigation in endoscopic sentinel lymph node biopsy for breast cancer].

    PubMed

    Zhang, Pu-Sheng; Luo, Yun-Feng; Yu, Jin-Long; Fang, Chi-Hua; Shi, Fu-Jun; Deng, Jian-Wen

    2016-08-20

    To study the clinical value of digital 3D technique combined with nanocarbon-aided navigation in endoscopic sentinel lymph node biopsy for breast cancer. Thirty-nine female patients with stage I/II breast cancer admitted in our hospital between September 2014 and September 2015 were recruited. CT lymphography data of the patients were segmented to reconstruct digital 3D models, which were imported into FreeForm Modeling Surgical System Platform for visual simulation surgery before operation. Endoscopic sentinel lymph node biopsy and endoscopic axillary lymph node dissection were then carried out, and the accuracy and clinical value of digital 3D technique in endoscopic sentinel lymph node biopsy were analyzed. s The 3D models faithfully represented the surgical anatomy of the patients and clearly displayed the 3D relationship among the sentinel lymph nodes, axillary lymph nodes, axillary vein, pectoralis major, pectoralis minor muscle and latissimus dorsi. In the biopsy, the detection rate of sentinel lymph nodes was 100% in the patients with a coincidence rate of 87.18% (34/39), a sensitivity of 91.67% (11/12), and a false negative rate of 8.33% (1/12). Complications such as limb pain, swelling, wound infection, and subcutaneouseroma were not found in these patients 6 months after the operation. Endoscopic sentinel lymph node biopsy assisted by digital 3D technique and nanocarbon-aided navigation allows a high detection rate of sentinel lymph nodes with a high sensitivity and a low false negative rate and can serve as a new method for sentinel lymph node biopsy for breast cancer.

  17. BinQuasi: a peak detection method for ChIP-sequencing data with biological replicates.

    PubMed

    Goren, Emily; Liu, Peng; Wang, Chao; Wang, Chong

    2018-04-19

    ChIP-seq experiments that are aimed at detecting DNA-protein interactions require biological replication to draw inferential conclusions, however there is no current consensus on how to analyze ChIP-seq data with biological replicates. Very few methodologies exist for the joint analysis of replicated ChIP-seq data, with approaches ranging from combining the results of analyzing replicates individually to joint modeling of all replicates. Combining the results of individual replicates analyzed separately can lead to reduced peak classification performance compared to joint modeling. Currently available methods for joint analysis may fail to control the false discovery rate at the nominal level. We propose BinQuasi, a peak caller for replicated ChIP-seq data, that jointly models biological replicates using a generalized linear model framework and employs a one-sided quasi-likelihood ratio test to detect peaks. When applied to simulated data and real datasets, BinQuasi performs favorably compared to existing methods, including better control of false discovery rate than existing joint modeling approaches. BinQuasi offers a flexible approach to joint modeling of replicated ChIP-seq data which is preferable to combining the results of replicates analyzed individually. Source code is freely available for download at https://cran.r-project.org/package=BinQuasi, implemented in R. pliu@iastate.edu or egoren@iastate.edu. Supplementary material is available at Bioinformatics online.

  18. Human Factors Evaluation of Conflict Detection Tool for Terminal Area

    NASA Technical Reports Server (NTRS)

    Verma, Savita Arora; Tang, Huabin; Ballinger, Deborah; Chinn, Fay Cherie; Kozon, Thomas E.

    2013-01-01

    A conflict detection and resolution tool, Terminal-area Tactical Separation-Assured Flight Environment (T-TSAFE), is being developed to improve the timeliness and accuracy of alerts and reduce the false alert rate observed with the currently deployed technology. The legacy system in use today, Conflict Alert, relies primarily on a dead reckoning algorithm, whereas T-TSAFE uses intent information to augment dead reckoning. In previous experiments, T-TSAFE was found to reduce the rate of false alerts and increase time between the alert to the controller and a loss of separation over the legacy system. In the present study, T-TSAFE was tested under two meteorological conditions, 1) all aircraft operated under instrument flight regimen, and 2) some aircraft operated under mixed operating conditions. The tool was used to visually alert controllers to predicted Losses of separation throughout the terminal airspace, and show compression errors, on final approach. The performance of T-TSAFE on final approach was compared with Automated Terminal Proximity Alert (ATPA), a tool recently deployed by the FAA. Results show that controllers did not report differences in workload or situational awareness between the T-TSAFE and ATPA cones but did prefer T-TSAFE features over ATPA functionality. T-TSAFE will provide one tool that shows alerts in the data blocks and compression errors via cones on the final approach, implementing all tactical conflict detection and alerting via one tool in TRACON airspace.

  19. pKWmEB: integration of Kruskal-Wallis test with empirical Bayes under polygenic background control for multi-locus genome-wide association study.

    PubMed

    Ren, Wen-Long; Wen, Yang-Jun; Dunwell, Jim M; Zhang, Yuan-Ming

    2018-03-01

    Although nonparametric methods in genome-wide association studies (GWAS) are robust in quantitative trait nucleotide (QTN) detection, the absence of polygenic background control in single-marker association in genome-wide scans results in a high false positive rate. To overcome this issue, we proposed an integrated nonparametric method for multi-locus GWAS. First, a new model transformation was used to whiten the covariance matrix of polygenic matrix K and environmental noise. Using the transferred model, Kruskal-Wallis test along with least angle regression was then used to select all the markers that were potentially associated with the trait. Finally, all the selected markers were placed into multi-locus model, these effects were estimated by empirical Bayes, and all the nonzero effects were further identified by a likelihood ratio test for true QTN detection. This method, named pKWmEB, was validated by a series of Monte Carlo simulation studies. As a result, pKWmEB effectively controlled false positive rate, although a less stringent significance criterion was adopted. More importantly, pKWmEB retained the high power of Kruskal-Wallis test, and provided QTN effect estimates. To further validate pKWmEB, we re-analyzed four flowering time related traits in Arabidopsis thaliana, and detected some previously reported genes that were not identified by the other methods.

  20. Multispectra CWT-based algorithm (MCWT) in mass spectra for peak extraction.

    PubMed

    Hsueh, Huey-Miin; Kuo, Hsun-Chih; Tsai, Chen-An

    2008-01-01

    An important objective in mass spectrometry (MS) is to identify a set of biomarkers that can be used to potentially distinguish patients between distinct treatments (or conditions) from tens or hundreds of spectra. A common two-step approach involving peak extraction and quantification is employed to identify the features of scientific interest. The selected features are then used for further investigation to understand underlying biological mechanism of individual protein or for development of genomic biomarkers to early diagnosis. However, the use of inadequate or ineffective peak detection and peak alignment algorithms in peak extraction step may lead to a high rate of false positives. Also, it is crucial to reduce the false positive rate in detecting biomarkers from ten or hundreds of spectra. Here a new procedure is introduced for feature extraction in mass spectrometry data that extends the continuous wavelet transform-based (CWT-based) algorithm to multiple spectra. The proposed multispectra CWT-based algorithm (MCWT) not only can perform peak detection for multiple spectra but also carry out peak alignment at the same time. The author' MCWT algorithm constructs a reference, which integrates information of multiple raw spectra, for feature extraction. The algorithm is applied to a SELDI-TOF mass spectra data set provided by CAMDA 2006 with known polypeptide m/z positions. This new approach is easy to implement and it outperforms the existing peak extraction method from the Bioconductor PROcess package.

Top