Sample records for anomaly detection techniques

  1. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less

  2. A Survey on Anomaly Based Host Intrusion Detection System

    NASA Astrophysics Data System (ADS)

    Jose, Shijoe; Malathi, D.; Reddy, Bharath; Jayaseeli, Dorathi

    2018-04-01

    An intrusion detection system (IDS) is hardware, software or a combination of two, for monitoring network or system activities to detect malicious signs. In computer security, designing a robust intrusion detection system is one of the most fundamental and important problems. The primary function of system is detecting intrusion and gives alerts when user tries to intrusion on timely manner. In these techniques when IDS find out intrusion it will send alert massage to the system administrator. Anomaly detection is an important problem that has been researched within diverse research areas and application domains. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. From the existing anomaly detection techniques, each technique has relative strengths and weaknesses. The current state of the experiment practice in the field of anomaly-based intrusion detection is reviewed and survey recent studies in this. This survey provides a study of existing anomaly detection techniques, and how the techniques used in one area can be applied in another application domain.

  3. Seismic data fusion anomaly detection

    NASA Astrophysics Data System (ADS)

    Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David

    2014-06-01

    Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.

  4. Automated Network Anomaly Detection with Learning, Control and Mitigation

    ERIC Educational Resources Information Center

    Ippoliti, Dennis

    2014-01-01

    Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…

  5. Evaluation of Anomaly Detection Method Based on Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Fontugne, Romain; Himura, Yosuke; Fukuda, Kensuke

    The number of threats on the Internet is rapidly increasing, and anomaly detection has become of increasing importance. High-speed backbone traffic is particularly degraded, but their analysis is a complicated task due to the amount of data, the lack of payload data, the asymmetric routing and the use of sampling techniques. Most anomaly detection schemes focus on the statistical properties of network traffic and highlight anomalous traffic through their singularities. In this paper, we concentrate on unusual traffic distributions, which are easily identifiable in temporal-spatial space (e.g., time/address or port). We present an anomaly detection method that uses a pattern recognition technique to identify anomalies in pictures representing traffic. The main advantage of this method is its ability to detect attacks involving mice flows. We evaluate the parameter set and the effectiveness of this approach by analyzing six years of Internet traffic collected from a trans-Pacific link. We show several examples of detected anomalies and compare our results with those of two other methods. The comparison indicates that the only anomalies detected by the pattern-recognition-based method are mainly malicious traffic with a few packets.

  6. A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.

  7. Modeling And Detecting Anomalies In Scada Systems

    NASA Astrophysics Data System (ADS)

    Svendsen, Nils; Wolthusen, Stephen

    The detection of attacks and intrusions based on anomalies is hampered by the limits of specificity underlying the detection techniques. However, in the case of many critical infrastructure systems, domain-specific knowledge and models can impose constraints that potentially reduce error rates. At the same time, attackers can use their knowledge of system behavior to mask their manipulations, causing adverse effects to observed only after a significant period of time. This paper describes elementary statistical techniques that can be applied to detect anomalies in critical infrastructure networks. A SCADA system employed in liquefied natural gas (LNG) production is used as a case study.

  8. Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery

    PubMed Central

    Sivaraks, Haemwaan

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284

  9. Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)

    NASA Technical Reports Server (NTRS)

    McIntosh, Dawn

    2006-01-01

    This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search

  10. Min-max hyperellipsoidal clustering for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A

    2006-08-01

    A novel hyperellipsoidal clustering technique is presented for an intrusion-detection system in network security. Hyperellipsoidal clusters toward maximum intracluster similarity and minimum intercluster similarity are generated from training data sets. The novelty of the technique lies in the fact that the parameters needed to construct higher order data models in general multivariate Gaussian functions are incrementally derived from the data sets using accretive processes. The technique is implemented in a feedforward neural network that uses a Gaussian radial basis function as the model generator. An evaluation based on the inclusiveness and exclusiveness of samples with respect to specific criteria is applied to accretively learn the output clusters of the neural network. One significant advantage of this is its ability to detect individual anomaly types that are hard to detect with other anomaly-detection schemes. Applying this technique, several feature subsets of the tcptrace network-connection records that give above 95% detection at false-positive rates below 5% were identified.

  11. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  12. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  13. A hybrid approach for efficient anomaly detection using metaheuristic methods

    PubMed Central

    Ghanem, Tamer F.; Elkilani, Wail S.; Abdul-kader, Hatem M.

    2014-01-01

    Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. PMID:26199752

  14. A hybrid approach for efficient anomaly detection using metaheuristic methods.

    PubMed

    Ghanem, Tamer F; Elkilani, Wail S; Abdul-Kader, Hatem M

    2015-07-01

    Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.

  15. The Monitoring, Detection, Isolation and Assessment of Information Warfare Attacks Through Multi-Level, Multi-Scale System Modeling and Model Based Technology

    DTIC Science & Technology

    2004-01-01

    login identity to the one under which the system call is executed, the parameters of the system call execution - file names including full path...Anomaly detection COAST-EIMDT Distributed on target hosts EMERALD Distributed on target hosts and security servers Signature recognition Anomaly...uses a centralized architecture, and employs an anomaly detection technique for intrusion detection. The EMERALD project [80] proposes a

  16. A lightweight network anomaly detection technique

    DOE PAGES

    Kim, Jinoh; Yoo, Wucherl; Sim, Alex; ...

    2017-03-13

    While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less

  17. A lightweight network anomaly detection technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jinoh; Yoo, Wucherl; Sim, Alex

    While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less

  18. Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)

    2001-01-01

    The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.

  19. Multi-Level Anomaly Detection on Time-Varying Graph Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A; Collins, John P; Ferragut, Erik M

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, thismore » multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  20. Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method

    NASA Technical Reports Server (NTRS)

    Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.

    2005-01-01

    NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.

  1. Lidar detection algorithm for time and range anomalies.

    PubMed

    Ben-David, Avishai; Davidson, Charles E; Vanderbeek, Richard G

    2007-10-10

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t(1) to t(2)" is addressed, and for range anomaly where the question "is a target present at time t within ranges R(1) and R(2)" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO(2) lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  2. Development of references of anomalies detection on P91 material using Self-Magnetic Leakage Field (SMLF) technique

    NASA Astrophysics Data System (ADS)

    Husin, Shuib; Afiq Pauzi, Ahmad; Yunus, Salmi Mohd; Ghafar, Mohd Hafiz Abdul; Adilin Sekari, Saiful

    2017-10-01

    This technical paper demonstrates the successful of the application of self-magnetic leakage field (SMLF) technique in detecting anomalies in weldment of a thick P91 materials joint (1 inch thickness). Boiler components such as boiler tubes, stub boiler at penthouse and energy piping such as hot reheat pipe (HRP) and H-balance energy piping to turbine are made of P91 material. P91 is ferromagnetic material, therefore the technique of self-magnetic leakage field (SMLF) is applicable for P91 in detecting anomalies within material (internal defects). The technique is categorized under non-destructive technique (NDT). It is the second passive method after acoustic emission (AE), at which the information on structures radiation (magnetic field and energy waves) is used. The measured magnetic leakage field of a product or component is a magnetic leakage field occurring on the component’s surface in the zone of dislocation stable slipbands under the influence of operational (in-service) or residual stresses or in zones of maximum inhomogeneity of metal structure in new products or components. Inter-granular and trans-granular cracks, inclusion, void, cavity and corrosion are considered types of inhomogeneity and discontinuity in material where obviously the output of magnetic leakage field will be shown when using this technique. The technique does not required surface preparation for the component to be inspected. This technique is contact-type inspection, which means the sensor has to touch or in-contact to the component’s surface during inspection. The results of application of SMLF technique on the developed P91 reference blocks have demonstrated that the technique is practical to be used for anomaly inspection and detection as well as identification of anomalies’ location. The evaluation of this passive self-magnetic leakage field (SMLF) technique has been verified by other conventional non-destructive tests (NDTs) on the reference blocks where simulated defects/anomalies have been developed inside at the weldment. The results from the inspection test showed that the signatures of magnetic leakage field gradient distribution prove that the peak is found on the location of defect/anomaly in the reference block. It is in agreement with the evidence of anomaly that seen in the radiography test film (RT).

  3. A Comparative Study of Unsupervised Anomaly Detection Techniques Using Honeypot Data

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Inoue, Daisuke; Eto, Masashi; Nakao, Koji

    Intrusion Detection Systems (IDS) have been received considerable attention among the network security researchers as one of the most promising countermeasures to defend our crucial computer systems or networks against attackers on the Internet. Over the past few years, many machine learning techniques have been applied to IDSs so as to improve their performance and to construct them with low cost and effort. Especially, unsupervised anomaly detection techniques have a significant advantage in their capability to identify unforeseen attacks, i.e., 0-day attacks, and to build intrusion detection models without any labeled (i.e., pre-classified) training data in an automated manner. In this paper, we conduct a set of experiments to evaluate and analyze performance of the major unsupervised anomaly detection techniques using real traffic data which are obtained at our honeypots deployed inside and outside of the campus network of Kyoto University, and using various evaluation criteria, i.e., performance evaluation by similarity measurements and the size of training data, overall performance, detection ability for unknown attacks, and time complexity. Our experimental results give some practical and useful guidelines to IDS researchers and operators, so that they can acquire insight to apply these techniques to the area of intrusion detection, and devise more effective intrusion detection models.

  4. Detection of sinkholes or anomalies using full seismic wave fields.

    DOT National Transportation Integrated Search

    2013-04-01

    This research presents an application of two-dimensional (2-D) time-domain waveform tomography for detection of embedded sinkholes and anomalies. The measured seismic surface wave fields were inverted using a full waveform inversion (FWI) technique, ...

  5. Network Anomaly Detection Based on Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  6. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  7. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    DOE PAGES

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; ...

    2016-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  8. Post-processing for improving hyperspectral anomaly detection accuracy

    NASA Astrophysics Data System (ADS)

    Wu, Jee-Cheng; Jiang, Chi-Ming; Huang, Chen-Liang

    2015-10-01

    Anomaly detection is an important topic in the exploitation of hyperspectral data. Based on the Reed-Xiaoli (RX) detector and a morphology operator, this research proposes a novel technique for improving the accuracy of hyperspectral anomaly detection. Firstly, the RX-based detector is used to process a given input scene. Then, a post-processing scheme using morphology operator is employed to detect those pixels around high-scoring anomaly pixels. Tests were conducted using two real hyperspectral images with ground truth information and the results based on receiver operating characteristic curves, illustrated that the proposed method reduced the false alarm rates of the RXbased detector.

  9. Anomaly Detection in Power Quality at Data Centers

    NASA Technical Reports Server (NTRS)

    Grichine, Art; Solano, Wanda M.

    2015-01-01

    The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.

  10. A modified anomaly detection method for capsule endoscopy images using non-linear color conversion and Higher-order Local Auto-Correlation (HLAC).

    PubMed

    Hu, Erzhong; Nosato, Hirokazu; Sakanashi, Hidenori; Murakawa, Masahiro

    2013-01-01

    Capsule endoscopy is a patient-friendly endoscopy broadly utilized in gastrointestinal examination. However, the efficacy of diagnosis is restricted by the large quantity of images. This paper presents a modified anomaly detection method, by which both known and unknown anomalies in capsule endoscopy images of small intestine are expected to be detected. To achieve this goal, this paper introduces feature extraction using a non-linear color conversion and Higher-order Local Auto Correlation (HLAC) Features, and makes use of image partition and subspace method for anomaly detection. Experiments are implemented among several major anomalies with combinations of proposed techniques. As the result, the proposed method achieved 91.7% and 100% detection accuracy for swelling and bleeding respectively, so that the effectiveness of proposed method is demonstrated.

  11. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  12. Attention focusing and anomaly detection in systems monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. The focus of this paper is a new technique for attention focusing. The technique involves reasoning about the distance between two frequency distributions, and is used to detect both anomalous system parameters and 'broken' causal dependencies. These two forms of information together isolate the locus of anomalous behavior in the system being monitored.

  13. Spatially-Aware Temporal Anomaly Mapping of Gamma Spectra

    NASA Astrophysics Data System (ADS)

    Reinhart, Alex; Athey, Alex; Biegalski, Steven

    2014-06-01

    For security, environmental, and regulatory purposes it is useful to continuously monitor wide areas for unexpected changes in radioactivity. We report on a temporal anomaly detection algorithm which uses mobile detectors to build a spatial map of background spectra, allowing sensitive detection of any anomalies through many days or months of monitoring. We adapt previously-developed anomaly detection methods, which compare spectral shape rather than count rate, to function with limited background data, allowing sensitive detection of small changes in spectral shape from day to day. To demonstrate this technique we collected daily observations over the period of six weeks on a 0.33 square mile research campus and performed source injection simulations.

  14. Fiber Optic Bragg Grating Sensors for Thermographic Detection of Subsurface Anomalies

    NASA Technical Reports Server (NTRS)

    Allison, Sidney G.; Winfree, William P.; Wu, Meng-Chou

    2009-01-01

    Conventional thermography with an infrared imager has been shown to be an extremely viable technique for nondestructively detecting subsurface anomalies such as thickness variations due to corrosion. A recently developed technique using fiber optic sensors to measure temperature holds potential for performing similar inspections without requiring an infrared imager. The structure is heated using a heat source such as a quartz lamp with fiber Bragg grating (FBG) sensors at the surface of the structure to detect temperature. Investigated structures include a stainless steel plate with thickness variations simulated by small platelets attached to the back side using thermal grease. A relationship is shown between the FBG sensor thermal response and variations in material thickness. For comparison, finite element modeling was performed and found to agree closely with the fiber optic thermography results. This technique shows potential for applications where FBG sensors are already bonded to structures for Integrated Vehicle Health Monitoring (IVHM) strain measurements and can serve dual-use by also performing thermographic detection of subsurface anomalies.

  15. Wolffian duct derivative anomalies: technical considerations when encountered during robot-assisted radical prostatectomy.

    PubMed

    Acharya, Sujeet S; Gundeti, Mohan S; Zagaja, Gregory P; Shalhav, Arieh L; Zorn, Kevin C

    2009-04-01

    Although malformations of the genitourinary tract are typically identified during childhood, they can remain silent until incidental detection in evaluation and treatment of other pathologies during adulthood. The advent of the minimally invasive era in urologic surgery has given rise to unique challenges in the surgical management of anomalies of the genitourinary tract. This article reviews the embryology of anomalies of Wolffian duct (WD) derivatives with specific attention to the seminal vesicles, vas deferens, ureter, and kidneys. This is followed by a discussion of the history of the laparoscopic approach to WD derivative anomalies. Finally, we present two cases to describe technical considerations when managing these anomalies when encountered during robotic-assisted radical prostatectomy. The University of Chicago Robotic Laparoscopic Radical Prostatectomy (RLRP) database was reviewed for cases where anomalies of WD derivatives were encountered. We describe how modifications in technique allowed for completion of the procedure without difficulty. None Of the 1230 RLRP procedures performed at our institution by three surgeons, only two cases (0.16%) have been noted to have a WD anomaly. These cases were able to be completed without difficulty by making simple modifications in technique. Although uncommon, it is important for the urologist to be familiar with the origin and surgical management of WD anomalies, particularly when detected incidentally during surgery. Simple modifications in technique allow for completion of RLRP without difficulty.

  16. Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets

    PubMed Central

    Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin

    2017-01-01

    Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy and sparse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxicabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques. PMID:28282948

  17. Detection of geothermal anomalies in Tengchong, Yunnan Province, China from MODIS multi-temporal night LST imagery

    NASA Astrophysics Data System (ADS)

    Li, H.; Kusky, T. M.; Peng, S.; Zhu, M.

    2012-12-01

    Thermal infrared (TIR) remote sensing is an important technique in the exploration of geothermal resources. In this study, a geothermal survey is conducted in Tengchong area of Yunnan province in China using multi-temporal MODIS LST (Land Surface Temperature). The monthly night MODIS LST data from Mar. 2000 to Mar. 2011 of the study area were collected and analyzed. The 132 month average LST map was derived and three geothermal anomalies were identified. The findings of this study agree well with the results from relative geothermal gradient measurements. Finally, we conclude that TIR remote sensing is a cost-effective technique to detect geothermal anomalies. Combining TIR remote sensing with geological analysis and the understanding of geothermal mechanism is an accurate and efficient approach to geothermal area detection.

  18. SU-G-JeP4-03: Anomaly Detection of Respiratory Motion by Use of Singular Spectrum Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kotoku, J; Kumagai, S; Nakabayashi, S

    Purpose: The implementation and realization of automatic anomaly detection of respiratory motion is a very important technique to prevent accidental damage during radiation therapy. Here, we propose an automatic anomaly detection method using singular value decomposition analysis. Methods: The anomaly detection procedure consists of four parts:1) measurement of normal respiratory motion data of a patient2) calculation of a trajectory matrix representing normal time-series feature3) real-time monitoring and calculation of a trajectory matrix of real-time data.4) calculation of an anomaly score from the similarity of the two feature matrices. Patient motion was observed by a marker-less tracking system using a depthmore » camera. Results: Two types of motion e.g. cough and sudden stop of breathing were successfully detected in our real-time application. Conclusion: Automatic anomaly detection of respiratory motion using singular spectrum analysis was successful in the cough and sudden stop of breathing. The clinical use of this algorithm will be very hopeful. This work was supported by JSPS KAKENHI Grant Number 15K08703.« less

  19. Anomaly Detection Techniques for Ad Hoc Networks

    ERIC Educational Resources Information Center

    Cai, Chaoli

    2009-01-01

    Anomaly detection is an important and indispensable aspect of any computer security mechanism. Ad hoc and mobile networks consist of a number of peer mobile nodes that are capable of communicating with each other absent a fixed infrastructure. Arbitrary node movements and lack of centralized control make them vulnerable to a wide variety of…

  20. Anomaly-based intrusion detection for SCADA systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less

  1. A robust background regression based score estimation algorithm for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei

    2016-12-01

    Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement in practice.

  2. Confabulation Based Real-time Anomaly Detection for Wide-area Surveillance Using Heterogeneous High Performance Computing Architecture

    DTIC Science & Technology

    2015-06-01

    system accuracy. The AnRAD system was also generalized for the additional application of network intrusion detection . A self-structuring technique...to Host- based Intrusion Detection Systems using Contiguous and Discontiguous System Call Patterns,” IEEE Transactions on Computer, 63(4), pp. 807...square kilometer areas. The anomaly recognition and detection (AnRAD) system was built as a cogent confabulation network . It represented road

  3. Department of Defense Fiscal Year (FY) 2005 Budget Estimates. Research, Development, Test and Evaluation, Defense-Wide. Volume 1 - Defense Advanced Research Projects Agency

    DTIC Science & Technology

    2004-02-01

    UNCLASSIFIED − Conducted experiments to determine the usability of general-purpose anomaly detection algorithms to monitor a large, complex military...reaction and detection modules to perform tailored analysis sequences to monitor environmental conditions, health hazards and physiological states...scalability of lab proven anomaly detection techniques for intrusion detection in real world high volume environments. Narrative Title FY 2003

  4. Road Anomalies Detection System Evaluation.

    PubMed

    Silva, Nuno; Shah, Vaibhav; Soares, João; Rodrigues, Helena

    2018-06-21

    Anomalies on road pavement cause discomfort to drivers and passengers, and may cause mechanical failure or even accidents. Governments spend millions of Euros every year on road maintenance, often causing traffic jams and congestion on urban roads on a daily basis. This paper analyses the difference between the deployment of a road anomalies detection and identification system in a “conditioned” and a real world setup, where the system performed worse compared to the “conditioned” setup. It also presents a system performance analysis based on the analysis of the training data sets; on the analysis of the attributes complexity, through the application of PCA techniques; and on the analysis of the attributes in the context of each anomaly type, using acceleration standard deviation attributes to observe how different anomalies classes are distributed in the Cartesian coordinates system. Overall, in this paper, we describe the main insights on road anomalies detection challenges to support the design and deployment of a new iteration of our system towards the deployment of a road anomaly detection service to provide information about roads condition to drivers and government entities.

  5. Thermal wake/vessel detection technique

    DOEpatents

    Roskovensky, John K [Albuquerque, NM; Nandy, Prabal [Albuquerque, NM; Post, Brian N [Albuquerque, NM

    2012-01-10

    A computer-automated method for detecting a vessel in water based on an image of a portion of Earth includes generating a thermal anomaly mask. The thermal anomaly mask flags each pixel of the image initially deemed to be a wake pixel based on a comparison of a thermal value of each pixel against other thermal values of other pixels localized about each pixel. Contiguous pixels flagged by the thermal anomaly mask are grouped into pixel clusters. A shape of each of the pixel clusters is analyzed to determine whether each of the pixel clusters represents a possible vessel detection event. The possible vessel detection events are represented visually within the image.

  6. Visual analytics of anomaly detection in large data streams

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay

    2009-01-01

    Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.

  7. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.

    2001-09-01

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerablemore » preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.« less

  8. Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets.

    PubMed

    Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin

    2017-03-09

    Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy ands parse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxi cabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques.

  9. Machine Learning in Intrusion Detection

    DTIC Science & Technology

    2005-07-01

    machine learning tasks. Anomaly detection provides the core technology for a broad spectrum of security-centric applications. In this dissertation, we examine various aspects of anomaly based intrusion detection in computer security. First, we present a new approach to learn program behavior for intrusion detection. Text categorization techniques are adopted to convert each process to a vector and calculate the similarity between two program activities. Then the k-nearest neighbor classifier is employed to classify program behavior as normal or intrusive. We demonstrate

  10. Mesoscale, Radiometrically Referenced, Multi-Temporal Hyperspectral Data for Co2 Leak Detection by Locating Spatial Variation of Biophysically Relevant Parameters

    NASA Astrophysics Data System (ADS)

    McCann, Cooper Patrick

    Low-cost flight-based hyperspectral imaging systems have the potential to provide valuable information for ecosystem and environmental studies as well as aide in land management and land health monitoring. This thesis describes (1) a bootstrap method of producing mesoscale, radiometrically-referenced hyperspectral data using the Landsat surface reflectance (LaSRC) data product as a reference target, (2) biophysically relevant basis functions to model the reflectance spectra, (3) an unsupervised classification technique based on natural histogram splitting of these biophysically relevant parameters, and (4) local and multi-temporal anomaly detection. The bootstrap method extends standard processing techniques to remove uneven illumination conditions between flight passes, allowing the creation of radiometrically self-consistent data. Through selective spectral and spatial resampling, LaSRC data is used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from a flight on 06/02/2016 is compared with concurrently collected ground based reflectance spectra as a means of validation achieving an average error of 2.74%. Fitting reflectance spectra using basis functions, based on biophysically relevant spectral features, allows both noise and data reductions while shifting information from spectral bands to biophysical features. Histogram splitting is used to determine a clustering based on natural splittings of these fit parameters. The Indian Pines reference data enabled comparisons of the efficacy of this technique to established techniques. The splitting technique is shown to be an improvement over the ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. This improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA. Three hyperspectral flights over the Kevin Dome area, covering 1843 ha, acquired 06/21/2014, 06/24/2015 and 06/26/2016 are examined with different methods of anomaly detection. Detection of anomalies within a single data set is examined to determine, on a local scale, areas that are significantly different from the surrounding area. Additionally, the detection and identification of persistent anomalies and non-persistent anomalies was investigated across multiple data sets.

  11. Detection of submicron scale cracks and other surface anomalies using positron emission tomography

    DOEpatents

    Cowan, Thomas E.; Howell, Richard H.; Colmenares, Carlos A.

    2004-02-17

    Detection of submicron scale cracks and other mechanical and chemical surface anomalies using PET. This surface technique has sufficient sensitivity to detect single voids or pits of sub-millimeter size and single cracks or fissures of millimeter size; and single cracks or fissures of millimeter-scale length, micrometer-scale depth, and nanometer-scale length, micrometer-scale depth, and nanometer-scale width. This technique can also be applied to detect surface regions of differing chemical reactivity. It may be utilized in a scanning or survey mode to simultaneously detect such mechanical or chemical features over large interior or exterior surface areas of parts as large as about 50 cm in diameter. The technique involves exposing a surface to short-lived radioactive gas for a time period, removing the excess gas to leave a partial monolayer, determining the location and shape of the cracks, voids, porous regions, etc., and calculating the width, depth, and length thereof. Detection of 0.01 mm deep cracks using a 3 mm detector resolution has been accomplished using this technique.

  12. Anomaly Detection Techniques with Real Test Data from a Spinning Turbine Engine-Like Rotor

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Woike, Mark R.; Oza, Nikunj C.; Matthews, Bryan L.

    2012-01-01

    Online detection techniques to monitor the health of rotating engine components are becoming increasingly attractive to aircraft engine manufacturers in order to increase safety of operation and lower maintenance costs. Health monitoring remains a challenge to easily implement, especially in the presence of scattered loading conditions, crack size, component geometry, and materials properties. The current trend, however, is to utilize noninvasive types of health monitoring or nondestructive techniques to detect hidden flaws and mini-cracks before any catastrophic event occurs. These techniques go further to evaluate material discontinuities and other anomalies that have grown to the level of critical defects that can lead to failure. Generally, health monitoring is highly dependent on sensor systems capable of performing in various engine environmental conditions and able to transmit a signal upon a predetermined crack length, while acting in a neutral form upon the overall performance of the engine system.

  13. Survey of Machine Learning Methods for Database Security

    NASA Astrophysics Data System (ADS)

    Kamra, Ashish; Ber, Elisa

    Application of machine learning techniques to database security is an emerging area of research. In this chapter, we present a survey of various approaches that use machine learning/data mining techniques to enhance the traditional security mechanisms of databases. There are two key database security areas in which these techniques have found applications, namely, detection of SQL Injection attacks and anomaly detection for defending against insider threats. Apart from the research prototypes and tools, various third-party commercial products are also available that provide database activity monitoring solutions by profiling database users and applications. We present a survey of such products. We end the chapter with a primer on mechanisms for responding to database anomalies.

  14. Visualization of hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Hogervorst, Maarten A.; Bijl, Piet; Toet, Alexander

    2007-04-01

    We developed four new techniques to visualize hyper spectral image data for man-in-the-loop target detection. The methods respectively: (1) display the subsequent bands as a movie ("movie"), (2) map the data onto three channels and display these as a colour image ("colour"), (3) display the correlation between the pixel signatures and a known target signature ("match") and (4) display the output of a standard anomaly detector ("anomaly"). The movie technique requires no assumptions about the target signature and involves no information loss. The colour technique produces a single image that can be displayed in real-time. A disadvantage of this technique is loss of information. A display of the match between a target signature and pixels and can be interpreted easily and fast, but this technique relies on precise knowledge of the target signature. The anomaly detector signifies pixels with signatures that deviate from the (local) background. We performed a target detection experiment with human observers to determine their relative performance with the four techniques,. The results show that the "match" presentation yields the best performance, followed by "movie" and "anomaly", while performance with the "colour" presentation was the poorest. Each scheme has its advantages and disadvantages and is more or less suited for real-time and post-hoc processing. The rationale is that the final interpretation is best done by a human observer. In contrast to automatic target recognition systems, the interpretation of hyper spectral imagery by the human visual system is robust to noise and image transformations and requires a minimal number of assumptions (about signature of target and background, target shape etc.) When more knowledge about target and background is available this may be used to help the observer interpreting the data (aided target detection).

  15. Security inspection in ports by anomaly detection using hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Rivera, Javier; Valverde, Fernando; Saldaña, Manuel; Manian, Vidya

    2013-05-01

    Applying hyperspectral imaging technology in port security is crucial for the detection of possible threats or illegal activities. One of the most common problems that cargo suffers is tampering. This represents a danger to society because it creates a channel to smuggle illegal and hazardous products. If a cargo is altered, security inspections on that cargo should contain anomalies that reveal the nature of the tampering. Hyperspectral images can detect anomalies by gathering information through multiple electromagnetic bands. The spectrums extracted from these bands can be used to detect surface anomalies from different materials. Based on this technology, a scenario was built in which a hyperspectral camera was used to inspect the cargo for any surface anomalies and a user interface shows the results. The spectrum of items, altered by different materials that can be used to conceal illegal products, is analyzed and classified in order to provide information about the tampered cargo. The image is analyzed with a variety of techniques such as multiple features extracting algorithms, autonomous anomaly detection, and target spectrum detection. The results will be exported to a workstation or mobile device in order to show them in an easy -to-use interface. This process could enhance the current capabilities of security systems that are already implemented, providing a more complete approach to detect threats and illegal cargo.

  16. Survey of Anomaly Detection Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, B

    This survey defines the problem of anomaly detection and provides an overview of existing methods. The methods are categorized into two general classes: generative and discriminative. A generative approach involves building a model that represents the joint distribution of the input features and the output labels of system behavior (e.g., normal or anomalous) then applies the model to formulate a decision rule for detecting anomalies. On the other hand, a discriminative approach aims directly to find the decision rule, with the smallest error rate, that distinguishes between normal and anomalous behavior. For each approach, we will give an overview ofmore » popular techniques and provide references to state-of-the-art applications.« less

  17. WE-H-BRC-06: A Unified Machine-Learning Based Probabilistic Model for Automated Anomaly Detection in the Treatment Plan Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, X; Liu, S; Kalet, A

    Purpose: The purpose of this work was to investigate the ability of a machine-learning based probabilistic approach to detect radiotherapy treatment plan anomalies given initial disease classes information. Methods In total we obtained 1112 unique treatment plans with five plan parameters and disease information from a Mosaiq treatment management system database for use in the study. The plan parameters include prescription dose, fractions, fields, modality and techniques. The disease information includes disease site, and T, M and N disease stages. A Bayesian network method was employed to model the probabilistic relationships between tumor disease information, plan parameters and an anomalymore » flag. A Bayesian learning method with Dirichlet prior was useed to learn the joint probabilities between dependent variables in error-free plan data and data with artificially induced anomalies. In the study, we randomly sampled data with anomaly in a specified anomaly space.We tested the approach with three groups of plan anomalies – improper concurrence of values of all five plan parameters and values of any two out of five parameters, and all single plan parameter value anomalies. Totally, 16 types of plan anomalies were covered by the study. For each type, we trained an individual Bayesian network. Results: We found that the true positive rate (recall) and positive predictive value (precision) to detect concurrence anomalies of five plan parameters in new patient cases were 94.45±0.26% and 93.76±0.39% respectively. To detect other 15 types of plan anomalies, the average recall and precision were 93.61±2.57% and 93.78±3.54% respectively. The computation time to detect the plan anomaly of each type in a new plan is ∼0.08 seconds. Conclusion: The proposed method for treatment plan anomaly detection was found effective in the initial tests. The results suggest that this type of models could be applied to develop plan anomaly detection tools to assist manual and automated plan checks. The senior author received research grants from ViewRay Inc. and Varian Medical System.« less

  18. Anomaly Monitoring Method for Key Components of Satellite

    PubMed Central

    Fan, Linjun; Xiao, Weidong; Tang, Jun

    2014-01-01

    This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703

  19. Inflight and Preflight Detection of Pitot Tube Anomalies

    NASA Technical Reports Server (NTRS)

    Mitchell, Darrell W.

    2014-01-01

    The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.

  20. A new approach for structural health monitoring by applying anomaly detection on strain sensor data

    NASA Astrophysics Data System (ADS)

    Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik

    2014-03-01

    Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.

  1. Urban Classification Techniques Using the Fusion of LiDAR and Spectral Data

    DTIC Science & Technology

    2012-09-01

    Photogrammetry and Remote Sensing, 62, 43–63. Stein, D., Beaven, S., Hoff, L., Winter, E., Schaum, A., & Stocker, A. (2002). Anomaly detection from...TECHNIQUES USING THE FUSION OF LIDAR AND SPECTRAL DATA by Justin E. Mesina September 2012 Thesis Advisor: Richard C . Olsen Second...from shadow anomalies . The fused results however, were not as accurate in differentiating trees from grasses as using only spectral results. Overall the

  2. Detecting an atomic clock frequency anomaly using an adaptive Kalman filter algorithm

    NASA Astrophysics Data System (ADS)

    Song, Huijie; Dong, Shaowu; Wu, Wenjun; Jiang, Meng; Wang, Weixiong

    2018-06-01

    The abnormal frequencies of an atomic clock mainly include frequency jump and frequency drift jump. Atomic clock frequency anomaly detection is a key technique in time-keeping. The Kalman filter algorithm, as a linear optimal algorithm, has been widely used in real-time detection for abnormal frequency. In order to obtain an optimal state estimation, the observation model and dynamic model of the Kalman filter algorithm should satisfy Gaussian white noise conditions. The detection performance is degraded if anomalies affect the observation model or dynamic model. The idea of the adaptive Kalman filter algorithm, applied to clock frequency anomaly detection, uses the residuals given by the prediction for building ‘an adaptive factor’ the prediction state covariance matrix is real-time corrected by the adaptive factor. The results show that the model error is reduced and the detection performance is improved. The effectiveness of the algorithm is verified by the frequency jump simulation, the frequency drift jump simulation and the measured data of the atomic clock by using the chi-square test.

  3. Advances in soil gas geochemical exploration for natural resources: Some current examples and practices

    NASA Astrophysics Data System (ADS)

    McCarthy, J. Howard, Jr.; Reimer, G. Michael

    1986-11-01

    Field studies have demonstrated that gas anomalies are found over buried mineral deposits. Abnormally high concentrations of sulfur gases and carbon dioxide and abnormally low concentrations of oxygen are commonly found over sulfide ore deposits. Helium anomalies are commonly associated with uranium deposits and geothermal areas. Helium and hydrocarbon gas anomalies have been detected over oil and gas deposits. Gases are sampled by extracting them from the pore space of soil, by degassing soil or rock, or by adsorbing them on artificial collectors. The two most widely used techniques for gas analysis are gas chromatography and mass spectrometry. The detection of gas anomalies at or near the surface may be an effective method to locate buried mineral deposits.

  4. Optimization of Archeological Anomalies using GIS method for Magnetic and Resistivity Study at Sungai Batu, Lembah Bujang, Kedah (Malaysia)

    NASA Astrophysics Data System (ADS)

    Yusoh, R.; Saad, R.; Saidin, M.; Anda, S. T.; Muhammad, S. B.; Ashraf, M. I. M.; Hazreek, Z. A. M.

    2018-04-01

    Magnetic and resistivity method has become a reliable option in archeological exploration. The use of both method has become popular these day. However, both method gives different type of sensing in detecting anomalies and direct interpret from the anomalies will result large coverage area for excavation. Therefore, to overcome this issue, both anomalies can be extracted using ArcGIS software to reduce excavated coverage area. The case study located at Sungai Batu, Lembah Bujang near SB2ZZ lot expected buried clay brick monument which will be a perfect case to apply this technique. Magnetic and resistivity method was implemented at the study area where the anomalies coverage area for magnetic and resistivity is 531.5 m2 and 636 m2 respectively which total area of both anomalies was 764 m2. By applying combine technique, the anomalies area reduce to 403.7 m2 which reduce the suspected anomalies by 47.16 %. The unsuspected clay brick monument area was increase from 15.86% to 55.54% which improve the cost and labor work for excavation.

  5. Detection and characterization of buried lunar craters with GRAIL data

    NASA Astrophysics Data System (ADS)

    Sood, Rohan; Chappaz, Loic; Melosh, Henry J.; Howell, Kathleen C.; Milbury, Colleen; Blair, David M.; Zuber, Maria T.

    2017-06-01

    We used gravity mapping observations from NASA's Gravity Recovery and Interior Laboratory (GRAIL) to detect, characterize and validate the presence of large impact craters buried beneath the lunar maria. In this paper we focus on two prominent anomalies detected in the GRAIL data using the gravity gradiometry technique. Our detection strategy is applied to both free-air and Bouguer gravity field observations to identify gravitational signatures that are similar to those observed over buried craters. The presence of buried craters is further supported by individual analysis of regional free-air gravity anomalies, Bouguer gravity anomaly maps, and forward modeling. Our best candidate, for which we propose the informal name of Earhart Crater, is approximately 200 km in diameter and forms part of the northwestern rim of Lacus Somniorum, The other candidate, for which we propose the informal name of Ashoka Anomaly, is approximately 160 km in diameter and lies completely buried beneath Mare Tranquillitatis. Other large, still unrecognized, craters undoubtedly underlie other portions of the Moon's vast mare lavas.

  6. GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D

    This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% truemore » positive rates at both.« less

  7. Artificial intelligence techniques for ground test monitoring of rocket engines

    NASA Technical Reports Server (NTRS)

    Ali, Moonis; Gupta, U. K.

    1990-01-01

    An expert system is being developed which can detect anomalies in Space Shuttle Main Engine (SSME) sensor data significantly earlier than the redline algorithm currently in use. The training of such an expert system focuses on two approaches which are based on low frequency and high frequency analyses of sensor data. Both approaches are being tested on data from SSME tests and their results compared with the findings of NASA and Rocketdyne experts. Prototype implementations have detected the presence of anomalies earlier than the redline algorithms that are in use currently. It therefore appears that these approaches have the potential of detecting anomalies early eneough to shut down the engine or take other corrective action before severe damage to the engine occurs.

  8. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  9. Microgravity and Electrical Resistivity Techniques for Detection of Caves and Clandestine Tunnels

    NASA Astrophysics Data System (ADS)

    Crawford, N. C.; Croft, L. A.; Cesin, G. L.; Wilson, S.

    2006-05-01

    The Center for Cave and Karst Studies, CCKS, has been using microgravity to locate caves from the ground's surface since 1985. The geophysical subsurface investigations began during a period when explosive and toxic vapors were rising from the karst aquifer under Bowling Green into homes, businesses, and schools. The USEPA provided the funding for this Superfund Emergency, and the CCKS was able to drill numerous wells into low-gravity anomalies to confirm and even map the route of caves in the underlying limestone bedrock. In every case, a low-gravity anomaly indicated a bedrock cave, a cave with a collapsed roof or locations where a bedrock cave had collapsed and filled with alluvium. At numerous locations, several wells were cored into microgravity anomalies and in every case, additional wells were drilled on both sides of the anomalies to confirm that the technique was in fact reliable. The wells cored on both sides of the anomalies did not intersect caves but instead intersected virtually solid limestone. Microgravity also easily detected storm sewers and even sanitary sewers, sometimes six meters (twenty feet) beneath the surface. Microgravity has also been used on many occasions to investigate sinkhole collapses. It identified potential collapse areas by detecting voids in the unconsolidated material above bedrock. The system will soon be tested over known tunnels and then during a blind test along a section of the U.S. border at Nogales, Arizona. The CCKS has experimented with other geophysical techniques, particularly ground penetrating radar, seismic and electrical resistivity. In the late 1990s the CCKS started using the Swift/Sting resistivity meter to perform karst geophysical subsurface investigations. The system provides good depth to bedrock data, but it is often difficult to interpret bedrock caves from the modeled data. The system typically used now by the CCKS to perform karst subsurface investigations is to use electrical resistivity traverses followed by microgravity over suspect areas identified on the modeled resistivity data. Some areas of high resistivity indicate caves, but others simply indicate pockets of dry limestone, and the signatures looks virtually identical. Therefore, the CCKS performs microgravity over all suspect areas along the resistivity traverses. A low-gravity anomaly that corresponds with a high-resistivity anomaly indicates a cave location. A high-resistivity anomaly that does not also have a low- gravity anomaly indicates a pocket of dry limestone. Numerous cored wells have been drilled both into the anomalies and on both sides to confirm the cave locations and to establish that the technique is accurate. The September 11, 2001 World Trade Center catastrophe was the catalyst for the formation of a program within the CCKS to use the techniques for locating bedrock caves and voids in unconsolidated materials for search and rescue and for locating clandestine tunnels. We are now into our third year of a grant from the Kentucky Science and Technology Center to develop a robot that will measure microgravity and other geophysical techniques. The robot has the potential for detecting clandestine tunnels under the U.S. border as well as military applications. The system will soon be tested over known tunnels and then during a blind test along a section of the U.S. border at Nogales, Arizona.

  10. A function approximation approach to anomaly detection in propulsion system test data

    NASA Technical Reports Server (NTRS)

    Whitehead, Bruce A.; Hoyt, W. A.

    1993-01-01

    Ground test data from propulsion systems such as the Space Shuttle Main Engine (SSME) can be automatically screened for anomalies by a neural network. The neural network screens data after being trained with nominal data only. Given the values of 14 measurements reflecting external influences on the SSME at a given time, the neural network predicts the expected nominal value of a desired engine parameter at that time. We compared the ability of three different function-approximation techniques to perform this nominal value prediction: a novel neural network architecture based on Gaussian bar basis functions, a conventional back propagation neural network, and linear regression. These three techniques were tested with real data from six SSME ground tests containing two anomalies. The basis function network trained more rapidly than back propagation. It yielded nominal predictions with, a tight enough confidence interval to distinguish anomalous deviations from the nominal fluctuations in an engine parameter. Since the function-approximation approach requires nominal training data only, it is capable of detecting unknown classes of anomalies for which training data is not available.

  11. Unsupervised Anomaly Detection Based on Clustering and Multiple One-Class SVM

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Kwon, Yongjin

    Intrusion detection system (IDS) has played an important role as a device to defend our networks from cyber attacks. However, since it is unable to detect unknown attacks, i.e., 0-day attacks, the ultimate challenge in intrusion detection field is how we can exactly identify such an attack by an automated manner. Over the past few years, several studies on solving these problems have been made on anomaly detection using unsupervised learning techniques such as clustering, one-class support vector machine (SVM), etc. Although they enable one to construct intrusion detection models at low cost and effort, and have capability to detect unforeseen attacks, they still have mainly two problems in intrusion detection: a low detection rate and a high false positive rate. In this paper, we propose a new anomaly detection method based on clustering and multiple one-class SVM in order to improve the detection rate while maintaining a low false positive rate. We evaluated our method using KDD Cup 1999 data set. Evaluation results show that our approach outperforms the existing algorithms reported in the literature; especially in detection of unknown attacks.

  12. An Investigation of State-Space Model Fidelity for SSME Data

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2008-01-01

    In previous studies, a variety of unsupervised anomaly detection techniques for anomaly detection were applied to SSME (Space Shuttle Main Engine) data. The observed results indicated that the identification of certain anomalies were specific to the algorithmic method under consideration. This is the reason why one of the follow-on goals of these previous investigations was to build an architecture to support the best capabilities of all algorithms. We appeal to that goal here by investigating a cascade, serial architecture for the best performing and most suitable candidates from previous studies. As a precursor to a formal ROC (Receiver Operating Characteristic) curve analysis for validation of resulting anomaly detection algorithms, our primary focus here is to investigate the model fidelity as measured by variants of the AIC (Akaike Information Criterion) for state-space based models. We show that placing constraints on a state-space model during or after the training of the model introduces a modest level of suboptimality. Furthermore, we compare the fidelity of all candidate models including those embodying the cascade, serial architecture. We make recommendations on the most suitable candidates for application to subsequent anomaly detection studies as measured by AIC-based criteria.

  13. Anomaly detection for machine learning redshifts applied to SDSS galaxies

    NASA Astrophysics Data System (ADS)

    Hoyle, Ben; Rau, Markus Michael; Paech, Kerstin; Bonnett, Christopher; Seitz, Stella; Weller, Jochen

    2015-10-01

    We present an analysis of anomaly detection for machine learning redshift estimation. Anomaly detection allows the removal of poor training examples, which can adversely influence redshift estimates. Anomalous training examples may be photometric galaxies with incorrect spectroscopic redshifts, or galaxies with one or more poorly measured photometric quantity. We select 2.5 million `clean' SDSS DR12 galaxies with reliable spectroscopic redshifts, and 6730 `anomalous' galaxies with spectroscopic redshift measurements which are flagged as unreliable. We contaminate the clean base galaxy sample with galaxies with unreliable redshifts and attempt to recover the contaminating galaxies using the Elliptical Envelope technique. We then train four machine learning architectures for redshift analysis on both the contaminated sample and on the preprocessed `anomaly-removed' sample and measure redshift statistics on a clean validation sample generated without any preprocessing. We find an improvement on all measured statistics of up to 80 per cent when training on the anomaly removed sample as compared with training on the contaminated sample for each of the machine learning routines explored. We further describe a method to estimate the contamination fraction of a base data sample.

  14. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    PubMed Central

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-01-01

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm2 areas and ≥2% in ∼20 mm2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified. PMID:22894421

  15. Spatial-temporal event detection in climate parameter imagery.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKenna, Sean Andrew; Gutierrez, Karen A.

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to themore » earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.« less

  16. Identifying High-Risk Patients without Labeled Training Data: Anomaly Detection Methodologies to Predict Adverse Outcomes

    PubMed Central

    Syed, Zeeshan; Saeed, Mohammed; Rubinfeld, Ilan

    2010-01-01

    For many clinical conditions, only a small number of patients experience adverse outcomes. Developing risk stratification algorithms for these conditions typically requires collecting large volumes of data to capture enough positive and negative for training. This process is slow, expensive, and may not be appropriate for new phenomena. In this paper, we explore different anomaly detection approaches to identify high-risk patients as cases that lie in sparse regions of the feature space. We study three broad categories of anomaly detection methods: classification-based, nearest neighbor-based, and clustering-based techniques. When evaluated on data from the National Surgical Quality Improvement Program (NSQIP), these methods were able to successfully identify patients at an elevated risk of mortality and rare morbidities following inpatient surgical procedures. PMID:21347083

  17. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  18. Hyperspectral target detection using heavy-tailed distributions

    NASA Astrophysics Data System (ADS)

    Willis, Chris J.

    2009-09-01

    One promising approach to target detection in hyperspectral imagery exploits a statistical mixture model to represent scene content at a pixel level. The process then goes on to look for pixels which are rare, when judged against the model, and marks them as anomalies. It is assumed that military targets will themselves be rare and therefore likely to be detected amongst these anomalies. For the typical assumption of multivariate Gaussianity for the mixture components, the presence of the anomalous pixels within the training data will have a deleterious effect on the quality of the model. In particular, the derivation process itself is adversely affected by the attempt to accommodate the anomalies within the mixture components. This will bias the statistics of at least some of the components away from their true values and towards the anomalies. In many cases this will result in a reduction in the detection performance and an increased false alarm rate. This paper considers the use of heavy-tailed statistical distributions within the mixture model. Such distributions are better able to account for anomalies in the training data within the tails of their distributions, and the balance of the pixels within their central masses. This means that an improved model of the majority of the pixels in the scene may be produced, ultimately leading to a better anomaly detection result. The anomaly detection techniques are examined using both synthetic data and hyperspectral imagery with injected anomalous pixels. A range of results is presented for the baseline Gaussian mixture model and for models accommodating heavy-tailed distributions, for different parameterizations of the algorithms. These include scene understanding results, anomalous pixel maps at given significance levels and Receiver Operating Characteristic curves.

  19. Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Lauzon, N.; Lence, B. J.

    2002-12-01

    This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be established with these calibration procedures. A simple method for analyzing uncertainties associated with the Kohonen neural network and fuzzy c-means is developed here. The method combines the results from several sets of clusters, either from the Kohonen neural network or fuzzy c-means, so as to provide an overall diagnosis as to the identification of outliers, shifts and trends. The results indicate an improvement in the performance for identifying anomalies when the method of combining cluster sets is used, compared with when only one cluster set is used.

  20. Detection of Landmines by Neutron Backscattering: Effects of Soil Moisture on the Detection System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baysoy, D. Y.; Subasi, M.

    2010-01-21

    Detection of buried land mines by using neutron backscattering technique (NBS) is a well established method. It depends on detecting a hydrogen anomaly in dry soil. Since a landmine and its plastic casing contain much more hydrogen atoms than the dry soil, this anomaly can be detected by observing a rise in the number of neutrons moderated to thermal or epithermal energy. But, the presence of moisture in the soil limits the effectiveness of the measurements. In this work, a landmine detection system using the NBS technique was designed. A series of Monte Carlo calculations was carried out to determinemore » the limits of the system due to the moisture content of the soil. In the simulations, an isotropic fast neutron source ({sup 252}Cf, 100 mug) and a neutron detection system which consists of five {sup 3}He detectors were used in a practicable geometry. In order to see the effects of soil moisture on the efficiency of the detection system, soils with different water contents were tested.« less

  1. Magnetic Resonance Imaging of Developmental Anomalies of the Uterus and the Vagina in Pediatric Patients.

    PubMed

    Gould, Sharon W; Epelman, Monica

    2015-08-01

    Developmental anomalies of the uterus and the vagina are associated with infertility and miscarriage and are most commonly detected in the postpubertal age-group. These conditions may also present in younger patients as a mass or pain owing to obstruction of the uterus or the vagina. Associated urinary tract anomalies are common, as well. Accurate diagnosis and thorough description of these anomalies is essential for appropriate management; however, evaluation may be difficult in an immature reproductive tract. Magnetic resonance imaging technique pertinent to imaging of the pediatric female reproductive tract is presented and illustrated along with the findings associated with various anomalies. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Model-Biased, Data-Driven Adaptive Failure Prediction

    NASA Technical Reports Server (NTRS)

    Leen, Todd K.

    2004-01-01

    This final report, which contains a research summary and a viewgraph presentation, addresses clustering and data simulation techniques for failure prediction. The researchers applied their techniques to both helicopter gearbox anomaly detection and segmentation of Earth Observing System (EOS) satellite imagery.

  3. Multimodal noninvasive and invasive imaging of extracranial venous abnormalities indicative of CCSVI: Results of the PREMiSe pilot study

    PubMed Central

    2013-01-01

    Background There is no established noninvasive or invasive diagnostic imaging modality at present that can serve as a ‘gold standard’ or “benchmark” for the detection of the venous anomalies, indicative of chronic cerebrospinal venous insufficiency (CCSVI). We investigated the sensitivity and specificity of 2 invasive vs. 2 noninvasive imaging techniques for the detection of extracranial venous anomalies in the internal jugular veins (IJVs) and azygos vein/vertebral veins (VVs) in patients with multiple sclerosis (MS). Methods The data for this multimodal imaging comparison pilot study was collected in phase 2 of the “Prospective Randomized Endovascular therapy in Multiple Sclerosis” (PREMiSe) study using standardized imaging techniques. Thirty MS subjects were screened initially with Doppler sonography (DS), out of which 10 did not fulfill noninvasive screening procedure requirements on DS that consisted of ≥2 venous hemodynamic extracranial criteria. Accordingly, 20 MS patients with relapsing MS were enrolled into the multimodal diagnostic imaging study. For magnetic resonance venography (MRV), IJVs abnormal findings were considered absent or pinpoint flow, whereas abnormal VVs flow was classified as absent. Abnormalities of the VVs were determined only using non-invasive testing. Catheter venography (CV) was considered abnormal when ≥50% lumen restriction was detected, while intravascular ultrasound (IVUS) was considered abnormal when ≥50% restriction of the lumen or intra-luminal defects or reduced pulsatility was found. Non-invasive and invasive imaging modality comparisons between left, right and total IJVs and between the VVs and azygos vein were performed. Because there is no reliable way of non-invasively assessing the azygos vein, the VVs abnormalities detected by the non-invasive testing were compared to the azygos abnormalities detected by the invasive testing. All image modalities were analyzed in a blinded manner by more than one viewer, upon which consensus was reached. The sensitivity and specificity were calculated using contingency tables denoting the presence or absence of vein-specific abnormality findings between all imaging modalities used individually as the benchmark. Results The sensitivity of CV + IVUS was 68.4% for the right and 90% for the left IJV and 85.7% for the azygos vein/VVs, compared to venous anomalies detected on DS. Compared to the venous anomalies detected on MRV, the sensitivity of CV + IVUS was 71.4% in right and 100% in left IJVs and 100% in the azygos vein/VVs; however, the specificity was 38.5%, 38.9% and 11.8%, respectively. The sensitivity between the two invasive imaging techniques, used as benchmarks, ranged from 72.7% for the right IJV to 90% for the azygos vein but the IVUS showed a higher rate of venous anomalies than the CV. There was excellent correspondence between identifying collateral veins on MRV and CV. Conclusions Noninvasive DS screening for the detection of venous anomalies indicative of CCSVI may be a reliable approach for identifying patients eligible for further multimodal invasive imaging testing of the IJVs. However, the noninvasive screening methods were inadequate to depict the total amount of azygos vein/VVs anomalies identified with invasive testing. This pilot study, with limited sample size, shows that both a non-invasive and invasive multimodal imaging diagnostic approach should be recommended to depict a range of extracranial venous anomalies indicative of CCSVI. However, lack of invasive testing on the study subjects whose results were negative on the DS screening and of healthy controls, limits further generalizibility of our findings. In addition, the findings from the 2 invasive techniques confirmed the existence of severe extracranial venous anomalies that significantly impaired normal blood outflow from the brain in this group of MS patients. PMID:24139135

  4. Thermal and TEC anomalies detection using an intelligent hybrid system around the time of the Saravan, Iran, (Mw = 7.7) earthquake of 16 April 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2014-02-01

    A powerful earthquake of Mw = 7.7 struck the Saravan region (28.107° N, 62.053° E) in Iran on 16 April 2013. Up to now nomination of an automated anomaly detection method in a non linear time series of earthquake precursor has been an attractive and challenging task. Artificial Neural Network (ANN) and Particle Swarm Optimization (PSO) have revealed strong potentials in accurate time series prediction. This paper presents the first study of an integration of ANN and PSO method in the research of earthquake precursors to detect the unusual variations of the thermal and total electron content (TEC) seismo-ionospheric anomalies induced by the strong earthquake of Saravan. In this study, to overcome the stagnation in local minimum during the ANN training, PSO as an optimization method is used instead of traditional algorithms for training the ANN method. The proposed hybrid method detected a considerable number of anomalies 4 and 8 days preceding the earthquake. Since, in this case study, ionospheric TEC anomalies induced by seismic activity is confused with background fluctuations due to solar activity, a multi-resolution time series processing technique based on wavelet transform has been applied on TEC signal variations. In view of the fact that the accordance in the final results deduced from some robust methods is a convincing indication for the efficiency of the method, therefore the detected thermal and TEC anomalies using the ANN + PSO method were compared to the results with regard to the observed anomalies by implementing the mean, median, Wavelet, Kalman filter, Auto-Regressive Integrated Moving Average (ARIMA), Support Vector Machine (SVM) and Genetic Algorithm (GA) methods. The results indicate that the ANN + PSO method is quite promising and deserves serious attention as a new tool for thermal and TEC seismo anomalies detection.

  5. Processing the Bouguer anomaly map of Biga and the surrounding area by the cellular neural network: application to the southwestern Marmara region

    NASA Astrophysics Data System (ADS)

    Aydogan, D.

    2007-04-01

    An image processing technique called the cellular neural network (CNN) approach is used in this study to locate geological features giving rise to gravity anomalies such as faults or the boundary of two geologic zones. CNN is a stochastic image processing technique based on template optimization using the neighborhood relationships of cells. These cells can be characterized by a functional block diagram that is typical of neural network theory. The functionality of CNN is described in its entirety by a number of small matrices (A, B and I) called the cloning template. CNN can also be considered to be a nonlinear convolution of these matrices. This template describes the strength of the nearest neighbor interconnections in the network. The recurrent perceptron learning algorithm (RPLA) is used in optimization of cloning template. The CNN and standard Canny algorithms were first tested on two sets of synthetic gravity data with the aim of checking the reliability of the proposed approach. The CNN method was compared with classical derivative techniques by applying the cross-correlation method (CC) to the same anomaly map as this latter approach can detect some features that are difficult to identify on the Bouguer anomaly maps. This approach was then applied to the Bouguer anomaly map of Biga and its surrounding area, in Turkey. Structural features in the area between Bandirma, Biga, Yenice and Gonen in the southwest Marmara region are investigated by applying the CNN and CC to the Bouguer anomaly map. Faults identified by these algorithms are generally in accordance with previously mapped surface faults. These examples show that the geologic boundaries can be detected from Bouguer anomaly maps using the cloning template approach. A visual evaluation of the outputs of the CNN and CC approaches is carried out, and the results are compared with each other. This approach provides quantitative solutions based on just a few assumptions, which makes the method more powerful than the classical methods.

  6. Anomaly Detection in Gamma-Ray Vehicle Spectra with Principal Components Analysis and Mahalanobis Distances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tardiff, Mark F.; Runkle, Robert C.; Anderson, K. K.

    2006-01-23

    The goal of primary radiation monitoring in support of routine screening and emergency response is to detect characteristics in vehicle radiation signatures that indicate the presence of potential threats. Two conceptual approaches to analyzing gamma-ray spectra for threat detection are isotope identification and anomaly detection. While isotope identification is the time-honored method, an emerging technique is anomaly detection that uses benign vehicle gamma ray signatures to define an expectation of the radiation signature for vehicles that do not pose a threat. Newly acquired spectra are then compared to this expectation using statistical criteria that reflect acceptable false alarm rates andmore » probabilities of detection. The gamma-ray spectra analyzed here were collected at a U.S. land Port of Entry (POE) using a NaI-based radiation portal monitor (RPM). The raw data were analyzed to develop a benign vehicle expectation by decimating the original pulse-height channels to 35 energy bins, extracting composite variables via principal components analysis (PCA), and estimating statistically weighted distances from the mean vehicle spectrum with the mahalanobis distance (MD) metric. This paper reviews the methods used to establish the anomaly identification criteria and presents a systematic analysis of the response of the combined PCA and MD algorithm to modeled mono-energetic gamma-ray sources.« less

  7. The role of noninvasive and invasive diagnostic imaging techniques for detection of extra-cranial venous system anomalies and developmental variants

    PubMed Central

    2013-01-01

    The extra-cranial venous system is complex and not well studied in comparison to the peripheral venous system. A newly proposed vascular condition, named chronic cerebrospinal venous insufficiency (CCSVI), described initially in patients with multiple sclerosis (MS) has triggered intense interest in better understanding of the role of extra-cranial venous anomalies and developmental variants. So far, there is no established diagnostic imaging modality, non-invasive or invasive, that can serve as the “gold standard” for detection of these venous anomalies. However, consensus guidelines and standardized imaging protocols are emerging. Most likely, a multimodal imaging approach will ultimately be the most comprehensive means for screening, diagnostic and monitoring purposes. Further research is needed to determine the spectrum of extra-cranial venous pathology and to compare the imaging findings with pathological examinations. The ability to define and reliably detect noninvasively these anomalies is an essential step toward establishing their incidence and prevalence. The role for these anomalies in causing significant hemodynamic consequences for the intra-cranial venous drainage in MS patients and other neurologic disorders, and in aging, remains unproven. PMID:23806142

  8. Improving the accuracy of canal seepage detection through geospatial techniques

    NASA Astrophysics Data System (ADS)

    Arshad, Muhammad

    With climatic change, many western states in the United States are experiencing drought conditions. Numerous irrigation districts are losing significant amount of water from their canal systems due to leakage. Every year, on the average 2 million acres of prime cropland in the US is lost to soil erosion, waterlogging and salinity. Lining of canals could save enormous amount of water for irrigating crops but in present time due to soaring costs of construction and environmental mitigation, adopting such program on a large scale would be excessive. Conventional techniques of seepage detection are expensive, time consuming and labor intensive besides being not very accurate. Technological advancements in remote sensing have made it possible to investigate irrigation canals for seepage sites identification. In this research, band-9 in the [NIR] region and band-45 in the [TIR] region of an airborne MASTER data has been utilized to highlight anomalies along irrigation canal at Phoenix, Arizona. High resolution (1 to 4 meter pixels) satellite images provided by private companies for scientific research and made available by Google to the public on Google Earth is then successfully used to separate those anomalies into water activity sites, natural vegetation, and man-made structures and thereby greatly improving the seepage detection ability of airborne remote sensing. This innovative technique is much faster and cost effective as compared to conventional techniques and past airborne remote sensing techniques for verification of anomalies along irrigation canals. This technique also solves one of the long standing problems of discriminating false impression of seepage sites due to dense natural vegetation, terrain relief and low depressions of natural drainages from true water related activity sites.

  9. Koopman Operator Framework for Time Series Modeling and Analysis

    NASA Astrophysics Data System (ADS)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  10. Image Analysis via Soft Computing: Prototype Applications at NASA KSC and Product Commercialization

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steve

    2011-01-01

    This slide presentation reviews the use of "soft computing" which differs from "hard computing" in that it is more tolerant of imprecision, partial truth, uncertainty, and approximation and its use in image analysis. Soft computing provides flexible information processing to handle real life ambiguous situations and achieve tractability, robustness low solution cost, and a closer resemblance to human decision making. Several systems are or have been developed: Fuzzy Reasoning Edge Detection (FRED), Fuzzy Reasoning Adaptive Thresholding (FRAT), Image enhancement techniques, and visual/pattern recognition. These systems are compared with examples that show the effectiveness of each. NASA applications that are reviewed are: Real-Time (RT) Anomaly Detection, Real-Time (RT) Moving Debris Detection and the Columbia Investigation. The RT anomaly detection reviewed the case of a damaged cable for the emergency egress system. The use of these techniques is further illustrated in the Columbia investigation with the location and detection of Foam debris. There are several applications in commercial usage: image enhancement, human screening and privacy protection, visual inspection, 3D heart visualization, tumor detections and x ray image enhancement.

  11. Use of ARFI elastography in the prediction of placental invasion anomaly via a new Virtual Touch Quantification Technique.

    PubMed

    Cim, Numan; Tolunay, Harun Egemen; Boza, Baris; Arslan, Harun; Ates, Can; İlik, İbrahim; Tezcan, Fatih Mehmet; Yıldızhan, Recep; Sahin, Hanım Guler; Yavuz, Alpaslan

    2018-03-22

    We aimed to evaluate the efficiency of placental elasticity in predicting a placental invasion anomaly with the Virtual Touch Quantification (VTQ) technique. Pregnant women in the third trimester with suspected placental invasion anomaly were enrolled into the research (n = 58). The placenta was evaluated and divided into three equal parts as foetal edge (inner 1/3 of placenta), maternal edge (outer 1/3 of placenta) and the central part (central 1/3 of placenta). Shear wave velocity (SWV) measurements were used in the elastographic evaluation of placentas by VTQ. We performed the measurements at the different regions of placenta for sampling the variety areas of the placenta. Acoustic Radiation Force Impulse (ARFI) Elastography scores were significantly higher in the group in which an invasion was detected during the surgery of patients with preoperative placental invasion suspicion. A significant difference in the measurements of the inner, central and outer third of the placenta between the groups was found (p < .001). In this study, we have shown higher SWV scores of placental measurements of the patients with preoperative suspected anomalies and an invasion detected during their surgery. These findings may reflect an event at the tissue elasticity level and we hope that the use of the VTQ technique may contribute to an early prediction of placental invasions before surgery in the future via new research. Impact statement What is already known on this subject? Placenta invasion anomalies (PIA's) are characterized by haemorrhages which can threat the mother's life. Placental invasion anomalies are among the most important causes of maternal mortality and morbidity. Early diagnosis is very important condition in reducing the mortality and morbidity. Gray scale ultrasonography (US) is mostly used in early diagnosis of PIA's. Acoustic radiation force impulse elastography (ARFI) is a new elastographic ultrasonography technic. We aimed to evaluate a new method in the early diagnosis of PIA's using ARFI technique. There is no study in the diagnosis of PIA's by ARFI in the literature to our knowledge. We think that this original study will contribute to the literature. What do the results of this study add? We showed the accuracy of ARFI in determination of PIA's. ARFI scores were significantly higher in the group in which invasion was detected during surgery of patients with preoperative placental invasion suspicion. What are the implications are of these findings for clinical practice and/or further research? Our findings may reflect an event at the tissue elasticity level and we hope that the use of VTQ technique may contribute to early predict of placental invasions before surgery in the future via new researches. Early diagnosis of placental invasion anomalies may reduce mortality and morbidity.

  12. Small-scale anomaly detection in panoramic imaging using neural models of low-level vision

    NASA Astrophysics Data System (ADS)

    Casey, Matthew C.; Hickman, Duncan L.; Pavlou, Athanasios; Sadler, James R. E.

    2011-06-01

    Our understanding of sensory processing in animals has reached the stage where we can exploit neurobiological principles in commercial systems. In human vision, one brain structure that offers insight into how we might detect anomalies in real-time imaging is the superior colliculus (SC). The SC is a small structure that rapidly orients our eyes to a movement, sound or touch that it detects, even when the stimulus may be on a small-scale; think of a camouflaged movement or the rustle of leaves. This automatic orientation allows us to prioritize the use of our eyes to raise awareness of a potential threat, such as a predator approaching stealthily. In this paper we describe the application of a neural network model of the SC to the detection of anomalies in panoramic imaging. The neural approach consists of a mosaic of topographic maps that are each trained using competitive Hebbian learning to rapidly detect image features of a pre-defined shape and scale. What makes this approach interesting is the ability of the competition between neurons to automatically filter noise, yet with the capability of generalizing the desired shape and scale. We will present the results of this technique applied to the real-time detection of obscured targets in visible-band panoramic CCTV images. Using background subtraction to highlight potential movement, the technique is able to correctly identify targets which span as little as 3 pixels wide while filtering small-scale noise.

  13. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  14. Spatial association analysis between hydrocarbon fields and sedimentary residual magnetic anomalies using Weights of Evidence: An example from the Triassic Province of Algeria

    NASA Astrophysics Data System (ADS)

    Allek, Karim; Boubaya, Djamel; Bouguern, Abderrahmane; Hamoudi, Mohamed

    2016-12-01

    The presence of near-surface magnetic anomalies over oil and gas accumulations and their contribution to exploration remain somewhat controversial despite encouraging results and an improved understanding of genetic links between hydrocarbon seepage-induced alterations and near-surface magnetic minerals. This controversy is likely to remain since the cause of shallow-sourced sedimentary magnetic anomalies may well be microseepage related, but could also result from other sources such as cultural features and detrital magnetite. The definite way of discriminating between them remains a challenge. In this paper we examine means to deal with this particular purpose using a Bayesian technique known as 'Weights-of-Evidence'. The technique is implemented in GIS to explore spatial associations between known hydrocarbon fields within the central Triassic province of Algeria and sedimentary residual magnetic anomalies. We use the results to show possible application of the method to the recognition of some characteristics (amplitude and width) of anomalies assumed to be induced by hydrocarbon microseepages. Our results reveal strong spatial association with certain typical class of anomalies, confirming therefore hypothesis that hydrocarbon microseepages may result in detectable magnetic anomalies. It is possible to use the anomalies occurring outside the known gas and oil fields to make informed decisions in the selection of new targets for more detailed hydrocarbon exploration.

  15. Enabling the Discovery of Recurring Anomalies in Aerospace System Problem Reports using High-Dimensional Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Srivastava, Ashok, N.; Akella, Ram; Diev, Vesselin; Kumaresan, Sakthi Preethi; McIntosh, Dawn M.; Pontikakis, Emmanuel D.; Xu, Zuobing; Zhang, Yi

    2006-01-01

    This paper describes the results of a significant research and development effort conducted at NASA Ames Research Center to develop new text mining techniques to discover anomalies in free-text reports regarding system health and safety of two aerospace systems. We discuss two problems of significant importance in the aviation industry. The first problem is that of automatic anomaly discovery about an aerospace system through the analysis of tens of thousands of free-text problem reports that are written about the system. The second problem that we address is that of automatic discovery of recurring anomalies, i.e., anomalies that may be described m different ways by different authors, at varying times and under varying conditions, but that are truly about the same part of the system. The intent of recurring anomaly identification is to determine project or system weakness or high-risk issues. The discovery of recurring anomalies is a key goal in building safe, reliable, and cost-effective aerospace systems. We address the anomaly discovery problem on thousands of free-text reports using two strategies: (1) as an unsupervised learning problem where an algorithm takes free-text reports as input and automatically groups them into different bins, where each bin corresponds to a different unknown anomaly category; and (2) as a supervised learning problem where the algorithm classifies the free-text reports into one of a number of known anomaly categories. We then discuss the application of these methods to the problem of discovering recurring anomalies. In fact the special nature of recurring anomalies (very small cluster sizes) requires incorporating new methods and measures to enhance the original approach for anomaly detection. ?& pant 0-

  16. Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.

    PubMed

    Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda

    2014-05-01

    We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.

  17. TargetVue: Visual Analysis of Anomalous User Behaviors in Online Communication Systems.

    PubMed

    Cao, Nan; Shi, Conglei; Lin, Sabrina; Lu, Jie; Lin, Yu-Ru; Lin, Ching-Yung

    2016-01-01

    Users with anomalous behaviors in online communication systems (e.g. email and social medial platforms) are potential threats to society. Automated anomaly detection based on advanced machine learning techniques has been developed to combat this issue; challenges remain, though, due to the difficulty of obtaining proper ground truth for model training and evaluation. Therefore, substantial human judgment on the automated analysis results is often required to better adjust the performance of anomaly detection. Unfortunately, techniques that allow users to understand the analysis results more efficiently, to make a confident judgment about anomalies, and to explore data in their context, are still lacking. In this paper, we propose a novel visual analysis system, TargetVue, which detects anomalous users via an unsupervised learning model and visualizes the behaviors of suspicious users in behavior-rich context through novel visualization designs and multiple coordinated contextual views. Particularly, TargetVue incorporates three new ego-centric glyphs to visually summarize a user's behaviors which effectively present the user's communication activities, features, and social interactions. An efficient layout method is proposed to place these glyphs on a triangle grid, which captures similarities among users and facilitates comparisons of behaviors of different users. We demonstrate the power of TargetVue through its application in a social bot detection challenge using Twitter data, a case study based on email records, and an interview with expert users. Our evaluation shows that TargetVue is beneficial to the detection of users with anomalous communication behaviors.

  18. SmartFABER: Recognizing fine-grained abnormal behaviors for early detection of mild cognitive impairment.

    PubMed

    Riboni, Daniele; Bettini, Claudio; Civitarese, Gabriele; Janjua, Zaffar Haider; Helaoui, Rim

    2016-02-01

    In an ageing world population more citizens are at risk of cognitive impairment, with negative consequences on their ability of independent living, quality of life and sustainability of healthcare systems. Cognitive neuroscience researchers have identified behavioral anomalies that are significant indicators of cognitive decline. A general goal is the design of innovative methods and tools for continuously monitoring the functional abilities of the seniors at risk and reporting the behavioral anomalies to the clinicians. SmartFABER is a pervasive system targeting this objective. A non-intrusive sensor network continuously acquires data about the interaction of the senior with the home environment during daily activities. A novel hybrid statistical and knowledge-based technique is used to analyses this data and detect the behavioral anomalies, whose history is presented through a dashboard to the clinicians. Differently from related works, SmartFABER can detect abnormal behaviors at a fine-grained level. We have fully implemented the system and evaluated it using real datasets, partly generated by performing activities in a smart home laboratory, and partly acquired during several months of monitoring of the instrumented home of a senior diagnosed with MCI. Experimental results, including comparisons with other activity recognition techniques, show the effectiveness of SmartFABER in terms of recognition rates. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Assessing the impact of background spectral graph construction techniques on the topological anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Ziemann, Amanda K.; Messinger, David W.; Albano, James A.; Basener, William F.

    2012-06-01

    Anomaly detection algorithms have historically been applied to hyperspectral imagery in order to identify pixels whose material content is incongruous with the background material in the scene. Typically, the application involves extracting man-made objects from natural and agricultural surroundings. A large challenge in designing these algorithms is determining which pixels initially constitute the background material within an image. The topological anomaly detection (TAD) algorithm constructs a graph theory-based, fully non-parametric topological model of the background in the image scene, and uses codensity to measure deviation from this background. In TAD, the initial graph theory structure of the image data is created by connecting an edge between any two pixel vertices x and y if the Euclidean distance between them is less than some resolution r. While this type of proximity graph is among the most well-known approaches to building a geometric graph based on a given set of data, there is a wide variety of dierent geometrically-based techniques. In this paper, we present a comparative test of the performance of TAD across four dierent constructs of the initial graph: mutual k-nearest neighbor graph, sigma-local graph for two different values of σ > 1, and the proximity graph originally implemented in TAD.

  20. Geophysical techniques in detection to river embankments - A case study: To locate sites of potential leaks using surface-wave and electrical methods

    USGS Publications Warehouse

    Chen, C.; Liu, J.; Xu, S.; Xia, J.; ,

    2004-01-01

    Geophysical technologies are very effective in environmental, engineering and groundwater applications. Parameters of delineating nature of near-surface materials such as compressional-wave velocity, shear-wave velocity can be obtained using shallow seismic methods. Electric methods are primary approaches for investigating groundwater and detecting leakage. Both of methods are applied to detect embankment in hope of obtaining evidences of the strength and moisture inside the body. A technological experiment has done for detecting and discovering the hidden troubles in the embankment of Yangtze River, Songzi, Hubei, China in 2003. Surface-wave and DC multi-channel array resistivity sounding techniques were used to detect hidden trouble inside and under dike like pipe-seeps. This paper discusses the exploration strategy and the effect of geological characteristics. A practical approach of combining seismic and electric resistivity measurements was applied to locate potential pipe-seeps in embankment in the experiment. The method presents a potential leak factor based on the shear-wave velocity and the resistivity of the medium to evaluate anomalies. An anomaly found in a segment of embankment detected was verified, where occurred a pipe-seep during the 98' flooding.

  1. HPNAIDM: The High-Performance Network Anomaly/Intrusion Detection and Mitigation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yan

    Identifying traffic anomalies and attacks rapidly and accurately is critical for large network operators. With the rapid growth of network bandwidth, such as the next generation DOE UltraScience Network, and fast emergence of new attacks/virus/worms, existing network intrusion detection systems (IDS) are insufficient because they: • Are mostly host-based and not scalable to high-performance networks; • Are mostly signature-based and unable to adaptively recognize flow-level unknown attacks; • Cannot differentiate malicious events from the unintentional anomalies. To address these challenges, we proposed and developed a new paradigm called high-performance network anomaly/intrustion detection and mitigation (HPNAIDM) system. The new paradigm ismore » significantly different from existing IDSes with the following features (research thrusts). • Online traffic recording and analysis on high-speed networks; • Online adaptive flow-level anomaly/intrusion detection and mitigation; • Integrated approach for false positive reduction. Our research prototype and evaluation demonstrate that the HPNAIDM system is highly effective and economically feasible. Beyond satisfying the pre-set goals, we even exceed that significantly (see more details in the next section). Overall, our project harvested 23 publications (2 book chapters, 6 journal papers and 15 peer-reviewed conference/workshop papers). Besides, we built a website for technique dissemination, which hosts two system prototype release to the research community. We also filed a patent application and developed strong international and domestic collaborations which span both academia and industry.« less

  2. Operator based integration of information in multimodal radiological search mission with applications to anomaly detection

    NASA Astrophysics Data System (ADS)

    Benedetto, J.; Cloninger, A.; Czaja, W.; Doster, T.; Kochersberger, K.; Manning, B.; McCullough, T.; McLane, M.

    2014-05-01

    Successful performance of radiological search mission is dependent on effective utilization of mixture of signals. Examples of modalities include, e.g., EO imagery and gamma radiation data, or radiation data collected during multiple events. In addition, elevation data or spatial proximity can be used to enhance the performance of acquisition systems. State of the art techniques in processing and exploitation of complex information manifolds rely on diffusion operators. Our approach involves machine learning techniques based on analysis of joint data- dependent graphs and their associated diffusion kernels. Then, the significant eigenvectors of the derived fused graph Laplace and Schroedinger operators form the new representation, which provides integrated features from the heterogeneous input data. The families of data-dependent Laplace and Schroedinger operators on joint data graphs, shall be integrated by means of appropriately designed fusion metrics. These fused representations are used for target and anomaly detection.

  3. [Transthoracic and transesophageal echocardiography in the pre- and postoperative assessment of interatrial communication].

    PubMed

    San Román, J A; Vilacosta, I; Zamorano, J; Castillo, J A; Rollán, M J; Villanueva, M A; Almería, C; Sánchez-Harguindey, L

    1993-12-01

    Transthoracic echocardiography is the most useful noninvasive method to diagnose atrial septal defect. It is suggested by some authors that transesophageal echocardiography is more accurate than transthoracic echocardiography in this setting. Our aim was to compare the usefulness of both techniques in: 1) diagnosing atrial septal defect, 2) detecting associated anomalies and 3) postoperative assessment. Pre and postoperative transthoracic and transesophageal echocardiography were performed in 27 patients in whom diagnosis of atrial septal defect was confirmed at surgery. Transthoracic echocardiography demonstrated the defect in 20 patients (74%) (8 ostium primum, 10 ostium secundum and 2 sinus venosus). The 27 patients (100%) were correctly diagnosed by transesophageal echocardiography (8 ostium primum, 12 ostium secundum and 7 sinus venosus). Defect size determined by transthoracic echocardiography had a poor correlation with the surgical measurement (r = 0.34). A good correlation was obtained when transesophageal versus surgical defect size measurements were compared (r = 0.85; p < 0.05). Transesophageal echocardiography was superior in detecting associated anomalies (5 patients with anomalous partial pulmonary venous drainage, 3 persistence of left superior vena cava and 1 atrial septal aneurysm). Moreover, this technique better determined residual atrial septal defect, and detected a postsurgical inferior vena cava connection to the left atrium. Transesophageal echocardiography is superior to transthoracic echocardiography in diagnosing atrial septal defect sinus venosus type, detecting associated anomalies and postoperative assessment. Transthoracic echocardiography is diagnostic in the majority of patients with atrial septal defect ostium primum and ostium secundum types.

  4. Diagnosis of fetal syndromes by three- and four-dimensional ultrasound: is there any improvement?

    PubMed

    Barišić, Lara Spalldi; Stanojević, Milan; Kurjak, Asim; Porović, Selma; Gaber, Ghalia

    2017-08-28

    With all of our present knowledge, high technology diagnostic equipment, electronic databases and other available supporting resources, detection of fetal syndromes is still a challenge for healthcare providers in prenatal as well as in the postnatal period. Prenatal diagnosis of fetal syndromes is not straightforward, and it is a difficult puzzle that needs to be assembled and solved. Detection of one anomaly should always raise a suspicion of the existence of more anomalies, and can be a trigger to investigate further and raise awareness of possible syndromes. Highly specialized software systems for three- and four-dimensional ultrasound (3D/4D US) enabled detailed depiction of fetal anatomy and assessment of the dynamics of fetal structural and functional development in real time. With recent advances in 3D/4D US technology, antenatal diagnosis of fetal anomalies and syndromes shifted from the 2nd to the 1st trimester of pregnancy. It is questionable what can and should be done after the prenatal diagnosis of fetal syndrome. The 3D and 4D US techniques improved detection accuracy of fetal abnormalities and syndromes from early pregnancy onwards. It is not easy to make prenatal diagnosis of fetal syndromes, so tools which help like online integrated databases are needed to increase diagnostic precision. The aim of this paper is to present the possibilities of different US techniques in the detection of some fetal syndromes prenatally.

  5. On Modeling of Adversary Behavior and Defense for Survivability of Military MANET Applications

    DTIC Science & Technology

    2015-01-01

    anomaly detection technique. b) A system-level majority-voting based intrusion detection system with m being the number of verifiers used to perform...pp. 1254 - 1263. [5] R. Mitchell, and I.R. Chen, “Adaptive Intrusion Detection for Unmanned Aircraft Systems based on Behavior Rule Specification...and adaptively trigger the best attack strategies while avoiding detection and eviction. The second step is to model defense behavior of defenders

  6. The intermediate wavelength magnetic anomaly field of the north Pacific and possible source distributions

    NASA Technical Reports Server (NTRS)

    Labrecque, J. L.; Cande, S. C.; Jarrard, R. D. (Principal Investigator)

    1983-01-01

    A technique that eliminates external field sources and the effects of strike aliasing was used to extract from marine survey data the intermediate wavelength magnetic anomaly field for (B) in the North Pacific. A strong correlation exists between this field and the MAGSAT field although a directional sensitivity in the MAGSAT field can be detected. The intermediate wavelength field is correlated to tectonic features. Island arcs appear as positive anomalies of induced origin likely due to variations in crustal thickness. Seamount chains and oceanic plateaus also are manifested by strong anomalies. The primary contribution to many of these anomalies appears to be due to a remanent magnetization. The source parameters for the remainder of these features are presently unidentified ambiguous. Results indicate that the sea surface field is a valuable source of information for secular variation analysis and the resolution of intermediate wavelength source parameters.

  7. Conditional anomaly detection methods for patient–management alert systems

    PubMed Central

    Valko, Michal; Cooper, Gregory; Seybert, Amy; Visweswaran, Shyam; Saul, Melissa; Hauskrecht, Milos

    2010-01-01

    Anomaly detection methods can be very useful in identifying unusual or interesting patterns in data. A recently proposed conditional anomaly detection framework extends anomaly detection to the problem of identifying anomalous patterns on a subset of attributes in the data. The anomaly always depends (is conditioned) on the value of remaining attributes. The work presented in this paper focuses on instance–based methods for detecting conditional anomalies. The methods rely on the distance metric to identify examples in the dataset that are most critical for detecting the anomaly. We investigate various metrics and metric learning methods to optimize the performance of the instance–based anomaly detection methods. We show the benefits of the instance–based methods on two real–world detection problems: detection of unusual admission decisions for patients with the community–acquired pneumonia and detection of unusual orders of an HPF4 test that is used to confirm Heparin induced thrombocytopenia — a life–threatening condition caused by the Heparin therapy. PMID:25392850

  8. Time-Frequency Methods for Structural Health Monitoring †

    PubMed Central

    Pyayt, Alexander L.; Kozionov, Alexey P.; Mokhov, Ilya I.; Lang, Bernhard; Meijer, Robert J.; Krzhizhanovskaya, Valeria V.; Sloot, Peter M. A.

    2014-01-01

    Detection of early warning signals for the imminent failure of large and complex engineered structures is a daunting challenge with many open research questions. In this paper we report on novel ways to perform Structural Health Monitoring (SHM) of flood protection systems (levees, earthen dikes and concrete dams) using sensor data. We present a robust data-driven anomaly detection method that combines time-frequency feature extraction, using wavelet analysis and phase shift, with one-sided classification techniques to identify the onset of failure anomalies in real-time sensor measurements. The methodology has been successfully tested at three operational levees. We detected a dam leakage in the retaining dam (Germany) and “strange” behaviour of sensors installed in a Boston levee (UK) and a Rhine levee (Germany). PMID:24625740

  9. Estimation of anomaly location and size using electrical impedance tomography.

    PubMed

    Kwon, Ohin; Yoon, Jeong Rock; Seo, Jin Keun; Woo, Eung Je; Cho, Young Gu

    2003-01-01

    We developed a new algorithm that estimates locations and sizes of anomalies in electrically conducting medium based on electrical impedance tomography (EIT) technique. When only the boundary current and voltage measurements are available, it is not practically feasible to reconstruct accurate high-resolution cross-sectional conductivity or resistivity images of a subject. In this paper, we focus our attention on the estimation of locations and sizes of anomalies with different conductivity values compared with the background tissues. We showed the performance of the algorithm from experimental results using a 32-channel EIT system and saline phantom. With about 1.73% measurement error in boundary current-voltage data, we found that the minimal size (area) of the detectable anomaly is about 0.72% of the size (area) of the phantom. Potential applications include the monitoring of impedance related physiological events and bubble detection in two-phase flow. Since this new algorithm requires neither any forward solver nor time-consuming minimization process, it is fast enough for various real-time applications in medicine and nondestructive testing.

  10. SOME GEOCHEMICAL METHODS OF URANIUM EXPLORATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Illsley, C.T.; Bills, C.W.; Pollock, J.W.

    Geochemical research and development projects were carried on to provide basic information which may be applied to exploration or general studies of uranium geology. The applications and limitations of various aspects of geochemistry to uranium geological problems are considerd. Modifications of existing analytical techniques were made and tested in the laboratory and in the field. These include rapid quantitative determination of unranium in water, soil and peat, and of trace amounts of sulfate and phosphate in water. Geochemical anomaly'' has been defined as a significant departure from the average abundance background of an element where the distribution has not beenmore » disturbed by mineralization. The detection and significance of geocthemical anomalies are directly related to the mobility of the element being sought in the zone of weathering. Mobility of uranium is governed by complex physical, chemical, and biological factors. For uranium anomalies in surface materils, the chemicaly factors affecting mobility are the most sigificant. The effects of pH, solubility, coprecipitution, adsorption complexion, or compound formation are discussed in relation to anomalies detected in water, soil, and stream sediments. (auth)« less

  11. Millimeter Wave Detection of Localized Anomalies in the Space Shuttle External Fuel Tank Insulating Foam and Acreage Heat Tiles

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Case, J. T.; Zoughi, R.; Hepburn, F.

    2005-01-01

    The Space Shuttle Columbia's catastrophic accident emphasizes the growing need for developing and applying effective, robust and life-cycle oriented nondestructive testing (NDT) methods for inspecting the shuttle external fuel tank spray on foam insulation (SOFI) and its protective acreage heat tiles. Millimeter wave NDT techniques were one of the methods chosen for evaluating their potential for inspecting these structures. Several panels with embedded anomalies (mainly voids) were produced and tested for this purpose. Near-field and far-field millimeter wave NDT methods were used for producing millimeter wave images of the anomalies in SOFI panel and heat tiles. This paper presents the results of an investigation for the purpose of detecting localized anomalies in two SOFI panels and a set of heat tiles. To this end, reflectometers at a relatively wide range of frequencies (Ka-band (26.5 - 40 GHz) to W-band (75 - 110 GHz)) and utilizing different types of radiators were employed. The results clearly illustrate the utility of these methods for this purpose.

  12. Microwave and Millimeter Wave Nondestructive Evaluation of the Space Shuttle External Tank Insulating Foam

    NASA Technical Reports Server (NTRS)

    Shrestha, S.; Kharkovsky, S.; Zoughi, R.; Hepburn, F

    2005-01-01

    The Space Shuttle Columbia s catastrophic failure has been attributed to a piece of external fuel tank insulating SOFI (Spray On Foam Insulation) foam striking the leading edge of the left wing of the orbiter causing significant damage to some of the protecting heat tiles. The accident emphasizes the growing need to develop effective, robust and life-cycle oriented methods of nondestructive testing and evaluation (NDT&E) of complex conductor-backed insulating foam and protective acreage heat tiles used in the space shuttle fleet and in future multi-launch space vehicles. The insulating SOFI foam is constructed from closed-cell foam. In the microwave regime this foam is in the family of low permittivity and low loss dielectric materials. Near-field microwave and millimeter wave NDT methods were one of the techniques chosen for this purpose. To this end several flat and thick SOFI foam panels, two structurally complex panels similar to the external fuel tank and a "blind" panel were used in this investigation. Several anomalies such as voids and disbonds were embedded in these panels at various locations. The location and properties of the embedded anomalies in the "blind" panel were not disclosed to the investigating team prior to the investigation. Three frequency bands were used in this investigation covering a frequency range of 8-75 GHz. Moreover, the influence of signal polarization was also investigated. Overall the results of this investigation were very promising for detecting the presence of anomalies in different panels covered with relatively thick insulating SOFI foam. Different types of anomalies were detected in foam up to 9 in thick. Many of the anomalies in the more complex panels were also detected. When investigating the blind panel no false positives were detected. Anomalies in between and underneath bolt heads were not easily detected. This paper presents the results of this investigation along with a discussion of the capabilities of the method used.

  13. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  14. Anomaly Detection for Beam Loss Maps in the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja

    2017-07-01

    In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.

  15. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The parameters thus learned are used for calculating the joint distribution of the observations. However, this GMM assumption is essentially an approximation and signals the potential viability of non-parametric density estimators. This is the key idea underlying the new approach.

  16. Electric field variations measured continuously in free air over a conductive thin zone in the tilted Lias-epsilon black shales near Osnabrück, Northwest Germany

    NASA Astrophysics Data System (ADS)

    Gurk, M.; Bosch, F. P.; Tougiannidis, N.

    2013-04-01

    Common studies on the static electric field distribution over a conductivity anomaly use the self-potential method. However, this method is time consuming and requires nonpolarizable electrodes to be placed in the ground. Moreover, the information gained by this method is restricted to the horizontal variations of the electric field. To overcome the limitation in the self-potential technique, we conducted a field experiment using a non conventional technique to assess the static electric field over a conductivity anomaly. We use two metallic potential probes arranged on an insulated boom with a separation of 126 cm. When placed into the electric field of the free air, a surface charge will be induced on each probe trying to equalize with the potential of the surrounding atmosphere. The use of a plasma source at both probes facilitated continuous and quicker measurement of the electric field in the air. The present study shows first experimental measurements with a modified potential probe technique (MPP) along a 600-meter-long transect to demonstrate the general feasibility of this method for studying the static electric field distribution over shallow conductivity anomalies. Field measurements were carried out on a test site on top of the Bramsche Massif near Osnabrück (Northwest Germany) to benefit from a variety of available near surface data over an almost vertical conductivity anomaly. High resolution self-potential data served in a numerical analysis to estimate the expected individual components of the electric field vector. During the experiment we found more anomalies in the vertical and horizontal components of the electric field than self-potential anomalies. These contrasting findings are successfully cross-validated with conventional near surface geophysical methods. Among these methods, we used self-potential, radiomagnetotelluric, electric resistivity tomography and induced polarization data to derive 2D conductivity models of the subsurface in order to infer the geometrical properties and the origin of the conductivity anomaly in the survey area. The presented study demonstrates the feasibility of electric field measurements in free air to detect and study near surface conductivity anomalies. Variations in Ez correlate well with the conductivity distribution obtained from resistivity methods. Compared to the self-potential technique, continuously free air measurements of the electric field are more rapid and of better lateral resolution combined with the unique ability to analyze vertical components of the electric field which are of particular importance to detect lateral conductivity contrasts. Mapping Ez in free air is a good tool to precisely map lateral changes of the electric field distribution in areas where SP generation fails. MPP offers interesting application in other geophysical techniques e.g. in time domain electromagnetics, DC and IP. With this method we were able to reveal a ca. 150 m broad zone of enhanced electric field strength.

  17. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks.

    PubMed

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-06-13

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens' quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%.

  18. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks

    PubMed Central

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-01-01

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens’ quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%. PMID:27304957

  19. A Testbed for Data Fusion for Helicopter Diagnostics and Prognostics

    DTIC Science & Technology

    2003-03-01

    and algorithm design and tuning in order to develop advanced diagnostic and prognostic techniques for air craft health monitoring . Here a...and development of models for diagnostics, prognostics , and anomaly detection . Figure 5 VMEP Server Browser Interface 7 Download... detections , and prognostic prediction time horizons. The VMEP system and in particular the web component are ideal for performing data collection

  20. OPAD data analysis

    NASA Astrophysics Data System (ADS)

    Buntine, Wray L.; Kraft, Richard; Whitaker, Kevin; Cooper, Anita E.; Powers, W. T.; Wallace, Tim L.

    1993-06-01

    Data obtained in the framework of an Optical Plume Anomaly Detection (OPAD) program intended to create a rocket engine health monitor based on spectrometric detections of anomalous atomic and molecular species in the exhaust plume are analyzed. The major results include techniques for handling data noise, methods for registration of spectra to wavelength, and a simple automatic process for estimating the metallic component of a spectrum.

  1. Automated synthesis, insertion and detection of polyps for CT colonography

    NASA Astrophysics Data System (ADS)

    Sezille, Nicolas; Sadleir, Robert J. T.; Whelan, Paul F.

    2003-03-01

    CT Colonography (CTC) is a new non-invasive colon imaging technique which has the potential to replace conventional colonoscopy for colorectal cancer screening. A novel system which facilitates automated detection of colorectal polyps at CTC is introduced. As exhaustive testing of such a system using real patient data is not feasible, more complete testing is achieved through synthesis of artificial polyps and insertion into real datasets. The polyp insertion is semi-automatic: candidate points are manually selected using a custom GUI, suitable points are determined automatically from an analysis of the local neighborhood surrounding each of the candidate points. Local density and orientation information are used to generate polyps based on an elliptical model. Anomalies are identified from the modified dataset by analyzing the axial images. Detected anomalies are classified as potential polyps or natural features using 3D morphological techniques. The final results are flagged for review. The system was evaluated using 15 scenarios. The sensitivity of the system was found to be 65% with 34% false positive detections. Automated diagnosis at CTC is possible and thorough testing is facilitated by augmenting real patient data with computer generated polyps. Ultimately, automated diagnosis will enhance standard CTC and increase performance.

  2. Interactions between Cytokines, Congenital Anomalies of Kidney and Urinary Tract and Chronic Kidney Disease

    PubMed Central

    Simões e Silva, Ana Cristina; Valério, Flávia Cordeiro; Vasconcelos, Mariana Affonso; Miranda, Débora Marques; Oliveira, Eduardo Araújo

    2013-01-01

    Fetal hydronephrosis is the most common anomaly detected on antenatal ultrasound, affecting 1–5% of pregnancies. Postnatal investigation has the major aim in detecting infants with severe urinary tract obstruction and clinically significant urinary tract anomalies among the heterogeneous universe of patients. Congenital uropathies are frequent causes of pediatric chronic kidney disease (CKD). Imaging techniques clearly contribute to this purpose; however, sometimes, these exams are invasive, very expensive, and not sufficient to precisely define the best approach as well as the prognosis. Recently, biomarkers have become a focus of clinical research as potentially useful diagnostic tools in pediatric urological diseases. In this regard, recent studies suggest a role for cytokines and chemokines in the pathophysiology of CAKUT and for the progression to CKD. Some authors proposed that the evaluation of these inflammatory mediators might help the management of postnatal uropathies and the detection of patients with high risk to developed chronic kidney disease. Therefore, the aim of this paper is to revise general aspects of cytokines and the link between cytokines, CAKUT, and CKD by including experimental and clinical evidence. PMID:24066006

  3. Freezing of Gait Detection in Parkinson's Disease: A Subject-Independent Detector Using Anomaly Scores.

    PubMed

    Pham, Thuy T; Moore, Steven T; Lewis, Simon John Geoffrey; Nguyen, Diep N; Dutkiewicz, Eryk; Fuglevand, Andrew J; McEwan, Alistair L; Leong, Philip H W

    2017-11-01

    Freezing of gait (FoG) is common in Parkinsonian gait and strongly relates to falls. Current clinical FoG assessments are patients' self-report diaries and experts' manual video analysis. Both are subjective and yield moderate reliability. Existing detection algorithms have been predominantly designed in subject-dependent settings. In this paper, we aim to develop an automated FoG detector for subject independent. After extracting highly relevant features, we apply anomaly detection techniques to detect FoG events. Specifically, feature selection is performed using correlation and clusterability metrics. From a list of 244 feature candidates, 36 candidates were selected using saliency and robustness criteria. We develop an anomaly score detector with adaptive thresholding to identify FoG events. Then, using accuracy metrics, we reduce the feature list to seven candidates. Our novel multichannel freezing index was the most selective across all window sizes, achieving sensitivity (specificity) of (). On the other hand, freezing index from the vertical axis was the best choice for a single input, achieving sensitivity (specificity) of () for ankle and () for back sensors. Our subject-independent method is not only significantly more accurate than those previously reported, but also uses a much smaller window (e.g., versus ) and/or lower tolerance (e.g., versus ).Freezing of gait (FoG) is common in Parkinsonian gait and strongly relates to falls. Current clinical FoG assessments are patients' self-report diaries and experts' manual video analysis. Both are subjective and yield moderate reliability. Existing detection algorithms have been predominantly designed in subject-dependent settings. In this paper, we aim to develop an automated FoG detector for subject independent. After extracting highly relevant features, we apply anomaly detection techniques to detect FoG events. Specifically, feature selection is performed using correlation and clusterability metrics. From a list of 244 feature candidates, 36 candidates were selected using saliency and robustness criteria. We develop an anomaly score detector with adaptive thresholding to identify FoG events. Then, using accuracy metrics, we reduce the feature list to seven candidates. Our novel multichannel freezing index was the most selective across all window sizes, achieving sensitivity (specificity) of (). On the other hand, freezing index from the vertical axis was the best choice for a single input, achieving sensitivity (specificity) of () for ankle and () for back sensors. Our subject-independent method is not only significantly more accurate than those previously reported, but also uses a much smaller window (e.g., versus ) and/or lower tolerance (e.g., versus ).

  4. FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection.

    PubMed

    Noto, Keith; Brodley, Carla; Slonim, Donna

    2012-01-01

    Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called "normal" instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach.

  5. FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection

    PubMed Central

    Brodley, Carla; Slonim, Donna

    2011-01-01

    Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called “normal” instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach. PMID:22639542

  6. Deep-cascade: Cascading 3D Deep Neural Networks for Fast Anomaly Detection and Localization in Crowded Scenes.

    PubMed

    Sabokrou, Mohammad; Fayyaz, Mohsen; Fathy, Mahmood; Klette, Reinhard

    2017-02-17

    This paper proposes a fast and reliable method for anomaly detection and localization in video data showing crowded scenes. Time-efficient anomaly localization is an ongoing challenge and subject of this paper. We propose a cubicpatch- based method, characterised by a cascade of classifiers, which makes use of an advanced feature-learning approach. Our cascade of classifiers has two main stages. First, a light but deep 3D auto-encoder is used for early identification of "many" normal cubic patches. This deep network operates on small cubic patches as being the first stage, before carefully resizing remaining candidates of interest, and evaluating those at the second stage using a more complex and deeper 3D convolutional neural network (CNN). We divide the deep autoencoder and the CNN into multiple sub-stages which operate as cascaded classifiers. Shallow layers of the cascaded deep networks (designed as Gaussian classifiers, acting as weak single-class classifiers) detect "simple" normal patches such as background patches, and more complex normal patches are detected at deeper layers. It is shown that the proposed novel technique (a cascade of two cascaded classifiers) performs comparable to current top-performing detection and localization methods on standard benchmarks, but outperforms those in general with respect to required computation time.

  7. Anomaly Detection at Multiple Scales (ADAMS)

    DTIC Science & Technology

    2011-11-09

    must resort to generating their own data that simulates insider attacks. The Schonlau dataset is the most widely used for academic study. It...measurements are estimated by well-known software plagiarism tools . 39 As explained above, there are many different techniques for code trans- formation

  8. Camouflage target detection via hyperspectral imaging plus information divergence measurement

    NASA Astrophysics Data System (ADS)

    Chen, Yuheng; Chen, Xinhua; Zhou, Jiankang; Ji, Yiqun; Shen, Weimin

    2016-01-01

    Target detection is one of most important applications in remote sensing. Nowadays accurate camouflage target distinction is often resorted to spectral imaging technique due to its high-resolution spectral/spatial information acquisition ability as well as plenty of data processing methods. In this paper, hyper-spectral imaging technique together with spectral information divergence measure method is used to solve camouflage target detection problem. A self-developed visual-band hyper-spectral imaging device is adopted to collect data cubes of certain experimental scene before spectral information divergences are worked out so as to discriminate target camouflage and anomaly. Full-band information divergences are measured to evaluate target detection effect visually and quantitatively. Information divergence measurement is proved to be a low-cost and effective tool for target detection task and can be further developed to other target detection applications beyond spectral imaging technique.

  9. A Healthcare Utilization Analysis Framework for Hot Spotting and Contextual Anomaly Detection

    PubMed Central

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient’s clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among medical encounters to provide more in-depth insights. PMID:23304306

  10. A healthcare utilization analysis framework for hot spotting and contextual anomaly detection.

    PubMed

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient's clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among medical encounters to provide more in-depth insights.

  11. Neutron activation determination of iridium, gold, platinum, and silver in geologic samples

    USGS Publications Warehouse

    Millard, H.T.

    1987-01-01

    Low-level methods for the determination of iridium and other noble metals have become increasingly important in recent years due to interest in locating abundance anomalies associated with the Cretaceous and Tertiary (K-T) boundary. Typical iridium anomalies are in the range of 1 to 100 ??g/kg (ppb). Thus methods with detection limits near 0.1 ??g/kg should be adequate to detect K-T boundary anomalies. Radiochemical neutron activation analysis methods continue to be required although instrumental neutron activation analysis techniques employing elaborate gamma-counters are under development. In the procedure developed in this study samples irradiated in the epithermal neutron facility of the U. S. Geological Survey TRIGA Reactor (Denver, Colorado) are treated with a mini-fire assay technique. The iridium, gold, and silver are collected in a 1-gram metallic lead button. Primary contaminants at this stage are arsenic and antimony. These can be removed by heating the button with a mixture of sodium perioxide and sodium hydroxide. The resulting 0.2-gram lead bead is counted in a Compton suppression spectrometer. Carrier yields are determined by reirradiation of the lead beads. This procedure has been applied to the U.S.G.S. Standard Rock PCC-1 and samples from K-T boundary sites in the Western Interior of North America. ?? 1987 Akade??miai Kiado??.

  12. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data.

    PubMed

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks.

  13. Steganography anomaly detection using simple one-class classification

    NASA Astrophysics Data System (ADS)

    Rodriguez, Benjamin M.; Peterson, Gilbert L.; Agaian, Sos S.

    2007-04-01

    There are several security issues tied to multimedia when implementing the various applications in the cellular phone and wireless industry. One primary concern is the potential ease of implementing a steganography system. Traditionally, the only mechanism to embed information into a media file has been with a desktop computer. However, as the cellular phone and wireless industry matures, it becomes much simpler for the same techniques to be performed using a cell phone. In this paper, two methods are compared that classify cell phone images as either an anomaly or clean, where a clean image is one in which no alterations have been made and an anomalous image is one in which information has been hidden within the image. An image in which information has been hidden is known as a stego image. The main concern in detecting steganographic content with machine learning using cell phone images is in training specific embedding procedures to determine if the method has been used to generate a stego image. This leads to a possible flaw in the system when the learned model of stego is faced with a new stego method which doesn't match the existing model. The proposed solution to this problem is to develop systems that detect steganography as anomalies, making the embedding method irrelevant in detection. Two applicable classification methods for solving the anomaly detection of steganographic content problem are single class support vector machines (SVM) and Parzen-window. Empirical comparison of the two approaches shows that Parzen-window outperforms the single class SVM most likely due to the fact that Parzen-window generalizes less.

  14. Analysis of LANDSAT-4 TM Data for Lithologic and Image Mapping Purpose

    NASA Technical Reports Server (NTRS)

    Podwysocki, M. H.; Salisbury, J. W.; Bender, L. V.; Jones, O. D.; Mimms, D. L.

    1984-01-01

    Lithologic mapping techniques using the near infrared bands of the Thematic Mapper onboard the LANDSAT 4 satellite are investigated. These methods are coupled with digital masking to test the capability of mapping geologic materials. Data are examined under medium to low Sun angle illumination conditions to determine the detection limits of materials with absorption features. Several detection anomalies are observed and explained.

  15. Large Scale System Defense

    DTIC Science & Technology

    2008-10-01

    AD); Aeolos, a distributed intrusion detection and event correlation infrastructure; STAND, a training-set sanitization technique applicable to ADs...UU 18. NUMBER OF PAGES 25 19a. NAME OF RESPONSIBLE PERSON Frank H. Born a. REPORT U b. ABSTRACT U c . THIS PAGE U 19b. TELEPHONE...Summary of findings 2 (a) Automatic Patch Generation 2 (b) Better Patch Management 2 ( c ) Artificial Diversity 3 (d) Distributed Anomaly Detection 3

  16. Propulsion health monitoring of a turbine engine disk using spin test data

    NASA Astrophysics Data System (ADS)

    Abdul-Aziz, Ali; Woike, Mark; Oza, Nikunj; Matthews, Bryan; Baakilini, George

    2010-03-01

    On line detection techniques to monitor the health of rotating engine components are becoming increasingly attractive options to aircraft engine companies in order to increase safety of operation and lower maintenance costs. Health monitoring remains a challenging feature to easily implement, especially, in the presence of scattered loading conditions, crack size, component geometry and materials properties. The current trend, however, is to utilize noninvasive types of health monitoring or nondestructive techniques to detect hidden flaws and mini cracks before any catastrophic event occurs. These techniques go further to evaluate materials' discontinuities and other anomalies that have grown to the level of critical defects which can lead to failure. Generally, health monitoring is highly dependent on sensor systems that are capable of performing in various engine environmental conditions and able to transmit a signal upon a predetermined crack length, while acting in a neutral form upon the overall performance of the engine system. Efforts are under way at NASA Glenn Research Center through support of the Intelligent Vehicle Health Management Project (IVHM) to develop and implement such sensor technology for a wide variety of applications. These efforts are focused on developing high temperature, wireless, low cost and durable products. Therefore, in an effort to address the technical issues concerning health monitoring of a rotor disk, this paper considers data collected from an experimental study using high frequency capacitive sensor technology to capture blade tip clearance and tip timing measurements in a rotating engine-like-disk-to predict the disk faults and assess its structural integrity. The experimental results collected at a range of rotational speeds from tests conducted at the NASA Glenn Research Center's Rotordynamics Laboratory will be evaluated using multiple data-driven anomaly detection techniques to identify anomalies in the disk. This study is expected to present a select evaluation of online health monitoring of a rotating disk using these high caliber sensors and test the capability of the in-house spin system.

  17. A model for anomaly classification in intrusion detection systems

    NASA Astrophysics Data System (ADS)

    Ferreira, V. O.; Galhardi, V. V.; Gonçalves, L. B. L.; Silva, R. C.; Cansian, A. M.

    2015-09-01

    Intrusion Detection Systems (IDS) are traditionally divided into two types according to the detection methods they employ, namely (i) misuse detection and (ii) anomaly detection. Anomaly detection has been widely used and its main advantage is the ability to detect new attacks. However, the analysis of anomalies generated can become expensive, since they often have no clear information about the malicious events they represent. In this context, this paper presents a model for automated classification of alerts generated by an anomaly based IDS. The main goal is either the classification of the detected anomalies in well-defined taxonomies of attacks or to identify whether it is a false positive misclassified by the IDS. Some common attacks to computer networks were considered and we achieved important results that can equip security analysts with best resources for their analyses.

  18. An immunity-based anomaly detection system with sensor agents.

    PubMed

    Okamoto, Takeshi; Ishida, Yoshiteru

    2009-01-01

    This paper proposes an immunity-based anomaly detection system with sensor agents based on the specificity and diversity of the immune system. Each agent is specialized to react to the behavior of a specific user. Multiple diverse agents decide whether the behavior is normal or abnormal. Conventional systems have used only a single sensor to detect anomalies, while the immunity-based system makes use of multiple sensors, which leads to improvements in detection accuracy. In addition, we propose an evaluation framework for the anomaly detection system, which is capable of evaluating the differences in detection accuracy between internal and external anomalies. This paper focuses on anomaly detection in user's command sequences on UNIX-like systems. In experiments, the immunity-based system outperformed some of the best conventional systems.

  19. Advanced Cyber Industrial Control System Tactics, Techniques, and Procedures (ACI TTP) for Department of Defense (DOD) Industrial Control Systems (ICS)

    DTIC Science & Technology

    2016-08-10

    enable JCS managers to detect advanced cyber attacks, mitigate the effects of those attacks, and recover their networks following an attack. It also... managers of ICS networks to Detect, Mitigate, and Recover from nation-state-level cyber attacks (strategic, deliberate, well-trained, and funded...Successful Detection of cyber anomalies is best achieved when IT and ICS managers remain in close coordination. The Integrity Checks Table

  20. Constraints on lithospheric structure from satellite potential field data: Africa and Asia. Analysis and interpretation of MAGSAT anomalies over North Africa

    NASA Technical Reports Server (NTRS)

    Phillips, R. J.

    1986-01-01

    Crustal anomaly detection with MAGSAT data is frustrated by the inherent resolving power of the data and by contamination from the external and core fields. The quality of the data might be tested by modeling specific tectonic features which produce anomalies that fall within the proposed resolution and crustal amplitude capabilities of the MAGSAT fields. To test this hypothesis, the north African hotspots associated with Ahaggar, Tibestia and Darfur have been modeled as magnetic induction anomalies due solely to shallower depth to the Curie isotherm surface beneath these features. The MAGSAT data were reduced by subtracting the external and core fields to isolate the scalar and vertical component crustal signals. The predicted model magnetic signal arising from the surface topography of the uplift and the Curie isotherm surface was calculated at MAGSAT altitudes by the Fourier transform technique modified to allow for variable magnetization. In summary it is suggested that the region beneath Ahaggar is associated with a strong thermal anomaly and the predicted anomaly best fits the associated MAGSAT anomaly if the African plate is moving in a northeasterly direction.

  1. Machine learning for the automatic detection of anomalous events

    NASA Astrophysics Data System (ADS)

    Fisher, Wendy D.

    In this dissertation, we describe our research contributions for a novel approach to the application of machine learning for the automatic detection of anomalous events. We work in two different domains to ensure a robust data-driven workflow that could be generalized for monitoring other systems. Specifically, in our first domain, we begin with the identification of internal erosion events in earth dams and levees (EDLs) using geophysical data collected from sensors located on the surface of the levee. As EDLs across the globe reach the end of their design lives, effectively monitoring their structural integrity is of critical importance. The second domain of interest is related to mobile telecommunications, where we investigate a system for automatically detecting non-commercial base station routers (BSRs) operating in protected frequency space. The presence of non-commercial BSRs can disrupt the connectivity of end users, cause service issues for the commercial providers, and introduce significant security concerns. We provide our motivation, experimentation, and results from investigating a generalized novel data-driven workflow using several machine learning techniques. In Chapter 2, we present results from our performance study that uses popular unsupervised clustering algorithms to gain insights to our real-world problems, and evaluate our results using internal and external validation techniques. Using EDL passive seismic data from an experimental laboratory earth embankment, results consistently show a clear separation of events from non-events in four of the five clustering algorithms applied. Chapter 3 uses a multivariate Gaussian machine learning model to identify anomalies in our experimental data sets. For the EDL work, we used experimental data from two different laboratory earth embankments. Additionally, we explore five wavelet transform methods for signal denoising. The best performance is achieved with the Haar wavelets. We achieve up to 97.3% overall accuracy and less than 1.4% false negatives in anomaly detection. In Chapter 4, we research using two-class and one-class support vector machines (SVMs) for an effective anomaly detection system. We again use the two different EDL data sets from experimental laboratory earth embankments (each having approximately 80% normal and 20% anomalies) to ensure our workflow is robust enough to work with multiple data sets and different types of anomalous events (e.g., cracks and piping). We apply Haar wavelet-denoising techniques and extract nine spectral features from decomposed segments of the time series data. The two-class SVM with 10-fold cross validation achieved over 94% overall accuracy and 96% F1-score. Our approach provides a means for automatically identifying anomalous events using various machine learning techniques. Detecting internal erosion events in aging EDLs, earlier than is currently possible, can allow more time to prevent or mitigate catastrophic failures. Results show that we can successfully separate normal from anomalous data observations in passive seismic data, and provide a step towards techniques for continuous real-time monitoring of EDL health. Our lightweight non-commercial BSR detection system also has promise in separating commercial from non-commercial BSR scans without the need for prior geographic location information, extensive time-lapse surveys, or a database of known commercial carriers. (Abstract shortened by ProQuest.).

  2. A Real-Time Earthquake Precursor Detection Technique Using TEC from a GPS Network

    NASA Astrophysics Data System (ADS)

    Alp Akyol, Ali; Arikan, Feza; Arikan, Orhan

    2016-07-01

    Anomalies have been observed in the ionospheric electron density distribution prior to strong earthquakes. However, most of the reported results are obtained by earthquake analysis. Therefore, their implementation in practice is highly problematic. Recently, a novel earthquake precursor detection technique based on spatio-temporal analysis of Total Electron Content (TEC) data obtained from Turkish National Permanent GPS Network (TNPGN) is developed by IONOLAB group (www.ionolab.org). In the present study, the developed detection technique is implemented in a causal setup over the available data set in test phase that enables the real time implementation. The performance of the developed earthquake prediction technique is evaluated by using 10 fold cross validation over the data obtained in 2011. Among the 23 earthquakes that have magnitudes higher than 5, the developed technique can detect precursors of 14 earthquakes while producing 8 false alarms. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.

  3. Radio Frequency Based Programmable Logic Controller Anomaly Detection

    DTIC Science & Technology

    2013-09-01

    include wireless radios, IEEE 802.15 Blue- tooth devices, cellular phones, and IEEE 802.11 WiFi networking devices. While wireless communication...MacKenzie, H. Shamoon Malware and SCADA Security What are the Im- pacts? . Technical Report, Tofino Security, Sep 2012. 61. Mateti,P. Hacking Techniques

  4. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data

    PubMed Central

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601

  5. Statistical Traffic Anomaly Detection in Time-Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types

  6. Statistical Traffic Anomaly Detection in Time Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types

  7. Radar and infrared remote sensing of geothermal features at Pilgrim Springs, Alaska

    NASA Technical Reports Server (NTRS)

    Dean, K. G.; Forbes, R. B.; Turner, D. L.; Eaton, F. D.; Sullivan, K. D.

    1982-01-01

    High-altitude radar and thermal imagery collected by the NASA research aircraft WB57F were used to examine the structural setting and distribution of radiant temperatures of geothermal anomalies in the Pilgrim Springs, Alaska area. Like-polarized radar imagery with perpendicular look directions provides the best structural data for lineament analysis, although more than half the mapped lineaments are easily detectable on conventional aerial photography. Radiometer data and imagery from a thermal scanner were used to evaluate radiant surface temperatures, which ranged from 3 to 17 C. The evening imagery, which utilized density-slicing techniques, detected thermal anomalies associated with geothermal heat sources. The study indicates that high-altitude predawn thermal imagery may be able to locate relatively large areas of hot ground in site-specific studies in the vegetated Alaskan terrain. This imagery will probably not detect gentle lateral gradients.

  8. Magnetic Gradient Horizontal Operator (MHGO) useful for detecting objects buried at shallow depth: cultural heritage (Villa degli Antonini, Rota Rio)

    NASA Astrophysics Data System (ADS)

    Di Filippo, Michele; Di Nezza, Maria

    2016-04-01

    Several factors were taken into consideration in order to appropriately tailor the geophysical explorations at the cultural heritage. Given the fact that each site has been neglected for a long time and in recent times used as an illegal dumping area, we thoroughly evaluated for this investigation the advantages and limitations of each specific technique, and the general conditions and history of the site. We took into account the extension of the areas to be investigated and the need for rapid data acquisition and processing. Furthermore, the survey required instrumentation with sensitivity to small background contrasts and as little as possible affected by background noise sources. In order to ascertain the existence and location of underground buried walls, a magnetic gradiometer survey (MAG) was planned. The map of the magnetic anomalies is not computed to reduction at the pole (RTP), but with a magnetic horizontal gradient operator (MHGO). The magnetic horizontal gradient operator (MHGO) generates from a grid of vertical gradient a grid of steepest slopes (i.e. the magnitude of the gradient) at any point on the surface. The MHGO is reported as a number (rise over run) rather than degrees, and the direction is opposite to that of the slope. The MHGO is zero for a horizontal surface, and approaches infinity as the slope approaches the vertical. The gradient data are especially useful for detecting objects buried at shallow depth. The map reveals some details of the anomalies of the geomagnetic field. Magnetic anomalies due to walls are more evident than in the total intensity map, whereas anomalies due to concentrations of debris are very weak. In this work we describe the results of an investigation obtained with magnetometry investigation for two archaeological sites: "Villa degli Antonini" (Genzano, Rome) and Rota Ria (Mugnano in Teverina, Viterbo). Since the main goal of the investigation was to understand the nature of magnetic anomalies with cost-effective method, we have also detection and location of underground buried structures using different instruments and techniques geophysical were carried out (EMI, GPR and microgravity) and so far excavated only in a targeted sector of the area of the anomaly labeled in order to test the validity of the geophysical survey.

  9. Application of Ground-Penetrating Radar for Detecting Internal Anomalies in Tree Trunks with Irregular Contours.

    PubMed

    Li, Weilin; Wen, Jian; Xiao, Zhongliang; Xu, Shengxia

    2018-02-22

    To assess the health conditions of tree trunks, it is necessary to estimate the layers and anomalies of their internal structure. The main objective of this paper is to investigate the internal part of tree trunks considering their irregular contour. In this respect, we used ground penetrating radar (GPR) for non-invasive detection of defects and deteriorations in living trees trunks. The Hilbert transform algorithm and the reflection amplitudes were used to estimate the relative dielectric constant. The point cloud data technique was applied as well to extract the irregular contours of trunks. The feasibility and accuracy of the methods were examined through numerical simulations, laboratory and field measurements. The results demonstrated that the applied methodology allowed for accurate characterizations of the internal inhomogeneity. Furthermore, the point cloud technique resolved the trunk well by providing high-precision coordinate information. This study also demonstrated that cross-section tomography provided images with high resolution and accuracy. These integrated techniques thus proved to be promising for observing tree trunks and other cylindrical objects. The applied approaches offer a great promise for future 3D reconstruction of tomographic images with radar wave.

  10. Semi-supervised anomaly detection - towards model-independent searches of new physics

    NASA Astrophysics Data System (ADS)

    Kuusela, Mikael; Vatanen, Tommi; Malmi, Eric; Raiko, Tapani; Aaltonen, Timo; Nagai, Yoshikazu

    2012-06-01

    Most classification algorithms used in high energy physics fall under the category of supervised machine learning. Such methods require a training set containing both signal and background events and are prone to classification errors should this training data be systematically inaccurate for example due to the assumed MC model. To complement such model-dependent searches, we propose an algorithm based on semi-supervised anomaly detection techniques, which does not require a MC training sample for the signal data. We first model the background using a multivariate Gaussian mixture model. We then search for deviations from this model by fitting to the observations a mixture of the background model and a number of additional Gaussians. This allows us to perform pattern recognition of any anomalous excess over the background. We show by a comparison to neural network classifiers that such an approach is a lot more robust against misspecification of the signal MC than supervised classification. In cases where there is an unexpected signal, a neural network might fail to correctly identify it, while anomaly detection does not suffer from such a limitation. On the other hand, when there are no systematic errors in the training data, both methods perform comparably.

  11. MODVOLC2: A Hybrid Time Series Analysis for Detecting Thermal Anomalies Applied to Thermal Infrared Satellite Data

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Wright, R.; Pilger, E.

    2009-12-01

    We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found that MODVOLC2 achieved good results on multiple sensors (MODIS and GOES), which provides confidence that MODVOLC2 can be run on future instruments regardless of their spatial and temporal resolutions. The improved performance of MODVOLC2 over MODVOLC makes possible the detection of lower temperature thermal anomalies that will be useful in improving our ability to document Earth’s volcanic eruptions as well as detect possible low temperature thermal precursors to larger eruptions.

  12. Geothermal area detection using Landsat ETM+ thermal infrared data and its mechanistic analysis—A case study in Tengchong, China

    NASA Astrophysics Data System (ADS)

    Qin, Qiming; Zhang, Ning; Nan, Peng; Chai, Leilei

    2011-08-01

    Thermal infrared (TIR) remote sensing is an important technique in the exploration of geothermal resources. In this study, a geothermal survey is conducted in Tengchong area of Yunnan province in China using TIR data from Landsat-7 Enhanced Thematic Mapper Plus (ETM+) sensor. Based on radiometric calibration, atmospheric correction and emissivity calculation, a simple but efficient single channel algorithm with acceptable precision is applied to retrieve the land surface temperature (LST) of study area. The LST anomalous areas with temperature about 4-10 K higher than background area are discovered. Four geothermal areas are identified with the discussion of geothermal mechanism and the further analysis of regional geologic structure. The research reveals that the distribution of geothermal areas is consistent with the fault development in study area. Magmatism contributes abundant thermal source to study area and the faults provide thermal channels for heat transfer from interior earth to land surface and facilitate the present of geothermal anomalies. Finally, we conclude that TIR remote sensing is a cost-effective technique to detect LST anomalies. Combining TIR remote sensing with geological analysis and the understanding of geothermal mechanism is an accurate and efficient approach to geothermal area detection.

  13. Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  14. Using Multiple Robust Parameter Design Techniques to Improve Hyperspectral Anomaly Detection Algorithm Performance

    DTIC Science & Technology

    2009-03-01

    Set negative pixel values = 0 (remove bad pixels) -------------- [m,n] = size(data_matrix_new); for i =1:m for j= 1:n if...everything from packaging toothpaste to high speed fluid dynamics. While future engagements will continue to require the development of specialized

  15. Multiple input electrode gap controller

    DOEpatents

    Hysinger, C.L.; Beaman, J.J.; Melgaard, D.K.; Williamson, R.L.

    1999-07-27

    A method and apparatus for controlling vacuum arc remelting (VAR) furnaces by estimation of electrode gap based on a plurality of secondary estimates derived from furnace outputs. The estimation is preferably performed by Kalman filter. Adaptive gain techniques may be employed, as well as detection of process anomalies such as glows. 17 figs.

  16. Multiple input electrode gap controller

    DOEpatents

    Hysinger, Christopher L.; Beaman, Joseph J.; Melgaard, David K.; Williamson, Rodney L.

    1999-01-01

    A method and apparatus for controlling vacuum arc remelting (VAR) furnaces by estimation of electrode gap based on a plurality of secondary estimates derived from furnace outputs. The estimation is preferably performed by Kalman filter. Adaptive gain techniques may be employed, as well as detection of process anomalies such as glows.

  17. Adiabatic Quantum Anomaly Detection and Machine Learning

    NASA Astrophysics Data System (ADS)

    Pudenz, Kristen; Lidar, Daniel

    2012-02-01

    We present methods of anomaly detection and machine learning using adiabatic quantum computing. The machine learning algorithm is a boosting approach which seeks to optimally combine somewhat accurate classification functions to create a unified classifier which is much more accurate than its components. This algorithm then becomes the first part of the larger anomaly detection algorithm. In the anomaly detection routine, we first use adiabatic quantum computing to train two classifiers which detect two sets, the overlap of which forms the anomaly class. We call this the learning phase. Then, in the testing phase, the two learned classification functions are combined to form the final Hamiltonian for an adiabatic quantum computation, the low energy states of which represent the anomalies in a binary vector space.

  18. Tone calibration technique: A digital signaling scheme for mobile applications

    NASA Technical Reports Server (NTRS)

    Davarian, F.

    1986-01-01

    Residual carrier modulation is conventionally used in a communication link to assist the receiver with signal demodulation and detection. Although suppressed carrier modulation has a slight power advantage over the residual carrier approach in systems enjoying a high level of stability, it lacks sufficient robustness to be used in channels severely contaminated by noise, interference and propagation effects. In mobile links, in particular, the vehicle motion and multipath waveform propagation affect the received carrier in an adverse fashion. A residual carrier scheme that uses a pilot carrier to calibrate a mobile channel against multipath fading anomalies is described. The benefits of this scheme, known as tone calibration technique, are described. A brief study of the system performance in the presence of implementation anomalies is also given.

  19. Anomaly and signature filtering improve classifier performance for detection of suspicious access to EHRs.

    PubMed

    Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila

    2011-01-01

    Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10(-6). Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful.

  20. Anomaly and Signature Filtering Improve Classifier Performance For Detection Of Suspicious Access To EHRs

    PubMed Central

    Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila

    2011-01-01

    Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10−6. Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful. PMID:22195129

  1. Dispersive Phase in the L-band InSAR Image Associated with Heavy Rain Episodes

    NASA Astrophysics Data System (ADS)

    Furuya, M.; Kinoshita, Y.

    2017-12-01

    Interferometric synthetic aperture radar (InSAR) is a powerful geodetic technique that allows us to detect ground displacements with unprecedented spatial resolution, and has been used to detect displacements due to earthquakes, volcanic eruptions, and glacier motion. In the meantime, due to the microwave propagation through ionosphere and troposphere, we often encounter non-negligible phase anomaly in InSAR data. Correcting for the ionsphere and troposphere is therefore a long-standing issue for high-precision geodetic measurements. However, if ground displacements are negligible, InSAR image can tell us the details of the atmosphere.Kinoshita and Furuya (2017, SOLA) detected phase anomaly in ALOS/PALSAR InSAR data associated with heavy rain over Niigata area, Japan, and performed numerical weathr model simulation to reproduce the anomaly; ALOS/PALSAR is a satellite-based L-band SAR sensor launched by JAXA in 2006 and terminated in 2011. The phase anomaly could be largely reproduced, using the output data from the weather model. However, we should note that numerical weather model outputs can only account for the non-dispersive effect in the phase anomaly. In case of severe weather event, we may expect dispersive effect that could be caused by the presence of free-electrons.In Global Navigation Satellite System (GNSS) positioning, dual frequency measurements allow us to separate the ionospheric dispersive component from tropospheric non-dispersive components. In contrast, SAR imaging is based on a single carrier frequency, and thus no operational ionospheric corrections have been performed in InSAR data analyses. Recently, Gomba et al (2016) detailed the processing strategy of split spectrum method (SSM) for InSAR, which splits the finite bandwidth of the range spectrum and virtually allows for dual-frequency measurements.We apply the L-band InSAR SSM to the heavy rain episodes, in which more than 50 mm/hour precipitations were reported. We report the presence of phase anomaly in both dispersive and non-dispersive components. While the original phase anomaly turns out to be mostly due to the non-dispersive effect, we could recognize local anomalies in the dispersive component as well. We will discuss its geophysical implications, and may show several case studies.

  2. Efficient dynamic events discrimination technique for fiber distributed Brillouin sensors.

    PubMed

    Galindez, Carlos A; Madruga, Francisco J; Lopez-Higuera, Jose M

    2011-09-26

    A technique to detect real time variations of temperature or strain in Brillouin based distributed fiber sensors is proposed and is investigated in this paper. The technique is based on anomaly detection methods such as the RX-algorithm. Detection and isolation of dynamic events from the static ones are demonstrated by a proper processing of the Brillouin gain values obtained by using a standard BOTDA system. Results also suggest that better signal to noise ratio, dynamic range and spatial resolution can be obtained. For a pump pulse of 5 ns the spatial resolution is enhanced, (from 0.541 m obtained by direct gain measurement, to 0.418 m obtained with the technique here exposed) since the analysis is concentrated in the variation of the Brillouin gain and not only on the averaging of the signal along the time. © 2011 Optical Society of America

  3. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field.

    PubMed

    Christiansen, Peter; Nielsen, Lars N; Steen, Kim A; Jørgensen, Rasmus N; Karstoft, Henrik

    2016-11-11

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks" (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45-90 m) than RCNN. RCNN has a similar performance at a short range (0-30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit).

  4. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field

    PubMed Central

    Christiansen, Peter; Nielsen, Lars N.; Steen, Kim A.; Jørgensen, Rasmus N.; Karstoft, Henrik

    2016-01-01

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45–90 m) than RCNN. RCNN has a similar performance at a short range (0–30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit). PMID:27845717

  5. The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad

    The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measuredmore » 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.« less

  6. Toward the detection of abnormal chest radiographs the way radiologists do it

    NASA Astrophysics Data System (ADS)

    Alzubaidi, Mohammad; Patel, Ameet; Panchanathan, Sethuraman; Black, John A., Jr.

    2011-03-01

    Computer Aided Detection (CADe) and Computer Aided Diagnosis (CADx) are relatively recent areas of research that attempt to employ feature extraction, pattern recognition, and machine learning algorithms to aid radiologists in detecting and diagnosing abnormalities in medical images. However, these computational methods are based on the assumption that there are distinct classes of abnormalities, and that each class has some distinguishing features that set it apart from other classes. However, abnormalities in chest radiographs tend to be very heterogeneous. The literature suggests that thoracic (chest) radiologists develop their ability to detect abnormalities by developing a sense of what is normal, so that anything that is abnormal attracts their attention. This paper discusses an approach to CADe that is based on a technique called anomaly detection (which aims to detect outliers in data sets) for the purpose of detecting atypical regions in chest radiographs. However, in order to apply anomaly detection to chest radiographs, it is necessary to develop a basis for extracting features from corresponding anatomical locations in different chest radiographs. This paper proposes a method for doing this, and describes how it can be used to support CADe.

  7. Analysis and interpretation of MAGSAT anomalies over north Africa

    NASA Technical Reports Server (NTRS)

    Phillips, R. J.

    1985-01-01

    Crustal anomaly detection with MAGSAT data is frustrated by inherent resolving power of the data and by contamination from external and core fields. Quality of the data might be tested by modeling specific tectonic features which produce anomalies that fall within proposed resolution and crustal amplitude capabilities of MAGSAT fields. To test this hypothesis, north African hotspots associated with Ahaggar, Tibesti and Darfur were modeled as magnetic induction anomalies. MAGSAT data were reduced by subtracting external and core fields to isolate scalar and vertical component crustal signals. Of the three volcanic areas, only the Ahaggar region had an associated anomaly of magnitude above error limits of the data. Hotspot hypothesis was tested for Ahaggar by seeing if predicted magnetic signal matched MAGSAT anomaly. Predicted model magnetic signal arising from surface topography of the uplift and the Curie isothermal surface was calculated at MAGSAT altitudes by Fourier transform technique modified to allow for variable magnetization. Curie isotherm surface was calculated using a method for temperature distribution in a moving plate above a fixed hotspot. Magnetic signal was calculated for a fixed plate as well as a number of plate velocities and directions.

  8. Investigation of a Neural Network Implementation of a TCP Packet Anomaly Detection System

    DTIC Science & Technology

    2004-05-01

    reconnatre les nouvelles variantes d’attaque. Les réseaux de neurones artificiels (ANN) ont les capacités d’apprendre à partir de schémas et de...Computational Intelligence Techniques in Intrusion Detection Systems. In IASTED International Conference on Neural Networks and Computational Intelligence , pp...Neural Network Training: Overfitting May be Harder than Expected. In Proceedings of the Fourteenth National Conference on Artificial Intelligence , AAAI-97

  9. Retrieving Temperature Anomaly in the Global Subsurface and Deeper Ocean From Satellite Observations

    NASA Astrophysics Data System (ADS)

    Su, Hua; Li, Wene; Yan, Xiao-Hai

    2018-01-01

    Retrieving the subsurface and deeper ocean (SDO) dynamic parameters from satellite observations is crucial for effectively understanding ocean interior anomalies and dynamic processes, but it is challenging to accurately estimate the subsurface thermal structure over the global scale from sea surface parameters. This study proposes a new approach based on Random Forest (RF) machine learning to retrieve subsurface temperature anomaly (STA) in the global ocean from multisource satellite observations including sea surface height anomaly (SSHA), sea surface temperature anomaly (SSTA), sea surface salinity anomaly (SSSA), and sea surface wind anomaly (SSWA) via in situ Argo data for RF training and testing. RF machine-learning approach can accurately retrieve the STA in the global ocean from satellite observations of sea surface parameters (SSHA, SSTA, SSSA, SSWA). The Argo STA data were used to validate the accuracy and reliability of the results from the RF model. The results indicated that SSHA, SSTA, SSSA, and SSWA together are useful parameters for detecting SDO thermal information and obtaining accurate STA estimations. The proposed method also outperformed support vector regression (SVR) in global STA estimation. It will be a useful technique for studying SDO thermal variability and its role in global climate system from global-scale satellite observations.

  10. Detection of Anomalous Machining Damages in Inconel 718 and TI 6-4 by Eddy Current Techniques

    NASA Astrophysics Data System (ADS)

    Lo, C. C. H.; Shimon, M.; Nakagawa, N.

    2010-02-01

    This paper reports on an eddy current (EC) study aimed at detecting anomalous machining damages in Inconel 718 and Ti 6-4 samples, including (i) surface discontinuities such as re-depositing of chips onto the machined surface, and (ii) microstructural damages manifested as a white surface layer and a subsurface layer of distorted grains, typically tens of microns thick. A series of pristine and machine-damaged coupons were studied by EC scans using a differential probe operated at 2 MHz to detect discontinuous surface anomalies, and by swept high frequency EC (SHFEC) measurements from 0.5 MHz to 65.5 MHz using proprietary detection coils to detect surface microstructural damages. In general, the EC c-scan data from machine-damaged surfaces show spatial variations with larger standard deviations than those from the undamaged surfaces. In some cases, the c-scan images exhibit characteristic bipolar indications in good spatial correlation with surface anomalies revealed by optical microscopy and laser profilometry. Results of the SHFEC measurements indicate a reduced near-surface conductivity of the damaged surfaces compared to the undamaged surfaces.

  11. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery

    PubMed Central

    Marapareddy, Ramakalavathi; Aanstoos, James V.; Younan, Nicolas H.

    2016-01-01

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ1, λ2, and λ3), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory’s (JPL’s) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers. PMID:27322270

  12. Apparatus for detecting a magnetic anomaly contiguous to remote location by squid gradiometer and magnetometer systems

    DOEpatents

    Overton, Jr., William C.; Steyert, Jr., William A.

    1984-01-01

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  13. Apparatus and method for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    DOEpatents

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1981-05-22

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  14. Detection of Anomalies in Citrus Leaves Using Laser-Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Sankaran, Sindhuja; Ehsani, Reza; Morgan, Kelly T

    2015-08-01

    Nutrient assessment and management are important to maintain productivity in citrus orchards. In this study, laser-induced breakdown spectroscopy (LIBS) was applied for rapid and real-time detection of citrus anomalies. Laser-induced breakdown spectroscopy spectra were collected from citrus leaves with anomalies such as diseases (Huanglongbing, citrus canker) and nutrient deficiencies (iron, manganese, magnesium, zinc), and compared with those of healthy leaves. Baseline correction, wavelet multivariate denoising, and normalization techniques were applied to the LIBS spectra before analysis. After spectral pre-processing, features were extracted using principal component analysis and classified using two models, quadratic discriminant analysis and support vector machine (SVM). The SVM resulted in a high average classification accuracy of 97.5%, with high average canker classification accuracy (96.5%). LIBS peak analysis indicated that high intensities at 229.7, 247.9, 280.3, 393.5, 397.0, and 769.8 nm were observed of 11 peaks found in all the samples. Future studies using controlled experiments with variable nutrient applications are required for quantification of foliar nutrients by using LIBS-based sensing.

  15. Discovering System Health Anomalies Using Data Mining Techniques

    NASA Technical Reports Server (NTRS)

    Sriastava, Ashok, N.

    2005-01-01

    We present a data mining framework for the analysis and discovery of anomalies in high-dimensional time series of sensor measurements that would be found in an Integrated System Health Monitoring system. We specifically treat the problem of discovering anomalous features in the time series that may be indicative of a system anomaly, or in the case of a manned system, an anomaly due to the human. Identification of these anomalies is crucial to building stable, reusable, and cost-efficient systems. The framework consists of an analysis platform and new algorithms that can scale to thousands of sensor streams to discovers temporal anomalies. We discuss the mathematical framework that underlies the system and also describe in detail how this framework is general enough to encompass both discrete and continuous sensor measurements. We also describe a new set of data mining algorithms based on kernel methods and hidden Markov models that allow for the rapid assimilation, analysis, and discovery of system anomalies. We then describe the performance of the system on a real-world problem in the aircraft domain where we analyze the cockpit data from aircraft as well as data from the aircraft propulsion, control, and guidance systems. These data are discrete and continuous sensor measurements and are dealt with seamlessly in order to discover anomalous flights. We conclude with recommendations that describe the tradeoffs in building an integrated scalable platform for robust anomaly detection in ISHM applications.

  16. RIDES: Robust Intrusion Detection System for IP-Based Ubiquitous Sensor Networks

    PubMed Central

    Amin, Syed Obaid; Siddiqui, Muhammad Shoaib; Hong, Choong Seon; Lee, Sungwon

    2009-01-01

    The IP-based Ubiquitous Sensor Network (IP-USN) is an effort to build the “Internet of things”. By utilizing IP for low power networks, we can benefit from existing well established tools and technologies of IP networks. Along with many other unresolved issues, securing IP-USN is of great concern for researchers so that future market satisfaction and demands can be met. Without proper security measures, both reactive and proactive, it is hard to envisage an IP-USN realm. In this paper we present a design of an IDS (Intrusion Detection System) called RIDES (Robust Intrusion DEtection System) for IP-USN. RIDES is a hybrid intrusion detection system, which incorporates both Signature and Anomaly based intrusion detection components. For signature based intrusion detection this paper only discusses the implementation of distributed pattern matching algorithm with the help of signature-code, a dynamically created attack-signature identifier. Other aspects, such as creation of rules are not discussed. On the other hand, for anomaly based detection we propose a scoring classifier based on the SPC (Statistical Process Control) technique called CUSUM charts. We also investigate the settings and their effects on the performance of related parameters for both of the components. PMID:22412321

  17. RIDES: Robust Intrusion Detection System for IP-Based Ubiquitous Sensor Networks.

    PubMed

    Amin, Syed Obaid; Siddiqui, Muhammad Shoaib; Hong, Choong Seon; Lee, Sungwon

    2009-01-01

    The IP-based Ubiquitous Sensor Network (IP-USN) is an effort to build the "Internet of things". By utilizing IP for low power networks, we can benefit from existing well established tools and technologies of IP networks. Along with many other unresolved issues, securing IP-USN is of great concern for researchers so that future market satisfaction and demands can be met. Without proper security measures, both reactive and proactive, it is hard to envisage an IP-USN realm. In this paper we present a design of an IDS (Intrusion Detection System) called RIDES (Robust Intrusion DEtection System) for IP-USN. RIDES is a hybrid intrusion detection system, which incorporates both Signature and Anomaly based intrusion detection components. For signature based intrusion detection this paper only discusses the implementation of distributed pattern matching algorithm with the help of signature-code, a dynamically created attack-signature identifier. Other aspects, such as creation of rules are not discussed. On the other hand, for anomaly based detection we propose a scoring classifier based on the SPC (Statistical Process Control) technique called CUSUM charts. We also investigate the settings and their effects on the performance of related parameters for both of the components.

  18. Network anomaly detection system with optimized DS evidence theory.

    PubMed

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network-complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each sensor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly.

  19. Network Anomaly Detection System with Optimized DS Evidence Theory

    PubMed Central

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  20. Non-invasive Florentine Renaissance Panel Painting Replica Structures Investigation by Using Terahertz Time-Domain Imaging (THz-TDI) Technique

    NASA Astrophysics Data System (ADS)

    Koch Dandolo, Corinna L.; Picollo, Marcello; Cucci, Costanza; Jepsen, Peter Uhd

    2016-11-01

    The potentials of the Terahertz Time-Domain Imaging (THz-TDI) technique for a non-invasive inspection of panel paintings have been considered in detail. The THz-TD data acquired on a replica of a panel painting made in imitation of Italian Renaissance panel paintings were processed in order to provide insights as to the limits and potentials of the technique in detecting different kinds of underdrawings and paint layers. Constituent layers, construction techniques, and anomalies were identified and localized by interpreting the extracted THz dielectric stratigraphy.

  1. Intelligent agent-based intrusion detection system using enhanced multiclass SVM.

    PubMed

    Ganapathy, S; Yogesh, P; Kannan, A

    2012-01-01

    Intrusion detection systems were used in the past along with various techniques to detect intrusions in networks effectively. However, most of these systems are able to detect the intruders only with high false alarm rate. In this paper, we propose a new intelligent agent-based intrusion detection model for mobile ad hoc networks using a combination of attribute selection, outlier detection, and enhanced multiclass SVM classification methods. For this purpose, an effective preprocessing technique is proposed that improves the detection accuracy and reduces the processing time. Moreover, two new algorithms, namely, an Intelligent Agent Weighted Distance Outlier Detection algorithm and an Intelligent Agent-based Enhanced Multiclass Support Vector Machine algorithm are proposed for detecting the intruders in a distributed database environment that uses intelligent agents for trust management and coordination in transaction processing. The experimental results of the proposed model show that this system detects anomalies with low false alarm rate and high-detection rate when tested with KDD Cup 99 data set.

  2. Intelligent Agent-Based Intrusion Detection System Using Enhanced Multiclass SVM

    PubMed Central

    Ganapathy, S.; Yogesh, P.; Kannan, A.

    2012-01-01

    Intrusion detection systems were used in the past along with various techniques to detect intrusions in networks effectively. However, most of these systems are able to detect the intruders only with high false alarm rate. In this paper, we propose a new intelligent agent-based intrusion detection model for mobile ad hoc networks using a combination of attribute selection, outlier detection, and enhanced multiclass SVM classification methods. For this purpose, an effective preprocessing technique is proposed that improves the detection accuracy and reduces the processing time. Moreover, two new algorithms, namely, an Intelligent Agent Weighted Distance Outlier Detection algorithm and an Intelligent Agent-based Enhanced Multiclass Support Vector Machine algorithm are proposed for detecting the intruders in a distributed database environment that uses intelligent agents for trust management and coordination in transaction processing. The experimental results of the proposed model show that this system detects anomalies with low false alarm rate and high-detection rate when tested with KDD Cup 99 data set. PMID:23056036

  3. Research for Key Techniques of Geophysical Recognition System of Hydrocarbon-induced Magnetic Anomalies Based on Hydrocarbon Seepage Theory

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Hao, T.; Zhao, B.

    2009-12-01

    Hydrocarbon seepage effects can cause magnetic alteration zones in near surface, and the magnetic anomalies induced by the alteration zones can thus be used to locate oil-gas potential regions. In order to reduce the inaccuracy and multi-resolution of the hydrocarbon anomalies recognized only by magnetic data, and to meet the requirement of integrated management and sythetic analysis of multi-source geoscientfic data, it is necessary to construct a recognition system that integrates the functions of data management, real-time processing, synthetic evaluation, and geologic mapping. In this paper research for the key techniques of the system is discussed. Image processing methods can be applied to potential field images so as to make it easier for visual interpretation and geological understanding. For gravity or magnetic images, the anomalies with identical frequency-domain characteristics but different spatial distribution will reflect differently in texture and relevant textural statistics. Texture is a description of structural arrangements and spatial variation of a dataset or an image, and has been applied in many research fields. Textural analysis is a procedure that extracts textural features by image processing methods and thus obtains a quantitative or qualitative description of texture. When the two kinds of anomalies have no distinct difference in amplitude or overlap in frequency spectrum, they may be distinguishable due to their texture, which can be considered as textural contrast. Therefore, for the recognition system we propose a new “magnetic spots” recognition method based on image processing techniques. The method can be divided into 3 major steps: firstly, separate local anomalies caused by shallow, relatively small sources from the total magnetic field, and then pre-process the local magnetic anomaly data by image processing methods such that magnetic anomalies can be expressed as points, lines and polygons with spatial correlation, which includes histogram-equalization based image display, object recognition and extraction; then, mine the spatial characteristics and correlations of the magnetic anomalies using textural statistics and analysis, and study the features of known anomalous objects (closures, hydrocarbon-bearing structures, igneous rocks, etc.) in the same research area; finally, classify the anomalies, cluster them according to their similarity, and predict hydrocarbon induced “magnetic spots” combined with geologic, drilling and rock core data. The system uses the ArcGIS as the secondary development platform, inherits the basic functions of the ArcGIS, and develops two main sepecial functional modules, the module for conventional potential-field data processing methods and the module for feature extraction and enhancement based on image processing and analysis techniques. The system can be applied to realize the geophysical detection and recognition of near-surface hydrocarbon seepage anomalies, provide technical support for locating oil-gas potential regions, and promote geophysical data processing and interpretation to advance more efficiently.

  4. Infrared thermography based diagnosis of inter-turn fault and cooling system failure in three phase induction motor

    NASA Astrophysics Data System (ADS)

    Singh, Gurmeet; Naikan, V. N. A.

    2017-12-01

    Thermography has been widely used as a technique for anomaly detection in induction motors. International Electrical Testing Association (NETA) proposed guidelines for thermographic inspection of electrical systems and rotating equipment. These guidelines help in anomaly detection and estimating its severity. However, it focus only on location of hotspot rather than diagnosing the fault. This paper addresses two such faults i.e. inter-turn fault and failure of cooling system, where both results in increase of stator temperature. Present paper proposes two thermal profile indicators using thermal analysis of IRT images. These indicators are in compliance with NETA standard. These indicators help in correctly diagnosing inter-turn fault and failure of cooling system. The work has been experimentally validated for healthy and with seeded faults scenarios of induction motors.

  5. Quantum-state anomaly detection for arbitrary errors using a machine-learning technique

    NASA Astrophysics Data System (ADS)

    Hara, Satoshi; Ono, Takafumi; Okamoto, Ryo; Washio, Takashi; Takeuchi, Shigeki

    2016-10-01

    The accurate detection of small deviations in given density matrice is important for quantum information processing, which is a difficult task because of the intrinsic fluctuation in density matrices reconstructed using a limited number of experiments. We previously proposed a method for decoherence error detection using a machine-learning technique [S. Hara, T. Ono, R. Okamoto, T. Washio, and S. Takeuchi, Phys. Rev. A 89, 022104 (2014), 10.1103/PhysRevA.89.022104]. However, the previous method is not valid when the errors are just changes in phase. Here, we propose a method that is valid for arbitrary errors in density matrices. The performance of the proposed method is verified using both numerical simulation data and real experimental data.

  6. Real-time anomaly detection for very short-term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Jian; Hong, Tao; Yue, Meng

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  7. Real-time anomaly detection for very short-term load forecasting

    DOE PAGES

    Luo, Jian; Hong, Tao; Yue, Meng

    2018-01-06

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  8. System for evaluating weld quality using eddy currents

    DOEpatents

    Todorov, Evgueni I.; Hay, Jacob

    2017-12-12

    Electromagnetic and eddy current techniques for fast automated real-time and near real-time inspection and monitoring systems for high production rate joining processes. An eddy current system, array and method for the fast examination of welds to detect anomalies such as missed seam (MS) and lack of penetration (LOP) the system, array and methods capable of detecting and sizing surface and slightly subsurface flaws at various orientations in connection with at least the first and second weld pass.

  9. Using statistical anomaly detection models to find clinical decision support malfunctions.

    PubMed

    Ray, Soumi; McEvoy, Dustin S; Aaron, Skye; Hickman, Thu-Trang; Wright, Adam

    2018-05-11

    Malfunctions in Clinical Decision Support (CDS) systems occur due to a multitude of reasons, and often go unnoticed, leading to potentially poor outcomes. Our goal was to identify malfunctions within CDS systems. We evaluated 6 anomaly detection models: (1) Poisson Changepoint Model, (2) Autoregressive Integrated Moving Average (ARIMA) Model, (3) Hierarchical Divisive Changepoint (HDC) Model, (4) Bayesian Changepoint Model, (5) Seasonal Hybrid Extreme Studentized Deviate (SHESD) Model, and (6) E-Divisive with Median (EDM) Model and characterized their ability to find known anomalies. We analyzed 4 CDS alerts with known malfunctions from the Longitudinal Medical Record (LMR) and Epic® (Epic Systems Corporation, Madison, WI, USA) at Brigham and Women's Hospital, Boston, MA. The 4 rules recommend lead testing in children, aspirin therapy in patients with coronary artery disease, pneumococcal vaccination in immunocompromised adults and thyroid testing in patients taking amiodarone. Poisson changepoint, ARIMA, HDC, Bayesian changepoint and the SHESD model were able to detect anomalies in an alert for lead screening in children and in an alert for pneumococcal conjugate vaccine in immunocompromised adults. EDM was able to detect anomalies in an alert for monitoring thyroid function in patients on amiodarone. Malfunctions/anomalies occur frequently in CDS alert systems. It is important to be able to detect such anomalies promptly. Anomaly detection models are useful tools to aid such detections.

  10. Prevalence, prenatal diagnosis and clinical features of oculo-auriculo-vertebral spectrum: a registry-based study in Europe

    PubMed Central

    Barisic, Ingeborg; Odak, Ljubica; Loane, Maria; Garne, Ester; Wellesley, Diana; Calzolari, Elisa; Dolk, Helen; Addor, Marie-Claude; Arriola, Larraitz; Bergman, Jorieke; Bianca, Sebastiano; Doray, Berenice; Khoshnood, Babak; Klungsoyr, Kari; McDonnell, Bob; Pierini, Anna; Rankin, Judith; Rissmann, Anke; Rounding, Catherine; Queisser-Luft, Annette; Scarano, Gioacchino; Tucker, David

    2014-01-01

    Oculo-auriculo-vertebral spectrum is a complex developmental disorder characterised mainly by anomalies of the ear, hemifacial microsomia, epibulbar dermoids and vertebral anomalies. The aetiology is largely unknown, and the epidemiological data are limited and inconsistent. We present the largest population-based epidemiological study to date, using data provided by the large network of congenital anomalies registries in Europe. The study population included infants diagnosed with oculo-auriculo-vertebral spectrum during the 1990–2009 period from 34 registries active in 16 European countries. Of the 355 infants diagnosed with oculo-auriculo-vertebral spectrum, there were 95.8% (340/355) live born, 0.8% (3/355) fetal deaths, 3.4% (12/355) terminations of pregnancy for fetal anomaly and 1.5% (5/340) neonatal deaths. In 18.9%, there was prenatal detection of anomaly/anomalies associated with oculo-auriculo-vertebral spectrum, 69.7% were diagnosed at birth, 3.9% in the first week of life and 6.1% within 1 year of life. Microtia (88.8%), hemifacial microsomia (49.0%) and ear tags (44.4%) were the most frequent anomalies, followed by atresia/stenosis of external auditory canal (25.1%), diverse vertebral (24.3%) and eye (24.3%) anomalies. There was a high rate (69.5%) of associated anomalies of other organs/systems. The most common were congenital heart defects present in 27.8% of patients. The prevalence of oculo-auriculo-vertebral spectrum, defined as microtia/ear anomalies and at least one major characteristic anomaly, was 3.8 per 100 000 births. Twinning, assisted reproductive techniques and maternal pre-pregnancy diabetes were confirmed as risk factors. The high rate of different associated anomalies points to the need of performing an early ultrasound screening in all infants born with this disorder. PMID:24398798

  11. Inductive System Monitors Tasks

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Inductive Monitoring System (IMS) software developed at Ames Research Center uses artificial intelligence and data mining techniques to build system-monitoring knowledge bases from archived or simulated sensor data. This information is then used to detect unusual or anomalous behavior that may indicate an impending system failure. Currently helping analyze data from systems that help fly and maintain the space shuttle and the International Space Station (ISS), the IMS has also been employed by data classes are then used to build a monitoring knowledge base. In real time, IMS performs monitoring functions: determining and displaying the degree of deviation from nominal performance. IMS trend analyses can detect conditions that may indicate a failure or required system maintenance. The development of IMS was motivated by the difficulty of producing detailed diagnostic models of some system components due to complexity or unavailability of design information. Successful applications have ranged from real-time monitoring of aircraft engine and control systems to anomaly detection in space shuttle and ISS data. IMS was used on shuttle missions STS-121, STS-115, and STS-116 to search the Wing Leading Edge Impact Detection System (WLEIDS) data for signs of possible damaging impacts during launch. It independently verified findings of the WLEIDS Mission Evaluation Room (MER) analysts and indicated additional points of interest that were subsequently investigated by the MER team. In support of the Exploration Systems Mission Directorate, IMS is being deployed as an anomaly detection tool on ISS mission control consoles in the Johnson Space Center Mission Operations Directorate. IMS has been trained to detect faults in the ISS Control Moment Gyroscope (CMG) systems. In laboratory tests, it has already detected several minor anomalies in real-time CMG data. When tested on archived data, IMS was able to detect precursors of the CMG1 failure nearly 15 hours in advance of the actual failure event. In the Aeronautics Research Mission Directorate, IMS successfully performed real-time engine health analysis. IMS was able to detect simulated failures and actual engine anomalies in an F/A-18 aircraft during the course of 25 test flights. IMS is also being used in colla

  12. 22nd Annual Logistics Conference and Exhibition

    DTIC Science & Technology

    2006-04-20

    Prognostics & Health Management at GE Dr. Piero P.Bonissone Industrial AI Lab GE Global Research NCD Select detection model Anomaly detection results...Mode 213 x Failure mode histogram 2130014 Anomaly detection from event-log data Anomaly detection from event-log data Diagnostics/ Prognostics Using...Failure Monitoring & AssessmentTactical C4ISR Sense Respond 7 •Diagnostics, Prognostics and health management

  13. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks

    PubMed Central

    Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin

    2016-01-01

    The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035

  14. Pre-seismic anomalies from optical satellite observations: a review

    NASA Astrophysics Data System (ADS)

    Jiao, Zhong-Hu; Zhao, Jing; Shan, Xinjian

    2018-04-01

    Detecting various anomalies using optical satellite data prior to strong earthquakes is key to understanding and forecasting earthquake activities because of its recognition of thermal-radiation-related phenomena in seismic preparation phases. Data from satellite observations serve as a powerful tool in monitoring earthquake preparation areas at a global scale and in a nearly real-time manner. Over the past several decades, many new different data sources have been utilized in this field, and progressive anomaly detection approaches have been developed. This paper reviews the progress and development of pre-seismic anomaly detection technology in this decade. First, precursor parameters, including parameters from the top of the atmosphere, in the atmosphere, and on the Earth's surface, are stated and discussed. Second, different anomaly detection methods, which are used to extract anomalous signals that probably indicate future seismic events, are presented. Finally, certain critical problems with the current research are highlighted, and new developing trends and perspectives for future work are discussed. The development of Earth observation satellites and anomaly detection algorithms can enrich available information sources, provide advanced tools for multilevel earthquake monitoring, and improve short- and medium-term forecasting, which play a large and growing role in pre-seismic anomaly detection research.

  15. Integrated geophysical study of the geothermal system in the southern part of Nisyros Island, Greece

    NASA Astrophysics Data System (ADS)

    Lagios, E.; Apostolopoulos, G.

    1995-10-01

    The study of the high-enthalpy geothermal field of Nisyros Island is of great importance, because of the planned construction of a geothermal power station. The purpose of the applied geophysical surveys — gravity, SP, VLF and audio-magnetotelluric — in southernmost Nisyros was to investigate the major and minor faulting zones which are geothermally active, i.e. whether geothermal fluid circulation occurs in these zones. The survey lines, four parallel traverses of about 1500 m length, were chosen to be almost transverse to the main faults of the area. The SP method was the main reconnaissance technique, with the VLF and gravity measurements correlating with the "SP model". Previously proposed SP data acquisition and reduction techniques were used, followed by a 2-D interpretation of the SP map which apparently locates the position of the fracture zones (geothermally active). The SP and VLF anomalies are believed to be generated by the same source (subsurface flow of fluid, heat and ions). Hence, at the place of a vertical geothermal fluid circulation zone, the curve of SP dipole-like anomaly changes its behaviour and the curve of the VLF anomaly takes maximum values for the in-phase component and minimum values for the out-of-phase component. On the VLF map of the survey area, the zones detected with the SP interpretation coincide with the maximum values of the VLF in-phase component. The geothermal fluid circulation zones, detected by the SP method, appear to be well correlated with corresponding features derived from the gravity and the AMT surveys. In particular, the AMT soundings indicate two zones of geothermal fluid circulation instead of the one the SP method detected in the central part of the investigated area.

  16. Systematic Screening for Subtelomeric Anomalies in a Clinical Sample of Autism

    ERIC Educational Resources Information Center

    Wassink, Thomas H.; Losh, Molly; Piven, Joseph; Sheffield, Val C.; Ashley, Elizabeth; Westin, Erik R.; Patil, Shivanand R.

    2007-01-01

    High-resolution karyotyping detects cytogenetic anomalies in 5-10% of cases of autism. Karyotyping, however, may fail to detect abnormalities of chromosome subtelomeres, which are gene rich regions prone to anomalies. We assessed whether panels of FISH probes targeted for subtelomeres could detect abnormalities beyond those identified by…

  17. Apollo experience report: Flight anomaly resolution

    NASA Technical Reports Server (NTRS)

    Lobb, J. D.

    1975-01-01

    The identification of flight anomalies, the determination of their causes, and the approaches taken for corrective action are described. Interrelationships of the broad range of disciplines involved with the complex systems and the team concept employed to ensure timely and accurate resolution of anomalies are discussed. The documentation techniques and the techniques for management of anomaly resolution are included. Examples of specific anomalies are presented in the original form of their progressive documentation. Flight anomaly resolution functioned as a part of the real-time mission support and postflight testing, and results were included in the postflight documentation.

  18. A comparative study of linear and nonlinear anomaly detectors for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Goldberg, Hirsh; Nasrabadi, Nasser M.

    2007-04-01

    In this paper we implement various linear and nonlinear subspace-based anomaly detectors for hyperspectral imagery. First, a dual window technique is used to separate the local area around each pixel into two regions - an inner-window region (IWR) and an outer-window region (OWR). Pixel spectra from each region are projected onto a subspace which is defined by projection bases that can be generated in several ways. Here we use three common pattern classification techniques (Principal Component Analysis (PCA), Fisher Linear Discriminant (FLD) Analysis, and the Eigenspace Separation Transform (EST)) to generate projection vectors. In addition to these three algorithms, the well-known Reed-Xiaoli (RX) anomaly detector is also implemented. Each of the four linear methods is then implicitly defined in a high- (possibly infinite-) dimensional feature space by using a nonlinear mapping associated with a kernel function. Using a common machine-learning technique known as the kernel trick all dot products in the feature space are replaced with a Mercer kernel function defined in terms of the original input data space. To determine how anomalous a given pixel is, we then project the current test pixel spectra and the spectral mean vector of the OWR onto the linear and nonlinear projection vectors in order to exploit the statistical differences between the IWR and OWR pixels. Anomalies are detected if the separation of the projection of the current test pixel spectra and the OWR mean spectra are greater than a certain threshold. Comparisons are made using receiver operating characteristics (ROC) curves.

  19. Detection of Coal Fires: A Case Study Conducted on Indian Coal Seams Using Neural Network and Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Singh, B. B.

    2016-12-01

    India produces majority of its electricity from coal but a huge quantity of coal burns every day due to coal fires and also poses a threat to the environment as severe pollutants. In the present study we had demonstrated the usage of Neural Network based approach with an integrated Particle Swarm Optimization (PSO) inversion technique. The Self Potential (SP) data set is used for the early detection of coal fires. The study was conducted over the East Basuria colliery, Jharia Coal Field, Jharkhand, India. The causative source was modelled as an inclined sheet like anomaly and the synthetic data was generated. Neural Network scheme consists of an input layer, hidden layers and an output layer. The input layer corresponds to the SP data and the output layer is the estimated depth of the coal fire. A synthetic dataset was modelled with some of the known parameters such as depth, conductivity, inclination angle, half width etc. associated with causative body and gives a very low misfit error of 0.0032%. Therefore, the method was found accurate in predicting the depth of the source body. The technique was applied to the real data set and the model was trained until a very good correlation of determination `R2' value of 0.98 is obtained. The depth of the source body was found to be 12.34m with a misfit error percentage of 0.242%. The inversion results were compared with the lithologs obtained from a nearby well which corresponds to the L3 coal seam. The depth of the coal fire had exactly matched with the half width of the anomaly which suggests that the fire is widely spread. The inclination angle of the anomaly was 135.510 which resembles the development of the geometrically complex fracture planes. These fractures may be developed due to anisotropic weakness of the ground which acts as passage for the air. As a result coal fires spreads along these fracture planes. The results obtained from the Neural Network was compared with PSO inversion results and were found in complete agreement. PSO technique had already been found a well-established technique to model SP anomalies. Therefore for successful control and mitigation, SP surveys coupled with Neural Network and PSO technique proves to be novel and economical approach along with other existing geophysical techniques. Keywords: PSO, Coal fire, Self-Potential, Inversion, Neural Network

  20. Topological anomaly detection performance with multispectral polarimetric imagery

    NASA Astrophysics Data System (ADS)

    Gartley, M. G.; Basener, W.,

    2009-05-01

    Polarimetric imaging has demonstrated utility for increasing contrast of manmade targets above natural background clutter. Manual detection of manmade targets in multispectral polarimetric imagery can be challenging and a subjective process for large datasets. Analyst exploitation may be improved utilizing conventional anomaly detection algorithms such as RX. In this paper we examine the performance of a relatively new approach to anomaly detection, which leverages topology theory, applied to spectral polarimetric imagery. Detection results for manmade targets embedded in a complex natural background will be presented for both the RX and Topological Anomaly Detection (TAD) approaches. We will also present detailed results examining detection sensitivities relative to: (1) the number of spectral bands, (2) utilization of Stoke's images versus intensity images, and (3) airborne versus spaceborne measurements.

  1. Quantum machine learning for quantum anomaly detection

    NASA Astrophysics Data System (ADS)

    Liu, Nana; Rebentrost, Patrick

    2018-04-01

    Anomaly detection is used for identifying data that deviate from "normal" data patterns. Its usage on classical data finds diverse applications in many important areas such as finance, fraud detection, medical diagnoses, data cleaning, and surveillance. With the advent of quantum technologies, anomaly detection of quantum data, in the form of quantum states, may become an important component of quantum applications. Machine-learning algorithms are playing pivotal roles in anomaly detection using classical data. Two widely used algorithms are the kernel principal component analysis and the one-class support vector machine. We find corresponding quantum algorithms to detect anomalies in quantum states. We show that these two quantum algorithms can be performed using resources that are logarithmic in the dimensionality of quantum states. For pure quantum states, these resources can also be logarithmic in the number of quantum states used for training the machine-learning algorithm. This makes these algorithms potentially applicable to big quantum data applications.

  2. Systematic review and meta-analysis of isolated posterior fossa malformations on prenatal ultrasound imaging (part 1): nomenclature, diagnostic accuracy and associated anomalies.

    PubMed

    D'Antonio, F; Khalil, A; Garel, C; Pilu, G; Rizzo, G; Lerman-Sagie, T; Bhide, A; Thilaganathan, B; Manzoli, L; Papageorghiou, A T

    2016-06-01

    To explore the outcome in fetuses with prenatal diagnosis of posterior fossa anomalies apparently isolated on ultrasound imaging. MEDLINE and EMBASE were searched electronically utilizing combinations of relevant medical subject headings for 'posterior fossa' and 'outcome'. The posterior fossa anomalies analyzed were Dandy-Walker malformation (DWM), mega cisterna magna (MCM), Blake's pouch cyst (BPC) and vermian hypoplasia (VH). The outcomes observed were rate of chromosomal abnormalities, additional anomalies detected at prenatal magnetic resonance imaging (MRI), additional anomalies detected at postnatal imaging and concordance between prenatal and postnatal diagnoses. Only isolated cases of posterior fossa anomalies - defined as having no cerebral or extracerebral additional anomalies detected on ultrasound examination - were included in the analysis. Quality assessment of the included studies was performed using the Newcastle-Ottawa Scale for cohort studies. We used meta-analyses of proportions to combine data and fixed- or random-effects models according to the heterogeneity of the results. Twenty-two studies including 531 fetuses with posterior fossa anomalies were included in this systematic review. The prevalence of chromosomal abnormalities in fetuses with isolated DWM was 16.3% (95% CI, 8.7-25.7%). The prevalence of additional central nervous system (CNS) abnormalities that were missed at ultrasound examination and detected only at prenatal MRI was 13.7% (95% CI, 0.2-42.6%), and the prevalence of additional CNS anomalies that were missed at prenatal imaging and detected only after birth was 18.2% (95% CI, 6.2-34.6%). Prenatal diagnosis was not confirmed after birth in 28.2% (95% CI, 8.5-53.9%) of cases. MCM was not significantly associated with additional anomalies detected at prenatal MRI or detected after birth. Prenatal diagnosis was not confirmed postnatally in 7.1% (95% CI, 2.3-14.5%) of cases. The rate of chromosomal anomalies in fetuses with isolated BPC was 5.2% (95% CI, 0.9-12.7%) and there was no associated CNS anomaly detected at prenatal MRI or only after birth. Prenatal diagnosis of BPC was not confirmed after birth in 9.8% (95% CI, 2.9-20.1%) of cases. The rate of chromosomal anomalies in fetuses with isolated VH was 6.5% (95% CI, 0.8-17.1%) and there were no additional anomalies detected at prenatal MRI (0% (95% CI, 0.0-45.9%)). The proportions of cerebral anomalies detected only after birth was 14.2% (95% CI, 2.9-31.9%). Prenatal diagnosis was not confirmed after birth in 32.4% (95% CI, 18.3-48.4%) of cases. DWM apparently isolated on ultrasound imaging is a condition with a high risk for chromosomal and associated structural anomalies. Isolated MCM and BPC have a low risk for aneuploidy or associated structural anomalies. The small number of cases with isolated VH prevents robust conclusions regarding their management from being drawn. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd.

  3. Traffic Pattern Detection Using the Hough Transformation for Anomaly Detection to Improve Maritime Domain Awareness

    DTIC Science & Technology

    2013-12-01

    Programming code in the Python language used in AIS data preprocessing is contained in Appendix A. The MATLAB programming code used to apply the Hough...described in Chapter III is applied to archived AIS data in this chapter. The implementation of the method, including programming techniques used, is...is contained in the second. To provide a proof of concept for the algorithm described in Chapter III, the PYTHON programming language was used for

  4. SSME HPOTP post-test diagnostic system enhancement project

    NASA Technical Reports Server (NTRS)

    Bickmore, Timothy W.

    1995-01-01

    An assessment of engine and component health is routinely made after each test or flight firing of a space shuttle main engine (SSME). Currently, this health assessment is done by teams of engineers who manually review sensor data, performance data, and engine and component operating histories. Based on review of information from these various sources, an evaluation is made as to the health of each component of the SSME and the preparedness of the engine for another test or flight. The objective of this project is to further develop a computer program which automates the analysis of test data from the SSME high-pressure oxidizer turbopump (HPOTP) in order to detect and diagnose anomalies. This program fits into a larger system, the SSME Post-Test Diagnostic System (PTDS), which will eventually be extended to assess the health and status of most SSME components on the basis of test data analysis. The HPOTP module is an expert system, which uses 'rules-of-thumb' obtained from interviews with experts from NASA Marshall Space Flight Center (MSFC) to detect and diagnose anomalies. Analyses of the raw test data are first performed using pattern recognition techniques which result in features such as spikes, shifts, peaks, and drifts being detected and written to a database. The HPOTP module then looks for combination of these features which are indicative of known anomalies, using the rules gathered from the turbomachinery experts. Results of this analysis are then displayed via a graphical user interface which provides ranked lists of anomalies and observations by engine component, along with supporting data plots for each.

  5. New sensors and techniques for the structural health monitoring of propulsion systems.

    PubMed

    Woike, Mark; Abdul-Aziz, Ali; Oza, Nikunj; Matthews, Bryan

    2013-01-01

    The ability to monitor the structural health of the rotating components, especially in the hot sections of turbine engines, is of major interest to aero community in improving engine safety and reliability. The use of instrumentation for these applications remains very challenging. It requires sensors and techniques that are highly accurate, are able to operate in a high temperature environment, and can detect minute changes and hidden flaws before catastrophic events occur. The National Aeronautics and Space Administration (NASA), through the Aviation Safety Program (AVSP), has taken a lead role in the development of new sensor technologies and techniques for the in situ structural health monitoring of gas turbine engines. This paper presents a summary of key results and findings obtained from three different structural health monitoring approaches that have been investigated. This includes evaluating the performance of a novel microwave blade tip clearance sensor; a vibration based crack detection technique using an externally mounted capacitive blade tip clearance sensor; and lastly the results of using data driven anomaly detection algorithms for detecting cracks in a rotating disk.

  6. New Sensors and Techniques for the Structural Health Monitoring of Propulsion Systems

    PubMed Central

    2013-01-01

    The ability to monitor the structural health of the rotating components, especially in the hot sections of turbine engines, is of major interest to aero community in improving engine safety and reliability. The use of instrumentation for these applications remains very challenging. It requires sensors and techniques that are highly accurate, are able to operate in a high temperature environment, and can detect minute changes and hidden flaws before catastrophic events occur. The National Aeronautics and Space Administration (NASA), through the Aviation Safety Program (AVSP), has taken a lead role in the development of new sensor technologies and techniques for the in situ structural health monitoring of gas turbine engines. This paper presents a summary of key results and findings obtained from three different structural health monitoring approaches that have been investigated. This includes evaluating the performance of a novel microwave blade tip clearance sensor; a vibration based crack detection technique using an externally mounted capacitive blade tip clearance sensor; and lastly the results of using data driven anomaly detection algorithms for detecting cracks in a rotating disk. PMID:23935425

  7. Application effectiveness of the microtremor survey method in the exploration of geothermal resources

    NASA Astrophysics Data System (ADS)

    Tian, Baoqing; Xu, Peifen; Ling, Suqun; Du, Jianguo; Xu, Xueqiu; Pang, Zhonghe

    2017-10-01

    Geophysical techniques are critical tools of geothermal resource surveys. In recent years, the microtremor survey method, which has two branch techniques (the microtremor sounding technique and the two-dimensional (2D) microtremor profiling technique), has become a common method for geothermal resource exploration. The results of microtremor surveys provide important deep information for probing structures of geothermal storing basins and researching the heat-controlling structures, as well as providing the basis for drilling positions of geothermal wells. In this paper, the southern Jiangsu geothermal resources area is taken as a study example. By comparing the results of microtremor surveys and drilling conclusions, and analyzing microtremor survey effectiveness, and geological and technical factors such as observation radius and sampling frequency, we study the applicability of the microtremor survey method and the optimal way of working with this method to achieve better detection results. A comparative study of survey results and geothermal drilling results shows that the microtremor sounding technique effectively distinguishes sub-layers and determines the depth of geothermal reservoirs in the area with excellent layer conditions. The error of depth is generally no more than 8% compared with the results of drilling. It detects deeper by adjusting the size of the probing radius. The 2D microtremor profiling technique probes exactly the buried structures which display as low velocity anomalies in the apparent velocity profile of the S-wave. The anomaly is the critical symbol of the 2D microtremor profiling technique to distinguish and explain the buried geothermal structures. 2D microtremor profiling results provide an important basis for locating exactly the geothermal well and reducing the risk of drilling dry wells.

  8. Complementary role of magnetic resonance imaging in the study of the fetal urinary system.

    PubMed

    Gómez Huertas, M; Culiañez Casas, M; Molina García, F S; Carrillo Badillo, M P; Pastor Pons, E

    2016-01-01

    Urinary system birth defects represent the abnormality most often detected in prenatal studies, accounting for 30% to 50% of all structural anomalies present at birth. The most common disorders are urinary tract dilation, developmental variants, cystic kidney diseases, kidney tumors, and bladder defects. These anomalies can present in isolation or in association with various syndromes. They are normally evaluated with sonography, and the use of magnetic resonance imaging (MRI) is considered only in inconclusive cases. In this article, we show the potential of fetal MRI as a technique to complement sonography in the study of fetal urinary system anomalies. We show the additional information that MRI can provide in each entity, especially in the evaluation of kidney function through diffusion-weighted sequences. Copyright © 2016 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  9. Remote Structural Health Monitoring and Advanced Prognostics of Wind Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douglas Brown; Bernard Laskowski

    The prospect of substantial investment in wind energy generation represents a significant capital investment strategy. In order to maximize the life-cycle of wind turbines, associated rotors, gears, and structural towers, a capability to detect and predict (prognostics) the onset of mechanical faults at a sufficiently early stage for maintenance actions to be planned would significantly reduce both maintenance and operational costs. Advancement towards this effort has been made through the development of anomaly detection, fault detection and fault diagnosis routines to identify selected fault modes of a wind turbine based on available sensor data preceding an unscheduled emergency shutdown. Themore » anomaly detection approach employs spectral techniques to find an approximation of the data using a combination of attributes that capture the bulk of variability in the data. Fault detection and diagnosis (FDD) is performed using a neural network-based classifier trained from baseline and fault data recorded during known failure conditions. The approach has been evaluated for known baseline conditions and three selected failure modes: pitch rate failure, low oil pressure failure and a gearbox gear-tooth failure. Experimental results demonstrate the approach can distinguish between these failure modes and normal baseline behavior within a specified statistical accuracy.« less

  10. Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling

    NASA Astrophysics Data System (ADS)

    Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji

    We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.

  11. Feasibility of anomaly detection and characterization using trans-admittance mammography with 60 × 60 electrode array

    NASA Astrophysics Data System (ADS)

    Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; In Oh, Tong

    2014-10-01

    Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.

  12. Challenges of using electrical resistivity method to locate karst conduits-A field case in the Inner Bluegrass Region, Kentucky

    USGS Publications Warehouse

    Zhu, J.; Currens, J.C.; Dinger, J.S.

    2011-01-01

    Conduits serve as major pathways for groundwater flow in karst aquifers. Locating them from the surface, however, is one of the most challenging tasks in karst research. Geophysical methods are often deployed to help locate voids by mapping variations of physical properties of the subsurface. Conduits can cause significant contrasts of some physical properties that can be detected; other subsurface features such as water-bearing fractures often yield similar contrasts, which are difficult to distinguish from the effects of the conduits. This study used electrical resistivity method to search for an unmapped karst conduit that recharges Royal Spring in the Inner Bluegrass karst region, Kentucky, USA. Three types of resistivity techniques (surface 2D survey, quasi-3D survey, and time-lapse survey) were used to map and characterize resistivity anomalies. Some of the major anomalies were selected as drilling targets to verify the existence of the conduits. Drilling near an anomaly identified by an electrical resistivity profile resulted in successful penetration of a major water-filled conduit. The drilling results also suggest that, in this study area, low resistivity anomalies in general are associated with water-bearing features. However, differences in the anomaly signals between the water-filled conduit and other water-bearing features such as water-filled fracture zones were undistinguishable. The electrical resistivity method is useful in conduit detection by providing potential drilling targets. Knowledge of geology and hydrogeology about the site and professional judgment also played important roles in locating the major conduit. ?? 2011 Elsevier B.V.

  13. Fault Detection and Safety in Closed-Loop Artificial Pancreas Systems

    PubMed Central

    2014-01-01

    Continuous subcutaneous insulin infusion pumps and continuous glucose monitors enable individuals with type 1 diabetes to achieve tighter blood glucose control and are critical components in a closed-loop artificial pancreas. Insulin infusion sets can fail and continuous glucose monitor sensor signals can suffer from a variety of anomalies, including signal dropout and pressure-induced sensor attenuations. In addition to hardware-based failures, software and human-induced errors can cause safety-related problems. Techniques for fault detection, safety analyses, and remote monitoring techniques that have been applied in other industries and applications, such as chemical process plants and commercial aircraft, are discussed and placed in the context of a closed-loop artificial pancreas. PMID:25049365

  14. A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data.

    PubMed

    Song, Hongchao; Jiang, Zhuqing; Men, Aidong; Yang, Bo

    2017-01-01

    Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE) and an ensemble k -nearest neighbor graphs- ( K -NNG-) based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity.

  15. A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data

    PubMed Central

    Jiang, Zhuqing; Men, Aidong; Yang, Bo

    2017-01-01

    Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE) and an ensemble k-nearest neighbor graphs- (K-NNG-) based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity. PMID:29270197

  16. Non-supervised method for early forest fire detection and rapid mapping

    NASA Astrophysics Data System (ADS)

    Artés, Tomás; Boca, Roberto; Liberta, Giorgio; San-Miguel, Jesús

    2017-09-01

    Natural hazards are a challenge for the society. Scientific community efforts have been severely increased assessing tasks about prevention and damage mitigation. The most important points to minimize natural hazard damages are monitoring and prevention. This work focuses particularly on forest fires. This phenomenon depends on small-scale factors and fire behavior is strongly related to the local weather. Forest fire spread forecast is a complex task because of the scale of the phenomena, the input data uncertainty and time constraints in forest fire monitoring. Forest fire simulators have been improved, including some calibration techniques avoiding data uncertainty and taking into account complex factors as the atmosphere. Such techniques increase dramatically the computational cost in a context where the available time to provide a forecast is a hard constraint. Furthermore, an early mapping of the fire becomes crucial to assess it. In this work, a non-supervised method for forest fire early detection and mapping is proposed. As main sources, the method uses daily thermal anomalies from MODIS and VIIRS combined with land cover map to identify and monitor forest fires with very few resources. This method relies on a clustering technique (DBSCAN algorithm) and on filtering thermal anomalies to detect the forest fires. In addition, a concave hull (alpha shape algorithm) is applied to obtain rapid mapping of the fire area (very coarse accuracy mapping). Therefore, the method leads to a potential use for high-resolution forest fire rapid mapping based on satellite imagery using the extent of each early fire detection. It shows the way to an automatic rapid mapping of the fire at high resolution processing as few data as possible.

  17. Anomaly detection in reconstructed quantum states using a machine-learning technique

    NASA Astrophysics Data System (ADS)

    Hara, Satoshi; Ono, Takafumi; Okamoto, Ryo; Washio, Takashi; Takeuchi, Shigeki

    2014-02-01

    The accurate detection of small deviations in given density matrices is important for quantum information processing. Here we propose a method based on the concept of data mining. We demonstrate that the proposed method can more accurately detect small erroneous deviations in reconstructed density matrices, which contain intrinsic fluctuations due to the limited number of samples, than a naive method of checking the trace distance from the average of the given density matrices. This method has the potential to be a key tool in broad areas of physics where the detection of small deviations of quantum states reconstructed using a limited number of samples is essential.

  18. Framework for behavioral analytics in anomaly identification

    NASA Astrophysics Data System (ADS)

    Touma, Maroun; Bertino, Elisa; Rivera, Brian; Verma, Dinesh; Calo, Seraphin

    2017-05-01

    Behavioral Analytics (BA) relies on digital breadcrumbs to build user profiles and create clusters of entities that exhibit a large degree of similarity. The prevailing assumption is that an entity will assimilate the group behavior of the cluster it belongs to. Our understanding of BA and its application in different domains continues to evolve and is a direct result of the growing interest in Machine Learning research. When trying to detect security threats, we use BA techniques to identify anomalies, defined in this paper as deviation from the group behavior. Early research papers in this field reveal a high number of false positives where a security alert is triggered based on deviation from the cluster learned behavior but still within the norm of what the system defines as an acceptable behavior. Further, domain specific security policies tend to be narrow and inadequately represent what an entity can do. Hence, they: a) limit the amount of useful data during the learning phase; and, b) lead to violation of policy during the execution phase. In this paper, we propose a framework for future research on the role of policies and behavior security in a coalition setting with emphasis on anomaly detection and individual's deviation from group activities.

  19. Expectation Maximization and its Application in Modeling, Segmentation and Anomaly Detection

    DTIC Science & Technology

    2008-05-01

    ocomplNc <la!a rrot>lcm,. ",., i’lCOll\\l>lc,c,ICSS of Ihc dala mayan "" IIuc lu missing dala. (J,,,,,,.,ed di,nibu!ions . elc . 0"" such c • ..- is a...Estimation Techniques in Computer Huiyan, Z., Yongfeng, C., Wen, Y. SAR Image Segmentation Using MPM Constrained Stochastic Relaxation. Civil Engineering

  20. An Investigation of Techniques for Detecting Data Anomalies in Earned Value Management Data

    DTIC Science & Technology

    2011-12-01

    Management Studio Harte Hanks Trillium Software Trillium Software System IBM Info Sphere Foundation Tools Informatica Data Explorer Informatica ...Analyst Informatica Developer Informatica Administrator Pitney Bowes Business Insight Spectrum SAP BusinessObjects Data Quality Management DataFlux...menting quality monitoring efforts and tracking data quality improvements Informatica http://www.informatica.com/products_services/Pages/index.aspx

  1. Respiratory Artefact Removal in Forced Oscillation Measurements: A Machine Learning Approach.

    PubMed

    Pham, Thuy T; Thamrin, Cindy; Robinson, Paul D; McEwan, Alistair L; Leong, Philip H W

    2017-08-01

    Respiratory artefact removal for the forced oscillation technique can be treated as an anomaly detection problem. Manual removal is currently considered the gold standard, but this approach is laborious and subjective. Most existing automated techniques used simple statistics and/or rejected anomalous data points. Unfortunately, simple statistics are insensitive to numerous artefacts, leading to low reproducibility of results. Furthermore, rejecting anomalous data points causes an imbalance between the inspiratory and expiratory contributions. From a machine learning perspective, such methods are unsupervised and can be considered simple feature extraction. We hypothesize that supervised techniques can be used to find improved features that are more discriminative and more highly correlated with the desired output. Features thus found are then used for anomaly detection by applying quartile thresholding, which rejects complete breaths if one of its features is out of range. The thresholds are determined by both saliency and performance metrics rather than qualitative assumptions as in previous works. Feature ranking indicates that our new landmark features are among the highest scoring candidates regardless of age across saliency criteria. F1-scores, receiver operating characteristic, and variability of the mean resistance metrics show that the proposed scheme outperforms previous simple feature extraction approaches. Our subject-independent detector, 1IQR-SU, demonstrated approval rates of 80.6% for adults and 98% for children, higher than existing methods. Our new features are more relevant. Our removal is objective and comparable to the manual method. This is a critical work to automate forced oscillation technique quality control.

  2. [Prenatal diagnosis. Review, personal and prospective studies].

    PubMed

    Engel, E; Empson, J; DeLozier, D; McGee, B; da Costa Woodson, E; Engel-de Montmollin, M; Carter, T; Lorber, C; Cassidy, S B; Millis, J; Heller, R M; Boehm, F; Vanhooydonk, J

    1979-07-07

    1. In a review of methods developed for the identification of fetal malformations, the technique, risks and results of amniocentesis are presented. 2. Large series already published have demonstrated the relative simplicity and feasibility of the procedure as well as current indications for its utilization. These include the detection of chromosomal anomalies, the determination of sex (in certain sex-linked disorders), documentation of enzymatic and metabolic deficiencies, and the demonstration of open lesions of the neural tube by appropriate techniques. 3. Experience with over 500 cases personally tested by the authors entirely confirms the major indications for and benefits of this modern method for the detection and prevention of severe congenital anomalies during early pregnancy. 4. The identification of chromosomal alterations is currently the major objective of the method. Increased risks are associated with pregnancies involving a maternal age of 35 years or older (which account for 1-3% of aneuploidies), the birth of a previous infant with free trisomy 21 (1% recurrence risk) or secondary to a parental chromosome translocation (as much as 10% risk of aneuploidy). Fetal karyotyping for determination of sex, in cases where the mother is a carrier of an X-linked recessive gene (on average, 50% of male offspring will be affected), is an inadequate method of diagnosis to be utilized only until alternative techniques render possible specific diagnosis of the anomalies under consideration (hemophilias A and B, muscular dystrophy, etc). 5. Several of these techniques are now nearing development through the advent of fetoscopy and advanced ultrasound methodology, and have already been applied to the detection of certain sex-linked disorders and also for diagnosis of hemoglobinopathies (thalassemias, sickel cell anemia) and other conditions requiring the obtaining of fetal blood for diagnosis. Technology allowing direct examination of fetal parts by means of optical instruments is particularly useful in cases where a severe fetal morphologic malformation cannot currently be identified by indirect visualization (ultrasound) or by analysis of cytogenetic or molecular markers. 6. Pathological accumulations of alpha-fetoprotein which are associated with diverse feto-placental abnormalities (particularly open malformations of the neural tube) can be detected in the amniotic fluid and/or maternal blood. In extension of this approach, it is foreseeable that conditions existing prenatally will be diagnosed in a growing number of cases from the study of fetal cells and molecules which can be isolated from the venous blood of pregnant women. This will become feasible as a result of some well-developed techniques which allow separation of fetal from maternal cells and metabolites, and also to some extremely fine analytic techniques, notably examination of the DNA itself by means of restriction enzymes.

  3. NDE of ceramics and ceramic composites

    NASA Technical Reports Server (NTRS)

    Vary, Alex; Klima, Stanley J.

    1991-01-01

    Although nondestructive evaluation (NDE) techniques for ceramics are fairly well developed, they are difficult to apply in many cases for high probability detection of the minute flaws that can cause failure in monolithic ceramics. Conventional NDE techniques are available for monolithic and fiber reinforced ceramic matrix composites, but more exact quantitative techniques needed are still being investigated and developed. Needs range from flaw detection to below 100 micron levels in monolithic ceramics to global imaging of fiber architecture and matrix densification anomalies in ceramic composites. NDE techniques that will ultimately be applicable to production and quality control of ceramic structures are still emerging from the lab. Needs are different depending on the processing stage, fabrication method, and nature of the finished product. NDE techniques are being developed in concert with materials processing research where they can provide feedback information to processing development and quality improvement. NDE techniques also serve as research tools for materials characterization and for understanding failure processes, e.g., during thermomechanical testing.

  4. Modeling EEG Waveforms with Semi-Supervised Deep Belief Nets: Fast Classification and Anomaly Measurement

    PubMed Central

    Wulsin, D. F.; Gupta, J. R.; Mani, R.; Blanco, J. A.; Litt, B.

    2011-01-01

    Clinical electroencephalography (EEG) records vast amounts of human complex data yet is still reviewed primarily by human readers. Deep Belief Nets (DBNs) are a relatively new type of multi-layer neural network commonly tested on two-dimensional image data, but are rarely applied to times-series data such as EEG. We apply DBNs in a semi-supervised paradigm to model EEG waveforms for classification and anomaly detection. DBN performance was comparable to standard classifiers on our EEG dataset, and classification time was found to be 1.7 to 103.7 times faster than the other high-performing classifiers. We demonstrate how the unsupervised step of DBN learning produces an autoencoder that can naturally be used in anomaly measurement. We compare the use of raw, unprocessed data—a rarity in automated physiological waveform analysis—to hand-chosen features and find that raw data produces comparable classification and better anomaly measurement performance. These results indicate that DBNs and raw data inputs may be more effective for online automated EEG waveform recognition than other common techniques. PMID:21525569

  5. Visualizing Uncertainty for Data Fusion Graphics: Review of Selected Literature and Industry Approaches

    DTIC Science & Technology

    2015-06-09

    anomaly detection , which is generally considered part of high level information fusion (HLIF) involving temporal-geospatial data as well as meta-data... Anomaly detection in the Maritime defence and security domain typically focusses on trying to identify vessels that are behaving in an unusual...manner compared with lawful vessels operating in the area – an applied case of target detection among distractors. Anomaly detection is a complex problem

  6. Multilayer Statistical Intrusion Detection in Wireless Networks

    NASA Astrophysics Data System (ADS)

    Hamdi, Mohamed; Meddeb-Makhlouf, Amel; Boudriga, Noureddine

    2008-12-01

    The rapid proliferation of mobile applications and services has introduced new vulnerabilities that do not exist in fixed wired networks. Traditional security mechanisms, such as access control and encryption, turn out to be inefficient in modern wireless networks. Given the shortcomings of the protection mechanisms, an important research focuses in intrusion detection systems (IDSs). This paper proposes a multilayer statistical intrusion detection framework for wireless networks. The architecture is adequate to wireless networks because the underlying detection models rely on radio parameters and traffic models. Accurate correlation between radio and traffic anomalies allows enhancing the efficiency of the IDS. A radio signal fingerprinting technique based on the maximal overlap discrete wavelet transform (MODWT) is developed. Moreover, a geometric clustering algorithm is presented. Depending on the characteristics of the fingerprinting technique, the clustering algorithm permits to control the false positive and false negative rates. Finally, simulation experiments have been carried out to validate the proposed IDS.

  7. An incremental anomaly detection model for virtual machines.

    PubMed

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.

  8. Novel Hyperspectral Anomaly Detection Methods Based on Unsupervised Nearest Regularized Subspace

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Chen, Y.; Tan, K.; Du, P.

    2018-04-01

    Anomaly detection has been of great interest in hyperspectral imagery analysis. Most conventional anomaly detectors merely take advantage of spectral and spatial information within neighboring pixels. In this paper, two methods of Unsupervised Nearest Regularized Subspace-based with Outlier Removal Anomaly Detector (UNRSORAD) and Local Summation UNRSORAD (LSUNRSORAD) are proposed, which are based on the concept that each pixel in background can be approximately represented by its spatial neighborhoods, while anomalies cannot. Using a dual window, an approximation of each testing pixel is a representation of surrounding data via a linear combination. The existence of outliers in the dual window will affect detection accuracy. Proposed detectors remove outlier pixels that are significantly different from majority of pixels. In order to make full use of various local spatial distributions information with the neighboring pixels of the pixels under test, we take the local summation dual-window sliding strategy. The residual image is constituted by subtracting the predicted background from the original hyperspectral imagery, and anomalies can be detected in the residual image. Experimental results show that the proposed methods have greatly improved the detection accuracy compared with other traditional detection method.

  9. An incremental anomaly detection model for virtual machines

    PubMed Central

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245

  10. Advancements of Data Anomaly Detection Research in Wireless Sensor Networks: A Survey and Open Issues

    PubMed Central

    Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-01-01

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182

  11. ISHM Anomaly Lexicon for Rocket Test

    NASA Technical Reports Server (NTRS)

    Schmalzel, John L.; Buchanan, Aubri; Hensarling, Paula L.; Morris, Jonathan; Turowski, Mark; Figueroa, Jorge F.

    2007-01-01

    Integrated Systems Health Management (ISHM) is a comprehensive capability. An ISHM system must detect anomalies, identify causes of such anomalies, predict future anomalies, help identify consequences of anomalies for example, suggested mitigation steps. The system should also provide users with appropriate navigation tools to facilitate the flow of information into and out of the ISHM system. Central to the ability of the ISHM to detect anomalies is a clearly defined catalog of anomalies. Further, this lexicon of anomalies must be organized in ways that make it accessible to a suite of tools used to manage the data, information and knowledge (DIaK) associated with a system. In particular, it is critical to ensure that there is optimal mapping between target anomalies and the algorithms associated with their detection. During the early development of our ISHM architecture and approach, it became clear that a lexicon of anomalies would be important to the development of critical anomaly detection algorithms. In our work in the rocket engine test environment at John C. Stennis Space Center, we have access to a repository of discrepancy reports (DRs) that are generated in response to squawks identified during post-test data analysis. The DR is the tool used to document anomalies and the methods used to resolve the issue. These DRs have been generated for many different tests and for all test stands. The result is that they represent a comprehensive summary of the anomalies associated with rocket engine testing. Fig. 1 illustrates some of the data that can be extracted from a DR. Such information includes affected transducer channels, narrative description of the observed anomaly, and the steps used to correct the problem. The primary goal of the anomaly lexicon development efforts we have undertaken is to create a lexicon that could be used in support of an associated health assessment database system (HADS) co-development effort. There are a number of significant byproducts of the anomaly lexicon compilation effort. For example, (1) Allows determination of the frequency distribution of anomalies to help identify those with the potential for high return on investment if included in automated detection as part of an ISHM system, (2) Availability of a regular lexicon could provide the base anomaly name choices to help maintain consistency in the DR collection process, and (3) Although developed for the rocket engine test environment, most of the anomalies are not specific to rocket testing, and thus can be reused in other applications.

  12. Very Large Graphs for Information Extraction (VLG) Detection and Inference in the Presence of Uncertainty

    DTIC Science & Technology

    2015-09-21

    this framework, MIT LL carried out a one-year proof- of-concept study to determine the capabilities and challenges in the detection of anomalies in...extremely large graphs [5]. Under this effort, two real datasets were considered, and algorithms for data modeling and anomaly detection were developed...is required in a well-defined experimental framework for the detection of anomalies in very large graphs. This study is intended to inform future

  13. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  14. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines

    PubMed Central

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-01-01

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561

  15. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.

    PubMed

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-04-29

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.

  16. Variable Discretisation for Anomaly Detection using Bayesian Networks

    DTIC Science & Technology

    2017-01-01

    UNCLASSIFIED DST- Group –TR–3328 1 Introduction Bayesian network implementations usually require each variable to take on a finite number of mutually...UNCLASSIFIED Variable Discretisation for Anomaly Detection using Bayesian Networks Jonathan Legg National Security and ISR Division Defence Science...and Technology Group DST- Group –TR–3328 ABSTRACT Anomaly detection is the process by which low probability events are automatically found against a

  17. Enhanced detection and visualization of anomalies in spectral imagery

    NASA Astrophysics Data System (ADS)

    Basener, William F.; Messinger, David W.

    2009-05-01

    Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.

  18. Pediatric tinnitus: Incidence of imaging anomalies and the impact of hearing loss.

    PubMed

    Kerr, Rhorie; Kang, Elise; Hopkins, Brandon; Anne, Samantha

    2017-12-01

    Guidelines exist for evaluation and management of tinnitus in adults; however lack of evidence in children limits applicability of these guidelines to pediatric patients. Objective of this study is to determine the incidence of inner ear anomalies detected on imaging studies within the pediatric population with tinnitus and evaluate if presence of hearing loss increases the rate of detection of anomalies in comparison to normal hearing patients. Retrospective review of all children with diagnosis of tinnitus from 2010 to 2015 ;at a tertiary care academic center. 102 pediatric patients with tinnitus were identified. Overall, 53 patients had imaging studies with 6 abnormal findings (11.3%). 51/102 patients had hearing loss of which 33 had imaging studies demonstrating 6 inner ear anomalies detected. This is an incidence of 18.2% for inner ear anomalies identified in patients with hearing loss (95% confidence interval (CI) of 7.0-35.5%). 4 of these 6 inner ear anomalies detected were vestibular aqueduct abnormalities. The other two anomalies were cochlear hypoplasia and bilateral semicircular canal dysmorphism. 51 patients had no hearing loss and of these patients, 20 had imaging studies with no inner ear abnormalities detected. There was no statistical difference in incidence of abnormal imaging findings in patients with and without hearing loss (Fisher's exact test, p ;= ;0.072.) CONCLUSION: There is a high incidence of anomalies detected in imaging studies done in pediatric patients with tinnitus, especially in the presence of hearing loss. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Electrical Resistivity Measurements: a Review

    NASA Astrophysics Data System (ADS)

    Singh, Yadunath

    World-wide interest on the use of ceramic materials for aerospace and other advanced engineering applications, has led to the need for inspection techniques capable of detecting unusually electrical and thermal anomalies in these compounds. Modern ceramic materials offer many attractive physical, electrical and mechanical properties for a wide and rapidly growing range of industrial applications; moreover specific use may be made of their electrical resistance, chemical resistance, and thermal barrier properties. In this review, we report the development and various techniques for the resistivity measurement of solid kind of samples.

  20. SERC 2014-2018 Technical Plan

    DTIC Science & Technology

    2013-10-25

    assurance-case analysis that are not only more powerful in anomaly detection , but also leading to stronger possibilities for positive assurance and to...PAGES 50 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c . THIS PAGE unclassified Standard Form 298 (Rev. 8...base for responding to cyber-attacks; and ( c ) combine techniques developed for automatic control systems in a manner that will both enable defense

  1. Neural network architectures to analyze OPAD data

    NASA Technical Reports Server (NTRS)

    Whitaker, Kevin W.

    1992-01-01

    A prototype Optical Plume Anomaly Detection (OPAD) system is now installed on the space shuttle main engine (SSME) Technology Test Bed (TTB) at MSFC. The OPAD system requirements dictate the need for fast, efficient data processing techniques. To address this need of the OPAD system, a study was conducted into how artificial neural networks could be used to assist in the analysis of plume spectral data.

  2. Anomaly-Based Intrusion Detection Systems Utilizing System Call Data

    DTIC Science & Technology

    2012-03-01

    Functionality Description Persistence mechanism Mimicry technique Camouflage malware image: • renaming its image • appending its image to victim...particular industrial plant . Exactly which one was targeted still remains unknown, however a majority of the attacks took place in Iran [24]. Due... plant to unstable phase and eventually physical damage. It is interesting to note that a particular block of code - block DB8061 is automatically

  3. Lower Mantle S-wave Velocity Model under the Western United States

    NASA Astrophysics Data System (ADS)

    Nelson, P.; Grand, S. P.

    2016-12-01

    Deep mantle plumes created by thermal instabilities at the core-mantle boundary has been an explanation for intraplate volcanism since the 1970's. Recently, broad slow velocity conduits in the lower mantle underneath some hotspots have been observed (French and Romanowicz, 2015), however the direct detection of a classical thin mantle plume using seismic tomography has remained elusive. Herein, we present a seismic tomography technique designed to image a deep mantle plume under the Yellowstone Hotspot located in the western United States utilizing SKS and SKKS waves in conjunction with finite frequency tomography. Synthetic resolution tests show the technique can resolve a 235 km diameter lower mantle plume with a 1.5% Gaussian velocity perturbation even if a realistic amount of random noise is added to the data. The Yellowstone Hotspot presents a unique opportunity to image a thin plume because it is the only hotspot with a purported deep origin that has a large enough aperture and density of seismometers to accurately sample the lower mantle at the length scales required to image a plume. Previous regional tomography studies largely based on S wave data have imaged a cylindrically shaped slow anomaly extending down to 900km under the hotspot, however they could not resolve it any deeper (Schmandt et al., 2010; Obrebski et al., 2010).To test if the anomaly extends deeper, we measured and inverted over 40,000 SKS and SKKS waves' travel times in two frequency bands recorded at 2400+ stations deployed during 2006-2012. Our preliminary model shows narrow slow velocity anomalies in the lower mantle with no fast anomalies. The slow anomalies are offset from the Yellowstone hotspot and may be diapirs rising from the base of the mantle.

  4. Hyperspectral anomaly detection using Sony PlayStation 3

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton; Romano, João; Sepulveda, Rene

    2009-05-01

    We present a proof-of-principle demonstration using Sony's IBM Cell processor-based PlayStation 3 (PS3) to run-in near real-time-a hyperspectral anomaly detection algorithm (HADA) on real hyperspectral (HS) long-wave infrared imagery. The PS3 console proved to be ideal for doing precisely the kind of heavy computational lifting HS based algorithms require, and the fact that it is a relatively open platform makes programming scientific applications feasible. The PS3 HADA is a unique parallel-random sampling based anomaly detection approach that does not require prior spectra of the clutter background. The PS3 HADA is designed to handle known underlying difficulties (e.g., target shape/scale uncertainties) often ignored in the development of autonomous anomaly detection algorithms. The effort is part of an ongoing cooperative contribution between the Army Research Laboratory and the Army's Armament, Research, Development and Engineering Center, which aims at demonstrating performance of innovative algorithmic approaches for applications requiring autonomous anomaly detection using passive sensors.

  5. A novel approach for detection of anomalies using measurement data of the Ironton-Russell bridge

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Norouzi, Mehdi; Hunt, Victor; Helmicki, Arthur

    2015-04-01

    Data models have been increasingly used in recent years for documenting normal behavior of structures and hence detect and classify anomalies. Large numbers of machine learning algorithms were proposed by various researchers to model operational and functional changes in structures; however, a limited number of studies were applied to actual measurement data due to limited access to the long term measurement data of structures and lack of access to the damaged states of structures. By monitoring the structure during construction and reviewing the effect of construction events on the measurement data, this study introduces a new approach to detect and eventually classify anomalies during construction and after construction. First, the implementation procedure of the sensory network that develops while the bridge is being built and its current status will be detailed. Second, the proposed anomaly detection algorithm will be applied on the collected data and finally, detected anomalies will be validated against the archived construction events.

  6. Accumulating pyramid spatial-spectral collaborative coding divergence for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Zou, Huanxin; Zhou, Shilin

    2016-03-01

    Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.

  7. nu-Anomica: A Fast Support Vector Based Novelty Detection Technique

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Bhaduri, Kanishka; Oza, Nikunj C.; Srivastava, Ashok N.

    2009-01-01

    In this paper we propose nu-Anomica, a novel anomaly detection technique that can be trained on huge data sets with much reduced running time compared to the benchmark one-class Support Vector Machines algorithm. In -Anomica, the idea is to train the machine such that it can provide a close approximation to the exact decision plane using fewer training points and without losing much of the generalization performance of the classical approach. We have tested the proposed algorithm on a variety of continuous data sets under different conditions. We show that under all test conditions the developed procedure closely preserves the accuracy of standard one-class Support Vector Machines while reducing both the training time and the test time by 5 - 20 times.

  8. Detection of admittivity anomaly on high-contrast heterogeneous backgrounds using frequency difference EIT.

    PubMed

    Jang, J; Seo, J K

    2015-06-01

    This paper describes a multiple background subtraction method in frequency difference electrical impedance tomography (fdEIT) to detect an admittivity anomaly from a high-contrast background conductivity distribution. The proposed method expands the use of the conventional weighted frequency difference EIT method, which has been used limitedly to detect admittivity anomalies in a roughly homogeneous background. The proposed method can be viewed as multiple weighted difference imaging in fdEIT. Although the spatial resolutions of the output images by fdEIT are very low due to the inherent ill-posedness, numerical simulations and phantom experiments of the proposed method demonstrate its feasibility to detect anomalies. It has potential application in stroke detection in a head model, which is highly heterogeneous due to the skull.

  9. NTilt as an improved enhanced tilt derivative filter for edge detection of potential field anomalies

    NASA Astrophysics Data System (ADS)

    Nasuti, Yasin; Nasuti, Aziz

    2018-07-01

    We develop a new phase-based filter to enhance the edges of geological sources from potential-field data called NTilt, which utilizes the vertical derivative of the analytical signal in different orders to the tilt derivative equation. This will equalize signals from sources buried at different depths. In order to evaluate the designed filter, we compared the results obtained from our filter with those from recently applied methods, testing against both synthetic data, and measured data from the Finnmark region of North Norway were used. The results demonstrate that the new filter permits better definition of the edges of causative anomalies, as well as better highlighting several anomalies that either are not shown in tilt derivative and other methods or not very well defined. The proposed technique also shows improvements in delineation of the actual edges of deep-seated anomalies compared to tilt derivative and other methods. The NTilt filter provides more accurate and sharper edges and makes the nearby anomalies more distinguishable, and also can avoid bringing some additional false edges reducing the ambiguity in potential field interpretations. This filter, thus, appears to be promising in providing a better qualitative interpretation of the gravity and magnetic data in comparison with the more commonly used filters.

  10. Multi-Level Modeling of Complex Socio-Technical Systems - Phase 1

    DTIC Science & Technology

    2013-06-06

    is to detect anomalous organizational outcomes, diagnose the causes of these anomalies , and decide upon appropriate compensation schemes. All of...monitor process outcomes. The purpose of this monitoring is to detect anomalous process outcomes, diagnose the causes of these anomalies , and decide upon...monitor work outcomes in terms of performance. The purpose of this monitoring is to detect anomalous work outcomes, diagnose the causes of these anomalies

  11. Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Todd Vollmer; Milos Manic

    The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, thismore » paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.« less

  12. Residual Error Based Anomaly Detection Using Auto-Encoder in SMD Machine Sound.

    PubMed

    Oh, Dong Yul; Yun, Il Dong

    2018-04-24

    Detecting an anomaly or an abnormal situation from given noise is highly useful in an environment where constantly verifying and monitoring a machine is required. As deep learning algorithms are further developed, current studies have focused on this problem. However, there are too many variables to define anomalies, and the human annotation for a large collection of abnormal data labeled at the class-level is very labor-intensive. In this paper, we propose to detect abnormal operation sounds or outliers in a very complex machine along with reducing the data-driven annotation cost. The architecture of the proposed model is based on an auto-encoder, and it uses the residual error, which stands for its reconstruction quality, to identify the anomaly. We assess our model using Surface-Mounted Device (SMD) machine sound, which is very complex, as experimental data, and state-of-the-art performance is successfully achieved for anomaly detection.

  13. First trimester PAPP-A in the detection of non-Down syndrome aneuploidy.

    PubMed

    Ochshorn, Y; Kupferminc, M J; Wolman, I; Orr-Urtreger, A; Jaffa, A J; Yaron, Y

    2001-07-01

    Combined first trimester screening using pregnancy associated plasma protein-A (PAPP-A), free beta-human chorionic gonadotrophin, and nuchal translucency (NT), is currently accepted as probably the best combination for the detection of Down syndrome (DS). Current first trimester algorithms provide computed risks only for DS. However, low PAPP-A is also associated with other chromosome anomalies such as trisomy 13, 18, and sex chromosome aneuploidy. Thus, using currently available algorithms, some chromosome anomalies may not be detected. The purpose of the present study was to establish a low-end cut-off value for PAPP-A that would increase the detection rates for non-DS chromosome anomalies. The study included 1408 patients who underwent combined first trimester screening. To determine a low-end cut-off value for PAPP-A, a Receiver-Operator Characteristic (ROC) curve analysis was performed. In the entire study group there were 18 cases of chromosome anomalies (trisomy 21, 13, 18, sex chromosome anomalies), 14 of which were among screen-positive patients, a detection rate of 77.7% for all chromosome anomalies (95% CI: 55.7-99.7%). ROC curve analysis detected a statistically significant cut-off for PAPP-A at 0.25 MoM. If the definition of screen-positive were to also include patients with PAPP-A<0.25 MoM, the detection rate would increase to 88.8% for all chromosome anomalies (95% CI: 71.6-106%). This low cut-off value may be used until specific algorithms are implemented for non-Down syndrome aneuploidy. Copyright 2001 John Wiley & Sons, Ltd.

  14. Evaluation schemes for video and image anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael

    2016-05-01

    Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.

  15. A new comparison of hyperspectral anomaly detection algorithms for real-time applications

    NASA Astrophysics Data System (ADS)

    Díaz, María.; López, Sebastián.; Sarmiento, Roberto

    2016-10-01

    Due to the high spectral resolution that remotely sensed hyperspectral images provide, there has been an increasing interest in anomaly detection. The aim of anomaly detection is to stand over pixels whose spectral signature differs significantly from the background spectra. Basically, anomaly detectors mark pixels with a certain score, considering as anomalies those whose scores are higher than a threshold. Receiver Operating Characteristic (ROC) curves have been widely used as an assessment measure in order to compare the performance of different algorithms. ROC curves are graphical plots which illustrate the trade- off between false positive and true positive rates. However, they are limited in order to make deep comparisons due to the fact that they discard relevant factors required in real-time applications such as run times, costs of misclassification and the competence to mark anomalies with high scores. This last fact is fundamental in anomaly detection in order to distinguish them easily from the background without any posterior processing. An extensive set of simulations have been made using different anomaly detection algorithms, comparing their performances and efficiencies using several extra metrics in order to complement ROC curves analysis. Results support our proposal and demonstrate that ROC curves do not provide a good visualization of detection performances for themselves. Moreover, a figure of merit has been proposed in this paper which encompasses in a single global metric all the measures yielded for the proposed additional metrics. Therefore, this figure, named Detection Efficiency (DE), takes into account several crucial types of performance assessment that ROC curves do not consider. Results demonstrate that algorithms with the best detection performances according to ROC curves do not have the highest DE values. Consequently, the recommendation of using extra measures to properly evaluate performances have been supported and justified by the conclusions drawn from the simulations.

  16. A hyperspectral imagery anomaly detection algorithm based on local three-dimensional orthogonal subspace projection

    NASA Astrophysics Data System (ADS)

    Zhang, Xing; Wen, Gongjian

    2015-10-01

    Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.

  17. An Optimized Method to Detect BDS Satellites' Orbit Maneuvering and Anomalies in Real-Time.

    PubMed

    Huang, Guanwen; Qin, Zhiwei; Zhang, Qin; Wang, Le; Yan, Xingyuan; Wang, Xiaolei

    2018-02-28

    The orbital maneuvers of Global Navigation Satellite System (GNSS) Constellations will decrease the performance and accuracy of positioning, navigation, and timing (PNT). Because satellites in the Chinese BeiDou Navigation Satellite System (BDS) are in Geostationary Orbit (GEO) and Inclined Geosynchronous Orbit (IGSO), maneuvers occur more frequently. Also, the precise start moment of the BDS satellites' orbit maneuvering cannot be obtained by common users. This paper presented an improved real-time detecting method for BDS satellites' orbit maneuvering and anomalies with higher timeliness and higher accuracy. The main contributions to this improvement are as follows: (1) instead of the previous two-steps method, a new one-step method with higher accuracy is proposed to determine the start moment and the pseudo random noise code (PRN) of the satellite orbit maneuvering in that time; (2) BDS Medium Earth Orbit (MEO) orbital maneuvers are firstly detected according to the proposed selection strategy for the stations; and (3) the classified non-maneuvering anomalies are detected by a new median robust method using the weak anomaly detection factor and the strong anomaly detection factor. The data from the Multi-GNSS Experiment (MGEX) in 2017 was used for experimental analysis. The experimental results and analysis showed that the start moment of orbital maneuvers and the period of non-maneuver anomalies can be determined more accurately in real-time. When orbital maneuvers and anomalies occur, the proposed method improved the data utilization for 91 and 95 min in 2017.

  18. An Optimized Method to Detect BDS Satellites’ Orbit Maneuvering and Anomalies in Real-Time

    PubMed Central

    Huang, Guanwen; Qin, Zhiwei; Zhang, Qin; Wang, Le; Yan, Xingyuan; Wang, Xiaolei

    2018-01-01

    The orbital maneuvers of Global Navigation Satellite System (GNSS) Constellations will decrease the performance and accuracy of positioning, navigation, and timing (PNT). Because satellites in the Chinese BeiDou Navigation Satellite System (BDS) are in Geostationary Orbit (GEO) and Inclined Geosynchronous Orbit (IGSO), maneuvers occur more frequently. Also, the precise start moment of the BDS satellites’ orbit maneuvering cannot be obtained by common users. This paper presented an improved real-time detecting method for BDS satellites’ orbit maneuvering and anomalies with higher timeliness and higher accuracy. The main contributions to this improvement are as follows: (1) instead of the previous two-steps method, a new one-step method with higher accuracy is proposed to determine the start moment and the pseudo random noise code (PRN) of the satellite orbit maneuvering in that time; (2) BDS Medium Earth Orbit (MEO) orbital maneuvers are firstly detected according to the proposed selection strategy for the stations; and (3) the classified non-maneuvering anomalies are detected by a new median robust method using the weak anomaly detection factor and the strong anomaly detection factor. The data from the Multi-GNSS Experiment (MGEX) in 2017 was used for experimental analysis. The experimental results and analysis showed that the start moment of orbital maneuvers and the period of non-maneuver anomalies can be determined more accurately in real-time. When orbital maneuvers and anomalies occur, the proposed method improved the data utilization for 91 and 95 min in 2017. PMID:29495638

  19. Active Learning with Rationales for Identifying Operationally Significant Anomalies in Aviation

    NASA Technical Reports Server (NTRS)

    Sharma, Manali; Das, Kamalika; Bilgic, Mustafa; Matthews, Bryan; Nielsen, David Lynn; Oza, Nikunj C.

    2016-01-01

    A major focus of the commercial aviation community is discovery of unknown safety events in flight operations data. Data-driven unsupervised anomaly detection methods are better at capturing unknown safety events compared to rule-based methods which only look for known violations. However, not all statistical anomalies that are discovered by these unsupervised anomaly detection methods are operationally significant (e.g., represent a safety concern). Subject Matter Experts (SMEs) have to spend significant time reviewing these statistical anomalies individually to identify a few operationally significant ones. In this paper we propose an active learning algorithm that incorporates SME feedback in the form of rationales to build a classifier that can distinguish between uninteresting and operationally significant anomalies. Experimental evaluation on real aviation data shows that our approach improves detection of operationally significant events by as much as 75% compared to the state-of-the-art. The learnt classifier also generalizes well to additional validation data sets.

  20. Improved determination of vector lithospheric magnetic anomalies from MAGSAT data

    NASA Technical Reports Server (NTRS)

    Ravat, Dhananjay

    1993-01-01

    Scientific contributions made in developing new methods to isolate and map vector magnetic anomalies from measurements made by Magsat are described. In addition to the objective of the proposal, the isolation and mapping of equatorial vector lithospheric Magsat anomalies, isolation of polar ionospheric fields during the period were also studied. Significant progress was also made in isolation of polar delta(Z) component and scalar anomalies as well as integration and synthesis of various techniques of removing equatorial and polar ionospheric effects. The significant contributions of this research are: (1) development of empirical/analytical techniques in modeling ionospheric fields in Magsat data and their removal from uncorrected anomalies to obtain better estimates of lithospheric anomalies (this task was accomplished for equatorial delta(X), delta(Z), and delta(B) component and polar delta(Z) and delta(B) component measurements; (2) integration of important processing techniques developed during the last decade with the newly developed technologies of ionospheric field modeling into an optimum processing scheme; and (3) implementation of the above processing scheme to map the most robust magnetic anomalies of the lithosphere (components as well as scalar).

  1. High-Resolution Millimeter Wave Detection of Vertical Cracks in the Space Shuttle External Tank (ET) Spray-on-Foam Insulation (SOFI)

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Zoughi, R.; Hepburn, Frank L.

    2006-01-01

    Space Shuttle Columbia's catastrophic failure has been attributed to a piece of spray-on-foam insulation (SOFI) that was dislodged from the external tank (ET) and struck the leading edge of the left wing. A piece of SOFI was also dislodged in the Space Shuttle Discovery's flight in 2005 and recently a crack was detected in its ET foam prior to its successful launch. Millimeter wave nondestructive testing methods have been considered as potential effective inspection tools for evaluating the integrity of the SOFI. Recently, in a specific investigation into the potential of these methods for detecting vertical cracks in SOFI was explored using a focused millimeter wave reflectometer at 150 GHz. The results showed the capability of these methods for detecting tight vertical cracks (also as a function of crack opening dimension) in exposed SOFI panels and while covered by a piece of SOFI ramp simulating a more realistic and challenging situation. Some crack-like anomalies were also detected in a blind SOFI panel. This paper presents the background for these techniques as well as representative images of the vertical crack in the SOFI panel, crack-like anomalies in the blind panel and a discussion of the practical attributes of these inspection methods.

  2. Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering

    NASA Astrophysics Data System (ADS)

    Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech

    2015-03-01

    We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.

  3. Application of the Augmented Operator Function Model for Developing Cognitive Metrics in Persistent Surveillance

    DTIC Science & Technology

    2013-09-26

    vehicle-lengths between frames. The low specificity of object detectors in WAMI means all vehicle detections are treated equally. Motion clutter...timing of the anomaly . If an anomaly was detected , recent activity would have a priority over older activity. This is due to the reasoning that if the...this could be a potential anomaly detected . Other baseline activities include normal work hours, religious observance times and interactions between

  4. Critical Infrastructure Protection and Resilience Literature Survey: Modeling and Simulation

    DTIC Science & Technology

    2014-11-01

    2013 Page 34 of 63 Below the yellow set is a purple cluster bringing together detection , anomaly , intrusion, sensors, monitoring and alerting (early...hazards and threats to security56 Water ADWICE, PSS®SINCAL ADWICE for real-time anomaly detection in water management systems57 One tool that...Systems. Cybernetics and Information Technologies. 2008;8(4):57-68. 57. Raciti M, Cucurull J, Nadjm-Tehrani S. Anomaly detection in water management

  5. Symbolic Time-Series Analysis for Anomaly Detection in Mechanical Systems

    DTIC Science & Technology

    2006-08-01

    Amol Khatkhate, Asok Ray , Fellow, IEEE, Eric Keller, Shalabh Gupta, and Shin C. Chin Abstract—This paper examines the efficacy of a novel method for...recognition. KHATKHATE et al.: SYMBOLIC TIME-SERIES ANALYSIS FOR ANOMALY DETECTION 447 Asok Ray (F’02) received graduate degrees in electri- cal...anomaly detection has been pro- posed by Ray [6], where the underlying information on the dynamical behavior of complex systems is derived based on

  6. Autonomous detection of crowd anomalies in multiple-camera surveillance feeds

    NASA Astrophysics Data System (ADS)

    Nordlöf, Jonas; Andersson, Maria

    2016-10-01

    A novel approach for autonomous detection of anomalies in crowded environments is presented in this paper. The proposed models uses a Gaussian mixture probability hypothesis density (GM-PHD) filter as feature extractor in conjunction with different Gaussian mixture hidden Markov models (GM-HMMs). Results, based on both simulated and recorded data, indicate that this method can track and detect anomalies on-line in individual crowds through multiple camera feeds in a crowded environment.

  7. Deep learning on temporal-spectral data for anomaly detection

    NASA Astrophysics Data System (ADS)

    Ma, King; Leung, Henry; Jalilian, Ehsan; Huang, Daniel

    2017-05-01

    Detecting anomalies is important for continuous monitoring of sensor systems. One significant challenge is to use sensor data and autonomously detect changes that cause different conditions to occur. Using deep learning methods, we are able to monitor and detect changes as a result of some disturbance in the system. We utilize deep neural networks for sequence analysis of time series. We use a multi-step method for anomaly detection. We train the network to learn spectral and temporal features from the acoustic time series. We test our method using fiber-optic acoustic data from a pipeline.

  8. Firefly Algorithm in detection of TEC seismo-ionospheric anomalies

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, Mehdi

    2015-07-01

    Anomaly detection in time series of different earthquake precursors is an essential introduction to create an early warning system with an allowable uncertainty. Since these time series are more often non linear, complex and massive, therefore the applied predictor method should be able to detect the discord patterns from a large data in a short time. This study acknowledges Firefly Algorithm (FA) as a simple and robust predictor to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of the some powerful earthquakes including Chile (27 February 2010), Varzeghan (11 August 2012) and Saravan (16 April 2013). Outstanding anomalies were observed 7 and 5 days before the Chile and Varzeghan earthquakes, respectively and also 3 and 8 days prior to the Saravan earthquake.

  9. INDUCTIVE SYSTEM HEALTH MONITORING WITH STATISTICAL METRICS

    NASA Technical Reports Server (NTRS)

    Iverson, David L.

    2005-01-01

    Model-based reasoning is a powerful method for performing system monitoring and diagnosis. Building models for model-based reasoning is often a difficult and time consuming process. The Inductive Monitoring System (IMS) software was developed to provide a technique to automatically produce health monitoring knowledge bases for systems that are either difficult to model (simulate) with a computer or which require computer models that are too complex to use for real time monitoring. IMS processes nominal data sets collected either directly from the system or from simulations to build a knowledge base that can be used to detect anomalous behavior in the system. Machine learning and data mining techniques are used to characterize typical system behavior by extracting general classes of nominal data from archived data sets. In particular, a clustering algorithm forms groups of nominal values for sets of related parameters. This establishes constraints on those parameter values that should hold during nominal operation. During monitoring, IMS provides a statistically weighted measure of the deviation of current system behavior from the established normal baseline. If the deviation increases beyond the expected level, an anomaly is suspected, prompting further investigation by an operator or automated system. IMS has shown potential to be an effective, low cost technique to produce system monitoring capability for a variety of applications. We describe the training and system health monitoring techniques of IMS. We also present the application of IMS to a data set from the Space Shuttle Columbia STS-107 flight. IMS was able to detect an anomaly in the launch telemetry shortly after a foam impact damaged Columbia's thermal protection system.

  10. Latent Space Tracking from Heterogeneous Data with an Application for Anomaly Detection

    DTIC Science & Technology

    2015-11-01

    specific, if the anomaly behaves as a sudden outlier after which the data stream goes back to normal state, then the anomalous data point should be...introduced three types of anomalies , all of them are sudden outliers . 438 J. Huang and X. Ning Table 2. Synthetic dataset: AUC and parameters method...Latent Space Tracking from Heterogeneous Data with an Application for Anomaly Detection Jiaji Huang1(B) and Xia Ning2 1 Department of Electrical

  11. Anomaly detection of microstructural defects in continuous fiber reinforced composites

    NASA Astrophysics Data System (ADS)

    Bricker, Stephen; Simmons, J. P.; Przybyla, Craig; Hardie, Russell

    2015-03-01

    Ceramic matrix composites (CMC) with continuous fiber reinforcements have the potential to enable the next generation of high speed hypersonic vehicles and/or significant improvements in gas turbine engine performance due to their exhibited toughness when subjected to high mechanical loads at extreme temperatures (2200F+). Reinforced fiber composites (RFC) provide increased fracture toughness, crack growth resistance, and strength, though little is known about how stochastic variation and imperfections in the material effect material properties. In this work, tools are developed for quantifying anomalies within the microstructure at several scales. The detection and characterization of anomalous microstructure is a critical step in linking production techniques to properties, as well as in accurate material simulation and property prediction for the integrated computation materials engineering (ICME) of RFC based components. It is desired to find statistical outliers for any number of material characteristics such as fibers, fiber coatings, and pores. Here, fiber orientation, or `velocity', and `velocity' gradient are developed and examined for anomalous behavior. Categorizing anomalous behavior in the CMC is approached by multivariate Gaussian mixture modeling. A Gaussian mixture is employed to estimate the probability density function (PDF) of the features in question, and anomalies are classified by their likelihood of belonging to the statistical normal behavior for that feature.

  12. Summary of Progress on SIG Ft. Ord ESTCP DemVal

    DTIC Science & Technology

    2007-04-01

    We report on progress under an ESTCP demonstration plan dedicated to demonstrating active learning - based UXO detection on an actual former UXO site...Ft. Ord), using EMI data. In addition to describing the details of the active - learning algorithm, we discuss techniques that were required when...terms of two dipole-moment magnitudes and two resonant frequencies. Information-theoretic active learning is then conducted on all anomalies to

  13. The Use of Signal Dimensionality for Automatic QC of Seismic Array Data

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.; Draganov, D.; Maceira, M.; Gomez, M.

    2014-12-01

    A significant problem in seismic array analysis is the inclusion of bad sensor channels in the beam-forming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by-node basis, so the dimensionality of the node traffic is instead monitored for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. We examine the signal dimension in similar way to the method addressing node traffic anomalies in large computer systems. We explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements. We show preliminary results applied to arrays in Kazakhstan (Makanchi) and Argentina (Malargue).

  14. Anomaly Detection Based on Sensor Data in Petroleum Industry Applications

    PubMed Central

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-01

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection. PMID:25633599

  15. Data cleaning in the energy domain

    NASA Astrophysics Data System (ADS)

    Akouemo Kengmo Kenfack, Hermine N.

    This dissertation addresses the problem of data cleaning in the energy domain, especially for natural gas and electric time series. The detection and imputation of anomalies improves the performance of forecasting models necessary to lower purchasing and storage costs for utilities and plan for peak energy loads or distribution shortages. There are various types of anomalies, each induced by diverse causes and sources depending on the field of study. The definition of false positives also depends on the context. The analysis is focused on energy data because of the availability of data and information to make a theoretical and practical contribution to the field. A probabilistic approach based on hypothesis testing is developed to decide if a data point is anomalous based on the level of significance. Furthermore, the probabilistic approach is combined with statistical regression models to handle time series data. Domain knowledge of energy data and the survey of causes and sources of anomalies in energy are incorporated into the data cleaning algorithm to improve the accuracy of the results. The data cleaning method is evaluated on simulated data sets in which anomalies were artificially inserted and on natural gas and electric data sets. In the simulation study, the performance of the method is evaluated for both detection and imputation on all identified causes of anomalies in energy data. The testing on utilities' data evaluates the percentage of improvement brought to forecasting accuracy by data cleaning. A cross-validation study of the results is also performed to demonstrate the performance of the data cleaning algorithm on smaller data sets and to calculate an interval of confidence for the results. The data cleaning algorithm is able to successfully identify energy time series anomalies. The replacement of those anomalies provides improvement to forecasting models accuracy. The process is automatic, which is important because many data cleaning processes require human input and become impractical for very large data sets. The techniques are also applicable to other fields such as econometrics and finance, but the exogenous factors of the time series data need to be well defined.

  16. Enzyme leaching of surficial geochemical samples for detecting hydromorphic trace-element anomalies associated with precious-metal mineralized bedrock buried beneath glacial overburden in northern Minnesota

    USGS Publications Warehouse

    Clark, Robert J.; Meier, A.L.; Riddle, G.; ,

    1990-01-01

    One objective of the International Falls and Roseau, Minnesota, CUSMAP projects was to develop a means of conducting regional-scale geochemical surveys in areas where bedrock is buried beneath complex glacially derived overburden. Partial analysis of B-horizon soils offered hope for detecting subtle hydromorphic trace-element dispersion patterns. An enzyme-based partial leach selectively removes metals from oxide coatings on the surfaces of soil materials without attacking their matrix. Most trace-element concentrations in the resulting solutions are in the part-per-trillion to low part-per-billion range, necessitating determinations by inductively coupled plasma/mass spectrometry. The resulting data show greater contrasts for many trace elements than with other techniques tested. Spatially, many trace metal anomalies are locally discontinuous, but anomalous trends within larger areas are apparent. In many instances, the source for an anomaly seems to be either basal till or bedrock. Ground water flow is probably the most important mechanism for transporting metals toward the surface, although ionic diffusion, electrochemical gradients, and capillary action may play a role in anomaly dispersal. Sample sites near the Rainy Lake-Seine River fault zone, a regional shear zone, often have anomalous concentrations of a variety of metals, commonly including Zn and/or one or more metals which substitute for Zn in sphalerite (Cd, Ge, Ga, and Sn). Shifts in background concentrations of Bi, Sb, and As show a trend across the area indicating a possible regional zoning of lode-Au mineralization. Soil anomalies of Ag, Co, and Tl parallel basement structures, suggesting areas that may have potential for Cobalt/Thunder Baytype silver viens. An area around Baudette, Minnesota, which is underlain by quartz-chlorite-carbonate-altered shear zones, is anomalous in Ag, As, Bi, Co, Mo, Te, Tl, and W. Anomalies of Ag, As, Bi, Te, and W tend to follow the fault zones, suggesting potential for lode-Au deposits. Soil anomalies of Co, Mo, and Tl appear to follow northwest-striking structures that cross the shear zones, suggesting that Thunder Bay-type mineralization may have overprinted earlier mineralization along the shear zones.

  17. Disparity : scalable anomaly detection for clusters.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desai, N.; Bradshaw, R.; Lusk, E.

    2008-01-01

    In this paper, we describe disparity, a tool that does parallel, scalable anomaly detection for clusters. Disparity uses basic statistical methods and scalable reduction operations to perform data reduction on client nodes and uses these results to locate node anomalies. We discuss the implementation of disparity and present results of its use on a SiCortex SC5832 system.

  18. Integrated System Health Management (ISHM) for Test Stand and J-2X Engine: Core Implementation

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge F.; Schmalzel, John L.; Aguilar, Robert; Shwabacher, Mark; Morris, Jon

    2008-01-01

    ISHM capability enables a system to detect anomalies, determine causes and effects, predict future anomalies, and provides an integrated awareness of the health of the system to users (operators, customers, management, etc.). NASA Stennis Space Center, NASA Ames Research Center, and Pratt & Whitney Rocketdyne have implemented a core ISHM capability that encompasses the A1 Test Stand and the J-2X Engine. The implementation incorporates all aspects of ISHM; from anomaly detection (e.g. leaks) to root-cause-analysis based on failure mode and effects analysis (FMEA), to a user interface for an integrated visualization of the health of the system (Test Stand and Engine). The implementation provides a low functional capability level (FCL) in that it is populated with few algorithms and approaches for anomaly detection, and root-cause trees from a limited FMEA effort. However, it is a demonstration of a credible ISHM capability, and it is inherently designed for continuous and systematic augmentation of the capability. The ISHM capability is grounded on an integrating software environment used to create an ISHM model of the system. The ISHM model follows an object-oriented approach: includes all elements of the system (from schematics) and provides for compartmentalized storage of information associated with each element. For instance, a sensor object contains a transducer electronic data sheet (TEDS) with information that might be used by algorithms and approaches for anomaly detection, diagnostics, etc. Similarly, a component, such as a tank, contains a Component Electronic Data Sheet (CEDS). Each element also includes a Health Electronic Data Sheet (HEDS) that contains health-related information such as anomalies and health state. Some practical aspects of the implementation include: (1) near real-time data flow from the test stand data acquisition system through the ISHM model, for near real-time detection of anomalies and diagnostics, (2) insertion of the J-2X predictive model providing predicted sensor values for comparison with measured values and use in anomaly detection and diagnostics, and (3) insertion of third-party anomaly detection algorithms into the integrated ISHM model.

  19. Robust and efficient anomaly detection using heterogeneous representations

    NASA Astrophysics Data System (ADS)

    Hu, Xing; Hu, Shiqiang; Xie, Jinhua; Zheng, Shiyou

    2015-05-01

    Various approaches have been proposed for video anomaly detection. Yet these approaches typically suffer from one or more limitations: they often characterize the pattern using its internal information, but ignore its external relationship which is important for local anomaly detection. Moreover, the high-dimensionality and the lack of robustness of pattern representation may lead to problems, including overfitting, increased computational cost and memory requirements, and high false alarm rate. We propose a video anomaly detection framework which relies on a heterogeneous representation to account for both the pattern's internal information and external relationship. The internal information is characterized by slow features learned by slow feature analysis from low-level representations, and the external relationship is characterized by the spatial contextual distances. The heterogeneous representation is compact, robust, efficient, and discriminative for anomaly detection. Moreover, both the pattern's internal information and external relationship can be taken into account in the proposed framework. Extensive experiments demonstrate the robustness and efficiency of our approach by comparison with the state-of-the-art approaches on the widely used benchmark datasets.

  20. Final report for LDRD project 11-0029 : high-interest event detection in large-scale multi-modal data sets : proof of concept.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohrer, Brandon Robinson

    2011-09-01

    Events of interest to data analysts are sometimes difficult to characterize in detail. Rather, they consist of anomalies, events that are unpredicted, unusual, or otherwise incongruent. The purpose of this LDRD was to test the hypothesis that a biologically-inspired anomaly detection algorithm could be used to detect contextual, multi-modal anomalies. There currently is no other solution to this problem, but the existence of a solution would have a great national security impact. The technical focus of this research was the application of a brain-emulating cognition and control architecture (BECCA) to the problem of anomaly detection. One aspect of BECCA inmore » particular was discovered to be critical to improved anomaly detection capabilities: it's feature creator. During the course of this project the feature creator was developed and tested against multiple data types. Development direction was drawn from psychological and neurophysiological measurements. Major technical achievements include the creation of hierarchical feature sets created from both audio and imagery data.« less

  1. SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon Rueff; Lyle Roybal; Denis Vollmer

    2013-01-01

    There is a significant need to protect the nation’s energy infrastructures from malicious actors using cyber methods. Supervisory, Control, and Data Acquisition (SCADA) systems may be vulnerable due to the insufficient security implemented during the design and deployment of these control systems. This is particularly true in older legacy SCADA systems that are still commonly in use. The purpose of INL’s research on the SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) project was to determine if and how data compression techniques could be used to identify and protect SCADA systems from cyber attacks. Initially, the concept was centered on howmore » to train a compression algorithm to recognize normal control system traffic versus hostile network traffic. Because large portions of the TCP/IP message traffic (called packets) are repetitive, the concept of using compression techniques to differentiate “non-normal” traffic was proposed. In this manner, malicious SCADA traffic could be identified at the packet level prior to completing its payload. Previous research has shown that SCADA network traffic has traits desirable for compression analysis. This work investigated three different approaches to identify malicious SCADA network traffic using compression techniques. The preliminary analyses and results presented herein are clearly able to differentiate normal from malicious network traffic at the packet level at a very high confidence level for the conditions tested. Additionally, the master dictionary approach used in this research appears to initially provide a meaningful way to categorize and compare packets within a communication channel.« less

  2. Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa

    2018-01-01

    A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.

  3. Mean Gravity Anomaly Prediction Techniques with a Comparative Analysis of the Accuracy and Economy of Selected Methods.

    DTIC Science & Technology

    1982-03-01

    gravity anomaly values computed from measured gravity at discrete points (x,y) within the 10 x 10 area. If the Ag are Bouguer gravity anomalies, the Ag is...a 10 x 10 mean Bouguer anomaly. If the Ag are free-air gravity anomalies, the Ag is a 10 x 10 mean free-air gravity anomaly. Either anomaly form can...it requires less subjective judgment. Predictions in continental areas always are made using Bouguer gravity anomalies because this anomaly form is

  4. Spacecraft Orbit Anomaly Representation Using Thrust-Fourier-Coefficients with Orbit Determination Toolbox

    NASA Astrophysics Data System (ADS)

    Ko, H.; Scheeres, D.

    2014-09-01

    Representing spacecraft orbit anomalies between two separate states is a challenging but an important problem in achieving space situational awareness for an active spacecraft. Incorporation of such a capability could play an essential role in analyzing satellite behaviors as well as trajectory estimation of the space object. A general way to deal with the anomaly problem is to add an estimated perturbing acceleration such as dynamic model compensation (DMC) into an orbit determination process based on pre- and post-anomaly tracking data. It is a time-consuming numerical process to find valid coefficients to compensate for unknown dynamics for the anomaly. Even if the orbit determination filter with DMC can crudely estimate an unknown acceleration, this approach does not consider any fundamental element of the unknown dynamics for a given anomaly. In this paper, a new way of representing a spacecraft anomaly using an interpolation technique with the Thrust-Fourier-Coefficients (TFCs) is introduced and several anomaly cases are studied using this interpolation method. It provides a very efficient way of reconstructing the fundamental elements of the dynamics for a given spacecraft anomaly. Any maneuver performed by a satellite transitioning between two arbitrary orbital states can be represented as an equivalent maneuver using an interpolation technique with the TFCs. Given unconnected orbit states between two epochs due to a spacecraft anomaly, it is possible to obtain a unique control law using the TFCs that is able to generate the desired secular behavior for the given orbital changes. This interpolation technique can capture the fundamental elements of combined unmodeled anomaly events. The interpolated orbit trajectory, using the TFCs compensating for a given anomaly, can be used to improve the quality of orbit fits through the anomaly period and therefore help to obtain a good orbit determination solution after the anomaly. Orbit Determination Toolbox (ODTBX) is modified to adapt this technique in order to verify the performance of this interpolation approach. Spacecraft anomaly cases are based on either single or multiple low or high thrust maneuvers and the unknown thrust accelerations are recovered and compared with the true thrust acceleration. The advantage of this approach is to easily append TFCs and its dynamics to the pre-built ODTBX, which enables us to blend post-anomaly tracking data to improve the performance of the interpolation representation in the absence of detailed information about a maneuver. It allows us to improve space situational awareness in the areas of uncertainty propagation, anomaly characterization and track correlation.

  5. Anomaly detection of turbopump vibration in Space Shuttle Main Engine using statistics and neural networks

    NASA Technical Reports Server (NTRS)

    Lo, C. F.; Wu, K.; Whitehead, B. A.

    1993-01-01

    The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.

  6. Temporal Methods to Detect Content-Based Anomalies in Social Media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skryzalin, Jacek; Field, Jr., Richard; Fisher, Andrew N.

    Here, we develop a method for time-dependent topic tracking and meme trending in social media. Our objective is to identify time periods whose content differs signifcantly from normal, and we utilize two techniques to do so. The first is an information-theoretic analysis of the distributions of terms emitted during different periods of time. In the second, we cluster documents from each time period and analyze the tightness of each clustering. We also discuss a method of combining the scores created by each technique, and we provide ample empirical analysis of our methodology on various Twitter datasets.

  7. Evolutionary neural networks for anomaly detection based on the behavior of a program.

    PubMed

    Han, Sang-Jun; Cho, Sung-Bae

    2006-06-01

    The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection.

  8. Independent component analysis (ICA) and self-organizing map (SOM) approach to multidetection system for network intruders

    NASA Astrophysics Data System (ADS)

    Abdi, Abdi M.; Szu, Harold H.

    2003-04-01

    With the growing rate of interconnection among computer systems, network security is becoming a real challenge. Intrusion Detection System (IDS) is designed to protect the availability, confidentiality and integrity of critical network information systems. Today"s approach to network intrusion detection involves the use of rule-based expert systems to identify an indication of known attack or anomalies. However, these techniques are less successful in identifying today"s attacks. Hackers are perpetually inventing new and previously unanticipated techniques to compromise information infrastructure. This paper proposes a dynamic way of detecting network intruders on time serious data. The proposed approach consists of a two-step process. Firstly, obtaining an efficient multi-user detection method, employing the recently introduced complexity minimization approach as a generalization of a standard ICA. Secondly, we identified unsupervised learning neural network architecture based on Kohonen"s Self-Organizing Map for potential functional clustering. These two steps working together adaptively will provide a pseudo-real time novelty detection attribute to supplement the current intrusion detection statistical methodology.

  9. Implementing Classification on a Munitions Response Project

    DTIC Science & Technology

    2011-12-01

    Detection Dig List  IVS/Seed Site Planning Decisions Dig All  Anomalies Site Characterization Implementing Classification on a Munitions Response...Details ● Seed emplacement ● EM61-MK2 detection survey  RTK GPS ● Select anomalies for further investigation ● Collect cued data using MetalMapper...5.2 mV in channel 2  938 anomalies selected ● All QC seeds detected using this threshold  Some just inside the 60-cm halo ● IVS reproducibility

  10. Fuzzy Kernel k-Medoids algorithm for anomaly detection problems

    NASA Astrophysics Data System (ADS)

    Rustam, Z.; Talita, A. S.

    2017-07-01

    Intrusion Detection System (IDS) is an essential part of security systems to strengthen the security of information systems. IDS can be used to detect the abuse by intruders who try to get into the network system in order to access and utilize the available data sources in the system. There are two approaches of IDS, Misuse Detection and Anomaly Detection (behavior-based intrusion detection). Fuzzy clustering-based methods have been widely used to solve Anomaly Detection problems. Other than using fuzzy membership concept to determine the object to a cluster, other approaches as in combining fuzzy and possibilistic membership or feature-weighted based methods are also used. We propose Fuzzy Kernel k-Medoids that combining fuzzy and possibilistic membership as a powerful method to solve anomaly detection problem since on numerical experiment it is able to classify IDS benchmark data into five different classes simultaneously. We classify IDS benchmark data KDDCup'99 data set into five different classes simultaneously with the best performance was achieved by using 30 % of training data with clustering accuracy reached 90.28 percent.

  11. Live birth rates and safety profile using dydrogesterone for luteal phase support in assisted reproductive techniques

    PubMed Central

    Nadarajah, Ravichandran; Rajesh, Hemashree; Wong, Ker Yi; Faisal, Fazlin; Yu, Su Ling

    2017-01-01

    INTRODUCTION Assisted reproductive techniques (ARTs) result in a deficient luteal phase, requiring the administration of intramuscular, intravaginal or oral exogenous progesterone. Dydrogesterone, an oral retroprogesterone with good bioavailability, has been used in assisted reproductive cycles with outcomes that are comparable to those of vaginal or intramuscular progesterone. However, there are limited reviews on its use for luteal phase support in ARTs, in terms of pregnancy outcomes and associated fetal anomalies. This study aimed to review the live birth rates and associated fetal anomalies of women who were given dydrogesterone for luteal phase support in assisted reproductive cycles at a tertiary hospital in Singapore. METHODS This retrospective descriptive study included 1,050 women who underwent in vitro fertilisation/intracytoplasmic sperm injection at the Centre for Assisted Reproduction of Singapore General Hospital between 2000 and 2011. The women were given dydrogesterone for luteal phase support. The main outcome measures were rates of pregnancy, live birth, miscarriage and fetal anomalies. RESULTS The pregnancy and live birth rates were 34.7% and 27.7%, respectively. Among those who achieved pregnancy, 17.0% miscarried, 0.8% had ectopic pregnancies and 0.3% had molar pregnancies. Fetal anomalies were detected in 1.9% of pregnancies, all of which were terminated by choice. CONCLUSION Since the outcomes of dydrogesterone are comparable to those of intramuscular and vaginal progesterone, it is a reasonable option to provide luteal phase support for women who are uncomfortable with injections or vaginal insertions. Randomised controlled studies are needed to determine the optimal dosage of dydrogesterone for luteal phase support in ARTs. PMID:27090598

  12. Anomaly detection in hyperspectral imagery: statistics vs. graph-based algorithms

    NASA Astrophysics Data System (ADS)

    Berkson, Emily E.; Messinger, David W.

    2016-05-01

    Anomaly detection (AD) algorithms are frequently applied to hyperspectral imagery, but different algorithms produce different outlier results depending on the image scene content and the assumed background model. This work provides the first comparison of anomaly score distributions between common statistics-based anomaly detection algorithms (RX and subspace-RX) and the graph-based Topological Anomaly Detector (TAD). Anomaly scores in statistical AD algorithms should theoretically approximate a chi-squared distribution; however, this is rarely the case with real hyperspectral imagery. The expected distribution of scores found with graph-based methods remains unclear. We also look for general trends in algorithm performance with varied scene content. Three separate scenes were extracted from the hyperspectral MegaScene image taken over downtown Rochester, NY with the VIS-NIR-SWIR ProSpecTIR instrument. In order of most to least cluttered, we study an urban, suburban, and rural scene. The three AD algorithms were applied to each scene, and the distributions of the most anomalous 5% of pixels were compared. We find that subspace-RX performs better than RX, because the data becomes more normal when the highest variance principal components are removed. We also see that compared to statistical detectors, anomalies detected by TAD are easier to separate from the background. Due to their different underlying assumptions, the statistical and graph-based algorithms highlighted different anomalies within the urban scene. These results will lead to a deeper understanding of these algorithms and their applicability across different types of imagery.

  13. Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity.

    PubMed

    Napoletano, Paolo; Piccoli, Flavio; Schettini, Raimondo

    2018-01-12

    Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art.

  14. Visualization techniques and graphical user interfaces in syndromic surveillance systems. Summary from the Disease Surveillance Workshop, Sept. 11-12, 2007; Bangkok, Thailand.

    PubMed

    Moore, Kieran M; Edge, Graham; Kurc, Andrew R

    2008-11-14

    Timeliness is a critical asset to the detection of public health threats when using syndromic surveillance systems. In order for epidemiologists to effectively distinguish which events are indicative of a true outbreak, the ability to utilize specific data streams from generalized data summaries is necessary. Taking advantage of graphical user interfaces and visualization capacities of current surveillance systems makes it easier for users to investigate detected anomalies by generating custom graphs, maps, plots, and temporal-spatial analysis of specific syndromes or data sources.

  15. Visualization techniques and graphical user interfaces in syndromic surveillance systems. Summary from the Disease Surveillance Workshop, Sept. 11–12, 2007; Bangkok, Thailand

    PubMed Central

    Moore, Kieran M; Edge, Graham; Kurc, Andrew R

    2008-01-01

    Timeliness is a critical asset to the detection of public health threats when using syndromic surveillance systems. In order for epidemiologists to effectively distinguish which events are indicative of a true outbreak, the ability to utilize specific data streams from generalized data summaries is necessary. Taking advantage of graphical user interfaces and visualization capacities of current surveillance systems makes it easier for users to investigate detected anomalies by generating custom graphs, maps, plots, and temporal-spatial analysis of specific syndromes or data sources. PMID:19025683

  16. Space shuttle main engine fault detection using neural networks

    NASA Technical Reports Server (NTRS)

    Bishop, Thomas; Greenwood, Dan; Shew, Kenneth; Stevenson, Fareed

    1991-01-01

    A method for on-line Space Shuttle Main Engine (SSME) anomaly detection and fault typing using a feedback neural network is described. The method involves the computation of features representing time-variance of SSME sensor parameters, using historical test case data. The network is trained, using backpropagation, to recognize a set of fault cases. The network is then able to diagnose new fault cases correctly. An essential element of the training technique is the inclusion of randomly generated data along with the real data, in order to span the entire input space of potential non-nominal data.

  17. First and second trimester screening for fetal structural anomalies.

    PubMed

    Edwards, Lindsay; Hui, Lisa

    2018-04-01

    Fetal structural anomalies are found in up to 3% of all pregnancies and ultrasound-based screening has been an integral part of routine prenatal care for decades. The prenatal detection of fetal anomalies allows for optimal perinatal management, providing expectant parents with opportunities for additional imaging, genetic testing, and the provision of information regarding prognosis and management options. Approximately one-half of all major structural anomalies can now be detected in the first trimester, including acrania/anencephaly, abdominal wall defects, holoprosencephaly and cystic hygromata. Due to the ongoing development of some organ systems however, some anomalies will not be evident until later in the pregnancy. To this extent, the second trimester anatomy is recommended by professional societies as the standard investigation for the detection of fetal structural anomalies. The reported detection rates of structural anomalies vary according to the organ system being examined, and are also dependent upon factors such as the equipment settings and sonographer experience. Technological advances over the past two decades continue to support the role of ultrasound as the primary imaging modality in pregnancy, and the safety of ultrasound for the developing fetus is well established. With increasing capabilities and experience, detailed examination of the central nervous system and cardiovascular system is possible, with dedicated examinations such as the fetal neurosonogram and the fetal echocardiogram now widely performed in tertiary centers. Magnetic resonance imaging (MRI) is well recognized for its role in the assessment of fetal brain anomalies; other potential indications for fetal MRI include lung volume measurement (in cases of congenital diaphragmatic hernia), and pre-surgical planning prior to fetal spina bifida repair. When a major structural abnormality is detected prenatally, genetic testing with chromosomal microarray is recommended over routine karyotype due to its higher genomic resolution. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Madagascar's escape from Africa: A high-resolution plate reconstruction for the Western Somali Basin and implications for supercontinent dispersal

    NASA Astrophysics Data System (ADS)

    Phethean, Jordan J. J.; Kalnins, Lara M.; van Hunen, Jeroen; Biffi, Paolo G.; Davies, Richard J.; McCaffrey, Ken J. W.

    2016-12-01

    Accurate reconstructions of the dispersal of supercontinent blocks are essential for testing continental breakup models. Here, we provide a new plate tectonic reconstruction of the opening of the Western Somali Basin during the breakup of East and West Gondwana. The model is constrained by a new comprehensive set of spreading lineaments, detected in this heavily sedimented basin using a novel technique based on directional derivatives of free-air gravity anomalies. Vertical gravity gradient and free-air gravity anomaly maps also enable the detection of extinct mid-ocean ridge segments, which can be directly compared to several previous ocean magnetic anomaly interpretations of the Western Somali Basin. The best matching interpretations have basin symmetry around the M0 anomaly; these are then used to temporally constrain our plate tectonic reconstruction. The reconstruction supports a tight fit for Gondwana fragments prior to breakup, and predicts that the continent-ocean transform margin lies along the Rovuma Basin, not along the Davie Fracture Zone (DFZ) as commonly thought. According to our reconstruction, the DFZ represents a major ocean-ocean fracture zone formed by the coalescence of several smaller fracture zones during evolving plate motions as Madagascar drifted southwards, and offshore Tanzania is an obliquely rifted, rather than transform, margin. New seismic reflection evidence for oceanic crust inboard of the DFZ strongly supports these conclusions. Our results provide important new constraints on the still enigmatic driving mechanism of continental rifting, the nature of the lithosphere in the Western Somali Basin, and its resource potential.

  19. Underground structure characterization using motor vehicles as passive seismic sources

    NASA Astrophysics Data System (ADS)

    Kuzma, H. A.; Liu, Y.; Zhao, Y.; Rector, J.; Vaidya, S.

    2009-12-01

    The ability to detect and characterize underground voids will be critical to the success of On-Site Inspections (OSI) as mandated by the nuclear Comprehensive Test Ban Treaty (CTBT). OSIs may be conducted in order to successfully locate the Ground Zero of underground tests as well as infrastructure related to testing. Recently, our team has shown the potential of a new technique to detect underground objects using the amplitude of seismic surface waves generated by motor vehicles. In an experiment conducted in June, 2009 we were able to detect an abandoned railroad tunnel by recognizing a clear pattern in the surface waves scattered by the tunnel, using a signal generated by driving a car on a dirt road across the tunnel. Synthetic experiments conducted using physically realistic wave-equation models further suggest that the technique can be readily applied to detecting underground features: it may be possible to image structures of importance to OSI simply by laying out an array of geophones (or using an array already in place for passive listening for event aftershocks) and driving vehicles around the site. We present evidence from a set of field experiments and from synthetic modeling and inversion studies to illustrate adaptations of the technique for OSI. Signature of an abandoned underground railroad tunnel at Donner Summit, CA. To produce this image, a line of geophones was placed along a dirt road perpendicular to the tunnel (black box) and a single car was driven along the road. A normalized mean power-spectrum is displayed on a log scale as a function of meters from the center of the tunnel. The top of the tunnel was 18m below ground surface. The tunnel anomaly is made up of a shadow (light) directly above the tunnel and amplitude build-up (dark) on either side of the tunnel. The size of the anomaly (6 orders of magnitude) suggests that the method can be extended to find deep structures at greater distances from the source and receivers.

  20. Integration of Audit Data Analysis and Mining Techniques into Aide

    DTIC Science & Technology

    2006-07-01

    results produced by the anomaly-detection subsystem. A successor system to NIDES, called EMERALD [35], currently under development at SRI, extends...to represent attack scenarios in a networked environment. eBayes of SRI’s Emerald uses Bayes net technology to analyze bursts of traffic [40...snmpget2, we have to resort to the TCP raw data (packets) to see what operations these connections performed. (E.g., long login trails in the PSSWD attack

  1. Mitigating the Insider Threat with High-Dimensional Anomaly Detection

    DTIC Science & Technology

    2004-12-01

    a more serious attack. Various systems such as NSM [56], GrIDS [57], snort [58], Emerald [59], and Spice [60] generate alerts for portscan...reboot etc. The user measurements include the user profiles such as time of login , duration of user session, cumulative CPU time, names of files...already been implemented in a real-time system for information retrieval [3]. A technique developed at SRI in the Emerald system [22] uses historical

  2. Anomaly Detection Techniques for the Condition Monitoring of Tidal Turbines

    DTIC Science & Technology

    2014-09-29

    particularly beneficial to this industry. This paper explores the use of the CRISP - DM data mining process model for identifying key trends within...within tidal turbines with limited historical data. Using the CRISP - DM data mining methodology (Wirth & Hipp, 2000), key relationships between...indicate a change in the response of the system, indicating the possible onset of a fault. 1.2.1. CRISP - DM The CRISP - DM (Cross-Industry Standard

  3. Passenger baggage object database (PBOD)

    NASA Astrophysics Data System (ADS)

    Gittinger, Jaxon M.; Suknot, April N.; Jimenez, Edward S.; Spaulding, Terry W.; Wenrich, Steve A.

    2018-04-01

    Detection of anomalies of interest in x-ray images is an ever-evolving problem that requires the rapid development of automatic detection algorithms. Automatic detection algorithms are developed using machine learning techniques, which would require developers to obtain the x-ray machine that was used to create the images being trained on, and compile all associated metadata for those images by hand. The Passenger Baggage Object Database (PBOD) and data acquisition application were designed and developed for acquiring and persisting 2-D and 3-D x-ray image data and associated metadata. PBOD was specifically created to capture simulated airline passenger "stream of commerce" luggage data, but could be applied to other areas of x-ray imaging to utilize machine-learning methods.

  4. A Load-Based Temperature Prediction Model for Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Sobhani, Masoud

    Electric load forecasting, as a basic requirement for the decision-making in power utilities, has been improved in various aspects in the past decades. Many factors may affect the accuracy of the load forecasts, such as data quality, goodness of the underlying model and load composition. Due to the strong correlation between the input variables (e.g., weather and calendar variables) and the load, the quality of input data plays a vital role in forecasting practices. Even if the forecasting model were able to capture most of the salient features of the load, a low quality input data may result in inaccurate forecasts. Most of the data cleansing efforts in the load forecasting literature have been devoted to the load data. Few studies focused on weather data cleansing for load forecasting. This research proposes an anomaly detection method for the temperature data. The method consists of two components: a load-based temperature prediction model and a detection technique. The effectiveness of the proposed method is demonstrated through two case studies: one based on the data from the Global Energy Forecasting Competition 2014, and the other based on the data published by ISO New England. The results show that by removing the detected observations from the original input data, the final load forecast accuracy is enhanced.

  5. Identifying Threats Using Graph-based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Eberle, William; Holder, Lawrence; Cook, Diane

    Much of the data collected during the monitoring of cyber and other infrastructures is structural in nature, consisting of various types of entities and relationships between them. The detection of threatening anomalies in such data is crucial to protecting these infrastructures. We present an approach to detecting anomalies in a graph-based representation of such data that explicitly represents these entities and relationships. The approach consists of first finding normative patterns in the data using graph-based data mining and then searching for small, unexpected deviations to these normative patterns, assuming illicit behavior tries to mimic legitimate, normative behavior. The approach is evaluated using several synthetic and real-world datasets. Results show that the approach has high truepositive rates, low false-positive rates, and is capable of detecting complex structural anomalies in real-world domains including email communications, cellphone calls and network traffic.

  6. Real-time Bayesian anomaly detection in streaming environmental data

    NASA Astrophysics Data System (ADS)

    Hill, David J.; Minsker, Barbara S.; Amir, Eyal

    2009-04-01

    With large volumes of data arriving in near real time from environmental sensors, there is a need for automated detection of anomalous data caused by sensor or transmission errors or by infrequent system behaviors. This study develops and evaluates three automated anomaly detection methods using dynamic Bayesian networks (DBNs), which perform fast, incremental evaluation of data as they become available, scale to large quantities of data, and require no a priori information regarding process variables or types of anomalies that may be encountered. This study investigates these methods' abilities to identify anomalies in eight meteorological data streams from Corpus Christi, Texas. The results indicate that DBN-based detectors, using either robust Kalman filtering or Rao-Blackwellized particle filtering, outperform a DBN-based detector using Kalman filtering, with the former having false positive/negative rates of less than 2%. These methods were successful at identifying data anomalies caused by two real events: a sensor failure and a large storm.

  7. Conditional Anomaly Detection with Soft Harmonic Functions

    PubMed Central

    Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F.; Hauskrecht, Milos

    2012-01-01

    In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions. PMID:25309142

  8. Conditional Anomaly Detection with Soft Harmonic Functions.

    PubMed

    Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F; Hauskrecht, Milos

    2011-01-01

    In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions.

  9. A Stochastic-entropic Approach to Detect Persistent Low-temperature Volcanogenic Thermal Anomalies

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Baxter, S.

    2011-12-01

    Eruption prediction is a chancy idiosyncratic affair, as volcanoes often manifest waxing and/or waning pre-eruption emission, geodetic, and seismic behavior that is unsystematic. Thus, fundamental to increased prediction accuracy and precision are good and frequent assessments of the time-series behavior of relevant precursor geophysical, geochemical, and geological phenomena, especially when volcanoes become restless. The Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER), in orbit since 1999 on the NASA Terra Earth Observing System satellite is an important capability for detection of thermal eruption precursors (even subtle ones) and increased passive gas emissions. The unique combination of ASTER high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), combined with simultaneous visible and near-IR imaging data, and stereo-photogrammetric capabilities make it a useful, especially thermal, precursor detection tool. The JPL ASTER Volcano Archive consisting of 80,000+ASTER volcano images allows systematic analysis of (a) baseline thermal emissions for 1550+ volcanoes, (b) important aspects of the time-dependent thermal variability, and (c) the limits of detection of temporal dynamics of eruption precursors. We are analyzing a catalog of the magnitude, frequency, and distribution of ASTER-documented volcano thermal signatures, compiled from 2000 onward, at 90m/pixel. Low contrast thermal anomalies of relatively low apparent absolute temperature (e.g., summit lakes, fumarolically altered areas, geysers, very small sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and their combinations), are particularly important to discern and monitor. We have developed a technique to detect persistent hotspots that takes into account in-scene observed pixel joint frequency distributions over time, temperature contrast, and Shannon entropy. Preliminary analyses of Fogo Volcano and Yellowstone hotspots, among others, indicate that this is a very sensitive technique with good potential to be applied over the entire ASTER global night-time archive. We will discuss our progress in creating the global thermal anomaly catalog as well as algorithm approach and results. This work was carried out at the Jet Propulsion Laboratory of the California Institute of Technology under contract to NASA.

  10. Thermal remote sensing as a part of Exupéry volcano fast response system

    NASA Astrophysics Data System (ADS)

    Zakšek, Klemen; Hort, Matthias

    2010-05-01

    In order to understand the eruptive potential of a volcanic system one has to characterize the actual state of stress of a volcanic system that involves proper monitoring strategies. As several volcanoes in highly populated areas especially in south east Asia are still nearly unmonitored a mobile volcano monitoring system is currently being developed in Germany. One of the major novelties of this mobile volcano fast response system called Exupéry is the direct inclusion of satellite based observations. Remote sensing data are introduced together with ground based field measurements into the GIS database, where the statistical properties of all recorded data are estimated. Using physical modelling and statistical methods we hope to constrain the probability of future eruptions. The emphasis of this contribution is on using thermal remote sensing as tool for monitoring active volcanoes. One can detect thermal anomalies originating from a volcano by comparing signals in mid and thermal infrared spectra. A reliable and effective thermal anomalies detection algorithm was developed by Wright (2002) for MODIS sensor; it is based on the threshold of the so called normalized thermal index (NTI). This is the method we use in Exupéry, where we characterize each detected thermal anomaly by temperature, area, heat flux and effusion rate. The recent work has shown that radiant flux is the most robust parameter for this characterization. Its derivation depends on atmosphere, satellite viewing angle and sensor characteristics. Some of these influences are easy to correct using standard remote sensing pre-processing techniques, however, some noise still remains in data. In addition, satellites in polar orbits have long revisit times and thus they might fail to follow a fast evolving volcanic crisis due to long revisit times. Thus we are currently testing Kalman filter on simultaneous use of MODIS and AVHRR data to improve the thermal anomaly characterization. The advantage of this technique is that it increases the temporal resolution through using images from different satellites having different resolution and sensitivity. This algorithm has been tested for an eruption at Mt. Etna (2002) and successfully captures more details of the eruption evolution than would be seen by using only one satellite source. At the moment for Exupéry, merely MODIS (a sensor aboard NASA's Terra and Aqua satellite) data are used for the operational use. As MODIS is a meteorological sensor, it is suitable also for producing general overview images of the crisis area. Therefore, for each processed MODIS image we also produce RGB image where some basic meteorological features are classified: e.g. clouds, volcanic ash plumes, ocean, etc. In the case of detected hotspot an additional image is created; it contains the original measured radiances of the selected channels for the crisis area. All anomaly and processing parameters are additionally written into an XML file. The results are available in web GIS in the worst case two hours after NASA provides level 1b data online.

  11. Detailed gravity anomalies from GEOS-3 satellite altimetry data

    NASA Technical Reports Server (NTRS)

    Gopalapillai, G. S.; Mourad, A. G.

    1978-01-01

    A technique for deriving mean gravity anomalies from dense altimetry data was developed. A combination of both deterministic and statistical techniques was used. The basic mathematical model was based on the Stokes' equation which describes the analytical relationship between mean gravity anomalies and geoid undulations at a point; this undulation is a linear function of the altimetry data at that point. The overdetermined problem resulting from the excessive altimetry data available was solved using Least-Squares principles. These principles enable the simultaneous estimation of the associated standard deviations reflecting the internal consistency based on the accuracy estimates provided for the altimetry data as well as for the terrestrial anomaly data. Several test computations were made of the anomalies and their accuracy estimates using GOES-3 data.

  12. Rate based failure detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Brett Emery Trabun; Gamage, Thoshitha Thanushka; Bakken, David Edward

    This disclosure describes, in part, a system management component and failure detection component for use in a power grid data network to identify anomalies within the network and systematically adjust the quality of service of data published by publishers and subscribed to by subscribers within the network. In one implementation, subscribers may identify a desired data rate, a minimum acceptable data rate, desired latency, minimum acceptable latency and a priority for each subscription. The failure detection component may identify an anomaly within the network and a source of the anomaly. Based on the identified anomaly, data rates and or datamore » paths may be adjusted in real-time to ensure that the power grid data network does not become overloaded and/or fail.« less

  13. Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity

    PubMed Central

    Schettini, Raimondo

    2018-01-01

    Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art. PMID:29329268

  14. Paternal psychological response after ultrasonographic detection of structural fetal anomalies with a comparison to maternal response: a cohort study.

    PubMed

    Kaasen, Anne; Helbig, Anne; Malt, Ulrik Fredrik; Naes, Tormod; Skari, Hans; Haugen, Guttorm Nils

    2013-07-12

    In Norway almost all pregnant women attend one routine ultrasound examination. Detection of fetal structural anomalies triggers psychological stress responses in the women affected. Despite the frequent use of ultrasound examination in pregnancy, little attention has been devoted to the psychological response of the expectant father following the detection of fetal anomalies. This is important for later fatherhood and the psychological interaction within the couple. We aimed to describe paternal psychological responses shortly after detection of structural fetal anomalies by ultrasonography, and to compare paternal and maternal responses within the same couple. A prospective observational study was performed at a tertiary referral centre for fetal medicine. Pregnant women with a structural fetal anomaly detected by ultrasound and their partners (study group,n=155) and 100 with normal ultrasound findings (comparison group) were included shortly after sonographic examination (inclusion period: May 2006-February 2009). Gestational age was >12 weeks. We used psychometric questionnaires to assess self-reported social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression): Impact of Event Scale. General Health Questionnaire and Edinburgh Postnatal Depression Scale. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Median (range) gestational age at inclusion in the study and comparison group was 19 (12-38) and 19 (13-22) weeks, respectively. Men and women in the study group had significantly higher levels of psychological distress than men and women in the comparison group on all psychometric endpoints. The lowest level of distress in the study group was associated with the least severe anomalies with no diagnostic or prognostic ambiguity (p < 0.033). Men had lower scores than women on all psychometric outcome variables. The correlation in distress scores between men and women was high in the fetal anomaly group (p < 0.001), but non-significant in the comparison group. Severity of the anomaly including ambiguity significantly influenced paternal response. Men reported lower scores on all psychometric outcomes than women. This knowledge may facilitate support for both expectant parents to reduce strain within the family after detectionof a fetal anomaly.

  15. Apollo-Soyuz pamphlet no. 4: Gravitational field. [experimental design

    NASA Technical Reports Server (NTRS)

    Page, L. W.; From, T. P.

    1977-01-01

    Two Apollo Soyuz experiments designed to detect gravity anomalies from spacecraft motion are described. The geodynamics experiment (MA-128) measured large-scale gravity anomalies by detecting small accelerations of Apollo in the 222 km orbit, using Doppler tracking from the ATS-6 satellite. Experiment MA-089 measured 300 km anomalies on the earth's surface by detecting minute changes in the separation between Apollo and the docking module. Topics discussed in relation to these experiments include the Doppler effect, gravimeters, and the discovery of mascons on the moon.

  16. Monitoring volcanic thermal activity by Robust Satellite Techniques: achievements and perspectives

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Marchese, F.; Mazzeo, G.; Pergola, N.

    2009-12-01

    Satellite data have been increasingly used in last decades to study active volcanoes and to monitor thermal activity variation in space-time domain. Several satellite techniques and original methods have been developed and tested, devoted to hotspot detection and thermal monitoring. Among them, a multi-temporal approach, named RST (Robust Satellite Techniques), has shown high performances in detecting hotspots, with a low false positive rate under different observational and atmospheric conditions, providing also a potential toward low-level thermal anomalies which may announce incoming eruptions. As the RST scheme is intrinsically exportable on different geographic areas and satellite sensors, it has been applied and tested on a number of volcanoes and in different environmental conditions. This work presents major results and outcomes of studies carried out on Etna and Stromboli (Italy), Merapi (Java Indonesia), Asamayama (Japan), Jebel Al Tair (Yemen) by using different satellite systems and sensors (e.g. NOAA-AVHRR, EOS-MODIS, MSG-SEVIRI). Performances on hotspot detection, early warning and real-time monitoring, together with capabilities in possible thermal precursor identification, will be presented and discussed.

  17. Identification of magnetic anomalies based on ground magnetic data analysis using multifractal modeling: a case study in Qoja-Kandi, East Azerbaijan Province, Iran

    NASA Astrophysics Data System (ADS)

    Mansouri, E.; Feizi, F.; Karbalaei Ramezanali, A. A.

    2015-07-01

    Ground magnetic anomaly separation using reduction-to-the-pole (RTP) technique and the fractal concentration-area (C-A) method has been applied to the Qoja-Kandi prosepecting area in NW Iran. The geophysical survey that resulted in the ground magnetic data was conducted for magnetic elements exploration. Firstly, RTP technique was applied for recognizing underground magnetic anomalies. RTP anomalies was classified to different populations based on this method. For this reason, drilling points determination with RTP technique was complicated. Next, C-A method was applied on the RTP-Magnetic-Anomalies (RTP-MA) for demonstrating magnetic susceptibility concentration. This identification was appropriate for increasing the resolution of the drilling points determination and decreasing the drilling risk, due to the economic costs of underground prospecting. In this study, the results of C-A Modeling on the RTP-MA are compared with 8 borehole data. The results show there is good correlation between anomalies derived via C-A method and log report of boreholes. Two boreholes were drilled in magnetic susceptibility concentration, based on multifractal modeling data analyses, between 63 533.1 and 66 296 nT. Drilling results show appropriate magnetite thickness with the grades greater than 20 % Fe total. Also, anomalies associated with andesite units host iron mineralization.

  18. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  19. Prevalence and distribution of dental anomalies in orthodontic patients.

    PubMed

    Montasser, Mona A; Taha, Mahasen

    2012-01-01

    To study the prevalence and distribution of dental anomalies in a sample of orthodontic patients. The dental casts, intraoral photographs, and lateral panoramic and cephalometric radiographs of 509 Egyptian orthodontic patients were studied. Patients were examined for dental anomalies in number, size, shape, position, and structure. The prevalence of each dental anomaly was calculated and compared between sexes. Of the total study sample, 32.6% of the patients had at least one dental anomaly other than agenesis of third molars; 32.1% of females and 33.5% of males had at least one dental anomaly other than agenesis of third molars. The most commonly detected dental anomalies were impaction (12.8%) and ectopic eruption (10.8%). The total prevalence of hypodontia (excluding third molars) and hyperdontia was 2.4% and 2.8%, respectively, with similiar distributions in females and males. Gemination and accessory roots were reported in this study; each of these anomalies was detected in 0.2% of patients. In addition to genetic and racial factors, environmental factors could have more important influence on the prevalence of dental anomalies in every population. Impaction, ectopic eruption, hyperdontia, hypodontia, and microdontia were the most common dental anomalies, while fusion and dentinogenesis imperfecta were absent.

  20. Mining Building Energy Management System Data Using Fuzzy Anomaly Detection and Linguistic Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumidu Wijayasekara; Ondrej Linda; Milos Manic

    Building Energy Management Systems (BEMSs) are essential components of modern buildings that utilize digital control technologies to minimize energy consumption while maintaining high levels of occupant comfort. However, BEMSs can only achieve these energy savings when properly tuned and controlled. Since indoor environment is dependent on uncertain criteria such as weather, occupancy, and thermal state, performance of BEMS can be sub-optimal at times. Unfortunately, the complexity of BEMS control mechanism, the large amount of data available and inter-relations between the data can make identifying these sub-optimal behaviors difficult. This paper proposes a novel Fuzzy Anomaly Detection and Linguistic Description (Fuzzy-ADLD)more » based method for improving the understandability of BEMS behavior for improved state-awareness. The presented method is composed of two main parts: 1) detection of anomalous BEMS behavior and 2) linguistic representation of BEMS behavior. The first part utilizes modified nearest neighbor clustering algorithm and fuzzy logic rule extraction technique to build a model of normal BEMS behavior. The second part of the presented method computes the most relevant linguistic description of the identified anomalies. The presented Fuzzy-ADLD method was applied to real-world BEMS system and compared against a traditional alarm based BEMS. In six different scenarios, the Fuzzy-ADLD method identified anomalous behavior either as fast as or faster (an hour or more), that the alarm based BEMS. In addition, the Fuzzy-ADLD method identified cases that were missed by the alarm based system, demonstrating potential for increased state-awareness of abnormal building behavior.« less

  1. Spatial and Temporal scales of time-averaged 700 MB height anomalies

    NASA Technical Reports Server (NTRS)

    Gutzler, D.

    1981-01-01

    The monthly and seasonal forecasting technique is based to a large extent on the extrapolation of trends in the positions of the centers of time averaged geopotential height anomalies. The complete forecasted height pattern is subsequently drawn around the forecasted anomaly centers. The efficacy of this technique was tested and time series of observed monthly mean and 5 day mean 700 mb geopotential heights were examined. Autocorrelation statistics are generated to document the tendency for persistence of anomalies. These statistics are compared to a red noise hypothesis to check for evidence of possible preferred time scales of persistence. Space-time spectral analyses at middle latitudes are checked for evidence of periodicities which could be associated with predictable month-to-month trends. A local measure of the average spatial scale of anomalies is devised for guidance in the completion of the anomaly pattern around the forecasted centers.

  2. Spent nuclear fuel assembly inspection using neutron computed tomography

    NASA Astrophysics Data System (ADS)

    Pope, Chad Lee

    The research presented here focuses on spent nuclear fuel assembly inspection using neutron computed tomography. Experimental measurements involving neutron beam transmission through a spent nuclear fuel assembly serve as benchmark measurements for an MCNP simulation model. Comparison of measured results to simulation results shows good agreement. Generation of tomography images from MCNP tally results was accomplished using adapted versions of built in MATLAB algorithms. Multiple fuel assembly models were examined to provide a broad set of conclusions. Tomography images revealing assembly geometric information including the fuel element lattice structure and missing elements can be obtained using high energy neutrons. A projection difference technique was developed which reveals the substitution of unirradiated fuel elements for irradiated fuel elements, using high energy neutrons. More subtle material differences such as altering the burnup of individual elements can be identified with lower energy neutrons provided the scattered neutron contribution to the image is limited. The research results show that neutron computed tomography can be used to inspect spent nuclear fuel assemblies for the purpose of identifying anomalies such as missing elements or substituted elements. The ability to identify anomalies in spent fuel assemblies can be used to deter diversion of material by increasing the risk of early detection as well as improve reprocessing facility operations by confirming the spent fuel configuration is as expected or allowing segregation if anomalies are detected.

  3. Infrared contrast data analysis method for quantitative measurement and monitoring in flash infrared thermography

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2015-04-01

    The paper provides information on a new infrared (IR) image contrast data post-processing method that involves converting raw data to normalized contrast versus time evolutions from the flash infrared thermography inspection video data. Thermal measurement features such as peak contrast, peak contrast time, persistence time, and persistence energy are calculated from the contrast evolutions. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat bottom holes in a test plate of the subject material. The measurement features are used to monitor growth of anomalies and to characterize the void-like anomalies. The method was developed to monitor and analyze void-like anomalies in reinforced carbon-carbon (RCC) materials used on the wing leading edge of the NASA Space Shuttle Orbiters, but the method is equally applicable to other materials. The thermal measurement features relate to the anomaly characteristics such as depth and size. Calibration of the contrast is used to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat bottom hole (EFBH) from the calibration data. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH diameter are compared with actual widths to evaluate utility of IR Contrast method. Some thermal measurements relate to gap thickness of the delaminations. Results of IR Contrast method on RCC hardware are provided. Keywords: normalized contrast, flash infrared thermography.

  4. Classification of the Regional Ionospheric Disturbance Based on Machine Learning Techniques

    NASA Astrophysics Data System (ADS)

    Terzi, Merve Begum; Arikan, Orhan; Karatay, Secil; Arikan, Feza; Gulyaeva, Tamara

    2016-08-01

    In this study, Total Electron Content (TEC) estimated from GPS receivers is used to model the regional and local variability that differs from global activity along with solar and geomagnetic indices. For the automated classification of regional disturbances, a classification technique based on a robust machine learning technique that have found wide spread use, Support Vector Machine (SVM) is proposed. Performance of developed classification technique is demonstrated for midlatitude ionosphere over Anatolia using TEC estimates generated from GPS data provided by Turkish National Permanent GPS Network (TNPGN-Active) for solar maximum year of 2011. As a result of implementing developed classification technique to Global Ionospheric Map (GIM) TEC data, which is provided by the NASA Jet Propulsion Laboratory (JPL), it is shown that SVM can be a suitable learning method to detect anomalies in TEC variations.

  5. Global Anomaly Detection in Two-Dimensional Symmetry-Protected Topological Phases

    NASA Astrophysics Data System (ADS)

    Bultinck, Nick; Vanhove, Robijn; Haegeman, Jutho; Verstraete, Frank

    2018-04-01

    Edge theories of symmetry-protected topological phases are well known to possess global symmetry anomalies. In this Letter we focus on two-dimensional bosonic phases protected by an on-site symmetry and analyze the corresponding edge anomalies in more detail. Physical interpretations of the anomaly in terms of an obstruction to orbifolding and constructing symmetry-preserving boundaries are connected to the cohomology classification of symmetry-protected phases in two dimensions. Using the tensor network and matrix product state formalism we numerically illustrate our arguments and discuss computational detection schemes to identify symmetry-protected order in a ground state wave function.

  6. Model selection for anomaly detection

    NASA Astrophysics Data System (ADS)

    Burnaev, E.; Erofeev, P.; Smolyakov, D.

    2015-12-01

    Anomaly detection based on one-class classification algorithms is broadly used in many applied domains like image processing (e.g. detection of whether a patient is "cancerous" or "healthy" from mammography image), network intrusion detection, etc. Performance of an anomaly detection algorithm crucially depends on a kernel, used to measure similarity in a feature space. The standard approaches (e.g. cross-validation) for kernel selection, used in two-class classification problems, can not be used directly due to the specific nature of a data (absence of a second, abnormal, class data). In this paper we generalize several kernel selection methods from binary-class case to the case of one-class classification and perform extensive comparison of these approaches using both synthetic and real-world data.

  7. Techniques of adrenal venous sampling in patients with inferior vena cava or renal vein anomalies.

    PubMed

    Endo, Kenji; Morita, Satoru; Suzaki, Shingo; Yamazaki, Hiroshi; Nishina, Yu; Sakai, Shuji

    2018-06-01

    To review the techniques and technical success rate of adrenal venous sampling (AVS) in patients with inferior vena cava (IVC) or renal vein anomalies. The techniques and success rate of AVS in 15 patients with anomalies [8 with double IVC (dIVC), 3 with left IVC (ltIVC), 2 with retroaortic left renal vein (LRV), and 2 with circumaortic LRV] underwent AVS was retrospectively reviewed. Among 11 patients with IVC anomalies, the success rates for sampling the right and left adrenal veins (RAV and LAV) were 81.8 and 90.9%, respectively. In dIVC, the LAV was selected using the following four methods: approaching through the right IVC from the right femoral vein, flipping the LAV catheter tip in the LRV (n = 4) or the interiliac-communicating vein (n = 1), or through the ltIVC from the right (n = 1) or left (n = 2) femoral vein. Among the four patients with LRV anomalies, the success rate was 100% for each adrenal vein. AVS can be successfully performed in patients with anomalies. The key to technical success is understanding the venous anatomy based on pre-procedural CT images and choosing appropriate methods.

  8. EMPACT 3D: an advanced EMI discrimination sensor for CONUS and OCONUS applications

    NASA Astrophysics Data System (ADS)

    Keranen, Joe; Miller, Jonathan S.; Schultz, Gregory; Sander-Olhoeft, Morgan; Laudato, Stephen

    2018-04-01

    We recently developed a new, man-portable, electromagnetic induction (EMI) sensor designed to detect and classify small, unexploded sub-munitions and discriminate them from non-hazardous debris. The ability to distinguish innocuous metal clutter from potentially hazardous unexploded ordnance (UXO) and other explosive remnants of war (ERW) before excavation can significantly accelerate land reclamation efforts by eliminating time spent removing harmless scrap metal. The EMI sensor employs a multi-axis transmitter and receiver configuration to produce data sufficient for anomaly discrimination. A real-time data inversion routine produces intrinsic and extrinsic anomaly features describing the polarizability, location, and orientation of the anomaly under test. We discuss data acquisition and post-processing software development, and results from laboratory and field tests demonstrating the discrimination capability of the system. Data acquisition and real-time processing emphasize ease-of-use, quality control (QC), and display of discrimination results. Integration of the QC and discrimination methods into the data acquisition software reduces the time required between sensor data collection and the final anomaly discrimination result. The system supports multiple concepts of operations (CONOPs) including: 1) a non-GPS cued configuration in which detected anomalies are discriminated and excavated immediately following the anomaly survey; 2) GPS integration to survey multiple anomalies to produce a prioritized dig list with global anomaly locations; and 3) a dynamic mapping configuration supporting detection followed by discrimination and excavation of targets of interest.

  9. Innovative Sensors for Pipeline Crawlers: Rotating Permanent Magnet Inspection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Bruce Nestleroth; Richard J. Davis; Stephanie Flamberg

    2006-09-30

    Internal inspection of pipelines is an important tool for ensuring safe and reliable delivery of fossil energy products. Current inspection systems that are propelled through the pipeline by the product flow cannot be used to inspect all pipelines because of the various physical barriers they may encounter. To facilitate inspection of these ''unpiggable'' pipelines, recent inspection development efforts have focused on a new generation of powered inspection platforms that are able to crawl slowly inside a pipeline and can maneuver past the physical barriers that limit internal inspection applicability, such as bore restrictions, low product flow rate, and low pressure.more » The first step in this research was to review existing inspection technologies for applicability and compatibility with crawler systems. Most existing inspection technologies, including magnetic flux leakage and ultrasonic methods, had significant implementation limitations including mass, physical size, inspection energy coupling requirements and technology maturity. The remote field technique was the most promising but power consumption was high and anomaly signals were low requiring sensitive detectors and electronics. After reviewing each inspection technology, it was decided to investigate the potential for a new inspection method. The new inspection method takes advantage of advances in permanent magnet strength, along with their wide availability and low cost. Called rotating permanent magnet inspection (RPMI), this patent pending technology employs pairs of permanent magnets rotating around the central axis of a cylinder to induce high current densities in the material under inspection. Anomalies and wall thickness variations are detected with an array of sensors that measure local changes in the magnetic field produced by the induced current flowing in the material. This inspection method is an alternative to the common concentric coil remote field technique that induces low-frequency eddy currents in ferromagnetic pipes and tubes. Since this is a new inspection method, both theory and experiment were used to determine fundamental capabilities and limitations. Fundamental finite element modeling analysis and experimental investigations performed during this development have led to the derivation of a first order analytical equation for designing rotating magnetizers to induce current and positioning sensors to record signals from anomalies. Experimental results confirm the analytical equation and the finite element calculations provide a firm basis for the design of RPMI systems. Experimental results have shown that metal loss anomalies and wall thickness variations can be detected with an array of sensors that measure local changes in the magnetic field produced by the induced current flowing in the material. The design exploits the phenomenon that circumferential currents are easily detectable at distances well away from the magnets. Current changes at anomalies were detectable with commercial low cost Hall Effect sensors. Commercial analog to digital converters can be used to measure the sensor output and data analysis can be performed in real time using PC computer systems. The technology was successfully demonstrated during two blind benchmark tests where numerous metal loss defects were detected. For this inspection technology, the detection threshold is a function of wall thickness and corrosion depth. For thinner materials, the detection threshold was experimentally shown to be comparable to magnetic flux leakage. For wall thicknesses greater than three tenths of an inch, the detection threshold increases with wall thickness. The potential for metal loss anomaly sizing was demonstrated in the second benchmarking study, again with accuracy comparable to existing magnetic flux leakage technologies. The rotating permanent magnet system has the potential for inspecting unpiggable pipelines since the magnetizer configurations can be sufficiently small with respect to the bore of the pipe to pass obstructions that limit the application of many inspection technologies. Also, since the largest dimension of the Hall Effect sensor is two tenths of an inch, the sensor packages can be small, flexible and light. The power consumption, on the order of ten watts, is low compared to some inspection systems; this would enable autonomous systems to inspect longer distances between charges. This project showed there are no technical barriers to building a field ready unit that can pass through narrow obstructions, such as plug valves. The next step in project implementation is to build a field ready unit that can begin to establish optimal performance capabilities including detection thresholds, sizing capability, and wall thickness limitations.« less

  10. Crustal modeling of the central part of the Northern Western Desert, Egypt using gravity data

    NASA Astrophysics Data System (ADS)

    Alrefaee, H. A.

    2017-05-01

    The Bouguer anomaly map of the central part of the Northern Western Desert, Egypt was used to construct six 2D gravity models to investigate the nature, physical properties and structures of the crust and upper mantle. The crustal models were constrained and constructed by integrating results from different geophysical techniques and available geological information. The depth to the basement surface, from eight wells existed across the study area, and the depth to the Conrad and Moho interfaces as well as physical properties of sediments, basement, crust and upper mantle from previous petrophysical and crustal studies were used to establish the gravity models. Euler deconvolution technique was carried on the Bouguer anomaly map to detect the subsurface fault trends. Edge detection techniques were calculated to outlines the boundaries of subsurface structural features. Basement structural map was interpreted to reveal the subsurface structural setting of the area. The crustal models reveals increasing of gravity field from the south to the north due to northward thinning of the crust. The models reveals also deformed and rugged basement surface with northward depth increasing from 1.6 km to 6 km. In contrast to the basement, the Conrad and Moho interfaces are nearly flat and get shallower northward where the depth to the Conrad or the thickness of the upper crust ranges from 18 km to 21 km while the depth to the Moho (crustal thickness) ranges from 31.5 km to 34 km. The crust beneath the study area is normal continental crust with obvious thinning toward the continental margin at the Mediterranean coast.

  11. SmartMal: a service-oriented behavioral malware detection framework for mobile devices.

    PubMed

    Wang, Chao; Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C K

    2014-01-01

    This paper presents SmartMal--a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices.

  12. SmartMal: A Service-Oriented Behavioral Malware Detection Framework for Mobile Devices

    PubMed Central

    Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C. K.

    2014-01-01

    This paper presents SmartMal—a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices. PMID:25165729

  13. Detection of spatio-temporal change of ocean acoustic velocity for observing seafloor crustal deformation applying seismological methods

    NASA Astrophysics Data System (ADS)

    Eto, S.; Nagai, S.; Tadokoro, K.

    2011-12-01

    Our group has developed a system for observing seafloor crustal deformation with a combination of acoustic ranging and kinematic GPS positioning techniques. One of the effective factors to reduce estimation error of submarine benchmark in our system is modeling variation of ocean acoustic velocity. We estimated various 1-dimensional velocity models with depth under some constraints, because it is difficult to estimate 3-dimensional acoustic velocity structure including temporal change due to our simple acquisition procedure of acoustic ranging data. We, then, applied the joint hypocenter determination method in seismology [Kissling et al., 1994] to acoustic ranging data. We assume two conditions as constraints in inversion procedure as follows: 1) fixed acoustic velocity in deeper part because it is usually stable both in space and time, 2) each inverted velocity model should be decreased with depth. The following two remarkable spatio-temporal changes of acoustic velocity 1) variations of travel-time residuals at the same points within short time and 2) larger differences between residuals at the neighboring points, which are one's of travel-time from different benchmarks. The First results cannot be explained only by the effect of atmospheric condition change including heating by sunlight. To verify the residual variations mentioned as the second result, we have performed forward modeling of acoustic ranging data with velocity models added velocity anomalies. We calculate travel time by a pseudo-bending ray tracing method [Um and Thurber, 1987] to examine effects of velocity anomaly on the travel-time differences. Comparison between these residuals and travel-time difference in forward modeling, velocity anomaly bodies in shallower depth can make these anomalous residuals, which may indicate moving water bodies. We need to apply an acoustic velocity structure model with velocity anomaly(s) in acoustic ranging data analysis and/or to develop a new system with a large number of sea surface stations to detect them, which may be able to reduce error of seafloor benchmarker position.

  14. The oxygen isotope partition function ratio of water and the structure of liquid water

    USGS Publications Warehouse

    O'Neil, J.R.; Adami, L.H.

    1969-01-01

    By means of the CO2-equilibration technique, the temperature dependence and absolute values of the oxygen isotope partition function ratio of liquid water have been determined, often at 1?? intervals, from -2 to 85??. A linear relationship between In (Q2/Q1) (H2O) and T-1 was obtained that is explicable in terms of the Bigeleisen-Mayer theory of isotopic fractionation. The data are incompatible with conventional, multicomponent mixture models of water because liquid water behaves isotopically as a singly structured homogeneous substance over the entire temperature range studied. A two-species model of water is proposed in which approximately 30% of the hydrogen bonds in ice are broken on melting at 0?? and in which this per cent of monomer changes by only a small amount over the entire liquid range. Because of the high precision and the fundamental property determined, the isotopic fractionation technique is particularly well suited to the detection of thermal anomalies. No anomalies were observed and those previously reported are ascribed to under-estimates of experimental error.

  15. Unsupervised Approaches for Post-Processing in Computationally Efficient Waveform-Similarity-Based Earthquake Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.

    2015-12-01

    Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.

  16. Digital image processing for information extraction.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1973-01-01

    The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.

  17. Identification of magnetic anomalies based on ground magnetic data analysis using multifractal modelling: a case study in Qoja-Kandi, East Azerbaijan Province, Iran

    NASA Astrophysics Data System (ADS)

    Mansouri, E.; Feizi, F.; Karbalaei Ramezanali, A. A.

    2015-10-01

    Ground magnetic anomaly separation using the reduction-to-the-pole (RTP) technique and the fractal concentration-area (C-A) method has been applied to the Qoja-Kandi prospecting area in northwestern Iran. The geophysical survey resulting in the ground magnetic data was conducted for magnetic element exploration. Firstly, the RTP technique was applied to recognize underground magnetic anomalies. RTP anomalies were classified into different populations based on the current method. For this reason, drilling point area determination by the RTP technique was complicated for magnetic anomalies, which are in the center and north of the studied area. Next, the C-A method was applied to the RTP magnetic anomalies (RTP-MA) to demonstrate magnetic susceptibility concentrations. This identification was appropriate for increasing the resolution of the drilling point area determination and decreasing the drilling risk issue, due to the economic costs of underground prospecting. In this study, the results of C-A modelling on the RTP-MA are compared with 8 borehole data. The results show that there is a good correlation between anomalies derived via the C-A method and the log report of boreholes. Two boreholes were drilled in magnetic susceptibility concentrations, based on multifractal modelling data analyses, between 63 533.1 and 66 296 nT. Drilling results showed appropriate magnetite thickness with grades greater than 20 % Fe. The total associated with anomalies containing andesite units hosts iron mineralization.

  18. A Testbed for Data Fusion for Engine Diagnostics and Prognostics1

    DTIC Science & Technology

    2002-03-01

    detected ; too late to be useful for prognostics development. Table 1. Table of acronyms ACRONYM MEANING AD Anomaly detector...strictly defined points. Determining where we are on the engine health curve is the first step in prognostics . Fault detection / diagnostic reasoning... Detection As described above the ability of the monitoring system to detect an anomaly is especially important for knowledge-based systems, i.e.,

  19. Integrated Multivariate Health Monitoring System for Helicopters Main Rotor Drives: Development and Validation with In-Service Data

    DTIC Science & Technology

    2014-10-02

    potential advantages of using multi- variate classification/discrimination/ anomaly detection meth- ods on real world accelerometric condition monitoring ...case of false anomaly reports. A possible explanation of this phenomenon could be given 8 ANNUAL CONFERENCE OF THE PROGNOSTICS AND HEALTH MANAGEMENT...of those helicopters. 1. Anomaly detection by means of a self-learning Shewhart control chart. A problem highlighted by the experts of Agusta- Westland

  20. An Overview on Prenatal Screening for Chromosomal Aberrations.

    PubMed

    Hixson, Lucas; Goel, Srishti; Schuber, Paul; Faltas, Vanessa; Lee, Jessica; Narayakkadan, Anjali; Leung, Ho; Osborne, Jim

    2015-10-01

    This article is a review of current and emerging methods used for prenatal detection of chromosomal aneuploidies. Chromosomal anomalies in the developing fetus can occur in any pregnancy and lead to death prior to or shortly after birth or to costly lifelong disabilities. Early detection of fetal chromosomal aneuploidies, an atypical number of certain chromosomes, can help parents evaluate their pregnancy options. Current diagnostic methods include maternal serum sampling or nuchal translucency testing, which are minimally invasive diagnostics, but lack sensitivity and specificity. The gold standard, karyotyping, requires amniocentesis or chorionic villus sampling, which are highly invasive and can cause abortions. In addition, many of these methods have long turnaround times, which can cause anxiety in mothers. Next-generation sequencing of fetal DNA in maternal blood enables minimally invasive, sensitive, and reasonably rapid analysis of fetal chromosomal anomalies and can be of clinical utility to parents. This review covers traditional methods and next-generation sequencing techniques for diagnosing aneuploidies in terms of clinical utility, technological characteristics, and market potential. © 2015 Society for Laboratory Automation and Screening.

  1. Detecting ship targets in spaceborne infrared image based on modeling radiation anomalies

    NASA Astrophysics Data System (ADS)

    Wang, Haibo; Zou, Zhengxia; Shi, Zhenwei; Li, Bo

    2017-09-01

    Using infrared imaging sensors to detect ship target in the ocean environment has many advantages compared to other sensor modalities, such as better thermal sensitivity and all-weather detection capability. We propose a new ship detection method by modeling radiation anomalies for spaceborne infrared image. The proposed method can be decomposed into two stages, where in the first stage, a test infrared image is densely divided into a set of image patches and the radiation anomaly of each patch is estimated by a Gaussian Mixture Model (GMM), and thereby target candidates are obtained from anomaly image patches. In the second stage, target candidates are further checked by a more discriminative criterion to obtain the final detection result. The main innovation of the proposed method is inspired by the biological mechanism that human eyes are sensitive to the unusual and anomalous patches among complex background. The experimental result on short wavelength infrared band (1.560 - 2.300 μm) and long wavelength infrared band (10.30 - 12.50 μm) of Landsat-8 satellite shows the proposed method achieves a desired ship detection accuracy with higher recall than other classical ship detection methods.

  2. Analysis of SSEM Sensor Data Using BEAM

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Park, Han; James, Mark

    2004-01-01

    A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.

  3. Intelligent system for a remote diagnosis of a photovoltaic solar power plant

    NASA Astrophysics Data System (ADS)

    Sanz-Bobi, M. A.; Muñoz San Roque, A.; de Marcos, A.; Bada, M.

    2012-05-01

    Usually small and mid-sized photovoltaic solar power plants are located in rural areas and typically they operate unattended. Some technicians are in charge of the supervision of these plants and, if an alarm is automatically issued, they try to investigate the problem and correct it. Sometimes these anomalies are detected some hours or days after they begin. Also the analysis of the causes once the anomaly is detected can take some additional time. All these factors motivated the development of a methodology able to perform continuous and automatic monitoring of the basic parameters of a photovoltaic solar power plant in order to detect anomalies as soon as possible, to diagnose their causes, and to immediately inform the personnel in charge of the plant. The methodology proposed starts from the study of the most significant failure modes of a photovoltaic plant through a FMEA and using this information, its typical performance is characterized by the creation of its normal behaviour models. They are used to detect the presence of a failure in an incipient or current form. Once an anomaly is detected, an automatic and intelligent diagnosis process is started in order to investigate the possible causes. The paper will describe the main features of a software tool able to detect anomalies and to diagnose them in a photovoltaic solar power plant.

  4. Method and system for monitoring environmental conditions

    DOEpatents

    Kulesz, James J [Oak Ridge, TN; Lee, Ronald W [Oak Ridge, TN

    2010-11-16

    A system for detecting the occurrence of anomalies includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. At least one software agent is capable of changing the operation of at least one of the controllers in response to the detection of an anomaly by a sensor.

  5. Method for locating underground anomalies by diffraction of electromagnetic waves passing between spaced boreholes

    DOEpatents

    Lytle, R. Jeffrey; Lager, Darrel L.; Laine, Edwin F.; Davis, Donald T.

    1979-01-01

    Underground anomalies or discontinuities, such as holes, tunnels, and caverns, are located by lowering an electromagnetic signal transmitting antenna down one borehole and a receiving antenna down another, the ground to be surveyed for anomalies being situated between the boreholes. Electronic transmitting and receiving equipment associated with the antennas is activated and the antennas are lowered in unison at the same rate down their respective boreholes a plurality of times, each time with the receiving antenna at a different level with respect to the transmitting antenna. The transmitted electromagnetic waves diffract at each edge of an anomaly. This causes minimal signal reception at the receiving antenna. Triangulation of the straight lines between the antennas for the depths at which the signal minimums are detected precisely locates the anomaly. Alternatively, phase shifts of the transmitted waves may be detected to locate an anomaly, the phase shift being distinctive for the waves directed at the anomaly.

  6. Genetic algorithm for TEC seismo-ionospheric anomalies detection around the time of the Solomon (Mw = 8.0) earthquake of 06 February 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-08-01

    On 6 February 2013, at 12:12:27 local time (01:12:27 UTC) a seismic event registering Mw 8.0 struck the Solomon Islands, located at the boundaries of the Australian and Pacific tectonic plates. Time series prediction is an important and widely interesting topic in the research of earthquake precursors. This paper describes a new computational intelligence approach to detect the unusual variations of the total electron content (TEC) seismo-ionospheric anomalies induced by the powerful Solomon earthquake using genetic algorithm (GA). The GA detected a considerable number of anomalous occurrences on earthquake day and also 7 and 8 days prior to the earthquake in a period of high geomagnetic activities. In this study, also the detected TEC anomalies using the proposed method are compared to the results dealing with the observed TEC anomalies by applying the mean, median, wavelet, Kalman filter, ARIMA, neural network and support vector machine methods. The accordance in the final results of all eight methods is a convincing indication for the efficiency of the GA method. It indicates that GA can be an appropriate non-parametric tool for anomaly detection in a non linear time series showing the seismo-ionospheric precursors variations.

  7. A study of the effect of seasonal climatic factors on the electrical resistivity response of three experimental graves

    NASA Astrophysics Data System (ADS)

    Jervis, John R.; Pringle, Jamie K.

    2014-09-01

    Electrical resistivity surveys have proven useful for locating clandestine graves in a number of forensic searches. However, some aspects of grave detection with resistivity surveys remain imperfectly understood. One such aspect is the effect of seasonal changes in climate on the resistivity response of graves. In this study, resistivity survey data collected over three years over three simulated graves were analysed in order to assess how the graves' resistivity anomalies varied seasonally and when they could most easily be detected. Thresholds were used to identify anomalies, and the ‘residual volume' of grave-related anomalies was calculated as the area bounded by the relevant thresholds multiplied by the anomaly's average value above the threshold. The residual volume of a resistivity anomaly associated with a buried pig cadaver showed evidence of repeating annual patterns and was moderately correlated with the soil moisture budget. This anomaly was easiest to detect between January and April each year, after prolonged periods of high net gain in soil moisture. The resistivity response of a wrapped cadaver was more complex, although it also showed evidence of seasonal variation during the third year after burial. We suggest that the observed variation in the graves' resistivity anomalies was caused by seasonal change in survey data noise levels, which was in turn influenced by the soil moisture budget. It is possible that similar variations occur elsewhere for sites with seasonal climate variations and this could affect successful detection of other subsurface features. Further research to investigate how different climates and soil types affect seasonal variation in grave-related resistivity anomalies would be useful.

  8. Techniques for interpretation of geoid anomalies

    NASA Technical Reports Server (NTRS)

    Chapman, M. E.

    1979-01-01

    For purposes of geological interpretation, techniques are developed to compute directly the geoid anomaly over models of density within the earth. Ideal bodies such as line segments, vertical sheets, and rectangles are first used to calculate the geoid anomaly. Realistic bodies are modeled with formulas for two-dimensional polygons and three-dimensional polyhedra. By using Fourier transform methods the two-dimensional geoid is seen to be a filtered version of the gravity field, in which the long-wavelength components are magnified and the short-wavelength components diminished.

  9. Gravity anomaly detection: Apollo/Soyuz

    NASA Technical Reports Server (NTRS)

    Vonbun, F. O.; Kahn, W. D.; Bryan, J. W.; Schmid, P. E.; Wells, W. T.; Conrad, D. T.

    1976-01-01

    The Goddard Apollo-Soyuz Geodynamics Experiment is described. It was performed to demonstrate the feasibility of tracking and recovering high frequency components of the earth's gravity field by utilizing a synchronous orbiting tracking station such as ATS-6. Gravity anomalies of 5 MGLS or larger having wavelengths of 300 to 1000 kilometers on the earth's surface are important for geologic studies of the upper layers of the earth's crust. Short wavelength Earth's gravity anomalies were detected from space. Two prime areas of data collection were selected for the experiment: (1) the center of the African continent and (2) the Indian Ocean Depression centered at 5% north latitude and 75% east longitude. Preliminary results show that the detectability objective of the experiment was met in both areas as well as at several additional anomalous areas around the globe. Gravity anomalies of the Karakoram and Himalayan mountain ranges, ocean trenches, as well as the Diamantina Depth, can be seen. Maps outlining the anomalies discovered are shown.

  10. Application of filtering techniques in preprocessing magnetic data

    NASA Astrophysics Data System (ADS)

    Liu, Haijun; Yi, Yongping; Yang, Hongxia; Hu, Guochuang; Liu, Guoming

    2010-08-01

    High precision magnetic exploration is a popular geophysical technique for its simplicity and its effectiveness. The explanation in high precision magnetic exploration is always a difficulty because of the existence of noise and disturbance factors, so it is necessary to find an effective preprocessing method to get rid of the affection of interference factors before further processing. The common way to do this work is by filtering. There are many kinds of filtering methods. In this paper we introduced in detail three popular kinds of filtering techniques including regularized filtering technique, sliding averages filtering technique, compensation smoothing filtering technique. Then we designed the work flow of filtering program based on these techniques and realized it with the help of DELPHI. To check it we applied it to preprocess magnetic data of a certain place in China. Comparing the initial contour map with the filtered contour map, we can see clearly the perfect effect our program. The contour map processed by our program is very smooth and the high frequency parts of data are disappeared. After filtering, we separated useful signals and noisy signals, minor anomaly and major anomaly, local anomaly and regional anomaly. It made us easily to focus on the useful information. Our program can be used to preprocess magnetic data. The results showed the effectiveness of our program.

  11. Designing a holistic end-to-end intelligent network analysis and security platform

    NASA Astrophysics Data System (ADS)

    Alzahrani, M.

    2018-03-01

    Firewall protects a network from outside attacks, however, once an attack entering a network, it is difficult to detect. Recent significance accidents happened. i.e.: millions of Yahoo email account were stolen and crucial data from institutions are held for ransom. Within two year Yahoo’s system administrators were not aware that there are intruder inside the network. This happened due to the lack of intelligent tools to monitor user behaviour in internal network. This paper discusses a design of an intelligent anomaly/malware detection system with proper proactive actions. The aim is to equip the system administrator with a proper tool to battle the insider attackers. The proposed system adopts machine learning to analyse user’s behaviour through the runtime behaviour of each node in the network. The machine learning techniques include: deep learning, evolving machine learning perceptron, hybrid of Neural Network and Fuzzy, as well as predictive memory techniques. The proposed system is expanded to deal with larger network using agent techniques.

  12. Computed tomography and magnetic resonance angiography in the evaluation of aberrant origin of the external carotid artery branches.

    PubMed

    Cappabianca, Salvatore; Scuotto, Assunta; Iaselli, Francesco; Pignatelli di Spinazzola, Nicoletta; Urraro, Fabrizio; Sarti, Giuseppe; Montemarano, Marcella; Grassi, Roberto; Rotondo, Antonio

    2012-07-01

    Aim of our study was to evaluate the prevalence of aberrant origin of the branches of the external carotid artery (ECA) in 97 patients by computed tomography (CTA) and magnetic resonance angiography (MRA) and to compare the accuracy of these two techniques in the visualization of the ECA system. All patients underwent CTA and MRA examination of the head and neck. Multiplanar and volumetric reformations were obtained in all cases. For each set of images, the presence of aberrant origin of the branches of the external carotid artery was investigated. MRA and CTA images of each patient were compared to define their information content. Anatomical anomalies were found in 88 heminecks, with a prevalence of 53.3%. In the 61 patients in whom the CTA was performed before the MRA, the latter method showed only 92% of abnormalities detected at the first examination; in the 36 patients in whom MRA was performed first, CTA identified all of the anomalies highlighted by the former, adding 12 new. Knowledge of the anomalies of origin of the ECA branches is essential for the head and neck surgeon; the high prevalence of anomalies found in our series as in the previous studies indicates the opportunity to perform a CTA or a MRA of the head and neck before any surgical or interventional procedure. CTA is the method of choice in the evaluation of anomalies of origin of the branches of the ECA and in the definition of their course.

  13. Proactive malware detection

    NASA Astrophysics Data System (ADS)

    Gloster, Jonathan; Diep, Michael; Dredden, David; Mix, Matthew; Olsen, Mark; Price, Brian; Steil, Betty

    2014-06-01

    Small-to-medium sized businesses lack resources to deploy and manage high-end advanced solutions to deter sophisticated threats from well-funded adversaries, but evidence shows that these types of businesses are becoming key targets. As malicious code and network attacks become more sophisticated, classic signature-based virus and malware detection methods are less effective. To augment the current malware methods of detection, we developed a proactive approach to detect emerging malware threats using open source tools and intelligence to discover patterns and behaviors of malicious attacks and adversaries. Technical and analytical skills are combined to track adversarial behavior, methods and techniques. We established a controlled (separated domain) network to identify, monitor, and track malware behavior to increase understanding of the methods and techniques used by cyber adversaries. We created a suite of tools that observe the network and system performance looking for anomalies that may be caused by malware. The toolset collects information from open-source tools and provides meaningful indicators that the system was under or has been attacked. When malware is discovered, we analyzed and reverse engineered it to determine how it could be detected and prevented. Results have shown that with minimum resources, cost effective capabilities can be developed to detect abnormal behavior that may indicate malicious software.

  14. Anomaly Detection and Life Pattern Estimation for the Elderly Based on Categorization of Accumulated Data

    NASA Astrophysics Data System (ADS)

    Mori, Taketoshi; Ishino, Takahito; Noguchi, Hiroshi; Shimosaka, Masamichi; Sato, Tomomasa

    2011-06-01

    We propose a life pattern estimation method and an anomaly detection method for elderly people living alone. In our observation system for such people, we deploy some pyroelectric sensors into the house and measure the person's activities all the time in order to grasp the person's life pattern. The data are transferred successively to the operation center and displayed to the nurses in the center in a precise way. Then, the nurses decide whether the data is the anomaly or not. In the system, the people whose features in their life resemble each other are categorized as the same group. Anomalies occurred in the past are shared in the group and utilized in the anomaly detection algorithm. This algorithm is based on "anomaly score." The "anomaly score" is figured out by utilizing the activeness of the person. This activeness is approximately proportional to the frequency of the sensor response in a minute. The "anomaly score" is calculated from the difference between the activeness in the present and the past one averaged in the long term. Thus, the score is positive if the activeness in the present is higher than the average in the past, and the score is negative if the value in the present is lower than the average. If the score exceeds a certain threshold, it means that an anomaly event occurs. Moreover, we developed an activity estimation algorithm. This algorithm estimates the residents' basic activities such as uprising, outing, and so on. The estimation is shown to the nurses with the "anomaly score" of the residents. The nurses can understand the residents' health conditions by combining these two information.

  15. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements through a jackknifing process to isolate the anomalous channels, so that an automated analysis system might discard them prior to FK analysis and beamforming on events of interest.

  16. Using Machine Learning for Behavior-Based Access Control: Scalable Anomaly Detection on TCP Connections and HTTP Requests

    DTIC Science & Technology

    2013-11-01

    machine learning techniques used in BBAC to make predictions about the intent of actors establishing TCP connections and issuing HTTP requests. We discuss pragmatic challenges and solutions we encountered in implementing and evaluating BBAC, discussing (a) the general concepts underlying BBAC, (b) challenges we have encountered in identifying suitable datasets, (c) mitigation strategies to cope...and describe current plans for transitioning BBAC capabilities into the Department of Defense together with lessons learned for the machine learning

  17. Remote sensing applied to agriculture: Basic principles, methodology, and applications

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Mendonca, F. J.

    1981-01-01

    The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.

  18. An Integrated Intrusion Detection Model of Cluster-Based Wireless Sensor Network

    PubMed Central

    Sun, Xuemei; Yan, Bo; Zhang, Xinzhong; Rong, Chuitian

    2015-01-01

    Considering wireless sensor network characteristics, this paper combines anomaly and mis-use detection and proposes an integrated detection model of cluster-based wireless sensor network, aiming at enhancing detection rate and reducing false rate. Adaboost algorithm with hierarchical structures is used for anomaly detection of sensor nodes, cluster-head nodes and Sink nodes. Cultural-Algorithm and Artificial-Fish–Swarm-Algorithm optimized Back Propagation is applied to mis-use detection of Sink node. Plenty of simulation demonstrates that this integrated model has a strong performance of intrusion detection. PMID:26447696

  19. An Integrated Intrusion Detection Model of Cluster-Based Wireless Sensor Network.

    PubMed

    Sun, Xuemei; Yan, Bo; Zhang, Xinzhong; Rong, Chuitian

    2015-01-01

    Considering wireless sensor network characteristics, this paper combines anomaly and mis-use detection and proposes an integrated detection model of cluster-based wireless sensor network, aiming at enhancing detection rate and reducing false rate. Adaboost algorithm with hierarchical structures is used for anomaly detection of sensor nodes, cluster-head nodes and Sink nodes. Cultural-Algorithm and Artificial-Fish-Swarm-Algorithm optimized Back Propagation is applied to mis-use detection of Sink node. Plenty of simulation demonstrates that this integrated model has a strong performance of intrusion detection.

  20. A Distance Measure for Attention Focusing and Anaomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R. J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by.

  1. Evaluation of Süleymanköy (Diyarbakir, Eastern Turkey) and Seferihisar (Izmir, Western Turkey) Self Potential Anomalies with Multilayer Perceptron Neural Networks

    NASA Astrophysics Data System (ADS)

    Kaftan, Ilknur; Sindirgi, Petek

    2013-04-01

    Self-potential (SP) is one of the oldest geophysical methods that provides important information about near-surface structures. Several methods have been developed to interpret SP data using simple geometries. This study investigated inverse solution of a buried, polarized sphere-shaped self-potential (SP ) anomaly via Multilayer Perceptron Neural Networks ( MLPNN ). The polarization angle ( α ) and depth to the centre of sphere ( h )were estimated. The MLPNN is applied to synthetic and field SP data. In order to see the capability of the method in detecting the number of sources, MLPNN was applied to different spherical models at different depths and locations.. Additionally, the performance of MLPNN was tested by adding random noise to the same synthetic test data. The sphere model successfully obtained similar parameters under different S/N ratios. Then, MLPNN method was applied to two field examples. The first one is the cross section taken from the SP anomaly map of the Ergani-Süleymanköy (Turkey) copper mine. MLPNN was also applied to SP data from Seferihisar Izmir (Western Turkey) geothermal field. The MLPNN results showed good agreement with the original synthetic data set. The effect of The technique gave satisfactory results following the addition of 5% and 10% Gaussian noise levels. The MLPNN results were compared to other SP interpretation techniques, such as Normalized Full Gradient (NFG), inverse solution and nomogram methods. All of the techniques showed strong similarity. Consequently, the synthetic and field applications of this study show that MLPNN provides reliable evaluation of the self potential data modelled by the sphere model.

  2. Anomaly Detection In Additively Manufactured Parts Using Laser Doppler Vibrometery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, Carlos A.

    Additively manufactured parts are susceptible to non-uniform structure caused by the unique manufacturing process. This can lead to structural weakness or catastrophic failure. Using laser Doppler vibrometry and frequency response analysis, non-contact detection of anomalies in additively manufactured parts may be possible. Preliminary tests show promise for small scale detection, but more future work is necessary.

  3. The Condensate Database for Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Gallaher, D. W.; Lv, Q.; Grant, G.; Campbell, G. G.; Liu, Q.

    2014-12-01

    Although massive amounts of cryospheric data have been and are being generated at an unprecedented rate, a vast majority of the otherwise valuable data have been ``sitting in the dark'', with very limited quality assurance or runtime access for higher-level data analytics such as anomaly detection. This has significantly hindered data-driven scientific discovery and advances in the polar research and Earth sciences community. In an effort to solve this problem we have investigated and developed innovative techniques for the construction of ``condensate database'', which is much smaller than the original data yet still captures the key characteristics (e.g., spatio-temporal norm and changes). In addition we are taking advantage of parallel databases that make use of low cost GPU processors. As a result, efficient anomaly detection and quality assurance can be achieved with in-memory data analysis or limited I/O requests. The challenges lie in the fact that cryospheric data are massive and diverse, with normal/abnomal patterns spanning a wide range of spatial and temporal scales. This project consists of investigations in three main areas: (1) adaptive neighborhood-based thresholding in both space and time; (2) compressive-domain pattern detection and change analysis; and (3) hybrid and adaptive condensation of multi-modal, multi-scale cryospheric data.

  4. Automatic RST-based system for a rapid detection of man-made disasters

    NASA Astrophysics Data System (ADS)

    Tramutoli, Valerio; Corrado, Rosita; Filizzola, Carolina; Livia Grimaldi, Caterina Sara; Mazzeo, Giuseppe; Marchese, Francesco; Pergola, Nicola

    2010-05-01

    Man-made disasters may cause injuries to citizens and damages to critical infrastructures. When it is not possible to prevent or foresee such disasters it is hoped at least to rapidly detect the accident in order to intervene as soon as possible to minimize damages. In this context, the combination of a Robust Satellite Technique (RST), able to identify for sure actual (i.e. no false alarm) accidents, and satellite sensors with high temporal resolution seems to assure both a reliable and a timely detection of abrupt Thermal Infrared (TIR) transients related to dangerous explosions. A processing chain, based on the RST approach, has been developed in the framework of the GMOSS and G-MOSAIC projects by DIFA-UNIBAS team, suitable for automatically identify on MSG-SEVIRI images harmful events. Maps of thermal anomalies are generated every 15 minutes (i.e. SEVIRI temporal repetition rate) over a selected area together with kml files (containing information on latitude and longitude of "thermally" anomalous SEVIRI pixel centre, time of image acquisition, relative intensity of anomalies, etc.) for a rapid visualization of the accident position even on Google Earth. Results achieved in the cases of gas pipelines recently exploded or attacked in Russia and in Iraq will be presented in this work.

  5. Analysis of potential urban unstable areas and landslide-induced damages on Volterra historical site through a remote sensing approach

    NASA Astrophysics Data System (ADS)

    Del Soldato, Matteo; Bianchini, Silvia; Nolesini, Teresa; Frodella, William; Casagli, Nicola

    2017-04-01

    Multisystem remote sensing techniques were exploited to provide a comprehensive overview of Volterra (Italy) site stability with regards to its landscape, urban fabric and cultural heritage. Interferometric Synthetic Aperture Radar (InSAR) techniques allow precise measurements of Earth surface displacement, as well as the detection of building deformations on large urban areas. In the field of cultural heritage conservation Infrared thermography (IRT) provides surface temperature mapping and therefore detects various potential criticalities, such as moisture, seepage areas, cracks and structural anomalies. Between winter 2014 and spring 2015 the historical center and south-western sectors of Volterra (Tuscany region, central Italy) were affected by instability phenomena. The spatial distribution, typology and effect on the urban fabrics of the landslide phenomena were investigated by analyzing the geological and geomorphological settings, traditional geotechnical monitoring and advanced remote sensing data such as Persistent Scatterers Interferometry (PSI). The ground deformation rates and the maximum settlement values derived from SAR acquisitions of historical ENVISAT and recent COSMO-SkyMed sensors, in 2003-2009 and 2010-2015 respectively, were compared with background geological data, constructive features, in situ evidences and detailed field inspections in order to classify landslide-damaged buildings. In this way, the detected movements and their potential correspondences with recognized damages were investigated in order to perform an assessment of the built-up areas deformations and damages on Volterra. The IRT technique was applied in order to survey the surface temperature of the historical Volterra wall-enclosure, and allowed highlighting thermal anomalies on this cultural heritage element of the site. The obtained results permitted to better correlate the landslide effects of the recognized deformations in the urban fabric, in order to provide useful information for future risk mitigation strategies to be planned by the local authorities and the involved technicians and conservators.

  6. Failure Control Techniques for the SSME

    NASA Technical Reports Server (NTRS)

    Taniguchi, M. H.

    1987-01-01

    Since ground testing of the Space Shuttle Main Engine (SSME) began in 1975, the detection of engine anomalies and the prevention of major damage have been achieved by a multi-faceted detection/shutdown system. The system continues the monitoring task today and consists of the following: sensors, automatic redline and other limit logic, redundant sensors and controller voting logic, conditional decision logic, and human monitoring. Typically, on the order of 300 to 500 measurements are sensed and recorded for each test, while on the order of 100 are used for control and monitoring. Despite extensive monitoring by the current detection system, twenty-seven (27) major incidents have occurred. This number would appear insignificant compared with over 1200 hot-fire tests which have taken place since 1976. However, the number suggests the requirement for and future benefits of a more advanced failure detection system.

  7. Algorithms Based on CWT and Classifiers to Control Cardiac Alterations and Stress Using an ECG and a SCR

    PubMed Central

    Villarejo, María Viqueira; Zapirain, Begoña García; Zorrilla, Amaia Méndez

    2013-01-01

    This paper presents the results of using a commercial pulsimeter as an electrocardiogram (ECG) for wireless detection of cardiac alterations and stress levels for home control. For these purposes, signal processing techniques (Continuous Wavelet Transform (CWT) and J48) have been used, respectively. The designed algorithm analyses the ECG signal and is able to detect the heart rate (99.42%), arrhythmia (93.48%) and extrasystoles (99.29%). The detection of stress level is complemented with Skin Conductance Response (SCR), whose success is 94.02%. The heart rate variability does not show added value to the stress detection in this case. With this pulsimeter, it is possible to prevent and detect anomalies for a non-intrusive way associated to a telemedicine system. It is also possible to use it during physical activity due to the fact the CWT minimizes the motion artifacts. PMID:23666135

  8. Algorithms based on CWT and classifiers to control cardiac alterations and stress using an ECG and a SCR.

    PubMed

    Villarejo, María Viqueira; Zapirain, Begoña García; Zorrilla, Amaia Méndez

    2013-05-10

    This paper presents the results of using a commercial pulsimeter as an electrocardiogram (ECG) for wireless detection of cardiac alterations and stress levels for home control. For these purposes, signal processing techniques (Continuous Wavelet Transform (CWT) and J48) have been used, respectively. The designed algorithm analyses the ECG signal and is able to detect the heart rate (99.42%), arrhythmia (93.48%) and extrasystoles (99.29%). The detection of stress level is complemented with Skin Conductance Response (SCR), whose success is 94.02%. The heart rate variability does not show added value to the stress detection in this case. With this pulsimeter, it is possible to prevent and detect anomalies for a non-intrusive way associated to a telemedicine system. It is also possible to use it during physical activity due to the fact the CWT minimizes the motion artifacts.

  9. Intrusion-aware alert validation algorithm for cooperative distributed intrusion detection schemes of wireless sensor networks.

    PubMed

    Shaikh, Riaz Ahmed; Jameel, Hassan; d'Auriol, Brian J; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae

    2009-01-01

    Existing anomaly and intrusion detection schemes of wireless sensor networks have mainly focused on the detection of intrusions. Once the intrusion is detected, an alerts or claims will be generated. However, any unidentified malicious nodes in the network could send faulty anomaly and intrusion claims about the legitimate nodes to the other nodes. Verifying the validity of such claims is a critical and challenging issue that is not considered in the existing cooperative-based distributed anomaly and intrusion detection schemes of wireless sensor networks. In this paper, we propose a validation algorithm that addresses this problem. This algorithm utilizes the concept of intrusion-aware reliability that helps to provide adequate reliability at a modest communication cost. In this paper, we also provide a security resiliency analysis of the proposed intrusion-aware alert validation algorithm.

  10. Intrusion-Aware Alert Validation Algorithm for Cooperative Distributed Intrusion Detection Schemes of Wireless Sensor Networks

    PubMed Central

    Shaikh, Riaz Ahmed; Jameel, Hassan; d’Auriol, Brian J.; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae

    2009-01-01

    Existing anomaly and intrusion detection schemes of wireless sensor networks have mainly focused on the detection of intrusions. Once the intrusion is detected, an alerts or claims will be generated. However, any unidentified malicious nodes in the network could send faulty anomaly and intrusion claims about the legitimate nodes to the other nodes. Verifying the validity of such claims is a critical and challenging issue that is not considered in the existing cooperative-based distributed anomaly and intrusion detection schemes of wireless sensor networks. In this paper, we propose a validation algorithm that addresses this problem. This algorithm utilizes the concept of intrusion-aware reliability that helps to provide adequate reliability at a modest communication cost. In this paper, we also provide a security resiliency analysis of the proposed intrusion-aware alert validation algorithm. PMID:22454568

  11. Remote sensing applications for diagnostics of the radioactive pollution of the ground surface and in the atmosphere

    NASA Astrophysics Data System (ADS)

    Pulinets, Sergey; Ouzounov, Dimitar; Boyarchuk, Kirill; Laverov, Nikolay

    2013-04-01

    Radioactive pollution due to its air ionization activity can drastically change the atmospheric boundary layer conductivity (what was experimentally proved during period of nuclear tests in atmosphere) and through the global electric circuit produce anomalous variations in atmosphere. As additional effect the ions created due to air ionization serve as centers of water vapor condensation and nucleation of aerosol-size particles. This process is accompanied by latent heat release. Both anomalies (ionospheric and thermal) can be controlled by remote sensing technique both from satellites (IR sensors and ionospheric probes) and from ground (GPS receivers, ground based ionosondes, VLF propagation sounding, ground measurements of the air temperature and humidity). We monitored the majority of transient events (Three-Mile Island and Chernobyl nuclear power plant emergencies) and stationary sources such as Gabon natural nuclear reactor, sites of underground nuclear tests, etc. and were able to detect thermal anomalies and for majority of cases - the ionospheric anomalies as well. Immediately after the March 11, 2011 earthquake and tsunami in Japan we started to continuously survey the long-wavelength energy flux (10-13 microns) measurable at top of the atmosphere from POES/NOAA/AVHRR polar orbit satellites. Our preliminary results show the presence of hot spots on the top of the atmosphere over the Fukushima Daiichi Nuclear Power Plant (FDNPP) and due to their persistence over the same region they are most likely not of meteorological origin. On March 14 and 21 we detected a significant increase in radiation at the top of the atmosphere which also coincides with a reported radioactivity gas leaks from the FDNPP. After March 21 the intensity of energy flux in atmosphere started to decline, which has been confirmed by ground radiometer network. We were able to detect with ground based ionosonde the ionospheric anomaly associated with the largest radioactive release on March 21.We are presenting new theoretical estimates and results of experimental measurements showing that the heat flux released during ionization of the atmospheric boundary layer under significant radioactive pollution is sufficient for recording anomalous heat fluxes using the means of remote sounding (infrared radiometers) installed on satellites, and ionospheric anomalies are generated due to changes of the boundary layer conductivity.

  12. Routine screening for fetal anomalies: expectations.

    PubMed

    Goldberg, James D

    2004-03-01

    Ultrasound has become a routine part of prenatal care. Despite this, the sensitivity and specificity of the procedure is unclear to many patients and healthcare providers. In a small study from Canada, 54.9% of women reported that they had received no information about ultrasound before their examination. In addition, 37.2% of women indicated that they were unaware of any fetal problems that ultrasound could not detect. Most centers that perform ultrasound do not have their own statistics regarding sensitivity and specificity; it is necessary to rely on large collaborative studies. Unfortunately, wide variations exist in these studies with detection rates for fetal anomalies between 13.3% and 82.4%. The Eurofetus study is the largest prospective study performed to date and because of the time and expense involved in this type of study, a similar study is not likely to be repeated. The overall fetal detection rate for anomalous fetuses was 64.1%. It is important to note that in this study, ultrasounds were performed in tertiary centers with significant experience in detecting fetal malformations. The RADIUS study also demonstrated a significantly improved detection rate of anomalies before 24 weeks in tertiary versus community centers (35% versus 13%). Two concepts seem to emerge from reviewing these data. First, patients must be made aware of the limitations of ultrasound in detecting fetal anomalies. This information is critical to allow them to make informed decisions whether to undergo ultrasound examination and to prepare them for potential outcomes.Second, to achieve the detection rates reported in the Eurofetus study, ultrasound examination must be performed in centers that have extensive experience in the detection of fetal anomalies.

  13. Defects detection and non-destructive testing (NDT) techniques in paintings: a unified approach through measurements of deformation

    NASA Astrophysics Data System (ADS)

    Sfarra, S.; Ibarra-Castanedo, C.; Ambrosini, D.; Paoletti, D.; Bendada, A.; Maldague, X.

    2013-05-01

    The present study is focused on two topics. The first one is a mathematical model, useful to understand the deformation of paintings, which uses straining devices, adjustable and micrometrically controlled via a pin supported in a hollow cylinder. Strains were analyzed by holographic interferometry (HI) technique using an appropriate frame. The second one concerns the need to improve the conservator's knowledge about the defect's detection and defect's propagation in acrylic painting characterized of underdrawings and pentimenti. To fulfill this task, a sample was manufactured to clarify the several uncertainties inherent the influence of external factors on their conservation. Subsurface anomalies were also retrieved by near-infrared reflectography (NIRR) and transmittography (NIRT) techniques, using LED lamps and several narrow-band filters mounted on a CMOS camera, working at different wavelengths each other and in combination with UV imaging. In addition, a sponge glued on the rear side of the canvas was impregnated with a precise amount of water by means of a syringe to verify the "stretcher effect" by the digital speckle photography (DSP) technique (using MatPIV). The same effect also affects the sharp transition of the canvas at the stretcher's edge. In this case, a possible mechanism is a direct mechanical contact between stretcher and canvas that was investigated by HI technique. Finally, advanced algorithms applied to the square heating thermography (SHT) data were very useful to detect three Mylar® inserts simulating different type of defects. These fabricated defects were also identified by optical techniques, while the visual inspection was the only one capable of detecting a biological damage.

  14. Transient ice mass variations over Greenland detected by the combination of GPS and GRACE data

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Liu, L.; Khan, S. A.; van Dam, T. M.; Zhang, E.

    2017-12-01

    Over the past decade, the Greenland Ice Sheet (GrIS) has been undergoing significant warming and ice mass loss. Such mass loss was not always a steady process but had substantial temporal and spatial variabilities. Here we apply multi-channel singular spectral analysis to crustal deformation time series measured at about 50 Global Positioning System (GPS) stations mounted on bedrock around the Greenland coast and mass changes inferred from Gravity Recovery and Climate Experiment (GRACE) to detect transient changes in ice mass balance over the GrIS. We detect two transient anomalies: one is a negative melting anomaly (Anomaly 1) that peaked around 2010; the other is a positive melting anomaly (Anomaly 2) that peaked between 2012 and 2013. The GRACE data show that both anomalies caused significant mass changes south of 74°N but negligible changes north of 74°N. Both anomalies caused the maximum mass change in southeast GrIS, followed by in west GrIS near Jakobshavn. Our results also show that the mass change caused by Anomaly 1 first reached the maximum in late 2009 in the southeast GrIS and then migrated to west GrIS. However, in Anomaly 2, the southeast GrIS was the last place that reached the maximum mass change in early 2013 and the west GrIS near Jakobshavn was the second latest place that reached the maximum mass change. Most of the GPS data show similar spatiotemporal patterns as those obtained from the GRACE data. However, some GPS time series show discrepancies in either space or time, because of data gaps and different sensitivities of mass loading change. Namely, loading deformation measured by GPS can be significantly affected by local dynamical mass changes, which, yet, has little impact on GRACE observations.

  15. Locality-constrained anomaly detection for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Liu, Jiabin; Li, Wei; Du, Qian; Liu, Kui

    2015-12-01

    Detecting a target with low-occurrence-probability from unknown background in a hyperspectral image, namely anomaly detection, is of practical significance. Reed-Xiaoli (RX) algorithm is considered as a classic anomaly detector, which calculates the Mahalanobis distance between local background and the pixel under test. Local RX, as an adaptive RX detector, employs a dual-window strategy to consider pixels within the frame between inner and outer windows as local background. However, the detector is sensitive if such a local region contains anomalous pixels (i.e., outliers). In this paper, a locality-constrained anomaly detector is proposed to remove outliers in the local background region before employing the RX algorithm. Specifically, a local linear representation is designed to exploit the internal relationship between linearly correlated pixels in the local background region and the pixel under test and its neighbors. Experimental results demonstrate that the proposed detector improves the original local RX algorithm.

  16. Detection of anomalous events

    DOEpatents

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  17. Quantifying Performance Bias in Label Fusion

    DTIC Science & Technology

    2012-08-21

    detect ), may provide the end-user with the means to appropriately adjust the performance and optimal thresholds for performance by fusing legacy systems...boolean combination of classification systems in ROC space: An application to anomaly detection with HMMs. Pattern Recognition, 43(8), 2732-2752. 10...Shamsuddin, S. (2009). An overview of neural networks use in anomaly intrusion detection systems. Paper presented at the Research and Development (SCOReD

  18. Bio-Inspired Distributed Decision Algorithms for Anomaly Detection

    DTIC Science & Technology

    2017-03-01

    TERMS DIAMoND, Local Anomaly Detector, Total Impact Estimation, Threat Level Estimator 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU...21 4.2 Performance of the DIAMoND Algorithm as a DNS-Server Level Attack Detection and Mitigation...with 6 Nodes ........................................................................................ 13 8 Hierarchical 2- Level Topology

  19. Anomaly Detection in the Right Hemisphere: The Influence of Visuospatial Factors

    ERIC Educational Resources Information Center

    Smith, Stephen D.; Dixon, Michael J.; Tays, William J.; Bulman-Fleming, M. Barbara

    2004-01-01

    Previous research with both brain-damaged and neurologically intact populations has demonstrated that the right cerebral hemisphere (RH) is superior to the left cerebral hemisphere (LH) at detecting anomalies (or incongruities) in objects (Ramachandran, 1995; Smith, Tays, Dixon, & Bulman-Fleming, 2002). The current research assesses whether the RH…

  20. A Semiparametric Model for Hyperspectral Anomaly Detection

    DTIC Science & Technology

    2012-01-01

    treeline ) in the presence of natural background clutter (e.g., trees, dirt roads, grasses). Each target consists of about 7 × 4 pixels, and each pixel...vehicles near the treeline in Cube 1 (Figure 1) constitutes the target set, but, since anomaly detectors are not designed to detect a particular target

  1. Analysis of gravity data beneath Endut geothermal prospect using horizontal gradient and Euler deconvolution

    NASA Astrophysics Data System (ADS)

    Supriyanto, Noor, T.; Suhanto, E.

    2017-07-01

    The Endut geothermal prospect is located in Banten Province, Indonesia. The geological setting of the area is dominated by quaternary volcanic, tertiary sediments and tertiary rock intrusion. This area has been in the preliminary study phase of geology, geochemistry, and geophysics. As one of the geophysical study, the gravity data measurement has been carried out and analyzed in order to understand geological condition especially subsurface fault structure that control the geothermal system in Endut area. After precondition applied to gravity data, the complete Bouguer anomaly have been analyzed using advanced derivatives method such as Horizontal Gradient (HG) and Euler Deconvolution (ED) to clarify the existance of fault structures. These techniques detected boundaries of body anomalies and faults structure that were compared with the lithologies in the geology map. The analysis result will be useful in making a further realistic conceptual model of the Endut geothermal area.

  2. Hyperspectral data collection for the assessment of target detection algorithms: the Viareggio 2013 trial

    NASA Astrophysics Data System (ADS)

    Rossi, Alessandro; Acito, Nicola; Diani, Marco; Corsini, Giovanni; De Ceglie, Sergio Ugo; Riccobono, Aldo; Chiarantini, Leandro

    2014-10-01

    Airborne hyperspectral imagery is valuable for military and civilian applications, such as target identification, detection of anomalies and changes within multiple acquisitions. In target detection (TD) applications, the performance assessment of different algorithms is an important and critical issue. In this context, the small number of public available hyperspectral data motivated us to perform an extensive measurement campaign including various operating scenarios. The campaign was organized by CISAM in cooperation with University of Pisa, Selex ES and CSSN-ITE, and it was conducted in Viareggio, Italy in May, 2013. The Selex ES airborne hyperspectral sensor SIM.GA was mounted on board of an airplane to collect images over different sites in the morning and afternoon of two subsequent days. This paper describes the hyperspectral data collection of the trial. Four different sites were set up, representing a complex urban scenario, two parking lots and a rural area. Targets with dimensions comparable to the sensor ground resolution were deployed in the sites to reproduce different operating situations. An extensive ground truth documentation completes the data collection. Experiments to test anomalous change detection techniques were set up changing the position of the deployed targets. Search and rescue scenarios were simulated to evaluate the performance of anomaly detection algorithms. Moreover, the reflectance signatures of the targets were measured on the ground to perform spectral matching in varying atmospheric and illumination conditions. The paper presents some preliminary results that show the effectiveness of hyperspectral data exploitation for the object detection tasks of interest in this work.

  3. Quantifying Methane Flux from a Prominent Seafloor Crater with Water Column Imagery Filtering and Bubble Quantification Techniques

    NASA Astrophysics Data System (ADS)

    Mitchell, G. A.; Gharib, J. J.; Doolittle, D. F.

    2015-12-01

    Methane gas flux from the seafloor to atmosphere is an important variable for global carbon cycle and climate models, yet is poorly constrained. Methodologies used to estimate seafloor gas flux commonly employ a combination of acoustic and optical techniques. These techniques often use hull-mounted multibeam echosounders (MBES) to quickly ensonify large volumes of the water column for acoustic backscatter anomalies indicative of gas bubble plumes. Detection of these water column anomalies with a MBES provides information on the lateral distribution of the plumes, the midwater dimensions of the plumes, and their positions on the seafloor. Seafloor plume locations are targeted for visual investigations using a remotely operated vehicle (ROV) to determine bubble emission rates, venting behaviors, bubble sizes, and ascent velocities. Once these variables are measured in-situ, an extrapolation of gas flux is made over the survey area using the number of remotely-mapped flares. This methodology was applied to a geophysical survey conducted in 2013 over a large seafloor crater that developed in response to an oil well blowout in 1983 offshore Papua New Guinea. The site was investigated by multibeam and sidescan mapping, sub-bottom profiling, 2-D high-resolution multi-channel seismic reflection, and ROV video and coring operations. Numerous water column plumes were detected in the data suggesting vigorously active vents within and near the seafloor crater (Figure 1). This study uses dual-frequency MBES datasets (Reson 7125, 200/400 kHz) and ROV video imagery of the active hydrocarbon seeps to estimate total gas flux from the crater. Plumes of bubbles were extracted from the water column data using threshold filtering techniques. Analysis of video images of the seep emission sites within the crater provided estimates on bubble size, expulsion frequency, and ascent velocity. The average gas flux characteristics made from ROV video observations is extrapolated over the number of individual flares detected acoustically and extracted to estimate gas flux from the survey area. The gas flux estimate from the water column filtering and ROV observations yields a range of 2.2 - 6.6 mol CH4 / min.

  4. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  5. A case study to detect the leakage of underground pressureless cement sewage water pipe using GPR, electrical, and chemical data.

    PubMed

    Liu, Guanqun; Jia, Yonggang; Liu, Hongjun; Qiu, Hanxue; Qiu, Dongling; Shan, Hongxian

    2002-03-01

    The exploration and determination of leakage of underground pressureless nonmetallic pipes is difficult to deal with. A comprehensive method combining Ground Penetrating Rader (GPR), electric potential survey and geochemical survey is introduced in the leakage detection of an underground pressureless nonmetallic sewage pipe in this paper. Theoretically, in the influencing scope of a leakage spot, the obvious changes of the electromagnetic properties and the physical-chemical properties of the underground media will be reflected as anomalies in GPR and electrical survey plots. The advantages of GPR and electrical survey are fast and accurate in detection of anomaly scope. In-situ analysis of the geophysical surveys can guide the geochemical survey. Then water and soil sampling and analyzing can be the evidence for judging the anomaly is caused by pipe leakage or not. On the basis of previous tests and practical surveys, the GPR waveforms, electric potential curves, contour maps, and chemical survey results are all classified into three types according to the extent or indexes of anomalies in orderto find out the leakage spots. When three survey methods all show their anomalies as type I in an anomalous spot, this spot is suspected as the most possible leakage location. Otherwise, it will be down grade suspected point. The suspect leakage spots should be confirmed by referring the site conditions because some anomalies are caused other factors. The excavation afterward proved that the method for determining the suspected location by anomaly type is effective and economic. Comprehensive method of GRP, electric potential survey, and geochemical survey is one of the effective methods in the leakage detection of underground nonmetallic pressureless pipe with its advantages of being fast and accurate.

  6. Cost Analysis of Following Up Incomplete Low-Risk Fetal Anatomy Ultrasounds.

    PubMed

    O'Brien, Karen; Shainker, Scott A; Modest, Anna M; Spiel, Melissa H; Resetkova, Nina; Shah, Neel; Hacker, Michele R

    2017-03-01

    To examine the clinical utility and cost of follow-up ultrasounds performed as a result of suboptimal views at the time of initial second-trimester ultrasound in a cohort of low-risk pregnant women. We conducted a retrospective cohort study of women at low risk for fetal structural anomalies who had second-trimester ultrasounds at 16 to less than 24 weeks of gestation from 2011 to 2013. We determined the probability of women having follow-up ultrasounds as a result of suboptimal views at the time of the initial second-trimester ultrasound, and calculated the probability of detecting an anomaly on follow-up ultrasound. These probabilities were used to estimate the national cost of our current ultrasound practice, and the cost to identify one fetal anomaly on follow-up ultrasound. During the study period, 1,752 women met inclusion criteria. Four fetuses (0.23% [95% CI 0.06-0.58]) were found to have anomalies at the initial ultrasound. Because of suboptimal views, 205 women (11.7%) returned for a follow-up ultrasound, and one (0.49% [95% CI 0.01-2.7]) anomaly was detected. Two women (0.11%) still had suboptimal views and returned for an additional follow-up ultrasound, with no anomalies detected. When the incidence of incomplete ultrasounds was applied to a similar low-risk national cohort, the annual cost of these follow-up scans was estimated at $85,457,160. In our cohort, the cost to detect an anomaly on follow-up ultrasound was approximately $55,000. The clinical yield of performing follow-up ultrasounds because of suboptimal views on low-risk second-trimester ultrasounds is low. Since so few fetal abnormalities were identified on follow-up scans, this added cost and patient burden may not be warranted. © 2016 Wiley Periodicals, Inc.

  7. Development of anomaly detection models for deep subsurface monitoring

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.

    2017-12-01

    Deep subsurface repositories are used for waste disposal and carbon sequestration. Monitoring deep subsurface repositories for potential anomalies is challenging, not only because the number of sensor networks and the quality of data are often limited, but also because of the lack of labeled data needed to train and validate machine learning (ML) algorithms. Although physical simulation models may be applied to predict anomalies (or the system's nominal state for that sake), the accuracy of such predictions may be limited by inherent conceptual and parameter uncertainties. The main objective of this study was to demonstrate the potential of data-driven models for leakage detection in carbon sequestration repositories. Monitoring data collected during an artificial CO2 release test at a carbon sequestration repository were used, which include both scalar time series (pressure) and vector time series (distributed temperature sensing). For each type of data, separate online anomaly detection algorithms were developed using the baseline experiment data (no leak) and then tested on the leak experiment data. Performance of a number of different online algorithms was compared. Results show the importance of including contextual information in the dataset to mitigate the impact of reservoir noise and reduce false positive rate. The developed algorithms were integrated into a generic Web-based platform for real-time anomaly detection.

  8. Machine intelligence-based decision-making (MIND) for automatic anomaly detection

    NASA Astrophysics Data System (ADS)

    Prasad, Nadipuram R.; King, Jason C.; Lu, Thomas

    2007-04-01

    Any event deemed as being out-of-the-ordinary may be called an anomaly. Anomalies by virtue of their definition are events that occur spontaneously with no prior indication of their existence or appearance. Effects of anomalies are typically unknown until they actually occur, and their effects aggregate in time to show noticeable change from the original behavior. An evolved behavior would in general be very difficult to correct unless the anomalous event that caused such behavior can be detected early, and any consequence attributed to the specific anomaly. Substantial time and effort is required to back-track the cause for abnormal behavior and to recreate the event sequence leading to abnormal behavior. There is a critical need therefore to automatically detect anomalous behavior as and when they may occur, and to do so with the operator in the loop. Human-machine interaction results in better machine learning and a better decision-support mechanism. This is the fundamental concept of intelligent control where machine learning is enhanced by interaction with human operators, and vice versa. The paper discusses a revolutionary framework for the characterization, detection, identification, learning, and modeling of anomalous behavior in observed phenomena arising from a large class of unknown and uncertain dynamical systems.

  9. Cross-linguistic variation in the neurophysiological response to semantic processing: Evidence from anomalies at the borderline of awareness

    PubMed Central

    Tune, Sarah; Schlesewsky, Matthias; Small, Steven L.; Sanford, Anthony J.; Bohan, Jason; Sassenhagen, Jona; Bornkessel-Schlesewsky, Ina

    2014-01-01

    The N400 event-related brain potential (ERP) has played a major role in the examination of how the human brain processes meaning. For current theories of the N400, classes of semantic inconsistencies which do not elicit N400 effects have proven particularly influential. Semantic anomalies that are difficult to detect are a case in point (“borderline anomalies”, e.g. “After an air crash, where should the survivors be buried?”), engendering a late positive ERP response but no N400 effect in English (Sanford, Leuthold, Bohan, & Sanford, 2011). In three auditory ERP experiments, we demonstrate that this result is subject to cross-linguistic variation. In a German version of Sanford and colleagues' experiment (Experiment 1), detected borderline anomalies elicited both N400 and late positivity effects compared to control stimuli or to missed borderline anomalies. Classic easy-to-detect semantic (non-borderline) anomalies showed the same pattern as in English (N400 plus late positivity). The cross-linguistic difference in the response to borderline anomalies was replicated in two additional studies with a slightly modified task (Experiment 2a: German; Experiment 2b: English), with a reliable LANGUAGE × ANOMALY interaction for the borderline anomalies confirming that the N400 effect is subject to systematic cross-linguistic variation. We argue that this variation results from differences in the language-specific default weighting of top-down and bottom-up information, concluding that N400 amplitude reflects the interaction between the two information sources in the form-to-meaning mapping. PMID:24447768

  10. OceanXtremes: Scalable Anomaly Detection in Oceanographic Time-Series

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Armstrong, E. M.; Chin, T. M.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jacob, J. C.; Quach, N.

    2016-12-01

    The oceanographic community must meet the challenge to rapidly identify features and anomalies in complex and voluminous observations to further science and improve decision support. Given this data-intensive reality, we are developing an anomaly detection system, called OceanXtremes, powered by an intelligent, elastic Cloud-based analytic service backend that enables execution of domain-specific, multi-scale anomaly and feature detection algorithms across the entire archive of 15 to 30-year ocean science datasets.Our parallel analytics engine is extending the NEXUS system and exploits multiple open-source technologies: Apache Cassandra as a distributed spatial "tile" cache, Apache Spark for in-memory parallel computation, and Apache Solr for spatial search and storing pre-computed tile statistics and other metadata. OceanXtremes provides these key capabilities: Parallel generation (Spark on a compute cluster) of 15 to 30-year Ocean Climatologies (e.g. sea surface temperature or SST) in hours or overnight, using simple pixel averages or customizable Gaussian-weighted "smoothing" over latitude, longitude, and time; Parallel pre-computation, tiling, and caching of anomaly fields (daily variables minus a chosen climatology) with pre-computed tile statistics; Parallel detection (over the time-series of tiles) of anomalies or phenomena by regional area-averages exceeding a specified threshold (e.g. high SST in El Nino or SST "blob" regions), or more complex, custom data mining algorithms; Shared discovery and exploration of ocean phenomena and anomalies (facet search using Solr), along with unexpected correlations between key measured variables; Scalable execution for all capabilities on a hybrid Cloud, using our on-premise OpenStack Cloud cluster or at Amazon. The key idea is that the parallel data-mining operations will be run "near" the ocean data archives (a local "network" hop) so that we can efficiently access the thousands of files making up a three decade time-series. The presentation will cover the architecture of OceanXtremes, parallelization of the climatology computation and anomaly detection algorithms using Spark, example results for SST and other time-series, and parallel performance metrics.

  11. Image Analysis Based on Soft Computing and Applied on Space Shuttle During the Liftoff Process

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus A.; Klinko, Steve J.

    2007-01-01

    Imaging techniques based on Soft Computing (SC) and developed at Kennedy Space Center (KSC) have been implemented on a variety of prototype applications related to the safety operation of the Space Shuttle during the liftoff process. These SC-based prototype applications include detection and tracking of moving Foreign Objects Debris (FOD) during the Space Shuttle liftoff, visual anomaly detection on slidewires used in the emergency egress system for the Space Shuttle at the laJlIlch pad, and visual detection of distant birds approaching the Space Shuttle launch pad. This SC-based image analysis capability developed at KSC was also used to analyze images acquired during the accident of the Space Shuttle Columbia and estimate the trajectory and velocity of the foam that caused the accident.

  12. System and Method for Outlier Detection via Estimating Clusters

    NASA Technical Reports Server (NTRS)

    Iverson, David J. (Inventor)

    2016-01-01

    An efficient method and system for real-time or offline analysis of multivariate sensor data for use in anomaly detection, fault detection, and system health monitoring is provided. Models automatically derived from training data, typically nominal system data acquired from sensors in normally operating conditions or from detailed simulations, are used to identify unusual, out of family data samples (outliers) that indicate possible system failure or degradation. Outliers are determined through analyzing a degree of deviation of current system behavior from the models formed from the nominal system data. The deviation of current system behavior is presented as an easy to interpret numerical score along with a measure of the relative contribution of each system parameter to any off-nominal deviation. The techniques described herein may also be used to "clean" the training data.

  13. Discrepancy of cytogenetic analysis in Western and eastern Taiwan.

    PubMed

    Chang, Yu-Hsun; Chen, Pui-Yi; Li, Tzu-Ying; Yeh, Chung-Nan; Li, Yi-Shian; Chu, Shao-Yin; Lee, Ming-Liang

    2013-06-01

    This study aimed at investigating the results of second-trimester amniocyte karyotyping in western and eastern Taiwan, and identifying any regional differences in the prevalence of fetal chromosomal anomalies. From 2004 to 2009, pregnant women who underwent amniocentesis in their second trimester at three hospitals in western Taiwan and at four hospitals in eastern Taiwan were included. All the cytogenetic analyses of cultured amniocytes were performed in the cytogenetics laboratory of the Genetic Counseling Center of Hualien Buddhist Tzu Chi General Hospital. We used the chi-square test, Student t test, and Mann-Whitney U test to evaluate the variants of clinical indications, amniocyte karyotyping results, and prevalence and types of chromosomal anomalies in western and eastern Taiwan. During the study period, 3573 samples, 1990 (55.7%) from western Taiwan and 1583 (44.3%) from eastern Taiwan, were collected and analyzed. The main indication for amniocyte karyotyping was advanced maternal age (69.0% in western Taiwan, 67.1% in eastern Taiwan). The detection rates of chromosomal anomalies by amniocyte karyotyping in eastern Taiwan (45/1582, 2.8%) did not differ significantly from that in western Taiwan (42/1989, 2.1%) (p = 1.58). Mothers who had abnormal ultrasound findings and histories of familial hereditary diseases or chromosomal anomalies had higher detection rates of chromosomal anomalies (9.3% and 7.2%, respectively). The detection rate of autosomal anomalies was higher in eastern Taiwan (93.3% vs. 78.6%, p = 0.046), but the detection rate of sex-linked chromosomal anomalies was higher in western Taiwan (21.4% vs. 6.7%, p = 0.046). We demonstrated regional differences in second-trimester amniocyte karyotyping results and established a database of common chromosomal anomalies that could be useful for genetic counseling, especially in eastern Taiwan. Copyright © 2012. Published by Elsevier B.V.

  14. An investigation of thermal anomalies in the Central American volcanic chain and evaluation of the utility of thermal anomaly monitoring in the prediction of volcanic eruptions. [Central America

    NASA Technical Reports Server (NTRS)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1975-01-01

    The author has identified the following significant results. Ground truth data collection proves that significant anomalies exist at 13 volcanoes within the test site of Central America. The dimensions and temperature contrast of these ten anomalies are large enough to be detected by the Skylab 192 instrument. The dimensions and intensity of thermal anomalies have changed at most of these volcanoes during the Skylab mission.

  15. System and method for anomaly detection

    DOEpatents

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  16. A primitive study on unsupervised anomaly detection with an autoencoder in emergency head CT volumes

    NASA Astrophysics Data System (ADS)

    Sato, Daisuke; Hanaoka, Shouhei; Nomura, Yukihiro; Takenaga, Tomomi; Miki, Soichiro; Yoshikawa, Takeharu; Hayashi, Naoto; Abe, Osamu

    2018-02-01

    Purpose: The target disorders of emergency head CT are wide-ranging. Therefore, people working in an emergency department desire a computer-aided detection system for general disorders. In this study, we proposed an unsupervised anomaly detection method in emergency head CT using an autoencoder and evaluated the anomaly detection performance of our method in emergency head CT. Methods: We used a 3D convolutional autoencoder (3D-CAE), which contains 11 layers in the convolution block and 6 layers in the deconvolution block. In the training phase, we trained the 3D-CAE using 10,000 3D patches extracted from 50 normal cases. In the test phase, we calculated abnormalities of each voxel in 38 emergency head CT volumes (22 abnormal cases and 16 normal cases) for evaluation and evaluated the likelihood of lesion existence. Results: Our method achieved a sensitivity of 68% and a specificity of 88%, with an area under the curve of the receiver operating characteristic curve of 0.87. It shows that this method has a moderate accuracy to distinguish normal CT cases to abnormal ones. Conclusion: Our method has potentialities for anomaly detection in emergency head CT.

  17. Effects of Sampling and Spatio/Temporal Granularity in Traffic Monitoring on Anomaly Detectability

    NASA Astrophysics Data System (ADS)

    Ishibashi, Keisuke; Kawahara, Ryoichi; Mori, Tatsuya; Kondoh, Tsuyoshi; Asano, Shoichiro

    We quantitatively evaluate how sampling and spatio/temporal granularity in traffic monitoring affect the detectability of anomalous traffic. Those parameters also affect the monitoring burden, so network operators face a trade-off between the monitoring burden and detectability and need to know which are the optimal paramter values. We derive equations to calculate the false positive ratio and false negative ratio for given values of the sampling rate, granularity, statistics of normal traffic, and volume of anomalies to be detected. Specifically, assuming that the normal traffic has a Gaussian distribution, which is parameterized by its mean and standard deviation, we analyze how sampling and monitoring granularity change these distribution parameters. This analysis is based on observation of the backbone traffic, which exhibits spatially uncorrelated and temporally long-range dependence. Then we derive the equations for detectability. With those equations, we can answer the practical questions that arise in actual network operations: what sampling rate to set to find the given volume of anomaly, or, if the sampling is too high for actual operation, what granularity is optimal to find the anomaly for a given lower limit of sampling rate.

  18. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Urnes, James M., Sr. (Inventor); Smith, Timothy A. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  19. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  20. Formal Methods for Information Protection Technology. Task 2: Mathematical Foundations, Architecture and Principles of Implementation of Multi-Agent Learning Components for Attack Detection in Computer Networks. Part 2

    DTIC Science & Technology

    2003-11-01

    Lafayette, IN 47907. [Lane et al-97b] T. Lane and C . E. Brodley. Sequence matching and learning in anomaly detection for computer security. Proceedings of...Mining, pp 259-263. 1998. [Lane et al-98b] T. Lane and C . E. Brodley. Temporal sequence learning and data reduction for anomaly detection ...W. Lee, C . Park, and S. Stolfo. Towards Automatic Intrusion Detection using NFR. 1st USENIX Workshop on Intrusion Detection and Network Monitoring

  1. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    PubMed

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  2. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    PubMed Central

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882

  3. Congenital aplastic-hypoplastic lumbar pedicle in infants and young children

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yousefzadeh, D.K.; El-Khoury, G.Y.; Lupetin, A.R.

    1982-01-01

    Nine cases of congenital aplastic-hypoplastic lumbar pedicle (mean age 27 months) are described. Their data are compared to those of 18 other reported cases (mean age 24.7 years) and the following conclusions are made: (1) Almost exclusively, the pedicular defect in infants and young children is due to developmental anomaly rather than destruction by malignancy or infectious processes. (2) This anomaly, we think, is more common than it is believed to be. (3) Unlike adults, infants and young children rarely develop hypertrophy and/or sclerosis of the contralateral pedicle. (4) Detection of pedicular anomaly is more than satisfying a radiographic curiositymore » and may lead to discovery of other coexisting anomalies. (5) Ultrasonic screening of the patients with congenital pedicular defects may detect the associated genitourinary anomalies, if present, and justify further studies in a selected group of patients.« less

  4. Observed TEC Anomalies by GNSS Sites Preceding the Aegean Sea Earthquake of 2014

    NASA Astrophysics Data System (ADS)

    Ulukavak, Mustafa; Yal&ccedul; ınkaya, Mualla

    2016-11-01

    In recent years, Total Electron Content (TEC) data, obtained from Global Navigation Satellites Systems (GNSS) receivers, has been widely used to detect seismo-ionospheric anomalies. In this study, Global Positioning System - Total Electron Content (GPS-TEC) data were used to investigate ionospheric abnormal behaviors prior to the 2014 Aegean Sea earthquake (40.305°N 25.453°E, 24 May 2014, 09:25:03 UT, Mw:6.9). The data obtained from three Continuously Operating Reference Stations in Turkey (CORS-TR) and two International GNSS Service (IGS) sites near the epicenter of the earthquake is used to detect ionospheric anomalies before the earthquake. Solar activity index (F10.7) and geomagnetic activity index (Dst), which are both related to space weather conditions, were used to analyze these pre-earthquake ionospheric anomalies. An examination of these indices indicated high solar activity between May 8 and 15, 2014. The first significant increase (positive anomalies) in Vertical Total Electron Content (VTEC) was detected on May 14, 2014 or 10 days before the earthquake. This positive anomaly can be attributed to the high solar activity. The indices do not imply high solar or geomagnetic activity after May 15, 2014. Abnormal ionospheric TEC changes (negative anomaly) were observed at all stations one day before the earthquake. These changes were lower than the lower bound by approximately 10-20 TEC unit (TECU), and may be considered as the ionospheric precursor of the 2014 Aegean Sea earthquake

  5. Eddy-Current Inspection of Ball Bearings

    NASA Technical Reports Server (NTRS)

    Bankston, B.

    1985-01-01

    Custom eddy-current probe locates surface anomalies. Low friction air cushion within cone allows ball to roll easily. Eddy current probe reliably detects surface and near-surface cracks, voids, and material anomalies in bearing balls or other spherical objects. Defects in ball surface detected by probe displayed on CRT and recorded on strip-chart recorder.

  6. Archean Isotope Anomalies as a Window into the Differentiation History of the Earth

    NASA Astrophysics Data System (ADS)

    Wainwright, A. N.; Debaille, V.; Zincone, S. A.

    2018-05-01

    No resolvable µ142Nd anomaly was detected in Paleo- Mesoarchean rocks of São Francisco and West African cratons. The lack of µ142Nd anomalies outside of North America and Greenland implies the Earth differentiated into at least two distinct domains.

  7. Detecting buried explosive hazards with handheld GPR and deep learning

    NASA Astrophysics Data System (ADS)

    Besaw, Lance E.

    2016-05-01

    Buried explosive hazards (BEHs), including traditional landmines and homemade improvised explosives, have proven difficult to detect and defeat during and after conflicts around the world. Despite their various sizes, shapes and construction material, ground penetrating radar (GPR) is an excellent phenomenology for detecting BEHs due to its ability to sense localized differences in electromagnetic properties. Handheld GPR detectors are common equipment for detecting BEHs because of their flexibility (in part due to the human operator) and effectiveness in cluttered environments. With modern digital electronics and positioning systems, handheld GPR sensors can sense and map variation in electromagnetic properties while searching for BEHs. Additionally, large-scale computers have demonstrated an insatiable appetite for ingesting massive datasets and extracting meaningful relationships. This is no more evident than the maturation of deep learning artificial neural networks (ANNs) for image and speech recognition now commonplace in industry and academia. This confluence of sensing, computing and pattern recognition technologies offers great potential to develop automatic target recognition techniques to assist GPR operators searching for BEHs. In this work deep learning ANNs are used to detect BEHs and discriminate them from harmless clutter. We apply these techniques to a multi-antennae, handheld GPR with centimeter-accurate positioning system that was used to collect data over prepared lanes containing a wide range of BEHs. This work demonstrates that deep learning ANNs can automatically extract meaningful information from complex GPR signatures, complementing existing GPR anomaly detection and classification techniques.

  8. GBAS Ionospheric Anomaly Monitoring Based on a Two-Step Approach

    PubMed Central

    Zhao, Lin; Yang, Fuxin; Li, Liang; Ding, Jicheng; Zhao, Yuxin

    2016-01-01

    As one significant component of space environmental weather, the ionosphere has to be monitored using Global Positioning System (GPS) receivers for the Ground-Based Augmentation System (GBAS). This is because an ionospheric anomaly can pose a potential threat for GBAS to support safety-critical services. The traditional code-carrier divergence (CCD) methods, which have been widely used to detect the variants of the ionospheric gradient for GBAS, adopt a linear time-invariant low-pass filter to suppress the effect of high frequency noise on the detection of the ionospheric anomaly. However, there is a counterbalance between response time and estimation accuracy due to the fixed time constants. In order to release the limitation, a two-step approach (TSA) is proposed by integrating the cascaded linear time-invariant low-pass filters with the adaptive Kalman filter to detect the ionospheric gradient anomaly. The performance of the proposed method is tested by using simulated and real-world data, respectively. The simulation results show that the TSA can detect ionospheric gradient anomalies quickly, even when the noise is severer. Compared to the traditional CCD methods, the experiments from real-world GPS data indicate that the average estimation accuracy of the ionospheric gradient improves by more than 31.3%, and the average response time to the ionospheric gradient at a rate of 0.018 m/s improves by more than 59.3%, which demonstrates the ability of TSA to detect a small ionospheric gradient more rapidly. PMID:27240367

  9. Pre-seismic anomalies in remotely sensed land surface temperature measurements: The case study of 2003 Boumerdes earthquake

    NASA Astrophysics Data System (ADS)

    Bellaoui, Mebrouk; Hassini, Abdelatif; Bouchouicha, Kada

    2017-05-01

    Detection of thermal anomaly prior to earthquake events has been widely confirmed by researchers over the past decade. One of the popular approaches for anomaly detection is the Robust Satellite Approach (RST). In this paper, we use this method on a collection of six years of MODIS satellite data, representing land surface temperature (LST) images to predict 21st May 2003 Boumerdes Algeria earthquake. The thermal anomalies results were compared with the ambient temperature variation measured in three meteorological stations of Algerian National Office of Meteorology (ONM) (DELLYS-AFIR, TIZI-OUZOU, and DAR-EL-BEIDA). The results confirm the importance of RST as an approach highly effective for monitoring the earthquakes.

  10. [A case report of Ebstein's anomaly treated with Hetzer's procedure].

    PubMed

    Sako, H; Hadama, T; Shigemitsu, O; Miyamoto, S; Anai, H; Wada, T; Iwata, E; Mori, Y; Soeda, T; Takakura, K

    2001-02-01

    A 27-year-old male who had been diagnosed with Ebstein's anomaly was admitted with uncontrollable congestive heart failure. The echocardiogram revealed severe tricuspid valve incompetence and the electrocardiogram showed atrial fibrillation. He underwent Hetzer's repair procedure for tricuspid valve incompetence and Minzioni's right atrial isolation technique to restore sinus rhythm. His congestive heart failure quickly disappeared and sinus rhythm was restored after operation. He was discharged 3 weeks postoperatively and remains well 22 months after his operation. Hetzer's technique for tricuspid valve repair in Ebstein's anomaly restructures the valve mechanism at the level of the true tricuspid anulus by using the most mobile leaflet for valve closure without plication of the atrialized chamber. We conclude that Hetzer's procedure is an effective operation for Ebstein's anomaly.

  11. Intraoperative conjoined lumbosacral nerve roots associated with spondylolisthesis.

    PubMed

    Popa, Iulian; Poenaru, Dan V; Oprea, Manuel D; Andrei, Diana

    2013-07-01

    Lumbosacral nerve roots anomalies may produce low back pain. These anomalies are reported to be a cause for failed back surgery. They are usually left undiagnosed, especially in endoscopic discectomy techniques. Any surgery for entrapment disorders, performed on a patient with undiagnosed lumbosacral nerve roots anomaly, may lead to serious neural injuries because of an improper surgical technique or decompression. In this report, we describe our experience with a case of L5-S1 spondylolisthesis and associated congenital lumbosacral nerve root anomalies discovered during the surgical intervention, and the difficulties raised by such a discovery. Careful examination of coronal and axial views obtained through high-quality Magnetic Resonance Imaging may lead to a proper diagnosis of this condition leading to an adequate surgical planning, minimizing the intraoperatory complications.

  12. Expanding living kidney donor criteria with ex-vivo surgery for renal anomalies

    PubMed Central

    McGregor, Thomas B.; Rampersad, Christie; Patel, Premal

    2016-01-01

    Introduction: Renal transplantation remains the gold standard treatment for end-stage renal disease, with living donor kidneys providing the best outcomes in terms of allograft survival. As the number of patients on the waitlist continues to grow, solutions to expand the donor pool are ongoing. A paradigm shift in the eligibility of donors with renal anomalies has been looked at as a potential source to expand the living donor pool. We sought to determine how many patients presented with anatomic renal anomalies at our transplant centre and describe the ex-vivo surgical techniques used to render these kidneys suitable for transplantation. Methods: A retrospective review was performed of all patients referred for surgical suitability to undergo laparoscopic donor nephrectomy between January 2011 and January 2015. Patient charts were analyzed for demographic information, perioperative variables, urological histories, and postoperative outcomes. Results: 96 referrals were identified, of which 81 patients underwent laparoscopic donor nephrectomy. Of these patients, 11 (13.6%) were identified as having a renal anomaly that could potentially exclude them from the donation process. These anomalies included five patients with unilateral nephrolithiasis, four patients with large renal cysts (>4 cm diameter), one patient with an angiomyolipoma (AML) and one patient with a calyceal diverticulum filled with stones. A description of the ex-vivo surgical techniques used to correct these renal anomalies is provided. Conclusions: We have shown here that ex-vivo surgical techniques can safely and effectively help correct some of these renal anomalies to render these kidneys transplantable, helping to expand the living donor pool. PMID:27800047

  13. Early Evaluation of the Fetal Heart.

    PubMed

    Hernandez-Andrade, Edgar; Patwardhan, Manasi; Cruz-Lemini, Mónica; Luewan, Suchaya

    2017-01-01

    Evaluation of the fetal heart at 11-13 + 6 weeks of gestation is indicated for women with a family history of congenital heart defects (CHD), a previous child with CDH, or an ultrasound finding associated with cardiac anomalies. The accuracy for early detection of CHD is highly related to the experience of the operator. The 4-chamber view and outflow tracts are the most important planes for identification of an abnormal heart, and can be obtained in the majority of fetuses from 11 weeks of gestation onward. Transvaginal ultrasound is the preferred route for fetal cardiac examination prior to 12 weeks of gestation, whereas, after 12 weeks, the fetal heart can be reliably evaluated by transabdominal ultrasound. Cardiac defects, such as ventricular septal defects, tetralogy of Fallot, Ebstein's anomaly, or cardiac tumors, are unlikely to be identified at ≤14 weeks of gestation. Additional ultrasound techniques such as spatiotemporal image correlation and the evaluation of volumes by a fetal-heart expert can improve the detection of congenital heart disease. The evaluation of the fetal cardiac function at 11-13 + 6 weeks of gestation can be useful for early identification of fetuses at risk of anemia due to hemoglobinopathies, such as hemoglobin Bart's disease. © 2017 S. Karger AG, Basel.

  14. Detection of micronuclei formation and nuclear anomalies in regenerative nodules of human cirrhotic livers and relationship to hepatocellular carcinoma.

    PubMed

    de Almeida, Terezinha M B; Leitão, Regina C; Andrade, Joyce D; Beçak, Willy; Carrilho, Flair J; Sonohara, Shigueko

    2004-04-01

    Human cirrhosis is considered an important factor in hepatocarcinogenesis. The lack of substantial genetics and cytogenetics data in human cirrhosis led us to investigate spontaneous micronuclei formation, as an indicator of chromosomal damage. The analysis was performed in hepatocytes of regenerative, macroregenerative, and tumoral nodules from 30 cases of cirrhosis (paraffin-embedded archival material), retrospectively selected: cryptogenic, hepatitis C virus, and hepatitis C virus associated with hepatocellular carcinoma (HCC). Thirteen control liver samples of healthy organ donors were included. Micronucleated hepatocytes were analyzed with Feulgen-fast-green dyeing techniques. The spontaneous frequency of micronucleated hepatocytes in both regenerative and macroregenerative nodules of all cirrhotic patients was significantly higher than for the normal control group. There was no significant difference in frequency of micronucleated hepatocytes in regenerative nodules compared with macroregenerative nodules for all cases analyzed, whereas a significantly higher frequency of micronucleated hepatocytes was detected in tumoral nodules, compared with cirrhotic regenerative nodules and normal parenchyma. A higher frequency of the nuclear anomalies termed broken-eggs was observed in hepatitis C virus-related samples. Chromatinic losses and genotoxicity already existed in the cirrhotic regenerative nodules, which might predispose to development of HCC.

  15. Practical method to identify orbital anomaly as spacecraft breakup in the geostationary region

    NASA Astrophysics Data System (ADS)

    Hanada, Toshiya; Uetsuhara, Masahiko; Nakaniwa, Yoshitaka

    2012-07-01

    Identifying a spacecraft breakup is an essential issue to define the current orbital debris environment. This paper proposes a practical method to identify an orbital anomaly, which appears as a significant discontinuity in the observation data, as a spacecraft breakup. The proposed method is applicable to orbital anomalies in the geostationary region. Long-term orbital evolutions of breakup fragments may conclude that their orbital planes will converge into several corresponding regions in inertial space even if the breakup epoch is not specified. This empirical method combines the aforementioned conclusion with the search strategy developed at Kyushu University, which can identify origins of observed objects as fragments released from a specified spacecraft. This practical method starts with selecting a spacecraft that experienced an orbital anomaly, and formulates a hypothesis to generate fragments from the anomaly. Then, the search strategy is applied to predict the behavior of groups of fragments hypothetically generated. Outcome of this predictive analysis specifies effectively when, where and how we should conduct optical measurements using ground-based telescopes. Objects detected based on the outcome are supposed to be from the anomaly, so that we can confirm the anomaly as a spacecraft breakup to release the detected objects. This paper also demonstrates observation planning for a spacecraft anomaly in the geostationary region.

  16. Detecting Pulsing Denial-of-Service Attacks with Nondeterministic Attack Intervals

    NASA Astrophysics Data System (ADS)

    Luo, Xiapu; Chan, Edmond W. W.; Chang, Rocky K. C.

    2009-12-01

    This paper addresses the important problem of detecting pulsing denial of service (PDoS) attacks which send a sequence of attack pulses to reduce TCP throughput. Unlike previous works which focused on a restricted form of attacks, we consider a very broad class of attacks. In particular, our attack model admits any attack interval between two adjacent pulses, whether deterministic or not. It also includes the traditional flooding-based attacks as a limiting case (i.e., zero attack interval). Our main contribution is Vanguard, a new anomaly-based detection scheme for this class of PDoS attacks. The Vanguard detection is based on three traffic anomalies induced by the attacks, and it detects them using a CUSUM algorithm. We have prototyped Vanguard and evaluated it on a testbed. The experiment results show that Vanguard is more effective than the previous methods that are based on other traffic anomalies (after a transformation using wavelet transform, Fourier transform, and autocorrelation) and detection algorithms (e.g., dynamic time warping).

  17. A scalable architecture for online anomaly detection of WLCG batch jobs

    NASA Astrophysics Data System (ADS)

    Kuehn, E.; Fischer, M.; Giffels, M.; Jung, C.; Petzold, A.

    2016-10-01

    For data centres it is increasingly important to monitor the network usage, and learn from network usage patterns. Especially configuration issues or misbehaving batch jobs preventing a smooth operation need to be detected as early as possible. At the GridKa data and computing centre we therefore operate a tool BPNetMon for monitoring traffic data and characteristics of WLCG batch jobs and pilots locally on different worker nodes. On the one hand local information itself are not sufficient to detect anomalies for several reasons, e.g. the underlying job distribution on a single worker node might change or there might be a local misconfiguration. On the other hand a centralised anomaly detection approach does not scale regarding network communication as well as computational costs. We therefore propose a scalable architecture based on concepts of a super-peer network.

  18. Detection of emerging sunspot regions in the solar interior.

    PubMed

    Ilonidis, Stathis; Zhao, Junwei; Kosovichev, Alexander

    2011-08-19

    Sunspots are regions where strong magnetic fields emerge from the solar interior and where major eruptive events occur. These energetic events can cause power outages, interrupt telecommunication and navigation services, and pose hazards to astronauts. We detected subsurface signatures of emerging sunspot regions before they appeared on the solar disc. Strong acoustic travel-time anomalies of an order of 12 to 16 seconds were detected as deep as 65,000 kilometers. These anomalies were associated with magnetic structures that emerged with an average speed of 0.3 to 0.6 kilometer per second and caused high peaks in the photospheric magnetic flux rate 1 to 2 days after the detection of the anomalies. Thus, synoptic imaging of subsurface magnetic activity may allow anticipation of large sunspot regions before they become visible, improving space weather forecast.

  19. Comparison of outliers and novelty detection to identify ionospheric TEC irregularities during geomagnetic storm and substorm

    NASA Astrophysics Data System (ADS)

    Pattisahusiwa, Asis; Houw Liong, The; Purqon, Acep

    2016-08-01

    In this study, we compare two learning mechanisms: outliers and novelty detection in order to detect ionospheric TEC disturbance by November 2004 geomagnetic storm and January 2005 substorm. The mechanisms are applied by using v-SVR learning algorithm which is a regression version of SVM. Our results show that both mechanisms are quiet accurate in learning TEC data. However, novelty detection is more accurate than outliers detection in extracting anomalies related to geomagnetic events. The detected anomalies by outliers detection are mostly related to trend of data, while novelty detection are associated to geomagnetic events. Novelty detection also shows evidence of LSTID during geomagnetic events.

  20. Chromosomal microarray in clinical diagnosis: a study of 337 patients with congenital anomalies and developmental delays or intellectual disability.

    PubMed

    Sansović, Ivona; Ivankov, Ana-Maria; Bobinec, Adriana; Kero, Mijana; Barišić, Ingeborg

    2017-06-14

    To determine the diagnostic yield and criteria that could help to classify and interpret the copy number variations (CNVs) detected by chromosomal microarray (CMA) technique in patients with congenital and developmental abnormalities including dysmorphia, developmental delay (DD) or intellectual disability (ID), autism spectrum disorders (ASD) and congenital anomalies (CA). CMA analysis was performed in 337 patients with DD/ID with or without dysmorphism, ASD, and/or CA. In 30 of 337 patients, chromosomal imbalances had previously been detected by classical cytogenetic and molecular cytogenetic methods. In 73 of 337 patients, clinically relevant variants were detected and better characterized. Most of them were >1 Mb. Variants of unknown clinical significance (VOUS) were discovered in 35 patients. The most common VOUS size category was <300 kb (40.5%). Deletions and de novo imbalances were more frequent in pathogenic CNV than in VOUS category. CMA had a high diagnostic yield of 43/307, excluding patients previously detected by other methods. CMA was valuable in establishing the diagnosis in a high proportion of patients. Criteria for classification and interpretation of CNVs include CNV size and type, mode of inheritance, and genotype-phenotype correlation. Agilent ISCA v2 Human Genome 8x60 K oligonucleotide microarray format proved to be reasonable resolution for clinical use, particularly in the regions that are recommended by the International Standard Cytogenomic Array (ISCA) Consortium and associated with well-established syndromes.

  1. Privacy-preserving outlier detection through random nonlinear data distortion.

    PubMed

    Bhaduri, Kanishka; Stefanski, Mark D; Srivastava, Ashok N

    2011-02-01

    Consider a scenario in which the data owner has some private or sensitive data and wants a data miner to access them for studying important patterns without revealing the sensitive information. Privacy-preserving data mining aims to solve this problem by randomly transforming the data prior to their release to the data miners. Previous works only considered the case of linear data perturbations--additive, multiplicative, or a combination of both--for studying the usefulness of the perturbed output. In this paper, we discuss nonlinear data distortion using potentially nonlinear random data transformation and show how it can be useful for privacy-preserving anomaly detection from sensitive data sets. We develop bounds on the expected accuracy of the nonlinear distortion and also quantify privacy by using standard definitions. The highlight of this approach is to allow a user to control the amount of privacy by varying the degree of nonlinearity. We show how our general transformation can be used for anomaly detection in practice for two specific problem instances: a linear model and a popular nonlinear model using the sigmoid function. We also analyze the proposed nonlinear transformation in full generality and then show that, for specific cases, it is distance preserving. A main contribution of this paper is the discussion between the invertibility of a transformation and privacy preservation and the application of these techniques to outlier detection. The experiments conducted on real-life data sets demonstrate the effectiveness of the approach.

  2. Development of multiple source data processing for structural analysis at a regional scale. [digital remote sensing in geology

    NASA Technical Reports Server (NTRS)

    Carrere, Veronique

    1990-01-01

    Various image processing techniques developed for enhancement and extraction of linear features, of interest to the structural geologist, from digital remote sensing, geologic, and gravity data, are presented. These techniques include: (1) automatic detection of linear features and construction of rose diagrams from Landsat MSS data; (2) enhancement of principal structural directions using selective filters on Landsat MSS, Spacelab panchromatic, and HCMM NIR data; (3) directional filtering of Spacelab panchromatic data using Fast Fourier Transform; (4) detection of linear/elongated zones of high thermal gradient from thermal infrared data; and (5) extraction of strong gravimetric gradients from digitized Bouguer anomaly maps. Processing results can be compared to each other through the use of a geocoded database to evaluate the structural importance of each lineament according to its depth: superficial structures in the sedimentary cover, or deeper ones affecting the basement. These image processing techniques were successfully applied to achieve a better understanding of the transition between Provence and the Pyrenees structural blocks, in southeastern France, for an improved structural interpretation of the Mediterranean region.

  3. Deep learning algorithms for detecting explosive hazards in ground penetrating radar data

    NASA Astrophysics Data System (ADS)

    Besaw, Lance E.; Stimac, Philip J.

    2014-05-01

    Buried explosive hazards (BEHs) have been, and continue to be, one of the most deadly threats in modern conflicts. Current handheld sensors rely on a highly trained operator for them to be effective in detecting BEHs. New algorithms are needed to reduce the burden on the operator and improve the performance of handheld BEH detectors. Traditional anomaly detection and discrimination algorithms use "hand-engineered" feature extraction techniques to characterize and classify threats. In this work we use a Deep Belief Network (DBN) to transcend the traditional approaches of BEH detection (e.g., principal component analysis and real-time novelty detection techniques). DBNs are pretrained using an unsupervised learning algorithm to generate compressed representations of unlabeled input data and form feature detectors. They are then fine-tuned using a supervised learning algorithm to form a predictive model. Using ground penetrating radar (GPR) data collected by a robotic cart swinging a handheld detector, our research demonstrates that relatively small DBNs can learn to model GPR background signals and detect BEHs with an acceptable false alarm rate (FAR). In this work, our DBNs achieved 91% probability of detection (Pd) with 1.4 false alarms per square meter when evaluated on anti-tank and anti-personnel targets at temperate and arid test sites. This research demonstrates that DBNs are a viable approach to detect and classify BEHs.

  4. Improving the geological interpretation of magnetic and gravity satellite anomalies

    NASA Technical Reports Server (NTRS)

    Hinze, W. J.; Braile, L. W. (Principal Investigator); Vonfrese, R. R. B.

    1985-01-01

    Current limitations in the quantitative interpretation of satellite-elevation geopotential field data and magnetic anomaly data were investigated along with techniques to overcome them. A major result was the preparation of an improved scalar magnetic anomaly map of South America and adjacent marine areas directly from the original MAGSAT data. In addition, comparisons of South American and Euro-African data show a strong correlation of anomalies along the Atlantic rifted margins of the continents.

  5. Vascular anomalies: classification, imaging characteristics and implications for interventional radiology treatment approaches

    PubMed Central

    Prajapati, H J S; Martin, L G; Patel, T H

    2014-01-01

    The term vascular anomaly represents a broad spectrum of vascular pathology, including proliferating vascular tumours and vascular malformations. While the treatment of most vascular anomalies is multifactorial, interventional radiology procedures, including embolic therapy, sclerotherapy and laser coagulation among others, are playing an increasingly important role in vascular anomaly management. This review discusses the diagnosis and treatment of common vascular malformations, with emphasis on the technique, efficacy and complications of different interventional radiology procedures. PMID:24588666

  6. Energy and remote sensing. [satellite exploration, monitoring, siting

    NASA Technical Reports Server (NTRS)

    Summers, R. A.; Smith, W. L.; Short, N. M.

    1977-01-01

    Exploration for uranium, thorium, oil, gas and geothermal activity through remote sensing techniques is considered; satellite monitoring of coal-derived CO2 in the atmosphere, and the remote assessment of strip mining and land restoration are also mentioned. Reference is made to color ratio composites based on Landsat data, which may aid in the detection of uranium deposits, and to computer-enhanced black and white airborne scanning imagery, which may locate geothermal anomalies. Other applications of remote sensing to energy resources management, including mapping of transportation networks and power plant siting, are discussed.

  7. Improvements in the simulation code of the SOX experiment

    NASA Astrophysics Data System (ADS)

    Caminata, A.; Agostini, M.; Altenmüeller, K.; Appel, S.; Atroshchenko, V.; Bellini, G.; Benziger, J.; Bick, D.; Bonfini, G.; Bravo, D.; Caccianiga, B.; Calaprice, F.; Carlini, M.; Cavalcante, P.; Chepurnov, A.; Choi, K.; Cribier, M.; D'Angelo, D.; Davini, S.; Derbin, A.; Di Noto, L.; Drachnev, I.; Durero, M.; Etenko, A.; Farinon, S.; Fischer, V.; Fomenko, K.; Franco, D.; Gabriele, F.; Gaffiot, J.; Galbiati, C.; Gschwender, M.; Ghiano, C.; Giammarchi, M.; Goeger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Houdy, Th.; Hungerford, E.; Ianni, Aldo; Ianni, Andrea; Jonquères, N.; Jany, A.; Jedrzejczak, K.; Jeschke, D.; Kobychev, V.; Korablev, D.; Korga, G.; Kornoukhov, V.; Kryn, D.; Lachenmaier, T.; Lasserre, T.; Laubenstein, M.; Lehnert, B.; Link, J.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Manecki, S.; Maneschg, W.; Manuzio, G.; Marcocci, S.; Maricic, J.; Mention, G.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Montuschi, M.; Mosteiro, P.; Muratova, V.; Musenich, R.; Neumair, B.; Oberauer, L.; Obolensky, M.; Ortica, F.; Pallavicini, M.; Papp, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Romani, A.; Roncin, R.; Rossi, N.; Schönert, S.; Scola, L.; Semenov, D.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Sukhotin, S.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Thurn, J.; Toropova, M.; Unzhakov, E.; Veyssiére, C.; Vishneva, A.; Vivier, M.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Weinz, S.; Winter, J.; Wojcik, M.; Wurm, M.; Yokley, Z.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.

    2017-09-01

    The aim of the SOX experiment is to test the hypothesis of existence of light sterile neutrinos trough a short baseline experiment. Electron antineutrinos will be produced by an high activity source and detected in the Borexino experiment. Both an oscillometry approach and a conventional disappearance analysis will be performed and, if combined, SOX will be able to investigate most of the anomaly region at 95% c.l. This paper focuses on the improvements performed on the simulation code and on the techniques (calibrations) used to validate the results.

  8. Steganalysis feature improvement using expectation maximization

    NASA Astrophysics Data System (ADS)

    Rodriguez, Benjamin M.; Peterson, Gilbert L.; Agaian, Sos S.

    2007-04-01

    Images and data files provide an excellent opportunity for concealing illegal or clandestine material. Currently, there are over 250 different tools which embed data into an image without causing noticeable changes to the image. From a forensics perspective, when a system is confiscated or an image of a system is generated the investigator needs a tool that can scan and accurately identify files suspected of containing malicious information. The identification process is termed the steganalysis problem which focuses on both blind identification, in which only normal images are available for training, and multi-class identification, in which both the clean and stego images at several embedding rates are available for training. In this paper an investigation of a clustering and classification technique (Expectation Maximization with mixture models) is used to determine if a digital image contains hidden information. The steganalysis problem is for both anomaly detection and multi-class detection. The various clusters represent clean images and stego images with between 1% and 10% embedding percentage. Based on the results it is concluded that the EM classification technique is highly suitable for both blind detection and the multi-class problem.

  9. Capacitance probe for detection of anomalies in non-metallic plastic pipe

    DOEpatents

    Mathur, Mahendra P.; Spenik, James L.; Condon, Christopher M.; Anderson, Rodney; Driscoll, Daniel J.; Fincham, Jr., William L.; Monazam, Esmail R.

    2010-11-23

    The disclosure relates to analysis of materials using a capacitive sensor to detect anomalies through comparison of measured capacitances. The capacitive sensor is used in conjunction with a capacitance measurement device, a location device, and a processor in order to generate a capacitance versus location output which may be inspected for the detection and localization of anomalies within the material under test. The components may be carried as payload on an inspection vehicle which may traverse through a pipe interior, allowing evaluation of nonmetallic or plastic pipes when the piping exterior is not accessible. In an embodiment, supporting components are solid-state devices powered by a low voltage on-board power supply, providing for use in environments where voltage levels may be restricted.

  10. Model-Based Anomaly Detection for a Transparent Optical Transmission System

    NASA Astrophysics Data System (ADS)

    Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.

    In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.

  11. Inversion of the perturbation GPS-TEC data induced by tsunamis in order to estimate the sea level anomaly.

    NASA Astrophysics Data System (ADS)

    Rakoto, Virgile; Lognonné, Philippe; Rolland, Lucie; Coïsson, Pierdavide; Drilleau, Mélanie

    2017-04-01

    Large underwater earthquakes (Mw > 7) can transmit part of their energy to the surrounding ocean through large sea-floor motions, generating tsunamis that propagate over long distances. The forcing effect of tsunami waves on the atmosphere generate internal gravity waves which produce detectable ionospheric perturbations when they reach the upper atmosphere. Theses perturbations are frequently observed in the total electron content (TEC) measured by the multi-frequency Global navigation Satellite systems (GNSS) data (e.g., GPS,GLONASS). In this paper, we performed for the first time an inversion of the sea level anomaly using the GPS TEC data using a least square inversion (LSQ) through a normal modes summation modeling technique. Using the tsunami of the 2012 Haida Gwaii in far field as a test case, we showed that the amplitude peak to peak of the sea level anomaly inverted using this method is below 10 % error. Nevertheless, we cannot invert the second wave arriving 20 minutes later. This second wave is generaly explain by the coastal reflection which the normal modeling does not take into account. Our technique is then applied to two other tsunamis : the 2006 Kuril Islands tsunami in far field, and the 2011 Tohoku tsunami in closer field. This demonstrates that the inversion using a normal mode approach is able to estimate fairly well the amplitude of the first arrivals of the tsunami. In the future, we plan to invert in real the TEC data in order to retrieve the tsunami height.

  12. Semi-Supervised Novelty Detection with Adaptive Eigenbases, and Application to Radio Transients

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Majid, Walid A.; Reed, Colorado J.; Wagstaff, Kiri L.

    2011-01-01

    We present a semi-supervised online method for novelty detection and evaluate its performance for radio astronomy time series data. Our approach uses adaptive eigenbases to combine 1) prior knowledge about uninteresting signals with 2) online estimation of the current data properties to enable highly sensitive and precise detection of novel signals. We apply the method to the problem of detecting fast transient radio anomalies and compare it to current alternative algorithms. Tests based on observations from the Parkes Multibeam Survey show both effective detection of interesting rare events and robustness to known false alarm anomalies.

  13. [The advantages of early midtrimester targeted fetal systematic organ screening for the detection of fetal anomalies--will a global change start in Israel?].

    PubMed

    Bronshtein, Moshe; Solt, Ido; Blumenfeld, Zeev

    2014-06-01

    Despite more than three decades of universal popularity of fetal sonography as an integral part of pregnancy evaluation, there is still no unequivocal agreement regarding the optimal dating of fetal sonographic screening and the type of ultrasound (transvaginal vs abdominal). TransvaginaL systematic sonography at 14-17 weeks for fetal organ screening. The evaluation of over 72.000 early (14-17 weeks) and late (18-24 weeks) fetal ultrasonographic systematic organ screenings revealed that 96% of the malformations are detectable in the early screening with an incidence of 1:50 gestations. Only 4% of the fetal anomalies are diagnosed later in pregnancy. Over 99% of the fetal cardiac anomalies are detectable in the early screening and most of them appear in low risk gestations. Therefore, we suggest a new platform of fetal sonographic evaluation and follow-up: The extensive systematic fetal organ screening should be performed by an expert sonographer who has been trained in the detection of fetal malformations, at 14-17 weeks gestation. This examination should also include fetal cardiac echography Three additional ultrasound examinations are suggested during pregnancy: the first, performed by the patient's obstetrician at 6-7 weeks for the exclusion of ectopic pregnancy, confirmation of fetal viability, dating, assessment of chorionicity in multiple gestations, and visualization of maternal adnexae. The other two, at 22-26 and 32-34 weeks, require less training and should be performed by an obstetrician who has been qualified in the sonographic detection of fetal anomalies. The advantages of early midtrimester targeted fetal systematic organ screening for the detection of fetal anomalies may dictate a global change.

  14. Geoid undulations and gravity anomalies over the Aral Sea, the Black Sea and the Caspian Sea from a combined GEOS-3/SEASAT/GEOSAT altimeter data set

    NASA Technical Reports Server (NTRS)

    Au, Andrew Y.; Brown, Richard D.; Welker, Jean E.

    1991-01-01

    Satellite-based altimetric data taken by GOES-3, SEASAT, and GEOSAT over the Aral Sea, the Black Sea, and the Caspian Sea are analyzed and a least squares collocation technique is used to predict the geoid undulations on a 0.25x0.25 deg. grid and to transform these geoid undulations to free air gravity anomalies. Rapp's 180x180 geopotential model is used as the reference surface for the collocation procedure. The result of geoid to gravity transformation is, however, sensitive to the information content of the reference geopotential model used. For example, considerable detailed surface gravity data were incorporated into the reference model over the Black Sea, resulting in a reference model with significant information content at short wavelengths. Thus, estimation of short wavelength gravity anomalies from gridded geoid heights is generally reliable over regions such as the Black Sea, using the conventional collocation technique with local empirical covariance functions. Over regions such as the Caspian Sea, where detailed surface data are generally not incorporated into the reference model, unconventional techniques are needed to obtain reliable gravity anomalies. Based on the predicted gravity anomalies over these inland seas, speculative tectonic structures are identified and geophysical processes are inferred.

  15. Characterization of normality of chaotic systems including prediction and detection of anomalies

    NASA Astrophysics Data System (ADS)

    Engler, Joseph John

    Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational techniques and novel prediction methodologies. The value and efficiency of these methods are explored in various case studies. Presented is an overview of chaotic systems with examples taken from the real world. A representation schema for rapid understanding of the various states of deterministically chaotic systems is presented. This schema is then used to detect anomalies and system state changes. Additionally, a novel prediction methodology which utilizes Lyapunov exponents to facilitate longer term prediction accuracy is presented and compared with other nonlinear prediction methodologies. These novel methodologies are then demonstrated on applications such as wind energy, cyber security and classification of social networks.

  16. MUSIC algorithm for location searching of dielectric anomalies from S-parameters using microwave imaging

    NASA Astrophysics Data System (ADS)

    Park, Won-Kwang; Kim, Hwa Pyung; Lee, Kwang-Jae; Son, Seong-Ho

    2017-11-01

    Motivated by the biomedical engineering used in early-stage breast cancer detection, we investigated the use of MUltiple SIgnal Classification (MUSIC) algorithm for location searching of small anomalies using S-parameters. We considered the application of MUSIC to functional imaging where a small number of dipole antennas are used. Our approach is based on the application of Born approximation or physical factorization. We analyzed cases in which the anomaly is respectively small and large in relation to the wavelength, and the structure of the left-singular vectors is linked to the nonzero singular values of a Multi-Static Response (MSR) matrix whose elements are the S-parameters. Using simulations, we demonstrated the strengths and weaknesses of the MUSIC algorithm in detecting both small and extended anomalies.

  17. Spherical earth gravity and magnetic anomaly analysis by equivalent point source inversion

    NASA Technical Reports Server (NTRS)

    Von Frese, R. R. B.; Hinze, W. J.; Braile, L. W.

    1981-01-01

    To facilitate geologic interpretation of satellite elevation potential field data, analysis techniques are developed and verified in the spherical domain that are commensurate with conventional flat earth methods of potential field interpretation. A powerful approach to the spherical earth problem relates potential field anomalies to a distribution of equivalent point sources by least squares matrix inversion. Linear transformations of the equivalent source field lead to corresponding geoidal anomalies, pseudo-anomalies, vector anomaly components, spatial derivatives, continuations, and differential magnetic pole reductions. A number of examples using 1 deg-averaged surface free-air gravity anomalies of POGO satellite magnetometer data for the United States, Mexico, and Central America illustrate the capabilities of the method.

  18. Integrity Verification for SCADA Devices Using Bloom Filters and Deep Packet Inspection

    DTIC Science & Technology

    2014-03-27

    prevent intrusions in smart grids [PK12]. Parthasarathy proposed an anomaly detection based IDS that takes into account system state. In his implementation...Security, 25(7):498–506, 10 2006. [LMV12] O. Linda, M. Manic, and T. Vollmer. Improving cyber-security of smart grid systems via anomaly detection and...6 2012. 114 [PK12] S. Parthasarathy and D. Kundur. Bloom filter based intrusion detection for smart grid SCADA. In Electrical & Computer Engineering

  19. Methods for computational disease surveillance in infection prevention and control: Statistical process control versus Twitter's anomaly and breakout detection algorithms.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A

    2018-02-01

    Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  20. Anomaly Detection for Next-Generation Space Launch Ground Operations

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.

    2010-01-01

    NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.

  1. Occurrence and Detectability of Thermal Anomalies on Europa

    NASA Astrophysics Data System (ADS)

    Hayne, Paul O.; Christensen, Philip R.; Spencer, John R.; Abramov, Oleg; Howett, Carly; Mellon, Michael; Nimmo, Francis; Piqueux, Sylvain; Rathbun, Julie A.

    2017-10-01

    Endogenic activity is likely on Europa, given its young surface age of and ongoing tidal heating by Jupiter. Temperature is a fundamental signature of activity, as witnessed on Enceladus, where plumes emanate from vents with strongly elevated temperatures. Recent observations suggest the presence of similar water plumes at Europa. Even if plumes are uncommon, resurfacing may produce elevated surface temperatures, perhaps due to near-surface liquid water. Detecting endogenic activity on Europa is one of the primary mission objectives of NASA’s planned Europa Clipper flyby mission.Here, we use a probabilistic model to assess the likelihood of detectable thermal anomalies on the surface of Europa. The Europa Thermal Emission Imaging System (E-THEMIS) investigation is designed to characterize Europa’s thermal behavior and identify any thermal anomalies due to recent or ongoing activity. We define “detectability” on the basis of expected E-THEMIS measurements, which include multi-spectral infrared emission, both day and night.Thermal anomalies on Europa may take a variety of forms, depending on the resurfacing style, frequency, and duration of events: 1) subsurface melting due to hot spots, 2) shear heating on faults, and 3) eruptions of liquid water or warm ice on the surface. We use numerical and analytical models to estimate temperatures for these features. Once activity ceases, lifetimes of thermal anomalies are estimated to be 100 - 1000 yr. On average, Europa’s 10 - 100 Myr surface age implies a resurfacing rate of ~3 - 30 km2/yr. The typical size of resurfacing features determines their frequency of occurrence. For example, if ~100 km2 chaos features dominate recent resurfacing, we expect one event every few years to decades. Smaller features, such as double-ridges, may be active much more frequently. We model each feature type as a statistically independent event, with probabilities weighted by their observed coverage of Europa’s surface. Our results show that if Europa is resurfaced continuously by the processes considered, there is a >99% chance that E-THEMIS will detect a thermal anomaly due to endogenic activity. Therefore, if no anomalies are detected, these models can be ruled out, or revised.

  2. Tactile sensor of hardness recognition based on magnetic anomaly detection

    NASA Astrophysics Data System (ADS)

    Xue, Lingyun; Zhang, Dongfang; Chen, Qingguang; Rao, Huanle; Xu, Ping

    2018-03-01

    Hardness, as one kind of tactile sensing, plays an important role in the field of intelligent robot application such as gripping, agricultural harvesting, prosthetic hand and so on. Recently, with the rapid development of magnetic field sensing technology with high performance, a number of magnetic sensors have been developed for intelligent application. The tunnel Magnetoresistance(TMR) based on magnetoresistance principal works as the sensitive element to detect the magnetic field and it has proven its excellent ability of weak magnetic detection. In the paper, a new method based on magnetic anomaly detection was proposed to detect the hardness in the tactile way. The sensor is composed of elastic body, ferrous probe, TMR element, permanent magnet. When the elastic body embedded with ferrous probe touches the object under the certain size of force, deformation of elastic body will produce. Correspondingly, the ferrous probe will be forced to displace and the background magnetic field will be distorted. The distorted magnetic field was detected by TMR elements and the output signal at different time can be sampled. The slope of magnetic signal with the sampling time is different for object with different hardness. The result indicated that the magnetic anomaly sensor can recognize the hardness rapidly within 150ms after the tactile moment. The hardness sensor based on magnetic anomaly detection principal proposed in the paper has the advantages of simple structure, low cost, rapid response and it has shown great application potential in the field of intelligent robot.

  3. A Doubly Stochastic Change Point Detection Algorithm for Noisy Biological Signals.

    PubMed

    Gold, Nathan; Frasch, Martin G; Herry, Christophe L; Richardson, Bryan S; Wang, Xiaogang

    2017-01-01

    Experimentally and clinically collected time series data are often contaminated with significant confounding noise, creating short, noisy time series. This noise, due to natural variability and measurement error, poses a challenge to conventional change point detection methods. We propose a novel and robust statistical method for change point detection for noisy biological time sequences. Our method is a significant improvement over traditional change point detection methods, which only examine a potential anomaly at a single time point. In contrast, our method considers all suspected anomaly points and considers the joint probability distribution of the number of change points and the elapsed time between two consecutive anomalies. We validate our method with three simulated time series, a widely accepted benchmark data set, two geological time series, a data set of ECG recordings, and a physiological data set of heart rate variability measurements of fetal sheep model of human labor, comparing it to three existing methods. Our method demonstrates significantly improved performance over the existing point-wise detection methods.

  4. Application of the Geo-Anomaly Unit Concept in Quantitative Delineation and Assessment of Gold Ore Targets in Western Shandong Uplift Terrain, Eastern China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen Yongqing, E-mail: ydonglai@mail.cgs.gov.cn; Zhao Pengda; Chen Jianguo

    2001-03-15

    A number of large and giant ore deposits have been discovered within the relatively small areas of lithospheric structure anomalies, including various boundary zones of tectonic plates. The regions have become the well-known intercontinental ore-forming belts, such as the circum-Pacific gold-copper, copper-molybdenum, and tungsten-tin metallogenic belts. These belts are typical geological anomalous areas. An investigation into the hydrothermal ore deposits in different regions in the former Soviet Union illustrated that the geologic structures of ore fields of almost all major commercial deposits have distinct features compared with the neighboring areas. These areas with distinct features are defined as geo-anomalies. Amore » geo-anomaly refers to such a geologic body or a combination of bodies that their composition, texture-structure, and genesis are significantly different from those of their surroundings. A geo-anomaly unit (GU) is an area containing distinct features that can be delineated with integrated ore-forming information using computer techniques on the basis of the geo-anomaly concept. Herein, the GU concept is illustrated by a case study of delineating the gold ore targets in the western Shandong uplift terrain, eastern China. It includes: (1) analyses of gold ore-forming factors; (2) compilation of normalized regional geochemical map and extraction of geochemical anomalies; (3) compilation of gravitational and aeromagnetic tectonic skeleton map and extraction of gravitational and aeromagnetic anomalies; (4) extraction of circular and linear anomalies from remote-sensing Landsat TM images; (5) establishment of a geo-anomaly conceptual model associated with known gold mineralization; (6) establishment of gold ore-forming favorability by computing techniques; and (7) delineation and assessment of ore-forming units. The units with high favorability are suggested as ore targets.« less

  5. Connection of stratospheric QBO with global atmospheric general circulation and tropical SST. Part I: methodology and composite life cycle

    NASA Astrophysics Data System (ADS)

    Huang, Bohua; Hu, Zeng-Zhen; Kinter, James L.; Wu, Zhaohua; Kumar, Arun

    2012-01-01

    The stratospheric quasi-biennial oscillation (QBO) and its association with the interannual variability in the stratosphere and troposphere, as well as in tropical sea surface temperature anomalies (SSTA), are examined in the context of a QBO life cycle. The analysis is based on the ERA40 and NCEP/NCAR reanalyses, radiosonde observations at Singapore, and other observation-based datasets. Both reanalyses reproduce the QBO life cycle and its associated variability in the stratosphere reasonably well, except that some long-term changes are detected only in the NCEP/NCAR reanalysis. In order to separate QBO from variability on other time scales and to eliminate the long-term changes, a scale separation technique [Ensemble Empirical Mode Decomposition (EEMD)] is applied to the raw data. The QBO component of zonal wind anomalies at 30 hPa, extracted using the EEMD method, is defined as a QBO index. Using this index, the QBO life cycle composites of stratosphere and troposphere variables, as well as SSTA, are constructed and examined. The composite features in the stratosphere are generally consistent with previous investigations. The correlations between the QBO and tropical Pacific SSTA depend on the phase in a QBO life cycle. On average, cold (warm) SSTA peaks about half a year after the maximum westerlies (easterlies) at 30 hPa. The connection of the QBO with the troposphere seems to be associated with the differences of temperature anomalies between the stratosphere and troposphere. While the anomalies in the stratosphere propagate downward systematically, some anomalies in the troposphere develop and expand vertically. Therefore, it is possible that the temperature difference between the troposphere and stratosphere may alter the atmospheric stability and tropical deep convection, which modulates the Walker circulation and SSTA in the equatorial Pacific Ocean.

  6. Characterization of a photometric anomaly in lunar Mare Nubium

    NASA Astrophysics Data System (ADS)

    Korokhin, Viktor; Shkuratov, Yuriy; Kaydash, Vadym; Basilevsky, Alexander; Rohachova, Larysa; Velikodsky, Yuri; Opanasenko, Nickolay; Videen, Gorden; Stankevich, Dmitry; Kaluhina, Olena

    2016-03-01

    A novel approach of constructing photometrically seamless mosaics of reflectance, color-ratios, and phase-curve slopes using LROC WAC images has been developed, which can be used to map the photometric parameters of the lunar surface. The approach takes into account both geometric corrections with data on local topography and photometric conjunctions using our simple photometric model. New mosaics obtained with the technique allow more reliable studies of structural and chemical characteristics of the lunar surface. This approach has been applied to analyze the photometric anomaly (21.6 S, 17.7 W, ~40 km in size) in Mare Nubium detected earlier with our Earth-based observations. For each point of the scene the parameters were calculated using the least-square method for several tens of source WAC images. Clementine mosaics also were used in the analysis, e.g., in order to estimate the parameter of maturity degree Is/FeO. The anomaly has low FeO and TiO2 abundance and reveals a higher slope of the phase function than surroundings. Thermal data from LRO Diviner measurements do not show anomalies in this region. We consider this area as a shallow flooding of an elevated formation of highland composition, the material of which could have been excavated and mixed up with upper layers of the lunar surface through meteoroid impacts. The anomalous behavior of the phase function can be explained by the difference of surface structure in the anomaly and surrounding regions on the scale of less than several centimeters. This may be due to larger quantities of small fragments of rocks and clumps on the surface and/or the presence of agglomerates having open structure.

  7. AnRAD: A Neuromorphic Anomaly Detection Framework for Massive Concurrent Data Streams.

    PubMed

    Chen, Qiuwen; Luley, Ryan; Wu, Qing; Bishop, Morgan; Linderman, Richard W; Qiu, Qinru

    2018-05-01

    The evolution of high performance computing technologies has enabled the large-scale implementation of neuromorphic models and pushed the research in computational intelligence into a new era. Among the machine learning applications, unsupervised detection of anomalous streams is especially challenging due to the requirements of detection accuracy and real-time performance. Designing a computing framework that harnesses the growing computing power of the multicore systems while maintaining high sensitivity and specificity to the anomalies is an urgent research topic. In this paper, we propose anomaly recognition and detection (AnRAD), a bioinspired detection framework that performs probabilistic inferences. We analyze the feature dependency and develop a self-structuring method that learns an efficient confabulation network using unlabeled data. This network is capable of fast incremental learning, which continuously refines the knowledge base using streaming data. Compared with several existing anomaly detection approaches, our method provides competitive detection quality. Furthermore, we exploit the massive parallel structure of the AnRAD framework. Our implementations of the detection algorithm on the graphic processing unit and the Xeon Phi coprocessor both obtain substantial speedups over the sequential implementation on general-purpose microprocessor. The framework provides real-time service to concurrent data streams within diversified knowledge contexts, and can be applied to large problems with multiple local patterns. Experimental results demonstrate high computing performance and memory efficiency. For vehicle behavior detection, the framework is able to monitor up to 16000 vehicles (data streams) and their interactions in real time with a single commodity coprocessor, and uses less than 0.2 ms for one testing subject. Finally, the detection network is ported to our spiking neural network simulator to show the potential of adapting to the emerging neuromorphic architectures.

  8. Towards radiological diagnosis of abdominal adhesions based on motion signatures derived from sequences of cine-MRI images.

    PubMed

    Fenner, John; Wright, Benjamin; Emberey, Jonathan; Spencer, Paul; Gillott, Richard; Summers, Angela; Hutchinson, Charles; Lawford, Pat; Brenchley, Paul; Bardhan, Karna Dev

    2014-06-01

    This paper reports novel development and preliminary application of an image registration technique for diagnosis of abdominal adhesions imaged with cine-MRI (cMRI). Adhesions can severely compromise the movement and physiological function of the abdominal contents, and their presence is difficult to detect. The image registration approach presented here is designed to expose anomalies in movement of the abdominal organs, providing a movement signature that is indicative of underlying structural abnormalities. Validation of the technique was performed using structurally based in vitro and in silico models, supported with Receiver Operating Characteristic (ROC) methods. For the more challenging cases presented to the small cohort of 4 observers, the AUC (area under curve) improved from a mean value of 0.67 ± 0.02 (without image registration assistance) to a value of 0.87 ± 0.02 when image registration support was included. Also, in these cases, a reduction in time to diagnosis was observed, decreasing by between 20% and 50%. These results provided sufficient confidence to apply the image registration diagnostic protocol to sample magnetic resonance imaging data from healthy volunteers as well as a patient suffering from encapsulating peritoneal sclerosis (an extreme form of adhesions) where immobilization of the gut by cocooning of the small bowel is observed. The results as a whole support the hypothesis that movement analysis using image registration offers a possible method for detecting underlying structural anomalies and encourages further investigation. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. Trust Management in Mobile Ad Hoc Networks for Bias Minimization and Application Performance Maximization

    DTIC Science & Technology

    2014-02-26

    set of anomaly detection rules 62 I.-R. Chen et al. / Ad Hoc Networks 19 (2014) 59–74 Author’s personal copy including the interval rule (for...deficiencies in anomaly detection (e.g., imperfection of rules) by a false negative probability (PHfn) of misidentifying an unhealthy node as a...multimedia servers, Multimedia Syst. 8 (2) (2000) 83–91. [53] R. Mitchell, I.R. Chen, Adaptive intrusion detection for unmanned aircraft systems based on

  10. Using Physical Models for Anomaly Detection in Control Systems

    NASA Astrophysics Data System (ADS)

    Svendsen, Nils; Wolthusen, Stephen

    Supervisory control and data acquisition (SCADA) systems are increasingly used to operate critical infrastructure assets. However, the inclusion of advanced information technology and communications components and elaborate control strategies in SCADA systems increase the threat surface for external and subversion-type attacks. The problems are exacerbated by site-specific properties of SCADA environments that make subversion detection impractical; and by sensor noise and feedback characteristics that degrade conventional anomaly detection systems. Moreover, potential attack mechanisms are ill-defined and may include both physical and logical aspects.

  11. Integration of heterogeneous data for classification in hyperspectral satellite imagery

    NASA Astrophysics Data System (ADS)

    Benedetto, J.; Czaja, W.; Dobrosotskaya, J.; Doster, T.; Duke, K.; Gillis, D.

    2012-06-01

    As new remote sensing modalities emerge, it becomes increasingly important to nd more suitable algorithms for fusion and integration of dierent data types for the purposes of target/anomaly detection and classication. Typical techniques that deal with this problem are based on performing detection/classication/segmentation separately in chosen modalities, and then integrating the resulting outcomes into a more complete picture. In this paper we provide a broad analysis of a new approach, based on creating fused representations of the multi- modal data, which then can be subjected to analysis by means of the state-of-the-art classiers or detectors. In this scenario we shall consider the hyperspectral imagery combined with spatial information. Our approach involves machine learning techniques based on analysis of joint data-dependent graphs and their associated diusion kernels. Then, the signicant eigenvectors of the derived fused graph Laplace operator form the new representation, which provides integrated features from the heterogeneous input data. We compare these fused approaches with analysis of integrated outputs of spatial and spectral graph methods.

  12. Meteoritic Sulfur Isotopic Analysis

    NASA Technical Reports Server (NTRS)

    Thiemens, Mark H.

    1996-01-01

    Funds were requested to continue our program in meteoritic sulfur isotopic analysis. We have recently detected a potential nucleosynthetic sulfur isotopic anomaly. We will search for potential carriers. The documentation of bulk systematics and the possible relation to nebular chemistry and oxygen isotopes will be explored. Analytical techniques for delta(sup 33), delta(sup 34)S, delta(sup 36)S isotopic analysis were improved. Analysis of sub milligram samples is now possible. A possible relation between sulfur isotopes and oxygen was detected, with similar group systematics noted, particularly in the case of aubrites, ureilites and entstatite chondrites. A possible nucleosynthetic excess S-33 has been noted in bulk ureilites and an oldhamite separate from Norton County. High energy proton (approximately 1 GeV) bombardments of iron foils were done to experimentally determine S-33, S-36 spallogenic yields for quantitation of isotopic measurements in iron meteorites. Techniques for measurement of mineral separates were perfected and an analysis program initiated. The systematic behavior of bulk sulfur isotopes will continue to be explored.

  13. Tunnel Detection Using Seismic Methods

    NASA Astrophysics Data System (ADS)

    Miller, R.; Park, C. B.; Xia, J.; Ivanov, J.; Steeples, D. W.; Ryden, N.; Ballard, R. F.; Llopis, J. L.; Anderson, T. S.; Moran, M. L.; Ketcham, S. A.

    2006-05-01

    Surface seismic methods have shown great promise for use in detecting clandestine tunnels in areas where unauthorized movement beneath secure boundaries have been or are a matter of concern for authorities. Unauthorized infiltration beneath national borders and into or out of secure facilities is possible at many sites by tunneling. Developments in acquisition, processing, and analysis techniques using multi-channel seismic imaging have opened the door to a vast number of near-surface applications including anomaly detection and delineation, specifically tunnels. Body waves have great potential based on modeling and very preliminary empirical studies trying to capitalize on diffracted energy. A primary limitation of all seismic energy is the natural attenuation of high-frequency energy by earth materials and the difficulty in transmitting a high- amplitude source pulse with a broad spectrum above 500 Hz into the earth. Surface waves have shown great potential since the development of multi-channel analysis methods (e.g., MASW). Both shear-wave velocity and backscatter energy from surface waves have been shown through modeling and empirical studies to have great promise in detecting the presence of anomalies, such as tunnels. Success in developing and evaluating various seismic approaches for detecting tunnels relies on investigations at known tunnel locations, in a variety of geologic settings, employing a wide range of seismic methods, and targeting a range of uniquely different tunnel geometries, characteristics, and host lithologies. Body-wave research at the Moffat tunnels in Winter Park, Colorado, provided well-defined diffraction-looking events that correlated with the subsurface location of the tunnel complex. Natural voids related to karst have been studied in Kansas, Oklahoma, Alabama, and Florida using shear-wave velocity imaging techniques based on the MASW approach. Manmade tunnels, culverts, and crawl spaces have been the target of multi-modal analysis in Kansas and California. Clandestine tunnels used for illegal entry into the U.S. from Mexico were studied at two different sites along the southern border of California. All these studies represent the empirical basis for suggesting surface seismic has a significant role to play in tunnel detection and that methods are under development and very nearly at hand that will provide an effective tool in appraising and maintaining parameter security. As broadband sources, gravity-coupled towed spreads, and automated analysis software continues to make advancements, so does the applicability of routine deployment of seismic imaging systems that can be operated by technicians with interpretation aids for nearly real-time target selection. Key to making these systems commercial is the development of enhanced imaging techniques in geologically noisy areas and highly variable surface terrain.

  14. Sleep Deprivation Attack Detection in Wireless Sensor Network

    NASA Astrophysics Data System (ADS)

    Bhattasali, Tapalina; Chaki, Rituparna; Sanyal, Sugata

    2012-02-01

    Deployment of sensor network in hostile environment makes it mainly vulnerable to battery drainage attacks because it is impossible to recharge or replace the battery power of sensor nodes. Among different types of security threats, low power sensor nodes are immensely affected by the attacks which cause random drainage of the energy level of sensors, leading to death of the nodes. The most dangerous type of attack in this category is sleep deprivation, where target of the intruder is to maximize the power consumption of sensor nodes, so that their lifetime is minimized. Most of the existing works on sleep deprivation attack detection involve a lot of overhead, leading to poor throughput. The need of the day is to design a model for detecting intrusions accurately in an energy efficient manner. This paper proposes a hierarchical framework based on distributed collaborative mechanism for detecting sleep deprivation torture in wireless sensor network efficiently. Proposed model uses anomaly detection technique in two steps to reduce the probability of false intrusion.

  15. Caldera unrest detected with seawater temperature anomalies at Deception Island, Antarctic Peninsula

    NASA Astrophysics Data System (ADS)

    Berrocoso, M.; Prates, G.; Fernández-Ros, A.; Peci, L. M.; de Gil, A.; Rosado, B.; Páez, R.; Jigena, B.

    2018-04-01

    Increased thermal activity was detected to coincide with the onset of volcano inflation in the seawater-filled caldera at Deception Island. This thermal activity was manifested in pulses of high water temperature that coincided with ocean tide cycles. The seawater temperature anomalies were detected by a thermometric sensor attached to the tide gauge (bottom pressure sensor). This was installed where the seawater circulation and the locations of known thermal anomalies, fumaroles and thermal springs, together favor the detection of water warmed within the caldera. Detection of the increased thermal activity was also possible because sea ice, which covers the entire caldera during the austral winter months, insulates the water and thus reduces temperature exchange between seawater and atmosphere. In these conditions, the water temperature data has been shown to provide significant information about Deception volcano activity. The detected seawater temperature increase, also observed in soil temperature readings, suggests rapid and near-simultaneous increase in geothermal activity with onset of caldera inflation and an increased number of seismic events observed in the following austral summer.

  16. Detection of Perlger-Huet anomaly based on augmented fast marching method and speeded up robust features.

    PubMed

    Sun, Minglei; Yang, Shaobao; Jiang, Jinling; Wang, Qiwei

    2015-01-01

    Pelger-Huet anomaly (PHA) and Pseudo Pelger-Huet anomaly (PPHA) are neutrophil with abnormal morphology. They have the bilobed or unilobed nucleus and excessive clumping chromatin. Currently, detection of this kind of cell mainly depends on the manual microscopic examination by a clinician, thus, the quality of detection is limited by the efficiency and a certain subjective consciousness of the clinician. In this paper, a detection method for PHA and PPHA is proposed based on karyomorphism and chromatin distribution features. Firstly, the skeleton of the nucleus is extracted using an augmented Fast Marching Method (AFMM) and width distribution is obtained through distance transform. Then, caryoplastin in the nucleus is extracted based on Speeded Up Robust Features (SURF) and a K-nearest-neighbor (KNN) classifier is constructed to analyze the features. Experiment shows that the sensitivity and specificity of this method achieved 87.5% and 83.33%, which means that the detection accuracy of PHA is acceptable. Meanwhile, the detection method should be helpful to the automatic morphological classification of blood cells.

  17. Anomalies in the detection of change: When changes in sample size are mistaken for changes in proportions.

    PubMed

    Fiedler, Klaus; Kareev, Yaakov; Avrahami, Judith; Beier, Susanne; Kutzner, Florian; Hütter, Mandy

    2016-01-01

    Detecting changes, in performance, sales, markets, risks, social relations, or public opinions, constitutes an important adaptive function. In a sequential paradigm devised to investigate detection of change, every trial provides a sample of binary outcomes (e.g., correct vs. incorrect student responses). Participants have to decide whether the proportion of a focal feature (e.g., correct responses) in the population from which the sample is drawn has decreased, remained constant, or increased. Strong and persistent anomalies in change detection arise when changes in proportional quantities vary orthogonally to changes in absolute sample size. Proportional increases are readily detected and nonchanges are erroneously perceived as increases when absolute sample size increases. Conversely, decreasing sample size facilitates the correct detection of proportional decreases and the erroneous perception of nonchanges as decreases. These anomalies are however confined to experienced samples of elementary raw events from which proportions have to be inferred inductively. They disappear when sample proportions are described as percentages in a normalized probability format. To explain these challenging findings, it is essential to understand the inductive-learning constraints imposed on decisions from experience.

  18. A novel approach for pilot error detection using Dynamic Bayesian Networks.

    PubMed

    Saada, Mohamad; Meng, Qinggang; Huang, Tingwen

    2014-06-01

    In the last decade Dynamic Bayesian Networks (DBNs) have become one type of the most attractive probabilistic modelling framework extensions of Bayesian Networks (BNs) for working under uncertainties from a temporal perspective. Despite this popularity not many researchers have attempted to study the use of these networks in anomaly detection or the implications of data anomalies on the outcome of such models. An abnormal change in the modelled environment's data at a given time, will cause a trailing chain effect on data of all related environment variables in current and consecutive time slices. Albeit this effect fades with time, it still can have an ill effect on the outcome of such models. In this paper we propose an algorithm for pilot error detection, using DBNs as the modelling framework for learning and detecting anomalous data. We base our experiments on the actions of an aircraft pilot, and a flight simulator is created for running the experiments. The proposed anomaly detection algorithm has achieved good results in detecting pilot errors and effects on the whole system.

  19. Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.

    2010-01-01

    In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.

  20. Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.

    PubMed

    Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen

    2015-11-01

    Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.

  1. Discrimination between pre-seismic electromagnetic anomalies and solar activity effects

    NASA Astrophysics Data System (ADS)

    Koulouras, G.; Balasis, G.; Kiourktsidis, I.; Nannos, E.; Kontakos, K.; Stonham, J.; Ruzhin, Y.; Eftaxias, K.; Cavouras, D.; Nomicos, C.

    2009-04-01

    Laboratory studies suggest that electromagnetic emissions in a wide frequency spectrum ranging from kilohertz (kHz) to very high megahertz (MHz) frequencies are produced by the opening of microcracks, with the MHz radiation appearing earlier than the kHz radiation. Earthquakes are large-scale fracture phenomena in the Earth's heterogeneous crust. Thus, the radiated kHz-MHz electromagnetic emissions are detectable not only in the laboratory but also at a geological scale. Clear MHz-to-kHz electromagnetic anomalies have been systematically detected over periods ranging from a few days to a few hours prior to recent destructive earthquakes in Greece. We should bear in mind that whether electromagnetic precursors to earthquakes exist is an important question not only for earthquake prediction but mainly for understanding the physical processes of earthquake generation. An open question in this field of research is the classification of a detected electromagnetic anomaly as a pre-seismic signal associated with earthquake occurrence. Indeed, electromagnetic fluctuations in the frequency range of MHz are known to be related to a few sources, including atmospheric noise (due to lightning), man-made composite noise, solar-terrestrial noise (resulting from the Sun-solar wind-magnetosphere-ionosphere-Earth's surface chain) or cosmic noise, and finally, the lithospheric effect, namely pre-seismic activity. We focus on this point in this paper. We suggest that if a combination of detected kHz and MHz electromagnetic anomalies satisfies the set of criteria presented herein, these anomalies could be considered as candidate precursory phenomena of an impending earthquake.

  2. Discrimination between preseismic electromagnetic anomalies and solar activity effects

    NASA Astrophysics Data System (ADS)

    Koulouras, Gr; Balasis, G.; Kontakos, K.; Ruzhin, Y.; Avgoustis, G.; Kavouras, D.; Nomicos, C.

    2009-04-01

    Laboratory studies suggest that electromagnetic emissions in a wide frequency spectrum ranging from kHz to very high MHz frequencies are produced by the opening of microcracks, with the MHz radiation appearing earlier than the kHz radiation. Earthquakes are large-scale fracture phenomena in the Earth's heterogeneous crust. Thus, the radiated kHz-MHz electromagnetic emissions are detectable not only at laboratory but also at geological scale. Clear MHz-to-kHz electromagnetic anomalies have been systematically detected over periods ranging from a few days to a few hours prior to recent destructive earthquakes in Greece. We bear in mind that whether electromagnetic precursors to earthquakes exist is an important question not only for earthquake prediction but mainly for understanding the physical processes of earthquake generation. An open question in this field of research is the classification of a detected electromagnetic anomaly as a pre-seismic signal associated to earthquake occurrence. Indeed, electromagnetic fluctuations in the frequency range of MHz are known to related to a few sources, i.e., they might be atmospheric noise (due to lightning), man-made composite noise, solar-terrestrial noise (resulting from the Sun-solar wind-magnetosphere-ionosphere-Earth's surface chain) or cosmic noise, and finally, lithospheric effect, namely pre-seismic activity. We focus on this point. We suggest that if a combination of detected kHz and MHz electromagnetic anomalies satisfies the herein presented set of criteria these anomalies could be considered as candidate precursory phenomena of an impending earthquake.

  3. Acute maternal social dysfunction, health perception and psychological distress after ultrasonographic detection of a fetal structural anomaly.

    PubMed

    Kaasen, A; Helbig, A; Malt, U F; Naes, T; Skari, H; Haugen, G

    2010-08-01

    To predict acute psychological distress in pregnant women following detection of a fetal structural anomaly by ultrasonography, and to relate these findings to a comparison group. A prospective, observational study. Tertiary referral centre for fetal medicine. One hundred and eighty pregnant women with a fetal structural anomaly detected by ultrasound (study group) and 111 with normal ultrasound findings (comparison group) were included within a week following sonographic examination after gestational age 12 weeks (inclusion period: May 2006 to February 2009). Social dysfunction and health perception were assessed by the corresponding subscales of the General Health Questionnaire (GHQ-28). Psychological distress was assessed using the Impact of Events Scale (IES-22), Edinburgh Postnatal Depression Scale (EPDS) and the anxiety and depression subscales of the GHQ-28. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Social dysfunction, health perception and psychological distress (intrusion, avoidance, arousal, anxiety, depression). The least severe anomalies with no diagnostic or prognostic ambiguity induced the lowest levels of IES intrusive distress (P = 0.025). Women included after 22 weeks of gestation (24%) reported significantly higher GHQ distress than women included earlier in pregnancy (P = 0.003). The study group had significantly higher levels of psychosocial distress than the comparison group on all psychometric endpoints. Psychological distress was predicted by gestational age at the time of assessment, severity of the fetal anomaly, and ambiguity concerning diagnosis or prognosis.

  4. Correction of Stahl ear deformity using a suture technique.

    PubMed

    Khan, Muhammad Adil Abbas; Jose, Rajive M; Ali, Syed Nadir; Yap, Lok Huei

    2010-09-01

    Correction of partial ear deformities can be a challenging task for the plastic surgeon. There are no standard techniques for correcting many of these deformities, and several different techniques are described in literature. Stahl ear is one such anomaly, characterized by an accessory third crus in the ear cartilage, giving rise to an irregular helical rim. The conventional techniques of correcting this deformity include either excision of the cartilage, repositioning of the cartilage, or scoring techniques. We recently encountered a case of Stahl ear deformity and undertook correction using internal sutures with very good results. The technical details of the surgery are described along with a review of literature on correcting similar anomalies.

  5. Detecting errors and anomalies in computerized materials control and accountability databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteson, R.; Hench, K.; Yarbro, T.

    The Automated MC and A Database Assessment project is aimed at improving anomaly and error detection in materials control and accountability (MC and A) databases and increasing confidence in the data that they contain. Anomalous data resulting in poor categorization of nuclear material inventories greatly reduces the value of the database information to users. Therefore it is essential that MC and A data be assessed periodically for anomalies or errors. Anomaly detection can identify errors in databases and thus provide assurance of the integrity of data. An expert system has been developed at Los Alamos National Laboratory that examines thesemore » large databases for anomalous or erroneous data. For several years, MC and A subject matter experts at Los Alamos have been using this automated system to examine the large amounts of accountability data that the Los Alamos Plutonium Facility generates. These data are collected and managed by the Material Accountability and Safeguards System, a near-real-time computerized nuclear material accountability and safeguards system. This year they have expanded the user base, customizing the anomaly detector for the varying requirements of different groups of users. This paper describes the progress in customizing the expert systems to the needs of the users of the data and reports on their results.« less

  6. Value of brain MRI when sonography raises suspicion of agenesis of the corpus callosum in fetuses.

    PubMed

    Jarre, A; Llorens Salvador, R; Montoliu Fornas, G; Montoya Filardi, A

    To evaluate the role of magnetic resonance imaging (MRI) in fetuses with a previous sonographic suspicion of agenesis of the corpus callosum (ACC) to confirm the diagnosis and to detect associated intracranial anomalies. Single-center retrospective and descriptive observational study of the brain MRI performed in 78 fetuses with ACC sonographic suspicion between January 2006 and December 2015. Two experts in fetal imaging reviewed the MRI findings to evaluate the presence and morphology of the corpus callosum. When ACC was detected the whole fetal brain anatomy was thoroughly studied to determine the presence of associated anomalies. Prenatal MR imaging findings were compared to postnatal brain MRI or necropsy findings when available. Fetal MRI diagnosed 45 cases of ACC, 12 were partial (26.7%) and 33 complete (73.3%). In 28 cases (62,2%) associated intracranial anomalies were identified. The most often abnormality was ventriculomegaly (78,6%), followed by cortical malformations (53,6%), posterior fossa (25%) and midline anomalies (10,7%). Fetal brain MRI has an important role in the diagnosis of ACC and detection of associated anomalies. To perform a fetal brain MRI is important in fetuses with sonographic suspicion of ACC. Copyright © 2017 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. IR Thermography of International Space Station Radiator Panels

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay; Winfree, WIlliam; Morton, Richard; Howell, Patricia

    2010-01-01

    Several non-flight qualification test radiators were inspected using flash thermography. Flash thermography data analysis used raw and second derivative images to detect anomalies (Echotherm and Mosaic). Simple contrast evolutions were plotted for the detected anomalies to help in anomaly characterization. Many out-of-family indications were noted. Some out-of-family indications were classified as cold spot indications and are due to additional adhesive or adhesive layer behind the facesheet. Some out-of-family indications were classified as hot spot indications and are due to void, unbond or lack of adhesive behind the facesheet. The IR inspection helped in assessing expected manufacturing quality of the radiators.

  8. How much does the MSW effect contribute to the reactor antineutrino anomaly?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdiviesso, G. A.

    2015-05-15

    It has been pointed out that there is a 5.7 ± 2.3 discrepancy between the predicted and the observed reactor antineutrino flux in very short baseline experiments. Several causes for this anomaly have been discussed, including a possible non-standard forth sterile neutrino. In order to quantify how much non-standard this anomaly really is, the standard MSW effect is reviewed. Knowing that reactor antineutrinos are produced in a dense medium (the nuclear fuel) and is usually detected in a less dense one (water, or scintillator), non-adiabatic effects are expected to happen, creating a difference between the creation and detection mixing angles.

  9. Radioactive anomaly discrimination from spectral ratios

    DOEpatents

    Maniscalco, James; Sjoden, Glenn; Chapman, Mac Clements

    2013-08-20

    A method for discriminating a radioactive anomaly from naturally occurring radioactive materials includes detecting a first number of gamma photons having energies in a first range of energy values within a predetermined period of time and detecting a second number of gamma photons having energies in a second range of energy values within the predetermined period of time. The method further includes determining, in a controller, a ratio of the first number of gamma photons having energies in the first range and the second number of gamma photons having energies in the second range, and determining that a radioactive anomaly is present when the ratio exceeds a threshold value.

  10. Dataset of anomalies and malicious acts in a cyber-physical subsystem.

    PubMed

    Laso, Pedro Merino; Brosset, David; Puentes, John

    2017-10-01

    This article presents a dataset produced to investigate how data and information quality estimations enable to detect aNomalies and malicious acts in cyber-physical systems. Data were acquired making use of a cyber-physical subsystem consisting of liquid containers for fuel or water, along with its automated control and data acquisition infrastructure. Described data consist of temporal series representing five operational scenarios - Normal, aNomalies, breakdown, sabotages, and cyber-attacks - corresponding to 15 different real situations. The dataset is publicly available in the .zip file published with the article, to investigate and compare faulty operation detection and characterization methods for cyber-physical systems.

  11. Dual Use Corrosion Inhibitor and Penetrant for Anomaly Detection in Neutron/X Radiography

    NASA Technical Reports Server (NTRS)

    Hall, Phillip B. (Inventor); Novak, Howard L. (Inventor)

    2004-01-01

    A dual purpose corrosion inhibitor and penetrant composition sensitive to radiography interrogation is provided. The corrosion inhibitor mitigates or eliminates corrosion on the surface of a substrate upon which the corrosion inhibitor is applied. In addition, the corrosion inhibitor provides for the attenuation of a signal used during radiography interrogation thereby providing for detection of anomalies on the surface of the substrate.

  12. Characterization and Modeling of Materials Responsible for Planetary Crustal Magnetism

    NASA Astrophysics Data System (ADS)

    Strauss, Becky E.

    Earth and Mercury are the only terrestrial planets in our solar system with present-day magnetic dipole fields generated by internal dynamo systems. In contrast, Mars and the Moon show evidence of past dipole fields in the form of crustal magnetic anomalies; to hold measurable magnetizations, crustal materials must have been exposed to an applied field. While the physical principles of magnetic recording are consistent between terrestrial planets, the particular conditions at each planet control the mechanisms by which crustal materials may be magnetized and limit the types of minerals that can retain magnetic remanence. As the suite of magnetic materials used for studies of remanence expands, the need for new methods follows. The integration of rock magnetic techniques with microscopy and chemical analyses enables the reconstruction of increasingly comprehensive narratives of remanence acquisition and alteration, even in materials that are challenging to study using traditional methods. This thesis demonstrates the utility of a materials approach to rock magnetism by applying techniques designed for terrestrial use in a planetary context. The first of two case studies focuses on calcite cave deposits as a means to demonstrate how novel techniques can be used to unlock previously inaccessible archives of magnetic information. Tandem magnetic and microscopic analyses improve our understanding of the rock magnetic properties of weakly magnetic stalagmites and their potential for paleomagnetic research, as well as illuminating the pathways of remanence acquisition in cave systems. The second case study addresses the magnetic anomalies recently detected by the MESSENGER orbiter at Mercury. These anomalies are consistent with remanence acquired in a dipole field. However, in the absence of physical samples, the types of magnetic minerals that could be holding remanence in Mercury's hot, highly reducing surface environment have not yet been determined. Orbital data is combined with fundamental rock magnetic principles to constrain the magnetic mineralogy of Mercury and to propose mechanisms of magnetization and remagnetization in the lithosphere.

  13. Extending TOPS: Ontology-driven Anomaly Detection and Analysis System

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Michaelis, A.

    2010-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.

  14. Ranking Causal Anomalies via Temporal and Dynamical Analysis on Vanishing Correlations.

    PubMed

    Cheng, Wei; Zhang, Kai; Chen, Haifeng; Jiang, Guofei; Chen, Zhengzhang; Wang, Wei

    2016-08-01

    Modern world has witnessed a dramatic increase in our ability to collect, transmit and distribute real-time monitoring and surveillance data from large-scale information systems and cyber-physical systems. Detecting system anomalies thus attracts significant amount of interest in many fields such as security, fault management, and industrial optimization. Recently, invariant network has shown to be a powerful way in characterizing complex system behaviours. In the invariant network, a node represents a system component and an edge indicates a stable, significant interaction between two components. Structures and evolutions of the invariance network, in particular the vanishing correlations, can shed important light on locating causal anomalies and performing diagnosis. However, existing approaches to detect causal anomalies with the invariant network often use the percentage of vanishing correlations to rank possible casual components, which have several limitations: 1) fault propagation in the network is ignored; 2) the root casual anomalies may not always be the nodes with a high-percentage of vanishing correlations; 3) temporal patterns of vanishing correlations are not exploited for robust detection. To address these limitations, in this paper we propose a network diffusion based framework to identify significant causal anomalies and rank them. Our approach can effectively model fault propagation over the entire invariant network, and can perform joint inference on both the structural, and the time-evolving broken invariance patterns. As a result, it can locate high-confidence anomalies that are truly responsible for the vanishing correlations, and can compensate for unstructured measurement noise in the system. Extensive experiments on synthetic datasets, bank information system datasets, and coal plant cyber-physical system datasets demonstrate the effectiveness of our approach.

  15. Solving the muon g -2 anomaly in deflected anomaly mediated SUSY breaking with messenger-matter interactions

    NASA Astrophysics Data System (ADS)

    Wang, Fei; Wang, Wenyu; Yang, Jin Min

    2017-10-01

    We propose to introduce general messenger-matter interactions in the deflected anomaly mediated supersymmetry (SUSY) breaking (AMSB) scenario to explain the gμ-2 anomaly. Scenarios with complete or incomplete grand unified theory (GUT) multiplet messengers are discussed, respectively. The introduction of incomplete GUT mulitiplets can be advantageous in various aspects. We found that the gμ-2 anomaly can be solved in both scenarios under current constraints including the gluino mass bounds, while the scenarios with incomplete GUT representation messengers are more favored by the gμ-2 data. We also found that the gluino is upper bounded by about 2.5 TeV (2.0 TeV) in scenario A and 3.0 TeV (2.7 TeV) in scenario B if the generalized deflected AMSB scenarios are used to fully account for the gμ-2 anomaly at 3 σ (2 σ ) level. Such a gluino should be accessible in the future LHC searches. Dark matter (DM) constraints, including DM relic density and direct detection bounds, favor scenario B with incomplete GUT multiplets. Much of the allowed parameter space for scenario B could be covered by the future DM direct detection experiments.

  16. Congenital anomalies of the left brachiocephalic vein detected in adults on computed tomography.

    PubMed

    Yamamuro, Hiroshi; Ichikawa, Tamaki; Hashimoto, Jun; Ono, Shun; Nagata, Yoshimi; Kawada, Shuichi; Kobayashi, Makiko; Koizumi, Jun; Shibata, Takeo; Imai, Yutaka

    2017-10-01

    Anomalous left brachiocephalic vein (BCV) is a rare and less known systemic venous anomaly. We evaluated congenital anomalies of the left BCV in adults detected during computed tomography (CT) examinations. This retrospective study included 81,425 patients without congenital heart disease who underwent chest CT. We reviewed the recorded reports and CT images for congenital anomalies of the left BCV including aberrant and supernumerary BCVs. The associated congenital aortic anomalies were assessed. Among 73,407 cases at a university hospital, 22 (16 males, 6 females; mean age, 59 years) with aberrant left BCVs were found using keyword research on recorded reports (0.03%). Among 8018 cases at the branch hospital, 5 (4 males, 1 female; mean age, 67 years) with aberrant left BCVs were found using CT image review (0.062%). There were no significant differences in incidences of aberrant left BCV between the two groups. Two cases had double left BCVs. Eleven cases showed high aortic arches. Two cases had the right aortic arch, one case had an incomplete double aortic arch, and one case was associated with coarctation. Aberrant left BCV on CT examination in adults was extremely rare. Some cases were associated with aortic arch anomalies.

  17. Cardiac veins: collateral venous drainage pathways in chronic hemodialysis patients.

    PubMed

    Ozmen, Evrim; Algin, Oktay

    2016-07-12

    Venous anomalies are diagnostic and therapeutic challenges. Subclavian or superior vena cava stenosis can be developed and venous return can be achieved via cardiac veins and coronary sinus in patients with central venous catheter for long-term hemodialysis. These types of abnormalities are not extremely rare especially in patients with a history of central venous catheter placement. Detection of these anomalies and subclavian vein stenosis before the surgical creation of hemodialysis fistulae or tunneled central venous catheter placement may prevent unnecessary interventions in those patients. Multidetector computed tomography (MDCT) technique can give further information when compared with fluoroscopy or digital subtraction angiography in the management of these patients. This case report describes interesting aspects of central vein complications in hemodialysis patients. As a conclusion, there are limited data about thoracic venous return, and further prospective studies with large patient number are required. MDCT with 3D reconstruction is particularly useful for the accurate evaluation of venous patency, variations, and collateral circulation. Also it is an excellent tool for choosing and planning treatment.

  18. Abandoned underground storage tank location using fluxgate magnetic surveying: A case study

    USGS Publications Warehouse

    Van Biersel, T. P.; Bristoll, B.C.; Taylor, R.W.; Rose, J.

    2002-01-01

    In 1993, during the removal of a diesel and a gasoline underground storage tank at the municipal garage of the Village of Kohler, Sheboygan County, Wisconsin, soil testing revealed environmental contamination at the site. A site investigation revealed the possibility of a second on-site source of petroleum contamination. Limited historical data and the present usage of structures within the suspected source area precluded the use of most invasive sampling methods and most geophysical techniques. A fluxgate magnetometer survey, followed by confirmatory excavation, was conducted at the site. The fluxgate magnetometer survey identified nine possible magnetic anomalies within the 18 ?? 25 m area. The subsequent excavation near the anomalies revealed the presence of five paired and two individual 2000 L underground storage tanks. The fluxgate magnetometer survey, although affected by the proximity of buildings, was able to detect the buried tanks within 3 m of the brick structures, using a 1.5 ?? 1.5 m sampling array.

  19. Inter-annual Variations in Snow/Firn Density over the Greenland Ice Sheet by Combining GRACE gravimetry and Envisat Altimetry

    NASA Astrophysics Data System (ADS)

    Su, X.; Shum, C. K.; Guo, J.; Howat, I.; Jezek, K. C.; Luo, Z.; Zhou, Z.

    2017-12-01

    Satellite altimetry has been used to monitor elevation and volume change of polar ice sheets since the 1990s. In order to derive mass change from the measured volume change, different density assumptions are commonly used in the research community, which may cause discrepancies on accurately estimating ice sheets mass balance. In this study, we investigate the inter-annual anomalies of mass change from GRACE gravimetry and elevation change from Envisat altimetry during years 2003-2009, with the objective of determining inter-annual variations of snow/firn density over the Greenland ice sheet (GrIS). High positive correlations (0.6 or higher) between these two inter-annual anomalies at are found over 93% of the GrIS, which suggests that both techniques detect the same geophysical process at the inter-annual timescale. Interpreting the two anomalies in terms of near surface density variations, over 80% of the GrIS, the inter-annual variation in average density is between the densities of snow and pure ice. In particular, at the Summit of Central Greenland, we validate the satellite data estimated density with the in situ data available from 75 snow pits and 9 ice cores. This study provides constraints on the currently applied density assumptions for the GrIS.

  20. Visual analytics techniques for large multi-attribute time series data

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.

    2008-01-01

    Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.

  1. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    NASA Astrophysics Data System (ADS)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  2. CHAMP: a locally adaptive unmixing-based hyperspectral anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Crist, Eric P.; Thelen, Brian J.; Carrara, David A.

    1998-10-01

    Anomaly detection offers a means by which to identify potentially important objects in a scene without prior knowledge of their spectral signatures. As such, this approach is less sensitive to variations in target class composition, atmospheric and illumination conditions, and sensor gain settings than would be a spectral matched filter or similar algorithm. The best existing anomaly detectors generally fall into one of two categories: those based on local Gaussian statistics, and those based on linear mixing moles. Unmixing-based approaches better represent the real distribution of data in a scene, but are typically derived and applied on a global or scene-wide basis. Locally adaptive approaches allow detection of more subtle anomalies by accommodating the spatial non-homogeneity of background classes in a typical scene, but provide a poorer representation of the true underlying background distribution. The CHAMP algorithm combines the best attributes of both approaches, applying a linear-mixing model approach in a spatially adaptive manner. The algorithm itself, and teste results on simulated and actual hyperspectral image data, are presented in this paper.

  3. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  4. Microarray-based comparative genomic hybridization analysis in neonates with congenital anomalies: detection of chromosomal imbalances.

    PubMed

    Emy Dorfman, Luiza; Leite, Júlio César L; Giugliani, Roberto; Riegel, Mariluce

    2015-01-01

    To identify chromosomal imbalances by whole-genome microarray-based comparative genomic hybridization (array-CGH) in DNA samples of neonates with congenital anomalies of unknown cause from a birth defects monitoring program at a public maternity hospital. A blind genomic analysis was performed retrospectively in 35 stored DNA samples of neonates born between July of 2011 and December of 2012. All potential DNA copy number variations detected (CNVs) were matched with those reported in public genomic databases, and their clinical significance was evaluated. Out of a total of 35 samples tested, 13 genomic imbalances were detected in 12/35 cases (34.3%). In 4/35 cases (11.4%), chromosomal imbalances could be defined as pathogenic; in 5/35 (14.3%) cases, DNA CNVs of uncertain clinical significance were identified; and in 4/35 cases (11.4%), normal variants were detected. Among the four cases with results considered causally related to the clinical findings, two of the four (50%) showed causative alterations already associated with well-defined microdeletion syndromes. In two of the four samples (50%), the chromosomal imbalances found, although predicted as pathogenic, had not been previously associated with recognized clinical entities. Array-CGH analysis allowed for a higher rate of detection of chromosomal anomalies, and this determination is especially valuable in neonates with congenital anomalies of unknown etiology, or in cases in which karyotype results cannot be obtained. Moreover, although the interpretation of the results must be refined, this method is a robust and precise tool that can be used in the first-line investigation of congenital anomalies, and should be considered for prospective/retrospective analyses of DNA samples by birth defect monitoring programs. Copyright © 2014 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  5. Choosing options for ultrasound screening in pregnancy and comparing cost effectiveness: a decision analysis approach.

    PubMed

    Roberts, T; Mugford, M; Piercy, J

    1998-09-01

    To compare the cost effectiveness of different programmes of routine antenatal ultrasound screening to detect four key fetal anomalies: serious cardiac anomalies, spina bifida, Down's syndrome and lethal anomalies, using existing evidence. Decision analysis was used based on the best data currently available, including expert opinion from the Royal College of Obstetricians and Gynaecologists, Working Party and secondary data from the literature, to predict the likely outcomes in terms of malformations detected by each screening programme. Results applicable in clinics, hospitals or GP practices delivering antenatal screening. The number of cases with a 'target' malformation correctly detected antenatally. There was substantial overlap between the cost ranges of each screening programme demonstrating considerable uncertainty about the relative economic efficiency of alternative programmes for ultrasound screening. The cheapest, but not the most effective, screening programme consisted of one second trimester ultrasound scan. The cost per target anomaly detected (cost effectiveness) for this programme was in the range 5,000 pound silver-109,000, pound silver but in any 1000 women it will also fail to detect between 3.6 and 4.7 target anomalies. The range of uncertainty in the costs did not allow selection of any one programme as a clear choice for NHS purchasers. The results suggested that the overall allocation of resources for routine ultrasound screening in the UK is not currently economically efficient, but that certain scenarios for ultrasound screening are potentially within the range of cost effectiveness reached by other, possibly competing, screening programmes. The model highlighted the weakness of available evidence and demonstrated the need for more information both about current practice and costs.

  6. High-resolution x-ray absorption spectroscopy studies of metal compounds in neurodegenerative brain tissue

    NASA Astrophysics Data System (ADS)

    Collingwood, J. F.; Mikhaylova, A.; Davidson, M. R.; Batich, C.; Streit, W. J.; Eskin, T.; Terry, J.; Barrea, R.; Underhill, R. S.; Dobson, J.

    2005-01-01

    Fluorescence mapping and microfocus X-ray absorption spectroscopy are used to detect, locate and identify iron biominerals and other inorganic metal accumulations in neurodegenerative brain tissue at sub-cellular resolution (<5 microns). Recent progress in developing the technique is reviewed. Synchrotron X-rays are used to map tissue sections for metals of interest, and XANES and XAFS are used to characterise anomalous concentrations of the metals in-situ so that they can be correlated with tissue structures and disease pathology. Iron anomalies associated with biogenic magnetite, ferritin and haemoglobin are located and identified in an avian tissue model with a pixel resolution ~5 microns. Subsequent studies include brain tissue sections from transgenic Huntington's mice, and the first high-resolution mapping and identification of iron biominerals in human Alzheimer's and control autopsy brain tissue. Technical developments include use of microfocus diffraction to obtain structural information about biominerals in-situ, and depositing sample location grids by lithography for the location of anomalies by conventional microscopy. The combined techniques provide a breakthrough in the study of both intra- and extra-cellular iron compounds and related metals in tissue. The information to be gained from this approach has implications for future diagnosis and treatment of neurodegeneration, and for our understanding of the mechanisms involved.

  7. Drought in the Horn of Africa: attribution of a damaging and repeating extreme event

    NASA Astrophysics Data System (ADS)

    Marthews, Toby; Otto, Friederike; Mitchell, Daniel; Dadson, Simon; Jones, Richard

    2015-04-01

    We have applied detection and attribution techniques to the severe drought that hit the Horn of Africa in 2014. The short rains failed in late 2013 in Kenya, South Sudan, Somalia and southern Ethiopia, leading to a very dry growing season January to March 2014, and subsequently to the current drought in many agricultural areas of the sub-region. We have made use of the weather@home project, which uses publicly-volunteered distributed computing to provide a large ensemble of simulations sufficient to sample regional climate uncertainty. Based on this, we have estimated the occurrence rates of the kinds of the rare and extreme events implicated in this large-scale drought. From land surface model runs based on these ensemble simulations, we have estimated the impacts of climate anomalies during this period and therefore we can reliably identify some factors of the ongoing drought as attributable to human-induced climate change. The UNFCCC's Adaptation Fund is attempting to support projects that bring about an adaptation to "the adverse effects of climate change", but in order to formulate such projects we need a much clearer way to assess how much climate change is human-induced and how much is a consequence of climate anomalies and large-scale teleconnections, which can only be provided by robust attribution techniques.

  8. Dictionary-Driven Ischemia Detection From Cardiac Phase-Resolved Myocardial BOLD MRI at Rest.

    PubMed

    Bevilacqua, Marco; Dharmakumar, Rohan; Tsaftaris, Sotirios A

    2016-01-01

    Cardiac Phase-resolved Blood-Oxygen-Level Dependent (CP-BOLD) MRI provides a unique opportunity to image an ongoing ischemia at rest. However, it requires post-processing to evaluate the extent of ischemia. To address this, here we propose an unsupervised ischemia detection (UID) method which relies on the inherent spatio-temporal correlation between oxygenation and wall motion to formalize a joint learning and detection problem based on dictionary decomposition. Considering input data of a single subject, it treats ischemia as an anomaly and iteratively learns dictionaries to represent only normal observations (corresponding to myocardial territories remote to ischemia). Anomaly detection is based on a modified version of One-class Support Vector Machines (OCSVM) to regulate directly the margins by incorporating the dictionary-based representation errors. A measure of ischemic extent (IE) is estimated, reflecting the relative portion of the myocardium affected by ischemia. For visualization purposes an ischemia likelihood map is created by estimating posterior probabilities from the OCSVM outputs, thus obtaining how likely the classification is correct. UID is evaluated on synthetic data and in a 2D CP-BOLD data set from a canine experimental model emulating acute coronary syndromes. Comparing early ischemic territories identified with UID against infarct territories (after several hours of ischemia), we find that IE, as measured by UID, is highly correlated (Pearson's r=0.84) with respect to infarct size. When advances in automated registration and segmentation of CP-BOLD images and full coverage 3D acquisitions become available, we hope that this method can enable pixel-level assessment of ischemia with this truly non-invasive imaging technique.

  9. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  10. An approach to online network monitoring using clustered patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jinoh; Sim, Alex; Suh, Sang C.

    Network traffic monitoring is a core element in network operations and management for various purposes such as anomaly detection, change detection, and fault/failure detection. In this study, we introduce a new approach to online monitoring using a pattern-based representation of the network traffic. Unlike the past online techniques limited to a single variable to summarize (e.g., sketch), the focus of this study is on capturing the network state from the multivariate attributes under consideration. To this end, we employ clustering with its benefit of the aggregation of multidimensional variables. The clustered result represents the state of the network with regardmore » to the monitored variables, which can also be compared with the previously observed patterns visually and quantitatively. Finally, we demonstrate the proposed method with two popular use cases, one for estimating state changes and the other for identifying anomalous states, to confirm its feasibility.« less

  11. An approach to online network monitoring using clustered patterns

    DOE PAGES

    Kim, Jinoh; Sim, Alex; Suh, Sang C.; ...

    2017-03-13

    Network traffic monitoring is a core element in network operations and management for various purposes such as anomaly detection, change detection, and fault/failure detection. In this study, we introduce a new approach to online monitoring using a pattern-based representation of the network traffic. Unlike the past online techniques limited to a single variable to summarize (e.g., sketch), the focus of this study is on capturing the network state from the multivariate attributes under consideration. To this end, we employ clustering with its benefit of the aggregation of multidimensional variables. The clustered result represents the state of the network with regardmore » to the monitored variables, which can also be compared with the previously observed patterns visually and quantitatively. Finally, we demonstrate the proposed method with two popular use cases, one for estimating state changes and the other for identifying anomalous states, to confirm its feasibility.« less

  12. Securing the User's Work Environment

    NASA Technical Reports Server (NTRS)

    Cardo, Nicholas P.

    2004-01-01

    High performance computing at the Numerical Aerospace Simulation Facility at NASA Ames Research Center includes C90's, J90's and Origin 2000's. Not only is it necessary to protect these systems from outside attacks, but also to provide a safe working environment on the systems. With the right tools, security anomalies in the user s work environment can be deleted and corrected. Validating proper ownership of files against user s permissions, will reduce the risk of inadvertent data compromise. The detection of extraneous directories and files hidden amongst user home directories is important for identifying potential compromises. The first runs of these utilities detected over 350,000 files with problems. With periodic scans, automated correction of problems takes only minutes. Tools for detecting these types of problems as well as their development techniques will be discussed with emphasis on consistency, portability and efficiency for both UNICOS and IRIX.

  13. Development and Application of Wide Bandwidth Magneto-Resistive Sensor Based Eddy Current Probe

    NASA Technical Reports Server (NTRS)

    Wincheski, Russell A.; Simpson, John

    2010-01-01

    The integration of magneto-resistive sensors into eddy current probes can significantly expand the capabilities of conventional eddy current nondestructive evaluation techniques. The room temperature solid-state sensors have typical bandwidths in the megahertz range and resolutions of tens of microgauss. The low frequency sensitivity of magneto-resistive sensors has been capitalized upon in previous research to fabricate very low frequency eddy current sensors for deep flaw detection in multilayer conductors. In this work a modified probe design is presented to expand the capabilities of the device. The new probe design incorporates a dual induction source enabling operation from low frequency deep flaw detection to high frequency high resolution near surface material characterization. Applications of the probe for the detection of localized near surface conductivity anomalies are presented. Finite element modeling of the probe is shown to be in good agreement with experimental measurements.

  14. Methods for Finding Legacy Wells in Residential and Commercial Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammack, Richard W.; Veloski, Garret A.

    In 1919, the enthusiasm surrounding a short-lived gas play in Versailles Borough, Pennsylvania resulted in the drilling of many needless wells. The legacy of this activity exists today in the form of abandoned, unplugged gas wells that are a continuing source of fugitive methane in the midst of a residential and commercial area. Flammable concentrations of methane have been detected near building foundations, which have forced people from their homes and businesses until methane concentrations decreased. Despite mitigation efforts, methane problems persist and have caused some buildings to be permanently abandoned and demolished. This paper describes the use of magneticmore » and methane sensing methods by the National Energy Technology Laboratory (NETL) to locate abandoned gas wells in Versailles Borough where site access is limited and existing infrastructure can interfere. Here, wells are located between closely spaced houses and beneath buildings and parking lots. Wells are seldom visible, often because wellheads and internal casing strings have been removed, and external casing has been cut off below ground level. The magnetic survey of Versailles Borough identified 53 strong, monopole magnetic anomalies that are presumed to indicate the locations of steel-cased wells. This hypothesis was tested by excavating the location of one strong, monopole magnetic anomaly that was within an area of anomalous methane concentrations. The excavation uncovered an unplugged gas well that was within 0.2 m of the location of the maximum magnetic signal. Truck-mounted methane surveys of Versailles Borough detected numerous methane anomalies that were useful for narrowing search areas. Methane sources identified during truck-mounted surveys included strong methane sources such as sewers and methane mitigation vents. However, inconsistent wind direction and speed, especially between buildings, made locating weaker methane sources (such as leaking wells) difficult. Walking surveys with the methane detector mounted on a cart or wagon were more effective for detecting leaking wells because the instrument’s air inlet was near the ground where: 1) the methane concentration from subsurface sources (including wells) was a maximum, and 2) there was less displacement of methane anomalies from methane sources by air currents. The Versailles Borough survey found 15 methane anomalies that coincided with the location of well-type magnetic anomalies; the methane sources for these anomalies were assumed to be leaking wells. For abandoned well locations where the wellhead and all casing strings have been removed and there is no magnetic anomaly, leaking wellbores can sometimes be detected by methane surveys. Unlike magnetic anomalies, methane anomalies can be: 1) ephemeral, 2) significantly displaced from the well location, and 3) from non-well sources that cannot be discriminated without isotopic analysis. If methane surveys are used for well location, the air inlet to the instrument should be kept as close to the ground as possible to minimize the likelihood of detecting methane from distant, wind-blown sources.« less

  15. Metal Whiskers: Failure Modes and Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    Brusse, Jay A.; Leidecker, Henning

    2007-01-01

    Metal coatings especially tin, zinc and cadmium are unpredictably susceptible to the formation of electrically conductive, crystalline filaments referred to as metal whiskers. The use of such coatings in and around electrical systems presents a risk of electrical shorting. Examples of metal whisker formation are shown with emphasis on optical inspection techniques to improve probability of detection. The failure modes (i.e., electrical shorting behavior) associated with metal whiskers are described. Based on an almost 9- year long study, the benefits of polyurethane conformal coat (namely, Arathane 5750) to protect electrical conductors from whisker-induced short circuit anomalies is discussed.

  16. Human recognition in a video network

    NASA Astrophysics Data System (ADS)

    Bhanu, Bir

    2009-10-01

    Video networks is an emerging interdisciplinary field with significant and exciting scientific and technological challenges. It has great promise in solving many real-world problems and enabling a broad range of applications, including smart homes, video surveillance, environment and traffic monitoring, elderly care, intelligent environments, and entertainment in public and private spaces. This paper provides an overview of the design of a wireless video network as an experimental environment, camera selection, hand-off and control, anomaly detection. It addresses challenging questions for individual identification using gait and face at a distance and present new techniques and their comparison for robust identification.

  17. Anomalies and gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mielke, Eckehard W.

    Anomalies in Yang-Mills type gauge theories of gravity are reviewed. Particular attention is paid to the relation between the Dirac spin, the axial current j5 and the non-covariant gauge spin C. Using diagrammatic techniques, we show that only generalizations of the U(1)- Pontrjagin four-form F and F = dC arise in the chiral anomaly, even when coupled to gravity. Implications for Ashtekar's canonical approach to quantum gravity are discussed.

  18. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    PubMed Central

    Liu, Datong; Peng, Yu; Peng, Xiyuan

    2018-01-01

    Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR) and relevance vector machine (RVM)) are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP), which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%). There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI) based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA) algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application. PMID:29587372

  19. A system for learning statistical motion patterns.

    PubMed

    Hu, Weiming; Xiao, Xuejuan; Fu, Zhouyu; Xie, Dan; Tan, Tieniu; Maybank, Steve

    2006-09-01

    Analysis of motion patterns is an effective approach for anomaly detection and behavior prediction. Current approaches for the analysis of motion patterns depend on known scenes, where objects move in predefined ways. It is highly desirable to automatically construct object motion patterns which reflect the knowledge of the scene. In this paper, we present a system for automatically learning motion patterns for anomaly detection and behavior prediction based on a proposed algorithm for robustly tracking multiple objects. In the tracking algorithm, foreground pixels are clustered using a fast accurate fuzzy K-means algorithm. Growing and prediction of the cluster centroids of foreground pixels ensure that each cluster centroid is associated with a moving object in the scene. In the algorithm for learning motion patterns, trajectories are clustered hierarchically using spatial and temporal information and then each motion pattern is represented with a chain of Gaussian distributions. Based on the learned statistical motion patterns, statistical methods are used to detect anomalies and predict behaviors. Our system is tested using image sequences acquired, respectively, from a crowded real traffic scene and a model traffic scene. Experimental results show the robustness of the tracking algorithm, the efficiency of the algorithm for learning motion patterns, and the encouraging performance of algorithms for anomaly detection and behavior prediction.

  20. Accurate mobile malware detection and classification in the cloud.

    PubMed

    Wang, Xiaolei; Yang, Yuexiang; Zeng, Yingzhi

    2015-01-01

    As the dominator of the Smartphone operating system market, consequently android has attracted the attention of s malware authors and researcher alike. The number of types of android malware is increasing rapidly regardless of the considerable number of proposed malware analysis systems. In this paper, by taking advantages of low false-positive rate of misuse detection and the ability of anomaly detection to detect zero-day malware, we propose a novel hybrid detection system based on a new open-source framework CuckooDroid, which enables the use of Cuckoo Sandbox's features to analyze Android malware through dynamic and static analysis. Our proposed system mainly consists of two parts: anomaly detection engine performing abnormal apps detection through dynamic analysis; signature detection engine performing known malware detection and classification with the combination of static and dynamic analysis. We evaluate our system using 5560 malware samples and 6000 benign samples. Experiments show that our anomaly detection engine with dynamic analysis is capable of detecting zero-day malware with a low false negative rate (1.16 %) and acceptable false positive rate (1.30 %); it is worth noting that our signature detection engine with hybrid analysis can accurately classify malware samples with an average positive rate 98.94 %. Considering the intensive computing resources required by the static and dynamic analysis, our proposed detection system should be deployed off-device, such as in the Cloud. The app store markets and the ordinary users can access our detection system for malware detection through cloud service.

Top