Science.gov

Sample records for anomaly detection system

  1. Ferret Workflow Anomaly Detection System

    DTIC Science & Technology

    2005-02-28

    The Ferret workflow anomaly detection system project 2003-2004 has provided validation and anomaly detection in accredited workflows in secure...completed to accomplish a goal. Anomaly detection is the determination that a condition departs from the expected. The baseline behavior from which the

  2. System and method for anomaly detection

    DOEpatents

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  3. An immunity-based anomaly detection system with sensor agents.

    PubMed

    Okamoto, Takeshi; Ishida, Yoshiteru

    2009-01-01

    This paper proposes an immunity-based anomaly detection system with sensor agents based on the specificity and diversity of the immune system. Each agent is specialized to react to the behavior of a specific user. Multiple diverse agents decide whether the behavior is normal or abnormal. Conventional systems have used only a single sensor to detect anomalies, while the immunity-based system makes use of multiple sensors, which leads to improvements in detection accuracy. In addition, we propose an evaluation framework for the anomaly detection system, which is capable of evaluating the differences in detection accuracy between internal and external anomalies. This paper focuses on anomaly detection in user's command sequences on UNIX-like systems. In experiments, the immunity-based system outperformed some of the best conventional systems.

  4. An Immunity-Based Anomaly Detection System with Sensor Agents

    PubMed Central

    Okamoto, Takeshi; Ishida, Yoshiteru

    2009-01-01

    This paper proposes an immunity-based anomaly detection system with sensor agents based on the specificity and diversity of the immune system. Each agent is specialized to react to the behavior of a specific user. Multiple diverse agents decide whether the behavior is normal or abnormal. Conventional systems have used only a single sensor to detect anomalies, while the immunity-based system makes use of multiple sensors, which leads to improvements in detection accuracy. In addition, we propose an evaluation framework for the anomaly detection system, which is capable of evaluating the differences in detection accuracy between internal and external anomalies. This paper focuses on anomaly detection in user's command sequences on UNIX-like systems. In experiments, the immunity-based system outperformed some of the best conventional systems. PMID:22291560

  5. Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)

    NASA Technical Reports Server (NTRS)

    McIntosh, Dawn

    2006-01-01

    This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search

  6. Network anomaly detection system with optimized DS evidence theory.

    PubMed

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network-complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each sensor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly.

  7. Network Anomaly Detection System with Optimized DS Evidence Theory

    PubMed Central

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  8. Probabilistic Anomaly Detection in Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic

    1993-01-01

    This paper describes probabilistic methods for novelty detection when using pattern recognition methods for fault monitoring of dynamic systems. The problem of novelty detection is particularly acute when prior knowledge and data only allow one to construct an incomplete prior model of the system. Hence, some allowance must be made in model design so that a classifier will be robust to data generated by classes not included in the training phase.

  9. System for Anomaly and Failure Detection (SAFD) system development

    NASA Technical Reports Server (NTRS)

    Oreilly, D.

    1993-01-01

    The System for Anomaly and Failure Detection (SAFD) algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failures as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient conditions. This task assignment originally specified developing a platform for executing the algorithm during hot fire tests at Technology Test Bed (TTB) and installing the SAFD algorithm on that platform. Two units were built and installed in the Hardware Simulation Lab and at the TTB in December 1991. Since that time, the task primarily entailed improvement and maintenance of the systems, additional testing to prove the feasibility of the algorithm, and support of hot fire testing. This document addresses the work done since the last report of June 1992. The work on the System for Anomaly and Failure Detection during this period included improving the platform and the algorithm, testing the algorithm against previous test data and in the Hardware Simulation Lab, installing other algorithms on the system, providing support for operations at the Technology Test Bed, and providing routine maintenance.

  10. System for Anomaly and Failure Detection (SAFD) system development

    NASA Technical Reports Server (NTRS)

    Oreilly, D.

    1992-01-01

    This task specified developing the hardware and software necessary to implement the System for Anomaly and Failure Detection (SAFD) algorithm, developed under Technology Test Bed (TTB) Task 21, on the TTB engine stand. This effort involved building two units; one unit to be installed in the Block II Space Shuttle Main Engine (SSME) Hardware Simulation Lab (HSL) at Marshall Space Flight Center (MSFC), and one unit to be installed at the TTB engine stand. Rocketdyne personnel from the HSL performed the task. The SAFD algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failure as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient condition.

  11. System for Anomaly and Failure Detection (SAFD) system development

    NASA Astrophysics Data System (ADS)

    Oreilly, D.

    1992-07-01

    This task specified developing the hardware and software necessary to implement the System for Anomaly and Failure Detection (SAFD) algorithm, developed under Technology Test Bed (TTB) Task 21, on the TTB engine stand. This effort involved building two units; one unit to be installed in the Block II Space Shuttle Main Engine (SSME) Hardware Simulation Lab (HSL) at Marshall Space Flight Center (MSFC), and one unit to be installed at the TTB engine stand. Rocketdyne personnel from the HSL performed the task. The SAFD algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failure as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient condition.

  12. Attention focusing and anomaly detection in systems monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. The focus of this paper is a new technique for attention focusing. The technique involves reasoning about the distance between two frequency distributions, and is used to detect both anomalous system parameters and 'broken' causal dependencies. These two forms of information together isolate the locus of anomalous behavior in the system being monitored.

  13. Extending TOPS: Knowledge Management System for Anomaly Detection and Analysis

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Michaelis, A.

    2009-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. While there are large numbers of anomaly detection algorithms for multivariate datasets, we are extending this capability beyond the anomaly detection itself and towards an automated analysis that would discover the possible causes of the anomalies. There are often indirect connections between datasets that manifest themselves during occurrence of external events and rather than searching exhaustively throughout all the datasets, our goal is to capture this knowledge and provide it to the system during automated analysis. This results in more efficient processing. Since we don’t need to process all the datasets using the original anomaly detection algorithms, which is often compute intensive; we achieve data reduction as we don’t need to store all the datasets in order to search for possible connections but we can download selected data on-demand based on our analysis. For example, an anomaly observed in vegetation Net Primary Production (NPP) can relate to an anomaly in vegetation Leaf Area Index (LAI), which is a fairly direct connection, as LAI is one of the inputs for NPP, however the change in LAI could be caused by a fire event, which is not directly connected with NPP. Because we are able to capture this knowledge we can analyze fire datasets and if there is a match with the NPP anomaly, we can infer that a fire is a likely cause. The knowledge is captured using OWL ontology language, where connections are defined in a schema

  14. Extending TOPS: Ontology-driven Anomaly Detection and Analysis System

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Michaelis, A.

    2010-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.

  15. Anomaly-based intrusion detection for SCADA systems

    SciTech Connect

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper will briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)

  16. Host-Based Anomaly Detection Using Wrapping File Systems

    DTIC Science & Technology

    2004-01-01

    experimental data acquired from this sensor. FWRAP employs the Probabilistic Anomaly Detection (PAD) algorithm previously reported in oar work on Windows...Registry Anomaly Detection . The detector is first trained by operating the host computer for some amount of time and a model specific to the target

  17. Conditional anomaly detection methods for patient–management alert systems

    PubMed Central

    Valko, Michal; Cooper, Gregory; Seybert, Amy; Visweswaran, Shyam; Saul, Melissa; Hauskrecht, Milos

    2010-01-01

    Anomaly detection methods can be very useful in identifying unusual or interesting patterns in data. A recently proposed conditional anomaly detection framework extends anomaly detection to the problem of identifying anomalous patterns on a subset of attributes in the data. The anomaly always depends (is conditioned) on the value of remaining attributes. The work presented in this paper focuses on instance–based methods for detecting conditional anomalies. The methods rely on the distance metric to identify examples in the dataset that are most critical for detecting the anomaly. We investigate various metrics and metric learning methods to optimize the performance of the instance–based anomaly detection methods. We show the benefits of the instance–based methods on two real–world detection problems: detection of unusual admission decisions for patients with the community–acquired pneumonia and detection of unusual orders of an HPF4 test that is used to confirm Heparin induced thrombocytopenia — a life–threatening condition caused by the Heparin therapy. PMID:25392850

  18. Conditional anomaly detection methods for patient-management alert systems.

    PubMed

    Valko, Michal; Cooper, Gregory; Seybert, Amy; Visweswaran, Shyam; Saul, Melissa; Hauskrecht, Milos

    2008-07-01

    Anomaly detection methods can be very useful in identifying unusual or interesting patterns in data. A recently proposed conditional anomaly detection framework extends anomaly detection to the problem of identifying anomalous patterns on a subset of attributes in the data. The anomaly always depends (is conditioned) on the value of remaining attributes. The work presented in this paper focuses on instance-based methods for detecting conditional anomalies. The methods rely on the distance metric to identify examples in the dataset that are most critical for detecting the anomaly. We investigate various metrics and metric learning methods to optimize the performance of the instance-based anomaly detection methods. We show the benefits of the instance-based methods on two real-world detection problems: detection of unusual admission decisions for patients with the community-acquired pneumonia and detection of unusual orders of an HPF4 test that is used to confirm Heparin induced thrombocytopenia - a life-threatening condition caused by the Heparin therapy.

  19. Rule-based expert system for maritime anomaly detection

    NASA Astrophysics Data System (ADS)

    Roy, Jean

    2010-04-01

    Maritime domain operators/analysts have a mandate to be aware of all that is happening within their areas of responsibility. This mandate derives from the needs to defend sovereignty, protect infrastructures, counter terrorism, detect illegal activities, etc., and it has become more challenging in the past decade, as commercial shipping turned into a potential threat. In particular, a huge portion of the data and information made available to the operators/analysts is mundane, from maritime platforms going about normal, legitimate activities, and it is very challenging for them to detect and identify the non-mundane. To achieve such anomaly detection, they must establish numerous relevant situational facts from a variety of sensor data streams. Unfortunately, many of the facts of interest just cannot be observed; the operators/analysts thus use their knowledge of the maritime domain and their reasoning faculties to infer these facts. As they are often overwhelmed by the large amount of data and information, automated reasoning tools could be used to support them by inferring the necessary facts, ultimately providing indications and warning on a small number of anomalous events worthy of their attention. Along this line of thought, this paper describes a proof-of-concept prototype of a rule-based expert system implementing automated rule-based reasoning in support of maritime anomaly detection.

  20. Log Summarization and Anomaly Detection for TroubleshootingDistributed Systems

    SciTech Connect

    Gunter, Dan; Tierney, Brian L.; Brown, Aaron; Swany, Martin; Bresnahan, John; Schopf, Jennifer M.

    2007-08-01

    Today's system monitoring tools are capable of detectingsystem failures such as host failures, OS errors, and network partitionsin near-real time. Unfortunately, the same cannot yet be said of theend-to-end distributed softwarestack. Any given action, for example,reliably transferring a directory of files, can involve a wide range ofcomplex and interrelated actions across multiple pieces of software:checking user certificates and permissions, getting details for allfiles, performing third-party transfers, understanding re-try policydecisions, etc. We present an infrastructure for troubleshooting complexmiddleware, a general purpose technique for configurable logsummarization, and an anomaly detection technique that works in near-realtime on running Grid middleware. We present results gathered using thisinfrastructure from instrumented Grid middleware and applications runningon the Emulab testbed. From these results, we analyze the effectivenessof several algorithms at accurately detecting a variety of performanceanomalies.

  1. Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.

    PubMed

    Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda

    2014-05-01

    We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.

  2. A Bayesian Hidden Markov Model-based approach for anomaly detection in electronic systems

    NASA Astrophysics Data System (ADS)

    Dorj, E.; Chen, C.; Pecht, M.

    Early detection of anomalies in any system or component prevents impending failures and enhances performance and availability. The complex architecture of electronics, the interdependency of component functionalities, and the miniaturization of most electronic systems make it difficult to detect and analyze anomalous behaviors. A Hidden Markov Model-based classification technique determines unobservable hidden behaviors of complex and remotely inaccessible electronic systems using observable signals. This paper presents a data-driven approach for anomaly detection in electronic systems based on a Bayesian Hidden Markov Model classification technique. The posterior parameters of the Hidden Markov Models are estimated using the conjugate prior method. An application of the developed Bayesian Hidden Markov Model-based anomaly detection approach is presented for detecting anomalous behavior in Insulated Gate Bipolar Transistors using experimental data. The detection results illustrate that the developed anomaly detection approach can help detect anomalous behaviors in electronic systems, which can help prevent system downtime and catastrophic failures.

  3. Anomaly-Based Intrusion Detection Systems Utilizing System Call Data

    DTIC Science & Technology

    2012-03-01

    HLLW.Raleka.A, Alasrou.A, Kassbot, Shelp.A, Blaster, Francette) • E-mail worms – 9 instances (5 variants of w32.Netsky and 4 variants of w32. Beagle ...For instance, the Beagle worm drops itself into the system folder, and then it e-mails its dropper. However, our prototype system 56

  4. NADIR (Network Anomaly Detection and Intrusion Reporter): A prototype network intrusion detection system

    SciTech Connect

    Jackson, K.A.; DuBois, D.H.; Stallings, C.A.

    1990-01-01

    The Network Anomaly Detection and Intrusion Reporter (NADIR) is an expert system which is intended to provide real-time security auditing for intrusion and misuse detection at Los Alamos National Laboratory's Integrated Computing Network (ICN). It is based on three basic assumptions: that statistical analysis of computer system and user activities may be used to characterize normal system and user behavior, and that given the resulting statistical profiles, behavior which deviates beyond certain bounds can be detected, that expert system techniques can be applied to security auditing and intrusion detection, and that successful intrusion detection may take place while monitoring a limited set of network activities such as user authentication and access control, file movement and storage, and job scheduling. NADIR has been developed to employ these basic concepts while monitoring the audited activities of more than 8000 ICN users.

  5. Detecting Biosphere anomalies hotspots

    NASA Astrophysics Data System (ADS)

    Guanche-Garcia, Yanira; Mahecha, Miguel; Flach, Milan; Denzler, Joachim

    2017-04-01

    The current amount of satellite remote sensing measurements available allow for applying data-driven methods to investigate environmental processes. The detection of anomalies or abnormal events is crucial to monitor the Earth system and to analyze their impacts on ecosystems and society. By means of a combination of statistical methods, this study proposes an intuitive and efficient methodology to detect those areas that present hotspots of anomalies, i.e. higher levels of abnormal or extreme events or more severe phases during our historical records. Biosphere variables from a preliminary version of the Earth System Data Cube developed within the CAB-LAB project (http://earthsystemdatacube.net/) have been used in this study. This database comprises several atmosphere and biosphere variables expanding 11 years (2001-2011) with 8-day of temporal resolution and 0.25° of global spatial resolution. In this study, we have used 10 variables that measure the biosphere. The methodology applied to detect abnormal events follows the intuitive idea that anomalies are assumed to be time steps that are not well represented by a previously estimated statistical model [1].We combine the use of Autoregressive Moving Average (ARMA) models with a distance metric like Mahalanobis distance to detect abnormal events in multiple biosphere variables. In a first step we pre-treat the variables by removing the seasonality and normalizing them locally (μ=0,σ=1). Additionally we have regionalized the area of study into subregions of similar climate conditions, by using the Köppen climate classification. For each climate region and variable we have selected the best ARMA parameters by means of a Bayesian Criteria. Then we have obtained the residuals by comparing the fitted models with the original data. To detect the extreme residuals from the 10 variables, we have computed the Mahalanobis distance to the data's mean (Hotelling's T^2), which considers the covariance matrix of the joint

  6. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    SciTech Connect

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  7. Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)

    2001-01-01

    The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.

  8. Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

    SciTech Connect

    Ondrej Linda; Todd Vollmer; Milos Manic

    2012-08-01

    The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, this paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.

  9. Addressing the Challenges of Anomaly Detection for Cyber Physical Energy Grid Systems

    SciTech Connect

    Ferragut, Erik M; Laska, Jason A; Melin, Alexander M; Czejdo, Bogdan

    2013-01-01

    The consolidation of cyber communications networks and physical control systems within the energy smart grid introduces a number of new risks. Unfortunately, these risks are largely unknown and poorly understood, yet include very high impact losses from attack and component failures. One important aspect of risk management is the detection of anomalies and changes. However, anomaly detection within cyber security remains a difficult, open problem, with special challenges in dealing with false alert rates and heterogeneous data. Furthermore, the integration of cyber and physical dynamics is often intractable. And, because of their broad scope, energy grid cyber-physical systems must be analyzed at multiple scales, from individual components, up to network level dynamics. We describe an improved approach to anomaly detection that combines three important aspects. First, system dynamics are modeled using a reduced order model for greater computational tractability. Second, a probabilistic and principled approach to anomaly detection is adopted that allows for regulation of false alerts and comparison of anomalies across heterogeneous data sources. Third, a hierarchy of aggregations are constructed to support interactive and automated analyses of anomalies at multiple scales.

  10. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    PubMed Central

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882

  11. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    PubMed

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  12. A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.

  13. A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.

  14. High Order Non-Stationary Markov Models and Anomaly Propagation Analysis in Intrusion Detection System (IDS)

    DTIC Science & Technology

    2007-02-01

    true alarms from false positives . At the host-level, a new anomaly detection mechanism operating that employs non-stationary Markov models is proposed....mitigate false positives, network based correlation of collected anomalies from different hosts is suggested, as well as a new means of host-based anomaly ... detection . The concept of anomaly propagation is based on the premise that false alarms do not propagate within the network. Unless anomaly

  15. An Auctioning Reputation System Based on Anomaly Detection

    DTIC Science & Technology

    2005-01-01

    systems used by online auction houses do not address the concern of a buyer shopping for commodities—finding a good bargain. These systems do not provide...INTRODUCTION Online auction houses such as eBay have emerged as a conve- nient way to buy and sell items over the Internet. eBay alone has over 147 million...Pattern Classification. John Wiley and Sons, Inc., New York, NY, 2nd edition, 2000. [11] eBay Inc. Policy: Seller shill bidding. Published online at http

  16. HPNAIDM: The High-Performance Network Anomaly/Intrusion Detection and Mitigation System

    SciTech Connect

    Chen, Yan

    2013-12-05

    Identifying traffic anomalies and attacks rapidly and accurately is critical for large network operators. With the rapid growth of network bandwidth, such as the next generation DOE UltraScience Network, and fast emergence of new attacks/virus/worms, existing network intrusion detection systems (IDS) are insufficient because they: • Are mostly host-based and not scalable to high-performance networks; • Are mostly signature-based and unable to adaptively recognize flow-level unknown attacks; • Cannot differentiate malicious events from the unintentional anomalies. To address these challenges, we proposed and developed a new paradigm called high-performance network anomaly/intrustion detection and mitigation (HPNAIDM) system. The new paradigm is significantly different from existing IDSes with the following features (research thrusts). • Online traffic recording and analysis on high-speed networks; • Online adaptive flow-level anomaly/intrusion detection and mitigation; • Integrated approach for false positive reduction. Our research prototype and evaluation demonstrate that the HPNAIDM system is highly effective and economically feasible. Beyond satisfying the pre-set goals, we even exceed that significantly (see more details in the next section). Overall, our project harvested 23 publications (2 book chapters, 6 journal papers and 15 peer-reviewed conference/workshop papers). Besides, we built a website for technique dissemination, which hosts two system prototype release to the research community. We also filed a patent application and developed strong international and domestic collaborations which span both academia and industry.

  17. Symbolic Time-Series Analysis for Anomaly Detection in Mechanical Systems

    DTIC Science & Technology

    2006-08-01

    IEEE/ASME TRANSACTIONS ON MECHATRONICS , VOL. 11, NO. 4, AUGUST 2006 439 Symbolic Time-Series Analysis for Anomaly Detection in Mechanical Systems ...fabricated as a multi- degree-of-freedom (DOF) mass-beam structure excited by oscillatory motion of two vibrators as shown in Fig. 1. Physical dimensions of...1. The dynamical system attains stationary behavior, in the fast time scale of machine vibrations , under persistent excitation in the vicinity of the

  18. Detecting Patterns of Anomalies

    DTIC Science & Technology

    2009-03-01

    ct)P (bt|ct) , where A,B and C are mutually exclusive subsets of attributes with at most k elements . This ratio is similar to the previous formula , but...AND SUBTITLE Detecting Patterns of Anomalies 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...to be dependent if, µ(A,B) ≥ βµ (2.1) where, βµ is a threshold parameter, set to a low value of 0.1 ( empirically ) in our experi- ments. Thus, for a

  19. Automated anomaly detection processor

    NASA Astrophysics Data System (ADS)

    Kraiman, James B.; Arouh, Scott L.; Webb, Michael L.

    2002-07-01

    Robust exploitation of tracking and surveillance data will provide an early warning and cueing capability for military and civilian Law Enforcement Agency operations. This will improve dynamic tasking of limited resources and hence operational efficiency. The challenge is to rapidly identify threat activity within a huge background of noncombatant traffic. We discuss development of an Automated Anomaly Detection Processor (AADP) that exploits multi-INT, multi-sensor tracking and surveillance data to rapidly identify and characterize events and/or objects of military interest, without requiring operators to specify threat behaviors or templates. The AADP has successfully detected an anomaly in traffic patterns in Los Angeles, analyzed ship track data collected during a Fleet Battle Experiment to detect simulated mine laying behavior amongst maritime noncombatants, and is currently under development for surface vessel tracking within the Coast Guard's Vessel Traffic Service to support port security, ship inspection, and harbor traffic control missions, and to monitor medical surveillance databases for early alert of a bioterrorist attack. The AADP can also be integrated into combat simulations to enhance model fidelity of multi-sensor fusion effects in military operations.

  20. Can we detect regional methane anomalies? A comparison between three observing systems

    NASA Astrophysics Data System (ADS)

    Cressot, Cindy; Pison, Isabelle; Rayner, Peter J.; Bousquet, Philippe; Fortems-Cheiney, Audrey; Chevallier, Frédéric

    2016-07-01

    A Bayesian inversion system is used to evaluate the capability of the current global surface network and of the space-borne GOSAT/TANSO-FTS and IASI instruments to quantify surface flux anomalies of methane at various spatial (global, semi-hemispheric and regional) and time (seasonal, yearly, 3-yearly) scales. The evaluation is based on a signal-to-noise ratio analysis, the signal being the methane fluxes inferred from the surface-based inversion from 2000 to 2011 and the noise (i.e., precision) of each of the three observing systems being computed from the Bayesian equation. At the global and semi-hemispheric scales, all observing systems detect flux anomalies at most of the tested timescales. At the regional scale, some seasonal flux anomalies are detected by the three observing systems, but year-to-year anomalies and longer-term trends are only poorly detected. Moreover, reliably detected regions depend on the reference surface-based inversion used as the signal. Indeed, tropical flux inter-annual variability, for instance, can be attributed mostly to Africa in the reference inversion or spread between tropical regions in Africa and America. Our results show that inter-annual analyses of methane emissions inferred by atmospheric inversions should always include an uncertainty assessment and that the attribution of current trends in atmospheric methane to particular regions' needs increased effort, for instance, gathering more observations (in the future) and improving transport models. At all scales, GOSAT generally shows the best performance of the three observing systems.

  1. Apparatus for detecting a magnetic anomaly contiguous to remote location by squid gradiometer and magnetometer systems

    DOEpatents

    Overton, Jr., William C.; Steyert, Jr., William A.

    1984-01-01

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  2. Apparatus for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    SciTech Connect

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1984-03-13

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  3. Model-Based Anomaly Detection for a Transparent Optical Transmission System

    NASA Astrophysics Data System (ADS)

    Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.

    In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.

  4. Anomaly Detection at Multiple Scales

    DTIC Science & Technology

    2011-11-07

    Arlington, VA November 7, 2011 Anomaly Detection at Multiple Scales Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden...AND SUBTITLE Anomaly Detection at Multiple Scales 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...Personal temperament and mental health • Distress, instability, or other vulnerability Anomaly Detection at Multiple Scales (ADAMS) Detect

  5. Fetal Central Nervous System Anomalies Detected by Magnetic Resonance Imaging: A Two-Year Experience

    PubMed Central

    Sefidbakht, Sepideh; Dehghani, Sakineh; Safari, Maryam; Vafaei, Homeira; Kasraeian, Maryam

    2016-01-01

    Background Magnetic resonance imaging (MRI) is gradually becoming more common for thorough visualization of the fetus than ultrasound (US), especially for neurological anomalies, which are the most common indications for fetal MRI and are a matter of concern for both families and society. Objectives We investigated fetal MRIs carried out in our center for frequency of central nervous system anomalies. This is the first such report in southern Iran. Materials and Methods One hundred and seven (107) pregnant women with suspicious fetal anomalies in prenatal ultrasound entered a cross-sectional retrospective study from 2011 to 2013. A 1.5 T Siemens Avanto scanner was employed for sequences, including T2 HASTE and Trufisp images in axial, coronal, and sagittal planes to mother’s body, T2 HASTE and Trufisp relative to the specific fetal body part being evaluated, and T1 flash images in at least one plane based on clinical indication. We investigated any abnormality in the central nervous system and performed descriptive analysis to achieve index of frequency. Results Mean gestational age ± standard deviation (SD) for fetuses was 25.54 ± 5.22 weeks, and mean maternal age ± SD was 28.38 ± 5.80 years Eighty out of 107 (74.7%) patients who were referred with initial impression of borderline ventriculomegaly. A total of 18 out of 107 (16.82%) patients were found to have fetuses with CNS anomalies and the remainder were neurologically normal. Detected anomalies were as follow: 3 (16.6%) fetuses each had the Dandy-Walker variant and Arnold-Chiari II (with myelomeningocele). Complete agenesis of corpus callosum, partial agenesis of corpus callosum, and aqueductal stenosis were each seen in 2 (11.1%) fetuses. Arnold-Chiari II without myelomeningocele, anterior spina bifida associated with neurenteric cyst, arachnoid cyst, lissencephaly, and isolated enlarged cisterna magna each presented in one (5.5%) fetus. One fetus had concomitant schizencephaly and complete agenesis of

  6. Survey of Anomaly Detection Methods

    SciTech Connect

    Ng, B

    2006-10-12

    This survey defines the problem of anomaly detection and provides an overview of existing methods. The methods are categorized into two general classes: generative and discriminative. A generative approach involves building a model that represents the joint distribution of the input features and the output labels of system behavior (e.g., normal or anomalous) then applies the model to formulate a decision rule for detecting anomalies. On the other hand, a discriminative approach aims directly to find the decision rule, with the smallest error rate, that distinguishes between normal and anomalous behavior. For each approach, we will give an overview of popular techniques and provide references to state-of-the-art applications.

  7. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.

    PubMed

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes.

  8. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems

    PubMed Central

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477

  9. Characterization of normality of chaotic systems including prediction and detection of anomalies

    NASA Astrophysics Data System (ADS)

    Engler, Joseph John

    Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational

  10. Anomaly detection in clinical processes.

    PubMed

    Huang, Zhengxing; Lu, Xudong; Duan, Huilong

    2012-01-01

    Meaningful anomalies in clinical processes may be related to caring performance or even the patient survival. It is imperative that the anomalies be timely detected such that useful and actionable knowledge of interest could be extracted to clinicians. Many previous approaches assume prior knowledge about the structure of clinical processes, using which anomalies are detected in a supervised manner. For a majority of clinical settings, however, clinical processes are complex, ad hoc, and even unknown a prior. In this paper, we investigate how to facilitate detection of anomalies in an unsupervised manner. An anomaly detection model is presented by applying a density-based clustering method on patient careflow logs. Using the learned model, it is possible to detect whether a particular patient careflow trace is anomalous with respect to normal traces in the logs. The approach has been validated over real data sets collected from a Chinese hospital.

  11. A function approximation approach to anomaly detection in propulsion system test data

    NASA Technical Reports Server (NTRS)

    Whitehead, Bruce A.; Hoyt, W. A.

    1993-01-01

    Ground test data from propulsion systems such as the Space Shuttle Main Engine (SSME) can be automatically screened for anomalies by a neural network. The neural network screens data after being trained with nominal data only. Given the values of 14 measurements reflecting external influences on the SSME at a given time, the neural network predicts the expected nominal value of a desired engine parameter at that time. We compared the ability of three different function-approximation techniques to perform this nominal value prediction: a novel neural network architecture based on Gaussian bar basis functions, a conventional back propagation neural network, and linear regression. These three techniques were tested with real data from six SSME ground tests containing two anomalies. The basis function network trained more rapidly than back propagation. It yielded nominal predictions with, a tight enough confidence interval to distinguish anomalous deviations from the nominal fluctuations in an engine parameter. Since the function-approximation approach requires nominal training data only, it is capable of detecting unknown classes of anomalies for which training data is not available.

  12. Seismic data fusion anomaly detection

    NASA Astrophysics Data System (ADS)

    Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David

    2014-06-01

    Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.

  13. Distance Metric Learning for Conditional Anomaly Detection.

    PubMed

    Valko, Michal; Hauskrecht, Milos

    2008-01-01

    Anomaly detection methods can be very useful in identifying unusual or interesting patterns in data. A recently proposed conditional anomaly detection framework extends anomaly detection to the problem of identifying anomalous patterns on a subset of attributes in the data. The anomaly always depends (is conditioned) on the value of remaining attributes. The work presented in this paper focuses on instance-based methods for detecting conditional anomalies. The methods depend heavily on the distance metric that lets us identify examples in the dataset that are most critical for detecting the anomaly. To optimize the performance of the anomaly detection methods we explore and study metric learning methods. We evaluate the quality of our methods on the Pneumonia PORT dataset by detecting unusual admission decisions for patients with the community-acquired pneumonia. The results of our metric learning methods show an improved detection performance over standard distance metrics, which is very promising for building automated anomaly detection systems for variety of intelligent monitoring applications.

  14. Distance Metric Learning for Conditional Anomaly Detection

    PubMed Central

    Valko, Michal; Hauskrecht, Milos

    2010-01-01

    Anomaly detection methods can be very useful in identifying unusual or interesting patterns in data. A recently proposed conditional anomaly detection framework extends anomaly detection to the problem of identifying anomalous patterns on a subset of attributes in the data. The anomaly always depends (is conditioned) on the value of remaining attributes. The work presented in this paper focuses on instance–based methods for detecting conditional anomalies. The methods depend heavily on the distance metric that lets us identify examples in the dataset that are most critical for detecting the anomaly. To optimize the performance of the anomaly detection methods we explore and study metric learning methods. We evaluate the quality of our methods on the Pneumonia PORT dataset by detecting unusual admission decisions for patients with the community–acquired pneumonia. The results of our metric learning methods show an improved detection performance over standard distance metrics, which is very promising for building automated anomaly detection systems for variety of intelligent monitoring applications. PMID:20485452

  15. Anomaly detection: eye movement patterns.

    PubMed

    Ni, W; Fodor, J D; Crain, S; Shankweiler, D

    1998-09-01

    The symptom of a garden path in sentence processing is an important anomaly in the input string. This anomaly signals to the parser that an error has occurred, and provides cues for how to repair it. Anomaly detection is thus an important aspect of sentence processing. In the present study, we investigated how the parser responds to unambiguous sentences that contain syntactic anomalies and pragmatic anomalies, examining records of eye movement during reading. While sensitivity to the two kinds of anomaly was very rapid and essentially simultaneous, qualitative differences existed in the patterns of first-pass reading times and eye regressions. The results are compatible with the proposal that syntactic information and pragmatic information are used differently in garden-path recovery.

  16. Mining Building Energy Management System Data Using Fuzzy Anomaly Detection and Linguistic Descriptions

    SciTech Connect

    Dumidu Wijayasekara; Ondrej Linda; Milos Manic; Craig Rieger

    2014-08-01

    Building Energy Management Systems (BEMSs) are essential components of modern buildings that utilize digital control technologies to minimize energy consumption while maintaining high levels of occupant comfort. However, BEMSs can only achieve these energy savings when properly tuned and controlled. Since indoor environment is dependent on uncertain criteria such as weather, occupancy, and thermal state, performance of BEMS can be sub-optimal at times. Unfortunately, the complexity of BEMS control mechanism, the large amount of data available and inter-relations between the data can make identifying these sub-optimal behaviors difficult. This paper proposes a novel Fuzzy Anomaly Detection and Linguistic Description (Fuzzy-ADLD) based method for improving the understandability of BEMS behavior for improved state-awareness. The presented method is composed of two main parts: 1) detection of anomalous BEMS behavior and 2) linguistic representation of BEMS behavior. The first part utilizes modified nearest neighbor clustering algorithm and fuzzy logic rule extraction technique to build a model of normal BEMS behavior. The second part of the presented method computes the most relevant linguistic description of the identified anomalies. The presented Fuzzy-ADLD method was applied to real-world BEMS system and compared against a traditional alarm based BEMS. In six different scenarios, the Fuzzy-ADLD method identified anomalous behavior either as fast as or faster (an hour or more), that the alarm based BEMS. In addition, the Fuzzy-ADLD method identified cases that were missed by the alarm based system, demonstrating potential for increased state-awareness of abnormal building behavior.

  17. Realization and detection of Weyl semimetals and the chiral anomaly in cold atomic systems

    NASA Astrophysics Data System (ADS)

    He, Wen-Yu; Zhang, Shizhong; Law, K. T.

    2016-07-01

    In this work, we describe a method to realize a three-dimensional Weyl semimetal by coupling multilayers of a honeycomb optical lattice in the presence of a pair of Raman lasers. The Raman lasers render each isolated honeycomb layer a Chern insulator. With finite interlayer coupling, the bulk gap of the system closes at certain out-of-plane momenta due to Raman assisted tunneling and results in the Weyl semimetal phase. Using experimentally relevant parameters, we show that both one pair and two pairs of Weyl points can be realized by tuning the interlayer coupling strength. We suggest that Landau-Zener tunneling can be used to detect Weyl points and show that the transition probability increases dramatically when the Weyl point emerges. The realization of chiral anomaly by using a magnetic-field gradient is also discussed.

  18. Anomaly detection on cup anemometers

    NASA Astrophysics Data System (ADS)

    Vega, Enrique; Pindado, Santiago; Martínez, Alejandro; Meseguer, Encarnación; García, Luis

    2014-12-01

    The performances of two rotor-damaged commercial anemometers (Vector Instruments A100 LK) were studied. The calibration results (i.e. the transfer function) were very linear, the aerodynamic behavior being more efficient than the one shown by both anemometers equipped with undamaged rotors. No detection of the anomaly (the rotors’ damage) was possible based on the calibration results. However, the Fourier analysis clearly revealed this anomaly.

  19. Application-Level Anomaly Detection for the Master Caution Panel

    DTIC Science & Technology

    2005-07-01

    AFRL-IF-RS-TR-2005-278 Final Technical Report July 2005 APPLICATION-LEVEL ANOMALY DETECTION FOR THE MASTER CAUTION PANEL...LEVEL ANOMALY DETECTION FOR THE MASTER CAUTION PANEL 6. AUTHOR(S) Salvatore J. Stolfo 5. FUNDING NUMBERS C - F30602-02-2-0209 PE...Master Caution Panel, Distributed System, Anomaly Detection , Machine Learning, Probabilistic Anomaly Detection Algorithm 16. PRICE CODE 17. SECURITY

  20. Data Mining for Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj

    2013-01-01

    The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.

  1. Verification of anomalies of the central nervous system detected by prenatal ultrasound.

    PubMed

    Wald, M; Lawrenz, K; Deutinger, J; Weninger, M

    2004-06-01

    The accuracy of fetal ultrasound (US) in diagnosing central nervous system (CNS) malformations was assessed with the aim to define in which cases US is reliable enough to assist in decisions on medical indication for abortions without resorting to magnetic resonance imaging (MRI). Retrospective analysis of the course of 69 fetuses with anomalies of the CNS detected on prenatal US in a university hospital. General Hospital of Vienna, University of Vienna, Austria. Prenatal US diagnosis was verified by postpartal US, MRI or computed tomography (CT) in the live births, and by autopsy of the fetus in cases of pregnancy termination. Abortion was induced in 40 fetuses for anencephaly (n = 4), exencephaly (n = 6), dorsal dysraphism (n = 6), encephalocele (n = 3), pronounced hydrocephaly (n = 11), holoprosencephaly (n = 4), Dandy Walker cyst (n = 5), and 1 complex syndrome - all confirmed on autopsy. In 29 live births, hydrocephaly, meningomyelocele, and microcephaly had always been correctly identified prenatally. Four Chiari malformations had been missed. Agenesis of the corpus callosum had remained unnoticed in 4 out of 14 cases and been erroneously reported in 5. Diagnostic errors were frequent for Dandy-Walker cyst and great cerebellomedullary cistern. Transabdominal fetal US did not lead to unjustified interventions. Inaccuracy in diagnosing abnormalities of the posterior fossa and the median telencephalon as well as aetiological clarification of hydrocephalus require additional MRI of the fetal CNS in patients selected accordingly.

  2. Astrometric solar system anomalies

    SciTech Connect

    Nieto, Michael Martin; Anderson, John D

    2009-01-01

    There are at least four unexplained anomalies connected with astrometric data. perhaps the most disturbing is the fact that when a spacecraft on a flyby trajectory approaches the Earth within 2000 km or less, it often experiences a change in total orbital energy per unit mass. next, a secular change in the astronomical unit AU is definitely a concern. It is increasing by about 15 cm yr{sup -1}. The other two anomalies are perhaps less disturbing because of known sources of nongravitational acceleration. The first is an apparent slowing of the two Pioneer spacecraft as they exit the solar system in opposite directions. Some astronomers and physicists are convinced this effect is of concern, but many others are convinced it is produced by a nearly identical thermal emission from both spacecraft, in a direction away from the Sun, thereby producing acceleration toward the Sun. The fourth anomaly is a measured increase in the eccentricity of the Moon's orbit. Here again, an increase is expected from tidal friction in both the Earth and Moon. However, there is a reported unexplained increase that is significant at the three-sigma level. It is produent to suspect that all four anomalies have mundane explanations, or that one or more anomalies are a result of systematic error. Yet they might eventually be explained by new physics. For example, a slightly modified theory of gravitation is not ruled out, perhaps analogous to Einstein's 1916 explanation for the excess precession of Mercury's perihelion.

  3. Apparatus and method for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    DOEpatents

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1981-05-22

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  4. System and method for the detection of anomalies in an image

    DOEpatents

    Prasad, Lakshman; Swaminarayan, Sriram

    2013-09-03

    Preferred aspects of the present invention can include receiving a digital image at a processor; segmenting the digital image into a hierarchy of feature layers comprising one or more fine-scale features defining a foreground object embedded in one or more coarser-scale features defining a background to the one or more fine-scale features in the segmentation hierarchy; detecting a first fine-scale foreground feature as an anomaly with respect to a first background feature within which it is embedded; and constructing an anomalous feature layer by synthesizing spatially contiguous anomalous fine-scale features. Additional preferred aspects of the present invention can include detecting non-pervasive changes between sets of images in response at least in part to one or more difference images between the sets of images.

  5. Network Event Recording Device: An automated system for Network anomaly detection, and notification. Draft

    SciTech Connect

    Simmons, D.G.; Wilkins, R.

    1994-09-01

    The goal of the Network Event Recording Device (NERD) is to provide a flexible autonomous system for network logging and notification when significant network anomalies occur. The NERD is also charged with increasing the efficiency and effectiveness of currently implemented network security procedures. While it has always been possible for network and security managers to review log files for evidence of network irregularities, the NERD provides real-time display of network activity, as well as constant monitoring and notification services for managers. Similarly, real-time display and notification of possible security breaches will provide improved effectiveness in combating resource infiltration from both inside and outside the immediate network environment.

  6. State of the Art in Anomaly Detection and Reaction

    DTIC Science & Technology

    1999-07-01

    This paper presents a view of the state of the art in anomaly detection and reaction (ADR) technology. The paper develops the view from six sources...reaction tools. The broader scope of anomaly detection and reaction also includes vulnerability scanners, infraction scanners, and security compliance...government off-the-shelf (GOTS) ADR systems; and (5) descriptions of current research in anomaly detection and reaction. Tables show intrusion detection tools

  7. Anomaly Detection in Dynamic Networks

    SciTech Connect

    Turcotte, Melissa

    2014-10-14

    Anomaly detection in dynamic communication networks has many important security applications. These networks can be extremely large and so detecting any changes in their structure can be computationally challenging; hence, computationally fast, parallelisable methods for monitoring the network are paramount. For this reason the methods presented here use independent node and edge based models to detect locally anomalous substructures within communication networks. As a first stage, the aim is to detect changes in the data streams arising from node or edge communications. Throughout the thesis simple, conjugate Bayesian models for counting processes are used to model these data streams. A second stage of analysis can then be performed on a much reduced subset of the network comprising nodes and edges which have been identified as potentially anomalous in the first stage. The first method assumes communications in a network arise from an inhomogeneous Poisson process with piecewise constant intensity. Anomaly detection is then treated as a changepoint problem on the intensities. The changepoint model is extended to incorporate seasonal behavior inherent in communication networks. This seasonal behavior is also viewed as a changepoint problem acting on a piecewise constant Poisson process. In a static time frame, inference is made on this extended model via a Gibbs sampling strategy. In a sequential time frame, where the data arrive as a stream, a novel, fast Sequential Monte Carlo (SMC) algorithm is introduced to sample from the sequence of posterior distributions of the change points over time. A second method is considered for monitoring communications in a large scale computer network. The usage patterns in these types of networks are very bursty in nature and don’t fit a Poisson process model. For tractable inference, discrete time models are considered, where the data are aggregated into discrete time periods and probability models are fitted to the

  8. Ground Viewing Perspective Hyperspectral Anomaly Detection

    DTIC Science & Technology

    2008-09-01

    features autonomous clutter background characterization (ACBC), adaptive anomaly detection , and constrained subspace target classification, where the first...materials (targets or non-targets). The first stage has two main components, ACBC and anomaly detection . The uniqueness of this first stage is that a random...ignored in the development of autonomous anomaly detection algorithms. Experimental results, using no prior information about the clutter background

  9. Conscious and unconscious detection of semantic anomalies.

    PubMed

    Hannon, Brenda

    2015-01-01

    When asked What superhero is associated with bats, Robin, the Penguin, Metropolis, Catwoman, the Riddler, the Joker, and Mr. Freeze? people frequently fail to notice the anomalous word Metropolis. The goals of this study were to determine whether detection of semantic anomalies, like Metropolis, is conscious or unconscious and whether this detection is immediate or delayed. To achieve these goals, participants answered anomalous and nonanomalous questions as their reading times for words were recorded. Comparisons between detected versus undetected anomalies revealed slower reading times for detected anomalies-a finding that suggests that people immediately and consciously detected anomalies. Further, comparisons between first and second words following undetected anomalies versus nonanomalous controls revealed some slower reading times for first and second words-a finding that suggests that people may have unconsciously detected anomalies but this detection was delayed. Taken together, these findings support the idea that when we are immediately aware of a semantic anomaly (i.e., immediate conscious detection) our language processes make immediate adjustments in order to reconcile contradictory information of anomalies with surrounding text; however, even when we are not consciously aware of semantic anomalies, our language processes still make these adjustments, although these adjustments are delayed (i.e., delayed unconscious detection).

  10. Recent Results on "Approximations to Optimal Alarm Systems for Anomaly Detection"

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2009-01-01

    An optimal alarm system and its approximations may use Kalman filtering for univariate linear dynamic systems driven by Gaussian noise to provide a layer of predictive capability. Predicted Kalman filter future process values and a fixed critical threshold can be used to construct a candidate level-crossing event over a predetermined prediction window. An optimal alarm system can be designed to elicit the fewest false alarms for a fixed detection probability in this particular scenario.

  11. Thesaurus anomaly detection by user action monitoring.

    PubMed

    Bitencourt, Jeferson L; Cancian, Píndaro S; Pacheco, Edson J; Nohama, Percy; Schulz, Stefan

    2007-01-01

    The construction and maintenance of a medical thesaurus is a non-trivial task, due to the inherent complexity of a proper medical terminology. We present a methodology for transaction-based anomaly detection in the process of thesaurus maintenance. Our experiences are based on lexicographic work with the MorphoSaurus lexicons, which are the basis for a mono- and cross-lingual biomedical information retrieval system. Any "edit"or "delete" actions within these lexicons that undo an action defined earlier were defined as anomalous. We identify four types of such anomalies. We also analyzed to which extent the anomalous lexicon entries had been detected by an alternative, corpus-based approach.

  12. Anomaly Detection by Reasoning from Evidence in Mobile Wireless Networks

    DTIC Science & Technology

    2008-08-01

    Anomaly detection is concerned with identification of abnormal patterns of behavior of a system. Traditional supervised machine learning methods of...for anomaly detection . Partitional clustering methods such as K-means require the number K of clusters to be specified by a user. Three heuristics

  13. Spacecraft Environmental Anomalies Expert System

    DTIC Science & Technology

    1994-02-23

    An expert system has been developed by The Aerospace Corporation, Space and Environment Technology Center for use in the diagnosis of satellite...anomalies caused by the space environment. The expert system is designed to determine the probable cause of an anomaly from the following candidates...in the satellite. The expert system is a rule-based system that uses the Texas Instrument’s Personal Consultant Plus expert - system shell. The expert

  14. Anomaly detection and reconstruction from random projections.

    PubMed

    Fowler, James E; Du, Qian

    2012-01-01

    Compressed-sensing methodology typically employs random projections simultaneously with signal acquisition to accomplish dimensionality reduction within a sensor device. The effect of such random projections on the preservation of anomalous data is investigated. The popular RX anomaly detector is derived for the case in which global anomalies are to be identified directly in the random-projection domain, and it is determined via both random simulation, as well as empirical observation that strongly anomalous vectors are likely to be identifiable by the projection-domain RX detector even in low-dimensional projections. Finally, a reconstruction procedure for hyperspectral imagery is developed wherein projection-domain anomaly detection is employed to partition the data set, permitting anomaly and normal pixel classes to be separately reconstructed in order to improve the representation of the anomaly pixels.

  15. Spacecraft environmental anomalies expert system

    NASA Technical Reports Server (NTRS)

    Koons, H. C.; Gorney, D. J.

    1988-01-01

    A microcomputer-based expert system is being developed at the Aerospace Corporation Space Sciences Laboratory to assist in the diagnosis of satellite anomalies caused by the space environment. The expert system is designed to address anomalies caused by surface charging, bulk charging, single event effects and total radiation dose. These effects depend on the orbit of the satellite, the local environment (which is highly variable), the satellite exposure time and the hardness of the circuits and components of the satellite. The expert system is a rule-based system that uses the Texas Instruments Personal Consultant Plus expert system shell. The completed expert system knowledge base will include 150 to 200 rules, as well as a spacecraft attributes database, an historical spacecraft anomalies database, and a space environment database which is updated in near real-time. Currently, the expert system is undergoing development and testing within the Aerospace Corporation Space Sciences Laboratory.

  16. Congenital renal anomalies detected in adulthood

    PubMed Central

    Muttarak, M; Sriburi, T

    2012-01-01

    Objective To document the types of congenital renal anomalies detected in adulthood, the clinical presentation and complications of these renal anomalies, and the most useful imaging modality in detecting a renal anomaly. Materials and methods This study was approved by the institutional review board and informed consent was waived. Between January 2007 and January 2011, the clinical data and imaging studies of 28 patients older than 18 years diagnosed with renal anomaly at the authors’ institution were retrospectively reviewed. Renal anomalies in this study included only those with abnormality in position and in form. Results Of these 28 patients, 22 underwent imaging studies and their results constituted the material of this study. Of the 22 patients, 14 had horseshoe kidneys (HSK), four had crossed renal ectopia and four had malrotation. Sixteen patients were men and six were women. The patients ranged in age from 19 to 74 years (mean age 51.1 years). Clinical presentations were abdominal pain (13), fever (13), haematuria (4), palpable mass (2), asymptomatic (2), polyuria (1) dysuria (1), blurred vision (1), and headache with weakness of left extremities (1). Imaging studies included abdominal radiograph (15), intravenous pyelography (IVP) (8), retrograde pyelography (RP) (4), ultrasonography (US) (7), and computed tomography (CT) (9). Associated complications included urinary tract stones (17), urinary tract infection (16), hydronephrosis (12), and tumours (2). Abdominal radiograph suggested renal anomalies in nine out of 15 studies. IVP, RP, US and CT suggested anomalies in all patients who had these studies performed. However, CT was the best imaging modality to evaluate anatomy, function and complications of patients with renal anomalies. Conclusion HSK was the most common renal anomaly, with abdominal pain and fever being the most common presentations. UTI and stones were the most common complications. IVP, RP, US and CT can be used to diagnose renal

  17. An enhanced stream mining approach for network anomaly detection

    NASA Astrophysics Data System (ADS)

    Bellaachia, Abdelghani; Bhatt, Rajat

    2005-03-01

    Network anomaly detection is one of the hot topics in the market today. Currently, researchers are trying to find a way in which machines could automatically learn both normal and anomalous behavior and thus detect anomalies if and when they occur. Most important applications which could spring out of these systems is intrusion detection and spam mail detection. In this paper, the primary focus on the problem and solution of "real time" network intrusion detection although the underlying theory discussed may be used for other applications of anomaly detection (like spam detection or spy-ware detection) too. Since a machine needs a learning process on its own, data mining has been chosen as a preferred technique. The object of this paper is to present a real time clustering system; we call Enhanced Stream Mining (ESM) which could analyze packet information (headers, and data) to determine intrusions.

  18. Investigation of a Neural Network Implementation of a TCP Packet Anomaly Detection System

    DTIC Science & Technology

    2004-05-01

    as the range 1024–65535. Some types of port scans may be detected through this attribute, as well as trojans and distributed denial of service (DDoS...Summary of DARPA 1999 week 5 detects. Date Attack Classifier Details 04/05/99 Portsweep Flags Lone FIN packets Sequence # SEQ=ACK=0 04/05/99 Neptune DoS IP...port (final stage of attack) 04/06/99 Neptune DoS IP Private IP address 10.20.30.40 Flags SYN packets with low source ports Ports Low ports to low

  19. Anomaly Detection for Resilient Control Systems Using Fuzzy-Neural Data Fusion Engine

    SciTech Connect

    Ondrej Linda; Milos Manic; Timothy R. McJunkin

    2011-08-01

    Resilient control systems in critical infrastructures require increased cyber-security and state-awareness. One of the necessary conditions for achieving the desired high level of resiliency is timely reporting and understanding of the status and behavioral trends of the control system. This paper describes the design and development of a neural-network based data-fusion system for increased state-awareness of resilient control systems. The proposed system consists of a dedicated data-fusion engine for each component of the control system. Each data-fusion engine implements three-layered alarm system consisting of: (1) conventional threshold-based alarms, (2) anomalous behavior detector using self-organizing maps, and (3) prediction error based alarms using neural network based signal forecasting. The proposed system was integrated with a model of the Idaho National Laboratory Hytest facility, which is a testing facility for hybrid energy systems. Experimental results demonstrate that the implemented data fusion system provides timely plant performance monitoring and cyber-state reporting.

  20. System for closure of a physical anomaly

    SciTech Connect

    Bearinger, Jane P; Maitland, Duncan J; Schumann, Daniel L; Wilson, Thomas S

    2014-11-11

    Systems for closure of a physical anomaly. Closure is accomplished by a closure body with an exterior surface. The exterior surface contacts the opening of the anomaly and closes the anomaly. The closure body has a primary shape for closing the anomaly and a secondary shape for being positioned in the physical anomaly. The closure body preferably comprises a shape memory polymer.

  1. Artificial immune system via Euclidean Distance Minimization for anomaly detection in bearings

    NASA Astrophysics Data System (ADS)

    Montechiesi, L.; Cocconcelli, M.; Rubini, R.

    2016-08-01

    In recent years new diagnostics methodologies have emerged, with particular interest into machinery operating in non-stationary conditions. In fact continuous speed changes and variable loads make non-trivial the spectrum analysis. A variable speed means a variable characteristic fault frequency related to the damage that is no more recognizable in the spectrum. To overcome this problem the scientific community proposed different approaches listed in two main categories: model-based approaches and expert systems. In this context the paper aims to present a simple expert system derived from the mechanisms of the immune system called Euclidean Distance Minimization, and its application in a real case of bearing faults recognition. The proposed method is a simplification of the original process, adapted by the class of Artificial Immune Systems, which proved to be useful and promising in different application fields. Comparative results are provided, with a complete explanation of the algorithm and its functioning aspects.

  2. Contextual Detection of Anomalies within Hyperspectral Images

    DTIC Science & Technology

    2011-03-01

    Hyperspectral Imagery (HSI), Unsupervised Target Detection, Target Identification, Contextual Anomaly Detection 16. SECURITY CLASSIFICATION OF: 17. LIMITATION...processing. Hyperspectral imaging has a wide range of applications within remote sensing, not limited to terrain classification , environmental monitoring...Johnson, R. J. (2008). Improved feature extraction, feature selection, and identification techniques that create a fast unsupervised hyperspectral

  3. Attention focussing and anomaly detection in real-time systems monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.; Chien, Steve A.; Fayyad, Usama M.; Porta, Harry J.

    1993-01-01

    In real-time monitoring situations, more information is not necessarily better. When faced with complex emergency situations, operators can experience information overload and a compromising of their ability to react quickly and correctly. We describe an approach to focusing operator attention in real-time systems monitoring based on a set of empirical and model-based measures for determining the relative importance of sensor data.

  4. Predictability in space launch vehicle anomaly detection using intelligent neuro-fuzzy systems

    NASA Technical Reports Server (NTRS)

    Gulati, Sandeep; Toomarian, Nikzad; Barhen, Jacob; Maccalla, Ayanna; Tawel, Raoul; Thakoor, Anil; Daud, Taher

    1994-01-01

    Included in this viewgraph presentation on intelligent neuroprocessors for launch vehicle health management systems (HMS) are the following: where the flight failures have been in launch vehicles; cumulative delay time; breakdown of operations hours; failure of Mars Probe; vehicle health management (VHM) cost optimizing curve; target HMS-STS auxiliary power unit location; APU monitoring and diagnosis; and integration of neural networks and fuzzy logic.

  5. Development of a Computer Architecture to Support the Optical Plume Anomaly Detection (OPAD) System

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1996-01-01

    The NASA OPAD spectrometer system relies heavily on extensive software which repetitively extracts spectral information from the engine plume and reports the amounts of metals which are present in the plume. The development of this software is at a sufficiently advanced stage where it can be used in actual engine tests to provide valuable data on engine operation and health. This activity will continue and, in addition, the OPAD system is planned to be used in flight aboard space vehicles. The two implementations, test-stand and in-flight, may have some differing requirements. For example, the data stored during a test-stand experiment are much more extensive than in the in-flight case. In both cases though, the majority of the requirements are similar. New data from the spectrograph is generated at a rate of once every 0.5 sec or faster. All processing must be completed within this period of time to maintain real-time performance. Every 0.5 sec, the OPAD system must report the amounts of specific metals within the engine plume, given the spectral data. At present, the software in the OPAD system performs this function by solving the inverse problem. It uses powerful physics-based computational models (the SPECTRA code), which receive amounts of metals as inputs to produce the spectral data that would have been observed, had the same metal amounts been present in the engine plume. During the experiment, for every spectrum that is observed, an initial approximation is performed using neural networks to establish an initial metal composition which approximates as accurately as possible the real one. Then, using optimization techniques, the SPECTRA code is repetitively used to produce a fit to the data, by adjusting the metal input amounts until the produced spectrum matches the observed one to within a given level of tolerance. This iterative solution to the original problem of determining the metal composition in the plume requires a relatively long period of time

  6. Use of color lights for the detection of anomalies in quality systems.

    PubMed

    Báez, G; De la Vega, E; Castro, C; Elizarraras, R

    2012-01-01

    The importance of eye care in the industry is a first level topic, due to most of the assembly and manufacturing aimed companies of various products that require direct health care of their employees, specially eye care. The lighting system, the lamp features and job tasks are factors that impact over the visual performance of the worker. Each of these factors, either by themselves or in conjunction, influences the visual performance of the employee, and therefore its safety and efficacy. Some of the reported symptoms are: problem of visual fixation, eye redness, tearing, headache, blurred vision, eyelids heaviness and dry eyes, [7]. The research was developed with 48 people, 27 male and 21 female, in the range of ages of 17 to 58 years old. In the experiment were used illumination system base on Diode Emitting lights (LED's) of five different colors (White, Blue, Green, Red and Yellow), the reason of use of LED's it is because are source of monochromatic light, also it is also saving power light and low heating dissipation.

  7. Deep learning on temporal-spectral data for anomaly detection

    NASA Astrophysics Data System (ADS)

    Ma, King; Leung, Henry; Jalilian, Ehsan; Huang, Daniel

    2017-05-01

    Detecting anomalies is important for continuous monitoring of sensor systems. One significant challenge is to use sensor data and autonomously detect changes that cause different conditions to occur. Using deep learning methods, we are able to monitor and detect changes as a result of some disturbance in the system. We utilize deep neural networks for sequence analysis of time series. We use a multi-step method for anomaly detection. We train the network to learn spectral and temporal features from the acoustic time series. We test our method using fiber-optic acoustic data from a pipeline.

  8. Spectral anomaly detection in deep shadows.

    PubMed

    Kanaev, Andrey V; Murray-Krezan, Jeremy

    2010-03-20

    Although several hyperspectral anomaly detection algorithms have proven useful when illumination conditions provide for enough light, many of these same detection algorithms fail to perform well when shadows are also present. To date, no general approach to the problem has been demonstrated. In this paper, a novel hyperspectral anomaly detection algorithm that adapts the dimensionality of the spectral detection subspace to multiple illumination levels is described. The novel detection algorithm is applied to reflectance domain hyperspectral data that represents a variety of illumination conditions: well illuminated and poorly illuminated (i.e., shadowed). Detection results obtained for objects located in deep shadows and light-shadow transition areas suggest superiority of the novel algorithm over standard subspace RX detection.

  9. Sensitivity of fetal anomaly detection as a function of time.

    PubMed

    Dervaux, B; Leleu, H; Lebrun, T; Levi, S; Grandjean, H

    1998-06-18

    In this paper, we show that the ratio of the number of fetal anomalies detected by ultrasounds (US) to the total number of cases is not a consistent estimator of the US sensitivity. As Eddy pointed out, when the disease evolves over time, the sensitivity of a test also varies over time according to the development of the disease. To assess correctly the detection capability of a test, it is therefore necessary to estimate a time continuous function (sensitivity function) instead of a single parameter. From a methodological point of view, by considering the "detectability" time of a fetal anomaly as a random variable and parametrizing its distribution function, we estimate the probability that an anomaly is detected conditional upon the precise timing of actually performed US during pregnancy. We fit this model with Eurofetus data (about 7,300 abnormal fetuses), and we compare estimations for different kinds of anomalies (classification based on the system involved and/or severity of the handicap). To allow for heterogeneity of anomalies regarding the detectability time, we generally adopt mixture models. For instance, we select a bi-gamma distribution for major malformations and estimate that 63% of such anomalies are detectable quite early in pregnancy (conditional mean: 15.2 weeks of amenorrhea (WA) +/- 4.2 WA), the others becoming detectable later (30.3 WA +/- 6.4 WA). Such results are then integrated in a cost-effectiveness analysis.

  10. Anomaly Detection Using Behavioral Approaches

    NASA Astrophysics Data System (ADS)

    Benferhat, Salem; Tabia, Karim

    Behavioral approaches, which represent normal/abnormal activities, have been widely used during last years in intrusion detection and computer security. Nevertheless, most works showed that they are ineffective for detecting novel attacks involving new behaviors. In this paper, we first study this recurring problem due on one hand to inadequate handling of anomalous and unusual audit events and on other hand to insufficient decision rules which do not meet behavioral approach objectives. We then propose to enhance the standard decision rules in order to fit behavioral approach requirements and better detect novel attacks. Experimental studies carried out on real and simulated http traffic show that these enhanced decision rules improve detecting most novel attacks without triggering higher false alarm rates.

  11. Anomaly Detection for Discrete Sequences: A Survey

    SciTech Connect

    Chandola, Varun; Banerjee, Arindam; Kumar, Vipin

    2012-01-01

    This survey attempts to provide a comprehensive and structured overview of the existing research for the problem of detecting anomalies in discrete/symbolic sequences. The objective is to provide a global understanding of the sequence anomaly detection problem and how existing techniques relate to each other. The key contribution of this survey is the classification of the existing research into three distinct categories, based on the problem formulation that they are trying to solve. These problem formulations are: 1) identifying anomalous sequences with respect to a database of normal sequences; 2) identifying an anomalous subsequence within a long sequence; and 3) identifying a pattern in a sequence whose frequency of occurrence is anomalous. We show how each of these problem formulations is characteristically distinct from each other and discuss their relevance in various application domains. We review techniques from many disparate and disconnected application domains that address each of these formulations. Within each problem formulation, we group techniques into categories based on the nature of the underlying algorithm. For each category, we provide a basic anomaly detection technique, and show how the existing techniques are variants of the basic technique. This approach shows how different techniques within a category are related or different from each other. Our categorization reveals new variants and combinations that have not been investigated before for anomaly detection. We also provide a discussion of relative strengths and weaknesses of different techniques. We show how techniques developed for one problem formulation can be adapted to solve a different formulation, thereby providing several novel adaptations to solve the different problem formulations. We also highlight the applicability of the techniques that handle discrete sequences to other related areas such as online anomaly detection and time series anomaly detection.

  12. Hyperspectral Anomaly Detection in Urban Scenarios

    NASA Astrophysics Data System (ADS)

    Rejas Ayuga, J. G.; Martínez Marín, R.; Marchamalo Sacristán, M.; Bonatti, J.; Ojeda, J. C.

    2016-06-01

    We have studied the spectral features of reflectance and emissivity in the pattern recognition of urban materials in several single hyperspectral scenes through a comparative analysis of anomaly detection methods and their relationship with city surfaces with the aim to improve information extraction processes. Spectral ranges of the visible-near infrared (VNIR), shortwave infrared (SWIR) and thermal infrared (TIR) from hyperspectral data cubes of AHS sensor and HyMAP and MASTER of two cities, Alcalá de Henares (Spain) and San José (Costa Rica) respectively, have been used. In this research it is assumed no prior knowledge of the targets, thus, the pixels are automatically separated according to their spectral information, significantly differentiated with respect to a background, either globally for the full scene, or locally by image segmentation. Several experiments on urban scenarios and semi-urban have been designed, analyzing the behaviour of the standard RX anomaly detector and different methods based on subspace, image projection and segmentation-based anomaly detection methods. A new technique for anomaly detection in hyperspectral data called DATB (Detector of Anomalies from Thermal Background) based on dimensionality reduction by projecting targets with unknown spectral signatures to a background calculated from thermal spectrum wavelengths is presented. First results and their consequences in non-supervised classification and extraction information processes are discussed.

  13. Elimination of character-resembling anomalies within a detected region using density-dependent reference point construction in an automated license plate recognition system

    NASA Astrophysics Data System (ADS)

    Chai, Hum Yan; Meng, Liang Kim; Mohamed, Hamam; Woon, Hon Hock; Lai, Khin Wee

    2016-11-01

    The problem of eliminating character-resembling blobs on a detected region in the plate detection stage of an automated license plate recognition system is addressed. The proposed method amplifies the slight differences between the noncharacter blobs (anomalies) and the character blobs (true signal) to enhance the tractability. This method postulates on two propositions: (1) the anomalies are usually located around the true signal and the suspected anomalies and (2) blobs should be given less emphasis in computing a reference point. The first proposition is based on prior knowledge and observation; the second proposition is based on the fact that a reference point that takes anomalies into account is contaminated and thus misleading. The gist of the method mainly focuses on the methodology to emphasize the blobs differently in accordance to their location in computing the reference point that approximates the representative value of true signal properties more accurately, thus giving the effect of amplifying the slight differences. The performance of the method is evaluated on both its capability and consistency in solving certain types of anomalies.

  14. Anomaly detection using classified eigenblocks in GPR image

    NASA Astrophysics Data System (ADS)

    Kim, Min Ju; Kim, Seong Dae; Lee, Seung-eui

    2016-05-01

    Automatic landmine detection system using ground penetrating radar has been widely researched. For the automatic mine detection system, system speed is an important factor. Many techniques for mine detection have been developed based on statistical background. Among them, a detection technique employing the Principal Component Analysis(PCA) has been used for clutter reduction and anomaly detection. However, the PCA technique can retard the entire process, because of large basis dimension and a numerous number of inner product operations. In order to overcome this problem, we propose a fast anomaly detection system using 2D DCT and PCA. Our experiments use a set of data obtained from a test site where the anti-tank and anti- personnel mines are buried. We evaluate the proposed system in terms of the ROC curve. The result shows that the proposed system performs much better than the conventional PCA systems from the viewpoint of speed and false alarm rate.

  15. Conditional Anomaly Detection with Soft Harmonic Functions.

    PubMed

    Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F; Hauskrecht, Milos

    2011-01-01

    In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions.

  16. Conditional Anomaly Detection with Soft Harmonic Functions

    PubMed Central

    Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F.; Hauskrecht, Milos

    2012-01-01

    In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions. PMID:25309142

  17. Anomaly detection and localization in crowded scenes.

    PubMed

    Li, Weixin; Mahadevan, Vijay; Vasconcelos, Nuno

    2014-01-01

    The detection and localization of anomalous behaviors in crowded scenes is considered, and a joint detector of temporal and spatial anomalies is proposed. The proposed detector is based on a video representation that accounts for both appearance and dynamics, using a set of mixture of dynamic textures models. These models are used to implement 1) a center-surround discriminant saliency detector that produces spatial saliency scores, and 2) a model of normal behavior that is learned from training data and produces temporal saliency scores. Spatial and temporal anomaly maps are then defined at multiple spatial scales, by considering the scores of these operators at progressively larger regions of support. The multiscale scores act as potentials of a conditional random field that guarantees global consistency of the anomaly judgments. A data set of densely crowded pedestrian walkways is introduced and used to evaluate the proposed anomaly detector. Experiments on this and other data sets show that the latter achieves state-of-the-art anomaly detection results.

  18. Anomaly Detection Techniques for Ad Hoc Networks

    ERIC Educational Resources Information Center

    Cai, Chaoli

    2009-01-01

    Anomaly detection is an important and indispensable aspect of any computer security mechanism. Ad hoc and mobile networks consist of a number of peer mobile nodes that are capable of communicating with each other absent a fixed infrastructure. Arbitrary node movements and lack of centralized control make them vulnerable to a wide variety of…

  19. Anomaly Detection Techniques for Ad Hoc Networks

    ERIC Educational Resources Information Center

    Cai, Chaoli

    2009-01-01

    Anomaly detection is an important and indispensable aspect of any computer security mechanism. Ad hoc and mobile networks consist of a number of peer mobile nodes that are capable of communicating with each other absent a fixed infrastructure. Arbitrary node movements and lack of centralized control make them vulnerable to a wide variety of…

  20. Hyperspectral anomaly detection using enhanced global factors

    NASA Astrophysics Data System (ADS)

    Paciencia, Todd J.; Bauer, Kenneth W.

    2016-05-01

    Dimension reduction techniques have become one popular unsupervised approach used towards detecting anomalies in hyperspectral imagery. Although demonstrating promising results in the literature on specific images, these methods can become difficult to directly interpret and often require tuning of their parameters to achieve high performance on a specific set of images. This lack of generality is also compounded by the need to remove noise and atmospheric absorption spectral bands from the image prior to detection. Without a process for this band selection and to make the methods adaptable to different image compositions, performance becomes difficult to maintain across a wider variety of images. Here, we present a framework that uses factor analysis to provide a robust band selection and more meaningful dimension reduction with which to detect anomalies in the imagery. Measurable characteristics of the image are used to create an automated decision process that allows the algorithm to adjust to a particular image, while maintaining high detection performance. The framework and its algorithms are detailed, and results are shown for forest, desert, sea, rural, urban, anomaly-sparse, and anomaly-dense imagery types from different sensors. Additionally, the method is compared to current state-of-the-art methods and is shown to be computationally efficient.

  1. Trust based Fusion over Noisy Channels through Anomaly Detection in Cognitive Radio Networks

    DTIC Science & Technology

    2011-11-01

    Trust based Fusion over Noisy Channels through Anomaly Detection in Cognitive Radio Networks∗ Shameek Bhattacharjee Department of EECS University of...of Systems]: Fault Tolerance General Terms Algorithms, Performance, Security, Theory Keywords Cognitive radio networks, attacks, anomaly detection , trust...TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Trust based Fusion over Noisy Channels through Anomaly Detection in Cognitive Radio

  2. OPAD data analysis. [Optical Plumes Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Kraft, Richard; Whitaker, Kevin; Cooper, Anita E.; Powers, W. T.; Wallace, Tim L.

    1993-01-01

    Data obtained in the framework of an Optical Plume Anomaly Detection (OPAD) program intended to create a rocket engine health monitor based on spectrometric detections of anomalous atomic and molecular species in the exhaust plume are analyzed. The major results include techniques for handling data noise, methods for registration of spectra to wavelength, and a simple automatic process for estimating the metallic component of a spectrum.

  3. Anomaly and error detection in computerized materials control & accountability databases

    SciTech Connect

    Whiteson, R.; Hoffbauer, B.; Yarbro, T.F.

    1997-09-01

    Unites States Department of Energy sites use computerized material control and accountability (MC&A) systems to manage the large amounts of data necessary to control and account for their nuclear materials. Theft or diversion of materials from these sites would likely result in anomalies in the data, and erroneous information greatly reduces the value of the information to its users. Therefore, it is essential that MC&A data be periodically assessed for anomalies or errors. At Los Alamos National Laboratory, we have been developing expert systems to provide efficient, cost-effective, automated error and anomaly detection. Automated anomaly detection can provide assurance of the integrity of data, reduce inventory frequency, enhance assurance of physical inventory, detect errors in databases, and gain a better perspective on overall facility operations. The Automated MC&A Database Assessment Project is aimed at improving anomaly and error detection in MC&A databases and increasing confidence in the data. We are working with data from the Los Alamos Plutonium Facility and the Material Accountability and Safeguards System, the Facility`s near-real-time computerized nuclear material accountability and safeguards system. This paper describes progress in customizing the expert systems to the needs of the users of the data and reports on our results.

  4. The role of noninvasive and invasive diagnostic imaging techniques for detection of extra-cranial venous system anomalies and developmental variants.

    PubMed

    Dolic, Kresimir; Siddiqui, Adnan H; Karmon, Yuval; Marr, Karen; Zivadinov, Robert

    2013-06-27

    The extra-cranial venous system is complex and not well studied in comparison to the peripheral venous system. A newly proposed vascular condition, named chronic cerebrospinal venous insufficiency (CCSVI), described initially in patients with multiple sclerosis (MS) has triggered intense interest in better understanding of the role of extra-cranial venous anomalies and developmental variants. So far, there is no established diagnostic imaging modality, non-invasive or invasive, that can serve as the "gold standard" for detection of these venous anomalies. However, consensus guidelines and standardized imaging protocols are emerging. Most likely, a multimodal imaging approach will ultimately be the most comprehensive means for screening, diagnostic and monitoring purposes. Further research is needed to determine the spectrum of extra-cranial venous pathology and to compare the imaging findings with pathological examinations. The ability to define and reliably detect noninvasively these anomalies is an essential step toward establishing their incidence and prevalence. The role for these anomalies in causing significant hemodynamic consequences for the intra-cranial venous drainage in MS patients and other neurologic disorders, and in aging, remains unproven.

  5. The role of noninvasive and invasive diagnostic imaging techniques for detection of extra-cranial venous system anomalies and developmental variants

    PubMed Central

    2013-01-01

    The extra-cranial venous system is complex and not well studied in comparison to the peripheral venous system. A newly proposed vascular condition, named chronic cerebrospinal venous insufficiency (CCSVI), described initially in patients with multiple sclerosis (MS) has triggered intense interest in better understanding of the role of extra-cranial venous anomalies and developmental variants. So far, there is no established diagnostic imaging modality, non-invasive or invasive, that can serve as the “gold standard” for detection of these venous anomalies. However, consensus guidelines and standardized imaging protocols are emerging. Most likely, a multimodal imaging approach will ultimately be the most comprehensive means for screening, diagnostic and monitoring purposes. Further research is needed to determine the spectrum of extra-cranial venous pathology and to compare the imaging findings with pathological examinations. The ability to define and reliably detect noninvasively these anomalies is an essential step toward establishing their incidence and prevalence. The role for these anomalies in causing significant hemodynamic consequences for the intra-cranial venous drainage in MS patients and other neurologic disorders, and in aging, remains unproven. PMID:23806142

  6. Multiple-Instance Learning for Anomaly Detection in Digital Mammography.

    PubMed

    Quellec, Gwenole; Lamard, Mathieu; Cozic, Michel; Coatrieux, Gouenou; Cazuguel, Guy

    2016-07-01

    This paper describes a computer-aided detection and diagnosis system for breast cancer, the most common form of cancer among women, using mammography. The system relies on the Multiple-Instance Learning (MIL) paradigm, which has proven useful for medical decision support in previous works from our team. In the proposed framework, breasts are first partitioned adaptively into regions. Then, features derived from the detection of lesions (masses and microcalcifications) as well as textural features, are extracted from each region and combined in order to classify mammography examinations as "normal" or "abnormal". Whenever an abnormal examination record is detected, the regions that induced that automated diagnosis can be highlighted. Two strategies are evaluated to define this anomaly detector. In a first scenario, manual segmentations of lesions are used to train an SVM that assigns an anomaly index to each region; local anomaly indices are then combined into a global anomaly index. In a second scenario, the local and global anomaly detectors are trained simultaneously, without manual segmentations, using various MIL algorithms (DD, APR, mi-SVM, MI-SVM and MILBoost). Experiments on the DDSM dataset show that the second approach, which is only weakly-supervised, surprisingly outperforms the first approach, even though it is strongly-supervised. This suggests that anomaly detectors can be advantageously trained on large medical image archives, without the need for manual segmentation.

  7. Gravity anomaly detection: Apollo/Soyuz

    NASA Technical Reports Server (NTRS)

    Vonbun, F. O.; Kahn, W. D.; Bryan, J. W.; Schmid, P. E.; Wells, W. T.; Conrad, D. T.

    1976-01-01

    The Goddard Apollo-Soyuz Geodynamics Experiment is described. It was performed to demonstrate the feasibility of tracking and recovering high frequency components of the earth's gravity field by utilizing a synchronous orbiting tracking station such as ATS-6. Gravity anomalies of 5 MGLS or larger having wavelengths of 300 to 1000 kilometers on the earth's surface are important for geologic studies of the upper layers of the earth's crust. Short wavelength Earth's gravity anomalies were detected from space. Two prime areas of data collection were selected for the experiment: (1) the center of the African continent and (2) the Indian Ocean Depression centered at 5% north latitude and 75% east longitude. Preliminary results show that the detectability objective of the experiment was met in both areas as well as at several additional anomalous areas around the globe. Gravity anomalies of the Karakoram and Himalayan mountain ranges, ocean trenches, as well as the Diamantina Depth, can be seen. Maps outlining the anomalies discovered are shown.

  8. Hyperspectral Anomaly Detection by Graph Pixel Selection.

    PubMed

    Yuan, Yuan; Ma, Dandan; Wang, Qi

    2016-12-01

    Hyperspectral anomaly detection (AD) is an important problem in remote sensing field. It can make full use of the spectral differences to discover certain potential interesting regions without any target priors. Traditional Mahalanobis-distance-based anomaly detectors assume the background spectrum distribution conforms to a Gaussian distribution. However, this and other similar distributions may not be satisfied for the real hyperspectral images. Moreover, the background statistics are susceptible to contamination of anomaly targets which will lead to a high false-positive rate. To address these intrinsic problems, this paper proposes a novel AD method based on the graph theory. We first construct a vertex- and edge-weighted graph and then utilize a pixel selection process to locate the anomaly targets. Two contributions are claimed in this paper: 1) no background distributions are required which makes the method more adaptive and 2) both the vertex and edge weights are considered which enables a more accurate detection performance and better robustness to noise. Intensive experiments on the simulated and real hyperspectral images demonstrate that the proposed method outperforms other benchmark competitors. In addition, the robustness of the proposed method has been validated by using various window sizes. This experimental result also demonstrates the valuable characteristic of less computational complexity and less parameter tuning for real applications.

  9. Applications of TOPS Anomaly Detection Framework to Amazon Drought Analysis

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Ganguly, S.; Michaelis, A.; Hashimoto, H.

    2011-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. While there are large numbers of anomaly detection algorithms for multivariate datasets, we are extending this capability beyond the anomaly detection itself and towards an automated analysis that would discover the possible causes of the anomalies. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. We have integrated this knowledge base with a framework for deploying an ensemble of anomaly detection algorithms on large volumes of Earth science datasets and applied it to specific scientific applications that support research conducted by our group. In one early application, we were able to process large number of MODIS, TRMM, CERES data along with ground-based weather and river flow observations to detect the evolution of 2010 drought in the Amazon, identify the affected area, and publish the results in three weeks. A similar analysis of the 2005 drought using the same data sets took nearly 2 years, highlighting the potential contribution of our anomaly framework in accelerating scientific discoveries.

  10. Anomaly Detection in Power Quality at Data Centers

    NASA Technical Reports Server (NTRS)

    Grichine, Art; Solano, Wanda M.

    2015-01-01

    The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.

  11. Anomaly Detection Framework Based on Matching Pursuit for Network Security Enhancement

    DTIC Science & Technology

    2010-11-01

    RTO-MP-IST-091 P11 - 1 Anomaly Detection Framework Based on Matching Pursuit for Network Security Enhancement Rafał Renk, Witold...Detection Systems can be classified as belonging to two main groups depending on the detection technique employed: anomaly detection and signature...based detection. Anomaly detection techniques, that we focus on in our work, rely on the existence of a reliable characterization of what is normal and

  12. Video behavior profiling for anomaly detection.

    PubMed

    Xiang, Tao; Gong, Shaogang

    2008-05-01

    This paper aims to address the problem of modelling video behaviour captured in surveillancevideos for the applications of online normal behaviour recognition and anomaly detection. A novelframework is developed for automatic behaviour profiling and online anomaly sampling/detectionwithout any manual labelling of the training dataset. The framework consists of the followingkey components: (1) A compact and effective behaviour representation method is developed basedon discrete scene event detection. The similarity between behaviour patterns are measured basedon modelling each pattern using a Dynamic Bayesian Network (DBN). (2) Natural grouping ofbehaviour patterns is discovered through a novel spectral clustering algorithm with unsupervisedmodel selection and feature selection on the eigenvectors of a normalised affinity matrix. (3) Acomposite generative behaviour model is constructed which is capable of generalising from asmall training set to accommodate variations in unseen normal behaviour patterns. (4) A run-timeaccumulative anomaly measure is introduced to detect abnormal behaviour while normal behaviourpatterns are recognised when sufficient visual evidence has become available based on an onlineLikelihood Ratio Test (LRT) method. This ensures robust and reliable anomaly detection and normalbehaviour recognition at the shortest possible time. The effectiveness and robustness of our approachis demonstrated through experiments using noisy and sparse datasets collected from both indoorand outdoor surveillance scenarios. In particular, it is shown that a behaviour model trained usingan unlabelled dataset is superior to those trained using the same but labelled dataset in detectinganomaly from an unseen video. The experiments also suggest that our online LRT based behaviourrecognition approach is advantageous over the commonly used Maximum Likelihood (ML) methodin differentiating ambiguities among different behaviour classes observed online.

  13. The role of visualization and interaction in maritime anomaly detection

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran

    2011-01-01

    The surveillance of large sea, air or land areas normally involves the analysis of large volumes of heterogeneous data from multiple sources. Timely detection and identification of anomalous behavior or any threat activity is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems for area surveillance are rarely used in the real world. We argue that such capabilities and applications present two critical challenges: (1) they need to provide adequate user support and (2) they need to involve the user in the underlying detection process. In order to encourage the use of anomaly detection capabilities in surveillance systems, this paper analyzes the challenges that existing anomaly detection and behavioral analysis approaches present regarding their use and maintenance by users. We analyze input parameters, detection process, model representation and outcomes. We discuss the role of visualization and interaction in the anomaly detection process. Practical examples from our current research within the maritime domain illustrate key aspects presented.

  14. Seismic Anomaly Detection Using Symbolic Representation Methods

    NASA Astrophysics Data System (ADS)

    Christodoulou, Vyron; Bi, Yaxin; Wilkie, George; Zhao, Guoze

    2016-08-01

    In this work we investigate the use of symbolic representation methods for Anomaly Detection in different electromagnetic sequential time series datasets. An issue that is often overlooked regarding symbolic representation and its performance in Anomaly Detection is the use of a quantitative accuracy metric. Until recently only visual representations have been used to show the efficiency of an algorithm to detect anomalies. In this respect we propose an novel accuracy metric that takes into account the length of the sliding window of such symbolic representation algorithms and we present its utility. For the evaluation of the accuracy metric, HOT-SAX is used, a method that aggregates data points by use of sliding windows. A HOT-SAX variant, with the use of overlapping windows, is also introduced that achieves better results based on the newly defined accuracy metric. Both methods are evaluated on ten different benchmark datasets and based on the empirical evidence we use Earth's geomagnetic data gathered by the SWARM satellites and terrestrial sources around the epicenter of two seismic events in the Yunnan region of China.

  15. A hybrid approach for efficient anomaly detection using metaheuristic methods.

    PubMed

    Ghanem, Tamer F; Elkilani, Wail S; Abdul-Kader, Hatem M

    2015-07-01

    Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.

  16. A hybrid approach for efficient anomaly detection using metaheuristic methods

    PubMed Central

    Ghanem, Tamer F.; Elkilani, Wail S.; Abdul-kader, Hatem M.

    2014-01-01

    Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. PMID:26199752

  17. Algorithm development for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton S.

    2008-10-01

    This dissertation proposes and evaluates a novel anomaly detection algorithm suite for ground-to-ground, or air-to-ground, applications requiring automatic target detection using hyperspectral (HS) data. Targets are manmade objects in natural background clutter under unknown illumination and atmospheric conditions. The use of statistical models herein is purely for motivation of particular formulas for calculating anomaly output surfaces. In particular, formulas from semiparametrics are utilized to obtain novel forms for output surfaces, and alternative scoring algorithms are proposed to calculate output surfaces that are comparable to those of semiparametrics. Evaluation uses both simulated data and real HS data from a joint data collection effort between the Army Research Laboratory and the Army Armament Research Development & Engineering Center. A data transformation method is presented for use by the two-sample data structure univariate semiparametric and nonparametric scoring algorithms, such that, the two-sample data are mapped from their original multivariate space to an univariate domain, where the statistical power of the univariate scoring algorithms is shown to be improved relative to existing multivariate scoring algorithms testing the same two-sample data. An exhaustive simulation experimental study is conducted to assess the performance of different HS anomaly detection techniques, where the null and alternative hypotheses are completely specified, including all parameters, using multivariate normal and mixtures of multivariate normal distributions. Finally, for ground-to-ground anomaly detection applications, where the unknown scales of targets add to the problem complexity, a novel global anomaly detection algorithm suite is introduced, featuring autonomous partial random sampling (PRS) of the data cube. The PRS method is proposed to automatically sample the unknown background clutter in the test HS imagery, and by repeating multiple times this

  18. The Frequencies of the Urinary Anomalies which were Detected in a Foetal Autopsy Study

    PubMed Central

    Gupta, Tulika; Kapoor, Kanchan; Sharma, A.; Huria, A.

    2012-01-01

    Aim The detection of foetal urinary abnormalities in the antenatal period will help in an adequate post natal management and it will also have a bearing on the decision of the termination of the pregnancy. The purpose of the present study was to detect urinary anomalies in the antenatal period by doing autopsies of the aborted foetuses. Settings and Design A cross-sectional study. Methods and Material A total of 226 aborted foetuses were autopsied. The urinary anomalies which were related to the renal parenchyma, the pelvi-ureteral system and the urinary bladder were recorded. The associated anomalies of the other organ systems were also noted. The incidences of the different urinary anomalies among the aborted foetuses were calculated. The gestational ages at which the various anomalies were detected were also studied. Results Twenty nine of the 226 fetuses were detected to have 34 urinary anomalies. Renal agenesis was the single most common anomaly. Overall, the anomalies which were related to the renal parenchyma accounted for 67.65 % of all the urinary anomalies, while the anomalies of the pelvi-ureteral system and the bladder constituted 20.59% of the detected urinary anomalies. The anomalies of the renal parenchyma (renal agenesis and horse-shoe and polycystic kidneys) were more frequently seen in the foetuses with a shorter gestational age as compared to the gestational ages of the foetuses which showed pelvi-ureteral anomalies. The cumulative incidence of the foetuses with urinary anomalies by 30 weeks of gestation was 12.83%. Conclusions A significant proportion of the aborted foetuses was detected to have urinary anomalies. An early antenatal detection of these and associated anomalies has significance, as this may help in an early postnatal diagnosis and management. The degree and the extent of the detected anomalies could also help in the decision making regarding the therapeutic abortions and the future pregnancies. PMID:23373012

  19. Anomaly Detection Based on Sensor Data in Petroleum Industry Applications

    PubMed Central

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-01

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection. PMID:25633599

  20. Anomaly detection based on sensor data in petroleum industry applications.

    PubMed

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-27

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection.

  1. Innovative Statistical Inference for Anomaly Detection in Hyperspectral Imagery

    DTIC Science & Technology

    2004-09-01

    Innovative Statistical Inference for Anomaly Detection in Hyperspectral Imagery by Dalton Rosario ARL-TR-3339 September 2004...2004 Innovative Statistical Inference for Anomaly Detection in Hyperspectral Imagery Dalton Rosario Sensors and Electron Devices...the effectiveness of both algorithms. 15. SUBJECT TERMS Hyperspectral anomaly detection , large sample theory 16. SECURITY CLASSIFICATION OF: 19a

  2. Fuzzy Kernel k-Medoids algorithm for anomaly detection problems

    NASA Astrophysics Data System (ADS)

    Rustam, Z.; Talita, A. S.

    2017-07-01

    Intrusion Detection System (IDS) is an essential part of security systems to strengthen the security of information systems. IDS can be used to detect the abuse by intruders who try to get into the network system in order to access and utilize the available data sources in the system. There are two approaches of IDS, Misuse Detection and Anomaly Detection (behavior-based intrusion detection). Fuzzy clustering-based methods have been widely used to solve Anomaly Detection problems. Other than using fuzzy membership concept to determine the object to a cluster, other approaches as in combining fuzzy and possibilistic membership or feature-weighted based methods are also used. We propose Fuzzy Kernel k-Medoids that combining fuzzy and possibilistic membership as a powerful method to solve anomaly detection problem since on numerical experiment it is able to classify IDS benchmark data into five different classes simultaneously. We classify IDS benchmark data KDDCup'99 data set into five different classes simultaneously with the best performance was achieved by using 30 % of training data with clustering accuracy reached 90.28 percent.

  3. Profile-based adaptive anomaly detection for network security.

    SciTech Connect

    Zhang, Pengchu C. (Sandia National Laboratories, Albuquerque, NM); Durgin, Nancy Ann

    2005-11-01

    As information systems become increasingly complex and pervasive, they become inextricably intertwined with the critical infrastructure of national, public, and private organizations. The problem of recognizing and evaluating threats against these complex, heterogeneous networks of cyber and physical components is a difficult one, yet a solution is vital to ensuring security. In this paper we investigate profile-based anomaly detection techniques that can be used to address this problem. We focus primarily on the area of network anomaly detection, but the approach could be extended to other problem domains. We investigate using several data analysis techniques to create profiles of network hosts and perform anomaly detection using those profiles. The ''profiles'' reduce multi-dimensional vectors representing ''normal behavior'' into fewer dimensions, thus allowing pattern and cluster discovery. New events are compared against the profiles, producing a quantitative measure of how ''anomalous'' the event is. Most network intrusion detection systems (IDSs) detect malicious behavior by searching for known patterns in the network traffic. This approach suffers from several weaknesses, including a lack of generalizability, an inability to detect stealthy or novel attacks, and lack of flexibility regarding alarm thresholds. Our research focuses on enhancing current IDS capabilities by addressing some of these shortcomings. We identify and evaluate promising techniques for data mining and machine-learning. The algorithms are ''trained'' by providing them with a series of data-points from ''normal'' network traffic. A successful algorithm can be trained automatically and efficiently, will have a low error rate (low false alarm and miss rates), and will be able to identify anomalies in ''pseudo real-time'' (i.e., while the intrusion is still in progress, rather than after the fact). We also build a prototype anomaly detection tool that demonstrates how the techniques might

  4. A lightweight network anomaly detection technique

    DOE PAGES

    Kim, Jinoh; Yoo, Wucherl; Sim, Alex; ...

    2017-03-13

    While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less

  5. Statistical anomaly detection for individuals with cognitive impairments.

    PubMed

    Chang, Yao-Jen; Lin, Kang-Ping; Chou, Li-Der; Chen, Shu-Fang; Ma, Tian-Shyan

    2014-01-01

    We study anomaly detection in a context that considers user trajectories as input and tries to identify anomalies for users following normal routes such as taking public transportation from the workplace to home or vice versa. Trajectories are modeled as a discrete-time series of axis-parallel constraints ("boxes") in the 2-D space. The anomaly can be estimated by considering two trajectories, where one trajectory is the current movement pattern and the other is a weighted trajectory collected from N norms. The proposed system was implemented and evaluated with eight individuals with cognitive impairments. The experimental results showed that recall was 95.0% and precision was 90.9% on average without false alarm suppression. False alarms and false negatives dropped when axis rotation was applied. The precision with axis rotation was 97.6% and the recall was 98.8%. The average time used for sending locations, running anomaly detection, and issuing warnings was in the range of 15.1-22.7 s. Our findings suggest that the ability to adapt anomaly detection devices for appropriate timing of self-alerts will be particularly important.

  6. Preseismic ionospheric anomalies detected before the 2016 Kumamoto earthquake

    NASA Astrophysics Data System (ADS)

    Iwata, Takuya; Umeno, Ken

    2017-03-01

    On 15 April 2016, the Kumamoto earthquake (Mw 7.3) occurred in Japan with no warning signals. Global Navigation Satellite System (GNSS) receivers provide useful information on disturbances in the ionosphere by calculating the changes in total electron content (TEC), which is the number of electrons in the ionosphere. Here we show our recently proposed correlation analysis of TEC data which can detect the preseismic ionospheric anomalies from the public GNSS data. Our method detected the ionospheric anomaly several tens of minutes before the 2016 Kumamoto earthquake near its epicenter. Furthermore, we gave an indicator to distinguish between the preseismic TEC anomalies and the medium-scale traveling ionospheric disturbances (MSTIDs) by calculating the anomalous area rates. These results support the hypothesis for existence of the preceding phenomena before large earthquakes.

  7. Investigation of the collision line broadening problem as applicable to the NASA Optical Plume Anomaly Detection (OPAD) system, phase 1

    NASA Technical Reports Server (NTRS)

    Dean, Timothy C.; Ventrice, Carl A.

    1995-01-01

    As a final report for phase 1 of the project, the researchers are submitting to the Tennessee Tech Office of Research the following two papers (reprinted in this report): 'Collision Line Broadening Effects on Spectrometric Data from the Optical Plume Anomaly System (OPAD),' presented at the 30th AIAA/ASME/SAE/ASEE Joint Propulsion Conference, 27-29 June 1994, and 'Calculation of Collision Cross Sections for Atomic Line Broadening in the Plume of the Space Shuttle Main Engine (SSME),' presented at the IEEE Southeastcon '95, 26-29 March 1995. These papers fully state the problem and the progress made up to the end of NASA Fiscal Year 1994. The NASA OPAD system was devised to predict concentrations of anomalous species in the plume of the Space Shuttle Main Engine (SSME) through analysis of spectrometric data. The self absorption of the radiation of these plume anomalies is highly dependent on the line shape of the atomic transition of interest. The Collision Line Broadening paper discusses the methods used to predict line shapes of atomic transitions in the environment of a rocket plume. The Voigt profile is used as the line shape factor since both Doppler and collisional line broadening are significant. Methods used to determine the collisional cross sections are discussed and the results are given and compared with experimental data. These collisional cross sections are then incorporated into the current self absorbing radiative model and the predicted spectrum is compared to actual spectral data collected from the Stennis Space Center Diagnostic Test Facility rocket engine. The second paper included in this report investigates an analytical method for determining the cross sections for collision line broadening by molecular perturbers, using effective central force interaction potentials. These cross sections are determined for several atomic species with H2, one of the principal constituents of the SSME plume environment, and compared with experimental data.

  8. Investigation of the collision line broadening problem as applicable to the NASA Optical Plume Anomaly Detection (OPAD) system, phase 1

    NASA Astrophysics Data System (ADS)

    Dean, Timothy C.; Ventrice, Carl A.

    1995-05-01

    As a final report for phase 1 of the project, the researchers are submitting to the Tennessee Tech Office of Research the following two papers (reprinted in this report): 'Collision Line Broadening Effects on Spectrometric Data from the Optical Plume Anomaly System (OPAD),' presented at the 30th AIAA/ASME/SAE/ASEE Joint Propulsion Conference, 27-29 June 1994, and 'Calculation of Collision Cross Sections for Atomic Line Broadening in the Plume of the Space Shuttle Main Engine (SSME),' presented at the IEEE Southeastcon '95, 26-29 March 1995. These papers fully state the problem and the progress made up to the end of NASA Fiscal Year 1994. The NASA OPAD system was devised to predict concentrations of anomalous species in the plume of the Space Shuttle Main Engine (SSME) through analysis of spectrometric data. The self absorption of the radiation of these plume anomalies is highly dependent on the line shape of the atomic transition of interest. The Collision Line Broadening paper discusses the methods used to predict line shapes of atomic transitions in the environment of a rocket plume. The Voigt profile is used as the line shape factor since both Doppler and collisional line broadening are significant. Methods used to determine the collisional cross sections are discussed and the results are given and compared with experimental data. These collisional cross sections are then incorporated into the current self absorbing radiative model and the predicted spectrum is compared to actual spectral data collected from the Stennis Space Center Diagnostic Test Facility rocket engine. The second paper included in this report investigates an analytical method for determining the cross sections for collision line broadening by molecular perturbers, using effective central force interaction potentials. These cross sections are determined for several atomic species with H2, one of the principal constituents of the SSME plume environment, and compared with experimental data.

  9. Maintaining defender's reputation in anomaly detection against insider attacks.

    PubMed

    Zhang, Nan; Yu, Wei; Fu, Xinwen; Das, Sajal K

    2010-06-01

    We address issues related to establishing a defender's reputation in anomaly detection against two types of attackers: 1) smart insiders, who learn from historic attacks and adapt their strategies to avoid detection/punishment, and 2) naïve attackers, who blindly launch their attacks without knowledge of the history. In this paper, we propose two novel algorithms for reputation establishment--one for systems solely consisting of smart insiders and the other for systems in which both smart insiders and naïve attackers are present. The theoretical analysis and performance evaluation show that our reputation-establishment algorithms can significantly improve the performance of anomaly detection against insider attacks in terms of the tradeoff between detection and false positives.

  10. Evidence-based anomaly detection in clinical domains.

    PubMed

    Hauskrecht, Milos; Valko, Michal; Kveton, Branislav; Visweswaran, Shyam; Cooper, Gregory F

    2007-10-11

    Anomaly detection methods can be very useful in identifying interesting or concerning events. In this work, we develop and examine new probabilistic anomaly detection methods that let us evaluate management decisions for a specific patient and identify those decisions that are highly unusual with respect to patients with the same or similar condition. The statistics used in this detection are derived from probabilistic models such as Bayesian networks that are learned from a database of past patient cases. We evaluate our methods on the problem of detection of unusual hospitalization patterns for patients with community acquired pneumonia. The results show very encouraging detection performance with 0.5 precision at 0.53 recall and give us hope that these techniques may provide the basis of intelligent monitoring systems that alert clinicians to the occurrence of unusual events or decisions.

  11. Evidence-based Anomaly Detection in Clinical Domains

    PubMed Central

    Hauskrecht, Milos; Valko, Michal; Kveton, Branislav; Visweswaran, Shyam; Cooper, Gregory F.

    2007-01-01

    Anomaly detection methods can be very useful in identifying interesting or concerning events. In this work, we develop and examine new probabilistic anomaly detection methods that let us evaluate management decisions for a specific patient and identify those decisions that are highly unusual with respect to patients with the same or similar condition. The statistics used in this detection are derived from probabilistic models such as Bayesian networks that are learned from a database of past patient cases. We evaluate our methods on the problem of detection of unusual hospitalization patterns for patients with community acquired pneumonia. The results show very encouraging detection performance with 0.5 precision at 0.53 recall and give us hope that these techniques may provide the basis of intelligent monitoring systems that alert clinicians to the occurrence of unusual events or decisions. PMID:18693850

  12. Remote detection of geochemical soil anomalies

    USGS Publications Warehouse

    Canney, F.C.

    1970-01-01

    This paper describes a preliminary experiment that was made to compare the spectral reflectance from trees growing in soil over a mineral deposit with reflectance from trees of the same species growing in a nearby unmineralized area. Although the measurements were made on a relatively small number of trees, some significant differences were obtained and the over-all results are encouraging enough to warrant additional studies. Preliminary results suggest that measurement of spectral reflectance may become a dramatic new way of detecting geochemical soil anomalies by remote means in tree-covered areas.

  13. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  14. Anomaly based vessel detection in visible and infrared images

    NASA Astrophysics Data System (ADS)

    Islam, Mohammad Moinul; Islam, Mohammed Nazrul; Asari, K. Vijayan; Karim, Mohammad A.

    2009-02-01

    Detection of small vessels is a challenging task for navy, coast guard and port authority for security purposes. Vessel identification is more complex as compared to other object detection because of its variability in shapes, features and orientations. Current methods for vessel detection are primarily based on segmentation techniques which are not as efficient and also require different algorithms for visible and infrared images. In this paper, a new vessel detection technique is proposed employing anomaly detection. The input intensity image is first converted to feature space using difference of Gaussian filters. Then a detector filter in the form of Mahalanobis distance is applied to the feature points to detect anomalies whose characteristics are different from their surroundings. Anomalies are detected as bright spots in both visible and infrared image. The larger the gray value of the pixels the more anomalous they are to be. The detector output is then post-processed and a binary image is constructed where the boat edges with strong variance relative to the background are identified along with few outliers from the background. The resultant image is then clustered to identify the location of the vessel. The main contribution in this paper is developing an algorithm which can reliably detect small vessels in visible and infrared images. The proposed method is investigated using real-life vessel images and found to perform excellent in both visible and infrared images with the same system parameters.

  15. Anomaly Detection for Next-Generation Space Launch Ground Operations

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.

    2010-01-01

    NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.

  16. Automatic detection of anomalies in Space Shuttle Main Engine turbopumps

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Whitehead, B. A.; Wu, Kewei

    1992-01-01

    A prototype expert system (developed on both PC and Symbolics 3670 lisp machine) for detecting anomalies in turbopump vibration data has been tested with data from ground tests 902-473, 902-501, 902-519, and 904-097 of the Space Shuttle Main Engine (SSME). The expert system has been utilized to analyze vibration data from each of the following SSME components: high-pressure oxidizer turbopump, high-pressure fuel turbopump, low-pressure fuel turbopump, and preburner boost pump. The expert system locates and classifies peaks in the power spectral density of each 0.4-sec window of steady-state data. Peaks representing the fundamental and harmonic frequencies of both shaft rotation and bearing cage rotation are identified by the expert system. Anomalies are then detected on the basis of sequential criteria and two threshold criteria set individually for the amplitude of each of these peaks: a prior threshold used during the first few windows of data in a test, and a posterior threshold used thereafter. In most cases the anomalies detected by the expert system agree with those reported by NASA. The two cases where there is significant disagreement will be further studied and the system design refined accordingly.

  17. Automatic detection of anomalies in Space Shuttle Main Engine turbopumps

    NASA Astrophysics Data System (ADS)

    Lo, Ching F.; Whitehead, B. A.; Wu, Kewei

    1992-07-01

    A prototype expert system (developed on both PC and Symbolics 3670 lisp machine) for detecting anomalies in turbopump vibration data has been tested with data from ground tests 902-473, 902-501, 902-519, and 904-097 of the Space Shuttle Main Engine (SSME). The expert system has been utilized to analyze vibration data from each of the following SSME components: high-pressure oxidizer turbopump, high-pressure fuel turbopump, low-pressure fuel turbopump, and preburner boost pump. The expert system locates and classifies peaks in the power spectral density of each 0.4-sec window of steady-state data. Peaks representing the fundamental and harmonic frequencies of both shaft rotation and bearing cage rotation are identified by the expert system. Anomalies are then detected on the basis of sequential criteria and two threshold criteria set individually for the amplitude of each of these peaks: a prior threshold used during the first few windows of data in a test, and a posterior threshold used thereafter. In most cases the anomalies detected by the expert system agree with those reported by NASA. The two cases where there is significant disagreement will be further studied and the system design refined accordingly.

  18. Automatic detection of anomalies in Space Shuttle Main Engine turbopumps

    NASA Technical Reports Server (NTRS)

    Lo, Ching F. (Principal Investigator); Whitehead, Bruce; Wu, Kewei; Rogers, George

    1992-01-01

    A prototype expert system for detecting anomalies in turbopump vibration data has been tested with data from ground tests 902-473, 902-501 902-519, and 904-097 of the Space Shuttle Main Engine!nc (SSME). The expert system has been utilized to analyze vibration ion data from each of the following SSME components: pressure oxidizer turbopump, high-pressure fuel turbo pump, low-pressure fuel turbopump, and preburner boost pump. The expert system locates and classifies peaks in the power spectral density of each 0.4 s window of steady-state data. Peaks representing the fundamental and harmonic frequencies of both shaft rotation and bearing cage rotation are identified by the expert system. Anomalies are then detected on the basis of of two thresholds set individually for the amplitude of each of these peaks: a prior threshold used during the first few windows of data in a test, and a posterior threshold used thereafter. In most cases the anomalies detected by the expert system agree with those reported by NASA. The two cases where there is significant disagreement will be further studied and the system design refined accordingly.

  19. Detecting syntactic and semantic anomalies in schizophrenia.

    PubMed

    Moro, Andrea; Bambini, Valentina; Bosia, Marta; Anselmetti, Simona; Riccaboni, Roberta; Cappa, Stefano F; Smeraldi, Enrico; Cavallaro, Roberto

    2015-12-01

    One of the major challenges in the study of language in schizophrenia is to identify specific levels of the linguistic structure that might be selectively impaired. While historically a main semantic deficit has been widely claimed, results are mixed, with also evidence of syntactic impairment. This might be due to heterogeneity in materials and paradigms across studies, which often do not allow to tap into single linguistic components. Moreover, the interaction between linguistic and neurocognitive deficits is still unclear. In this study, we concentrated on syntactic and semantic knowledge. We employed an anomaly detection task including short and long sentences with either syntactic errors violating the principles of Universal Grammar, or a novel form of semantic errors, resulting from a contradiction in the computation of the whole sentence meaning. Fifty-eight patients with diagnosis of schizophrenia were compared to 30 healthy subjects. Results showed that, in patients, only the ability to identify syntactic anomaly, both in short and long sentences, was impaired. This result cannot be explained by working memory abilities or psychopathological features. These findings suggest the presence of an impairment of syntactic knowledge in schizophrenia, at least partially independent of the cognitive and psychopathological profile. On the contrary, we cannot conclude that there is a semantic impairment, at least in terms of compositional semantics abilities.

  20. Automated anomaly detection for Orbiter High Temperature Reusable Surface Insulation

    NASA Astrophysics Data System (ADS)

    Cooper, Eric G.; Jones, Sharon M.; Goode, Plesent W.; Vazquez, Sixto L.

    1992-11-01

    The description, analysis, and experimental results of a method for identifying possible defects on High Temperature Reusable Surface Insulation (HRSI) of the Orbiter Thermal Protection System (TPS) is presented. Currently, a visual postflight inspection of Orbiter TPS is conducted to detect and classify defects as part of the Orbiter maintenance flow. The objective of the method is to automate the detection of defects by identifying anomalies between preflight and postflight images of TPS components. The initial version is intended to detect and label gross (greater than 0.1 inches in the smallest dimension) anomalies on HRSI components for subsequent classification by a human inspector. The approach is a modified Golden Template technique where the preflight image of a tile serves as the template against which the postflight image of the tile is compared. Candidate anomalies are selected as a result of the comparison and processed to identify true anomalies. The processing methods are developed and discussed, and the results of testing on actual and simulated tile images are presented. Solutions to the problems of brightness and spatial normalization, timely execution, and minimization of false positives are also discussed.

  1. Automated anomaly detection for Orbiter High Temperature Reusable Surface Insulation

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; Jones, Sharon M.; Goode, Plesent W.; Vazquez, Sixto L.

    1992-01-01

    The description, analysis, and experimental results of a method for identifying possible defects on High Temperature Reusable Surface Insulation (HRSI) of the Orbiter Thermal Protection System (TPS) is presented. Currently, a visual postflight inspection of Orbiter TPS is conducted to detect and classify defects as part of the Orbiter maintenance flow. The objective of the method is to automate the detection of defects by identifying anomalies between preflight and postflight images of TPS components. The initial version is intended to detect and label gross (greater than 0.1 inches in the smallest dimension) anomalies on HRSI components for subsequent classification by a human inspector. The approach is a modified Golden Template technique where the preflight image of a tile serves as the template against which the postflight image of the tile is compared. Candidate anomalies are selected as a result of the comparison and processed to identify true anomalies. The processing methods are developed and discussed, and the results of testing on actual and simulated tile images are presented. Solutions to the problems of brightness and spatial normalization, timely execution, and minimization of false positives are also discussed.

  2. Statistical Traffic Anomaly Detection in Time-Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    based anomaly detection methods are considered to be more economic and promising since they can identify novel attacks. In this work we focus on change ...82. [4] W. Lu and A. A. Ghorbani, “Network anomaly detection based on wavelet analysis ,” EURASIP Journal on Advances in Signal Processing, vol. 2009... dynamically . We formulate the anomaly detection problem as a binary composite hypothesis testing problem and develop a model-free and a model- based

  3. Statistical Traffic Anomaly Detection in Time Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    based anomaly detection methods are considered to be more economic and promising since they can identify novel attacks. In this work we focus on change ...82. [4] W. Lu and A. A. Ghorbani, “Network anomaly detection based on wavelet analysis ,” EURASIP Journal on Advances in Signal Processing, vol. 2009... dynamically . We formulate the anomaly detection problem as a binary composite hypothesis testing problem and develop a model-free and a model- based

  4. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field

    PubMed Central

    Christiansen, Peter; Nielsen, Lars N.; Steen, Kim A.; Jørgensen, Rasmus N.; Karstoft, Henrik

    2016-01-01

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45–90 m) than RCNN. RCNN has a similar performance at a short range (0–30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit). PMID:27845717

  5. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field.

    PubMed

    Christiansen, Peter; Nielsen, Lars N; Steen, Kim A; Jørgensen, Rasmus N; Karstoft, Henrik

    2016-11-11

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks" (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45-90 m) than RCNN. RCNN has a similar performance at a short range (0-30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit).

  6. Bayesian Filtering Approaches for Detecting Anomalies in Environmental Sensor Data

    NASA Astrophysics Data System (ADS)

    Hill, D. J.; Minsker, B. S.

    2006-12-01

    Recent advances in sensor technology are facilitating the deployment of sensors into the environment that can produce measurements at high spatial and/or temporal resolutions. Not only can these data be used to better characterize the system for improved modeling, but they can also be used to produce better understandings of the mechanisms of environmental processes. One such use of these data is anomaly detection to identify data that deviate from historical patterns. These anomalous data can be caused by sensor or data transmission errors or by infrequent system behaviors that are often of interest to the scientific or public safety communities. Thus, anomaly detection has many practical applications, such as data quality assurance and control (QA/QC), where anomalous data are treated as data errors; focused data collection, where anomalous data indicate segments of data that are of interest to researchers; or event detection, where anomalous data signal system behaviors that could result in a natural disaster, for example. Traditionally, most anomaly detection has been carried out manually with the assistance of data visualization tools; however, due to the large volume of data produced by environmental sensors, manual techniques are not always feasible. This study develops an automated anomaly detection method that employs dynamic Bayesian networks (DBNs) to model the states of the environmental system in which the sensors are deployed. The DBN is an artificial intelligence technique that models the evolution of the discrete and/or continuous valued states of a dynamic system by tracking changes in the system states over time. Two commonly used types of DBNs are hidden Markov models and Kalman filters. In this study, DBNs will be used to predict the expected value of unknown system states, as well as the likelihood of particular sensor measurements of those states. Unlikely measurements are then considered anomalous. The performance of the DBN based anomaly

  7. Impact of maternal body mass index on the antenatal detection of congenital anomalies.

    PubMed

    Best, K E; Tennant, P W G; Bell, R; Rankin, J

    2012-11-01

    To investigate the association between maternal body mass index (BMI) and antenatal ultrasound detection of congenital anomalies. Population-based register study. North of England (UK). All pregnancies (n = 3096) associated with a congenital anomaly notified to the Northern Congenital Abnormality Survey (NorCAS) during 2006-2009. Cases with chromosomal and teratogenic anomalies (n = 611) or without information on antenatal scanning (n = 4) were excluded. Adjusted odds ratios (aORs) and 95% confidence intervals (CIs) for antenatal detection according to maternal BMI categories were estimated using logistic regression. For all anomalies combined, cases were defined as 'detected' if any congenital anomaly was suspected antenatally. Organ system-specific anomalies were defined as detected if an anomaly of the correct system was suspected. Antenatal detection of any anomaly occurred in 1146 of 2483 (46.2%) cases with normal karyotype. The odds of detection were significantly decreased in obese (BMI ≥ 30 kg/m(2)) women compared with women of recommended BMI (18.5-24.9 kg/m(2); aOR, 0.77; 95% CI, 0.60-0.99; P = 0.046). Cardiovascular system anomalies were suspected antenatally in 109 of 945 (11.5%) cases. The odds of detecting a cardiovascular anomaly were significantly greater in underweight women (BMI < 18.5 kg/m(2)) than in women of recommended BMI (aOR, 2.95; 95% CI, 1.13-7.70; P = 0.027). There was no association between BMI and detection in any other organ system or between BMI and termination of pregnancy for fetal anomaly. Antenatal ultrasound detection of a congenital anomaly is decreased in obese pregnant women. This has implications for the scanning and counselling of obese women. © 2012 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2012 RCOG.

  8. Voila: Visual Anomaly Detection and Monitoring with Streaming Spatiotemporal Data.

    PubMed

    Cao, Nan; Lin, Chaoguang; Zhu, Qiuhan; Lin, Yu-Ru; Teng, Xian; Wen, Xidao

    2017-08-30

    The increasing availability of spatiotemporal data continuously collected from various sources provides new opportunities for a timely understanding of the data in their spatial and temporal context. Finding abnormal patterns in such data poses significant challenges. Given that there is often no clear boundary between normal and abnormal patterns, existing solutions are limited in their capacity of identifying anomalies in large, dynamic and heterogeneous data, interpreting anomalies in their multifaceted, spatiotemporal context, and allowing users to provide feedback in the analysis loop. In this work, we introduce a unified visual interactive system and framework, Voila, for interactively detecting anomalies in spatiotemporal data collected from a streaming data source. The system is designed to meet two requirements in real-world applications, i.e., online monitoring and interactivity. We propose a novel tensor-based anomaly analysis algorithm with visualization and interaction design that dynamically produces contextualized, interpretable data summaries and allows for interactively ranking anomalous patterns based on user input. Using the "smart city" as an example scenario, we demonstrate the effectiveness of the proposed framework through quantitative evaluation and qualitative case studies.

  9. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  10. Statistical Anomaly Detection for Monitoring of Human Dynamics

    NASA Astrophysics Data System (ADS)

    Kamiya, K.; Fuse, T.

    2015-05-01

    Understanding of human dynamics has drawn attention to various areas. Due to the wide spread of positioning technologies that use GPS or public Wi-Fi, location information can be obtained with high spatial-temporal resolution as well as at low cost. By collecting set of individual location information in real time, monitoring of human dynamics is recently considered possible and is expected to lead to dynamic traffic control in the future. Although this monitoring focuses on detecting anomalous states of human dynamics, anomaly detection methods are developed ad hoc and not fully systematized. This research aims to define an anomaly detection problem of the human dynamics monitoring with gridded population data and develop an anomaly detection method based on the definition. According to the result of a review we have comprehensively conducted, we discussed the characteristics of the anomaly detection of human dynamics monitoring and categorized our problem to a semi-supervised anomaly detection problem that detects contextual anomalies behind time-series data. We developed an anomaly detection method based on a sticky HDP-HMM, which is able to estimate the number of hidden states according to input data. Results of the experiment with synthetic data showed that our proposed method has good fundamental performance with respect to the detection rate. Through the experiment with real gridded population data, an anomaly was detected when and where an actual social event had occurred.

  11. Mobile gamma-ray scanning system for detecting radiation anomalies associated with /sup 226/Ra-bearing materials

    SciTech Connect

    Myrick, T.E.; Blair, M.S.; Doane, R.W.; Goldsmith, W.A.

    1982-11-01

    A mobile gamma-ray scanning system has been developed by Oak Ridge National Laboratory for use in the Department of Energy's remedial action survey programs. The unit consists of a NaI(T1) detection system housed in a specially-equipped van. The system is operator controlled through an on-board mini-computer, with data output provided on the computer video screen, strip chart recorders, and an on-line printer. Data storage is provided by a floppy disk system. Multichannel analysis capabilities are included for qualitative radionuclide identification. A /sup 226/Ra-specific algorithm is employed to identify locations containing residual radium-bearing materials. This report presents the details of the system description, software development, and scanning methods utilized with the ORNL system. Laboratory calibration and field testing have established the system sensitivity, field of view, and other performance characteristics, the results of which are also presented. Documentation of the instrumentation and computer programs are included.

  12. Investigations as a prerequisite for genetic counseling after termination of pregnancy based on sonographic detection of serious central nervous system--or skeletal anomalies.

    PubMed

    Kaasen, Anne; Prescott, Trine E; Heiberg, Arvid; Scott, Helge; Haugen, Guttorm

    2008-01-01

    The primary aim was to evaluate which investigation performed after sonographic detection of central nervous system (CNS) or skeletal anomalies that had highest diagnostic yield. The secondary aim was to estimate recurrence risk. Design. Retrospective review of patients' records. Tertiary fetal medicine referral center. Pregnancy terminations (n=97) because of CNS or skeletal anomalies during a 17-year period, within 12-24 weeks gestation. Two medical geneticists and one genetic counselor reviewed charts independently. Primary ultrasound diagnosis, change in diagnosis following supplementary examinations in addition to prenatal ultrasound (medical history, autopsy, post-mortem X-ray, karyotyping, targeted DNA analysis and investigations for infection), the most useful method to determine diagnosis, and recurrence risk estimate including inter-rater agreement. Mean gestational age was 19.8 weeks. All three investigators agreed in each case on which investigation constituted the best basis to determine the most precise diagnosis. The examinations performed in addition to prenatal ultrasound provided important diagnostic information in 54 cases (56%) and altered recurrence risk in 22 (23%) cases; in eight of these cases the risk estimate was increased. In nine cases (9%) the investigators disagreed in their estimates of recurrence risk. Kappa for inter-rater agreement was >0.90. A panel of diagnostic investigations, depending on the organ system involved, allows for a more precise diagnosis and a more reliable estimate of recurrence risk than prenatal ultrasound alone. In some instances, recurrence risk estimation is not straightforward as evidenced by lack of consensus.

  13. Anomaly detection enhanced classification in computer intrusion detection

    SciTech Connect

    Fugate, M. L.; Gattiker, J. R.

    2002-01-01

    This report describes work with the goal of enhancing capabilities in computer intrusion detection. The work builds upon a study of classification performance, that compared various methods of classifying information derived from computer network packets into attack versus normal categories, based on a labeled training dataset. This previous work validates our classification methods, and clears the ground for studying whether and how anomaly detection can be used to enhance this performance, The DARPA project that initiated the dataset used here concluded that anomaly detection should be examined to boost the performance of machine learning in the computer intrusion detection task. This report investigates the data set for aspects that will be valuable for anomaly detection application, and supports these results with models constructed from the data. In this report, the term anomaly detection means learning a model from unlabeled data, and using this to make some inference about future data. Our data is a feature vector derived from network packets: an 'example' or 'sample'. On the other hand, classification means building a model from labeled data, and using that model to classify unlabeled (future) examples. There is some precedent in the literature for combining these methods. One approach is to stage the two techniques, using anomaly detection to segment data into two sets for classification. An interpretation of this is a method to combat nonstationarity in the data. In our previous work, we demonstrated that the data has substantial temporal nonstationarity. With classification methods that can be thought of as learning a decision surface between two statistical distributions, performance is expected to degrade significantly when classifying examples that are from regions not well represented in the training set. Anomaly detection can be seen as a problem of learning the density (landscape) or the support (boundary) of a statistical distribution so that

  14. Discovering System Health Anomalies Using Data Mining Techniques

    NASA Technical Reports Server (NTRS)

    Sriastava, Ashok, N.

    2005-01-01

    We present a data mining framework for the analysis and discovery of anomalies in high-dimensional time series of sensor measurements that would be found in an Integrated System Health Monitoring system. We specifically treat the problem of discovering anomalous features in the time series that may be indicative of a system anomaly, or in the case of a manned system, an anomaly due to the human. Identification of these anomalies is crucial to building stable, reusable, and cost-efficient systems. The framework consists of an analysis platform and new algorithms that can scale to thousands of sensor streams to discovers temporal anomalies. We discuss the mathematical framework that underlies the system and also describe in detail how this framework is general enough to encompass both discrete and continuous sensor measurements. We also describe a new set of data mining algorithms based on kernel methods and hidden Markov models that allow for the rapid assimilation, analysis, and discovery of system anomalies. We then describe the performance of the system on a real-world problem in the aircraft domain where we analyze the cockpit data from aircraft as well as data from the aircraft propulsion, control, and guidance systems. These data are discrete and continuous sensor measurements and are dealt with seamlessly in order to discover anomalous flights. We conclude with recommendations that describe the tradeoffs in building an integrated scalable platform for robust anomaly detection in ISHM applications.

  15. Thermal and TEC anomalies detection using an intelligent hybrid system around the time of the Saravan, Iran, (Mw = 7.7) earthquake of 16 April 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2014-02-01

    A powerful earthquake of Mw = 7.7 struck the Saravan region (28.107° N, 62.053° E) in Iran on 16 April 2013. Up to now nomination of an automated anomaly detection method in a non linear time series of earthquake precursor has been an attractive and challenging task. Artificial Neural Network (ANN) and Particle Swarm Optimization (PSO) have revealed strong potentials in accurate time series prediction. This paper presents the first study of an integration of ANN and PSO method in the research of earthquake precursors to detect the unusual variations of the thermal and total electron content (TEC) seismo-ionospheric anomalies induced by the strong earthquake of Saravan. In this study, to overcome the stagnation in local minimum during the ANN training, PSO as an optimization method is used instead of traditional algorithms for training the ANN method. The proposed hybrid method detected a considerable number of anomalies 4 and 8 days preceding the earthquake. Since, in this case study, ionospheric TEC anomalies induced by seismic activity is confused with background fluctuations due to solar activity, a multi-resolution time series processing technique based on wavelet transform has been applied on TEC signal variations. In view of the fact that the accordance in the final results deduced from some robust methods is a convincing indication for the efficiency of the method, therefore the detected thermal and TEC anomalies using the ANN + PSO method were compared to the results with regard to the observed anomalies by implementing the mean, median, Wavelet, Kalman filter, Auto-Regressive Integrated Moving Average (ARIMA), Support Vector Machine (SVM) and Genetic Algorithm (GA) methods. The results indicate that the ANN + PSO method is quite promising and deserves serious attention as a new tool for thermal and TEC seismo anomalies detection.

  16. Automated Network Anomaly Detection with Learning, Control and Mitigation

    ERIC Educational Resources Information Center

    Ippoliti, Dennis

    2014-01-01

    Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…

  17. Automated Network Anomaly Detection with Learning, Control and Mitigation

    ERIC Educational Resources Information Center

    Ippoliti, Dennis

    2014-01-01

    Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…

  18. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    PubMed

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  19. Is Host-Based Anomaly Detection + Temporal Correlation = Worm Causality

    DTIC Science & Technology

    2007-03-06

    combination of host-based anomaly detection and temporal correlation of network events. The contribution of this paper is a systematic exploration of this...suggest that it is feasible to establish attack causality accurately using anomaly detection and temporal event correlation in enterprise network environments with tens of thousands of hosts.

  20. Claycap anomaly detection using hyperspectral remote sensing and lidargrammetric techniques

    NASA Astrophysics Data System (ADS)

    Garcia Quijano, Maria Jose

    Clay capped waste sites are a common method to dispose of the more than 40 million tons of hazardous waste produced in the United States every year (EPA, 2003). Due to the potential threat that hazardous waste poses, it is essential to monitor closely the performance of these facilities. Development of a monitoring system that exploits spectral and topographic changes over hazardous waste sites is presented. Spectral anomaly detection is based upon the observed changes in absolute reflectance and spectral derivatives in centipede grass (Eremochloa ophiuroides) under different irrigation levels. The spectral features that provide the best separability among irrigation levels were identified using Stepwise Discriminant Analyses. The Red Edge Position was selected as a suitable discriminant variable to compare the performance of a global and a local anomaly detection algorithm using a DAIS 3715 hyperspectral image. Topographical anomaly detection is assessed by evaluating the vertical accuracy of two LIDAR datasets acquired from two different altitudes (700 m and 1,200 m AGL) over a clay-capped hazardous site at the Savannah River National Laboratory, SC using the same Optech ALTM 2050 and Cessna 337 platform. Additionally, a quantitative comparison is performed to determine the effect that decreasing platform altitude and increasing posting density have on the vertical accuracy of the LIDAR data collected.

  1. Recent Advances in Ionospheric Anomalies detection

    NASA Astrophysics Data System (ADS)

    Titov, Anton; Vyacheslav, Khattatov

    2016-07-01

    The variability of the parameters of the ionosphere and ionospheric anomalies are the subject of intensive research. It is widely known and studied in the literature ionospheric disturbances caused by solar activity, the passage of the terminator, artificial heating of high-latitude ionosphere, as well as seismic events. Each of the above types of anomalies is the subject of study and analysis. Analysis of these anomalies will provide an opportunity to improve our understanding of the mechanisms of ionospheric disturbances. To solve this problem are encouraged to develop a method of modeling the ionosphere, based on the assimilation of large amounts of observational data.

  2. Lymphatic system anomalies in Crouzon syndrome

    PubMed Central

    Bourgeois, Pierre; Moniotte, Stéphane

    2009-01-01

    Crouzon syndrome is a rare genetic disorder characterised mainly by distinctive malformations of the skull and facial region and caused by mutations in the fibroblast growth factor receptor 2 (FGFR2) gene. No study reported on oedemas related to lymphatic system abnormalities in these patients. A case of Crouzon syndrome displaying classic facial anomalies but also with bilateral lower limb oedema is reported in whom lymphoscintigraphic investigation of the limbs clearly delineated the presence of lymphatic system anomalies. PMID:21686735

  3. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  4. Detection of Low Temperature Volcanogenic Thermal Anomalies with ASTER

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Baxter, S.

    2009-12-01

    Predicting volcanic eruptions is a thorny problem, as volcanoes typically exhibit idiosyncratic waxing and/or waning pre-eruption emission, geodetic, and seismic behavior. It is no surprise that increasing our accuracy and precision in eruption prediction depends on assessing the time-progressions of all relevant precursor geophysical, geochemical, and geological phenomena, and on more frequently observing volcanoes when they become restless. The ASTER instrument on the NASA Terra Earth Observing System satellite in low earth orbit provides important capabilities in the area of detection of volcanogenic anomalies such as thermal precursors and increased passive gas emissions. Its unique high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), bore-sighted with visible and near-IR imaging data, and combined with off-nadir pointing and stereo-photogrammetric capabilities make ASTER a potentially important volcanic precursor detection tool. We are utilizing the JPL ASTER Volcano Archive (http://ava.jpl.nasa.gov) to systematically examine 80,000+ ASTER volcano images to analyze (a) thermal emission baseline behavior for over 1500 volcanoes worldwide, (b) the form and magnitude of time-dependent thermal emission variability for these volcanoes, and (c) the spatio-temporal limits of detection of pre-eruption temporal changes in thermal emission in the context of eruption precursor behavior. We are creating and analyzing a catalog of the magnitude, frequency, and distribution of volcano thermal signatures worldwide as observed from ASTER since 2000 at 90m/pixel. Of particular interest as eruption precursors are small low contrast thermal anomalies of low apparent absolute temperature (e.g., melt-water lakes, fumaroles, geysers, grossly sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and

  5. Hierarchical Kohonenen net for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A; Huff, Julie

    2005-04-01

    A novel multilevel hierarchical Kohonen Net (K-Map) for an intrusion detection system is presented. Each level of the hierarchical map is modeled as a simple winner-take-all K-Map. One significant advantage of this multilevel hierarchical K-Map is its computational efficiency. Unlike other statistical anomaly detection methods such as nearest neighbor approach, K-means clustering or probabilistic analysis that employ distance computation in the feature space to identify the outliers, our approach does not involve costly point-to-point computation in organizing the data into clusters. Another advantage is the reduced network size. We use the classification capability of the K-Map on selected dimensions of data set in detecting anomalies. Randomly selected subsets that contain both attacks and normal records from the KDD Cup 1999 benchmark data are used to train the hierarchical net. We use a confidence measure to label the clusters. Then we use the test set from the same KDD Cup 1999 benchmark to test the hierarchical net. We show that a hierarchical K-Map in which each layer operates on a small subset of the feature space is superior to a single-layer K-Map operating on the whole feature space in detecting a variety of attacks in terms of detection rate as well as false positive rate.

  6. Anomaly Detection in Test Equipment via Sliding Mode Observers

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.; Drakunov, Sergey V.

    2012-01-01

    Nonlinear observers were originally developed based on the ideas of variable structure control, and for the purpose of detecting disturbances in complex systems. In this anomaly detection application, these observers were designed for estimating the distributed state of fluid flow in a pipe described by a class of advection equations. The observer algorithm uses collected data in a piping system to estimate the distributed system state (pressure and velocity along a pipe containing liquid gas propellant flow) using only boundary measurements. These estimates are then used to further estimate and localize possible anomalies such as leaks or foreign objects, and instrumentation metering problems such as incorrect flow meter orifice plate size. The observer algorithm has the following parts: a mathematical model of the fluid flow, observer control algorithm, and an anomaly identification algorithm. The main functional operation of the algorithm is in creating the sliding mode in the observer system implemented as software. Once the sliding mode starts in the system, the equivalent value of the discontinuous function in sliding mode can be obtained by filtering out the high-frequency chattering component. In control theory, "observers" are dynamic algorithms for the online estimation of the current state of a dynamic system by measurements of an output of the system. Classical linear observers can provide optimal estimates of a system state in case of uncertainty modeled by white noise. For nonlinear cases, the theory of nonlinear observers has been developed and its success is mainly due to the sliding mode approach. Using the mathematical theory of variable structure systems with sliding modes, the observer algorithm is designed in such a way that it steers the output of the model to the output of the system obtained via a variety of sensors, in spite of possible mismatches between the assumed model and actual system. The unique properties of sliding mode control

  7. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    NASA Technical Reports Server (NTRS)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  8. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    NASA Technical Reports Server (NTRS)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  9. Adaptive Anomaly Detection using Isolation Forest

    DTIC Science & Technology

    2009-12-20

    detectors such as ORCA [8] and one-class SVM [31], and density-based anomaly detector LOF [9]. The rest of the paper is organised as follows. Section 2...that a value can be computed using this measure. We use k = 5 in our experiments. The second experiment compares HS*-Trees with ORCA [8], one-class...SVM (first mentioned in [31]) and LOF [9]. ORCA employs distance-based definition (ii), stated in sec- tion 3.1, to rank anomalies; LOF is the state-of

  10. Post-processing for improving hyperspectral anomaly detection accuracy

    NASA Astrophysics Data System (ADS)

    Wu, Jee-Cheng; Jiang, Chi-Ming; Huang, Chen-Liang

    2015-10-01

    Anomaly detection is an important topic in the exploitation of hyperspectral data. Based on the Reed-Xiaoli (RX) detector and a morphology operator, this research proposes a novel technique for improving the accuracy of hyperspectral anomaly detection. Firstly, the RX-based detector is used to process a given input scene. Then, a post-processing scheme using morphology operator is employed to detect those pixels around high-scoring anomaly pixels. Tests were conducted using two real hyperspectral images with ground truth information and the results based on receiver operating characteristic curves, illustrated that the proposed method reduced the false alarm rates of the RXbased detector.

  11. Identification and detection of anomalies through SSME data analysis

    NASA Technical Reports Server (NTRS)

    Pereira, Lisa; Ali, Moonis

    1990-01-01

    The goal of the ongoing research described in this paper is to analyze real-time ground test data in order to identify patterns associated with the anomalous engine behavior, and on the basis of this analysis to develop an expert system which detects anomalous engine behavior in the early stages of fault development. A prototype of the expert system has been developed and tested on the high frequency data of two SSME tests, namely Test #901-0516 and Test #904-044. The comparison of our results with the post-test analyses indicates that the expert system detected the presence of the anomalies in a significantly early stage of fault development.

  12. Load characterization and anomaly detection for voice over IP traffic.

    PubMed

    Mandjes, Michel; Saniee, Iraj; Stolyar, Alexander L

    2005-09-01

    We consider the problem of traffic anomaly detection in IP networks. Traffic anomalies typically arise when there is focused overload or when a network element fails and it is desired to infer these purely from the measured traffic. We derive new general formulae for the variance of the cumulative traffic over a fixed time interval and show how the derived analytical expression simplifies for the case of voice over IP traffic, the focus of this paper. To detect load anomalies, we show it is sufficient to consider cumulative traffic over relatively long intervals such as 5 min. We also propose simple anomaly detection tests including detection of over/underload. This approach substantially extends the current practice in IP network management where only the first-order statistics and fixed thresholds are used to identify abnormal behavior. We conclude with the application of the scheme to field data from an operational network.

  13. On local anomaly detection and analysis for clinical pathways.

    PubMed

    Huang, Zhengxing; Dong, Wei; Ji, Lei; Yin, Liangying; Duan, Huilong

    2015-11-01

    Anomaly detection, as an imperative task for clinical pathway (CP) analysis and improvement, can provide useful and actionable knowledge of interest to clinical experts to be potentially exploited. Existing studies mainly focused on the detection of global anomalous inpatient traces of CPs using the similarity measures in a structured manner, which brings order in the chaos of CPs, may decline the accuracy of similarity measure between inpatient traces, and may distort the efficiency of anomaly detection. In addition, local anomalies that exist in some subsegments of events or behaviors in inpatient traces are easily overlooked by existing approaches since they are designed for detecting global or large anomalies. In this study, we employ a probabilistic topic model to discover underlying treatment patterns, and assume any significant unexplainable deviations from the normal behaviors surmised by the derived patterns are strongly correlated with anomalous behaviours. In this way, we can figure out the detailed local abnormal behaviors and the associations between these anomalies such that diagnostic information on local anomalies can be provided. The proposed approach is evaluated via a clinical data-set, including 2954 unstable angina patient traces and 483,349 clinical events, extracted from a Chinese hospital. Using the proposed method, local anomalies are detected from the log. In addition, the identified associations between the detected local anomalies are derived from the log, which lead to clinical concern on the reason resulting in these anomalies in CPs. The correctness of the proposed approach has been evaluated by three experience cardiologists of the hospital. For four types of local anomalies (i.e., unexpected events, early events, delay events, and absent events), the proposed approach achieves 94%, 71% 77%, and 93.2% in terms of recall. This is quite remarkable as we do not use a prior knowledge. Substantial experimental results show that the

  14. Evaluation of Anomaly Detection Method Based on Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Fontugne, Romain; Himura, Yosuke; Fukuda, Kensuke

    The number of threats on the Internet is rapidly increasing, and anomaly detection has become of increasing importance. High-speed backbone traffic is particularly degraded, but their analysis is a complicated task due to the amount of data, the lack of payload data, the asymmetric routing and the use of sampling techniques. Most anomaly detection schemes focus on the statistical properties of network traffic and highlight anomalous traffic through their singularities. In this paper, we concentrate on unusual traffic distributions, which are easily identifiable in temporal-spatial space (e.g., time/address or port). We present an anomaly detection method that uses a pattern recognition technique to identify anomalies in pictures representing traffic. The main advantage of this method is its ability to detect attacks involving mice flows. We evaluate the parameter set and the effectiveness of this approach by analyzing six years of Internet traffic collected from a trans-Pacific link. We show several examples of detected anomalies and compare our results with those of two other methods. The comparison indicates that the only anomalies detected by the pattern-recognition-based method are mainly malicious traffic with a few packets.

  15. Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling

    NASA Astrophysics Data System (ADS)

    Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji

    We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.

  16. Detection of fluid density anomalies using remote imaging techniques

    NASA Astrophysics Data System (ADS)

    Smart, Clara J.

    Systematic and remote imaging techniques capable of detecting fluid density anomalies will allow for effective scientific sampling, improved geologic and biologic spatial understanding and analysis of temporal changes. This work presents algorithms for detection of anomalous fluids using an ROV-mounted high resolution imaging suite, specifically the structured light laser sensor and 1350kHz multibeam sonar system. As the ROV-mounted structured light laser sensor passes over areas of active flow the turbulent nature of the density anomaly causes the project laser line, imaged at the seafloor, to blur and distort. Detection of this phenomena was initially presented in 2013 with significant limitations including false positive results for active venting. Advancements to the detection algorithm presented in this work include intensity normalization algorithms and the implementation of a support vector machine classification algorithm. Results showing clear differentiation between areas of plain seafloor, bacteria or biology, and active venting are presented for multiple hydrothermal vent fields. Survey altitudes and the direction of travel impact laser data gathered over active vent sites. To determine the implications of these survey parameters, data collected over a single hydrothermal vent at three altitudes with four headings per altitude are analyzed. Changing survey geometry will impact the resolution and intensity of the laser line images, therefore, normalization and processing considerations are presented to maintain signal quality. The spatial distribution of the detected density anomaly will also be discussed as it is impacted by survey range and vehicle heading. While surveying hypersaline brine pools the observed acoustic responses from the 1350kHz high frequency multibeam sonar system indicate sensitivity to changes in acoustic impedance and therefore the density of a fluid. Internal density stratification was detected acoustically, appearing as multiple

  17. Anomaly detection applied to a materials control and accounting database

    SciTech Connect

    Whiteson, R.; Spanks, L.; Yarbro, T.

    1995-09-01

    An important component of the national mission of reducing the nuclear danger includes accurate recording of the processing and transportation of nuclear materials. Nuclear material storage facilities, nuclear chemical processing plants, and nuclear fuel fabrication facilities collect and store large amounts of data describing transactions that involve nuclear materials. To maintain confidence in the integrity of these data, it is essential to identify anomalies in the databases. Anomalous data could indicate error, theft, or diversion of material. Yet, because of the complex and diverse nature of the data, analysis and evaluation are extremely tedious. This paper describes the authors work in the development of analysis tools to automate the anomaly detection process for the Material Accountability and Safeguards System (MASS) that tracks and records the activities associated with accountable quantities of nuclear material at Los Alamos National Laboratory. Using existing guidelines that describe valid transactions, the authors have created an expert system that identifies transactions that do not conform to the guidelines. Thus, this expert system can be used to focus the attention of the expert or inspector directly on significant phenomena.

  18. Multi-criteria anomaly detection in urban noise sensor networks.

    PubMed

    Dauwe, Samuel; Oldoni, Damiano; De Baets, Bernard; Van Renterghem, Timothy; Botteldooren, Dick; Dhoedt, Bart

    2014-01-01

    The growing concern of citizens about the quality of their living environment and the emergence of low-cost microphones and data acquisition systems triggered the deployment of numerous noise monitoring networks spread over large geographical areas. Due to the local character of noise pollution in an urban environment, a dense measurement network is needed in order to accurately assess the spatial and temporal variations. The use of consumer grade microphones in this context appears to be very cost-efficient compared to the use of measurement microphones. However, the lower reliability of these sensing units requires a strong quality control of the measured data. To automatically validate sensor (microphone) data, prior to their use in further processing, a multi-criteria measurement quality assessment model for detecting anomalies such as microphone breakdowns, drifts and critical outliers was developed. Each of the criteria results in a quality score between 0 and 1. An ordered weighted average (OWA) operator combines these individual scores into a global quality score. The model is validated on datasets acquired from a real-world, extensive noise monitoring network consisting of more than 50 microphones. Over a period of more than a year, the proposed approach successfully detected several microphone faults and anomalies.

  19. Lidar detection algorithm for time and range anomalies

    NASA Astrophysics Data System (ADS)

    Ben-David, Avishai; Davidson, Charles E.; Vanderbeek, Richard G.

    2007-10-01

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t1 to t2" is addressed, and for range anomaly where the question "is a target present at time t within ranges R1 and R2" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO2 lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  20. Mixtures of Probabilistic Principal Component Analyzers for Anomaly Detection

    SciTech Connect

    Fang, Yi; Ganguly, Auroop R

    2007-01-01

    Anomaly detection tools have been increasingly used in recent years to generate predictive insights on rare events. The typical challenges encountered in such applications include a large number of data dimensions and absence of labeled data. An anomaly detection strategy for these scenarios is dimensionality reduction followed by clustering in the reduced space, with the degree of anomaly of an event or observation quantified by statistical distance from the clusters. However, most research efforts so far are focused on single abrupt anomalies, while the correlation between observations is completely ignored. In this paper, we address the problem of detection of both abrupt and sustained anomalies with high dimensions. The task becomes more challenging than only detecting abrupt outliers because of the gradual and indiscriminant changes in sustained anomalies. We utilize a mixture model of probabilistic principal component analyzers to quantify each observation by probabilistic measures. A statistical process control method is then used to monitor both abrupt and gradual changes. On the other hand, the mixture model can be regarded as a trade-off strategy between linear and nonlinear dimensionality reductions in terms of computational efficiency. This compromise is particularly important in real-time deployment. The proposed method is evaluated on simulated and benchmark data, as well as on data from wide-area sensors at a truck weigh station test-bed.

  1. A New Methodology for Early Anomaly Detection of BWR Instabilities

    SciTech Connect

    Ivanov, K. N.

    2005-11-27

    The objective of the performed research is to develop an early anomaly detection methodology so as to enhance safety, availability, and operational flexibility of Boiling Water Reactor (BWR) nuclear power plants. The technical approach relies on suppression of potential power oscillations in BWRs by detecting small anomalies at an early stage and taking appropriate prognostic actions based on an anticipated operation schedule. The research utilizes a model of coupled (two-phase) thermal-hydraulic and neutron flux dynamics, which is used as a generator of time series data for anomaly detection at an early stage. The model captures critical nonlinear features of coupled thermal-hydraulic and nuclear reactor dynamics and (slow time-scale) evolution of the anomalies as non-stationary parameters. The time series data derived from this nonlinear non-stationary model serves as the source of information for generating the symbolic dynamics for characterization of model parameter changes that quantitatively represent small anomalies. The major focus of the presented research activity was on developing and qualifying algorithms of pattern recognition for power instability based on anomaly detection from time series data, which later can be used to formulate real-time decision and control algorithms for suppression of power oscillations for a variety of anticipated operating conditions. The research being performed in the framework of this project is essential to make significant improvement in the capability of thermal instability analyses for enhancing safety, availability, and operational flexibility of currently operating and next generation BWRs.

  2. Online Learning and Sequential Anomaly Detection in Trajectories.

    PubMed

    Laxhammar, Rikard; Falkman, Göran

    2013-09-12

    Detection of anomalous trajectories is an important problem in the surveillance domain. Various algorithms based on learning of normal trajectory patterns have been proposed for this problem. Yet, these algorithms typically suffer from one or more limitations: They are not designed for sequential analysis of incomplete trajectories or online learning based on an incrementally updated training set. Moreover, they typically involve tuning of many parameters, including ad-hoc anomaly thresholds, and may therefore suffer from overfitting and poorly-calibrated alarm rates. In this article, we propose and investigate the Sequential Hausdorff Nearest-Neighbour Conformal Anomaly Detector (SHNN-CAD) for online learning and sequential anomaly detection in trajectories. This is a parameter-light algorithm that offers a well-founded approach to the calibration of the anomaly threshold. The discords algorithm, originally proposed by Keogh et al, is another parameter-light anomaly detection algorithm that has previously been shown to have good classification performance on a wide range of time-series datasets, including trajectory data. We implement and investigate the performance of SHNN-CAD and the discords algorithm on four different labelled trajectory datasets. The results show that SHNN-CAD achieves competitive classification performance with minimum parameter tuning during unsupervised online learning and sequential anomaly detection in trajectories.

  3. Online Learning and Sequential Anomaly Detection in Trajectories.

    PubMed

    Laxhammar, Rikard; Falkman, Göran

    2014-06-01

    Detection of anomalous trajectories is an important problem in the surveillance domain. Various algorithms based on learning of normal trajectory patterns have been proposed for this problem. Yet, these algorithms typically suffer from one or more limitations: They are not designed for sequential analysis of incomplete trajectories or online learning based on an incrementally updated training set. Moreover, they typically involve tuning of many parameters, including ad-hoc anomaly thresholds, and may therefore suffer from overfitting and poorly-calibrated alarm rates. In this article, we propose and investigate the Sequential Hausdorff Nearest-Neighbor Conformal Anomaly Detector (SHNN-CAD) for online learning and sequential anomaly detection in trajectories. This is a parameter-light algorithm that offers a well-founded approach to the calibration of the anomaly threshold. The discords algorithm, originally proposed by Keogh et al. , is another parameter-light anomaly detection algorithm that has previously been shown to have good classification performance on a wide range of time-series datasets, including trajectory data. We implement and investigate the performance of SHNN-CAD and the discords algorithm on four different labeled trajectory datasets. The results show that SHNN-CAD achieves competitive classification performance with minimum parameter tuning during unsupervised online learning and sequential anomaly detection in trajectories.

  4. Association of Copy Number Variants With Specific Ultrasonographically Detected Fetal Anomalies

    PubMed Central

    Donnelly, Jennifer C; Platt, Lawrence D; Rebarber, Andrei; Zachary, Julia; Grobman, William A; Wapner, Ronald J

    2014-01-01

    Objective To evaluate the association of other-than-common benign copy number variants with specific fetal abnormalities detected by ultrasonogram. Methods Fetuses with structural anomalies were compared to fetuses without detected abnormalities for the frequency of other-than-common benign copy number variants. This is a secondary analysis from the previously published National Institute of Child Health and Human Development microarray trial. Ultrasound reports were reviewed and details of structural anomalies were entered into a nonhierarchical web-based database. The frequency of other-than-common benign copy number variants (ie, either pathogenic or variants of uncertain significance) not detected by karyotype was calculated for each anomaly in isolation and in the presence of other anomalies and compared to the frequency in fetuses without detected abnormalities. Results Of 1,082 fetuses with anomalies detected on ultrasound, 752 had a normal karyotype. Other-than-common benign copy number variants were present in 61 (8.1%) of these euploid fetuses. Fetuses with anomalies in more than one system had a 13.0% frequency of other-than-common benign copy number variants, which was significantly higher (p<0.001) than the frequency (3.6%) in fetuses without anomalies (n = 1966). Specific organ systems in which isolated anomalies were nominally significantly associated with other-than-common benign copy number variants were the renal (p= 0.036) and cardiac systems (p=0.012) but did not meet the adjustment for multiple comparisons. Conclusions When a fetal anomaly is detected on ultrasonogram, chromosomal microarray offers additional information over karyotype, the degree of which depends on the organ system involved. PMID:24901266

  5. Evaluation schemes for video and image anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael

    2016-05-01

    Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.

  6. USBeSafe: Applying One Class SVM for Effective USB Event Anomaly Detection

    DTIC Science & Technology

    2016-04-25

    as the attack hides in plain sight. In this thesis, we present USBeSafe as a first-of-its-kind machine learning - based anomaly detection framework... learning techniques, specifically one-class support vector machines, to create an offline USB event anomaly detection system that serves as the basis...Transfer Types . . . . . . . . . . . . . . . . . . . . . . . 7 Enumeration: Learning about the Device . . . . . . . 8 2.2 The Science of Machine Learning

  7. Anomalies

    SciTech Connect

    Bardeen, W.A.

    1985-08-01

    Anomalies have a diverse impact on many aspects of physical phenomena. The role of anomalies in determining physical structure from the amplitude for decay to the foundations of superstring theory will be reviewed. 36 refs.

  8. Cross correlation anomaly detection system

    NASA Technical Reports Server (NTRS)

    Micka, E. Z. (Inventor)

    1975-01-01

    This invention provides a method for automatically inspecting the surface of an object, such as an integrated circuit chip, whereby the data obtained by the light reflected from the surface, caused by a scanning light beam, is automatically compared with data representing acceptable values for each unique surface. A signal output provided indicated of acceptance or rejection of the chip. Acceptance is based on predetermined statistical confidence intervals calculated from known good regions of the object being tested, or their representative values. The method can utilize a known good chip, a photographic mask from which the I.C. was fabricated, or a computer stored replica of each pattern being tested.

  9. Gravitational anomalies in the solar system?

    NASA Astrophysics Data System (ADS)

    Iorio, Lorenzo

    2015-02-01

    Mindful of the anomalous perihelion precession of Mercury discovered by Le Verrier in the second half of the nineteenth century and its successful explanation by Einstein with his General Theory of Relativity in the early years of the twentieth century, discrepancies among observed effects in our Solar system and their theoretical predictions on the basis of the currently accepted laws of gravitation applied to known matter-energy distributions have the potential of paving the way for remarkable advances in fundamental physics. This is particularly important now more than ever, given that most of the universe seems to be made of unknown substances dubbed Dark Matter and Dark Energy. Should this not be directly the case, Solar system's anomalies could anyhow lead to advancements in either cumulative science, as shown to us by the discovery of Neptune in the first half of the nineteenth century, and technology itself. Moreover, investigations in one of such directions can serendipitously enrich the other one as well. The current status of some alleged gravitational anomalies in the Solar system is critically reviewed. They are: (a) Possible anomalous advances of planetary perihelia. (b) Unexplained orbital residuals of a recently discovered moon of Uranus (Mab). (c) The lingering unexplained secular increase of the eccentricity of the orbit of the Moon. (d) The so-called Faint Young Sun Paradox. (e) The secular decrease of the mass parameter of the Sun. (f) The Flyby Anomaly. (g) The Pioneer Anomaly. (h) The anomalous secular increase of the astronomical unit.

  10. Anomalies.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on anomalies includes Web sites, CD-ROMs and software, videos, books, and additional resources for elementary and junior high school students. Pertinent activities are suggested, and sidebars discuss UFOs, animal anomalies, and anomalies from nature; and resources covering unexplained phenonmenas like crop circles, Easter Island,…

  11. Anomalies.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on anomalies includes Web sites, CD-ROMs and software, videos, books, and additional resources for elementary and junior high school students. Pertinent activities are suggested, and sidebars discuss UFOs, animal anomalies, and anomalies from nature; and resources covering unexplained phenonmenas like crop circles, Easter Island,…

  12. Visual analytics of anomaly detection in large data streams

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay

    2009-01-01

    Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.

  13. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.

    PubMed

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-04-29

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.

  14. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines

    PubMed Central

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-01-01

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561

  15. A Robust Method to Detect BeiDou Navigation Satellite System Orbit Maneuvering/Anomalies and Its Applications to Precise Orbit Determination

    PubMed Central

    Ye, Fei; Yuan, Yunbin; Tan, Bingfeng; Ou, Jikun

    2017-01-01

    The failure to detect anomalies and maneuvering of the orbits of navigation satellite sensors will deteriorate the performance of positioning and orbit determination. Motivated by the influence of the frequent maneuvering of BDS GEO and IGSO satellites, this paper analyzes the limitations of existing methods, where BDS orbit maneuvering and anomalies can be detected, and develops a method to solve this problem based on the RMS model of orbit mutual differences proposed in this paper. The performance of this method was assessed by comparison with the health flag of broadcast ephemeris, precise orbit products of GFZ, the O-C values of a GNSS station and a conventional method. The results show that the performance of the method developed in this paper is better than that of the conventional method when the periodicity and trend items are obvious. Meanwhile, three additional verification results show that the method developed in this paper can find error information in the merged broadcast ephemeris provided by iGMAS. Furthermore, from the testing results, it can be seen that the detection of anomaly and maneuvering items do not affect each other based on the robust thresholds constructed in this paper. In addition, the precise orbit of the maneuvering satellites can be determined under the circumstances that the maneuver information detected in this paper is used, and the root mean square (RMS) of orbit overlap comparison for GEO-03/IGSO-03 in Radial, Along, Cross, 1D-RMS are 0.7614/0.4460 m, 1.8901/0.3687 m, 0.3392/0.2069 m, 2.0657/0.6145 m, respectively. PMID:28509847

  16. A Robust Method to Detect BeiDou Navigation Satellite System Orbit Maneuvering/Anomalies and Its Applications to Precise Orbit Determination.

    PubMed

    Ye, Fei; Yuan, Yunbin; Tan, Bingfeng; Ou, Jikun

    2017-05-16

    The failure to detect anomalies and maneuvering of the orbits of navigation satellite sensors will deteriorate the performance of positioning and orbit determination. Motivated by the influence of the frequent maneuvering of BDS GEO and IGSO satellites, this paper analyzes the limitations of existing methods, where BDS orbit maneuvering and anomalies can be detected, and develops a method to solve this problem based on the RMS model of orbit mutual differences proposed in this paper. The performance of this method was assessed by comparison with the health flag of broadcast ephemeris, precise orbit products of GFZ, the O-C values of a GNSS station and a conventional method. The results show that the performance of the method developed in this paper is better than that of the conventional method when the periodicity and trend items are obvious. Meanwhile, three additional verification results show that the method developed in this paper can find error information in the merged broadcast ephemeris provided by iGMAS. Furthermore, from the testing results, it can be seen that the detection of anomaly and maneuvering items do not affect each other based on the robust thresholds constructed in this paper. In addition, the precise orbit of the maneuvering satellites can be determined under the circumstances that the maneuver information detected in this paper is used, and the root mean square (RMS) of orbit overlap comparison for GEO-03/IGSO-03 in Radial, Along, Cross, 1D-RMS are 0.7614/0.4460 m, 1.8901/0.3687 m, 0.3392/0.2069 m, 2.0657/0.6145 m, respectively.

  17. Anomaly Detection In Additively Manufactured Parts Using Laser Doppler Vibrometery

    SciTech Connect

    Hernandez, Carlos A.

    2015-09-29

    Additively manufactured parts are susceptible to non-uniform structure caused by the unique manufacturing process. This can lead to structural weakness or catastrophic failure. Using laser Doppler vibrometry and frequency response analysis, non-contact detection of anomalies in additively manufactured parts may be possible. Preliminary tests show promise for small scale detection, but more future work is necessary.

  18. Anomaly detection of blast furnace condition using tuyere cameras

    NASA Astrophysics Data System (ADS)

    Yamahira, Naoshi; Hirata, Takehide; Tsuda, Kazuro; Morikawa, Yasuyuki; Takata, Yousuke

    2016-09-01

    We present a method of anomaly detection using multivariate statistical process control(MSPC) to detect the abnormal behaviors of a blast furnace. Tuyere cameras attached circumferentially at the lower side of a blast furnace are used to monitor the inside of the furnace and this method extracts abnormal behaviors of intensities. It is confirmed that with our method, detecting timing is earlier than operators' notice. Besides, misalignment of cameras doesn't affect detecting performance, which is important property in actual use.

  19. Dependence-Based Anomaly Detection Methodologies

    DTIC Science & Technology

    2012-08-16

    tricks the user to enter their Netflix login. Detecting it is out of our scope and requires site authentication (i.e., certification verification... Netflix login. Detecting it is out of our scope and requires site authentication (i.e., certification verification) and user education. The preliminary

  20. [Anomaly Detection of Multivariate Time Series Based on Riemannian Manifolds].

    PubMed

    Xu, Yonghong; Hou, Xiaoying; Li Shuting; Cui, Jie

    2015-06-01

    Multivariate time series problems widely exist in production and life in the society. Anomaly detection has provided people with a lot of valuable information in financial, hydrological, meteorological fields, and the research areas of earthquake, video surveillance, medicine and others. In order to quickly and efficiently find exceptions in time sequence so that it can be presented in front of people in an intuitive way, we in this study combined the Riemannian manifold with statistical process control charts, based on sliding window, with a description of the covariance matrix as the time sequence, to achieve the multivariate time series of anomaly detection and its visualization. We made MA analog data flow and abnormal electrocardiogram data from MIT-BIH as experimental objects, and verified the anomaly detection method. The results showed that the method was reasonable and effective.

  1. Identifying Threats Using Graph-based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Eberle, William; Holder, Lawrence; Cook, Diane

    Much of the data collected during the monitoring of cyber and other infrastructures is structural in nature, consisting of various types of entities and relationships between them. The detection of threatening anomalies in such data is crucial to protecting these infrastructures. We present an approach to detecting anomalies in a graph-based representation of such data that explicitly represents these entities and relationships. The approach consists of first finding normative patterns in the data using graph-based data mining and then searching for small, unexpected deviations to these normative patterns, assuming illicit behavior tries to mimic legitimate, normative behavior. The approach is evaluated using several synthetic and real-world datasets. Results show that the approach has high truepositive rates, low false-positive rates, and is capable of detecting complex structural anomalies in real-world domains including email communications, cellphone calls and network traffic.

  2. Symbolic Representation of Electromagnetic Data for Seismic Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Christodoulou, Vyron; Bi, Yaxin; Wilkie, George; Zhao, Guoze

    2016-08-01

    In this work we investigate the use of symbolic representation methods for Anomaly Detection in different electromagnetic sequential me series datasets. An issue that is o en overlooked regarding symbolic representation and its performance in Anomaly Detection is the use of a quantitative accuracy metric. Un l recently only visual representations have been used to show the efficiency of an algorithm to detect anomalies. In this respect we propose an novel accuracy metric that takes into account the length of the sliding window of such symbolic representation algorithms and we present its utility. For the evaluation of the accuracy metric, HOT-SAX is used, a method that aggregates data points by use of sliding windows. A HOT-SAX variant, with the use of overlapping windows, is also introduced that achieves be er results based on the newly de ned accuracy metric. Both algorithms are evaluated under ten benchmark and real terrestrial and satellite data.

  3. Near-Real Time Anomaly Detection for Scientific Sensor Data

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A.; Tweedie, C. E.; goswami, S.; Jaimes, A.; Gamon, J. A.

    2011-12-01

    Environmental scientists use advanced sensor technology such as meteorological towers, wireless sensor networks and robotic trams equipped with sensors to perform data collection at remote research sites. Because the amount of environmental sensor data acquired in real time by such instruments is increasing, both the ability to evaluate the accuracy of the data at near-real time and check that the instrumentation is operating correctly are critical in order to not lose valuable time and information. The goal of the research is to define a software engineering-based solution that provides the foundation to define reusable templates for formally specifying data properties and automatically generate programming code that can monitor data streams to identify anomalies at near real-time. The research effort has resulted in a data property categorization that is based on a literature survey of 15 projects that collected environmental data from sensors and a case study conducted in the Arctic. More than 500 published data properties were manually extracted and analyzed from the surveyed projects. The data property categorization revealed recurrent data patterns. Using these patterns and the Specification and Pattern System (SPS) from the software-engineering community as a model, we developed the Data Specification and Pattern System (D-SPS) to capture data properties. D-SPS is the foundation for the Data Property Specification (DaProS) prototype tool that assists scientists in specification of sensor data properties. A series of experiments have been conducted in collaboration with experts working with Eddy covariance (EC) data from the Jornada Basin Experimental Range (JER) and with hyper-spectral data collected using robotic tram systems from the Arctic. The goal of the experiments were to determine if the approach for specifying data properties is effective for specifying data properties and identifying anomalies in sensor data. A complementary Sensor Data

  4. Multivariate anomaly detection for Earth observations: a comparison of algorithms and feature extraction techniques

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Gans, Fabian; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus; Rodner, Erik; Bathiany, Sebastian; Bodesheim, Paul; Guanche, Yanira; Sippel, Sebastian; Mahecha, Miguel D.

    2017-08-01

    Today, many processes at the Earth's surface are constantly monitored by multiple data streams. These observations have become central to advancing our understanding of vegetation dynamics in response to climate or land use change. Another set of important applications is monitoring effects of extreme climatic events, other disturbances such as fires, or abrupt land transitions. One important methodological question is how to reliably detect anomalies in an automated and generic way within multivariate data streams, which typically vary seasonally and are interconnected across variables. Although many algorithms have been proposed for detecting anomalies in multivariate data, only a few have been investigated in the context of Earth system science applications. In this study, we systematically combine and compare feature extraction and anomaly detection algorithms for detecting anomalous events. Our aim is to identify suitable workflows for automatically detecting anomalous patterns in multivariate Earth system data streams. We rely on artificial data that mimic typical properties and anomalies in multivariate spatiotemporal Earth observations like sudden changes in basic characteristics of time series such as the sample mean, the variance, changes in the cycle amplitude, and trends. This artificial experiment is needed as there is no gold standard for the identification of anomalies in real Earth observations. Our results show that a well-chosen feature extraction step (e.g., subtracting seasonal cycles, or dimensionality reduction) is more important than the choice of a particular anomaly detection algorithm. Nevertheless, we identify three detection algorithms (k-nearest neighbors mean distance, kernel density estimation, a recurrence approach) and their combinations (ensembles) that outperform other multivariate approaches as well as univariate extreme-event detection methods. Our results therefore provide an effective workflow to automatically detect

  5. Unsupervised Topic Discovery by Anomaly Detection

    DTIC Science & Technology

    2013-09-01

    Data Mining, Nashville, Tennessee, June 13–16, 2004. [15] H. P. Kriegel, P. Kroger , and A. Zimek, “Outlier detection techniques (tutorial),” 13th...Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD),Bangkok, Thailand, April 27–30, 2009. [16] H. P. Kriegel, P. Kroger , and A. Zimek

  6. Remote detection of geobotanical anomalies associated with hydrocarbon microseepage

    NASA Technical Reports Server (NTRS)

    Rock, B. N.

    1985-01-01

    As part of the continuing study of the Lost River, West Virginia NASA/Geosat Test Case Site, an extensive soil gas survey of the site was conducted during the summer of 1983. This soil gas survey has identified an order of magnitude methane, ethane, propane, and butane anomaly that is precisely coincident with the linear maple anomaly reported previously. This and other maple anomalies were previously suggested to be indicative of anaerobic soil conditions associated with hydrocarbon microseepage. In vitro studies support the view that anomalous distributions of native tree species tolerant of anaerobic soil conditions may be useful indicators of methane microseepage in heavily vegetated areas of the United States characterized by deciduous forest cover. Remote sensing systems which allow discrimination and mapping of native tree species and/or species associations will provide the exploration community with a means of identifying vegetation distributional anomalies indicative of microseepage.

  7. Remote detection of geobotanical anomalies associated with hydrocarbon microseepage

    NASA Technical Reports Server (NTRS)

    Rock, B. N.

    1985-01-01

    As part of the continuing study of the Lost River, West Virginia NASA/Geosat Test Case Site, an extensive soil gas survey of the site was conducted during the summer of 1983. This soil gas survey has identified an order of magnitude methane, ethane, propane, and butane anomaly that is precisely coincident with the linear maple anomaly reported previously. This and other maple anomalies were previously suggested to be indicative of anaerobic soil conditions associated with hydrocarbon microseepage. In vitro studies support the view that anomalous distributions of native tree species tolerant of anaerobic soil conditions may be useful indicators of methane microseepage in heavily vegetated areas of the United States characterized by deciduous forest cover. Remote sensing systems which allow discrimination and mapping of native tree species and/or species associations will provide the exploration community with a means of identifying vegetation distributional anomalies indicative of microseepage.

  8. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  9. The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection

    NASA Astrophysics Data System (ADS)

    Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad; Demon, Siti Zulaikha Ngah

    2016-01-01

    The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measured 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.

  10. The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection

    SciTech Connect

    Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad; Demon, Siti Zulaikha Ngah

    2016-01-22

    The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measured 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.

  11. Compressive Hyperspectral Imaging and Anomaly Detection

    DTIC Science & Technology

    2013-03-01

    simple, yet effective method of using the spatial information to increase the accuracy of target detection. The idea is to apply TV denoising [4] to the...a zero value, and isolated false alarm pixels are usually eliminated by the TV denoising algorithm. 2 2.1.1 TV Denoising Here we briefly describe the...total variation denoising model[4] we use in the above. Given an image I ∈ R2, we solve the following L1 minimization problem to denoise the image

  12. Radio Frequency Based Programmable Logic Controller Anomaly Detection

    DTIC Science & Technology

    2013-09-01

    RF- DNA Transform . . . . . . . . . . . . . . . . . . . . 49 3.7 Region of Interest Selection . . . . . . . . . . . . . . . . . . . . 52 3.8 CBAD...Device, NB=60, NOp=5 . . . . . . . . . . . . . . . 71 4.4 Software Anomaly Detection: RF- DNA Sequences . . . . . . . . 74 vii Page 4.4.1 Single Device...Waveforms . . . . . . . . . . . . . . 50 3.11 RF- DNA Fingerprint Diagram . . . . . . . . . . . . . . . . . 53 3.12 Representative Collected Scan Waveform

  13. Hyperspectral anomaly detection using Sony PlayStation 3

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton; Romano, João; Sepulveda, Rene

    2009-05-01

    We present a proof-of-principle demonstration using Sony's IBM Cell processor-based PlayStation 3 (PS3) to run-in near real-time-a hyperspectral anomaly detection algorithm (HADA) on real hyperspectral (HS) long-wave infrared imagery. The PS3 console proved to be ideal for doing precisely the kind of heavy computational lifting HS based algorithms require, and the fact that it is a relatively open platform makes programming scientific applications feasible. The PS3 HADA is a unique parallel-random sampling based anomaly detection approach that does not require prior spectra of the clutter background. The PS3 HADA is designed to handle known underlying difficulties (e.g., target shape/scale uncertainties) often ignored in the development of autonomous anomaly detection algorithms. The effort is part of an ongoing cooperative contribution between the Army Research Laboratory and the Army's Armament, Research, Development and Engineering Center, which aims at demonstrating performance of innovative algorithmic approaches for applications requiring autonomous anomaly detection using passive sensors.

  14. Robust and efficient anomaly detection using heterogeneous representations

    NASA Astrophysics Data System (ADS)

    Hu, Xing; Hu, Shiqiang; Xie, Jinhua; Zheng, Shiyou

    2015-05-01

    Various approaches have been proposed for video anomaly detection. Yet these approaches typically suffer from one or more limitations: they often characterize the pattern using its internal information, but ignore its external relationship which is important for local anomaly detection. Moreover, the high-dimensionality and the lack of robustness of pattern representation may lead to problems, including overfitting, increased computational cost and memory requirements, and high false alarm rate. We propose a video anomaly detection framework which relies on a heterogeneous representation to account for both the pattern's internal information and external relationship. The internal information is characterized by slow features learned by slow feature analysis from low-level representations, and the external relationship is characterized by the spatial contextual distances. The heterogeneous representation is compact, robust, efficient, and discriminative for anomaly detection. Moreover, both the pattern's internal information and external relationship can be taken into account in the proposed framework. Extensive experiments demonstrate the robustness and efficiency of our approach by comparison with the state-of-the-art approaches on the widely used benchmark datasets.

  15. SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) 2013

    SciTech Connect

    Gordon Rueff; Lyle Roybal; Denis Vollmer

    2013-01-01

    There is a significant need to protect the nation’s energy infrastructures from malicious actors using cyber methods. Supervisory, Control, and Data Acquisition (SCADA) systems may be vulnerable due to the insufficient security implemented during the design and deployment of these control systems. This is particularly true in older legacy SCADA systems that are still commonly in use. The purpose of INL’s research on the SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) project was to determine if and how data compression techniques could be used to identify and protect SCADA systems from cyber attacks. Initially, the concept was centered on how to train a compression algorithm to recognize normal control system traffic versus hostile network traffic. Because large portions of the TCP/IP message traffic (called packets) are repetitive, the concept of using compression techniques to differentiate “non-normal” traffic was proposed. In this manner, malicious SCADA traffic could be identified at the packet level prior to completing its payload. Previous research has shown that SCADA network traffic has traits desirable for compression analysis. This work investigated three different approaches to identify malicious SCADA network traffic using compression techniques. The preliminary analyses and results presented herein are clearly able to differentiate normal from malicious network traffic at the packet level at a very high confidence level for the conditions tested. Additionally, the master dictionary approach used in this research appears to initially provide a meaningful way to categorize and compare packets within a communication channel.

  16. Solar cell anomaly detection method and apparatus

    NASA Technical Reports Server (NTRS)

    Miller, Emmett L. (Inventor); Shumka, Alex (Inventor); Gauthier, Michael K. (Inventor)

    1981-01-01

    A method is provided for detecting cracks and other imperfections in a solar cell, which includes scanning a narrow light beam back and forth across the cell in a raster pattern, while monitoring the electrical output of the cell to find locations where the electrical output varies significantly. The electrical output can be monitored on a television type screen containing a raster pattern with each point on the screen corresponding to a point on the solar cell surface, and with the brightness of each point on the screen corresponding to the electrical output from the cell which was produced when the light beam was at the corresponding point on the cell. The technique can be utilized to scan a large array of interconnected solar cells, to determine which ones are defective.

  17. A spring window for geobotanical anomaly detection

    NASA Technical Reports Server (NTRS)

    Bell, R.; Labovitz, M. L.; Masuoka, E. J.

    1985-01-01

    The observation of senescence of deciduous vegetation to detect soil heavy metal mineralization is discussed. A gridded sampling of two sites of Quercus alba L. in south-central Virginia in 1982 is studied. The data reveal that smaller leaf blade lengths are observed in the soil site with copper, lead, and zinc concentrations. A random study in 1983 of red and white Q. rubra L., Q. prinus L., and Acer rubrum L., to confirm previous results is described. The observations of blade length and bud breaks show a 7-10 day lag in growth in the mineral site for the oak trees; however, the maple trees are not influenced by the minerals.

  18. A spring window for geobotanical anomaly detection

    NASA Technical Reports Server (NTRS)

    Bell, R.; Labovitz, M. L.; Masuoka, E. J.

    1985-01-01

    The observation of senescence of deciduous vegetation to detect soil heavy metal mineralization is discussed. A gridded sampling of two sites of Quercus alba L. in south-central Virginia in 1982 is studied. The data reveal that smaller leaf blade lengths are observed in the soil site with copper, lead, and zinc concentrations. A random study in 1983 of red and white Q. rubra L., Q. prinus L., and Acer rubrum L., to confirm previous results is described. The observations of blade length and bud breaks show a 7-10 day lag in growth in the mineral site for the oak trees; however, the maple trees are not influenced by the minerals.

  19. Sensor Anomaly Detection in Wireless Sensor Networks for Healthcare

    PubMed Central

    Haque, Shah Ahsanul; Rahman, Mustafizur; Aziz, Syed Mahfuzul

    2015-01-01

    Wireless Sensor Networks (WSN) are vulnerable to various sensor faults and faulty measurements. This vulnerability hinders efficient and timely response in various WSN applications, such as healthcare. For example, faulty measurements can create false alarms which may require unnecessary intervention from healthcare personnel. Therefore, an approach to differentiate between real medical conditions and false alarms will improve remote patient monitoring systems and quality of healthcare service afforded by WSN. In this paper, a novel approach is proposed to detect sensor anomaly by analyzing collected physiological data from medical sensors. The objective of this method is to effectively distinguish false alarms from true alarms. It predicts a sensor value from historic values and compares it with the actual sensed value for a particular instance. The difference is compared against a threshold value, which is dynamically adjusted, to ascertain whether the sensor value is anomalous. The proposed approach has been applied to real healthcare datasets and compared with existing approaches. Experimental results demonstrate the effectiveness of the proposed system, providing high Detection Rate (DR) and low False Positive Rate (FPR). PMID:25884786

  20. Sensor anomaly detection in wireless sensor networks for healthcare.

    PubMed

    Haque, Shah Ahsanul; Rahman, Mustafizur; Aziz, Syed Mahfuzul

    2015-04-15

    Wireless Sensor Networks (WSN) are vulnerable to various sensor faults and faulty measurements. This vulnerability hinders efficient and timely response in various WSN applications, such as healthcare. For example, faulty measurements can create false alarms which may require unnecessary intervention from healthcare personnel. Therefore, an approach to differentiate between real medical conditions and false alarms will improve remote patient monitoring systems and quality of healthcare service afforded by WSN. In this paper, a novel approach is proposed to detect sensor anomaly by analyzing collected physiological data from medical sensors. The objective of this method is to effectively distinguish false alarms from true alarms. It predicts a sensor value from historic values and compares it with the actual sensed value for a particular instance. The difference is compared against a threshold value, which is dynamically adjusted, to ascertain whether the sensor value is anomalous. The proposed approach has been applied to real healthcare datasets and compared with existing approaches. Experimental results demonstrate the effectiveness of the proposed system, providing high Detection Rate (DR) and low False Positive Rate (FPR).

  1. A Program to Compute Magnetic Anomaly Detection Probabilities. Revision 2

    DTIC Science & Technology

    1990-03-01

    Washington, D.C., June 1964. 3. " Magnetic Airborne Detector Program," Summary Technical Report of Division 6, National Defense Research Committee, Vol. 5...AD-A225 427 JIL L COPY NPS71-88-OO1 (Second Revision) NAVAL POSTGRADUATE SCHOOL Monterey, California T E A PROGRAM TO COMPU rE MAGNETIC ANOMALY...to Compute Magnetic Anomaly Detection Probabilities", was referred to its author for his comments. 2. It is the author’s opinion that the report

  2. An Anomaly Clock Detection Algorithm for a Robust Clock Ensemble

    DTIC Science & Technology

    2009-11-01

    41 st Annual Precise Time and Time Interval (PTTI) Meeting 121 AN ANOMALY CLOCK DETECTION ALGORITHM FOR A ROBUST CLOCK ENSEMBLE...clocks are in phase and on frequency all the time with advantages of relatively simple, robust, fully redundant, and improved performance. It allows...Algorithm parameters, such as the sliding window width as a function of the time constant, and the minimum detectable levels have been optimized and

  3. A Semiparametric Model for Hyperspectral Anomaly Detection

    DTIC Science & Technology

    2012-01-01

    known that the performance of kernel methods crucially depends on the kernel function and its parameter(s) [11]. More recently, Gurram and Kwon in [12...700 VNIR HS spectral imager, which is commercially available off the shelf. The system produces HS data cubes of fixed dimensions R = 640 by C = 640...window in X (a data cube). The data format of X is shown in (1), where r ( r = 1, . . . , R ) and c (c = 1, . . . ,C) index pixels xrc in the R × C spatial

  4. Gaussian Process for Activity Modeling and Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Liao, W.; Rosenhahn, B.; Yang, M. Ying

    2015-08-01

    Complex activity modeling and identification of anomaly is one of the most interesting and desired capabilities for automated video behavior analysis. A number of different approaches have been proposed in the past to tackle this problem. There are two main challenges for activity modeling and anomaly detection: 1) most existing approaches require sufficient data and supervision for learning; 2) the most interesting abnormal activities arise rarely and are ambiguous among typical activities, i.e. hard to be precisely defined. In this paper, we propose a novel approach to model complex activities and detect anomalies by using non-parametric Gaussian Process (GP) models in a crowded and complicated traffic scene. In comparison with parametric models such as HMM, GP models are nonparametric and have their advantages. Our GP models exploit implicit spatial-temporal dependence among local activity patterns. The learned GP regression models give a probabilistic prediction of regional activities at next time interval based on observations at present. An anomaly will be detected by comparing the actual observations with the prediction at real time. We verify the effectiveness and robustness of the proposed model on the QMUL Junction Dataset. Furthermore, we provide a publicly available manually labeled ground truth of this data set.

  5. A Feasibility Study on the Application of the ScriptGenE Framework as an Anomaly Detection System in Industrial Control Systems

    DTIC Science & Technology

    2015-09-17

    not reflect the official policy or position of the United States Air Force, the United States Department of Defense or the United States Government...anomalies when the training data also contains anomalous behavior in Experiment 2. The ADS achieves true positive rate (TPR) of 0.9011 and false positive rate...EtherNet/IP EtherNet Industrial Protocol EWS engineering workstation FN false negative FP false positive FNR false negative rate FPR false positive rate FTP

  6. Limitations of Aneuploidy and Anomaly Detection in the Obese Patient.

    PubMed

    Zozzaro-Smith, Paula; Gray, Lisa M; Bacak, Stephen J; Thornburg, Loralei L

    2014-07-17

    Obesity is a worldwide epidemic and can have a profound effect on pregnancy risks. Obese patients tend to be older and are at increased risk for structural fetal anomalies and aneuploidy, making screening options critically important for these women. Failure rates for first-trimester nuchal translucency (NT) screening increase with obesity, while the ability to detect soft-markers declines, limiting ultrasound-based screening options. Obesity also decreases the chances of completing the anatomy survey and increases the residual risk of undetected anomalies. Additionally, non-invasive prenatal testing (NIPT) is less likely to provide an informative result in obese patients. Understanding the limitations and diagnostic accuracy of aneuploidy and anomaly screening in obese patients can help guide clinicians in counseling patients on the screening options.

  7. Anomaly Detection and Life Pattern Estimation for the Elderly Based on Categorization of Accumulated Data

    NASA Astrophysics Data System (ADS)

    Mori, Taketoshi; Ishino, Takahito; Noguchi, Hiroshi; Shimosaka, Masamichi; Sato, Tomomasa

    2011-06-01

    We propose a life pattern estimation method and an anomaly detection method for elderly people living alone. In our observation system for such people, we deploy some pyroelectric sensors into the house and measure the person's activities all the time in order to grasp the person's life pattern. The data are transferred successively to the operation center and displayed to the nurses in the center in a precise way. Then, the nurses decide whether the data is the anomaly or not. In the system, the people whose features in their life resemble each other are categorized as the same group. Anomalies occurred in the past are shared in the group and utilized in the anomaly detection algorithm. This algorithm is based on "anomaly score." The "anomaly score" is figured out by utilizing the activeness of the person. This activeness is approximately proportional to the frequency of the sensor response in a minute. The "anomaly score" is calculated from the difference between the activeness in the present and the past one averaged in the long term. Thus, the score is positive if the activeness in the present is higher than the average in the past, and the score is negative if the value in the present is lower than the average. If the score exceeds a certain threshold, it means that an anomaly event occurs. Moreover, we developed an activity estimation algorithm. This algorithm estimates the residents' basic activities such as uprising, outing, and so on. The estimation is shown to the nurses with the "anomaly score" of the residents. The nurses can understand the residents' health conditions by combining these two information.

  8. Anomaly detection for machine learning redshifts applied to SDSS galaxies

    NASA Astrophysics Data System (ADS)

    Hoyle, Ben; Rau, Markus Michael; Paech, Kerstin; Bonnett, Christopher; Seitz, Stella; Weller, Jochen

    2015-10-01

    We present an analysis of anomaly detection for machine learning redshift estimation. Anomaly detection allows the removal of poor training examples, which can adversely influence redshift estimates. Anomalous training examples may be photometric galaxies with incorrect spectroscopic redshifts, or galaxies with one or more poorly measured photometric quantity. We select 2.5 million `clean' SDSS DR12 galaxies with reliable spectroscopic redshifts, and 6730 `anomalous' galaxies with spectroscopic redshift measurements which are flagged as unreliable. We contaminate the clean base galaxy sample with galaxies with unreliable redshifts and attempt to recover the contaminating galaxies using the Elliptical Envelope technique. We then train four machine learning architectures for redshift analysis on both the contaminated sample and on the preprocessed `anomaly-removed' sample and measure redshift statistics on a clean validation sample generated without any preprocessing. We find an improvement on all measured statistics of up to 80 per cent when training on the anomaly removed sample as compared with training on the contaminated sample for each of the machine learning routines explored. We further describe a method to estimate the contamination fraction of a base data sample.

  9. Security inspection in ports by anomaly detection using hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Rivera, Javier; Valverde, Fernando; Saldaña, Manuel; Manian, Vidya

    2013-05-01

    Applying hyperspectral imaging technology in port security is crucial for the detection of possible threats or illegal activities. One of the most common problems that cargo suffers is tampering. This represents a danger to society because it creates a channel to smuggle illegal and hazardous products. If a cargo is altered, security inspections on that cargo should contain anomalies that reveal the nature of the tampering. Hyperspectral images can detect anomalies by gathering information through multiple electromagnetic bands. The spectrums extracted from these bands can be used to detect surface anomalies from different materials. Based on this technology, a scenario was built in which a hyperspectral camera was used to inspect the cargo for any surface anomalies and a user interface shows the results. The spectrum of items, altered by different materials that can be used to conceal illegal products, is analyzed and classified in order to provide information about the tampered cargo. The image is analyzed with a variety of techniques such as multiple features extracting algorithms, autonomous anomaly detection, and target spectrum detection. The results will be exported to a workstation or mobile device in order to show them in an easy -to-use interface. This process could enhance the current capabilities of security systems that are already implemented, providing a more complete approach to detect threats and illegal cargo.

  10. An expert system for diagnosing anomalies of spacecraft

    NASA Technical Reports Server (NTRS)

    Lauriente, Michael; Durand, Rick; Vampola, AL; Koons, Harry C.; Gorney, David

    1994-01-01

    Although the analysis of anomalous behavior of satellites is difficult because it is a very complex process, it is important to be able to make an accurate assessment in a timely manner when the anomaly is observed. Spacecraft operators may have to take corrective action or to 'safe' the spacecraft; space-environment forecasters may have to assess the environmental situation and issue warnings and alerts regarding hazardous conditions, and scientists and engineers may want to gain knowledge for future designs to mitigate the problems. Anomalies can be hardware problems, software errors, environmentally induced, or even the cause of workmanship. Spacecraft anomalies attributable to electrostatic discharges have been known to cause command errors. A goal is to develop an automated system based on this concept to reduce the number of personnel required to operate large programs or missions such as Hubble Space Telescope (HST) and Mission to Planet Earth (MTPE). Although expert systems to detect anomalous behavior of satellites during operations are established, diagnosis of the anomaly is a complex procedure and is a new development.

  11. Change and Anomaly Detection in Real-Time GPS Data

    NASA Astrophysics Data System (ADS)

    Granat, R.; Pierce, M.; Gao, X.; Bock, Y.

    2008-12-01

    The California Real-Time Network (CRTN) is currently generating real-time GPS position data at a rate of 1-2Hz at over 80 locations. The CRTN data presents the possibility of studying dynamical solid earth processes in a way that complements existing seismic networks. To realize this possibility we have developed a prototype system for detecting changes and anomalies in the real-time data. Through this system, we can can correlate changes in multiple stations in order to detect signals with geographical extent. Our approach involves developing a statistical model for each GPS station in the network, and then using those models to segment the time series into a number of discrete states described by the model. We use a hidden Markov model (HMM) to describe the behavior of each station; fitting the model to the data requires neither labeled training examples nor a priori information about the system. As such, HMMs are well suited to this problem domain, in which the data remains largely uncharacterized. There are two main components to our approach. The first is the model fitting algorithm, regularized deterministic annealing expectation- maximization (RDAEM), which provides robust, high-quality results. The second is a web service infrastructure that connects the data to the statistical modeling analysis and allows us to easily present the results of that analysis through a web portal interface. This web service approach facilitates the automatic updating of station models to keep pace with dynamical changes in the data. Our web portal interface is critical to the process of interpreting the data. A Google Maps interface allows users to visually interpret state changes not only on individual stations but across the entire network. Users can drill down from the map interface to inspect detailed results for individual stations, download the time series data, and inspect fitted models. Alternatively, users can use the web portal look at the evolution of changes on the

  12. Clutter and anomaly removal for enhanced target detection

    NASA Astrophysics Data System (ADS)

    Basener, William F.

    2010-04-01

    In this paper we investigate the use of anomaly detection to identify pixels to be removed prior to covariance computation. The resulting covariance matrix provides a better model of the image background and is less likely to be tainted by target spectra. In our tests, this method results in robust improvement in target detection performance for quadratic detection algorithms. Tests are conducted using imagery and targets freely available online. The imagery was acquired over Cooke City, Montana, a small town near Yellowstone Park, using the HyMap V/NIR/SWIR sensor with 126 spectral bands. There are three vehicle and four fabric targets located in the town and surrounding area.

  13. Anomaly Detection in Time-Evolving Climate Graphs

    NASA Astrophysics Data System (ADS)

    Liess, S.; Agrawal, S.; Das, K.; Atluri, G.; Steinbach, M.; Steinhaeuser, K.; Kumar, V.

    2016-12-01

    The spatio­-temporal observations that are available for different climate variables such as pressure, temperature, wind, humidity etc., have been studied to understand how changes in one variable at a location exhibit similarity with changes in a different variable at a location thousands of kilometers away. These non-trivial long distance relationships, called teleconnections, are often useful in understanding the underlying physical phenomenon driving extreme events, which are becoming more common with the changing climate. Networks constructed using these data sets have the ability to capture these relationships at a global scale. These networks have been analyzed using a variety of network based approaches such as community detection and anomaly detection that have shown promise in capturing interesting climate phenomenon. In this research we plan to construct time-evolving climate networks such that their edges represent causal relationships, and then discover anomalies in such 'causal' climate networks. As part of this research, we will address several limitations of previous work in anomaly detection using climate networks. First, we will take into account spatial and temporal dependencies while constructing the networks, that has been largely ignored by existing work. Second, we will learn Granger causality to define causal relationships among different nodes. Third, we will build heterogeneous climate networks that will involve nodes from different climate variables. Fourth, we will construct a Granger graphical model to understand the long-range temporal dependency in the data. Finally, we will use community evolution based notion of anomaly detection on the time-evolving causal networks to discover deviations in expected behavior.

  14. GPR anomaly detection with robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Masarik, Matthew P.; Burns, Joseph; Thelen, Brian T.; Kelly, Jack; Havens, Timothy C.

    2015-05-01

    This paper investigates the application of Robust Principal Component Analysis (RPCA) to ground penetrating radar as a means to improve GPR anomaly detection. The method consists of a preprocessing routine to smoothly align the ground and remove the ground response (haircut), followed by mapping to the frequency domain, applying RPCA, and then mapping the sparse component of the RPCA decomposition back to the time domain. A prescreener is then applied to the time-domain sparse component to perform anomaly detection. The emphasis of the RPCA algorithm on sparsity has the effect of significantly increasing the apparent signal-to-clutter ratio (SCR) as compared to the original data, thereby enabling improved anomaly detection. This method is compared to detrending (spatial-mean removal) and classical principal component analysis (PCA), and the RPCA-based processing is seen to provide substantial improvements in the apparent SCR over both of these alternative processing schemes. In particular, the algorithm has been applied to both field collected impulse GPR data and has shown significant improvement in terms of the ROC curve relative to detrending and PCA.

  15. BEARS: a multi-mission anomaly response system

    NASA Astrophysics Data System (ADS)

    Roberts, Bryce A.

    2009-05-01

    The Mission Operations Group at UC Berkeley's Space Sciences Laboratory operates a highly automated ground station and presently a fleet of seven satellites, each with its own associated command and control console. However, the requirement for prompt anomaly detection and resolution is shared commonly between the ground segment and all spacecraft. The efficient, low-cost operation and "lights-out" staffing of the Mission Operations Group requires that controllers and engineers be notified of spacecraft and ground system problems around the clock. The Berkeley Emergency Anomaly and Response System (BEARS) is an in-house developed web- and paging-based software system that meets this need. BEARS was developed as a replacement for an existing emergency reporting software system that was too closedsource, platform-specific, expensive, and antiquated to expand or maintain. To avoid these limitations, the new system design leverages cross-platform, open-source software products such as MySQL, PHP, and Qt. Anomaly notifications and responses make use of the two-way paging capabilities of modern smart phones.

  16. Evaluation of anomaly detection algorithm using trans-admittance mammography with 60 × 60 electrode array.

    PubMed

    Zhao, Mingkang; Wi, Hun; Oh, Tong In; Woo, Eung Je

    2013-01-01

    Electrical impedance imaging has a potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. Specially, tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. We applied the anomaly detection algorithm to the high density trans-admittance mammography system for estimating the size and position of breast cancer. We tested 4 different size of anomaly with 3 different conductivity contrasts at 5 different depths. From a frequency difference trans-admittance map, we can readily observe the transversal position and estimate its size and depth. However, the size estimation was dependent on the admittivity contrast between anomaly and background. It requires the robust detection algorithm regardless of the conductivity contrast.

  17. Energy Detection Based on Undecimated Discrete Wavelet Transform and Its Application in Magnetic Anomaly Detection

    PubMed Central

    Nie, Xinhua; Pan, Zhongming; Zhang, Dasha; Zhou, Han; Chen, Min; Zhang, Wenna

    2014-01-01

    Magnetic anomaly detection (MAD) is a passive approach for detection of a ferromagnetic target, and its performance is often limited by external noises. In consideration of one major noise source is the fractal noise (or called 1/f noise) with a power spectral density of 1/fa (0detection method based on undecimated discrete wavelet transform (UDWT) is proposed in this paper. Firstly, the foundations of magnetic anomaly detection and UDWT are introduced in brief, while a possible detection system based on giant magneto-impedance (GMI) magnetic sensor is also given out. Then our proposed energy detection based on UDWT is described in detail, and the probabilities of false alarm and detection for given the detection threshold in theory are presented. It is noticeable that no a priori assumptions regarding the ferromagnetic target or the magnetic noise probability are necessary for our method, and different from the discrete wavelet transform (DWT), the UDWT is shift invariant. Finally, some simulations are performed and the results show that the detection performance of our proposed detector is better than that of the conventional energy detector even utilized in the Gaussian white noise, especially when the spectral parameter α is less than 1.0. In addition, a real-world experiment was done to demonstrate the advantages of the proposed method. PMID:25343484

  18. Energy detection based on undecimated discrete wavelet transform and its application in magnetic anomaly detection.

    PubMed

    Nie, Xinhua; Pan, Zhongming; Zhang, Dasha; Zhou, Han; Chen, Min; Zhang, Wenna

    2014-01-01

    Magnetic anomaly detection (MAD) is a passive approach for detection of a ferromagnetic target, and its performance is often limited by external noises. In consideration of one major noise source is the fractal noise (or called 1/f noise) with a power spectral density of 1/fa (0detection method based on undecimated discrete wavelet transform (UDWT) is proposed in this paper. Firstly, the foundations of magnetic anomaly detection and UDWT are introduced in brief, while a possible detection system based on giant magneto-impedance (GMI) magnetic sensor is also given out. Then our proposed energy detection based on UDWT is described in detail, and the probabilities of false alarm and detection for given the detection threshold in theory are presented. It is noticeable that no a priori assumptions regarding the ferromagnetic target or the magnetic noise probability are necessary for our method, and different from the discrete wavelet transform (DWT), the UDWT is shift invariant. Finally, some simulations are performed and the results show that the detection performance of our proposed detector is better than that of the conventional energy detector even utilized in the Gaussian white noise, especially when the spectral parameter α is less than 1.0. In addition, a real-world experiment was done to demonstrate the advantages of the proposed method.

  19. Progressive anomaly detection in medical data using vital sign signals

    NASA Astrophysics Data System (ADS)

    Gao, Cheng; Lee, Li-Chien; Li, Yao; Chang, Chein-I.; Hu, Peter; Mackenzie, Colin

    2016-05-01

    Vital Sign Signals (VSSs) have been widely used for medical data analysis. One classic approach is to use Logistic Regression Model (LRM) to describe data to be analyzed. There are two challenging issues from this approach. One is how many VSSs needed to be used in the model since there are many VSSs can be used for this purpose. Another is that once the number of VSSs is determined, the follow-up issue what these VSSs are. Up to date these two issues are resolved by empirical selection. This paper addresses these two issues from a hyperspectral imaging perspective. If we view a patient with collected different vital sign signals as a pixel vector in hyperspectral image, then each vital sign signal can be considered as a particular band. In light of this interpretation each VSS can be ranked by band prioritization commonly used by band selection in hyperspectral imaging. In order to resolve the issue of how many VSSs should be used for data analysis we further develop a Progressive Band Processing of Anomaly Detection (PBPAD) which allows users to detect anomalies in medical data using prioritized VSSs one after another so that data changes between bands can be dictated by profiles provided by PBPAD. As a result, there is no need of determining the number of VSSs as well as which VSS should be used because all VSSs are used in their prioritized orders. To demonstrate the utility of PBPAD in medical data analysis anomaly detection is implemented as PBP to find anomalies which correspond to abnormal patients. The data to be used for experiments are data collected in University of Maryland, School of Medicine, Shock Trauma Center (STC). The results will be evaluated by the results obtained by Logistic Regression Model (LRM).

  20. Inflight and Preflight Detection of Pitot Tube Anomalies

    NASA Technical Reports Server (NTRS)

    Mitchell, Darrell W.

    2014-01-01

    The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.

  1. Video Anomaly Detection with Compact Feature Sets for Online Performance.

    PubMed

    Leyva, Roberto; Sanchez, Victor; Li, Chang-Tsun

    2017-04-18

    Over the past decade, video anomaly detection has been explored with remarkable results. However, research on methodologies suitable for online performance is still very limited. In this paper, we present an online framework for video anomaly detection. The key aspect of our framework is a compact set of highly descriptive features, which is extracted from a novel cell structure that helps to define support regions in a coarse-to-fine fashion. Based on the scene's activity, only a limited number of support regions are processed, thus limiting the size of the feature set. Specifically, we use foreground occupancy and optical flow features. The framework uses an inference mechanism that evaluates the compact feature set via Gaussian Mixture Models, Markov Chains and Bag-of-Words in order to detect abnormal events. Our framework also considers the joint response of the models in the local spatio-temporal neighborhood to increase detection accuracy. We test our framework on popular existing datasets and on a new dataset comprising a wide variety of realistic videos captured by surveillance cameras. This particular dataset includes surveillance videos depicting criminal activities, car accidents and other dangerous situations. Evaluation results show that our framework outperforms other online methods and attains a very competitive detection performance compared to state-of-the-art non-online methods.

  2. [Congenital anomalies of the central nervous system in autopsy specimens].

    PubMed

    Sobaniec-Lotowska, M; Ostapiuk, H; Sulkowski, S; Sobaniec, W; Sulik, M; Famulski, W

    1989-02-01

    On the basis of an analysis of 2398 autopsies of infants aged up to 1 year in 194 cases congenital anomalies of the central nervous system were found (8.1%). Most cases of these anomalies were noted in the group of newborns (85%) and the most frequent anomalies were: myelomeningocele (35.6%), multiple anomalies (20.1%), congenital hydrocephalus (17%), anencephaly (14.4%) and corpus callosum malformations (3.6%). Myelomeningocele, congenital hydrocephalus, anencephaly and true microcephaly were more frequent in girls, while multiple anomalies and corpus callosum malformations were more frequent in boys.

  3. Detection of chiral anomaly and valley transport in Dirac semimetals

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Zhang, Enze; Liu, Yanwen; Chen, Zhigang; Liang, Sihang; Cao, Junzhi; Yuan, Xiang; Tang, Lei; Li, Qian; Gu, Teng; Wu, Yizheng; Zou, Jin; Xiu, Faxian

    Chiral anomaly is a non-conservation of chiral charge pumped by the topological nontrivial gauge field, which has been predicted to exist in the emergent quasiparticle excitations in Dirac and Weyl semimetals. However, so far, such pumping process hasn't been clearly demonstrated and lacks a convincing experimental identification. Here, we report the detection of the charge pumping effect and the related valley transport in Cd3As2 driven by external electric and magnetic fields (EB). We find that the chiral imbalance leads to a non-zero gyrotropic coefficient, which can be confirmed by the EB-generated Kerr effect. By applying B along the current direction, we observe a negative magnetoresistance despite the giant positive one at other directions, a clear indication of the chiral anomaly. Remarkably, a robust nonlocal response in valley diffusion originated from the chiral anomaly is persistent up to room temperature when B is parallel to E. The ability to manipulate the valley polarization in Dirac semimetal opens up a brand-new route to understand its fundamental properties through external fields and utilize the chiral fermions in valleytronic applications.

  4. Anomaly detection of flight routes through optimal waypoint

    NASA Astrophysics Data System (ADS)

    Pusadan, M. Y.; Buliali, J. L.; Ginardi, R. V. H.

    2017-01-01

    Deciding factor of flight, one of them is the flight route. Flight route determined by coordinate (latitude and longitude). flight routed is determined by its coordinates (latitude and longitude) as defined is waypoint. anomaly occurs, if the aircraft is flying outside the specified waypoint area. In the case of flight data, anomalies occur by identifying problems of the flight route based on data ADS-B. This study has an aim of to determine the optimal waypoints of the flight route. The proposed methods: i) Agglomerative Hierarchical Clustering (AHC) in several segments based on range area coordinates (latitude and longitude) in every waypoint; ii) The coefficient cophenetics correlation (c) to determine the correlation between the members in each cluster; iii) cubic spline interpolation as a graphic representation of the has connected between the coordinates on every waypoint; and iv). Euclidean distance to measure distances between waypoints with 2 centroid result of clustering AHC. The experiment results are value of coefficient cophenetics correlation (c): 0,691≤ c ≤ 0974, five segments the generated of the range area waypoint coordinates, and the shortest and longest distance between the centroid with waypoint are 0.46 and 2.18. Thus, concluded that the shortest distance is used as the reference coordinates of optimal waypoint, and farthest distance can be indicated potentially detected anomaly.

  5. Anomaly detection in the right hemisphere: The influence of visuospatial factors.

    PubMed

    Smith, Stephen D; Dixon, Michael J; Tays, William J; Bulman-Fleming, M Barbara

    2004-08-01

    Previous research with both brain-damaged and neurologically intact populations has demonstrated that the right cerebral hemisphere (RH) is superior to the left cerebral hemisphere (LH) at detecting anomalies (or incongruities) in objects (Ramachandran, 1995; Smith, Tays, Dixon, & Bulman-Fleming, 2002). The current research assesses whether the RH advantage for anomaly detection is due to the RH superiority for visuospatial skills or is a distinct cognitive process. Sixty undergraduate participants completed tasks assessing anomaly detection, mental rotation, and global and local perceptual abilities. The results demonstrate that anomaly detection is negatively correlated with mental rotation. These findings suggest that anomaly detection is not simply a function of visuospatial skills.

  6. A new prior for bayesian anomaly detection: application to biosurveillance.

    PubMed

    Shen, Y; Cooper, G F

    2010-01-01

    Bayesian anomaly detection computes posterior probabilities of anomalous events by combining prior beliefs and evidence from data. However, the specification of prior probabilities can be challenging. This paper describes a Bayesian prior in the context of disease outbreak detection. The goal is to provide a meaningful, easy-to-use prior that yields a posterior probability of an outbreak that performs at least as well as a standard frequentist approach. If this goal is achieved, the resulting posterior could be usefully incorporated into a decision analysis about how to act in light of a possible disease outbreak. This paper describes a Bayesian method for anomaly detection that combines learning from data with a semi-informative prior probability over patterns of anomalous events. A univariate version of the algorithm is presented here for ease of illustration of the essential ideas. The paper describes the algorithm in the context of disease-outbreak detection, but it is general and can be used in other anomaly detection applications. For this application, the semi-informative prior specifies that an increased count over baseline is expected for the variable being monitored, such as the number of respiratory chief complaints per day at a given emergency department. The semi-informative prior is derived based on the baseline prior, which is estimated from using historical data. The evaluation reported here used semi-synthetic data to evaluate the detection performance of the proposed Bayesian method and a control chart method, which is a standard frequentist algorithm that is closest to the Bayesian method in terms of the type of data it uses. The disease-outbreak detection performance of the Bayesian method was statistically significantly better than that of the control chart method when proper baseline periods were used to estimate the baseline behavior to avoid seasonal effects. When using longer baseline periods, the Bayesian method performed as well as the

  7. Application of Improved SOM Neural Network in Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Jiang, Xueying; Liu, Kean; Yan, Jiegou; Chen, Wenhui

    For the false alarm rate, false negative rate, training time and other issues of SOM neural network algorithm, the author Gives an improved anomaly detection SOM algorithm---FPSOM through the introduction of the learning rate, which can adaptively learn the original sample space, better reflects the status of the original data. At the same time, combined with the artificial neural network, The author also gives the intelligent detection model and the model of the training module, designed the main realization of FPSOM neural network algorithm, and finally simulation experiments were carried out in KDDCUP data sets. The experiments show that the new algorithm is better than SOM which can greatly shorten the training time, and effectively improve the detection rate and reduce the false positive rate.

  8. Detecting errors and anomalies in computerized materials control and accountability databases

    SciTech Connect

    Whiteson, R.; Hench, K.; Yarbro, T.; Baumgart, C.

    1998-12-31

    The Automated MC and A Database Assessment project is aimed at improving anomaly and error detection in materials control and accountability (MC and A) databases and increasing confidence in the data that they contain. Anomalous data resulting in poor categorization of nuclear material inventories greatly reduces the value of the database information to users. Therefore it is essential that MC and A data be assessed periodically for anomalies or errors. Anomaly detection can identify errors in databases and thus provide assurance of the integrity of data. An expert system has been developed at Los Alamos National Laboratory that examines these large databases for anomalous or erroneous data. For several years, MC and A subject matter experts at Los Alamos have been using this automated system to examine the large amounts of accountability data that the Los Alamos Plutonium Facility generates. These data are collected and managed by the Material Accountability and Safeguards System, a near-real-time computerized nuclear material accountability and safeguards system. This year they have expanded the user base, customizing the anomaly detector for the varying requirements of different groups of users. This paper describes the progress in customizing the expert systems to the needs of the users of the data and reports on their results.

  9. Anomaly Detection in Multiple Scale for Insider Threat Analysis

    SciTech Connect

    Kim, Yoohwan; Sheldon, Frederick T; Hively, Lee M

    2012-01-01

    We propose a method to quantify malicious insider activity with statistical and graph-based analysis aided with semantic scoring rules. Different types of personal activities or interactions are monitored to form a set of directed weighted graphs. The semantic scoring rules assign higher scores for the events more significant and suspicious. Then we build personal activity profiles in the form of score tables. Profiles are created in multiple scales where the low level profiles are aggregated toward more stable higherlevel profiles within the subject or object hierarchy. Further, the profiles are created in different time scales such as day, week, or month. During operation, the insider s current activity profile is compared to the historical profiles to produce an anomaly score. For each subject with a high anomaly score, a subgraph of connected subjects is extracted to look for any related score movement. Finally the subjects are ranked by their anomaly scores to help the analysts focus on high-scored subjects. The threat-ranking component supports the interaction between the User Dashboard and the Insider Threat Knowledge Base portal. The portal includes a repository for historical results, i.e., adjudicated cases containing all of the information first presented to the user and including any additional insights to help the analysts. In this paper we show the framework of the proposed system and the operational algorithms.

  10. Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method

    NASA Technical Reports Server (NTRS)

    Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.

    2005-01-01

    NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.

  11. Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method

    NASA Technical Reports Server (NTRS)

    Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.

    2005-01-01

    NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.

  12. A Comparative Evaluation of Anomaly Detection Algorithms for Maritime Video Surveillance

    DTIC Science & Technology

    2011-01-01

    A variety of anomaly detection algorithms have been applied to surveillance tasks for detecting threats with some success. However, it is not clear...which anomaly detection algorithms should be used for domains such as ground-based maritime video surveillance. For example, recently introduced...Also, the reasons for the performance differences of anomaly detection algorithms on problems of varying difficulty are not well understood. We

  13. A Rule-based Track Anomaly Detection Algorithm for Maritime Force Protection

    DTIC Science & Technology

    2014-08-01

    UNCLASSIFIED UNCLASSIFIED A Rule- based Track Anomaly Detection Algorithm for Maritime Force Protection S.Boinepalli and...detection tool using a Rule- based Algorithm that can detect anomalies in a set of pre-recorded tracks using their curvature, speed and weave. We...Australia 2014 AR 016-049 August 2014 APPROVED FOR PUBLIC RELEASE UNCLASSIFIED UNCLASSIFIED A Rule- based Track Anomaly Detection

  14. Anomaly Detection Using an Ensemble of Feature Models.

    PubMed

    Noto, Keith; Brodley, Carla; Slonim, Donna

    2010-12-13

    We present a new approach to semi-supervised anomaly detection. Given a set of training examples believed to come from the same distribution or class, the task is to learn a model that will be able to distinguish examples in the future that do not belong to the same class. Traditional approaches typically compare the position of a new data point to the set of "normal" training data points in a chosen representation of the feature space. For some data sets, the normal data may not have discernible positions in feature space, but do have consistent relationships among some features that fail to appear in the anomalous examples. Our approach learns to predict the values of training set features from the values of other features. After we have formed an ensemble of predictors, we apply this ensemble to new data points. To combine the contribution of each predictor in our ensemble, we have developed a novel, information-theoretic anomaly measure that our experimental results show selects against noisy and irrelevant features. Our results on 47 data sets show that for most data sets, this approach significantly improves performance over current state-of-the-art feature space distance and density-based approaches.

  15. Anomaly Detection Using an Ensemble of Feature Models

    PubMed Central

    Noto, Keith; Brodley, Carla; Slonim, Donna

    2011-01-01

    We present a new approach to semi-supervised anomaly detection. Given a set of training examples believed to come from the same distribution or class, the task is to learn a model that will be able to distinguish examples in the future that do not belong to the same class. Traditional approaches typically compare the position of a new data point to the set of “normal” training data points in a chosen representation of the feature space. For some data sets, the normal data may not have discernible positions in feature space, but do have consistent relationships among some features that fail to appear in the anomalous examples. Our approach learns to predict the values of training set features from the values of other features. After we have formed an ensemble of predictors, we apply this ensemble to new data points. To combine the contribution of each predictor in our ensemble, we have developed a novel, information-theoretic anomaly measure that our experimental results show selects against noisy and irrelevant features. Our results on 47 data sets show that for most data sets, this approach significantly improves performance over current state-of-the-art feature space distance and density-based approaches. PMID:22020249

  16. Anomaly detection of microstructural defects in continuous fiber reinforced composites

    NASA Astrophysics Data System (ADS)

    Bricker, Stephen; Simmons, J. P.; Przybyla, Craig; Hardie, Russell

    2015-03-01

    Ceramic matrix composites (CMC) with continuous fiber reinforcements have the potential to enable the next generation of high speed hypersonic vehicles and/or significant improvements in gas turbine engine performance due to their exhibited toughness when subjected to high mechanical loads at extreme temperatures (2200F+). Reinforced fiber composites (RFC) provide increased fracture toughness, crack growth resistance, and strength, though little is known about how stochastic variation and imperfections in the material effect material properties. In this work, tools are developed for quantifying anomalies within the microstructure at several scales. The detection and characterization of anomalous microstructure is a critical step in linking production techniques to properties, as well as in accurate material simulation and property prediction for the integrated computation materials engineering (ICME) of RFC based components. It is desired to find statistical outliers for any number of material characteristics such as fibers, fiber coatings, and pores. Here, fiber orientation, or `velocity', and `velocity' gradient are developed and examined for anomalous behavior. Categorizing anomalous behavior in the CMC is approached by multivariate Gaussian mixture modeling. A Gaussian mixture is employed to estimate the probability density function (PDF) of the features in question, and anomalies are classified by their likelihood of belonging to the statistical normal behavior for that feature.

  17. Machine intelligence-based decision-making (MIND) for automatic anomaly detection

    NASA Astrophysics Data System (ADS)

    Prasad, Nadipuram R.; King, Jason C.; Lu, Thomas

    2007-04-01

    Any event deemed as being out-of-the-ordinary may be called an anomaly. Anomalies by virtue of their definition are events that occur spontaneously with no prior indication of their existence or appearance. Effects of anomalies are typically unknown until they actually occur, and their effects aggregate in time to show noticeable change from the original behavior. An evolved behavior would in general be very difficult to correct unless the anomalous event that caused such behavior can be detected early, and any consequence attributed to the specific anomaly. Substantial time and effort is required to back-track the cause for abnormal behavior and to recreate the event sequence leading to abnormal behavior. There is a critical need therefore to automatically detect anomalous behavior as and when they may occur, and to do so with the operator in the loop. Human-machine interaction results in better machine learning and a better decision-support mechanism. This is the fundamental concept of intelligent control where machine learning is enhanced by interaction with human operators, and vice versa. The paper discusses a revolutionary framework for the characterization, detection, identification, learning, and modeling of anomalous behavior in observed phenomena arising from a large class of unknown and uncertain dynamical systems.

  18. A High-Order Statistical Tensor Based Algorithm for Anomaly Detection in Hyperspectral Imagery

    PubMed Central

    Geng, Xiurui; Sun, Kang; Ji, Luyan; Zhao, Yongchao

    2014-01-01

    Recently, high-order statistics have received more and more interest in the field of hyperspectral anomaly detection. However, most of the existing high-order statistics based anomaly detection methods require stepwise iterations since they are the direct applications of blind source separation. Moreover, these methods usually produce multiple detection maps rather than a single anomaly distribution image. In this study, we exploit the concept of coskewness tensor and propose a new anomaly detection method, which is called COSD (coskewness detector). COSD does not need iteration and can produce single detection map. The experiments based on both simulated and real hyperspectral data sets verify the effectiveness of our algorithm. PMID:25366706

  19. A high-order statistical tensor based algorithm for anomaly detection in hyperspectral imagery.

    PubMed

    Geng, Xiurui; Sun, Kang; Ji, Luyan; Zhao, Yongchao

    2014-11-04

    Recently, high-order statistics have received more and more interest in the field of hyperspectral anomaly detection. However, most of the existing high-order statistics based anomaly detection methods require stepwise iterations since they are the direct applications of blind source separation. Moreover, these methods usually produce multiple detection maps rather than a single anomaly distribution image. In this study, we exploit the concept of coskewness tensor and propose a new anomaly detection method, which is called COSD (coskewness detector). COSD does not need iteration and can produce single detection map. The experiments based on both simulated and real hyperspectral data sets verify the effectiveness of our algorithm.

  20. From Signature-Based Towards Behaviour-Based Anomaly Detection (Extended Abstract)

    DTIC Science & Technology

    2010-11-01

    RTO-MP-IST-091 P2 - 1 From Signature-Based Towards Behaviour-Based Anomaly Detection (Extended Abstract) Pavel Minarik, Jan Vykopal...A 3. DATES COVERED - 4. TITLE AND SUBTITLE From Signature-Based Towards Behaviour-Based Anomaly Detection (Extended Abstract) 5a. CONTRACT...Prescribed by ANSI Std Z39-18 From Signature-Based Towards Behaviour-Based Anomaly Detection P2 - 2 RTO-MP-IST-091 DEEP PACKET INSPECTION Every

  1. OceanXtremes: Scalable Anomaly Detection in Oceanographic Time-Series

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Armstrong, E. M.; Chin, T. M.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jacob, J. C.; Quach, N.

    2016-12-01

    The oceanographic community must meet the challenge to rapidly identify features and anomalies in complex and voluminous observations to further science and improve decision support. Given this data-intensive reality, we are developing an anomaly detection system, called OceanXtremes, powered by an intelligent, elastic Cloud-based analytic service backend that enables execution of domain-specific, multi-scale anomaly and feature detection algorithms across the entire archive of 15 to 30-year ocean science datasets.Our parallel analytics engine is extending the NEXUS system and exploits multiple open-source technologies: Apache Cassandra as a distributed spatial "tile" cache, Apache Spark for in-memory parallel computation, and Apache Solr for spatial search and storing pre-computed tile statistics and other metadata. OceanXtremes provides these key capabilities: Parallel generation (Spark on a compute cluster) of 15 to 30-year Ocean Climatologies (e.g. sea surface temperature or SST) in hours or overnight, using simple pixel averages or customizable Gaussian-weighted "smoothing" over latitude, longitude, and time; Parallel pre-computation, tiling, and caching of anomaly fields (daily variables minus a chosen climatology) with pre-computed tile statistics; Parallel detection (over the time-series of tiles) of anomalies or phenomena by regional area-averages exceeding a specified threshold (e.g. high SST in El Nino or SST "blob" regions), or more complex, custom data mining algorithms; Shared discovery and exploration of ocean phenomena and anomalies (facet search using Solr), along with unexpected correlations between key measured variables; Scalable execution for all capabilities on a hybrid Cloud, using our on-premise OpenStack Cloud cluster or at Amazon. The key idea is that the parallel data-mining operations will be run "near" the ocean data archives (a local "network" hop) so that we can efficiently access the thousands of files making up a three decade time

  2. [Multi-DSP parallel processing technique of hyperspectral RX anomaly detection].

    PubMed

    Guo, Wen-Ji; Zeng, Xiao-Ru; Zhao, Bao-Wei; Ming, Xing; Zhang, Gui-Feng; Lü, Qun-Bo

    2014-05-01

    To satisfy the requirement of high speed, real-time and mass data storage etc. for RX anomaly detection of hyperspectral image data, the present paper proposes a solution of multi-DSP parallel processing system for hyperspectral image based on CPCI Express standard bus architecture. Hardware topological architecture of the system combines the tight coupling of four DSPs sharing data bus and memory unit with the interconnection of Link ports. On this hardware platform, by assigning parallel processing task for each DSP in consideration of the spectrum RX anomaly detection algorithm and the feature of 3D data in the spectral image, a 4DSP parallel processing technique which computes and solves the mean matrix and covariance matrix of the whole image by spatially partitioning the image is proposed. The experiment result shows that, in the case of equivalent detective effect, it can reach the time efficiency 4 times higher than single DSP process with the 4-DSP parallel processing technique of RX anomaly detection algorithm proposed by this paper, which makes a breakthrough in the constraints to the huge data image processing of DSP's internal storage capacity, meanwhile well meeting the demands of the spectral data in real-time processing.

  3. A new approach for structural health monitoring by applying anomaly detection on strain sensor data

    NASA Astrophysics Data System (ADS)

    Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik

    2014-03-01

    Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.

  4. Anomaly Detection for Data Reduction in an Unattended Ground Sensor (UGS) Field

    DTIC Science & Technology

    2014-09-01

    Framework integrates super - resolution , contrast, and deblur research algorithms as well as the Force Protection Surveillance System (FPSS),2,3 a full-motion...report describes the design and implementation of a data reduction technique for video sensors that are part of a larger unattended ground sensor (UGS...network. The data reduction technique is based on anomaly detection in full-motion video and subsequent statistical analysis techniques that allow the

  5. Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets

    PubMed Central

    Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin

    2017-01-01

    Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy and sparse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxicabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques. PMID:28282948

  6. Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets.

    PubMed

    Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin

    2017-03-09

    Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy ands parse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxi cabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques.

  7. Extending TOPS: A Prototype MODIS Anomaly Detection Architecture

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Srivastava, A. N.

    2008-12-01

    The management and processing of Earth science data has been gaining importance over the last decade due to higher data volumes generated by a larger number of instruments, and due to the increase in complexity of Earth science models that use this data. The volume of data itself is often a limiting factor in obtaining the information needed by the scientists; without more sophisticated data volume reduction technologies, possible key information may not be discovered. We are especially interested in automatic identification of disturbances within the ecosystems (e,g, wildfires, droughts, floods, insect/pest damage, wind damage, logging), and focusing our analysis efforts on the identified areas. There are dozens of variables that define the health of our ecosystem and both long-term and short-term changes in these variables can serve as early indicators of natural disasters and shifts in climate and ecosystem health. These changes can have profound socio-economic impacts and we need to develop capabilities for identification, analysis and response to these changes in a timely manner. Because the ecosystem consists of a large number of variables, there can be a disturbance that is only apparent when we examine relationships among multiple variables despite the fact that none of them is by itself alarming. We have to be able to extract information from multiple sensors and observations and discover these underlying relationships. As the data volumes increase, there is also potential for large number of anomalies to "flood" the system, so we need to provide ability to automatically select the most likely ones and the most important ones and the ability to analyze the anomaly with minimal involvement of scientists. We describe a prototype architecture for anomaly driven data reduction for both near-real-time and archived surface reflectance data from the MODIS instrument collected over Central California and test it using Orca and One-Class Support Vector Machines

  8. FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection

    PubMed Central

    Brodley, Carla; Slonim, Donna

    2011-01-01

    Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called “normal” instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach. PMID:22639542

  9. FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection.

    PubMed

    Noto, Keith; Brodley, Carla; Slonim, Donna

    2012-01-01

    Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called "normal" instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach.

  10. Feasibility of anomaly detection and characterization using trans-admittance mammography with 60 × 60 electrode array

    NASA Astrophysics Data System (ADS)

    Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; In Oh, Tong

    2014-10-01

    Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.

  11. Feasibility of anomaly detection and characterization using trans-admittance mammography with 60 × 60 electrode array.

    PubMed

    Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; Oh, Tong In

    2014-10-07

    Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.

  12. Online anomaly detection in crowd scenes via structure analysis.

    PubMed

    Yuan, Yuan; Fang, Jianwu; Wang, Qi

    2015-03-01

    Abnormal behavior detection in crowd scenes is continuously a challenge in the field of computer vision. For tackling this problem, this paper starts from a novel structure modeling of crowd behavior. We first propose an informative structural context descriptor (SCD) for describing the crowd individual, which originally introduces the potential energy function of particle's interforce in solid-state physics to intuitively conduct vision contextual cueing. For computing the crowd SCD variation effectively, we then design a robust multi-object tracker to associate the targets in different frames, which employs the incremental analytical ability of the 3-D discrete cosine transform (DCT). By online spatial-temporal analyzing the SCD variation of the crowd, the abnormality is finally localized. Our contribution mainly lies on three aspects: 1) the new exploration of abnormal detection from structure modeling where the motion difference between individuals is computed by a novel selective histogram of optical flow that makes the proposed method can deal with more kinds of anomalies; 2) the SCD description that can effectively represent the relationship among the individuals; and 3) the 3-D DCT multi-object tracker that can robustly associate the limited number of (instead of all) targets which makes the tracking analysis in high density crowd situation feasible. Experimental results on several publicly available crowd video datasets verify the effectiveness of the proposed method.

  13. Validation of semantic illusions independent of anomaly detection: evidence from eye movements.

    PubMed

    Cook, Anne E; Walsh, Erinn K; Bills, Margaret A A; Kircher, John C; O'Brien, Edward J

    2016-12-09

    Several theorists have argued that readers fail to detect semantic anomalies during reading, and that these effects are indicative of "shallow processing" behaviours. Previous studies of semantic anomalies such as the Moses illusion have focused primarily on explicit detection tasks. In the present study, we examined participants' eye movements as they read true/false statements that were non-anomalous, or contained a semantic anomaly that was either high- or low-related to the correct information. Analyses of reading behaviours revealed that only low-related detected anomalies resulted in initial processing difficulty, but both detected and undetected anomalies, regardless of whether they were high- or low-related, resulted in delayed processing difficulty. The results extend previous findings on semantic anomalies and are discussed in terms of the RI-Val model of text processing.

  14. A Comparative Study of Unsupervised Anomaly Detection Techniques Using Honeypot Data

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Inoue, Daisuke; Eto, Masashi; Nakao, Koji

    Intrusion Detection Systems (IDS) have been received considerable attention among the network security researchers as one of the most promising countermeasures to defend our crucial computer systems or networks against attackers on the Internet. Over the past few years, many machine learning techniques have been applied to IDSs so as to improve their performance and to construct them with low cost and effort. Especially, unsupervised anomaly detection techniques have a significant advantage in their capability to identify unforeseen attacks, i.e., 0-day attacks, and to build intrusion detection models without any labeled (i.e., pre-classified) training data in an automated manner. In this paper, we conduct a set of experiments to evaluate and analyze performance of the major unsupervised anomaly detection techniques using real traffic data which are obtained at our honeypots deployed inside and outside of the campus network of Kyoto University, and using various evaluation criteria, i.e., performance evaluation by similarity measurements and the size of training data, overall performance, detection ability for unknown attacks, and time complexity. Our experimental results give some practical and useful guidelines to IDS researchers and operators, so that they can acquire insight to apply these techniques to the area of intrusion detection, and devise more effective intrusion detection models.

  15. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks.

    PubMed

    Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin

    2016-09-28

    The spatial-temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial-temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data.

  16. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks

    PubMed Central

    Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin

    2016-01-01

    The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035

  17. AnRAD: A Neuromorphic Anomaly Detection Framework for Massive Concurrent Data Streams.

    PubMed

    Chen, Qiuwen; Luley, Ryan; Wu, Qing; Bishop, Morgan; Linderman, Richard W; Qiu, Qinru

    2017-03-17

    The evolution of high performance computing technologies has enabled the large-scale implementation of neuromorphic models and pushed the research in computational intelligence into a new era. Among the machine learning applications, unsupervised detection of anomalous streams is especially challenging due to the requirements of detection accuracy and real-time performance. Designing a computing framework that harnesses the growing computing power of the multicore systems while maintaining high sensitivity and specificity to the anomalies is an urgent research topic. In this paper, we propose anomaly recognition and detection (AnRAD), a bioinspired detection framework that performs probabilistic inferences. We analyze the feature dependency and develop a self-structuring method that learns an efficient confabulation network using unlabeled data. This network is capable of fast incremental learning, which continuously refines the knowledge base using streaming data. Compared with several existing anomaly detection approaches, our method provides competitive detection quality. Furthermore, we exploit the massive parallel structure of the AnRAD framework. Our implementations of the detection algorithm on the graphic processing unit and the Xeon Phi coprocessor both obtain substantial speedups over the sequential implementation on general-purpose microprocessor. The framework provides real-time service to concurrent data streams within diversified knowledge contexts, and can be applied to large problems with multiple local patterns. Experimental results demonstrate high computing performance and memory efficiency. For vehicle behavior detection, the framework is able to monitor up to 16,000 vehicles (data streams) and their interactions in real time with a single commodity coprocessor, and uses less than 0.2 ms for one testing subject. Finally, the detection network is ported to our spiking neural network simulator to show the potential of adapting to the emerging

  18. Using QR Factorization for Real-Time Anomaly Detection in Hyperspectral Images

    DTIC Science & Technology

    2012-03-22

    vector µ and a new covariance matrix S are calculated using Equation 3 and Equation 4 respectively. Note that the contribution of the new pixel is...Finding Hyperspectral Anomalies. Military Operations Research , 13 (4), 19-36. Stein, D. W., Beaven, S . G., Hoff, L. E., Winter, E. M., Schaum , A. P...anomaly detection methods have focused on analysis after the entire image has been collected. As useful as post-collection anomaly detection is, there is

  19. Radon anomalies: When are they possible to be detected?

    NASA Astrophysics Data System (ADS)

    Passarelli, Luigi; Woith, Heiko; Seyis, Cemil; Nikkhoo, Mehdi; Donner, Reik

    2017-04-01

    Records of the Radon noble gas in different environments like soil, air, groundwater, rock, caves, and tunnels, typically display cyclic variations including diurnal (S1), semidiurnal (S2) and seasonal components. But there are also cases where theses cycles are absent. Interestingly, radon emission can also be affected by transient processes, which inhibit or enhance the radon carrying process at the surface. This results in transient changes in the radon emission rate, which are superimposed on the low and high frequency cycles. The complexity in the spectral contents of the radon time-series makes any statistical analysis aiming at understanding the physical driving processes a challenging task. In the past decades there have been several attempts to relate changes in radon emission rate with physical triggering processes such as earthquake occurrence. One of the problems in this type of investigation is to objectively detect anomalies in the radon time-series. In the present work, we propose a simple and objective statistical method for detecting changes in the radon emission rate time-series. The method uses non-parametric statistical tests (e.g., Kolmogorov-Smirnov) to compare empirical distributions of radon emission rate by sequentially applying various time window to the time-series. The statistical test indicates whether two empirical distributions of data originate from the same distribution at a desired significance level. We test the algorithm on synthetic data in order to explore the sensitivity of the statistical test to the sample size. We successively apply the test to six radon emission rate recordings from stations located around the Marmara Sea obtained within the MARsite project (MARsite has received funding from the European Union's Seventh Programme for research, technological development and demonstration under grant agreement No 308417). We conclude that the test performs relatively well on identify transient changes in the radon emission

  20. Developing a new, passive diffusion sampling array to detect helium anomalies associated with volcanic unrest

    USGS Publications Warehouse

    Dame, Brittany E; Solomon, D Kip; Evans, William C.; Ingebritsen, Steven E.

    2015-01-01

    Helium (He) concentration and 3 He/ 4 He anomalies in soil gas and spring water are potentially powerful tools for investigating hydrothermal circulation associated with volca- nism and could perhaps serve as part of a hazards warning system. However, in operational practice, He and other gases are often sampled only after volcanic unrest is detected by other means. A new passive diffusion sampler suite, intended to be collected after the onset of unrest, has been developed and tested as a relatively low-cost method of determining He- isotope composition pre- and post-unrest. The samplers, each with a distinct equilibration time, passively record He concen- tration and isotope ratio in springs and soil gas. Once collected and analyzed, the He concentrations in the samplers are used to deconvolve the time history of the He concentration and the 3 He/ 4 He ratio at the collection site. The current suite consisting of three samplers is sufficient to deconvolve both the magnitude and the timing of a step change in in situ con- centration if the suite is collected within 100 h of the change. The effects of temperature and prolonged deployment on the suite ’ s capability of recording He anomalies have also been evaluated. The suite has captured a significant 3 He/ 4 He soil gas anomaly at Horseshoe Lake near Mammoth Lakes, California. The passive diffusion sampler suite appears to be an accurate and affordable alternative for determining He anomalies associated with volcanic unrest.

  1. Developing a new, passive diffusion sampler suite to detect helium anomalies associated with volcanic unrest

    NASA Astrophysics Data System (ADS)

    Dame, Brittany E.; Solomon, D. Kip; Evans, William C.; Ingebritsen, Steven E.

    2015-03-01

    Helium (He) concentration and 3He/4He anomalies in soil gas and spring water are potentially powerful tools for investigating hydrothermal circulation associated with volcanism and could perhaps serve as part of a hazards warning system. However, in operational practice, He and other gases are often sampled only after volcanic unrest is detected by other means. A new passive diffusion sampler suite, intended to be collected after the onset of unrest, has been developed and tested as a relatively low-cost method of determining He-isotope composition pre- and post-unrest. The samplers, each with a distinct equilibration time, passively record He concentration and isotope ratio in springs and soil gas. Once collected and analyzed, the He concentrations in the samplers are used to deconvolve the time history of the He concentration and the 3He/4He ratio at the collection site. The current suite consisting of three samplers is sufficient to deconvolve both the magnitude and the timing of a step change in in situ concentration if the suite is collected within 100 h of the change. The effects of temperature and prolonged deployment on the suite's capability of recording He anomalies have also been evaluated. The suite has captured a significant 3He/4He soil gas anomaly at Horseshoe Lake near Mammoth Lakes, California. The passive diffusion sampler suite appears to be an accurate and affordable alternative for determining He anomalies associated with volcanic unrest.

  2. ENTVis: A Visual Analytic Tool for Entropy-Based Network Traffic Anomaly Detection.

    PubMed

    Zhou, Fangfang; Huang, Wei; Zhao, Ying; Shi, Yang; Liang, Xing; Fan, Xiaoping

    2015-01-01

    Entropy-based traffic metrics have received substantial attention in network traffic anomaly detection because entropy can provide fine-grained metrics of traffic distribution characteristics. However, some practical issues--such as ambiguity, lack of detailed distribution information, and a large number of false positives--affect the application of entropy-based traffic anomaly detection. In this work, we introduce a visual analytic tool called ENTVis to help users understand entropy-based traffic metrics and achieve accurate traffic anomaly detection. ENTVis provides three coordinated views and rich interactions to support a coherent visual analysis on multiple perspectives: the timeline group view for perceiving situations and finding hints of anomalies, the Radviz view for clustering similar anomalies in a period, and the matrix view for understanding traffic distributions and diagnosing anomalies in detail. Several case studies have been performed to verify the usability and effectiveness of our method. A further evaluation was conducted via expert review.

  3. Detection of Lexical and Morphological Anomalies by Children with and without Language Impairment

    ERIC Educational Resources Information Center

    Pawlowska, Monika; Robinson, Sarah; Seddoh, Amebu

    2014-01-01

    Purpose: The abilities of 5-year-old children with and without language impairment (LI) to detect anomalies involving lexical items and grammatical morphemes in stories were compared. The influence of sentence versus discourse context on lexical anomaly detection rates was explored. Method: The participants were read 3 story scripts and asked to…

  4. Detection of Lexical and Morphological Anomalies by Children with and without Language Impairment

    ERIC Educational Resources Information Center

    Pawlowska, Monika; Robinson, Sarah; Seddoh, Amebu

    2014-01-01

    Purpose: The abilities of 5-year-old children with and without language impairment (LI) to detect anomalies involving lexical items and grammatical morphemes in stories were compared. The influence of sentence versus discourse context on lexical anomaly detection rates was explored. Method: The participants were read 3 story scripts and asked to…

  5. State of the Art in Anomaly Detection and Reaction: An Update

    DTIC Science & Technology

    2000-02-24

    This paper is a supplement to the author’s May 1999 report, "State of the Art in Anomaly Detection and Reaction," (MP-99B000020). Although this...state of the art in anomaly detection and reaction as it is described in the 1999 report. There have been some noteworthy developments in the past year

  6. Clusters versus GPUs for Parallel Target and Anomaly Detection in Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2010-12-01

    Remotely sensed hyperspectral sensors provide image data containing rich information in both the spatial and the spectral domain, and this information can be used to address detection tasks in many applications. In many surveillance applications, the size of the objects (targets) searched for constitutes a very small fraction of the total search area and the spectral signatures associated to the targets are generally different from those of the background, hence the targets can be seen as anomalies. In hyperspectral imaging, many algorithms have been proposed for automatic target and anomaly detection. Given the dimensionality of hyperspectral scenes, these techniques can be time-consuming and difficult to apply in applications requiring real-time performance. In this paper, we develop several new parallel implementations of automatic target and anomaly detection algorithms. The proposed parallel algorithms are quantitatively evaluated using hyperspectral data collected by the NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) system over theWorld Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in theWTC complex.

  7. Parallel implementation of RX anomaly detection on multi-core processors: impact of data partitioning strategies

    NASA Astrophysics Data System (ADS)

    Molero, Jose M.; Garzón, Ester M.; García, Inmaculada; Plaza, Antonio

    2011-11-01

    Anomaly detection is an important task for remotely sensed hyperspectral data exploitation. One of the most widely used and successful algorithms for anomaly detection in hyperspectral images is the Reed-Xiaoli (RX) algorithm. Despite its wide acceptance and high computational complexity when applied to real hyperspectral scenes, few documented parallel implementations of this algorithm exist, in particular for multi-core processors. The advantage of multi-core platforms over other specialized parallel architectures is that they are a low-power, inexpensive, widely available and well-known technology. A critical issue in the parallel implementation of RX is the sample covariance matrix calculation, which can be approached in global or local fashion. This aspect is crucial for the RX implementation since the consideration of a local or global strategy for the computation of the sample covariance matrix is expected to affect both the scalability of the parallel solution and the anomaly detection results. In this paper, we develop new parallel implementations of the RX in multi-core processors and specifically investigate the impact of different data partitioning strategies when parallelizing its computations. For this purpose, we consider both global and local data partitioning strategies in the spatial domain of the scene, and further analyze their scalability in different multi-core platforms. The numerical effectiveness of the considered solutions is evaluated using receiver operating characteristics (ROC) curves, analyzing their capacity to detect thermal hot spots (anomalies) in hyperspectral data collected by the NASA's Airborne Visible Infra- Red Imaging Spectrometer system over the World Trade Center in New York, five days after the terrorist attacks of September 11th, 2001.

  8. A new morphological anomaly detection algorithm for hyperspectral images and its GPU implementation

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2011-10-01

    Anomaly detection is considered a very important task for hyperspectral data exploitation. It is now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we develop a new morphological algorithm for anomaly detection in hyperspectral images along with an efficient GPU implementation of the algorithm. The algorithm is implemented on latest-generation GPU architectures, and evaluated with regards to other anomaly detection algorithms using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex. The proposed GPU implementation achieves real-time performance in the considered case study.

  9. Multiprobe in-situ measurement of magnetic field in a minefield via a distributed network of miniaturized low-power integrated sensor systems for detection of magnetic field anomalies

    NASA Astrophysics Data System (ADS)

    Javadi, Hamid H. S.; Bendrihem, David; Blaes, B.; Boykins, Kobe; Cardone, John; Cruzan, C.; Gibbs, J.; Goodman, W.; Lieneweg, U.; Michalik, H.; Narvaez, P.; Perrone, D.; Rademacher, Joel D.; Snare, R.; Spencer, Howard; Sue, Miles; Weese, J.

    1998-09-01

    Based on technologies developed for the Jet Propulsion Laboratory (JPL) Free-Flying-Magnetometer (FFM) concept, we propose to modify the present design of FFMs for detection of mines and arsenals with large magnetic signature. The result will be an integrated miniature sensor system capable of identifying local magnetic field anomaly caused by a magnetic dipole moment. Proposed integrated sensor system is in line with the JPL technology road-map for development of autonomous, intelligent, networked, integrated systems with a broad range of applications. In addition, advanced sensitive magnetic sensors (e.g., silicon micromachined magnetometer, laser pumped helium magnetometer) are being developed for future NASA space plasma probes. It is envisioned that a fleet of these Integrated Sensor Systems (ISS) units will be dispersed on a mine-field via an aerial vehicle (a low-flying airplane or helicopter). The number of such sensor systems in each fleet and the corresponding in-situ probe-grid cell size is based on the strength of magnetic anomaly of the target and ISS measurement resolution of magnetic field vector. After a specified time, ISS units will transmit the measured magnetic field and attitude data to an air-borne platform for further data processing. The cycle of data acquisition and transmission will be continued until batteries run out. Data analysis will allow a local deformation of the Earth's magnetic field vector by a magnetic dipole moment to be detected. Each ISS unit consists of miniaturized sensitive 3- axis magnetometer, high resolution analog-to-digital converter (ADC), Field Programmable Gate Array (FPGA)-based data subsystem, Li-batteries and power regulation circuitry, memory, S-band transmitter, single-patch antenna, and a sun angle sensor. ISS unit is packaged with non-magnetic components and the electronic design implements low-magnetic signature circuits. Care is undertaken to guarantee no corruption of magnetometer sensitivity as a result

  10. A Mobile Device System for Early Warning of ECG Anomalies

    PubMed Central

    Szczepański, Adam; Saeed, Khalid

    2014-01-01

    With the rapid increase in computational power of mobile devices the amount of ambient intelligence-based smart environment systems has increased greatly in recent years. A proposition of such a solution is described in this paper, namely real time monitoring of an electrocardiogram (ECG) signal during everyday activities for identification of life threatening situations. The paper, being both research and review, describes previous work of the authors, current state of the art in the context of the authors' work and the proposed aforementioned system. Although parts of the solution were described in earlier publications of the authors, the whole concept is presented completely for the first time along with the prototype implementation on mobile device—a Windows 8 tablet with Modern UI. The system has three main purposes. The first goal is the detection of sudden rapid cardiac malfunctions and informing the people in the patient's surroundings, family and friends and the nearest emergency station about the deteriorating health of the monitored person. The second goal is a monitoring of ECG signals under non-clinical conditions to detect anomalies that are typically not found during diagnostic tests. The third goal is to register and analyze repeatable, long-term disturbances in the regular signal and finding their patterns. PMID:24955946

  11. Resampling approach for anomaly detection in multispectral images

    SciTech Connect

    Theiler, J. P.; Cai, D.

    2003-01-01

    We propose a novel approach for identifying the 'most unusual' samples in a data set, based on a resampling of data attributes. The resampling produces a 'background class' and then binary classification is used to distinguish the original training set from the background. Those in the training set that are most like the background (i e, most unlike the rest of the training set) are considered anomalous. Although by their nature, anomalies do not permit a positive definition (if I knew what they were, I wouldn't call them anomalies), one can make 'negative definitions' (I can say what does not qualify as an interesting anomaly). By choosing different resampling schemes, one can identify different kinds of anomalies. For multispectral images, anomalous pixels correspond to locations on the ground with unusual spectral signatures or, depending on how feature sets are constructed, unusual spatial textures.

  12. Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.

    2010-01-01

    In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.

  13. An anomaly detection and isolation scheme with instance-based learning and sequential analysis

    SciTech Connect

    Yoo, T. S.; Garcia, H. E.

    2006-07-01

    This paper presents an online anomaly detection and isolation (FDI) technique using an instance-based learning method combined with a sequential change detection and isolation algorithm. The proposed method uses kernel density estimation techniques to build statistical models of the given empirical data (null hypothesis). The null hypothesis is associated with the set of alternative hypotheses modeling the abnormalities of the systems. A decision procedure involves a sequential change detection and isolation algorithm. Notably, the proposed method enjoys asymptotic optimality as the applied change detection and isolation algorithm is optimal in minimizing the worst mean detection/isolation delay for a given mean time before a false alarm or a false isolation. Applicability of this methodology is illustrated with redundant sensor data set and its performance. (authors)

  14. Multiple Kernel Learning for Heterogeneous Anomaly Detection: Algorithm and Aviation Safety Case Study

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.

    2010-01-01

    The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods

  15. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data

    PubMed Central

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601

  16. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data.

    PubMed

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks.

  17. Structural chromosomal anomalies detected by prenatal genetic diagnosis: our experience.

    PubMed

    Farcaş, Simona; Crişan, C D; Andreescu, Nicoleta; Stoian, Monica; Motoc, A G M

    2013-01-01

    The prenatal diagnosis is currently widely spread and facilitates the acquiring of important genetic information about the fetus by a rate extremely accelerate and considered without precedent. In this paper, we like to present our experience concerning the genetic diagnosis and counseling offered for pregnancies in which a structural chromosomal aberration was found. The study group is formed by 528 prenatal samples of amniotic fluid and chorionic villi, received by our laboratory from 2006 through October 2012 for cytogenetic diagnosis. The appropriate genetic investigation was selected based on the indications for prenatal diagnosis. The cases with structural chromosomal anomalies and polymorphic variants were analyzed as regard to the maternal age, gestational age, referral indications and type of chromosomal anomaly found. A total number of 21 structural chromosomal anomalies and polymorphic variants were identified in the study group. Out of 21 structural chromosomal anomalies and polymorphic variants, six deletions and microdeletions, four situations with abnormal long "p" arm of acrocentric chromosomes, two duplications, two reciprocal translocations, two inversions, two additions, one Robertsonian translocation associating trisomy 13, one 9q heteromorphism and one complex chromosome rearrangement were noticed. To the best of our knowledge, this is the first Romanian study in which the diagnostic strategies and the management of the prenatal cases with structural rearrangements are presented. The data provided about the diagnosis strategy and the management of the prenatal cases with structural chromosomal anomalies represents a useful tool in genetic counseling of pregnancies diagnosed with rare structural chromosomal anomalies.

  18. Detecting anomalies in CMB maps: a new method

    SciTech Connect

    Neelakanta, Jayanth T.

    2015-10-01

    Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ''hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics are linear and quadratic combinations of the a{sub ℓ m}'s with random co-efficients, and they test the null hypothesis that the a{sub ℓ m}'s are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple ℓ and m modes, the ''coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.

  19. Detecting anomalies in CMB maps: a new method

    NASA Astrophysics Data System (ADS)

    Neelakanta, Jayanth T.

    2015-10-01

    Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ``hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics are linear and quadratic combinations of the al m's with random co-efficients, and they test the null hypothesis that the al m's are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple l and m modes, the ``coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.

  20. A healthcare utilization analysis framework for hot spotting and contextual anomaly detection.

    PubMed

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient's clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among

  1. A Healthcare Utilization Analysis Framework for Hot Spotting and Contextual Anomaly Detection

    PubMed Central

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient’s clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among

  2. On Predictability of System Anomalies in Real World

    DTIC Science & Technology

    2011-08-01

    distributed system SETI @home [44]. Different from the above work, this work focuses on quantifying the predictability of real-world system anomalies. V...J.-M. Vincent, and D. Anderson, “Mining for statistical models of availability in large-scale distributed systems: An empirical study of seti @home,” in Proc. of MASCOTS, sept. 2009.

  3. Detecting Distributed Network Traffic Anomaly with Network-Wide Correlation Analysis

    NASA Astrophysics Data System (ADS)

    Zonglin, Li; Guangmin, Hu; Xingmiao, Yao; Dan, Yang

    2008-12-01

    Distributed network traffic anomaly refers to a traffic abnormal behavior involving many links of a network and caused by the same source (e.g., DDoS attack, worm propagation). The anomaly transiting in a single link might be unnoticeable and hard to detect, while the anomalous aggregation from many links can be prevailing, and does more harm to the networks. Aiming at the similar features of distributed traffic anomaly on many links, this paper proposes a network-wide detection method by performing anomalous correlation analysis of traffic signals' instantaneous parameters. In our method, traffic signals' instantaneous parameters are firstly computed, and their network-wide anomalous space is then extracted via traffic prediction. Finally, an anomaly is detected by a global correlation coefficient of anomalous space. Our evaluation using Abilene traffic traces demonstrates the excellent performance of this approach for distributed traffic anomaly detection.

  4. High-resolution microarray in the assessment of fetal anomalies detected by ultrasound.

    PubMed

    Charan, Poonam; Woodrow, Nicole; Walker, Sue P; Ganesamoorthy, Devika; McGillivray, George; Palma-Dias, Ricardo

    2014-02-01

    The main aim of this study was to determine the feasibility of using high-resolution microarray to assist with prenatal diagnosis of ultrasound-detected fetal abnormality and to describe the frequency of abnormal results in different categories of fetal anomalies. Prospective cross-sectional study was conducted on women diagnosed with a fetal anomaly (ies) between February 2009 and December 2011 who were offered testing by microarray analysis (Affymetrix 2.7M SNP) and fluorescent in situ hybridisation (FISH) instead of standard karyotyping. Fetal anomalies were categorised according to organ system involvement. One hundred and eighteen women consented to testing with microarray. Eleven of one hundred eighteen (9.3%) cases had aneuploidy detected by FISH. Of the remaining 107, 23 (21.5%) had an abnormality detected on microarray, only three of which would have been detected using the combination of six-probe FISH and banded karyotype. The maximum expected yield for six-probe FISH and karyotype was thus 14/118 (11.8%), compared to 34/118 (28.8%), P < 0.0001. Of the 23 abnormalities detected with microarray, 10 (43%) were pathogenic, six (26%) were long continuous stretches of homozygosity and seven (30%) were of uncertain significance. The maximum yield was in cases with cardiovascular (100%); multiple (40%); central nervous system (CNS) (25%) and skeletal (9%) abnormalities. This study has confirmed the feasibility of translation of microarray into clinical practice. 11.8% (14/118) of the cases would have a genetic basis of an abnormality with a FISH and banded karyotype. This figure is approximately tripled to 28.8% (34/118) if we offer FISH and microarray. High yield for imbalances are multiple, cardiovascular, CNS and skeletal abnormalities. © 2014 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  5. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    NASA Astrophysics Data System (ADS)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  6. Gaussian Process Regression-Based Video Anomaly Detection and Localization With Hierarchical Feature Representation.

    PubMed

    Cheng, Kai-Wen; Chen, Yie-Tarng; Fang, Wen-Hsien

    2015-12-01

    This paper presents a hierarchical framework for detecting local and global anomalies via hierarchical feature representation and Gaussian process regression (GPR) which is fully non-parametric and robust to the noisy training data, and supports sparse features. While most research on anomaly detection has focused more on detecting local anomalies, we are more interested in global anomalies that involve multiple normal events interacting in an unusual manner, such as car accidents. To simultaneously detect local and global anomalies, we cast the extraction of normal interactions from the training videos as a problem of finding the frequent geometric relations of the nearby sparse spatio-temporal interest points (STIPs). A codebook of interaction templates is then constructed and modeled using the GPR, based on which a novel inference method for computing the likelihood of an observed interaction is also developed. Thereafter, these local likelihood scores are integrated into globally consistent anomaly masks, from which anomalies can be succinctly identified. To the best of our knowledge, it is the first time GPR is employed to model the relationship of the nearby STIPs for anomaly detection. Simulations based on four widespread datasets show that the new method outperforms the main state-of-the-art methods with lower computational burden.

  7. Neural Network Noise Anomaly Recognition System and Method

    DTIC Science & Technology

    2000-10-04

    determine when an input waveform deviates from learned noise characteristics. A plurality of neural networks is preferably provided, which each receives a...plurality of samples of intervals or windows of the input waveform. Each of the neural networks produces an output based on whether an anomaly is...detected with respect to the noise, which the neural network is trained to detect. The plurality of outputs of the neural networks is preferably applied to

  8. Small-scale anomaly detection in panoramic imaging using neural models of low-level vision

    NASA Astrophysics Data System (ADS)

    Casey, Matthew C.; Hickman, Duncan L.; Pavlou, Athanasios; Sadler, James R. E.

    2011-06-01

    Our understanding of sensory processing in animals has reached the stage where we can exploit neurobiological principles in commercial systems. In human vision, one brain structure that offers insight into how we might detect anomalies in real-time imaging is the superior colliculus (SC). The SC is a small structure that rapidly orients our eyes to a movement, sound or touch that it detects, even when the stimulus may be on a small-scale; think of a camouflaged movement or the rustle of leaves. This automatic orientation allows us to prioritize the use of our eyes to raise awareness of a potential threat, such as a predator approaching stealthily. In this paper we describe the application of a neural network model of the SC to the detection of anomalies in panoramic imaging. The neural approach consists of a mosaic of topographic maps that are each trained using competitive Hebbian learning to rapidly detect image features of a pre-defined shape and scale. What makes this approach interesting is the ability of the competition between neurons to automatically filter noise, yet with the capability of generalizing the desired shape and scale. We will present the results of this technique applied to the real-time detection of obscured targets in visible-band panoramic CCTV images. Using background subtraction to highlight potential movement, the technique is able to correctly identify targets which span as little as 3 pixels wide while filtering small-scale noise.

  9. Min-max hyperellipsoidal clustering for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A

    2006-08-01

    A novel hyperellipsoidal clustering technique is presented for an intrusion-detection system in network security. Hyperellipsoidal clusters toward maximum intracluster similarity and minimum intercluster similarity are generated from training data sets. The novelty of the technique lies in the fact that the parameters needed to construct higher order data models in general multivariate Gaussian functions are incrementally derived from the data sets using accretive processes. The technique is implemented in a feedforward neural network that uses a Gaussian radial basis function as the model generator. An evaluation based on the inclusiveness and exclusiveness of samples with respect to specific criteria is applied to accretively learn the output clusters of the neural network. One significant advantage of this is its ability to detect individual anomaly types that are hard to detect with other anomaly-detection schemes. Applying this technique, several feature subsets of the tcptrace network-connection records that give above 95% detection at false-positive rates below 5% were identified.

  10. A robust background regression based score estimation algorithm for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei

    2016-12-01

    Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement

  11. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  12. A Multi-Agent Framework for Anomalies Detection on Distributed Firewalls Using Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Karoui, Kamel; Ftima, Fakher Ben; Ghezala, Henda Ben

    The Agents and Data Mining integration has emerged as a promising area for disributed problems solving. Applying this integration on distributed firewalls will facilitate the anomalies detection process. In this chapter, we present a set of algorithms and mining techniques to analyse, manage and detect anomalies on distributed firewalls' policy rules using the multi-agent approach; first, for each firewall, a static agent will execute a set of data mining techniques to generate a new set of efficient firewall policy rules. Then, a mobile agent will exploit these sets of optimized rules to detect eventual anomalies on a specific firewall (intra-firewalls anomalies) or between firewalls (inter-firewalls anomalies). An experimental case study will be presented to demonstrate the usefulness of our approach.

  13. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    NASA Technical Reports Server (NTRS)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  14. Real Time Detection of Anomalies in Streaming Radar and Rain Gauge Data

    NASA Astrophysics Data System (ADS)

    Hill, D. J.; Minsker, B.; Amir, E.; Choi, J.

    2008-12-01

    Radar-rainfall data are being used in an increasing number of real-time applications because of their wide spatial and temporal coverage. Because of uncertainties in radar measurements and the relationship between radar measurements and rainfall on the ground, radar-rainfall data are often combined with rain gauge data to improve their accuracy. While rain gauges can provide accurate estimates of rainfall, their data are sometimes subject to a number of errors caused by the environment in which the gauges are deployed. This study develops a method for automatically detecting anomalies (i.e. data that deviate markedly from historical patterns) in both radar and raingauge data through integration and modeling of data from these two different sources.. These anomalous data can be caused by sensor or data transmission errors or by infrequent system behaviors that may be of interest to the scientific or public safety communities. This study develops an automated anomaly detection method that employs a Dynamic Bayesian Network to assimilate data from multiple rain gauges and weather radar (NEXRAD) into an uncertain model of the current rainfall. Filtering (e.g. Kalman filtering) can then be used to infer the likelihood that a particular gauge measurement is anomalous. Measurements with a high likelihood of being anomalous are classified as such. The method developed in this study performs fast, incremental evaluation of data as they become available; scales to large quantities of data; and requires no a priori information regarding process variables or types of anomalies that may be encountered. The performance of the anomaly detector developed in this study is demonstrated using a precipitation sensor network composed of a NEXRAD weather radar and several near- real-time telemetered rain gauges deployed by the USGS in Chicago. The results indicate that the method performs well at identifying anomalous data caused by a real sensor failure.

  15. Discovering Recurring Anomalies in Text Reports Regarding Complex Space Systems

    NASA Technical Reports Server (NTRS)

    Zane-Ulman, Brett; Srivastava, Ashok N.

    2005-01-01

    Many existing complex space systems have a significant amount of historical maintenance and problem data bases that are stored in unstructured text forms. For some platforms, these reports may be encoded as scanned images rather than even searchable text. The problem that we address in this paper is the discovery of recurring anomalies and relationships between different problem reports that may indicate larger systemic problems. We will illustrate our techniques on data from discrepancy reports regarding software anomalies in the Space Shuttle. These free text reports are written by a number of different penp!e, thus the emphasis and wording varies considerably.

  16. Single and multi-subject clustering of flow cytometry data for cell-type identification and anomaly detection.

    PubMed

    Pouyan, Maziyar Baran; Jindal, Vasu; Birjandtalab, Javad; Nourani, Mehrdad

    2016-08-10

    Measurement of various markers of single cells using flow cytometry has several biological applications. These applications include improving our understanding of behavior of cellular systems, identifying rare cell populations and personalized medication. A common critical issue in the existing methods is identification of the number of cellular populations which heavily affects the accuracy of results. Furthermore, anomaly detection is crucial in flow cytometry experiments. In this work, we propose a two-stage clustering technique for cell type identification in single subject flow cytometry data and extend it for anomaly detection among multiple subjects. Our experimentation on 42 flow cytometry datasets indicates high performance and accurate clustering (F-measure > 91 %) in identifying main cellular populations. Furthermore, our anomaly detection technique evaluated on Acute Myeloid Leukemia dataset results in only <2 % false positives.

  17. Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  18. Anomaly Detection for Beam Loss Maps in the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja

    2017-07-01

    In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.

  19. Classification of SD-OCT volumes for DME detection: an anomaly detection approach

    NASA Astrophysics Data System (ADS)

    Sankar, S.; Sidibé, D.; Cheung, Y.; Wong, T. Y.; Lamoureux, E.; Milea, D.; Meriaudeau, F.

    2016-03-01

    Diabetic Macular Edema (DME) is the leading cause of blindness amongst diabetic patients worldwide. It is characterized by accumulation of water molecules in the macula leading to swelling. Early detection of the disease helps prevent further loss of vision. Naturally, automated detection of DME from Optical Coherence Tomography (OCT) volumes plays a key role. To this end, a pipeline for detecting DME diseases in OCT volumes is proposed in this paper. The method is based on anomaly detection using Gaussian Mixture Model (GMM). It starts with pre-processing the B-scans by resizing, flattening, filtering and extracting features from them. Both intensity and Local Binary Pattern (LBP) features are considered. The dimensionality of the extracted features is reduced using PCA. As the last stage, a GMM is fitted with features from normal volumes. During testing, features extracted from the test volume are evaluated with the fitted model for anomaly and classification is made based on the number of B-scans detected as outliers. The proposed method is tested on two OCT datasets achieving a sensitivity and a specificity of 80% and 93% on the first dataset, and 100% and 80% on the second one. Moreover, experiments show that the proposed method achieves better classification performances than other recently published works.

  20. Enabling the Discovery of Recurring Anomalies in Aerospace System Problem Reports using High-Dimensional Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Srivastava, Ashok, N.; Akella, Ram; Diev, Vesselin; Kumaresan, Sakthi Preethi; McIntosh, Dawn M.; Pontikakis, Emmanuel D.; Xu, Zuobing; Zhang, Yi

    2006-01-01

    This paper describes the results of a significant research and development effort conducted at NASA Ames Research Center to develop new text mining techniques to discover anomalies in free-text reports regarding system health and safety of two aerospace systems. We discuss two problems of significant importance in the aviation industry. The first problem is that of automatic anomaly discovery about an aerospace system through the analysis of tens of thousands of free-text problem reports that are written about the system. The second problem that we address is that of automatic discovery of recurring anomalies, i.e., anomalies that may be described m different ways by different authors, at varying times and under varying conditions, but that are truly about the same part of the system. The intent of recurring anomaly identification is to determine project or system weakness or high-risk issues. The discovery of recurring anomalies is a key goal in building safe, reliable, and cost-effective aerospace systems. We address the anomaly discovery problem on thousands of free-text reports using two strategies: (1) as an unsupervised learning problem where an algorithm takes free-text reports as input and automatically groups them into different bins, where each bin corresponds to a different unknown anomaly category; and (2) as a supervised learning problem where the algorithm classifies the free-text reports into one of a number of known anomaly categories. We then discuss the application of these methods to the problem of discovering recurring anomalies. In fact the special nature of recurring anomalies (very small cluster sizes) requires incorporating new methods and measures to enhance the original approach for anomaly detection. ?& pant 0-

  1. GPU implementation of target and anomaly detection algorithms for remotely sensed hyperspectral image analysis

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2010-08-01

    Automatic target and anomaly detection are considered very important tasks for hyperspectral data exploitation. These techniques are now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly and target detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we describe several new GPU-based implementations of target and anomaly detection algorithms for hyperspectral data exploitation. The parallel algorithms are implemented on latest-generation Tesla C1060 GPU architectures, and quantitatively evaluated using hyperspectral data collected by NASA's AVIRIS system over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex.

  2. Comparison of magnetic resonance urography with ultrasound studies in detection of fetal urogenital anomalies.

    PubMed

    Kajbafzadeh, Abdol-Mohammad; Payabvash, Seyedmehdi; Sadeghi, Zhina; Elmi, Azadeh; Jamal, Ashraf; Hantoshzadeh, Zohreh; Eslami, Laleh; Mehdizadeh, Mehrzad

    2008-02-01

    Prenatal ultrasonography detects the vast majority of urogenital anomalies, but in some cases the diagnosis remains in doubt. We assessed the potential of magnetic resonance urography (MRU) in the evaluation of different urogenital anomalies in fetuses when ultrasound study was equivocal. We retrospectively reviewed the medical records of 46 fetuses in whom the presence of urogenital anomalies was suspected at ultrasound studies, but remained inconclusive. Fetal MRU was performed within the same week as ultrasound studies. All patients underwent MRU, comprising overview, fast, thick-slab, heavily T2-weighted sequences, followed by focused, high-resolution T2-weighted sequences obtained in sagittal, axial and coronal planes. T1-weighted sequences were obtained in selected cases for assessment of the gastrointestinal tract. All MRU results were compared with ultrasound findings. Sensitivity of each imaging modality was estimated based on definite diagnoses made after birth or abortion. The mean (range) gestational age was 27 (18-36)weeks. The final diagnosis was ureteropelvic junction obstruction in 12, ureteral dilation (due to vesicoureteral junction obstruction) in five, ureterocele in five, posterior urethral valve in 16, multicystic dysplastic kidney in six, mesenteric cyst in one and abdominoscrotal hydrocele in one. Overall diagnostic sensitivity of fetal MRU was 96% compared to sonography with 58% sensitivity (p<0.05). Fetal MRU studies provided additional information to sonography in 17 (37%) cases, and were especially more sensitive in evaluation of ureteral anatomy. Fetal MRU can accurately diagnose a wide variety of urinary tract disorders and must be regarded as a valuable complementary tool to ultrasound in the assessment of the urinary system, particularly in cases of inconclusive ultrasound findings. The present study had a selection bias, as only fetuses with possible anomalies proposed by sonography were referred for MRU; however, this is the

  3. Autonomous detection of crowd anomalies in multiple-camera surveillance feeds

    NASA Astrophysics Data System (ADS)

    Nordlöf, Jonas; Andersson, Maria

    2016-10-01

    A novel approach for autonomous detection of anomalies in crowded environments is presented in this paper. The proposed models uses a Gaussian mixture probability hypothesis density (GM-PHD) filter as feature extractor in conjunction with different Gaussian mixture hidden Markov models (GM-HMMs). Results, based on both simulated and recorded data, indicate that this method can track and detect anomalies on-line in individual crowds through multiple camera feeds in a crowded environment.

  4. [Quality of praenatal diagnostic ultrasound - comparison of sonographically detected foetal anomalies with diagnostic findings verified post-partum in Switzerland].

    PubMed

    Stiller, R; Huch, R; Huch, A; Zimmermann, R

    2001-10-01

    Comparison of all praenatally detected cases of foetal anomalies to actual diagnostic findings post partum during a one year period in Switzerland. A retrospective questionnaire-based evaluation including the 5 university hospitals and 6 large hospitals in Switzerland as a population-based study. Analysis of all foetal anomalies detected praenatally by ultrasound in the year of 1995 in these centres. 347 cases have been included in the study. 89 % of cases were detected using screening methods. (2/3) were referred by obstetrical practitioners and GPs. 62 % of the pregnancies were completed and 33 % terminated, while the rest resulted in abortion or stillbirth. In terminated pregnancies there was a 82 % agreement between praenatal and postmortem findings. Sonographic results and clinical/post-mortem diagnosis were in agreement about the presence of major foetal anomalies in 18 % of cases. Additional minor anomalies unperceived by sonography, however, were seen post mortem. There was no false positive case. Without ultrasound screening almost 90 % of anomalies would have been missed due to the absence of clinical symptoms. The Swiss two-step system for praenatal ultrasound screening, based on screening scans done by the obstetrician and GP in practice, or residents in the public outpatient clinics respectively, and the detailed scan done by a subspecialized perinatologist shows excellent results especially in the subgroup of terminated pregnancies.

  5. Anomaly Detection Techniques with Real Test Data from a Spinning Turbine Engine-Like Rotor

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Woike, Mark R.; Oza, Nikunj C.; Matthews, Bryan L.

    2012-01-01

    Online detection techniques to monitor the health of rotating engine components are becoming increasingly attractive to aircraft engine manufacturers in order to increase safety of operation and lower maintenance costs. Health monitoring remains a challenge to easily implement, especially in the presence of scattered loading conditions, crack size, component geometry, and materials properties. The current trend, however, is to utilize noninvasive types of health monitoring or nondestructive techniques to detect hidden flaws and mini-cracks before any catastrophic event occurs. These techniques go further to evaluate material discontinuities and other anomalies that have grown to the level of critical defects that can lead to failure. Generally, health monitoring is highly dependent on sensor systems capable of performing in various engine environmental conditions and able to transmit a signal upon a predetermined crack length, while acting in a neutral form upon the overall performance of the engine system.

  6. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The

  7. Lunar magnetic anomalies detected by the Apollo substatellite magnetometers

    USGS Publications Warehouse

    Hood, L.L.; Coleman, P.J.; Russell, C.T.; Wilhelms, D.E.

    1979-01-01

    Properties of lunar crustal magnetization thus far deduced from Apollo subsatellite magnetometer data are reviewed using two of the most accurate presently available magnetic anomaly maps - one covering a portion of the lunar near side and the other a part of the far side. The largest single anomaly found within the region of coverage on the near-side map correlates exactly with a conspicuous, light-colored marking in western Oceanus Procellarum called Reiner Gamma. This feature is interpreted as an unusual deposit of ejecta from secondary craters of the large nearby primary impact crater Cavalerius. An age for Cavalerius (and, by implication, for Reiner Gamma) of 3.2 ?? 0.2 ?? 109 y is estimated. The main (30 ?? 60 km) Reiner Gamma deposit is nearly uniformly magnetized in a single direction, with a minimum mean magnetization intensity of ???7 ?? 10-2 G cm3/g (assuming a density of 3 g/cm3), or about 700 times the stable magnetization component of the most magnetic returned samples. Additional medium-amplitude anomalies exist over the Fra Mauro Formation (Imbrium basin ejecta emplaced ???3.9 ?? 109 y ago) where it has not been flooded by mare basalt flows, but are nearly absent over the maria and over the craters Copernicus, Kepler, and Reiner and their encircling ejecta mantles. The mean altitude of the far-side anomaly gap is much higher than that of the near-side map and the surface geology is more complex, so individual anomaly sources have not yet been identified. However, it is clear that a concentration of especially strong sources exists in the vicinity of the craters Van de Graaff and Aitken. Numerical modeling of the associated fields reveals that the source locations do not correspond with the larger primary impact craters of the region and, by analogy with Reiner Gamma, may be less conspicuous secondary crater ejecta deposits. The reason for a special concentration of strong sources in the Van de Graaff-Aitken region is unknown, but may be indirectly

  8. Target detection using the background model from the topological anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Dorado Munoz, Leidy P.; Messinger, David W.; Ziemann, Amanda K.

    2013-05-01

    The Topological Anomaly Detection (TAD) algorithm has been used as an anomaly detector in hyperspectral and multispectral images. TAD is an algorithm based on graph theory that constructs a topological model of the background in a scene, and computes an anomalousness ranking for all of the pixels in the image with respect to the background in order to identify pixels with uncommon or strange spectral signatures. The pixels that are modeled as background are clustered into groups or connected components, which could be representative of spectral signatures of materials present in the background. Therefore, the idea of using the background components given by TAD in target detection is explored in this paper. In this way, these connected components are characterized in three different approaches, where the mean signature and endmembers for each component are calculated and used as background basis vectors in Orthogonal Subspace Projection (OSP) and Adaptive Subspace Detector (ASD). Likewise, the covariance matrix of those connected components is estimated and used in detectors: Constrained Energy Minimization (CEM) and Adaptive Coherence Estimator (ACE). The performance of these approaches and the different detectors is compared with a global approach, where the background characterization is derived directly from the image. Experiments and results using self-test data set provided as part of the RIT blind test target detection project are shown.

  9. Anomaly detection in hyperspectral imagery: statistics vs. graph-based algorithms

    NASA Astrophysics Data System (ADS)

    Berkson, Emily E.; Messinger, David W.

    2016-05-01

    Anomaly detection (AD) algorithms are frequently applied to hyperspectral imagery, but different algorithms produce different outlier results depending on the image scene content and the assumed background model. This work provides the first comparison of anomaly score distributions between common statistics-based anomaly detection algorithms (RX and subspace-RX) and the graph-based Topological Anomaly Detector (TAD). Anomaly scores in statistical AD algorithms should theoretically approximate a chi-squared distribution; however, this is rarely the case with real hyperspectral imagery. The expected distribution of scores found with graph-based methods remains unclear. We also look for general trends in algorithm performance with varied scene content. Three separate scenes were extracted from the hyperspectral MegaScene image taken over downtown Rochester, NY with the VIS-NIR-SWIR ProSpecTIR instrument. In order of most to least cluttered, we study an urban, suburban, and rural scene. The three AD algorithms were applied to each scene, and the distributions of the most anomalous 5% of pixels were compared. We find that subspace-RX performs better than RX, because the data becomes more normal when the highest variance principal components are removed. We also see that compared to statistical detectors, anomalies detected by TAD are easier to separate from the background. Due to their different underlying assumptions, the statistical and graph-based algorithms highlighted different anomalies within the urban scene. These results will lead to a deeper understanding of these algorithms and their applicability across different types of imagery.

  10. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    DOE PAGES

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; ...

    2016-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  11. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    SciTech Connect

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; Laska, Jason A.; Sullivan, Blair D.

    2016-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.

  12. Multi-Level Anomaly Detection on Time-Varying Graph Data

    SciTech Connect

    Bridges, Robert A; Collins, John P; Ferragut, Erik M; Laska, Jason A; Sullivan, Blair D

    2015-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.

  13. A hyperspectral imagery anomaly detection algorithm based on local three-dimensional orthogonal subspace projection

    NASA Astrophysics Data System (ADS)

    Zhang, Xing; Wen, Gongjian

    2015-10-01

    Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.

  14. Probable swirls detected as photometric anomalies in Oceanus Procellarum

    NASA Astrophysics Data System (ADS)

    Shkuratov, Yu.; Kaydash, V.; Gerasimenko, S.; Opanasenko, N.; Velikodsky, Yu.; Korokhin, V.; Videen, G.; Pieters, C.

    2010-07-01

    Images of the lunar nearside obtained by telescopes of Maidanak Observatory (Uzbekistan) and Simeiz Observatory (Crimea, Ukraine) equipped with Canon CMOS cameras and Sony CCD LineScan camera were used to study photometric properties of the lunar nearside in several spectral bands. A wide range of lunar phase angles was covered, and the method of phase ratios to assess the steepness of the phase function at different phase angles is applied. We found several areas with photometric anomalies in the south-west portion of the lunar disk that we refer to as Oceanus Procellarum anomalies. The areas being unique on the lunar nearside do not obey the inverse correlation between albedo and phase-curve slope, demonstrating high phase-curve slopes at intermediate albedo. Low-Sun images acquired with Lunar Orbiter IV and Apollo-16 cameras do not reveal anomalous topography of the regions, at least for scales larger than several tens of meters. The areas also do not have any thermal inertia, radar (70 and 3.8 cm), magnetic, or chemical/mineral peculiarities. On the other hand they exhibit a polarimetric signature that we interpret to be due to the presence of a porous regolith upper layer consisting of dust particles. The anomalies may be interpreted as regions of very fresh shallow regolith disturbances caused by impacts of meteoroid swarms consisting of rather small impactors. This origin is similar to one of the hypotheses for the origin of lunar swirls like the Reiner-γ formation. The photometric difference between the shallow and pervasive (Reiner-γ class) swirls is that the latter appear to have a significant amount of immature soils in the upper surface layers.

  15. Overlapping image segmentation for context-dependent anomaly detection

    NASA Astrophysics Data System (ADS)

    Theiler, James; Prasad, Lakshman

    2011-06-01

    The challenge of finding small targets in big images lies in the characterization of the background clutter. The more homogeneous the background, the more distinguishable a typical target will be from its background. One way to homogenize the background is to segment the image into distinct regions, each of which is individually homogeneous, and then to treat each region separately. In this paper we will report on experiments in which the target is unspecified (it is an anomaly), and various segmentation strategies are employed, including an adaptive hierarchical tree-based scheme. We find that segmentations that employ overlap achieve better performance in the low false alarm rate regime.

  16. Data mining method for anomaly detection in the supercomputer task flow

    NASA Astrophysics Data System (ADS)

    Voevodin, Vadim; Voevodin, Vladimir; Shaikhislamov, Denis; Nikitenko, Dmitry

    2016-10-01

    The efficiency of most supercomputer applications is extremely low. At the same time, the user rarely even suspects that their applications may be wasting computing resources. Software tools need to be developed to help detect inefficient applications and report them to the users. We suggest an algorithm for detecting anomalies in the supercomputer's task flow, based on a data mining methods. System monitoring is used to calculate integral characteristics for every job executed, and the data is used as input for our classification method based on the Random Forest algorithm. The proposed approach can currently classify the application as one of three classes - normal, suspicious and definitely anomalous. The proposed approach has been demonstrated on actual applications running on the "Lomonosov" supercomputer.

  17. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  18. Electronic systems failures and anomalies attributed to electromagnetic interference

    NASA Technical Reports Server (NTRS)

    Leach, R. D. (Editor); Alexander, M. B. (Editor)

    1995-01-01

    The effects of electromagnetic interference can be very detrimental to electronic systems utilized in space missions. Assuring that subsystems and systems are electrically compatible is an important engineering function necessary to assure mission success. This reference publication will acquaint the reader with spacecraft electronic systems failures and anomalies caused by electromagnetic interference and will show the importance of electromagnetic compatibility activities in conjunction with space flight programs. It is also hoped that the report will illustrate that evolving electronic systems are increasingly sensitive to electromagnetic interference and that NASA personnel must continue to diligently pursue electromagnetic compatibility on space flight systems.

  19. Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering

    NASA Astrophysics Data System (ADS)

    Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech

    2015-03-01

    We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.

  20. MedMon: securing medical devices through wireless monitoring and anomaly detection.

    PubMed

    Zhang, Meng; Raghunathan, Anand; Jha, Niraj K

    2013-12-01

    Rapid advances in personal healthcare systems based on implantable and wearable medical devices promise to greatly improve the quality of diagnosis and treatment for a range of medical conditions. However, the increasing programmability and wireless connectivity of medical devices also open up opportunities for malicious attackers. Unfortunately, implantable/wearable medical devices come with extreme size and power constraints, and unique usage models, making it infeasible to simply borrow conventional security solutions such as cryptography. We propose a general framework for securing medical devices based on wireless channel monitoring and anomaly detection. Our proposal is based on a medical security monitor (MedMon) that snoops on all the radio-frequency wireless communications to/from medical devices and uses multi-layered anomaly detection to identify potentially malicious transactions. Upon detection of a malicious transaction, MedMon takes appropriate response actions, which could range from passive (notifying the user) to active (jamming the packets so that they do not reach the medical device). A key benefit of MedMon is that it is applicable to existing medical devices that are in use by patients, with no hardware or software modifications to them. Consequently, it also leads to zero power overheads on these devices. We demonstrate the feasibility of our proposal by developing a prototype implementation for an insulin delivery system using off-the-shelf components (USRP software-defined radio). We evaluate its effectiveness under several attack scenarios. Our results show that MedMon can detect virtually all naive attacks and a large fraction of more sophisticated attacks, suggesting that it is an effective approach to enhancing the security of medical devices.

  1. Representation-learning for anomaly detection in complex x-ray cargo imagery

    NASA Astrophysics Data System (ADS)

    Andrews, Jerone T. A.; Jaccard, Nicolas; Rogers, Thomas W.; Griffin, Lewis D.

    2017-05-01

    Existing approaches to automated security image analysis focus on the detection of particular classes of threat. However, this mode of inspection is ineffectual when dealing with mature classes of threat, for which adversaries have refined effective concealment techniques. Furthermore, these methods may be unable to detect potential threats that have never been seen before. Therefore, in this paper, we investigate an anomaly detection framework, at X-ray image patch-level, based on: (i) image representations, and (ii) the detection of anomalies relative to those representations. We present encouraging preliminary results, using representations learnt using convolutional neural networks, as well as several contributions to a general-purpose anomaly detection algorithm based on decision-tree learning.

  2. A new comparison of hyperspectral anomaly detection algorithms for real-time applications

    NASA Astrophysics Data System (ADS)

    Díaz, María.; López, Sebastián.; Sarmiento, Roberto

    2016-10-01

    Due to the high spectral resolution that remotely sensed hyperspectral images provide, there has been an increasing interest in anomaly detection. The aim of anomaly detection is to stand over pixels whose spectral signature differs significantly from the background spectra. Basically, anomaly detectors mark pixels with a certain score, considering as anomalies those whose scores are higher than a threshold. Receiver Operating Characteristic (ROC) curves have been widely used as an assessment measure in order to compare the performance of different algorithms. ROC curves are graphical plots which illustrate the trade- off between false positive and true positive rates. However, they are limited in order to make deep comparisons due to the fact that they discard relevant factors required in real-time applications such as run times, costs of misclassification and the competence to mark anomalies with high scores. This last fact is fundamental in anomaly detection in order to distinguish them easily from the background without any posterior processing. An extensive set of simulations have been made using different anomaly detection algorithms, comparing their performances and efficiencies using several extra metrics in order to complement ROC curves analysis. Results support our proposal and demonstrate that ROC curves do not provide a good visualization of detection performances for themselves. Moreover, a figure of merit has been proposed in this paper which encompasses in a single global metric all the measures yielded for the proposed additional metrics. Therefore, this figure, named Detection Efficiency (DE), takes into account several crucial types of performance assessment that ROC curves do not consider. Results demonstrate that algorithms with the best detection performances according to ROC curves do not have the highest DE values. Consequently, the recommendation of using extra measures to properly evaluate performances have been supported and justified by

  3. Neural network based approach for anomaly detection in the lungs region by electrical impedance tomography.

    PubMed

    Minhas, Atul S; Reddy, M Ramasubba

    2005-08-01

    In this paper, we have shown a simple procedure to detect anomalies in the lungs region by electrical impedance tomography. The main aim of the present study is to investigate the possibility of anomaly detection by using neural networks. Radial basis function neural networks are used as classifiers to classify the anomaly as belonging to the anterior or posterior region of the left lung or the right lung. The neural networks are trained and tested with the simulated data obtained by solving the mathematical model equation governing current flow through the simulated thoracic region. The equation solution and model simulation are done with FEMLAB. The effect of adding a higher number of neurons to the hidden layer can be clearly seen by the reduction in classification error. The study shows that there is interaction between the size (radius) and conductivity of anomalies and for some combination of these two factors the classification error of neural networks will be very small.

  4. Advancements of data anomaly detection research in wireless sensor networks: a survey and open issues.

    PubMed

    Rassam, Murad A; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-08-07

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept "Internet of Things" has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed.

  5. Advancements of Data Anomaly Detection Research in Wireless Sensor Networks: A Survey and Open Issues

    PubMed Central

    Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-01-01

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182

  6. A Statistical Detection of an Anomaly from a Few Noisy Tomographic Projections

    NASA Astrophysics Data System (ADS)

    Fillatre, Lionel; Nikiforov, Igor

    2005-12-01

    The problem of detecting an anomaly/target from a very limited number of noisy tomographic projections is addressed from the statistical point of view. The imaged object is composed of an environment, considered as a nuisance parameter, with a possibly hidden anomaly/target. The GLR test is used to solve the problem. When the projection linearly depends on the nuisance parameters, the GLR test coincides with an optimal statistical invariant test.

  7. Evaluation of Anomaly Detection Capability for Ground-Based Pre-Launch Shuttle Operations. Chapter 8

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2010-01-01

    This chapter will provide a thorough end-to-end description of the process for evaluation of three different data-driven algorithms for anomaly detection to select the best candidate for deployment as part of a suite of IVHM (Integrated Vehicle Health Management) technologies. These algorithms were deemed to be sufficiently mature enough to be considered viable candidates for deployment in support of the maiden launch of Ares I-X, the successor to the Space Shuttle for NASA's Constellation program. Data-driven algorithms are just one of three different types being deployed. The other two types of algorithms being deployed include a "nile-based" expert system, and a "model-based" system. Within these two categories, the deployable candidates have already been selected based upon qualitative factors such as flight heritage. For the rule-based system, SHINE (Spacecraft High-speed Inference Engine) has been selected for deployment, which is a component of BEAM (Beacon-based Exception Analysis for Multimissions), a patented technology developed at NASA's JPL (Jet Propulsion Laboratory) and serves to aid in the management and identification of operational modes. For the "model-based" system, a commercially available package developed by QSI (Qualtech Systems, Inc.), TEAMS (Testability Engineering and Maintenance System) has been selected for deployment to aid in diagnosis. In the context of this particular deployment, distinctions among the use of the terms "data-driven," "rule-based," and "model-based," can be found in. Although there are three different categories of algorithms that have been selected for deployment, our main focus in this chapter will be on the evaluation of three candidates for data-driven anomaly detection. These algorithms will be evaluated upon their capability for robustly detecting incipient faults or failures in the ground-based phase of pre-launch space shuttle operations, rather than based oil heritage as performed in previous studies. Robust

  8. CTS TEP thermal anomalies: Heat pipe system performance

    NASA Technical Reports Server (NTRS)

    Marcus, B. D.

    1977-01-01

    A part of the investigation is summarized of the thermal anomalies of the transmitter experiment package (TEP) on the Communications Technology Satellite (CTS) which were observed on four occasions in 1977. Specifically, the possible failure modes of the variable conductance heat pipe system (VCHPS) used for principal thermal control of the high-power traveling wave tube in the TEP are considered. Further, the investigation examines how those malfunctions may have given rise to the TEP thermal anomalies. Using CTS flight data information, ground test results, analysis conclusions, and other relevant information, the investigation concentrated on artery depriming as the most likely VCHPS failure mode. Included in the study as possible depriming mechanisms were freezing of the working fluid, Marangoni flow, and gas evolution within the arteries. The report concludes that while depriming of the heat pipe arteries is consistent with the bulk of the observed data, the factors which cause the arteries to deprime have yet to be identified.

  9. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu

  10. Efficient detection of anomaly patterns through global search in remotely sensed big data

    NASA Astrophysics Data System (ADS)

    Marinoni, Andrea; Gamba, Paolo

    2016-10-01

    In order to leverage computational complexity and avoid information losses, "big data" analysis requires a new class of algorithms and methods to be designed and implemented. In this sense, information theory-based techniques can play a key role to effectively unveil change and anomaly patterns within big data sets. A framework that aims at detecting the anomaly patterns of a given dataset is introduced. The proposed method, namely PROMODE, relies on a representation of the given dataset performed by means of undirected bipartite graphs. Then the anomalies are searched and detected by progressively spanning the graph. The proposed architecture delivers a computational load that is less than that carried by typical frameworks in literature, so that PROMODE can be considered as a valid algorithm for efficient detection of change patterns in remotely sensed big data.

  11. Anomaly detection of turbopump vibration in Space Shuttle Main Engine using statistics and neural networks

    NASA Technical Reports Server (NTRS)

    Lo, C. F.; Wu, K.; Whitehead, B. A.

    1993-01-01

    The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.

  12. The antilock braking system anomaly: a drinking driver problem?

    PubMed

    Harless, David W; Hoffer, George E

    2002-05-01

    Antilock braking systems (ABS) have held promise for reducing the incidence of accidents because they reduce stopping times on slippery surfaces and allow drivers to maintain steering control during emergency braking. Farmer et al. (Accident Anal. Prevent. 29 (1997) 745) provide evidence that antilock brakes are beneficial to nonoccupants: a set of 1992 model General Motors vehicles equipped with antilock brakes were involved in significantly fewer fatal crashes in which occupants of other vehicles, pedestrians, or bicyclists were killed. But, perversely, the risk of death for occupants of vehicles equipped with antilock brakes increased significantly after adoption. Farmer (Accident Anal. Prevent. 33 (2001) 361) updates the analysis for 1996- 1998 and finds a significant attenuation in the ABS anomaly. Researchers have put forward two hypotheses to explain this antilock brake anomaly: risk compensation and improper operation of antilock brake-equipped vehicles. We provide strong evidence for the improper operation hypothesis by showing that the antilock brake anomaly is confined largely to drinking drivers. Further, we show that the attenuation phenomenon occurs consistently after the first three to four years of vehicle service.

  13. Stochastic anomaly detection in eye-tracking data for quantification of motor symptoms in Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Jansson, Daniel; Medvedev, Alexander; Axelson, Hans; Nyholm, Dag

    2013-10-01

    Two methods for distinguishing between healthy controls and patients diagnosed with Parkinson's disease by means of recorded smooth pursuit eye movements are presented and evaluated. Both methods are based on the principles of stochastic anomaly detection and make use of orthogonal series approximation for probability distribution estimation. The first method relies on the identification of a Wiener-type model of the smooth pursuit system and attempts to find statistically significant differences between the estimated parameters in healthy controls and patientts with Parkinson's disease. The second method applies the same statistical method to distinguish between the gaze trajectories of healthy and Parkinson subjects attempting to track visual stimuli. Both methods show promising results, where healthy controls and patients with Parkinson's disease are effectively separated in terms of the considered metric. The results are preliminary because of the small number of participating test subjects, but they are indicative of the potential of the presented methods as diagnosing or staging tools for Parkinson's disease.

  14. Stochastic anomaly detection in eye-tracking data for quantification of motor symptoms in Parkinson's disease.

    PubMed

    Jansson, Daniel; Medvedev, Alexander; Axelson, Hans; Nyholm, Dag

    2015-01-01

    Two methods for distinguishing between healthy controls and patients diagnosed with Parkinson's disease by means of recorded smooth pursuit eye movements are presented and evaluated. Both methods are based on the principles of stochastic anomaly detection and make use of orthogonal series approximation for probability distribution estimation. The first method relies on the identification of a Wiener model of the smooth pursuit system and attempts to find statistically significant differences between the estimated parameters in healthy controls and patients with Parkinson's disease. The second method applies the same statistical method to distinguish between the gaze trajectories of healthy and Parkinson subjects tracking visual stimuli. Both methods show promising results, where healthy controls and patients with Parkinson's disease are effectively separated in terms of the considered metric. The results are preliminary because of the small number of participating test subjects, but they are indicative of the potential of the presented methods as diagnosing or staging tools for Parkinson's disease.

  15. Satellite Microwave Detected SST Anomalies and Hurricane Intensification

    NASA Astrophysics Data System (ADS)

    Sun, D.; Kafatos, M.; Cervone, G.; Boybeyi, Z.; Yang, R.

    2006-12-01

    The year 2005 is a record-breaking year for Atlantic Hurricanes. There were 28 named storms and 15 hurricanes, including three Category 5 hurricanes, Katrina, Rita, and the strongest hurricane on record, Wilma. Katrina became the costliest and one of the deadliest hurricanes in the US history. Better understanding and prediction of hurricanes will allow societies to be better prepared to minimize life and property damages. SST data from remotely sensed infrared measurements, like GOES, AVHRR, and MODIS, show missing values over the cloudy regions associated with hurricanes. While satellite microwave measurements, like the Tropical Rainfall Measuring Mission (TRMM) microwave imager (TMI), can provide SST even under cloudy conditions. Both satellite measurements and buoy observations show that SST increases in advance of significant hurricane intensification. This is probably because it may need a period of time for a tropical cyclone to accumulate energy to develop into a hurricane. Moreover, hurricane intensification may also be related to the actual location of high SST. Our results indicate pre-existing high SST anomaly (SSTA) located at the right side of the storm track for Hurricane Katrina. Numerical simulations of three control experiments also confirm the importance of the relative positioning of SSTA with respect to the storm track. Similar situations are also found for Hurricanes Rita and Wilma. On the contrary, if there is no high SSTA at the right location, a hurricane may not be able to undergo further intensification. This may explain why not all tropical cyclones associated with warm waters can attain peak intensity (categories 4 and 5) during their life cycle. Using this finding, during this year, in advance of several days, we successfully predicted tropical storm Ernesto could not have developed into a hurricane again after it entered the ocean since its first landfall in Cuba.

  16. Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery

    PubMed Central

    Sivaraks, Haemwaan

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284

  17. Robust and accurate anomaly detection in ECG artifacts using time series motif discovery.

    PubMed

    Sivaraks, Haemwaan; Ratanamahatana, Chotirat Ann

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods.

  18. Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection

    PubMed Central

    O’Hern, Corey S.; Shattuck, Mark D.; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G.

    2016-01-01

    diagnostic tools to distinguish between acute viral and bacterial respiratory infections is critical to improve patient care and limit the overuse of antibiotics in the medical community. The identification of prognostic respiratory virus biomarkers provides an early warning system that is capable of predicting which subjects will become symptomatic to expand our medical diagnostic capabilities and treatment options for acute infectious diseases. The host response to acute infection may be viewed as a deterministic signaling network responsible for maintaining the health of the host organism. We identify pathway signatures that reflect the very earliest perturbations in the host response to acute infection. These pathways provide a monitor the health state of the host using anomaly detection to quantify and predict health outcomes to pathogens. PMID:27532264

  19. Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection.

    PubMed

    Wang, Kun; Langevin, Stanley; O'Hern, Corey S; Shattuck, Mark D; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G; Kirby, Michael

    2016-01-01

    diagnostic tools to distinguish between acute viral and bacterial respiratory infections is critical to improve patient care and limit the overuse of antibiotics in the medical community. The identification of prognostic respiratory virus biomarkers provides an early warning system that is capable of predicting which subjects will become symptomatic to expand our medical diagnostic capabilities and treatment options for acute infectious diseases. The host response to acute infection may be viewed as a deterministic signaling network responsible for maintaining the health of the host organism. We identify pathway signatures that reflect the very earliest perturbations in the host response to acute infection. These pathways provide a monitor the health state of the host using anomaly detection to quantify and predict health outcomes to pathogens.

  20. Anomaly Detection using Multi-channel FLAC for Supporting Diagnosis of ECG

    NASA Astrophysics Data System (ADS)

    Ye, Jiaxing; Kobayashi, Takumi; Murakawa, Masahiro; Higuchi, Tetsuya; Otsu, Nobuyuki

    In this paper, we propose an approach for abnormality detection in multi-channel ECG signals. This system serves as front end to detect the irregular sections in ECG signals, where symptoms may be observed. Thereby, the doctor can focus on only the detected suspected symptom sections, ignoring the disease-free parts. Hence the workload of the inspection by the doctors is significantly reduced and the diagnosis efficiency can be sharply improved. For extracting the predominant characteristics of multi-channel ECG signals, we propose multi-channel Fourier local auto-correlations (m-FLAC) features on multi-channel complex spectrograms. The method characterizes the amplitude and phase information as well as temporal dynamics of the multi-channel ECG signal. At the anomaly detection stage, we employ complex subspace method for statistically modeling the normal (healthy) ECG patterns as in one-class learning. Then, we investigate the input ECG signals by measuring its deviation distance to the trained subspace. The ECG sections with disordered spectral distributions can be effectively discerned based on such distance metric. To validate the proposed approach, we conducted experiments on ECG dataset. The experimental results demonstrated the effectiveness of the proposed approach including promising performance and high efficiency, compared to conventional methods.

  1. Reasoning about anomalies: a study of the analytical process of detecting and identifying anomalous behavior in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Kronhamn, Thomas

    2009-05-01

    The goal of visual analytical tools is to support the analytical reasoning process, maximizing human perceptual, understanding and reasoning capabilities in complex and dynamic situations. Visual analytics software must be built upon an understanding of the reasoning process, since it must provide appropriate interactions that allow a true discourse with the information. In order to deepen our understanding of the human analytical process and guide developers in the creation of more efficient anomaly detection systems, this paper investigates how is the human analytical process of detecting and identifying anomalous behavior in maritime traffic data. The main focus of this work is to capture the entire analysis process that an analyst goes through, from the raw data to the detection and identification of anomalous behavior. Three different sources are used in this study: a literature survey of the science of analytical reasoning, requirements specified by experts from organizations with interest in port security and user field studies conducted in different marine surveillance control centers. Furthermore, this study elaborates on how to support the human analytical process using data mining, visualization and interaction methods. The contribution of this paper is twofold: (1) within visual analytics, contribute to the science of analytical reasoning with practical understanding of users tasks in order to develop a taxonomy of interactions that support the analytical reasoning process and (2) within anomaly detection, facilitate the design of future anomaly detector systems when fully automatic approaches are not viable and human participation is needed.

  2. Countering Botnets: Anomaly-Based Detection, Comprehensive Analysis, and Efficient Mitigation

    DTIC Science & Technology

    2011-05-01

    BOTNETS: ANOMALY-BASED DETECTION , COMPREHENSIVE ANALYSIS, AND EFFICIENT MITIGATION GEORGIA TECH RESEARCH CORPORATION MAY 2011 FINAL... DETECTION , COMPREHENSIVE ANALYSIS, AND EFFICIENT MITIGATION 5a. CONTRACT NUMBER N/A 5b. GRANT NUMBER FA8750-08-2-0141 5c. PROGRAM ELEMENT NUMBER...cover five general areas: (1) botnet detection , (2) botnet analysis, (3) botnet mitigation, (4) add-on tasks to the original contract, including the

  3. Low frequency of Y anomaly detected in Australian Brahman cow-herds

    PubMed Central

    de Camargo, Gregório M.F.; Porto-Neto, Laercio R.; Fortes, Marina R.S.; Bunch, Rowan J.; Tonhati, Humberto; Reverter, Antonio; Moore, Stephen S.; Lehnert, Sigrid A.

    2015-01-01

    Indicine cattle have lower reproductive performance in comparison to taurine. A chromosomal anomaly characterized by the presence Y markers in females was reported and associated with infertility in cattle. The aim of this study was to investigate the occurrence of the anomaly in Brahman cows. Brahman cows (n = 929) were genotyped for a Y chromosome specific region using real time-PCR. Only six out of 929 cows had the anomaly (0.6%). The anomaly frequency was much lower in Brahman cows than in the crossbred population, in which it was first detected. It also seems that the anomaly doesn't affect pregnancy in the population. Due to the low frequency, association analyses couldn't be executed. Further, SNP signal of the pseudoautosomal boundary region of the Y chromosome was investigated using HD SNP chip. Pooled DNA of “non-pregnant” and “pregnant” cows were compared and no difference in SNP allele frequency was observed. Results suggest that the anomaly had a very low frequency in this Australian Brahman population and had no effect on reproduction. Further studies comparing pregnant cows and cows that failed to conceive should be executed after better assembly and annotation of the Y chromosome in cattle. PMID:25750859

  4. Low frequency of Y anomaly detected in Australian Brahman cow-herds.

    PubMed

    de Camargo, Gregório M F; Porto-Neto, Laercio R; Fortes, Marina R S; Bunch, Rowan J; Tonhati, Humberto; Reverter, Antonio; Moore, Stephen S; Lehnert, Sigrid A

    2015-02-01

    Indicine cattle have lower reproductive performance in comparison to taurine. A chromosomal anomaly characterized by the presence Y markers in females was reported and associated with infertility in cattle. The aim of this study was to investigate the occurrence of the anomaly in Brahman cows. Brahman cows (n = 929) were genotyped for a Y chromosome specific region using real time-PCR. Only six out of 929 cows had the anomaly (0.6%). The anomaly frequency was much lower in Brahman cows than in the crossbred population, in which it was first detected. It also seems that the anomaly doesn't affect pregnancy in the population. Due to the low frequency, association analyses couldn't be executed. Further, SNP signal of the pseudoautosomal boundary region of the Y chromosome was investigated using HD SNP chip. Pooled DNA of "non-pregnant" and "pregnant" cows were compared and no difference in SNP allele frequency was observed. Results suggest that the anomaly had a very low frequency in this Australian Brahman population and had no effect on reproduction. Further studies comparing pregnant cows and cows that failed to conceive should be executed after better assembly and annotation of the Y chromosome in cattle.

  5. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  6. Using new edges for anomaly detection in computer networks

    DOEpatents

    Neil, Joshua Charles

    2015-05-19

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  7. Using new edges for anomaly detection in computer networks

    DOEpatents

    Neil, Joshua Charles

    2017-07-04

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  8. Improving genome annotations using phylogenetic profile anomaly detection.

    PubMed

    Mikkelsen, Tarjei S; Galagan, James E; Mesirov, Jill P

    2005-02-15

    A promising strategy for refining genome annotations is to detect features that conflict with known functional or evolutionary relationships between groups of genes. Previous work in this area has been focused on investigating the absence of 'housekeeping' genes or components of well-studied pathways. We have sought to develop a method for improving new annotations that can automatically synthesize and use the information available in a database of other annotated genomes. We show that a probabilistic model of phylogenetic profiles, trained from a database of curated genome annotations, can be used to reliably detect errors in new annotations. We use our method to identify 22 genes that were missed in previously published annotations of prokaryotic genomes. The method was evaluated using MATLAB and open source software referenced in this work. Scripts and datasets are available from the authors upon request. tarjei@broad.mit.edu.

  9. [A Hyperspectral Imagery Anomaly Detection Algorithm Based on Gauss-Markov Model].

    PubMed

    Gao, Kun; Liu, Ying; Wang, Li-jing; Zhu, Zhen-yu; Cheng, Hao-bo

    2015-10-01

    With the development of spectral imaging technology, hyperspectral anomaly detection is getting more and more widely used in remote sensing imagery processing. The traditional RX anomaly detection algorithm neglects spatial correlation of images. Besides, it does not validly reduce the data dimension, which costs too much processing time and shows low validity on hyperspectral data. The hyperspectral images follow Gauss-Markov Random Field (GMRF) in space and spectral dimensions. The inverse matrix of covariance matrix is able to be directly calculated by building the Gauss-Markov parameters, which avoids the huge calculation of hyperspectral data. This paper proposes an improved RX anomaly detection algorithm based on three-dimensional GMRF. The hyperspectral imagery data is simulated with GMRF model, and the GMRF parameters are estimated with the Approximated Maximum Likelihood method. The detection operator is constructed with GMRF estimation parameters. The detecting pixel is considered as the centre in a local optimization window, which calls GMRF detecting window. The abnormal degree is calculated with mean vector and covariance inverse matrix, and the mean vector and covariance inverse matrix are calculated within the window. The image is detected pixel by pixel with the moving of GMRF window. The traditional RX detection algorithm, the regional hypothesis detection algorithm based on GMRF and the algorithm proposed in this paper are simulated with AVIRIS hyperspectral data. Simulation results show that the proposed anomaly detection method is able to improve the detection efficiency and reduce false alarm rate. We get the operation time statistics of the three algorithms in the same computer environment. The results show that the proposed algorithm improves the operation time by 45.2%, which shows good computing efficiency.

  10. Detection of chromosomal anomalies in endometrial atypical hyperplasia and carcinoma by using fluorescence in situ hybridization.

    PubMed

    Qian, Junqi; Weber, Deena; Cochran, Richard; Hossain, Deloar; Bostwick, David G

    2010-04-25

    Endometrial cancer is the most common pelvic gynecological malignancy. The diagnosis of well-differentiated endometrial adenocarcinoma, atypical hyperplasia, and hyperplasia is often challenging. The authors sought to investigate the utility of chromosomal anomalies for the detection of endometrial hyperplasia and carcinoma using multitarget fluorescence in situ hybridization (FISH). Samples were collected by endometrial Tao brush and processed by liquid-based cytological preparation protocol from consecutive cases to include 50 benign, 50 hyperplasia without atypia, 47 atypical hyperplasia, and 53 endometrial cancers. Each was hybridized using fluorescence-labeled DNA probes to chromosomes 1, 8, and 10. The FISH signals were enumerated in 100 cells per case, and the chromosomal anomalies were correlated with pathologic findings, including histologic diagnoses on matched endometrial tissue samples. Numeric chromosomal anomalies were found in 0% (0 of 50) of benign, 20% (10 of 50) of hyperplasia, 74% (35 of 47) of atypical hyperplasia, and 87% (46 of 53) of carcinoma specimens. The mean percentage of cells with chromosomal changes was 55% in cancer specimens, which was significantly higher than that in hyperplasia without atypia (13%, P < .0001) and atypical hyperplasia (32%, P = .003). The most frequent chromosomal anomaly was gain of chromosome 1. FISH anomalies had an overall sensitivity of 81% and specificity of 90% for the detection of atypical hyperplasia and/or endometrial carcinoma. There was no association with grade of endometrial carcinoma. Multitarget FISH appears to be useful for the differential diagnosis of hyperplasia, atypical hyperplasia, and endometrial adenocarcinoma, with a high level of sensitivity and specificity. It is also a potential tool for the early detection of neoplastic cells in endometrial cytology specimens. Endometrial hyperplasia with FISH-detected chromosomal anomalies may represent a clinically significant subset of cases that

  11. Some practical issues in anomaly detection and exploitation of regions of interest in hyperspectral images.

    PubMed

    Goudail, François; Roux, Nicolas; Baarstad, Ivar; Løke, Trond; Kaspersen, Peter; Alouini, Mehdi; Normandin, Xavier

    2006-07-20

    We address method of detection of anomalies in hyperspectral images that consists in performing the detection when the spectral signatures of the targets are unknown. We show that, in real hyperspectral images, use of the full spectral resolution may not be necessary for detection but that the correlation properties of spectral fluctuations have to be taken into account in the design of the detection algorithm. Anomaly detectors are useful for detecting regions of interest (ROIs), but, as they are prone to false alarms, one must analyze the ROIs obtained further to decide whether they correspond to real targets. We propose a method of exploitation of these ROIs that consists in generating a single image in which the contrast of the ROI is optimized.

  12. Bio-Inspired Distributed Decision Algorithms for Anomaly Detection

    DTIC Science & Technology

    2017-03-01

    i.e. they connect to a larger number of nodes, e) the CAIDA Autonomous System graph for May 2004 [9], and f) the CAIDA Autonomous System graph for...built to specific network traces. We plan to obtain and support re-play of Witty Worm traces from CAIDA managed by UCSD. 3.3.2.7 DIAMoND...partially) scale-free networks with weak hubs is very low, and in most cases does not even exceed 50%. For systems with strong hubs, such as the real CAIDA

  13. 3D Reconstruction For The Detection Of Cranial Anomalies

    NASA Astrophysics Data System (ADS)

    Kettner, B.; Shalev, S.; Lavelle, C.

    1986-01-01

    There is a growing interest in the use of three-dimensional (3D) cranial reconstruction from CT scans for surgical planning. A low-cost imaging system has been developed, which provides pseudo-3D images which may be manipulated to reveal the craniofacial skeleton as a whole or any particular component region. The contrast between congenital (hydrocephalic), normocephalic and acquired (carcinoma of the maxillary sinus) anomalous cranial forms demonstrates the potential of this system.

  14. Model-Based Reasoning in the Detection of Satellite Anomalies

    DTIC Science & Technology

    1990-12-01

    Conference on Artificial Intellegence . 1363-1368. Detroit, Michigan, August 89. Chu, Wei-Hai. "Generic Expert System Shell for Diagnostic Reasoning... Intellegence . 1324-1330. Detroit, Michigan, August 89. de Kleer, Johan and Brian C. Williams. "Diagnosing Multiple Faults," Artificial Intellegence , 32(1): 97...Benjamin Kuipers. "Model-Based Monitoring of Dynamic Systems," Proceedings of the Eleventh Intematianal Joint Conference on Artificial Intellegence . 1238

  15. Maritime Anomaly Detection: Domain Introduction and Review of Selected Literature

    DTIC Science & Technology

    2011-10-01

    operators, not to fully replace them. The amount of data that enters a system is typically astronomical , and a single person cannot manage and...2.3.2.2. Natural language processing The use of structured data is very common today. However, a huge portion of the relevant data is in...Current gaps in MAD are identified from the data and information, processing and systems perspectives. The selected literature review is structured

  16. A non-parametric approach to anomaly detection in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Veracini, Tiziana; Matteoli, Stefania; Diani, Marco; Corsini, Giovanni; de Ceglie, Sergio U.

    2010-10-01

    In the past few years, spectral analysis of data collected by hyperspectral sensors aimed at automatic anomaly detection has become an interesting area of research. In this paper, we are interested in an Anomaly Detection (AD) scheme for hyperspectral images in which spectral anomalies are defined with respect to a statistical model of the background Probability Density Function (PDF).The characterization of the PDF of hyperspectral imagery is not trivial. We approach the background PDF estimation through the Parzen Windowing PDF estimator (PW). PW is a flexible and valuable tool for accurately modeling unknown PDFs in a non-parametric fashion. Although such an approach is well known and has been widely employed, its use within an AD scheme has been not investigated yet. For practical purposes, the PW ability to estimate PDFs is strongly influenced by the choice of the bandwidth matrix, which controls the degree of smoothing of the resulting PDF approximation. Here, a Bayesian approach is employed to carry out the bandwidth selection. The resulting estimated background PDF is then used to detect spectral anomalies within a detection scheme based on the Neyman-Pearson approach. Real hyperspectral imagery is used for an experimental evaluation of the proposed strategy.

  17. Anomaly Detection in the Right Hemisphere: The Influence of Visuospatial Factors

    ERIC Educational Resources Information Center

    Smith, Stephen D.; Dixon, Michael J.; Tays, William J.; Bulman-Fleming, M. Barbara

    2004-01-01

    Previous research with both brain-damaged and neurologically intact populations has demonstrated that the right cerebral hemisphere (RH) is superior to the left cerebral hemisphere (LH) at detecting anomalies (or incongruities) in objects (Ramachandran, 1995; Smith, Tays, Dixon, & Bulman-Fleming, 2002). The current research assesses whether the RH…

  18. Dual Use Corrosion Inhibitor and Penetrant for Anomaly Detection in Neutron/X Radiography

    NASA Technical Reports Server (NTRS)

    Hall, Phillip B. (Inventor); Novak, Howard L. (Inventor)

    2004-01-01

    A dual purpose corrosion inhibitor and penetrant composition sensitive to radiography interrogation is provided. The corrosion inhibitor mitigates or eliminates corrosion on the surface of a substrate upon which the corrosion inhibitor is applied. In addition, the corrosion inhibitor provides for the attenuation of a signal used during radiography interrogation thereby providing for detection of anomalies on the surface of the substrate.

  19. Anomaly Detection in the Right Hemisphere: The Influence of Visuospatial Factors

    ERIC Educational Resources Information Center

    Smith, Stephen D.; Dixon, Michael J.; Tays, William J.; Bulman-Fleming, M. Barbara

    2004-01-01

    Previous research with both brain-damaged and neurologically intact populations has demonstrated that the right cerebral hemisphere (RH) is superior to the left cerebral hemisphere (LH) at detecting anomalies (or incongruities) in objects (Ramachandran, 1995; Smith, Tays, Dixon, & Bulman-Fleming, 2002). The current research assesses whether the RH…

  20. Using Machine Learning for Advanced Anomaly Detection and Classification

    NASA Astrophysics Data System (ADS)

    Lane, B.; Poole, M.; Camp, M.; Murray-Krezan, J.

    2016-09-01

    Machine Learning (ML) techniques have successfully been used in a wide variety of applications to automatically detect and potentially classify changes in activity, or a series of activities by utilizing large amounts data, sometimes even seemingly-unrelated data. The amount of data being collected, processed, and stored in the Space Situational Awareness (SSA) domain has grown at an exponential rate and is now better suited for ML. This paper describes development of advanced algorithms to deliver significant improvements in characterization of deep space objects and indication and warning (I&W) using a global network of telescopes that are collecting photometric data on a multitude of space-based objects. The Phase II Air Force Research Laboratory (AFRL) Small Business Innovative Research (SBIR) project Autonomous Characterization Algorithms for Change Detection and Characterization (ACDC), contracted to ExoAnalytic Solutions Inc. is providing the ability to detect and identify photometric signature changes due to potential space object changes (e.g. stability, tumble rate, aspect ratio), and correlate observed changes to potential behavioral changes using a variety of techniques, including supervised learning. Furthermore, these algorithms run in real-time on data being collected and processed by the ExoAnalytic Space Operations Center (EspOC), providing timely alerts and warnings while dynamically creating collection requirements to the EspOC for the algorithms that generate higher fidelity I&W. This paper will discuss the recently implemented ACDC algorithms, including the general design approach and results to date. The usage of supervised algorithms, such as Support Vector Machines, Neural Networks, k-Nearest Neighbors, etc., and unsupervised algorithms, for example k-means, Principle Component Analysis, Hierarchical Clustering, etc., and the implementations of these algorithms is explored. Results of applying these algorithms to EspOC data both in an off

  1. Anomaly detection for a vibrating structure: A subspace identification/tracking approach.

    PubMed

    Candy, J V; Franco, S N; Ruggiero, E L; Emmons, M C; Lopez, I M; Stoops, L M

    2017-08-01

    Mechanical devices operating in noisy environments lead to low signal-to-noise ratios creating a challenging signal processing problem to monitor the vibrational signature of the device in real-time. To detect/classify a particular type of device from noisy vibration data, it is necessary to identify signatures that make it unique. Resonant (modal) frequencies emitted offer a signature characterizing its operation. The monitoring of structural modes to determine the condition of a device under investigation is essential, especially if it is a critical entity of an operational system. The development of a model-based scheme capable of the on-line tracking of structural modal frequencies by applying both system identification methods to extract a modal model and state estimation methods to track their evolution is discussed along with the development of an on-line monitor capable of detecting anomalies in real-time. An application of this approach to an unknown structural device is discussed illustrating the approach and evaluating its performance.

  2. Compendium of Anomaly Detection and Reaction Tools and Projects

    DTIC Science & Technology

    2000-05-17

    of Data Network packets Reactions Alerts: paging and/or e - mailing system administrators (Enterprise version) Responses: Output to any SNMP compliant...Allain, in an e-mail, dated April 25, 2000. to the Infosec e - mailing list, in response to a query about the use of the CVE (Common Vulnerabilities and

  3. Radio signal anomalies detected with MEXART in 2012 during the recovery phase of geomagnetic storms

    NASA Astrophysics Data System (ADS)

    Carrillo-Vargas, Armando; Pérez-Enríquez, Román; López-Montes, Rebeca; Rodríguez-Martínez, Mario; Ugalde-Calvillo, Luis Gerardo

    2016-11-01

    In this work we present MEXART observations in 2012 from 17 radio sources in which we detected anomalies in the radio signal of these sources occurring during the recovery phase of some geomagnetic storms. We performed FFT and wavelet analysis of the radio signals during these periods and found that rather than IPS the anomalies seem to originate in the ionosphere, especially because of the frequencies at which they are observed. We discuss this results under the view that the source of the geomagnetic storm is no longer in the interplanetary medium.

  4. Developing an automatic classification system of vegetation anomalies for early warning with the ASAP (Anomaly hot Spots of Agricultural Production) system

    NASA Astrophysics Data System (ADS)

    Meroni, M.; Rembold, F.; Urbano, F.; Lemoine, G.

    2016-12-01

    Anomaly maps and time profiles of remote sensing derived indicators relevant to monitor crop and vegetation stress can be accessed online thanks to a rapidly growing number of web based portals. However, timely and systematic global analysis and coherent interpretation of such information, as it is needed for example for SDG 2 related monitoring, remains challenging. With the ASAP system (Anomaly hot Spots of Agricultural Production) we propose a two-step analysis to provide monthly warning of production deficits in water-limited agriculture worldwide. The first step is fully automated and aims at classifying each administrative unit (1st sub-national level) into a number of possible warning levels, ranging from "none" to "watch" and up to "extended alarm". The second step involves the verification of the automatic warnings and integration into a short national level analysis by agricultural analysts. In this paper we describe the methodological development of the automatic vegetation anomaly classification system. Warnings are triggered only during the crop growing season, defined by a remote sensing based phenology. The classification takes into consideration the fraction of the agricultural and rangelands area for each administrative unit that is affected by a severe anomaly of two rainfall-based indicators (the Standardized Precipitation Index (SPI), computed at 1 and 3-month scale) and one biophysical indicator (the cumulative NDVI from the start of the growing season). The severity of the warning thus depends on the timing, the nature and the number of indicators for which an anomaly is detected. The prototype system is using global NDVI images of the METOP sensor, while a second version is being developed based on 1km Modis NDVI with temporal smoothing and near real time filtering. Also a specific water balance model is under development to include agriculture water stress information in addition to the SPI. The monthly warning classification and crop

  5. Magnetic anomaly detection (MAD) of ferromagnetic pipelines using principal component analysis (PCA)

    NASA Astrophysics Data System (ADS)

    Sheinker, Arie; Moldwin, Mark B.

    2016-04-01

    The magnetic anomaly detection (MAD) method is used for detection of visually obscured ferromagnetic objects. The method exploits the magnetic field originating from the ferromagnetic object, which constitutes an anomaly in the ambient earth’s magnetic field. Traditionally, MAD is used to detect objects with a magnetic field of a dipole structure, where far from the object it can be considered as a point source. In the present work, we expand MAD to the case of a non-dipole source, i.e. a ferromagnetic pipeline. We use principal component analysis (PCA) to calculate the principal components, which are then employed to construct an effective detector. Experiments conducted in our lab with real-world data validate the above analysis. The simplicity, low computational complexity, and the high detection rate make the proposed detector attractive for real-time, low power applications.

  6. Anomaly Detection in Radiation Sensor Data with Applications to Transportation Security

    SciTech Connect

    Omitaomu, Olufemi A; Ganguly, Auroop R; Patton, Bruce W; Protopopescu, Vladimir A

    2009-01-01

    In this paper, we present a new approach for detecting trucks transporting illicit radioactive materials using radiation data. The approach is motivated by the high number of false alarms that typically results when using radiation portal monitors. Our approach is a three-stage anomaly detection process that consists of transforming the radiation sensor data into wavelet coefficients, representing the transformed data in binary form, and detecting anomalies among data sets using a proximity-based method. The approach is evaluated using simulated radiation data, and the results are encouraging. From a transportation security perspective, our results indicate that the concomitant use of gross count and spectroscopy radiation data improves identification of trucks transporting illicit radioactive materials. The results also suggest that the use of additional heterogeneous data with radiation data may enhance the reliability of the detection process. Further testing with real radiation data and mixture of cargo is needed to fully validate the results.

  7. Rapid Anomaly Detection and Tracking via Compressive Time-Spectra Measurement

    DTIC Science & Technology

    2016-02-12

    matrix. Yet at the same time, due to a deterministic sequence of permutations and inversions of its rows and columns , the STOne matrix has the...measurement strategies to perform and optimize compressed domain anomaly detection. Throughout the project we compared STOne (Sum-to-One), and Walsh...image previews and full-scale image reconstructions. These matrices were dubbed Sum-to-One, or STOne pattern. 1 New modes for change detection

  8. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  9. Towards spatial localisation of harmful algal blooms; statistics-based spatial anomaly detection

    NASA Astrophysics Data System (ADS)

    Shutler, J. D.; Grant, M. G.; Miller, P. I.

    2005-10-01

    Harmful algal blooms are believed to be increasing in occurrence and their toxins can be concentrated by filter-feeding shellfish and cause amnesia or paralysis when ingested. As a result fisheries and beaches in the vicinity of blooms may need to be closed and the local population informed. For this avoidance planning timely information on the existence of a bloom, its species and an accurate map of its extent would be prudent. Current research to detect these blooms from space has mainly concentrated on spectral approaches towards determining species. We present a novel statistics-based background-subtraction technique that produces improved descriptions of an anomaly's extent from remotely-sensed ocean colour data. This is achieved by extracting bulk information from a background model; this is complemented by a computer vision ramp filtering technique to specifically detect the perimeter of the anomaly. The complete extraction technique uses temporal-variance estimates which control the subtraction of the scene of interest from the time-weighted background estimate, producing confidence maps of anomaly extent. Through the variance estimates the method learns the associated noise present in the data sequence, providing robustness, and allowing generic application. Further, the use of the median for the background model reduces the effects of anomalies that appear within the time sequence used to generate it, allowing seasonal variations in the background levels to be closely followed. To illustrate the detection algorithm's application, it has been applied to two spectrally different oceanic regions.

  10. Fetal anomaly detection by second-trimester ultrasonography in a tertiary center.

    PubMed

    VanDorsten, J P; Hulsey, T C; Newman, R B; Menard, M K

    1998-04-01

    Our purpose was to determine the relative accuracy of indicated versus screening second-trimester ultrasonography for detection of fetal anomalies and to assess the cost effectiveness of anomaly screening. The study population consisted of 2031 pregnant women with singleton gestations who prospectively underwent ultrasonographic scanning between 15 and 22 weeks and received complete obstetric care at the Medical University of South Carolina between July 1, 1993, and June 30, 1996. Patients were divided into two groups: (1) indicated and (2) screening. The cost of screening ultrasonography was compared with the cost of newborn care for selected anomalous fetuses. Forty-seven fetuses (2.3%) were diagnosed by ultrasonography as having a major anomaly: 8.6% in the indicated group and 0.68% in the screening group (p=0.001). The sensitivity for detecting the anomalous fetus was 75.0% overall: 89.7% in the indicated group and 47.6% in the screening group (p=0.001). Of the 47 patients diagnosed with fetal anomalies, 11 (23.4%) chose pregnancy termination; of the 35 (74.5%) live-born anomalous infants, 29 (82.9%) were discharged alive. Projected newborn cost savings offset the cost of routine midtrimester screening. Detection of anomalous fetuses was significantly better in the indicated compared with the screening group. Nevertheless, routine ultrasonographic screening appeared cost-effective in our population.

  11. Method of sensitivity analysis in anomaly detection algorithms for hyperspectral images

    NASA Astrophysics Data System (ADS)

    Messer, Adam J.; Bauer, Kenneth W.

    2017-05-01

    Anomaly detection within hyperspectral images often relies on the critical step of thresholding to declare the specific pixels based on their anomaly scores. When the detector is built upon sound statistical assumptions, this threshold is often probabilistically based, such as the RX detector and the chi-squared threshold. However, when either the detector lacking statistical framework or the background pixels of the image violate the required assumptions, the approach to thresholding is complicated and can resolve into performance instability. We present a method to test the sensitivity thresholding to small changes in the characteristics of the anomalies based on their Mahalanobis distance to the background class. In doing so, we highlight issues in detectors thresholding techniques comparing statistical approaches against heuristic methods of thresholding.

  12. A Stochastic-entropic Approach to Detect Persistent Low-temperature Volcanogenic Thermal Anomalies

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Baxter, S.

    2011-12-01

    Eruption prediction is a chancy idiosyncratic affair, as volcanoes often manifest waxing and/or waning pre-eruption emission, geodetic, and seismic behavior that is unsystematic. Thus, fundamental to increased prediction accuracy and precision are good and frequent assessments of the time-series behavior of relevant precursor geophysical, geochemical, and geological phenomena, especially when volcanoes become restless. The Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER), in orbit since 1999 on the NASA Terra Earth Observing System satellite is an important capability for detection of thermal eruption precursors (even subtle ones) and increased passive gas emissions. The unique combination of ASTER high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), combined with simultaneous visible and near-IR imaging data, and stereo-photogrammetric capabilities make it a useful, especially thermal, precursor detection tool. The JPL ASTER Volcano Archive consisting of 80,000+ASTER volcano images allows systematic analysis of (a) baseline thermal emissions for 1550+ volcanoes, (b) important aspects of the time-dependent thermal variability, and (c) the limits of detection of temporal dynamics of eruption precursors. We are analyzing a catalog of the magnitude, frequency, and distribution of ASTER-documented volcano thermal signatures, compiled from 2000 onward, at 90m/pixel. Low contrast thermal anomalies of relatively low apparent absolute temperature (e.g., summit lakes, fumarolically altered areas, geysers, very small sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and their combinations), are particularly important to discern and monitor. We have developed a technique to detect persistent hotspots that takes into account in-scene observed pixel joint frequency

  13. Evidence for consciousness-related anomalies in random physical systems

    NASA Astrophysics Data System (ADS)

    Radin, Dean I.; Nelson, Roger D.

    1989-12-01

    Speculations about the role of consciousness in physical systems are frequently observed in the literature concerned with the interpretation of quantum mechanics. While only three experimental investigations can be found on this topic in physics journals, more than 800 relevant experiments have been reported in the literature of parapsychology. A well-defined body of empirical evidence from this domain was reviewed using meta-analytic techniques to assess methodological quality and overall effect size. Results showed effects conforming to chance expectation in control conditions and unequivocal non-chance effects in experimental conditions. This quantitative literature review agrees with the findings of two earlier reviews, suggesting the existence of some form of consciousness-related anomaly in random physical systems.

  14. Detecting ship targets in spaceborne infrared image based on modeling radiation anomalies

    NASA Astrophysics Data System (ADS)

    Wang, Haibo; Zou, Zhengxia; Shi, Zhenwei; Li, Bo

    2017-09-01

    Using infrared imaging sensors to detect ship target in the ocean environment has many advantages compared to other sensor modalities, such as better thermal sensitivity and all-weather detection capability. We propose a new ship detection method by modeling radiation anomalies for spaceborne infrared image. The proposed method can be decomposed into two stages, where in the first stage, a test infrared image is densely divided into a set of image patches and the radiation anomaly of each patch is estimated by a Gaussian Mixture Model (GMM), and thereby target candidates are obtained from anomaly image patches. In the second stage, target candidates are further checked by a more discriminative criterion to obtain the final detection result. The main innovation of the proposed method is inspired by the biological mechanism that human eyes are sensitive to the unusual and anomalous patches among complex background. The experimental result on short wavelength infrared band (1.560 - 2.300 μm) and long wavelength infrared band (10.30 - 12.50 μm) of Landsat-8 satellite shows the proposed method achieves a desired ship detection accuracy with higher recall than other classical ship detection methods.

  15. Temporal anomaly detection: an artificial immune approach based on T cell activation, clonal size regulation and homeostasis.

    PubMed

    Antunes, Mário J; Correia, Manuel E

    2010-01-01

    This paper presents an artificial immune system (AIS) based on Grossman's tunable activation threshold (TAT) for temporal anomaly detection. We describe the generic AIS framework and the TAT model adopted for simulating T Cells behaviour, emphasizing two novel important features: the temporal dynamic adjustment of T Cells clonal size and its associated homeostasis mechanism. We also present some promising results obtained with artificially generated data sets, aiming to test the appropriateness of using TAT in dynamic changing environments, to distinguish new unseen patterns as part of what should be detected as normal or as anomalous. We conclude by discussing results obtained thus far with artificially generated data sets.

  16. Application of Artificial Bee Colony algorithm in TEC seismo-ionospheric anomalies detection

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2015-09-01

    In this study, the efficiency of Artificial Bee Colony (ABC) algorithm is investigated to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of some strong earthquakes including Chile (27 February 2010; 01 April 2014), Varzeghan (11 August 2012), Saravan (16 April 2013) and Papua New Guinea (29 March 2015). In comparison with other anomaly detection algorithms, ABC has a number of advantages which can be numerated as (1) detection of discord patterns in a large non linear data during a short time, (2) simplicity, (3) having less control parameters and (4) efficiently for solving multimodal and multidimensional optimization problems. Also the results of this study acknowledge the TEC time-series as a robust earthquake precursor.

  17. A scalable architecture for online anomaly detection of WLCG batch jobs

    NASA Astrophysics Data System (ADS)

    Kuehn, E.; Fischer, M.; Giffels, M.; Jung, C.; Petzold, A.

    2016-10-01

    For data centres it is increasingly important to monitor the network usage, and learn from network usage patterns. Especially configuration issues or misbehaving batch jobs preventing a smooth operation need to be detected as early as possible. At the GridKa data and computing centre we therefore operate a tool BPNetMon for monitoring traffic data and characteristics of WLCG batch jobs and pilots locally on different worker nodes. On the one hand local information itself are not sufficient to detect anomalies for several reasons, e.g. the underlying job distribution on a single worker node might change or there might be a local misconfiguration. On the other hand a centralised anomaly detection approach does not scale regarding network communication as well as computational costs. We therefore propose a scalable architecture based on concepts of a super-peer network.

  18. Beyond Trisomy 21: Additional Chromosomal Anomalies Detected through Routine Aneuploidy Screening

    PubMed Central

    Metcalfe, Amy; Hippman, Catriona; Pastuck, Melanie; Johnson, Jo-Ann

    2014-01-01

    Prenatal screening is often misconstrued by patients as screening for trisomy 21 alone; however, other chromosomal anomalies are often detected. This study aimed to systematically review the literature and use diagnostic meta-analysis to derive pooled detection and false positive rates for aneuploidies other than trisomy 21 with different prenatal screening tests. Non-invasive prenatal testing had the highest detection (DR) and lowest false positive (FPR) rates for trisomy 13 (DR: 90.3%; FPR: 0.2%), trisomy 18 (DR: 98.1%; FPR: 0.2%), and 45,X (DR: 92.2%; FPR: 0.1%); however, most estimates came from high-risk samples. The first trimester combined test also had high DRs for all conditions studied (trisomy 13 DR: 83.1%; FPR: 4.4%; trisomy 18 DR: 91.9%; FPR: 3.5%; 45,X DR: 70.1%; FPR: 5.4%; triploidy DR: 100%; FPR: 6.3%). Second trimester triple screening had the lowest DRs and highest FPRs for all conditions (trisomy 13 DR: 43.9%; FPR: 8.1%; trisomy 18 DR: 70.5%; FPR: 3.3%; 45,X DR: 77.2%; FPR: 9.3%). Prenatal screening tests differ in their ability to accurately detect chromosomal anomalies. Patients should be counseled about the ability of prenatal screening to detect anomalies other than trisomy 21 prior to undergoing screening. PMID:26237381

  19. Anomaly Detection in Gamma-Ray Vehicle Spectra with Principal Components Analysis and Mahalanobis Distances

    SciTech Connect

    Tardiff, Mark F.; Runkle, Robert C.; Anderson, K. K.; Smith, L. E.

    2006-01-23

    The goal of primary radiation monitoring in support of routine screening and emergency response is to detect characteristics in vehicle radiation signatures that indicate the presence of potential threats. Two conceptual approaches to analyzing gamma-ray spectra for threat detection are isotope identification and anomaly detection. While isotope identification is the time-honored method, an emerging technique is anomaly detection that uses benign vehicle gamma ray signatures to define an expectation of the radiation signature for vehicles that do not pose a threat. Newly acquired spectra are then compared to this expectation using statistical criteria that reflect acceptable false alarm rates and probabilities of detection. The gamma-ray spectra analyzed here were collected at a U.S. land Port of Entry (POE) using a NaI-based radiation portal monitor (RPM). The raw data were analyzed to develop a benign vehicle expectation by decimating the original pulse-height channels to 35 energy bins, extracting composite variables via principal components analysis (PCA), and estimating statistically weighted distances from the mean vehicle spectrum with the mahalanobis distance (MD) metric. This paper reviews the methods used to establish the anomaly identification criteria and presents a systematic analysis of the response of the combined PCA and MD algorithm to modeled mono-energetic gamma-ray sources.

  20. Millimeter Wave Detection of Localized Anomalies in the Space Shuttle External Fuel Tank Insulating Foam

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Case, J. T.; Abou-Khousa, M. A.; Zoughi, R.; Hepburn, F.

    2006-01-01

    The Space Shuttle Columbia's catastrophic accident emphasizes the growing need for developing and applying effective, robust and life-cycle oriented nondestructive testing (NDT) methods for inspecting the shuttle external fuel tank spray on foam insulation (SOFI). Millimeter wave NDT techniques were one of the methods chosen for evaluating their potential for inspecting these structures. Several panels with embedded anomalies (mainly voids) were produced and tested for this purpose. Near-field and far-field millimeter wave NDT methods were used for producing images of the anomalies in these panels. This paper presents the results of an investigation for the purpose of detecting localized anomalies in several SOFI panels. To this end, reflectometers at a relatively wide range of frequencies (Ka-band (26.5 - 40 GHz) to W-band (75 - 110 GHz)) and utilizing different types of radiators were employed. The resulting raw images revealed a significant amount of information about the interior of these panels. However, using simple image processing techniques the results were improved in particular as it relate s to detecting the smaller anomalies. This paper presents the results of this investigation and a discussion of these results.

  1. Local anomaly detection algorithm based on sliding windows in spectral space

    NASA Astrophysics Data System (ADS)

    Li, Zhiyong; Zhou, Shilin; Han, Yong; Wang, Liangliang

    2014-11-01

    In this paper, a novel local ways to implement hyperspectral anomaly detector is presented. Usually, the local detectors are implemented in the spatial window of image scene, but the proposed approach is implemented on the windows of spectral space. As a multivariate data, the hyperspectral image datasets can be considered as a low-dimensional manifold embedded in the high-dimensional spectral space. In real environments, nonlinear spectral mixture occurs more frequently. At these situations, whole dataset would be distributed in one or more nonlinear manifolds in high dimensional space, such as a hyper-curve surface or nonlinear hyper-simplex. However, the majority of global and local detectors in hyperspectral image are based on the linear projections. They are established on the assumption that the geometric distribution of datasets is a linear manifold. It is incapable for them to deal with these nonlinear manifold data, even for spatial local data. In this paper, a novel anomaly detection algorithm based on local linear manifold is put forward to handle the nonlinear manifold problems. In the algorithm, the local neighborhood relationships are established in spectral space, and then an anomaly detector based on linear projection is carried out in these local areas. This situation is similar to using sliding windows in the spectral space. The results are compared with classic spatial local algorithm by using real hyperspectral image and demonstrate the effectiveness in improving the weak anomalies detection and decreasing the false alarms.

  2. An expert system for diagnosing environmentally induced spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark; Lauriente, Michael; Koons, Harry C.; Gorney, David

    1992-01-01

    A new rule-based, machine independent analytical tool was designed for diagnosing spacecraft anomalies using an expert system. Expert systems provide an effective method for saving knowledge, allow computers to sift through large amounts of data pinpointing significant parts, and most importantly, use heuristics in addition to algorithms, which allow approximate reasoning and inference and the ability to attack problems not rigidly defined. The knowledge base consists of over two-hundred (200) rules and provides links to historical and environmental databases. The environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information and allows the user to input partial information (varying degrees of confidence in an answer) or 'unknown' to any question. The modularity of the expert system allows for easy updates and modifications. It not only provides scientists with needed risk analysis and confidence not found in algorithmic programs, but is also an effective learning tool, and the window implementation makes it very easy to use. The system currently runs on a Micro VAX II at Goddard Space Flight Center (GSFC). The inference engine used is NASA's C Language Integrated Production System (CLIPS).

  3. Theory and experiments in model-based space system anomaly management

    NASA Astrophysics Data System (ADS)

    Kitts, Christopher Adam

    This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.

  4. Conformal prediction for anomaly detection and collision alert in space surveillance

    NASA Astrophysics Data System (ADS)

    Chen, Huimin; Chen, Genshe; Blasch, Erik; Pham, Khanh

    2013-05-01

    Anomaly detection has been considered as an important technique for detecting critical events in a wide range of data rich applications where a majority of the data is inconsequential and/or uninteresting. We study the detection of anomalous behaviors among space objects using the theory of conformal prediction for distribution-independent on-line learning to provide collision alerts with a desirable confidence level. We exploit the fact that conformal predictors provide valid forecasted sets at specified confidence levels under the relatively weak assumption that the normal training data, together with the normal testing data, are generated from the same distribution. If the actual observation is not included in the conformal prediction set, it is classified as anomalous at the corresponding significance level. Interpreting the significance level as an upper bound of the probability that a normal observation is mistakenly classified as anomalous, we can conveniently adjust the sensitivity to anomalies while controlling the false alarm rate without having to find the application specific threshold. The proposed conformal prediction method was evaluated for a space surveillance application using the open source North American Aerospace Defense Command (NORAD) catalog data. The validity of the prediction sets is justified by the empirical error rate that matches the significance level. In addition, experiments with simulated anomalous data indicate that anomaly detection sensitivity with conformal prediction is superior to that of the existing methods in declaring potential collision events.

  5. A Distance Measure for Attention Focusing and Anaomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R. J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by.

  6. A Distance Measure for Attention Focusing and Anaomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R. J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by.

  7. An earthquake from space: detection of precursory magnetic anomalies from Swarm satellites before the 2015 M8 Nepal Earthquake

    NASA Astrophysics Data System (ADS)

    De Santis, A.; Balasis, G.; Pavón-Carrasco, F. J.; Cianchini, G.; Mandea, M.

    2015-12-01

    A large earthquake of around 8 magnitude occurred on 25 April 2015, 06:26 UTC, with epicenter in Nepal, causing more than 9000 fatalities and devastating destruction. The contemporary orbiting in the topside ionosphere of the three Swarm satellites by ESA makes it possible to look for possible pre-earthquake magnetic anomalous signals, likely due to some lithosphere-atmosphere-ionosphere (LAI) coupling. First, a wavelet analysis has been performed during the same day of the earthquake (from the external magnetic point of view, an exceptionally quiet day) with the result that a ULF anomalous and persisting signal (from around 3 to 6 UTC), is clearly detected before the earthquake. After this single-spot analysis, we performed a more extensive analysis for two months around the earthquake occurrence, to confirm or refute the cause-effect relationship. From the series of the detected magnetic anomalies (during night and magnetically quiet times) from Swarm satellites, we show that the cumulative numbers of anomalies follows the same typical power-law behavior of a critical system approaching its critical time, in our case, the large seismic event of 25 April, 2015, and then it recovers as the typical recovery phase after a large earthquake. The impressive similarity of this behavior with the analogous of seismic data analysis, provides strong support to the lithospheric origin of the satellite magnetic anomalies, as due to the LAI coupling during the preparation phase of the Nepal earthquake.

  8. Improvements in the method of radiation anomaly detection by spectral comparison ratios.

    PubMed

    Pfund, D M; Anderson, K K; Detwiler, R S; Jarman, K D; McDonald, B S; Milbrath, B D; Myjak, M J; Paradis, N C; Robinson, S M; Woodring, M L

    2016-04-01

    We present a new procedure for configuring the Nuisance-rejection Spectral Comparison Ratio Anomaly Detection (N-SCRAD) method. The procedure minimizes detectable count rates of source spectra at a specified false positive rate using simulated annealing. We also present a new method for correcting the estimates of background variability used in N-SCRAD to current conditions of the total count rate. The correction lowers detection thresholds for a specified false positive rate, enabling greater sensitivity to targets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection

    SciTech Connect

    Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D; Reed, Joel W; Goodall, John R

    2016-01-01

    This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% true positive rates at both.

  10. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  11. Capacitance probe for detection of anomalies in non-metallic plastic pipe

    DOEpatents

    Mathur, Mahendra P.; Spenik, James L.; Condon, Christopher M.; Anderson, Rodney; Driscoll, Daniel J.; Fincham, Jr., William L.; Monazam, Esmail R.

    2010-11-23

    The disclosure relates to analysis of materials using a capacitive sensor to detect anomalies through comparison of measured capacitances. The capacitive sensor is used in conjunction with a capacitance measurement device, a location device, and a processor in order to generate a capacitance versus location output which may be inspected for the detection and localization of anomalies within the material under test. The components may be carried as payload on an inspection vehicle which may traverse through a pipe interior, allowing evaluation of nonmetallic or plastic pipes when the piping exterior is not accessible. In an embodiment, supporting components are solid-state devices powered by a low voltage on-board power supply, providing for use in environments where voltage levels may be restricted.

  12. Electrical Resistivity Tomography Using Wenner β - Schlumberger Configuration for Anomaly Detection in The Soil

    NASA Astrophysics Data System (ADS)

    Pebriyanto, Y.; Dahlan, K.; Sari, Y. W.

    2017-03-01

    In the subsurface exploration investigations there are many methods used, one of them is Electrical Resistivity Tomography (ERT). ERT method is able to measure the electrical properties of the material below the earth surface based on the value of the resistivity of the material by injecting electric current and measure the potential at the surface. Based on the data obtained then will be inputted into RES2DINV software for final processing of 2D image. This research has been created by testing 2 configurations Wenner-Schlumberger and Wenner β - Schlumberger for detecting anomalies in homogeneous soil. A wooden box containing homogeneous soil is used for the test. Three anomalies (wood, stone, and wet soil) were placed in different positions and the variation of resistivity was detected. We found that the Wenner β - Schlumberger configuration results in a smaller resistivity value error than the Wenner-Schlumberger configurations.

  13. Identifying High-Risk Patients without Labeled Training Data: Anomaly Detection Methodologies to Predict Adverse Outcomes

    PubMed Central

    Syed, Zeeshan; Saeed, Mohammed; Rubinfeld, Ilan

    2010-01-01

    For many clinical conditions, only a small number of patients experience adverse outcomes. Developing risk stratification algorithms for these conditions typically requires collecting large volumes of data to capture enough positive and negative for training. This process is slow, expensive, and may not be appropriate for new phenomena. In this paper, we explore different anomaly detection approaches to identify high-risk patients as cases that lie in sparse regions of the feature space. We study three broad categories of anomaly detection methods: classification-based, nearest neighbor-based, and clustering-based techniques. When evaluated on data from the National Surgical Quality Improvement Program (NSQIP), these methods were able to successfully identify patients at an elevated risk of mortality and rare morbidities following inpatient surgical procedures. PMID:21347083

  14. Identifying High-Risk Patients without Labeled Training Data: Anomaly Detection Methodologies to Predict Adverse Outcomes.

    PubMed

    Syed, Zeeshan; Saeed, Mohammed; Rubinfeld, Ilan

    2010-11-13

    For many clinical conditions, only a small number of patients experience adverse outcomes. Developing risk stratification algorithms for these conditions typically requires collecting large volumes of data to capture enough positive and negative for training. This process is slow, expensive, and may not be appropriate for new phenomena. In this paper, we explore different anomaly detection approaches to identify high-risk patients as cases that lie in sparse regions of the feature space. We study three broad categories of anomaly detection methods: classification-based, nearest neighbor-based, and clustering-based techniques. When evaluated on data from the National Surgical Quality Improvement Program (NSQIP), these methods were able to successfully identify patients at an elevated risk of mortality and rare morbidities following inpatient surgical procedures.

  15. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding.

    PubMed

    Krasichkov, Alexander S; Grigoriev, Eugene B; Bogachev, Mikhail I; Nifontov, Eugene M

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  16. Epsilon-optimal non-Bayesian anomaly detection for parametric tomography.

    PubMed

    Fillatre, Lionel; Nikiforov, Igor; Retraint, Florent

    2008-11-01

    The non-Bayesian detection of an anomaly from a single or a few noisy tomographic projections is considered as a statistical hypotheses testing problem. It is supposed that a radiography is composed of an imaged nonanomalous background medium, considered as a deterministic nuisance parameter, with a possibly hidden anomaly. Because the full voxel-by-voxel reconstruction is impossible, an original tomographic method based on the parametric models of the nonanomalous background medium and radiographic process is proposed to fill up the gap in the missing data. Exploiting this "parametric tomography," a new detection scheme with a limited loss of optimality is proposed as an alternative to the nonlinear generalized likelihood ratio test, which is untractable in the context of nondestructive testing for the objects with uncertainties in their physical/geometrical properties. The theoretical results are illustrated by the processing of real radiographies for the nuclear fuel rod inspection.

  17. Improving Non-Linear Approaches to Anomaly Detection, Class Separation, and Visualization

    DTIC Science & Technology

    2014-12-26

    thickness (mm), two-hour serum insulin (mu U/ml), body mass index (weight in kg/(height in m)2), diabetes pedigree function, and age (years). Patients...is O(Nmr + m3) but also requires large m to ensure sufficient sampling [139]. Drineas and Mahoney [67] developed a data- dependent non -uniform...IMPROVING NON -LINEAR APPROACHES TO ANOMALY DETECTION, CLASS SEPARATION, AND VISUALIZATION DISSERTATION Todd J. Paciencia, Major, USAF AFIT-ENS-DS-14

  18. Microarray-based comparative genomic hybridization analysis in neonates with congenital anomalies: detection of chromosomal imbalances.

    PubMed

    Emy Dorfman, Luiza; Leite, Júlio César L; Giugliani, Roberto; Riegel, Mariluce

    2015-01-01

    To identify chromosomal imbalances by whole-genome microarray-based comparative genomic hybridization (array-CGH) in DNA samples of neonates with congenital anomalies of unknown cause from a birth defects monitoring program at a public maternity hospital. A blind genomic analysis was performed retrospectively in 35 stored DNA samples of neonates born between July of 2011 and December of 2012. All potential DNA copy number variations detected (CNVs) were matched with those reported in public genomic databases, and their clinical significance was evaluated. Out of a total of 35 samples tested, 13 genomic imbalances were detected in 12/35 cases (34.3%). In 4/35 cases (11.4%), chromosomal imbalances could be defined as pathogenic; in 5/35 (14.3%) cases, DNA CNVs of uncertain clinical significance were identified; and in 4/35 cases (11.4%), normal variants were detected. Among the four cases with results considered causally related to the clinical findings, two of the four (50%) showed causative alterations already associated with well-defined microdeletion syndromes. In two of the four samples (50%), the chromosomal imbalances found, although predicted as pathogenic, had not been previously associated with recognized clinical entities. Array-CGH analysis allowed for a higher rate of detection of chromosomal anomalies, and this determination is especially valuable in neonates with congenital anomalies of unknown etiology, or in cases in which karyotype results cannot be obtained. Moreover, although the interpretation of the results must be refined, this method is a robust and precise tool that can be used in the first-line investigation of congenital anomalies, and should be considered for prospective/retrospective analyses of DNA samples by birth defect monitoring programs. Copyright © 2014 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  19. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  20. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  1. Using reactive tracers to detect flow field anomalies in water treatment reactors.

    PubMed

    Gresch, Markus; Braun, Daniel; Gujer, Willi

    2011-02-01

    The hydraulics of water and wastewater treatment reactors has a major impact on their performance and control. The residence time distribution as a measure for the hydraulics represents macroscopic mixing in an integrated way with no spatial information. However, with regard to optimal sensor location for process control and for process optimisation measures, spatial information about macro-mixing is helpful. Spatially distributed measurements of reactive tracers can provide this information. In this paper we generally discuss how reactive tracers can be used to detect and characterize distinct large scale flow structures. It is shown that tracer substances are particularly suited if their reaction time scale is similar to the time scale of the large scale flow structure. For nitrifying activated sludge systems, ammonium is identified to be a suitable tracer. In a comprehensive experimental study at a real aeration tank, two distinct large scale flow features were identified by distributed ammonium measurements. Flow velocity measurements using acoustic Doppler velocimetry clearly supported the nature of these flow field anomalies. Ion-selective electrodes are a well suited device for ammonium measurements providing the temporal resolution that is needed for such an analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Symmetry fractionalization and anomaly detection in three-dimensional topological phases

    NASA Astrophysics Data System (ADS)

    Chen, Xie; Hermele, Michael

    2016-11-01

    In a phase with fractional excitations, topological properties are enriched in the presence of global symmetry. In particular, fractional excitations can transform under symmetry in a fractionalized manner, resulting in different symmetry enriched topological (SET) phases. While a good deal is now understood in 2D regarding what symmetry fractionalization patterns are possible, the situation in 3D is much more open. A new feature in 3D is the existence of loop excitations, so to study 3D SET phases, first we need to understand how to properly describe the fractionalized action of symmetry on loops. Using a dimensional reduction procedure, we show that these loop excitations exist as the boundary between two 2D SET phases, and the symmetry action is characterized by the corresponding difference in SET orders. Moreover, similar to the 2D case, we find that some seemingly possible symmetry fractionalization patterns are actually anomalous and cannot be realized strictly in 3D. We detect such anomalies using the flux fusion method we introduced previously in 2D. To illustrate these ideas, we use the 3 D Z2 gauge theory with Z2 global symmetry as an example, and enumerate and describe the corresponding SET phases. In particular, we find four nonanomalous SET phases and one anomalous SET phase, which we show can be realized as the surface of a 4D system with symmetry protected topological order.

  3. Detection and Origin of Hydrocarbon Seepage Anomalies in the Barents Sea

    NASA Astrophysics Data System (ADS)

    Polteau, Stephane; Planke, Sverre; Stolze, Lina; Kjølhamar, Bent E.; Myklebust, Reidun

    2016-04-01

    We have collected more than 450 gravity cores in the Barents Sea to detect hydrocarbon seepage anomalies and for seismic-stratigraphic tie. The cores are from the Hoop Area (125 samples) and from the Barents Sea SE (293 samples). In addition, we have collected cores near seven exploration wells. The samples were analyzed using three different analytical methods; (1) the standard organic geochemical analyzes of Applied Petroleum Technologies (APT), (2) the Amplified Geochemical Imaging (AGI) method, and (3) the Microbial Prospecting for Oil and Gas (MPOG) method. These analytical approaches can detect trace amounts of thermogenic hydrocarbons in the sediment samples, and may provide additional information about the fluid phases and the depositional environment, maturation, and age of the source rocks. However, hydrocarbon anomalies in seabed sediments may also be related to shallow sources, such as biogenic gas or reworked source rocks in the sediments. To better understand the origin of the hydrocarbon anomalies in the Barents Sea we have studied 35 samples collected approximately 200 m away from seven exploration wells. The wells included three boreholes associated with oil discoveries, two with gas discoveries, one dry well with gas shows, and one dry well. In general, the results of this case study reveal that the oil wells have an oil signature, gas wells show a gas signature, and dry wells have a background signature. However, differences in results from the three methods may occur and have largely been explained in terms of analytical measurement ranges, method sensitivities, and bio-geochemical processes in the seabed sediments. The standard geochemical method applied by APT relies on measuring the abundance of compounds between C1 to C5 in the headspace gas and between C11 to C36 in the sediment extracts. The anomalies detected in the sediment samples from this study were in the C16 to C30 range. Since the organic matter yields were mostly very low, the

  4. OPAD through 1991 - Status report no. 2. [Optical Plume Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Powers, W. T.; Cooper, A. E.; Wallace, T. L.

    1991-01-01

    The Optical Plume Anomaly Detection (OPAD) experimental program has attempted to develop a rocket engine health monitor for the detection, and if possible the quantification, of anomalous atomic and molecular species in exhaust plumes. The test program has formulated instrument designs allowing both wide spectral range and high spectral resolution. Attention is presently given to OPAD data collected for the SSME at NASA-Marshall's technology test stand, with a view to spectral emissions at startup and variations in baseline plume emissions due to changes in rated power level.

  5. A multiscale hypothesis testing approach to anomaly detection and localization from noisy tomographic data.

    PubMed

    Frakt, A B; Karl, W C; Willsky, A S

    1998-01-01

    In this paper, we investigate the problems of anomaly detection and localization from noisy tomographic data. These are characteristic of a class of problems that cannot be optimally solved because they involve hypothesis testing over hypothesis spaces with extremely large cardinality. Our multiscale hypothesis testing approach addresses the key issues associated with this class of problems. A multiscale hypothesis test is a hierarchical sequence of composite hypothesis tests that discards large portions of the hypothesis space with minimal computational burden and zooms in on the likely true hypothesis. For the anomaly detection and localization problems, hypothesis zooming corresponds to spatial zooming - anomalies are successively localized to finer and finer spatial scales. The key challenges we address include how to hierarchically divide a large hypothesis space and how to process the data at each stage of the hierarchy to decide which parts of the hypothesis space deserve more attention. For the latter, we pose and solve a nonlinear optimization problem for a decision statistic that maximally disambiguates composite hypotheses. With no more computational complexity, our optimized statistic shows substantial improvement over conventional approaches. We provide examples that demonstrate this and quantify how much performance is sacrificed by the use of a suboptimal method as compared to that achievable if the optimal approach were computationally feasible.

  6. Wavelet-RX anomaly detection for dual-band forward-looking infrared imagery.

    PubMed

    Mehmood, Asif; Nasrabadi, Nasser M

    2010-08-20

    This paper describes a new wavelet-based anomaly detection technique for a dual-band forward-looking infrared (FLIR) sensor consisting of a coregistered longwave (LW) with a midwave (MW) sensor. The proposed approach, called the wavelet-RX (Reed-Xiaoli) algorithm, consists of a combination of a two-dimensional (2D) wavelet transform and a well-known multivariate anomaly detector called the RX algorithm. In our wavelet-RX algorithm, a 2D wavelet transform is first applied to decompose the input image into uniform subbands. A subband-image cube is formed by concatenating together a number of significant subbands (high-energy subbands). The RX algorithm is then applied to the subband-image cube obtained from a wavelet decomposition of the LW or MW sensor data. In the case of the dual band, the RX algorithm is applied to a subband-image cube constructed by concatenating together the high-energy subbands of the LW and MW subband-image cubes. Experimental results are presented for the proposed wavelet-RX and the classical constant false alarm rate (CFAR) algorithm for detecting anomalies (targets) in a single broadband FLIR (LW or MW) or in a coregistered dual-band FLIR sensor. The results show that the proposed wavelet-RX algorithm outperforms the classical CFAR detector for both single-band and dual-band FLIR sensors.

  7. Kernel wavelet-Reed-Xiaoli: an anomaly detection for forward-looking infrared imagery.

    PubMed

    Mehmood, Asif; Nasrabadi, Nasser M

    2011-06-10

    This paper describes a new kernel wavelet-based anomaly detection technique for long-wave (LW) forward-looking infrared imagery. The proposed approach called kernel wavelet-Reed-Xiaoli (wavelet-RX) algorithm is essentially an extension of the wavelet-RX algorithm (combination of wavelet transform and RX anomaly detector) to a high-dimensional feature space (possibly infinite) via a certain nonlinear mapping function of the input data. The wavelet-RX algorithm in this high-dimensional feature space can easily be implemented in terms of kernels that implicitly compute dot products in the feature space (kernelizing the wavelet-RX algorithm). In the proposed kernel wavelet-RX algorithm, a two-dimensional wavelet transform is first applied to decompose the input image into uniform subbands. A number of significant subbands (high-energy subbands) are concatenated together to form a subband-image cube. The kernel RX algorithm is then applied to this subband-image cube. Experimental results are presented for the proposed kernel wavelet-RX, wavelet-RX, and the classical constant false alarm rate (CFAR) algorithm for detecting anomalies (targets) in a large database of LW imagery. The receiver operating characteristic plots show that the proposed kernel wavelet-RX algorithm outperforms the wavelet-RX as well as the classical CFAR detector.

  8. Anomaly detection in hyperspectral imagery based on low-rank and sparse decomposition

    NASA Astrophysics Data System (ADS)

    Cui, Xiaoguang; Tian, Yuan; Weng, Lubin; Yang, Yiping

    2014-01-01

    This paper presents a novel low-rank and sparse decomposition (LSD) based model for anomaly detection in hyperspectral images. In our model, a local image region is represented as a low-rank matrix plus spares noises in the spectral space, where the background can be explained by the low-rank matrix, and the anomalies are indicated by the sparse noises. The detection of anomalies in local image regions is formulated as a constrained LSD problem, which can be solved efficiently and robustly with a modified "Go Decomposition" (GoDec) method. To enhance the validity of this model, we adapts a "simple linear iterative clustering" (SLIC) superpixel algorithm to efficiently generate homogeneous local image regions i.e. superpixels in hyperspectral imagery, thus ensures that the background in local image regions satisfies the condition of low-rank. Experimental results on real hyperspectral data demonstrate that, compared with several known local detectors including RX detector, kernel RX detector, and SVDD detector, the proposed model can comfortably achieves better performance in satisfactory computation time.

  9. Interpretation of Magnetic Anomalies in Salihli (Turkey) Geothermal Area Using 3-D Inversion and Edge Detection Techniques

    NASA Astrophysics Data System (ADS)

    Timur, Emre

    2016-04-01

    There are numerous geophysical methods used to investigate geothermal areas. The major purpose of this magnetic survey is to locate the boudaries of active hydrothermal system in the South of Gediz Graben in Salihli (Manisa/Turkey). The presence of the hydrothermal system had already been inferred from surface evidence of hydrothermal activity and drillings. Firstly, 3-D prismatic models were theoretically investigated and edge detection methods were utilized with an iterative inversion method to define the boundaries and the parameters of the structure. In the first step of the application, it was necessary to convert the total field anomaly into a pseudo-gravity anomaly map. Then the geometric boudaries of the structures were determined by applying a MATLAB based software with 3 different edge detection algorithms. The exact location of the structures were obtained by using these boundary coordinates as initial geometric parameters in the inversion process. In addition to these methods, reduction to pole and horizontal gradient methods were applied to the data to achieve more information about the location and shape of the possible reservoir. As a result, the edge detection methods were found to be successful, both in the field and as theoretical data sets for delineating the boundaries of the possible geothermal reservoir structure. The depth of the geothermal reservoir was determined as 2,4 km from 3-D inversion and 2,1 km from power spectrum methods.

  10. Clairvoyant fusion detection of ocean anomalies in WorldView-2 spectral imagery

    NASA Astrophysics Data System (ADS)

    Schaum, Alan; Allman, Eric; Stites, Matthew

    2016-09-01

    For every possible mixture of clouds and ocean in WorldView-2 8-band data, we construct an anomaly detector (called a "clairvoyant" because we never know which mixture is appropriate in any given pixel). Then we combine these using a fusion technique. The usual method of deriving an analytic expression describing the envelope of all the clairvoyants' decision boundaries is not possible. Instead, we compute the intersections of infinitesimally close boundaries generated by differential changes in the mixing fraction. When glued together, these 6-dimensional hyperstrings constitute the desired 7-dimensional decision boundary of the fused anomaly detector. However, no closed-form solution exists for the fused result. Therefore, we construct an approximation to the fused detection boundary by first flattening the strings into 6-dimensional hyperplanes and then gluing them together à la 3D printing.

  11. Anomaly detection in radiographic images of composite materials via crosshatch regression

    NASA Astrophysics Data System (ADS)

    Lockard, Colin D.

    The development and testing of new composite materials is an important area of research supporting advances in aerospace engineering. Understanding the properties of these materials requires the analysis of material samples to identify damage. Given the significant time and effort required from human experts to analyze computed tomography (CT) scans related to the non-destructive evaluation of carbon fiber materials, it is advantageous to develop an automated system for identifying anomalies in these images. This thesis introduces a regression-based algorithm for identifying anomalies in grayscale images, with a particular focus on its application for the analysis of CT scan images of carbon fiber. The algorithm centers around a "crosshatch regression" approach in which each two-dimensional image is divided into a series of one-dimensional signals, each representing a single line of pixels. A robust multiple linear regression model is fitted to each signal and outliers are identified. Smoothing and quality control techniques help better define anomaly boundaries and remove noise, and multiple crosshatch regression runs are combined to generate the final result. A ground truth set was created and the algorithm was run against these images for testing. The experimental results support the efficacy of the technique, locating 92% of anomalies with an average recall of 88%, precision of 78%, and root mean square deviation of 11.2 pixels.

  12. Ultrasonographic detection of single umbilical artery: a simple marker of fetal anomaly.

    PubMed

    Sener, T; Ozalp, S; Hassa, H; Zeytinoglu, S; Basaran, N; Durak, B

    1997-08-01

    To detect associated anomalies, karyotypes and perinatal prognosis of fetuses with single umbilical artery. Fifteen fetuses who have single umbilical artery were evaluated in the obstetrical ultrasonography and medical genetics departments of Osmangazi University. Fifteen fetuses with single umbilical artery were detected during antenatal ultrasonographic examinations. Associated sonographic abnormalities include oligohydramnios (two), intrauterine growth retardation (one), renal agenesis (one), fetal ascites (one), diaphragmatic hernia (one), hydrocephalus (two), and meningomyelocele (one). Complications related to the pregnancy were pre-eclampsia in one case and abruptio placenta in another. Karyotype analysis was available in 11 cases and the only cytogenetic abnormality detected was trisomy 18 in one case. Two cases with hydrocephalus and single umbilical artery were delivered by cesarean section at 34 and 38 weeks, but both died (on the first and fifth days after birth). Five pregnancies were terminated because of intrauterine death (one), severe pre-eclampsia (one), cytogenetic abnormality (one), and multiple congenital anomalies associated with single umbilical artery (two) at 36, 27, 22, 26 and 29 weeks, respectively. Eight of the neonates who had no additional congenital or cytogenetic abnormality were completely normal at birth and during the neonatal period. Diagnoses were confirmed pathologically in all cases. Scanning the umbilical cord should be one of the essential parts of ultrasonographic examination. When single umbilical artery is detected, a detailed ultrasonographic examination is necessary to rule out associated abnormalities. We advise fetal karyotyping even when no additional pathology can be detected on ultrasonographic examination.

  13. Development of an expert system for analysis of Shuttle atmospheric revitalization and pressure control subsystem anomalies

    NASA Technical Reports Server (NTRS)

    Lafuse, Sharon A.

    1991-01-01

    The paper describes the Shuttle Leak Management Expert System (SLMES), a preprototype expert system developed to enable the ECLSS subsystem manager to analyze subsystem anomalies and to formulate flight procedures based on flight data. The SLMES combines the rule-based expert system technology with the traditional FORTRAN-based software into an integrated system. SLMES analyzes the data using rules, and, when it detects a problem that requires simulation, it sets up the input for the FORTRAN-based simulation program ARPCS2AT2, which predicts the cabin total pressure and composition as a function of time. The program simulates the pressure control system, the crew oxygen masks, the airlock repress/depress valves, and the leakage. When the simulation has completed, other SLMES rules are triggered to examine the results of simulation contrary to flight data and to suggest methods for correcting the problem. Results are then presented in form of graphs and tables.

  14. Development of an expert system for analysis of Shuttle atmospheric revitalization and pressure control subsystem anomalies

    NASA Technical Reports Server (NTRS)

    Lafuse, Sharon A.

    1991-01-01

    The paper describes the Shuttle Leak Management Expert System (SLMES), a preprototype expert system developed to enable the ECLSS subsystem manager to analyze subsystem anomalies and to formulate flight procedures based on flight data. The SLMES combines the rule-based expert system technology with the traditional FORTRAN-based software into an integrated system. SLMES analyzes the data using rules, and, when it detects a problem that requires simulation, it sets up the input for the FORTRAN-based simulation program ARPCS2AT2, which predicts the cabin total pressure and composition as a function of time. The program simulates the pressure control system, the crew oxygen masks, the airlock repress/depress valves, and the leakage. When the simulation has completed, other SLMES rules are triggered to examine the results of simulation contrary to flight data and to suggest methods for correcting the problem. Results are then presented in form of graphs and tables.

  15. Systematic review of first-trimester ultrasound screening for detection of fetal structural anomalies and factors that affect screening performance.

    PubMed

    Karim, J N; Roberts, N W; Salomon, L J; Papageorghiou, A T

    2017-10-01

    To determine the sensitivity and specificity of first-trimester ultrasound for the detection of fetal abnormalities and to establish which factors might impact on screening performance. A systematic review and meta-analysis of all relevant publications was performed to assess the diagnostic accuracy of two-dimensional transabdominal and transvaginal ultrasound in the detection of congenital fetal anomalies prior to 14 weeks' gestation. The reference standard was detection of abnormalities at birth or postmortem. Factors that may impact on detection rates were evaluated, including population characteristics, gestational age, healthcare setting, ultrasound modality, use of an anatomical checklist for detection of first-trimester anomalies and type of malformation included in the study. In an effort to reduce the impact of study heterogeneity on the results of the meta-analysis, data from the studies were analyzed within subgroups of major anomalies vs all types of anomaly and low-risk/unselected populations vs high-risk populations. An electronic search (until 29 July 2015) identified 2225 relevant citations, from which a total of 30 studies, published between 1991 and 2014, were selected for inclusion. The pooled estimate for the detection of major abnormalities in low-risk or unselected populations (19 studies, 115 731 fetuses) was 46.10% (95% CI, 36.88-55.46%). The detection rate for all abnormalities in low-risk or unselected populations (14 studies, 97 976 fetuses) was 32.35% (95% CI, 22.45-43.12%), whereas in high-risk populations (six studies, 2841 fetuses) it was 61.18% (95% CI, 37.71-82.19%). Of the factors examined for their impact on detection rate, there was a statistically significant relationship (P < 0.0001) between the use of a standardized anatomical protocol during first-trimester anomaly screening and its sensitivity for the detection of fetal anomalies in all subgroups. Detection rates of first-trimester fetal anomalies ranged from 32

  16. VISAD: an interactive and visual analytical tool for the detection of behavioral anomalies in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Warston, Håkan

    2009-05-01

    Monitoring the surveillance of large sea areas normally involves the analysis of huge quantities of heterogeneous data from multiple sources (radars, cameras, automatic identification systems, reports, etc.). The rapid identification of anomalous behavior or any threat activity in the data is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems are rarely used in the real world. There are two main reasons: (1) the detection of anomalous behavior is normally not a well-defined and structured problem and therefore, automatic data mining approaches do not work well and (2) the difficulties that these systems have regarding the representation and employment of the prior knowledge that the users bring to their tasks. In order to overcome these limitations, we believe that human involvement in the entire discovery process is crucial. Using a visual analytics process model as a framework, we present VISAD: an interactive, visual knowledge discovery tool for supporting the detection and identification of anomalous behavior in maritime traffic data. VISAD supports the insertion of human expert knowledge in (1) the preparation of the system, (2) the establishment of the normal picture and (3) in the actual detection of rare events. For each of these three modules, VISAD implements different layers of data mining, visualization and interaction techniques. Thus, the detection procedure becomes transparent to the user, which increases his/her confidence and trust in the system and overall, in the whole discovery process.

  17. Least Square Support Vector Machine for Detection of - Ionospheric Anomalies Associated with the Powerful Nepal Earthquake (Mw = 7.5) of 25 April 2015

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2016-06-01

    Due to the irrepalable devastations of strong earthquakes, accurate anomaly detection in time series of different precursors for creating a trustworthy early warning system has brought new challenges. In this paper the predictability of Least Square Support Vector Machine (LSSVM) has been investigated by forecasting the GPS-TEC (Total Electron Content) variations around the time and location of Nepal earthquake. In 77 km NW of Kathmandu in Nepal (28.147° N, 84.708° E, depth = 15.0 km) a powerful earthquake of Mw = 7.8 took place at 06:11:26 UTC on April 25, 2015. For comparing purpose, other two methods including Median and ANN (Artificial Neural Network) have been implemented. All implemented algorithms indicate on striking TEC anomalies 2 days prior to the main shock. Results reveal that LSSVM method is promising for TEC sesimo-ionospheric anomalies detection.

  18. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery.

    PubMed

    Marapareddy, Ramakalavathi; Aanstoos, James V; Younan, Nicolas H

    2016-06-16

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ₁, λ₂, and λ₃), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory's (JPL's) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers.

  19. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery

    PubMed Central

    Marapareddy, Ramakalavathi; Aanstoos, James V.; Younan, Nicolas H.

    2016-01-01

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ1, λ2, and λ3), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory’s (JPL’s) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers. PMID:27322270

  20. Multi-Observation Continuous Density Hidden Markov Models for Anomaly Detection in Full Motion Video

    DTIC Science & Technology

    2012-06-01

    variable of summation • λ = (A, B, π) A Hidden Markov Model • ai j Probability of being in state j at time t + 1 given the process was in i at t • bi PDF for...Angular Deviation . A random variable , the difference in heading (in degrees) from the overall direction of movement over the sequence • S : Speed. A... random variable , the speed of the agent at a given time step xiii MULTI-OBSERVATION CONTINUOUS DENSITY HIDDEN MARKOV MODELS FOR ANOMALY DETECTION IN

  1. Building robust neighborhoods for manifold learning-based image classification and anomaly detection

    NASA Astrophysics Data System (ADS)

    Doster, Timothy; Olson, Colin C.

    2016-05-01

    We exploit manifold learning algorithms to perform image classification and anomaly detection in complex scenes involving hyperspectral land cover and broadband IR maritime data. The results of standard manifold learning techniques are improved by including spatial information. This is accomplished by creating super-pixels which are robust to affine transformations inherent in natural scenes. We utilize techniques from harmonic analysis and image processing, namely, rotation, skew, flip, and shift operators to develop a more representational graph structure which defines the data-dependent manifold.

  2. Utilization of Electrical Impedance Tomography to Detect Internal Anomalies in Southern Pine Logs

    NASA Astrophysics Data System (ADS)

    Steele, Philip; Cooper, Jerome

    2006-03-01

    A large body of research has shown that knowledge of internal defect location in logs prior to sawing has the potential to significantly increase lumber value yield. This paper describes a relatively low-capital log scanning technique based on Electrical Impedance Tomography (EIT) to image anomalies interior to sawlogs. Static testing results showed that knots, juvenile and compression wood internal to logs can be detected. Although resolution is lower than that of CT and NMR technologies, the low cost of this EIT application should render it competitive.

  3. Range-invariant anomaly detection applied to imaging Fourier transform spectrometry data

    NASA Astrophysics Data System (ADS)

    Borel, Christoph; Rosario, Dalton; Romano, Joao

    2012-09-01

    This paper describes the end-to-end processing of image Fourier transform spectrometry data taken of surrogate tank targets at Picatinny Arsenal in New Jersey with the long-wave hyper-spectral camera HyperCam from Telops. The first part of the paper discusses the processing from raw data to calibrated radiance and emissivity data. The second part discusses the application of a range-invariant anomaly detection approach to calibrated radiance, emissivity and brightness temperature data for different spatial resolutions and compares it to the Reed-Xiaoli detector.

  4. RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection

    PubMed Central

    Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S.

    2015-01-01

    Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request. PMID:25685112

  5. Fiber Optic Bragg Grating Sensors for Thermographic Detection of Subsurface Anomalies

    NASA Technical Reports Server (NTRS)

    Allison, Sidney G.; Winfree, William P.; Wu, Meng-Chou

    2009-01-01

    Conventional thermography with an infrared imager has been shown to be an extremely viable technique for nondestructively detecting subsurface anomalies such as thickness variations due to corrosion. A recently developed technique using fiber optic sensors to measure temperature holds potential for performing similar inspections without requiring an infrared imager. The structure is heated using a heat source such as a quartz lamp with fiber Bragg grating (FBG) sensors at the surface of the structure to detect temperature. Investigated structures include a stainless steel plate with thickness variations simulated by small platelets attached to the back side using thermal grease. A relationship is shown between the FBG sensor thermal response and variations in material thickness. For comparison, finite element modeling was performed and found to agree closely with the fiber optic thermography results. This technique shows potential for applications where FBG sensors are already bonded to structures for Integrated Vehicle Health Monitoring (IVHM) strain measurements and can serve dual-use by also performing thermographic detection of subsurface anomalies.

  6. Anomaly Detection Based on Local Nearest Neighbor Distance Descriptor in Crowded Scenes

    PubMed Central

    Hu, Shiqiang; Zhang, Huanlong; Luo, Lingkun

    2014-01-01

    We propose a novel local nearest neighbor distance (LNND) descriptor for anomaly detection in crowded scenes. Comparing with the commonly used low-level feature descriptors in previous works, LNND descriptor has two major advantages. First, LNND descriptor efficiently incorporates spatial and temporal contextual information around the video event that is important for detecting anomalous interaction among multiple events, while most existing feature descriptors only contain the information of single event. Second, LNND descriptor is a compact representation and its dimensionality is typically much lower than the low-level feature descriptor. Therefore, not only the computation time and storage requirement can be accordingly saved by using LNND descriptor for the anomaly detection method with offline training fashion, but also the negative aspects caused by using high-dimensional feature descriptor can be avoided. We validate the effectiveness of LNND descriptor by conducting extensive experiments on different benchmark datasets. Experimental results show the promising performance of LNND-based method against the state-of-the-art methods. It is worthwhile to notice that the LNND-based approach requires less intermediate processing steps without any subsequent processing such as smoothing but achieves comparable event better performance. PMID:25105164

  7. RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection.

    PubMed

    Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S

    Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request.

  8. Anomaly detection based on local nearest neighbor distance descriptor in crowded scenes.

    PubMed

    Hu, Xing; Hu, Shiqiang; Zhang, Xiaoyu; Zhang, Huanlong; Luo, Lingkun

    2014-01-01

    We propose a novel local nearest neighbor distance (LNND) descriptor for anomaly detection in crowded scenes. Comparing with the commonly used low-level feature descriptors in previous works, LNND descriptor has two major advantages. First, LNND descriptor efficiently incorporates spatial and temporal contextual information around the video event that is important for detecting anomalous interaction among multiple events, while most existing feature descriptors only contain the information of single event. Second, LNND descriptor is a compact representation and its dimensionality is typically much lower than the low-level feature descriptor. Therefore, not only the computation time and storage requirement can be accordingly saved by using LNND descriptor for the anomaly detection method with offline training fashion, but also the negative aspects caused by using high-dimensional feature descriptor can be avoided. We validate the effectiveness of LNND descriptor by conducting extensive experiments on different benchmark datasets. Experimental results show the promising performance of LNND-based method against the state-of-the-art methods. It is worthwhile to notice that the LNND-based approach requires less intermediate processing steps without any subsequent processing such as smoothing but achieves comparable event better performance.

  9. Molecular Detection of Human Cytomegalovirus (HCMV) Among Infants with Congenital Anomalies in Khartoum State, Sudan

    PubMed Central

    Ebrahim, Maha G.; Ali, Aisha S.; Mustafa, Mohamed O.; Musa, Dalal F.; El Hussein, Abdel Rahim M.; Elkhidir, Isam M.; Enan, Khalid A.

    2015-01-01

    Human Cytomegalovirus (HCMV) infection still represents the most common potentially serious viral complication in humans and is a major cause of congenital anomalies in infants. This study is aimed to detect HCMV in infants with congenital anomalies. Study subjects consisted of infants born with neural tube defect, hydrocephalus and microcephaly. Fifty serum specimens (20 males, 30 females) were collected from different hospitals in Khartoum State. The sera were investigated for cytomegalovirus specific immunoglobin M (IgM) antibodies using enzyme-linked immunosorbent assay (ELISA), and for Cytomegalovirus DNA using polymerase chain reaction (PCR). Out of the 50 sera tested, one patient’s (2%) sample showed HCMV IgM, but with no detectable DNA, other 4(8.2 %) sera were positive for HCMV DNA but with no detectable IgM. Various diagnostic techniques should be considered to evaluate HCMV disease and routine screening for HCMV should be introduced for pregnant women in this setting. It is vital to initiate further research work with many samples from different area to assess prevalence and characterize HCMV and evaluate its maternal health implications. PMID:26862356

  10. Application of the LMC algorithm to anomaly detection using the Wichmann/NIITEK ground-penetrating radar

    NASA Astrophysics Data System (ADS)

    Torrione, Peter A.; Collins, Leslie M.; Clodfelter, Fred; Frasier, Shane; Starnes, Ian

    2003-09-01

    This paper describes the application of a 2-dimensional (2-D) lattice LMS algorithm for anomaly detection using the Wichmann/Niitek ground penetrating radar (GPR) system. Sets of 3-dimensional (3-D) data are collected from the GPR system and these are processed in separate 2-D slices. Those 2-D slices that are spatially correlated in depth are combined into separate "depth segments" and these are processed independently. When target/no target declarations need to be made, the individual depth segments are combined to yield a 2-D confidence map. The 2-D confidence map is then thresholded and alarms are placed at the centroids of the remaining 8-connected data points. Calibration lane results are presented for data collected over several soil types under several weather conditions. Results show a false alarm rate improvement of at least an order of magnitude over other GPR systems, as well as significant improvement over other adaptive algorithms operating on the same data.

  11. Portable modular detection system

    DOEpatents

    Brennan, James S.; Singh, Anup; Throckmorton, Daniel J.; Stamps, James F.

    2009-10-13

    Disclosed herein are portable and modular detection devices and systems for detecting electromagnetic radiation, such as fluorescence, from an analyte which comprises at least one optical element removably attached to at least one alignment rail. Also disclosed are modular detection devices and systems having an integrated lock-in amplifier and spatial filter and assay methods using the portable and modular detection devices.

  12. Sparsity divergence index based on locally linear embedding for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhang, Lili; Zhao, Chunhui

    2016-04-01

    Hyperspectral imagery (HSI) has high spectral and spatial resolutions, which are essential for anomaly detection (AD). Many anomaly detectors assume that the spectrum signature of HSI pixels can be modeled with a Gaussian distribution, which is actually not accurate and often leads to many false alarms. Therefore, a sparsity model without any distribution hypothesis is usually employed. Dimensionality reduction (DR) as a preprocessing step for HSI is important. Principal component analysis as a conventional DR method is a linear projection and cannot exploit the nonlinear properties in hyperspectral data, whereas locally linear embedding (LLE) as a local, nonlinear manifold learning algorithm works well for DR of HSI. A modified algorithm of sparsity divergence index based on locally linear embedding (SDI-LLE) is thus proposed. First, kernel collaborative representation detection is adopted to calculate the sparse dictionary matrix of local reconstruction weights in LLE. Then, SDI is obtained both in the spectral and spatial domains, where spatial SDI is computed after DR by LLE. Finally, joint SDI, combining spectral SDI and spatial SDI, is computed, and the optimal SDI is performed for AD. Experimental results demonstrate that the proposed algorithm significantly improves the performance, when compared with its counterparts.

  13. Anomaly detection using temporal data mining in a smart home environment.

    PubMed

    Jakkula, V; Cook, D J

    2008-01-01

    To many people, home is a sanctuary. With the maturing of smart home technologies, many people with cognitive and physical disabilities can lead independent lives in their own homes for extended periods of time. In this paper, we investigate the design of machine learning algorithms that support this goal. We hypothesize that machine learning algorithms can be designed to automatically learn models of resident behavior in a smart home, and that the results can be used to perform automated health monitoring and to detect anomalies. Specifically, our algorithms draw upon the temporal nature of sensor data collected in a smart home to build a model of expected activities and to detect unexpected, and possibly health-critical, events in the home. We validate our algorithms using synthetic data and real activity data collected from volunteers in an automated smart environment. The results from our experiments support our hypothesis that a model can be learned from observed smart home data and used to report anomalies, as they occur, in a smart home.

  14. Four miniature kidneys: supernumerary kidney and multiple organ system anomalies.

    PubMed

    Afrouzian, Marjan; Sonstein, Joseph; Dadfarnia, Tahereh; Sreshta, J Nicholas; Hawkins, Hal K

    2014-05-01

    More than 350 years after Martius's first reported case in 1656, supernumerary kidney (SNK) continues to fascinate the world of medicine, generating new ideas in the domain of embryogenesis. Association of a normal kidney with a second or third ipsilateral smaller kidney is an extremely rare anomaly with only a total of 81 cases reported until today. We are reporting a case of SNK, clinically diagnosed as right hydronephrosis, associated with an ipsilateral ectopic ureter, a contralateral partially duplicated ureter, and a multiseptate gallbladder. Pathologic examination of the nephrectomy revealed 4 miniature kidneys, joining a dilated ureter through 4 separate conduits. Our patient is the first reported case of SNK with absent ipsilateral normal kidney, presence of more than 3 kidneys on 1 side, and associated anomaly in the gallbladder. This case represents a unique combination of rarities, suggesting insights in the domain of molecular embryology.

  15. Anomaly detection in flexible mechanical couplings via symbolic time series analysis

    NASA Astrophysics Data System (ADS)

    Khatkhate, Amol; Gupta, Shalabh; Ray, Asok; Patankar, Ravi

    2008-04-01

    Critical components of a rotating machinery such as bearings and couplings are often subjected to unbalanced axial and radial loads due to excessive machine vibrations arising from shaft misalignment(s). The paper presents Symbolic Time Series Analysis (STSA) of bearing acceleration data for detection and estimation of gradually developing parametric changes in flexible disc/diaphragm couplings. The analytical method is built upon the principles of Symbolic Dynamics, Automata Theory, and Statistical Pattern Recognition. The anomaly detection methodology is validated on a real-time simulation test bed, where the dynamic model of a flexible mechanical coupling is subjected to angular misalignment(s) leading to coupling failure. Damage patterns are identified from STSA of multiple data sets generated for different input torques. Statistical estimates are obtained for small changes in the coupling stiffness based on the information derived from the ensemble of damage patterns.

  16. Development of newly designed VHF interferometer system for observing earthquake-related atmospheric anomalies.

    PubMed

    Yamamoto, Isao; Fujiwara, Hironobu; Kamogawa, Masashi; Iyono, Atsushi; Kroumov, Valeri; Azakami, Takashi

    2009-01-01

    Temporal correlation between atmospheric anomalies and earthquakes has recently been verified statistically through measuring VHF FM radio waves transmitted beyond the line-of-sight. In order to locate the sources of such atmospheric anomalies, we developed a VHF interferometer system (bistatic-radar type) capable of finding the arrival direction of FM radio waves scattered possibly by earthquake-related atmospheric anomalies. In general, frequency modulation of FM radio waves produces ambiguity of arrival direction. However, our system, employing high-sampling rates of the order of kHz, can precisely measure the arrival direction of FM radio waves by stacking received signals.

  17. Development of newly designed VHF interferometer system for observing earthquake-related atmospheric anomalies

    PubMed Central

    Yamamoto, Isao; Fujiwara, Hironobu; Kamogawa, Masashi; Iyono, Atsushi; Kroumov, Valeri; Azakami, Takashi

    2009-01-01

    Temporal correlation between atmospheric anomalies and earthquakes has recently been verified statistically through measuring VHF FM radio waves transmitted beyond the line-of-sight. In order to locate the sources of such atmospheric anomalies, we developed a VHF interferometer system (bistatic-radar type) capable of finding the arrival direction of FM radio waves scattered possibly by earthquake-related atmospheric anomalies. In general, frequency modulation of FM radio waves produces ambiguity of arrival direction. However, our system, employing high-sampling rates of the order of kHz, can precisely measure the arrival direction of FM radio waves by stacking received signals. PMID:20009381

  18. Structure and dynamics of decadal anomalies in the wintertime midlatitude North Pacific ocean-atmosphere system

    NASA Astrophysics Data System (ADS)

    Fang, Jiabei; Yang, Xiu-Qun

    2016-09-01

    The structure and dynamics of decadal anomalies in the wintertime midlatitude North Pacific ocean-atmosphere system are examined in this study, using the NCEP/NCAR atmospheric reanalysis, HadISST SST and Simple Ocean Data Assimilation data for 1960-2010. The midlatitude decadal anomalies associated with the Pacific Decadal Oscillation are identified, being characterized by an equivalent barotropic atmospheric low (high) pressure over a cold (warm) oceanic surface. Such a unique configuration of decadal anomalies can be maintained by an unstable ocean-atmosphere interaction mechanism in the midlatitudes, which is hypothesized as follows. Associated with a warm PDO phase, an initial midlatitude surface westerly anomaly accompanied with intensified Aleutian low tends to force a negative SST anomaly by increasing upward surface heat fluxes and driving southward Ekman current anomaly. The SST cooling tends to increase the meridional SST gradient, thus enhancing the subtropical oceanic front. As an adjustment of the atmospheric boundary layer to the enhanced oceanic front, the low-level atmospheric meridional temperature gradient and thus the low-level atmospheric baroclinicity tend to be strengthened, inducing more active transient eddy activities that increase transient eddy vorticity forcing. The vorticity forcing that dominates the total atmospheric forcing tends to produce an equivalent barotropic atmospheric low pressure north of the initial westerly anomaly, intensifying the initial anomalies of the midlatitude surface westerly and Aleutian low. Therefore, it is suggested that the midlatitude ocean-atmosphere interaction can provide a positive feedback mechanism for the development of initial anomaly, in which the oceanic front and the atmospheric transient eddy are the indispensable ingredients. Such a positive ocean-atmosphere feedback mechanism is fundamentally responsible for the observed decadal anomalies in the midlatitude North Pacific ocean

  19. Structure and dynamics of decadal anomalies in the wintertime midlatitude North Pacific ocean-atmosphere system

    NASA Astrophysics Data System (ADS)

    Fang, Jiabei; Yang, Xiu-qun

    2017-04-01

    The structure and dynamics of decadal anomalies in the wintertime midlatitude North Pacific ocean-atmosphere system are examined in this study, using the NCEP/NCAR atmospheric reanalysis, HadISST SST and Simple Ocean Data Assimilation data for 1960-2010. The midlatitude decadal anomalies associated with the Pacific Decadal Oscillation are identified, being characterized by an equivalent barotropic atmospheric low (high) pressure over a cold (warm) oceanic surface. Such a unique configuration of decadal anomalies can be maintained by an unstable ocean-atmosphere interaction mechanism in the midlatitudes, which is hypothesized as follows. Associated with a warm PDO phase, an initial midlatitude surface westerly anomaly accompanied with intensified Aleutian low tends to force a negative SST anomaly by increasing upward surface heat fluxes and driving southward Ekman current anomaly. The SST cooling tends to increase the meridional SST gradient, thus enhancing the subtropical oceanic front. As an adjustment of the atmospheric boundary layer to the enhanced oceanic front, the low-level atmospheric meridional temperature gradient and thus the low-level atmospheric baroclinicity tend to be strengthened, inducing more active transient eddy activities that increase transient eddy vorticity forcing. The vorticity forcing that dominates the total atmospheric forcing tends to produce an equivalent barotropic atmospheric low pressure north of the initial westerly anomaly, intensifying the initial anomalies of the midlatitude surface westerly and Aleutian low. Therefore, it is suggested that the midlatitude ocean-atmosphere interaction can provide a positive feedback mechanism for the development of initial anomaly, in which the oceanic front and the atmospheric transient eddy are the indispensable ingredients. Such a positive ocean-atmosphere feedback mechanism is fundamentally responsible for the observed decadal anomalies in the midlatitude North Pacific ocean

  20. Time series satellite and ground-based data for detecting earthquake anomalies

    NASA Astrophysics Data System (ADS)

    Zoran, M. A.; Savastru, R. S.; Savastru, D. M.

    2014-10-01

    Earthquake science has entered a new era with the development of space-based technologies to measure surface geophysical parameters and deformation at the boundaries of tectonic plates and large faults. Satellite time-series data, coupled with ground based observations where available, can enable scientists to survey pre-earthquake signals in the areas of strong tectonic activity. Cumulative stress energy in seismic active regions under operating tectonic force manifests various earthquakes' precursors. Space-time anomalies of Earth's emitted radiation (thermal infrared in spectral range measured from satellite months to weeks before the occurrence of earthquakes, radon in underground water and soil, etc.), and electromagnetic anomalies are considered as pre-seismic signals. Vrancea tectonic active zone in Romania is characterized by a high seismic hazard in European- Mediterranean region, being responsible of intermediate depth and normal earthquakes generation on a confined epicentral area.Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. This paper presents observations made using in-situ data and time series MODIS and NOAA-AVHRR satellite data for derived multi geophysical parameters (land surface temperature -LST, outgoing long-wave radiation- OLR, net surface latent heat flux (LHF) and mean air temperature- AT for some seismic events recorded in Vrancea region in Romania, which is one of the most active intracontinental seismic areas in Europe. Starting with almost one week prior to a moderate or strong earthquake a transient thermal infrared rise in LST of several Celsius degrees (°C) and the increased OLR values higher than the normal have been recorded around epicentral areas, function of the magnitude and focal depth, which disappeared after the main shock.

  1. Anomaly detection driven active learning for identifying suspicious tracks and events in WAMI video

    NASA Astrophysics Data System (ADS)

    Miller, David J.; Natraj, Aditya; Hockenbury, Ryler; Dunn, Katherine; Sheffler, Michael; Sullivan, Kevin

    2012-06-01

    We describe a comprehensive system for learning to identify suspicious vehicle tracks from wide-area motion (WAMI) video. First, since the road network for the scene of interest is assumed unknown, agglomerative hierarchical clustering is applied to all spatial vehicle measurements, resulting in spatial cells that largely capture individual road segments. Next, for each track, both at the cell (speed, acceleration, azimuth) and track (range, total distance, duration) levels, extreme value feature statistics are both computed and aggregated, to form summary (p-value based) anomaly statistics for each track. Here, to fairly evaluate tracks that travel across different numbers of spatial cells, for each cell-level feature type, a single (most extreme) statistic is chosen, over all cells traveled. Finally, a novel active learning paradigm, applied to a (logistic regression) track classifier, is invoked to learn to distinguish suspicious from merely anomalous tracks, starting from anomaly-ranked track prioritization, with ground-truth labeling by a human operator. This system has been applied to WAMI video data (ARGUS), with the tracks automatically extracted by a system developed in-house at Toyon Research Corporation. Our system gives promising preliminary results in highly ranking as suspicious aerial vehicles, dismounts, and traffic violators, and in learning which features are most indicative of suspicious tracks.

  2. Using anomaly detection method and multi-temporal Radarsat images for short-term land use/land cover change detection

    NASA Astrophysics Data System (ADS)

    Qian, JunPing; Chen, XiaoYue; Li, Xia; Yeh, Anthony Gar-On; Ai, Bin

    2008-10-01

    -temporal Radarsat images and object features including mean value of backscattering coefficient (Mean), minimal value of backscattering (Min), homogeneity of gray level co-occurrence matrix (GLCMhomo) and dissimilarity of gray level co-occurrence matrix (GLCMdis) were extracted basing on segmented image objects. After that change-vector was constructed for each land objects. In the third step DBAD algorithm was applied to the change vector dataset to detect anomaly change in the 3 scenes of images. Finally field surveying data plus manual interpretation were used for validation. Comparing with object-based image regression method, DBAD results in better accuracy. Besides, data validation also shows that DBAD have better accuracy in both under-constructed area and newly built up area (error lower than 12%). While for built up area and some mixed used area, it gains relatively lower accuracy than other land types (from 10% to 28.57%). To conclude, short-term land use change in time series images could be defined as spatial and temporal anomaly in remote sensing images. By extending traditional anomaly detection to spatial-temporal anomaly detection, land use change caused by human activity could be effectively detected during short time intervals. The algorithm DBAD focus only on the density of change vectors in feature space, which is independent of the amplitude and direction of change vectors. This enable DBAD effectively discriminate temporal image variation caused by observation system, environment or seasonal land cover change, especially in vegetation and cultivated area which changed remarkably during the observation period, from land use change caused by human activities. This helps to decrease the false alarming in short-term change detection.

  3. Detection of submicron scale cracks and other surface anomalies using positron emission tomography

    DOEpatents

    Cowan, Thomas E.; Howell, Richard H.; Colmenares, Carlos A.

    2004-02-17

    Detection of submicron scale cracks and other mechanical and chemical surface anomalies using PET. This surface technique has sufficient sensitivity to detect single voids or pits of sub-millimeter size and single cracks or fissures of millimeter size; and single cracks or fissures of millimeter-scale length, micrometer-scale depth, and nanometer-scale length, micrometer-scale depth, and nanometer-scale width. This technique can also be applied to detect surface regions of differing chemical reactivity. It may be utilized in a scanning or survey mode to simultaneously detect such mechanical or chemical features over large interior or exterior surface areas of parts as large as about 50 cm in diameter. The technique involves exposing a surface to short-lived radioactive gas for a time period, removing the excess gas to leave a partial monolayer, determining the location and shape of the cracks, voids, porous regions, etc., and calculating the width, depth, and length thereof. Detection of 0.01 mm deep cracks using a 3 mm detector resolution has been accomplished using this technique.

  4. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling.

    PubMed

    Raghuram, Jayaram; Miller, David J; Kesidis, George

    2014-07-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates.

  5. Detection of subpixel anomalies in multispectral infrared imagery using an adaptive Bayesian classifier

    SciTech Connect

    Ashton, E.A.

    1998-03-01

    The detection of subpixel targets with unknown spectral signatures and cluttered backgrounds in multispectral imagery is a topic of great interest for remote surveillance applications. Because no knowledge of the target is assumed, the only way to accomplish such a detection is through a search for anomalous pixels. Two approaches to this problem are examined in this paper. The first is to separate the image into a number of statistical clusters by using an extension of the well-known {kappa}-means algorithm. Each bin of resultant residual vectors is then decorrelated, and the results are thresholded to provide detection. The second approach requires the formation of a probabilistic background model by using an adaptive Bayesian classification algorithm. This allows the calculation of a probability for each pixel, with respect to the model. These probabilities are then thresholded to provide detection. Both algorithms are shown to provide significant improvement over current filtering techniques for anomaly detection in experiments using multispectral IR imagery with both simulated and actual subpixel targets.

  6. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling

    PubMed Central

    Raghuram, Jayaram; Miller, David J.; Kesidis, George

    2014-01-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511

  7. Classification of radar data by detecting and identifying spatial and temporal anomalies

    NASA Astrophysics Data System (ADS)

    Väilä, Minna; Venäläinen, Ilkka; Jylhä, Juha; Ruotsalainen, Marja; Perälä, Henna; Visa, Ari

    2010-04-01

    For some time, applying the theory of pattern recognition and classification to radar signal processing has been a topic of interest in the field of remote sensing. Efficient operation and target indication is often hindered by the signal background, which can have similar properties with the interesting signal. Because noise and clutter may constitute most part of the response of surveillance radar, aircraft and other interesting targets can be seen as anomalies in the data. We propose an algorithm for detecting these anomalies on a heterogeneous clutter background in each range-Doppler cell, the basic unit in the radar data defined by the resolution in range, angle and Doppler. The analysis is based on the time history of the response in a cell and its correlation to the spatial surroundings. If the newest time window of response in a resolution cell differs statistically from the time history of the cell, the cell is determined anomalous. Normal cells are classified as noise or different type of clutter based on their strength on each Doppler band. Anomalous cells are analyzed using a longer time window, which emulates a longer coherent illumination. Based on the decorrelation behavior of the response in the long time window, the anomalous cells are classified as clutter, an airplane or a helicopter. The algorithm is tested with both experimental and simulated radar data. The experimental radar data has been recorded in a forested landscape.

  8. Comparative genomic hybridization array study and its utility in detection of constitutional and acquired anomalies.

    PubMed

    Andrieux, Joris; Sheth, Frenny

    2009-10-01

    The last decade has witnessed an upsurge in the knowledge of cytogenetic disorders and putting the old technology in a new basket with molecular genetics. As conventional cytogenetic can detect the genetic alteration of 10-15 Mb, many of the micro-deletions and micro-duplications are missed. However, with the advent of technology of fluorescence in situ hybridization (FISH), the resolution of genetic aberrations can reach to 3-5 Mb, nonetheless the anomalies smaller than the above, need further precision which has been achieved using comparative genomic hybridization array (CGH-array). Introduction of array-CGH has brought higher sensitivity with automated DNA fragment analyzer and DNA chip for submicroscopic chromosomal anomalies that are missed till date in many of the acquired and constitutional genetic disorders. The resolution of the technology varies from several Kb to 1 Mb depending upon the type of array selected. With the recent improvement in the array-CGH technology, a link between cytogenetic and molecular biology has been established without replacing conventional cytogenetic technique. The wider accessibility of the technology shall certainly provide a clue to the many unidentified/unexplained genetic disorders which shall prove to be a boon to the clinicians.

  9. Fuzzy Logic Based Anomaly Detection for Embedded Network Security Cyber Sensor

    SciTech Connect

    Ondrej Linda; Todd Vollmer; Jason Wright; Milos Manic

    2011-04-01

    Resiliency and security in critical infrastructure control systems in the modern world of cyber terrorism constitute a relevant concern. Developing a network security system specifically tailored to the requirements of such critical assets is of a primary importance. This paper proposes a novel learning algorithm for anomaly based network security cyber sensor together with its hardware implementation. The presented learning algorithm constructs a fuzzy logic rule based model of normal network behavior. Individual fuzzy rules are extracted directly from the stream of incoming packets using an online clustering algorithm. This learning algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental test-bed mimicking the environment of a critical infrastructure control system.

  10. Regional and residual anomaly separation in microgravity maps for cave detection: The case study of Gruta de las Maravillas (SW Spain)

    NASA Astrophysics Data System (ADS)

    Martínez-Moreno, F. J.; Galindo-Zaldívar, J.; Pedrera, A.; Teixidó, T.; Peña, J. A.; González-Castillo, L.

    2015-03-01

    Gravity can be considered an optimal geophysical method for cave detection, given the high density contrast between an empty cavity and the surrounding materials. A number of methods can be used for regional and residual gravity anomaly separation, although they have not been tested in natural scenarios. With the purpose of comparing the different methods, we calculate the residual anomalies associated with the karst system of Gruta de las Maravillas whose cave morphology and dimensions are well-known. A total of 1857 field measurements, mostly distributed in a regular grid of 10 × 10 m, cover the studied area. The microgravity data were acquired using a Scintrex CG5 gravimeter and topography control was carried out with a differential GPS. Regional anomaly maps were calculated by means of several algorithms to generate the corresponding residual gravimetric maps: polynomial first-order fitting, fast Fourier transformation with an upward continuation filter, moving average, minimum curvature and kriging methods. Results are analysed and discussed in terms of resolution, implying the capacity to detect shallow voids. We propose that polynomial fitting is the best technique when microgravity data are used to obtain the residual anomaly maps for cave detection.

  11. Distribution water quality anomaly detection from UV optical sensor monitoring data by integrating principal component analysis with chi-square distribution.

    PubMed

    Hou, Dibo; Zhang, Jian; Yang, Zheling; Liu, Shu; Huang, Pingjie; Zhang, Guangxin

    2015-06-29

    The issue of distribution water quality security ensuring is recently attracting global attention due to the potential threat from harmful contaminants. The real-time monitoring based on ultraviolet optical sensors is a promising technique. This method is of reagent-free, low maintenance cost, rapid analysis and wide cover range. However, the ultraviolet absorption spectra are of large size and easily interfered. While within the on-site application, there is almost no prior knowledge like spectral characteristics of potential contaminants before determined. Meanwhile, the concept of normal water quality is also varying due to the operating condition. In this paper, a procedure based on multivariate statistical analysis is proposed to detect distribution water quality anomaly based on ultraviolet optical sensors. Firstly, the principal component analysis is employed to capture the main variety features from the spectral matrix and reduce the dimensionality. A new statistical variable is then constructed and used for evaluating the local outlying degree according to the chi-square distribution in the principal component subspace. The possibility of anomaly of the latest observation is calculated by the accumulation of the outlying degrees from the adjacent previous observations. To develop a more reliable anomaly detection procedure, several key parameters are discussed. By utilizing the proposed methods, the distribution water quality anomalies and the optical abnormal changes can be detected. The contaminants intrusion experiment is conducted in a pilot-scale distribution system by injecting phenol solution. The effectiveness of the proposed procedure is finally testified using the experimental spectral data.

  12. The 2014-2015 warming anomaly in the Southern California Current System observed by underwater gliders

    NASA Astrophysics Data System (ADS)

    Zaba, Katherine D.; Rudnick, Daniel L.

    2016-02-01

    Large-scale patterns of positive temperature anomalies persisted throughout the surface waters of the North Pacific Ocean during 2014-2015. In the Southern California Current System, measurements by our sustained network of underwater gliders reveal the coastal effects of the recent warming. Regional upper ocean temperature anomalies were greatest since the initiation of the glider network in 2006. Additional observed physical anomalies included a depressed thermocline, high stratification, and freshening; induced biological consequences included changes in the vertical distribution of chlorophyll fluorescence. Contemporaneous surface heat flux and wind strength perturbations suggest that local anomalous atmospheric forcing caused the unusual oceanic conditions.

  13. Adaptive hidden Markov model with anomaly States for price manipulation detection.

    PubMed

    Cao, Yi; Li, Yuhua; Coleman, Sonya; Belatreche, Ammar; McGinnity, Thomas Martin

    2015-02-01

    Price manipulation refers to the activities of those traders who use carefully designed trading behaviors to manually push up or down the underlying equity prices for making profits. With increasing volumes and frequency of trading, price manipulation can be extremely damaging to the proper functioning and integrity of capital markets. The existing literature focuses on either empirical studies of market abuse cases or analysis of particular manipulation types based on certain assumptions. Effective approaches for analyzing and detecting price manipulation in real time are yet to be developed. This paper proposes a novel approach, called adaptive hidden Markov model with anomaly states (AHMMAS) for modeling and detecting price manipulation activities. Together with wavelet transformations and gradients as the feature extraction methods, the AHMMAS model caters to price manipulation detection and basic manipulation type recognition. The evaluation experiments conducted on seven stock tick data from NASDAQ and the London Stock Exchange and 10 simulated stock prices by stochastic differential equation show that the proposed AHMMAS model can effectively detect price manipulation patterns and outperforms the selected benchmark models.

  14. Traffic Pattern Detection Using the Hough Transformation for Anomaly Detection to Improve Maritime Domain Awareness

    DTIC Science & Technology

    2013-12-01

    position reports alone in this thesis. Mechanisms for clustering , classifying, and detecting outliers in groups of vessels based on their behavior...the score- based approach affords more ability to infer meaning directly from the abnormal classification. In the recreational example, the score...region at indices (x, y). From the Hough space , co-linear regions are detected by identifying common (d, θ) pairs within each set S between different

  15. Transferring embryos with genetic anomalies detected in preimplantation testing: an Ethics Committee Opinion.

    PubMed

    2017-05-01

    Patient requests for transfer of embryos with genetic anomalies linked to serious health-affecting disorders detected in preimplantation testing are rare but do exist. This Opinion sets out the possible rationales for a provider's decision to assist or decline to assist in such transfers. The Committee concludes in most clinical cases it is ethically permissible to assist or decline to assist in transferring such embryos. In circumstances in which a child is highly likely to be born with a life-threatening condition that causes severe and early debility with no possibility of reasonable function, provider transfer of such embryos is ethically problematic and highly discouraged. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  16. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  17. Detection, identification and mapping of iron anomalies in brain tissue using X-ray absorption spectroscopy

    SciTech Connect

    Mikhaylova, A.; Davidson, M.; Toastmann, H.; Channell, J.E.T.; Guyodo, Y.; Batich, C.; Dobson, J.

    2008-06-16

    This work describes a novel method for the detection, identification and mapping of anomalous iron compounds in mammalian brain tissue using X-ray absorption spectroscopy. We have located and identified individual iron anomalies in an avian tissue model associated with ferritin, biogenic magnetite and haemoglobin with a pixel resolution of less than 5 {micro}m. This technique represents a breakthrough in the study of both intra- and extra-cellular iron compounds in brain tissue. The potential for high-resolution iron mapping using microfocused X-ray beams has direct application to investigations of the location and structural form of iron compounds associated with human neurodegenerative disorders - a problem which has vexed researchers for 50 years.

  18. Cfetool: A General Purpose Tool for Anomaly Detection in Periodic Data

    SciTech Connect

    Wachsmann, Alf; Cassell, Elizabeth; /UC, Santa Barbara

    2007-03-06

    Cfengine's environment daemon ''cfenv'' has a limited and fixed set of metrics it measures on a computer. The data is assumed to be periodic in nature and cfenvd reports any data points that fall too far out of the pattern it has learned from past measurements. This is used to detect ''anomalies'' on computers. We introduce a new standalone tool, ''cfetool'', that allows arbitrary periodic data to be stored and evaluated. The user interface is modeled after rrdtool, another widely used tool to store measured data. Because a standalone tool can be used not only for computer related data, we have extended the built-in mathematics to apply to yearly data as well.

  19. Seismological detection of low-velocity anomalies surrounding the mantle transition zone in Japan subduction zone

    NASA Astrophysics Data System (ADS)

    Liu, Zhen; Park, Jeffrey; Karato, Shun-ichiro

    2016-03-01

    In the Japan subduction zone, a locally depressed 660 discontinuity has been observed beneath northeast Asia, suggesting downwelling of materials from the mantle transition zone (MTZ). Vertical transport of water-rich MTZ materials across the major mineral phase changes could lead to water release and to partial melting in surrounding mantle regions, causing seismic low-velocity anomalies. Melt layers implied by low-velocity zones (LVZs) above the 410 discontinuity have been detected in many regions, but seismic evidence for partial melting below the 660 discontinuity has been limited. High-frequency migrated Ps receiver functions indicate LVZs below the depressed 660 discontinuity and above the 410 discontinuity in the deep Japan subduction zone, suggesting dehydration melting induced by water transport out of the MTZ. Our results provide insights into water circulation associated with dynamic interactions between the subducted slab and surrounding mantle.

  20. Correlation between prenatal ultrasound and fetal autopsy findings on urinary system anomalies terminated in the second trimester.

    PubMed

    Akgun, Hulya; Basbug, Mustafa; Ozgun, Mahmut Tuncay; Ozturk, Figen; Okten, Turhan

    2014-03-01

    This prospective study was designed to compare ultrasound and autopsy findings on fetal urinary system malformations in second trimester terminations of pregnancy to evaluate the degree of agreement of such findings. From January 2003 to October 2012, a total of 308 second trimester terminations of pregnancy were performed because of fetal malformation diagnosed through second trimester ultrasound examination at a tertiary referral center. Among 308 second trimester fetuses with congenital anomalies, 62 (20.1%) had urinary anomalies. Ultrasound and fetal autopsy findings were in full agreement for urinary system malformations in 45 (72.6%) of 62 cases. In six (9.7%), autopsy confirmed the malformations detected by ultrasound but showed additional lesser urinary anomalies. In 10 (16.1%) cases, autopsy revealed major urinary anomalies not determined by ultrasound. In one case (1.6%), ultrasound reported bilateral renal agenesis; however, autopsy revealed a horseshoe kidney. The ultrasound screening sensitivity was 83.8%, and specificity was 99.5%. The results showed that prenatal ultrasound achieved a high accuracy in diagnosing fetal urinary malformations. However, fetal autopsy occasionally adds valuable information to prenatal ultrasound findings. © 2014 John Wiley & Sons, Ltd.

  1. Interior intrusion detection systems

    SciTech Connect

    Rodriguez, J.R.; Matter, J.C. ); Dry, B. )

    1991-10-01

    The purpose of this NUREG is to present technical information that should be useful to NRC licensees in designing interior intrusion detection systems. Interior intrusion sensors are discussed according to their primary application: boundary-penetration detection, volumetric detection, and point protection. Information necessary for implementation of an effective interior intrusion detection system is presented, including principles of operation, performance characteristics and guidelines for design, procurement, installation, testing, and maintenance. A glossary of sensor data terms is included. 36 figs., 6 tabs.

  2. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, M. Clinton; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many konts to make possible high speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flow fields/plumes; the Optical Plume Anomaly Detection (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDIFIS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Additionally, efforts are being advanced to hardware encode components of the EDIFIS in order to address real-time operational requirements for health monitoring and management. This paper addresses the OPAD with its tool suite, and discusses what is considered a natural progression: a concept for migrating OPAD towards detection of high energy particles, including neutrons and gamma rays. The integration of these tools and capabilities will provide NASA with a systematic approach to monitor space vehicle internal and external environment.

  3. Paternal psychological response after ultrasonographic detection of structural fetal anomalies with a comparison to maternal response: a cohort study

    PubMed Central

    2013-01-01

    Background In Norway almost all pregnant women attend one routine ultrasound examination. Detection of fetal structural anomalies triggers psychological stress responses in the women affected. Despite the frequent use of ultrasound examination in pregnancy, little attention has been devoted to the psychological response of the expectant father following the detection of fetal anomalies. This is important for later fatherhood and the psychological interaction within the couple. We aimed to describe paternal psychological responses shortly after detection of structural fetal anomalies by ultrasonography, and to compare paternal and maternal responses within the same couple. Methods A prospective observational study was performed at a tertiary referral centre for fetal medicine. Pregnant women with a structural fetal anomaly detected by ultrasound and their partners (study group,n=155) and 100 with normal ultrasound findings (comparison group) were included shortly after sonographic examination (inclusion period: May 2006-February 2009). Gestational age was >12 weeks. We used psychometric questionnaires to assess self-reported social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression): Impact of Event Scale. General Health Questionnaire and Edinburgh Postnatal Depression Scale. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Results Median (range) gestational age at inclusion in the study and comparison group was 19 (12–38) and 19 (13–22) weeks, respectively. Men and women in the study group had significantly higher levels of psychological distress than men and women in the comparison group on all psychometric endpoints. The lowest level of distress in the study group was associated with the least severe anomalies with no diagnostic or prognostic ambiguity (p < 0.033). Men had lower scores than women on all psychometric

  4. A parametric study of unsupervised anomaly detection performance in maritime imagery using manifold learning techniques

    NASA Astrophysics Data System (ADS)

    Olson, C. C.; Doster, T.

    2016-05-01

    We investigate the parameters that govern an unsupervised anomaly detection framework that uses nonlinear techniques to learn a better model of the non-anomalous data. A manifold or kernel-based model is learned from a small, uniformly sampled subset in order to reduce computational burden and under the assumption that anomalous data will have little effect on the learned model because their rarity reduces the likelihood of their inclusion in the subset. The remaining data are then projected into the learned space and their projection errors used as detection statistics. Here, kernel principal component analysis is considered for learning the background model. We consider spectral data from an 8-band multispectral sensor as well as panchromatic infrared images treated by building a data set composed of overlapping image patches. We consider detection performance as a function of patch neighborhood size as well as embedding parameters such as kernel bandwidth and dimension. ROC curves are generated over a range of parameters and compared to RX performance.

  5. Anomaly and Signature Filtering Improve Classifier Performance For Detection Of Suspicious Access To EHRs

    PubMed Central

    Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila

    2011-01-01

    Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10−6. Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful. PMID:22195129

  6. Anomaly and signature filtering improve classifier performance for detection of suspicious access to EHRs.

    PubMed

    Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila

    2011-01-01

    Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10(-6). Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful.

  7. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    SciTech Connect

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-08-15

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of {>=}5% in {approx}1 mm{sup 2} areas and {>=}2% in {approx}20 mm{sup 2} areas. Conclusions: The ability to detect small dose differences ({<=}2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified.

  8. Airborne detection of magnetic anomalies associated with soils on the Oak Ridge Reservation, Tennessee

    SciTech Connect

    Doll, W.E.; Beard, L.P.; Helm, J.M.

    1995-04-01

    Reconnaissance airborne geophysical data acquired over the 35,000-acre Oak Ridge Reservation (ORR), TN, show several magnetic anomalies over undisturbed areas mapped as Copper Ridge Dolomite (CRD). The anomalies of interest are most apparent in magnetic gradient maps where they exceed 0.06 nT/m and in some cases exceed 0.5 nT/m. Anomalies as large as 25nT are seen on maps. Some of the anomalies correlate with known or suspected karst, or with apparent conductivity anomalies calculated from electromagnetic data acquired contemporaneously with the magnetic data. Some of the anomalies have a strong correlation with topographic lows or closed depressions. Surface magnetic data have been acquired over some of these sites and have confirmed the existence of the anomalies. Ground inspections in the vicinity of several of the anomalies has not led to any discoveries of manmade surface materials of sufficient size to generate the observed anomalies. One would expect an anomaly of approximately 1 nT for a pickup truck from 200 ft altitude. Typical residual magnetic anomalies have magnitudes of 5--10 nT, and some are as large as 25nT. The absence of roads or other indications of culture (past or present) near the anomalies and the modeling of anomalies in data acquired with surface instruments indicate that man-made metallic objects are unlikely to be responsible for the anomaly. The authors show that observed anomalies in the CRD can reasonably be associated with thickening of the soil layer. The occurrence of the anomalies in areas where evidences of karstification are seen would follow because sediment deposition would occur in topographic lows. Linear groups of anomalies on the maps may be associated with fracture zones which were eroded more than adjacent rocks and were subsequently covered with a thicker blanket of sediment. This study indicates that airborne magnetic data may be of use in other sites where fracture zones or buried collapse structures are of interest.

  9. Paternal psychological response after ultrasonographic detection of structural fetal anomalies with a comparison to maternal response: a cohort study.

    PubMed

    Kaasen, Anne; Helbig, Anne; Malt, Ulrik Fredrik; Naes, Tormod; Skari, Hans; Haugen, Guttorm Nils

    2013-07-12

    In Norway almost all pregnant women attend one routine ultrasound examination. Detection of fetal structural anomalies triggers psychological stress responses in the women affected. Despite the frequent use of ultrasound examination in pregnancy, little attention has been devoted to the psychological response of the expectant father following the detection of fetal anomalies. This is important for later fatherhood and the psychological interaction within the couple. We aimed to describe paternal psychological responses shortly after detection of structural fetal anomalies by ultrasonography, and to compare paternal and maternal responses within the same couple. A prospective observational study was performed at a tertiary referral centre for fetal medicine. Pregnant women with a structural fetal anomaly detected by ultrasound and their partners (study group,n=155) and 100 with normal ultrasound findings (comparison group) were included shortly after sonographic examination (inclusion period: May 2006-February 2009). Gestational age was >12 weeks. We used psychometric questionnaires to assess self-reported social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression): Impact of Event Scale. General Health Questionnaire and Edinburgh Postnatal Depression Scale. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Median (range) gestational age at inclusion in the study and comparison group was 19 (12-38) and 19 (13-22) weeks, respectively. Men and women in the study group had significantly higher levels of psychological distress than men and women in the comparison group on all psychometric endpoints. The lowest level of distress in the study group was associated with the least severe anomalies with no diagnostic or prognostic ambiguity (p < 0.033). Men had lower scores than women on all psychometric outcome variables. The correlation in

  10. Value of Ultrasound in Detecting Urinary Tract Anomalies After First Febrile Urinary Tract Infection in Children.

    PubMed

    Ghobrial, Emad E; Abdelaziz, Doaa M; Sheba, Maha F; Abdel-Azeem, Yasser S

    2016-05-01

    Background Urinary tract infection (UTI) is an infection that affects part of the urinary tract. Ultrasound is a noninvasive test that can demonstrate the size and shape of kidneys, presence of dilatation of the ureters, and the existence of anatomic abnormalities. The aim of the study is to estimate the value of ultrasound in detecting urinary tract anomalies after first attack of UTI. Methods This study was conducted at the Nephrology Clinic, New Children's Hospital, Faculty of Medicine, Cairo University, from August 2012 to March 2013, and included 30 children who presented with first attack of acute febrile UTI. All patients were subjected to urine analysis, urine culture and sensitivity, serum creatinine, complete blood count, and imaging in the form of renal ultrasound, voiding cysto-urethrography, and renal scan. Results All the patients had fever with a mean of 38.96°C ± 0.44°C and the mean duration of illness was 6.23 ± 5.64 days. Nineteen patients (63.3%) had an ultrasound abnormality. The commonest abnormalities were kidney stones (15.8%). Only 2 patients who had abnormal ultrasound had also vesicoureteric reflux on cystourethrography. Sensitivity of ultrasound was 66.7%, specificity was 37.5%, positive predictive value was 21.1%, negative predictive value was 81.8%, and total accuracy was 43.33%. Conclusion We concluded that ultrasound alone was not of much value in diagnosing and putting a plan of first attack of febrile UTI. It is recommended that combined investigations are the best way to confirm diagnosis of urinary tract anomalies. © The Author(s) 2015.

  11. Is a "loss of balance" a control error signal anomaly? Evidence for three-sigma failure