Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin
2016-01-01
The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035
Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype
NASA Technical Reports Server (NTRS)
Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.
2010-01-01
In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.
Active Learning with Rationales for Identifying Operationally Significant Anomalies in Aviation
NASA Technical Reports Server (NTRS)
Sharma, Manali; Das, Kamalika; Bilgic, Mustafa; Matthews, Bryan; Nielsen, David Lynn; Oza, Nikunj C.
2016-01-01
A major focus of the commercial aviation community is discovery of unknown safety events in flight operations data. Data-driven unsupervised anomaly detection methods are better at capturing unknown safety events compared to rule-based methods which only look for known violations. However, not all statistical anomalies that are discovered by these unsupervised anomaly detection methods are operationally significant (e.g., represent a safety concern). Subject Matter Experts (SMEs) have to spend significant time reviewing these statistical anomalies individually to identify a few operationally significant ones. In this paper we propose an active learning algorithm that incorporates SME feedback in the form of rationales to build a classifier that can distinguish between uninteresting and operationally significant anomalies. Experimental evaluation on real aviation data shows that our approach improves detection of operationally significant events by as much as 75% compared to the state-of-the-art. The learnt classifier also generalizes well to additional validation data sets.
Residual Error Based Anomaly Detection Using Auto-Encoder in SMD Machine Sound.
Oh, Dong Yul; Yun, Il Dong
2018-04-24
Detecting an anomaly or an abnormal situation from given noise is highly useful in an environment where constantly verifying and monitoring a machine is required. As deep learning algorithms are further developed, current studies have focused on this problem. However, there are too many variables to define anomalies, and the human annotation for a large collection of abnormal data labeled at the class-level is very labor-intensive. In this paper, we propose to detect abnormal operation sounds or outliers in a very complex machine along with reducing the data-driven annotation cost. The architecture of the proposed model is based on an auto-encoder, and it uses the residual error, which stands for its reconstruction quality, to identify the anomaly. We assess our model using Surface-Mounted Device (SMD) machine sound, which is very complex, as experimental data, and state-of-the-art performance is successfully achieved for anomaly detection.
Anomaly Detection for Next-Generation Space Launch Ground Operations
NASA Technical Reports Server (NTRS)
Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.
2010-01-01
NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.
Development of anomaly detection models for deep subsurface monitoring
NASA Astrophysics Data System (ADS)
Sun, A. Y.
2017-12-01
Deep subsurface repositories are used for waste disposal and carbon sequestration. Monitoring deep subsurface repositories for potential anomalies is challenging, not only because the number of sensor networks and the quality of data are often limited, but also because of the lack of labeled data needed to train and validate machine learning (ML) algorithms. Although physical simulation models may be applied to predict anomalies (or the system's nominal state for that sake), the accuracy of such predictions may be limited by inherent conceptual and parameter uncertainties. The main objective of this study was to demonstrate the potential of data-driven models for leakage detection in carbon sequestration repositories. Monitoring data collected during an artificial CO2 release test at a carbon sequestration repository were used, which include both scalar time series (pressure) and vector time series (distributed temperature sensing). For each type of data, separate online anomaly detection algorithms were developed using the baseline experiment data (no leak) and then tested on the leak experiment data. Performance of a number of different online algorithms was compared. Results show the importance of including contextual information in the dataset to mitigate the impact of reservoir noise and reduce false positive rate. The developed algorithms were integrated into a generic Web-based platform for real-time anomaly detection.
Model-Biased, Data-Driven Adaptive Failure Prediction
NASA Technical Reports Server (NTRS)
Leen, Todd K.
2004-01-01
This final report, which contains a research summary and a viewgraph presentation, addresses clustering and data simulation techniques for failure prediction. The researchers applied their techniques to both helicopter gearbox anomaly detection and segmentation of Earth Observing System (EOS) satellite imagery.
NASA Astrophysics Data System (ADS)
Brax, Christoffer; Niklasson, Lars
2009-05-01
Maritime Domain Awareness is important for both civilian and military applications. An important part of MDA is detection of unusual vessel activities such as piracy, smuggling, poaching, collisions, etc. Today's interconnected sensorsystems provide us with huge amounts of information over large geographical areas which can make the operators reach their cognitive capacity and start to miss important events. We propose and agent-based situation management system that automatically analyse sensor information to detect unusual activity and anomalies. The system combines knowledge-based detection with data-driven anomaly detection. The system is evaluated using information from both radar and AIS sensors.
Evaluation of Anomaly Detection Capability for Ground-Based Pre-Launch Shuttle Operations. Chapter 8
NASA Technical Reports Server (NTRS)
Martin, Rodney Alexander
2010-01-01
This chapter will provide a thorough end-to-end description of the process for evaluation of three different data-driven algorithms for anomaly detection to select the best candidate for deployment as part of a suite of IVHM (Integrated Vehicle Health Management) technologies. These algorithms were deemed to be sufficiently mature enough to be considered viable candidates for deployment in support of the maiden launch of Ares I-X, the successor to the Space Shuttle for NASA's Constellation program. Data-driven algorithms are just one of three different types being deployed. The other two types of algorithms being deployed include a "nile-based" expert system, and a "model-based" system. Within these two categories, the deployable candidates have already been selected based upon qualitative factors such as flight heritage. For the rule-based system, SHINE (Spacecraft High-speed Inference Engine) has been selected for deployment, which is a component of BEAM (Beacon-based Exception Analysis for Multimissions), a patented technology developed at NASA's JPL (Jet Propulsion Laboratory) and serves to aid in the management and identification of operational modes. For the "model-based" system, a commercially available package developed by QSI (Qualtech Systems, Inc.), TEAMS (Testability Engineering and Maintenance System) has been selected for deployment to aid in diagnosis. In the context of this particular deployment, distinctions among the use of the terms "data-driven," "rule-based," and "model-based," can be found in. Although there are three different categories of algorithms that have been selected for deployment, our main focus in this chapter will be on the evaluation of three candidates for data-driven anomaly detection. These algorithms will be evaluated upon their capability for robustly detecting incipient faults or failures in the ground-based phase of pre-launch space shuttle operations, rather than based oil heritage as performed in previous studies. Robust detection will allow for the achievement of pre-specified minimum false alarm and/or missed detection rates in the selection of alert thresholds. All algorithms will also be optimized with respect to an aggregation of these same criteria. Our study relies upon the use of Shuttle data to act as was a proxy for and in preparation for application to Ares I-X data, which uses a very similar hardware platform for the subsystems that are being targeted (TVC - Thrust Vector Control subsystem for the SRB (Solid Rocket Booster)).
Asynchronous Data-Driven Classification of Weapon Systems
2009-10-01
Classification of Weapon SystemsF Xin Jin† Kushal Mukherjee† Shalabh Gupta† Asok Ray † Shashi Phoha† Thyagaraju Damarla‡ xuj103@psu.edu kum162@psu.edu szg107...Orlando, FL. [8] A. Ray , “Symbolic dynamic analysis of complex systems for anomaly detection,” Signal Processing, vol. 84, no. 7, pp. 1115–1130, July...2004. [9] S. Gupta and A. Ray , “Symbolic dynamic filtering for data-driven pat- tern recognition,” PATTERN RECOGNITION: Theory and Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao
In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less
Conditional Outlier Detection for Clinical Alerting
Hauskrecht, Milos; Valko, Michal; Batal, Iyad; Clermont, Gilles; Visweswaran, Shyam; Cooper, Gregory F.
2010-01-01
We develop and evaluate a data-driven approach for detecting unusual (anomalous) patient-management actions using past patient cases stored in an electronic health record (EHR) system. Our hypothesis is that patient-management actions that are unusual with respect to past patients may be due to a potential error and that it is worthwhile to raise an alert if such a condition is encountered. We evaluate this hypothesis using data obtained from the electronic health records of 4,486 post-cardiac surgical patients. We base the evaluation on the opinions of a panel of experts. The results support that anomaly-based alerting can have reasonably low false alert rates and that stronger anomalies are correlated with higher alert rates. PMID:21346986
Conditional outlier detection for clinical alerting.
Hauskrecht, Milos; Valko, Michal; Batal, Iyad; Clermont, Gilles; Visweswaran, Shyam; Cooper, Gregory F
2010-11-13
We develop and evaluate a data-driven approach for detecting unusual (anomalous) patient-management actions using past patient cases stored in an electronic health record (EHR) system. Our hypothesis is that patient-management actions that are unusual with respect to past patients may be due to a potential error and that it is worthwhile to raise an alert if such a condition is encountered. We evaluate this hypothesis using data obtained from the electronic health records of 4,486 post-cardiac surgical patients. We base the evaluation on the opinions of a panel of experts. The results support that anomaly-based alerting can have reasonably low false alert rates and that stronger anomalies are correlated with higher alert rates.
Time-Frequency Methods for Structural Health Monitoring †
Pyayt, Alexander L.; Kozionov, Alexey P.; Mokhov, Ilya I.; Lang, Bernhard; Meijer, Robert J.; Krzhizhanovskaya, Valeria V.; Sloot, Peter M. A.
2014-01-01
Detection of early warning signals for the imminent failure of large and complex engineered structures is a daunting challenge with many open research questions. In this paper we report on novel ways to perform Structural Health Monitoring (SHM) of flood protection systems (levees, earthen dikes and concrete dams) using sensor data. We present a robust data-driven anomaly detection method that combines time-frequency feature extraction, using wavelet analysis and phase shift, with one-sided classification techniques to identify the onset of failure anomalies in real-time sensor measurements. The methodology has been successfully tested at three operational levees. We detected a dam leakage in the retaining dam (Germany) and “strange” behaviour of sensors installed in a Boston levee (UK) and a Rhine levee (Germany). PMID:24625740
NASA Astrophysics Data System (ADS)
Ichii, K.; Kondo, M.; Ueyama, M.; Kato, T.; Ito, A.; Sasai, T.; Sato, H.; Kobayashi, H.; Saigusa, N.
2014-12-01
Long term record of satellite-based terrestrial vegetation are important to evaluate terrestrial carbon cycle models. In this study, we demonstrate how multiple satellite observation can be used for evaluating past changes in gross primary productivity (GPP) and detecting robust anomalies in terrestrial carbon cycle in Asia through our model-data synthesis analysis, Asia-MIP. We focused on the two different temporal coverages: long-term (30 years; 1982-2011) and decadal (10 years; 2001-2011; data intensive period) scales. We used a NOAA/AVHRR NDVI record for long-term analysis and multiple satellite data and products (e.g. Terra-MODIS, SPOT-VEGETATION) as historical satellite data, and multiple terrestrial carbon cycle models (e.g. BEAMS, Biome-BGC, ORCHIDEE, SEIB-DGVM, and VISIT). As a results of long-term (30 years) trend analysis, satellite-based time-series data showed that approximately 40% of the area has experienced a significant increase in the NDVI, while only a few areas have experienced a significant decreasing trend over the last 30 years. The increases in the NDVI were dominant in the sub-continental regions of Siberia, East Asia, and India. Simulations using the terrestrial biosphere models also showed significant increases in GPP, similar to the results for the NDVI, in boreal and temperate regions. A modeled sensitivity analysis showed that the increases in GPP are explained by increased temperature and precipitation in Siberia. Precipitation, solar radiation, CO2fertilization and land cover changes are important factors in the tropical regions. However, the relative contributions of each factor to GPP changes are different among the models. Year-to-year variations of terrestrial GPP were overall consistently captured by the satellite data and terrestrial carbon cycle models if the anomalies are large (e.g. 2003 summer GPP anomalies in East Asia and 2002 spring GPP anomalies in mid to high latitudes). The behind mechanisms can be consistently explained by the models if the anomalies are caused in the low temperature regions (e.g. spring in Northern Asia). However, water-driven or radiation-driven GPP anomalies lacks consistent explanation among models. Therefore, terrestrial carbon cycle models require improvement of the sensitivity of climate anomalies to carbon cycles.
Extending TOPS: Ontology-driven Anomaly Detection and Analysis System
NASA Astrophysics Data System (ADS)
Votava, P.; Nemani, R. R.; Michaelis, A.
2010-12-01
Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.
Automation for deep space vehicle monitoring
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.
1991-01-01
Information on automation for deep space vehicle monitoring is given in viewgraph form. Information is given on automation goals and strategy; the Monitor Analyzer of Real-time Voyager Engineering Link (MARVEL); intelligent input data management; decision theory for making tradeoffs; dynamic tradeoff evaluation; evaluation of anomaly detection results; evaluation of data management methods; system level analysis with cooperating expert systems; the distributed architecture of multiple expert systems; and event driven response.
Event-driven simulation in SELMON: An overview of EDSE
NASA Technical Reports Server (NTRS)
Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.
1992-01-01
EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.
Particle Filtering for Model-Based Anomaly Detection in Sensor Networks
NASA Technical Reports Server (NTRS)
Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon
2012-01-01
A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The parameters thus learned are used for calculating the joint distribution of the observations. However, this GMM assumption is essentially an approximation and signals the potential viability of non-parametric density estimators. This is the key idea underlying the new approach.
Duarte, João V; Ribeiro, Maria J; Violante, Inês R; Cunha, Gil; Silva, Eduardo; Castelo-Branco, Miguel
2014-01-01
Neurofibromatosis Type 1 (NF1) is a common genetic condition associated with cognitive dysfunction. However, the pathophysiology of the NF1 cognitive deficits is not well understood. Abnormal brain structure, including increased total brain volume, white matter (WM) and grey matter (GM) abnormalities have been reported in the NF1 brain. These previous studies employed univariate model-driven methods preventing detection of subtle and spatially distributed differences in brain anatomy. Multivariate pattern analysis allows the combination of information from multiple spatial locations yielding a discriminative power beyond that of single voxels. Here we investigated for the first time subtle anomalies in the NF1 brain, using a multivariate data-driven classification approach. We used support vector machines (SVM) to classify whole-brain GM and WM segments of structural T1 -weighted MRI scans from 39 participants with NF1 and 60 non-affected individuals, divided in children/adolescents and adults groups. We also employed voxel-based morphometry (VBM) as a univariate gold standard to study brain structural differences. SVM classifiers correctly classified 94% of cases (sensitivity 92%; specificity 96%) revealing the existence of brain structural anomalies that discriminate NF1 individuals from controls. Accordingly, VBM analysis revealed structural differences in agreement with the SVM weight maps representing the most relevant brain regions for group discrimination. These included the hippocampus, basal ganglia, thalamus, and visual cortex. This multivariate data-driven analysis thus identified subtle anomalies in brain structure in the absence of visible pathology. Our results provide further insight into the neuroanatomical correlates of known features of the cognitive phenotype of NF1. Copyright © 2012 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur
Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less
General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations
NASA Technical Reports Server (NTRS)
Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark
2010-01-01
Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.
The Condensate Database for Big Data Analysis
NASA Astrophysics Data System (ADS)
Gallaher, D. W.; Lv, Q.; Grant, G.; Campbell, G. G.; Liu, Q.
2014-12-01
Although massive amounts of cryospheric data have been and are being generated at an unprecedented rate, a vast majority of the otherwise valuable data have been ``sitting in the dark'', with very limited quality assurance or runtime access for higher-level data analytics such as anomaly detection. This has significantly hindered data-driven scientific discovery and advances in the polar research and Earth sciences community. In an effort to solve this problem we have investigated and developed innovative techniques for the construction of ``condensate database'', which is much smaller than the original data yet still captures the key characteristics (e.g., spatio-temporal norm and changes). In addition we are taking advantage of parallel databases that make use of low cost GPU processors. As a result, efficient anomaly detection and quality assurance can be achieved with in-memory data analysis or limited I/O requests. The challenges lie in the fact that cryospheric data are massive and diverse, with normal/abnomal patterns spanning a wide range of spatial and temporal scales. This project consists of investigations in three main areas: (1) adaptive neighborhood-based thresholding in both space and time; (2) compressive-domain pattern detection and change analysis; and (3) hybrid and adaptive condensation of multi-modal, multi-scale cryospheric data.
NASA Technical Reports Server (NTRS)
Park, Han G. (Inventor); Zak, Michail (Inventor); James, Mark L. (Inventor); Mackey, Ryan M. E. (Inventor)
2003-01-01
A general method of anomaly detection from time-correlated sensor data is disclosed. Multiple time-correlated signals are received. Their cross-signal behavior is compared against a fixed library of invariants. The library is constructed during a training process, which is itself data-driven using the same time-correlated signals. The method is applicable to a broad class of problems and is designed to respond to any departure from normal operation, including faults or events that lie outside the training envelope.
FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection.
Noto, Keith; Brodley, Carla; Slonim, Donna
2012-01-01
Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called "normal" instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach.
FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection
Brodley, Carla; Slonim, Donna
2011-01-01
Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called “normal” instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach. PMID:22639542
Machine learning for the automatic detection of anomalous events
NASA Astrophysics Data System (ADS)
Fisher, Wendy D.
In this dissertation, we describe our research contributions for a novel approach to the application of machine learning for the automatic detection of anomalous events. We work in two different domains to ensure a robust data-driven workflow that could be generalized for monitoring other systems. Specifically, in our first domain, we begin with the identification of internal erosion events in earth dams and levees (EDLs) using geophysical data collected from sensors located on the surface of the levee. As EDLs across the globe reach the end of their design lives, effectively monitoring their structural integrity is of critical importance. The second domain of interest is related to mobile telecommunications, where we investigate a system for automatically detecting non-commercial base station routers (BSRs) operating in protected frequency space. The presence of non-commercial BSRs can disrupt the connectivity of end users, cause service issues for the commercial providers, and introduce significant security concerns. We provide our motivation, experimentation, and results from investigating a generalized novel data-driven workflow using several machine learning techniques. In Chapter 2, we present results from our performance study that uses popular unsupervised clustering algorithms to gain insights to our real-world problems, and evaluate our results using internal and external validation techniques. Using EDL passive seismic data from an experimental laboratory earth embankment, results consistently show a clear separation of events from non-events in four of the five clustering algorithms applied. Chapter 3 uses a multivariate Gaussian machine learning model to identify anomalies in our experimental data sets. For the EDL work, we used experimental data from two different laboratory earth embankments. Additionally, we explore five wavelet transform methods for signal denoising. The best performance is achieved with the Haar wavelets. We achieve up to 97.3% overall accuracy and less than 1.4% false negatives in anomaly detection. In Chapter 4, we research using two-class and one-class support vector machines (SVMs) for an effective anomaly detection system. We again use the two different EDL data sets from experimental laboratory earth embankments (each having approximately 80% normal and 20% anomalies) to ensure our workflow is robust enough to work with multiple data sets and different types of anomalous events (e.g., cracks and piping). We apply Haar wavelet-denoising techniques and extract nine spectral features from decomposed segments of the time series data. The two-class SVM with 10-fold cross validation achieved over 94% overall accuracy and 96% F1-score. Our approach provides a means for automatically identifying anomalous events using various machine learning techniques. Detecting internal erosion events in aging EDLs, earlier than is currently possible, can allow more time to prevent or mitigate catastrophic failures. Results show that we can successfully separate normal from anomalous data observations in passive seismic data, and provide a step towards techniques for continuous real-time monitoring of EDL health. Our lightweight non-commercial BSR detection system also has promise in separating commercial from non-commercial BSR scans without the need for prior geographic location information, extensive time-lapse surveys, or a database of known commercial carriers. (Abstract shortened by ProQuest.).
A robust background regression based score estimation algorithm for hyperspectral anomaly detection
NASA Astrophysics Data System (ADS)
Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei
2016-12-01
Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement in practice.
NASA Astrophysics Data System (ADS)
Scozzari, Andrea; Doveri, Marco
2015-04-01
The knowledge of the physical/chemical processes implied with the exploitation of water bodies for human consumption is an essential tool for the optimisation of the monitoring infrastructure. Due to their increasing importance in the context of human consumption (at least in the EU), this work focuses on groundwater resources. In the framework of drinkable water networks, the physical and data-driven modelling of transport phenomena in groundwater can help optimising the sensor network and validating the acquired data. This work proposes the combined usage of physical and data-driven modelling as a support to the design and maximisation of results from a network of distributed sensors. In particular, the validation of physico-chemical measurements and the detection of eventual anomalies by a set of continuous measurements take benefit from the knowledge of the domain from which water is abstracted, and its expected characteristics. Change-detection techniques based on non-specific sensors (presented by quite a large literature during the last two decades) have to deal with the classical issues of maximising correct detections and minimising false alarms, the latter of the two being the most typical problem to be faced, in the view of designing truly applicable monitoring systems. In this context, the definition of "anomaly" in terms of distance from an expected value or feature characterising the quality of water implies the definition of a suitable metric and the knowledge of the physical and chemical peculiarities of the natural domain from which water is exploited, with its implications in terms of characteristics of the water resource.
Detecting Biosphere anomalies hotspots
NASA Astrophysics Data System (ADS)
Guanche-Garcia, Yanira; Mahecha, Miguel; Flach, Milan; Denzler, Joachim
2017-04-01
The current amount of satellite remote sensing measurements available allow for applying data-driven methods to investigate environmental processes. The detection of anomalies or abnormal events is crucial to monitor the Earth system and to analyze their impacts on ecosystems and society. By means of a combination of statistical methods, this study proposes an intuitive and efficient methodology to detect those areas that present hotspots of anomalies, i.e. higher levels of abnormal or extreme events or more severe phases during our historical records. Biosphere variables from a preliminary version of the Earth System Data Cube developed within the CAB-LAB project (http://earthsystemdatacube.net/) have been used in this study. This database comprises several atmosphere and biosphere variables expanding 11 years (2001-2011) with 8-day of temporal resolution and 0.25° of global spatial resolution. In this study, we have used 10 variables that measure the biosphere. The methodology applied to detect abnormal events follows the intuitive idea that anomalies are assumed to be time steps that are not well represented by a previously estimated statistical model [1].We combine the use of Autoregressive Moving Average (ARMA) models with a distance metric like Mahalanobis distance to detect abnormal events in multiple biosphere variables. In a first step we pre-treat the variables by removing the seasonality and normalizing them locally (μ=0,σ=1). Additionally we have regionalized the area of study into subregions of similar climate conditions, by using the Köppen climate classification. For each climate region and variable we have selected the best ARMA parameters by means of a Bayesian Criteria. Then we have obtained the residuals by comparing the fitted models with the original data. To detect the extreme residuals from the 10 variables, we have computed the Mahalanobis distance to the data's mean (Hotelling's T^2), which considers the covariance matrix of the joint distribution. The proposed methodology has been applied to different areas around the globe. The results show that the method is able to detect historic events and also provides a useful tool to define sensitive regions. This method and results have been developed within the framework of the project BACI (http://baci-h2020.eu/), which aims to integrate Earth Observation data to monitor the earth system and assessing the impacts of terrestrial changes. [1] V. Chandola, A., Banerjee and v., Kumar. Anomaly detection: a survey. ACM computing surveys (CSUR), vol. 41, n. 3, 2009. [2] P. Mahalanobis. On the generalised distance in statistics. Proceedings National Institute of Science, vol. 2, pp 49-55, 1936.
Seismic data fusion anomaly detection
NASA Astrophysics Data System (ADS)
Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David
2014-06-01
Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.
Conditional anomaly detection methods for patient–management alert systems
Valko, Michal; Cooper, Gregory; Seybert, Amy; Visweswaran, Shyam; Saul, Melissa; Hauskrecht, Milos
2010-01-01
Anomaly detection methods can be very useful in identifying unusual or interesting patterns in data. A recently proposed conditional anomaly detection framework extends anomaly detection to the problem of identifying anomalous patterns on a subset of attributes in the data. The anomaly always depends (is conditioned) on the value of remaining attributes. The work presented in this paper focuses on instance–based methods for detecting conditional anomalies. The methods rely on the distance metric to identify examples in the dataset that are most critical for detecting the anomaly. We investigate various metrics and metric learning methods to optimize the performance of the instance–based anomaly detection methods. We show the benefits of the instance–based methods on two real–world detection problems: detection of unusual admission decisions for patients with the community–acquired pneumonia and detection of unusual orders of an HPF4 test that is used to confirm Heparin induced thrombocytopenia — a life–threatening condition caused by the Heparin therapy. PMID:25392850
Quantum machine learning for quantum anomaly detection
NASA Astrophysics Data System (ADS)
Liu, Nana; Rebentrost, Patrick
2018-04-01
Anomaly detection is used for identifying data that deviate from "normal" data patterns. Its usage on classical data finds diverse applications in many important areas such as finance, fraud detection, medical diagnoses, data cleaning, and surveillance. With the advent of quantum technologies, anomaly detection of quantum data, in the form of quantum states, may become an important component of quantum applications. Machine-learning algorithms are playing pivotal roles in anomaly detection using classical data. Two widely used algorithms are the kernel principal component analysis and the one-class support vector machine. We find corresponding quantum algorithms to detect anomalies in quantum states. We show that these two quantum algorithms can be performed using resources that are logarithmic in the dimensionality of quantum states. For pure quantum states, these resources can also be logarithmic in the number of quantum states used for training the machine-learning algorithm. This makes these algorithms potentially applicable to big quantum data applications.
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
22nd Annual Logistics Conference and Exhibition
2006-04-20
Prognostics & Health Management at GE Dr. Piero P.Bonissone Industrial AI Lab GE Global Research NCD Select detection model Anomaly detection results...Mode 213 x Failure mode histogram 2130014 Anomaly detection from event-log data Anomaly detection from event-log data Diagnostics/ Prognostics Using...Failure Monitoring & AssessmentTactical C4ISR Sense Respond 7 •Diagnostics, Prognostics and health management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humberto E. Garcia
This paper illustrates safeguards benefits that process monitoring (PM) can have as a diversion deterrent and as a complementary safeguards measure to nuclear material accountancy (NMA). In order to infer the possible existence of proliferation-driven activities, the objective of NMA-based methods is often to statistically evaluate materials unaccounted for (MUF) computed by solving a given mass balance equation related to a material balance area (MBA) at every material balance period (MBP), a particular objective for a PM-based approach may be to statistically infer and evaluate anomalies unaccounted for (AUF) that may have occurred within a MBP. Although possibly being indicativemore » of proliferation-driven activities, the detection and tracking of anomaly patterns is not trivial because some executed events may be unobservable or unreliably observed as others. The proposed similarity between NMA- and PM-based approaches is important as performance metrics utilized for evaluating NMA-based methods, such as detection probability (DP) and false alarm probability (FAP), can also be applied for assessing PM-based safeguards solutions. To this end, AUF count estimates can be translated into significant quantity (SQ) equivalents that may have been diverted within a given MBP. A diversion alarm is reported if this mass estimate is greater than or equal to the selected value for alarm level (AL), appropriately chosen to optimize DP and FAP based on the particular characteristics of the monitored MBA, the sensors utilized, and the data processing method employed for integrating and analyzing collected measurements. To illustrate the application of the proposed PM approach, a protracted diversion of Pu in a waste stream was selected based on incomplete fuel dissolution in a dissolver unit operation, as this diversion scenario is considered to be problematic for detection using NMA-based methods alone. Results demonstrate benefits of conducting PM under a system-centric strategy that utilizes data collected from a system of sensors and that effectively exploits known characterizations of sensors and facility operations in order to significantly improve anomaly detection, reduce false alarm, and enhance assessment robustness under unreliable partial sensor information.« less
Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling
NASA Astrophysics Data System (ADS)
Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji
We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.
On the detection and attribution of gravity waves generated by the 20 March 2015 solar eclipse
2016-01-01
Internal gravity waves are generated as adjustment radiation whenever a sudden change in forcing causes the atmosphere to depart from its large-scale balanced state. Such a forcing anomaly occurs during a solar eclipse, when the Moon’s shadow cools part of the Earth’s surface. The resulting atmospheric gravity waves are associated with pressure and temperature perturbations, which in principle are detectable both at the surface and aloft. In this study, surface pressure and temperature data from two UK sites at Reading and Lerwick are examined for eclipse-driven gravity wave perturbations during the 20 March 2015 solar eclipse over northwest Europe. Radiosonde wind data from the same two sites are also analysed using a moving parcel analysis method, to determine the periodicities of the waves aloft. On this occasion, the perturbations both at the surface and aloft are found not to be confidently attributable to eclipse-driven gravity waves. We conclude that the complex synoptic weather conditions over the UK at the time of this particular eclipse helped to mask any eclipse-driven gravity waves. This article is part of the themed issue ‘Atmospheric effects of solar eclipses stimulated by the 2015 UK eclipse’. PMID:27550763
On the detection and attribution of gravity waves generated by the 20 March 2015 solar eclipse.
Marlton, G J; Williams, P D; Nicoll, K A
2016-09-28
Internal gravity waves are generated as adjustment radiation whenever a sudden change in forcing causes the atmosphere to depart from its large-scale balanced state. Such a forcing anomaly occurs during a solar eclipse, when the Moon's shadow cools part of the Earth's surface. The resulting atmospheric gravity waves are associated with pressure and temperature perturbations, which in principle are detectable both at the surface and aloft. In this study, surface pressure and temperature data from two UK sites at Reading and Lerwick are examined for eclipse-driven gravity wave perturbations during the 20 March 2015 solar eclipse over northwest Europe. Radiosonde wind data from the same two sites are also analysed using a moving parcel analysis method, to determine the periodicities of the waves aloft. On this occasion, the perturbations both at the surface and aloft are found not to be confidently attributable to eclipse-driven gravity waves. We conclude that the complex synoptic weather conditions over the UK at the time of this particular eclipse helped to mask any eclipse-driven gravity waves.This article is part of the themed issue 'Atmospheric effects of solar eclipses stimulated by the 2015 UK eclipse'. © 2016 The Authors.
A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data.
Goldstein, Markus; Uchida, Seiichi
2016-01-01
Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks.
Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)
NASA Technical Reports Server (NTRS)
McIntosh, Dawn
2006-01-01
This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search
Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines
Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu
2016-01-01
In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561
Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.
Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu
2016-04-29
In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.
2015-06-09
anomaly detection , which is generally considered part of high level information fusion (HLIF) involving temporal-geospatial data as well as meta-data... Anomaly detection in the Maritime defence and security domain typically focusses on trying to identify vessels that are behaving in an unusual...manner compared with lawful vessels operating in the area – an applied case of target detection among distractors. Anomaly detection is a complex problem
Medical sieve: a cognitive assistant for radiologists and cardiologists
NASA Astrophysics Data System (ADS)
Syeda-Mahmood, T.; Walach, E.; Beymer, D.; Gilboa-Solomon, F.; Moradi, M.; Kisilev, P.; Kakrania, D.; Compas, C.; Wang, H.; Negahdar, R.; Cao, Y.; Baldwin, T.; Guo, Y.; Gur, Y.; Rajan, D.; Zlotnick, A.; Rabinovici-Cohen, S.; Ben-Ari, R.; Guy, Amit; Prasanna, P.; Morey, J.; Boyko, O.; Hashoul, S.
2016-03-01
Radiologists and cardiologists today have to view large amounts of imaging data relatively quickly leading to eye fatigue. Further, they have only limited access to clinical information relying mostly on their visual interpretation of imaging studies for their diagnostic decisions. In this paper, we present Medical Sieve, an automated cognitive assistant for radiologists and cardiologists designed to help in their clinical decision-making. The sieve is a clinical informatics system that collects clinical, textual and imaging data of patients from electronic health records systems. It then analyzes multimodal content to detect anomalies if any, and summarizes the patient record collecting all relevant information pertinent to a chief complaint. The results of anomaly detection are then fed into a reasoning engine which uses evidence from both patient-independent clinical knowledge and large-scale patient-driven similar patient statistics to arrive at potential differential diagnosis to help in clinical decision making. In compactly summarizing all relevant information to the clinician per chief complaint, the system still retains links to the raw data for detailed review providing holistic summaries of patient conditions. Results of clinical studies in the domains of cardiology and breast radiology have already shown the promise of the system in differential diagnosis and imaging studies summarization.
A novel approach for detection of anomalies using measurement data of the Ironton-Russell bridge
NASA Astrophysics Data System (ADS)
Zhang, Fan; Norouzi, Mehdi; Hunt, Victor; Helmicki, Arthur
2015-04-01
Data models have been increasingly used in recent years for documenting normal behavior of structures and hence detect and classify anomalies. Large numbers of machine learning algorithms were proposed by various researchers to model operational and functional changes in structures; however, a limited number of studies were applied to actual measurement data due to limited access to the long term measurement data of structures and lack of access to the damaged states of structures. By monitoring the structure during construction and reviewing the effect of construction events on the measurement data, this study introduces a new approach to detect and eventually classify anomalies during construction and after construction. First, the implementation procedure of the sensory network that develops while the bridge is being built and its current status will be detailed. Second, the proposed anomaly detection algorithm will be applied on the collected data and finally, detected anomalies will be validated against the archived construction events.
Latent Space Tracking from Heterogeneous Data with an Application for Anomaly Detection
2015-11-01
specific, if the anomaly behaves as a sudden outlier after which the data stream goes back to normal state, then the anomalous data point should be...introduced three types of anomalies , all of them are sudden outliers . 438 J. Huang and X. Ning Table 2. Synthetic dataset: AUC and parameters method...Latent Space Tracking from Heterogeneous Data with an Application for Anomaly Detection Jiaji Huang1(B) and Xia Ning2 1 Department of Electrical
A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data
Goldstein, Markus; Uchida, Seiichi
2016-01-01
Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601
A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data.
Song, Hongchao; Jiang, Zhuqing; Men, Aidong; Yang, Bo
2017-01-01
Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE) and an ensemble k -nearest neighbor graphs- ( K -NNG-) based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity.
A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data
Jiang, Zhuqing; Men, Aidong; Yang, Bo
2017-01-01
Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE) and an ensemble k-nearest neighbor graphs- (K-NNG-) based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity. PMID:29270197
Pre-seismic anomalies from optical satellite observations: a review
NASA Astrophysics Data System (ADS)
Jiao, Zhong-Hu; Zhao, Jing; Shan, Xinjian
2018-04-01
Detecting various anomalies using optical satellite data prior to strong earthquakes is key to understanding and forecasting earthquake activities because of its recognition of thermal-radiation-related phenomena in seismic preparation phases. Data from satellite observations serve as a powerful tool in monitoring earthquake preparation areas at a global scale and in a nearly real-time manner. Over the past several decades, many new different data sources have been utilized in this field, and progressive anomaly detection approaches have been developed. This paper reviews the progress and development of pre-seismic anomaly detection technology in this decade. First, precursor parameters, including parameters from the top of the atmosphere, in the atmosphere, and on the Earth's surface, are stated and discussed. Second, different anomaly detection methods, which are used to extract anomalous signals that probably indicate future seismic events, are presented. Finally, certain critical problems with the current research are highlighted, and new developing trends and perspectives for future work are discussed. The development of Earth observation satellites and anomaly detection algorithms can enrich available information sources, provide advanced tools for multilevel earthquake monitoring, and improve short- and medium-term forecasting, which play a large and growing role in pre-seismic anomaly detection research.
Causes of Upper-Ocean Temperature Anomalies in the Tropical North Atlantic
NASA Astrophysics Data System (ADS)
Rugg, A.; Foltz, G. R.; Perez, R. C.
2016-02-01
Hurricane activity and regional rainfall are strongly impacted by upper ocean conditions in the tropical North Atlantic, defined as the region between the equator and 20°N. A previous study analyzed a strong cold sea surface temperature (SST) anomaly that developed in this region during early 2009 and was recorded by the Pilot Research Array in the Tropical Atlantic (PIRATA) moored buoy at 4°N, 23°W (Foltz et al. 2012). The same mooring shows a similar cold anomaly in the spring of 2015 as well as a strong warm anomaly in 2010, offering the opportunity for a more comprehensive analysis of the causes of these events. In this study we examine the main causes of the observed temperature anomalies between 1998 and 2015. Basin-scale conditions during these events are analyzed using satellite SST, wind, and rain data, as well as temperature and salinity profiles from the NCEP Global Ocean Data Assimilation System. A more detailed analysis is conducted using ten years of direct measurements from the PIRATA mooring at 4°N, 23°W. Results show that the cooling and warming anomalies were caused primarily by wind-driven changes in surface evaporative cooling, mixed layer depth, and upper-ocean vertical velocity. Anomalies in surface solar radiation acted to damp the wind-driven SST anomalies in the latitude bands of the ITCZ (3°-8°N). Basin-scale analyses also suggest a strong connection between the observed SST anomalies and the Atlantic Meridional Mode, a well-known pattern of SST and surface wind anomalies spanning the tropical Atlantic.
NASA Technical Reports Server (NTRS)
Lo, C. F.; Wu, K.; Whitehead, B. A.
1993-01-01
The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.
Real-time Bayesian anomaly detection in streaming environmental data
NASA Astrophysics Data System (ADS)
Hill, David J.; Minsker, Barbara S.; Amir, Eyal
2009-04-01
With large volumes of data arriving in near real time from environmental sensors, there is a need for automated detection of anomalous data caused by sensor or transmission errors or by infrequent system behaviors. This study develops and evaluates three automated anomaly detection methods using dynamic Bayesian networks (DBNs), which perform fast, incremental evaluation of data as they become available, scale to large quantities of data, and require no a priori information regarding process variables or types of anomalies that may be encountered. This study investigates these methods' abilities to identify anomalies in eight meteorological data streams from Corpus Christi, Texas. The results indicate that DBN-based detectors, using either robust Kalman filtering or Rao-Blackwellized particle filtering, outperform a DBN-based detector using Kalman filtering, with the former having false positive/negative rates of less than 2%. These methods were successful at identifying data anomalies caused by two real events: a sensor failure and a large storm.
Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini
2013-01-01
Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, X; Liu, S; Kalet, A
Purpose: The purpose of this work was to investigate the ability of a machine-learning based probabilistic approach to detect radiotherapy treatment plan anomalies given initial disease classes information. Methods In total we obtained 1112 unique treatment plans with five plan parameters and disease information from a Mosaiq treatment management system database for use in the study. The plan parameters include prescription dose, fractions, fields, modality and techniques. The disease information includes disease site, and T, M and N disease stages. A Bayesian network method was employed to model the probabilistic relationships between tumor disease information, plan parameters and an anomalymore » flag. A Bayesian learning method with Dirichlet prior was useed to learn the joint probabilities between dependent variables in error-free plan data and data with artificially induced anomalies. In the study, we randomly sampled data with anomaly in a specified anomaly space.We tested the approach with three groups of plan anomalies – improper concurrence of values of all five plan parameters and values of any two out of five parameters, and all single plan parameter value anomalies. Totally, 16 types of plan anomalies were covered by the study. For each type, we trained an individual Bayesian network. Results: We found that the true positive rate (recall) and positive predictive value (precision) to detect concurrence anomalies of five plan parameters in new patient cases were 94.45±0.26% and 93.76±0.39% respectively. To detect other 15 types of plan anomalies, the average recall and precision were 93.61±2.57% and 93.78±3.54% respectively. The computation time to detect the plan anomaly of each type in a new plan is ∼0.08 seconds. Conclusion: The proposed method for treatment plan anomaly detection was found effective in the initial tests. The results suggest that this type of models could be applied to develop plan anomaly detection tools to assist manual and automated plan checks. The senior author received research grants from ViewRay Inc. and Varian Medical System.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brett Emery Trabun; Gamage, Thoshitha Thanushka; Bakken, David Edward
This disclosure describes, in part, a system management component and failure detection component for use in a power grid data network to identify anomalies within the network and systematically adjust the quality of service of data published by publishers and subscribed to by subscribers within the network. In one implementation, subscribers may identify a desired data rate, a minimum acceptable data rate, desired latency, minimum acceptable latency and a priority for each subscription. The failure detection component may identify an anomaly within the network and a source of the anomaly. Based on the identified anomaly, data rates and or datamore » paths may be adjusted in real-time to ensure that the power grid data network does not become overloaded and/or fail.« less
Deep learning on temporal-spectral data for anomaly detection
NASA Astrophysics Data System (ADS)
Ma, King; Leung, Henry; Jalilian, Ehsan; Huang, Daniel
2017-05-01
Detecting anomalies is important for continuous monitoring of sensor systems. One significant challenge is to use sensor data and autonomously detect changes that cause different conditions to occur. Using deep learning methods, we are able to monitor and detect changes as a result of some disturbance in the system. We utilize deep neural networks for sequence analysis of time series. We use a multi-step method for anomaly detection. We train the network to learn spectral and temporal features from the acoustic time series. We test our method using fiber-optic acoustic data from a pipeline.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Ning; Du, Pengwei; Greitzer, Frank L.
2012-12-31
This paper presents the multi-layer, data-driven advanced reasoning tool (M-DART), a proof-of-principle decision support tool for improved power system operation. M-DART will cross-correlate and examine different data sources to assess anomalies, infer root causes, and anneal data into actionable information. By performing higher-level reasoning “triage” of diverse data sources, M-DART focuses on early detection of emerging power system events and identifies highest priority actions for the human decision maker. M-DART represents a significant advancement over today’s grid monitoring technologies that apply offline analyses to derive model-based guidelines for online real-time operations and use isolated data processing mechanisms focusing on individualmore » data domains. The development of the M-DART will bridge these gaps by reasoning about results obtained from multiple data sources that are enabled by the smart grid infrastructure. This hybrid approach integrates a knowledge base that is trained offline but tuned online to capture model-based relationships while revealing complex causal relationships among data from different domains.« less
Propulsion Health Monitoring of a Turbine Engine Disk Using Spin Test Data
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Woike, Mark R.; Oza, Nikunj; Matthews, Bryan; Baaklini, George Y.
2010-01-01
This paper considers data collected from an experimental study using high frequency capacitive sensor technology to capture blade tip clearance and tip timing measurements in a rotating turbine engine-like-disk-to predict the disk faults and assess its structural integrity. The experimental results collected at a range of rotational speeds from tests conducted at the NASA Glenn Research Center s Rotordynamics Laboratory are evaluated using multiple data-driven anomaly detection techniques to identify abnormalities in the disk. Further, this study presents a select evaluation of an online health monitoring scheme of a rotating disk using high caliber sensors and test the capability of the in-house spin system.
Identifying Threats Using Graph-based Anomaly Detection
NASA Astrophysics Data System (ADS)
Eberle, William; Holder, Lawrence; Cook, Diane
Much of the data collected during the monitoring of cyber and other infrastructures is structural in nature, consisting of various types of entities and relationships between them. The detection of threatening anomalies in such data is crucial to protecting these infrastructures. We present an approach to detecting anomalies in a graph-based representation of such data that explicitly represents these entities and relationships. The approach consists of first finding normative patterns in the data using graph-based data mining and then searching for small, unexpected deviations to these normative patterns, assuming illicit behavior tries to mimic legitimate, normative behavior. The approach is evaluated using several synthetic and real-world datasets. Results show that the approach has high truepositive rates, low false-positive rates, and is capable of detecting complex structural anomalies in real-world domains including email communications, cellphone calls and network traffic.
Evaluation of Anomaly Detection Method Based on Pattern Recognition
NASA Astrophysics Data System (ADS)
Fontugne, Romain; Himura, Yosuke; Fukuda, Kensuke
The number of threats on the Internet is rapidly increasing, and anomaly detection has become of increasing importance. High-speed backbone traffic is particularly degraded, but their analysis is a complicated task due to the amount of data, the lack of payload data, the asymmetric routing and the use of sampling techniques. Most anomaly detection schemes focus on the statistical properties of network traffic and highlight anomalous traffic through their singularities. In this paper, we concentrate on unusual traffic distributions, which are easily identifiable in temporal-spatial space (e.g., time/address or port). We present an anomaly detection method that uses a pattern recognition technique to identify anomalies in pictures representing traffic. The main advantage of this method is its ability to detect attacks involving mice flows. We evaluate the parameter set and the effectiveness of this approach by analyzing six years of Internet traffic collected from a trans-Pacific link. We show several examples of detected anomalies and compare our results with those of two other methods. The comparison indicates that the only anomalies detected by the pattern-recognition-based method are mainly malicious traffic with a few packets.
Fuzzy Kernel k-Medoids algorithm for anomaly detection problems
NASA Astrophysics Data System (ADS)
Rustam, Z.; Talita, A. S.
2017-07-01
Intrusion Detection System (IDS) is an essential part of security systems to strengthen the security of information systems. IDS can be used to detect the abuse by intruders who try to get into the network system in order to access and utilize the available data sources in the system. There are two approaches of IDS, Misuse Detection and Anomaly Detection (behavior-based intrusion detection). Fuzzy clustering-based methods have been widely used to solve Anomaly Detection problems. Other than using fuzzy membership concept to determine the object to a cluster, other approaches as in combining fuzzy and possibilistic membership or feature-weighted based methods are also used. We propose Fuzzy Kernel k-Medoids that combining fuzzy and possibilistic membership as a powerful method to solve anomaly detection problem since on numerical experiment it is able to classify IDS benchmark data into five different classes simultaneously. We classify IDS benchmark data KDDCup'99 data set into five different classes simultaneously with the best performance was achieved by using 30 % of training data with clustering accuracy reached 90.28 percent.
NASA Astrophysics Data System (ADS)
Sun, Hao; Zou, Huanxin; Zhou, Shilin
2016-03-01
Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohrer, Brandon Robinson
2011-09-01
Events of interest to data analysts are sometimes difficult to characterize in detail. Rather, they consist of anomalies, events that are unpredicted, unusual, or otherwise incongruent. The purpose of this LDRD was to test the hypothesis that a biologically-inspired anomaly detection algorithm could be used to detect contextual, multi-modal anomalies. There currently is no other solution to this problem, but the existence of a solution would have a great national security impact. The technical focus of this research was the application of a brain-emulating cognition and control architecture (BECCA) to the problem of anomaly detection. One aspect of BECCA inmore » particular was discovered to be critical to improved anomaly detection capabilities: it's feature creator. During the course of this project the feature creator was developed and tested against multiple data types. Development direction was drawn from psychological and neurophysiological measurements. Major technical achievements include the creation of hierarchical feature sets created from both audio and imagery data.« less
Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering
NASA Astrophysics Data System (ADS)
Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech
2015-03-01
We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.
NASA Astrophysics Data System (ADS)
Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.
2016-04-01
Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.
Multi-Level Anomaly Detection on Time-Varying Graph Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridges, Robert A; Collins, John P; Ferragut, Erik M
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, thismore » multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
SU-G-JeP4-03: Anomaly Detection of Respiratory Motion by Use of Singular Spectrum Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kotoku, J; Kumagai, S; Nakabayashi, S
Purpose: The implementation and realization of automatic anomaly detection of respiratory motion is a very important technique to prevent accidental damage during radiation therapy. Here, we propose an automatic anomaly detection method using singular value decomposition analysis. Methods: The anomaly detection procedure consists of four parts:1) measurement of normal respiratory motion data of a patient2) calculation of a trajectory matrix representing normal time-series feature3) real-time monitoring and calculation of a trajectory matrix of real-time data.4) calculation of an anomaly score from the similarity of the two feature matrices. Patient motion was observed by a marker-less tracking system using a depthmore » camera. Results: Two types of motion e.g. cough and sudden stop of breathing were successfully detected in our real-time application. Conclusion: Automatic anomaly detection of respiratory motion using singular spectrum analysis was successful in the cough and sudden stop of breathing. The clinical use of this algorithm will be very hopeful. This work was supported by JSPS KAKENHI Grant Number 15K08703.« less
Modeling inter-signal arrival times for accurate detection of CAN bus signal injection attacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Michael Roy; Bridges, Robert A; Combs, Frank L
Modern vehicles rely on hundreds of on-board electronic control units (ECUs) communicating over in-vehicle networks. As external interfaces to the car control networks (such as the on-board diagnostic (OBD) port, auxiliary media ports, etc.) become common, and vehicle-to-vehicle / vehicle-to-infrastructure technology is in the near future, the attack surface for vehicles grows, exposing control networks to potentially life-critical attacks. This paper addresses the need for securing the CAN bus by detecting anomalous traffic patterns via unusual refresh rates of certain commands. While previous works have identified signal frequency as an important feature for CAN bus intrusion detection, this paper providesmore » the first such algorithm with experiments on five attack scenarios. Our data-driven anomaly detection algorithm requires only five seconds of training time (on normal data) and achieves true positive / false discovery rates of 0.9998/0.00298, respectively (micro-averaged across the five experimental tests).« less
Radiation anomaly detection algorithms for field-acquired gamma energy spectra
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen
2015-08-01
The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.
EMPACT 3D: an advanced EMI discrimination sensor for CONUS and OCONUS applications
NASA Astrophysics Data System (ADS)
Keranen, Joe; Miller, Jonathan S.; Schultz, Gregory; Sander-Olhoeft, Morgan; Laudato, Stephen
2018-04-01
We recently developed a new, man-portable, electromagnetic induction (EMI) sensor designed to detect and classify small, unexploded sub-munitions and discriminate them from non-hazardous debris. The ability to distinguish innocuous metal clutter from potentially hazardous unexploded ordnance (UXO) and other explosive remnants of war (ERW) before excavation can significantly accelerate land reclamation efforts by eliminating time spent removing harmless scrap metal. The EMI sensor employs a multi-axis transmitter and receiver configuration to produce data sufficient for anomaly discrimination. A real-time data inversion routine produces intrinsic and extrinsic anomaly features describing the polarizability, location, and orientation of the anomaly under test. We discuss data acquisition and post-processing software development, and results from laboratory and field tests demonstrating the discrimination capability of the system. Data acquisition and real-time processing emphasize ease-of-use, quality control (QC), and display of discrimination results. Integration of the QC and discrimination methods into the data acquisition software reduces the time required between sensor data collection and the final anomaly discrimination result. The system supports multiple concepts of operations (CONOPs) including: 1) a non-GPS cued configuration in which detected anomalies are discriminated and excavated immediately following the anomaly survey; 2) GPS integration to survey multiple anomalies to produce a prioritized dig list with global anomaly locations; and 3) a dynamic mapping configuration supporting detection followed by discrimination and excavation of targets of interest.
Integrated System Health Management (ISHM) for Test Stand and J-2X Engine: Core Implementation
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Schmalzel, John L.; Aguilar, Robert; Shwabacher, Mark; Morris, Jon
2008-01-01
ISHM capability enables a system to detect anomalies, determine causes and effects, predict future anomalies, and provides an integrated awareness of the health of the system to users (operators, customers, management, etc.). NASA Stennis Space Center, NASA Ames Research Center, and Pratt & Whitney Rocketdyne have implemented a core ISHM capability that encompasses the A1 Test Stand and the J-2X Engine. The implementation incorporates all aspects of ISHM; from anomaly detection (e.g. leaks) to root-cause-analysis based on failure mode and effects analysis (FMEA), to a user interface for an integrated visualization of the health of the system (Test Stand and Engine). The implementation provides a low functional capability level (FCL) in that it is populated with few algorithms and approaches for anomaly detection, and root-cause trees from a limited FMEA effort. However, it is a demonstration of a credible ISHM capability, and it is inherently designed for continuous and systematic augmentation of the capability. The ISHM capability is grounded on an integrating software environment used to create an ISHM model of the system. The ISHM model follows an object-oriented approach: includes all elements of the system (from schematics) and provides for compartmentalized storage of information associated with each element. For instance, a sensor object contains a transducer electronic data sheet (TEDS) with information that might be used by algorithms and approaches for anomaly detection, diagnostics, etc. Similarly, a component, such as a tank, contains a Component Electronic Data Sheet (CEDS). Each element also includes a Health Electronic Data Sheet (HEDS) that contains health-related information such as anomalies and health state. Some practical aspects of the implementation include: (1) near real-time data flow from the test stand data acquisition system through the ISHM model, for near real-time detection of anomalies and diagnostics, (2) insertion of the J-2X predictive model providing predicted sensor values for comparison with measured values and use in anomaly detection and diagnostics, and (3) insertion of third-party anomaly detection algorithms into the integrated ISHM model.
General Purpose Data-Driven Monitoring for Space Operations
NASA Technical Reports Server (NTRS)
Iverson, David L.; Martin, Rodney A.; Schwabacher, Mark A.; Spirkovska, Liljana; Taylor, William McCaa; Castle, Joseph P.; Mackey, Ryan M.
2009-01-01
As modern space propulsion and exploration systems improve in capability and efficiency, their designs are becoming increasingly sophisticated and complex. Determining the health state of these systems, using traditional parameter limit checking, model-based, or rule-based methods, is becoming more difficult as the number of sensors and component interactions grow. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults or failures. The Inductive Monitoring System (IMS) is a data-driven system health monitoring software tool that has been successfully applied to several aerospace applications. IMS uses a data mining technique called clustering to analyze archived system data and characterize normal interactions between parameters. The scope of IMS based data-driven monitoring applications continues to expand with current development activities. Successful IMS deployment in the International Space Station (ISS) flight control room to monitor ISS attitude control systems has led to applications in other ISS flight control disciplines, such as thermal control. It has also generated interest in data-driven monitoring capability for Constellation, NASA's program to replace the Space Shuttle with new launch vehicles and spacecraft capable of returning astronauts to the moon, and then on to Mars. Several projects are currently underway to evaluate and mature the IMS technology and complementary tools for use in the Constellation program. These include an experiment on board the Air Force TacSat-3 satellite, and ground systems monitoring for NASA's Ares I-X and Ares I launch vehicles. The TacSat-3 Vehicle System Management (TVSM) project is a software experiment to integrate fault and anomaly detection algorithms and diagnosis tools with executive and adaptive planning functions contained in the flight software on-board the Air Force Research Laboratory TacSat-3 satellite. The TVSM software package will be uploaded after launch to monitor spacecraft subsystems such as power and guidance, navigation, and control (GN&C). It will analyze data in real-time to demonstrate detection of faults and unusual conditions, diagnose problems, and react to threats to spacecraft health and mission goals. The experiment will demonstrate the feasibility and effectiveness of integrated system health management (ISHM) technologies with both ground and on-board experiments.
Dictionary-Driven Ischemia Detection From Cardiac Phase-Resolved Myocardial BOLD MRI at Rest.
Bevilacqua, Marco; Dharmakumar, Rohan; Tsaftaris, Sotirios A
2016-01-01
Cardiac Phase-resolved Blood-Oxygen-Level Dependent (CP-BOLD) MRI provides a unique opportunity to image an ongoing ischemia at rest. However, it requires post-processing to evaluate the extent of ischemia. To address this, here we propose an unsupervised ischemia detection (UID) method which relies on the inherent spatio-temporal correlation between oxygenation and wall motion to formalize a joint learning and detection problem based on dictionary decomposition. Considering input data of a single subject, it treats ischemia as an anomaly and iteratively learns dictionaries to represent only normal observations (corresponding to myocardial territories remote to ischemia). Anomaly detection is based on a modified version of One-class Support Vector Machines (OCSVM) to regulate directly the margins by incorporating the dictionary-based representation errors. A measure of ischemic extent (IE) is estimated, reflecting the relative portion of the myocardium affected by ischemia. For visualization purposes an ischemia likelihood map is created by estimating posterior probabilities from the OCSVM outputs, thus obtaining how likely the classification is correct. UID is evaluated on synthetic data and in a 2D CP-BOLD data set from a canine experimental model emulating acute coronary syndromes. Comparing early ischemic territories identified with UID against infarct territories (after several hours of ischemia), we find that IE, as measured by UID, is highly correlated (Pearson's r=0.84) with respect to infarct size. When advances in automated registration and segmentation of CP-BOLD images and full coverage 3D acquisitions become available, we hope that this method can enable pixel-level assessment of ischemia with this truly non-invasive imaging technique.
A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization
Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; ...
2016-01-01
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
Model selection for anomaly detection
NASA Astrophysics Data System (ADS)
Burnaev, E.; Erofeev, P.; Smolyakov, D.
2015-12-01
Anomaly detection based on one-class classification algorithms is broadly used in many applied domains like image processing (e.g. detection of whether a patient is "cancerous" or "healthy" from mammography image), network intrusion detection, etc. Performance of an anomaly detection algorithm crucially depends on a kernel, used to measure similarity in a feature space. The standard approaches (e.g. cross-validation) for kernel selection, used in two-class classification problems, can not be used directly due to the specific nature of a data (absence of a second, abnormal, class data). In this paper we generalize several kernel selection methods from binary-class case to the case of one-class classification and perform extensive comparison of these approaches using both synthetic and real-world data.
System and method for anomaly detection
Scherrer, Chad
2010-06-15
A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.
Real-time anomaly detection for very short-term load forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Jian; Hong, Tao; Yue, Meng
Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less
Real-time anomaly detection for very short-term load forecasting
Luo, Jian; Hong, Tao; Yue, Meng
2018-01-06
Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less
The evens and odds of CMB anomalies
NASA Astrophysics Data System (ADS)
Gruppuso, A.; Kitazawa, N.; Lattanzi, M.; Mandolesi, N.; Natoli, P.; Sagnotti, A.
2018-06-01
The lack of power of large-angle CMB anisotropies is known to increase its statistical significance at higher Galactic latitudes, where a string-inspired pre-inflationary scale Δ can also be detected. Considering the Planck 2015 data, and relying largely on a Bayesian approach, we show that the effect is mostly driven by the even - ℓ harmonic multipoles with ℓ ≲ 20, which appear sizably suppressed in a way that is robust with respect to Galactic masking, along with the corresponding detections of Δ. On the other hand, the first odd - ℓ multipoles are only suppressed at high Galactic latitudes. We investigate this behavior in different sky masks, constraining Δ through even and odd multipoles, and we elaborate on possible implications. We include low- ℓ polarization data which, despite being noise-limited, help in attaining confidence levels of about 3 σ in the detection of Δ. We also show by direct forecasts that a future all-sky E-mode cosmic-variance-limited polarization survey may push the constraining power for Δ beyond 5 σ.
Propulsion health monitoring of a turbine engine disk using spin test data
NASA Astrophysics Data System (ADS)
Abdul-Aziz, Ali; Woike, Mark; Oza, Nikunj; Matthews, Bryan; Baakilini, George
2010-03-01
On line detection techniques to monitor the health of rotating engine components are becoming increasingly attractive options to aircraft engine companies in order to increase safety of operation and lower maintenance costs. Health monitoring remains a challenging feature to easily implement, especially, in the presence of scattered loading conditions, crack size, component geometry and materials properties. The current trend, however, is to utilize noninvasive types of health monitoring or nondestructive techniques to detect hidden flaws and mini cracks before any catastrophic event occurs. These techniques go further to evaluate materials' discontinuities and other anomalies that have grown to the level of critical defects which can lead to failure. Generally, health monitoring is highly dependent on sensor systems that are capable of performing in various engine environmental conditions and able to transmit a signal upon a predetermined crack length, while acting in a neutral form upon the overall performance of the engine system. Efforts are under way at NASA Glenn Research Center through support of the Intelligent Vehicle Health Management Project (IVHM) to develop and implement such sensor technology for a wide variety of applications. These efforts are focused on developing high temperature, wireless, low cost and durable products. Therefore, in an effort to address the technical issues concerning health monitoring of a rotor disk, this paper considers data collected from an experimental study using high frequency capacitive sensor technology to capture blade tip clearance and tip timing measurements in a rotating engine-like-disk-to predict the disk faults and assess its structural integrity. The experimental results collected at a range of rotational speeds from tests conducted at the NASA Glenn Research Center's Rotordynamics Laboratory will be evaluated using multiple data-driven anomaly detection techniques to identify anomalies in the disk. This study is expected to present a select evaluation of online health monitoring of a rotating disk using these high caliber sensors and test the capability of the in-house spin system.
Analysis of SSEM Sensor Data Using BEAM
NASA Technical Reports Server (NTRS)
Zak, Michail; Park, Han; James, Mark
2004-01-01
A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.
A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan W.
2014-01-01
This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.
A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan Walker
2015-01-01
This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.
Recent Results on "Approximations to Optimal Alarm Systems for Anomaly Detection"
NASA Technical Reports Server (NTRS)
Martin, Rodney Alexander
2009-01-01
An optimal alarm system and its approximations may use Kalman filtering for univariate linear dynamic systems driven by Gaussian noise to provide a layer of predictive capability. Predicted Kalman filter future process values and a fixed critical threshold can be used to construct a candidate level-crossing event over a predetermined prediction window. An optimal alarm system can be designed to elicit the fewest false alarms for a fixed detection probability in this particular scenario.
2015-09-21
this framework, MIT LL carried out a one-year proof- of-concept study to determine the capabilities and challenges in the detection of anomalies in...extremely large graphs [5]. Under this effort, two real datasets were considered, and algorithms for data modeling and anomaly detection were developed...is required in a well-defined experimental framework for the detection of anomalies in very large graphs. This study is intended to inform future
Anomaly Detection Based on Sensor Data in Petroleum Industry Applications
Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra
2015-01-01
Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection. PMID:25633599
ISHM Anomaly Lexicon for Rocket Test
NASA Technical Reports Server (NTRS)
Schmalzel, John L.; Buchanan, Aubri; Hensarling, Paula L.; Morris, Jonathan; Turowski, Mark; Figueroa, Jorge F.
2007-01-01
Integrated Systems Health Management (ISHM) is a comprehensive capability. An ISHM system must detect anomalies, identify causes of such anomalies, predict future anomalies, help identify consequences of anomalies for example, suggested mitigation steps. The system should also provide users with appropriate navigation tools to facilitate the flow of information into and out of the ISHM system. Central to the ability of the ISHM to detect anomalies is a clearly defined catalog of anomalies. Further, this lexicon of anomalies must be organized in ways that make it accessible to a suite of tools used to manage the data, information and knowledge (DIaK) associated with a system. In particular, it is critical to ensure that there is optimal mapping between target anomalies and the algorithms associated with their detection. During the early development of our ISHM architecture and approach, it became clear that a lexicon of anomalies would be important to the development of critical anomaly detection algorithms. In our work in the rocket engine test environment at John C. Stennis Space Center, we have access to a repository of discrepancy reports (DRs) that are generated in response to squawks identified during post-test data analysis. The DR is the tool used to document anomalies and the methods used to resolve the issue. These DRs have been generated for many different tests and for all test stands. The result is that they represent a comprehensive summary of the anomalies associated with rocket engine testing. Fig. 1 illustrates some of the data that can be extracted from a DR. Such information includes affected transducer channels, narrative description of the observed anomaly, and the steps used to correct the problem. The primary goal of the anomaly lexicon development efforts we have undertaken is to create a lexicon that could be used in support of an associated health assessment database system (HADS) co-development effort. There are a number of significant byproducts of the anomaly lexicon compilation effort. For example, (1) Allows determination of the frequency distribution of anomalies to help identify those with the potential for high return on investment if included in automated detection as part of an ISHM system, (2) Availability of a regular lexicon could provide the base anomaly name choices to help maintain consistency in the DR collection process, and (3) Although developed for the rocket engine test environment, most of the anomalies are not specific to rocket testing, and thus can be reused in other applications.
Anomaly Detection in Power Quality at Data Centers
NASA Technical Reports Server (NTRS)
Grichine, Art; Solano, Wanda M.
2015-01-01
The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.
Data-driven Analysis and Prediction of Arctic Sea Ice
NASA Astrophysics Data System (ADS)
Kondrashov, D. A.; Chekroun, M.; Ghil, M.; Yuan, X.; Ting, M.
2015-12-01
We present results of data-driven predictive analyses of sea ice over the main Arctic regions. Our approach relies on the Multilayer Stochastic Modeling (MSM) framework of Kondrashov, Chekroun and Ghil [Physica D, 2015] and it leads to prognostic models of sea ice concentration (SIC) anomalies on seasonal time scales.This approach is applied to monthly time series of leading principal components from the multivariate Empirical Orthogonal Function decomposition of SIC and selected climate variables over the Arctic. We evaluate the predictive skill of MSM models by performing retrospective forecasts with "no-look ahead" forup to 6-months ahead. It will be shown in particular that the memory effects included in our non-Markovian linear MSM models improve predictions of large-amplitude SIC anomalies in certain Arctic regions. Furtherimprovements allowed by the MSM framework will adopt a nonlinear formulation, as well as alternative data-adaptive decompositions.
Post-processing for improving hyperspectral anomaly detection accuracy
NASA Astrophysics Data System (ADS)
Wu, Jee-Cheng; Jiang, Chi-Ming; Huang, Chen-Liang
2015-10-01
Anomaly detection is an important topic in the exploitation of hyperspectral data. Based on the Reed-Xiaoli (RX) detector and a morphology operator, this research proposes a novel technique for improving the accuracy of hyperspectral anomaly detection. Firstly, the RX-based detector is used to process a given input scene. Then, a post-processing scheme using morphology operator is employed to detect those pixels around high-scoring anomaly pixels. Tests were conducted using two real hyperspectral images with ground truth information and the results based on receiver operating characteristic curves, illustrated that the proposed method reduced the false alarm rates of the RXbased detector.
Dictionary-driven Ischemia Detection from Cardiac Phase-Resolved Myocardial BOLD MRI at Rest
Bevilacqua, Marco; Dharmakumar, Rohan; Tsaftaris, Sotirios A.
2016-01-01
Cardiac Phase-resolved Blood-Oxygen-Level Dependent (CP–BOLD) MRI provides a unique opportunity to image an ongoing ischemia at rest. However, it requires post-processing to evaluate the extent of ischemia. To address this, here we propose an unsupervised ischemia detection (UID) method which relies on the inherent spatio-temporal correlation between oxygenation and wall motion to formalize a joint learning and detection problem based on dictionary decomposition. Considering input data of a single subject, it treats ischemia as an anomaly and iteratively learns dictionaries to represent only normal observations (corresponding to myocardial territories remote to ischemia). Anomaly detection is based on a modified version of One-class Support Vector Machines (OCSVM) to regulate directly the margins by incorporating the dictionary-based representation errors. A measure of ischemic extent (IE) is estimated, reflecting the relative portion of the myocardium affected by ischemia. For visualization purposes an ischemia likelihood map is created by estimating posterior probabilities from the OCSVM outputs, thus obtaining how likely the classification is correct. UID is evaluated on synthetic data and in a 2D CP–BOLD data set from a canine experimental model emulating acute coronary syndromes. Comparing early ischemic territories identified with UID against infarct territories (after several hours of ischemia), we find that IE, as measured by UID, is highly correlated (Pearson’s r = 0.84) w.r.t. infarct size. When advances in automated registration and segmentation of CP–BOLD images and full coverage 3D acquisitions become available, we hope that this method can enable pixel-level assessment of ischemia with this truly non-invasive imaging technique. PMID:26292338
Disparity : scalable anomaly detection for clusters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desai, N.; Bradshaw, R.; Lusk, E.
2008-01-01
In this paper, we describe disparity, a tool that does parallel, scalable anomaly detection for clusters. Disparity uses basic statistical methods and scalable reduction operations to perform data reduction on client nodes and uses these results to locate node anomalies. We discuss the implementation of disparity and present results of its use on a SiCortex SC5832 system.
Visual analytics of anomaly detection in large data streams
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay
2009-01-01
Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.
Autonomous detection of crowd anomalies in multiple-camera surveillance feeds
NASA Astrophysics Data System (ADS)
Nordlöf, Jonas; Andersson, Maria
2016-10-01
A novel approach for autonomous detection of anomalies in crowded environments is presented in this paper. The proposed models uses a Gaussian mixture probability hypothesis density (GM-PHD) filter as feature extractor in conjunction with different Gaussian mixture hidden Markov models (GM-HMMs). Results, based on both simulated and recorded data, indicate that this method can track and detect anomalies on-line in individual crowds through multiple camera feeds in a crowded environment.
Hyperspectral target detection using heavy-tailed distributions
NASA Astrophysics Data System (ADS)
Willis, Chris J.
2009-09-01
One promising approach to target detection in hyperspectral imagery exploits a statistical mixture model to represent scene content at a pixel level. The process then goes on to look for pixels which are rare, when judged against the model, and marks them as anomalies. It is assumed that military targets will themselves be rare and therefore likely to be detected amongst these anomalies. For the typical assumption of multivariate Gaussianity for the mixture components, the presence of the anomalous pixels within the training data will have a deleterious effect on the quality of the model. In particular, the derivation process itself is adversely affected by the attempt to accommodate the anomalies within the mixture components. This will bias the statistics of at least some of the components away from their true values and towards the anomalies. In many cases this will result in a reduction in the detection performance and an increased false alarm rate. This paper considers the use of heavy-tailed statistical distributions within the mixture model. Such distributions are better able to account for anomalies in the training data within the tails of their distributions, and the balance of the pixels within their central masses. This means that an improved model of the majority of the pixels in the scene may be produced, ultimately leading to a better anomaly detection result. The anomaly detection techniques are examined using both synthetic data and hyperspectral imagery with injected anomalous pixels. A range of results is presented for the baseline Gaussian mixture model and for models accommodating heavy-tailed distributions, for different parameterizations of the algorithms. These include scene understanding results, anomalous pixel maps at given significance levels and Receiver Operating Characteristic curves.
Skillful Spring Forecasts of September Arctic Sea Ice Extent Using Passive Microwave Data
NASA Technical Reports Server (NTRS)
Petty, A. A.; Schroder, D.; Stroeve, J. C.; Markus, Thorsten; Miller, Jeffrey A.; Kurtz, Nathan Timothy; Feltham, D. L.; Flocco, D.
2017-01-01
In this study, we demonstrate skillful spring forecasts of detrended September Arctic sea ice extent using passive microwave observations of sea ice concentration (SIC) and melt onset (MO). We compare these to forecasts produced using data from a sophisticated melt pond model, and find similar to higher skill values, where the forecast skill is calculated relative to linear trend persistence. The MO forecasts shows the highest skill in March-May, while the SIC forecasts produce the highest skill in June-August, especially when the forecasts are evaluated over recent years (since 2008). The high MO forecast skill in early spring appears to be driven primarily by the presence and timing of open water anomalies, while the high SIC forecast skill appears to be driven by both open water and surface melt processes. Spatial maps of detrended anomalies highlight the drivers of the different forecasts, and enable us to understand regions of predictive importance. Correctly capturing sea ice state anomalies, along with changes in open water coverage appear to be key processes in skillfully forecasting summer Arctic sea ice.
An Optimized Method to Detect BDS Satellites' Orbit Maneuvering and Anomalies in Real-Time.
Huang, Guanwen; Qin, Zhiwei; Zhang, Qin; Wang, Le; Yan, Xingyuan; Wang, Xiaolei
2018-02-28
The orbital maneuvers of Global Navigation Satellite System (GNSS) Constellations will decrease the performance and accuracy of positioning, navigation, and timing (PNT). Because satellites in the Chinese BeiDou Navigation Satellite System (BDS) are in Geostationary Orbit (GEO) and Inclined Geosynchronous Orbit (IGSO), maneuvers occur more frequently. Also, the precise start moment of the BDS satellites' orbit maneuvering cannot be obtained by common users. This paper presented an improved real-time detecting method for BDS satellites' orbit maneuvering and anomalies with higher timeliness and higher accuracy. The main contributions to this improvement are as follows: (1) instead of the previous two-steps method, a new one-step method with higher accuracy is proposed to determine the start moment and the pseudo random noise code (PRN) of the satellite orbit maneuvering in that time; (2) BDS Medium Earth Orbit (MEO) orbital maneuvers are firstly detected according to the proposed selection strategy for the stations; and (3) the classified non-maneuvering anomalies are detected by a new median robust method using the weak anomaly detection factor and the strong anomaly detection factor. The data from the Multi-GNSS Experiment (MGEX) in 2017 was used for experimental analysis. The experimental results and analysis showed that the start moment of orbital maneuvers and the period of non-maneuver anomalies can be determined more accurately in real-time. When orbital maneuvers and anomalies occur, the proposed method improved the data utilization for 91 and 95 min in 2017.
An Optimized Method to Detect BDS Satellites’ Orbit Maneuvering and Anomalies in Real-Time
Huang, Guanwen; Qin, Zhiwei; Zhang, Qin; Wang, Le; Yan, Xingyuan; Wang, Xiaolei
2018-01-01
The orbital maneuvers of Global Navigation Satellite System (GNSS) Constellations will decrease the performance and accuracy of positioning, navigation, and timing (PNT). Because satellites in the Chinese BeiDou Navigation Satellite System (BDS) are in Geostationary Orbit (GEO) and Inclined Geosynchronous Orbit (IGSO), maneuvers occur more frequently. Also, the precise start moment of the BDS satellites’ orbit maneuvering cannot be obtained by common users. This paper presented an improved real-time detecting method for BDS satellites’ orbit maneuvering and anomalies with higher timeliness and higher accuracy. The main contributions to this improvement are as follows: (1) instead of the previous two-steps method, a new one-step method with higher accuracy is proposed to determine the start moment and the pseudo random noise code (PRN) of the satellite orbit maneuvering in that time; (2) BDS Medium Earth Orbit (MEO) orbital maneuvers are firstly detected according to the proposed selection strategy for the stations; and (3) the classified non-maneuvering anomalies are detected by a new median robust method using the weak anomaly detection factor and the strong anomaly detection factor. The data from the Multi-GNSS Experiment (MGEX) in 2017 was used for experimental analysis. The experimental results and analysis showed that the start moment of orbital maneuvers and the period of non-maneuver anomalies can be determined more accurately in real-time. When orbital maneuvers and anomalies occur, the proposed method improved the data utilization for 91 and 95 min in 2017. PMID:29495638
Transient ice mass variations over Greenland detected by the combination of GPS and GRACE data
NASA Astrophysics Data System (ADS)
Zhang, B.; Liu, L.; Khan, S. A.; van Dam, T. M.; Zhang, E.
2017-12-01
Over the past decade, the Greenland Ice Sheet (GrIS) has been undergoing significant warming and ice mass loss. Such mass loss was not always a steady process but had substantial temporal and spatial variabilities. Here we apply multi-channel singular spectral analysis to crustal deformation time series measured at about 50 Global Positioning System (GPS) stations mounted on bedrock around the Greenland coast and mass changes inferred from Gravity Recovery and Climate Experiment (GRACE) to detect transient changes in ice mass balance over the GrIS. We detect two transient anomalies: one is a negative melting anomaly (Anomaly 1) that peaked around 2010; the other is a positive melting anomaly (Anomaly 2) that peaked between 2012 and 2013. The GRACE data show that both anomalies caused significant mass changes south of 74°N but negligible changes north of 74°N. Both anomalies caused the maximum mass change in southeast GrIS, followed by in west GrIS near Jakobshavn. Our results also show that the mass change caused by Anomaly 1 first reached the maximum in late 2009 in the southeast GrIS and then migrated to west GrIS. However, in Anomaly 2, the southeast GrIS was the last place that reached the maximum mass change in early 2013 and the west GrIS near Jakobshavn was the second latest place that reached the maximum mass change. Most of the GPS data show similar spatiotemporal patterns as those obtained from the GRACE data. However, some GPS time series show discrepancies in either space or time, because of data gaps and different sensitivities of mass loading change. Namely, loading deformation measured by GPS can be significantly affected by local dynamical mass changes, which, yet, has little impact on GRACE observations.
Detecting errors and anomalies in computerized materials control and accountability databases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiteson, R.; Hench, K.; Yarbro, T.
The Automated MC and A Database Assessment project is aimed at improving anomaly and error detection in materials control and accountability (MC and A) databases and increasing confidence in the data that they contain. Anomalous data resulting in poor categorization of nuclear material inventories greatly reduces the value of the database information to users. Therefore it is essential that MC and A data be assessed periodically for anomalies or errors. Anomaly detection can identify errors in databases and thus provide assurance of the integrity of data. An expert system has been developed at Los Alamos National Laboratory that examines thesemore » large databases for anomalous or erroneous data. For several years, MC and A subject matter experts at Los Alamos have been using this automated system to examine the large amounts of accountability data that the Los Alamos Plutonium Facility generates. These data are collected and managed by the Material Accountability and Safeguards System, a near-real-time computerized nuclear material accountability and safeguards system. This year they have expanded the user base, customizing the anomaly detector for the varying requirements of different groups of users. This paper describes the progress in customizing the expert systems to the needs of the users of the data and reports on their results.« less
On the representation of atmospheric blocking in EURO-CORDEX control runs
NASA Astrophysics Data System (ADS)
Jury, Martin W.; García, Sixto; Gutiérrez, José M.
2017-04-01
While regional climate models (RCMs) have been shown to yield improved projections, due to better representations of orography and higher resolved scales, impacts on mesoscale phenomena like atmospheric blocking have been hardly addressed. In this study we clarify if the EURO-CORDEX domain is large enough to allow the RCMs to significantly amplify the blocking representation in reference to the underlying driving data. Therefore, we analyzed blocking accompanying anomalies in temperature near the surface (TAS) and precipitation rate (PR) for a set of RCMs. 5 RCMs stem from the ensemble of EURO-CORDEX control runs, while 3 RCMs are WRF models with different nudging realizations, all of them are driven by ERA-Interim. The used blocking detection method detects blockings by localizing high pressure systems between 55°N and 65°N with the use of geopotential height gradients on the 500 hPa level (Z500), and was applied to ERA-Interim and the mentioned RCM data between 1981 and 2010. Detected blockings centers were spatially attributed to three sectors, which have been shown to display distinctive impacts on TAS and PR during blocking episodes. As a reference for TAS and PR we used 86 weather stations across Europe from the ECA&D dataset. Our results indicate, that little improvement can be expected in the representation of Z500 fields by the RCMs. Most of them show less blocking than the driving data, while blocking representation was most in agreement with the driving data for RCMs that have been strongly conditioned to the driving data. Further, in our idealized setting the RCMs were not able to reproduce the anomalies for TAS connected to blocking. Moreover, using the blocking index of the driving data could be considered correct, because the representation of TAS and PR for falsely detected blocking and non-blocking days in the RCMs did not deviate strongly.
Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A
2018-02-01
Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Spatially-Aware Temporal Anomaly Mapping of Gamma Spectra
NASA Astrophysics Data System (ADS)
Reinhart, Alex; Athey, Alex; Biegalski, Steven
2014-06-01
For security, environmental, and regulatory purposes it is useful to continuously monitor wide areas for unexpected changes in radioactivity. We report on a temporal anomaly detection algorithm which uses mobile detectors to build a spatial map of background spectra, allowing sensitive detection of any anomalies through many days or months of monitoring. We adapt previously-developed anomaly detection methods, which compare spectral shape rather than count rate, to function with limited background data, allowing sensitive detection of small changes in spectral shape from day to day. To demonstrate this technique we collected daily observations over the period of six weeks on a 0.33 square mile research campus and performed source injection simulations.
Dataset of anomalies and malicious acts in a cyber-physical subsystem.
Laso, Pedro Merino; Brosset, David; Puentes, John
2017-10-01
This article presents a dataset produced to investigate how data and information quality estimations enable to detect aNomalies and malicious acts in cyber-physical systems. Data were acquired making use of a cyber-physical subsystem consisting of liquid containers for fuel or water, along with its automated control and data acquisition infrastructure. Described data consist of temporal series representing five operational scenarios - Normal, aNomalies, breakdown, sabotages, and cyber-attacks - corresponding to 15 different real situations. The dataset is publicly available in the .zip file published with the article, to investigate and compare faulty operation detection and characterization methods for cyber-physical systems.
Saliency U-Net: A regional saliency map-driven hybrid deep learning network for anomaly segmentation
NASA Astrophysics Data System (ADS)
Karargyros, Alex; Syeda-Mahmood, Tanveer
2018-02-01
Deep learning networks are gaining popularity in many medical image analysis tasks due to their generalized ability to automatically extract relevant features from raw images. However, this can make the learning problem unnecessarily harder requiring network architectures of high complexity. In case of anomaly detection, in particular, there is often sufficient regional difference between the anomaly and the surrounding parenchyma that could be easily highlighted through bottom-up saliency operators. In this paper we propose a new hybrid deep learning network using a combination of raw image and such regional maps to more accurately learn the anomalies using simpler network architectures. Specifically, we modify a deep learning network called U-Net using both the raw and pre-segmented images as input to produce joint encoding (contraction) and expansion paths (decoding) in the U-Net. We present results of successfully delineating subdural and epidural hematomas in brain CT imaging and liver hemangioma in abdominal CT images using such network.
Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets
Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin
2017-01-01
Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy and sparse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxicabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques. PMID:28282948
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882
Artificial intelligence techniques for ground test monitoring of rocket engines
NASA Technical Reports Server (NTRS)
Ali, Moonis; Gupta, U. K.
1990-01-01
An expert system is being developed which can detect anomalies in Space Shuttle Main Engine (SSME) sensor data significantly earlier than the redline algorithm currently in use. The training of such an expert system focuses on two approaches which are based on low frequency and high frequency analyses of sensor data. Both approaches are being tested on data from SSME tests and their results compared with the findings of NASA and Rocketdyne experts. Prototype implementations have detected the presence of anomalies earlier than the redline algorithms that are in use currently. It therefore appears that these approaches have the potential of detecting anomalies early eneough to shut down the engine or take other corrective action before severe damage to the engine occurs.
Implementing Classification on a Munitions Response Project
2011-12-01
Detection Dig List IVS/Seed Site Planning Decisions Dig All Anomalies Site Characterization Implementing Classification on a Munitions Response...Details ● Seed emplacement ● EM61-MK2 detection survey RTK GPS ● Select anomalies for further investigation ● Collect cued data using MetalMapper...5.2 mV in channel 2 938 anomalies selected ● All QC seeds detected using this threshold Some just inside the 60-cm halo ● IVS reproducibility
A scalable architecture for online anomaly detection of WLCG batch jobs
NASA Astrophysics Data System (ADS)
Kuehn, E.; Fischer, M.; Giffels, M.; Jung, C.; Petzold, A.
2016-10-01
For data centres it is increasingly important to monitor the network usage, and learn from network usage patterns. Especially configuration issues or misbehaving batch jobs preventing a smooth operation need to be detected as early as possible. At the GridKa data and computing centre we therefore operate a tool BPNetMon for monitoring traffic data and characteristics of WLCG batch jobs and pilots locally on different worker nodes. On the one hand local information itself are not sufficient to detect anomalies for several reasons, e.g. the underlying job distribution on a single worker node might change or there might be a local misconfiguration. On the other hand a centralised anomaly detection approach does not scale regarding network communication as well as computational costs. We therefore propose a scalable architecture based on concepts of a super-peer network.
Toward Continuous GPS Carrier-Phase Time Transfer: Eliminating the Time Discontinuity at an Anomaly
Yao, Jian; Levine, Judah; Weiss, Marc
2015-01-01
The wide application of Global Positioning System (GPS) carrier-phase (CP) time transfer is limited by the problem of boundary discontinuity (BD). The discontinuity has two categories. One is “day boundary discontinuity,” which has been studied extensively and can be solved by multiple methods [1–8]. The other category of discontinuity, called “anomaly boundary discontinuity (anomaly-BD),” comes from a GPS data anomaly. The anomaly can be a data gap (i.e., missing data), a GPS measurement error (i.e., bad data), or a cycle slip. Initial study of the anomaly-BD shows that we can fix the discontinuity if the anomaly lasts no more than 20 min, using the polynomial curve-fitting strategy to repair the anomaly [9]. However, sometimes, the data anomaly lasts longer than 20 min. Thus, a better curve-fitting strategy is in need. Besides, a cycle slip, as another type of data anomaly, can occur and lead to an anomaly-BD. To solve these problems, this paper proposes a new strategy, i.e., the satellite-clock-aided curve fitting strategy with the function of cycle slip detection. Basically, this new strategy applies the satellite clock correction to the GPS data. After that, we do the polynomial curve fitting for the code and phase data, as before. Our study shows that the phase-data residual is only ~3 mm for all GPS satellites. The new strategy also detects and finds the number of cycle slips by searching the minimum curve-fitting residual. Extensive examples show that this new strategy enables us to repair up to a 40-min GPS data anomaly, regardless of whether the anomaly is due to a data gap, a cycle slip, or a combination of the two. We also find that interference of the GPS signal, known as “jamming”, can possibly lead to a time-transfer error, and that this new strategy can compensate for jamming outages. Thus, the new strategy can eliminate the impact of jamming on time transfer. As a whole, we greatly improve the robustness of the GPS CP time transfer. PMID:26958451
A Doubly Stochastic Change Point Detection Algorithm for Noisy Biological Signals.
Gold, Nathan; Frasch, Martin G; Herry, Christophe L; Richardson, Bryan S; Wang, Xiaogang
2017-01-01
Experimentally and clinically collected time series data are often contaminated with significant confounding noise, creating short, noisy time series. This noise, due to natural variability and measurement error, poses a challenge to conventional change point detection methods. We propose a novel and robust statistical method for change point detection for noisy biological time sequences. Our method is a significant improvement over traditional change point detection methods, which only examine a potential anomaly at a single time point. In contrast, our method considers all suspected anomaly points and considers the joint probability distribution of the number of change points and the elapsed time between two consecutive anomalies. We validate our method with three simulated time series, a widely accepted benchmark data set, two geological time series, a data set of ECG recordings, and a physiological data set of heart rate variability measurements of fetal sheep model of human labor, comparing it to three existing methods. Our method demonstrates significantly improved performance over the existing point-wise detection methods.
Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach
NASA Astrophysics Data System (ADS)
Koeppen, W. C.; Pilger, E.; Wright, R.
2011-07-01
We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.
Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.
2016-06-07
A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.
Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection
NASA Astrophysics Data System (ADS)
Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa
2018-01-01
A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.
NASA Astrophysics Data System (ADS)
LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.
2016-12-01
Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.
Firefly Algorithm in detection of TEC seismo-ionospheric anomalies
NASA Astrophysics Data System (ADS)
Akhoondzadeh, Mehdi
2015-07-01
Anomaly detection in time series of different earthquake precursors is an essential introduction to create an early warning system with an allowable uncertainty. Since these time series are more often non linear, complex and massive, therefore the applied predictor method should be able to detect the discord patterns from a large data in a short time. This study acknowledges Firefly Algorithm (FA) as a simple and robust predictor to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of the some powerful earthquakes including Chile (27 February 2010), Varzeghan (11 August 2012) and Saravan (16 April 2013). Outstanding anomalies were observed 7 and 5 days before the Chile and Varzeghan earthquakes, respectively and also 3 and 8 days prior to the Saravan earthquake.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie
PingER (Ping End-to-End Reporting) is a worldwide end-to-end Internet performance measurement framework. It was developed by the SLAC National Accelerator Laboratory, Stanford, USA and running from the last 20 years. It has more than 700 monitoring agents and remote sites which monitor the performance of Internet links around 170 countries of the world. At present, the size of the compressed PingER data set is about 60 GB comprising of 100,000 flat files. The data is publicly available for valuable Internet performance analyses. However, the data sets suffer from missing values and anomalies due to congestion, bottleneck links, queuing overflow, networkmore » software misconfiguration, hardware failure, cable cuts, and social upheavals. Therefore, the objective of this paper is to detect such performance drops or spikes labeled as anomalies or outliers for the PingER data set. In the proposed approach, the raw text files of the data set are transformed into a PingER dimensional model. The missing values are imputed using the k-NN algorithm. The data is partitioned into similar instances using the k-means clustering algorithm. Afterward, clustering is integrated with the Local Outlier Factor (LOF) using the Cluster Based Local Outlier Factor (CBLOF) algorithm to detect the anomalies or outliers from the PingER data. Lastly, anomalies are further analyzed to identify the time frame and location of the hosts generating the major percentage of the anomalies in the PingER data set ranging from 1998 to 2016.« less
Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie; ...
2018-05-28
PingER (Ping End-to-End Reporting) is a worldwide end-to-end Internet performance measurement framework. It was developed by the SLAC National Accelerator Laboratory, Stanford, USA and running from the last 20 years. It has more than 700 monitoring agents and remote sites which monitor the performance of Internet links around 170 countries of the world. At present, the size of the compressed PingER data set is about 60 GB comprising of 100,000 flat files. The data is publicly available for valuable Internet performance analyses. However, the data sets suffer from missing values and anomalies due to congestion, bottleneck links, queuing overflow, networkmore » software misconfiguration, hardware failure, cable cuts, and social upheavals. Therefore, the objective of this paper is to detect such performance drops or spikes labeled as anomalies or outliers for the PingER data set. In the proposed approach, the raw text files of the data set are transformed into a PingER dimensional model. The missing values are imputed using the k-NN algorithm. The data is partitioned into similar instances using the k-means clustering algorithm. Afterward, clustering is integrated with the Local Outlier Factor (LOF) using the Cluster Based Local Outlier Factor (CBLOF) algorithm to detect the anomalies or outliers from the PingER data. Lastly, anomalies are further analyzed to identify the time frame and location of the hosts generating the major percentage of the anomalies in the PingER data set ranging from 1998 to 2016.« less
Road Anomalies Detection System Evaluation.
Silva, Nuno; Shah, Vaibhav; Soares, João; Rodrigues, Helena
2018-06-21
Anomalies on road pavement cause discomfort to drivers and passengers, and may cause mechanical failure or even accidents. Governments spend millions of Euros every year on road maintenance, often causing traffic jams and congestion on urban roads on a daily basis. This paper analyses the difference between the deployment of a road anomalies detection and identification system in a “conditioned” and a real world setup, where the system performed worse compared to the “conditioned” setup. It also presents a system performance analysis based on the analysis of the training data sets; on the analysis of the attributes complexity, through the application of PCA techniques; and on the analysis of the attributes in the context of each anomaly type, using acceleration standard deviation attributes to observe how different anomalies classes are distributed in the Cartesian coordinates system. Overall, in this paper, we describe the main insights on road anomalies detection challenges to support the design and deployment of a new iteration of our system towards the deployment of a road anomaly detection service to provide information about roads condition to drivers and government entities.
NASA Astrophysics Data System (ADS)
Mori, Taketoshi; Ishino, Takahito; Noguchi, Hiroshi; Shimosaka, Masamichi; Sato, Tomomasa
2011-06-01
We propose a life pattern estimation method and an anomaly detection method for elderly people living alone. In our observation system for such people, we deploy some pyroelectric sensors into the house and measure the person's activities all the time in order to grasp the person's life pattern. The data are transferred successively to the operation center and displayed to the nurses in the center in a precise way. Then, the nurses decide whether the data is the anomaly or not. In the system, the people whose features in their life resemble each other are categorized as the same group. Anomalies occurred in the past are shared in the group and utilized in the anomaly detection algorithm. This algorithm is based on "anomaly score." The "anomaly score" is figured out by utilizing the activeness of the person. This activeness is approximately proportional to the frequency of the sensor response in a minute. The "anomaly score" is calculated from the difference between the activeness in the present and the past one averaged in the long term. Thus, the score is positive if the activeness in the present is higher than the average in the past, and the score is negative if the value in the present is lower than the average. If the score exceeds a certain threshold, it means that an anomaly event occurs. Moreover, we developed an activity estimation algorithm. This algorithm estimates the residents' basic activities such as uprising, outing, and so on. The estimation is shown to the nurses with the "anomaly score" of the residents. The nurses can understand the residents' health conditions by combining these two information.
Min-max hyperellipsoidal clustering for anomaly detection in network security.
Sarasamma, Suseela T; Zhu, Qiuming A
2006-08-01
A novel hyperellipsoidal clustering technique is presented for an intrusion-detection system in network security. Hyperellipsoidal clusters toward maximum intracluster similarity and minimum intercluster similarity are generated from training data sets. The novelty of the technique lies in the fact that the parameters needed to construct higher order data models in general multivariate Gaussian functions are incrementally derived from the data sets using accretive processes. The technique is implemented in a feedforward neural network that uses a Gaussian radial basis function as the model generator. An evaluation based on the inclusiveness and exclusiveness of samples with respect to specific criteria is applied to accretively learn the output clusters of the neural network. One significant advantage of this is its ability to detect individual anomaly types that are hard to detect with other anomaly-detection schemes. Applying this technique, several feature subsets of the tcptrace network-connection records that give above 95% detection at false-positive rates below 5% were identified.
Anomaly detection driven active learning for identifying suspicious tracks and events in WAMI video
NASA Astrophysics Data System (ADS)
Miller, David J.; Natraj, Aditya; Hockenbury, Ryler; Dunn, Katherine; Sheffler, Michael; Sullivan, Kevin
2012-06-01
We describe a comprehensive system for learning to identify suspicious vehicle tracks from wide-area motion (WAMI) video. First, since the road network for the scene of interest is assumed unknown, agglomerative hierarchical clustering is applied to all spatial vehicle measurements, resulting in spatial cells that largely capture individual road segments. Next, for each track, both at the cell (speed, acceleration, azimuth) and track (range, total distance, duration) levels, extreme value feature statistics are both computed and aggregated, to form summary (p-value based) anomaly statistics for each track. Here, to fairly evaluate tracks that travel across different numbers of spatial cells, for each cell-level feature type, a single (most extreme) statistic is chosen, over all cells traveled. Finally, a novel active learning paradigm, applied to a (logistic regression) track classifier, is invoked to learn to distinguish suspicious from merely anomalous tracks, starting from anomaly-ranked track prioritization, with ground-truth labeling by a human operator. This system has been applied to WAMI video data (ARGUS), with the tracks automatically extracted by a system developed in-house at Toyon Research Corporation. Our system gives promising preliminary results in highly ranking as suspicious aerial vehicles, dismounts, and traffic violators, and in learning which features are most indicative of suspicious tracks.
A novel approach for pilot error detection using Dynamic Bayesian Networks.
Saada, Mohamad; Meng, Qinggang; Huang, Tingwen
2014-06-01
In the last decade Dynamic Bayesian Networks (DBNs) have become one type of the most attractive probabilistic modelling framework extensions of Bayesian Networks (BNs) for working under uncertainties from a temporal perspective. Despite this popularity not many researchers have attempted to study the use of these networks in anomaly detection or the implications of data anomalies on the outcome of such models. An abnormal change in the modelled environment's data at a given time, will cause a trailing chain effect on data of all related environment variables in current and consecutive time slices. Albeit this effect fades with time, it still can have an ill effect on the outcome of such models. In this paper we propose an algorithm for pilot error detection, using DBNs as the modelling framework for learning and detecting anomalous data. We base our experiments on the actions of an aircraft pilot, and a flight simulator is created for running the experiments. The proposed anomaly detection algorithm has achieved good results in detecting pilot errors and effects on the whole system.
Novel Hyperspectral Anomaly Detection Methods Based on Unsupervised Nearest Regularized Subspace
NASA Astrophysics Data System (ADS)
Hou, Z.; Chen, Y.; Tan, K.; Du, P.
2018-04-01
Anomaly detection has been of great interest in hyperspectral imagery analysis. Most conventional anomaly detectors merely take advantage of spectral and spatial information within neighboring pixels. In this paper, two methods of Unsupervised Nearest Regularized Subspace-based with Outlier Removal Anomaly Detector (UNRSORAD) and Local Summation UNRSORAD (LSUNRSORAD) are proposed, which are based on the concept that each pixel in background can be approximately represented by its spatial neighborhoods, while anomalies cannot. Using a dual window, an approximation of each testing pixel is a representation of surrounding data via a linear combination. The existence of outliers in the dual window will affect detection accuracy. Proposed detectors remove outlier pixels that are significantly different from majority of pixels. In order to make full use of various local spatial distributions information with the neighboring pixels of the pixels under test, we take the local summation dual-window sliding strategy. The residual image is constituted by subtracting the predicted background from the original hyperspectral imagery, and anomalies can be detected in the residual image. Experimental results show that the proposed methods have greatly improved the detection accuracy compared with other traditional detection method.
Application of data cubes for improving detection of water cycle extreme events
NASA Astrophysics Data System (ADS)
Teng, W. L.; Albayrak, A.
2015-12-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.
Detection and characterization of buried lunar craters with GRAIL data
NASA Astrophysics Data System (ADS)
Sood, Rohan; Chappaz, Loic; Melosh, Henry J.; Howell, Kathleen C.; Milbury, Colleen; Blair, David M.; Zuber, Maria T.
2017-06-01
We used gravity mapping observations from NASA's Gravity Recovery and Interior Laboratory (GRAIL) to detect, characterize and validate the presence of large impact craters buried beneath the lunar maria. In this paper we focus on two prominent anomalies detected in the GRAIL data using the gravity gradiometry technique. Our detection strategy is applied to both free-air and Bouguer gravity field observations to identify gravitational signatures that are similar to those observed over buried craters. The presence of buried craters is further supported by individual analysis of regional free-air gravity anomalies, Bouguer gravity anomaly maps, and forward modeling. Our best candidate, for which we propose the informal name of Earhart Crater, is approximately 200 km in diameter and forms part of the northwestern rim of Lacus Somniorum, The other candidate, for which we propose the informal name of Ashoka Anomaly, is approximately 160 km in diameter and lies completely buried beneath Mare Tranquillitatis. Other large, still unrecognized, craters undoubtedly underlie other portions of the Moon's vast mare lavas.
Spatial-temporal event detection in climate parameter imagery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKenna, Sean Andrew; Gutierrez, Karen A.
Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to themore » earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.« less
Syed, Zeeshan; Saeed, Mohammed; Rubinfeld, Ilan
2010-01-01
For many clinical conditions, only a small number of patients experience adverse outcomes. Developing risk stratification algorithms for these conditions typically requires collecting large volumes of data to capture enough positive and negative for training. This process is slow, expensive, and may not be appropriate for new phenomena. In this paper, we explore different anomaly detection approaches to identify high-risk patients as cases that lie in sparse regions of the feature space. We study three broad categories of anomaly detection methods: classification-based, nearest neighbor-based, and clustering-based techniques. When evaluated on data from the National Surgical Quality Improvement Program (NSQIP), these methods were able to successfully identify patients at an elevated risk of mortality and rare morbidities following inpatient surgical procedures. PMID:21347083
A Testbed for Data Fusion for Engine Diagnostics and Prognostics1
2002-03-01
detected ; too late to be useful for prognostics development. Table 1. Table of acronyms ACRONYM MEANING AD Anomaly detector...strictly defined points. Determining where we are on the engine health curve is the first step in prognostics . Fault detection / diagnostic reasoning... Detection As described above the ability of the monitoring system to detect an anomaly is especially important for knowledge-based systems, i.e.,
AnRAD: A Neuromorphic Anomaly Detection Framework for Massive Concurrent Data Streams.
Chen, Qiuwen; Luley, Ryan; Wu, Qing; Bishop, Morgan; Linderman, Richard W; Qiu, Qinru
2018-05-01
The evolution of high performance computing technologies has enabled the large-scale implementation of neuromorphic models and pushed the research in computational intelligence into a new era. Among the machine learning applications, unsupervised detection of anomalous streams is especially challenging due to the requirements of detection accuracy and real-time performance. Designing a computing framework that harnesses the growing computing power of the multicore systems while maintaining high sensitivity and specificity to the anomalies is an urgent research topic. In this paper, we propose anomaly recognition and detection (AnRAD), a bioinspired detection framework that performs probabilistic inferences. We analyze the feature dependency and develop a self-structuring method that learns an efficient confabulation network using unlabeled data. This network is capable of fast incremental learning, which continuously refines the knowledge base using streaming data. Compared with several existing anomaly detection approaches, our method provides competitive detection quality. Furthermore, we exploit the massive parallel structure of the AnRAD framework. Our implementations of the detection algorithm on the graphic processing unit and the Xeon Phi coprocessor both obtain substantial speedups over the sequential implementation on general-purpose microprocessor. The framework provides real-time service to concurrent data streams within diversified knowledge contexts, and can be applied to large problems with multiple local patterns. Experimental results demonstrate high computing performance and memory efficiency. For vehicle behavior detection, the framework is able to monitor up to 16000 vehicles (data streams) and their interactions in real time with a single commodity coprocessor, and uses less than 0.2 ms for one testing subject. Finally, the detection network is ported to our spiking neural network simulator to show the potential of adapting to the emerging neuromorphic architectures.
OceanXtremes: Scalable Anomaly Detection in Oceanographic Time-Series
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Armstrong, E. M.; Chin, T. M.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jacob, J. C.; Quach, N.
2016-12-01
The oceanographic community must meet the challenge to rapidly identify features and anomalies in complex and voluminous observations to further science and improve decision support. Given this data-intensive reality, we are developing an anomaly detection system, called OceanXtremes, powered by an intelligent, elastic Cloud-based analytic service backend that enables execution of domain-specific, multi-scale anomaly and feature detection algorithms across the entire archive of 15 to 30-year ocean science datasets.Our parallel analytics engine is extending the NEXUS system and exploits multiple open-source technologies: Apache Cassandra as a distributed spatial "tile" cache, Apache Spark for in-memory parallel computation, and Apache Solr for spatial search and storing pre-computed tile statistics and other metadata. OceanXtremes provides these key capabilities: Parallel generation (Spark on a compute cluster) of 15 to 30-year Ocean Climatologies (e.g. sea surface temperature or SST) in hours or overnight, using simple pixel averages or customizable Gaussian-weighted "smoothing" over latitude, longitude, and time; Parallel pre-computation, tiling, and caching of anomaly fields (daily variables minus a chosen climatology) with pre-computed tile statistics; Parallel detection (over the time-series of tiles) of anomalies or phenomena by regional area-averages exceeding a specified threshold (e.g. high SST in El Nino or SST "blob" regions), or more complex, custom data mining algorithms; Shared discovery and exploration of ocean phenomena and anomalies (facet search using Solr), along with unexpected correlations between key measured variables; Scalable execution for all capabilities on a hybrid Cloud, using our on-premise OpenStack Cloud cluster or at Amazon. The key idea is that the parallel data-mining operations will be run "near" the ocean data archives (a local "network" hop) so that we can efficiently access the thousands of files making up a three decade time-series. The presentation will cover the architecture of OceanXtremes, parallelization of the climatology computation and anomaly detection algorithms using Spark, example results for SST and other time-series, and parallel performance metrics.
A new approach for structural health monitoring by applying anomaly detection on strain sensor data
NASA Astrophysics Data System (ADS)
Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik
2014-03-01
Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.
Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets.
Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin
2017-03-09
Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy ands parse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxi cabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques.
2014-10-02
potential advantages of using multi- variate classification/discrimination/ anomaly detection meth- ods on real world accelerometric condition monitoring ...case of false anomaly reports. A possible explanation of this phenomenon could be given 8 ANNUAL CONFERENCE OF THE PROGNOSTICS AND HEALTH MANAGEMENT...of those helicopters. 1. Anomaly detection by means of a self-learning Shewhart control chart. A problem highlighted by the experts of Agusta- Westland
An Investigation of State-Space Model Fidelity for SSME Data
NASA Technical Reports Server (NTRS)
Martin, Rodney Alexander
2008-01-01
In previous studies, a variety of unsupervised anomaly detection techniques for anomaly detection were applied to SSME (Space Shuttle Main Engine) data. The observed results indicated that the identification of certain anomalies were specific to the algorithmic method under consideration. This is the reason why one of the follow-on goals of these previous investigations was to build an architecture to support the best capabilities of all algorithms. We appeal to that goal here by investigating a cascade, serial architecture for the best performing and most suitable candidates from previous studies. As a precursor to a formal ROC (Receiver Operating Characteristic) curve analysis for validation of resulting anomaly detection algorithms, our primary focus here is to investigate the model fidelity as measured by variants of the AIC (Akaike Information Criterion) for state-space based models. We show that placing constraints on a state-space model during or after the training of the model introduces a modest level of suboptimality. Furthermore, we compare the fidelity of all candidate models including those embodying the cascade, serial architecture. We make recommendations on the most suitable candidates for application to subsequent anomaly detection studies as measured by AIC-based criteria.
Semi-Supervised Novelty Detection with Adaptive Eigenbases, and Application to Radio Transients
NASA Technical Reports Server (NTRS)
Thompson, David R.; Majid, Walid A.; Reed, Colorado J.; Wagstaff, Kiri L.
2011-01-01
We present a semi-supervised online method for novelty detection and evaluate its performance for radio astronomy time series data. Our approach uses adaptive eigenbases to combine 1) prior knowledge about uninteresting signals with 2) online estimation of the current data properties to enable highly sensitive and precise detection of novel signals. We apply the method to the problem of detecting fast transient radio anomalies and compare it to current alternative algorithms. Tests based on observations from the Parkes Multibeam Survey show both effective detection of interesting rare events and robustness to known false alarm anomalies.
NASA Astrophysics Data System (ADS)
Pattisahusiwa, Asis; Houw Liong, The; Purqon, Acep
2016-08-01
In this study, we compare two learning mechanisms: outliers and novelty detection in order to detect ionospheric TEC disturbance by November 2004 geomagnetic storm and January 2005 substorm. The mechanisms are applied by using v-SVR learning algorithm which is a regression version of SVM. Our results show that both mechanisms are quiet accurate in learning TEC data. However, novelty detection is more accurate than outliers detection in extracting anomalies related to geomagnetic events. The detected anomalies by outliers detection are mostly related to trend of data, while novelty detection are associated to geomagnetic events. Novelty detection also shows evidence of LSTID during geomagnetic events.
NASA Astrophysics Data System (ADS)
Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; In Oh, Tong
2014-10-01
Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.
Observed TEC Anomalies by GNSS Sites Preceding the Aegean Sea Earthquake of 2014
NASA Astrophysics Data System (ADS)
Ulukavak, Mustafa; Yal&ccedul; ınkaya, Mualla
2016-11-01
In recent years, Total Electron Content (TEC) data, obtained from Global Navigation Satellites Systems (GNSS) receivers, has been widely used to detect seismo-ionospheric anomalies. In this study, Global Positioning System - Total Electron Content (GPS-TEC) data were used to investigate ionospheric abnormal behaviors prior to the 2014 Aegean Sea earthquake (40.305°N 25.453°E, 24 May 2014, 09:25:03 UT, Mw:6.9). The data obtained from three Continuously Operating Reference Stations in Turkey (CORS-TR) and two International GNSS Service (IGS) sites near the epicenter of the earthquake is used to detect ionospheric anomalies before the earthquake. Solar activity index (F10.7) and geomagnetic activity index (Dst), which are both related to space weather conditions, were used to analyze these pre-earthquake ionospheric anomalies. An examination of these indices indicated high solar activity between May 8 and 15, 2014. The first significant increase (positive anomalies) in Vertical Total Electron Content (VTEC) was detected on May 14, 2014 or 10 days before the earthquake. This positive anomaly can be attributed to the high solar activity. The indices do not imply high solar or geomagnetic activity after May 15, 2014. Abnormal ionospheric TEC changes (negative anomaly) were observed at all stations one day before the earthquake. These changes were lower than the lower bound by approximately 10-20 TEC unit (TECU), and may be considered as the ionospheric precursor of the 2014 Aegean Sea earthquake
NASA Astrophysics Data System (ADS)
Koeppen, W. C.; Wright, R.; Pilger, E.
2009-12-01
We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found that MODVOLC2 achieved good results on multiple sensors (MODIS and GOES), which provides confidence that MODVOLC2 can be run on future instruments regardless of their spatial and temporal resolutions. The improved performance of MODVOLC2 over MODVOLC makes possible the detection of lower temperature thermal anomalies that will be useful in improving our ability to document Earth’s volcanic eruptions as well as detect possible low temperature thermal precursors to larger eruptions.
GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D
This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% truemore » positive rates at both.« less
Conditional Anomaly Detection with Soft Harmonic Functions
Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F.; Hauskrecht, Milos
2012-01-01
In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions. PMID:25309142
Conditional Anomaly Detection with Soft Harmonic Functions.
Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F; Hauskrecht, Milos
2011-01-01
In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions.
Lidar detection algorithm for time and range anomalies.
Ben-David, Avishai; Davidson, Charles E; Vanderbeek, Richard G
2007-10-10
A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t(1) to t(2)" is addressed, and for range anomaly where the question "is a target present at time t within ranges R(1) and R(2)" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO(2) lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.
Data-Driven Modeling and Prediction of Arctic Sea Ice
NASA Astrophysics Data System (ADS)
Kondrashov, Dmitri; Chekroun, Mickael; Ghil, Michael
2016-04-01
We present results of data-driven predictive analyses of sea ice over the main Arctic regions. Our approach relies on the Multilayer Stochastic Modeling (MSM) framework of Kondrashov, Chekroun and Ghil [Physica D, 2015] and it leads to probabilistic prognostic models of sea ice concentration (SIC) anomalies on seasonal time scales. This approach is applied to monthly time series of state-of-the-art data-adaptive decompositions of SIC and selected climate variables over the Arctic. We evaluate the predictive skill of MSM models by performing retrospective forecasts with "no-look ahead" for up to 6-months ahead. It will be shown in particular that the memory effects included intrinsically in the formulation of our non-Markovian MSM models allow for improvements of the prediction skill of large-amplitude SIC anomalies in certain Arctic regions on the one hand, and of September Sea Ice Extent, on the other. Further improvements allowed by the MSM framework will adopt a nonlinear formulation and explore next-generation data-adaptive decompositions, namely modification of Principal Oscillation Patterns (POPs) and rotated Multichannel Singular Spectrum Analysis (M-SSA).
NASA Astrophysics Data System (ADS)
Kinoshita, M.; Von Herzen, R. P.; Matsubayashi, O.; Fujioka, K.
1998-06-01
During Aug. 13-21, 1994, temperatures and current velocity were simultaneously monitored on the TAG hydrothermal mound. Three `Giant Kelps (GKs)', vertical thermistor arrays of 50 m height, were moored on the periphery of the central black smoker complex (CBC). A `Manatee', multi-monitoring system including current velocity, was deployed 50 m east of CBC. Four `Daibutsu' geothermal probes penetrated the sediment south to west of CBC. Compilation of all data revealed semi-diurnal variations in water temperatures and current velocity, and allowed us to discuss the source of these anomalies. Temperature anomalies of GKs correlate well with current velocity, and are interpreted to be caused by the main plume from CBC that was bent over by the tidal current. We identified two types of asymmetric, periodic temperature variations at Daibutsu Probes 2 and 8, located 20 m to the south of CBC. By comparing temperatures and current velocity, they are attributed to non-buoyant effluents laterally advected by the tidal current. The source of one variation is located east to ESE of the probes, and the source of the other is located to the north. On Aug. 31, a new periodic anomaly emerged on Probe 2 with its amplitude up to 0.8°C. The 6-h offset between the new anomaly and the previous one suggests that the source of the new anomaly lies to the west of Probe 2. The heat flux of these non-buoyant effluents is estimated to range from 30 to 100 kW/m 2, which is of the same order as direct estimates of diffuse flow at the TAG mound. It suggests that a significant amount of diffuse effluent is laterally advected by the prevailing current near the seafloor.
Data-driven modeling of surface temperature anomaly and solar activity trends
Friedel, Michael J.
2012-01-01
A novel two-step modeling scheme is used to reconstruct and analyze surface temperature and solar activity data at global, hemispheric, and regional scales. First, the self-organizing map (SOM) technique is used to extend annual modern climate data from the century to millennial scale. The SOM component planes are used to identify and quantify strength of nonlinear relations among modern surface temperature anomalies (<150 years), tropical and extratropical teleconnections, and Palmer Drought Severity Indices (0–2000 years). Cross-validation of global sea and land surface temperature anomalies verifies that the SOM is an unbiased estimator with less uncertainty than the magnitude of anomalies. Second, the quantile modeling of SOM reconstructions reveal trends and periods in surface temperature anomaly and solar activity whose timing agrees with published studies. Temporal features in surface temperature anomalies, such as the Medieval Warm Period, Little Ice Age, and Modern Warming Period, appear at all spatial scales but whose magnitudes increase when moving from ocean to land, from global to regional scales, and from southern to northern regions. Some caveats that apply when interpreting these data are the high-frequency filtering of climate signals based on quantile model selection and increased uncertainty when paleoclimatic data are limited. Even so, all models find the rate and magnitude of Modern Warming Period anomalies to be greater than those during the Medieval Warm Period. Lastly, quantile trends among reconstructed equatorial Pacific temperature profiles support the recent assertion of two primary El Niño Southern Oscillation types. These results demonstrate the efficacy of this alternative modeling approach for reconstructing and interpreting scale-dependent climate variables.
A Comparative Study of Unsupervised Anomaly Detection Techniques Using Honeypot Data
NASA Astrophysics Data System (ADS)
Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Inoue, Daisuke; Eto, Masashi; Nakao, Koji
Intrusion Detection Systems (IDS) have been received considerable attention among the network security researchers as one of the most promising countermeasures to defend our crucial computer systems or networks against attackers on the Internet. Over the past few years, many machine learning techniques have been applied to IDSs so as to improve their performance and to construct them with low cost and effort. Especially, unsupervised anomaly detection techniques have a significant advantage in their capability to identify unforeseen attacks, i.e., 0-day attacks, and to build intrusion detection models without any labeled (i.e., pre-classified) training data in an automated manner. In this paper, we conduct a set of experiments to evaluate and analyze performance of the major unsupervised anomaly detection techniques using real traffic data which are obtained at our honeypots deployed inside and outside of the campus network of Kyoto University, and using various evaluation criteria, i.e., performance evaluation by similarity measurements and the size of training data, overall performance, detection ability for unknown attacks, and time complexity. Our experimental results give some practical and useful guidelines to IDS researchers and operators, so that they can acquire insight to apply these techniques to the area of intrusion detection, and devise more effective intrusion detection models.
NASA Astrophysics Data System (ADS)
Jervis, John R.; Pringle, Jamie K.
2014-09-01
Electrical resistivity surveys have proven useful for locating clandestine graves in a number of forensic searches. However, some aspects of grave detection with resistivity surveys remain imperfectly understood. One such aspect is the effect of seasonal changes in climate on the resistivity response of graves. In this study, resistivity survey data collected over three years over three simulated graves were analysed in order to assess how the graves' resistivity anomalies varied seasonally and when they could most easily be detected. Thresholds were used to identify anomalies, and the ‘residual volume' of grave-related anomalies was calculated as the area bounded by the relevant thresholds multiplied by the anomaly's average value above the threshold. The residual volume of a resistivity anomaly associated with a buried pig cadaver showed evidence of repeating annual patterns and was moderately correlated with the soil moisture budget. This anomaly was easiest to detect between January and April each year, after prolonged periods of high net gain in soil moisture. The resistivity response of a wrapped cadaver was more complex, although it also showed evidence of seasonal variation during the third year after burial. We suggest that the observed variation in the graves' resistivity anomalies was caused by seasonal change in survey data noise levels, which was in turn influenced by the soil moisture budget. It is possible that similar variations occur elsewhere for sites with seasonal climate variations and this could affect successful detection of other subsurface features. Further research to investigate how different climates and soil types affect seasonal variation in grave-related resistivity anomalies would be useful.
NASA Astrophysics Data System (ADS)
Frolking, S. E.; Milliman, T.; Palace, M. W.; Wisser, D.; Lammers, R. B.; Fahnestock, M. A.
2010-12-01
A severe drought occurred in many portions of Amazonia in the dry season (June-September) of 2005. We analyzed ten years (7/99-10/09) of SeaWinds active microwave Ku-band backscatter data collected over the Amazon Basin, developing a monthly climatology and monthly anomalies from that climatology in an effort to detect landscape responses to this drought. We compared these to seasonal accumulating water deficit anomalies generated using Tropical Rainfall Monitoring Mission (TRMM) precipitation data (1999-2009) and 100 mm/mo evapotranspirative demand as a water deficit threshold. There was significant interannual variability in monthly mean backscatter only for ascending (early morning) overpass data, and little interannual variability in monthly mean backscatter for descending (late afternoon) overpass data. Strong negative anomalies in both ascending-overpass backscatter and accumulating water deficit developed during July-October 2005, centered on the southwestern Amazon Basin (Acre and western Amazonas states in Brazil; Madre de Dios state in Peru; Pando state in Bolivia). During the 2005 drought, there was a strong spatial correlation between morning overpass backscatter anomalies and water deficit anomalies. We hypothesize that as the drought persisted over several months, the forest canopy was increasingly unable to recover full leaf moisture content over night, and the early morning overpass backscatter data became anomalously low. This is the first reporting of tropical wet forest seasonal drought detection by active microwave scatterometry.
A function approximation approach to anomaly detection in propulsion system test data
NASA Technical Reports Server (NTRS)
Whitehead, Bruce A.; Hoyt, W. A.
1993-01-01
Ground test data from propulsion systems such as the Space Shuttle Main Engine (SSME) can be automatically screened for anomalies by a neural network. The neural network screens data after being trained with nominal data only. Given the values of 14 measurements reflecting external influences on the SSME at a given time, the neural network predicts the expected nominal value of a desired engine parameter at that time. We compared the ability of three different function-approximation techniques to perform this nominal value prediction: a novel neural network architecture based on Gaussian bar basis functions, a conventional back propagation neural network, and linear regression. These three techniques were tested with real data from six SSME ground tests containing two anomalies. The basis function network trained more rapidly than back propagation. It yielded nominal predictions with, a tight enough confidence interval to distinguish anomalous deviations from the nominal fluctuations in an engine parameter. Since the function-approximation approach requires nominal training data only, it is capable of detecting unknown classes of anomalies for which training data is not available.
Anomaly detection using temporal data mining in a smart home environment.
Jakkula, V; Cook, D J
2008-01-01
To many people, home is a sanctuary. With the maturing of smart home technologies, many people with cognitive and physical disabilities can lead independent lives in their own homes for extended periods of time. In this paper, we investigate the design of machine learning algorithms that support this goal. We hypothesize that machine learning algorithms can be designed to automatically learn models of resident behavior in a smart home, and that the results can be used to perform automated health monitoring and to detect anomalies. Specifically, our algorithms draw upon the temporal nature of sensor data collected in a smart home to build a model of expected activities and to detect unexpected, and possibly health-critical, events in the home. We validate our algorithms using synthetic data and real activity data collected from volunteers in an automated smart environment. The results from our experiments support our hypothesis that a model can be learned from observed smart home data and used to report anomalies, as they occur, in a smart home.
New sensors and techniques for the structural health monitoring of propulsion systems.
Woike, Mark; Abdul-Aziz, Ali; Oza, Nikunj; Matthews, Bryan
2013-01-01
The ability to monitor the structural health of the rotating components, especially in the hot sections of turbine engines, is of major interest to aero community in improving engine safety and reliability. The use of instrumentation for these applications remains very challenging. It requires sensors and techniques that are highly accurate, are able to operate in a high temperature environment, and can detect minute changes and hidden flaws before catastrophic events occur. The National Aeronautics and Space Administration (NASA), through the Aviation Safety Program (AVSP), has taken a lead role in the development of new sensor technologies and techniques for the in situ structural health monitoring of gas turbine engines. This paper presents a summary of key results and findings obtained from three different structural health monitoring approaches that have been investigated. This includes evaluating the performance of a novel microwave blade tip clearance sensor; a vibration based crack detection technique using an externally mounted capacitive blade tip clearance sensor; and lastly the results of using data driven anomaly detection algorithms for detecting cracks in a rotating disk.
New Sensors and Techniques for the Structural Health Monitoring of Propulsion Systems
2013-01-01
The ability to monitor the structural health of the rotating components, especially in the hot sections of turbine engines, is of major interest to aero community in improving engine safety and reliability. The use of instrumentation for these applications remains very challenging. It requires sensors and techniques that are highly accurate, are able to operate in a high temperature environment, and can detect minute changes and hidden flaws before catastrophic events occur. The National Aeronautics and Space Administration (NASA), through the Aviation Safety Program (AVSP), has taken a lead role in the development of new sensor technologies and techniques for the in situ structural health monitoring of gas turbine engines. This paper presents a summary of key results and findings obtained from three different structural health monitoring approaches that have been investigated. This includes evaluating the performance of a novel microwave blade tip clearance sensor; a vibration based crack detection technique using an externally mounted capacitive blade tip clearance sensor; and lastly the results of using data driven anomaly detection algorithms for detecting cracks in a rotating disk. PMID:23935425
Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection.
Wang, Kun; Langevin, Stanley; O'Hern, Corey S; Shattuck, Mark D; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G; Kirby, Michael
2016-01-01
Clinical diagnosis of acute infectious diseases during the early stages of infection is critical to administering the appropriate treatment to improve the disease outcome. We present a data driven analysis of the human cellular response to respiratory viruses including influenza, respiratory syncytia virus, and human rhinovirus, and compared this with the response to the bacterial endotoxin, Lipopolysaccharides (LPS). Using an anomaly detection framework we identified pathways that clearly distinguish between asymptomatic and symptomatic patients infected with the four different respiratory viruses and that accurately diagnosed patients exposed to a bacterial infection. Connectivity pathway analysis comparing the viral and bacterial diagnostic signatures identified host cellular pathways that were unique to patients exposed to LPS endotoxin indicating this type of analysis could be used to identify host biomarkers that can differentiate clinical etiologies of acute infection. We applied the Multivariate State Estimation Technique (MSET) on two human influenza (H1N1 and H3N2) gene expression data sets to define host networks perturbed in the asymptomatic phase of infection. Our analysis identified pathways in the respiratory virus diagnostic signature as prognostic biomarkers that triggered prior to clinical presentation of acute symptoms. These early warning pathways correctly predicted that almost half of the subjects would become symptomatic in less than forty hours post-infection and that three of the 18 subjects would become symptomatic after only 8 hours. These results provide a proof-of-concept for utility of anomaly detection algorithms to classify host pathway signatures that can identify presymptomatic signatures of acute diseases and differentiate between etiologies of infection. On a global scale, acute respiratory infections cause a significant proportion of human co-morbidities and account for 4.25 million deaths annually. The development of clinical diagnostic tools to distinguish between acute viral and bacterial respiratory infections is critical to improve patient care and limit the overuse of antibiotics in the medical community. The identification of prognostic respiratory virus biomarkers provides an early warning system that is capable of predicting which subjects will become symptomatic to expand our medical diagnostic capabilities and treatment options for acute infectious diseases. The host response to acute infection may be viewed as a deterministic signaling network responsible for maintaining the health of the host organism. We identify pathway signatures that reflect the very earliest perturbations in the host response to acute infection. These pathways provide a monitor the health state of the host using anomaly detection to quantify and predict health outcomes to pathogens.
Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection
O’Hern, Corey S.; Shattuck, Mark D.; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G.
2016-01-01
Clinical diagnosis of acute infectious diseases during the early stages of infection is critical to administering the appropriate treatment to improve the disease outcome. We present a data driven analysis of the human cellular response to respiratory viruses including influenza, respiratory syncytia virus, and human rhinovirus, and compared this with the response to the bacterial endotoxin, Lipopolysaccharides (LPS). Using an anomaly detection framework we identified pathways that clearly distinguish between asymptomatic and symptomatic patients infected with the four different respiratory viruses and that accurately diagnosed patients exposed to a bacterial infection. Connectivity pathway analysis comparing the viral and bacterial diagnostic signatures identified host cellular pathways that were unique to patients exposed to LPS endotoxin indicating this type of analysis could be used to identify host biomarkers that can differentiate clinical etiologies of acute infection. We applied the Multivariate State Estimation Technique (MSET) on two human influenza (H1N1 and H3N2) gene expression data sets to define host networks perturbed in the asymptomatic phase of infection. Our analysis identified pathways in the respiratory virus diagnostic signature as prognostic biomarkers that triggered prior to clinical presentation of acute symptoms. These early warning pathways correctly predicted that almost half of the subjects would become symptomatic in less than forty hours post-infection and that three of the 18 subjects would become symptomatic after only 8 hours. These results provide a proof-of-concept for utility of anomaly detection algorithms to classify host pathway signatures that can identify presymptomatic signatures of acute diseases and differentiate between etiologies of infection. On a global scale, acute respiratory infections cause a significant proportion of human co-morbidities and account for 4.25 million deaths annually. The development of clinical diagnostic tools to distinguish between acute viral and bacterial respiratory infections is critical to improve patient care and limit the overuse of antibiotics in the medical community. The identification of prognostic respiratory virus biomarkers provides an early warning system that is capable of predicting which subjects will become symptomatic to expand our medical diagnostic capabilities and treatment options for acute infectious diseases. The host response to acute infection may be viewed as a deterministic signaling network responsible for maintaining the health of the host organism. We identify pathway signatures that reflect the very earliest perturbations in the host response to acute infection. These pathways provide a monitor the health state of the host using anomaly detection to quantify and predict health outcomes to pathogens. PMID:27532264
NASA Astrophysics Data System (ADS)
Nawir, Mukrimah; Amir, Amiza; Lynn, Ong Bi; Yaakob, Naimah; Badlishah Ahmad, R.
2018-05-01
The rapid growth of technologies might endanger them to various network attacks due to the nature of data which are frequently exchange their data through Internet and large-scale data that need to be handle. Moreover, network anomaly detection using machine learning faced difficulty when dealing the involvement of dataset where the number of labelled network dataset is very few in public and this caused many researchers keep used the most commonly network dataset (KDDCup99) which is not relevant to employ the machine learning (ML) algorithms for a classification. Several issues regarding these available labelled network datasets are discussed in this paper. The aim of this paper to build a network anomaly detection system using machine learning algorithms that are efficient, effective and fast processing. The finding showed that AODE algorithm is performed well in term of accuracy and processing time for binary classification towards UNSW-NB15 dataset.
RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection.
Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S
Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request.
RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection
Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S.
2015-01-01
Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request. PMID:25685112
NASA Technical Reports Server (NTRS)
Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.
1975-01-01
The author has identified the following significant results. Ground truth data collection proves that significant anomalies exist at 13 volcanoes within the test site of Central America. The dimensions and temperature contrast of these ten anomalies are large enough to be detected by the Skylab 192 instrument. The dimensions and intensity of thermal anomalies have changed at most of these volcanoes during the Skylab mission.
Thinned crustal structure and tectonic boundary of the Nansha Block, southern South China Sea
NASA Astrophysics Data System (ADS)
Dong, Miao; Wu, Shi-Guo; Zhang, Jian
2016-12-01
The southern South China Sea margin consists of the thinned crustal Nansha Block and a compressional collision zone. The Nansha Block's deep structure and tectonic evolution contains critical information about the South China Sea's rifting. Multiple geophysical data sets, including regional magnetic, gravity and reflection seismic data, reveal the deep structure and rifting processes. Curie point depth (CPD), estimated from magnetic anomalies using a windowed wavenumber-domain algorithm, enables us to image thermal structures. To derive a 3D Moho topography and crustal thickness model, we apply Oldenburg algorithm to the gravity anomaly, which was extracted from the observed free air gravity anomaly data after removing the gravity effect of density variations of sediments, and temperature and pressure variations of the lithospheric mantle. We found that the Moho depth (20 km) is shallower than the CPD (24 km) in the Northwest Borneo Trough, possibly caused by thinned crust, low heat flow and a low vertical geothermal gradient. The Nansha Block's northern boundary is a narrow continent-ocean transition zone constrained by magnetic anomalies, reflection seismic data, gravity anomalies and an interpretation of Moho depth (about 13 km). The block extends southward beneath a gravity-driven deformed sediment wedge caused by uplift on land after a collision, with a contribution from deep crustal flow. Its southwestern boundary is close to the Lupar Line defined by a significant negative reduction to the pole (RTP) of magnetic anomaly and short-length-scale variation in crustal thickness, increasing from 18 to 26 km.
Identification and detection of anomalies through SSME data analysis
NASA Technical Reports Server (NTRS)
Pereira, Lisa; Ali, Moonis
1990-01-01
The goal of the ongoing research described in this paper is to analyze real-time ground test data in order to identify patterns associated with the anomalous engine behavior, and on the basis of this analysis to develop an expert system which detects anomalous engine behavior in the early stages of fault development. A prototype of the expert system has been developed and tested on the high frequency data of two SSME tests, namely Test #901-0516 and Test #904-044. The comparison of our results with the post-test analyses indicates that the expert system detected the presence of the anomalies in a significantly early stage of fault development.
Spiechowicz, Jakub; Łuczka, Jerzy; Hänggi, Peter
2016-01-01
We study far from equilibrium transport of a periodically driven inertial Brownian particle moving in a periodic potential. As detected for a SQUID ratchet dynamics, the mean square deviation of the particle position from its average may involve three distinct intermediate, although extended diffusive regimes: initially as superdiffusion, followed by subdiffusion and finally, normal diffusion in the asymptotic long time limit. Even though these anomalies are transient effects, their lifetime can be many, many orders of magnitude longer than the characteristic time scale of the setup and turns out to be extraordinarily sensitive to the system parameters like temperature or the potential asymmetry. In the paper we reveal mechanisms of diffusion anomalies related to ergodicity of the system, symmetry breaking of the periodic potential and ultraslow relaxation of the particle velocity towards its steady state. Similar sequences of the diffusive behaviours could be detected in various systems including, among others, colloidal particles in random potentials, glass forming liquids and granular gases. PMID:27492219
A new method of real-time detection of changes in periodic data stream
NASA Astrophysics Data System (ADS)
Lyu, Chen; Lu, Guoliang; Cheng, Bin; Zheng, Xiangwei
2017-07-01
The change point detection in periodic time series is much desirable in many practical usages. We present a novel algorithm for this task, which includes two phases: 1) anomaly measure- on the basis of a typical regression model, we propose a new computation method to measure anomalies in time series which does not require any reference data from other measurement(s); 2) change detection- we introduce a new martingale test for detection which can be operated in an unsupervised and nonparametric way. We have conducted extensive experiments to systematically test our algorithm. The results make us believe that our algorithm can be directly applicable in many real-world change-point-detection applications.
D'Antonio, F; Khalil, A; Garel, C; Pilu, G; Rizzo, G; Lerman-Sagie, T; Bhide, A; Thilaganathan, B; Manzoli, L; Papageorghiou, A T
2016-06-01
To explore the outcome in fetuses with prenatal diagnosis of posterior fossa anomalies apparently isolated on ultrasound imaging. MEDLINE and EMBASE were searched electronically utilizing combinations of relevant medical subject headings for 'posterior fossa' and 'outcome'. The posterior fossa anomalies analyzed were Dandy-Walker malformation (DWM), mega cisterna magna (MCM), Blake's pouch cyst (BPC) and vermian hypoplasia (VH). The outcomes observed were rate of chromosomal abnormalities, additional anomalies detected at prenatal magnetic resonance imaging (MRI), additional anomalies detected at postnatal imaging and concordance between prenatal and postnatal diagnoses. Only isolated cases of posterior fossa anomalies - defined as having no cerebral or extracerebral additional anomalies detected on ultrasound examination - were included in the analysis. Quality assessment of the included studies was performed using the Newcastle-Ottawa Scale for cohort studies. We used meta-analyses of proportions to combine data and fixed- or random-effects models according to the heterogeneity of the results. Twenty-two studies including 531 fetuses with posterior fossa anomalies were included in this systematic review. The prevalence of chromosomal abnormalities in fetuses with isolated DWM was 16.3% (95% CI, 8.7-25.7%). The prevalence of additional central nervous system (CNS) abnormalities that were missed at ultrasound examination and detected only at prenatal MRI was 13.7% (95% CI, 0.2-42.6%), and the prevalence of additional CNS anomalies that were missed at prenatal imaging and detected only after birth was 18.2% (95% CI, 6.2-34.6%). Prenatal diagnosis was not confirmed after birth in 28.2% (95% CI, 8.5-53.9%) of cases. MCM was not significantly associated with additional anomalies detected at prenatal MRI or detected after birth. Prenatal diagnosis was not confirmed postnatally in 7.1% (95% CI, 2.3-14.5%) of cases. The rate of chromosomal anomalies in fetuses with isolated BPC was 5.2% (95% CI, 0.9-12.7%) and there was no associated CNS anomaly detected at prenatal MRI or only after birth. Prenatal diagnosis of BPC was not confirmed after birth in 9.8% (95% CI, 2.9-20.1%) of cases. The rate of chromosomal anomalies in fetuses with isolated VH was 6.5% (95% CI, 0.8-17.1%) and there were no additional anomalies detected at prenatal MRI (0% (95% CI, 0.0-45.9%)). The proportions of cerebral anomalies detected only after birth was 14.2% (95% CI, 2.9-31.9%). Prenatal diagnosis was not confirmed after birth in 32.4% (95% CI, 18.3-48.4%) of cases. DWM apparently isolated on ultrasound imaging is a condition with a high risk for chromosomal and associated structural anomalies. Isolated MCM and BPC have a low risk for aneuploidy or associated structural anomalies. The small number of cases with isolated VH prevents robust conclusions regarding their management from being drawn. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd.
Inductive System Monitors Tasks
NASA Technical Reports Server (NTRS)
2008-01-01
The Inductive Monitoring System (IMS) software developed at Ames Research Center uses artificial intelligence and data mining techniques to build system-monitoring knowledge bases from archived or simulated sensor data. This information is then used to detect unusual or anomalous behavior that may indicate an impending system failure. Currently helping analyze data from systems that help fly and maintain the space shuttle and the International Space Station (ISS), the IMS has also been employed by data classes are then used to build a monitoring knowledge base. In real time, IMS performs monitoring functions: determining and displaying the degree of deviation from nominal performance. IMS trend analyses can detect conditions that may indicate a failure or required system maintenance. The development of IMS was motivated by the difficulty of producing detailed diagnostic models of some system components due to complexity or unavailability of design information. Successful applications have ranged from real-time monitoring of aircraft engine and control systems to anomaly detection in space shuttle and ISS data. IMS was used on shuttle missions STS-121, STS-115, and STS-116 to search the Wing Leading Edge Impact Detection System (WLEIDS) data for signs of possible damaging impacts during launch. It independently verified findings of the WLEIDS Mission Evaluation Room (MER) analysts and indicated additional points of interest that were subsequently investigated by the MER team. In support of the Exploration Systems Mission Directorate, IMS is being deployed as an anomaly detection tool on ISS mission control consoles in the Johnson Space Center Mission Operations Directorate. IMS has been trained to detect faults in the ISS Control Moment Gyroscope (CMG) systems. In laboratory tests, it has already detected several minor anomalies in real-time CMG data. When tested on archived data, IMS was able to detect precursors of the CMG1 failure nearly 15 hours in advance of the actual failure event. In the Aeronautics Research Mission Directorate, IMS successfully performed real-time engine health analysis. IMS was able to detect simulated failures and actual engine anomalies in an F/A-18 aircraft during the course of 25 test flights. IMS is also being used in colla
Anomaly Monitoring Method for Key Components of Satellite
Fan, Linjun; Xiao, Weidong; Tang, Jun
2014-01-01
This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703
Method for Real-Time Model Based Structural Anomaly Detection
NASA Technical Reports Server (NTRS)
Urnes, James M., Sr. (Inventor); Smith, Timothy A. (Inventor); Reichenbach, Eric Y. (Inventor)
2015-01-01
A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.
2003-11-01
Lafayette, IN 47907. [Lane et al-97b] T. Lane and C . E. Brodley. Sequence matching and learning in anomaly detection for computer security. Proceedings of...Mining, pp 259-263. 1998. [Lane et al-98b] T. Lane and C . E. Brodley. Temporal sequence learning and data reduction for anomaly detection ...W. Lee, C . Park, and S. Stolfo. Towards Automatic Intrusion Detection using NFR. 1st USENIX Workshop on Intrusion Detection and Network Monitoring
Spatiotemporal Detection of Unusual Human Population Behavior Using Mobile Phone Data
Dobra, Adrian; Williams, Nathalie E.; Eagle, Nathan
2015-01-01
With the aim to contribute to humanitarian response to disasters and violent events, scientists have proposed the development of analytical tools that could identify emergency events in real-time, using mobile phone data. The assumption is that dramatic and discrete changes in behavior, measured with mobile phone data, will indicate extreme events. In this study, we propose an efficient system for spatiotemporal detection of behavioral anomalies from mobile phone data and compare sites with behavioral anomalies to an extensive database of emergency and non-emergency events in Rwanda. Our methodology successfully captures anomalous behavioral patterns associated with a broad range of events, from religious and official holidays to earthquakes, floods, violence against civilians and protests. Our results suggest that human behavioral responses to extreme events are complex and multi-dimensional, including extreme increases and decreases in both calling and movement behaviors. We also find significant temporal and spatial variance in responses to extreme events. Our behavioral anomaly detection system and extensive discussion of results are a significant contribution to the long-term project of creating an effective real-time event detection system with mobile phone data and we discuss the implications of our findings for future research to this end. PMID:25806954
GBAS Ionospheric Anomaly Monitoring Based on a Two-Step Approach
Zhao, Lin; Yang, Fuxin; Li, Liang; Ding, Jicheng; Zhao, Yuxin
2016-01-01
As one significant component of space environmental weather, the ionosphere has to be monitored using Global Positioning System (GPS) receivers for the Ground-Based Augmentation System (GBAS). This is because an ionospheric anomaly can pose a potential threat for GBAS to support safety-critical services. The traditional code-carrier divergence (CCD) methods, which have been widely used to detect the variants of the ionospheric gradient for GBAS, adopt a linear time-invariant low-pass filter to suppress the effect of high frequency noise on the detection of the ionospheric anomaly. However, there is a counterbalance between response time and estimation accuracy due to the fixed time constants. In order to release the limitation, a two-step approach (TSA) is proposed by integrating the cascaded linear time-invariant low-pass filters with the adaptive Kalman filter to detect the ionospheric gradient anomaly. The performance of the proposed method is tested by using simulated and real-world data, respectively. The simulation results show that the TSA can detect ionospheric gradient anomalies quickly, even when the noise is severer. Compared to the traditional CCD methods, the experiments from real-world GPS data indicate that the average estimation accuracy of the ionospheric gradient improves by more than 31.3%, and the average response time to the ionospheric gradient at a rate of 0.018 m/s improves by more than 59.3%, which demonstrates the ability of TSA to detect a small ionospheric gradient more rapidly. PMID:27240367
NASA Technical Reports Server (NTRS)
Vaughan, R. Greg; Hook, Simon J.
2006-01-01
ASTER thermal infrared data over Mt. St Helens were used to characterize its thermal behavior from Jun 2000 to Feb 2006. Prior to the Oct 2004 eruption, the average crater temperature varied seasonally between -12 and 6 C. After the eruption, maximum single-pixel temperature increased from 10 C (Oct 2004) to 96 C (Aug 2005), then showed a decrease to Feb 2006. The initial increase in temperature was correlated with dome morphology and growth rate and the subsequent decrease was interpreted to relate to both seasonal trends and a decreased growth rate/increased cooling rate, possibly suggesting a significant change in the volcanic system. A single-pixel ASTER thermal anomaly first appeared on Oct 1, 2004, eleven hours after the first eruption - 10 days before new lava was exposed at the surface. By contrast, an automated algorithm for detecting thermal anomalies in MODIS data did not trigger an alert until Dec 18. However, a single-pixel thermal anomaly first appeared in MODIS channel 23 (4 um) on Oct 13, 12 days after the first eruption - 2 days after lava was exposed. The earlier thermal anomaly detected with ASTER data is attributed to the higher spatial resolution (90 m) compared with MODIS (1 m) and the earlier visual observation of anomalous pixels compared to the automated detection method suggests that local spatial statistics and background radiance data could improve automated detection methods.
Yin, Shen; Gao, Huijun; Qiu, Jianbin; Kaynak, Okyay
2017-11-01
Data-driven fault detection plays an important role in industrial systems due to its applicability in case of unknown physical models. In fault detection, disturbances must be taken into account as an inherent characteristic of processes. Nevertheless, fault detection for nonlinear processes with deterministic disturbances still receive little attention, especially in data-driven field. To solve this problem, a just-in-time learning-based data-driven (JITL-DD) fault detection method for nonlinear processes with deterministic disturbances is proposed in this paper. JITL-DD employs JITL scheme for process description with local model structures to cope with processes dynamics and nonlinearity. The proposed method provides a data-driven fault detection solution for nonlinear processes with deterministic disturbances, and owns inherent online adaptation and high accuracy of fault detection. Two nonlinear systems, i.e., a numerical example and a sewage treatment process benchmark, are employed to show the effectiveness of the proposed method.
CSAX: Characterizing Systematic Anomalies in eXpression Data.
Noto, Keith; Majidi, Saeed; Edlow, Andrea G; Wick, Heather C; Bianchi, Diana W; Slonim, Donna K
2015-05-01
Methods for translating gene expression signatures into clinically relevant information have typically relied upon having many samples from patients with similar molecular phenotypes. Here, we address the question of what can be done when it is relatively easy to obtain healthy patient samples, but when abnormalities corresponding to disease states may be rare and one-of-a-kind. The associated computational challenge, anomaly detection, is a well-studied machine-learning problem. However, due to the dimensionality and variability of expression data, existing methods based on feature space analysis or individual anomalously expressed genes are insufficient. We present a novel approach, CSAX, that identifies pathways in an individual sample in which the normal expression relationships are disrupted. To evaluate our approach, we have compiled and released a compendium of public expression data sets, reformulated to create a test bed for anomaly detection. We demonstrate the accuracy of CSAX on the data sets in our compendium, compare it to other leading methods, and show that CSAX aids in both identifying anomalies and explaining their underlying biology. We describe an approach to characterizing the difficulty of specific expression anomaly detection tasks. We then illustrate CSAX's value in two developmental case studies. Confirming prior hypotheses, CSAX highlights disruption of platelet activation pathways in a neonate with retinopathy of prematurity and identifies, for the first time, dysregulated oxidative stress response in second trimester amniotic fluid of fetuses with obese mothers. Our approach provides an important step toward identification of individual disease patterns in the era of precision medicine.
Detecting an atomic clock frequency anomaly using an adaptive Kalman filter algorithm
NASA Astrophysics Data System (ADS)
Song, Huijie; Dong, Shaowu; Wu, Wenjun; Jiang, Meng; Wang, Weixiong
2018-06-01
The abnormal frequencies of an atomic clock mainly include frequency jump and frequency drift jump. Atomic clock frequency anomaly detection is a key technique in time-keeping. The Kalman filter algorithm, as a linear optimal algorithm, has been widely used in real-time detection for abnormal frequency. In order to obtain an optimal state estimation, the observation model and dynamic model of the Kalman filter algorithm should satisfy Gaussian white noise conditions. The detection performance is degraded if anomalies affect the observation model or dynamic model. The idea of the adaptive Kalman filter algorithm, applied to clock frequency anomaly detection, uses the residuals given by the prediction for building ‘an adaptive factor’ the prediction state covariance matrix is real-time corrected by the adaptive factor. The results show that the model error is reduced and the detection performance is improved. The effectiveness of the algorithm is verified by the frequency jump simulation, the frequency drift jump simulation and the measured data of the atomic clock by using the chi-square test.
Anomaly detection in hyperspectral imagery: statistics vs. graph-based algorithms
NASA Astrophysics Data System (ADS)
Berkson, Emily E.; Messinger, David W.
2016-05-01
Anomaly detection (AD) algorithms are frequently applied to hyperspectral imagery, but different algorithms produce different outlier results depending on the image scene content and the assumed background model. This work provides the first comparison of anomaly score distributions between common statistics-based anomaly detection algorithms (RX and subspace-RX) and the graph-based Topological Anomaly Detector (TAD). Anomaly scores in statistical AD algorithms should theoretically approximate a chi-squared distribution; however, this is rarely the case with real hyperspectral imagery. The expected distribution of scores found with graph-based methods remains unclear. We also look for general trends in algorithm performance with varied scene content. Three separate scenes were extracted from the hyperspectral MegaScene image taken over downtown Rochester, NY with the VIS-NIR-SWIR ProSpecTIR instrument. In order of most to least cluttered, we study an urban, suburban, and rural scene. The three AD algorithms were applied to each scene, and the distributions of the most anomalous 5% of pixels were compared. We find that subspace-RX performs better than RX, because the data becomes more normal when the highest variance principal components are removed. We also see that compared to statistical detectors, anomalies detected by TAD are easier to separate from the background. Due to their different underlying assumptions, the statistical and graph-based algorithms highlighted different anomalies within the urban scene. These results will lead to a deeper understanding of these algorithms and their applicability across different types of imagery.
Gravity anomaly detection: Apollo/Soyuz
NASA Technical Reports Server (NTRS)
Vonbun, F. O.; Kahn, W. D.; Bryan, J. W.; Schmid, P. E.; Wells, W. T.; Conrad, D. T.
1976-01-01
The Goddard Apollo-Soyuz Geodynamics Experiment is described. It was performed to demonstrate the feasibility of tracking and recovering high frequency components of the earth's gravity field by utilizing a synchronous orbiting tracking station such as ATS-6. Gravity anomalies of 5 MGLS or larger having wavelengths of 300 to 1000 kilometers on the earth's surface are important for geologic studies of the upper layers of the earth's crust. Short wavelength Earth's gravity anomalies were detected from space. Two prime areas of data collection were selected for the experiment: (1) the center of the African continent and (2) the Indian Ocean Depression centered at 5% north latitude and 75% east longitude. Preliminary results show that the detectability objective of the experiment was met in both areas as well as at several additional anomalous areas around the globe. Gravity anomalies of the Karakoram and Himalayan mountain ranges, ocean trenches, as well as the Diamantina Depth, can be seen. Maps outlining the anomalies discovered are shown.
CHAMP: a locally adaptive unmixing-based hyperspectral anomaly detection algorithm
NASA Astrophysics Data System (ADS)
Crist, Eric P.; Thelen, Brian J.; Carrara, David A.
1998-10-01
Anomaly detection offers a means by which to identify potentially important objects in a scene without prior knowledge of their spectral signatures. As such, this approach is less sensitive to variations in target class composition, atmospheric and illumination conditions, and sensor gain settings than would be a spectral matched filter or similar algorithm. The best existing anomaly detectors generally fall into one of two categories: those based on local Gaussian statistics, and those based on linear mixing moles. Unmixing-based approaches better represent the real distribution of data in a scene, but are typically derived and applied on a global or scene-wide basis. Locally adaptive approaches allow detection of more subtle anomalies by accommodating the spatial non-homogeneity of background classes in a typical scene, but provide a poorer representation of the true underlying background distribution. The CHAMP algorithm combines the best attributes of both approaches, applying a linear-mixing model approach in a spatially adaptive manner. The algorithm itself, and teste results on simulated and actual hyperspectral image data, are presented in this paper.
Experiments on Adaptive Techniques for Host-Based Intrusion Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.
2001-09-01
This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerablemore » preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.« less
Mobile In Vivo Infrared Data Collection and Diagnoses Comparison System
NASA Technical Reports Server (NTRS)
Mintz, Frederick W. (Inventor); Gunapala, Sarath D. (Inventor); Moynihan, Philip I. (Inventor)
2013-01-01
Described is a mobile in vivo infrared brain scan and analysis system. The system includes a data collection subsystem and a data analysis subsystem. The data collection subsystem is a helmet with a plurality of infrared (IR) thermometer probes. Each of the IR thermometer probes includes an IR photodetector capable of detecting IR radiation generated by evoked potentials within a user's skull. The helmet is formed to collect brain data that is reflective of firing neurons in a mobile subject and transmit the brain data to the data analysis subsystem. The data analysis subsystem is configured to generate and display a three-dimensional image that depicts a location of the firing neurons. The data analysis subsystem is also configured to compare the brain data against a library of brain data to detect an anomaly in the brain data, and notify a user of any detected anomaly in the brain data.
NASA Astrophysics Data System (ADS)
Nagai, S.; Eto, S.; Tadokoro, K.; Watanabe, T.
2011-12-01
On-land geodetic observations are not enough to monitor crustal activities in and around the subduction zone, so seafloor geodetic observations have been required. However, present accuracy of seafloor geodetic observation is an order of 1 cm or larger, which is difficult to detect differences from plate motion in short time interval, which means a plate coupling rate and its spatio-temporal variation. Our group has been developed observation system and methodology for seafloor geodesy, which is combined kinematic GPS and ocean acoustic ranging. One of influence factors is acoustic velocity change in ocean, due to change in temperature, ocean currents in different scale, and so on. A typical perturbation of acoustic velocity makes an order of 1 ms difference in travel time, which corresponds to 1 m difference in ray length. We have investigated this effect in seafloor geodesy using both observed and synthetic data to reduce estimation error of benchmarker (transponder) positions and to develop our strategy for observation and its analyses. In this paper, we focus on forward modeling of travel times of acoustic ranging data and recovery tests using synthetic data comparing with observed results [Eto et al., 2011; in this meeting]. Estimation procedure for benchmarker positions is similar to those used in earthquake location method and seismic tomography. So we have applied methods in seismic study, especially in tomographic inversion. First, we use method of a one-dimensional velocity inversion with station corrections, proposed by Kissling et al. [1994], to detect spatio-temporal change in ocean acoustic velocity from observed data in the Suruga-Nankai Trough, Japan. From these analyses, some important information has been clarified in travel time data [Eto et al., 2011]. Most of them can explain small velocity anomaly at a depth of 300m or shallower, through forward modeling of travel time data using simple velocity structure with velocity anomaly. However, due to simple data acquisition procedure, we cannot detect velocity anomaly(s) in space and time precisely, that is size of anomaly and its (their) movement. As a next step, we demonstrate recovery of benchmarker positions in tomographic inversion using synthetic data including anomalous travel time data to develop idea to calculate benchmarker positions with high-accuracy. In the tomographic inversion, we introduce some constraints corresponding to realistic conditions. This step gives us new developed system to detect crustal deformation in seafloor geodesy and new findings for understanding these in and around plate boundaries.
Fault Detection and Diagnosis of Railway Point Machines by Sound Analysis
Lee, Jonguk; Choi, Heesu; Park, Daihee; Chung, Yongwha; Kim, Hee-Young; Yoon, Sukhan
2016-01-01
Railway point devices act as actuators that provide different routes to trains by driving switchblades from the current position to the opposite one. Point failure can significantly affect railway operations, with potentially disastrous consequences. Therefore, early detection of anomalies is critical for monitoring and managing the condition of rail infrastructure. We present a data mining solution that utilizes audio data to efficiently detect and diagnose faults in railway condition monitoring systems. The system enables extracting mel-frequency cepstrum coefficients (MFCCs) from audio data with reduced feature dimensions using attribute subset selection, and employs support vector machines (SVMs) for early detection and classification of anomalies. Experimental results show that the system enables cost-effective detection and diagnosis of faults using a cheap microphone, with accuracy exceeding 94.1% whether used alone or in combination with other known methods. PMID:27092509
NASA Astrophysics Data System (ADS)
Freeman, Mervyn; Lam, Mai Mai; Chisham, Gareth
2017-04-01
We use National Centers for Environmental Prediction (NCEP)/National Center for Atmospheric Research (NCAR) reanalysis data to show that Antarctic surface air temperature anomalies result from differences in the daily-mean duskward component,By, of the interplanetary magnetic field (IMF). We find the anomalies have strong geographical and seasonal variations. Regional anomalies are evident poleward of 60˚ S and are of diminishing representative peak amplitude from autumn (3.2˚ C) to winter (2.4˚ C) to spring (1.6˚ C) to summer (0.9˚ C). We demonstrate that anomalies of statistically-significant amplitude are due to geostrophic wind anomalies, resulting from the same By changes, moving air across large meridional gradients in zonal mean air temperature between 60 and 80˚ S. Additionally, we find that the mean tropospheric temperature anomaly for geographical latitudes ≤ -70˚ peaks at about 0.7 K and is statistically significant at the 1 - 5% level between air pressures of 1000 and 500 hPa (i.e., ˜0.1 to 5.6 km altitude above sea level) and for time lags with respect to the IMF of up to 7 days. The signature propagates vertically between air pressure p ≥ 850 hPa (≤ 1.5 km) and p = 500 hPa (˜5.6 km). The characteristics of prompt response and vertical propagation within the troposphere have previously been seen in the correlation between the IMF and high-latitude air pressure anomalies, known as the Mansurov effect, at higher statistical significances (1%). We conclude that we have identified the temperature signature of the Mansurov effect in the Antarctic troposphere. Since these tropospheric anomalies have been associated with By-driven anomalies in the electric potential of the ionosphere, we further conclude that they are caused by IMF-induced changes to the global atmospheric electric circuit (GEC). Our results support the view that variations in the ionospheric potential act on the troposphere via the action of resulting variations in the downwards current of the GEC on tropospheric clouds.
Extending TOPS: A Prototype MODIS Anomaly Detection Architecture
NASA Astrophysics Data System (ADS)
Votava, P.; Nemani, R. R.; Srivastava, A. N.
2008-12-01
The management and processing of Earth science data has been gaining importance over the last decade due to higher data volumes generated by a larger number of instruments, and due to the increase in complexity of Earth science models that use this data. The volume of data itself is often a limiting factor in obtaining the information needed by the scientists; without more sophisticated data volume reduction technologies, possible key information may not be discovered. We are especially interested in automatic identification of disturbances within the ecosystems (e,g, wildfires, droughts, floods, insect/pest damage, wind damage, logging), and focusing our analysis efforts on the identified areas. There are dozens of variables that define the health of our ecosystem and both long-term and short-term changes in these variables can serve as early indicators of natural disasters and shifts in climate and ecosystem health. These changes can have profound socio-economic impacts and we need to develop capabilities for identification, analysis and response to these changes in a timely manner. Because the ecosystem consists of a large number of variables, there can be a disturbance that is only apparent when we examine relationships among multiple variables despite the fact that none of them is by itself alarming. We have to be able to extract information from multiple sensors and observations and discover these underlying relationships. As the data volumes increase, there is also potential for large number of anomalies to "flood" the system, so we need to provide ability to automatically select the most likely ones and the most important ones and the ability to analyze the anomaly with minimal involvement of scientists. We describe a prototype architecture for anomaly driven data reduction for both near-real-time and archived surface reflectance data from the MODIS instrument collected over Central California and test it using Orca and One-Class Support Vector Machines algorithms. We demonstrate our efforts in the context of the Terrestrial Observation and Prediction System (TOPS), a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in a range of applications including natural resources management, public health and disaster management.
NASA Astrophysics Data System (ADS)
McCann, Cooper Patrick
Low-cost flight-based hyperspectral imaging systems have the potential to provide valuable information for ecosystem and environmental studies as well as aide in land management and land health monitoring. This thesis describes (1) a bootstrap method of producing mesoscale, radiometrically-referenced hyperspectral data using the Landsat surface reflectance (LaSRC) data product as a reference target, (2) biophysically relevant basis functions to model the reflectance spectra, (3) an unsupervised classification technique based on natural histogram splitting of these biophysically relevant parameters, and (4) local and multi-temporal anomaly detection. The bootstrap method extends standard processing techniques to remove uneven illumination conditions between flight passes, allowing the creation of radiometrically self-consistent data. Through selective spectral and spatial resampling, LaSRC data is used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from a flight on 06/02/2016 is compared with concurrently collected ground based reflectance spectra as a means of validation achieving an average error of 2.74%. Fitting reflectance spectra using basis functions, based on biophysically relevant spectral features, allows both noise and data reductions while shifting information from spectral bands to biophysical features. Histogram splitting is used to determine a clustering based on natural splittings of these fit parameters. The Indian Pines reference data enabled comparisons of the efficacy of this technique to established techniques. The splitting technique is shown to be an improvement over the ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. This improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA. Three hyperspectral flights over the Kevin Dome area, covering 1843 ha, acquired 06/21/2014, 06/24/2015 and 06/26/2016 are examined with different methods of anomaly detection. Detection of anomalies within a single data set is examined to determine, on a local scale, areas that are significantly different from the surrounding area. Additionally, the detection and identification of persistent anomalies and non-persistent anomalies was investigated across multiple data sets.
Data cleaning in the energy domain
NASA Astrophysics Data System (ADS)
Akouemo Kengmo Kenfack, Hermine N.
This dissertation addresses the problem of data cleaning in the energy domain, especially for natural gas and electric time series. The detection and imputation of anomalies improves the performance of forecasting models necessary to lower purchasing and storage costs for utilities and plan for peak energy loads or distribution shortages. There are various types of anomalies, each induced by diverse causes and sources depending on the field of study. The definition of false positives also depends on the context. The analysis is focused on energy data because of the availability of data and information to make a theoretical and practical contribution to the field. A probabilistic approach based on hypothesis testing is developed to decide if a data point is anomalous based on the level of significance. Furthermore, the probabilistic approach is combined with statistical regression models to handle time series data. Domain knowledge of energy data and the survey of causes and sources of anomalies in energy are incorporated into the data cleaning algorithm to improve the accuracy of the results. The data cleaning method is evaluated on simulated data sets in which anomalies were artificially inserted and on natural gas and electric data sets. In the simulation study, the performance of the method is evaluated for both detection and imputation on all identified causes of anomalies in energy data. The testing on utilities' data evaluates the percentage of improvement brought to forecasting accuracy by data cleaning. A cross-validation study of the results is also performed to demonstrate the performance of the data cleaning algorithm on smaller data sets and to calculate an interval of confidence for the results. The data cleaning algorithm is able to successfully identify energy time series anomalies. The replacement of those anomalies provides improvement to forecasting models accuracy. The process is automatic, which is important because many data cleaning processes require human input and become impractical for very large data sets. The techniques are also applicable to other fields such as econometrics and finance, but the exogenous factors of the time series data need to be well defined.
Recombinant Temporal Aberration Detection Algorithms for Enhanced Biosurveillance
Murphy, Sean Patrick; Burkom, Howard
2008-01-01
Objective Broadly, this research aims to improve the outbreak detection performance and, therefore, the cost effectiveness of automated syndromic surveillance systems by building novel, recombinant temporal aberration detection algorithms from components of previously developed detectors. Methods This study decomposes existing temporal aberration detection algorithms into two sequential stages and investigates the individual impact of each stage on outbreak detection performance. The data forecasting stage (Stage 1) generates predictions of time series values a certain number of time steps in the future based on historical data. The anomaly measure stage (Stage 2) compares features of this prediction to corresponding features of the actual time series to compute a statistical anomaly measure. A Monte Carlo simulation procedure is then used to examine the recombinant algorithms’ ability to detect synthetic aberrations injected into authentic syndromic time series. Results New methods obtained with procedural components of published, sometimes widely used, algorithms were compared to the known methods using authentic datasets with plausible stochastic injected signals. Performance improvements were found for some of the recombinant methods, and these improvements were consistent over a range of data types, outbreak types, and outbreak sizes. For gradual outbreaks, the WEWD MovAvg7+WEWD Z-Score recombinant algorithm performed best; for sudden outbreaks, the HW+WEWD Z-Score performed best. Conclusion This decomposition was found not only to yield valuable insight into the effects of the aberration detection algorithms but also to produce novel combinations of data forecasters and anomaly measures with enhanced detection performance. PMID:17947614
NASA Astrophysics Data System (ADS)
Bellaoui, Mebrouk; Hassini, Abdelatif; Bouchouicha, Kada
2017-05-01
Detection of thermal anomaly prior to earthquake events has been widely confirmed by researchers over the past decade. One of the popular approaches for anomaly detection is the Robust Satellite Approach (RST). In this paper, we use this method on a collection of six years of MODIS satellite data, representing land surface temperature (LST) images to predict 21st May 2003 Boumerdes Algeria earthquake. The thermal anomalies results were compared with the ambient temperature variation measured in three meteorological stations of Algerian National Office of Meteorology (ONM) (DELLYS-AFIR, TIZI-OUZOU, and DAR-EL-BEIDA). The results confirm the importance of RST as an approach highly effective for monitoring the earthquakes.
Hierarchical Kohonenen net for anomaly detection in network security.
Sarasamma, Suseela T; Zhu, Qiuming A; Huff, Julie
2005-04-01
A novel multilevel hierarchical Kohonen Net (K-Map) for an intrusion detection system is presented. Each level of the hierarchical map is modeled as a simple winner-take-all K-Map. One significant advantage of this multilevel hierarchical K-Map is its computational efficiency. Unlike other statistical anomaly detection methods such as nearest neighbor approach, K-means clustering or probabilistic analysis that employ distance computation in the feature space to identify the outliers, our approach does not involve costly point-to-point computation in organizing the data into clusters. Another advantage is the reduced network size. We use the classification capability of the K-Map on selected dimensions of data set in detecting anomalies. Randomly selected subsets that contain both attacks and normal records from the KDD Cup 1999 benchmark data are used to train the hierarchical net. We use a confidence measure to label the clusters. Then we use the test set from the same KDD Cup 1999 benchmark to test the hierarchical net. We show that a hierarchical K-Map in which each layer operates on a small subset of the feature space is superior to a single-layer K-Map operating on the whole feature space in detecting a variety of attacks in terms of detection rate as well as false positive rate.
Using Physical Models for Anomaly Detection in Control Systems
NASA Astrophysics Data System (ADS)
Svendsen, Nils; Wolthusen, Stephen
Supervisory control and data acquisition (SCADA) systems are increasingly used to operate critical infrastructure assets. However, the inclusion of advanced information technology and communications components and elaborate control strategies in SCADA systems increase the threat surface for external and subversion-type attacks. The problems are exacerbated by site-specific properties of SCADA environments that make subversion detection impractical; and by sensor noise and feedback characteristics that degrade conventional anomaly detection systems. Moreover, potential attack mechanisms are ill-defined and may include both physical and logical aspects.
A 1985-2015 data-driven global reconstruction of GRACE total water storage
NASA Astrophysics Data System (ADS)
Humphrey, Vincent; Gudmundsson, Lukas; Isabelle Seneviratne, Sonia
2016-04-01
After thirteen years of measurements, the Gravity Recovery and Climate Experiment (GRACE) mission has enabled for an unprecedented view on total water storage (TWS) variability. However, the relatively short record length, irregular time steps and multiple data gaps since 2011 still represent important limitations to a wider use of this dataset within the hydrological and climatological community especially for applications such as model evaluation or assimilation of GRACE in land surface models. To address this issue, we make use of the available GRACE record (2002-2015) to infer local statistical relationships between detrended monthly TWS anomalies and the main controlling atmospheric drivers (e.g. daily precipitation and temperature) at 1 degree resolution (Humphrey et al., in revision). Long-term and homogeneous monthly time series of detrended anomalies in total water storage are then reconstructed for the period 1985-2015. The quality of this reconstruction is evaluated in two different ways. First we perform a cross-validation experiment to assess the performance and robustness of the statistical model. Second we compare with independent basin-scale estimates of TWS anomalies derived by means of combined atmospheric and terrestrial water-balance using atmospheric water vapor flux convergence and change in atmospheric water vapor content (Mueller et al. 2011). The reconstructed time series are shown to provide robust data-driven estimates of global variations in water storage over large regions of the world. Example applications are provided for illustration, including an analysis of some selected major drought events which occurred before the GRACE era. References Humphrey V, Gudmundsson L, Seneviratne SI (in revision) Assessing global water storage variability from GRACE: trends, seasonal cycle, sub-seasonal anomalies and extremes. Surv Geophys Mueller B, Hirschi M, Seneviratne SI (2011) New diagnostic estimates of variations in terrestrial water storage based on ERA-Interim data. Hydrol Process 25:996-1008
SSME Post Test Diagnostic System: Systems Section
NASA Technical Reports Server (NTRS)
Bickmore, Timothy
1995-01-01
An assessment of engine and component health is routinely made after each test firing or flight firing of a Space Shuttle Main Engine (SSME). Currently, this health assessment is done by teams of engineers who manually review sensor data, performance data, and engine and component operating histories. Based on review of information from these various sources, an evaluation is made as to the health of each component of the SSME and the preparedness of the engine for another test or flight. The objective of this project - the SSME Post Test Diagnostic System (PTDS) - is to develop a computer program which automates the analysis of test data from the SSME in order to detect and diagnose anomalies. This report primarily covers work on the Systems Section of the PTDS, which automates the analyses performed by the systems/performance group at the Propulsion Branch of NASA Marshall Space Flight Center (MSFC). This group is responsible for assessing the overall health and performance of the engine, and detecting and diagnosing anomalies which involve multiple components (other groups are responsible for analyzing the behavior of specific components). The PTDS utilizes several advanced software technologies to perform its analyses. Raw test data is analyzed using signal processing routines which detect features in the data, such as spikes, shifts, peaks, and drifts. Component analyses are performed by expert systems, which use 'rules-of-thumb' obtained from interviews with the MSFC data analysts to detect and diagnose anomalies. The systems analysis is performed using case-based reasoning. Results of all analyses are stored in a relational database and displayed via an X-window-based graphical user interface which provides ranked lists of anomalies and observations by engine component, along with supporting data plots for each.
Dispersive Phase in the L-band InSAR Image Associated with Heavy Rain Episodes
NASA Astrophysics Data System (ADS)
Furuya, M.; Kinoshita, Y.
2017-12-01
Interferometric synthetic aperture radar (InSAR) is a powerful geodetic technique that allows us to detect ground displacements with unprecedented spatial resolution, and has been used to detect displacements due to earthquakes, volcanic eruptions, and glacier motion. In the meantime, due to the microwave propagation through ionosphere and troposphere, we often encounter non-negligible phase anomaly in InSAR data. Correcting for the ionsphere and troposphere is therefore a long-standing issue for high-precision geodetic measurements. However, if ground displacements are negligible, InSAR image can tell us the details of the atmosphere.Kinoshita and Furuya (2017, SOLA) detected phase anomaly in ALOS/PALSAR InSAR data associated with heavy rain over Niigata area, Japan, and performed numerical weathr model simulation to reproduce the anomaly; ALOS/PALSAR is a satellite-based L-band SAR sensor launched by JAXA in 2006 and terminated in 2011. The phase anomaly could be largely reproduced, using the output data from the weather model. However, we should note that numerical weather model outputs can only account for the non-dispersive effect in the phase anomaly. In case of severe weather event, we may expect dispersive effect that could be caused by the presence of free-electrons.In Global Navigation Satellite System (GNSS) positioning, dual frequency measurements allow us to separate the ionospheric dispersive component from tropospheric non-dispersive components. In contrast, SAR imaging is based on a single carrier frequency, and thus no operational ionospheric corrections have been performed in InSAR data analyses. Recently, Gomba et al (2016) detailed the processing strategy of split spectrum method (SSM) for InSAR, which splits the finite bandwidth of the range spectrum and virtually allows for dual-frequency measurements.We apply the L-band InSAR SSM to the heavy rain episodes, in which more than 50 mm/hour precipitations were reported. We report the presence of phase anomaly in both dispersive and non-dispersive components. While the original phase anomaly turns out to be mostly due to the non-dispersive effect, we could recognize local anomalies in the dispersive component as well. We will discuss its geophysical implications, and may show several case studies.
Communications and tracking expert systems study
NASA Technical Reports Server (NTRS)
Leibfried, T. F.; Feagin, Terry; Overland, David
1987-01-01
The original objectives of the study consisted of five broad areas of investigation: criteria and issues for explanation of communication and tracking system anomaly detection, isolation, and recovery; data storage simplification issues for fault detection expert systems; data selection procedures for decision tree pruning and optimization to enhance the abstraction of pertinent information for clear explanation; criteria for establishing levels of explanation suited to needs; and analysis of expert system interaction and modularization. Progress was made in all areas, but to a lesser extent in the criteria for establishing levels of explanation suited to needs. Among the types of expert systems studied were those related to anomaly or fault detection, isolation, and recovery.
NASA Technical Reports Server (NTRS)
Phillips, R. J.
1986-01-01
Crustal anomaly detection with MAGSAT data is frustrated by the inherent resolving power of the data and by contamination from the external and core fields. The quality of the data might be tested by modeling specific tectonic features which produce anomalies that fall within the proposed resolution and crustal amplitude capabilities of the MAGSAT fields. To test this hypothesis, the north African hotspots associated with Ahaggar, Tibestia and Darfur have been modeled as magnetic induction anomalies due solely to shallower depth to the Curie isotherm surface beneath these features. The MAGSAT data were reduced by subtracting the external and core fields to isolate the scalar and vertical component crustal signals. The predicted model magnetic signal arising from the surface topography of the uplift and the Curie isotherm surface was calculated at MAGSAT altitudes by the Fourier transform technique modified to allow for variable magnetization. In summary it is suggested that the region beneath Ahaggar is associated with a strong thermal anomaly and the predicted anomaly best fits the associated MAGSAT anomaly if the African plate is moving in a northeasterly direction.
Unsupervised Anomaly Detection Based on Clustering and Multiple One-Class SVM
NASA Astrophysics Data System (ADS)
Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Kwon, Yongjin
Intrusion detection system (IDS) has played an important role as a device to defend our networks from cyber attacks. However, since it is unable to detect unknown attacks, i.e., 0-day attacks, the ultimate challenge in intrusion detection field is how we can exactly identify such an attack by an automated manner. Over the past few years, several studies on solving these problems have been made on anomaly detection using unsupervised learning techniques such as clustering, one-class support vector machine (SVM), etc. Although they enable one to construct intrusion detection models at low cost and effort, and have capability to detect unforeseen attacks, they still have mainly two problems in intrusion detection: a low detection rate and a high false positive rate. In this paper, we propose a new anomaly detection method based on clustering and multiple one-class SVM in order to improve the detection rate while maintaining a low false positive rate. We evaluated our method using KDD Cup 1999 data set. Evaluation results show that our approach outperforms the existing algorithms reported in the literature; especially in detection of unknown attacks.
NASA Astrophysics Data System (ADS)
Greatbatch, Richard J.; Zhu, Xiaoting; Claus, Martin
2018-03-01
Monthly mean sea level anomalies in the tropical Pacific for the period 1961-2002 are reconstructed using a linear, multimode model driven by monthly mean wind stress anomalies from the NCEP/NCAR and ERA-40 reanalysis products. Overall, the sea level anomalies reconstructed by both wind stress products agree well with the available tide gauge data, although with poor performance at Kanton Island in the western-central equatorial Pacific and reduced amplitude at Christmas Island. The reduced performance is related to model error in locating the pivot point in sea level variability associated with the so-called "tilt" mode. We present evidence that the pivot point was further west during the period 1993-2014 than during the period 1961-2002 and attribute this to a persistent upward trend in the zonal wind stress variance along the equator west of 160° W throughout the period 1961-2014. Experiments driven by the zonal component of the wind stress alone reproduce much of the trend in sea level found in the experiments driven by both components of the wind stress. The experiments show an upward trend in sea level in the eastern tropical Pacific over the period 1961-2002, but with a much stronger upward trend when using the NCEP/NCAR product. We argue that the latter is related to an overly strong eastward trend in zonal wind stress in the eastern-central Pacific that is believed to be a spurious feature of the NCEP/NCAR product.
CSAX: Characterizing Systematic Anomalies in eXpression Data
Noto, Keith; Majidi, Saeed; Edlow, Andrea G.; Wick, Heather C.; Bianchi, Diana W.
2015-01-01
Abstract Methods for translating gene expression signatures into clinically relevant information have typically relied upon having many samples from patients with similar molecular phenotypes. Here, we address the question of what can be done when it is relatively easy to obtain healthy patient samples, but when abnormalities corresponding to disease states may be rare and one-of-a-kind. The associated computational challenge, anomaly detection, is a well-studied machine-learning problem. However, due to the dimensionality and variability of expression data, existing methods based on feature space analysis or individual anomalously expressed genes are insufficient. We present a novel approach, CSAX, that identifies pathways in an individual sample in which the normal expression relationships are disrupted. To evaluate our approach, we have compiled and released a compendium of public expression data sets, reformulated to create a test bed for anomaly detection. We demonstrate the accuracy of CSAX on the data sets in our compendium, compare it to other leading methods, and show that CSAX aids in both identifying anomalies and explaining their underlying biology. We describe an approach to characterizing the difficulty of specific expression anomaly detection tasks. We then illustrate CSAX's value in two developmental case studies. Confirming prior hypotheses, CSAX highlights disruption of platelet activation pathways in a neonate with retinopathy of prematurity and identifies, for the first time, dysregulated oxidative stress response in second trimester amniotic fluid of fetuses with obese mothers. Our approach provides an important step toward identification of individual disease patterns in the era of precision medicine. PMID:25651392
King, Trude V.V.; Johnson, Michaela R.; Hubbard, Bernard E.; Drenth, Benjamin J.
2011-01-01
During the independent analysis of the geophysical, ASTER, and imaging spectrometer (HyMap) data by USGS scientists, previously unrecognized targets of potential mineralization were identified using evaluation criteria most suitable to the individual dataset. These anomalous zones offer targets of opportunity that warrant additional field verification. This report describes the standards used to define the anomalies, summarizes the results of the evaluations for each type of data, and discusses the importance and implications of regions of anomaly overlap between two or three of the datasets.
Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data
NASA Technical Reports Server (NTRS)
Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney
2012-01-01
This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.
Congenital aplastic-hypoplastic lumbar pedicle in infants and young children
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yousefzadeh, D.K.; El-Khoury, G.Y.; Lupetin, A.R.
1982-01-01
Nine cases of congenital aplastic-hypoplastic lumbar pedicle (mean age 27 months) are described. Their data are compared to those of 18 other reported cases (mean age 24.7 years) and the following conclusions are made: (1) Almost exclusively, the pedicular defect in infants and young children is due to developmental anomaly rather than destruction by malignancy or infectious processes. (2) This anomaly, we think, is more common than it is believed to be. (3) Unlike adults, infants and young children rarely develop hypertrophy and/or sclerosis of the contralateral pedicle. (4) Detection of pedicular anomaly is more than satisfying a radiographic curiositymore » and may lead to discovery of other coexisting anomalies. (5) Ultrasonic screening of the patients with congenital pedicular defects may detect the associated genitourinary anomalies, if present, and justify further studies in a selected group of patients.« less
Faust, Kevin; Xie, Quin; Han, Dominick; Goyle, Kartikay; Volynskaya, Zoya; Djuric, Ugljesa; Diamandis, Phedias
2018-05-16
There is growing interest in utilizing artificial intelligence, and particularly deep learning, for computer vision in histopathology. While accumulating studies highlight expert-level performance of convolutional neural networks (CNNs) on focused classification tasks, most studies rely on probability distribution scores with empirically defined cutoff values based on post-hoc analysis. More generalizable tools that allow humans to visualize histology-based deep learning inferences and decision making are scarce. Here, we leverage t-distributed Stochastic Neighbor Embedding (t-SNE) to reduce dimensionality and depict how CNNs organize histomorphologic information. Unique to our workflow, we develop a quantitative and transparent approach to visualizing classification decisions prior to softmax compression. By discretizing the relationships between classes on the t-SNE plot, we show we can super-impose randomly sampled regions of test images and use their distribution to render statistically-driven classifications. Therefore, in addition to providing intuitive outputs for human review, this visual approach can carry out automated and objective multi-class classifications similar to more traditional and less-transparent categorical probability distribution scores. Importantly, this novel classification approach is driven by a priori statistically defined cutoffs. It therefore serves as a generalizable classification and anomaly detection tool less reliant on post-hoc tuning. Routine incorporation of this convenient approach for quantitative visualization and error reduction in histopathology aims to accelerate early adoption of CNNs into generalized real-world applications where unanticipated and previously untrained classes are often encountered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perigaud C.; Dewitte, B.
The Zebiak and Cane model is used in its {open_quotes}uncoupled mode,{close_quotes} meaning that the oceanic model component is driven by the Florida State University (FSU) wind stress anomalies over 1980-93 to simulate sea surface temperature anomalies, and these are used in the atmospheric model component to generate wind anomalies. Simulations are compared with data derived from FSU winds, International Satellite Cloud Climatology Project cloud convection, Advanced Very High Resolution Radiometer SST, Geosat sea level, 20{degrees}C isotherm depth derived from an expendable bathythermograph, and current velocities estimated from drifters or current-meter moorings. Forced by the simulated SST, the atmospheric model ismore » fairly successful in reproducing the observed westerlies during El Nino events. The model fails to simulate the easterlies during La Nina 1988. The simulated forcing of the atmosphere is in very poor agreement with the heating derived from cloud convection data. Similarly, the model is fairly successful in reproducing the warm anomalies during El Nino events. However, it fails to simulate the observed cold anomalies. Simulated variations of thermocline depth agree reasonably well with observations. The model simulates zonal current anomalies that are reversing at a dominant 9-month frequency. Projecting altimetric observations on Kelvin and Rossby waves provides an estimate of zonal current anomalies, which is consistent with the ones derived from drifters or from current meter moorings. Unlike the simulated ones, the observed zonal current anomalies reverse from eastward during El Nino events to westward during La Nina events. The simulated 9-month oscillations correspond to a resonant mode of the basin. They can be suppressed by cancelling the wave reflection at the boundaries, or they can be attenuated by increasing the friction in the ocean model. 58 refs., 14 figs., 6 tabs.« less
A Healthcare Utilization Analysis Framework for Hot Spotting and Contextual Anomaly Detection
Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram
2012-01-01
Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient’s clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among medical encounters to provide more in-depth insights. PMID:23304306
A healthcare utilization analysis framework for hot spotting and contextual anomaly detection.
Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram
2012-01-01
Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient's clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among medical encounters to provide more in-depth insights.
Tectonic Interpretation of CHAMP Geopotential Data over the Northern Adriatic Sea.
NASA Astrophysics Data System (ADS)
Taylor, P. T.; Kim, H. R.; Mayer-Gürr, T.
2006-05-01
Recent aeromagnetic anomaly compilations (Chiappini et al., 2000 and Tontini et al., 2004) show a large positive (>700 nT) northwest-southeast trending magnetic anomaly off the Dalmatian coast. Unfortunately these aeromagnetic data cover only a part of this anomaly. We wanted to investigate if this large magnetic anomaly could be detected at satellite altitude and what is the extent and source of this feature. Therefore, magnetic and gravity anomaly maps were made from the CHAMP geopotential data, measured at the current low altitude of 345-350 km over the northern Adriatic Sea. We made the magnetic anomaly map over this relatively small region using 36 descending and 85 ascending orbits screened to be at the lowest altitude and the most magnetically quietest data. We removed the main field component (i.e., IGRF-10 up to degree and order 13) and then demeaned individual tracks and subtracted a second order polynomial to remove regional and/or un-modeled external field features. The resulting map from these well-correlated anomalies revealed a positive magnetic anomaly (>2 nT). Reduction-to-the pole brought these CHAMP anomaly features into coincidence with the aeromagnetic data. Previously Cantini et al. (1999) compared the surface magnetic data with MAGSAT by continuing upward the former and downwards the latter to 100 km and found a good correlation for wavelengths of 300-500 km. We also investigated the CHAMP gravity data. They were reduced using the kinematic short-arc integration method (Ilk et al., 2005 and Mayer Gürr et al., 2005). However, no corresponding short-wavelength gravity anomaly was observed in our study area. This tectonically complex region is under horizontal stress and the source of the large magnetic anomaly can be modelled by an associated ophiolite melange.
NASA Technical Reports Server (NTRS)
Zeng, Fanwei; Collatz, George James; Pinzon, Jorge E.; Ivanoff, Alvaro
2013-01-01
Satellite observations of surface reflected solar radiation contain informationabout variability in the absorption of solar radiation by vegetation. Understanding thecauses of variability is important for models that use these data to drive land surface fluxesor for benchmarking prognostic vegetation models. Here we evaluated the interannualvariability in the new 30.5-year long global satellite-derived surface reflectance index data,Global Inventory Modeling and Mapping Studies normalized difference vegetation index(GIMMS NDVI3g). Pearsons correlation and multiple linear stepwise regression analyseswere applied to quantify the NDVI interannual variability driven by climate anomalies, andto evaluate the effects of potential interference (snow, aerosols and clouds) on the NDVIsignal. We found ecologically plausible strong controls on NDVI variability by antecedent precipitation and current monthly temperature with distinct spatial patterns. Precipitation correlations were strongest for temperate to tropical water limited herbaceous systemswhere in some regions and seasons 40 of the NDVI variance could be explained byprecipitation anomalies. Temperature correlations were strongest in northern mid- to-high-latitudes in the spring and early summer where up to 70 of the NDVI variance was explained by temperature anomalies. We find that, in western and central North America,winter-spring precipitation determines early summer growth while more recent precipitation controls NDVI variability in late summer. In contrast, current or prior wetseason precipitation anomalies were correlated with all months of NDVI in sub-tropical herbaceous vegetation. Snow, aerosols and clouds as well as unexplained phenomena still account for part of the NDVI variance despite corrections. Nevertheless, this study demonstrates that GIMMS NDVI3g represents real responses of vegetation to climate variability that are useful for global models.
Anomaly detection for machine learning redshifts applied to SDSS galaxies
NASA Astrophysics Data System (ADS)
Hoyle, Ben; Rau, Markus Michael; Paech, Kerstin; Bonnett, Christopher; Seitz, Stella; Weller, Jochen
2015-10-01
We present an analysis of anomaly detection for machine learning redshift estimation. Anomaly detection allows the removal of poor training examples, which can adversely influence redshift estimates. Anomalous training examples may be photometric galaxies with incorrect spectroscopic redshifts, or galaxies with one or more poorly measured photometric quantity. We select 2.5 million `clean' SDSS DR12 galaxies with reliable spectroscopic redshifts, and 6730 `anomalous' galaxies with spectroscopic redshift measurements which are flagged as unreliable. We contaminate the clean base galaxy sample with galaxies with unreliable redshifts and attempt to recover the contaminating galaxies using the Elliptical Envelope technique. We then train four machine learning architectures for redshift analysis on both the contaminated sample and on the preprocessed `anomaly-removed' sample and measure redshift statistics on a clean validation sample generated without any preprocessing. We find an improvement on all measured statistics of up to 80 per cent when training on the anomaly removed sample as compared with training on the contaminated sample for each of the machine learning routines explored. We further describe a method to estimate the contamination fraction of a base data sample.
Discovering System Health Anomalies Using Data Mining Techniques
NASA Technical Reports Server (NTRS)
Sriastava, Ashok, N.
2005-01-01
We present a data mining framework for the analysis and discovery of anomalies in high-dimensional time series of sensor measurements that would be found in an Integrated System Health Monitoring system. We specifically treat the problem of discovering anomalous features in the time series that may be indicative of a system anomaly, or in the case of a manned system, an anomaly due to the human. Identification of these anomalies is crucial to building stable, reusable, and cost-efficient systems. The framework consists of an analysis platform and new algorithms that can scale to thousands of sensor streams to discovers temporal anomalies. We discuss the mathematical framework that underlies the system and also describe in detail how this framework is general enough to encompass both discrete and continuous sensor measurements. We also describe a new set of data mining algorithms based on kernel methods and hidden Markov models that allow for the rapid assimilation, analysis, and discovery of system anomalies. We then describe the performance of the system on a real-world problem in the aircraft domain where we analyze the cockpit data from aircraft as well as data from the aircraft propulsion, control, and guidance systems. These data are discrete and continuous sensor measurements and are dealt with seamlessly in order to discover anomalous flights. We conclude with recommendations that describe the tradeoffs in building an integrated scalable platform for robust anomaly detection in ISHM applications.
NASA Astrophysics Data System (ADS)
Riccardi, U.; Arnoso, J.; Benavent, M.; Vélez, E.; Tammaro, U.; Montesinos, F. G.
2018-05-01
We report on a detailed geodetic continuous monitoring in Timanfaya volcanic area (TVA), where the most intense geothermal anomalies of Lanzarote Island are located. We analyze about three years of GNSS data collected on a small network of five permanent stations, one of which at TVA, deployed on the island, and nearly 20 years of tiltmeter and strainmeter records acquired at Los Camelleros site settled in the facilities of the Geodynamics Laboratory of Lanzarote within TVA. This study is intended to contribute to understanding the active tectonics on Lanzarote Island and its origin, mainly in TVA. After characterizing and filtering out the seasonal periodicities related to "non-tectonic" sources from the geodetic records, a tentative ground deformation field is reconstructed through the analysis of both tilt, strain records and the time evolution of the baselines ranging the GNSS stations. The joint interpretation of the collected geodetic data show that the area of the strongest geothermal anomaly in TVA is currently undergoing a SE trending relative displacement at a rate of about 3 mm/year. This area even experiences a significant subsidence with a maximum rate of about 6 mm/year. Moreover, we examine the possible relation between the observed deformations and atmospheric effects by modelling the response functions of temperature and rain recorded in the laboratory. Finally, from the retrieval of the deformation patterns and the joint analysis of geodetic and environmental observations, we propose a qualitative model of the interplaying role between the hydrological systems and the geothermal anomalies. Namely, we explain the detected time correlation between rainfall and ground deformation because of the enhancement of the thermal transfer from the underground heat source driven by the infiltration of meteoric water.
Effectiveness of Natural Field Induced Polarization for Detecting Polymetallic Deposits
NASA Astrophysics Data System (ADS)
YANG, Jin; LIU, Zhaoping; WANG, Long
To validate the effect of Natural Field Induced Polarization (NFIP), a certain polymetallic deposit was chosen as the test site, where Induced Polarization (IP) using gradient array and the Magnetotelluric (MT) sounding were conducted simultaneously. Analysis and comparison of the data indicated that the anomaly of the Relative Percent Frequency Effect (RPFE) from the MT data and the anomaly of IP coincided well with each other in the extents of the anomalous site and anomaly magnitudes. The results showed that NFIP was effective in the exploration of polymetallic deposits, under certain conditions.
Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method
NASA Technical Reports Server (NTRS)
Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.
2005-01-01
NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.
Rule-based expert system for maritime anomaly detection
NASA Astrophysics Data System (ADS)
Roy, Jean
2010-04-01
Maritime domain operators/analysts have a mandate to be aware of all that is happening within their areas of responsibility. This mandate derives from the needs to defend sovereignty, protect infrastructures, counter terrorism, detect illegal activities, etc., and it has become more challenging in the past decade, as commercial shipping turned into a potential threat. In particular, a huge portion of the data and information made available to the operators/analysts is mundane, from maritime platforms going about normal, legitimate activities, and it is very challenging for them to detect and identify the non-mundane. To achieve such anomaly detection, they must establish numerous relevant situational facts from a variety of sensor data streams. Unfortunately, many of the facts of interest just cannot be observed; the operators/analysts thus use their knowledge of the maritime domain and their reasoning faculties to infer these facts. As they are often overwhelmed by the large amount of data and information, automated reasoning tools could be used to support them by inferring the necessary facts, ultimately providing indications and warning on a small number of anomalous events worthy of their attention. Along this line of thought, this paper describes a proof-of-concept prototype of a rule-based expert system implementing automated rule-based reasoning in support of maritime anomaly detection.
NASA Technical Reports Server (NTRS)
Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.
2010-01-01
The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods
FAST TRACK COMMUNICATION: Quantum anomalies and linear response theory
NASA Astrophysics Data System (ADS)
Sela, Itamar; Aisenberg, James; Kottos, Tsampikos; Cohen, Doron
2010-08-01
The analysis of diffusive energy spreading in quantized chaotic driven systems leads to a universal paradigm for the emergence of a quantum anomaly. In the classical approximation, a driven chaotic system exhibits stochastic-like diffusion in energy space with a coefficient D that is proportional to the intensity ɛ2 of the driving. In the corresponding quantized problem the coherent transitions are characterized by a generalized Wigner time tɛ, and a self-generated (intrinsic) dephasing process leads to nonlinear dependence of D on ɛ2.
IR Thermography of International Space Station Radiator Panels
NASA Technical Reports Server (NTRS)
Koshti, Ajay; Winfree, WIlliam; Morton, Richard; Howell, Patricia
2010-01-01
Several non-flight qualification test radiators were inspected using flash thermography. Flash thermography data analysis used raw and second derivative images to detect anomalies (Echotherm and Mosaic). Simple contrast evolutions were plotted for the detected anomalies to help in anomaly characterization. Many out-of-family indications were noted. Some out-of-family indications were classified as cold spot indications and are due to additional adhesive or adhesive layer behind the facesheet. Some out-of-family indications were classified as hot spot indications and are due to void, unbond or lack of adhesive behind the facesheet. The IR inspection helped in assessing expected manufacturing quality of the radiators.
Convectively Driven Tropopause-Level Cooling and Its Influences on Stratospheric Moisture
NASA Astrophysics Data System (ADS)
Kim, Joowan; Randel, William J.; Birner, Thomas
2018-01-01
Characteristics of the tropopause-level cooling associated with tropical deep convection are examined using CloudSat radar and Constellation Observing System for Meteorology, Ionosphere and Climate (COSMIC) GPS radio occultation measurements. Extreme deep convection is sampled based on the cloud top height (>17 km) from CloudSat, and colocated temperature profiles from COSMIC are composited around the deep convection. Response of moisture to the tropopause-level cooling is also examined in the upper troposphere and lower stratosphere using microwave limb sounder measurements. The composite temperature shows an anomalous warming in the troposphere and a significant cooling near the tropopause (at 16-19 km) when deep convection occurs over the western Pacific, particularly during periods with active Madden-Julian Oscillation (MJO). The composite of the tropopause cooling has a large horizontal scale ( 6,000 km in longitude) with minimum temperature anomaly of -2 K, and it lasts more than 2 weeks with support of mesoscale convective clusters embedded within the envelope of the MJO. The water vapor anomalies show strong correlation with the temperature anomalies (i.e., dry anomaly in the cold anomaly), showing that the convectively driven tropopause cooling actively dehydrate the lower stratosphere in the western Pacific region. The moisture is also affected by anomalous Matsuno-Gill-type circulation associated with the cold anomaly, in which dry air spreads over a wide range in the tropical tropopause layer (TTL). These results suggest that convectively driven tropopause cooling and associated transient circulation play an important role in the large-scale dehydration process in the TTL.
Roberts, T; Mugford, M; Piercy, J
1998-09-01
To compare the cost effectiveness of different programmes of routine antenatal ultrasound screening to detect four key fetal anomalies: serious cardiac anomalies, spina bifida, Down's syndrome and lethal anomalies, using existing evidence. Decision analysis was used based on the best data currently available, including expert opinion from the Royal College of Obstetricians and Gynaecologists, Working Party and secondary data from the literature, to predict the likely outcomes in terms of malformations detected by each screening programme. Results applicable in clinics, hospitals or GP practices delivering antenatal screening. The number of cases with a 'target' malformation correctly detected antenatally. There was substantial overlap between the cost ranges of each screening programme demonstrating considerable uncertainty about the relative economic efficiency of alternative programmes for ultrasound screening. The cheapest, but not the most effective, screening programme consisted of one second trimester ultrasound scan. The cost per target anomaly detected (cost effectiveness) for this programme was in the range 5,000 pound silver-109,000, pound silver but in any 1000 women it will also fail to detect between 3.6 and 4.7 target anomalies. The range of uncertainty in the costs did not allow selection of any one programme as a clear choice for NHS purchasers. The results suggested that the overall allocation of resources for routine ultrasound screening in the UK is not currently economically efficient, but that certain scenarios for ultrasound screening are potentially within the range of cost effectiveness reached by other, possibly competing, screening programmes. The model highlighted the weakness of available evidence and demonstrated the need for more information both about current practice and costs.
Topographically driven groundwater flow and the San Andreas heat flow paradox revisited
Saffer, D.M.; Bekins, B.A.; Hickman, S.
2003-01-01
Evidence for a weak San Andreas Fault includes (1) borehole heat flow measurements that show no evidence for a frictionally generated heat flow anomaly and (2) the inferred orientation of ??1 nearly perpendicular to the fault trace. Interpretations of the stress orientation data remain controversial, at least in close proximity to the fault, leading some researchers to hypothesize that the San Andreas Fault is, in fact, strong and that its thermal signature may be removed or redistributed by topographically driven groundwater flow in areas of rugged topography, such as typify the San Andreas Fault system. To evaluate this scenario, we use a steady state, two-dimensional model of coupled heat and fluid flow within cross sections oriented perpendicular to the fault and to the primary regional topography. Our results show that existing heat flow data near Parkfield, California, do not readily discriminate between the expected thermal signature of a strong fault and that of a weak fault. In contrast, for a wide range of groundwater flow scenarios in the Mojave Desert, models that include frictional heat generation along a strong fault are inconsistent with existing heat flow data, suggesting that the San Andreas Fault at this location is indeed weak. In both areas, comparison of modeling results and heat flow data suggest that advective redistribution of heat is minimal. The robust results for the Mojave region demonstrate that topographically driven groundwater flow, at least in two dimensions, is inadequate to obscure the frictionally generated heat flow anomaly from a strong fault. However, our results do not preclude the possibility of transient advective heat transport associated with earthquakes.
Tropical Pacific climate during the Medieval Climate Anomaly: progress and pitfalls
NASA Astrophysics Data System (ADS)
Cobb, K. M.; Westphal, N.; Charles, C.; Sayani, H. R.; Edwards, R. L.; Cheng, H.; Grothe, P. R.; Chen, T.; Hitt, N. T.; O'Connor, G.; Atwood, A. R.
2016-12-01
A vast trove of paleoclimate records indicates that the Medieval Climate Anomaly (MCA; 900-1200AD) was characterized by relative warmth throughout the Northern Hemisphere and significant hydroclimate anomalies - particularly well-resolved over North America - that posed a challenge to human populations. The global-scale nature of the climate anomalies has driven speculation that the tropical Pacific, with its rich spectrum of natural variability and far-reaching impact, may have undergone a prolonged reorganization during the MCA. While some key records from across the tropical Pacific document significant changes in temperature and/or hydrology, a dynamically-consistent picture of the MCA tropical Pacific climate state has proven elusive. In particular, there are few if any robust paleoclimate constraints from the central Pacific, where even modest changes in ocean temperature translate into distinct patterns of global atmospheric teleconnections. Here, we present a new collection of fossil coral multi-proxy records from Christmas Island (2N, 157W) that provide robust constraints on both temperature and hydrological changes during the MCA. We employ both modern coral data, instrumental climate data, and climate model output in developing a framework for quantifying the uncertainties associated with the new fossil coral data. In doing so, we illustrate the clear benefits of modern environmental monitoring campaigns that inform the generation of paleoclimate pseudo-proxies.
A groundwater convection model for Rio Grande rift geothermal resources
NASA Technical Reports Server (NTRS)
Morgan, P.; Harder, V.; Daggett, P. H.; Swanberg, C. A.
1981-01-01
It has been proposed that forced convection, driven by normal groundwater flow through the interconnected basins of the Rio Grande rift is the primary source mechanism for the numerous geothermal anomalies along the rift. A test of this concept using an analytical model indicates that significant forced convection must occur in the basins even if permeabilities are as low as 50-200 millidarcies at a depth of 2 km. Where groundwater flow is constricted at the discharge areas of the basins forced convection can locally increase the gradient to a level where free convection also occurs, generating surface heat flow anomalies 5-15 times background. A compilation of groundwater data for the rift basins shows a strong correlation between constrictions in groundwater flow and hot springs and geothermal anomalies, giving strong circumstantial support to the convection model.
A model for anomaly classification in intrusion detection systems
NASA Astrophysics Data System (ADS)
Ferreira, V. O.; Galhardi, V. V.; Gonçalves, L. B. L.; Silva, R. C.; Cansian, A. M.
2015-09-01
Intrusion Detection Systems (IDS) are traditionally divided into two types according to the detection methods they employ, namely (i) misuse detection and (ii) anomaly detection. Anomaly detection has been widely used and its main advantage is the ability to detect new attacks. However, the analysis of anomalies generated can become expensive, since they often have no clear information about the malicious events they represent. In this context, this paper presents a model for automated classification of alerts generated by an anomaly based IDS. The main goal is either the classification of the detected anomalies in well-defined taxonomies of attacks or to identify whether it is a false positive misclassified by the IDS. Some common attacks to computer networks were considered and we achieved important results that can equip security analysts with best resources for their analyses.
An immunity-based anomaly detection system with sensor agents.
Okamoto, Takeshi; Ishida, Yoshiteru
2009-01-01
This paper proposes an immunity-based anomaly detection system with sensor agents based on the specificity and diversity of the immune system. Each agent is specialized to react to the behavior of a specific user. Multiple diverse agents decide whether the behavior is normal or abnormal. Conventional systems have used only a single sensor to detect anomalies, while the immunity-based system makes use of multiple sensors, which leads to improvements in detection accuracy. In addition, we propose an evaluation framework for the anomaly detection system, which is capable of evaluating the differences in detection accuracy between internal and external anomalies. This paper focuses on anomaly detection in user's command sequences on UNIX-like systems. In experiments, the immunity-based system outperformed some of the best conventional systems.
NASA Astrophysics Data System (ADS)
Li, H.; Kusky, T. M.; Peng, S.; Zhu, M.
2012-12-01
Thermal infrared (TIR) remote sensing is an important technique in the exploration of geothermal resources. In this study, a geothermal survey is conducted in Tengchong area of Yunnan province in China using multi-temporal MODIS LST (Land Surface Temperature). The monthly night MODIS LST data from Mar. 2000 to Mar. 2011 of the study area were collected and analyzed. The 132 month average LST map was derived and three geothermal anomalies were identified. The findings of this study agree well with the results from relative geothermal gradient measurements. Finally, we conclude that TIR remote sensing is a cost-effective technique to detect geothermal anomalies. Combining TIR remote sensing with geological analysis and the understanding of geothermal mechanism is an accurate and efficient approach to geothermal area detection.
Anomaly-based intrusion detection for SCADA systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, D.; Usynin, A.; Hines, J. W.
2006-07-01
Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less
Model-Based Anomaly Detection for a Transparent Optical Transmission System
NASA Astrophysics Data System (ADS)
Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.
In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.
SSME HPOTP post-test diagnostic system enhancement project
NASA Technical Reports Server (NTRS)
Bickmore, Timothy W.
1995-01-01
An assessment of engine and component health is routinely made after each test or flight firing of a space shuttle main engine (SSME). Currently, this health assessment is done by teams of engineers who manually review sensor data, performance data, and engine and component operating histories. Based on review of information from these various sources, an evaluation is made as to the health of each component of the SSME and the preparedness of the engine for another test or flight. The objective of this project is to further develop a computer program which automates the analysis of test data from the SSME high-pressure oxidizer turbopump (HPOTP) in order to detect and diagnose anomalies. This program fits into a larger system, the SSME Post-Test Diagnostic System (PTDS), which will eventually be extended to assess the health and status of most SSME components on the basis of test data analysis. The HPOTP module is an expert system, which uses 'rules-of-thumb' obtained from interviews with experts from NASA Marshall Space Flight Center (MSFC) to detect and diagnose anomalies. Analyses of the raw test data are first performed using pattern recognition techniques which result in features such as spikes, shifts, peaks, and drifts being detected and written to a database. The HPOTP module then looks for combination of these features which are indicative of known anomalies, using the rules gathered from the turbomachinery experts. Results of this analysis are then displayed via a graphical user interface which provides ranked lists of anomalies and observations by engine component, along with supporting data plots for each.
An extreme anomaly in stratospheric ozone over Europe in 1940-1942
NASA Astrophysics Data System (ADS)
Brönnimann, S.; Luterbacher, J.; Staehelin, J.; Svendby, T. M.
2004-04-01
Reevaluated historical total ozone data reveal extraordinarily high values over several European sites in 1940-1942, concurrent with extreme climatic anomalies at the Earth's surface. Using historical radiosonde data, reconstructed upper-level fields, and total ozone data from Arosa (Switzerland), Dombås, and Tromsø (Norway), this unusual case of stratosphere-troposphere coupling is analyzed. At Arosa, numerous strong total ozone peaks in all seasons were due to unusually frequent upper troughs over central Europe and related ozone redistribution in the lower stratosphere. At the Norwegian sites, high winter total ozone was most likely caused by major stratospheric warmings in Jan./Feb. 1940, Feb./Mar. 1941, and Feb. 1942. Results demonstrate that the dynamically driven interannual variability of total ozone can be much larger than that estimated based on the past 25-40 years.
Analysis and interpretation of MAGSAT anomalies over north Africa
NASA Technical Reports Server (NTRS)
Phillips, R. J.
1985-01-01
Crustal anomaly detection with MAGSAT data is frustrated by inherent resolving power of the data and by contamination from external and core fields. Quality of the data might be tested by modeling specific tectonic features which produce anomalies that fall within proposed resolution and crustal amplitude capabilities of MAGSAT fields. To test this hypothesis, north African hotspots associated with Ahaggar, Tibesti and Darfur were modeled as magnetic induction anomalies. MAGSAT data were reduced by subtracting external and core fields to isolate scalar and vertical component crustal signals. Of the three volcanic areas, only the Ahaggar region had an associated anomaly of magnitude above error limits of the data. Hotspot hypothesis was tested for Ahaggar by seeing if predicted magnetic signal matched MAGSAT anomaly. Predicted model magnetic signal arising from surface topography of the uplift and the Curie isothermal surface was calculated at MAGSAT altitudes by Fourier transform technique modified to allow for variable magnetization. Curie isotherm surface was calculated using a method for temperature distribution in a moving plate above a fixed hotspot. Magnetic signal was calculated for a fixed plate as well as a number of plate velocities and directions.
Telling tails explain the discrepancy in sexual partner reports.
Morris, M
1993-09-30
An anomaly often noted in surveys of sexual behaviour is that the number of female sexual partners reported by men exceeds the number of male partners reported by women. This discrepancy is sometimes interpreted as evidence that surveys produce unreliable data due to sex-linked response and sampling bias. We report here that among the 90% of respondents reporting fewer than 20 lifetime partners, however, the ratio of male to female reports drops from 3.2:1 to 1.2:1. The anomaly thus appears to be driven by the upper tail of the contact distribution, an example of the general principle of outlier influence in data analysis. The implication is that sexual behaviour surveys provide reliable data in the main, and that simple improvements can increase precision in the upper tail to make these data more useful for modelling the spread of AIDS and other sexually transmitted diseases.
Big Data Analysis of Manufacturing Processes
NASA Astrophysics Data System (ADS)
Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert
2015-11-01
The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tardiff, Mark F.; Runkle, Robert C.; Anderson, K. K.
2006-01-23
The goal of primary radiation monitoring in support of routine screening and emergency response is to detect characteristics in vehicle radiation signatures that indicate the presence of potential threats. Two conceptual approaches to analyzing gamma-ray spectra for threat detection are isotope identification and anomaly detection. While isotope identification is the time-honored method, an emerging technique is anomaly detection that uses benign vehicle gamma ray signatures to define an expectation of the radiation signature for vehicles that do not pose a threat. Newly acquired spectra are then compared to this expectation using statistical criteria that reflect acceptable false alarm rates andmore » probabilities of detection. The gamma-ray spectra analyzed here were collected at a U.S. land Port of Entry (POE) using a NaI-based radiation portal monitor (RPM). The raw data were analyzed to develop a benign vehicle expectation by decimating the original pulse-height channels to 35 energy bins, extracting composite variables via principal components analysis (PCA), and estimating statistically weighted distances from the mean vehicle spectrum with the mahalanobis distance (MD) metric. This paper reviews the methods used to establish the anomaly identification criteria and presents a systematic analysis of the response of the combined PCA and MD algorithm to modeled mono-energetic gamma-ray sources.« less
Statistical Traffic Anomaly Detection in Time-Varying Communication Networks
2015-02-01
methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types
Statistical Traffic Anomaly Detection in Time Varying Communication Networks
2015-02-01
methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types
Intercomparison and Uncertainty Assessment of Nine Evapotranspiration Estimates Over South America
NASA Astrophysics Data System (ADS)
Sörensson, Anna A.; Ruscica, Romina C.
2018-04-01
This study examines the uncertainties and the representations of anomalies of a set of evapotranspiration products over climatologically distinct regions of South America. The products, coming from land surface models, reanalysis, and remote sensing, are chosen from sources that are readily available to the community of users. The results show that the spatial patterns of maximum uncertainty differ among metrics, with dry regions showing maximum relative uncertainties of annual mean evapotranspiration, while energy-limited regions present maximum uncertainties in the representation of the annual cycle and monsoon regions in the representation of anomalous conditions. Furthermore, it is found that land surface models driven by observed atmospheric fields detect meteorological and agricultural droughts in dry regions unequivocally. The remote sensing products employed do not distinguish all agricultural droughts and this could be attributed to the forcing net radiation. The study also highlights important characteristics of individual data sets and recommends users to include assessments of sensitivity to evapotranspiration data sets in their studies, depending on region and nature of study to be conducted.
A Survey on Anomaly Based Host Intrusion Detection System
NASA Astrophysics Data System (ADS)
Jose, Shijoe; Malathi, D.; Reddy, Bharath; Jayaseeli, Dorathi
2018-04-01
An intrusion detection system (IDS) is hardware, software or a combination of two, for monitoring network or system activities to detect malicious signs. In computer security, designing a robust intrusion detection system is one of the most fundamental and important problems. The primary function of system is detecting intrusion and gives alerts when user tries to intrusion on timely manner. In these techniques when IDS find out intrusion it will send alert massage to the system administrator. Anomaly detection is an important problem that has been researched within diverse research areas and application domains. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. From the existing anomaly detection techniques, each technique has relative strengths and weaknesses. The current state of the experiment practice in the field of anomaly-based intrusion detection is reviewed and survey recent studies in this. This survey provides a study of existing anomaly detection techniques, and how the techniques used in one area can be applied in another application domain.
Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation
NASA Technical Reports Server (NTRS)
Koshti, Ajay
2014-01-01
The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.
Papagiannopoulos, Dimitri; Gong, Edward
2017-03-01
This review article explores sports and recreational precautions in children with solitary kidneys. In 2001, the American Academy of Pediatrics published recommendations for activity in children with medical conditions. Those with solitary kidneys were graded a "qualified yes": no restriction in noncontact sports, and individual assessment for limited-contact, contact, and collision sports. Recent trauma data suggest that classification according to the degree of contact is inaccurate. We propose an updated, data-driven classification of sports or recreation according to the risk of high-grade renal trauma or loss of renal unit. Given the paucity of literature on the topic and lack of consensus, children with congenital renal anomalies should exercise caution in both sports and recreation. Copyright © 2016 Elsevier Inc. All rights reserved.
A lightweight network anomaly detection technique
Kim, Jinoh; Yoo, Wucherl; Sim, Alex; ...
2017-03-13
While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less
A lightweight network anomaly detection technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jinoh; Yoo, Wucherl; Sim, Alex
While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less
Practical method to identify orbital anomaly as spacecraft breakup in the geostationary region
NASA Astrophysics Data System (ADS)
Hanada, Toshiya; Uetsuhara, Masahiko; Nakaniwa, Yoshitaka
2012-07-01
Identifying a spacecraft breakup is an essential issue to define the current orbital debris environment. This paper proposes a practical method to identify an orbital anomaly, which appears as a significant discontinuity in the observation data, as a spacecraft breakup. The proposed method is applicable to orbital anomalies in the geostationary region. Long-term orbital evolutions of breakup fragments may conclude that their orbital planes will converge into several corresponding regions in inertial space even if the breakup epoch is not specified. This empirical method combines the aforementioned conclusion with the search strategy developed at Kyushu University, which can identify origins of observed objects as fragments released from a specified spacecraft. This practical method starts with selecting a spacecraft that experienced an orbital anomaly, and formulates a hypothesis to generate fragments from the anomaly. Then, the search strategy is applied to predict the behavior of groups of fragments hypothetically generated. Outcome of this predictive analysis specifies effectively when, where and how we should conduct optical measurements using ground-based telescopes. Objects detected based on the outcome are supposed to be from the anomaly, so that we can confirm the anomaly as a spacecraft breakup to release the detected objects. This paper also demonstrates observation planning for a spacecraft anomaly in the geostationary region.
Listening to Limericks: A Pupillometry Investigation of Perceivers’ Expectancy
Scheepers, Christoph; Mohr, Sibylle; Fischer, Martin H.; Roberts, Andrew M.
2013-01-01
What features of a poem make it captivating, and which cognitive mechanisms are sensitive to these features? We addressed these questions experimentally by measuring pupillary responses of 40 participants who listened to a series of Limericks. The Limericks ended with either a semantic, syntactic, rhyme or metric violation. Compared to a control condition without violations, only the rhyme violation condition induced a reliable pupillary response. An anomaly-rating study on the same stimuli showed that all violations were reliably detectable relative to the control condition, but the anomaly induced by rhyme violations was perceived as most severe. Together, our data suggest that rhyme violations in Limericks may induce an emotional response beyond mere anomaly detection. PMID:24086417
Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server
2016-09-01
ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT
Routine screening for fetal anomalies: expectations.
Goldberg, James D
2004-03-01
Ultrasound has become a routine part of prenatal care. Despite this, the sensitivity and specificity of the procedure is unclear to many patients and healthcare providers. In a small study from Canada, 54.9% of women reported that they had received no information about ultrasound before their examination. In addition, 37.2% of women indicated that they were unaware of any fetal problems that ultrasound could not detect. Most centers that perform ultrasound do not have their own statistics regarding sensitivity and specificity; it is necessary to rely on large collaborative studies. Unfortunately, wide variations exist in these studies with detection rates for fetal anomalies between 13.3% and 82.4%. The Eurofetus study is the largest prospective study performed to date and because of the time and expense involved in this type of study, a similar study is not likely to be repeated. The overall fetal detection rate for anomalous fetuses was 64.1%. It is important to note that in this study, ultrasounds were performed in tertiary centers with significant experience in detecting fetal malformations. The RADIUS study also demonstrated a significantly improved detection rate of anomalies before 24 weeks in tertiary versus community centers (35% versus 13%). Two concepts seem to emerge from reviewing these data. First, patients must be made aware of the limitations of ultrasound in detecting fetal anomalies. This information is critical to allow them to make informed decisions whether to undergo ultrasound examination and to prepare them for potential outcomes.Second, to achieve the detection rates reported in the Eurofetus study, ultrasound examination must be performed in centers that have extensive experience in the detection of fetal anomalies.
Adiabatic Quantum Anomaly Detection and Machine Learning
NASA Astrophysics Data System (ADS)
Pudenz, Kristen; Lidar, Daniel
2012-02-01
We present methods of anomaly detection and machine learning using adiabatic quantum computing. The machine learning algorithm is a boosting approach which seeks to optimally combine somewhat accurate classification functions to create a unified classifier which is much more accurate than its components. This algorithm then becomes the first part of the larger anomaly detection algorithm. In the anomaly detection routine, we first use adiabatic quantum computing to train two classifiers which detect two sets, the overlap of which forms the anomaly class. We call this the learning phase. Then, in the testing phase, the two learned classification functions are combined to form the final Hamiltonian for an adiabatic quantum computation, the low energy states of which represent the anomalies in a binary vector space.
A new prior for bayesian anomaly detection: application to biosurveillance.
Shen, Y; Cooper, G F
2010-01-01
Bayesian anomaly detection computes posterior probabilities of anomalous events by combining prior beliefs and evidence from data. However, the specification of prior probabilities can be challenging. This paper describes a Bayesian prior in the context of disease outbreak detection. The goal is to provide a meaningful, easy-to-use prior that yields a posterior probability of an outbreak that performs at least as well as a standard frequentist approach. If this goal is achieved, the resulting posterior could be usefully incorporated into a decision analysis about how to act in light of a possible disease outbreak. This paper describes a Bayesian method for anomaly detection that combines learning from data with a semi-informative prior probability over patterns of anomalous events. A univariate version of the algorithm is presented here for ease of illustration of the essential ideas. The paper describes the algorithm in the context of disease-outbreak detection, but it is general and can be used in other anomaly detection applications. For this application, the semi-informative prior specifies that an increased count over baseline is expected for the variable being monitored, such as the number of respiratory chief complaints per day at a given emergency department. The semi-informative prior is derived based on the baseline prior, which is estimated from using historical data. The evaluation reported here used semi-synthetic data to evaluate the detection performance of the proposed Bayesian method and a control chart method, which is a standard frequentist algorithm that is closest to the Bayesian method in terms of the type of data it uses. The disease-outbreak detection performance of the Bayesian method was statistically significantly better than that of the control chart method when proper baseline periods were used to estimate the baseline behavior to avoid seasonal effects. When using longer baseline periods, the Bayesian method performed as well as the control chart method. The time complexity of the Bayesian algorithm is linear in the number of the observed events being monitored, due to a novel, closed-form derivation that is introduced in the paper. This paper introduces a novel prior probability for Bayesian outbreak detection that is expressive, easy-to-apply, computationally efficient, and performs as well or better than a standard frequentist method.
Effect of Varying Crustal Thickness on CHAMP Geopotential Data
NASA Technical Reports Server (NTRS)
Taylor, Patrick T.; Kis, Karoly I.; vonFrese, Ralph R. B.; Korhonen, Juha V.; Wittmann, Geza; Kim, Hyung Rae; Potts, Larmie V.
2003-01-01
Tn determine the effect of crustal thickness variation on satellite-altitude geopotential anomalies we compared two regions of Europe with vastly different values, South and Central Finland and the Pannonian Basin. In our study regions, crustal thickness exceeds 44 km in Finland and is less than 26 km in the Pannonian Basin. Heat-flow data indicate that the thinner and more active crust of the Pannonian Basin has a value nearly three times that of the Finnish Svecofennian Province. An ovoid positive CHAMP gravity anomaly (-4 mGal) is quasi-coincidental with the CHAMP magnetic anomaly traverses the Pannonian Basin while ground based gravity mapping in Hungary shows that the free-air gravity anomalies across the Pannonian Basin are near 0 to +20 mGal with shorter wavelength anomalies from +40 to less than +60 mGal and some 0 to greater than -20 mGal. Larger anomalies are detected in the mountainous areas. The minor value anomalies can indicate the isostatic equilibrium for Hungary (the central part of the Pannonian Basin). Gravity data over Finland bear overprint of de-glaciation. CHAMP gravity data indicates a west-east positive gradient of less than 4 mGal across South and Central Finland. CHAMP magnetic data (400 km) reveal elongated semi-circular negative anomalies for both regions with South-Central Finland having larger amplitude (less than -6 nT) than that over the Pannonian Basin, Hungary (less than -5 nT). In the latter subducted oceanic lithosphere has been proposed as the anomalous body.
Liu, Guanqun; Jia, Yonggang; Liu, Hongjun; Qiu, Hanxue; Qiu, Dongling; Shan, Hongxian
2002-03-01
The exploration and determination of leakage of underground pressureless nonmetallic pipes is difficult to deal with. A comprehensive method combining Ground Penetrating Rader (GPR), electric potential survey and geochemical survey is introduced in the leakage detection of an underground pressureless nonmetallic sewage pipe in this paper. Theoretically, in the influencing scope of a leakage spot, the obvious changes of the electromagnetic properties and the physical-chemical properties of the underground media will be reflected as anomalies in GPR and electrical survey plots. The advantages of GPR and electrical survey are fast and accurate in detection of anomaly scope. In-situ analysis of the geophysical surveys can guide the geochemical survey. Then water and soil sampling and analyzing can be the evidence for judging the anomaly is caused by pipe leakage or not. On the basis of previous tests and practical surveys, the GPR waveforms, electric potential curves, contour maps, and chemical survey results are all classified into three types according to the extent or indexes of anomalies in orderto find out the leakage spots. When three survey methods all show their anomalies as type I in an anomalous spot, this spot is suspected as the most possible leakage location. Otherwise, it will be down grade suspected point. The suspect leakage spots should be confirmed by referring the site conditions because some anomalies are caused other factors. The excavation afterward proved that the method for determining the suspected location by anomaly type is effective and economic. Comprehensive method of GRP, electric potential survey, and geochemical survey is one of the effective methods in the leakage detection of underground nonmetallic pressureless pipe with its advantages of being fast and accurate.
Jacobsen, Matthew K.; Velisavljevic, Nenad; Kono, Yoshio; ...
2017-04-05
Evidence in support of a shear driven anomaly in zirconium at elevated temperatures and pressures has been determined through the combined use of ultrasonic, diffractive, and radiographic techniques. Implications that these have on the phase diagram are explored through thermoacoustic parameters associated with the elasticity and thermal characteristics. In particular, our results illustrate a deviating phase boundary between the α and ω phases, referred to as a kink, at elevated temperatures and pressures. Furthermore, pair distribution studies of this material at more extreme temperatures and pressures illustrate the scale on which diffusion takes place in this material. Possible interpretation ofmore » these can be made through inspection of shear-driven anomalies in other systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobsen, M. K.; Velisavljevic, N.; Kono, Y.
2017-04-01
Evidence in support of a shear driven anomaly in zirconium at elevated temperatures and pressures has been determined through the combined use of ultrasonic, diffractive, and radiographic techniques. Implications that these have on the phase diagram are explored through thermoacoustic parameters associated with the elasticity and thermal characteristics. In particular, our results illustrate a deviating phase boundary between the α and ω phases, referred to as a kink, at elevated temperatures and pressures. Further, pair distribution studies of this material at more extreme temperatures and pressures illustrate the scale on which diffusion takes place in this material. Possible interpretation ofmore » these can be made through inspection of shear-driven anomalies in other systems.« less
Christiansen, Peter; Nielsen, Lars N; Steen, Kim A; Jørgensen, Rasmus N; Karstoft, Henrik
2016-11-11
Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks" (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45-90 m) than RCNN. RCNN has a similar performance at a short range (0-30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit).
Christiansen, Peter; Nielsen, Lars N.; Steen, Kim A.; Jørgensen, Rasmus N.; Karstoft, Henrik
2016-01-01
Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45–90 m) than RCNN. RCNN has a similar performance at a short range (0–30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit). PMID:27845717
Mapping Subsurface Structure at Guar Kepah by using Ground Penetrating Radar
NASA Astrophysics Data System (ADS)
Mansor, Hafizuddin; Rosli, Najmiah; Ismail, N. A.; Saidin, M.; Masnan, S. S. K.
2018-04-01
A Ground Penetrating Radar (GPR) survey was conducted at Guar Kepah to detect buried object before commencement of archaeological gallery construction. The study area covered around 20 m length and 14 m width. 15 GPR lines were constructed from north to south with 20 m length, 1 m spacing and parallel to each other. The 500 MHz closed antenna had been used in this study. The surface findings were noticed before started GPR survey. The data was analysed and interpreted by using Groundvision software and several filters were applied to radargrams to enhance the data. Based on the result, several anomalies were detected. The surface findings also detected by GPR which cause hyperbolic curve in radargrams. The subsurface layer was detected by GPR survey. The anomalies are assigned to several classes based on the pattern of signals obtained in radargrams.
The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad
The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measuredmore » 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.« less
Caldera unrest detected with seawater temperature anomalies at Deception Island, Antarctic Peninsula
NASA Astrophysics Data System (ADS)
Berrocoso, M.; Prates, G.; Fernández-Ros, A.; Peci, L. M.; de Gil, A.; Rosado, B.; Páez, R.; Jigena, B.
2018-04-01
Increased thermal activity was detected to coincide with the onset of volcano inflation in the seawater-filled caldera at Deception Island. This thermal activity was manifested in pulses of high water temperature that coincided with ocean tide cycles. The seawater temperature anomalies were detected by a thermometric sensor attached to the tide gauge (bottom pressure sensor). This was installed where the seawater circulation and the locations of known thermal anomalies, fumaroles and thermal springs, together favor the detection of water warmed within the caldera. Detection of the increased thermal activity was also possible because sea ice, which covers the entire caldera during the austral winter months, insulates the water and thus reduces temperature exchange between seawater and atmosphere. In these conditions, the water temperature data has been shown to provide significant information about Deception volcano activity. The detected seawater temperature increase, also observed in soil temperature readings, suggests rapid and near-simultaneous increase in geothermal activity with onset of caldera inflation and an increased number of seismic events observed in the following austral summer.
Multi-criteria anomaly detection in urban noise sensor networks.
Dauwe, Samuel; Oldoni, Damiano; De Baets, Bernard; Van Renterghem, Timothy; Botteldooren, Dick; Dhoedt, Bart
2014-01-01
The growing concern of citizens about the quality of their living environment and the emergence of low-cost microphones and data acquisition systems triggered the deployment of numerous noise monitoring networks spread over large geographical areas. Due to the local character of noise pollution in an urban environment, a dense measurement network is needed in order to accurately assess the spatial and temporal variations. The use of consumer grade microphones in this context appears to be very cost-efficient compared to the use of measurement microphones. However, the lower reliability of these sensing units requires a strong quality control of the measured data. To automatically validate sensor (microphone) data, prior to their use in further processing, a multi-criteria measurement quality assessment model for detecting anomalies such as microphone breakdowns, drifts and critical outliers was developed. Each of the criteria results in a quality score between 0 and 1. An ordered weighted average (OWA) operator combines these individual scores into a global quality score. The model is validated on datasets acquired from a real-world, extensive noise monitoring network consisting of more than 50 microphones. Over a period of more than a year, the proposed approach successfully detected several microphone faults and anomalies.
Semi-supervised anomaly detection - towards model-independent searches of new physics
NASA Astrophysics Data System (ADS)
Kuusela, Mikael; Vatanen, Tommi; Malmi, Eric; Raiko, Tapani; Aaltonen, Timo; Nagai, Yoshikazu
2012-06-01
Most classification algorithms used in high energy physics fall under the category of supervised machine learning. Such methods require a training set containing both signal and background events and are prone to classification errors should this training data be systematically inaccurate for example due to the assumed MC model. To complement such model-dependent searches, we propose an algorithm based on semi-supervised anomaly detection techniques, which does not require a MC training sample for the signal data. We first model the background using a multivariate Gaussian mixture model. We then search for deviations from this model by fitting to the observations a mixture of the background model and a number of additional Gaussians. This allows us to perform pattern recognition of any anomalous excess over the background. We show by a comparison to neural network classifiers that such an approach is a lot more robust against misspecification of the signal MC than supervised classification. In cases where there is an unexpected signal, a neural network might fail to correctly identify it, while anomaly detection does not suffer from such a limitation. On the other hand, when there are no systematic errors in the training data, both methods perform comparably.
Anomalous change detection in imagery
Theiler, James P [Los Alamos, NM; Perkins, Simon J [Santa Fe, NM
2011-05-31
A distribution-based anomaly detection platform is described that identifies a non-flat background that is specified in terms of the distribution of the data. A resampling approach is also disclosed employing scrambled resampling of the original data with one class specified by the data and the other by the explicit distribution, and solving using binary classification.
Anomaly Detection Using an Ensemble of Feature Models
Noto, Keith; Brodley, Carla; Slonim, Donna
2011-01-01
We present a new approach to semi-supervised anomaly detection. Given a set of training examples believed to come from the same distribution or class, the task is to learn a model that will be able to distinguish examples in the future that do not belong to the same class. Traditional approaches typically compare the position of a new data point to the set of “normal” training data points in a chosen representation of the feature space. For some data sets, the normal data may not have discernible positions in feature space, but do have consistent relationships among some features that fail to appear in the anomalous examples. Our approach learns to predict the values of training set features from the values of other features. After we have formed an ensemble of predictors, we apply this ensemble to new data points. To combine the contribution of each predictor in our ensemble, we have developed a novel, information-theoretic anomaly measure that our experimental results show selects against noisy and irrelevant features. Our results on 47 data sets show that for most data sets, this approach significantly improves performance over current state-of-the-art feature space distance and density-based approaches. PMID:22020249
NASA Astrophysics Data System (ADS)
Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus
2016-04-01
The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu/index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.
Urban Classification Techniques Using the Fusion of LiDAR and Spectral Data
2012-09-01
Photogrammetry and Remote Sensing, 62, 43–63. Stein, D., Beaven, S., Hoff, L., Winter, E., Schaum, A., & Stocker, A. (2002). Anomaly detection from...TECHNIQUES USING THE FUSION OF LIDAR AND SPECTRAL DATA by Justin E. Mesina September 2012 Thesis Advisor: Richard C . Olsen Second...from shadow anomalies . The fused results however, were not as accurate in differentiating trees from grasses as using only spectral results. Overall the
NASA Technical Reports Server (NTRS)
Shukla, J.; Moura, A. D.
1980-01-01
The monthly mean sea surface temperature anomalies over tropical Altantic and rainfall anomalies over two selected stations for 25 years (1948-1972) were examined. It is found that the most severe drought events are associated with the simultaneous occurrence of warm sea surface temperature anomalies over north and cold sea surface temperature anomalies over south tropical Atlantic. Simultaneous occurrences of warm sea surface temperature anomaly at 15 deg N, 45 deg W and cold sea surface temperature anomaly at 15 deg S, 5 deg W were always associated with negative anomalies of rainfall, and vice versa. A simple primitive equation model is used to calculate the frictionally controlled and thermally driven circulation due to a prescribed heating function in a resting atmosphere.
Anomaly detection applied to a materials control and accounting database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiteson, R.; Spanks, L.; Yarbro, T.
An important component of the national mission of reducing the nuclear danger includes accurate recording of the processing and transportation of nuclear materials. Nuclear material storage facilities, nuclear chemical processing plants, and nuclear fuel fabrication facilities collect and store large amounts of data describing transactions that involve nuclear materials. To maintain confidence in the integrity of these data, it is essential to identify anomalies in the databases. Anomalous data could indicate error, theft, or diversion of material. Yet, because of the complex and diverse nature of the data, analysis and evaluation are extremely tedious. This paper describes the authors workmore » in the development of analysis tools to automate the anomaly detection process for the Material Accountability and Safeguards System (MASS) that tracks and records the activities associated with accountable quantities of nuclear material at Los Alamos National Laboratory. Using existing guidelines that describe valid transactions, the authors have created an expert system that identifies transactions that do not conform to the guidelines. Thus, this expert system can be used to focus the attention of the expert or inspector directly on significant phenomena.« less
Kinsland, G L; Hurtado, M; Pope, K O
2000-04-15
Small negative gravity anomalies are found in gravity data from along the northwestern shoreline of the Yucatan Peninsula. These anomalies are shown to be due to elongate, shallow anomalous porosity zones in the Tertiary carbonates. These zones are caused primarily by groundwater solution and are presently active conduits for groundwater flow. The association of these small gravity anomalies with known topographic and structural features of the area, which partially overlies the Chicxulub Impact crater, indicates their development was influenced by structures, faults and/or fractures, within the Tertiary and pre-Tertiary carbonates.
NASA Technical Reports Server (NTRS)
Kinsland, G. L.; Hurtado, M.; Pope, K. O.; Ocampo, A. C. (Principal Investigator)
2000-01-01
Small negative gravity anomalies are found in gravity data from along the northwestern shoreline of the Yucatan Peninsula. These anomalies are shown to be due to elongate, shallow anomalous porosity zones in the Tertiary carbonates. These zones are caused primarily by groundwater solution and are presently active conduits for groundwater flow. The association of these small gravity anomalies with known topographic and structural features of the area, which partially overlies the Chicxulub Impact crater, indicates their development was influenced by structures, faults and/or fractures, within the Tertiary and pre-Tertiary carbonates.
Analysis of NSWC Ocean EM Observatory Test Data
2016-09-01
deployment locations. 1S. SUBJECT TERMS magnetic anomaly detection (MAD), oceanographic magnetic fields, coherence, magnetic noise reduction 16...analyses ......................................................................................... 11 3. Analysis of magnetic data...37 Appendix B: Feb 11 underwater magnetic data
Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance
NASA Technical Reports Server (NTRS)
Viswanathan, Arun
2012-01-01
This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this variability in training data for anomaly detectors, but has ignored variability in the attack signal that will necessarily affect the evaluation results for such detectors. We posit that current evaluation strategies implicitly assume that attacks always manifest in a stable manner; we show that this assumption is wrong. We describe a simple experiment to demonstrate the effects of environmental noise on the manifestation of attacks in data and introduce the notion of attack manifestation stability. Finally, we argue that conclusions about detector performance will be unreliable and incomplete if the stability of attack manifestation is not accounted for in the evaluation strategy.
NASA Astrophysics Data System (ADS)
Tian, Qing; Yang, Dan; Zhang, Yuan; Qu, Hongquan
2018-04-01
This paper presents detection and recognition method to locate and identify harmful intrusions in the optical fiber pre-warning system (OFPS). Inspired by visual attention architecture (VAA), the process flow is divided into two parts, i.e., data-driven process and task-driven process. At first, data-driven process takes all the measurements collected by the system as input signals, which is handled by detection method to locate the harmful intrusion in both spatial domain and time domain. Then, these detected intrusion signals are taken over by task-driven process. Specifically, we get pitch period (PP) and duty cycle (DC) of the intrusion signals to identify the mechanical and manual digging (MD) intrusions respectively. For the passing vehicle (PV) intrusions, their strong low frequency component can be used as good feature. In generally, since the harmful intrusion signals only account for a small part of whole measurements, the data-driven process reduces the amount of input data for subsequent task-driven process considerably. Furthermore, the task-driven process determines the harmful intrusions orderly according to their severity, which makes a priority mechanism for the system as well as targeted processing for different harmful intrusion. At last, real experiments are performed to validate the effectiveness of this method.
Overton, Jr., William C.; Steyert, Jr., William A.
1984-01-01
A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.
Overton, W.C. Jr.; Steyert, W.A. Jr.
1981-05-22
A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.
Integrated System Health Management Development Toolkit
NASA Technical Reports Server (NTRS)
Figueroa, Jorge; Smith, Harvey; Morris, Jon
2009-01-01
This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.
NASA Astrophysics Data System (ADS)
Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.
2016-09-01
Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.
NASA Astrophysics Data System (ADS)
Wang, Fei; Wang, Wenyu; Yang, Jin Min
2017-10-01
We propose to introduce general messenger-matter interactions in the deflected anomaly mediated supersymmetry (SUSY) breaking (AMSB) scenario to explain the gμ-2 anomaly. Scenarios with complete or incomplete grand unified theory (GUT) multiplet messengers are discussed, respectively. The introduction of incomplete GUT mulitiplets can be advantageous in various aspects. We found that the gμ-2 anomaly can be solved in both scenarios under current constraints including the gluino mass bounds, while the scenarios with incomplete GUT representation messengers are more favored by the gμ-2 data. We also found that the gluino is upper bounded by about 2.5 TeV (2.0 TeV) in scenario A and 3.0 TeV (2.7 TeV) in scenario B if the generalized deflected AMSB scenarios are used to fully account for the gμ-2 anomaly at 3 σ (2 σ ) level. Such a gluino should be accessible in the future LHC searches. Dark matter (DM) constraints, including DM relic density and direct detection bounds, favor scenario B with incomplete GUT multiplets. Much of the allowed parameter space for scenario B could be covered by the future DM direct detection experiments.
Stability Analysis of Radial Turning Process for Superalloys
NASA Astrophysics Data System (ADS)
Jiménez, Alberto; Boto, Fernando; Irigoien, Itziar; Sierra, Basilio; Suarez, Alfredo
2017-09-01
Stability detection in machining processes is an essential component for the design of efficient machining processes. Automatic methods are able to determine when instability is happening and prevent possible machine failures. In this work a variety of methods are proposed for detecting stability anomalies based on the measured forces in the radial turning process of superalloys. Two different methods are proposed to determine instabilities. Each one is tested on real data obtained in the machining of Waspalloy, Haynes 282 and Inconel 718. Experimental data, in both Conventional and High Pressure Coolant (HPC) environments, are set in four different states depending on materials grain size and Hardness (LGA, LGS, SGA and SGS). Results reveal that PCA method is useful for visualization of the process and detection of anomalies in online processes.
Network Intrusion Detection and Visualization using Aggregations in a Cyber Security Data Warehouse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czejdo, Bogdan; Ferragut, Erik M; Goodall, John R
2012-01-01
The challenge of achieving situational understanding is a limiting factor in effective, timely, and adaptive cyber-security analysis. Anomaly detection fills a critical role in network assessment and trend analysis, both of which underlie the establishment of comprehensive situational understanding. To that end, we propose a cyber security data warehouse implemented as a hierarchical graph of aggregations that captures anomalies at multiple scales. Each node of our pro-posed graph is a summarization table of cyber event aggregations, and the edges are aggregation operators. The cyber security data warehouse enables domain experts to quickly traverse a multi-scale aggregation space systematically. We describemore » the architecture of a test bed system and a summary of results on the IEEE VAST 2012 Cyber Forensics data.« less
Effect of Varying Crustal Thickness on CHAMP Geopotential Data
NASA Technical Reports Server (NTRS)
Taylor, P. T.; Kis, K. I.; vonFrese, R. R. B.; Korhonen, J. V.; Wittmann, G.; Kim, H. R.; Potts, L. V.
2003-01-01
To determine the effect of crustal thickness variation on satellite-altitude geopotential anomalies we compared two regions of Europe with vastly different values, Central/Southern Finland and the Pannonian Basin. Crustal thickness exceeds 62 km in Finland and is less than 26 km in the Pannonian Basin. Heat-flow maps indicate that the thinner and more active crust of the Pannonian Basin has a value nearly three times that of the Finnish Svecofennian Province. Ground based gravity mapping in Hungary shows that the free-air gravity anomalies across the Pannonian Basin are near 0 to +20 mGal with shorter wavelength anomalies from +40 to less than +60 mGal and some 0 to greater than -20 mGal. Larger anomalies are detected in the mountainous areas. The minor value anomalies can indicate the isostatic equilibrium for Hungary (the central part of the Pannonian Basin). Gravity data over Finland are complicated by de-glaciation. CHAMP gravity data (400 km) indicates a west-east positive gradient of greater than 4 mGal across Central/Southern Finland and an ovoid positive anomaly (approximately 4 mGal) quasi-coincidental with the magnetic anomaly traversing the Pannonian Basin. CHAMP magnetic data (425 km) reveal elongated semicircular negative anomalies for both regions with South-Central Finland having larger amplitude (less than -6 nT) than that over the Pannonian Basin, Hungary (less than -5 nT). In both regions subducted oceanic lithosphere has been proposed as the anomalous body.
Continental and oceanic magnetic anomalies: Enhancement through GRM
NASA Technical Reports Server (NTRS)
Vonfrese, R. R. B.; Hinze, W. J.
1985-01-01
In contrast to the POGO and MAGSAT satellites, the Geopotential Research Mission (GRM) satellite system will orbit at a minimum elevation to provide significantly better resolved lithospheric magnetic anomalies for more detailed and improved geologic analysis. In addition, GRM will measure corresponding gravity anomalies to enhance our understanding of the gravity field for vast regions of the Earth which are largely inaccessible to more conventional surface mapping. Crustal studies will greatly benefit from the dual data sets as modeling has shown that lithospheric sources of long wavelength magnetic anomalies frequently involve density variations which may produce detectable gravity anomalies at satellite elevations. Furthermore, GRM will provide an important replication of lithospheric magnetic anomalies as an aid to identifying and extracting these anomalies from satellite magnetic measurements. The potential benefits to the study of the origin and characterization of the continents and oceans, that may result from the increased GRM resolution are examined.
Network anomaly detection system with optimized DS evidence theory.
Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu
2014-01-01
Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network-complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each sensor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly.
Network Anomaly Detection System with Optimized DS Evidence Theory
Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu
2014-01-01
Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258
Surveying the South Pole-Aitken basin magnetic anomaly for remnant impactor metallic iron
Cahill, Joshua T.S.; Hagerty, Justin J.; Lawrence, David M.; Klima, Rachel L.; Blewett, David T.
2014-01-01
The Moon has areas of magnetized crust ("magnetic anomalies"), the origins of which are poorly constrained. A magnetic anomaly near the northern rim of South Pole-Aitken (SPA) basin was recently postulated to originate from remnant metallic iron emplaced by the SPA basin-forming impactor. Here, we remotely examine the regolith of this SPA magnetic anomaly with a combination of Clementine and Lunar Prospector derived iron maps for any evidence of enhanced metallic iron content. We find that these data sets do not definitively detect the hypothesized remnant metallic iron within the upper tens of centimeters of the lunar regolith.
Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques
NASA Astrophysics Data System (ADS)
Lauzon, N.; Lence, B. J.
2002-12-01
This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be established with these calibration procedures. A simple method for analyzing uncertainties associated with the Kohonen neural network and fuzzy c-means is developed here. The method combines the results from several sets of clusters, either from the Kohonen neural network or fuzzy c-means, so as to provide an overall diagnosis as to the identification of outliers, shifts and trends. The results indicate an improvement in the performance for identifying anomalies when the method of combining cluster sets is used, compared with when only one cluster set is used.
Autonomous software: Myth or magic?
NASA Astrophysics Data System (ADS)
Allan, A.; Naylor, T.; Saunders, E. S.
2008-03-01
We discuss work by the eSTAR project which demonstrates a fully closed loop autonomous system for the follow up of possible micro-lensing anomalies. Not only are the initial micro-lensing detections followed up in real time, but ongoing events are prioritised and continually monitored, with the returned data being analysed automatically. If the ``smart software'' running the observing campaign detects a planet-like anomaly, further follow-up will be scheduled autonomously and other telescopes and telescope networks alerted to the possible planetary detection. We further discuss the implications of this, and how such projects can be used to build more general autonomous observing and control systems.
NASA Astrophysics Data System (ADS)
Chen, S.; Tao, C.; Li, H.; Zhou, J.; Deng, X.; Tao, W.; Zhang, G.; Liu, W.; He, Y.
2014-12-01
The Precious Stone Mountain hydrothermal field (PSMHF) is located on the southern rim of the Galapagos Microplate. It was found at the 3rd leg of the 2009 Chinese DY115-21 expedition on board R/V Dayangyihao. It is efficient to learn the distribution of hydrothermal plumes and locate the hydrothermal vents by detecting the anomalies of turbidity and temperature. Detecting seawater turbidity by MAPR based on deep-tow technology is established and improved during our cruises. We collected data recorded by MAPR and information from geological sampling, yielding the following results: (1)Strong hydrothermal turbidity and temperature anomalies were recorded at 1.23°N, southeast and northwest of PSMHF. According to the CTD data on the mooring system, significant temperature anomalies were observed over PSMHF at the depth of 1,470 m, with anomalies range from 0.2℃ to 0.4℃, which gave another evidence of the existence of hydrothermal plume. (2)At 1.23°N (101.4802°W/1.2305°N), the nose-shaped particle plume was concentrated at a depth interval of 1,400-1,600 m, with 200 m thickness and an east-west diffusion range of 500 m. The maximum turbidity anomaly (0.045 △NTU) was recorded at the depth of 1,500 m, while the background anomaly was about 0.01△NTU. A distinct temperature anomaly was also detected at the seafloor near 1.23°N. Deep-tow camera showed the area was piled up by hydrothermal sulfide sediments. (3) In the southeast (101.49°W/1.21°N), the thickness of hydrothermal plume was 300 m and it was spreading laterally at a depth of 1,500-1,800 m, for a distance about 800 m. The maximum turbidity anomaly of nose-shaped plume is about 0.04 △NTU at the depth of 1,600 m. Distinct temperature anomaly was also detected in the northwest (101.515°W/1.235°N). (4) Terrain and bottom current were the main factors controlling the distribution of hydrothermal plume. Different from the distribution of hydrothermal plumes on the mid-ocean ridges, which was mostly effected by seafloor topography, the terrain of the PSMHF was relatively flat, so the impact was negligible. Southwest direction bottom current at the speed of 0.05 m/s in PSMHF had a great influence on the distribution and spreading direction of hydrothermal plume. Keyword: hydrothermal plume, Precious Stone Mountain hydrothermal field, turbidity
NASA Astrophysics Data System (ADS)
Akhoondzadeh, M.
2013-04-01
In this paper, a number of classical and intelligent methods, including interquartile, autoregressive integrated moving average (ARIMA), artificial neural network (ANN) and support vector machine (SVM), have been proposed to quantify potential thermal anomalies around the time of the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4). The duration of the data set, which is comprised of Aqua-MODIS land surface temperature (LST) night-time snapshot images, is 62 days. In order to quantify variations of LST data obtained from satellite images, the air temperature (AT) data derived from the meteorological station close to the earthquake epicenter has been taken into account. For the models examined here, results indicate the following: (i) ARIMA models, which are the most widely used in the time series community for short-term forecasting, are quickly and easily implemented, and can efficiently act through linear solutions. (ii) A multilayer perceptron (MLP) feed-forward neural network can be a suitable non-parametric method to detect the anomalous changes of a non-linear time series such as variations of LST. (iii) Since SVMs are often used due to their many advantages for classification and regression tasks, it can be shown that, if the difference between the predicted value using the SVM method and the observed value exceeds the pre-defined threshold value, then the observed value could be regarded as an anomaly. (iv) ANN and SVM methods could be powerful tools in modeling complex phenomena such as earthquake precursor time series where we may not know what the underlying data generating process is. There is good agreement in the results obtained from the different methods for quantifying potential anomalies in a given LST time series. This paper indicates that the detection of the potential thermal anomalies derive credibility from the overall efficiencies and potentialities of the four integrated methods.
NASA Astrophysics Data System (ADS)
Sorge, J.; Williams-Jones, G.; Wright, R.; Varley, N. R.
2010-12-01
Satellite imagery is playing an increasingly prominent role in volcanology as it allows for consistent monitoring of remote, dangerous, and/or under-monitored volcanoes. One such system is Volcán de Colima (Mexico), a persistently active andesitic stratovolcano. Its characteristic and hazardous activity includes lava dome growth, pyroclastic flows, explosions, and Plinian to Subplinian eruptions, which have historically occurred at the end of Volcán de Colima’s eruptive cycle. Despite the availability of large amounts of historical satellite imagery, methods to process and interpret these images over long time periods are limited. Furthermore, while time-series InSAR data from a previous study (December 2002 to August 2006) detected an overall subsidence between 1 and 3 km from the summit, there is insufficient temporal resolution to unambiguously constrain the source processes. To address this issue, a semi-automated process for time-based characterization of persistent volcanic activity at Volcán de Colima has been developed using a combination of MODIS and GOES satellite imagery to identify thermal anomalies on the volcano edifice. This satellite time-series data is then combined with available geodetic data, a detailed eruption history, and other geophysical time-series data (e.g., seismicity, explosions/day, effusion rate, environmental data, etc.) and examined for possible correlations and recurring patterns in the multiple data sets to investigate potential trigger mechanisms responsible for the changes in volcanic activity. GOES and MODIS images are available from 2000 to present at a temporal resolution of one image every 30 minutes and up to four images per day, respectively, creating a data set of approximately 180,000 images. Thermal anomalies over Volcán de Colima are identified in both night- and day-time images by applying a time-series approach to the analysis of MODIS data. Detection of false anomalies, caused by non-volcanic heat sources such as fires or solar heating (in the daytime images), is mitigated by adjusting the MODIS detection thresholds, through comparison of daytime versus nighttime results, and by observing the spatial distribution of the anomalies on the edifice. Conversely, anomalies may not be detected due to cloud cover; clouds absorb thermal radiation limiting or preventing the ability of the satellite to measure thermal events; therefore, the anomaly data is supplemented with a cloud cover time-series data set. Fast Fourier and Wavelet transforms are then applied to the continuous, uninterrupted intervals of satellite observation to compare and correlate with the multiple time-series data sets. The result is the characterization of the behavior of an individual volcano, based on an extended time period. This volcano specific, comprehensive characterization can then be used as a predictive tool in the real-time monitoring of volcanic activity.
NASA Astrophysics Data System (ADS)
Benedetto, J.; Cloninger, A.; Czaja, W.; Doster, T.; Kochersberger, K.; Manning, B.; McCullough, T.; McLane, M.
2014-05-01
Successful performance of radiological search mission is dependent on effective utilization of mixture of signals. Examples of modalities include, e.g., EO imagery and gamma radiation data, or radiation data collected during multiple events. In addition, elevation data or spatial proximity can be used to enhance the performance of acquisition systems. State of the art techniques in processing and exploitation of complex information manifolds rely on diffusion operators. Our approach involves machine learning techniques based on analysis of joint data- dependent graphs and their associated diffusion kernels. Then, the significant eigenvectors of the derived fused graph Laplace and Schroedinger operators form the new representation, which provides integrated features from the heterogeneous input data. The families of data-dependent Laplace and Schroedinger operators on joint data graphs, shall be integrated by means of appropriately designed fusion metrics. These fused representations are used for target and anomaly detection.
Application of Data Cubes for Improving Detection of Water Cycle Extreme Events
NASA Technical Reports Server (NTRS)
Albayrak, Arif; Teng, William
2015-01-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).
NASA Astrophysics Data System (ADS)
Kim, J.; Lin, S. Y.; Tsai, Y.; Singh, S.; Singh, T.
2017-12-01
A large ground deformation which may be caused by a significant groundwater depletion of the Northwest India Aquifer has been successfully observed throughout space geodesy techniques (Tsai et al, 2016). Employing advanced time-series ScanSAR InSAR analysis and Gravity Recovery and Climate Experiment (GRACE) satellites data, it revealed 400-km wide huge ground deformation in and around Haryana. It was further notified that the Ambala city located in northern Haryana district shown the most significant ground subsidence with maximum cumulative deformation up to 0.2 meters within 3 years in contrast to the nearby cities such as Patiala and Chandigarh that did not present similar subsidence. In this study, we investigated the details of "Ambala Anomaly" employing advanced time-series InSAR and spatial analyses together with local geology and anthropogenic contexts and tried to identify the factors causing such a highly unique ground deformation pattern. To explore the pattern and trend of Ambala' subsidence, we integrated the time-series deformation results of both ascending L-band PALSAR-1 (Phased Array type L-band Synthetic Aperture Radar) from 2007/1 to 2011/1 and descending C-band ASAR (Advanced Synthetic Aperture Radar) from 2008/9 to 2010/8 to process the 3D decomposition, expecting to reveal the asymmetric movement of the surface. In addition. The spatial analyses incorporating detected ground deformations and local economical/social factors were then applied for the interpretation of "Ambala Anomaly". The detailed interrelationship of driving factors of the "Ambala Anomaly" and the spatial pattern of corresponding ground subsidence will be further demonstrated. After all, we determined the uniqueness of Ambala subsidence possibly be driven by both anthropogenic behaviors including the rapid growth rate of population and constructing of industrial centers as well as the natural geological characteristics and sediment deposition.
Using statistical anomaly detection models to find clinical decision support malfunctions.
Ray, Soumi; McEvoy, Dustin S; Aaron, Skye; Hickman, Thu-Trang; Wright, Adam
2018-05-11
Malfunctions in Clinical Decision Support (CDS) systems occur due to a multitude of reasons, and often go unnoticed, leading to potentially poor outcomes. Our goal was to identify malfunctions within CDS systems. We evaluated 6 anomaly detection models: (1) Poisson Changepoint Model, (2) Autoregressive Integrated Moving Average (ARIMA) Model, (3) Hierarchical Divisive Changepoint (HDC) Model, (4) Bayesian Changepoint Model, (5) Seasonal Hybrid Extreme Studentized Deviate (SHESD) Model, and (6) E-Divisive with Median (EDM) Model and characterized their ability to find known anomalies. We analyzed 4 CDS alerts with known malfunctions from the Longitudinal Medical Record (LMR) and Epic® (Epic Systems Corporation, Madison, WI, USA) at Brigham and Women's Hospital, Boston, MA. The 4 rules recommend lead testing in children, aspirin therapy in patients with coronary artery disease, pneumococcal vaccination in immunocompromised adults and thyroid testing in patients taking amiodarone. Poisson changepoint, ARIMA, HDC, Bayesian changepoint and the SHESD model were able to detect anomalies in an alert for lead screening in children and in an alert for pneumococcal conjugate vaccine in immunocompromised adults. EDM was able to detect anomalies in an alert for monitoring thyroid function in patients on amiodarone. Malfunctions/anomalies occur frequently in CDS alert systems. It is important to be able to detect such anomalies promptly. Anomaly detection models are useful tools to aid such detections.
Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.
Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen
2015-11-01
Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.
NASA Astrophysics Data System (ADS)
Polonsky, Alexander B.; Basharin, Dmitry V.
2017-04-01
The aim of this paper is to study the interannual climate variability over the Mediterranean region related to the Indo-ocean dipole (IOD) using the data of re-analyses, archival data and specialized numerical experiments. It is shown that the IOD does not impact essentially the anomalies of surface air temperature (SAT) and sea level pressure (SLP) in the Mediterranean region. On average, the IOD-induced share of the SAT/SLP variance in the total variance of these fields in the Mediterranean region is smaller than 10% even in summer when it is at a maximum. However, the statistically significant IOD-induced SAT/SLP anomalies in the Mediterranean region are detectable. For definite IOD events the associated Mediterranean SAT anomalies can reach about 1 °C.
NASA Technical Reports Server (NTRS)
Francis, P. W.; Rothery, D. A.
1987-01-01
The Landsat Thematic Mapper (TM) offers a means of detecting and monitoring thermal features of active volcanoes. Using the TM, a prominent thermal anomaly has been discovered on Lascar volcano, northern Chile. Data from two short-wavelength infrared channels of the TM show that material within a 300-m-diameter pit crater was at a temperature of at least 380 C on two dates in 1985. The thermal anomaly closely resembles in size and radiant temperature the anomaly over the active lava lake at Erta'ale in Ethiopia. An eruption took place at Lascar on Sept. 16, 1986. TM data acquired on Oct. 27, 1986, revealed significant changes within the crater area. Lascar is in a much more active state than any other volcano in the central Andes, and for this reason it merits further careful monitoring. Studies show that the TM is capable of confidently identifying thermal anomalies less than 100 m in size, at temperatures of above 150 C, and thus it offers a valuable means of monitoring the conditions of active or potentially active volcanoes, particularly those in remote regions.
Linear dynamical modes as new variables for data-driven ENSO forecast
NASA Astrophysics Data System (ADS)
Gavrilov, Andrey; Seleznev, Aleksei; Mukhin, Dmitry; Loskutov, Evgeny; Feigin, Alexander; Kurths, Juergen
2018-05-01
A new data-driven model for analysis and prediction of spatially distributed time series is proposed. The model is based on a linear dynamical mode (LDM) decomposition of the observed data which is derived from a recently developed nonlinear dimensionality reduction approach. The key point of this approach is its ability to take into account simple dynamical properties of the observed system by means of revealing the system's dominant time scales. The LDMs are used as new variables for empirical construction of a nonlinear stochastic evolution operator. The method is applied to the sea surface temperature anomaly field in the tropical belt where the El Nino Southern Oscillation (ENSO) is the main mode of variability. The advantage of LDMs versus traditionally used empirical orthogonal function decomposition is demonstrated for this data. Specifically, it is shown that the new model has a competitive ENSO forecast skill in comparison with the other existing ENSO models.
Stratification on the Skagit Bay Tidal Flats
2012-09-01
and wind -driven currents can 11 affect the potential energy anomaly balance in estuaries and ROFIs during storms (Yang and Khangaonkar, 2009...30 3.4.1 The Potential Energy Anomaly Balance...turbulent energy is dissipated by destabilizing the fluid rather than by slowing the upper water column (Turner, 1973). Overall, stratification tends to
NASA Technical Reports Server (NTRS)
Hegyi, Bradley M.; Taylor, Patrick C.
2017-01-01
The impact of the Arctic Oscillation (AO) and Arctic Dipole (AD) on the radiative flux into the Arctic mean atmospheric column is quantified. 3-month-averaged AO and AD indices are regressed with corresponding surface and top-of-atmosphere (TOA) fluxes from the CERES-SFC and CERES-TOA EBAF datasets over the period 2000-2014. An increase in clear-sky fluxes into the Arctic mean atmospheric column during fall is the largest net flux anomaly associated with AO, primarily driven by a positive net longwave flux anomaly (i.e. increase of net flux into the atmospheric column) at the surface. A decrease in the Arctic mean atmospheric column cloud radiative effect during winter and spring is the largest flux anomaly associated with AD, primarily driven by a change in the longwave cloud radiative effect at the surface. These prominent responses to AO and AD are widely distributed across the ice-covered Arctic, suggesting that the physical process or processes that bring about the flux change associated with AO and AD are distributed throughout the Arctic.
Liu, Datong; Peng, Yu; Peng, Xiyuan
2018-01-01
Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR) and relevance vector machine (RVM)) are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP), which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%). There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI) based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA) algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application. PMID:29587372
Meridional circulation and CNO anomalies in red giant stars
NASA Technical Reports Server (NTRS)
Sweigart, A. V.; Mengel, J. G.
1979-01-01
The possibility is investigated that meridional circulation driven by internal rotation might lead to the mixing of CNO-processed material from the vicinity of the hydrogen shell into the envelope of a red giant star. This theory of meridional mixing is found to be generally consistent with available data and to be capable of explaining a number of observational results without invoking a radical departure from the standard physics of stellar interiors. It is suggested that meridional circulation must be a normal characteristic of a rotating star and that meridional mixing provides a reasonable framework for understanding many of the CNO anomalies exhibited by weak-G-band and CN-strong stars as well as the low C-12/C-13 ratios measured among field red giants.
Improving detection of low SNR targets using moment-based detection
NASA Astrophysics Data System (ADS)
Young, Shannon R.; Steward, Bryan J.; Hawks, Michael; Gross, Kevin C.
2016-05-01
Increases in the number of cameras deployed, frame rate, and detector array sizes have led to a dramatic increase in the volume of motion imagery data that is collected. Without a corresponding increase in analytical manpower, much of the data is not analyzed to full potential. This creates a need for fast, automated, and robust methods for detecting signals of interest. Current approaches fall into two categories: detect-before-track (DBT), which are fast but often poor at detecting dim targets, and track-before-detect (TBD) methods which can offer better performance but are typically much slower. This research seeks to contribute to the near real time detection of low SNR, unresolved moving targets through an extension of earlier work on higher order moments anomaly detection, a method that exploits both spatial and temporal information but is still computationally efficient and massively parallelizable. It was found that intelligent selection of parameters can improve probability of detection by as much as 25% compared to earlier work with higherorder moments. The present method can reduce detection thresholds by 40% compared to the Reed-Xiaoli anomaly detector for low SNR targets (for a given probability of detection and false alarm).
Drivers of Arctic Ocean warming in CMIP5 models
NASA Astrophysics Data System (ADS)
Burgard, Clara; Notz, Dirk
2017-05-01
We investigate changes in the Arctic Ocean energy budget simulated by 26 general circulation models from the Coupled Model Intercomparison Project Phase 5 framework. Our goal is to understand whether the Arctic Ocean warming between 1961 and 2099 is primarily driven by changes in the net atmospheric surface flux or by changes in the meridional oceanic heat flux. We find that the simulated Arctic Ocean warming is driven by positive anomalies in the net atmospheric surface flux in 11 models, by positive anomalies in the meridional oceanic heat flux in 11 models, and by positive anomalies in both energy fluxes in four models. The different behaviors are mainly characterized by the different changes in meridional oceanic heat flux that lead to different changes in the turbulent heat loss to the atmosphere. The multimodel ensemble mean is hence not representative of a consensus across the models in Arctic climate projections.
A dilation-driven vortex flow in sheared granular materials explains a rheometric anomaly.
Krishnaraj, K P; Nott, Prabhu R
2016-02-11
Granular flows occur widely in nature and industry, yet a continuum description that captures their important features is yet not at hand. Recent experiments on granular materials sheared in a cylindrical Couette device revealed a puzzling anomaly, wherein all components of the stress rise nearly exponentially with depth. Here we show, using particle dynamics simulations and imaging experiments, that the stress anomaly arises from a remarkable vortex flow. For the entire range of fill heights explored, we observe a single toroidal vortex that spans the entire Couette cell and whose sense is opposite to the uppermost Taylor vortex in a fluid. We show that the vortex is driven by a combination of shear-induced dilation, a phenomenon that has no analogue in fluids, and gravity flow. Dilatancy is an important feature of granular mechanics, but not adequately incorporated in existing models.
Automated Network Anomaly Detection with Learning, Control and Mitigation
ERIC Educational Resources Information Center
Ippoliti, Dennis
2014-01-01
Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…
Systematic Screening for Subtelomeric Anomalies in a Clinical Sample of Autism
ERIC Educational Resources Information Center
Wassink, Thomas H.; Losh, Molly; Piven, Joseph; Sheffield, Val C.; Ashley, Elizabeth; Westin, Erik R.; Patil, Shivanand R.
2007-01-01
High-resolution karyotyping detects cytogenetic anomalies in 5-10% of cases of autism. Karyotyping, however, may fail to detect abnormalities of chromosome subtelomeres, which are gene rich regions prone to anomalies. We assessed whether panels of FISH probes targeted for subtelomeres could detect abnormalities beyond those identified by…
The Low-Frequency Variability of the Tropical Atlantic Ocean
NASA Technical Reports Server (NTRS)
Haekkinen, Sirpa; Mo, Kingtse C.; Koblinsky, Chester J. (Technical Monitor)
2001-01-01
Upper ocean temperature variability in the tropical Atlantic is examined from the Comprehensive Ocean Atmosphere Data Set (COADS) as well as from an ocean model simulation forced by COADS anomalies appended to a monthly climatology. Our findings are as follows: Only the sea surface temperatures (SST) in the northern tropics are driven by heat fluxes, while the southern tropical variability arises from wind driven ocean circulation changes. The subsurface temperatures in the northern and southern tropics are found to have a strong linkage to buoyancy forcing changes in the northern North Atlantic. Evidence for Kelvin-like boundary wave propagation from the high latitudes is presented from the model simulation. This extratropical influence is associated with wintertime North Atlantic Oscillation (NAO) forcing and manifests itself in the northern and southern tropical temperature anomalies of the same sign at depth of 100-200 meters as result of a Rossby wave propagation away from the eastern boundary in the wake of the boundary wave passage. The most apparent association of the southern tropical sea surface temperature anomalies (STA) arises with the anomalous cross-equatorial winds which can be related to both NAO and the remote influence from the Pacific equatorial region. These teleconnections are seasonal so that the NAO impact on the tropical SST is the largest it mid-winter but in spring and early summer the Pacific remote influence competes with NAO. However, NAO appears to have a more substantial role than the Pacific influence at low frequencies during the last 50 years. The dynamic origin of STA is indirectly confirmed from the SST-heat flux relationship using ocean model experiments which remove either anomalous wind stress forcing or atmospheric forcing anomalies contributing to heat exchange.
NASA Astrophysics Data System (ADS)
Dutheil, J. Ph.; Langel, G.
2003-08-01
ARIANE 5 experienced a flight anomaly with the 10 th model mission (F 510), having placed its both satellites in a lower orbit than the planned GTO. Only one satellite (Artemis) could be retrieved due to its own propulsion systems. Arianespace, CNES and Astrium-GmbH (former DaimlerChrysler Aerospace Dasa) immediately set up a recovery team, combining forces for carrying deep and schedule-driven investigations, and later qualifying recovery measures. A failure in such an important program: is immediately triggering a large "post-shock" reaction from the ARIANE community implied in the relevant business and technology. The investigation fields are summarised in the following chapters, showing how failure analysis, engineering investigations and basic research have been combined in order to have a schedule and methodic efficient approach. The combination of all available European resources in space vehicle design has been implemented, involving industry, agency technical centers and research laboratories. The investigation methodology applied has been driven by the particular situation of a flight anomaly investigation, which has to take into account the reduced amount of measurement available in flight and the necessary combination with ground test data for building a strategy to reach identification of possible failure scenario. From the investigations and from extensive sensitivity characterisation test of EPS engine (AESTUS) ignition transient, stability margins have been deeply investigated and introduced in the post-anomaly upgraded stage design. The identification and implementation of recovery measures, extended as well to - potential ignition margin reduction factors even beyond the observed flight anomaly allowed to establish a robust complementary qualification status, thus allowing resuming of operational flight, starting with the valuable "Envisat" payload of European Space Agency, dedicated to earth and climate monitoring, on flight 511, the 28/02/2002, from Kourou Spaceport.
Early dynamics of white matter deficits in children developing dyslexia.
Vanderauwera, Jolijn; Wouters, Jan; Vandermosten, Maaike; Ghesquière, Pol
2017-10-01
Neural anomalies have been demonstrated in dyslexia. Recent studies in pre-readers at risk for dyslexia and in pre-readers developing poor reading suggest that these anomalies might be a cause of their reading impairment. Our study goes one step further by exploring the neurodevelopmental trajectory of white matter anomalies in pre-readers with and without a familial risk for dyslexia (n=61) of whom a strictly selected sample develops dyslexia later on (n=15). We collected longitudinal diffusion MRI and behavioural data until grade 3. The results provide evidence that children with dyslexia exhibit pre-reading white matter anomalies in left and right long segment of the arcuate fasciculus (AF), with predictive power of the left segment above traditional cognitive measures and familial risk. Whereas white matter differences in the left AF seem most strongly related to the development of dyslexia, differences in the left IFOF and in the right AF seem driven by both familial risk and later reading ability. Moreover, differences in the left AF appeared to be dynamic. This study supports and expands recent insights into the neural basis of dyslexia, pointing towards pre-reading anomalies related to dyslexia, as well as underpinning the dynamic character of white matter. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Buntine, Wray L.; Kraft, Richard; Whitaker, Kevin; Cooper, Anita E.; Powers, W. T.; Wallace, Tim L.
1993-06-01
Data obtained in the framework of an Optical Plume Anomaly Detection (OPAD) program intended to create a rocket engine health monitor based on spectrometric detections of anomalous atomic and molecular species in the exhaust plume are analyzed. The major results include techniques for handling data noise, methods for registration of spectra to wavelength, and a simple automatic process for estimating the metallic component of a spectrum.
NASA Technical Reports Server (NTRS)
Gage, Mark; Dehoff, Ronald
1991-01-01
This system architecture task (1) analyzed the current process used to make an assessment of engine and component health after each test or flight firing of an SSME, (2) developed an approach and a specific set of objectives and requirements for automated diagnostics during post fire health assessment, and (3) listed and described the software applications required to implement this system. The diagnostic system described is a distributed system with a database management system to store diagnostic information and test data, a CAE package for visual data analysis and preparation of plots of hot-fire data, a set of procedural applications for routine anomaly detection, and an expert system for the advanced anomaly detection and evaluation.
GDPC: Gravitation-based Density Peaks Clustering algorithm
NASA Astrophysics Data System (ADS)
Jiang, Jianhua; Hao, Dehao; Chen, Yujun; Parmar, Milan; Li, Keqin
2018-07-01
The Density Peaks Clustering algorithm, which we refer to as DPC, is a novel and efficient density-based clustering approach, and it is published in Science in 2014. The DPC has advantages of discovering clusters with varying sizes and varying densities, but has some limitations of detecting the number of clusters and identifying anomalies. We develop an enhanced algorithm with an alternative decision graph based on gravitation theory and nearby distance to identify centroids and anomalies accurately. We apply our method to some UCI and synthetic data sets. We report comparative clustering performances using F-Measure and 2-dimensional vision. We also compare our method to other clustering algorithms, such as K-Means, Affinity Propagation (AP) and DPC. We present F-Measure scores and clustering accuracies of our GDPC algorithm compared to K-Means, AP and DPC on different data sets. We show that the GDPC has the superior performance in its capability of: (1) detecting the number of clusters obviously; (2) aggregating clusters with varying sizes, varying densities efficiently; (3) identifying anomalies accurately.
Precision global health in the digital age.
Flahault, Antoine; Geissbuhler, Antoine; Guessous, Idris; Guérin, Philippe; Bolon, Isabelle; Salathé, Marcel; Escher, Gérard
2017-04-19
Precision global health is an approach similar to precision medicine, which facilitates, through innovation and technology, better targeting of public health interventions on a global scale, for the purpose of maximising their effectiveness and relevance. Illustrative examples include: the use of remote sensing data to fight vector-borne diseases; large databases of genomic sequences of foodborne pathogens helping to identify origins of outbreaks; social networks and internet search engines for tracking communicable diseases; cell phone data in humanitarian actions; drones to deliver healthcare services in remote and secluded areas. Open science and data sharing platforms are proposed for fostering international research programmes under fair, ethical and respectful conditions. Innovative education, such as massive open online courses or serious games, can promote wider access to training in public health and improving health literacy. The world is moving towards learning healthcare systems. Professionals are equipped with data collection and decision support devices. They share information, which are complemented by external sources, and analysed in real time using machine learning techniques. They allow for the early detection of anomalies, and eventually guide appropriate public health interventions. This article shows how information-driven approaches, enabled by digital technologies, can help improving global health with greater equity.
Remote Structural Health Monitoring and Advanced Prognostics of Wind Turbines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Douglas Brown; Bernard Laskowski
The prospect of substantial investment in wind energy generation represents a significant capital investment strategy. In order to maximize the life-cycle of wind turbines, associated rotors, gears, and structural towers, a capability to detect and predict (prognostics) the onset of mechanical faults at a sufficiently early stage for maintenance actions to be planned would significantly reduce both maintenance and operational costs. Advancement towards this effort has been made through the development of anomaly detection, fault detection and fault diagnosis routines to identify selected fault modes of a wind turbine based on available sensor data preceding an unscheduled emergency shutdown. Themore » anomaly detection approach employs spectral techniques to find an approximation of the data using a combination of attributes that capture the bulk of variability in the data. Fault detection and diagnosis (FDD) is performed using a neural network-based classifier trained from baseline and fault data recorded during known failure conditions. The approach has been evaluated for known baseline conditions and three selected failure modes: pitch rate failure, low oil pressure failure and a gearbox gear-tooth failure. Experimental results demonstrate the approach can distinguish between these failure modes and normal baseline behavior within a specified statistical accuracy.« less
Wulsin, D. F.; Gupta, J. R.; Mani, R.; Blanco, J. A.; Litt, B.
2011-01-01
Clinical electroencephalography (EEG) records vast amounts of human complex data yet is still reviewed primarily by human readers. Deep Belief Nets (DBNs) are a relatively new type of multi-layer neural network commonly tested on two-dimensional image data, but are rarely applied to times-series data such as EEG. We apply DBNs in a semi-supervised paradigm to model EEG waveforms for classification and anomaly detection. DBN performance was comparable to standard classifiers on our EEG dataset, and classification time was found to be 1.7 to 103.7 times faster than the other high-performing classifiers. We demonstrate how the unsupervised step of DBN learning produces an autoencoder that can naturally be used in anomaly measurement. We compare the use of raw, unprocessed data—a rarity in automated physiological waveform analysis—to hand-chosen features and find that raw data produces comparable classification and better anomaly measurement performance. These results indicate that DBNs and raw data inputs may be more effective for online automated EEG waveform recognition than other common techniques. PMID:21525569
A Load-Based Temperature Prediction Model for Anomaly Detection
NASA Astrophysics Data System (ADS)
Sobhani, Masoud
Electric load forecasting, as a basic requirement for the decision-making in power utilities, has been improved in various aspects in the past decades. Many factors may affect the accuracy of the load forecasts, such as data quality, goodness of the underlying model and load composition. Due to the strong correlation between the input variables (e.g., weather and calendar variables) and the load, the quality of input data plays a vital role in forecasting practices. Even if the forecasting model were able to capture most of the salient features of the load, a low quality input data may result in inaccurate forecasts. Most of the data cleansing efforts in the load forecasting literature have been devoted to the load data. Few studies focused on weather data cleansing for load forecasting. This research proposes an anomaly detection method for the temperature data. The method consists of two components: a load-based temperature prediction model and a detection technique. The effectiveness of the proposed method is demonstrated through two case studies: one based on the data from the Global Energy Forecasting Competition 2014, and the other based on the data published by ISO New England. The results show that by removing the detected observations from the original input data, the final load forecast accuracy is enhanced.
Topological anomaly detection performance with multispectral polarimetric imagery
NASA Astrophysics Data System (ADS)
Gartley, M. G.; Basener, W.,
2009-05-01
Polarimetric imaging has demonstrated utility for increasing contrast of manmade targets above natural background clutter. Manual detection of manmade targets in multispectral polarimetric imagery can be challenging and a subjective process for large datasets. Analyst exploitation may be improved utilizing conventional anomaly detection algorithms such as RX. In this paper we examine the performance of a relatively new approach to anomaly detection, which leverages topology theory, applied to spectral polarimetric imagery. Detection results for manmade targets embedded in a complex natural background will be presented for both the RX and Topological Anomaly Detection (TAD) approaches. We will also present detailed results examining detection sensitivities relative to: (1) the number of spectral bands, (2) utilization of Stoke's images versus intensity images, and (3) airborne versus spaceborne measurements.
Wylie, B.K.; Zhang, L.; Bliss, Norman B.; Ji, Lei; Tieszen, Larry L.; Jolly, W. M.
2008-01-01
High-latitude ecosystems are exposed to more pronounced warming effects than other parts of the globe. We develop a technique to monitor ecological changes in a way that distinguishes climate influences from disturbances. In this study, we account for climatic influences on Alaskan boreal forest performance with a data-driven model. We defined ecosystem performance anomalies (EPA) using the residuals of the model and made annual maps of EPA. Most areas (88%) did not have anomalous ecosystem performance for at least 6 of 8 years between 1996 and 2004. Areas with underperforming EPA (10%) often indicate areas associated with recent fires and areas of possible insect infestation or drying soil related to permafrost degradation. Overperforming areas (2%) occurred in older fire recovery areas where increased deciduous vegetation components are expected. The EPA measure was validated with composite burn index data and Landsat vegetation indices near and within burned areas.
Detecting anomalies in CMB maps: a new method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neelakanta, Jayanth T., E-mail: jayanthtn@gmail.com
2015-10-01
Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ''hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics aremore » linear and quadratic combinations of the a{sub ℓ m}'s with random co-efficients, and they test the null hypothesis that the a{sub ℓ m}'s are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple ℓ and m modes, the ''coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.« less
NASA Astrophysics Data System (ADS)
Barbarella, M.; De Giglio, M.; Galeandro, A.; Mancini, F.
2012-04-01
The modification of some atmospheric physical properties prior to a high magnitude earthquake has been recently debated within the Lithosphere-Atmosphere-Ionosphere (LAI) Coupling model. Among this variety of phenomena the ionization of air at the higher level of the atmosphere, called ionosphere, is investigated in this work. Such a ionization occurrences could be caused by possible leaking of gases from earth crust and their presence was detected around the time of high magnitude earthquakes by several authors. However, the spatial scale and temporal domain over which such a disturbances come into evidence is still a controversial item. Even thought the ionospheric activity could be investigated by different methodologies (satellite or terrestrial measurements), we selected the production of ionospheric maps by the analysis of GNSS (Global Navigation Satellite Data) data as possible way to detect anomalies prior of a seismic event over a wide area around the epicentre. It is well known that, in the GNSS sciences, the ionospheric activity could be probed by the analysis of refraction phenomena occurred on the dual frequency signals along the satellite to receiver path. The analysis of refraction phenomena affecting data acquired by the GNSS permanent trackers is able to produce daily to hourly maps representing the spatial distribution of the ionospheric Total Electron Content (TEC) as an index of the ionization degree in the upper atmosphere. The presence of large ionospheric anomalies could be therefore interpreted in the LAI Coupling model like a precursor signal of a strong earthquake, especially when the appearance of other different precursors (thermal anomalies and/or gas fluxes) could be detected. In this work, a six-month long series of ionospheric maps produced from GNSS data collected by a network of 49 GPS permanent stations distributed within an area around the city of L'Aquila (Abruzzi, Italy), where an earthquake (M = 6.3) occurred on April 6, 2009, were investigated. Basically, the proposed methodology is able to perform a time series analysis of the TEC maps and, eventually, define the spatial and temporal domains of ionospheric disturbances. This goal was achieved by a time series analysis of the spatial dataset able to compare a local pattern of ionospheric activity with its historical mean value and detect areas where the TEC content exhibits anomalous values. This data processing shows some 1 to 2 days long anomalies about 20 days before of the seismic event (confirming also results provided in recent studies by means of ionospheric soundings).
Algorithms for Spectral Decomposition with Applications to Optical Plume Anomaly Detection
NASA Technical Reports Server (NTRS)
Srivastava, Askok N.; Matthews, Bryan; Das, Santanu
2008-01-01
The analysis of spectral signals for features that represent physical phenomenon is ubiquitous in the science and engineering communities. There are two main approaches that can be taken to extract relevant features from these high-dimensional data streams. The first set of approaches relies on extracting features using a physics-based paradigm where the underlying physical mechanism that generates the spectra is used to infer the most important features in the data stream. We focus on a complementary methodology that uses a data-driven technique that is informed by the underlying physics but also has the ability to adapt to unmodeled system attributes and dynamics. We discuss the following four algorithms: Spectral Decomposition Algorithm (SDA), Non-Negative Matrix Factorization (NMF), Independent Component Analysis (ICA) and Principal Components Analysis (PCA) and compare their performance on a spectral emulator which we use to generate artificial data with known statistical properties. This spectral emulator mimics the real-world phenomena arising from the plume of the space shuttle main engine and can be used to validate the results that arise from various spectral decomposition algorithms and is very useful for situations where real-world systems have very low probabilities of fault or failure. Our results indicate that methods like SDA and NMF provide a straightforward way of incorporating prior physical knowledge while NMF with a tuning mechanism can give superior performance on some tests. We demonstrate these algorithms to detect potential system-health issues on data from a spectral emulator with tunable health parameters.
Inflight and Preflight Detection of Pitot Tube Anomalies
NASA Technical Reports Server (NTRS)
Mitchell, Darrell W.
2014-01-01
The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.
Ranking Causal Anomalies via Temporal and Dynamical Analysis on Vanishing Correlations.
Cheng, Wei; Zhang, Kai; Chen, Haifeng; Jiang, Guofei; Chen, Zhengzhang; Wang, Wei
2016-08-01
Modern world has witnessed a dramatic increase in our ability to collect, transmit and distribute real-time monitoring and surveillance data from large-scale information systems and cyber-physical systems. Detecting system anomalies thus attracts significant amount of interest in many fields such as security, fault management, and industrial optimization. Recently, invariant network has shown to be a powerful way in characterizing complex system behaviours. In the invariant network, a node represents a system component and an edge indicates a stable, significant interaction between two components. Structures and evolutions of the invariance network, in particular the vanishing correlations, can shed important light on locating causal anomalies and performing diagnosis. However, existing approaches to detect causal anomalies with the invariant network often use the percentage of vanishing correlations to rank possible casual components, which have several limitations: 1) fault propagation in the network is ignored; 2) the root casual anomalies may not always be the nodes with a high-percentage of vanishing correlations; 3) temporal patterns of vanishing correlations are not exploited for robust detection. To address these limitations, in this paper we propose a network diffusion based framework to identify significant causal anomalies and rank them. Our approach can effectively model fault propagation over the entire invariant network, and can perform joint inference on both the structural, and the time-evolving broken invariance patterns. As a result, it can locate high-confidence anomalies that are truly responsible for the vanishing correlations, and can compensate for unstructured measurement noise in the system. Extensive experiments on synthetic datasets, bank information system datasets, and coal plant cyber-physical system datasets demonstrate the effectiveness of our approach.
Ground Penetrating Radar Survey at Yoros Fortesss,Istanbul
NASA Astrophysics Data System (ADS)
Kucukdemirci, M.; Yalçın, A. B.
2016-12-01
Geophysical methods are effective tool to detect the archaeological remains and materials, which were hidden under the ground. One of the most frequently used methods for archaeological prospection is Ground Penetrating Radar (GPR). This paper illustrates the small scale GPR survey to determine the buried archaeological features around the Yoros Fortress, located on shores of the Bosporus strait in Istanbul, during the archaeological excavations. The survey was carried out with a GSSI SIR 3000 system, using 400 Mhz center frequency bistatic antenna with the configuration of 16 bits dynamic range and 512 samples per scan. The data were collected along parallel profiles with an interval of 0.50 meters with zigzag profile configuration on the survey grids. The GPR data were processed by GPR-Slice V.7 (Ground Penetrating Radar Imaging Software). As a result, in the first shallow depths, some scattered anomalies were detected. These can be related to a small portion of archaeological ruins close to the surface. In the deeper levels, the geometry of the anomalies related to the possible archaeological ruins, looks clearer. Two horizontal and parallel anomalies were detected, with the direction NS in the depth of 1.45 meters, possibly related to the ancient channels.
Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie
2009-01-01
Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.
A Comparison of the Age-Spectra from Data Assimilation Models
NASA Technical Reports Server (NTRS)
Schoeberl, Mark R.; Douglass, Anne R.; Zhu, Zheng-Xin; Pawson, Steven; Einaudi, Franco (Technical Monitor)
2002-01-01
We use kinematic and diabatic back trajectory calculations, driven by winds from a general circulation model (GCM) and two different data assimilation systems (DAS), to compute the age spectrum at three latitudes in the lower stratosphere. The age-spectra are compared to chemical transport model (CTM) calculations, and the mean ages from all of these studies are compared to observations. The age spectra computed using the GCM winds show a reasonably well-isolated tropics in good agreement with observations; however, the age spectra determined from the DAS differ from the GCM spectra. For the diabatic trajectory calculations, the age spectrum is too broad as a result of too much exchange between the tropics and mid-latitudes. The age spectrum determined using the kinematic trajectory calculation is less broad and lacks an age offset; both of these features are due to excessive vertical dispersion of parcels. The tropical and mid-latitude mean age difference between the diabatically and kinematically determined age-spectra is about one year, the former being older. The CTM calculation of the age spectrum using the DAS winds shows the same dispersive characteristics of the kinematic trajectory calculation. These results suggest that the current DAS products will not give realistic trace gas distributions for long integrations; they also help explain why the mean ages determined in a number of previous DAS driven CTM's are too young compared with observations. Finally, we note trajectory-generated age spectra show significant age anomalies correlated with the seasonal cycles, and these anomalies can be linked to year-to-year variations in the tropical heating rate. These anomalies are suppressed in the CTM spectra suggesting that the CTM transport is too diffusive.
Hu, Erzhong; Nosato, Hirokazu; Sakanashi, Hidenori; Murakawa, Masahiro
2013-01-01
Capsule endoscopy is a patient-friendly endoscopy broadly utilized in gastrointestinal examination. However, the efficacy of diagnosis is restricted by the large quantity of images. This paper presents a modified anomaly detection method, by which both known and unknown anomalies in capsule endoscopy images of small intestine are expected to be detected. To achieve this goal, this paper introduces feature extraction using a non-linear color conversion and Higher-order Local Auto Correlation (HLAC) Features, and makes use of image partition and subspace method for anomaly detection. Experiments are implemented among several major anomalies with combinations of proposed techniques. As the result, the proposed method achieved 91.7% and 100% detection accuracy for swelling and bleeding respectively, so that the effectiveness of proposed method is demonstrated.
Flash Infrared Thermography Contrast Data Analysis Technique
NASA Technical Reports Server (NTRS)
Koshti, Ajay
2014-01-01
This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.
NASA Astrophysics Data System (ADS)
Gittinger, Jaxon M.; Jimenez, Edward S.; Holswade, Erica A.; Nunna, Rahul S.
2017-02-01
This work will demonstrate the implementation of a traditional and non-traditional visualization of x-ray images for aviation security applications that will be feasible with open system architecture initiatives such as the Open Threat Assessment Platform (OTAP). Anomalies of interest to aviation security are fluid, where characteristic signals of anomalies of interest can evolve rapidly. OTAP is a limited scope open architecture baggage screening prototype that intends to allow 3rd-party vendors to develop and easily implement, integrate, and deploy detection algorithms and specialized hardware on a field deployable screening technology [13]. In this study, stereoscopic images were created using an unmodified, field-deployed system and rendered on the Oculus Rift, a commercial virtual reality video gaming headset. The example described in this work is not dependent on the Oculus Rift, and is possible using any comparable hardware configuration capable of rendering stereoscopic images. The depth information provided from viewing the images will aid in the detection of characteristic signals from anomalies of interest. If successful, OTAP has the potential to allow for aviation security to become more fluid in its adaptation to the evolution of anomalies of interest. This work demonstrates one example that is easily implemented using the OTAP platform, that could lead to the future generation of ATR algorithms and data visualization approaches.
NASA Astrophysics Data System (ADS)
Levine, P. A.; Xu, M.; Chen, Y.; Randerson, J. T.; Hoffman, F. M.
2017-12-01
Interannual variability of climatic conditions in the Amazon rainforest is associated with El Niño-Southern Oscillation (ENSO) and ocean-atmosphere interactions in the North Atlantic. Sea surface temperature (SST) anomalies in these remote ocean regions drive teleconnections with Amazonian surface air temperature (T), precipitation (P), and net ecosystem production (NEP). While SST-driven NEP anomalies have been primarily linked to T anomalies, it is unclear how much the T anomalies result directly from SST forcing of atmospheric circulation, and how much result indirectly from decreases in precipitation that, in turn, influence surface energy fluxes. Interannual variability of P associated with SST anomalies lead to variability in soil moisture (SM), which would indirectly affect T via partitioning of turbulent heat fluxes between the land surface and the atmosphere. To separate the direct and indirect influence of the SST signal on T and NEP, we performed a mechanism-denial experiment to decouple SST and SM anomalies. We used the Accelerated Climate Modeling for Energy (ACMEv0.3), with version 5 of the Community Atmosphere Model and version 4.5 of the Community Land Model. We forced the model with observed SSTs from 1982-2016. We found that SST and SM variability both contribute to T and NEP anomalies in the Amazon, with relative contributions depending on lag time and location within the Amazon basin. SST anomalies associated with ENSO drive most of the T variability at shorter lag times, while the ENSO-driven SM anomalies contribute more to T variability at longer lag times. SM variability and the resulting influence on T anomalies are much stronger in the eastern Amazon than in the west. Comparing modeled T with observations demonstrate that SST alone is sufficient for simulating the correct timing of T variability, but SM anomalies are necessary for simulating the correct magnitude of the T variability. Modeled NEP indicated that variability in carbon fluxes results from both SST and SM anomalies. As with T, SM anomalies affect NEP at a much longer lag time than SST anomalies. These results highlight the role of land-atmosphere coupling in driving climate variability within the Amazon, and suggest that land atmospheric coupling may amplify and delay carbon cycle responses to ocean-atmosphere teleconnections.
Automated Detection of Events of Scientific Interest
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.
Non-invasive flow path characterization in a mining-impacted wetland
Bethune, James; Randell, Jackie; Runkel, Robert L.; Singha, Kamini
2015-01-01
Time-lapse electrical resistivity (ER) was used to capture the dilution of a seasonal pulse of acid mine drainage (AMD) contamination in the subsurface of a wetland downgradient of the abandoned Pennsylvania mine workings in central Colorado. Data were collected monthly from mid-July to late October of 2013, with an additional dataset collected in June of 2014. Inversion of the ER data shows the development through time of multiple resistive anomalies in the subsurface, which corroborating data suggest are driven by changes in total dissolved solids (TDS) localized in preferential flow pathways. Sensitivity analyses on a synthetic model of the site suggest that the anomalies would need to be at least several meters in diameter to be adequately resolved by the inversions. The existence of preferential flow paths would have a critical impact on the extent of attenuation mechanisms at the site, and their further characterization could be used to parameterize reactive transport models in developing quantitative predictions of remediation strategies.
A novel interacting multiple model based network intrusion detection scheme
NASA Astrophysics Data System (ADS)
Xin, Ruichi; Venkatasubramanian, Vijay; Leung, Henry
2006-04-01
In today's information age, information and network security are of primary importance to any organization. Network intrusion is a serious threat to security of computers and data networks. In internet protocol (IP) based network, intrusions originate in different kinds of packets/messages contained in the open system interconnection (OSI) layer 3 or higher layers. Network intrusion detection and prevention systems observe the layer 3 packets (or layer 4 to 7 messages) to screen for intrusions and security threats. Signature based methods use a pre-existing database that document intrusion patterns as perceived in the layer 3 to 7 protocol traffics and match the incoming traffic for potential intrusion attacks. Alternately, network traffic data can be modeled and any huge anomaly from the established traffic pattern can be detected as network intrusion. The latter method, also known as anomaly based detection is gaining popularity for its versatility in learning new patterns and discovering new attacks. It is apparent that for a reliable performance, an accurate model of the network data needs to be established. In this paper, we illustrate using collected data that network traffic is seldom stationary. We propose the use of multiple models to accurately represent the traffic data. The improvement in reliability of the proposed model is verified by measuring the detection and false alarm rates on several datasets.
Indo-Pacific sea level variability during recent decades
NASA Astrophysics Data System (ADS)
Yamanaka, G.; Tsujino, H.; Nakano, H.; Urakawa, S. L.; Sakamoto, K.
2016-12-01
Decadal variability of sea level in the Indo-Pacific region is investigated using a historical OGCM simulation. The OGCM driven by the atmospheric forcing removing long-term trends clearly exhibits decadal sea level variability in the Pacific Ocean, which is associated with eastern tropical Pacific thermal anomalies. During the period of 1977-1987, the sea level anomalies are positive in the eastern equatorial Pacific and show deviations from a north-south symmetric distribution, with strongly negative anomalies in the western tropical South Pacific. During the period of 1996-2006, in contrast, the sea level anomalies are negative in the eastern equatorial Pacific and show a nearly north-south symmetric pattern, with positive anomalies in both hemispheres. Concurrently, sea level anomalies in the south-eastern Indian Ocean vary with those in the western tropical Pacific. These sea level variations are closely related to large-scale wind fields. Indo-Pacific sea level distributions are basically determined by wind anomalies over the equatorial region as well as wind stress curl anomalies over the off-equatorial region.
Manevich-Mazor, Mirra; Weissmann-Brenner, Alina; Bar Yosef, Omer; Hoffmann, Chen; Mazor, Roei David; Mosheva, Mariela; Achiron, Reuven Ryszard; Katorza, Eldad
2018-06-07
To evaluate the added value of fetal MRI to ultrasound in detecting and specifying callosal anomalies, and its impact on clinical decision making. Fetuses with a sonographic diagnosis of an anomalous corpus callosum (CC) who underwent a subsequent fetal brain MRI between 2010 and 2015 were retrospectively evaluated and classified according to the severity of the findings. The findings detected on ultrasound were compared to those detected on MRI. An analysis was performed to assess whether fetal MRI altered the group classification, and thus the management of these pregnancies. 78 women were recruited following sonographic diagnoses of either complete or partial callosal agenesis, short, thin or thick CC. Normal MRI studies were obtained inµ19 cases (24 %). Among these, all children available for follow-up received an adequate adaptive score in their Vineland II adaptive behavior scale assessment. Analysis of the concordance between US and MRI demonstrated a substantial level of agreement for complete callosal agenesis (kappa: 0.742), moderate agreement for thin CC (kappa: 0.418) and fair agreement for all other callosal anomalies. Comparison between US and MRI-based mild/severe findings classifications revealed that MRI contributed to a change in the management for 28 fetuses (35.9 %), mostly (25 fetuses, 32.1 %) in favor of pregnancy preservation. Fetal MRI effectively detects callosal anomalies and enables satisfactory validation of the presence or absence of callosal anomalies identified by ultrasound and adds valuable data that improves clinical decision making. © Georg Thieme Verlag KG Stuttgart · New York.
Remanent magnetization and 3-dimensional density model of the Kentucky anomaly region
NASA Technical Reports Server (NTRS)
Mayhew, M. A.; Estes, R. H.; Myers, D. M.
1984-01-01
A three-dimensional model of the Kentucky body was developed to fit surface gravity and long wavelength aeromagnetic data. Magnetization and density parameters for the model are much like those of Mayhew et al (1982). The magnetic anomaly due to the model at satellite altitude is shown to be much too small by itself to account for the anomaly measured by Magsat. It is demonstrated that the source region for the satellite anomaly is considerably more extensive than the Kentucky body sensu stricto. The extended source region is modeled first using prismatic model sources and then using dipole array sources. Magnetization directions for the source region found by inversion of various combinations of scalar and vector data are found to be close to the main field direction, implying the lack of a strong remanent component. It is shown by simulation that in a case (such as this) where the geometry of the source is known, if a strong remanent component is present its direction is readily detectable, but by scalar data as readily as vector data.
Long-term Trends and Variability of Eddy Activities in the South China Sea
NASA Astrophysics Data System (ADS)
Zhang, M.; von Storch, H.
2017-12-01
For constructing empirical downscaling models and projecting possible future states of eddy activities in the South China Sea (SCS), long-term statistical characteristics of the SCS eddy are needed. We use a daily global eddy-resolving model product named STORM covering the period of 1950-2010. This simulation has employed the MPI-OM model with a mean horizontal resolution of 10km and been driven by the NCEP reanalysis-1 data set. An eddy detection and tracking algorithm operating on the gridded sea surface height anomaly (SSHA) fields was developed. A set of parameters for the criteria in the SCS are determined through sensitivity tests. Our method detected more than 6000 eddy tracks in the South China Sea. For all of them, eddy diameters, track length, eddy intensity, eddy lifetime and eddy frequency were determined. The long-term trends and variability of those properties also has been derived. Most of the eddies propagate westward. Nearly 100 eddies travel longer than 1000km, and over 800 eddies have a lifespan of more than 2 months. Furthermore, for building the statistical empirical model, the relationship between the SCS eddy statistics and the large-scale atmospheric and oceanic phenomena has been investigated.
78 FR 77024 - Telemarketing Sales Rule; Notice of Termination of Caller ID Rulemaking
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-20
..., data mining and anomaly detection, and call-blocking technology). \\19\\ AT&T Servs., Inc., No. 00040, at... technically feasible, by looking at the signaling data . . . to distinguish between a CPN [calling party...
Radar and infrared remote sensing of geothermal features at Pilgrim Springs, Alaska
NASA Technical Reports Server (NTRS)
Dean, K. G.; Forbes, R. B.; Turner, D. L.; Eaton, F. D.; Sullivan, K. D.
1982-01-01
High-altitude radar and thermal imagery collected by the NASA research aircraft WB57F were used to examine the structural setting and distribution of radiant temperatures of geothermal anomalies in the Pilgrim Springs, Alaska area. Like-polarized radar imagery with perpendicular look directions provides the best structural data for lineament analysis, although more than half the mapped lineaments are easily detectable on conventional aerial photography. Radiometer data and imagery from a thermal scanner were used to evaluate radiant surface temperatures, which ranged from 3 to 17 C. The evening imagery, which utilized density-slicing techniques, detected thermal anomalies associated with geothermal heat sources. The study indicates that high-altitude predawn thermal imagery may be able to locate relatively large areas of hot ground in site-specific studies in the vegetated Alaskan terrain. This imagery will probably not detect gentle lateral gradients.
NASA Astrophysics Data System (ADS)
Simmons, Gary G.; Howett, Carly J. A.; Young, Leslie A.; Spencer, John R.
2015-11-01
In the last few decades, thermal data from the Galileo and Cassini spacecraft have detected various anomalies on Jovian and Saturnian satellites, including the thermally anomalous “PacMan” regions on Mimas and Tethys and the Pwyll anomaly on Europa (Howett et al. 2011, Howett et al. 2012, Spencer et al. 1999). Yet, the peculiarities of some of these anomalies, like the weak detection of the “PacMan” anomalies on Rhea and Dione and the low thermal inertia values of the widespread anomalies on equatorial Europa, are subjects for on-going research (Howett et al. 2014, Rathbun et al. 2010). Further, analysis and review of all the data both Galileo and Cassini took of these worlds will provide information of the thermal inertia and albedos of their surfaces, perhaps highlighting potential targets of interest for future Jovian and Saturnian system missions. Many previous works have used a thermophysical model for airless planets developed by Spencer (1990). However, the Three Dimensional Volatile-Transport (VT3D) model proposed by Young (2012) is able to predict surface temperatures in significantly faster computation time, incorporating seasonal and diurnal insolation variations. This work is the first step in an ongoing investigation, which will use VT3D’s capabilities to reanalyze Galileo and Cassini data. VT3D, which has already been used to analyze volatile transport on Pluto, is validated by comparing its results to that of the Spencer thermal model. We will also present our initial results using VT3D to reanalyze the thermophysical properties of the PacMan anomaly previous discovered on Mimas by Howett et al. (2011), using temperature constraints of diurnal data from Cassini/CIRS. VT3D is expected to be an efficient tool in identifying new thermal anomalies in future Saturnian and Jovian missions.Bibliography:C.J.A. Howett et al. (2011), Icarus 216, 221.C.J.A. Howett et al. (2012), Icarus 221, 1084.C.J.A. Howett et al. (2014), Icarus 241, 239.J.A. Rathbun et al. (2010), Icarus 210, 763J. R. Spencer (1990), Icarus 83, 27.J. R. Spencer et al. (1999), Science 284, 1514.L. A. Young (2012), Icarus 221, 80.
Discovering anomalous events from urban informatics data
NASA Astrophysics Data System (ADS)
Jayarajah, Kasthuri; Subbaraju, Vigneshwaran; Weerakoon, Dulanga; Misra, Archan; Tam, La Thanh; Athaide, Noel
2017-05-01
Singapore's "smart city" agenda is driving the government to provide public access to a broader variety of urban informatics sources, such as images from traffic cameras and information about buses servicing different bus stops. Such informatics data serves as probes of evolving conditions at different spatiotemporal scales. This paper explores how such multi-modal informatics data can be used to establish the normal operating conditions at different city locations, and then apply appropriate outlier-based analysis techniques to identify anomalous events at these selected locations. We will introduce the overall architecture of sociophysical analytics, where such infrastructural data sources can be combined with social media analytics to not only detect such anomalous events, but also localize and explain them. Using the annual Formula-1 race as our candidate event, we demonstrate a key difference between the discriminative capabilities of different sensing modes: while social media streams provide discriminative signals during or prior to the occurrence of such an event, urban informatics data can often reveal patterns that have higher persistence, including before and after the event. In particular, we shall demonstrate how combining data from (i) publicly available Tweets, (ii) crowd levels aboard buses, and (iii) traffic cameras can help identify the Formula-1 driven anomalies, across different spatiotemporal boundaries.
Retrieving Temperature Anomaly in the Global Subsurface and Deeper Ocean From Satellite Observations
NASA Astrophysics Data System (ADS)
Su, Hua; Li, Wene; Yan, Xiao-Hai
2018-01-01
Retrieving the subsurface and deeper ocean (SDO) dynamic parameters from satellite observations is crucial for effectively understanding ocean interior anomalies and dynamic processes, but it is challenging to accurately estimate the subsurface thermal structure over the global scale from sea surface parameters. This study proposes a new approach based on Random Forest (RF) machine learning to retrieve subsurface temperature anomaly (STA) in the global ocean from multisource satellite observations including sea surface height anomaly (SSHA), sea surface temperature anomaly (SSTA), sea surface salinity anomaly (SSSA), and sea surface wind anomaly (SSWA) via in situ Argo data for RF training and testing. RF machine-learning approach can accurately retrieve the STA in the global ocean from satellite observations of sea surface parameters (SSHA, SSTA, SSSA, SSWA). The Argo STA data were used to validate the accuracy and reliability of the results from the RF model. The results indicated that SSHA, SSTA, SSSA, and SSWA together are useful parameters for detecting SDO thermal information and obtaining accurate STA estimations. The proposed method also outperformed support vector regression (SVR) in global STA estimation. It will be a useful technique for studying SDO thermal variability and its role in global climate system from global-scale satellite observations.
Detection of Low Temperature Volcanogenic Thermal Anomalies with ASTER
NASA Astrophysics Data System (ADS)
Pieri, D. C.; Baxter, S.
2009-12-01
Predicting volcanic eruptions is a thorny problem, as volcanoes typically exhibit idiosyncratic waxing and/or waning pre-eruption emission, geodetic, and seismic behavior. It is no surprise that increasing our accuracy and precision in eruption prediction depends on assessing the time-progressions of all relevant precursor geophysical, geochemical, and geological phenomena, and on more frequently observing volcanoes when they become restless. The ASTER instrument on the NASA Terra Earth Observing System satellite in low earth orbit provides important capabilities in the area of detection of volcanogenic anomalies such as thermal precursors and increased passive gas emissions. Its unique high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), bore-sighted with visible and near-IR imaging data, and combined with off-nadir pointing and stereo-photogrammetric capabilities make ASTER a potentially important volcanic precursor detection tool. We are utilizing the JPL ASTER Volcano Archive (http://ava.jpl.nasa.gov) to systematically examine 80,000+ ASTER volcano images to analyze (a) thermal emission baseline behavior for over 1500 volcanoes worldwide, (b) the form and magnitude of time-dependent thermal emission variability for these volcanoes, and (c) the spatio-temporal limits of detection of pre-eruption temporal changes in thermal emission in the context of eruption precursor behavior. We are creating and analyzing a catalog of the magnitude, frequency, and distribution of volcano thermal signatures worldwide as observed from ASTER since 2000 at 90m/pixel. Of particular interest as eruption precursors are small low contrast thermal anomalies of low apparent absolute temperature (e.g., melt-water lakes, fumaroles, geysers, grossly sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and their combinations). To systematically detect such intrinsically difficult anomalies within our large archive, we are exploring a four step approach: (a) the recursive application of a GPU-accelerated, edge-preserving bilateral filter prepares a thermal image by removing noise and fine detail; (b) the resulting stylized filtered image is segmented by a path-independent region-growing algorithm, (c) the resulting segments are fused based on thermal affinity, and (d) fused segments are subjected to thermal and geographical tests for hotspot detection and classification, to eliminate false alarms or non-volcanogenic anomalies. We will discuss our progress in creating the general thermal anomaly catalog as well as algorithm approach and results. This work was carried out at the Jet Propulsion Laboratory of the California Institute of Technology under contract to NASA.
Deolia, Shravani Govind; Chhabra, Chaya; Chhabra, Kumar Gaurav; Kalghatgi, Shrivardhan; Khandelwal, Naresh
2015-01-01
Anomalies and enamel hypoplasia of deciduous dentition are routinely encountered by dental professionals and early detection and careful management of such conditions facilitates may help in customary occlusal development. The aim of this study was to determine the prevalence of hypodontia, microdontia, double teeth, and hyperdontia of deciduous teeth among Indian children. The study group comprised 1,398 children (735 boys, 633 girls). The children were examined in department of Pedodontics and Preventive Dentistry in Jodhpur Dental College General Hospital, Jodhpur, Rajasthan, India. Clinical data were collected by single dentist according to Kreiborg criteria, which includes double teeth, hypodontia, microdontia, and supernumerary teeth. Statistical analysis of the data was performed using the descriptive analysis and chi-square test. Dental anomalies were found in 4% of children. The distribution of dental anomalies were significantly more frequent (P = 0.001) in girls (5.8%, n = 38) than in boys (2.7%, n = 18). In relation to anomaly frequencies at different ages, significant difference was found between 2 and 3 years (P = 0.001). Double teeth were the most frequently (2.3%) observed anomaly. The other anomalies followed as 0.3% supernumerary teeth, 0.6% microdontia, 0.6% hypodontia. Identification of dental anomalies at an early age is of great importance as it prevents malocclusions, functional and certain psychological problems.
Fermi Large Area Telescope as a Galactic Supernovae Axionscope
Meyer, M.; Giannotti, M.; Mirizzi, A.; ...
2017-01-06
In a Galactic core-collapse supernova (SN), axionlike particles (ALPs) could be emitted via the Primakoff process and eventually convert into γ rays in the magnetic field of the Milky Way. From a data-driven sensitivity estimate, we find that, for a SN exploding in our Galaxy, the Fermi Large Area Telescope (LAT) would be able to explore the photon-ALP coupling down to g aγ ≃ 2 × 10 -13 GeV -1 for an ALP mass m a ≲ 10 -9 eV. Also, these values are out of reach of next generation laboratory experiments. In this event, the Fermi LAT would probemore » large regions of the ALP parameter space invoked to explain the anomalous transparency of the Universe to γ rays, stellar cooling anomalies, and cold dark matter. Lastly, if no γ-ray emission were to be detected, Fermi-LAT observations would improve current bounds derived from SN 1987A by more than 1 order of magnitude.« less
A CAMAC based real-time noise analysis system for nuclear reactors
NASA Astrophysics Data System (ADS)
Ciftcioglu, Özer
1987-05-01
A CAMAC based real-time noise analysis system was designed for the TRIGA MARK II nuclear reactor at the Institute for Nuclear Energy, Istanbul. The input analog signals obtained from the radiation detectors are introduced to the system through CAMAC interface. The signals converted into digital form are processed by a PDP-11 computer. The fast data processing based on auto/cross power spectral density computations is carried out by means of assembly written FFT algorithms in real-time and the spectra obtained are displayed on a CAMAC driven display system as an additional monitoring device. The system has the advantage of being software programmable and controlled by a CAMAC system so that it is operated under program control for reactor surveillance, anomaly detection and diagnosis. The system can also be used for the identification of nonstationary operational characteristics of the reactor in long term by comparing the noise power spectra with the corresponding reference noise patterns prepared in advance.
Barisic, Ingeborg; Odak, Ljubica; Loane, Maria; Garne, Ester; Wellesley, Diana; Calzolari, Elisa; Dolk, Helen; Addor, Marie-Claude; Arriola, Larraitz; Bergman, Jorieke; Bianca, Sebastiano; Doray, Berenice; Khoshnood, Babak; Klungsoyr, Kari; McDonnell, Bob; Pierini, Anna; Rankin, Judith; Rissmann, Anke; Rounding, Catherine; Queisser-Luft, Annette; Scarano, Gioacchino; Tucker, David
2014-01-01
Oculo-auriculo-vertebral spectrum is a complex developmental disorder characterised mainly by anomalies of the ear, hemifacial microsomia, epibulbar dermoids and vertebral anomalies. The aetiology is largely unknown, and the epidemiological data are limited and inconsistent. We present the largest population-based epidemiological study to date, using data provided by the large network of congenital anomalies registries in Europe. The study population included infants diagnosed with oculo-auriculo-vertebral spectrum during the 1990–2009 period from 34 registries active in 16 European countries. Of the 355 infants diagnosed with oculo-auriculo-vertebral spectrum, there were 95.8% (340/355) live born, 0.8% (3/355) fetal deaths, 3.4% (12/355) terminations of pregnancy for fetal anomaly and 1.5% (5/340) neonatal deaths. In 18.9%, there was prenatal detection of anomaly/anomalies associated with oculo-auriculo-vertebral spectrum, 69.7% were diagnosed at birth, 3.9% in the first week of life and 6.1% within 1 year of life. Microtia (88.8%), hemifacial microsomia (49.0%) and ear tags (44.4%) were the most frequent anomalies, followed by atresia/stenosis of external auditory canal (25.1%), diverse vertebral (24.3%) and eye (24.3%) anomalies. There was a high rate (69.5%) of associated anomalies of other organs/systems. The most common were congenital heart defects present in 27.8% of patients. The prevalence of oculo-auriculo-vertebral spectrum, defined as microtia/ear anomalies and at least one major characteristic anomaly, was 3.8 per 100 000 births. Twinning, assisted reproductive techniques and maternal pre-pregnancy diabetes were confirmed as risk factors. The high rate of different associated anomalies points to the need of performing an early ultrasound screening in all infants born with this disorder. PMID:24398798
Richard Tran Mills; Jitendra Kumar; Forrest M. Hoffman; William W. Hargrove; Joseph P. Spruce; Steven P. Norman
2013-01-01
We investigated the use of principal components analysis (PCA) to visualize dominant patterns and identify anomalies in a multi-year land surface phenology data set (231 m à 231 m normalized difference vegetation index (NDVI) values derived from the Moderate Resolution Imaging Spectroradiometer (MODIS)) used for detecting threats to forest health in the conterminous...
Anomaly detection of flight routes through optimal waypoint
NASA Astrophysics Data System (ADS)
Pusadan, M. Y.; Buliali, J. L.; Ginardi, R. V. H.
2017-01-01
Deciding factor of flight, one of them is the flight route. Flight route determined by coordinate (latitude and longitude). flight routed is determined by its coordinates (latitude and longitude) as defined is waypoint. anomaly occurs, if the aircraft is flying outside the specified waypoint area. In the case of flight data, anomalies occur by identifying problems of the flight route based on data ADS-B. This study has an aim of to determine the optimal waypoints of the flight route. The proposed methods: i) Agglomerative Hierarchical Clustering (AHC) in several segments based on range area coordinates (latitude and longitude) in every waypoint; ii) The coefficient cophenetics correlation (c) to determine the correlation between the members in each cluster; iii) cubic spline interpolation as a graphic representation of the has connected between the coordinates on every waypoint; and iv). Euclidean distance to measure distances between waypoints with 2 centroid result of clustering AHC. The experiment results are value of coefficient cophenetics correlation (c): 0,691≤ c ≤ 0974, five segments the generated of the range area waypoint coordinates, and the shortest and longest distance between the centroid with waypoint are 0.46 and 2.18. Thus, concluded that the shortest distance is used as the reference coordinates of optimal waypoint, and farthest distance can be indicated potentially detected anomaly.
2004-01-01
login identity to the one under which the system call is executed, the parameters of the system call execution - file names including full path...Anomaly detection COAST-EIMDT Distributed on target hosts EMERALD Distributed on target hosts and security servers Signature recognition Anomaly...uses a centralized architecture, and employs an anomaly detection technique for intrusion detection. The EMERALD project [80] proposes a
Adaptive Impact-Driven Detection of Silent Data Corruption for HPC Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di, Sheng; Cappello, Franck
For exascale HPC applications, silent data corruption (SDC) is one of the most dangerous problems because there is no indication that there are errors during the execution. We propose an adaptive impact-driven method that can detect SDCs dynamically. The key contributions are threefold. (1) We carefully characterize 18 real-world HPC applications and discuss the runtime data features, as well as the impact of the SDCs on their execution results. (2) We propose an impact-driven detection model that does not blindly improve the prediction accuracy, but instead detects only influential SDCs to guarantee user-acceptable execution results. (3) Our solution can adaptmore » to dynamic prediction errors based on local runtime data and can automatically tune detection ranges for guaranteeing low false alarms. Experiments show that our detector can detect 80-99.99% of SDCs with a false alarm rate less that 1% of iterations for most cases. The memory cost and detection overhead are reduced to 15% and 6.3%, respectively, for a large majority of applications.« less
NASA Astrophysics Data System (ADS)
Ho, Yi-Ying; Jhuang, Hau-Kun; Su, Yung-Chih; Liu, Jann-Yenq
2013-06-01
In this paper we examine the pre-earthquake ionospheric anomalies by the total electron content (TEC) extracted from GIM (global ionospheric map) and the electron density (Ne) observed by the DEMETER (Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions) satellite during the 2010 M8.8 Chile earthquake. Temporal variations show the nighttime TEC and Ne simultaneously increase 9-19 days before the earthquake. A cross-comparison of data recorded during the period of 1 February to 3 March in 2006-2010 confirms the above temporal anomalies specifically appear in 2010. The spatial analyses show that the anomalies tend to appear over the epicenter.
NTilt as an improved enhanced tilt derivative filter for edge detection of potential field anomalies
NASA Astrophysics Data System (ADS)
Nasuti, Yasin; Nasuti, Aziz
2018-07-01
We develop a new phase-based filter to enhance the edges of geological sources from potential-field data called NTilt, which utilizes the vertical derivative of the analytical signal in different orders to the tilt derivative equation. This will equalize signals from sources buried at different depths. In order to evaluate the designed filter, we compared the results obtained from our filter with those from recently applied methods, testing against both synthetic data, and measured data from the Finnmark region of North Norway were used. The results demonstrate that the new filter permits better definition of the edges of causative anomalies, as well as better highlighting several anomalies that either are not shown in tilt derivative and other methods or not very well defined. The proposed technique also shows improvements in delineation of the actual edges of deep-seated anomalies compared to tilt derivative and other methods. The NTilt filter provides more accurate and sharper edges and makes the nearby anomalies more distinguishable, and also can avoid bringing some additional false edges reducing the ambiguity in potential field interpretations. This filter, thus, appears to be promising in providing a better qualitative interpretation of the gravity and magnetic data in comparison with the more commonly used filters.
Bayesian anomaly detection in monitoring data applying relevance vector machine
NASA Astrophysics Data System (ADS)
Saito, Tomoo
2011-04-01
A method for automatically classifying the monitoring data into two categories, normal and anomaly, is developed in order to remove anomalous data included in the enormous amount of monitoring data, applying the relevance vector machine (RVM) to a probabilistic discriminative model with basis functions and their weight parameters whose posterior PDF (probabilistic density function) conditional on the learning data set is given by Bayes' theorem. The proposed framework is applied to actual monitoring data sets containing some anomalous data collected at two buildings in Tokyo, Japan, which shows that the trained models discriminate anomalous data from normal data very clearly, giving high probabilities of being normal to normal data and low probabilities of being normal to anomalous data.
An incremental anomaly detection model for virtual machines.
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.
An incremental anomaly detection model for virtual machines
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245
Visualization of hyperspectral imagery
NASA Astrophysics Data System (ADS)
Hogervorst, Maarten A.; Bijl, Piet; Toet, Alexander
2007-04-01
We developed four new techniques to visualize hyper spectral image data for man-in-the-loop target detection. The methods respectively: (1) display the subsequent bands as a movie ("movie"), (2) map the data onto three channels and display these as a colour image ("colour"), (3) display the correlation between the pixel signatures and a known target signature ("match") and (4) display the output of a standard anomaly detector ("anomaly"). The movie technique requires no assumptions about the target signature and involves no information loss. The colour technique produces a single image that can be displayed in real-time. A disadvantage of this technique is loss of information. A display of the match between a target signature and pixels and can be interpreted easily and fast, but this technique relies on precise knowledge of the target signature. The anomaly detector signifies pixels with signatures that deviate from the (local) background. We performed a target detection experiment with human observers to determine their relative performance with the four techniques,. The results show that the "match" presentation yields the best performance, followed by "movie" and "anomaly", while performance with the "colour" presentation was the poorest. Each scheme has its advantages and disadvantages and is more or less suited for real-time and post-hoc processing. The rationale is that the final interpretation is best done by a human observer. In contrast to automatic target recognition systems, the interpretation of hyper spectral imagery by the human visual system is robust to noise and image transformations and requires a minimal number of assumptions (about signature of target and background, target shape etc.) When more knowledge about target and background is available this may be used to help the observer interpreting the data (aided target detection).
TargetVue: Visual Analysis of Anomalous User Behaviors in Online Communication Systems.
Cao, Nan; Shi, Conglei; Lin, Sabrina; Lu, Jie; Lin, Yu-Ru; Lin, Ching-Yung
2016-01-01
Users with anomalous behaviors in online communication systems (e.g. email and social medial platforms) are potential threats to society. Automated anomaly detection based on advanced machine learning techniques has been developed to combat this issue; challenges remain, though, due to the difficulty of obtaining proper ground truth for model training and evaluation. Therefore, substantial human judgment on the automated analysis results is often required to better adjust the performance of anomaly detection. Unfortunately, techniques that allow users to understand the analysis results more efficiently, to make a confident judgment about anomalies, and to explore data in their context, are still lacking. In this paper, we propose a novel visual analysis system, TargetVue, which detects anomalous users via an unsupervised learning model and visualizes the behaviors of suspicious users in behavior-rich context through novel visualization designs and multiple coordinated contextual views. Particularly, TargetVue incorporates three new ego-centric glyphs to visually summarize a user's behaviors which effectively present the user's communication activities, features, and social interactions. An efficient layout method is proposed to place these glyphs on a triangle grid, which captures similarities among users and facilitates comparisons of behaviors of different users. We demonstrate the power of TargetVue through its application in a social bot detection challenge using Twitter data, a case study based on email records, and an interview with expert users. Our evaluation shows that TargetVue is beneficial to the detection of users with anomalous communication behaviors.
Automatic Assessment of Acquisition and Transmission Losses in Indian Remote Sensing Satellite Data
NASA Astrophysics Data System (ADS)
Roy, D.; Purna Kumari, B.; Manju Sarma, M.; Aparna, N.; Gopal Krishna, B.
2016-06-01
The quality of Remote Sensing data is an important parameter that defines the extent of its usability in various applications. The data from Remote Sensing satellites is received as raw data frames at the ground station. This data may be corrupted with data losses due to interferences during data transmission, data acquisition and sensor anomalies. Thus it is important to assess the quality of the raw data before product generation for early anomaly detection, faster corrective actions and product rejection minimization. Manual screening of raw images is a time consuming process and not very accurate. In this paper, an automated process for identification and quantification of losses in raw data like pixel drop out, line loss and data loss due to sensor anomalies is discussed. Quality assessment of raw scenes based on these losses is also explained. This process is introduced in the data pre-processing stage and gives crucial data quality information to users at the time of browsing data for product ordering. It has also improved the product generation workflow by enabling faster and more accurate quality estimation.
NASA Astrophysics Data System (ADS)
Sharma, S. P.; Biswas, A.
2012-12-01
South Purulia Shear Zone (SPSZ) is an important region for prospecting of uranium mineralization. Geological studies and hydro-uranium anomaly suggest the presence of Uranium deposit around Raghunathpur village which lies about 8 km north of SPSZ. However, detailed geophysical investigations have not been carried out in this region for investigation of uranium mineralization. Since surface signature of uranium mineralization is not depicted near the location, a deeper subsurface source is expected for hydro uranium anomaly. To delineate the subsurface structure and to investigate the origin of hydro-uranium anomaly present in the area, Vertical Electrical Sounding (VES) using Schlumberger array and Gradient Resistivity Profiling (GRP) were performed at different locations along a profile perpendicular to the South Purulia Shear Zone. Apparent resistivity computed from the measured sounding data at various locations shows a continuously increasing trend. As a result, conventional apparent resistivity data is not able to detect the possible source of hydro uranium anomaly. An innovative approach is applied which depicts the apparent conductivity in the subsurface revealed a possible connection from SPSZ to Raghunathpur. On the other hand resistivity profiling data suggests a low resistive zone which is also characterized by low Self-Potential (SP) anomaly zone. Since SPSZ is characterized by the source of uranium mineralization; hydro-uranium anomaly at Raghunathpur is connected with the SPSZ. The conducting zone has been delineated from SPSZ to Raghunathpur at deeper depths which could be uranium bearing. Since the location is also characterized by a low gravity and high magnetic anomaly zone, this conducting zone is likely to be mineralized zone. Keywords: Apparent resistivity; apparent conductivity; Self Potential; Uranium mineralization; shear zone; hydro-uranium anomaly.
A Testbed for Data Fusion for Helicopter Diagnostics and Prognostics
2003-03-01
and algorithm design and tuning in order to develop advanced diagnostic and prognostic techniques for air craft health monitoring . Here a...and development of models for diagnostics, prognostics , and anomaly detection . Figure 5 VMEP Server Browser Interface 7 Download... detections , and prognostic prediction time horizons. The VMEP system and in particular the web component are ideal for performing data collection
Personal Electronic Devices and Their Interference with Aircraft Systems
NASA Technical Reports Server (NTRS)
Ross, Elden; Ely, Jay J. (Technical Monitor)
2001-01-01
A compilation of data on personal electronic devices (PEDs) attributed to having created anomalies with aircraft systems. Charts and tables display 14 years of incidents reported by pilots to the Aviation Safety Reporting System (ASRS). Affected systems, incident severity, sources of anomaly detection, and the most frequently identified PEDs are some of the more significant data. Several reports contain incidents of aircraft off course when all systems indicated on course and of critical events that occurred during landings and takeoffs. Additionally, PEDs that should receive priority in testing are identified.
Sea surface temperature anomalies driven by oceanic local forcing in the Brazil-Malvinas Confluence
NASA Astrophysics Data System (ADS)
da Silveira, Isabel Porto; Pezzi, Luciano Ponzi
2014-03-01
Sea surface temperature (SST) anomaly events in the Brazil-Malvinas Confluence (BMC) were investigated through wavelet analysis and numerical modeling. Wavelet analysis was applied to recognize the main spectral signals of SST anomaly events in the BMC and in the Drake Passage as a first attempt to link middle and high latitudes. The numerical modeling approach was used to clarify the local oceanic dynamics that drive these anomalies. Wavelet analysis pointed to the 8-12-year band as the most energetic band representing remote forcing between high to middle latitudes. Other frequencies observed in the BMC wavelet analysis indicate that part of its variability could also be forced by low-latitude events, such as El Niño. Numerical experiments carried out for the years of 1964 and 1992 (cold and warm El Niño-Southern Oscillation (ENSO) phases) revealed two distinct behaviors that produced negative and positive sea surface temperature anomalies on the BMC region. The first behavior is caused by northward cold flow, Río de la Plata runoff, and upwelling processes. The second behavior is driven by a southward excursion of the Brazil Current (BC) front, alterations in Río de la Plata discharge rates, and most likely by air-sea interactions. Both episodes are characterized by uncoupled behavior between the surface and deeper layers.
Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing
NASA Technical Reports Server (NTRS)
Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)
2001-01-01
The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.
NASA Astrophysics Data System (ADS)
Akhoondzadeh, Mehdi; De Santis, Angelo; Marchetti, Dedalo; Piscini, Alessandro; Cianchini, Gianfranco
2018-01-01
After DEMETER satellite mission (2004-2010), the launch of the Swarm satellites (Alpha (A), Bravo (B) and Charlie (C)) has created a new opportunity in the study of earthquake ionospheric precursors. Nowadays, there is no doubt that multi precursors analysis is a necessary phase to better understand the LAIC (Lithosphere Atmosphere Ionosphere Coupling) mechanism before large earthquakes. In this study, using absolute scalar magnetometer, vector field magnetometer and electric field instrument on board Swarm satellites, GPS (Global Positioning System) measurements, MODIS-Aqua satellite and ECMWF (European Centre for Medium-Range Weather Forecasts) data, the variations of the electron density and temperature, magnetic field, TEC (Total Electron Content), LST (Land Surface Temperature), AOD (Aerosol Optical Depth) and SKT (SKin Temperature) have been surveyed to find the potential seismic anomalies around the strong Ecuador (Mw = 7.8) earthquake of 16 April 2016. The four solar and geomagnetic indices: F10.7, Dst, Kp and ap were investigated to distinguish whether the preliminary detected anomalies might be associated with the solar-geomagnetic activities instead of the seismo-ionospheric anomalies. The Swarm satellites (A, B and C) data analysis indicate the anomalies in time series of electron density variations on 7, 11 and 12 days before the event; the unusual variations in time series of electron temperature on 8 days preceding the earthquake; the analysis of the magnetic field scalar and vectors data show the considerable anomalies 52, 48, 23, 16, 11, 9 and 7 days before the main shock. A striking anomaly is detected in TEC variations on 1 day before earthquake at 9:00 UTC. The analysis of MODIS-Aqua night-time images shows that LST increase unusually on 11 days prior to main shock. In addition, the AOD variations obtained from MODIS measurements reach the maximum value on 10 days before the earthquake. The SKT around epicentral region presents anomalous higher value about 40 days before the earthquake. It should be noted that the different lead times of the observed anomalies could be acknowledged based on a reasonable LAIC earthquake mechanism. Our results emphasize that the Swarm satellites measurements play an undeniable role in progress the studies of the ionospheric precursors.
Prevalence and distribution of selected dental anomalies among saudi children in Abha, Saudi Arabia.
Yassin, Syed M
2016-12-01
Dental anomalies are not an unusual finding in routine dental examination. The effect of dental anomalies can lead to functional, esthetic and occlusal problems. The Purpose of the study was to determine the prevalence and distribution of selected developmental dental anomalies in Saudi children. The study was based on clinical examination and Panoramic radiographs of children who visited the Pediatric dentistry clinics at King Khalid University College of Dentistry, Saudi Arabia. These patients were examined for dental anomalies in size, shape, number, structure and position. Data collected were entered and analyzed using statistical package for social sciences version. Of the 1252 children (638 Boys, 614 girls) examined, 318 subjects (25.39%) presented with selected dental anomalies. The distribution by gender was 175 boys (27.42%) and 143 girls (23.28%). On intergroup comparison, number anomalies was the most common anomaly with Hypodontia (9.7%) being the most common anomaly in Saudi children, followed by hyperdontia (3.5%). The Prevalence of size anomalies were Microdontia (2.6%) and Macrodontia (1.8%). The prevalence of Shape anomalies were Talon cusp (1.4%), Taurodontism (1.4%), Fusion (0.8%).The prevalence of Positional anomalies were Ectopic eruption (2.3%) and Rotation (0.4%). The prevalence of structural anomalies were Amelogenesis imperfecta (0.3%) Dentinogenesis imperfecta (0.1%). A significant number of children had dental anomaly with Hypodontia being the most common anomaly and Dentinogenesis imperfecta being the rare anomaly in the study. Early detection and management of these anomalies can avoid potential orthodontic and esthetic problems in a child. Key words: Dental anomalies, children, Saudi Arabia.
NASA Astrophysics Data System (ADS)
Akhoondzadeh, M.
2013-09-01
Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.
Ocean Chlorophyll as a Precursor of ENSO: An Earth System Modeling Study
NASA Astrophysics Data System (ADS)
Park, Jong-Yeon; Dunne, John P.; Stock, Charles A.
2018-02-01
Ocean chlorophyll concentration, a proxy for phytoplankton, is strongly influenced by internal ocean dynamics such as those associated with El Niño-Southern Oscillation (ENSO). Observations show that ocean chlorophyll responses to ENSO generally lead sea surface temperature (SST) responses in the equatorial Pacific. A long-term global Earth system model simulation incorporating marine biogeochemical processes also exhibits a preceding chlorophyll response. In contrast to simulated SST anomalies, which significantly lag the wind-driven subsurface heat response to ENSO, chlorophyll anomalies respond rapidly. Iron was found to be the key factor connecting the simulated surface chlorophyll anomalies to the subsurface ocean response. Westerly wind bursts decrease central Pacific chlorophyll by reducing iron supply through wind-driven thermocline deepening but increase western Pacific chlorophyll by enhancing the influx of coastal iron from the maritime continent. Our results mechanistically support the potential for chlorophyll-based indices to inform seasonal ENSO forecasts beyond previously identified SST-based indices.
Variable Discretisation for Anomaly Detection using Bayesian Networks
2017-01-01
UNCLASSIFIED DST- Group –TR–3328 1 Introduction Bayesian network implementations usually require each variable to take on a finite number of mutually...UNCLASSIFIED Variable Discretisation for Anomaly Detection using Bayesian Networks Jonathan Legg National Security and ISR Division Defence Science...and Technology Group DST- Group –TR–3328 ABSTRACT Anomaly detection is the process by which low probability events are automatically found against a
Enhanced detection and visualization of anomalies in spectral imagery
NASA Astrophysics Data System (ADS)
Basener, William F.; Messinger, David W.
2009-05-01
Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.
Modeling of electrical impedance tomography to detect breast cancer by finite volume methods
NASA Astrophysics Data System (ADS)
Ain, K.; Wibowo, R. A.; Soelistiono, S.
2017-05-01
The properties of the electrical impedance of tissue are an interesting study, because changes of the electrical impedance of organs are related to physiological and pathological. Both physiological and pathological properties are strongly associated with disease information. Several experiments shown that the breast cancer has a lower impedance than the normal breast tissue. Thus, the imaging based on impedance can be used as an alternative equipment to detect the breast cancer. This research carries out by modelling of Electrical Impedance Tomography to detect the breast cancer by finite volume methods. The research includes development of a mathematical model of the electric potential field by 2D Finite Volume Method, solving the forward problem and inverse problem by linear reconstruction method. The scanning is done by 16 channel electrode with neighbors method to collect data. The scanning is performed at a frequency of 10 kHz and 100 kHz with three objects numeric includes an anomaly at the surface, an anomaly at the depth and an anomaly at the surface and at depth. The simulation has been successfully to reconstruct image of functional anomalies of the breast cancer at the surface position, the depth position or a combination of surface and the depth.
Pediatric tinnitus: Incidence of imaging anomalies and the impact of hearing loss.
Kerr, Rhorie; Kang, Elise; Hopkins, Brandon; Anne, Samantha
2017-12-01
Guidelines exist for evaluation and management of tinnitus in adults; however lack of evidence in children limits applicability of these guidelines to pediatric patients. Objective of this study is to determine the incidence of inner ear anomalies detected on imaging studies within the pediatric population with tinnitus and evaluate if presence of hearing loss increases the rate of detection of anomalies in comparison to normal hearing patients. Retrospective review of all children with diagnosis of tinnitus from 2010 to 2015 ;at a tertiary care academic center. 102 pediatric patients with tinnitus were identified. Overall, 53 patients had imaging studies with 6 abnormal findings (11.3%). 51/102 patients had hearing loss of which 33 had imaging studies demonstrating 6 inner ear anomalies detected. This is an incidence of 18.2% for inner ear anomalies identified in patients with hearing loss (95% confidence interval (CI) of 7.0-35.5%). 4 of these 6 inner ear anomalies detected were vestibular aqueduct abnormalities. The other two anomalies were cochlear hypoplasia and bilateral semicircular canal dysmorphism. 51 patients had no hearing loss and of these patients, 20 had imaging studies with no inner ear abnormalities detected. There was no statistical difference in incidence of abnormal imaging findings in patients with and without hearing loss (Fisher's exact test, p ;= ;0.072.) CONCLUSION: There is a high incidence of anomalies detected in imaging studies done in pediatric patients with tinnitus, especially in the presence of hearing loss. Copyright © 2017 Elsevier B.V. All rights reserved.
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
NASA Technical Reports Server (NTRS)
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
Estimation of anomaly location and size using electrical impedance tomography.
Kwon, Ohin; Yoon, Jeong Rock; Seo, Jin Keun; Woo, Eung Je; Cho, Young Gu
2003-01-01
We developed a new algorithm that estimates locations and sizes of anomalies in electrically conducting medium based on electrical impedance tomography (EIT) technique. When only the boundary current and voltage measurements are available, it is not practically feasible to reconstruct accurate high-resolution cross-sectional conductivity or resistivity images of a subject. In this paper, we focus our attention on the estimation of locations and sizes of anomalies with different conductivity values compared with the background tissues. We showed the performance of the algorithm from experimental results using a 32-channel EIT system and saline phantom. With about 1.73% measurement error in boundary current-voltage data, we found that the minimal size (area) of the detectable anomaly is about 0.72% of the size (area) of the phantom. Potential applications include the monitoring of impedance related physiological events and bubble detection in two-phase flow. Since this new algorithm requires neither any forward solver nor time-consuming minimization process, it is fast enough for various real-time applications in medicine and nondestructive testing.
Microgravity and Electrical Resistivity Techniques for Detection of Caves and Clandestine Tunnels
NASA Astrophysics Data System (ADS)
Crawford, N. C.; Croft, L. A.; Cesin, G. L.; Wilson, S.
2006-05-01
The Center for Cave and Karst Studies, CCKS, has been using microgravity to locate caves from the ground's surface since 1985. The geophysical subsurface investigations began during a period when explosive and toxic vapors were rising from the karst aquifer under Bowling Green into homes, businesses, and schools. The USEPA provided the funding for this Superfund Emergency, and the CCKS was able to drill numerous wells into low-gravity anomalies to confirm and even map the route of caves in the underlying limestone bedrock. In every case, a low-gravity anomaly indicated a bedrock cave, a cave with a collapsed roof or locations where a bedrock cave had collapsed and filled with alluvium. At numerous locations, several wells were cored into microgravity anomalies and in every case, additional wells were drilled on both sides of the anomalies to confirm that the technique was in fact reliable. The wells cored on both sides of the anomalies did not intersect caves but instead intersected virtually solid limestone. Microgravity also easily detected storm sewers and even sanitary sewers, sometimes six meters (twenty feet) beneath the surface. Microgravity has also been used on many occasions to investigate sinkhole collapses. It identified potential collapse areas by detecting voids in the unconsolidated material above bedrock. The system will soon be tested over known tunnels and then during a blind test along a section of the U.S. border at Nogales, Arizona. The CCKS has experimented with other geophysical techniques, particularly ground penetrating radar, seismic and electrical resistivity. In the late 1990s the CCKS started using the Swift/Sting resistivity meter to perform karst geophysical subsurface investigations. The system provides good depth to bedrock data, but it is often difficult to interpret bedrock caves from the modeled data. The system typically used now by the CCKS to perform karst subsurface investigations is to use electrical resistivity traverses followed by microgravity over suspect areas identified on the modeled resistivity data. Some areas of high resistivity indicate caves, but others simply indicate pockets of dry limestone, and the signatures looks virtually identical. Therefore, the CCKS performs microgravity over all suspect areas along the resistivity traverses. A low-gravity anomaly that corresponds with a high-resistivity anomaly indicates a cave location. A high-resistivity anomaly that does not also have a low- gravity anomaly indicates a pocket of dry limestone. Numerous cored wells have been drilled both into the anomalies and on both sides to confirm the cave locations and to establish that the technique is accurate. The September 11, 2001 World Trade Center catastrophe was the catalyst for the formation of a program within the CCKS to use the techniques for locating bedrock caves and voids in unconsolidated materials for search and rescue and for locating clandestine tunnels. We are now into our third year of a grant from the Kentucky Science and Technology Center to develop a robot that will measure microgravity and other geophysical techniques. The robot has the potential for detecting clandestine tunnels under the U.S. border as well as military applications. The system will soon be tested over known tunnels and then during a blind test along a section of the U.S. border at Nogales, Arizona.
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Morris, Jon; Turowski, Mark; Franzl, Richard; Walker, Mark; Kapadia, Ravi; Venkatesh, Meera; Schmalzel, John
2010-01-01
Severe weather events are likely occurrences on the Mississippi Gulf Coast. It is important to rapidly diagnose and mitigate the effects of storms on Stennis Space Center's rocket engine test complex to avoid delays to critical test article programs, reduce costs, and maintain safety. An Integrated Systems Health Management (ISHM) approach and technologies are employed to integrate environmental (weather) monitoring, structural modeling, and the suite of available facility instrumentation to provide information for readiness before storms, rapid initial damage assessment to guide mitigation planning, and then support on-going assurance as repairs are effected and finally support recertification. The system is denominated Katrina Storm Monitoring System (KStorMS). Integrated Systems Health Management (ISHM) describes a comprehensive set of capabilities that provide insight into the behavior the health of a system. Knowing the status of a system allows decision makers to effectively plan and execute their mission. For example, early insight into component degradation and impending failures provides more time to develop work around strategies and more effectively plan for maintenance. Failures of system elements generally occur over time. Information extracted from sensor data, combined with system-wide knowledge bases and methods for information extraction and fusion, inference, and decision making, can be used to detect incipient failures. If failures do occur, it is critical to detect and isolate them, and suggest an appropriate course of action. ISHM enables determining the condition (health) of every element in a complex system-of-systems or SoS (detect anomalies, diagnose causes, predict future anomalies), and provide data, information, and knowledge (DIaK) to control systems for safe and effective operation. ISHM capability is achieved by using a wide range of technologies that enable anomaly detection, diagnostics, prognostics, and advise for control: (1) anomaly detection algorithms and strategies, (2) fusion of DIaK for anomaly detection (model-based, numerical, statistical, empirical, expert-based, qualitative, etc.), (3) diagnostics/prognostics strategies and methods, (4) user interface, (5) advanced control strategies, (6) integration architectures/frameworks, (7) embedding of intelligence. Many of these technologies are mature, and they are being used in the KStorMS. The paper will describe the design, implementation, and operation of the KStorMS; and discuss further evolution to support other needs such as condition-based maintenance (CBM).
NASA Technical Reports Server (NTRS)
Labrecque, J. L.; Cande, S. C.; Jarrard, R. D. (Principal Investigator)
1983-01-01
A technique that eliminates external field sources and the effects of strike aliasing was used to extract from marine survey data the intermediate wavelength magnetic anomaly field for (B) in the North Pacific. A strong correlation exists between this field and the MAGSAT field although a directional sensitivity in the MAGSAT field can be detected. The intermediate wavelength field is correlated to tectonic features. Island arcs appear as positive anomalies of induced origin likely due to variations in crustal thickness. Seamount chains and oceanic plateaus also are manifested by strong anomalies. The primary contribution to many of these anomalies appears to be due to a remanent magnetization. The source parameters for the remainder of these features are presently unidentified ambiguous. Results indicate that the sea surface field is a valuable source of information for secular variation analysis and the resolution of intermediate wavelength source parameters.
Convectively-driven cold layer and its influences on moisture in the UTLS
NASA Astrophysics Data System (ADS)
Kim, J.; Randel, W. J.; Birner, T.
2016-12-01
Characteristics of the cold anomaly in the tropical tropopause layer (TTL) that is commonly observed with deep convection are examined using CloudSat and Constellation Observing System for Meteorology, Ionosphere and Climate (COSMIC) GPS radio occultation measurements. Deep convection is sampled based on the cloud top height (>17 km) from CloudSat 2B-CLDCLASS, and then temperature profiles from COSMIC are composited around the deep convection. The composite temperature shows anomalously warm troposphere (up to 14 km) and a significantly cold layer near the tropopause (at 16-18 km) in the regions of deep convection. Generally in the tropics, the cold layer has very large horizontal scale (2,000 - 6,000 km) compared to that of mesoscale convective cluster, and it lasts one or two weeks with minimum temperature anomaly of - 2K. The cold layer shows slight but clear eastward-tilted vertical structure in the deep tropics indicating a large-scale Kelvin wave response. Further analyses on circulation patterns suggest that the anomaly can be explained as a part of Gill-type response in the TTL to deep convective heating in the troposphere. Response of moisture to the cold layer is also examined in the upper troposphere and lower stratosphere using microwave limb sounder (MLS) measurements. The water vapor anomalies show coherent structures with the temperature and circulation anomalies. A clear dry anomaly is found in the cold layer and its outflow region, implying a large-scale dehydration process due to the convectively driven cold layer in the upper TTL.
Riboni, Daniele; Bettini, Claudio; Civitarese, Gabriele; Janjua, Zaffar Haider; Helaoui, Rim
2016-02-01
In an ageing world population more citizens are at risk of cognitive impairment, with negative consequences on their ability of independent living, quality of life and sustainability of healthcare systems. Cognitive neuroscience researchers have identified behavioral anomalies that are significant indicators of cognitive decline. A general goal is the design of innovative methods and tools for continuously monitoring the functional abilities of the seniors at risk and reporting the behavioral anomalies to the clinicians. SmartFABER is a pervasive system targeting this objective. A non-intrusive sensor network continuously acquires data about the interaction of the senior with the home environment during daily activities. A novel hybrid statistical and knowledge-based technique is used to analyses this data and detect the behavioral anomalies, whose history is presented through a dashboard to the clinicians. Differently from related works, SmartFABER can detect abnormal behaviors at a fine-grained level. We have fully implemented the system and evaluated it using real datasets, partly generated by performing activities in a smart home laboratory, and partly acquired during several months of monitoring of the instrumented home of a senior diagnosed with MCI. Experimental results, including comparisons with other activity recognition techniques, show the effectiveness of SmartFABER in terms of recognition rates. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ziemann, Amanda K.; Messinger, David W.; Albano, James A.; Basener, William F.
2012-06-01
Anomaly detection algorithms have historically been applied to hyperspectral imagery in order to identify pixels whose material content is incongruous with the background material in the scene. Typically, the application involves extracting man-made objects from natural and agricultural surroundings. A large challenge in designing these algorithms is determining which pixels initially constitute the background material within an image. The topological anomaly detection (TAD) algorithm constructs a graph theory-based, fully non-parametric topological model of the background in the image scene, and uses codensity to measure deviation from this background. In TAD, the initial graph theory structure of the image data is created by connecting an edge between any two pixel vertices x and y if the Euclidean distance between them is less than some resolution r. While this type of proximity graph is among the most well-known approaches to building a geometric graph based on a given set of data, there is a wide variety of dierent geometrically-based techniques. In this paper, we present a comparative test of the performance of TAD across four dierent constructs of the initial graph: mutual k-nearest neighbor graph, sigma-local graph for two different values of σ > 1, and the proximity graph originally implemented in TAD.
Hot spots of multivariate extreme anomalies in Earth observations
NASA Astrophysics Data System (ADS)
Flach, M.; Sippel, S.; Bodesheim, P.; Brenning, A.; Denzler, J.; Gans, F.; Guanche, Y.; Reichstein, M.; Rodner, E.; Mahecha, M. D.
2016-12-01
Anomalies in Earth observations might indicate data quality issues, extremes or the change of underlying processes within a highly multivariate system. Thus, considering the multivariate constellation of variables for extreme detection yields crucial additional information over conventional univariate approaches. We highlight areas in which multivariate extreme anomalies are more likely to occur, i.e. hot spots of extremes in global atmospheric Earth observations that impact the Biosphere. In addition, we present the year of the most unusual multivariate extreme between 2001 and 2013 and show that these coincide with well known high impact extremes. Technically speaking, we account for multivariate extremes by using three sophisticated algorithms adapted from computer science applications. Namely an ensemble of the k-nearest neighbours mean distance, a kernel density estimation and an approach based on recurrences is used. However, the impact of atmosphere extremes on the Biosphere might largely depend on what is considered to be normal, i.e. the shape of the mean seasonal cycle and its inter-annual variability. We identify regions with similar mean seasonality by means of dimensionality reduction in order to estimate in each region both the `normal' variance and robust thresholds for detecting the extremes. In addition, we account for challenges like heteroscedasticity in Northern latitudes. Apart from hot spot areas, those anomalies in the atmosphere time series are of particular interest, which can only be detected by a multivariate approach but not by a simple univariate approach. Such an anomalous constellation of atmosphere variables is of interest if it impacts the Biosphere. The multivariate constellation of such an anomalous part of a time series is shown in one case study indicating that multivariate anomaly detection can provide novel insights into Earth observations.
Data-driven region-of-interest selection without inflating Type I error rate.
Brooks, Joseph L; Zoumpoulaki, Alexia; Bowman, Howard
2017-01-01
In ERP and other large multidimensional neuroscience data sets, researchers often select regions of interest (ROIs) for analysis. The method of ROI selection can critically affect the conclusions of a study by causing the researcher to miss effects in the data or to detect spurious effects. In practice, to avoid inflating Type I error rate (i.e., false positives), ROIs are often based on a priori hypotheses or independent information. However, this can be insensitive to experiment-specific variations in effect location (e.g., latency shifts) reducing power to detect effects. Data-driven ROI selection, in contrast, is nonindependent and uses the data under analysis to determine ROI positions. Therefore, it has potential to select ROIs based on experiment-specific information and increase power for detecting effects. However, data-driven methods have been criticized because they can substantially inflate Type I error rate. Here, we demonstrate, using simulations of simple ERP experiments, that data-driven ROI selection can indeed be more powerful than a priori hypotheses or independent information. Furthermore, we show that data-driven ROI selection using the aggregate grand average from trials (AGAT), despite being based on the data at hand, can be safely used for ROI selection under many circumstances. However, when there is a noise difference between conditions, using the AGAT can inflate Type I error and should be avoided. We identify critical assumptions for use of the AGAT and provide a basis for researchers to use, and reviewers to assess, data-driven methods of ROI localization in ERP and other studies. © 2016 Society for Psychophysiological Research.
NASA Astrophysics Data System (ADS)
Koshti, Ajay M.
2015-04-01
The paper provides information on a new infrared (IR) image contrast data post-processing method that involves converting raw data to normalized contrast versus time evolutions from the flash infrared thermography inspection video data. Thermal measurement features such as peak contrast, peak contrast time, persistence time, and persistence energy are calculated from the contrast evolutions. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat bottom holes in a test plate of the subject material. The measurement features are used to monitor growth of anomalies and to characterize the void-like anomalies. The method was developed to monitor and analyze void-like anomalies in reinforced carbon-carbon (RCC) materials used on the wing leading edge of the NASA Space Shuttle Orbiters, but the method is equally applicable to other materials. The thermal measurement features relate to the anomaly characteristics such as depth and size. Calibration of the contrast is used to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat bottom hole (EFBH) from the calibration data. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH diameter are compared with actual widths to evaluate utility of IR Contrast method. Some thermal measurements relate to gap thickness of the delaminations. Results of IR Contrast method on RCC hardware are provided. Keywords: normalized contrast, flash infrared thermography.
Hyperspectral anomaly detection using Sony PlayStation 3
NASA Astrophysics Data System (ADS)
Rosario, Dalton; Romano, João; Sepulveda, Rene
2009-05-01
We present a proof-of-principle demonstration using Sony's IBM Cell processor-based PlayStation 3 (PS3) to run-in near real-time-a hyperspectral anomaly detection algorithm (HADA) on real hyperspectral (HS) long-wave infrared imagery. The PS3 console proved to be ideal for doing precisely the kind of heavy computational lifting HS based algorithms require, and the fact that it is a relatively open platform makes programming scientific applications feasible. The PS3 HADA is a unique parallel-random sampling based anomaly detection approach that does not require prior spectra of the clutter background. The PS3 HADA is designed to handle known underlying difficulties (e.g., target shape/scale uncertainties) often ignored in the development of autonomous anomaly detection algorithms. The effort is part of an ongoing cooperative contribution between the Army Research Laboratory and the Army's Armament, Research, Development and Engineering Center, which aims at demonstrating performance of innovative algorithmic approaches for applications requiring autonomous anomaly detection using passive sensors.
Detection of anomalies in radio tomography of asteroids: Source count and forward errors
NASA Astrophysics Data System (ADS)
Pursiainen, S.; Kaasalainen, M.
2014-09-01
The purpose of this study was to advance numerical methods for radio tomography in which asteroid's internal electric permittivity distribution is to be recovered from radio frequency data gathered by an orbiter. The focus was on signal generation via multiple sources (transponders) providing one potential, or even essential, scenario to be implemented in a challenging in situ measurement environment and within tight payload limits. As a novel feature, the effects of forward errors including noise and a priori uncertainty of the forward (data) simulation were examined through a combination of the iterative alternating sequential (IAS) inverse algorithm and finite-difference time-domain (FDTD) simulation of time evolution data. Single and multiple source scenarios were compared in two-dimensional localization of permittivity anomalies. Three different anomaly strengths and four levels of total noise were tested. Results suggest, among other things, that multiple sources can be necessary to obtain appropriate results, for example, to distinguish three separate anomalies with permittivity less or equal than half of the background value, relevant in recovery of internal cavities.
A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring
NASA Technical Reports Server (NTRS)
Doyle, R.
1994-01-01
Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.
Jang, J; Seo, J K
2015-06-01
This paper describes a multiple background subtraction method in frequency difference electrical impedance tomography (fdEIT) to detect an admittivity anomaly from a high-contrast background conductivity distribution. The proposed method expands the use of the conventional weighted frequency difference EIT method, which has been used limitedly to detect admittivity anomalies in a roughly homogeneous background. The proposed method can be viewed as multiple weighted difference imaging in fdEIT. Although the spatial resolutions of the output images by fdEIT are very low due to the inherent ill-posedness, numerical simulations and phantom experiments of the proposed method demonstrate its feasibility to detect anomalies. It has potential application in stroke detection in a head model, which is highly heterogeneous due to the skull.
Multi-Level Modeling of Complex Socio-Technical Systems - Phase 1
2013-06-06
is to detect anomalous organizational outcomes, diagnose the causes of these anomalies , and decide upon appropriate compensation schemes. All of...monitor process outcomes. The purpose of this monitoring is to detect anomalous process outcomes, diagnose the causes of these anomalies , and decide upon...monitor work outcomes in terms of performance. The purpose of this monitoring is to detect anomalous work outcomes, diagnose the causes of these anomalies
Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ondrej Linda; Todd Vollmer; Milos Manic
The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, thismore » paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.« less
Analysis of LANDSAT-4 TM Data for Lithologic and Image Mapping Purpose
NASA Technical Reports Server (NTRS)
Podwysocki, M. H.; Salisbury, J. W.; Bender, L. V.; Jones, O. D.; Mimms, D. L.
1984-01-01
Lithologic mapping techniques using the near infrared bands of the Thematic Mapper onboard the LANDSAT 4 satellite are investigated. These methods are coupled with digital masking to test the capability of mapping geologic materials. Data are examined under medium to low Sun angle illumination conditions to determine the detection limits of materials with absorption features. Several detection anomalies are observed and explained.
First trimester PAPP-A in the detection of non-Down syndrome aneuploidy.
Ochshorn, Y; Kupferminc, M J; Wolman, I; Orr-Urtreger, A; Jaffa, A J; Yaron, Y
2001-07-01
Combined first trimester screening using pregnancy associated plasma protein-A (PAPP-A), free beta-human chorionic gonadotrophin, and nuchal translucency (NT), is currently accepted as probably the best combination for the detection of Down syndrome (DS). Current first trimester algorithms provide computed risks only for DS. However, low PAPP-A is also associated with other chromosome anomalies such as trisomy 13, 18, and sex chromosome aneuploidy. Thus, using currently available algorithms, some chromosome anomalies may not be detected. The purpose of the present study was to establish a low-end cut-off value for PAPP-A that would increase the detection rates for non-DS chromosome anomalies. The study included 1408 patients who underwent combined first trimester screening. To determine a low-end cut-off value for PAPP-A, a Receiver-Operator Characteristic (ROC) curve analysis was performed. In the entire study group there were 18 cases of chromosome anomalies (trisomy 21, 13, 18, sex chromosome anomalies), 14 of which were among screen-positive patients, a detection rate of 77.7% for all chromosome anomalies (95% CI: 55.7-99.7%). ROC curve analysis detected a statistically significant cut-off for PAPP-A at 0.25 MoM. If the definition of screen-positive were to also include patients with PAPP-A<0.25 MoM, the detection rate would increase to 88.8% for all chromosome anomalies (95% CI: 71.6-106%). This low cut-off value may be used until specific algorithms are implemented for non-Down syndrome aneuploidy. Copyright 2001 John Wiley & Sons, Ltd.
Evaluation schemes for video and image anomaly detection algorithms
NASA Astrophysics Data System (ADS)
Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael
2016-05-01
Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.
Security inspection in ports by anomaly detection using hyperspectral imaging technology
NASA Astrophysics Data System (ADS)
Rivera, Javier; Valverde, Fernando; Saldaña, Manuel; Manian, Vidya
2013-05-01
Applying hyperspectral imaging technology in port security is crucial for the detection of possible threats or illegal activities. One of the most common problems that cargo suffers is tampering. This represents a danger to society because it creates a channel to smuggle illegal and hazardous products. If a cargo is altered, security inspections on that cargo should contain anomalies that reveal the nature of the tampering. Hyperspectral images can detect anomalies by gathering information through multiple electromagnetic bands. The spectrums extracted from these bands can be used to detect surface anomalies from different materials. Based on this technology, a scenario was built in which a hyperspectral camera was used to inspect the cargo for any surface anomalies and a user interface shows the results. The spectrum of items, altered by different materials that can be used to conceal illegal products, is analyzed and classified in order to provide information about the tampered cargo. The image is analyzed with a variety of techniques such as multiple features extracting algorithms, autonomous anomaly detection, and target spectrum detection. The results will be exported to a workstation or mobile device in order to show them in an easy -to-use interface. This process could enhance the current capabilities of security systems that are already implemented, providing a more complete approach to detect threats and illegal cargo.
A new comparison of hyperspectral anomaly detection algorithms for real-time applications
NASA Astrophysics Data System (ADS)
Díaz, María.; López, Sebastián.; Sarmiento, Roberto
2016-10-01
Due to the high spectral resolution that remotely sensed hyperspectral images provide, there has been an increasing interest in anomaly detection. The aim of anomaly detection is to stand over pixels whose spectral signature differs significantly from the background spectra. Basically, anomaly detectors mark pixels with a certain score, considering as anomalies those whose scores are higher than a threshold. Receiver Operating Characteristic (ROC) curves have been widely used as an assessment measure in order to compare the performance of different algorithms. ROC curves are graphical plots which illustrate the trade- off between false positive and true positive rates. However, they are limited in order to make deep comparisons due to the fact that they discard relevant factors required in real-time applications such as run times, costs of misclassification and the competence to mark anomalies with high scores. This last fact is fundamental in anomaly detection in order to distinguish them easily from the background without any posterior processing. An extensive set of simulations have been made using different anomaly detection algorithms, comparing their performances and efficiencies using several extra metrics in order to complement ROC curves analysis. Results support our proposal and demonstrate that ROC curves do not provide a good visualization of detection performances for themselves. Moreover, a figure of merit has been proposed in this paper which encompasses in a single global metric all the measures yielded for the proposed additional metrics. Therefore, this figure, named Detection Efficiency (DE), takes into account several crucial types of performance assessment that ROC curves do not consider. Results demonstrate that algorithms with the best detection performances according to ROC curves do not have the highest DE values. Consequently, the recommendation of using extra measures to properly evaluate performances have been supported and justified by the conclusions drawn from the simulations.
NASA Astrophysics Data System (ADS)
Zhang, Xing; Wen, Gongjian
2015-10-01
Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.
Rotor Smoothing and Vibration Monitoring Results for the US Army VMEP
2009-06-01
individual component CI detection thresholds, and development of models for diagnostics, prognostics , and anomaly detection . Figure 16 VMEP Server...and prognostics are of current interest. Development of those systems requires large amounts of data (collection, monitoring , manipulation) to capture...development of automated systems and for continuous updating of algorithms to improve detection , classification, and prognostic performance. A test
Sabokrou, Mohammad; Fayyaz, Mohsen; Fathy, Mahmood; Klette, Reinhard
2017-02-17
This paper proposes a fast and reliable method for anomaly detection and localization in video data showing crowded scenes. Time-efficient anomaly localization is an ongoing challenge and subject of this paper. We propose a cubicpatch- based method, characterised by a cascade of classifiers, which makes use of an advanced feature-learning approach. Our cascade of classifiers has two main stages. First, a light but deep 3D auto-encoder is used for early identification of "many" normal cubic patches. This deep network operates on small cubic patches as being the first stage, before carefully resizing remaining candidates of interest, and evaluating those at the second stage using a more complex and deeper 3D convolutional neural network (CNN). We divide the deep autoencoder and the CNN into multiple sub-stages which operate as cascaded classifiers. Shallow layers of the cascaded deep networks (designed as Gaussian classifiers, acting as weak single-class classifiers) detect "simple" normal patches such as background patches, and more complex normal patches are detected at deeper layers. It is shown that the proposed novel technique (a cascade of two cascaded classifiers) performs comparable to current top-performing detection and localization methods on standard benchmarks, but outperforms those in general with respect to required computation time.
Automated Propulsion Data Screening demonstration system
NASA Technical Reports Server (NTRS)
Hoyt, W. Andes; Choate, Timothy D.; Whitehead, Bruce A.
1995-01-01
A fully-instrumented firing of a propulsion system typically generates a very large quantity of data. In the case of the Space Shuttle Main Engine (SSME), data analysis from ground tests and flights is currently a labor-intensive process. Human experts spend a great deal of time examining the large volume of sensor data generated by each engine firing. These experts look for any anomalies in the data which might indicate engine conditions warranting further investigation. The contract effort was to develop a 'first-cut' screening system for application to SSME engine firings that would identify the relatively small volume of data which is unusual or anomalous in some way. With such a system, limited and expensive human resources could focus on this small volume of unusual data for thorough analysis. The overall project objective was to develop a fully operational Automated Propulsion Data Screening (APDS) system with the capability of detecting significant trends and anomalies in transient and steady-state data. However, the effort limited screening of transient data to ground test data for throttle-down cases typical of the 3-g acceleration, and for engine throttling required to reach the maximum dynamic pressure limits imposed on the Space Shuttle. This APDS is based on neural networks designed to detect anomalies in propulsion system data that are not part of the data used for neural network training. The delivered system allows engineers to build their own screening sets for application to completed or planned firings of the SSME. ERC developers also built some generic screening sets that NASA engineers could apply immediately to their data analysis efforts.
An Analysis of Drop Outs and Unusual Behavior from Primary and Secondary Radar
NASA Astrophysics Data System (ADS)
Allen, Nicholas J.
An evaluation of the radar systems in the Red River Valley of North Dakota (ND) and its surrounding areas for its ability to provide Detect and Avoid (DAA) capabilities for manned and unmanned aircraft systems (UAS) was performed. Additionally, the data was analyzed for its feasibility to be used in autonomous Air Traffic Control (ATC) systems in the future. With the almost certain increase in airspace congestion over the coming years, the need for a robust and accurate radar system is crucial. This study focused on the Airport Surveillance Radar (ASR) at Fargo, ND and the Air Route Surveillance Radar at Finley, ND. Each of these radar sites contain primary and secondary radars. It was found that both locations exhibit data anomalies, such as: drop outs, altitude outliers, prolonged altitude failures, repeated data, and multiple aircraft with the same identification number (ID) number. Four weeks of data provided by Harris Corporation throughout the year were analyzed using a MATLAB algorithm developed to identify the data anomalies. The results showed Fargo intercepts on average 450 aircraft, while Finley intercepts 1274 aircraft. Of these aircraft an average of 34% experienced drop outs at Fargo and 69% at Finley. With the average drop out at Fargo of 23.58 seconds and 42.45 seconds at Finley, and several lasting more than several minutes, it shows these data anomalies can occur for an extended period of time. Between 1% to 26% aircraft experienced the other data anomalies, depending on the type of data anomaly and location. When aircraft were near airports or the edge of the effective radar radius, the largest proportion of data anomalies were experienced. It was also discovered that drop outs, altitude outliers, andrepeated data are radar induced errors, while prolonged altitude failures and multiple aircraft with the same ID are transponder induced errors. The risk associated with each data anomaly, by looking at the severity of the event and the occurrence was also produced. The findings from this report will provide meaningful data and likely influence the development of UAS DAA logic and the logic behind autonomous ATC systems.
Prevalence and distribution of selected dental anomalies among saudi children in Abha, Saudi Arabia
2016-01-01
Background Dental anomalies are not an unusual finding in routine dental examination. The effect of dental anomalies can lead to functional, esthetic and occlusal problems. The Purpose of the study was to determine the prevalence and distribution of selected developmental dental anomalies in Saudi children. Material and Methods The study was based on clinical examination and Panoramic radiographs of children who visited the Pediatric dentistry clinics at King Khalid University College of Dentistry, Saudi Arabia. These patients were examined for dental anomalies in size, shape, number, structure and position. Data collected were entered and analyzed using statistical package for social sciences version. Results Of the 1252 children (638 Boys, 614 girls) examined, 318 subjects (25.39%) presented with selected dental anomalies. The distribution by gender was 175 boys (27.42%) and 143 girls (23.28%). On intergroup comparison, number anomalies was the most common anomaly with Hypodontia (9.7%) being the most common anomaly in Saudi children, followed by hyperdontia (3.5%). The Prevalence of size anomalies were Microdontia (2.6%) and Macrodontia (1.8%). The prevalence of Shape anomalies were Talon cusp (1.4%), Taurodontism (1.4%), Fusion (0.8%).The prevalence of Positional anomalies were Ectopic eruption (2.3%) and Rotation (0.4%). The prevalence of structural anomalies were Amelogenesis imperfecta (0.3%) Dentinogenesis imperfecta (0.1%). Conclusions A significant number of children had dental anomaly with Hypodontia being the most common anomaly and Dentinogenesis imperfecta being the rare anomaly in the study. Early detection and management of these anomalies can avoid potential orthodontic and esthetic problems in a child. Key words:Dental anomalies, children, Saudi Arabia. PMID:27957258
Stratospheric column NO2 anomalies over Russia related to the 2011 Arctic ozone hole
NASA Astrophysics Data System (ADS)
Aheyeva, Viktoryia; Gruzdev, Aleksandr; Elokhov, Aleksandr; Grishaev, Mikhail; Salnikova, Natalia
2013-04-01
We analyze data of spectrometric measurements of stratospheric column NO2 contents at mid- and high-latitude stations of Zvenigorod (55.7°N, Moscow region), Tomsk (56.5°N, West Siberia), and Zhigansk (66.8°N, East Siberia). Measurements are done in visual spectral range with zenith-viewing spectrometers during morning and evening twilights. Alongside column NO2 contents, vertical profiles of NO2 are retrieved at the Zvenigorod station. Zvenigorod and Zhigansk are the measurement stations within the Network for the Detection of Atmospheric Composition Change (NDACC). For interpretation of results of analysis of NO2 data, data of Ozone Monitoring Instrument measurements of total column ozone and rawinsonde data are also analyzed and back trajectories calculated with the help of HYSPLIT trajectory model are used. Significant negative anomalies in stratospheric NO2 columns accompanied by episodes of significant cooling of the stratosphere and decrease in total ozone were observed at the three stations in the winter-spring period of 2011. Trajectory analysis shows that the anomalies were caused by the transport of stratospheric air from the region of the ozone hole observed that season in the Arctic. Although negative NO2 anomalies due to the transport from the Arctic were also observed in some other years, the anomalies in 2011 have had record magnitudes. Analysis of NO2 vertical profiles at Zvenigorod shows that the NO2 anomaly in 2011 compared to other years anomalies was additionally contributed by the denitrification of the Arctic lower stratosphere. NO2 profiles show that a certain degree of the denitrification probably survived even after the ozone hole.
Engine Data Interpretation System (EDIS), phase 2
NASA Technical Reports Server (NTRS)
Cost, Thomas L.; Hofmann, Martin O.
1991-01-01
A prototype of an expert system was developed which applies qualitative constraint-based reasoning to the task of post-test analysis of data resulting from a rocket engine firing. Data anomalies are detected and corresponding faults are diagnosed. Engine behavior is reconstructed using measured data and knowledge about engine behavior. Knowledge about common faults guides but does not restrict the search for the best explanation in terms of hypothesized faults. The system contains domain knowledge about the behavior of common rocket engine components and was configured for use with the Space Shuttle Main Engine (SSME). A graphical user interface allows an expert user to intimately interact with the system during diagnosis. The system was applied to data taken during actual SSME tests where data anomalies were observed.
Privacy-preserving outlier detection through random nonlinear data distortion.
Bhaduri, Kanishka; Stefanski, Mark D; Srivastava, Ashok N
2011-02-01
Consider a scenario in which the data owner has some private or sensitive data and wants a data miner to access them for studying important patterns without revealing the sensitive information. Privacy-preserving data mining aims to solve this problem by randomly transforming the data prior to their release to the data miners. Previous works only considered the case of linear data perturbations--additive, multiplicative, or a combination of both--for studying the usefulness of the perturbed output. In this paper, we discuss nonlinear data distortion using potentially nonlinear random data transformation and show how it can be useful for privacy-preserving anomaly detection from sensitive data sets. We develop bounds on the expected accuracy of the nonlinear distortion and also quantify privacy by using standard definitions. The highlight of this approach is to allow a user to control the amount of privacy by varying the degree of nonlinearity. We show how our general transformation can be used for anomaly detection in practice for two specific problem instances: a linear model and a popular nonlinear model using the sigmoid function. We also analyze the proposed nonlinear transformation in full generality and then show that, for specific cases, it is distance preserving. A main contribution of this paper is the discussion between the invertibility of a transformation and privacy preservation and the application of these techniques to outlier detection. The experiments conducted on real-life data sets demonstrate the effectiveness of the approach.
Data based abnormality detection
NASA Astrophysics Data System (ADS)
Purwar, Yashasvi
Data based abnormality detection is a growing research field focussed on extracting information from feature rich data. They are considered to be non-intrusive and non-destructive in nature which gives them a clear advantage over conventional methods. In this study, we explore different streams of data based anomalies detection. We propose extension and revisions to existing valve stiction detection algorithm supported with industrial case study. We also explored the area of image analysis and proposed a complete solution for Malaria diagnosis. The proposed method is tested over images provided by pathology laboratory at Alberta Health Service. We also address the robustness and practicality of the solution proposed.
2013-09-26
vehicle-lengths between frames. The low specificity of object detectors in WAMI means all vehicle detections are treated equally. Motion clutter...timing of the anomaly . If an anomaly was detected , recent activity would have a priority over older activity. This is due to the reasoning that if the...this could be a potential anomaly detected . Other baseline activities include normal work hours, religious observance times and interactions between
Critical Infrastructure Protection and Resilience Literature Survey: Modeling and Simulation
2014-11-01
2013 Page 34 of 63 Below the yellow set is a purple cluster bringing together detection , anomaly , intrusion, sensors, monitoring and alerting (early...hazards and threats to security56 Water ADWICE, PSS®SINCAL ADWICE for real-time anomaly detection in water management systems57 One tool that...Systems. Cybernetics and Information Technologies. 2008;8(4):57-68. 57. Raciti M, Cucurull J, Nadjm-Tehrani S. Anomaly detection in water management
Symbolic Time-Series Analysis for Anomaly Detection in Mechanical Systems
2006-08-01
Amol Khatkhate, Asok Ray , Fellow, IEEE, Eric Keller, Shalabh Gupta, and Shin C. Chin Abstract—This paper examines the efficacy of a novel method for...recognition. KHATKHATE et al.: SYMBOLIC TIME-SERIES ANALYSIS FOR ANOMALY DETECTION 447 Asok Ray (F’02) received graduate degrees in electri- cal...anomaly detection has been pro- posed by Ray [6], where the underlying information on the dynamical behavior of complex systems is derived based on
Chemical Compositions and Abundance Anomalies in Stellar Coronae ADP 99
NASA Technical Reports Server (NTRS)
Oliversen, Ronald J. (Technical Monitor); Drake, Jeremy
2004-01-01
New atomic data for tackling some of our spectra have been investigated by co-I Laming (NRL), including the effects of recombination on spectral line fluxes that are not included in, for example, the CHIANTI database models. Promising new progress has been made with modelling some of the recent abundance anomaly results in terms of Alven wave-driven separation of neutrals and ions in the upper chromosphere. The problems that existing models have is that they cannot simultaneously explain the low-FIP enhanced solar-like coronae and the high-FIP rich active coronae of RS CVn-like stars. The Alven wave model shows promise with both of these scenarios, with the fractionation or suppression of low-FIP ions depending on the characteristics of the chromosphere. This work is currently in the writing up stage. In summary, the work to-date is making good progress in mapping abundance anomalies as a function of spectral type and activity level. We are also making good progress with modelling that we will be able to test with our observational results. With one more year of effort, we'anticipate that the bulk of the work described above can be published, together with outstanding key studies on anomalies among the different active binaries.
NASA Astrophysics Data System (ADS)
Eto, S.; Nagai, S.; Tadokoro, K.
2011-12-01
Our group has developed a system for observing seafloor crustal deformation with a combination of acoustic ranging and kinematic GPS positioning techniques. One of the effective factors to reduce estimation error of submarine benchmark in our system is modeling variation of ocean acoustic velocity. We estimated various 1-dimensional velocity models with depth under some constraints, because it is difficult to estimate 3-dimensional acoustic velocity structure including temporal change due to our simple acquisition procedure of acoustic ranging data. We, then, applied the joint hypocenter determination method in seismology [Kissling et al., 1994] to acoustic ranging data. We assume two conditions as constraints in inversion procedure as follows: 1) fixed acoustic velocity in deeper part because it is usually stable both in space and time, 2) each inverted velocity model should be decreased with depth. The following two remarkable spatio-temporal changes of acoustic velocity 1) variations of travel-time residuals at the same points within short time and 2) larger differences between residuals at the neighboring points, which are one's of travel-time from different benchmarks. The First results cannot be explained only by the effect of atmospheric condition change including heating by sunlight. To verify the residual variations mentioned as the second result, we have performed forward modeling of acoustic ranging data with velocity models added velocity anomalies. We calculate travel time by a pseudo-bending ray tracing method [Um and Thurber, 1987] to examine effects of velocity anomaly on the travel-time differences. Comparison between these residuals and travel-time difference in forward modeling, velocity anomaly bodies in shallower depth can make these anomalous residuals, which may indicate moving water bodies. We need to apply an acoustic velocity structure model with velocity anomaly(s) in acoustic ranging data analysis and/or to develop a new system with a large number of sea surface stations to detect them, which may be able to reduce error of seafloor benchmarker position.
What controls the variability of oxygen in the subpolar North Pacific?
NASA Astrophysics Data System (ADS)
Takano, Yohei
Dissolved oxygen is a widely observed chemical quantity in the oceans along with temperature and salinity. Changes in the dissolved oxygen have been observed over the world oceans. Observed oxygen in the Ocean Station Papa (OSP, 50°N, 145°W) in the Gulf of Alaska exhibits strong variability over interannual and decadal timescales, however, the mechanisms driving the observed variability are not yet fully understood. Furthermore, irregular sampling frequency and relatively short record length make it difficult to detect a low-frequency variability. Motivated by these observations, we investigate the mechanisms driving the low-frequency variability of oxygen in the subpolar North Pacific. The specific purposes of this study are (1) to evaluate the robustness of the observed low-frequency variability of dissolved oxygen and (2) to determine the mechanisms driving the observed variability using statistical data analysis and numerical simulations. To evaluate the robustness of the low-frequency variability, we conducted spectral analyses on the observed oxygen at OSP. To address the irregular sampling frequency we randomly sub-sampled the raw data to form 500 ensemble members with a regular time interval, and then performed spectral analyses. The resulting power spectrum of oxygen exhibits a robust low-frequency variability and a statistically significant spectral peak is identified at a timescale of 15--20 years. The wintertime oceanic barotropic streamfunction is significantly correlated with the observed oxygen anomaly at OSP with a north-south dipole structure over the North Pacific. We hypothesize that the observed low-frequency variability is primarily driven by the variability of large-scale ocean circulation in the North Pacific. To test this hypothesis, we simulate the three-dimensional distribution of oxygen anomaly between 1952 to 2001 using data-constrained circulation fields. The simulated oxygen anomaly shows an outstanding variability in the Gulf of Alaska, showing that this region is a hotspot of oxygen fluctuation. Anomalous advection acting on the climatological mean oxygen gradient is the source of oxygen variability in this simulation. Empirical Orthogonal Function (EOF) analyses of the simulated oxygen show that the two dominant modes of the oxygen anomaly explains more than 50% of oxygen variance over the North Pacific, that are closely related to the dominant modes of climate variability in the North Pacific (Pacific Decadal Oscillation and North Pacific Oscillation). Our results imply the important link between large-scale climate fluctuations, ocean circulation and biogeochemical tracers in the North Pacific.
Autonomous Soil Assessment System: A Data-Driven Approach to Planetary Mobility Hazard Detection
NASA Astrophysics Data System (ADS)
Raimalwala, K.; Faragalli, M.; Reid, E.
2018-04-01
The Autonomous Soil Assessment System predicts mobility hazards for rovers. Its development and performance are presented, with focus on its data-driven models, machine learning algorithms, and real-time sensor data fusion for predictive analytics.
ERIC Educational Resources Information Center
Hutner, Todd L.; Markman, Arthur B.
2017-01-01
Two anomalies continue to confound researchers and science teacher educators. First, new science teachers are quick to discard the pedagogy and practices that they learn in their teacher education programs in favor of a traditional, didactic approach to teaching science. Second, a discrepancy exists at all stages of science teachers' careers…
NASA Astrophysics Data System (ADS)
Garcia, Xavier; Monteys, Xavier; Evans, Rob L.; Szpak, Michal
2014-04-01
During the Irish National Seabed Survey (INSS) in 2003, a gas related pockmark field was discovered and extensively mapped in the Malin Shelf region (NW Ireland). In summer 2006, additional complementary data involving core sample analysis, multibeam and single-beam backscatter classification, and a marine controlled-source electromagnetic survey were obtained in specific locations.This multidisciplinary approach allowed us to map the upper 20 m of the seabed in an unprecedented way and to correlate the main geophysical parameters with the geological properties of the seabed. The EM data provide us with information about sediment conductivity, which can be used as a proxy for porosity and also to identify the presence of fluid and fluid migration pathways. We conclude that, as a whole, the central part of the Malin basin is characterized by higher conductivities, which we interpret as a lithological change. Within the basin several areas are characterized by conductive anomalies associated with fluid flow processes and potentially the presence of microbial activity, as suggested by previous work. Pockmark structures show a characteristic electrical signature, with high-conductivity anomalies on the edges and less conductive, homogeneous interiors with several high-conductivity anomalies, potentially associated with gas-driven microbial activity.
Surface ocean carbon isotope anomalies on glacial terminations: An alternative view
NASA Astrophysics Data System (ADS)
Lund, D. C.; Cote, M.; Schmittner, A.
2016-12-01
Late Pleistocene glacial terminations are characterized by surface ocean carbon isotope minima on a global scale. During the last deglaciation (i.e. Termination 1), planktonic foraminiferal δ13C anomalies occurred in the Atlantic, Indian, Pacific, and Southern Oceans. Despite the apparently ubiquitous nature of δ13C anomalies on glacial terminations, their cause remains a matter of ongoing debate. The prevailing view is that isotopically light carbon from the abyss was upwelled in the Southern Ocean, resulting in outgassing of 13C-depleted carbon to the atmosphere and its advection to lower latitudes via mode and intermediate waters (Spero and Lea, 2002). Alternatively, carbon isotope minima may be driven by weakening of the biological pump related to circulation-driven changes in the oceanic preformed nutrient budget (Schmittner and Lund, 2015). Here we assess the deep upwelling and biological pump hypotheses using a new compilation of 70 globally-distributed planktonic δ13C records from the published literature. We find that 1) the mean deglacial δ13C anomaly is similar in all ocean basins, 2) the eastern tropical Pacific yields smaller mean δ13C anomalies than the western tropical Pacific, and 3) δ13C anomalies in the Southern Ocean decrease with increasing latitude. Our results are generally inconsistent with the deep upwelling hypothesis, which predicts that the δ13C signal should be largest in the Southern Ocean and upwelling regions. Instead, the spatial pattern in δ13C anomalies supports the biological pump hypothesis, which predicts that reduced export of light carbon from the euphotic zone triggers negative carbon isotope anomalies in the surface ocean and positive anomalies at intermediate depths. Upwelling of relatively 13C-enriched intermediate waters tends to moderate carbon isotope minima in upwelling regions. Our results suggest that the initial rise in atmospheric CO2 during Termination 1 was likely due to weakening of the biological pump associated with a reduction in the Atlantic Meridional Overturning Circulation, consistent with model results (Schmittner and Lund, 2015). Spero, H., and D. Lea (2002) Science 296, 522-525. Schmittner, A., and D. Lund (2015) Climate of the Past 11, 135-152.
NASA Astrophysics Data System (ADS)
Jay, J.; Pritchard, M. E.; Mares, P. J.; Mnich, M. E.; Welch, M. D.; Melkonian, A. K.; Aguilera, F.; Naranjo, J.; Sunagua, M.; Clavero, J. E.
2011-12-01
We examine 153 volcanoes and geothermal areas in the central, southern, and austral Andes for temperature anomalies between 2000-2011 from two different spacebourne sensors: 1) those automatically detected by the MODVOLC algorithm (Wright et al., 2004) from MODIS and 2) manually identified hotspots in nighttime images from ASTER. Based on previous work, we expected to find 8 thermal anomalies (volcanoes: Ubinas, Villarrica, Copahue, Láscar, Llaima, Chaitén, Puyehue-Cordón Caulle, Chiliques). We document 31 volcanic areas with pixel integrated temperatures of 4 to more than 100 K above background in at least two images, and another 29 areas that have questionable hotspots with either smaller anomalies or a hotspot in only one image. Most of the thermal anomalies are related to known activity (lava and pyroclastic flows, growing lava domes, fumaroles, and lakes) while others are of unknown origin or reflect activity at volcanoes that were not thought to be active. A handful of volcanoes exhibit temporal variations in the magnitude and location of their temperature anomaly that can be related to both documented and undocumented pulses of activity. Our survey reveals that low amplitude volcanic hotspots detectable from space are more common than expected (based on lower resolution data) and that these features could be more widely used to monitor changes in the activity of remote volcanoes. We find that the shape, size, magnitude, and location on the volcano of the thermal anomaly vary significantly from volcano to volcano, and these variations should be considered when developing algorithms for hotspot identification and detection. We compare our thermal results to satellite InSAR measurements of volcanic deformation and find that there is no simple relationship between deformation and thermal anomalies - while 31 volcanoes have continuous hotspots, at least 17 volcanoes in the same area have exhibited deformation, and these lists do not completely overlap. In order to investigate the relationship between seismic and thermal volcanic activity, we examine seismic data for 5 of the volcanoes (Uturuncu, Olca-Paruma, Ollague, Irruputuncu, and Sol de Mañana) as well as seismological reports from the Chilean geological survey SERNAGEOMIN for 11 additional volcanoes. Although there were 7 earthquakes with Mw > 7 in our study area from 2000-2010, there is essentially no evidence from ASTER or MODVOLC that the thermal anomalies were affected by seismic shaking.
NASA Astrophysics Data System (ADS)
Hood, L. L.; Spudis, P. D.
2016-11-01
Approximate maps of the lunar crustal magnetic field at low altitudes in the vicinities of the three Imbrian-aged impact basins, Orientale, Schrödinger, and Imbrium, have been constructed using Lunar Prospector and Kaguya orbital magnetometer data. Detectable anomalies are confirmed to be present well within the rims of Imbrium and Schrödinger. Anomalies in Schrödinger are asymmetrically distributed about the basin center, while a single isolated anomaly is most clearly detected within Imbrium northwest of Timocharis crater. The subsurface within these basins was heated to high temperatures at the time of impact and required long time periods (up to 1 Myr) to cool below the Curie temperature for metallic iron remanence carriers (1043 K). Therefore, consistent with laboratory analyses of returned samples, a steady, long-lived magnetizing field, i.e., a former core dynamo, is inferred to have existed when these basins formed. The asymmetrical distribution within Schrödinger suggests partial demagnetization by later volcanic activity when the dynamo field was much weaker or nonexistent. However, it remains true that anomalies within Imbrian-aged basins are much weaker than those within most Nectarian-aged basins. The virtual absence of anomalies within Orientale where impact melt rocks (the Maunder Formation) are exposed at the surface is difficult to explain unless the dynamo field was much weaker during the Imbrian period.
Learning to Classify with Possible Sensor Failures
2014-05-04
SVMs), have demonstrated good classification performance when the training data is representative of the test data [1, 2, 3]. However, in many real...Detection of people and animals using non- imaging sensors,” Information Fusion (FUSION), 2011 Proceedings of the 14th International Conference on, pp...classification methods in terms of both classification accuracy and anomaly detection rate using 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13
Detection and Classification of Network Intrusions Using Hidden Markov Models
2002-01-01
31 2.2.3 High-level state machines for misuse detection . . . . . . . 32 2.2.4 EMERALD ...Solaris host audit data to detect Solaris R2L (Remote-to-Local) and U2R (User-to-Root) attacks. 7 login as a legitimate user on a local system and use a...as suspicious rather than the entire login session and it can detect some anomalies that are difficult to detect with traditional approaches. It’s
Survey of Machine Learning Methods for Database Security
NASA Astrophysics Data System (ADS)
Kamra, Ashish; Ber, Elisa
Application of machine learning techniques to database security is an emerging area of research. In this chapter, we present a survey of various approaches that use machine learning/data mining techniques to enhance the traditional security mechanisms of databases. There are two key database security areas in which these techniques have found applications, namely, detection of SQL Injection attacks and anomaly detection for defending against insider threats. Apart from the research prototypes and tools, various third-party commercial products are also available that provide database activity monitoring solutions by profiling database users and applications. We present a survey of such products. We end the chapter with a primer on mechanisms for responding to database anomalies.
Relationships between Rwandan seasonal rainfall anomalies and ENSO events
NASA Astrophysics Data System (ADS)
Muhire, I.; Ahmed, F.; Abutaleb, K.
2015-10-01
This study aims primarily at investigating the relationships between Rwandan seasonal rainfall anomalies and El Niño-South Oscillation phenomenon (ENSO) events. The study is useful for early warning of negative effects associated with extreme rainfall anomalies across the country. It covers the period 1935-1992, using long and short rains data from 28 weather stations in Rwanda and ENSO events resourced from Glantz (2001). The mean standardized anomaly indices were calculated to investigate their associations with ENSO events. One-way analysis of variance was applied on the mean standardized anomaly index values per ENSO event to explore the spatial correlation of rainfall anomalies per ENSO event. A geographical information system was used to present spatially the variations in mean standardized anomaly indices per ENSO event. The results showed approximately three climatic periods, namely, dry period (1935-1960), semi-humid period (1961-1976) and wet period (1977-1992). Though positive and negative correlations were detected between extreme short rains anomalies and El Niño events, La Niña events were mostly linked to negative rainfall anomalies while El Niño events were associated with positive rainfall anomalies. The occurrence of El Niño and La Niña in the same year does not show any clear association with rainfall anomalies. However, the phenomenon was more linked with positive long rains anomalies and negative short rains anomalies. The normal years were largely linked with negative long rains anomalies and positive short rains anomalies, which is a pointer to the influence of other factors other than ENSO events. This makes projection of seasonal rainfall anomalies in the country by merely predicting ENSO events difficult.
Robust and efficient anomaly detection using heterogeneous representations
NASA Astrophysics Data System (ADS)
Hu, Xing; Hu, Shiqiang; Xie, Jinhua; Zheng, Shiyou
2015-05-01
Various approaches have been proposed for video anomaly detection. Yet these approaches typically suffer from one or more limitations: they often characterize the pattern using its internal information, but ignore its external relationship which is important for local anomaly detection. Moreover, the high-dimensionality and the lack of robustness of pattern representation may lead to problems, including overfitting, increased computational cost and memory requirements, and high false alarm rate. We propose a video anomaly detection framework which relies on a heterogeneous representation to account for both the pattern's internal information and external relationship. The internal information is characterized by slow features learned by slow feature analysis from low-level representations, and the external relationship is characterized by the spatial contextual distances. The heterogeneous representation is compact, robust, efficient, and discriminative for anomaly detection. Moreover, both the pattern's internal information and external relationship can be taken into account in the proposed framework. Extensive experiments demonstrate the robustness and efficiency of our approach by comparison with the state-of-the-art approaches on the widely used benchmark datasets.
Verification of Minimum Detectable Activity for Radiological Threat Source Search
NASA Astrophysics Data System (ADS)
Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn
2015-10-01
The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.
Chemical Compositions and Anomalies in Stellar Coronae
NASA Technical Reports Server (NTRS)
Drake, Jeremy; Oliversen, Ronald J. (Technical Monitor)
2005-01-01
In summary, as the papers cited here and in earlier reports demonstrate, this award has enabled us to obtain a fairly good picture of the abundance anomalies in stellar coronae. The "inverse FIP" effect in very active stars has now been fleshed out as a more complex anomaly depending on FIP, whereas before it appeared only in terms of a general metal paucity, the recent solar abundance assessment of Asplund et a1 will, if correct, challenge some of the older interpretations of coronal abundance anomalies since they imply quite different relative abundances of CNO compared with Fe, Mg and Si. Further investigations have been in into the possibility of modeling some of the recent coronal abundance anomaly results in terms of Alfven wave-driven separation of neutrals and ions in the upper chromosphere. This work still remains in the seed stage, and future funding from a different program will be requested to pursue it further.
Rocket Engine Oscillation Diagnostics
NASA Technical Reports Server (NTRS)
Nesman, Tom; Turner, James E. (Technical Monitor)
2002-01-01
Rocket engine oscillating data can reveal many physical phenomena ranging from unsteady flow and acoustics to rotordynamics and structural dynamics. Because of this, engine diagnostics based on oscillation data should employ both signal analysis and physical modeling. This paper describes an approach to rocket engine oscillation diagnostics, types of problems encountered, and example problems solved. Determination of design guidelines and environments (or loads) from oscillating phenomena is required during initial stages of rocket engine design, while the additional tasks of health monitoring, incipient failure detection, and anomaly diagnostics occur during engine development and operation. Oscillations in rocket engines are typically related to flow driven acoustics, flow excited structures, or rotational forces. Additional sources of oscillatory energy are combustion and cavitation. Included in the example problems is a sampling of signal analysis tools employed in diagnostics. The rocket engine hardware includes combustion devices, valves, turbopumps, and ducts. Simple models of an oscillating fluid system or structure can be constructed to estimate pertinent dynamic parameters governing the unsteady behavior of engine systems or components. In the example problems it is shown that simple physical modeling when combined with signal analysis can be successfully employed to diagnose complex rocket engine oscillatory phenomena.
Ionospheric anomalies detected by ionosonde and possibly related to crustal earthquakes in Greece
NASA Astrophysics Data System (ADS)
Perrone, Loredana; De Santis, Angelo; Abbattista, Cristoforo; Alfonsi, Lucilla; Amoruso, Leonardo; Carbone, Marianna; Cesaroni, Claudio; Cianchini, Gianfranco; De Franceschi, Giorgiana; De Santis, Anna; Di Giovambattista, Rita; Marchetti, Dedalo; Pavòn-Carrasco, Francisco J.; Piscini, Alessandro; Spogli, Luca; Santoro, Francesca
2018-03-01
Ionosonde data and crustal earthquakes with magnitude M ≥ 6.0 observed in Greece during the 2003-2015 period were examined to check if the relationships obtained earlier between precursory ionospheric anomalies and earthquakes in Japan and central Italy are also valid for Greek earthquakes. The ionospheric anomalies are identified on the observed variations of the sporadic E-layer parameters (h'Es, foEs) and foF2 at the ionospheric station of Athens. The corresponding empirical relationships between the seismo-ionospheric disturbances and the earthquake magnitude and the epicentral distance are obtained and found to be similar to those previously published for other case studies. The large lead times found for the ionospheric anomalies occurrence may confirm a rather long earthquake preparation period. The possibility of using the relationships obtained for earthquake prediction is finally discussed.
A Data-Driven Assessment of the Sensitivity of Global Ecosystems to Climate Anomalies
NASA Astrophysics Data System (ADS)
Miralles, D. G.; Papagiannopoulou, C.; Demuzere, M.; Decubber, S.; Waegeman, W.; Verhoest, N.; Dorigo, W.
2017-12-01
Vegetation is a central player in the climate system, constraining atmospheric conditions through a series of feedbacks. This fundamental role highlights the importance of understanding regional drivers of ecological sensitivity and the response of vegetation to climatic changes. While nutrient availability and short-term disturbances can be crucial for vegetation at various spatiotemporal scales, natural vegetation dynamics are overall driven by climate. At monthly scales, the interactions between vegetation and climate become complex: some vegetation types react preferentially to specific climatic changes, with different levels of intensity, resilience and lagged response. For our current Earth System Models (ESMs) being able to capture this complexity is crucial but extremely challenging. This adds uncertainty to our projections of future climate and the fate of global ecosystems. Here, following a Granger causality framework based on a non-linear random forest predictive model, we exploit the current wealth of satellite data records to uncover the main climatic drivers of monthly vegetation variability globally. Results based on three decades of satellite data indicate that water availability is the most dominant factor driving vegetation in over 60% of the vegetated land. This overall dependency of ecosystems on water availability is larger than previously reported, partly owed to the ability of our machine-learning framework to disentangle the co-linearites between climatic drivers, and to quantify non-linear impacts of climate on vegetation. Our observation-based results are then used to benchmark ESMs on their representation of vegetation sensitivity to climate and climatic extremes. Our findings indicate that the sensitivity of vegetation to climatic anomalies is ill-reproduced by some widely-used ESMs.
NASA Astrophysics Data System (ADS)
Chen, Jingbo; Wang, Chengyi; Yue, Anzhi; Chen, Jiansheng; He, Dongxu; Zhang, Xiuyan
2017-10-01
The tremendous success of deep learning models such as convolutional neural networks (CNNs) in computer vision provides a method for similar problems in the field of remote sensing. Although research on repurposing pretrained CNN to remote sensing tasks is emerging, the scarcity of labeled samples and the complexity of remote sensing imagery still pose challenges. We developed a knowledge-guided golf course detection approach using a CNN fine-tuned on temporally augmented data. The proposed approach is a combination of knowledge-driven region proposal, data-driven detection based on CNN, and knowledge-driven postprocessing. To confront data complexity, knowledge-derived cooccurrence, composition, and area-based rules are applied sequentially to propose candidate golf regions. To confront sample scarcity, we employed data augmentation in the temporal domain, which extracts samples from multitemporal images. The augmented samples were then used to fine-tune a pretrained CNN for golf detection. Finally, commission error was further suppressed by postprocessing. Experiments conducted on GF-1 imagery prove the effectiveness of the proposed approach.
NASA Astrophysics Data System (ADS)
El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali
2015-09-01
The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.
Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery
Sivaraks, Haemwaan
2015-01-01
Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284
Data Mining for Anomaly Detection
NASA Technical Reports Server (NTRS)
Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj
2013-01-01
The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.
Teaching Network Security with IP Darkspace Data
ERIC Educational Resources Information Center
Zseby, Tanja; Iglesias Vázquez, Félix; King, Alistair; Claffy, K. C.
2016-01-01
This paper presents a network security laboratory project for teaching network traffic anomaly detection methods to electrical engineering students. The project design follows a research-oriented teaching principle, enabling students to make their own discoveries in real network traffic, using data captured from a large IP darkspace monitor…
ERIC Educational Resources Information Center
Lu, Hsin-Min
2010-01-01
Deep penetration of personal computers, data communication networks, and the Internet has created a massive platform for data collection, dissemination, storage, and retrieval. Large amounts of textual data are now available at a very low cost. Valuable information, such as consumer preferences, new product developments, trends, and opportunities,…
An Investigation of Techniques for Detecting Data Anomalies in Earned Value Management Data
2011-12-01
Management Studio Harte Hanks Trillium Software Trillium Software System IBM Info Sphere Foundation Tools Informatica Data Explorer Informatica ...Analyst Informatica Developer Informatica Administrator Pitney Bowes Business Insight Spectrum SAP BusinessObjects Data Quality Management DataFlux...menting quality monitoring efforts and tracking data quality improvements Informatica http://www.informatica.com/products_services/Pages/index.aspx
Flight-Tested Prototype of BEAM Software
NASA Technical Reports Server (NTRS)
Mackey, Ryan; Tikidjian, Raffi; James, Mark; Wang, David
2006-01-01
Researchers at JPL have completed a software prototype of BEAM (Beacon-based Exception Analysis for Multi-missions) and successfully tested its operation in flight onboard a NASA research aircraft. BEAM (see NASA Tech Briefs, Vol. 26, No. 9; and Vol. 27, No. 3) is an ISHM (Integrated Systems Health Management) technology that automatically analyzes sensor data and classifies system behavior as either nominal or anomalous, and further characterizes anomalies according to strength, duration, and affected signals. BEAM (see figure) can be used to monitor a wide variety of physical systems and sensor types in real time. In this series of tests, BEAM monitored the engines of a Dryden Flight Research Center F-18 aircraft, and performed onboard, unattended analysis of 26 engine sensors from engine startup to shutdown. The BEAM algorithm can detect anomalies based solely on the sensor data, which includes but is not limited to sensor failure, performance degradation, incorrect operation such as unplanned engine shutdown or flameout in this example, and major system faults. BEAM was tested on an F-18 simulator, static engine tests, and 25 individual flights totaling approximately 60 hours of flight time. During these tests, BEAM successfully identified planned anomalies (in-flight shutdowns of one engine) as well as minor unplanned anomalies (e.g., transient oil- and fuel-pressure drops), with no false alarms or suspected false-negative results for the period tested. BEAM also detected previously unknown behavior in the F- 18 compressor section during several flights. This result, confirmed by direct analysis of the raw data, serves as a significant test of BEAM's capability.
Garne, E; Vinkel Hansen, A; Morris, J; Jordan, S; Klungsøyr, K; Engeland, A; Tucker, D; Thayer, D S; Davies, G I; Nybo Andersen, A-M; Dolk, H
2016-09-01
To examine the effect of maternal exposure to asthma medications on the risk of congenital anomalies. Meta-analysis of aggregated data from three cohort studies. Linkage between healthcare databases and EUROCAT congenital anomaly registries. 519 242 pregnancies in Norway (2004-2010), Wales (2000-2010) and Funen, Denmark (2000-2010). Exposure defined as having at least one prescription for asthma medications issued (Wales) or dispensed (Norway, Denmark) from 91 days before to 91 days after the pregnancy start date. Odds ratios (ORs) were estimated separately for each register and combined in meta-analyses. ORs for all congenital anomalies and specific congenital anomalies. Overall exposure prevalence was 3.76%. For exposure to asthma medication in general, the adjusted OR (adjOR) for a major congenital anomaly was 1.21 (99% CI 1.09-1.34) after adjustment for maternal age and socioeconomic position. The OR of anal atresia was significantly increased in pregnancies exposed to inhaled corticosteroids (3.40; 99% CI 1.15-10.04). For severe congenital heart defects, an increased OR (1.97; 1.12-3.49) was associated with exposure to combination treatment with inhaled corticosteroids and long-acting beta-2-agonists. Associations with renal dysplasia were driven by exposure to short-acting beta-2-agonists (2.37; 1.20-4.67). The increased risk of congenital anomalies for women taking asthma medication is small with little confounding by maternal age or socioeconomic status. The study confirmed the association of inhaled corticosteroids with anal atresia found in earlier research and found potential new associations with combination treatment. The potential new associations should be interpreted with caution due to the large number of comparisons undertaken. This cohort study found a small increased risk of congenital anomalies for women taking asthma medication. © 2016 The Authors. BJOG An International Journal of Obstetrics and Gynaecology published by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.
Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.
2012-01-01
Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm2 areas and ≥2% in ∼20 mm2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified. PMID:22894421
An impact-driven dynamo for the early Moon.
Le Bars, M; Wieczorek, M A; Karatekin, O; Cébron, D; Laneuville, M
2011-11-09
The origin of lunar magnetic anomalies remains unresolved after their discovery more than four decades ago. A commonly invoked hypothesis is that the Moon might once have possessed a thermally driven core dynamo, but this theory is problematical given the small size of the core and the required surface magnetic field strengths. An alternative hypothesis is that impact events might have amplified ambient fields near the antipodes of the largest basins, but many magnetic anomalies exist that are not associated with basin antipodes. Here we propose a new model for magnetic field generation, in which dynamo action comes from impact-induced changes in the Moon's rotation rate. Basin-forming impact events are energetic enough to have unlocked the Moon from synchronous rotation, and we demonstrate that the subsequent large-scale fluid flows in the core, excited by the tidal distortion of the core-mantle boundary, could have powered a lunar dynamo. Predicted surface magnetic field strengths are on the order of several microteslas, consistent with palaeomagnetic measurements, and the duration of these fields is sufficient to explain the central magnetic anomalies associated with several large impact basins.
Temilola, Dada Oluwaseyi; Folayan, Morenike Oluwatoyin; Fatusi, Olawunmi; Chukwumah, Nneka Maureen; Onyejaka, Nneka; Oziegbe, Elizabeth; Oyedele, Titus; Kolawole, Kikelomo Adebanke; Agbaje, Hakeem
2014-10-16
The study of dental anomalies is important because it generates information that is important for both the anthropological and clinical management of patients. The objective of this study is to determine the prevalence and pattern of presentation of dental hard-tissue developmental anomalies in the mix dentition of children residing in Ile-Ife, a suburban region of Nigeria. Information on age, sex and socioeconomic status was collected from 1,036 children aged four months to 12 years through a household survey. Clinical examination was conducted to assess the presence of dental anomalies. Associations between age, sex, socioeconomic status, prevalence, and pattern of presentation of the developmental hard-tissue dental anomalies were determined. Two hundred and seventy six (26.6%) children had dental anomalies. Of these, 23.8% had one anomaly, 2.5% had two anomalies, and 0.3% had more than two anomalies. Of the children with anomalies, 49.3%were male, 50.7%were female, and 47.8%, 28.6% and 23.6% were children from low, middle and high socioeconomic classes, respectively. More anomalies were seen in permanent than primary dentition. Anomalies of tooth structure were most prevalent (16.1%); anomalies which affect tooth number were least prevalent (1.3%). Dens evaginatus, peg-shaped lateral, macrodontia, and talon cusp were more prevalent in the permanent dentition, and dens evaginatus peg-shaped lateral and macrodontia were more prevalent in the maxilla. There were significantly more macrodontia anomalies in males and in children of high socioeconomic status. This large survey of dental hard-tissue anomalies found in the primary dentition and mixed dentition of children in Nigeria provides anthropological and clinical data that may aid the detection and management of dental problems of children in Nigeria.
Aeromagnetic anomalies over faulted strata
Grauch, V.J.S.; Hudson, Mark R.
2011-01-01
High-resolution aeromagnetic surveys are now an industry standard and they commonly detect anomalies that are attributed to faults within sedimentary basins. However, detailed studies identifying geologic sources of magnetic anomalies in sedimentary environments are rare in the literature. Opportunities to study these sources have come from well-exposed sedimentary basins of the Rio Grande rift in New Mexico and Colorado. High-resolution aeromagnetic data from these areas reveal numerous, curvilinear, low-amplitude (2–15 nT at 100-m terrain clearance) anomalies that consistently correspond to intrasedimentary normal faults (Figure 1). Detailed geophysical and rock-property studies provide evidence for the magnetic sources at several exposures of these faults in the central Rio Grande rift (summarized in Grauch and Hudson, 2007, and Hudson et al., 2008). A key result is that the aeromagnetic anomalies arise from the juxtaposition of magnetically differing strata at the faults as opposed to chemical processes acting at the fault zone. The studies also provide (1) guidelines for understanding and estimating the geophysical parameters controlling aeromagnetic anomalies at faulted strata (Grauch and Hudson), and (2) observations on key geologic factors that are favorable for developing similar sedimentary sources of aeromagnetic anomalies elsewhere (Hudson et al.).
Lunar physical properties from analysis of magnetometer data
NASA Technical Reports Server (NTRS)
Daily, W. D.
1979-01-01
The electromagnetic properties of the lunar interior are discussed with emphasis on (1) bulk, crustal, and local anomalous conductivity; (2) bulk magnetic permeability measurements, iron abundance estimates, and core size limits; (3) lunar ionosphere and atmosphere; and (4) crustal magnetic remanence: scale size measurements and constraints on remanence origin. Appendices treat the phase relationship between the energetic particle flux modulation and current disc penetrations in the Jovian magnetosphere (Pioneer 10 inbound) theories for the origin of lunar magnetism; electrical conductivity anomalies associated with circular lunar maria; electromagnetic properties of the Moon; Mare Serenitatis conductivity anomaly detected by Apollo 16 and Lunokhod 2 magnetometers; and lunar properties from magnetometer data: effects of data errors.
Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity.
Napoletano, Paolo; Piccoli, Flavio; Schettini, Raimondo
2018-01-12
Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art.
Amendt, Peter; Landen, O L; Robey, H F; Li, C K; Petrasso, R D
2010-09-10
The observation of large, self-generated electric fields (≥10(9) V/m) in imploding capsules using proton radiography has been reported [C. K. Li, Phys. Rev. Lett. 100, 225001 (2008)]. A model of pressure gradient-driven diffusion in a plasma with self-generated electric fields is developed and applied to reported neutron yield deficits for equimolar D3He [J. R. Rygg, Phys. Plasmas 13, 052702 (2006)] and (DT)3He [H. W. Herrmann, Phys. Plasmas 16, 056312 (2009)] fuel mixtures and Ar-doped deuterium fuels [J. D. Lindl, Phys. Plasmas 11, 339 (2004)]. The observed anomalies are explained as a mild loss of deuterium nuclei near capsule center arising from shock-driven diffusion in the high-field limit.
Modeling And Detecting Anomalies In Scada Systems
NASA Astrophysics Data System (ADS)
Svendsen, Nils; Wolthusen, Stephen
The detection of attacks and intrusions based on anomalies is hampered by the limits of specificity underlying the detection techniques. However, in the case of many critical infrastructure systems, domain-specific knowledge and models can impose constraints that potentially reduce error rates. At the same time, attackers can use their knowledge of system behavior to mask their manipulations, causing adverse effects to observed only after a significant period of time. This paper describes elementary statistical techniques that can be applied to detect anomalies in critical infrastructure networks. A SCADA system employed in liquefied natural gas (LNG) production is used as a case study.
First and second trimester screening for fetal structural anomalies.
Edwards, Lindsay; Hui, Lisa
2018-04-01
Fetal structural anomalies are found in up to 3% of all pregnancies and ultrasound-based screening has been an integral part of routine prenatal care for decades. The prenatal detection of fetal anomalies allows for optimal perinatal management, providing expectant parents with opportunities for additional imaging, genetic testing, and the provision of information regarding prognosis and management options. Approximately one-half of all major structural anomalies can now be detected in the first trimester, including acrania/anencephaly, abdominal wall defects, holoprosencephaly and cystic hygromata. Due to the ongoing development of some organ systems however, some anomalies will not be evident until later in the pregnancy. To this extent, the second trimester anatomy is recommended by professional societies as the standard investigation for the detection of fetal structural anomalies. The reported detection rates of structural anomalies vary according to the organ system being examined, and are also dependent upon factors such as the equipment settings and sonographer experience. Technological advances over the past two decades continue to support the role of ultrasound as the primary imaging modality in pregnancy, and the safety of ultrasound for the developing fetus is well established. With increasing capabilities and experience, detailed examination of the central nervous system and cardiovascular system is possible, with dedicated examinations such as the fetal neurosonogram and the fetal echocardiogram now widely performed in tertiary centers. Magnetic resonance imaging (MRI) is well recognized for its role in the assessment of fetal brain anomalies; other potential indications for fetal MRI include lung volume measurement (in cases of congenital diaphragmatic hernia), and pre-surgical planning prior to fetal spina bifida repair. When a major structural abnormality is detected prenatally, genetic testing with chromosomal microarray is recommended over routine karyotype due to its higher genomic resolution. Copyright © 2017 Elsevier Ltd. All rights reserved.
Anomaly Detection of Electromyographic Signals.
Ijaz, Ahsan; Choi, Jongeun
2018-04-01
In this paper, we provide a robust framework to detect anomalous electromyographic (EMG) signals and identify contamination types. As a first step for feature selection, optimally selected Lawton wavelets transform is applied. Robust principal component analysis (rPCA) is then performed on these wavelet coefficients to obtain features in a lower dimension. The rPCA based features are used for constructing a self-organizing map (SOM). Finally, hierarchical clustering is applied on the SOM that separates anomalous signals residing in the smaller clusters and breaks them into logical units for contamination identification. The proposed methodology is tested using synthetic and real world EMG signals. The synthetic EMG signals are generated using a heteroscedastic process mimicking desired experimental setups. A sub-part of these synthetic signals is introduced with anomalies. These results are followed with real EMG signals introduced with synthetic anomalies. Finally, a heterogeneous real world data set is used with known quality issues under an unsupervised setting. The framework provides recall of 90% (± 3.3) and precision of 99%(±0.4).
From Signature-Based Towards Behaviour-Based Anomaly Detection (Extended Abstract)
2010-11-01
data acquisition can serve as sensors. De- facto standard for IP flow monitoring is NetFlow format. Although NetFlow was originally developed by Cisco...packets with some common properties that pass through a network device. These collected flows are exported to an external device, the NetFlow ...Thanks to the network-based approach using NetFlow data, the detection algorithm is host independent and highly scalable. Deep Packet Inspection
Target detection using the background model from the topological anomaly detection algorithm
NASA Astrophysics Data System (ADS)
Dorado Munoz, Leidy P.; Messinger, David W.; Ziemann, Amanda K.
2013-05-01
The Topological Anomaly Detection (TAD) algorithm has been used as an anomaly detector in hyperspectral and multispectral images. TAD is an algorithm based on graph theory that constructs a topological model of the background in a scene, and computes an anomalousness ranking for all of the pixels in the image with respect to the background in order to identify pixels with uncommon or strange spectral signatures. The pixels that are modeled as background are clustered into groups or connected components, which could be representative of spectral signatures of materials present in the background. Therefore, the idea of using the background components given by TAD in target detection is explored in this paper. In this way, these connected components are characterized in three different approaches, where the mean signature and endmembers for each component are calculated and used as background basis vectors in Orthogonal Subspace Projection (OSP) and Adaptive Subspace Detector (ASD). Likewise, the covariance matrix of those connected components is estimated and used in detectors: Constrained Energy Minimization (CEM) and Adaptive Coherence Estimator (ACE). The performance of these approaches and the different detectors is compared with a global approach, where the background characterization is derived directly from the image. Experiments and results using self-test data set provided as part of the RIT blind test target detection project are shown.
Cellular telephone-based radiation sensor and wide-area detection network
Craig, William W [Pittsburg, CA; Labov, Simon E [Berkeley, CA
2006-12-12
A network of radiation detection instruments, each having a small solid state radiation sensor module integrated into a cellular phone for providing radiation detection data and analysis directly to a user. The sensor module includes a solid-state crystal bonded to an ASIC readout providing a low cost, low power, light weight compact instrument to detect and measure radiation energies in the local ambient radiation field. In particular, the photon energy, time of event, and location of the detection instrument at the time of detection is recorded for real time transmission to a central data collection/analysis system. The collected data from the entire network of radiation detection instruments are combined by intelligent correlation/analysis algorithms which map the background radiation and detect, identify and track radiation anomalies in the region.
Cellular telephone-based radiation detection instrument
Craig, William W [Pittsburg, CA; Labov, Simon E [Berkeley, CA
2011-06-14
A network of radiation detection instruments, each having a small solid state radiation sensor module integrated into a cellular phone for providing radiation detection data and analysis directly to a user. The sensor module includes a solid-state crystal bonded to an ASIC readout providing a low cost, low power, light weight compact instrument to detect and measure radiation energies in the local ambient radiation field. In particular, the photon energy, time of event, and location of the detection instrument at the time of detection is recorded for real time transmission to a central data collection/analysis system. The collected data from the entire network of radiation detection instruments are combined by intelligent correlation/analysis algorithms which map the background radiation and detect, identify and track radiation anomalies in the region.
Cellular telephone-based wide-area radiation detection network
Craig, William W [Pittsburg, CA; Labov, Simon E [Berkeley, CA
2009-06-09
A network of radiation detection instruments, each having a small solid state radiation sensor module integrated into a cellular phone for providing radiation detection data and analysis directly to a user. The sensor module includes a solid-state crystal bonded to an ASIC readout providing a low cost, low power, light weight compact instrument to detect and measure radiation energies in the local ambient radiation field. In particular, the photon energy, time of event, and location of the detection instrument at the time of detection is recorded for real time transmission to a central data collection/analysis system. The collected data from the entire network of radiation detection instruments are combined by intelligent correlation/analysis algorithms which map the background radiation and detect, identify and track radiation anomalies in the region.
A sonographic approach to prenatal classification of congenital spine anomalies
Robertson, Meiri; Sia, Sock Bee
2015-01-01
Abstract Objective: To develop a classification system for congenital spine anomalies detected by prenatal ultrasound. Methods: Data were collected from fetuses with spine abnormalities diagnosed in our institution over a five‐year period between June 2005 and June 2010. The ultrasound images were analysed to determine which features were associated with different congenital spine anomalies. Findings of the prenatal ultrasound images were correlated with other prenatal imaging, post mortem findings, post mortem imaging, neonatal imaging, karyotype, and other genetic workup. Data from published case reports of prenatal diagnosis of rare congenital spine anomalies were analysed to provide a comprehensive work. Results: During the study period, eighteen cases of spine abnormalities were diagnosed in 7819 women. The mean gestational age at diagnosis was 18.8w ± 2.2 SD. While most cases represented open NTD, a spectrum of vertebral abnormalities were diagnosed prenatally. These included hemivertebrae, block vertebrae, cleft or butterfly vertebrae, sacral agenesis, and a lipomeningocele. The most sensitive features for diagnosis of a spine abnormality included flaring of the vertebral arch ossification centres, abnormal spine curvature, and short spine length. While reported findings at the time of diagnosis were often conservative, retrospective analysis revealed good correlation with radiographic imaging. 3D imaging was found to be a valuable tool in many settings. Conclusions: Analysis of the study findings showed prenatal ultrasound allowed detection of disruption to the normal appearances of the fetal spine. Using the three features of flaring of the vertebral arch ossification centres, abnormal spine curvature, and short spine length, an algorithm was devised to aid with the diagnosis of spine anomalies for those who perform and report prenatal ultrasound. PMID:28191204
NASA Astrophysics Data System (ADS)
Rossi, Alessandro; Acito, Nicola; Diani, Marco; Corsini, Giovanni; De Ceglie, Sergio Ugo; Riccobono, Aldo; Chiarantini, Leandro
2014-10-01
Airborne hyperspectral imagery is valuable for military and civilian applications, such as target identification, detection of anomalies and changes within multiple acquisitions. In target detection (TD) applications, the performance assessment of different algorithms is an important and critical issue. In this context, the small number of public available hyperspectral data motivated us to perform an extensive measurement campaign including various operating scenarios. The campaign was organized by CISAM in cooperation with University of Pisa, Selex ES and CSSN-ITE, and it was conducted in Viareggio, Italy in May, 2013. The Selex ES airborne hyperspectral sensor SIM.GA was mounted on board of an airplane to collect images over different sites in the morning and afternoon of two subsequent days. This paper describes the hyperspectral data collection of the trial. Four different sites were set up, representing a complex urban scenario, two parking lots and a rural area. Targets with dimensions comparable to the sensor ground resolution were deployed in the sites to reproduce different operating situations. An extensive ground truth documentation completes the data collection. Experiments to test anomalous change detection techniques were set up changing the position of the deployed targets. Search and rescue scenarios were simulated to evaluate the performance of anomaly detection algorithms. Moreover, the reflectance signatures of the targets were measured on the ground to perform spectral matching in varying atmospheric and illumination conditions. The paper presents some preliminary results that show the effectiveness of hyperspectral data exploitation for the object detection tasks of interest in this work.
A hybrid approach for efficient anomaly detection using metaheuristic methods
Ghanem, Tamer F.; Elkilani, Wail S.; Abdul-kader, Hatem M.
2014-01-01
Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. PMID:26199752
A hybrid approach for efficient anomaly detection using metaheuristic methods.
Ghanem, Tamer F; Elkilani, Wail S; Abdul-Kader, Hatem M
2015-07-01
Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.
NASA Astrophysics Data System (ADS)
Akhoondzadeh, M.
2014-02-01
A powerful earthquake of Mw = 7.7 struck the Saravan region (28.107° N, 62.053° E) in Iran on 16 April 2013. Up to now nomination of an automated anomaly detection method in a non linear time series of earthquake precursor has been an attractive and challenging task. Artificial Neural Network (ANN) and Particle Swarm Optimization (PSO) have revealed strong potentials in accurate time series prediction. This paper presents the first study of an integration of ANN and PSO method in the research of earthquake precursors to detect the unusual variations of the thermal and total electron content (TEC) seismo-ionospheric anomalies induced by the strong earthquake of Saravan. In this study, to overcome the stagnation in local minimum during the ANN training, PSO as an optimization method is used instead of traditional algorithms for training the ANN method. The proposed hybrid method detected a considerable number of anomalies 4 and 8 days preceding the earthquake. Since, in this case study, ionospheric TEC anomalies induced by seismic activity is confused with background fluctuations due to solar activity, a multi-resolution time series processing technique based on wavelet transform has been applied on TEC signal variations. In view of the fact that the accordance in the final results deduced from some robust methods is a convincing indication for the efficiency of the method, therefore the detected thermal and TEC anomalies using the ANN + PSO method were compared to the results with regard to the observed anomalies by implementing the mean, median, Wavelet, Kalman filter, Auto-Regressive Integrated Moving Average (ARIMA), Support Vector Machine (SVM) and Genetic Algorithm (GA) methods. The results indicate that the ANN + PSO method is quite promising and deserves serious attention as a new tool for thermal and TEC seismo anomalies detection.
NASA Technical Reports Server (NTRS)
Acker, James G.; Uz, Stephanie Schollaert; Shen, Suhung; Leptoukh, Gregory G.
2010-01-01
Application of appropriate spatial averaging techniques is crucial to correct evaluation of ocean color radiometric data, due to the common log-normal or mixed log-normal distribution of these data. Averaging method is particularly crucial for data acquired in coastal regions. The effect of averaging method was markedly demonstrated for a precipitation-driven event on the U.S. Northeast coast in October-November 2005, which resulted in export of high concentrations of riverine colored dissolved organic matter (CDOM) to New York and New Jersey coastal waters over a period of several days. Use of the arithmetic mean averaging method created an inaccurate representation of the magnitude of this event in SeaWiFS global mapped chl a data, causing it to be visualized as a very large chl a anomaly. The apparent chl a anomaly was enhanced by the known incomplete discrimination of CDOM and phytoplankton chlorophyll in SeaWiFS data; other data sources enable an improved characterization. Analysis using the geometric mean averaging method did not indicate this event to be statistically anomalous. Our results predicate the necessity of providing the geometric mean averaging method for ocean color radiometric data in the Goddard Earth Sciences DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni).
NASA Astrophysics Data System (ADS)
Asada, M.
2017-12-01
Mud volcanoes (MV) are geological features that are observed all over the world, especially along plate convergent margins. MVs bring fluid and sediment to the surface from depth. MVs around Japan are expected to transport of information from the shallow portions of the seismogenic zone. The Kumano forearc basin (FAB) in the Nankai region is the most studied area in Japan. It is bounded by a shelf on the north, and the Kumano Basin edge fault zone (KBEFZ) on the south. The Kumano FAB has 1-2 km of sediment and overlies the accretionary prism. There are at least 14 MVs in the Kumano Basin. Most of them are found over the northern basin floor, and at least one MV is at the KBEFZ. The MV at the KBEFZ is imaged on a 3D seismic data set as a small topographic feature on seafloor with a disrupted BSR below it. On high-resolution acoustic imagery, it is an 80 100m-high hill with a crater-like depression. It is characterized by a negative ph anomaly detected just above it. High-backscatter seafloor recognized around the MV suggests that harder seafloor exists in that area. To determine whether large subseafloor diapirs exist below active MVs, we try to detect the gravity contrast between the allochthonous materials and basin sediment. Gravity data were collected by research vessels over the area in 2012 2017. After corrections of drift and Etovos effects, absolute gravity, free-air and Bouguer gravity anomalies were calculated. The gravity data do not always show anomalies directly on MVs over the northern basin, thus suggesting that larger diapirs which have gravity contrast over a few milli-Gals do not exist below most of MVs in this basin. Instead, a large negative gravity anomaly is found at the northeastern end of the Kumano Basin. Localized positive anomalies exist along the KBEFZ in the area of theMV. The positive anomaly may suggest that an allochthonous high-density sediment body intrudes along the highly deformed, weak, fault zone.
Networked gamma radiation detection system for tactical deployment
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ronald; Smith, Ethan; Guss, Paul; Mitchell, Stephen
2015-08-01
A networked gamma radiation detection system with directional sensitivity and energy spectral data acquisition capability is being developed by the National Security Technologies, LLC, Remote Sensing Laboratory to support the close and intense tactical engagement of law enforcement who carry out counterterrorism missions. In the proposed design, three clusters of 2″ × 4″ × 16″ sodium iodide crystals (4 each) with digiBASE-E (for list mode data collection) would be placed on the passenger side of a minivan. To enhance localization and facilitate rapid identification of isotopes, advanced smart real-time localization and radioisotope identification algorithms like WAVRAD (wavelet-assisted variance reduction for anomaly detection) and NSCRAD (nuisance-rejection spectral comparison ratio anomaly detection) will be incorporated. We will test a collection of algorithms and analysis that centers on the problem of radiation detection with a distributed sensor network. We will study the basic characteristics of a radiation sensor network and focus on the trade-offs between false positive alarm rates, true positive alarm rates, and time to detect multiple radiation sources in a large area. Empirical and simulation analyses of critical system parameters, such as number of sensors, sensor placement, and sensor response functions, will be examined. This networked system will provide an integrated radiation detection architecture and framework with (i) a large nationally recognized search database equivalent that would help generate a common operational picture in a major radiological crisis; (ii) a robust reach back connectivity for search data to be evaluated by home teams; and, finally, (iii) a possibility of integrating search data from multi-agency responders.
Analysis of the variability of the North Atlantic eddy-driven jet stream in CMIP5
NASA Astrophysics Data System (ADS)
Iqbal, Waheed; Leung, Wai-Nang; Hannachi, Abdel
2017-09-01
The North Atlantic eddy-driven jet is a dominant feature of extratropical climate and its variability is associated with the large-scale changes in the surface climate of midlatitudes. Variability of this jet is analysed in a set of General Circulation Models (GCMs) from the Coupled Model Inter-comparison Project phase-5 (CMIP5) over the North Atlantic region. The CMIP5 simulations for the 20th century climate (Historical) are compared with the ERA40 reanalysis data. The jet latitude index, wind speed and jet persistence are analysed in order to evaluate 11 CMIP5 GCMs and to compare them with those from CMIP3 integrations. The phase of mean seasonal cycle of jet latitude and wind speed from historical runs of CMIP5 GCMs are comparable to ERA40. The wind speed mean seasonal cycle by CMIP5 GCMs is overestimated in winter months. A positive (negative) jet latitude anomaly in historical simulations relative to ERA40 is observed in summer (winter). The ensemble mean of jet latitude biases in historical simulations of CMIP3 and CMIP5 with respect to ERA40 are -2.43° and -1.79° respectively. Thus indicating improvements in CMIP5 in comparison to the CMIP3 GCMs. The comparison of historical and future simulations of CMIP5 under RCP4.5 and RCP8.5 for the period 2076-2099, shows positive anomalies in the jet latitude implying a poleward shifted jet. The results from the analysed models offer no specific improvements in simulating the trimodality of the eddy-driven jet.
NASA Astrophysics Data System (ADS)
Fan, L. F.; Lien, K. L.; Hsieh, I. C.; Lin, S.
2017-12-01
Methane seep in deep sea environment could lead to build up of chemosynthesis communities, and a number of geological and biological anomalies as compare to the surrounding area. In order to examine the linkage between seep anomalies and those at the vicinity background area, and to detail mapping those spatial variations, we used a deep towed camera system (TowCam) to survey seafloor on the Tainan Ridge, Northeastern South China Sea (SCS). The underwater sea floor pictures could provide better spatial variations to demonstrate impact of methane seep on the sea floor. Water column variations of salinity, temperature, dissolved oxygen were applied to delineate fine scale variations at the study area. In addition, sediment cores were collected for chemical analyses to confirm the existence of local spatial variations. Our results show large spatial variations existed as a result of differences in methane flux. In fact, methane is the driving force for the observed biogeochemical variations in the water column, on the sea floor, and in the sediment. Of the area we have surveyed, there are approximately 7% of total towcam survey data showing abnormal water properties. Corresponding to the water column anomalies, underwater sea floor pictures taken from those places showed that chemosynthetic clams and muscles could be identified, together with authigenic carbonate buildups, and bacterial mats. Moreover, sediment cores with chemical anomalies also matched those in the water column and on the sea floor. These anomalies, however, represent only a small portion of the area surveyed and could not be identified with typical (random) coring method. Methane seep, therefore, require tedious and multiple types of surveys to better understand the scale and magnitude of seep and biogeochemical anomalies those were driven by gas migrations.
NASA Technical Reports Server (NTRS)
Susskind, Joel
2008-01-01
AIRS/AMSU is the advanced IR/MW atmospheric sounding system launched on EOS Aqua in May 2002. Products derived from AIRS/AMSU by the AIRS Science Team include surface skin temperature and atmospheric temperature profiles; atmospheric humidity profiles, fractional cloud cover and cloud top pressure, and OLR. Products covering the period September 2002 through the present have been derived from AIRS/AMSU using the AIRS Science Team Version 5 retrieval algorithm. In this paper, we will show results covering the time period September 2006 - November 2008. This time period is marked by a substantial warming trend of Northern Hemisphere Extratropical land surface skin temperatures, as well as pronounced El Nino - La Nina episodes. These both influence the spatial and temporal anomaly patterns of atmospheric temperature and moisture profiles, as well as of cloud cover and Clear sky and All Sky OLR. The relationships between temporal and spatial anomalies of these parameters over this time period, as determined from AIRS/AMSU observations, will be shown, with particular emphasis on which contribute significantly to OLR anomalies in each of the tropics and extra-tropics. Results will also be shown to validate the anomalies and trends of temperature profiles and OLR as determined from analysis of AIRS/AMSU data. Global and regional trends during the 6 1/3 year period are not necessarily indicative of what has happened in the past, or what may happen in the future. Nevertheless, the inter-relationships of spatial and temporal anomalies of atmospheric geophysical parameters with those of surface skin temperature are indicative of climate processes, and can be used to test the performance of climate models when driven by changes in surface temperatures.
NASA Technical Reports Server (NTRS)
Susskind, Joel; Molnar, Gyula
2009-01-01
AIRS/AMSU is the advanced IR/MW atmospheric sounding system launched on EOS Aqua in May 2002. Products derived from AIRS/AMSU by the AIRS Science Team include surface skin temperature and atmospheric temperature profiled; atmospheric humidity profiles, fractional cloud clover and cloud top pressure, and OLR. Products covering the period September 2002 through the present have been derived from AIRS/AMSU using the AIRS Science Team Version 5 retrieval algorithm. In this paper, we will show results covering the time period September 2006 - November 2008. This time period is marked by a substantial warming trend of Northern Hemisphere Extra-tropical land surface skin temperatures, as well as pronounced El Nino - La Nina episodes. These both influence the spatial and temporal anomaly patterns of atmospheric temperature and moisture profiles, as well as of cloud cover and Clear Sky and All Sky OLR. The relationships between temporal and spatial anomalies of these parameters over this time period, as determined from AIRS/AMSU observations, will be shown with particular emphasis on which contribute significantly to OLR anomalies in each of the tropics and extra-tropics. Results will also be shown to evaluate the anomalies and trends of temperature profiles and OLR as determined from analysis of AIRS/AMSU data. Global and regional trends during the 6 1/3 year time period are not necessarily indicative of what has happened in the past, or what may happen in the future. Nevertheless, the inter-relationships of spatial and temporal anomalies of atmospheric geophysical parameters with those of surface skin temperature are indicative of climate processes, and can be used to test the performance of climate models when driven by changes in surface temperatures.
Method for localizing and isolating an errant process step
Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.
2003-01-01
A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.
Koopman Operator Framework for Time Series Modeling and Analysis
NASA Astrophysics Data System (ADS)
Surana, Amit
2018-01-01
We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.
Anomaly Detection in Moving-Camera Video Sequences Using Principal Subspace Analysis
Thomaz, Lucas A.; Jardim, Eric; da Silva, Allan F.; ...
2017-10-16
This study presents a family of algorithms based on sparse decompositions that detect anomalies in video sequences obtained from slow moving cameras. These algorithms start by computing the union of subspaces that best represents all the frames from a reference (anomaly free) video as a low-rank projection plus a sparse residue. Then, they perform a low-rank representation of a target (possibly anomalous) video by taking advantage of both the union of subspaces and the sparse residue computed from the reference video. Such algorithms provide good detection results while at the same time obviating the need for previous video synchronization. However,more » this is obtained at the cost of a large computational complexity, which hinders their applicability. Another contribution of this paper approaches this problem by using intrinsic properties of the obtained data representation in order to restrict the search space to the most relevant subspaces, providing computational complexity gains of up to two orders of magnitude. The developed algorithms are shown to cope well with videos acquired in challenging scenarios, as verified by the analysis of 59 videos from the VDAO database that comprises videos with abandoned objects in a cluttered industrial scenario.« less
Anomaly Detection in Moving-Camera Video Sequences Using Principal Subspace Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomaz, Lucas A.; Jardim, Eric; da Silva, Allan F.
This study presents a family of algorithms based on sparse decompositions that detect anomalies in video sequences obtained from slow moving cameras. These algorithms start by computing the union of subspaces that best represents all the frames from a reference (anomaly free) video as a low-rank projection plus a sparse residue. Then, they perform a low-rank representation of a target (possibly anomalous) video by taking advantage of both the union of subspaces and the sparse residue computed from the reference video. Such algorithms provide good detection results while at the same time obviating the need for previous video synchronization. However,more » this is obtained at the cost of a large computational complexity, which hinders their applicability. Another contribution of this paper approaches this problem by using intrinsic properties of the obtained data representation in order to restrict the search space to the most relevant subspaces, providing computational complexity gains of up to two orders of magnitude. The developed algorithms are shown to cope well with videos acquired in challenging scenarios, as verified by the analysis of 59 videos from the VDAO database that comprises videos with abandoned objects in a cluttered industrial scenario.« less
Kessler, Richard; Strain, R.E.; Marlowe, J. I.; Currin, K.B.
1996-01-01
A ground-penetrating radar survey was conducted at the Monroe Crossroads Battlefield site at Fort Bragg, North Carolina, to determine possible locations of subsurface archaeological features. An electromagnetic survey also was conducted at the site to verify and augment the ground-penetrating radar data. The surveys were conducted over a 67,200-square-foot grid with a grid point spacing of 20 feet. During the ground-penetrating radar survey, 87 subsurface anomalies were detected based on visual inspection of the field records. These anomalies were flagged in the field as they appeared on the ground-penetrating radar records and were located by a land survey. The electromagnetic survey produced two significant readings at ground-penetrating radar anomaly locations. The National Park Service excavated 44 of the 87 anomaly locations at the Civil War battlefield site. Four of these excavations produced significant archaeological features, including one at an abandoned well.
NASA Astrophysics Data System (ADS)
Gutiérrez, Francisco J.; Lemus, Martín; Parada, Miguel A.; Benavente, Oscar M.; Aguilera, Felipe A.
2012-09-01
Detection of thermal anomalies in volcanic-geothermal areas using remote sensing methodologies requires the subtraction of temperatures, not provided by geothermal manifestations (e.g. hot springs, fumaroles, active craters), from satellite image kinetic temperature, which is assumed to correspond to the ground surface temperature. Temperatures that have been subtracted in current models include those derived from the atmospheric transmittance, reflectance of the Earth's surface (albedo), topography effect, thermal inertia and geographic position effect. We propose a model that includes a new parameter (K) that accounts for the variation of temperature with ground surface altitude difference in areas where steep relief exists. The proposed model was developed and applied, using ASTER satellite images, in two Andean volcanic/geothermal complexes (Descabezado Grande-Cerro Azul Volcanic Complex and Planchón-Peteroa-Azufre Volcanic Complex) where field data of atmosphere and ground surface temperature as well as radiation for albedo calibration were obtained in 10 selected sites. The study area was divided into three zones (Northern, Central and Southern zones) where the thermal anomalies were obtained independently. K value calculated for night images of the three zones are better constrained and resulted to be very similar to the Environmental Lapse Rate (ELR) determined for a stable atmosphere (ELR > 7 °C/km). Using the proposed model, numerous thermal anomalies in areas of ≥ 90 m × 90 m were identified that were successfully cross-checked in the field. Night images provide more reliable information for thermal anomaly detection than day images because they record higher temperature contrast between geothermal areas and its surroundings and correspond to more stable atmospheric condition at the time of image acquisition.
On ɛ-mechanism driven pulsations in VV 47
NASA Astrophysics Data System (ADS)
Sowicka, Paulina; Handler, Gerald; Jones, David
2018-06-01
We report new observations of the central star of the planetary nebula VV 47 carried out to verify earlier assertions that the short-period pulsation modes detected in the star are driven by the ɛ mechanism. In our data, VV 47 was not variable up to a limit of 0.52 mmag in the Fourier amplitude spectrum up to the Nyquist frequency of 21.7 mHz. Given this null result we re-analyzed the data set in which oscillations were claimed. After careful data reduction, photometry, extinction correction, and analysis with a conservative criterion of S/N ≥ 4 in the Fourier amplitude spectrum, we found that the star was not variable during the original observations. The oscillations reported earlier were due to an over-optimistic detection criterion. We conclude that VV 47 did not pulsate during any measurements at hand; the observational detection of ɛ-driven pulsations remains arduous.
Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity
Schettini, Raimondo
2018-01-01
Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art. PMID:29329268
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dumidu Wijayasekara; Ondrej Linda; Milos Manic
Building Energy Management Systems (BEMSs) are essential components of modern buildings that utilize digital control technologies to minimize energy consumption while maintaining high levels of occupant comfort. However, BEMSs can only achieve these energy savings when properly tuned and controlled. Since indoor environment is dependent on uncertain criteria such as weather, occupancy, and thermal state, performance of BEMS can be sub-optimal at times. Unfortunately, the complexity of BEMS control mechanism, the large amount of data available and inter-relations between the data can make identifying these sub-optimal behaviors difficult. This paper proposes a novel Fuzzy Anomaly Detection and Linguistic Description (Fuzzy-ADLD)more » based method for improving the understandability of BEMS behavior for improved state-awareness. The presented method is composed of two main parts: 1) detection of anomalous BEMS behavior and 2) linguistic representation of BEMS behavior. The first part utilizes modified nearest neighbor clustering algorithm and fuzzy logic rule extraction technique to build a model of normal BEMS behavior. The second part of the presented method computes the most relevant linguistic description of the identified anomalies. The presented Fuzzy-ADLD method was applied to real-world BEMS system and compared against a traditional alarm based BEMS. In six different scenarios, the Fuzzy-ADLD method identified anomalous behavior either as fast as or faster (an hour or more), that the alarm based BEMS. In addition, the Fuzzy-ADLD method identified cases that were missed by the alarm based system, demonstrating potential for increased state-awareness of abnormal building behavior.« less
Kaasen, Anne; Helbig, Anne; Malt, Ulrik Fredrik; Naes, Tormod; Skari, Hans; Haugen, Guttorm Nils
2013-07-12
In Norway almost all pregnant women attend one routine ultrasound examination. Detection of fetal structural anomalies triggers psychological stress responses in the women affected. Despite the frequent use of ultrasound examination in pregnancy, little attention has been devoted to the psychological response of the expectant father following the detection of fetal anomalies. This is important for later fatherhood and the psychological interaction within the couple. We aimed to describe paternal psychological responses shortly after detection of structural fetal anomalies by ultrasonography, and to compare paternal and maternal responses within the same couple. A prospective observational study was performed at a tertiary referral centre for fetal medicine. Pregnant women with a structural fetal anomaly detected by ultrasound and their partners (study group,n=155) and 100 with normal ultrasound findings (comparison group) were included shortly after sonographic examination (inclusion period: May 2006-February 2009). Gestational age was >12 weeks. We used psychometric questionnaires to assess self-reported social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression): Impact of Event Scale. General Health Questionnaire and Edinburgh Postnatal Depression Scale. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Median (range) gestational age at inclusion in the study and comparison group was 19 (12-38) and 19 (13-22) weeks, respectively. Men and women in the study group had significantly higher levels of psychological distress than men and women in the comparison group on all psychometric endpoints. The lowest level of distress in the study group was associated with the least severe anomalies with no diagnostic or prognostic ambiguity (p < 0.033). Men had lower scores than women on all psychometric outcome variables. The correlation in distress scores between men and women was high in the fetal anomaly group (p < 0.001), but non-significant in the comparison group. Severity of the anomaly including ambiguity significantly influenced paternal response. Men reported lower scores on all psychometric outcomes than women. This knowledge may facilitate support for both expectant parents to reduce strain within the family after detectionof a fetal anomaly.
Apollo-Soyuz pamphlet no. 4: Gravitational field. [experimental design
NASA Technical Reports Server (NTRS)
Page, L. W.; From, T. P.
1977-01-01
Two Apollo Soyuz experiments designed to detect gravity anomalies from spacecraft motion are described. The geodynamics experiment (MA-128) measured large-scale gravity anomalies by detecting small accelerations of Apollo in the 222 km orbit, using Doppler tracking from the ATS-6 satellite. Experiment MA-089 measured 300 km anomalies on the earth's surface by detecting minute changes in the separation between Apollo and the docking module. Topics discussed in relation to these experiments include the Doppler effect, gravimeters, and the discovery of mascons on the moon.
Thermal wake/vessel detection technique
Roskovensky, John K [Albuquerque, NM; Nandy, Prabal [Albuquerque, NM; Post, Brian N [Albuquerque, NM
2012-01-10
A computer-automated method for detecting a vessel in water based on an image of a portion of Earth includes generating a thermal anomaly mask. The thermal anomaly mask flags each pixel of the image initially deemed to be a wake pixel based on a comparison of a thermal value of each pixel against other thermal values of other pixels localized about each pixel. Contiguous pixels flagged by the thermal anomaly mask are grouped into pixel clusters. A shape of each of the pixel clusters is analyzed to determine whether each of the pixel clusters represents a possible vessel detection event. The possible vessel detection events are represented visually within the image.
Ionospheric earthquake effects detection based on Total Electron Content (TEC) GPS Correlation
NASA Astrophysics Data System (ADS)
Sunardi, Bambang; Muslim, Buldan; Eka Sakya, Andi; Rohadi, Supriyanto; Sulastri; Murjaya, Jaya
2018-03-01
Advances in science and technology showed that ground-based GPS receiver was able to detect ionospheric Total Electron Content (TEC) disturbances caused by various natural phenomena such as earthquakes. One study of Tohoku (Japan) earthquake, March 11, 2011, magnitude M 9.0 showed TEC fluctuations observed from GPS observation network spread around the disaster area. This paper discussed the ionospheric earthquake effects detection using TEC GPS data. The case studies taken were Kebumen earthquake, January 25, 2014, magnitude M 6.2, Sumba earthquake, February 12, 2016, M 6.2 and Halmahera earthquake, February 17, 2016, M 6.1. TEC-GIM (Global Ionosphere Map) correlation methods for 31 days were used to monitor TEC anomaly in ionosphere. To ensure the geomagnetic disturbances due to solar activity, we also compare with Dst index in the same time window. The results showed anomalous ratio of correlation coefficient deviation to its standard deviation upon occurrences of Kebumen and Sumba earthquake, but not detected a similar anomaly for the Halmahera earthquake. It was needed a continous monitoring of TEC GPS data to detect the earthquake effects in ionosphere. This study giving hope in strengthening the earthquake effect early warning system using TEC GPS data. The method development of continuous TEC GPS observation derived from GPS observation network that already exists in Indonesia is needed to support earthquake effects early warning systems.
Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences
NASA Technical Reports Server (NTRS)
Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene
2006-01-01
This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.
Research on Abnormal Detection Based on Improved Combination of K - means and SVDD
NASA Astrophysics Data System (ADS)
Hao, Xiaohong; Zhang, Xiaofeng
2018-01-01
In order to improve the efficiency of network intrusion detection and reduce the false alarm rate, this paper proposes an anomaly detection algorithm based on improved K-means and SVDD. The algorithm first uses the improved K-means algorithm to cluster the training samples of each class, so that each class is independent and compact in class; Then, according to the training samples, the SVDD algorithm is used to construct the minimum superspheres. The subordinate relationship of the samples is determined by calculating the distance of the minimum superspheres constructed by SVDD. If the test sample is less than the center of the hypersphere, the test sample belongs to this class, otherwise it does not belong to this class, after several comparisons, the final test of the effective detection of the test sample.In this paper, we use KDD CUP99 data set to simulate the proposed anomaly detection algorithm. The results show that the algorithm has high detection rate and low false alarm rate, which is an effective network security protection method.
NASA Astrophysics Data System (ADS)
Martin-Del Pozzo, A. L.; Cifuentes-Nava, G.; Cabral-Cano, E.; Sánchez-Rubio, G.; Reyes, M.; Martínez-Bringas, Alicia; Garcia, E.; Arango-Galvan, C.
2002-03-01
An interdisciplinary approach correlating magnetic anomalies with composition of the ejecta in each eruption, as well as with seismicity, was used to study the effect of magmatic activity on the local magnetic record at Popocatépetl Volcano located 65 km southeast of México City. Eruptions began on December, 1994, and have continued with dome growth and ash emissions since then. The Tlamacas (TLA) geomagnetic total field monitoring station, located 5 km away from Popocatépetl's crater, was installed in December, 1997, in order to detect magnetic anomalies induced by this activity. Spatial correlation and weighted difference methods were applied to detect temporal geomagnetic anomalies using TLA's record and the Teoloyucan Magnetic Observatory as a reference station. Weighted differences were applied to cancel the effects of non-vulcanogenic external field variations. Magnetic anomalies over a 2-year time span were classified into four types correlating them with geochemical, seismic and visual monitoring of the volcanic activity. Magnetic anomalies are believed to be caused by magma injection and gas pressure build-up, which is sensitive to vent morphology and clearing during eruption, although some anomalies appear to be thermally related, changes in the stress field are very important. Most magnetic anomalies are short time signals that reverse to baseline level. Decreasing anomalies (-0.5 to -6.8 nT) precede eruptions by 1-8 days. The presence of a mafic magmatic component was determined by mineral examination and silica and magnesium analyses on the ejecta from the 1997-1999 eruptions. Whole rock analyses ranged from dacitic (65% SiO 2) to andesitic (57% SiO 2) with 2-6.6% MgO. The higher MgO, lower silica samples contain forsteritic olivine (Fo90). SiO 2 does not increase and MgO does not increase with time, suggesting ascent of small magma pulses which are consistent with the magnetic data.
Prevalence and distribution of dental anomalies in orthodontic patients.
Montasser, Mona A; Taha, Mahasen
2012-01-01
To study the prevalence and distribution of dental anomalies in a sample of orthodontic patients. The dental casts, intraoral photographs, and lateral panoramic and cephalometric radiographs of 509 Egyptian orthodontic patients were studied. Patients were examined for dental anomalies in number, size, shape, position, and structure. The prevalence of each dental anomaly was calculated and compared between sexes. Of the total study sample, 32.6% of the patients had at least one dental anomaly other than agenesis of third molars; 32.1% of females and 33.5% of males had at least one dental anomaly other than agenesis of third molars. The most commonly detected dental anomalies were impaction (12.8%) and ectopic eruption (10.8%). The total prevalence of hypodontia (excluding third molars) and hyperdontia was 2.4% and 2.8%, respectively, with similiar distributions in females and males. Gemination and accessory roots were reported in this study; each of these anomalies was detected in 0.2% of patients. In addition to genetic and racial factors, environmental factors could have more important influence on the prevalence of dental anomalies in every population. Impaction, ectopic eruption, hyperdontia, hypodontia, and microdontia were the most common dental anomalies, while fusion and dentinogenesis imperfecta were absent.
Nonlinear Classification of AVO Attributes Using SVM
NASA Astrophysics Data System (ADS)
Zhao, B.; Zhou, H.
2005-05-01
A key research topic in reservoir characterization is the detection of the presence of fluids using seismic and well-log data. In particular, partial gas discrimination is very challenging because low and high gas saturation can result in similar anomalies in terms of Amplitude Variation with Offset (AVO), bright spot, and velocity sag. Hence, a successful fluid detection will require a good understanding of the seismic signatures of the fluids, high-quality data, and good detection methodology. Traditional attempts of partial gas discrimination employ the Neural Network algorithm. A new approach is to use the Support Vector Machine (SVM) (Vapnik, 1995; Liu and Sacchi, 2003). While the potential of the SVM has not been fully explored for reservoir fluid detection, the current nonlinear methods classify seismic attributes without the use of rock physics constraints. The objective of this study is to improve the capability of distinguishing a fizz-water reservoir from a commercial gas reservoir by developing a new detection method using AVO attributes and rock physics constraints. This study will first test the SVM classification with synthetic data, and then apply the algorithm to field data from the King-Kong and Lisa-Anne fields in Gulf of Mexico. While both field areas have high amplitude seismic anomalies, King-Kong field produces commercial gas but Lisa-Anne field does not. We expect that the new SVM-based nonlinear classification of AVO attributes may be able to separate commercial gas from fizz-water in these two fields.
Forward Modelling of Long-wavelength Magnetic Anomaly Contributions from the Upper Mantle
NASA Astrophysics Data System (ADS)
Idoko, C. M.; Conder, J. A.; Ferre, E. C.; Friedman, S. A.
2016-12-01
Towards the interpretation of the upcoming results from SWARM satellite survey, we develop a MATLAB-based geophysical forward-modeling of magnetic anomalies from tectonic regions with different upper mantle geotherms including subduction zones (Kamchaka island arcs), cratons (Siberian craton), and hotspots (Hawaii hotspots and Massif-central plumes). We constrain the modeling - using magnetic data measured from xenoliths collected across these regions. Over the years, the potency of the upper mantle in contributing to long-wavelength magnetic anomalies has been a topic of debate among geoscientists. However, recent works show that some low geotherm tectonic environments such as forearcs and cratons contain mantle xenoliths which are below the Curie-Temperature of magnetite and could potentially contribute to long-wavelength magnetic anomalies. The modeling pursued here holds the prospect of better understanding the magnetism of the upper mantle, and the resolution of the mismatch between observed long-wavelength anomalies and surface field anomaly upward continued to satellite altitude. The SWARM satellite survey provides a unique opportunity due to its capacity to detect more accurately the depth of magnetic sources. A preliminary model of a hypothetical craton of size 2000km by 1000km by 500km discretized into 32 equal and uniformly distributed prism blocks, using magnetic data from Siberian craton with average natural remanent magnetization value of 0.0829 A/m (randomnly oriented) for a magnetized mantle thickness of 75km, and induced magnetization, varying according to the Curie-Weiss law from surface to 500km depth with an average magnetization of 0.02 A/m, shows that the contributions of the induced and remanent phases of magnetizations- with a total-field anomaly amplitude of 3 nT may impart a measurable signal to the observed long-wavelength magnetic anomalies in low geotherm tectonic environments.
Automated synthesis, insertion and detection of polyps for CT colonography
NASA Astrophysics Data System (ADS)
Sezille, Nicolas; Sadleir, Robert J. T.; Whelan, Paul F.
2003-03-01
CT Colonography (CTC) is a new non-invasive colon imaging technique which has the potential to replace conventional colonoscopy for colorectal cancer screening. A novel system which facilitates automated detection of colorectal polyps at CTC is introduced. As exhaustive testing of such a system using real patient data is not feasible, more complete testing is achieved through synthesis of artificial polyps and insertion into real datasets. The polyp insertion is semi-automatic: candidate points are manually selected using a custom GUI, suitable points are determined automatically from an analysis of the local neighborhood surrounding each of the candidate points. Local density and orientation information are used to generate polyps based on an elliptical model. Anomalies are identified from the modified dataset by analyzing the axial images. Detected anomalies are classified as potential polyps or natural features using 3D morphological techniques. The final results are flagged for review. The system was evaluated using 15 scenarios. The sensitivity of the system was found to be 65% with 34% false positive detections. Automated diagnosis at CTC is possible and thorough testing is facilitated by augmenting real patient data with computer generated polyps. Ultimately, automated diagnosis will enhance standard CTC and increase performance.
Detailed Vibration Analysis of Pinion Gear with Time-Frequency Methods
NASA Technical Reports Server (NTRS)
Mosher, Marianne; Pryor, Anna H.; Lewicki, David G.
2003-01-01
In this paper, the authors show a detailed analysis of the vibration signal from the destructive testing of a spiral bevel gear and pinion pair containing seeded faults. The vibration signal is analyzed in the time domain, frequency domain and with four time-frequency transforms: the Short Time Frequency Transform (STFT), the Wigner-Ville Distribution with the Choi-Williams kernel (WV-CW), the Continuous Wavelet' Transform (CWT) and the Discrete Wavelet Transform (DWT). Vibration data of bevel gear tooth fatigue cracks, under a variety of operating load levels and damage conditions, are analyzed using these methods. A new metric for automatic anomaly detection is developed and can be produced from any systematic numerical representation of the vibration signals. This new metric reveals indications of gear damage with all of the time-frequency transforms, as well as time and frequency representations, on this data set. Analysis with the CWT detects changes in the signal at low torque levels not found with the other transforms. The WV-CW and CWT use considerably more resources than the STFT and the DWT. More testing of the new metric is needed to determine its value for automatic anomaly detection and to develop fault detection methods for the metric.
NASA Astrophysics Data System (ADS)
Chowdhury, Debanjan; Skinner, Brian; Lee, Patrick A.
2018-05-01
Electron tunneling into a system with strong interactions is known to exhibit an anomaly, in which the tunneling conductance vanishes continuously at low energy due to many-body interactions. Recent measurements have probed this anomaly in a quantum Hall bilayer of the half-filled Landau level, and shown that the anomaly apparently gets stronger as the half-filled Landau level is increasingly spin polarized. Motivated by this result, we construct a semiclassical hydrodynamic theory of the tunneling anomaly in terms of the charge-spreading action associated with tunneling between two copies of the Halperin-Lee-Read state with partial spin polarization. This theory is complementary to our recent work (D. Chowdhury, B. Skinner, and P. A. Lee, arXiv:1709.06091) where the electron spectral function was computed directly using an instanton-based approach. Our results show that the experimental observation cannot be understood within conventional theories of the tunneling anomaly, in which the spreading of the injected charge is driven by the mean-field Coulomb energy. However, we identify a qualitatively new regime, in which the mean-field Coulomb energy is effectively quenched and the tunneling anomaly is dominated by the finite compressibility of the composite Fermion liquid.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzke, Brett D.; Wilson, John E.; Hathaway, J.
2008-02-12
Statistically defensible methods are presented for developing geophysical detector sampling plans and analyzing data for munitions response sites where unexploded ordnance (UXO) may exist. Detection methods for identifying areas of elevated anomaly density from background density are shown. Additionally, methods are described which aid in the choice of transect pattern and spacing to assure with degree of confidence that a target area (TA) of specific size, shape, and anomaly density will be identified using the detection methods. Methods for evaluating the sensitivity of designs to variation in certain parameters are also discussed. Methods presented have been incorporated into the Visualmore » Sample Plan (VSP) software (free at http://dqo.pnl.gov/vsp) and demonstrated at multiple sites in the United States. Application examples from actual transect designs and surveys from the previous two years are demonstrated.« less
Anomaly Detection Techniques with Real Test Data from a Spinning Turbine Engine-Like Rotor
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Woike, Mark R.; Oza, Nikunj C.; Matthews, Bryan L.
2012-01-01
Online detection techniques to monitor the health of rotating engine components are becoming increasingly attractive to aircraft engine manufacturers in order to increase safety of operation and lower maintenance costs. Health monitoring remains a challenge to easily implement, especially in the presence of scattered loading conditions, crack size, component geometry, and materials properties. The current trend, however, is to utilize noninvasive types of health monitoring or nondestructive techniques to detect hidden flaws and mini-cracks before any catastrophic event occurs. These techniques go further to evaluate material discontinuities and other anomalies that have grown to the level of critical defects that can lead to failure. Generally, health monitoring is highly dependent on sensor systems capable of performing in various engine environmental conditions and able to transmit a signal upon a predetermined crack length, while acting in a neutral form upon the overall performance of the engine system.
NASA Astrophysics Data System (ADS)
Di Filippo, Michele; Di Nezza, Maria
2016-04-01
Several factors were taken into consideration in order to appropriately tailor the geophysical explorations at the cultural heritage. Given the fact that each site has been neglected for a long time and in recent times used as an illegal dumping area, we thoroughly evaluated for this investigation the advantages and limitations of each specific technique, and the general conditions and history of the site. We took into account the extension of the areas to be investigated and the need for rapid data acquisition and processing. Furthermore, the survey required instrumentation with sensitivity to small background contrasts and as little as possible affected by background noise sources. In order to ascertain the existence and location of underground buried walls, a magnetic gradiometer survey (MAG) was planned. The map of the magnetic anomalies is not computed to reduction at the pole (RTP), but with a magnetic horizontal gradient operator (MHGO). The magnetic horizontal gradient operator (MHGO) generates from a grid of vertical gradient a grid of steepest slopes (i.e. the magnitude of the gradient) at any point on the surface. The MHGO is reported as a number (rise over run) rather than degrees, and the direction is opposite to that of the slope. The MHGO is zero for a horizontal surface, and approaches infinity as the slope approaches the vertical. The gradient data are especially useful for detecting objects buried at shallow depth. The map reveals some details of the anomalies of the geomagnetic field. Magnetic anomalies due to walls are more evident than in the total intensity map, whereas anomalies due to concentrations of debris are very weak. In this work we describe the results of an investigation obtained with magnetometry investigation for two archaeological sites: "Villa degli Antonini" (Genzano, Rome) and Rota Ria (Mugnano in Teverina, Viterbo). Since the main goal of the investigation was to understand the nature of magnetic anomalies with cost-effective method, we have also detection and location of underground buried structures using different instruments and techniques geophysical were carried out (EMI, GPR and microgravity) and so far excavated only in a targeted sector of the area of the anomaly labeled in order to test the validity of the geophysical survey.
Thermal remote sensing as a part of Exupéry volcano fast response system
NASA Astrophysics Data System (ADS)
Zakšek, Klemen; Hort, Matthias
2010-05-01
In order to understand the eruptive potential of a volcanic system one has to characterize the actual state of stress of a volcanic system that involves proper monitoring strategies. As several volcanoes in highly populated areas especially in south east Asia are still nearly unmonitored a mobile volcano monitoring system is currently being developed in Germany. One of the major novelties of this mobile volcano fast response system called Exupéry is the direct inclusion of satellite based observations. Remote sensing data are introduced together with ground based field measurements into the GIS database, where the statistical properties of all recorded data are estimated. Using physical modelling and statistical methods we hope to constrain the probability of future eruptions. The emphasis of this contribution is on using thermal remote sensing as tool for monitoring active volcanoes. One can detect thermal anomalies originating from a volcano by comparing signals in mid and thermal infrared spectra. A reliable and effective thermal anomalies detection algorithm was developed by Wright (2002) for MODIS sensor; it is based on the threshold of the so called normalized thermal index (NTI). This is the method we use in Exupéry, where we characterize each detected thermal anomaly by temperature, area, heat flux and effusion rate. The recent work has shown that radiant flux is the most robust parameter for this characterization. Its derivation depends on atmosphere, satellite viewing angle and sensor characteristics. Some of these influences are easy to correct using standard remote sensing pre-processing techniques, however, some noise still remains in data. In addition, satellites in polar orbits have long revisit times and thus they might fail to follow a fast evolving volcanic crisis due to long revisit times. Thus we are currently testing Kalman filter on simultaneous use of MODIS and AVHRR data to improve the thermal anomaly characterization. The advantage of this technique is that it increases the temporal resolution through using images from different satellites having different resolution and sensitivity. This algorithm has been tested for an eruption at Mt. Etna (2002) and successfully captures more details of the eruption evolution than would be seen by using only one satellite source. At the moment for Exupéry, merely MODIS (a sensor aboard NASA's Terra and Aqua satellite) data are used for the operational use. As MODIS is a meteorological sensor, it is suitable also for producing general overview images of the crisis area. Therefore, for each processed MODIS image we also produce RGB image where some basic meteorological features are classified: e.g. clouds, volcanic ash plumes, ocean, etc. In the case of detected hotspot an additional image is created; it contains the original measured radiances of the selected channels for the crisis area. All anomaly and processing parameters are additionally written into an XML file. The results are available in web GIS in the worst case two hours after NASA provides level 1b data online.
Cole, Janine; Finn, Carol A.; Webb, Susan J.
2013-01-01
Aeromagnetic data clearly delineate the mafic rocks of the economically significant Bushveld Igneous Complex. This is mainly due to the abundance of magnetite in the Upper Zone of the Rustenburg Layered Suite of the Bushveld, but strongly remanently magnetised rocks in the Main Zone also contribute significantly in places. In addition to delineating the extent of the magnetic rocks in the complex, the magnetic anomalies also provide information about the dip and depth of these units. The presence of varying degrees of remanent magnetisation in most of the magnetic lithologies of the Rustenburg Layered Suite complicates the interpretation of the data. The combination of available regional and high resolution airborne magnetic data with published palaeomagnetic data reveals characteristic magnetic signatures associated with the different magnetic lithologies in the Rustenburg Layered Suite. As expected, the ferrogabbros of the Upper Zone cause the highest amplitude magnetic anomalies, but in places subtle features within the Main Zone can also be detected. A marker with strong remanent magnetisation located in the Main Zone close to the contact with the Upper Zone is responsible for very high amplitude negative anomalies in the southern parts of both the eastern and western lobes of the Bushveld Complex. Prominent anomalies are not necessarily related to a specific lithology, but can result from the interaction between anomalies caused by differently magnetised bodies.The magnetic data provided substantial information at different levels of detail, ranging from contacts between zones, and layering within zones, to magnetite pipes dykes and faults that can have an impact on mine planning. Finally, simple modelling of the magnetic data supports the concept of continuous mafic rocks between the western and eastern lobes.
Global Anomaly Detection in Two-Dimensional Symmetry-Protected Topological Phases
NASA Astrophysics Data System (ADS)
Bultinck, Nick; Vanhove, Robijn; Haegeman, Jutho; Verstraete, Frank
2018-04-01
Edge theories of symmetry-protected topological phases are well known to possess global symmetry anomalies. In this Letter we focus on two-dimensional bosonic phases protected by an on-site symmetry and analyze the corresponding edge anomalies in more detail. Physical interpretations of the anomaly in terms of an obstruction to orbifolding and constructing symmetry-preserving boundaries are connected to the cohomology classification of symmetry-protected phases in two dimensions. Using the tensor network and matrix product state formalism we numerically illustrate our arguments and discuss computational detection schemes to identify symmetry-protected order in a ground state wave function.
NASA Astrophysics Data System (ADS)
Berenter, J. S.; Mueller, J. M.; Morrison, I.
2016-12-01
Annual forest fires are a source of great economic and environmental cost in the Maya Biosphere Reserve (MBR), a region of high ecological and historical value in Guatemala's department of Petén. Scarce institutional resources, limited local response capacity, and difficult terrain place a premium on the use of Earth observation data for forest fire management in the MBR, but also present significant institutional barriers to optimizing the value of this data. Drawing upon key informant interviews and a contingent valuation survey of national and local actors conducted during a three-year performance evaluation of the USAID/NASA Regional Visualization and Monitoring System (SERVIR), this paper traces the flow of SERVIR data from acquisition to decision in order to assess the institutional and contextual factors affecting the value of Earth observation data for forest fire management in the MBR. Findings indicate that the use of satellite data for forest fire management in the MBR is widespread and multi-dimensional: historical assessments of land use and land cover, fire scarring, and climate data help central-level fire management agencies identify and regulate fire-sensitive areas; regular monitoring and dissemination of climate data enables coordination between agricultural burning activities and fire early warning systems; and daily satellite detection of thermal anomalies in land surface temperature permits first responders to monitor and react to "hotspot" activity. Findings also suggest, however, that while the decentralized operations of Petén's fire management systems foster the use of Earth observation data, systemic bottlenecks, including budgetary constraints, inadequate data infrastructure and interpretation capacity, and obstacles to regulatory enforcement, impede the flow of information and use of technology and thus impact the value of that data, particularly in remote and under-resourced areas of the MBR. A geographic expansion and fortification of support systems for use of Earth observation data is thus required to maximize the value of data-driven forest fire management in the MBR. Findings further validate a need for continued cooperation between scientific and governance institutions to disseminate and integrate geospatial data into environmental decision-making.
Toward the detection of abnormal chest radiographs the way radiologists do it
NASA Astrophysics Data System (ADS)
Alzubaidi, Mohammad; Patel, Ameet; Panchanathan, Sethuraman; Black, John A., Jr.
2011-03-01
Computer Aided Detection (CADe) and Computer Aided Diagnosis (CADx) are relatively recent areas of research that attempt to employ feature extraction, pattern recognition, and machine learning algorithms to aid radiologists in detecting and diagnosing abnormalities in medical images. However, these computational methods are based on the assumption that there are distinct classes of abnormalities, and that each class has some distinguishing features that set it apart from other classes. However, abnormalities in chest radiographs tend to be very heterogeneous. The literature suggests that thoracic (chest) radiologists develop their ability to detect abnormalities by developing a sense of what is normal, so that anything that is abnormal attracts their attention. This paper discusses an approach to CADe that is based on a technique called anomaly detection (which aims to detect outliers in data sets) for the purpose of detecting atypical regions in chest radiographs. However, in order to apply anomaly detection to chest radiographs, it is necessary to develop a basis for extracting features from corresponding anatomical locations in different chest radiographs. This paper proposes a method for doing this, and describes how it can be used to support CADe.
Data integrity systems for organ contours in radiation therapy planning.
Shah, Veeraj P; Lakshminarayanan, Pranav; Moore, Joseph; Tran, Phuoc T; Quon, Harry; Deville, Curtiland; McNutt, Todd R
2018-06-12
The purpose of this research is to develop effective data integrity models for contoured anatomy in a radiotherapy workflow for both real-time and retrospective analysis. Within this study, two classes of contour integrity models were developed: data driven models and contiguousness models. The data driven models aim to highlight contours which deviate from a gross set of contours from similar disease sites and encompass the following regions of interest (ROI): bladder, femoral heads, spinal cord, and rectum. The contiguousness models, which individually analyze the geometry of contours to detect possible errors, are applied across many different ROI's and are divided into two metrics: Extent and Region Growing over volume. After analysis, we found that 70% of detected bladder contours were verified as suspicious. The spinal cord and rectum models verified that 73% and 80% of contours were suspicious respectively. The contiguousness models were the most accurate models and the Region Growing model was the most accurate submodel. 100% of the detected noncontiguous contours were verified as suspicious, but in the cases of spinal cord, femoral heads, bladder, and rectum, the Region Growing model detected additional two to five suspicious contours that the Extent model failed to detect. When conducting a blind review to detect false negatives, it was found that all the data driven models failed to detect all suspicious contours. The Region Growing contiguousness model produced zero false negatives in all regions of interest other than prostate. With regards to runtime, the contiguousness via extent model took an average of 0.2 s per contour. On the other hand, the region growing method had a longer runtime which was dependent on the number of voxels in the contour. Both contiguousness models have potential for real-time use in clinical radiotherapy while the data driven models are better suited for retrospective use. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Hypergraph-based anomaly detection of high-dimensional co-occurrences.
Silva, Jorge; Willett, Rebecca
2009-03-01
This paper addresses the problem of detecting anomalous multivariate co-occurrences using a limited number of unlabeled training observations. A novel method based on using a hypergraph representation of the data is proposed to deal with this very high-dimensional problem. Hypergraphs constitute an important extension of graphs which allow edges to connect more than two vertices simultaneously. A variational Expectation-Maximization algorithm for detecting anomalies directly on the hypergraph domain without any feature selection or dimensionality reduction is presented. The resulting estimate can be used to calculate a measure of anomalousness based on the False Discovery Rate. The algorithm has O(np) computational complexity, where n is the number of training observations and p is the number of potential participants in each co-occurrence event. This efficiency makes the method ideally suited for very high-dimensional settings, and requires no tuning, bandwidth or regularization parameters. The proposed approach is validated on both high-dimensional synthetic data and the Enron email database, where p > 75,000, and it is shown that it can outperform other state-of-the-art methods.
Sidibé, Désiré; Sankar, Shrinivasan; Lemaître, Guillaume; Rastgoo, Mojdeh; Massich, Joan; Cheung, Carol Y; Tan, Gavin S W; Milea, Dan; Lamoureux, Ecosse; Wong, Tien Y; Mériaudeau, Fabrice
2017-02-01
This paper proposes a method for automatic classification of spectral domain OCT data for the identification of patients with retinal diseases such as Diabetic Macular Edema (DME). We address this issue as an anomaly detection problem and propose a method that not only allows the classification of the OCT volume, but also allows the identification of the individual diseased B-scans inside the volume. Our approach is based on modeling the appearance of normal OCT images with a Gaussian Mixture Model (GMM) and detecting abnormal OCT images as outliers. The classification of an OCT volume is based on the number of detected outliers. Experimental results with two different datasets show that the proposed method achieves a sensitivity and a specificity of 80% and 93% on the first dataset, and 100% and 80% on the second one. Moreover, the experiments show that the proposed method achieves better classification performance than other recently published works. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Detection of sinkholes or anomalies using full seismic wave fields.
DOT National Transportation Integrated Search
2013-04-01
This research presents an application of two-dimensional (2-D) time-domain waveform tomography for detection of embedded sinkholes and anomalies. The measured seismic surface wave fields were inverted using a full waveform inversion (FWI) technique, ...
SmartMal: a service-oriented behavioral malware detection framework for mobile devices.
Wang, Chao; Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C K
2014-01-01
This paper presents SmartMal--a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices.
SmartMal: A Service-Oriented Behavioral Malware Detection Framework for Mobile Devices
Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C. K.
2014-01-01
This paper presents SmartMal—a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices. PMID:25165729
Atmospheric forcing of sea ice anomalies in the Ross Sea Polynya region
NASA Astrophysics Data System (ADS)
Dale, Ethan; McDonald, Adrian; Rack, Wolfgang
2016-04-01
Despite warming trends in global temperatures, sea ice extent in the southern hemisphere has shown an increasing trend over recent decades. Wind-driven sea ice export from coastal polynyas is an important source of sea ice production. Areas of major polynyas in the Ross Sea, the region with largest increase in sea ice extent, have been suggested to produce the vast amount of the sea ice in the region. We investigate the impacts of strong wind events on polynyas and the subsequent sea ice production. We utilize Bootstrap sea ice concentration (SIC) measurements derived from satellite based, Special Sensor Microwave Imager (SSM/I) brightness temperature images. These are compared with surface wind measurements made by automatic weather stations of the University of Wisconsin-Madison Antarctic Meteorology Program. Our analysis focusses on the winter period defined as 1st April to 1st November in this study. Wind data was used to classify each day into characteristic regimes based on the change of wind speed. For each regime, a composite of SIC anomaly was formed for the Ross Sea region. We found that persistent weak winds near the edge of the Ross Ice Shelf are generally associated with positive SIC anomalies in the Ross Sea polynya area (RSP). Conversely we found negative SIC anomalies in this area during persistent strong winds. By analyzing sea ice motion vectors derived from SSM/I brightness temperatures, we find significant sea ice motion anomalies throughout the Ross Sea during strong wind events. These anomalies persist for several days after the strong wing event. Strong, negative correlations are found between SIC within the RSP and wind speed indicating that strong winds cause significant advection of sea ice in the RSP. This rapid decrease in SIC is followed by a more gradual recovery in SIC. This increase occurs on a time scale greater than the average persistence of strong wind events and the resulting Sea ice motion anomalies, highlighting the production of new sea ice through thermodynamic processes.