DOE Office of Scientific and Technical Information (OSTI.GOV)
Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur
Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less
2004-01-01
login identity to the one under which the system call is executed, the parameters of the system call execution - file names including full path...Anomaly detection COAST-EIMDT Distributed on target hosts EMERALD Distributed on target hosts and security servers Signature recognition Anomaly...uses a centralized architecture, and employs an anomaly detection technique for intrusion detection. The EMERALD project [80] proposes a
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
Hyperspectral target detection using heavy-tailed distributions
NASA Astrophysics Data System (ADS)
Willis, Chris J.
2009-09-01
One promising approach to target detection in hyperspectral imagery exploits a statistical mixture model to represent scene content at a pixel level. The process then goes on to look for pixels which are rare, when judged against the model, and marks them as anomalies. It is assumed that military targets will themselves be rare and therefore likely to be detected amongst these anomalies. For the typical assumption of multivariate Gaussianity for the mixture components, the presence of the anomalous pixels within the training data will have a deleterious effect on the quality of the model. In particular, the derivation process itself is adversely affected by the attempt to accommodate the anomalies within the mixture components. This will bias the statistics of at least some of the components away from their true values and towards the anomalies. In many cases this will result in a reduction in the detection performance and an increased false alarm rate. This paper considers the use of heavy-tailed statistical distributions within the mixture model. Such distributions are better able to account for anomalies in the training data within the tails of their distributions, and the balance of the pixels within their central masses. This means that an improved model of the majority of the pixels in the scene may be produced, ultimately leading to a better anomaly detection result. The anomaly detection techniques are examined using both synthetic data and hyperspectral imagery with injected anomalous pixels. A range of results is presented for the baseline Gaussian mixture model and for models accommodating heavy-tailed distributions, for different parameterizations of the algorithms. These include scene understanding results, anomalous pixel maps at given significance levels and Receiver Operating Characteristic curves.
Anomaly detection in hyperspectral imagery: statistics vs. graph-based algorithms
NASA Astrophysics Data System (ADS)
Berkson, Emily E.; Messinger, David W.
2016-05-01
Anomaly detection (AD) algorithms are frequently applied to hyperspectral imagery, but different algorithms produce different outlier results depending on the image scene content and the assumed background model. This work provides the first comparison of anomaly score distributions between common statistics-based anomaly detection algorithms (RX and subspace-RX) and the graph-based Topological Anomaly Detector (TAD). Anomaly scores in statistical AD algorithms should theoretically approximate a chi-squared distribution; however, this is rarely the case with real hyperspectral imagery. The expected distribution of scores found with graph-based methods remains unclear. We also look for general trends in algorithm performance with varied scene content. Three separate scenes were extracted from the hyperspectral MegaScene image taken over downtown Rochester, NY with the VIS-NIR-SWIR ProSpecTIR instrument. In order of most to least cluttered, we study an urban, suburban, and rural scene. The three AD algorithms were applied to each scene, and the distributions of the most anomalous 5% of pixels were compared. We find that subspace-RX performs better than RX, because the data becomes more normal when the highest variance principal components are removed. We also see that compared to statistical detectors, anomalies detected by TAD are easier to separate from the background. Due to their different underlying assumptions, the statistical and graph-based algorithms highlighted different anomalies within the urban scene. These results will lead to a deeper understanding of these algorithms and their applicability across different types of imagery.
Prevalence and distribution of dental anomalies in orthodontic patients.
Montasser, Mona A; Taha, Mahasen
2012-01-01
To study the prevalence and distribution of dental anomalies in a sample of orthodontic patients. The dental casts, intraoral photographs, and lateral panoramic and cephalometric radiographs of 509 Egyptian orthodontic patients were studied. Patients were examined for dental anomalies in number, size, shape, position, and structure. The prevalence of each dental anomaly was calculated and compared between sexes. Of the total study sample, 32.6% of the patients had at least one dental anomaly other than agenesis of third molars; 32.1% of females and 33.5% of males had at least one dental anomaly other than agenesis of third molars. The most commonly detected dental anomalies were impaction (12.8%) and ectopic eruption (10.8%). The total prevalence of hypodontia (excluding third molars) and hyperdontia was 2.4% and 2.8%, respectively, with similiar distributions in females and males. Gemination and accessory roots were reported in this study; each of these anomalies was detected in 0.2% of patients. In addition to genetic and racial factors, environmental factors could have more important influence on the prevalence of dental anomalies in every population. Impaction, ectopic eruption, hyperdontia, hypodontia, and microdontia were the most common dental anomalies, while fusion and dentinogenesis imperfecta were absent.
FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection.
Noto, Keith; Brodley, Carla; Slonim, Donna
2012-01-01
Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called "normal" instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach.
FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection
Brodley, Carla; Slonim, Donna
2011-01-01
Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called “normal” instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach. PMID:22639542
Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.
2016-06-07
A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.
Anomaly Detection in Power Quality at Data Centers
NASA Technical Reports Server (NTRS)
Grichine, Art; Solano, Wanda M.
2015-01-01
The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.
Shaikh, Riaz Ahmed; Jameel, Hassan; d'Auriol, Brian J; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae
2009-01-01
Existing anomaly and intrusion detection schemes of wireless sensor networks have mainly focused on the detection of intrusions. Once the intrusion is detected, an alerts or claims will be generated. However, any unidentified malicious nodes in the network could send faulty anomaly and intrusion claims about the legitimate nodes to the other nodes. Verifying the validity of such claims is a critical and challenging issue that is not considered in the existing cooperative-based distributed anomaly and intrusion detection schemes of wireless sensor networks. In this paper, we propose a validation algorithm that addresses this problem. This algorithm utilizes the concept of intrusion-aware reliability that helps to provide adequate reliability at a modest communication cost. In this paper, we also provide a security resiliency analysis of the proposed intrusion-aware alert validation algorithm.
Shaikh, Riaz Ahmed; Jameel, Hassan; d’Auriol, Brian J.; Lee, Heejo; Lee, Sungyoung; Song, Young-Jae
2009-01-01
Existing anomaly and intrusion detection schemes of wireless sensor networks have mainly focused on the detection of intrusions. Once the intrusion is detected, an alerts or claims will be generated. However, any unidentified malicious nodes in the network could send faulty anomaly and intrusion claims about the legitimate nodes to the other nodes. Verifying the validity of such claims is a critical and challenging issue that is not considered in the existing cooperative-based distributed anomaly and intrusion detection schemes of wireless sensor networks. In this paper, we propose a validation algorithm that addresses this problem. This algorithm utilizes the concept of intrusion-aware reliability that helps to provide adequate reliability at a modest communication cost. In this paper, we also provide a security resiliency analysis of the proposed intrusion-aware alert validation algorithm. PMID:22454568
Evaluation of Anomaly Detection Method Based on Pattern Recognition
NASA Astrophysics Data System (ADS)
Fontugne, Romain; Himura, Yosuke; Fukuda, Kensuke
The number of threats on the Internet is rapidly increasing, and anomaly detection has become of increasing importance. High-speed backbone traffic is particularly degraded, but their analysis is a complicated task due to the amount of data, the lack of payload data, the asymmetric routing and the use of sampling techniques. Most anomaly detection schemes focus on the statistical properties of network traffic and highlight anomalous traffic through their singularities. In this paper, we concentrate on unusual traffic distributions, which are easily identifiable in temporal-spatial space (e.g., time/address or port). We present an anomaly detection method that uses a pattern recognition technique to identify anomalies in pictures representing traffic. The main advantage of this method is its ability to detect attacks involving mice flows. We evaluate the parameter set and the effectiveness of this approach by analyzing six years of Internet traffic collected from a trans-Pacific link. We show several examples of detected anomalies and compare our results with those of two other methods. The comparison indicates that the only anomalies detected by the pattern-recognition-based method are mainly malicious traffic with a few packets.
Bio-Inspired Distributed Decision Algorithms for Anomaly Detection
2017-03-01
TERMS DIAMoND, Local Anomaly Detector, Total Impact Estimation, Threat Level Estimator 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU...21 4.2 Performance of the DIAMoND Algorithm as a DNS-Server Level Attack Detection and Mitigation...with 6 Nodes ........................................................................................ 13 8 Hierarchical 2- Level Topology
Jang, J; Seo, J K
2015-06-01
This paper describes a multiple background subtraction method in frequency difference electrical impedance tomography (fdEIT) to detect an admittivity anomaly from a high-contrast background conductivity distribution. The proposed method expands the use of the conventional weighted frequency difference EIT method, which has been used limitedly to detect admittivity anomalies in a roughly homogeneous background. The proposed method can be viewed as multiple weighted difference imaging in fdEIT. Although the spatial resolutions of the output images by fdEIT are very low due to the inherent ill-posedness, numerical simulations and phantom experiments of the proposed method demonstrate its feasibility to detect anomalies. It has potential application in stroke detection in a head model, which is highly heterogeneous due to the skull.
Anomaly-based intrusion detection for SCADA systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, D.; Usynin, A.; Hines, J. W.
2006-07-01
Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less
Effects of Sampling and Spatio/Temporal Granularity in Traffic Monitoring on Anomaly Detectability
NASA Astrophysics Data System (ADS)
Ishibashi, Keisuke; Kawahara, Ryoichi; Mori, Tatsuya; Kondoh, Tsuyoshi; Asano, Shoichiro
We quantitatively evaluate how sampling and spatio/temporal granularity in traffic monitoring affect the detectability of anomalous traffic. Those parameters also affect the monitoring burden, so network operators face a trade-off between the monitoring burden and detectability and need to know which are the optimal paramter values. We derive equations to calculate the false positive ratio and false negative ratio for given values of the sampling rate, granularity, statistics of normal traffic, and volume of anomalies to be detected. Specifically, assuming that the normal traffic has a Gaussian distribution, which is parameterized by its mean and standard deviation, we analyze how sampling and monitoring granularity change these distribution parameters. This analysis is based on observation of the backbone traffic, which exhibits spatially uncorrelated and temporally long-range dependence. Then we derive the equations for detectability. With those equations, we can answer the practical questions that arise in actual network operations: what sampling rate to set to find the given volume of anomaly, or, if the sampling is too high for actual operation, what granularity is optimal to find the anomaly for a given lower limit of sampling rate.
Road Anomalies Detection System Evaluation.
Silva, Nuno; Shah, Vaibhav; Soares, João; Rodrigues, Helena
2018-06-21
Anomalies on road pavement cause discomfort to drivers and passengers, and may cause mechanical failure or even accidents. Governments spend millions of Euros every year on road maintenance, often causing traffic jams and congestion on urban roads on a daily basis. This paper analyses the difference between the deployment of a road anomalies detection and identification system in a “conditioned” and a real world setup, where the system performed worse compared to the “conditioned” setup. It also presents a system performance analysis based on the analysis of the training data sets; on the analysis of the attributes complexity, through the application of PCA techniques; and on the analysis of the attributes in the context of each anomaly type, using acceleration standard deviation attributes to observe how different anomalies classes are distributed in the Cartesian coordinates system. Overall, in this paper, we describe the main insights on road anomalies detection challenges to support the design and deployment of a new iteration of our system towards the deployment of a road anomaly detection service to provide information about roads condition to drivers and government entities.
NASA Technical Reports Server (NTRS)
Lo, C. F.; Wu, K.; Whitehead, B. A.
1993-01-01
The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.
NASA Astrophysics Data System (ADS)
Sun, Hao; Zou, Huanxin; Zhou, Shilin
2016-03-01
Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.
Conditional Anomaly Detection with Soft Harmonic Functions
Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F.; Hauskrecht, Milos
2012-01-01
In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions. PMID:25309142
Conditional Anomaly Detection with Soft Harmonic Functions.
Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F; Hauskrecht, Milos
2011-01-01
In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions.
Novel Hyperspectral Anomaly Detection Methods Based on Unsupervised Nearest Regularized Subspace
NASA Astrophysics Data System (ADS)
Hou, Z.; Chen, Y.; Tan, K.; Du, P.
2018-04-01
Anomaly detection has been of great interest in hyperspectral imagery analysis. Most conventional anomaly detectors merely take advantage of spectral and spatial information within neighboring pixels. In this paper, two methods of Unsupervised Nearest Regularized Subspace-based with Outlier Removal Anomaly Detector (UNRSORAD) and Local Summation UNRSORAD (LSUNRSORAD) are proposed, which are based on the concept that each pixel in background can be approximately represented by its spatial neighborhoods, while anomalies cannot. Using a dual window, an approximation of each testing pixel is a representation of surrounding data via a linear combination. The existence of outliers in the dual window will affect detection accuracy. Proposed detectors remove outlier pixels that are significantly different from majority of pixels. In order to make full use of various local spatial distributions information with the neighboring pixels of the pixels under test, we take the local summation dual-window sliding strategy. The residual image is constituted by subtracting the predicted background from the original hyperspectral imagery, and anomalies can be detected in the residual image. Experimental results show that the proposed methods have greatly improved the detection accuracy compared with other traditional detection method.
Sun, Minglei; Yang, Shaobao; Jiang, Jinling; Wang, Qiwei
2015-01-01
Pelger-Huet anomaly (PHA) and Pseudo Pelger-Huet anomaly (PPHA) are neutrophil with abnormal morphology. They have the bilobed or unilobed nucleus and excessive clumping chromatin. Currently, detection of this kind of cell mainly depends on the manual microscopic examination by a clinician, thus, the quality of detection is limited by the efficiency and a certain subjective consciousness of the clinician. In this paper, a detection method for PHA and PPHA is proposed based on karyomorphism and chromatin distribution features. Firstly, the skeleton of the nucleus is extracted using an augmented Fast Marching Method (AFMM) and width distribution is obtained through distance transform. Then, caryoplastin in the nucleus is extracted based on Speeded Up Robust Features (SURF) and a K-nearest-neighbor (KNN) classifier is constructed to analyze the features. Experiment shows that the sensitivity and specificity of this method achieved 87.5% and 83.33%, which means that the detection accuracy of PHA is acceptable. Meanwhile, the detection method should be helpful to the automatic morphological classification of blood cells.
CHAMP: a locally adaptive unmixing-based hyperspectral anomaly detection algorithm
NASA Astrophysics Data System (ADS)
Crist, Eric P.; Thelen, Brian J.; Carrara, David A.
1998-10-01
Anomaly detection offers a means by which to identify potentially important objects in a scene without prior knowledge of their spectral signatures. As such, this approach is less sensitive to variations in target class composition, atmospheric and illumination conditions, and sensor gain settings than would be a spectral matched filter or similar algorithm. The best existing anomaly detectors generally fall into one of two categories: those based on local Gaussian statistics, and those based on linear mixing moles. Unmixing-based approaches better represent the real distribution of data in a scene, but are typically derived and applied on a global or scene-wide basis. Locally adaptive approaches allow detection of more subtle anomalies by accommodating the spatial non-homogeneity of background classes in a typical scene, but provide a poorer representation of the true underlying background distribution. The CHAMP algorithm combines the best attributes of both approaches, applying a linear-mixing model approach in a spatially adaptive manner. The algorithm itself, and teste results on simulated and actual hyperspectral image data, are presented in this paper.
Prevalence and distribution of selected dental anomalies among saudi children in Abha, Saudi Arabia.
Yassin, Syed M
2016-12-01
Dental anomalies are not an unusual finding in routine dental examination. The effect of dental anomalies can lead to functional, esthetic and occlusal problems. The Purpose of the study was to determine the prevalence and distribution of selected developmental dental anomalies in Saudi children. The study was based on clinical examination and Panoramic radiographs of children who visited the Pediatric dentistry clinics at King Khalid University College of Dentistry, Saudi Arabia. These patients were examined for dental anomalies in size, shape, number, structure and position. Data collected were entered and analyzed using statistical package for social sciences version. Of the 1252 children (638 Boys, 614 girls) examined, 318 subjects (25.39%) presented with selected dental anomalies. The distribution by gender was 175 boys (27.42%) and 143 girls (23.28%). On intergroup comparison, number anomalies was the most common anomaly with Hypodontia (9.7%) being the most common anomaly in Saudi children, followed by hyperdontia (3.5%). The Prevalence of size anomalies were Microdontia (2.6%) and Macrodontia (1.8%). The prevalence of Shape anomalies were Talon cusp (1.4%), Taurodontism (1.4%), Fusion (0.8%).The prevalence of Positional anomalies were Ectopic eruption (2.3%) and Rotation (0.4%). The prevalence of structural anomalies were Amelogenesis imperfecta (0.3%) Dentinogenesis imperfecta (0.1%). A significant number of children had dental anomaly with Hypodontia being the most common anomaly and Dentinogenesis imperfecta being the rare anomaly in the study. Early detection and management of these anomalies can avoid potential orthodontic and esthetic problems in a child. Key words: Dental anomalies, children, Saudi Arabia.
NASA Astrophysics Data System (ADS)
Chen, S.; Tao, C.; Li, H.; Zhou, J.; Deng, X.; Tao, W.; Zhang, G.; Liu, W.; He, Y.
2014-12-01
The Precious Stone Mountain hydrothermal field (PSMHF) is located on the southern rim of the Galapagos Microplate. It was found at the 3rd leg of the 2009 Chinese DY115-21 expedition on board R/V Dayangyihao. It is efficient to learn the distribution of hydrothermal plumes and locate the hydrothermal vents by detecting the anomalies of turbidity and temperature. Detecting seawater turbidity by MAPR based on deep-tow technology is established and improved during our cruises. We collected data recorded by MAPR and information from geological sampling, yielding the following results: (1)Strong hydrothermal turbidity and temperature anomalies were recorded at 1.23°N, southeast and northwest of PSMHF. According to the CTD data on the mooring system, significant temperature anomalies were observed over PSMHF at the depth of 1,470 m, with anomalies range from 0.2℃ to 0.4℃, which gave another evidence of the existence of hydrothermal plume. (2)At 1.23°N (101.4802°W/1.2305°N), the nose-shaped particle plume was concentrated at a depth interval of 1,400-1,600 m, with 200 m thickness and an east-west diffusion range of 500 m. The maximum turbidity anomaly (0.045 △NTU) was recorded at the depth of 1,500 m, while the background anomaly was about 0.01△NTU. A distinct temperature anomaly was also detected at the seafloor near 1.23°N. Deep-tow camera showed the area was piled up by hydrothermal sulfide sediments. (3) In the southeast (101.49°W/1.21°N), the thickness of hydrothermal plume was 300 m and it was spreading laterally at a depth of 1,500-1,800 m, for a distance about 800 m. The maximum turbidity anomaly of nose-shaped plume is about 0.04 △NTU at the depth of 1,600 m. Distinct temperature anomaly was also detected in the northwest (101.515°W/1.235°N). (4) Terrain and bottom current were the main factors controlling the distribution of hydrothermal plume. Different from the distribution of hydrothermal plumes on the mid-ocean ridges, which was mostly effected by seafloor topography, the terrain of the PSMHF was relatively flat, so the impact was negligible. Southwest direction bottom current at the speed of 0.05 m/s in PSMHF had a great influence on the distribution and spreading direction of hydrothermal plume. Keyword: hydrothermal plume, Precious Stone Mountain hydrothermal field, turbidity
Anomalous change detection in imagery
Theiler, James P [Los Alamos, NM; Perkins, Simon J [Santa Fe, NM
2011-05-31
A distribution-based anomaly detection platform is described that identifies a non-flat background that is specified in terms of the distribution of the data. A resampling approach is also disclosed employing scrambled resampling of the original data with one class specified by the data and the other by the explicit distribution, and solving using binary classification.
Multi-Level Anomaly Detection on Time-Varying Graph Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridges, Robert A; Collins, John P; Ferragut, Erik M
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, thismore » multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
An Unsupervised Deep Hyperspectral Anomaly Detector
Ma, Ning; Peng, Yu; Wang, Shaojun
2018-01-01
Hyperspectral image (HSI) based detection has attracted considerable attention recently in agriculture, environmental protection and military applications as different wavelengths of light can be advantageously used to discriminate different types of objects. Unfortunately, estimating the background distribution and the detection of interesting local objects is not straightforward, and anomaly detectors may give false alarms. In this paper, a Deep Belief Network (DBN) based anomaly detector is proposed. The high-level features and reconstruction errors are learned through the network in a manner which is not affected by previous background distribution assumption. To reduce contamination by local anomalies, adaptive weights are constructed from reconstruction errors and statistical information. By using the code image which is generated during the inference of DBN and modified by adaptively updated weights, a local Euclidean distance between under test pixels and their neighboring pixels is used to determine the anomaly targets. Experimental results on synthetic and recorded HSI datasets show the performance of proposed method outperforms the classic global Reed-Xiaoli detector (RXD), local RX detector (LRXD) and the-state-of-the-art Collaborative Representation detector (CRD). PMID:29495410
SmartMal: a service-oriented behavioral malware detection framework for mobile devices.
Wang, Chao; Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C K
2014-01-01
This paper presents SmartMal--a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices.
SmartMal: A Service-Oriented Behavioral Malware Detection Framework for Mobile Devices
Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C. K.
2014-01-01
This paper presents SmartMal—a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices. PMID:25165729
Prevalence and distribution of selected dental anomalies among saudi children in Abha, Saudi Arabia
2016-01-01
Background Dental anomalies are not an unusual finding in routine dental examination. The effect of dental anomalies can lead to functional, esthetic and occlusal problems. The Purpose of the study was to determine the prevalence and distribution of selected developmental dental anomalies in Saudi children. Material and Methods The study was based on clinical examination and Panoramic radiographs of children who visited the Pediatric dentistry clinics at King Khalid University College of Dentistry, Saudi Arabia. These patients were examined for dental anomalies in size, shape, number, structure and position. Data collected were entered and analyzed using statistical package for social sciences version. Results Of the 1252 children (638 Boys, 614 girls) examined, 318 subjects (25.39%) presented with selected dental anomalies. The distribution by gender was 175 boys (27.42%) and 143 girls (23.28%). On intergroup comparison, number anomalies was the most common anomaly with Hypodontia (9.7%) being the most common anomaly in Saudi children, followed by hyperdontia (3.5%). The Prevalence of size anomalies were Microdontia (2.6%) and Macrodontia (1.8%). The prevalence of Shape anomalies were Talon cusp (1.4%), Taurodontism (1.4%), Fusion (0.8%).The prevalence of Positional anomalies were Ectopic eruption (2.3%) and Rotation (0.4%). The prevalence of structural anomalies were Amelogenesis imperfecta (0.3%) Dentinogenesis imperfecta (0.1%). Conclusions A significant number of children had dental anomaly with Hypodontia being the most common anomaly and Dentinogenesis imperfecta being the rare anomaly in the study. Early detection and management of these anomalies can avoid potential orthodontic and esthetic problems in a child. Key words:Dental anomalies, children, Saudi Arabia. PMID:27957258
Survey of Anomaly Detection Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ng, B
This survey defines the problem of anomaly detection and provides an overview of existing methods. The methods are categorized into two general classes: generative and discriminative. A generative approach involves building a model that represents the joint distribution of the input features and the output labels of system behavior (e.g., normal or anomalous) then applies the model to formulate a decision rule for detecting anomalies. On the other hand, a discriminative approach aims directly to find the decision rule, with the smallest error rate, that distinguishes between normal and anomalous behavior. For each approach, we will give an overview ofmore » popular techniques and provide references to state-of-the-art applications.« less
A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization
Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; ...
2016-01-01
This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less
Attention focusing and anomaly detection in systems monitoring
NASA Technical Reports Server (NTRS)
Doyle, Richard J.
1994-01-01
Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. The focus of this paper is a new technique for attention focusing. The technique involves reasoning about the distance between two frequency distributions, and is used to detect both anomalous system parameters and 'broken' causal dependencies. These two forms of information together isolate the locus of anomalous behavior in the system being monitored.
Remote detection of geobotanical anomalies associated with hydrocarbon microseepage
NASA Technical Reports Server (NTRS)
Rock, B. N.
1985-01-01
As part of the continuing study of the Lost River, West Virginia NASA/Geosat Test Case Site, an extensive soil gas survey of the site was conducted during the summer of 1983. This soil gas survey has identified an order of magnitude methane, ethane, propane, and butane anomaly that is precisely coincident with the linear maple anomaly reported previously. This and other maple anomalies were previously suggested to be indicative of anaerobic soil conditions associated with hydrocarbon microseepage. In vitro studies support the view that anomalous distributions of native tree species tolerant of anaerobic soil conditions may be useful indicators of methane microseepage in heavily vegetated areas of the United States characterized by deciduous forest cover. Remote sensing systems which allow discrimination and mapping of native tree species and/or species associations will provide the exploration community with a means of identifying vegetation distributional anomalies indicative of microseepage.
Visual saliency detection based on modeling the spatial Gaussianity
NASA Astrophysics Data System (ADS)
Ju, Hongbin
2015-04-01
In this paper, a novel salient object detection method based on modeling the spatial anomalies is presented. The proposed framework is inspired by the biological mechanism that human eyes are sensitive to the unusual and anomalous objects among complex background. It is supposed that a natural image can be seen as a combination of some similar or dissimilar basic patches, and there is a direct relationship between its saliency and anomaly. Some patches share high degree of similarity and have a vast number of quantity. They usually make up the background of an image. On the other hand, some patches present strong rarity and specificity. We name these patches "anomalies". Generally, anomalous patch is a reflection of the edge or some special colors and textures in an image, and these pattern cannot be well "explained" by their surroundings. Human eyes show great interests in these anomalous patterns, and will automatically pick out the anomalous parts of an image as the salient regions. To better evaluate the anomaly degree of the basic patches and exploit their nonlinear statistical characteristics, a multivariate Gaussian distribution saliency evaluation model is proposed. In this way, objects with anomalous patterns usually appear as the outliers in the Gaussian distribution, and we identify these anomalous objects as salient ones. Experiments are conducted on the well-known MSRA saliency detection dataset. Compared with other recent developed visual saliency detection methods, our method suggests significant advantages.
A scalable architecture for online anomaly detection of WLCG batch jobs
NASA Astrophysics Data System (ADS)
Kuehn, E.; Fischer, M.; Giffels, M.; Jung, C.; Petzold, A.
2016-10-01
For data centres it is increasingly important to monitor the network usage, and learn from network usage patterns. Especially configuration issues or misbehaving batch jobs preventing a smooth operation need to be detected as early as possible. At the GridKa data and computing centre we therefore operate a tool BPNetMon for monitoring traffic data and characteristics of WLCG batch jobs and pilots locally on different worker nodes. On the one hand local information itself are not sufficient to detect anomalies for several reasons, e.g. the underlying job distribution on a single worker node might change or there might be a local misconfiguration. On the other hand a centralised anomaly detection approach does not scale regarding network communication as well as computational costs. We therefore propose a scalable architecture based on concepts of a super-peer network.
Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method
NASA Technical Reports Server (NTRS)
Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.
2005-01-01
NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.
ISHM Anomaly Lexicon for Rocket Test
NASA Technical Reports Server (NTRS)
Schmalzel, John L.; Buchanan, Aubri; Hensarling, Paula L.; Morris, Jonathan; Turowski, Mark; Figueroa, Jorge F.
2007-01-01
Integrated Systems Health Management (ISHM) is a comprehensive capability. An ISHM system must detect anomalies, identify causes of such anomalies, predict future anomalies, help identify consequences of anomalies for example, suggested mitigation steps. The system should also provide users with appropriate navigation tools to facilitate the flow of information into and out of the ISHM system. Central to the ability of the ISHM to detect anomalies is a clearly defined catalog of anomalies. Further, this lexicon of anomalies must be organized in ways that make it accessible to a suite of tools used to manage the data, information and knowledge (DIaK) associated with a system. In particular, it is critical to ensure that there is optimal mapping between target anomalies and the algorithms associated with their detection. During the early development of our ISHM architecture and approach, it became clear that a lexicon of anomalies would be important to the development of critical anomaly detection algorithms. In our work in the rocket engine test environment at John C. Stennis Space Center, we have access to a repository of discrepancy reports (DRs) that are generated in response to squawks identified during post-test data analysis. The DR is the tool used to document anomalies and the methods used to resolve the issue. These DRs have been generated for many different tests and for all test stands. The result is that they represent a comprehensive summary of the anomalies associated with rocket engine testing. Fig. 1 illustrates some of the data that can be extracted from a DR. Such information includes affected transducer channels, narrative description of the observed anomaly, and the steps used to correct the problem. The primary goal of the anomaly lexicon development efforts we have undertaken is to create a lexicon that could be used in support of an associated health assessment database system (HADS) co-development effort. There are a number of significant byproducts of the anomaly lexicon compilation effort. For example, (1) Allows determination of the frequency distribution of anomalies to help identify those with the potential for high return on investment if included in automated detection as part of an ISHM system, (2) Availability of a regular lexicon could provide the base anomaly name choices to help maintain consistency in the DR collection process, and (3) Although developed for the rocket engine test environment, most of the anomalies are not specific to rocket testing, and thus can be reused in other applications.
NASA Astrophysics Data System (ADS)
Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.
2015-10-01
We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.
A Doubly Stochastic Change Point Detection Algorithm for Noisy Biological Signals.
Gold, Nathan; Frasch, Martin G; Herry, Christophe L; Richardson, Bryan S; Wang, Xiaogang
2017-01-01
Experimentally and clinically collected time series data are often contaminated with significant confounding noise, creating short, noisy time series. This noise, due to natural variability and measurement error, poses a challenge to conventional change point detection methods. We propose a novel and robust statistical method for change point detection for noisy biological time sequences. Our method is a significant improvement over traditional change point detection methods, which only examine a potential anomaly at a single time point. In contrast, our method considers all suspected anomaly points and considers the joint probability distribution of the number of change points and the elapsed time between two consecutive anomalies. We validate our method with three simulated time series, a widely accepted benchmark data set, two geological time series, a data set of ECG recordings, and a physiological data set of heart rate variability measurements of fetal sheep model of human labor, comparing it to three existing methods. Our method demonstrates significantly improved performance over the existing point-wise detection methods.
Development of anomaly detection models for deep subsurface monitoring
NASA Astrophysics Data System (ADS)
Sun, A. Y.
2017-12-01
Deep subsurface repositories are used for waste disposal and carbon sequestration. Monitoring deep subsurface repositories for potential anomalies is challenging, not only because the number of sensor networks and the quality of data are often limited, but also because of the lack of labeled data needed to train and validate machine learning (ML) algorithms. Although physical simulation models may be applied to predict anomalies (or the system's nominal state for that sake), the accuracy of such predictions may be limited by inherent conceptual and parameter uncertainties. The main objective of this study was to demonstrate the potential of data-driven models for leakage detection in carbon sequestration repositories. Monitoring data collected during an artificial CO2 release test at a carbon sequestration repository were used, which include both scalar time series (pressure) and vector time series (distributed temperature sensing). For each type of data, separate online anomaly detection algorithms were developed using the baseline experiment data (no leak) and then tested on the leak experiment data. Performance of a number of different online algorithms was compared. Results show the importance of including contextual information in the dataset to mitigate the impact of reservoir noise and reduce false positive rate. The developed algorithms were integrated into a generic Web-based platform for real-time anomaly detection.
NASA Astrophysics Data System (ADS)
Hood, L. L.; Spudis, P. D.
2016-11-01
Approximate maps of the lunar crustal magnetic field at low altitudes in the vicinities of the three Imbrian-aged impact basins, Orientale, Schrödinger, and Imbrium, have been constructed using Lunar Prospector and Kaguya orbital magnetometer data. Detectable anomalies are confirmed to be present well within the rims of Imbrium and Schrödinger. Anomalies in Schrödinger are asymmetrically distributed about the basin center, while a single isolated anomaly is most clearly detected within Imbrium northwest of Timocharis crater. The subsurface within these basins was heated to high temperatures at the time of impact and required long time periods (up to 1 Myr) to cool below the Curie temperature for metallic iron remanence carriers (1043 K). Therefore, consistent with laboratory analyses of returned samples, a steady, long-lived magnetizing field, i.e., a former core dynamo, is inferred to have existed when these basins formed. The asymmetrical distribution within Schrödinger suggests partial demagnetization by later volcanic activity when the dynamo field was much weaker or nonexistent. However, it remains true that anomalies within Imbrian-aged basins are much weaker than those within most Nectarian-aged basins. The virtual absence of anomalies within Orientale where impact melt rocks (the Maunder Formation) are exposed at the surface is difficult to explain unless the dynamo field was much weaker during the Imbrian period.
2011-12-18
Proceedings of the SIGMET- RICS Symposium on Parallel and Distributed Tools, pages 48–59, 1998. [8] A. Dinning and E. Schonberg . Detecting access...multi- threaded programs. ACM Trans. Comput. Syst., 15(4):391– 411, 1997. [38] E. Schonberg . On-the-fly detection of access anomalies. In Proceedings
2008-10-01
AD); Aeolos, a distributed intrusion detection and event correlation infrastructure; STAND, a training-set sanitization technique applicable to ADs...UU 18. NUMBER OF PAGES 25 19a. NAME OF RESPONSIBLE PERSON Frank H. Born a. REPORT U b. ABSTRACT U c . THIS PAGE U 19b. TELEPHONE...Summary of findings 2 (a) Automatic Patch Generation 2 (b) Better Patch Management 2 ( c ) Artificial Diversity 3 (d) Distributed Anomaly Detection 3
A Dictionary Approach to Electron Backscatter Diffraction Indexing.
Chen, Yu H; Park, Se Un; Wei, Dennis; Newstadt, Greg; Jackson, Michael A; Simmons, Jeff P; De Graef, Marc; Hero, Alfred O
2015-06-01
We propose a framework for indexing of grain and subgrain structures in electron backscatter diffraction patterns of polycrystalline materials. We discretize the domain of a dynamical forward model onto a dense grid of orientations, producing a dictionary of patterns. For each measured pattern, we identify the most similar patterns in the dictionary, and identify boundaries, detect anomalies, and index crystal orientations. The statistical distribution of these closest matches is used in an unsupervised binary decision tree (DT) classifier to identify grain boundaries and anomalous regions. The DT classifies a pattern as an anomaly if it has an abnormally low similarity to any pattern in the dictionary. It classifies a pixel as being near a grain boundary if the highly ranked patterns in the dictionary differ significantly over the pixel's neighborhood. Indexing is accomplished by computing the mean orientation of the closest matches to each pattern. The mean orientation is estimated using a maximum likelihood approach that models the orientation distribution as a mixture of Von Mises-Fisher distributions over the quaternionic three sphere. The proposed dictionary matching approach permits segmentation, anomaly detection, and indexing to be performed in a unified manner with the additional benefit of uncertainty quantification.
Data Analytics in Procurement Fraud Prevention
2014-05-30
Certified Fraud Examiners CAC common access card COR contracting officer’s representative CPAR Contractor Performance Assessment Reporting System DCAA...using analytics to predict patterns occurring in known credit card fraud investigations to prevent future schemes before they happen. The goal of...or iTunes . 4. Distributional Analytics Distributional analytics are used to detect anomalies within data. Through the use of distributional
Ellipsoids for anomaly detection in remote sensing imagery
NASA Astrophysics Data System (ADS)
Grosklos, Guenchik; Theiler, James
2015-05-01
For many target and anomaly detection algorithms, a key step is the estimation of a centroid (relatively easy) and a covariance matrix (somewhat harder) that characterize the background clutter. For a background that can be modeled as a multivariate Gaussian, the centroid and covariance lead to an explicit probability density function that can be used in likelihood ratio tests for optimal detection statistics. But ellipsoidal contours can characterize a much larger class of multivariate density function, and the ellipsoids that characterize the outer periphery of the distribution are most appropriate for detection in the low false alarm rate regime. Traditionally the sample mean and sample covariance are used to estimate ellipsoid location and shape, but these quantities are confounded both by large lever-arm outliers and non-Gaussian distributions within the ellipsoid of interest. This paper compares a variety of centroid and covariance estimation schemes with the aim of characterizing the periphery of the background distribution. In particular, we will consider a robust variant of the Khachiyan algorithm for minimum-volume enclosing ellipsoid. The performance of these different approaches is evaluated on multispectral and hyperspectral remote sensing imagery using coverage plots of ellipsoid volume versus false alarm rate.
Investigation of Axial Electric Field Measurements with Grounded-Wire TEM Surveys
NASA Astrophysics Data System (ADS)
Zhou, Nan-nan; Xue, Guo-qiang; Li, Hai; Hou, Dong-yang
2018-01-01
The grounded-wire transient electromagnetic (TEM) surveying is often performed along the equatorial direction with its observation lines paralleling to the transmitting wire with a certain transmitter-receiver distance. However, such method takes into account only the equatorial component of the electromagnetic field, and a little effort has been made on incorporating the other major component along the transmitting wire, here denoted as axial field. To obtain a comprehensive understanding of its fundamental characteristics and guide the designing of the corresponding observation system for reliable anomaly detection, this study for the first time investigates the axial electric field from three crucial aspects, including its decay curve, plane distribution, and anomaly sensitivity, through both synthetic modeling and real application to one major coal field in China. The results demonstrate a higher sensitivity to both high- and low-resistivity anomalies by the electric field in axial direction and confirm its great potentials for robust anomaly detection in the subsurface.
Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems
Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda
2015-01-01
In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477
Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.
Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda
2015-01-01
In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes.
Raghuram, Jayaram; Miller, David J; Kesidis, George
2014-07-01
We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates.
Raghuram, Jayaram; Miller, David J.; Kesidis, George
2014-01-01
We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511
A system for learning statistical motion patterns.
Hu, Weiming; Xiao, Xuejuan; Fu, Zhouyu; Xie, Dan; Tan, Tieniu; Maybank, Steve
2006-09-01
Analysis of motion patterns is an effective approach for anomaly detection and behavior prediction. Current approaches for the analysis of motion patterns depend on known scenes, where objects move in predefined ways. It is highly desirable to automatically construct object motion patterns which reflect the knowledge of the scene. In this paper, we present a system for automatically learning motion patterns for anomaly detection and behavior prediction based on a proposed algorithm for robustly tracking multiple objects. In the tracking algorithm, foreground pixels are clustered using a fast accurate fuzzy K-means algorithm. Growing and prediction of the cluster centroids of foreground pixels ensure that each cluster centroid is associated with a moving object in the scene. In the algorithm for learning motion patterns, trajectories are clustered hierarchically using spatial and temporal information and then each motion pattern is represented with a chain of Gaussian distributions. Based on the learned statistical motion patterns, statistical methods are used to detect anomalies and predict behaviors. Our system is tested using image sequences acquired, respectively, from a crowded real traffic scene and a model traffic scene. Experimental results show the robustness of the tracking algorithm, the efficiency of the algorithm for learning motion patterns, and the encouraging performance of algorithms for anomaly detection and behavior prediction.
Anomaly clustering in hyperspectral images
NASA Astrophysics Data System (ADS)
Doster, Timothy J.; Ross, David S.; Messinger, David W.; Basener, William F.
2009-05-01
The topological anomaly detection algorithm (TAD) differs from other anomaly detection algorithms in that it uses a topological/graph-theoretic model for the image background instead of modeling the image with a Gaussian normal distribution. In the construction of the model, TAD produces a hard threshold separating anomalous pixels from background in the image. We build on this feature of TAD by extending the algorithm so that it gives a measure of the number of anomalous objects, rather than the number of anomalous pixels, in a hyperspectral image. This is done by identifying, and integrating, clusters of anomalous pixels via a graph theoretical method combining spatial and spectral information. The method is applied to a cluttered HyMap image and combines small groups of pixels containing like materials, such as those corresponding to rooftops and cars, into individual clusters. This improves visualization and interpretation of objects.
OceanXtremes: Scalable Anomaly Detection in Oceanographic Time-Series
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Armstrong, E. M.; Chin, T. M.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jacob, J. C.; Quach, N.
2016-12-01
The oceanographic community must meet the challenge to rapidly identify features and anomalies in complex and voluminous observations to further science and improve decision support. Given this data-intensive reality, we are developing an anomaly detection system, called OceanXtremes, powered by an intelligent, elastic Cloud-based analytic service backend that enables execution of domain-specific, multi-scale anomaly and feature detection algorithms across the entire archive of 15 to 30-year ocean science datasets.Our parallel analytics engine is extending the NEXUS system and exploits multiple open-source technologies: Apache Cassandra as a distributed spatial "tile" cache, Apache Spark for in-memory parallel computation, and Apache Solr for spatial search and storing pre-computed tile statistics and other metadata. OceanXtremes provides these key capabilities: Parallel generation (Spark on a compute cluster) of 15 to 30-year Ocean Climatologies (e.g. sea surface temperature or SST) in hours or overnight, using simple pixel averages or customizable Gaussian-weighted "smoothing" over latitude, longitude, and time; Parallel pre-computation, tiling, and caching of anomaly fields (daily variables minus a chosen climatology) with pre-computed tile statistics; Parallel detection (over the time-series of tiles) of anomalies or phenomena by regional area-averages exceeding a specified threshold (e.g. high SST in El Nino or SST "blob" regions), or more complex, custom data mining algorithms; Shared discovery and exploration of ocean phenomena and anomalies (facet search using Solr), along with unexpected correlations between key measured variables; Scalable execution for all capabilities on a hybrid Cloud, using our on-premise OpenStack Cloud cluster or at Amazon. The key idea is that the parallel data-mining operations will be run "near" the ocean data archives (a local "network" hop) so that we can efficiently access the thousands of files making up a three decade time-series. The presentation will cover the architecture of OceanXtremes, parallelization of the climatology computation and anomaly detection algorithms using Spark, example results for SST and other time-series, and parallel performance metrics.
Method for localizing and isolating an errant process step
Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.
2003-01-01
A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.
A mathematical model of extremely low frequency ocean induced electromagnetic noise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dautta, Manik, E-mail: manik.dautta@anyeshan.com; Faruque, Rumana Binte, E-mail: rumana.faruque@anyeshan.com; Islam, Rakibul, E-mail: rakibul.islam@anyeshan.com
2016-07-12
Magnetic Anomaly Detection (MAD) system uses the principle that ferromagnetic objects disturb the magnetic lines of force of the earth. These lines of force are able to pass through both water and air in similar manners. A MAD system, usually mounted on an aerial vehicle, is thus often employed to confirm the detection and accomplish localization of large ferromagnetic objects submerged in a sea-water environment. However, the total magnetic signal encountered by a MAD system includes contributions from a myriad of low to Extremely Low Frequency (ELF) sources. The goal of the MAD system is to detect small anomaly signalsmore » in the midst of these low-frequency interfering signals. Both the Range of Detection (R{sub d}) and the Probability of Detection (P{sub d}) are limited by the ratio of anomaly signal strength to the interfering magnetic noise. In this paper, we report a generic mathematical model to estimate the signal-to-noise ratio or SNR. Since time-variant electro-magnetic signals are affected by conduction losses due to sea-water conductivity and the presence of air-water interface, we employ the general formulation of dipole induced electromagnetic field propagation in stratified media [1]. As a first step we employ a volumetric distribution of isolated elementary magnetic dipoles, each having its own dipole strength and orientation, to estimate the magnetic noise observed by a MAD system. Numerical results are presented for a few realizations out of an ensemble of possible realizations of elementary dipole source distributions.« less
NASA Technical Reports Server (NTRS)
Labrecque, J. L.; Cande, S. C.; Jarrard, R. D. (Principal Investigator)
1983-01-01
A technique that eliminates external field sources and the effects of strike aliasing was used to extract from marine survey data the intermediate wavelength magnetic anomaly field for (B) in the North Pacific. A strong correlation exists between this field and the MAGSAT field although a directional sensitivity in the MAGSAT field can be detected. The intermediate wavelength field is correlated to tectonic features. Island arcs appear as positive anomalies of induced origin likely due to variations in crustal thickness. Seamount chains and oceanic plateaus also are manifested by strong anomalies. The primary contribution to many of these anomalies appears to be due to a remanent magnetization. The source parameters for the remainder of these features are presently unidentified ambiguous. Results indicate that the sea surface field is a valuable source of information for secular variation analysis and the resolution of intermediate wavelength source parameters.
Ranking Causal Anomalies via Temporal and Dynamical Analysis on Vanishing Correlations.
Cheng, Wei; Zhang, Kai; Chen, Haifeng; Jiang, Guofei; Chen, Zhengzhang; Wang, Wei
2016-08-01
Modern world has witnessed a dramatic increase in our ability to collect, transmit and distribute real-time monitoring and surveillance data from large-scale information systems and cyber-physical systems. Detecting system anomalies thus attracts significant amount of interest in many fields such as security, fault management, and industrial optimization. Recently, invariant network has shown to be a powerful way in characterizing complex system behaviours. In the invariant network, a node represents a system component and an edge indicates a stable, significant interaction between two components. Structures and evolutions of the invariance network, in particular the vanishing correlations, can shed important light on locating causal anomalies and performing diagnosis. However, existing approaches to detect causal anomalies with the invariant network often use the percentage of vanishing correlations to rank possible casual components, which have several limitations: 1) fault propagation in the network is ignored; 2) the root casual anomalies may not always be the nodes with a high-percentage of vanishing correlations; 3) temporal patterns of vanishing correlations are not exploited for robust detection. To address these limitations, in this paper we propose a network diffusion based framework to identify significant causal anomalies and rank them. Our approach can effectively model fault propagation over the entire invariant network, and can perform joint inference on both the structural, and the time-evolving broken invariance patterns. As a result, it can locate high-confidence anomalies that are truly responsible for the vanishing correlations, and can compensate for unstructured measurement noise in the system. Extensive experiments on synthetic datasets, bank information system datasets, and coal plant cyber-physical system datasets demonstrate the effectiveness of our approach.
SOME GEOCHEMICAL METHODS OF URANIUM EXPLORATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Illsley, C.T.; Bills, C.W.; Pollock, J.W.
Geochemical research and development projects were carried on to provide basic information which may be applied to exploration or general studies of uranium geology. The applications and limitations of various aspects of geochemistry to uranium geological problems are considerd. Modifications of existing analytical techniques were made and tested in the laboratory and in the field. These include rapid quantitative determination of unranium in water, soil and peat, and of trace amounts of sulfate and phosphate in water. Geochemical anomaly'' has been defined as a significant departure from the average abundance background of an element where the distribution has not beenmore » disturbed by mineralization. The detection and significance of geocthemical anomalies are directly related to the mobility of the element being sought in the zone of weathering. Mobility of uranium is governed by complex physical, chemical, and biological factors. For uranium anomalies in surface materils, the chemicaly factors affecting mobility are the most sigificant. The effects of pH, solubility, coprecipitution, adsorption complexion, or compound formation are discussed in relation to anomalies detected in water, soil, and stream sediments. (auth)« less
Conditional anomaly detection methods for patient–management alert systems
Valko, Michal; Cooper, Gregory; Seybert, Amy; Visweswaran, Shyam; Saul, Melissa; Hauskrecht, Milos
2010-01-01
Anomaly detection methods can be very useful in identifying unusual or interesting patterns in data. A recently proposed conditional anomaly detection framework extends anomaly detection to the problem of identifying anomalous patterns on a subset of attributes in the data. The anomaly always depends (is conditioned) on the value of remaining attributes. The work presented in this paper focuses on instance–based methods for detecting conditional anomalies. The methods rely on the distance metric to identify examples in the dataset that are most critical for detecting the anomaly. We investigate various metrics and metric learning methods to optimize the performance of the instance–based anomaly detection methods. We show the benefits of the instance–based methods on two real–world detection problems: detection of unusual admission decisions for patients with the community–acquired pneumonia and detection of unusual orders of an HPF4 test that is used to confirm Heparin induced thrombocytopenia — a life–threatening condition caused by the Heparin therapy. PMID:25392850
Anomaly Detection for Beam Loss Maps in the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja
2017-07-01
In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.
Particle Filtering for Model-Based Anomaly Detection in Sensor Networks
NASA Technical Reports Server (NTRS)
Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon
2012-01-01
A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The parameters thus learned are used for calculating the joint distribution of the observations. However, this GMM assumption is essentially an approximation and signals the potential viability of non-parametric density estimators. This is the key idea underlying the new approach.
Distribution of leached radioactive material in the Legin Group Area, San Miguel County, Colorado
Rogers, Allen S.
1950-01-01
Radioactivity anomalies, which are small in magnitude, and probably are not caused by extensions of known uranium-vanadium ore bodies, were detected during the gamma-ray logging of diamond-drill holes in the Legin group of claims, southwest San Miguel County, Colo. The positions of these anomalies are at the top surfaces of mudstone strata within, and at the base of, the ore-bearing sandstone of the Salt Wash member of the Morrison formation. The distribution of these anomalies suggests that ground water has leached radioactive material from the ore bodies and has carried it down dip and laterally along the top surfaces of underlying impermeable mudstone strata for distance as great as 300 feet. The anomalies are probably caused by radon and its daughter elements. Preliminary tests indicate that radon in quantities up to 10-7 curies per liter may be present in ground water flowing along sandstone-mudstone contacts under carnotite ore bodies. In comparison, the radium content of the same water is less than 10-10 curies per liter. Further substantiation of the relationship between ore bodies, the movement of water, and the radon-caused anomalies may greatly increase the scope of gamma-ray logs of drill holes as an aid to prospecting.
Sherwin, Jason; Sajda, Paul
2013-01-01
Humans are extremely good at detecting anomalies in sensory input. For example, while listening to a piece of Western-style music, an anomalous key change or an out-of-key pitch is readily apparent, even to the non-musician. In this paper we investigate differences between musical experts and non-experts during musical anomaly detection. Specifically, we analyzed the electroencephalograms (EEG) of five expert cello players and five non-musicians while they listened to excerpts of J.S. Bach’s Prelude from Cello Suite No.1. All subjects were familiar with the piece, though experts also had extensive experience playing the piece. Subjects were told that anomalous musical events (AMEs) could occur at random within the excerpts of the piece and were told to report the number of AMEs after each excerpt. Furthermore, subjects were instructed to remain still while listening to the excerpts and their lack of movement was verified via visual and EEG monitoring. Experts had significantly better behavioral performance (i.e. correctly reporting AME counts) than non-experts, though both groups had mean accuracies greater than 80%. These group differences were also reflected in the EEG correlates of key-change detection post-stimulus, with experts showing more significant, greater magnitude, longer periods of and earlier peaks in condition-discriminating EEG activity than novices. Using the timing of the maximum discriminating neural correlates, we performed source reconstruction and compared significant differences between cellists and non-musicians. We found significant differences that included a slightly right lateralized motor and frontal source distribution. The right lateralized motor activation is consistent with the cortical representation of the left hand – i.e. the hand a cellist would use, while playing, to generate the anomalous key-changes. In general, these results suggest that sensory anomalies detected by experts may in fact be partially a result of an embodied cognition, with a model of the action for generating the anomaly playing a role in its detection. PMID:24056235
Data Analytics in Procurement Fraud Prevention
2014-06-01
access card COR contracting officer’s representative CPAR Contractor Performance Assessment Reporting System DCAA Defense Contract Audit Agency DOD...of this can be seen in a company using analytics to predict patterns occurring in known credit card fraud investigations to prevent future schemes...a website such as Amazon or iTunes . 10 4. Distributional Analytics Distributional analytics are used to detect anomalies within data. Through the
A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data.
Goldstein, Markus; Uchida, Seiichi
2016-01-01
Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks.
Mining IP to Domain Name Interactions to Detect DNS Flood Attacks on Recursive DNS Servers.
Alonso, Roberto; Monroy, Raúl; Trejo, Luis A
2016-08-17
The Domain Name System (DNS) is a critical infrastructure of any network, and, not surprisingly a common target of cybercrime. There are numerous works that analyse higher level DNS traffic to detect anomalies in the DNS or any other network service. By contrast, few efforts have been made to study and protect the recursive DNS level. In this paper, we introduce a novel abstraction of the recursive DNS traffic to detect a flooding attack, a kind of Distributed Denial of Service (DDoS). The crux of our abstraction lies on a simple observation: Recursive DNS queries, from IP addresses to domain names, form social groups; hence, a DDoS attack should result in drastic changes on DNS social structure. We have built an anomaly-based detection mechanism, which, given a time window of DNS usage, makes use of features that attempt to capture the DNS social structure, including a heuristic that estimates group composition. Our detection mechanism has been successfully validated (in a simulated and controlled setting) and with it the suitability of our abstraction to detect flooding attacks. To the best of our knowledge, this is the first time that work is successful in using this abstraction to detect these kinds of attacks at the recursive level. Before concluding the paper, we motivate further research directions considering this new abstraction, so we have designed and tested two additional experiments which exhibit promising results to detect other types of anomalies in recursive DNS servers.
Mining IP to Domain Name Interactions to Detect DNS Flood Attacks on Recursive DNS Servers
Alonso, Roberto; Monroy, Raúl; Trejo, Luis A.
2016-01-01
The Domain Name System (DNS) is a critical infrastructure of any network, and, not surprisingly a common target of cybercrime. There are numerous works that analyse higher level DNS traffic to detect anomalies in the DNS or any other network service. By contrast, few efforts have been made to study and protect the recursive DNS level. In this paper, we introduce a novel abstraction of the recursive DNS traffic to detect a flooding attack, a kind of Distributed Denial of Service (DDoS). The crux of our abstraction lies on a simple observation: Recursive DNS queries, from IP addresses to domain names, form social groups; hence, a DDoS attack should result in drastic changes on DNS social structure. We have built an anomaly-based detection mechanism, which, given a time window of DNS usage, makes use of features that attempt to capture the DNS social structure, including a heuristic that estimates group composition. Our detection mechanism has been successfully validated (in a simulated and controlled setting) and with it the suitability of our abstraction to detect flooding attacks. To the best of our knowledge, this is the first time that work is successful in using this abstraction to detect these kinds of attacks at the recursive level. Before concluding the paper, we motivate further research directions considering this new abstraction, so we have designed and tested two additional experiments which exhibit promising results to detect other types of anomalies in recursive DNS servers. PMID:27548169
CPAD: Cyber-Physical Attack Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferragut, Erik M; Laska, Jason A
The CPAD technology relates to anomaly detection and more specifically to cyber physical attack detection. It infers underlying physical relationships between components by analyzing the sensor measurements of a system. It then uses these measurements to detect signs of a non-physically realizable state, which is indicative of an integrity attack on the system. CPAD can be used on any highly-instrumented cyber-physical system to detect integrity attacks and identify the component or components compromised. It has applications to power transmission and distribution, nuclear and industrial plants, and complex vehicles.
A model for anomaly classification in intrusion detection systems
NASA Astrophysics Data System (ADS)
Ferreira, V. O.; Galhardi, V. V.; Gonçalves, L. B. L.; Silva, R. C.; Cansian, A. M.
2015-09-01
Intrusion Detection Systems (IDS) are traditionally divided into two types according to the detection methods they employ, namely (i) misuse detection and (ii) anomaly detection. Anomaly detection has been widely used and its main advantage is the ability to detect new attacks. However, the analysis of anomalies generated can become expensive, since they often have no clear information about the malicious events they represent. In this context, this paper presents a model for automated classification of alerts generated by an anomaly based IDS. The main goal is either the classification of the detected anomalies in well-defined taxonomies of attacks or to identify whether it is a false positive misclassified by the IDS. Some common attacks to computer networks were considered and we achieved important results that can equip security analysts with best resources for their analyses.
An immunity-based anomaly detection system with sensor agents.
Okamoto, Takeshi; Ishida, Yoshiteru
2009-01-01
This paper proposes an immunity-based anomaly detection system with sensor agents based on the specificity and diversity of the immune system. Each agent is specialized to react to the behavior of a specific user. Multiple diverse agents decide whether the behavior is normal or abnormal. Conventional systems have used only a single sensor to detect anomalies, while the immunity-based system makes use of multiple sensors, which leads to improvements in detection accuracy. In addition, we propose an evaluation framework for the anomaly detection system, which is capable of evaluating the differences in detection accuracy between internal and external anomalies. This paper focuses on anomaly detection in user's command sequences on UNIX-like systems. In experiments, the immunity-based system outperformed some of the best conventional systems.
Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data
NASA Technical Reports Server (NTRS)
Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney
2012-01-01
This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.
NASA Astrophysics Data System (ADS)
Gurk, M.; Bosch, F. P.; Tougiannidis, N.
2013-04-01
Common studies on the static electric field distribution over a conductivity anomaly use the self-potential method. However, this method is time consuming and requires nonpolarizable electrodes to be placed in the ground. Moreover, the information gained by this method is restricted to the horizontal variations of the electric field. To overcome the limitation in the self-potential technique, we conducted a field experiment using a non conventional technique to assess the static electric field over a conductivity anomaly. We use two metallic potential probes arranged on an insulated boom with a separation of 126 cm. When placed into the electric field of the free air, a surface charge will be induced on each probe trying to equalize with the potential of the surrounding atmosphere. The use of a plasma source at both probes facilitated continuous and quicker measurement of the electric field in the air. The present study shows first experimental measurements with a modified potential probe technique (MPP) along a 600-meter-long transect to demonstrate the general feasibility of this method for studying the static electric field distribution over shallow conductivity anomalies. Field measurements were carried out on a test site on top of the Bramsche Massif near Osnabrück (Northwest Germany) to benefit from a variety of available near surface data over an almost vertical conductivity anomaly. High resolution self-potential data served in a numerical analysis to estimate the expected individual components of the electric field vector. During the experiment we found more anomalies in the vertical and horizontal components of the electric field than self-potential anomalies. These contrasting findings are successfully cross-validated with conventional near surface geophysical methods. Among these methods, we used self-potential, radiomagnetotelluric, electric resistivity tomography and induced polarization data to derive 2D conductivity models of the subsurface in order to infer the geometrical properties and the origin of the conductivity anomaly in the survey area. The presented study demonstrates the feasibility of electric field measurements in free air to detect and study near surface conductivity anomalies. Variations in Ez correlate well with the conductivity distribution obtained from resistivity methods. Compared to the self-potential technique, continuously free air measurements of the electric field are more rapid and of better lateral resolution combined with the unique ability to analyze vertical components of the electric field which are of particular importance to detect lateral conductivity contrasts. Mapping Ez in free air is a good tool to precisely map lateral changes of the electric field distribution in areas where SP generation fails. MPP offers interesting application in other geophysical techniques e.g. in time domain electromagnetics, DC and IP. With this method we were able to reveal a ca. 150 m broad zone of enhanced electric field strength.
A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data
Goldstein, Markus; Uchida, Seiichi
2016-01-01
Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601
Statistical Traffic Anomaly Detection in Time-Varying Communication Networks
2015-02-01
methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types
Statistical Traffic Anomaly Detection in Time Varying Communication Networks
2015-02-01
methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types
Radar and infrared remote sensing of geothermal features at Pilgrim Springs, Alaska
NASA Technical Reports Server (NTRS)
Dean, K. G.; Forbes, R. B.; Turner, D. L.; Eaton, F. D.; Sullivan, K. D.
1982-01-01
High-altitude radar and thermal imagery collected by the NASA research aircraft WB57F were used to examine the structural setting and distribution of radiant temperatures of geothermal anomalies in the Pilgrim Springs, Alaska area. Like-polarized radar imagery with perpendicular look directions provides the best structural data for lineament analysis, although more than half the mapped lineaments are easily detectable on conventional aerial photography. Radiometer data and imagery from a thermal scanner were used to evaluate radiant surface temperatures, which ranged from 3 to 17 C. The evening imagery, which utilized density-slicing techniques, detected thermal anomalies associated with geothermal heat sources. The study indicates that high-altitude predawn thermal imagery may be able to locate relatively large areas of hot ground in site-specific studies in the vegetated Alaskan terrain. This imagery will probably not detect gentle lateral gradients.
A Survey on Anomaly Based Host Intrusion Detection System
NASA Astrophysics Data System (ADS)
Jose, Shijoe; Malathi, D.; Reddy, Bharath; Jayaseeli, Dorathi
2018-04-01
An intrusion detection system (IDS) is hardware, software or a combination of two, for monitoring network or system activities to detect malicious signs. In computer security, designing a robust intrusion detection system is one of the most fundamental and important problems. The primary function of system is detecting intrusion and gives alerts when user tries to intrusion on timely manner. In these techniques when IDS find out intrusion it will send alert massage to the system administrator. Anomaly detection is an important problem that has been researched within diverse research areas and application domains. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. From the existing anomaly detection techniques, each technique has relative strengths and weaknesses. The current state of the experiment practice in the field of anomaly-based intrusion detection is reviewed and survey recent studies in this. This survey provides a study of existing anomaly detection techniques, and how the techniques used in one area can be applied in another application domain.
Theory and experiments in model-based space system anomaly management
NASA Astrophysics Data System (ADS)
Kitts, Christopher Adam
This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.
Anomaly Detection in Test Equipment via Sliding Mode Observers
NASA Technical Reports Server (NTRS)
Solano, Wanda M.; Drakunov, Sergey V.
2012-01-01
Nonlinear observers were originally developed based on the ideas of variable structure control, and for the purpose of detecting disturbances in complex systems. In this anomaly detection application, these observers were designed for estimating the distributed state of fluid flow in a pipe described by a class of advection equations. The observer algorithm uses collected data in a piping system to estimate the distributed system state (pressure and velocity along a pipe containing liquid gas propellant flow) using only boundary measurements. These estimates are then used to further estimate and localize possible anomalies such as leaks or foreign objects, and instrumentation metering problems such as incorrect flow meter orifice plate size. The observer algorithm has the following parts: a mathematical model of the fluid flow, observer control algorithm, and an anomaly identification algorithm. The main functional operation of the algorithm is in creating the sliding mode in the observer system implemented as software. Once the sliding mode starts in the system, the equivalent value of the discontinuous function in sliding mode can be obtained by filtering out the high-frequency chattering component. In control theory, "observers" are dynamic algorithms for the online estimation of the current state of a dynamic system by measurements of an output of the system. Classical linear observers can provide optimal estimates of a system state in case of uncertainty modeled by white noise. For nonlinear cases, the theory of nonlinear observers has been developed and its success is mainly due to the sliding mode approach. Using the mathematical theory of variable structure systems with sliding modes, the observer algorithm is designed in such a way that it steers the output of the model to the output of the system obtained via a variety of sensors, in spite of possible mismatches between the assumed model and actual system. The unique properties of sliding mode control allow not only control of the model internal states to the states of the real-life system, but also identification of the disturbance or anomaly that may occur.
Wormlike Chain Theory and Bending of Short DNA
NASA Astrophysics Data System (ADS)
Mazur, Alexey K.
2007-05-01
The probability distributions for bending angles in double helical DNA obtained in all-atom molecular dynamics simulations are compared with theoretical predictions. The computed distributions remarkably agree with the wormlike chain theory and qualitatively differ from predictions of the subelastic chain model. The computed data exhibit only small anomalies in the apparent flexibility of short DNA and cannot account for the recently reported AFM data. It is possible that the current atomistic DNA models miss some essential mechanisms of DNA bending on intermediate length scales. Analysis of bent DNA structures reveal, however, that the bending motion is structurally heterogeneous and directionally anisotropic on the length scales where the experimental anomalies were detected. These effects are essential for interpretation of the experimental data and they also can be responsible for the apparent discrepancy.
Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server
2016-09-01
ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT
Seismic data fusion anomaly detection
NASA Astrophysics Data System (ADS)
Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David
2014-06-01
Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.
Adiabatic Quantum Anomaly Detection and Machine Learning
NASA Astrophysics Data System (ADS)
Pudenz, Kristen; Lidar, Daniel
2012-02-01
We present methods of anomaly detection and machine learning using adiabatic quantum computing. The machine learning algorithm is a boosting approach which seeks to optimally combine somewhat accurate classification functions to create a unified classifier which is much more accurate than its components. This algorithm then becomes the first part of the larger anomaly detection algorithm. In the anomaly detection routine, we first use adiabatic quantum computing to train two classifiers which detect two sets, the overlap of which forms the anomaly class. We call this the learning phase. Then, in the testing phase, the two learned classification functions are combined to form the final Hamiltonian for an adiabatic quantum computation, the low energy states of which represent the anomalies in a binary vector space.
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
NASA Technical Reports Server (NTRS)
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
NASA Technical Reports Server (NTRS)
Gage, Mark; Dehoff, Ronald
1991-01-01
This system architecture task (1) analyzed the current process used to make an assessment of engine and component health after each test or flight firing of an SSME, (2) developed an approach and a specific set of objectives and requirements for automated diagnostics during post fire health assessment, and (3) listed and described the software applications required to implement this system. The diagnostic system described is a distributed system with a database management system to store diagnostic information and test data, a CAE package for visual data analysis and preparation of plots of hot-fire data, a set of procedural applications for routine anomaly detection, and an expert system for the advanced anomaly detection and evaluation.
NASA Astrophysics Data System (ADS)
Barbarella, M.; De Giglio, M.; Galeandro, A.; Mancini, F.
2012-04-01
The modification of some atmospheric physical properties prior to a high magnitude earthquake has been recently debated within the Lithosphere-Atmosphere-Ionosphere (LAI) Coupling model. Among this variety of phenomena the ionization of air at the higher level of the atmosphere, called ionosphere, is investigated in this work. Such a ionization occurrences could be caused by possible leaking of gases from earth crust and their presence was detected around the time of high magnitude earthquakes by several authors. However, the spatial scale and temporal domain over which such a disturbances come into evidence is still a controversial item. Even thought the ionospheric activity could be investigated by different methodologies (satellite or terrestrial measurements), we selected the production of ionospheric maps by the analysis of GNSS (Global Navigation Satellite Data) data as possible way to detect anomalies prior of a seismic event over a wide area around the epicentre. It is well known that, in the GNSS sciences, the ionospheric activity could be probed by the analysis of refraction phenomena occurred on the dual frequency signals along the satellite to receiver path. The analysis of refraction phenomena affecting data acquired by the GNSS permanent trackers is able to produce daily to hourly maps representing the spatial distribution of the ionospheric Total Electron Content (TEC) as an index of the ionization degree in the upper atmosphere. The presence of large ionospheric anomalies could be therefore interpreted in the LAI Coupling model like a precursor signal of a strong earthquake, especially when the appearance of other different precursors (thermal anomalies and/or gas fluxes) could be detected. In this work, a six-month long series of ionospheric maps produced from GNSS data collected by a network of 49 GPS permanent stations distributed within an area around the city of L'Aquila (Abruzzi, Italy), where an earthquake (M = 6.3) occurred on April 6, 2009, were investigated. Basically, the proposed methodology is able to perform a time series analysis of the TEC maps and, eventually, define the spatial and temporal domains of ionospheric disturbances. This goal was achieved by a time series analysis of the spatial dataset able to compare a local pattern of ionospheric activity with its historical mean value and detect areas where the TEC content exhibits anomalous values. This data processing shows some 1 to 2 days long anomalies about 20 days before of the seismic event (confirming also results provided in recent studies by means of ionospheric soundings).
A robust background regression based score estimation algorithm for hyperspectral anomaly detection
NASA Astrophysics Data System (ADS)
Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei
2016-12-01
Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement in practice.
Christiansen, Peter; Nielsen, Lars N; Steen, Kim A; Jørgensen, Rasmus N; Karstoft, Henrik
2016-11-11
Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks" (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45-90 m) than RCNN. RCNN has a similar performance at a short range (0-30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit).
Christiansen, Peter; Nielsen, Lars N.; Steen, Kim A.; Jørgensen, Rasmus N.; Karstoft, Henrik
2016-01-01
Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45–90 m) than RCNN. RCNN has a similar performance at a short range (0–30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit). PMID:27845717
The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad
The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measuredmore » 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.« less
Analysis and interpretation of MAGSAT anomalies over north Africa
NASA Technical Reports Server (NTRS)
Phillips, R. J.
1985-01-01
Crustal anomaly detection with MAGSAT data is frustrated by inherent resolving power of the data and by contamination from external and core fields. Quality of the data might be tested by modeling specific tectonic features which produce anomalies that fall within proposed resolution and crustal amplitude capabilities of MAGSAT fields. To test this hypothesis, north African hotspots associated with Ahaggar, Tibesti and Darfur were modeled as magnetic induction anomalies. MAGSAT data were reduced by subtracting external and core fields to isolate scalar and vertical component crustal signals. Of the three volcanic areas, only the Ahaggar region had an associated anomaly of magnitude above error limits of the data. Hotspot hypothesis was tested for Ahaggar by seeing if predicted magnetic signal matched MAGSAT anomaly. Predicted model magnetic signal arising from surface topography of the uplift and the Curie isothermal surface was calculated at MAGSAT altitudes by Fourier transform technique modified to allow for variable magnetization. Curie isotherm surface was calculated using a method for temperature distribution in a moving plate above a fixed hotspot. Magnetic signal was calculated for a fixed plate as well as a number of plate velocities and directions.
NASA Astrophysics Data System (ADS)
Zhang, Haiyan; Wen, Zhiping; Wu, Renguang; Li, Xiuzhen; Chen, Ruidan
2018-05-01
The East Asian summer monsoon is affected by processes in the mid-high latitudes in addition to various tropical and subtropical systems. The present study investigates the summer sea level pressure (SLP) variability over northern East Asia (NEA) and emphasizes the closed active center over the Mongolian region. It is found that the seasonal mean Mongolian SLP (MSLP) anomaly is closely connected with the variability of summertime regional synoptic extra-tropical cyclones on longer time scales. A significant inter-decadal increase in the MSLP around the early 1990s has been detected, which is accompanied by a weakening in the activity of regional extra-tropical cyclones. Recent warming over NEA may have a contribution to the inter-decadal change, which features evidently meridional inhomogeneity around 45°N. The inhomogeneous air temperature anomaly distribution results in decreased vertical wind shear, reduced atmospheric baroclinicity over the Mongolian region, and thus inactive regional cyclones and increased MSLP in the latter decade. The associated temperature anomaly distribution may be partly attributed to regional inhomogeneity in cloud and radiation anomalies, and it is further maintained by two positive feedback mechanisms associated with atmospheric internal processes: one via adiabatic heating and the other via horizontal temperature advection.
Smart Grid Educational Series | Energy Systems Integration Facility | NREL
generation through transmission, all the way to the distribution infrastructure. Download presentation | Text on key takeaways from breakout group discussions. Learn more about the workshop. Text Version Text presentation PDF | Text Version Using MultiSpeak Data Model Standard & Essence Anomaly Detection for ICS
Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)
NASA Technical Reports Server (NTRS)
McIntosh, Dawn
2006-01-01
This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search
Deolia, Shravani Govind; Chhabra, Chaya; Chhabra, Kumar Gaurav; Kalghatgi, Shrivardhan; Khandelwal, Naresh
2015-01-01
Anomalies and enamel hypoplasia of deciduous dentition are routinely encountered by dental professionals and early detection and careful management of such conditions facilitates may help in customary occlusal development. The aim of this study was to determine the prevalence of hypodontia, microdontia, double teeth, and hyperdontia of deciduous teeth among Indian children. The study group comprised 1,398 children (735 boys, 633 girls). The children were examined in department of Pedodontics and Preventive Dentistry in Jodhpur Dental College General Hospital, Jodhpur, Rajasthan, India. Clinical data were collected by single dentist according to Kreiborg criteria, which includes double teeth, hypodontia, microdontia, and supernumerary teeth. Statistical analysis of the data was performed using the descriptive analysis and chi-square test. Dental anomalies were found in 4% of children. The distribution of dental anomalies were significantly more frequent (P = 0.001) in girls (5.8%, n = 38) than in boys (2.7%, n = 18). In relation to anomaly frequencies at different ages, significant difference was found between 2 and 3 years (P = 0.001). Double teeth were the most frequently (2.3%) observed anomaly. The other anomalies followed as 0.3% supernumerary teeth, 0.6% microdontia, 0.6% hypodontia. Identification of dental anomalies at an early age is of great importance as it prevents malocclusions, functional and certain psychological problems.
RIDES: Robust Intrusion Detection System for IP-Based Ubiquitous Sensor Networks
Amin, Syed Obaid; Siddiqui, Muhammad Shoaib; Hong, Choong Seon; Lee, Sungwon
2009-01-01
The IP-based Ubiquitous Sensor Network (IP-USN) is an effort to build the “Internet of things”. By utilizing IP for low power networks, we can benefit from existing well established tools and technologies of IP networks. Along with many other unresolved issues, securing IP-USN is of great concern for researchers so that future market satisfaction and demands can be met. Without proper security measures, both reactive and proactive, it is hard to envisage an IP-USN realm. In this paper we present a design of an IDS (Intrusion Detection System) called RIDES (Robust Intrusion DEtection System) for IP-USN. RIDES is a hybrid intrusion detection system, which incorporates both Signature and Anomaly based intrusion detection components. For signature based intrusion detection this paper only discusses the implementation of distributed pattern matching algorithm with the help of signature-code, a dynamically created attack-signature identifier. Other aspects, such as creation of rules are not discussed. On the other hand, for anomaly based detection we propose a scoring classifier based on the SPC (Statistical Process Control) technique called CUSUM charts. We also investigate the settings and their effects on the performance of related parameters for both of the components. PMID:22412321
RIDES: Robust Intrusion Detection System for IP-Based Ubiquitous Sensor Networks.
Amin, Syed Obaid; Siddiqui, Muhammad Shoaib; Hong, Choong Seon; Lee, Sungwon
2009-01-01
The IP-based Ubiquitous Sensor Network (IP-USN) is an effort to build the "Internet of things". By utilizing IP for low power networks, we can benefit from existing well established tools and technologies of IP networks. Along with many other unresolved issues, securing IP-USN is of great concern for researchers so that future market satisfaction and demands can be met. Without proper security measures, both reactive and proactive, it is hard to envisage an IP-USN realm. In this paper we present a design of an IDS (Intrusion Detection System) called RIDES (Robust Intrusion DEtection System) for IP-USN. RIDES is a hybrid intrusion detection system, which incorporates both Signature and Anomaly based intrusion detection components. For signature based intrusion detection this paper only discusses the implementation of distributed pattern matching algorithm with the help of signature-code, a dynamically created attack-signature identifier. Other aspects, such as creation of rules are not discussed. On the other hand, for anomaly based detection we propose a scoring classifier based on the SPC (Statistical Process Control) technique called CUSUM charts. We also investigate the settings and their effects on the performance of related parameters for both of the components.
Overton, Jr., William C.; Steyert, Jr., William A.
1984-01-01
A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.
Overton, W.C. Jr.; Steyert, W.A. Jr.
1981-05-22
A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.
Network anomaly detection system with optimized DS evidence theory.
Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu
2014-01-01
Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network-complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each sensor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly.
Network Anomaly Detection System with Optimized DS Evidence Theory
Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu
2014-01-01
Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258
Distribution of the Crustal Magnetic Field in Sichuan-Yunnan Region, Southwest China
Bai, Chunhua; Kang, Guofa; Gao, Guoming
2014-01-01
Based on the new and higher degree geomagnetic model NGDC-720-V3, we have investigated the spatial distribution, the altitude decay characteristics of the crustal magnetic anomaly, the contributions from different wavelength bands to the anomaly, and the relationship among the anomaly, the geological structure, and the geophysical field in Sichuan-Yunnan region of China. It is noted that the most outstanding feature in this area is the strong positive magnetic anomaly in Sichuan Basin, a geologically stable block. Contrasting with this feature, a strong negative anomaly can be seen nearby in Longmen Mountain block, an active block. This contradiction implies a possible relationship between the magnetic field and the geological activity. Completely different feature in magnetic field distribution is seen in the central Yunnan block, another active region, where positive and negative anomalies distribute alternatively, showing a complex magnetic anomaly map. Some fault belts, such as the Longmen Mountain fault, Lijiang-Xiaojinhe fault, and the Red River fault, are the transitional zones of strong and weak or negative and positive anomalies. The corresponding relationship between the magnetic anomaly and the geophysical fields was confirmed. PMID:25243232
Turtle Carapace Anomalies: The Roles of Genetic Diversity and Environment
Velo-Antón, Guillermo; Becker, C. Guilherme; Cordero-Rivera, Adolfo
2011-01-01
Background Phenotypic anomalies are common in wild populations and multiple genetic, biotic and abiotic factors might contribute to their formation. Turtles are excellent models for the study of developmental instability because anomalies are easily detected in the form of malformations, additions, or reductions in the number of scutes or scales. Methodology/Principal Findings In this study, we integrated field observations, manipulative experiments, and climatic and genetic approaches to investigate the origin of carapace scute anomalies across Iberian populations of the European pond turtle, Emys orbicularis. The proportion of anomalous individuals varied from 3% to 69% in local populations, with increasing frequency of anomalies in northern regions. We found no significant effect of climatic and soil moisture, or climatic temperature on the occurrence of anomalies. However, lower genetic diversity and inbreeding were good predictors of the prevalence of scute anomalies among populations. Both decreasing genetic diversity and increasing proportion of anomalous individuals in northern parts of the Iberian distribution may be linked to recolonization events from the Southern Pleistocene refugium. Conclusions/Significance Overall, our results suggest that developmental instability in turtle carapace formation might be caused, at least in part, by genetic factors, although the influence of environmental factors affecting the developmental stability of turtle carapace cannot be ruled out. Further studies of the effects of environmental factors, pollutants and heritability of anomalies would be useful to better understand the complex origin of anomalies in natural populations. PMID:21533278
NASA Astrophysics Data System (ADS)
Husin, Shuib; Afiq Pauzi, Ahmad; Yunus, Salmi Mohd; Ghafar, Mohd Hafiz Abdul; Adilin Sekari, Saiful
2017-10-01
This technical paper demonstrates the successful of the application of self-magnetic leakage field (SMLF) technique in detecting anomalies in weldment of a thick P91 materials joint (1 inch thickness). Boiler components such as boiler tubes, stub boiler at penthouse and energy piping such as hot reheat pipe (HRP) and H-balance energy piping to turbine are made of P91 material. P91 is ferromagnetic material, therefore the technique of self-magnetic leakage field (SMLF) is applicable for P91 in detecting anomalies within material (internal defects). The technique is categorized under non-destructive technique (NDT). It is the second passive method after acoustic emission (AE), at which the information on structures radiation (magnetic field and energy waves) is used. The measured magnetic leakage field of a product or component is a magnetic leakage field occurring on the component’s surface in the zone of dislocation stable slipbands under the influence of operational (in-service) or residual stresses or in zones of maximum inhomogeneity of metal structure in new products or components. Inter-granular and trans-granular cracks, inclusion, void, cavity and corrosion are considered types of inhomogeneity and discontinuity in material where obviously the output of magnetic leakage field will be shown when using this technique. The technique does not required surface preparation for the component to be inspected. This technique is contact-type inspection, which means the sensor has to touch or in-contact to the component’s surface during inspection. The results of application of SMLF technique on the developed P91 reference blocks have demonstrated that the technique is practical to be used for anomaly inspection and detection as well as identification of anomalies’ location. The evaluation of this passive self-magnetic leakage field (SMLF) technique has been verified by other conventional non-destructive tests (NDTs) on the reference blocks where simulated defects/anomalies have been developed inside at the weldment. The results from the inspection test showed that the signatures of magnetic leakage field gradient distribution prove that the peak is found on the location of defect/anomaly in the reference block. It is in agreement with the evidence of anomaly that seen in the radiography test film (RT).
2013-10-15
statistic,” in Artifical Intelligence and Statistics (AISTATS), 2013. [6] ——, “Detecting activity in graphs via the Graph Ellipsoid Scan Statistic... Artifical Intelligence and Statistics (AISTATS), 2013. [8] ——, “Near-optimal anomaly detection in graphs using Lovász Extended Scan Statistic,” in Neural...networks,” in Artificial Intelligence and Statistics (AISTATS), 2010. 11 [11] D. Aldous, “The random walk construction of uniform spanning trees and
Communication: On the isotope anomaly of nuclear quadrupole coupling in molecules
NASA Astrophysics Data System (ADS)
Filatov, Michael; Zou, Wenli; Cremer, Dieter
2012-10-01
The dependence of the nuclear quadrupole coupling constants (NQCC) on the interaction between electrons and a nucleus of finite size is theoretically analyzed. A deviation of the ratio of the NQCCs obtained from two different isotopomers of a molecule from the ratio of the corresponding bare nuclear electric quadrupole moments, known as quadrupole anomaly, is interpreted in terms of the logarithmic derivatives of the electric field gradient at the nuclear site with respect to the nuclear charge radius. Quantum chemical calculations based on a Dirac-exact relativistic methodology suggest that the effect of the changing size of the Au nucleus in different isotopomers can be observed for Au-containing molecules, for which the predicted quadrupole anomaly reaches values of the order of 0.1%. This is experimentally detectable and provides an insight into the charge distribution of non-spherical nuclei.
Real-time anomaly detection for very short-term load forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Jian; Hong, Tao; Yue, Meng
Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less
Real-time anomaly detection for very short-term load forecasting
Luo, Jian; Hong, Tao; Yue, Meng
2018-01-06
Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less
Using statistical anomaly detection models to find clinical decision support malfunctions.
Ray, Soumi; McEvoy, Dustin S; Aaron, Skye; Hickman, Thu-Trang; Wright, Adam
2018-05-11
Malfunctions in Clinical Decision Support (CDS) systems occur due to a multitude of reasons, and often go unnoticed, leading to potentially poor outcomes. Our goal was to identify malfunctions within CDS systems. We evaluated 6 anomaly detection models: (1) Poisson Changepoint Model, (2) Autoregressive Integrated Moving Average (ARIMA) Model, (3) Hierarchical Divisive Changepoint (HDC) Model, (4) Bayesian Changepoint Model, (5) Seasonal Hybrid Extreme Studentized Deviate (SHESD) Model, and (6) E-Divisive with Median (EDM) Model and characterized their ability to find known anomalies. We analyzed 4 CDS alerts with known malfunctions from the Longitudinal Medical Record (LMR) and Epic® (Epic Systems Corporation, Madison, WI, USA) at Brigham and Women's Hospital, Boston, MA. The 4 rules recommend lead testing in children, aspirin therapy in patients with coronary artery disease, pneumococcal vaccination in immunocompromised adults and thyroid testing in patients taking amiodarone. Poisson changepoint, ARIMA, HDC, Bayesian changepoint and the SHESD model were able to detect anomalies in an alert for lead screening in children and in an alert for pneumococcal conjugate vaccine in immunocompromised adults. EDM was able to detect anomalies in an alert for monitoring thyroid function in patients on amiodarone. Malfunctions/anomalies occur frequently in CDS alert systems. It is important to be able to detect such anomalies promptly. Anomaly detection models are useful tools to aid such detections.
22nd Annual Logistics Conference and Exhibition
2006-04-20
Prognostics & Health Management at GE Dr. Piero P.Bonissone Industrial AI Lab GE Global Research NCD Select detection model Anomaly detection results...Mode 213 x Failure mode histogram 2130014 Anomaly detection from event-log data Anomaly detection from event-log data Diagnostics/ Prognostics Using...Failure Monitoring & AssessmentTactical C4ISR Sense Respond 7 •Diagnostics, Prognostics and health management
Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin
2016-01-01
The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035
Pre-seismic anomalies from optical satellite observations: a review
NASA Astrophysics Data System (ADS)
Jiao, Zhong-Hu; Zhao, Jing; Shan, Xinjian
2018-04-01
Detecting various anomalies using optical satellite data prior to strong earthquakes is key to understanding and forecasting earthquake activities because of its recognition of thermal-radiation-related phenomena in seismic preparation phases. Data from satellite observations serve as a powerful tool in monitoring earthquake preparation areas at a global scale and in a nearly real-time manner. Over the past several decades, many new different data sources have been utilized in this field, and progressive anomaly detection approaches have been developed. This paper reviews the progress and development of pre-seismic anomaly detection technology in this decade. First, precursor parameters, including parameters from the top of the atmosphere, in the atmosphere, and on the Earth's surface, are stated and discussed. Second, different anomaly detection methods, which are used to extract anomalous signals that probably indicate future seismic events, are presented. Finally, certain critical problems with the current research are highlighted, and new developing trends and perspectives for future work are discussed. The development of Earth observation satellites and anomaly detection algorithms can enrich available information sources, provide advanced tools for multilevel earthquake monitoring, and improve short- and medium-term forecasting, which play a large and growing role in pre-seismic anomaly detection research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, X; Liu, S; Kalet, A
Purpose: The purpose of this work was to investigate the ability of a machine-learning based probabilistic approach to detect radiotherapy treatment plan anomalies given initial disease classes information. Methods In total we obtained 1112 unique treatment plans with five plan parameters and disease information from a Mosaiq treatment management system database for use in the study. The plan parameters include prescription dose, fractions, fields, modality and techniques. The disease information includes disease site, and T, M and N disease stages. A Bayesian network method was employed to model the probabilistic relationships between tumor disease information, plan parameters and an anomalymore » flag. A Bayesian learning method with Dirichlet prior was useed to learn the joint probabilities between dependent variables in error-free plan data and data with artificially induced anomalies. In the study, we randomly sampled data with anomaly in a specified anomaly space.We tested the approach with three groups of plan anomalies – improper concurrence of values of all five plan parameters and values of any two out of five parameters, and all single plan parameter value anomalies. Totally, 16 types of plan anomalies were covered by the study. For each type, we trained an individual Bayesian network. Results: We found that the true positive rate (recall) and positive predictive value (precision) to detect concurrence anomalies of five plan parameters in new patient cases were 94.45±0.26% and 93.76±0.39% respectively. To detect other 15 types of plan anomalies, the average recall and precision were 93.61±2.57% and 93.78±3.54% respectively. The computation time to detect the plan anomaly of each type in a new plan is ∼0.08 seconds. Conclusion: The proposed method for treatment plan anomaly detection was found effective in the initial tests. The results suggest that this type of models could be applied to develop plan anomaly detection tools to assist manual and automated plan checks. The senior author received research grants from ViewRay Inc. and Varian Medical System.« less
Automated Network Anomaly Detection with Learning, Control and Mitigation
ERIC Educational Resources Information Center
Ippoliti, Dennis
2014-01-01
Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…
Systematic Screening for Subtelomeric Anomalies in a Clinical Sample of Autism
ERIC Educational Resources Information Center
Wassink, Thomas H.; Losh, Molly; Piven, Joseph; Sheffield, Val C.; Ashley, Elizabeth; Westin, Erik R.; Patil, Shivanand R.
2007-01-01
High-resolution karyotyping detects cytogenetic anomalies in 5-10% of cases of autism. Karyotyping, however, may fail to detect abnormalities of chromosome subtelomeres, which are gene rich regions prone to anomalies. We assessed whether panels of FISH probes targeted for subtelomeres could detect abnormalities beyond those identified by…
Topological anomaly detection performance with multispectral polarimetric imagery
NASA Astrophysics Data System (ADS)
Gartley, M. G.; Basener, W.,
2009-05-01
Polarimetric imaging has demonstrated utility for increasing contrast of manmade targets above natural background clutter. Manual detection of manmade targets in multispectral polarimetric imagery can be challenging and a subjective process for large datasets. Analyst exploitation may be improved utilizing conventional anomaly detection algorithms such as RX. In this paper we examine the performance of a relatively new approach to anomaly detection, which leverages topology theory, applied to spectral polarimetric imagery. Detection results for manmade targets embedded in a complex natural background will be presented for both the RX and Topological Anomaly Detection (TAD) approaches. We will also present detailed results examining detection sensitivities relative to: (1) the number of spectral bands, (2) utilization of Stoke's images versus intensity images, and (3) airborne versus spaceborne measurements.
Quantum machine learning for quantum anomaly detection
NASA Astrophysics Data System (ADS)
Liu, Nana; Rebentrost, Patrick
2018-04-01
Anomaly detection is used for identifying data that deviate from "normal" data patterns. Its usage on classical data finds diverse applications in many important areas such as finance, fraud detection, medical diagnoses, data cleaning, and surveillance. With the advent of quantum technologies, anomaly detection of quantum data, in the form of quantum states, may become an important component of quantum applications. Machine-learning algorithms are playing pivotal roles in anomaly detection using classical data. Two widely used algorithms are the kernel principal component analysis and the one-class support vector machine. We find corresponding quantum algorithms to detect anomalies in quantum states. We show that these two quantum algorithms can be performed using resources that are logarithmic in the dimensionality of quantum states. For pure quantum states, these resources can also be logarithmic in the number of quantum states used for training the machine-learning algorithm. This makes these algorithms potentially applicable to big quantum data applications.
Detection of Spiroplasma and Wolbachia in the bacterial gonad community of Chorthippus parallelus.
Martínez-Rodríguez, P; Hernández-Pérez, M; Bella, J L
2013-07-01
We have recently detected the endosymbiont Wolbachia in multiple individuals and populations of the grasshopper Chorthippus parallelus (Orthoptera: acrididae). This bacterium induces reproductive anomalies, including cytoplasmic incompatibility. Such incompatibilities may help explain the maintenance of two distinct subspecies of this grasshopper, C. parallelus parallelus and C. parallelus erythropus, which are involved in a Pyrenean hybrid zone that has been extensively studied for the past 20 years, becoming a model system for the study of genetic divergence and speciation. To evaluate whether Wolbachia is the sole bacterial infection that might induce reproductive anomalies, the gonadal bacterial community of individuals from 13 distinct populations of C. parallelus was determined by denaturing gradient gel electrophoresis analysis of bacterial 16S rRNA gene fragments and sequencing. The study revealed low bacterial diversity in the gonads: a persistent bacterial trio consistent with Spiroplasma sp. and the two previously described supergroups of Wolbachia (B and F) dominated the gonad microbiota. A further evaluation of the composition of the gonad bacterial communities was carried out by whole cell hybridization. Our results confirm previous studies of the cytological distribution of Wolbachia in C. parallelus gonads and show a homogeneous infection by Spiroplasma. Spiroplasma and Wolbachia cooccurred in some individuals, but there was no significant association of Spiroplasma with a grasshopper's sex or with Wolbachia infection, although subtle trends might be detected with a larger sample size. This information, together with previous experimental crosses of this grasshopper, suggests that Spiroplasma is unlikely to contribute to sex-specific reproductive anomalies; instead, they implicate Wolbachia as the agent of the observed anomalies in C. parallelus.
Detection of Low Temperature Volcanogenic Thermal Anomalies with ASTER
NASA Astrophysics Data System (ADS)
Pieri, D. C.; Baxter, S.
2009-12-01
Predicting volcanic eruptions is a thorny problem, as volcanoes typically exhibit idiosyncratic waxing and/or waning pre-eruption emission, geodetic, and seismic behavior. It is no surprise that increasing our accuracy and precision in eruption prediction depends on assessing the time-progressions of all relevant precursor geophysical, geochemical, and geological phenomena, and on more frequently observing volcanoes when they become restless. The ASTER instrument on the NASA Terra Earth Observing System satellite in low earth orbit provides important capabilities in the area of detection of volcanogenic anomalies such as thermal precursors and increased passive gas emissions. Its unique high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), bore-sighted with visible and near-IR imaging data, and combined with off-nadir pointing and stereo-photogrammetric capabilities make ASTER a potentially important volcanic precursor detection tool. We are utilizing the JPL ASTER Volcano Archive (http://ava.jpl.nasa.gov) to systematically examine 80,000+ ASTER volcano images to analyze (a) thermal emission baseline behavior for over 1500 volcanoes worldwide, (b) the form and magnitude of time-dependent thermal emission variability for these volcanoes, and (c) the spatio-temporal limits of detection of pre-eruption temporal changes in thermal emission in the context of eruption precursor behavior. We are creating and analyzing a catalog of the magnitude, frequency, and distribution of volcano thermal signatures worldwide as observed from ASTER since 2000 at 90m/pixel. Of particular interest as eruption precursors are small low contrast thermal anomalies of low apparent absolute temperature (e.g., melt-water lakes, fumaroles, geysers, grossly sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and their combinations). To systematically detect such intrinsically difficult anomalies within our large archive, we are exploring a four step approach: (a) the recursive application of a GPU-accelerated, edge-preserving bilateral filter prepares a thermal image by removing noise and fine detail; (b) the resulting stylized filtered image is segmented by a path-independent region-growing algorithm, (c) the resulting segments are fused based on thermal affinity, and (d) fused segments are subjected to thermal and geographical tests for hotspot detection and classification, to eliminate false alarms or non-volcanogenic anomalies. We will discuss our progress in creating the general thermal anomaly catalog as well as algorithm approach and results. This work was carried out at the Jet Propulsion Laboratory of the California Institute of Technology under contract to NASA.
D'Antonio, F; Khalil, A; Garel, C; Pilu, G; Rizzo, G; Lerman-Sagie, T; Bhide, A; Thilaganathan, B; Manzoli, L; Papageorghiou, A T
2016-06-01
To explore the outcome in fetuses with prenatal diagnosis of posterior fossa anomalies apparently isolated on ultrasound imaging. MEDLINE and EMBASE were searched electronically utilizing combinations of relevant medical subject headings for 'posterior fossa' and 'outcome'. The posterior fossa anomalies analyzed were Dandy-Walker malformation (DWM), mega cisterna magna (MCM), Blake's pouch cyst (BPC) and vermian hypoplasia (VH). The outcomes observed were rate of chromosomal abnormalities, additional anomalies detected at prenatal magnetic resonance imaging (MRI), additional anomalies detected at postnatal imaging and concordance between prenatal and postnatal diagnoses. Only isolated cases of posterior fossa anomalies - defined as having no cerebral or extracerebral additional anomalies detected on ultrasound examination - were included in the analysis. Quality assessment of the included studies was performed using the Newcastle-Ottawa Scale for cohort studies. We used meta-analyses of proportions to combine data and fixed- or random-effects models according to the heterogeneity of the results. Twenty-two studies including 531 fetuses with posterior fossa anomalies were included in this systematic review. The prevalence of chromosomal abnormalities in fetuses with isolated DWM was 16.3% (95% CI, 8.7-25.7%). The prevalence of additional central nervous system (CNS) abnormalities that were missed at ultrasound examination and detected only at prenatal MRI was 13.7% (95% CI, 0.2-42.6%), and the prevalence of additional CNS anomalies that were missed at prenatal imaging and detected only after birth was 18.2% (95% CI, 6.2-34.6%). Prenatal diagnosis was not confirmed after birth in 28.2% (95% CI, 8.5-53.9%) of cases. MCM was not significantly associated with additional anomalies detected at prenatal MRI or detected after birth. Prenatal diagnosis was not confirmed postnatally in 7.1% (95% CI, 2.3-14.5%) of cases. The rate of chromosomal anomalies in fetuses with isolated BPC was 5.2% (95% CI, 0.9-12.7%) and there was no associated CNS anomaly detected at prenatal MRI or only after birth. Prenatal diagnosis of BPC was not confirmed after birth in 9.8% (95% CI, 2.9-20.1%) of cases. The rate of chromosomal anomalies in fetuses with isolated VH was 6.5% (95% CI, 0.8-17.1%) and there were no additional anomalies detected at prenatal MRI (0% (95% CI, 0.0-45.9%)). The proportions of cerebral anomalies detected only after birth was 14.2% (95% CI, 2.9-31.9%). Prenatal diagnosis was not confirmed after birth in 32.4% (95% CI, 18.3-48.4%) of cases. DWM apparently isolated on ultrasound imaging is a condition with a high risk for chromosomal and associated structural anomalies. Isolated MCM and BPC have a low risk for aneuploidy or associated structural anomalies. The small number of cases with isolated VH prevents robust conclusions regarding their management from being drawn. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd.
Ristivojević, Andjelka; Djokić, Petra Lukić; Katanić, Dragan; Dobanovacki, Dušanka; Privrodski, Jadranka Jovanović
2016-05-01
According to the World Health Organization (WHO) definition, congenital anomalies are all disorders of the organs or tissues, regardless of whether they are visible at birth or manifest in life, and are registered in the International Classification of Diseases. The aim of this study was to compare the incidence and structure of prenatally detected and clinically manifested congenital anomalies in the newborns in the region of Novi Sad (Province of Vojvodina, Serbia) in the two distant years (1996 and 2006). This retrospective cohort study included all the children born at the Clinic for Gynecology and Obstetrics (Clinical Center of Vojvodina) in Novi Sad during 1996 and 2006. The incidence and the structure of congenital anomalies were analyzed. During 1996 there were 6,099 births and major congenital anomalies were found in 215 infants, representing 3.5%. In 2006 there were 6,628 births and major congenital anomalies were noted in 201 newborns, which is 3%. During 1996 there were more children with anomalies of musculoskeletal system, urogenital tract, with anomalies of the central nervous system and chromosomal abnormalities. During the year 2006 there were more children with cardiovascular anomalies, followed by urogenital anomalies, with significant decline in musculoskeletal anomalies. The distribution of the newborns with major congenital anomalies, regarding perinatal outcome, showed the difference between the studied years. In 2006 the increasing number of children required further investigation and treatment. There is no national registry of congenital anomalies in Serbia so the aim of this study was to enlight this topic. In the span of ten years, covering the period of the NATO campaign in Novi Sad and Serbia, the frequency of major congenital anomalies in the newborns was not increased. The most frequent anomalies observed during both years implied the musculosketelal, cardiovascular, urogenital and central nervous system. In the year 2006 there was a significant eruption of cardiovascular anomalies and a significant decrease of musculoskeletal anomalies, chromosomal abnormalities and central nervous system anomalies, while the number of urogenital anomalies declined compared to the year 1996.
SU-G-JeP4-03: Anomaly Detection of Respiratory Motion by Use of Singular Spectrum Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kotoku, J; Kumagai, S; Nakabayashi, S
Purpose: The implementation and realization of automatic anomaly detection of respiratory motion is a very important technique to prevent accidental damage during radiation therapy. Here, we propose an automatic anomaly detection method using singular value decomposition analysis. Methods: The anomaly detection procedure consists of four parts:1) measurement of normal respiratory motion data of a patient2) calculation of a trajectory matrix representing normal time-series feature3) real-time monitoring and calculation of a trajectory matrix of real-time data.4) calculation of an anomaly score from the similarity of the two feature matrices. Patient motion was observed by a marker-less tracking system using a depthmore » camera. Results: Two types of motion e.g. cough and sudden stop of breathing were successfully detected in our real-time application. Conclusion: Automatic anomaly detection of respiratory motion using singular spectrum analysis was successful in the cough and sudden stop of breathing. The clinical use of this algorithm will be very hopeful. This work was supported by JSPS KAKENHI Grant Number 15K08703.« less
More About Software for No-Loss Computing
NASA Technical Reports Server (NTRS)
Edmonds, Iarina
2007-01-01
A document presents some additional information on the subject matter of "Integrated Hardware and Software for No- Loss Computing" (NPO-42554), which appears elsewhere in this issue of NASA Tech Briefs. To recapitulate: The hardware and software designs of a developmental parallel computing system are integrated to effectuate a concept of no-loss computing (NLC). The system is designed to reconfigure an application program such that it can be monitored in real time and further reconfigured to continue a computation in the event of failure of one of the computers. The design provides for (1) a distributed class of NLC computation agents, denoted introspection agents, that effects hierarchical detection of anomalies; (2) enhancement of the compiler of the parallel computing system to cause generation of state vectors that can be used to continue a computation in the event of a failure; and (3) activation of a recovery component when an anomaly is detected.
Hu, Erzhong; Nosato, Hirokazu; Sakanashi, Hidenori; Murakawa, Masahiro
2013-01-01
Capsule endoscopy is a patient-friendly endoscopy broadly utilized in gastrointestinal examination. However, the efficacy of diagnosis is restricted by the large quantity of images. This paper presents a modified anomaly detection method, by which both known and unknown anomalies in capsule endoscopy images of small intestine are expected to be detected. To achieve this goal, this paper introduces feature extraction using a non-linear color conversion and Higher-order Local Auto Correlation (HLAC) Features, and makes use of image partition and subspace method for anomaly detection. Experiments are implemented among several major anomalies with combinations of proposed techniques. As the result, the proposed method achieved 91.7% and 100% detection accuracy for swelling and bleeding respectively, so that the effectiveness of proposed method is demonstrated.
Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling
NASA Astrophysics Data System (ADS)
Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji
We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.
NASA Astrophysics Data System (ADS)
Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; In Oh, Tong
2014-10-01
Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.
NASA Astrophysics Data System (ADS)
McEvoy, Thomas Richard; Wolthusen, Stephen D.
Recent research on intrusion detection in supervisory data acquisition and control (SCADA) and DCS systems has focused on anomaly detection at protocol level based on the well-defined nature of traffic on such networks. Here, we consider attacks which compromise sensors or actuators (including physical manipulation), where intrusion may not be readily apparent as data and computational states can be controlled to give an appearance of normality, and sensor and control systems have limited accuracy. To counter these, we propose to consider indirect relations between sensor readings to detect such attacks through concurrent observations as determined by control laws and constraints.
2009-03-01
viii 3.2.3 Sub7 ...from TaskInfo in Excel Format. 3.2.3 Sub7 Also known as SubSeven, this is one of the best known, most widely distributed backdoor programs on the...engineering the spread of viruses, worms, backdoors and other malware. The Sub7 Trojan establishes a server on the victim computer that
Anomaly Detection Using an Ensemble of Feature Models
Noto, Keith; Brodley, Carla; Slonim, Donna
2011-01-01
We present a new approach to semi-supervised anomaly detection. Given a set of training examples believed to come from the same distribution or class, the task is to learn a model that will be able to distinguish examples in the future that do not belong to the same class. Traditional approaches typically compare the position of a new data point to the set of “normal” training data points in a chosen representation of the feature space. For some data sets, the normal data may not have discernible positions in feature space, but do have consistent relationships among some features that fail to appear in the anomalous examples. Our approach learns to predict the values of training set features from the values of other features. After we have formed an ensemble of predictors, we apply this ensemble to new data points. To combine the contribution of each predictor in our ensemble, we have developed a novel, information-theoretic anomaly measure that our experimental results show selects against noisy and irrelevant features. Our results on 47 data sets show that for most data sets, this approach significantly improves performance over current state-of-the-art feature space distance and density-based approaches. PMID:22020249
Faust, Kevin; Xie, Quin; Han, Dominick; Goyle, Kartikay; Volynskaya, Zoya; Djuric, Ugljesa; Diamandis, Phedias
2018-05-16
There is growing interest in utilizing artificial intelligence, and particularly deep learning, for computer vision in histopathology. While accumulating studies highlight expert-level performance of convolutional neural networks (CNNs) on focused classification tasks, most studies rely on probability distribution scores with empirically defined cutoff values based on post-hoc analysis. More generalizable tools that allow humans to visualize histology-based deep learning inferences and decision making are scarce. Here, we leverage t-distributed Stochastic Neighbor Embedding (t-SNE) to reduce dimensionality and depict how CNNs organize histomorphologic information. Unique to our workflow, we develop a quantitative and transparent approach to visualizing classification decisions prior to softmax compression. By discretizing the relationships between classes on the t-SNE plot, we show we can super-impose randomly sampled regions of test images and use their distribution to render statistically-driven classifications. Therefore, in addition to providing intuitive outputs for human review, this visual approach can carry out automated and objective multi-class classifications similar to more traditional and less-transparent categorical probability distribution scores. Importantly, this novel classification approach is driven by a priori statistically defined cutoffs. It therefore serves as a generalizable classification and anomaly detection tool less reliant on post-hoc tuning. Routine incorporation of this convenient approach for quantitative visualization and error reduction in histopathology aims to accelerate early adoption of CNNs into generalized real-world applications where unanticipated and previously untrained classes are often encountered.
A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data.
Song, Hongchao; Jiang, Zhuqing; Men, Aidong; Yang, Bo
2017-01-01
Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE) and an ensemble k -nearest neighbor graphs- ( K -NNG-) based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity.
A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data
Jiang, Zhuqing; Men, Aidong; Yang, Bo
2017-01-01
Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE) and an ensemble k-nearest neighbor graphs- (K-NNG-) based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity. PMID:29270197
Distribution of female genital tract anomalies in two classifications.
Heinonen, Pentti K
2016-11-01
This study assessed the distribution of Müllerian duct anomalies in two verified classifications of female genital tract malformations, and the presence of associated renal defects. 621 women with confirmed female genital tract anomalies were retrospectively grouped under the European (ESHRE/ESGE) and the American (AFS) classification. The diagnosis of uterine malformation was based on findings in hysterosalpingography, two-dimensional ultrasonography, endoscopies, laparotomy, cesarean section and magnetic resonance imaging in 97.3% of cases. Renal status was determined in 378 patients, including 5 with normal uterus and vagina. The European classification covered all 621 women studied. Uterine anomalies without cervical or vaginal anomaly were found in 302 (48.6%) patients. Uterine anomaly was associated with vaginal anomaly in 45.2%, and vaginal anomaly alone was found in 26 (4.2%) cases. Septate uterus was the most common (49.1%) of all genital tract anomalies, followed by bicorporeal uteri (18.2%). The American classification covered 590 (95%) out of the 621 women with genital tract anomalies. The American system did not take into account vaginal anomalies in 170 (34.7%) and cervical anomalies in 174 (35.5%) out of 490 cases with uterine malformations. Renal abnormalities were found in 71 (18.8%) out of 378 women, unilateral renal agenesis being the most common defect (12.2%), also found in 4 women without Müllerian duct anomaly. The European classification sufficiently covered uterine and vaginal abnormalities. The distribution of the main uterine anomalies was equal in both classifications. The American system missed cervical and vaginal anomalies associated with uterine anomalies. Evaluation of renal system is recommended for all patients with genital tract anomalies. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Detecting Biosphere anomalies hotspots
NASA Astrophysics Data System (ADS)
Guanche-Garcia, Yanira; Mahecha, Miguel; Flach, Milan; Denzler, Joachim
2017-04-01
The current amount of satellite remote sensing measurements available allow for applying data-driven methods to investigate environmental processes. The detection of anomalies or abnormal events is crucial to monitor the Earth system and to analyze their impacts on ecosystems and society. By means of a combination of statistical methods, this study proposes an intuitive and efficient methodology to detect those areas that present hotspots of anomalies, i.e. higher levels of abnormal or extreme events or more severe phases during our historical records. Biosphere variables from a preliminary version of the Earth System Data Cube developed within the CAB-LAB project (http://earthsystemdatacube.net/) have been used in this study. This database comprises several atmosphere and biosphere variables expanding 11 years (2001-2011) with 8-day of temporal resolution and 0.25° of global spatial resolution. In this study, we have used 10 variables that measure the biosphere. The methodology applied to detect abnormal events follows the intuitive idea that anomalies are assumed to be time steps that are not well represented by a previously estimated statistical model [1].We combine the use of Autoregressive Moving Average (ARMA) models with a distance metric like Mahalanobis distance to detect abnormal events in multiple biosphere variables. In a first step we pre-treat the variables by removing the seasonality and normalizing them locally (μ=0,σ=1). Additionally we have regionalized the area of study into subregions of similar climate conditions, by using the Köppen climate classification. For each climate region and variable we have selected the best ARMA parameters by means of a Bayesian Criteria. Then we have obtained the residuals by comparing the fitted models with the original data. To detect the extreme residuals from the 10 variables, we have computed the Mahalanobis distance to the data's mean (Hotelling's T^2), which considers the covariance matrix of the joint distribution. The proposed methodology has been applied to different areas around the globe. The results show that the method is able to detect historic events and also provides a useful tool to define sensitive regions. This method and results have been developed within the framework of the project BACI (http://baci-h2020.eu/), which aims to integrate Earth Observation data to monitor the earth system and assessing the impacts of terrestrial changes. [1] V. Chandola, A., Banerjee and v., Kumar. Anomaly detection: a survey. ACM computing surveys (CSUR), vol. 41, n. 3, 2009. [2] P. Mahalanobis. On the generalised distance in statistics. Proceedings National Institute of Science, vol. 2, pp 49-55, 1936.
Detection of anomalies in radio tomography of asteroids: Source count and forward errors
NASA Astrophysics Data System (ADS)
Pursiainen, S.; Kaasalainen, M.
2014-09-01
The purpose of this study was to advance numerical methods for radio tomography in which asteroid's internal electric permittivity distribution is to be recovered from radio frequency data gathered by an orbiter. The focus was on signal generation via multiple sources (transponders) providing one potential, or even essential, scenario to be implemented in a challenging in situ measurement environment and within tight payload limits. As a novel feature, the effects of forward errors including noise and a priori uncertainty of the forward (data) simulation were examined through a combination of the iterative alternating sequential (IAS) inverse algorithm and finite-difference time-domain (FDTD) simulation of time evolution data. Single and multiple source scenarios were compared in two-dimensional localization of permittivity anomalies. Three different anomaly strengths and four levels of total noise were tested. Results suggest, among other things, that multiple sources can be necessary to obtain appropriate results, for example, to distinguish three separate anomalies with permittivity less or equal than half of the background value, relevant in recovery of internal cavities.
2015-06-09
anomaly detection , which is generally considered part of high level information fusion (HLIF) involving temporal-geospatial data as well as meta-data... Anomaly detection in the Maritime defence and security domain typically focusses on trying to identify vessels that are behaving in an unusual...manner compared with lawful vessels operating in the area – an applied case of target detection among distractors. Anomaly detection is a complex problem
Post-processing for improving hyperspectral anomaly detection accuracy
NASA Astrophysics Data System (ADS)
Wu, Jee-Cheng; Jiang, Chi-Ming; Huang, Chen-Liang
2015-10-01
Anomaly detection is an important topic in the exploitation of hyperspectral data. Based on the Reed-Xiaoli (RX) detector and a morphology operator, this research proposes a novel technique for improving the accuracy of hyperspectral anomaly detection. Firstly, the RX-based detector is used to process a given input scene. Then, a post-processing scheme using morphology operator is employed to detect those pixels around high-scoring anomaly pixels. Tests were conducted using two real hyperspectral images with ground truth information and the results based on receiver operating characteristic curves, illustrated that the proposed method reduced the false alarm rates of the RXbased detector.
An incremental anomaly detection model for virtual machines.
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.
An incremental anomaly detection model for virtual machines
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245
Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini
2013-01-01
Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182
Detecting anomalies in CMB maps: a new method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neelakanta, Jayanth T., E-mail: jayanthtn@gmail.com
2015-10-01
Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ''hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics aremore » linear and quadratic combinations of the a{sub ℓ m}'s with random co-efficients, and they test the null hypothesis that the a{sub ℓ m}'s are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple ℓ and m modes, the ''coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.« less
Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing
NASA Technical Reports Server (NTRS)
Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)
2001-01-01
The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.
Prevalence and distribution of selected developmental dental anomalies in an Indian population.
Gupta, Saurabh K; Saxena, Payal; Jain, Sandhya; Jain, Deshraj
2011-06-01
The purpose of this study was to determine the prevalence of developmental dental anomalies in an Indian population and to statistically analyze the distribution of these anomalies. The study was based on clinical examination, evaluation of dental casts, and panoramic radiographs of 1123 Indian subjects (572 males, 551 females), who visited the outpatient clinic at Government Dental College, Indore between November 2009 and September 2010, after obtaining their informed consent. These patients were examined for the following developmental dental anomalies: shape anomalies (microdontia, talon cusp, dens evaginatus, fusion, taurodontism), number anomalies (hypodontia, oligodontia, anodontia), structural anomalies (amelogenesis imperfecta, dentinogenesis imperfecta) and positional anomalies (ectopic eruption, rotation, impaction). The percentages of these anomalies were assessed for the whole group and compared using statistical analysis. Among the 1123 subjects, a total of 385 individuals (34.28%) presented with the selected developmental dental anomalies. The distribution by sex was 197 males (34.44%), and 188 females (34.06%). Out of the total 1123 individuals, 351 (31.26%) exhibited at least one anomaly, 28 (2.49 %) showed two anomalies and 6 (0.53%) displayed more than two anomalies. P values indicated that the dental anomalies were statistically independent of sex. On intergroup comparison, positional anomalies were significantly most prevalent (P < 0.05) in the Indian population. The most common developmental dental anomaly was rotation (10.24%), followed by ectopic eruption (7.93%). The next common group was number anomalies. The most common number anomaly was hypodontia (4.19%), which had a higher frequency than hyperdontia (2.40%). Analyzing the next prevalent group of shape anomalies, microdontia (2.58%) was found to be the most common, followed by taurodontism (2.49%), dens evaginatus (2.40%) and talon cusp (0.97%). Dentinogenesis imperfecta (0.09%) was the rarest, followed by amelogenesis imperfecta (0.27%) and fusion (0.27%).
2015-09-21
this framework, MIT LL carried out a one-year proof- of-concept study to determine the capabilities and challenges in the detection of anomalies in...extremely large graphs [5]. Under this effort, two real datasets were considered, and algorithms for data modeling and anomaly detection were developed...is required in a well-defined experimental framework for the detection of anomalies in very large graphs. This study is intended to inform future
Lidar detection algorithm for time and range anomalies.
Ben-David, Avishai; Davidson, Charles E; Vanderbeek, Richard G
2007-10-10
A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t(1) to t(2)" is addressed, and for range anomaly where the question "is a target present at time t within ranges R(1) and R(2)" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO(2) lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.
NASA Astrophysics Data System (ADS)
Akhoondzadeh, M.
2013-09-01
Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.
Model-based approach for cyber-physical attack detection in water distribution systems.
Housh, Mashor; Ohar, Ziv
2018-08-01
Modern Water Distribution Systems (WDSs) are often controlled by Supervisory Control and Data Acquisition (SCADA) systems and Programmable Logic Controllers (PLCs) which manage their operation and maintain a reliable water supply. As such, and with the cyber layer becoming a central component of WDS operations, these systems are at a greater risk of being subjected to cyberattacks. This paper offers a model-based methodology based on a detailed hydraulic understanding of WDSs combined with an anomaly detection algorithm for the identification of complex cyberattacks that cannot be fully identified by hydraulically based rules alone. The results show that the proposed algorithm is capable of achieving the best-known performance when tested on the data published in the BATtle of the Attack Detection ALgorithms (BATADAL) competition (http://www.batadal.net). Copyright © 2018. Published by Elsevier Ltd.
Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines
Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu
2016-01-01
In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561
Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.
Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu
2016-04-29
In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.
Variable Discretisation for Anomaly Detection using Bayesian Networks
2017-01-01
UNCLASSIFIED DST- Group –TR–3328 1 Introduction Bayesian network implementations usually require each variable to take on a finite number of mutually...UNCLASSIFIED Variable Discretisation for Anomaly Detection using Bayesian Networks Jonathan Legg National Security and ISR Division Defence Science...and Technology Group DST- Group –TR–3328 ABSTRACT Anomaly detection is the process by which low probability events are automatically found against a
Enhanced detection and visualization of anomalies in spectral imagery
NASA Astrophysics Data System (ADS)
Basener, William F.; Messinger, David W.
2009-05-01
Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.
Pediatric tinnitus: Incidence of imaging anomalies and the impact of hearing loss.
Kerr, Rhorie; Kang, Elise; Hopkins, Brandon; Anne, Samantha
2017-12-01
Guidelines exist for evaluation and management of tinnitus in adults; however lack of evidence in children limits applicability of these guidelines to pediatric patients. Objective of this study is to determine the incidence of inner ear anomalies detected on imaging studies within the pediatric population with tinnitus and evaluate if presence of hearing loss increases the rate of detection of anomalies in comparison to normal hearing patients. Retrospective review of all children with diagnosis of tinnitus from 2010 to 2015 ;at a tertiary care academic center. 102 pediatric patients with tinnitus were identified. Overall, 53 patients had imaging studies with 6 abnormal findings (11.3%). 51/102 patients had hearing loss of which 33 had imaging studies demonstrating 6 inner ear anomalies detected. This is an incidence of 18.2% for inner ear anomalies identified in patients with hearing loss (95% confidence interval (CI) of 7.0-35.5%). 4 of these 6 inner ear anomalies detected were vestibular aqueduct abnormalities. The other two anomalies were cochlear hypoplasia and bilateral semicircular canal dysmorphism. 51 patients had no hearing loss and of these patients, 20 had imaging studies with no inner ear abnormalities detected. There was no statistical difference in incidence of abnormal imaging findings in patients with and without hearing loss (Fisher's exact test, p ;= ;0.072.) CONCLUSION: There is a high incidence of anomalies detected in imaging studies done in pediatric patients with tinnitus, especially in the presence of hearing loss. Copyright © 2017 Elsevier B.V. All rights reserved.
2008-08-28
for aircraft pitch measurement Fluxgate magnetometer 10 RS232- ASCII SerialDevice.fluxgate Provides redundant aircraft attitude measurement...Figure 28. Filtered, ’final’ magnetometer data taken at high altitude. ......................................................... 43 LIST OF TABLES...flight. The magnetometer data can be analyzed to extract either distributions of magnetic anomalies (which can be further used to locate and bound
Detailed Vibration Analysis of Pinion Gear with Time-Frequency Methods
NASA Technical Reports Server (NTRS)
Mosher, Marianne; Pryor, Anna H.; Lewicki, David G.
2003-01-01
In this paper, the authors show a detailed analysis of the vibration signal from the destructive testing of a spiral bevel gear and pinion pair containing seeded faults. The vibration signal is analyzed in the time domain, frequency domain and with four time-frequency transforms: the Short Time Frequency Transform (STFT), the Wigner-Ville Distribution with the Choi-Williams kernel (WV-CW), the Continuous Wavelet' Transform (CWT) and the Discrete Wavelet Transform (DWT). Vibration data of bevel gear tooth fatigue cracks, under a variety of operating load levels and damage conditions, are analyzed using these methods. A new metric for automatic anomaly detection is developed and can be produced from any systematic numerical representation of the vibration signals. This new metric reveals indications of gear damage with all of the time-frequency transforms, as well as time and frequency representations, on this data set. Analysis with the CWT detects changes in the signal at low torque levels not found with the other transforms. The WV-CW and CWT use considerably more resources than the STFT and the DWT. More testing of the new metric is needed to determine its value for automatic anomaly detection and to develop fault detection methods for the metric.
Spatially-Aware Temporal Anomaly Mapping of Gamma Spectra
NASA Astrophysics Data System (ADS)
Reinhart, Alex; Athey, Alex; Biegalski, Steven
2014-06-01
For security, environmental, and regulatory purposes it is useful to continuously monitor wide areas for unexpected changes in radioactivity. We report on a temporal anomaly detection algorithm which uses mobile detectors to build a spatial map of background spectra, allowing sensitive detection of any anomalies through many days or months of monitoring. We adapt previously-developed anomaly detection methods, which compare spectral shape rather than count rate, to function with limited background data, allowing sensitive detection of small changes in spectral shape from day to day. To demonstrate this technique we collected daily observations over the period of six weeks on a 0.33 square mile research campus and performed source injection simulations.
NASA Astrophysics Data System (ADS)
Hortos, William S.
2009-05-01
In previous work by the author, parameters across network protocol layers were selected as features in supervised algorithms that detect and identify certain intrusion attacks on wireless ad hoc sensor networks (WSNs) carrying multisensor data. The algorithms improved the residual performance of the intrusion prevention measures provided by any dynamic key-management schemes and trust models implemented among network nodes. The approach of this paper does not train algorithms on the signature of known attack traffic, but, instead, the approach is based on unsupervised anomaly detection techniques that learn the signature of normal network traffic. Unsupervised learning does not require the data to be labeled or to be purely of one type, i.e., normal or attack traffic. The approach can be augmented to add any security attributes and quantified trust levels, established during data exchanges among nodes, to the set of cross-layer features from the WSN protocols. A two-stage framework is introduced for the security algorithms to overcome the problems of input size and resource constraints. The first stage is an unsupervised clustering algorithm which reduces the payload of network data packets to a tractable size. The second stage is a traditional anomaly detection algorithm based on a variation of support vector machines (SVMs), whose efficiency is improved by the availability of data in the packet payload. In the first stage, selected algorithms are adapted to WSN platforms to meet system requirements for simple parallel distributed computation, distributed storage and data robustness. A set of mobile software agents, acting like an ant colony in securing the WSN, are distributed at the nodes to implement the algorithms. The agents move among the layers involved in the network response to the intrusions at each active node and trustworthy neighborhood, collecting parametric values and executing assigned decision tasks. This minimizes the need to move large amounts of audit-log data through resource-limited nodes and locates routines closer to that data. Performance of the unsupervised algorithms is evaluated against the network intrusions of black hole, flooding, Sybil and other denial-of-service attacks in simulations of published scenarios. Results for scenarios with intentionally malfunctioning sensors show the robustness of the two-stage approach to intrusion anomalies.
Hyperspectral anomaly detection using Sony PlayStation 3
NASA Astrophysics Data System (ADS)
Rosario, Dalton; Romano, João; Sepulveda, Rene
2009-05-01
We present a proof-of-principle demonstration using Sony's IBM Cell processor-based PlayStation 3 (PS3) to run-in near real-time-a hyperspectral anomaly detection algorithm (HADA) on real hyperspectral (HS) long-wave infrared imagery. The PS3 console proved to be ideal for doing precisely the kind of heavy computational lifting HS based algorithms require, and the fact that it is a relatively open platform makes programming scientific applications feasible. The PS3 HADA is a unique parallel-random sampling based anomaly detection approach that does not require prior spectra of the clutter background. The PS3 HADA is designed to handle known underlying difficulties (e.g., target shape/scale uncertainties) often ignored in the development of autonomous anomaly detection algorithms. The effort is part of an ongoing cooperative contribution between the Army Research Laboratory and the Army's Armament, Research, Development and Engineering Center, which aims at demonstrating performance of innovative algorithmic approaches for applications requiring autonomous anomaly detection using passive sensors.
A novel approach for detection of anomalies using measurement data of the Ironton-Russell bridge
NASA Astrophysics Data System (ADS)
Zhang, Fan; Norouzi, Mehdi; Hunt, Victor; Helmicki, Arthur
2015-04-01
Data models have been increasingly used in recent years for documenting normal behavior of structures and hence detect and classify anomalies. Large numbers of machine learning algorithms were proposed by various researchers to model operational and functional changes in structures; however, a limited number of studies were applied to actual measurement data due to limited access to the long term measurement data of structures and lack of access to the damaged states of structures. By monitoring the structure during construction and reviewing the effect of construction events on the measurement data, this study introduces a new approach to detect and eventually classify anomalies during construction and after construction. First, the implementation procedure of the sensory network that develops while the bridge is being built and its current status will be detailed. Second, the proposed anomaly detection algorithm will be applied on the collected data and finally, detected anomalies will be validated against the archived construction events.
A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring
NASA Technical Reports Server (NTRS)
Doyle, R.
1994-01-01
Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.
Magnetic and gravity anomalies in the Americas
NASA Technical Reports Server (NTRS)
Braile, L. W.; Hinze, W. J.; Vonfrese, R. R. B. (Principal Investigator)
1981-01-01
The cleaning and magnetic tape storage of spherical Earth processing programs are reported. These programs include: NVERTSM which inverts total or vector magnetic anomaly data on a distribution of point dipoles in spherical coordinates; SMFLD which utilizes output from NVERTSM to compute total or vector magnetic anomaly fields for a distribution of point dipoles in spherical coordinates; NVERTG; and GFLD. Abstracts are presented for papers dealing with the mapping and modeling of magnetic and gravity anomalies, and with the verification of crustal components in satellite data.
Multi-Level Modeling of Complex Socio-Technical Systems - Phase 1
2013-06-06
is to detect anomalous organizational outcomes, diagnose the causes of these anomalies , and decide upon appropriate compensation schemes. All of...monitor process outcomes. The purpose of this monitoring is to detect anomalous process outcomes, diagnose the causes of these anomalies , and decide upon...monitor work outcomes in terms of performance. The purpose of this monitoring is to detect anomalous work outcomes, diagnose the causes of these anomalies
Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ondrej Linda; Todd Vollmer; Milos Manic
The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, thismore » paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.« less
Residual Error Based Anomaly Detection Using Auto-Encoder in SMD Machine Sound.
Oh, Dong Yul; Yun, Il Dong
2018-04-24
Detecting an anomaly or an abnormal situation from given noise is highly useful in an environment where constantly verifying and monitoring a machine is required. As deep learning algorithms are further developed, current studies have focused on this problem. However, there are too many variables to define anomalies, and the human annotation for a large collection of abnormal data labeled at the class-level is very labor-intensive. In this paper, we propose to detect abnormal operation sounds or outliers in a very complex machine along with reducing the data-driven annotation cost. The architecture of the proposed model is based on an auto-encoder, and it uses the residual error, which stands for its reconstruction quality, to identify the anomaly. We assess our model using Surface-Mounted Device (SMD) machine sound, which is very complex, as experimental data, and state-of-the-art performance is successfully achieved for anomaly detection.
Geoid Anomalies and the Near-Surface Dipole Distribution of Mass
NASA Technical Reports Server (NTRS)
Turcotte, D. L.; Ockendon, J. R.
1978-01-01
Although geoid or surface gravity anomalies cannot be uniquely related to an interior distribution of mass, they can be related to a surface mass distribution. However, over horizontal distances greater than about 100 km, the condition of isostatic equilibrium above the asthenosphere is a good approximation and the total mass per unit column is zero. Thus the surface distribution of mass is also zero. For this case we show that the surface gravitational potential anomaly can be uniquely related to a surface dipole distribution of mass. Variations in the thickness of the crust and lithosphere can be expected to produce undulations in the geoid.
First trimester PAPP-A in the detection of non-Down syndrome aneuploidy.
Ochshorn, Y; Kupferminc, M J; Wolman, I; Orr-Urtreger, A; Jaffa, A J; Yaron, Y
2001-07-01
Combined first trimester screening using pregnancy associated plasma protein-A (PAPP-A), free beta-human chorionic gonadotrophin, and nuchal translucency (NT), is currently accepted as probably the best combination for the detection of Down syndrome (DS). Current first trimester algorithms provide computed risks only for DS. However, low PAPP-A is also associated with other chromosome anomalies such as trisomy 13, 18, and sex chromosome aneuploidy. Thus, using currently available algorithms, some chromosome anomalies may not be detected. The purpose of the present study was to establish a low-end cut-off value for PAPP-A that would increase the detection rates for non-DS chromosome anomalies. The study included 1408 patients who underwent combined first trimester screening. To determine a low-end cut-off value for PAPP-A, a Receiver-Operator Characteristic (ROC) curve analysis was performed. In the entire study group there were 18 cases of chromosome anomalies (trisomy 21, 13, 18, sex chromosome anomalies), 14 of which were among screen-positive patients, a detection rate of 77.7% for all chromosome anomalies (95% CI: 55.7-99.7%). ROC curve analysis detected a statistically significant cut-off for PAPP-A at 0.25 MoM. If the definition of screen-positive were to also include patients with PAPP-A<0.25 MoM, the detection rate would increase to 88.8% for all chromosome anomalies (95% CI: 71.6-106%). This low cut-off value may be used until specific algorithms are implemented for non-Down syndrome aneuploidy. Copyright 2001 John Wiley & Sons, Ltd.
Toward Baseline Software Anomalies in NASA Missions
NASA Technical Reports Server (NTRS)
Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.
2012-01-01
In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.
Evaluation schemes for video and image anomaly detection algorithms
NASA Astrophysics Data System (ADS)
Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael
2016-05-01
Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.
Security inspection in ports by anomaly detection using hyperspectral imaging technology
NASA Astrophysics Data System (ADS)
Rivera, Javier; Valverde, Fernando; Saldaña, Manuel; Manian, Vidya
2013-05-01
Applying hyperspectral imaging technology in port security is crucial for the detection of possible threats or illegal activities. One of the most common problems that cargo suffers is tampering. This represents a danger to society because it creates a channel to smuggle illegal and hazardous products. If a cargo is altered, security inspections on that cargo should contain anomalies that reveal the nature of the tampering. Hyperspectral images can detect anomalies by gathering information through multiple electromagnetic bands. The spectrums extracted from these bands can be used to detect surface anomalies from different materials. Based on this technology, a scenario was built in which a hyperspectral camera was used to inspect the cargo for any surface anomalies and a user interface shows the results. The spectrum of items, altered by different materials that can be used to conceal illegal products, is analyzed and classified in order to provide information about the tampered cargo. The image is analyzed with a variety of techniques such as multiple features extracting algorithms, autonomous anomaly detection, and target spectrum detection. The results will be exported to a workstation or mobile device in order to show them in an easy -to-use interface. This process could enhance the current capabilities of security systems that are already implemented, providing a more complete approach to detect threats and illegal cargo.
Distribution of branchial anomalies in a paediatric Asian population.
Teo, Neville Wei Yang; Ibrahim, Shahrul Izham; Tan, Kun Kiaang Henry
2015-04-01
The objective of the present study was to review the distribution and incidence of branchial anomalies in an Asian paediatric population and highlight the challenges involved in the diagnosis of branchial anomalies. This was a retrospective chart review of all paediatric patients who underwent surgery for branchial anomalies in a tertiary paediatric hospital from August 2007 to November 2012. The clinical notes were correlated with preoperative radiological investigations, intraoperative findings and histology results. Branchial anomalies were classified based on the results of the review. A total of 28 children underwent surgery for 30 branchial anomalies during the review period. Two children had bilateral branchial anomalies requiring excision. Of the 30 branchial anomalies, 7 (23.3%) were first branchial anomalies, 5 (16.7%) were second branchial anomalies, 3 (10.0%) were third branchial anomalies, and 4 (13.3%) were fourth branchial anomalies (one of the four patients with fourth branchial anomalies had bilateral branchial anomalies). In addition, seven children had 8 (26.7%) branchial anomalies that were thought to originate from the pyriform sinus; however, we were unable to determine if these anomalies were from the third or fourth branchial arches. There was inadequate information on the remaining 3 (10.0%) branchial anomalies for classification. The incidence of second branchial anomalies appears to be lower in our Asian paediatric population, while that of third and fourth branchial anomalies was higher. Knowledge of embryology and the related anatomy of the branchial apparatus is crucial in the identification of the type of branchial anomaly.
Alekseeva, N P; Alekseev, A O; Vakhtin, Iu B; Kravtsov, V Iu; Kuzovatov, S N; Skorikova, T I
2008-01-01
Distributions of nuclear morphology anomalies in transplantable rabdomiosarcoma RA-23 cell populations were investigated under effect of ionizing radiation from 0 to 45 Gy. Internuclear bridges, nuclear protrusions and dumbbell-shaped nuclei were accepted for morphological anomalies. Empirical distributions of the number of anomalies per 100 nuclei were used. The adequate model of reentrant binomial distribution has been found. The sum of binomial random variables with binomial number of summands has such distribution. Averages of these random variables were named, accordingly, internal and external average reentrant components. Their maximum likelihood estimations were received. Statistical properties of these estimations were investigated by means of statistical modeling. It has been received that at equally significant correlation between the radiation dose and the average of nuclear anomalies in cell populations after two-three cellular cycles from the moment of irradiation in vivo the irradiation doze significantly correlates with internal average reentrant component, and in remote descendants of cell transplants irradiated in vitro - with external one.
A new comparison of hyperspectral anomaly detection algorithms for real-time applications
NASA Astrophysics Data System (ADS)
Díaz, María.; López, Sebastián.; Sarmiento, Roberto
2016-10-01
Due to the high spectral resolution that remotely sensed hyperspectral images provide, there has been an increasing interest in anomaly detection. The aim of anomaly detection is to stand over pixels whose spectral signature differs significantly from the background spectra. Basically, anomaly detectors mark pixels with a certain score, considering as anomalies those whose scores are higher than a threshold. Receiver Operating Characteristic (ROC) curves have been widely used as an assessment measure in order to compare the performance of different algorithms. ROC curves are graphical plots which illustrate the trade- off between false positive and true positive rates. However, they are limited in order to make deep comparisons due to the fact that they discard relevant factors required in real-time applications such as run times, costs of misclassification and the competence to mark anomalies with high scores. This last fact is fundamental in anomaly detection in order to distinguish them easily from the background without any posterior processing. An extensive set of simulations have been made using different anomaly detection algorithms, comparing their performances and efficiencies using several extra metrics in order to complement ROC curves analysis. Results support our proposal and demonstrate that ROC curves do not provide a good visualization of detection performances for themselves. Moreover, a figure of merit has been proposed in this paper which encompasses in a single global metric all the measures yielded for the proposed additional metrics. Therefore, this figure, named Detection Efficiency (DE), takes into account several crucial types of performance assessment that ROC curves do not consider. Results demonstrate that algorithms with the best detection performances according to ROC curves do not have the highest DE values. Consequently, the recommendation of using extra measures to properly evaluate performances have been supported and justified by the conclusions drawn from the simulations.
NASA Astrophysics Data System (ADS)
Zhang, Xing; Wen, Gongjian
2015-10-01
Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.
An Optimized Method to Detect BDS Satellites' Orbit Maneuvering and Anomalies in Real-Time.
Huang, Guanwen; Qin, Zhiwei; Zhang, Qin; Wang, Le; Yan, Xingyuan; Wang, Xiaolei
2018-02-28
The orbital maneuvers of Global Navigation Satellite System (GNSS) Constellations will decrease the performance and accuracy of positioning, navigation, and timing (PNT). Because satellites in the Chinese BeiDou Navigation Satellite System (BDS) are in Geostationary Orbit (GEO) and Inclined Geosynchronous Orbit (IGSO), maneuvers occur more frequently. Also, the precise start moment of the BDS satellites' orbit maneuvering cannot be obtained by common users. This paper presented an improved real-time detecting method for BDS satellites' orbit maneuvering and anomalies with higher timeliness and higher accuracy. The main contributions to this improvement are as follows: (1) instead of the previous two-steps method, a new one-step method with higher accuracy is proposed to determine the start moment and the pseudo random noise code (PRN) of the satellite orbit maneuvering in that time; (2) BDS Medium Earth Orbit (MEO) orbital maneuvers are firstly detected according to the proposed selection strategy for the stations; and (3) the classified non-maneuvering anomalies are detected by a new median robust method using the weak anomaly detection factor and the strong anomaly detection factor. The data from the Multi-GNSS Experiment (MGEX) in 2017 was used for experimental analysis. The experimental results and analysis showed that the start moment of orbital maneuvers and the period of non-maneuver anomalies can be determined more accurately in real-time. When orbital maneuvers and anomalies occur, the proposed method improved the data utilization for 91 and 95 min in 2017.
An Optimized Method to Detect BDS Satellites’ Orbit Maneuvering and Anomalies in Real-Time
Huang, Guanwen; Qin, Zhiwei; Zhang, Qin; Wang, Le; Yan, Xingyuan; Wang, Xiaolei
2018-01-01
The orbital maneuvers of Global Navigation Satellite System (GNSS) Constellations will decrease the performance and accuracy of positioning, navigation, and timing (PNT). Because satellites in the Chinese BeiDou Navigation Satellite System (BDS) are in Geostationary Orbit (GEO) and Inclined Geosynchronous Orbit (IGSO), maneuvers occur more frequently. Also, the precise start moment of the BDS satellites’ orbit maneuvering cannot be obtained by common users. This paper presented an improved real-time detecting method for BDS satellites’ orbit maneuvering and anomalies with higher timeliness and higher accuracy. The main contributions to this improvement are as follows: (1) instead of the previous two-steps method, a new one-step method with higher accuracy is proposed to determine the start moment and the pseudo random noise code (PRN) of the satellite orbit maneuvering in that time; (2) BDS Medium Earth Orbit (MEO) orbital maneuvers are firstly detected according to the proposed selection strategy for the stations; and (3) the classified non-maneuvering anomalies are detected by a new median robust method using the weak anomaly detection factor and the strong anomaly detection factor. The data from the Multi-GNSS Experiment (MGEX) in 2017 was used for experimental analysis. The experimental results and analysis showed that the start moment of orbital maneuvers and the period of non-maneuver anomalies can be determined more accurately in real-time. When orbital maneuvers and anomalies occur, the proposed method improved the data utilization for 91 and 95 min in 2017. PMID:29495638
Active Learning with Rationales for Identifying Operationally Significant Anomalies in Aviation
NASA Technical Reports Server (NTRS)
Sharma, Manali; Das, Kamalika; Bilgic, Mustafa; Matthews, Bryan; Nielsen, David Lynn; Oza, Nikunj C.
2016-01-01
A major focus of the commercial aviation community is discovery of unknown safety events in flight operations data. Data-driven unsupervised anomaly detection methods are better at capturing unknown safety events compared to rule-based methods which only look for known violations. However, not all statistical anomalies that are discovered by these unsupervised anomaly detection methods are operationally significant (e.g., represent a safety concern). Subject Matter Experts (SMEs) have to spend significant time reviewing these statistical anomalies individually to identify a few operationally significant ones. In this paper we propose an active learning algorithm that incorporates SME feedback in the form of rationales to build a classifier that can distinguish between uninteresting and operationally significant anomalies. Experimental evaluation on real aviation data shows that our approach improves detection of operationally significant events by as much as 75% compared to the state-of-the-art. The learnt classifier also generalizes well to additional validation data sets.
Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering
NASA Astrophysics Data System (ADS)
Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech
2015-03-01
We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.
2013-09-26
vehicle-lengths between frames. The low specificity of object detectors in WAMI means all vehicle detections are treated equally. Motion clutter...timing of the anomaly . If an anomaly was detected , recent activity would have a priority over older activity. This is due to the reasoning that if the...this could be a potential anomaly detected . Other baseline activities include normal work hours, religious observance times and interactions between
Critical Infrastructure Protection and Resilience Literature Survey: Modeling and Simulation
2014-11-01
2013 Page 34 of 63 Below the yellow set is a purple cluster bringing together detection , anomaly , intrusion, sensors, monitoring and alerting (early...hazards and threats to security56 Water ADWICE, PSS®SINCAL ADWICE for real-time anomaly detection in water management systems57 One tool that...Systems. Cybernetics and Information Technologies. 2008;8(4):57-68. 57. Raciti M, Cucurull J, Nadjm-Tehrani S. Anomaly detection in water management
Symbolic Time-Series Analysis for Anomaly Detection in Mechanical Systems
2006-08-01
Amol Khatkhate, Asok Ray , Fellow, IEEE, Eric Keller, Shalabh Gupta, and Shin C. Chin Abstract—This paper examines the efficacy of a novel method for...recognition. KHATKHATE et al.: SYMBOLIC TIME-SERIES ANALYSIS FOR ANOMALY DETECTION 447 Asok Ray (F’02) received graduate degrees in electri- cal...anomaly detection has been pro- posed by Ray [6], where the underlying information on the dynamical behavior of complex systems is derived based on
Autonomous detection of crowd anomalies in multiple-camera surveillance feeds
NASA Astrophysics Data System (ADS)
Nordlöf, Jonas; Andersson, Maria
2016-10-01
A novel approach for autonomous detection of anomalies in crowded environments is presented in this paper. The proposed models uses a Gaussian mixture probability hypothesis density (GM-PHD) filter as feature extractor in conjunction with different Gaussian mixture hidden Markov models (GM-HMMs). Results, based on both simulated and recorded data, indicate that this method can track and detect anomalies on-line in individual crowds through multiple camera feeds in a crowded environment.
Deep learning on temporal-spectral data for anomaly detection
NASA Astrophysics Data System (ADS)
Ma, King; Leung, Henry; Jalilian, Ehsan; Huang, Daniel
2017-05-01
Detecting anomalies is important for continuous monitoring of sensor systems. One significant challenge is to use sensor data and autonomously detect changes that cause different conditions to occur. Using deep learning methods, we are able to monitor and detect changes as a result of some disturbance in the system. We utilize deep neural networks for sequence analysis of time series. We use a multi-step method for anomaly detection. We train the network to learn spectral and temporal features from the acoustic time series. We test our method using fiber-optic acoustic data from a pipeline.
Firefly Algorithm in detection of TEC seismo-ionospheric anomalies
NASA Astrophysics Data System (ADS)
Akhoondzadeh, Mehdi
2015-07-01
Anomaly detection in time series of different earthquake precursors is an essential introduction to create an early warning system with an allowable uncertainty. Since these time series are more often non linear, complex and massive, therefore the applied predictor method should be able to detect the discord patterns from a large data in a short time. This study acknowledges Firefly Algorithm (FA) as a simple and robust predictor to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of the some powerful earthquakes including Chile (27 February 2010), Varzeghan (11 August 2012) and Saravan (16 April 2013). Outstanding anomalies were observed 7 and 5 days before the Chile and Varzeghan earthquakes, respectively and also 3 and 8 days prior to the Saravan earthquake.
Latent Space Tracking from Heterogeneous Data with an Application for Anomaly Detection
2015-11-01
specific, if the anomaly behaves as a sudden outlier after which the data stream goes back to normal state, then the anomalous data point should be...introduced three types of anomalies , all of them are sudden outliers . 438 J. Huang and X. Ning Table 2. Synthetic dataset: AUC and parameters method...Latent Space Tracking from Heterogeneous Data with an Application for Anomaly Detection Jiaji Huang1(B) and Xia Ning2 1 Department of Electrical
Automation for deep space vehicle monitoring
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.
1991-01-01
Information on automation for deep space vehicle monitoring is given in viewgraph form. Information is given on automation goals and strategy; the Monitor Analyzer of Real-time Voyager Engineering Link (MARVEL); intelligent input data management; decision theory for making tradeoffs; dynamic tradeoff evaluation; evaluation of anomaly detection results; evaluation of data management methods; system level analysis with cooperating expert systems; the distributed architecture of multiple expert systems; and event driven response.
Forward Modelling of Long-wavelength Magnetic Anomaly Contributions from the Upper Mantle
NASA Astrophysics Data System (ADS)
Idoko, C. M.; Conder, J. A.; Ferre, E. C.; Friedman, S. A.
2016-12-01
Towards the interpretation of the upcoming results from SWARM satellite survey, we develop a MATLAB-based geophysical forward-modeling of magnetic anomalies from tectonic regions with different upper mantle geotherms including subduction zones (Kamchaka island arcs), cratons (Siberian craton), and hotspots (Hawaii hotspots and Massif-central plumes). We constrain the modeling - using magnetic data measured from xenoliths collected across these regions. Over the years, the potency of the upper mantle in contributing to long-wavelength magnetic anomalies has been a topic of debate among geoscientists. However, recent works show that some low geotherm tectonic environments such as forearcs and cratons contain mantle xenoliths which are below the Curie-Temperature of magnetite and could potentially contribute to long-wavelength magnetic anomalies. The modeling pursued here holds the prospect of better understanding the magnetism of the upper mantle, and the resolution of the mismatch between observed long-wavelength anomalies and surface field anomaly upward continued to satellite altitude. The SWARM satellite survey provides a unique opportunity due to its capacity to detect more accurately the depth of magnetic sources. A preliminary model of a hypothetical craton of size 2000km by 1000km by 500km discretized into 32 equal and uniformly distributed prism blocks, using magnetic data from Siberian craton with average natural remanent magnetization value of 0.0829 A/m (randomnly oriented) for a magnetized mantle thickness of 75km, and induced magnetization, varying according to the Curie-Weiss law from surface to 500km depth with an average magnetization of 0.02 A/m, shows that the contributions of the induced and remanent phases of magnetizations- with a total-field anomaly amplitude of 3 nT may impart a measurable signal to the observed long-wavelength magnetic anomalies in low geotherm tectonic environments.
Efficient dynamic events discrimination technique for fiber distributed Brillouin sensors.
Galindez, Carlos A; Madruga, Francisco J; Lopez-Higuera, Jose M
2011-09-26
A technique to detect real time variations of temperature or strain in Brillouin based distributed fiber sensors is proposed and is investigated in this paper. The technique is based on anomaly detection methods such as the RX-algorithm. Detection and isolation of dynamic events from the static ones are demonstrated by a proper processing of the Brillouin gain values obtained by using a standard BOTDA system. Results also suggest that better signal to noise ratio, dynamic range and spatial resolution can be obtained. For a pump pulse of 5 ns the spatial resolution is enhanced, (from 0.541 m obtained by direct gain measurement, to 0.418 m obtained with the technique here exposed) since the analysis is concentrated in the variation of the Brillouin gain and not only on the averaging of the signal along the time. © 2011 Optical Society of America
Distribution of branchial anomalies in a paediatric Asian population
Teo, Neville Wei Yang; Ibrahim, Shahrul Izham; Tan, Kun Kiaang Henry
2015-01-01
INTRODUCTION The objective of the present study was to review the distribution and incidence of branchial anomalies in an Asian paediatric population and highlight the challenges involved in the diagnosis of branchial anomalies. METHODS This was a retrospective chart review of all paediatric patients who underwent surgery for branchial anomalies in a tertiary paediatric hospital from August 2007 to November 2012. The clinical notes were correlated with preoperative radiological investigations, intraoperative findings and histology results. Branchial anomalies were classified based on the results of the review. RESULTS A total of 28 children underwent surgery for 30 branchial anomalies during the review period. Two children had bilateral branchial anomalies requiring excision. Of the 30 branchial anomalies, 7 (23.3%) were first branchial anomalies, 5 (16.7%) were second branchial anomalies, 3 (10.0%) were third branchial anomalies, and 4 (13.3%) were fourth branchial anomalies (one of the four patients with fourth branchial anomalies had bilateral branchial anomalies). In addition, seven children had 8 (26.7%) branchial anomalies that were thought to originate from the pyriform sinus; however, we were unable to determine if these anomalies were from the third or fourth branchial arches. There was inadequate information on the remaining 3 (10.0%) branchial anomalies for classification. CONCLUSION The incidence of second branchial anomalies appears to be lower in our Asian paediatric population, while that of third and fourth branchial anomalies was higher. Knowledge of embryology and the related anatomy of the branchial apparatus is crucial in the identification of the type of branchial anomaly. PMID:25917471
Data cleaning in the energy domain
NASA Astrophysics Data System (ADS)
Akouemo Kengmo Kenfack, Hermine N.
This dissertation addresses the problem of data cleaning in the energy domain, especially for natural gas and electric time series. The detection and imputation of anomalies improves the performance of forecasting models necessary to lower purchasing and storage costs for utilities and plan for peak energy loads or distribution shortages. There are various types of anomalies, each induced by diverse causes and sources depending on the field of study. The definition of false positives also depends on the context. The analysis is focused on energy data because of the availability of data and information to make a theoretical and practical contribution to the field. A probabilistic approach based on hypothesis testing is developed to decide if a data point is anomalous based on the level of significance. Furthermore, the probabilistic approach is combined with statistical regression models to handle time series data. Domain knowledge of energy data and the survey of causes and sources of anomalies in energy are incorporated into the data cleaning algorithm to improve the accuracy of the results. The data cleaning method is evaluated on simulated data sets in which anomalies were artificially inserted and on natural gas and electric data sets. In the simulation study, the performance of the method is evaluated for both detection and imputation on all identified causes of anomalies in energy data. The testing on utilities' data evaluates the percentage of improvement brought to forecasting accuracy by data cleaning. A cross-validation study of the results is also performed to demonstrate the performance of the data cleaning algorithm on smaller data sets and to calculate an interval of confidence for the results. The data cleaning algorithm is able to successfully identify energy time series anomalies. The replacement of those anomalies provides improvement to forecasting models accuracy. The process is automatic, which is important because many data cleaning processes require human input and become impractical for very large data sets. The techniques are also applicable to other fields such as econometrics and finance, but the exogenous factors of the time series data need to be well defined.
Anomaly Detection Based on Sensor Data in Petroleum Industry Applications
Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra
2015-01-01
Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection. PMID:25633599
Disparity : scalable anomaly detection for clusters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desai, N.; Bradshaw, R.; Lusk, E.
2008-01-01
In this paper, we describe disparity, a tool that does parallel, scalable anomaly detection for clusters. Disparity uses basic statistical methods and scalable reduction operations to perform data reduction on client nodes and uses these results to locate node anomalies. We discuss the implementation of disparity and present results of its use on a SiCortex SC5832 system.
Integrated System Health Management (ISHM) for Test Stand and J-2X Engine: Core Implementation
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Schmalzel, John L.; Aguilar, Robert; Shwabacher, Mark; Morris, Jon
2008-01-01
ISHM capability enables a system to detect anomalies, determine causes and effects, predict future anomalies, and provides an integrated awareness of the health of the system to users (operators, customers, management, etc.). NASA Stennis Space Center, NASA Ames Research Center, and Pratt & Whitney Rocketdyne have implemented a core ISHM capability that encompasses the A1 Test Stand and the J-2X Engine. The implementation incorporates all aspects of ISHM; from anomaly detection (e.g. leaks) to root-cause-analysis based on failure mode and effects analysis (FMEA), to a user interface for an integrated visualization of the health of the system (Test Stand and Engine). The implementation provides a low functional capability level (FCL) in that it is populated with few algorithms and approaches for anomaly detection, and root-cause trees from a limited FMEA effort. However, it is a demonstration of a credible ISHM capability, and it is inherently designed for continuous and systematic augmentation of the capability. The ISHM capability is grounded on an integrating software environment used to create an ISHM model of the system. The ISHM model follows an object-oriented approach: includes all elements of the system (from schematics) and provides for compartmentalized storage of information associated with each element. For instance, a sensor object contains a transducer electronic data sheet (TEDS) with information that might be used by algorithms and approaches for anomaly detection, diagnostics, etc. Similarly, a component, such as a tank, contains a Component Electronic Data Sheet (CEDS). Each element also includes a Health Electronic Data Sheet (HEDS) that contains health-related information such as anomalies and health state. Some practical aspects of the implementation include: (1) near real-time data flow from the test stand data acquisition system through the ISHM model, for near real-time detection of anomalies and diagnostics, (2) insertion of the J-2X predictive model providing predicted sensor values for comparison with measured values and use in anomaly detection and diagnostics, and (3) insertion of third-party anomaly detection algorithms into the integrated ISHM model.
Robust and efficient anomaly detection using heterogeneous representations
NASA Astrophysics Data System (ADS)
Hu, Xing; Hu, Shiqiang; Xie, Jinhua; Zheng, Shiyou
2015-05-01
Various approaches have been proposed for video anomaly detection. Yet these approaches typically suffer from one or more limitations: they often characterize the pattern using its internal information, but ignore its external relationship which is important for local anomaly detection. Moreover, the high-dimensionality and the lack of robustness of pattern representation may lead to problems, including overfitting, increased computational cost and memory requirements, and high false alarm rate. We propose a video anomaly detection framework which relies on a heterogeneous representation to account for both the pattern's internal information and external relationship. The internal information is characterized by slow features learned by slow feature analysis from low-level representations, and the external relationship is characterized by the spatial contextual distances. The heterogeneous representation is compact, robust, efficient, and discriminative for anomaly detection. Moreover, both the pattern's internal information and external relationship can be taken into account in the proposed framework. Extensive experiments demonstrate the robustness and efficiency of our approach by comparison with the state-of-the-art approaches on the widely used benchmark datasets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohrer, Brandon Robinson
2011-09-01
Events of interest to data analysts are sometimes difficult to characterize in detail. Rather, they consist of anomalies, events that are unpredicted, unusual, or otherwise incongruent. The purpose of this LDRD was to test the hypothesis that a biologically-inspired anomaly detection algorithm could be used to detect contextual, multi-modal anomalies. There currently is no other solution to this problem, but the existence of a solution would have a great national security impact. The technical focus of this research was the application of a brain-emulating cognition and control architecture (BECCA) to the problem of anomaly detection. One aspect of BECCA inmore » particular was discovered to be critical to improved anomaly detection capabilities: it's feature creator. During the course of this project the feature creator was developed and tested against multiple data types. Development direction was drawn from psychological and neurophysiological measurements. Major technical achievements include the creation of hierarchical feature sets created from both audio and imagery data.« less
Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection
NASA Astrophysics Data System (ADS)
Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa
2018-01-01
A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.
Indo-Pacific sea level variability during recent decades
NASA Astrophysics Data System (ADS)
Yamanaka, G.; Tsujino, H.; Nakano, H.; Urakawa, S. L.; Sakamoto, K.
2016-12-01
Decadal variability of sea level in the Indo-Pacific region is investigated using a historical OGCM simulation. The OGCM driven by the atmospheric forcing removing long-term trends clearly exhibits decadal sea level variability in the Pacific Ocean, which is associated with eastern tropical Pacific thermal anomalies. During the period of 1977-1987, the sea level anomalies are positive in the eastern equatorial Pacific and show deviations from a north-south symmetric distribution, with strongly negative anomalies in the western tropical South Pacific. During the period of 1996-2006, in contrast, the sea level anomalies are negative in the eastern equatorial Pacific and show a nearly north-south symmetric pattern, with positive anomalies in both hemispheres. Concurrently, sea level anomalies in the south-eastern Indian Ocean vary with those in the western tropical Pacific. These sea level variations are closely related to large-scale wind fields. Indo-Pacific sea level distributions are basically determined by wind anomalies over the equatorial region as well as wind stress curl anomalies over the off-equatorial region.
Monitoring of Thermal Protection Systems Using Robust Self-Organizing Optical Fiber Sensing Networks
NASA Technical Reports Server (NTRS)
Richards, Lance
2013-01-01
The general aim of this work is to develop and demonstrate a prototype structural health monitoring system for thermal protection systems that incorporates piezoelectric acoustic emission (AE) sensors to detect the occurrence and location of damaging impacts, and an optical fiber Bragg grating (FBG) sensor network to evaluate the effect of detected damage on the thermal conductivity of the TPS material. Following detection of an impact, the TPS would be exposed to a heat source, possibly the sun, and the temperature distribution on the inner surface in the vicinity of the impact measured by the FBG network. A similar procedure could also be carried out as a screening test immediately prior to re-entry. The implications of any detected anomalies in the measured temperature distribution will be evaluated for their significance in relation to the performance of the TPS during re-entry. Such a robust TPS health monitoring system would ensure overall crew safety throughout the mission, especially during reentry
Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery
Sivaraks, Haemwaan
2015-01-01
Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284
Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.
2012-01-01
Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm2 areas and ≥2% in ∼20 mm2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified. PMID:22894421
Implementing Classification on a Munitions Response Project
2011-12-01
Detection Dig List IVS/Seed Site Planning Decisions Dig All Anomalies Site Characterization Implementing Classification on a Munitions Response...Details ● Seed emplacement ● EM61-MK2 detection survey RTK GPS ● Select anomalies for further investigation ● Collect cued data using MetalMapper...5.2 mV in channel 2 938 anomalies selected ● All QC seeds detected using this threshold Some just inside the 60-cm halo ● IVS reproducibility
Fuzzy Kernel k-Medoids algorithm for anomaly detection problems
NASA Astrophysics Data System (ADS)
Rustam, Z.; Talita, A. S.
2017-07-01
Intrusion Detection System (IDS) is an essential part of security systems to strengthen the security of information systems. IDS can be used to detect the abuse by intruders who try to get into the network system in order to access and utilize the available data sources in the system. There are two approaches of IDS, Misuse Detection and Anomaly Detection (behavior-based intrusion detection). Fuzzy clustering-based methods have been widely used to solve Anomaly Detection problems. Other than using fuzzy membership concept to determine the object to a cluster, other approaches as in combining fuzzy and possibilistic membership or feature-weighted based methods are also used. We propose Fuzzy Kernel k-Medoids that combining fuzzy and possibilistic membership as a powerful method to solve anomaly detection problem since on numerical experiment it is able to classify IDS benchmark data into five different classes simultaneously. We classify IDS benchmark data KDDCup'99 data set into five different classes simultaneously with the best performance was achieved by using 30 % of training data with clustering accuracy reached 90.28 percent.
Oztarhan, Kazim; Gedikbasi, Ali; Yildirim, Dogukan; Arslan, Oguz; Adal, Erdal; Kavuncuoglu, Sultan; Ozbek, Sibel; Ceylan, Yavuz
2010-12-01
The aim of this study was to determine the distribution of cases associated with congenital abnormalities during the following three periods: pregnancy, birth, and the neonatal period. This was a retrospective study of cases between 2002 and 2006. All abnormal pregnancies, elective terminations of pregnancies, stillbirths, and births with congenital abnormalities managed in the Neonatology Unit were classified based on the above distribution scheme. During the 5-year study period, 1906 cases with congenital abnormalities were recruited, as follows: 640 prenatally detected and terminated cases, with most abnormalities related to the central nervous system, chromosomes, and urogenital system (56.7%, 12.7%, and 8.9%, respectively); 712 neonates with congenital abnormalities (congenital heart disease [49.2%], central nervous system abnormalities [14.7%], and urogenital system abnormalities [12.9%]); and hospital stillbirths, of which 34.2% had malformations (220 prenatal cases [34.4%] had multiple abnormalities, whereas 188 liveborn cases [26.4%] had multiple abnormalities). The congenital abnormalities rate between 2002 and 2006 was 2.07%. Systematic screening for fetal anomalies is the primary means for identification of affected pregnancies. © 2010 The Authors. Congenital Anomalies © 2010 Japanese Teratology Society.
Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity.
Napoletano, Paolo; Piccoli, Flavio; Schettini, Raimondo
2018-01-12
Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art.
NASA Astrophysics Data System (ADS)
Li, Duo; Liu, Yajing
2017-04-01
Along-strike segmentation of slow-slip events (SSEs) and nonvolcanic tremors in Cascadia may reflect heterogeneities of the subducting slab or overlying continental lithosphere. However, the nature behind this segmentation is not fully understood. We develop a 3-D model for episodic SSEs in northern and central Cascadia, incorporating both seismological and gravitational observations to constrain the heterogeneities in the megathrust fault properties. The 6 year automatically detected tremors are used to constrain the rate-state friction parameters. The effective normal stress at SSE depths is constrained by along-margin free-air and Bouguer gravity anomalies. The along-strike variation in the long-term plate convergence rate is also taken into consideration. Simulation results show five segments of ˜Mw6.0 SSEs spontaneously appear along the strike, correlated to the distribution of tremor epicenters. Modeled SSE recurrence intervals are equally comparable to GPS observations using both types of gravity anomaly constraints. However, the model constrained by free-air anomaly does a better job in reproducing the cumulative slip as well as more consistent surface displacements with GPS observations. The modeled along-strike segmentation represents the averaged slip release over many SSE cycles, rather than permanent barriers. Individual slow-slip events can still propagate across the boundaries, which may cause interactions between adjacent SSEs, as observed in time-dependent GPS inversions. In addition, the moment-duration scaling is sensitive to the selection of velocity criteria for determining when SSEs occur. Hence, the detection ability of the current GPS network should be considered in the interpretation of slow earthquake source parameter scaling relations.
Modeling And Detecting Anomalies In Scada Systems
NASA Astrophysics Data System (ADS)
Svendsen, Nils; Wolthusen, Stephen
The detection of attacks and intrusions based on anomalies is hampered by the limits of specificity underlying the detection techniques. However, in the case of many critical infrastructure systems, domain-specific knowledge and models can impose constraints that potentially reduce error rates. At the same time, attackers can use their knowledge of system behavior to mask their manipulations, causing adverse effects to observed only after a significant period of time. This paper describes elementary statistical techniques that can be applied to detect anomalies in critical infrastructure networks. A SCADA system employed in liquefied natural gas (LNG) production is used as a case study.
First and second trimester screening for fetal structural anomalies.
Edwards, Lindsay; Hui, Lisa
2018-04-01
Fetal structural anomalies are found in up to 3% of all pregnancies and ultrasound-based screening has been an integral part of routine prenatal care for decades. The prenatal detection of fetal anomalies allows for optimal perinatal management, providing expectant parents with opportunities for additional imaging, genetic testing, and the provision of information regarding prognosis and management options. Approximately one-half of all major structural anomalies can now be detected in the first trimester, including acrania/anencephaly, abdominal wall defects, holoprosencephaly and cystic hygromata. Due to the ongoing development of some organ systems however, some anomalies will not be evident until later in the pregnancy. To this extent, the second trimester anatomy is recommended by professional societies as the standard investigation for the detection of fetal structural anomalies. The reported detection rates of structural anomalies vary according to the organ system being examined, and are also dependent upon factors such as the equipment settings and sonographer experience. Technological advances over the past two decades continue to support the role of ultrasound as the primary imaging modality in pregnancy, and the safety of ultrasound for the developing fetus is well established. With increasing capabilities and experience, detailed examination of the central nervous system and cardiovascular system is possible, with dedicated examinations such as the fetal neurosonogram and the fetal echocardiogram now widely performed in tertiary centers. Magnetic resonance imaging (MRI) is well recognized for its role in the assessment of fetal brain anomalies; other potential indications for fetal MRI include lung volume measurement (in cases of congenital diaphragmatic hernia), and pre-surgical planning prior to fetal spina bifida repair. When a major structural abnormality is detected prenatally, genetic testing with chromosomal microarray is recommended over routine karyotype due to its higher genomic resolution. Copyright © 2017 Elsevier Ltd. All rights reserved.
Climate anomalies associated with the occurrence of rockfalls at high-elevation in the Italian Alps
NASA Astrophysics Data System (ADS)
Paranunzio, Roberta; Laio, Francesco; Chiarle, Marta; Nigrelli, Guido; Guzzetti, Fausto
2016-09-01
Climate change is seriously affecting the cryosphere in terms, for example, of permafrost thaw, alteration of rain / snow ratio, and glacier shrinkage. There is concern about the increasing number of rockfalls at high elevation in the last decades. Nevertheless, the exact role of climate parameters in slope instability at high elevation has not been fully explored yet. In this paper, we investigate 41 rockfalls listed in different sources (newspapers, technical reports, and CNR IRPI archive) in the elevation range 1500-4200 m a.s.l. in the Italian Alps between 1997 and 2013 in the absence of an evident trigger. We apply and improve an existing data-based statistical approach to detect the anomalies of climate parameters (temperature and precipitation) associated with rockfall occurrences. The identified climate anomalies have been related to the spatiotemporal distribution of the events. Rockfalls occurred in association with significant temperature anomalies in 83 % of our case studies. Temperature represents a key factor contributing to slope failure occurrence in different ways. As expected, warm temperatures accelerate snowmelt and permafrost thaw; however, surprisingly, negative anomalies are also often associated with slope failures. Interestingly, different regional patterns emerge from the data: higher-than-average temperatures are often associated with rockfalls in the Western Alps, while in the Eastern Alps slope failures are mainly associated with colder-than-average temperatures.
NASA Astrophysics Data System (ADS)
Qin, Qiming; Zhang, Ning; Nan, Peng; Chai, Leilei
2011-08-01
Thermal infrared (TIR) remote sensing is an important technique in the exploration of geothermal resources. In this study, a geothermal survey is conducted in Tengchong area of Yunnan province in China using TIR data from Landsat-7 Enhanced Thematic Mapper Plus (ETM+) sensor. Based on radiometric calibration, atmospheric correction and emissivity calculation, a simple but efficient single channel algorithm with acceptable precision is applied to retrieve the land surface temperature (LST) of study area. The LST anomalous areas with temperature about 4-10 K higher than background area are discovered. Four geothermal areas are identified with the discussion of geothermal mechanism and the further analysis of regional geologic structure. The research reveals that the distribution of geothermal areas is consistent with the fault development in study area. Magmatism contributes abundant thermal source to study area and the faults provide thermal channels for heat transfer from interior earth to land surface and facilitate the present of geothermal anomalies. Finally, we conclude that TIR remote sensing is a cost-effective technique to detect LST anomalies. Combining TIR remote sensing with geological analysis and the understanding of geothermal mechanism is an accurate and efficient approach to geothermal area detection.
Scan statistics with local vote for target detection in distributed system
NASA Astrophysics Data System (ADS)
Luo, Junhai; Wu, Qi
2017-12-01
Target detection has occupied a pivotal position in distributed system. Scan statistics, as one of the most efficient detection methods, has been applied to a variety of anomaly detection problems and significantly improves the probability of detection. However, scan statistics cannot achieve the expected performance when the noise intensity is strong, or the signal emitted by the target is weak. The local vote algorithm can also achieve higher target detection rate. After the local vote, the counting rule is always adopted for decision fusion. The counting rule does not use the information about the contiguity of sensors but takes all sensors' data into consideration, which makes the result undesirable. In this paper, we propose a scan statistics with local vote (SSLV) method. This method combines scan statistics with local vote decision. Before scan statistics, each sensor executes local vote decision according to the data of its neighbors and its own. By combining the advantages of both, our method can obtain higher detection rate in low signal-to-noise ratio environment than the scan statistics. After the local vote decision, the distribution of sensors which have detected the target becomes more intensive. To make full use of local vote decision, we introduce a variable-step-parameter for the SSLV. It significantly shortens the scan period especially when the target is absent. Analysis and simulations are presented to demonstrate the performance of our method.
A hybrid approach for efficient anomaly detection using metaheuristic methods
Ghanem, Tamer F.; Elkilani, Wail S.; Abdul-kader, Hatem M.
2014-01-01
Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. PMID:26199752
Identifying Threats Using Graph-based Anomaly Detection
NASA Astrophysics Data System (ADS)
Eberle, William; Holder, Lawrence; Cook, Diane
Much of the data collected during the monitoring of cyber and other infrastructures is structural in nature, consisting of various types of entities and relationships between them. The detection of threatening anomalies in such data is crucial to protecting these infrastructures. We present an approach to detecting anomalies in a graph-based representation of such data that explicitly represents these entities and relationships. The approach consists of first finding normative patterns in the data using graph-based data mining and then searching for small, unexpected deviations to these normative patterns, assuming illicit behavior tries to mimic legitimate, normative behavior. The approach is evaluated using several synthetic and real-world datasets. Results show that the approach has high truepositive rates, low false-positive rates, and is capable of detecting complex structural anomalies in real-world domains including email communications, cellphone calls and network traffic.
A hybrid approach for efficient anomaly detection using metaheuristic methods.
Ghanem, Tamer F; Elkilani, Wail S; Abdul-Kader, Hatem M
2015-07-01
Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.
NASA Astrophysics Data System (ADS)
Akhoondzadeh, M.
2014-02-01
A powerful earthquake of Mw = 7.7 struck the Saravan region (28.107° N, 62.053° E) in Iran on 16 April 2013. Up to now nomination of an automated anomaly detection method in a non linear time series of earthquake precursor has been an attractive and challenging task. Artificial Neural Network (ANN) and Particle Swarm Optimization (PSO) have revealed strong potentials in accurate time series prediction. This paper presents the first study of an integration of ANN and PSO method in the research of earthquake precursors to detect the unusual variations of the thermal and total electron content (TEC) seismo-ionospheric anomalies induced by the strong earthquake of Saravan. In this study, to overcome the stagnation in local minimum during the ANN training, PSO as an optimization method is used instead of traditional algorithms for training the ANN method. The proposed hybrid method detected a considerable number of anomalies 4 and 8 days preceding the earthquake. Since, in this case study, ionospheric TEC anomalies induced by seismic activity is confused with background fluctuations due to solar activity, a multi-resolution time series processing technique based on wavelet transform has been applied on TEC signal variations. In view of the fact that the accordance in the final results deduced from some robust methods is a convincing indication for the efficiency of the method, therefore the detected thermal and TEC anomalies using the ANN + PSO method were compared to the results with regard to the observed anomalies by implementing the mean, median, Wavelet, Kalman filter, Auto-Regressive Integrated Moving Average (ARIMA), Support Vector Machine (SVM) and Genetic Algorithm (GA) methods. The results indicate that the ANN + PSO method is quite promising and deserves serious attention as a new tool for thermal and TEC seismo anomalies detection.
A Stochastic-entropic Approach to Detect Persistent Low-temperature Volcanogenic Thermal Anomalies
NASA Astrophysics Data System (ADS)
Pieri, D. C.; Baxter, S.
2011-12-01
Eruption prediction is a chancy idiosyncratic affair, as volcanoes often manifest waxing and/or waning pre-eruption emission, geodetic, and seismic behavior that is unsystematic. Thus, fundamental to increased prediction accuracy and precision are good and frequent assessments of the time-series behavior of relevant precursor geophysical, geochemical, and geological phenomena, especially when volcanoes become restless. The Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER), in orbit since 1999 on the NASA Terra Earth Observing System satellite is an important capability for detection of thermal eruption precursors (even subtle ones) and increased passive gas emissions. The unique combination of ASTER high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), combined with simultaneous visible and near-IR imaging data, and stereo-photogrammetric capabilities make it a useful, especially thermal, precursor detection tool. The JPL ASTER Volcano Archive consisting of 80,000+ASTER volcano images allows systematic analysis of (a) baseline thermal emissions for 1550+ volcanoes, (b) important aspects of the time-dependent thermal variability, and (c) the limits of detection of temporal dynamics of eruption precursors. We are analyzing a catalog of the magnitude, frequency, and distribution of ASTER-documented volcano thermal signatures, compiled from 2000 onward, at 90m/pixel. Low contrast thermal anomalies of relatively low apparent absolute temperature (e.g., summit lakes, fumarolically altered areas, geysers, very small sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and their combinations), are particularly important to discern and monitor. We have developed a technique to detect persistent hotspots that takes into account in-scene observed pixel joint frequency distributions over time, temperature contrast, and Shannon entropy. Preliminary analyses of Fogo Volcano and Yellowstone hotspots, among others, indicate that this is a very sensitive technique with good potential to be applied over the entire ASTER global night-time archive. We will discuss our progress in creating the global thermal anomaly catalog as well as algorithm approach and results. This work was carried out at the Jet Propulsion Laboratory of the California Institute of Technology under contract to NASA.
Real-time Bayesian anomaly detection in streaming environmental data
NASA Astrophysics Data System (ADS)
Hill, David J.; Minsker, Barbara S.; Amir, Eyal
2009-04-01
With large volumes of data arriving in near real time from environmental sensors, there is a need for automated detection of anomalous data caused by sensor or transmission errors or by infrequent system behaviors. This study develops and evaluates three automated anomaly detection methods using dynamic Bayesian networks (DBNs), which perform fast, incremental evaluation of data as they become available, scale to large quantities of data, and require no a priori information regarding process variables or types of anomalies that may be encountered. This study investigates these methods' abilities to identify anomalies in eight meteorological data streams from Corpus Christi, Texas. The results indicate that DBN-based detectors, using either robust Kalman filtering or Rao-Blackwellized particle filtering, outperform a DBN-based detector using Kalman filtering, with the former having false positive/negative rates of less than 2%. These methods were successful at identifying data anomalies caused by two real events: a sensor failure and a large storm.
Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach
NASA Astrophysics Data System (ADS)
Koeppen, W. C.; Pilger, E.; Wright, R.
2011-07-01
We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.
Statistical Analysis of TEC Anomalies Prior to M6.0+ Earthquakes During 2003-2014
NASA Astrophysics Data System (ADS)
Zhu, Fuying; Su, Fanfan; Lin, Jian
2018-04-01
There are many studies on the anomalous variations of the ionospheric TEC prior to large earthquakes. However, whether or not the morphological characteristics of the TEC anomalies in the daytime and at night are different is rarely studied. In the present paper, based on the total electron content (TEC) data from the global ionosphere map (GIM), we carry out a statistical survey on the spatial-temporal distribution of TEC anomalies before 1339 global M6.0+ earthquakes during 2003-2014. After excluding the interference of geomagnetic disturbance, the temporal and spatial distributions of ionospheric TEC anomalies prior to the earthquakes in the daytime and at night are investigated and compared. Except that the nighttime occurrence rates of the pre-earthquake ionospheric anomalies (PEIAs) are higher than those in the daytime, our analysis has not found any statistically significant difference in the spatial-temporal distribution of PEIAs in the daytime and at night. Moreover, the occurrence rates of pre-earthquake ionospheric TEC both positive anomalies and negative anomalies at night tend to increase slightly with the earthquake magnitude. Thus, we suggest that monitoring the ionospheric TEC changes at night might be a clue to reveal the relation between ionospheric disturbances and seismic activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brett Emery Trabun; Gamage, Thoshitha Thanushka; Bakken, David Edward
This disclosure describes, in part, a system management component and failure detection component for use in a power grid data network to identify anomalies within the network and systematically adjust the quality of service of data published by publishers and subscribed to by subscribers within the network. In one implementation, subscribers may identify a desired data rate, a minimum acceptable data rate, desired latency, minimum acceptable latency and a priority for each subscription. The failure detection component may identify an anomaly within the network and a source of the anomaly. Based on the identified anomaly, data rates and or datamore » paths may be adjusted in real-time to ensure that the power grid data network does not become overloaded and/or fail.« less
Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity
Schettini, Raimondo
2018-01-01
Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art. PMID:29329268
Kaasen, Anne; Helbig, Anne; Malt, Ulrik Fredrik; Naes, Tormod; Skari, Hans; Haugen, Guttorm Nils
2013-07-12
In Norway almost all pregnant women attend one routine ultrasound examination. Detection of fetal structural anomalies triggers psychological stress responses in the women affected. Despite the frequent use of ultrasound examination in pregnancy, little attention has been devoted to the psychological response of the expectant father following the detection of fetal anomalies. This is important for later fatherhood and the psychological interaction within the couple. We aimed to describe paternal psychological responses shortly after detection of structural fetal anomalies by ultrasonography, and to compare paternal and maternal responses within the same couple. A prospective observational study was performed at a tertiary referral centre for fetal medicine. Pregnant women with a structural fetal anomaly detected by ultrasound and their partners (study group,n=155) and 100 with normal ultrasound findings (comparison group) were included shortly after sonographic examination (inclusion period: May 2006-February 2009). Gestational age was >12 weeks. We used psychometric questionnaires to assess self-reported social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression): Impact of Event Scale. General Health Questionnaire and Edinburgh Postnatal Depression Scale. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Median (range) gestational age at inclusion in the study and comparison group was 19 (12-38) and 19 (13-22) weeks, respectively. Men and women in the study group had significantly higher levels of psychological distress than men and women in the comparison group on all psychometric endpoints. The lowest level of distress in the study group was associated with the least severe anomalies with no diagnostic or prognostic ambiguity (p < 0.033). Men had lower scores than women on all psychometric outcome variables. The correlation in distress scores between men and women was high in the fetal anomaly group (p < 0.001), but non-significant in the comparison group. Severity of the anomaly including ambiguity significantly influenced paternal response. Men reported lower scores on all psychometric outcomes than women. This knowledge may facilitate support for both expectant parents to reduce strain within the family after detectionof a fetal anomaly.
Apollo-Soyuz pamphlet no. 4: Gravitational field. [experimental design
NASA Technical Reports Server (NTRS)
Page, L. W.; From, T. P.
1977-01-01
Two Apollo Soyuz experiments designed to detect gravity anomalies from spacecraft motion are described. The geodynamics experiment (MA-128) measured large-scale gravity anomalies by detecting small accelerations of Apollo in the 222 km orbit, using Doppler tracking from the ATS-6 satellite. Experiment MA-089 measured 300 km anomalies on the earth's surface by detecting minute changes in the separation between Apollo and the docking module. Topics discussed in relation to these experiments include the Doppler effect, gravimeters, and the discovery of mascons on the moon.
Radon anomalies: When are they possible to be detected?
NASA Astrophysics Data System (ADS)
Passarelli, Luigi; Woith, Heiko; Seyis, Cemil; Nikkhoo, Mehdi; Donner, Reik
2017-04-01
Records of the Radon noble gas in different environments like soil, air, groundwater, rock, caves, and tunnels, typically display cyclic variations including diurnal (S1), semidiurnal (S2) and seasonal components. But there are also cases where theses cycles are absent. Interestingly, radon emission can also be affected by transient processes, which inhibit or enhance the radon carrying process at the surface. This results in transient changes in the radon emission rate, which are superimposed on the low and high frequency cycles. The complexity in the spectral contents of the radon time-series makes any statistical analysis aiming at understanding the physical driving processes a challenging task. In the past decades there have been several attempts to relate changes in radon emission rate with physical triggering processes such as earthquake occurrence. One of the problems in this type of investigation is to objectively detect anomalies in the radon time-series. In the present work, we propose a simple and objective statistical method for detecting changes in the radon emission rate time-series. The method uses non-parametric statistical tests (e.g., Kolmogorov-Smirnov) to compare empirical distributions of radon emission rate by sequentially applying various time window to the time-series. The statistical test indicates whether two empirical distributions of data originate from the same distribution at a desired significance level. We test the algorithm on synthetic data in order to explore the sensitivity of the statistical test to the sample size. We successively apply the test to six radon emission rate recordings from stations located around the Marmara Sea obtained within the MARsite project (MARsite has received funding from the European Union's Seventh Programme for research, technological development and demonstration under grant agreement No 308417). We conclude that the test performs relatively well on identify transient changes in the radon emission rate, but the results are strongly dependent on the length of the time window and/or type of frequency filtering. More importantly, when raw time-series contain cyclic components (e.g. seasonal or diurnal variation), the quest of anomalies related to transients becomes meaningless. We conclude that an objective identification of transient changes can be performed only after filtering the raw time-series for the physically meaningful frequency content.
Thermal wake/vessel detection technique
Roskovensky, John K [Albuquerque, NM; Nandy, Prabal [Albuquerque, NM; Post, Brian N [Albuquerque, NM
2012-01-10
A computer-automated method for detecting a vessel in water based on an image of a portion of Earth includes generating a thermal anomaly mask. The thermal anomaly mask flags each pixel of the image initially deemed to be a wake pixel based on a comparison of a thermal value of each pixel against other thermal values of other pixels localized about each pixel. Contiguous pixels flagged by the thermal anomaly mask are grouped into pixel clusters. A shape of each of the pixel clusters is analyzed to determine whether each of the pixel clusters represents a possible vessel detection event. The possible vessel detection events are represented visually within the image.
Integrated System Health Management: Foundational Concepts, Approach, and Implementation.
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Schmalzel, John; Walker, Mark; Venkatesh, Meera; Kapadia, Ravi; Morris, Jon; Turowski, Mark; Smith, Harvey
2009-01-01
Implementation of integrated system health management (ISHM) capability is fundamentally linked to the management of data, information, and knowledge (DIaK) with the purposeful objective of determining the health of a system. It is akin to having a team of experts who are all individually and collectively observing and analyzing a complex system, and communicating effectively with each other in order to arrive to an accurate and reliable assessment of its health. We present concepts, procedures, and a specific approach as a foundation for implementing a credible ISHM capability. The capability stresses integration of DIaK from all elements of a system. The intent is also to make possible implementation of on-board ISHM capability, in contrast to a remote capability. The information presented is the result of many years of research, development, and maturation of technologies, and of prototype implementations in operational systems (rocket engine test facilities). The paper will address the following topics: 1. ISHM Model of a system 2. Detection of anomaly indicators. 3. Determination and confirmation of anomalies. 4. Diagnostic of causes and determination of effects. 5. Consistency checking cycle. 6. Management of health information 7. User Interfaces 8. Example implementation ISHM has been defined from many perspectives. We define it as a capability that might be achieved by various approaches. We describe a specific approach that has been matured throughout many years of development, and pilot implementations. ISHM is a capability that is achieved by integrating data, information, and knowledge (DIaK) that might be distributed throughout the system elements (which inherently implies capability to manage DIaK associated with distributed sub-systems). DIaK must be available to any element of a system at the right time and in accordance with a meaningful context. ISHM Functional Capability Level (FCL) is measured by how well a system performs the following functions: (1) detect anomalies, (2) diagnose causes, (3) predict future anomalies/failures, and (4) provide the user with an integrated awareness about the condition of every element in the system and guide user decisions.
Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences
NASA Technical Reports Server (NTRS)
Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene
2006-01-01
This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.
Anomaly Monitoring Method for Key Components of Satellite
Fan, Linjun; Xiao, Weidong; Tang, Jun
2014-01-01
This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703
Anomaly Detection Using Optimally-Placed μPMU Sensors in Distribution Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jamei, Mahdi; Scaglione, Anna; Roberts, Ciaran
IEEE As the distribution grid moves toward a tightly-monitored network, it is important to automate the analysis of the enormous amount of data produced by the sensors to increase the operators situational awareness about the system. Here, focusing on Micro-Phasor Measurement Unit (μPMU) data, we propose a hierarchical architecture for monitoring the grid and establish a set of analytics and sensor fusion primitives for the detection of abnormal behavior in the control perimeter. And due to the key role of the μPMU devices in our architecture, a source-constrained optimal μPMU placement is also described that finds the best location ofmore » the devices with respect to our rules. The effectiveness of the proposed methods are tested through the synthetic and real μPMU data.« less
Anomaly Detection Using Optimally-Placed μPMU Sensors in Distribution Grids
Jamei, Mahdi; Scaglione, Anna; Roberts, Ciaran; ...
2017-10-25
IEEE As the distribution grid moves toward a tightly-monitored network, it is important to automate the analysis of the enormous amount of data produced by the sensors to increase the operators situational awareness about the system. Here, focusing on Micro-Phasor Measurement Unit (μPMU) data, we propose a hierarchical architecture for monitoring the grid and establish a set of analytics and sensor fusion primitives for the detection of abnormal behavior in the control perimeter. And due to the key role of the μPMU devices in our architecture, a source-constrained optimal μPMU placement is also described that finds the best location ofmore » the devices with respect to our rules. The effectiveness of the proposed methods are tested through the synthetic and real μPMU data.« less
A rapid local singularity analysis algorithm with applications
NASA Astrophysics Data System (ADS)
Chen, Zhijun; Cheng, Qiuming; Agterberg, Frits
2015-04-01
The local singularity model developed by Cheng is fast gaining popularity in characterizing mineralization and detecting anomalies of geochemical, geophysical and remote sensing data. However in one of the conventional algorithms involving the moving average values with different scales is time-consuming especially while analyzing a large dataset. Summed area table (SAT), also called as integral image, is a fast algorithm used within the Viola-Jones object detection framework in computer vision area. Historically, the principle of SAT is well-known in the study of multi-dimensional probability distribution functions, namely in computing 2D (or ND) probabilities (area under the probability distribution) from the respective cumulative distribution functions. We introduce SAT and it's variation Rotated Summed Area Table in the isotropic, anisotropic or directional local singularity mapping in this study. Once computed using SAT, any one of the rectangular sum can be computed at any scale or location in constant time. The area for any rectangular region in the image can be computed by using only 4 array accesses in constant time independently of the size of the region; effectively reducing the time complexity from O(n) to O(1). New programs using Python, Julia, matlab and C++ are implemented respectively to satisfy different applications, especially to the big data analysis. Several large geochemical and remote sensing datasets are tested. A wide variety of scale changes (linear spacing or log spacing) for non-iterative or iterative approach are adopted to calculate the singularity index values and compare the results. The results indicate that the local singularity analysis with SAT is more robust and superior to traditional approach in identifying anomalies.
A DBN based anomaly targets detector for HSI
NASA Astrophysics Data System (ADS)
Ma, Ning; Wang, Shaojun; Yu, Jinxiang; Peng, Yu
2017-10-01
Due to the assumption that Hyperspectral image (HSI) should conform to Gaussian distribution, traditional Mahalanobis distance-based anomaly targets detectors perform poor because the assumption may not always hold. In order to solve those problems, a deep learning based detector, Deep Belief Network(DBN) anomaly detector(DBN-AD), was proposed to fit the unknown distribution of HSI by energy modeling, the reconstruction errors of this encode-decode processing are used for discriminating the anomaly targets. Experiments are implemented on real and synthesized HSI dataset which collection by Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS). Comparing to classic anomaly detector, the proposed method shows better performance, it performs about 0.17 higher in Area Under ROC Curve (AUC) than that of Reed-Xiaoli detector(RXD) and Kernel-RXD (K-RXD).
Detecting an atomic clock frequency anomaly using an adaptive Kalman filter algorithm
NASA Astrophysics Data System (ADS)
Song, Huijie; Dong, Shaowu; Wu, Wenjun; Jiang, Meng; Wang, Weixiong
2018-06-01
The abnormal frequencies of an atomic clock mainly include frequency jump and frequency drift jump. Atomic clock frequency anomaly detection is a key technique in time-keeping. The Kalman filter algorithm, as a linear optimal algorithm, has been widely used in real-time detection for abnormal frequency. In order to obtain an optimal state estimation, the observation model and dynamic model of the Kalman filter algorithm should satisfy Gaussian white noise conditions. The detection performance is degraded if anomalies affect the observation model or dynamic model. The idea of the adaptive Kalman filter algorithm, applied to clock frequency anomaly detection, uses the residuals given by the prediction for building ‘an adaptive factor’ the prediction state covariance matrix is real-time corrected by the adaptive factor. The results show that the model error is reduced and the detection performance is improved. The effectiveness of the algorithm is verified by the frequency jump simulation, the frequency drift jump simulation and the measured data of the atomic clock by using the chi-square test.
Method and apparatus for detecting external cracks from within a metal tube
Caffey, Thurlow W. H.
2001-08-07
A method and tool using a continuous electromagnetic wave from a transverse magnetic-dipole source with a coaxial electric-dipole receiver is described for the detection of external sidewall cracks and other anomalies in boiler tubes and other enclosures. The invention utilizes the concept of radar backscatter rather than eddy-currents or ultrasound, which are sometimes used in prior art crack-detection methods. A numerical study of the distribution of the fields shows that the direct transmission from the source to the receiver is reduced from that in free space. Further, if the diameter of the receiver dipole is made sufficiently small, it should be possible to detect cracks with a scattering loss of up to -40 dB in thin-walled boiler tubes.
Global Anomaly Detection in Two-Dimensional Symmetry-Protected Topological Phases
NASA Astrophysics Data System (ADS)
Bultinck, Nick; Vanhove, Robijn; Haegeman, Jutho; Verstraete, Frank
2018-04-01
Edge theories of symmetry-protected topological phases are well known to possess global symmetry anomalies. In this Letter we focus on two-dimensional bosonic phases protected by an on-site symmetry and analyze the corresponding edge anomalies in more detail. Physical interpretations of the anomaly in terms of an obstruction to orbifolding and constructing symmetry-preserving boundaries are connected to the cohomology classification of symmetry-protected phases in two dimensions. Using the tensor network and matrix product state formalism we numerically illustrate our arguments and discuss computational detection schemes to identify symmetry-protected order in a ground state wave function.
Model selection for anomaly detection
NASA Astrophysics Data System (ADS)
Burnaev, E.; Erofeev, P.; Smolyakov, D.
2015-12-01
Anomaly detection based on one-class classification algorithms is broadly used in many applied domains like image processing (e.g. detection of whether a patient is "cancerous" or "healthy" from mammography image), network intrusion detection, etc. Performance of an anomaly detection algorithm crucially depends on a kernel, used to measure similarity in a feature space. The standard approaches (e.g. cross-validation) for kernel selection, used in two-class classification problems, can not be used directly due to the specific nature of a data (absence of a second, abnormal, class data). In this paper we generalize several kernel selection methods from binary-class case to the case of one-class classification and perform extensive comparison of these approaches using both synthetic and real-world data.
NASA Technical Reports Server (NTRS)
Liberman, Eugene M.; Manner, David B.; Dolce, James L.; Mellor, Pamela A.
1993-01-01
A user interface to the power distribution expert system for Space Station Freedom is discussed. The importance of features which simplify assessing system status and which minimize navigating through layers of information are examined. Design rationale and implementation choices are also presented. The amalgamation of such design features as message linking arrows, reduced information content screens, high salience anomaly icons, and color choices with failure detection and diagnostic explanation from an expert system is shown to provide an effective status-at-a-glance monitoring system for power distribution. This user interface design offers diagnostic reasoning without compromising the monitoring of current events. The display can convey complex concepts in terms that are clear to its users.
EMPACT 3D: an advanced EMI discrimination sensor for CONUS and OCONUS applications
NASA Astrophysics Data System (ADS)
Keranen, Joe; Miller, Jonathan S.; Schultz, Gregory; Sander-Olhoeft, Morgan; Laudato, Stephen
2018-04-01
We recently developed a new, man-portable, electromagnetic induction (EMI) sensor designed to detect and classify small, unexploded sub-munitions and discriminate them from non-hazardous debris. The ability to distinguish innocuous metal clutter from potentially hazardous unexploded ordnance (UXO) and other explosive remnants of war (ERW) before excavation can significantly accelerate land reclamation efforts by eliminating time spent removing harmless scrap metal. The EMI sensor employs a multi-axis transmitter and receiver configuration to produce data sufficient for anomaly discrimination. A real-time data inversion routine produces intrinsic and extrinsic anomaly features describing the polarizability, location, and orientation of the anomaly under test. We discuss data acquisition and post-processing software development, and results from laboratory and field tests demonstrating the discrimination capability of the system. Data acquisition and real-time processing emphasize ease-of-use, quality control (QC), and display of discrimination results. Integration of the QC and discrimination methods into the data acquisition software reduces the time required between sensor data collection and the final anomaly discrimination result. The system supports multiple concepts of operations (CONOPs) including: 1) a non-GPS cued configuration in which detected anomalies are discriminated and excavated immediately following the anomaly survey; 2) GPS integration to survey multiple anomalies to produce a prioritized dig list with global anomaly locations; and 3) a dynamic mapping configuration supporting detection followed by discrimination and excavation of targets of interest.
Hydrothermal circulation at Mount St. Helens determined by self-potential measurements
Bedrosian, P.A.; Unsworth, M.J.; Johnston, M.J.S.
2007-01-01
The distribution of hydrothermal circulation within active volcanoes is of importance in identifying regions of hydrothermal alteration which may in turn control explosivity, slope stability and sector collapse. Self-potential measurements, indicative of fluid circulation, were made within the crater of Mount St. Helens in 2000 and 2001. A strong dipolar anomaly in the self-potential field was detected on the north face of the 1980-86 lava dome. This anomaly reaches a value of negative one volt on the lower flanks of the dome and reverses sign toward the dome summit. The anomaly pattern is believed to result from a combination of thermoelectric, electrokinetic, and fluid disruption effects within and surrounding the dome. Heat supplied from a cooling dacite magma very likely drives a shallow hydrothermal convection cell within the dome. The temporal stability of the SP field, low surface recharge rate, and magmatic component to fumarole condensates and thermal waters suggest the hydrothermal system is maintained by water vapor exsolved from the magma and modulated on short time scales by surface recharge. ?? 2006 Elsevier B.V. All rights reserved.
A Spatial-Spectral Approach for Visualization of Vegetation Stress Resulting from Pipeline Leakage.
Van derWerff, Harald; Van der Meijde, Mark; Jansma, Fokke; Van der Meer, Freek; Groothuis, Gert Jan
2008-06-04
Hydrocarbon leakage into the environment has large economic and environmental impact. Traditional methods for investigating seepages and their resulting pollution, such as drilling, are destructive, time consuming and expensive. Remote sensing is an efficient tool that offers a non-destructive investigation method. Optical remote sensing has been extensively tested for exploration of onshore hydrocarbon reservoirs and detection of hydrocarbons at the Earth's surface. In this research, we investigate indirect manifestations of pipeline leakage by way of visualizing vegetation anomalies in airborne hyperspectral imagery. Agricultural land-use causes a heterogeneous landcover; variation in red edge position between fields was much larger than infield red edge position variation that could be related to hydrocarbon pollution. A moving and growing kernel procedure was developed to normalzie red edge values relative to values of neighbouring pixels to enhance pollution related anomalies in the image. Comparison of the spatial distribution of anomalies with geochemical data obtained by drilling showed that 8 out of 10 polluted sites were predicted correctly while 2 out of 30 sites that were predicted clean were actually polluted.
A Spatial-Spectral Approach for Visualization of Vegetation Stress Resulting from Pipeline Leakage
van der Werff, Harald; van der Meijde, Mark; Jansma, Fokke; van der Meer, Freek; Groothuis, Gert Jan
2008-01-01
Hydrocarbon leakage into the environment has large economic and environmental impact. Traditional methods for investigating seepages and their resulting pollution, such as drilling, are destructive, time consuming and expensive. Remote sensing is an efficient tool that offers a non-destructive investigation method. Optical remote sensing has been extensively tested for exploration of onshore hydrocarbon reservoirs and detection of hydrocarbons at the Earth's surface. In this research, we investigate indirect manifestations of pipeline leakage by way of visualizing vegetation anomalies in airborne hyperspectral imagery. Agricultural land-use causes a heterogeneous landcover; variation in red edge position between fields was much larger than infield red edge position variation that could be related to hydrocarbon pollution. A moving and growing kernel procedure was developed to normalzie red edge values relative to values of neighbouring pixels to enhance pollution related anomalies in the image. Comparison of the spatial distribution of anomalies with geochemical data obtained by drilling showed that 8 out of 10 polluted sites were predicted correctly while 2 out of 30 sites that were predicted clean were actually polluted. PMID:27879905
Detection of sinkholes or anomalies using full seismic wave fields.
DOT National Transportation Integrated Search
2013-04-01
This research presents an application of two-dimensional (2-D) time-domain waveform tomography for detection of embedded sinkholes and anomalies. The measured seismic surface wave fields were inverted using a full waveform inversion (FWI) technique, ...
A Testbed for Data Fusion for Engine Diagnostics and Prognostics1
2002-03-01
detected ; too late to be useful for prognostics development. Table 1. Table of acronyms ACRONYM MEANING AD Anomaly detector...strictly defined points. Determining where we are on the engine health curve is the first step in prognostics . Fault detection / diagnostic reasoning... Detection As described above the ability of the monitoring system to detect an anomaly is especially important for knowledge-based systems, i.e.,
Automated novelty detection in the WISE survey with one-class support vector machines
NASA Astrophysics Data System (ADS)
Solarz, A.; Bilicki, M.; Gromadzki, M.; Pollo, A.; Durkalec, A.; Wypych, M.
2017-10-01
Wide-angle photometric surveys of previously uncharted sky areas or wavelength regimes will always bring in unexpected sources - novelties or even anomalies - whose existence and properties cannot be easily predicted from earlier observations. Such objects can be efficiently located with novelty detection algorithms. Here we present an application of such a method, called one-class support vector machines (OCSVM), to search for anomalous patterns among sources preselected from the mid-infrared AllWISE catalogue covering the whole sky. To create a model of expected data we train the algorithm on a set of objects with spectroscopic identifications from the SDSS DR13 database, present also in AllWISE. The OCSVM method detects as anomalous those sources whose patterns - WISE photometric measurements in this case - are inconsistent with the model. Among the detected anomalies we find artefacts, such as objects with spurious photometry due to blending, but more importantly also real sources of genuine astrophysical interest. Among the latter, OCSVM has identified a sample of heavily reddened AGN/quasar candidates distributed uniformly over the sky and in a large part absent from other WISE-based AGN catalogues. It also allowed us to find a specific group of sources of mixed types, mostly stars and compact galaxies. By combining the semi-supervised OCSVM algorithm with standard classification methods it will be possible to improve the latter by accounting for sources which are not present in the training sample, but are otherwise well-represented in the target set. Anomaly detection adds flexibility to automated source separation procedures and helps verify the reliability and representativeness of the training samples. It should be thus considered as an essential step in supervised classification schemes to ensure completeness and purity of produced catalogues. The catalogues of outlier data are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/606/A39
2014-10-02
potential advantages of using multi- variate classification/discrimination/ anomaly detection meth- ods on real world accelerometric condition monitoring ...case of false anomaly reports. A possible explanation of this phenomenon could be given 8 ANNUAL CONFERENCE OF THE PROGNOSTICS AND HEALTH MANAGEMENT...of those helicopters. 1. Anomaly detection by means of a self-learning Shewhart control chart. A problem highlighted by the experts of Agusta- Westland
Detecting ship targets in spaceborne infrared image based on modeling radiation anomalies
NASA Astrophysics Data System (ADS)
Wang, Haibo; Zou, Zhengxia; Shi, Zhenwei; Li, Bo
2017-09-01
Using infrared imaging sensors to detect ship target in the ocean environment has many advantages compared to other sensor modalities, such as better thermal sensitivity and all-weather detection capability. We propose a new ship detection method by modeling radiation anomalies for spaceborne infrared image. The proposed method can be decomposed into two stages, where in the first stage, a test infrared image is densely divided into a set of image patches and the radiation anomaly of each patch is estimated by a Gaussian Mixture Model (GMM), and thereby target candidates are obtained from anomaly image patches. In the second stage, target candidates are further checked by a more discriminative criterion to obtain the final detection result. The main innovation of the proposed method is inspired by the biological mechanism that human eyes are sensitive to the unusual and anomalous patches among complex background. The experimental result on short wavelength infrared band (1.560 - 2.300 μm) and long wavelength infrared band (10.30 - 12.50 μm) of Landsat-8 satellite shows the proposed method achieves a desired ship detection accuracy with higher recall than other classical ship detection methods.
Analysis of SSEM Sensor Data Using BEAM
NASA Technical Reports Server (NTRS)
Zak, Michail; Park, Han; James, Mark
2004-01-01
A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.
Intelligent system for a remote diagnosis of a photovoltaic solar power plant
NASA Astrophysics Data System (ADS)
Sanz-Bobi, M. A.; Muñoz San Roque, A.; de Marcos, A.; Bada, M.
2012-05-01
Usually small and mid-sized photovoltaic solar power plants are located in rural areas and typically they operate unattended. Some technicians are in charge of the supervision of these plants and, if an alarm is automatically issued, they try to investigate the problem and correct it. Sometimes these anomalies are detected some hours or days after they begin. Also the analysis of the causes once the anomaly is detected can take some additional time. All these factors motivated the development of a methodology able to perform continuous and automatic monitoring of the basic parameters of a photovoltaic solar power plant in order to detect anomalies as soon as possible, to diagnose their causes, and to immediately inform the personnel in charge of the plant. The methodology proposed starts from the study of the most significant failure modes of a photovoltaic plant through a FMEA and using this information, its typical performance is characterized by the creation of its normal behaviour models. They are used to detect the presence of a failure in an incipient or current form. Once an anomaly is detected, an automatic and intelligent diagnosis process is started in order to investigate the possible causes. The paper will describe the main features of a software tool able to detect anomalies and to diagnose them in a photovoltaic solar power plant.
Method and system for monitoring environmental conditions
Kulesz, James J [Oak Ridge, TN; Lee, Ronald W [Oak Ridge, TN
2010-11-16
A system for detecting the occurrence of anomalies includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. At least one software agent is capable of changing the operation of at least one of the controllers in response to the detection of an anomaly by a sensor.
Lytle, R. Jeffrey; Lager, Darrel L.; Laine, Edwin F.; Davis, Donald T.
1979-01-01
Underground anomalies or discontinuities, such as holes, tunnels, and caverns, are located by lowering an electromagnetic signal transmitting antenna down one borehole and a receiving antenna down another, the ground to be surveyed for anomalies being situated between the boreholes. Electronic transmitting and receiving equipment associated with the antennas is activated and the antennas are lowered in unison at the same rate down their respective boreholes a plurality of times, each time with the receiving antenna at a different level with respect to the transmitting antenna. The transmitted electromagnetic waves diffract at each edge of an anomaly. This causes minimal signal reception at the receiving antenna. Triangulation of the straight lines between the antennas for the depths at which the signal minimums are detected precisely locates the anomaly. Alternatively, phase shifts of the transmitted waves may be detected to locate an anomaly, the phase shift being distinctive for the waves directed at the anomaly.
NASA Astrophysics Data System (ADS)
Akhoondzadeh, M.
2013-08-01
On 6 February 2013, at 12:12:27 local time (01:12:27 UTC) a seismic event registering Mw 8.0 struck the Solomon Islands, located at the boundaries of the Australian and Pacific tectonic plates. Time series prediction is an important and widely interesting topic in the research of earthquake precursors. This paper describes a new computational intelligence approach to detect the unusual variations of the total electron content (TEC) seismo-ionospheric anomalies induced by the powerful Solomon earthquake using genetic algorithm (GA). The GA detected a considerable number of anomalous occurrences on earthquake day and also 7 and 8 days prior to the earthquake in a period of high geomagnetic activities. In this study, also the detected TEC anomalies using the proposed method are compared to the results dealing with the observed TEC anomalies by applying the mean, median, wavelet, Kalman filter, ARIMA, neural network and support vector machine methods. The accordance in the final results of all eight methods is a convincing indication for the efficiency of the GA method. It indicates that GA can be an appropriate non-parametric tool for anomaly detection in a non linear time series showing the seismo-ionospheric precursors variations.
Intelligent agent-based intrusion detection system using enhanced multiclass SVM.
Ganapathy, S; Yogesh, P; Kannan, A
2012-01-01
Intrusion detection systems were used in the past along with various techniques to detect intrusions in networks effectively. However, most of these systems are able to detect the intruders only with high false alarm rate. In this paper, we propose a new intelligent agent-based intrusion detection model for mobile ad hoc networks using a combination of attribute selection, outlier detection, and enhanced multiclass SVM classification methods. For this purpose, an effective preprocessing technique is proposed that improves the detection accuracy and reduces the processing time. Moreover, two new algorithms, namely, an Intelligent Agent Weighted Distance Outlier Detection algorithm and an Intelligent Agent-based Enhanced Multiclass Support Vector Machine algorithm are proposed for detecting the intruders in a distributed database environment that uses intelligent agents for trust management and coordination in transaction processing. The experimental results of the proposed model show that this system detects anomalies with low false alarm rate and high-detection rate when tested with KDD Cup 99 data set.
Intelligent Agent-Based Intrusion Detection System Using Enhanced Multiclass SVM
Ganapathy, S.; Yogesh, P.; Kannan, A.
2012-01-01
Intrusion detection systems were used in the past along with various techniques to detect intrusions in networks effectively. However, most of these systems are able to detect the intruders only with high false alarm rate. In this paper, we propose a new intelligent agent-based intrusion detection model for mobile ad hoc networks using a combination of attribute selection, outlier detection, and enhanced multiclass SVM classification methods. For this purpose, an effective preprocessing technique is proposed that improves the detection accuracy and reduces the processing time. Moreover, two new algorithms, namely, an Intelligent Agent Weighted Distance Outlier Detection algorithm and an Intelligent Agent-based Enhanced Multiclass Support Vector Machine algorithm are proposed for detecting the intruders in a distributed database environment that uses intelligent agents for trust management and coordination in transaction processing. The experimental results of the proposed model show that this system detects anomalies with low false alarm rate and high-detection rate when tested with KDD Cup 99 data set. PMID:23056036
Detection of abnormal item based on time intervals for recommender systems.
Gao, Min; Yuan, Quan; Ling, Bin; Xiong, Qingyu
2014-01-01
With the rapid development of e-business, personalized recommendation has become core competence for enterprises to gain profits and improve customer satisfaction. Although collaborative filtering is the most successful approach for building a recommender system, it suffers from "shilling" attacks. In recent years, the research on shilling attacks has been greatly improved. However, the approaches suffer from serious problem in attack model dependency and high computational cost. To solve the problem, an approach for the detection of abnormal item is proposed in this paper. In the paper, two common features of all attack models are analyzed at first. A revised bottom-up discretized approach is then proposed based on time intervals and the features for the detection. The distributions of ratings in different time intervals are compared to detect anomaly based on the calculation of chi square distribution (χ(2)). We evaluated our approach on four types of items which are defined according to the life cycles of these items. The experimental results show that the proposed approach achieves a high detection rate with low computational cost when the number of attack profiles is more than 15. It improves the efficiency in shilling attacks detection by narrowing down the suspicious users.
NASA Astrophysics Data System (ADS)
Jervis, John R.; Pringle, Jamie K.
2014-09-01
Electrical resistivity surveys have proven useful for locating clandestine graves in a number of forensic searches. However, some aspects of grave detection with resistivity surveys remain imperfectly understood. One such aspect is the effect of seasonal changes in climate on the resistivity response of graves. In this study, resistivity survey data collected over three years over three simulated graves were analysed in order to assess how the graves' resistivity anomalies varied seasonally and when they could most easily be detected. Thresholds were used to identify anomalies, and the ‘residual volume' of grave-related anomalies was calculated as the area bounded by the relevant thresholds multiplied by the anomaly's average value above the threshold. The residual volume of a resistivity anomaly associated with a buried pig cadaver showed evidence of repeating annual patterns and was moderately correlated with the soil moisture budget. This anomaly was easiest to detect between January and April each year, after prolonged periods of high net gain in soil moisture. The resistivity response of a wrapped cadaver was more complex, although it also showed evidence of seasonal variation during the third year after burial. We suggest that the observed variation in the graves' resistivity anomalies was caused by seasonal change in survey data noise levels, which was in turn influenced by the soil moisture budget. It is possible that similar variations occur elsewhere for sites with seasonal climate variations and this could affect successful detection of other subsurface features. Further research to investigate how different climates and soil types affect seasonal variation in grave-related resistivity anomalies would be useful.
Gravity anomaly detection: Apollo/Soyuz
NASA Technical Reports Server (NTRS)
Vonbun, F. O.; Kahn, W. D.; Bryan, J. W.; Schmid, P. E.; Wells, W. T.; Conrad, D. T.
1976-01-01
The Goddard Apollo-Soyuz Geodynamics Experiment is described. It was performed to demonstrate the feasibility of tracking and recovering high frequency components of the earth's gravity field by utilizing a synchronous orbiting tracking station such as ATS-6. Gravity anomalies of 5 MGLS or larger having wavelengths of 300 to 1000 kilometers on the earth's surface are important for geologic studies of the upper layers of the earth's crust. Short wavelength Earth's gravity anomalies were detected from space. Two prime areas of data collection were selected for the experiment: (1) the center of the African continent and (2) the Indian Ocean Depression centered at 5% north latitude and 75% east longitude. Preliminary results show that the detectability objective of the experiment was met in both areas as well as at several additional anomalous areas around the globe. Gravity anomalies of the Karakoram and Himalayan mountain ranges, ocean trenches, as well as the Diamantina Depth, can be seen. Maps outlining the anomalies discovered are shown.
Network Anomaly Detection Based on Wavelet Analysis
NASA Astrophysics Data System (ADS)
Lu, Wei; Ghorbani, Ali A.
2008-12-01
Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.
NASA Astrophysics Data System (ADS)
Mori, Taketoshi; Ishino, Takahito; Noguchi, Hiroshi; Shimosaka, Masamichi; Sato, Tomomasa
2011-06-01
We propose a life pattern estimation method and an anomaly detection method for elderly people living alone. In our observation system for such people, we deploy some pyroelectric sensors into the house and measure the person's activities all the time in order to grasp the person's life pattern. The data are transferred successively to the operation center and displayed to the nurses in the center in a precise way. Then, the nurses decide whether the data is the anomaly or not. In the system, the people whose features in their life resemble each other are categorized as the same group. Anomalies occurred in the past are shared in the group and utilized in the anomaly detection algorithm. This algorithm is based on "anomaly score." The "anomaly score" is figured out by utilizing the activeness of the person. This activeness is approximately proportional to the frequency of the sensor response in a minute. The "anomaly score" is calculated from the difference between the activeness in the present and the past one averaged in the long term. Thus, the score is positive if the activeness in the present is higher than the average in the past, and the score is negative if the value in the present is lower than the average. If the score exceeds a certain threshold, it means that an anomaly event occurs. Moreover, we developed an activity estimation algorithm. This algorithm estimates the residents' basic activities such as uprising, outing, and so on. The estimation is shown to the nurses with the "anomaly score" of the residents. The nurses can understand the residents' health conditions by combining these two information.
NASA Astrophysics Data System (ADS)
Tang, Jun; Yuan, Yunbin
2017-10-01
Ionospheric anomalies possibly associated with large earthquakes, particularly coseismic ionospheric disturbances, have been detected by global positioning system (GPS). A large Nepal earthquake with magnitude Mw7.8 occurred on April 25, 2015. In this paper, we investigate the multi-dimensional distribution of near-field coseismic ionospheric disturbances (CIDs) using total electron content (TEC) and computerized ionospheric tomography (CIT) from regional GPS observational data. The results show significant ionospheric TEC disturbances and interesting multi-dimensional structures around the main shock. Regarding the TEC changes, coseismic ionospheric disturbances occur approximately 10-20 min after the earthquake northeast and northwest of epicentre. The maximum ridge-to-trough amplitude of CIDs is up to approximately 0.90 TECU/min. Propagation velocities of the TEC disturbances are 1.27 ± 0.06 km/s and 1.91 ± 0.38 km/s. It is believed that the ionospheric disturbances are triggered by acoustic and Rayleigh waves. Tomographic results show that the three-dimensional distribution of ionospheric disturbances obviously increases at an altitude of 300 km above the surrounding epicentre, predominantly in the entire region between 200 km and 400 km. Significant ionospheric disturbances appear at 06:30 UT from tomographic images. This study reveals characteristics of an ionospheric anomaly caused by the Nepal earthquake.
NASA Astrophysics Data System (ADS)
Bersan, Silvia; Koelewijn, André R.; Simonini, Paolo
2018-02-01
Internal erosion is the cause of a significant percentage of failure and incidents involving both dams and river embankments in many countries. In the past 20 years the use of fibre-optic Distributed Temperature Sensing (DTS) in dams has proved to be an effective tool for the detection of leakages and internal erosion. This work investigates the effectiveness of DTS for dike monitoring, focusing on the early detection of backward erosion piping, a mechanism that affects the foundation layer of structures resting on permeable, sandy soils. The paper presents data from a piping test performed on a large-scale experimental dike equipped with a DTS system together with a large number of accompanying sensors. The effect of seepage and piping on the temperature field is analysed, eventually identifying the processes that cause the onset of thermal anomalies around piping channels and thus enable their early detection. Making use of dimensional analysis, the factors that influence this thermal response of a dike foundation are identified. Finally some tools are provided that can be helpful for the design of monitoring systems and for the interpretation of temperature data.
Evolution and Advances in Satellite Analysis of Volcanoes
NASA Astrophysics Data System (ADS)
Dean, K. G.; Dehn, J.; Webley, P.; Bailey, J.
2008-12-01
Over the past 20 years satellite data used for monitoring and analysis of volcanic eruptions has evolved in terms of timeliness, access, distribution, resolution and understanding of volcanic processes. Initially satellite data was used for retrospective analysis but has evolved to proactive monitoring systems. Timely acquisition of data and the capability to distribute large data files paralleled advances in computer technology and was a critical component for near real-time monitoring. The sharing of these data and resulting discussions has improved our understanding of eruption processes and, even more importantly, their impact on society. To illustrate this evolution, critical scientific discoveries will be highlighted, including detection of airborne ash and sulfur dioxide, cloud-height estimates, prediction of ash cloud movement, and detection of thermal anomalies as precursor-signals to eruptions. AVO has been a leader in implementing many of these advances into an operational setting such as, automated eruption detection, database analysis systems, and remotely accessible web-based analysis systems. Finally, limitations resulting from trade-offs between resolution and how they impact some weakness in detection techniques and hazard assessments will be presented.
NASA Technical Reports Server (NTRS)
Richards, Lance
2014-01-01
The general aim of this work is to develop and demonstrate a prototype structural health monitoring system for thermal protection systems that incorporates piezoelectric acoustic emission (AE) sensors to detect the occurrence and location of damaging impacts, such as those from Micrometeoroid Orbital Debris (MMOD). The approach uses an optical fiber Bragg grating (FBG) sensor network to evaluate the effect of detected damage on the thermal conductivity of the TPS material. Following detection of an impact, the TPS would be exposed to a heat source, possibly the sun, and the temperature distribution on the inner surface in the vicinity of the impact measured by the FBG network. A similar procedure could also be carried out as a screening test immediately prior to re-entry. The implications of any detected anomalies in the measured temperature distribution will be evaluated for their significance in relation to the performance of the TPS during reentry. Such a robust TPS health monitoring system would ensure overall crew safety throughout the mission, especially during reentry.
Networked gamma radiation detection system for tactical deployment
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ronald; Smith, Ethan; Guss, Paul; Mitchell, Stephen
2015-08-01
A networked gamma radiation detection system with directional sensitivity and energy spectral data acquisition capability is being developed by the National Security Technologies, LLC, Remote Sensing Laboratory to support the close and intense tactical engagement of law enforcement who carry out counterterrorism missions. In the proposed design, three clusters of 2″ × 4″ × 16″ sodium iodide crystals (4 each) with digiBASE-E (for list mode data collection) would be placed on the passenger side of a minivan. To enhance localization and facilitate rapid identification of isotopes, advanced smart real-time localization and radioisotope identification algorithms like WAVRAD (wavelet-assisted variance reduction for anomaly detection) and NSCRAD (nuisance-rejection spectral comparison ratio anomaly detection) will be incorporated. We will test a collection of algorithms and analysis that centers on the problem of radiation detection with a distributed sensor network. We will study the basic characteristics of a radiation sensor network and focus on the trade-offs between false positive alarm rates, true positive alarm rates, and time to detect multiple radiation sources in a large area. Empirical and simulation analyses of critical system parameters, such as number of sensors, sensor placement, and sensor response functions, will be examined. This networked system will provide an integrated radiation detection architecture and framework with (i) a large nationally recognized search database equivalent that would help generate a common operational picture in a major radiological crisis; (ii) a robust reach back connectivity for search data to be evaluated by home teams; and, finally, (iii) a possibility of integrating search data from multi-agency responders.
An Investigation of State-Space Model Fidelity for SSME Data
NASA Technical Reports Server (NTRS)
Martin, Rodney Alexander
2008-01-01
In previous studies, a variety of unsupervised anomaly detection techniques for anomaly detection were applied to SSME (Space Shuttle Main Engine) data. The observed results indicated that the identification of certain anomalies were specific to the algorithmic method under consideration. This is the reason why one of the follow-on goals of these previous investigations was to build an architecture to support the best capabilities of all algorithms. We appeal to that goal here by investigating a cascade, serial architecture for the best performing and most suitable candidates from previous studies. As a precursor to a formal ROC (Receiver Operating Characteristic) curve analysis for validation of resulting anomaly detection algorithms, our primary focus here is to investigate the model fidelity as measured by variants of the AIC (Akaike Information Criterion) for state-space based models. We show that placing constraints on a state-space model during or after the training of the model introduces a modest level of suboptimality. Furthermore, we compare the fidelity of all candidate models including those embodying the cascade, serial architecture. We make recommendations on the most suitable candidates for application to subsequent anomaly detection studies as measured by AIC-based criteria.
Patterns of reflected radiance associated with geobotanical anomalies
NASA Technical Reports Server (NTRS)
Birnie, R. W.; Stone, T. A.; Francica, J. R.
1985-01-01
This paper summarizes three remote sensing experiments in which changes in remotely measured reflected radiance patterns of vegetation correlated with changes in geology. In two cases using airborne spectroradiometer data, changes in the physical properties of a uniform species correlated with zones of porphyry copper mineralization. In another case using Landsat digital data, changes were detected in the distribution and density of a number of species and combined with soil brightness data to produce a composite index useful for distinguishing lithologies.
Hydrological simulations in the Rhine basin.
van den Hurk, B; Beersma, J; Lenderink, G
2005-01-01
Simulations with regional climate models (RCMs), carried out for the Rhine basin, have been analyzed in the context of implications of the possible future discharge of the Rhine river. In a first analysis, the runoff generated by the RCMs is compared to observations, in order to detect the way the RCMs treat anomalies in precipitation in their land surface component. A second analysis is devoted to the frequency distribution of area averaged precipitation, and the impact of selection of various driving global climate models.
NASA Astrophysics Data System (ADS)
Tominaga, Masako; Tivey, Maurice A.; MacLeod, Christopher J.; Morris, Antony; Lissenberg, C. Johan; Shillington, Donna J.; Ferrini, Vicki
2016-06-01
Marine magnetic anomalies are a powerful tool for detecting geomagnetic polarity reversals, lithological boundaries, topographic contrasts, and alteration fronts in the oceanic lithosphere. Our aim here is to detect lithological contacts in fast-spreading lower crust and shallow mantle by characterizing magnetic anomalies and investigating their origins. We conducted a high-resolution, near-bottom, vector magnetic survey of crust exposed in the Hess Deep "tectonic window" using the remotely operated vehicle (ROV) Isis during RRS James Cook cruise JC21 in 2008. Hess Deep is located at the western tip of the propagating rift of the Cocos-Nazca plate boundary near the East Pacific Rise (EPR) (2°15'N, 101°30'W). ROV Isis collected high-resolution bathymetry and near-bottom magnetic data as well as seafloor samples to determine the in situ lithostratigraphy and internal structure of a section of EPR lower crust and mantle exposed on the steep (~20°dipping) south facing slope just north of the Hess Deep nadir. Ten magnetic profiles were collected up the slope using a three-axis fluxgate magnetometer mounted on ROV Isis. We develop and extend the vertical magnetic profile (VMP) approach of Tivey (1996) by incorporating, for the first time, a three-dimensional vector analysis, leading to what we here termed as "vector vertical magnetic profiling" approach. We calculate the source magnetization distribution, the deviation from two dimensionality, and the strike of magnetic boundaries using both the total field Fourier-transform inversion approach and a modified differential vector magnetic analysis. Overall, coherent, long-wavelength total field anomalies are present with a strong magnetization contrast between the upper and lower parts of the slope. The total field anomalies indicate a coherently magnetized source at depth. The upper part of the slope is weakly magnetized and magnetic structure follows the underlying slope morphology, including a "bench" and lobe-shaped steps, imaged by microbathymetry. The lower part of the slope is strongly magnetized, with a gradual reduction in amplitude from east to west across the slope. Surface morphology and recent drilling results indicate that the slope has been affected by mass wasting, but the observation of internally coherent magnetization distributions within the upper and lower slopes suggest that the disturbance is surficial. We attribute the spatial differences in magnetization distribution to the combination of changes in in situ lithology and depth to the source. These survey lines document the first magnetic profiles that capture the gabbro-ultramafic and possibly dike-gabbro boundaries in fast-spreading lower crust.
An Integrated Intrusion Detection Model of Cluster-Based Wireless Sensor Network
Sun, Xuemei; Yan, Bo; Zhang, Xinzhong; Rong, Chuitian
2015-01-01
Considering wireless sensor network characteristics, this paper combines anomaly and mis-use detection and proposes an integrated detection model of cluster-based wireless sensor network, aiming at enhancing detection rate and reducing false rate. Adaboost algorithm with hierarchical structures is used for anomaly detection of sensor nodes, cluster-head nodes and Sink nodes. Cultural-Algorithm and Artificial-Fish–Swarm-Algorithm optimized Back Propagation is applied to mis-use detection of Sink node. Plenty of simulation demonstrates that this integrated model has a strong performance of intrusion detection. PMID:26447696
An Integrated Intrusion Detection Model of Cluster-Based Wireless Sensor Network.
Sun, Xuemei; Yan, Bo; Zhang, Xinzhong; Rong, Chuitian
2015-01-01
Considering wireless sensor network characteristics, this paper combines anomaly and mis-use detection and proposes an integrated detection model of cluster-based wireless sensor network, aiming at enhancing detection rate and reducing false rate. Adaboost algorithm with hierarchical structures is used for anomaly detection of sensor nodes, cluster-head nodes and Sink nodes. Cultural-Algorithm and Artificial-Fish-Swarm-Algorithm optimized Back Propagation is applied to mis-use detection of Sink node. Plenty of simulation demonstrates that this integrated model has a strong performance of intrusion detection.
A Distance Measure for Attention Focusing and Anaomaly Detection in Systems Monitoring
NASA Technical Reports Server (NTRS)
Doyle, R. J.
1994-01-01
Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by.
Detection and characterization of buried lunar craters with GRAIL data
NASA Astrophysics Data System (ADS)
Sood, Rohan; Chappaz, Loic; Melosh, Henry J.; Howell, Kathleen C.; Milbury, Colleen; Blair, David M.; Zuber, Maria T.
2017-06-01
We used gravity mapping observations from NASA's Gravity Recovery and Interior Laboratory (GRAIL) to detect, characterize and validate the presence of large impact craters buried beneath the lunar maria. In this paper we focus on two prominent anomalies detected in the GRAIL data using the gravity gradiometry technique. Our detection strategy is applied to both free-air and Bouguer gravity field observations to identify gravitational signatures that are similar to those observed over buried craters. The presence of buried craters is further supported by individual analysis of regional free-air gravity anomalies, Bouguer gravity anomaly maps, and forward modeling. Our best candidate, for which we propose the informal name of Earhart Crater, is approximately 200 km in diameter and forms part of the northwestern rim of Lacus Somniorum, The other candidate, for which we propose the informal name of Ashoka Anomaly, is approximately 160 km in diameter and lies completely buried beneath Mare Tranquillitatis. Other large, still unrecognized, craters undoubtedly underlie other portions of the Moon's vast mare lavas.
Anomaly Detection In Additively Manufactured Parts Using Laser Doppler Vibrometery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, Carlos A.
Additively manufactured parts are susceptible to non-uniform structure caused by the unique manufacturing process. This can lead to structural weakness or catastrophic failure. Using laser Doppler vibrometry and frequency response analysis, non-contact detection of anomalies in additively manufactured parts may be possible. Preliminary tests show promise for small scale detection, but more future work is necessary.
Unsupervised Anomaly Detection Based on Clustering and Multiple One-Class SVM
NASA Astrophysics Data System (ADS)
Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Kwon, Yongjin
Intrusion detection system (IDS) has played an important role as a device to defend our networks from cyber attacks. However, since it is unable to detect unknown attacks, i.e., 0-day attacks, the ultimate challenge in intrusion detection field is how we can exactly identify such an attack by an automated manner. Over the past few years, several studies on solving these problems have been made on anomaly detection using unsupervised learning techniques such as clustering, one-class support vector machine (SVM), etc. Although they enable one to construct intrusion detection models at low cost and effort, and have capability to detect unforeseen attacks, they still have mainly two problems in intrusion detection: a low detection rate and a high false positive rate. In this paper, we propose a new anomaly detection method based on clustering and multiple one-class SVM in order to improve the detection rate while maintaining a low false positive rate. We evaluated our method using KDD Cup 1999 data set. Evaluation results show that our approach outperforms the existing algorithms reported in the literature; especially in detection of unknown attacks.
On Predictability of System Anomalies in Real World
2011-08-01
distributed system SETI @home [44]. Different from the above work, this work focuses on quantifying the predictability of real-world system anomalies. V...J.-M. Vincent, and D. Anderson, “Mining for statistical models of availability in large-scale distributed systems: An empirical study of seti @home,” in Proc. of MASCOTS, sept. 2009.
2015-06-01
system accuracy. The AnRAD system was also generalized for the additional application of network intrusion detection . A self-structuring technique...to Host- based Intrusion Detection Systems using Contiguous and Discontiguous System Call Patterns,” IEEE Transactions on Computer, 63(4), pp. 807...square kilometer areas. The anomaly recognition and detection (AnRAD) system was built as a cogent confabulation network . It represented road
2004-02-01
UNCLASSIFIED − Conducted experiments to determine the usability of general-purpose anomaly detection algorithms to monitor a large, complex military...reaction and detection modules to perform tailored analysis sequences to monitor environmental conditions, health hazards and physiological states...scalability of lab proven anomaly detection techniques for intrusion detection in real world high volume environments. Narrative Title FY 2003
Stress distribution and topography of Tellus Regio, Venus
NASA Technical Reports Server (NTRS)
Williams, David R.; Greeley, Ronald
1989-01-01
The Tellus Regio area of Venus represents a subset of a narrow latitude band where Pioneer Venus Orbiter (PVO) altimetry data, line-of-sight (LOS) gravity data, and Venera 15/16 radar images have all been obtained with good resolution. Tellus Regio also has a wide variety of surface morphologic features, elevations ranging up to 2.5 km, and a relatively low LOS gravity anomaly. This area was therefore chosen in order to examine the theoretical stress distributions resulting from various models of compensation of the observed topography. These surface stress distributions are then compared with the surface morphology revealed in the Venera 15/16 radar images. Conclusions drawn from these comparisons will enable constraints to be put on various tectonic parameters relevant to Tellus Regio. The stress distribution is calculated as a function of the topography, the equipotential anomaly, and the assumed model parameters. The topography data is obtained from the PVO altimetry. The equipotential anomaly is estimated from the PVO LOS gravity data. The PVO LOS gravity represents the spacecraft accelerations due to mass anomalies within the planet. These accelerations are measured at various altitudes and angles to the local vertical and therefore do not lend themselves to a straightforward conversion. A minimum variance estimator of the LOS gravity data is calculated, taking into account the various spacecraft altitudes and LOS angles and using the measured PVO topography as an a priori constraint. This results in an estimated equivalent surface mass distribution, from which the equipotential anomaly is determined.
The characteristics and distribution of dental anomalies in patients with cleft.
Wu, Ting-Ting; Chen, Philip K T; Lo, Lun-Jou; Cheng, Min-Chi; Ko, Ellen Wen-Ching
2011-01-01
Dental anomalies associated with different severities of cleft lip and palate have been rarely reported. This retrospective study investigates the characteristics of dental anomalies associated with different types of cleft, and compares the dental anomaly traits based on sex and severity of cleft. Cleft patients born in 1995 with qualified diagnostic records from 7 to 11 years were included for evaluation. Records were retrieved from database of Chang Gung Craniofacial Center, including panoramic radiographs and intraoral photographs. In total, 196 patients with complete records were included in the evaluation. This study compares the dental anomalies associated with each type of cleft. The frequency of dental anomalies in the maxillary incisor area in the cleft palate (CP) group (20%) was significantly lower than that in other groups. The frequency of missing maxillary lateral incisors (MLIs) increased as the cleft severity increased. Supernumerary teeth and missing lower incisors exhibited the opposite trend. No sexual dimorphism appeared in terms of the frequencies of peg laterals and missing MLIs. The distribution patterns of missing MLIs and peg laterals in males, but not in females, were consistent for the three types of unilateral clefts. Regarding the characteristics of dental anomalies among the three unilateral clefts, missing MLIs, supernumerary teeth, and missing lower incisors were found to be related to cleft severity. The maxillary lateral incisor was the most affected tooth in the cleft area. The frequency of missing MLIs and peg laterals was not sexual dimorphic, but the distribution pattern was different between the sexes.
Mangione, Francesca; Nguyen, Laure; Foumou, Nathalie; Bocquet, Emmanuelle; Dursun, Elisabeth
2018-03-01
Prevalence of dental anomalies in cleft patients is higher than that in general population. The objectives of this study were to assess the prevalence of dental anomalies and their coexistence in French children with cleft and, then, to investigate the relation between the dental anomalies and the cleft type. Seventy-four non-syndromic cleft patients (6-16 years old) from Lille Regional University and Mondor-Chenevier Hospitals (France) were included. Clefts were classified as right/left unilateral cleft lip and palate (UCLP), bilateral cleft lip and palate (BCLP) and cleft palate (CP). Dental anomalies were investigated on panoramic radiographs and categorized as agenesis, supernumerary teeth, incisor rotations, impacted canines and shape anomalies. Prevalence and gender distribution of dental anomalies, mean number of affected teeth per patient, agenesis occurrence and location, and coexistence of dental anomalies were analysed by cleft type. 96.0% of patients presented at least one dental anomaly (agenesis 83.8%, incisor rotations 25.7%, shape anomalies 21.6%, impacted canines 18.9%, supernumerary teeth 8.1%). BCLP patients had a higher number of affected teeth, and left UCLP patients had a higher one compared to right UCLP patients. Distribution of inside (45.3%) and outside (54.7%) cleft region agenesis was similar. Adjacent (31.8%) and not adjacent (33.3%) combined dental anomalies were often encountered. Dental anomalies were localized inside as well as outside cleft region and were often associated with each other. BCLP patients were more affected. Early radiographic evaluation allows a comprehensive diagnosis of inside and outside cleft region anomalies, required for the multidisciplinary dental treatment.
NASA Astrophysics Data System (ADS)
Girolami, C.; Barchi, M. R.; Heyde, I.; Pauselli, C.; Vetere, F.; Cannata, A.
2017-11-01
In this work, the gravity anomaly signal beneath Mount Amiata and its surroundings have been analysed to reconstruct the subsurface setting. In particular, the work focuses on the investigation of the geological bodies responsible for the Bouguer gravity minimum observed in this area.
Routine screening for fetal anomalies: expectations.
Goldberg, James D
2004-03-01
Ultrasound has become a routine part of prenatal care. Despite this, the sensitivity and specificity of the procedure is unclear to many patients and healthcare providers. In a small study from Canada, 54.9% of women reported that they had received no information about ultrasound before their examination. In addition, 37.2% of women indicated that they were unaware of any fetal problems that ultrasound could not detect. Most centers that perform ultrasound do not have their own statistics regarding sensitivity and specificity; it is necessary to rely on large collaborative studies. Unfortunately, wide variations exist in these studies with detection rates for fetal anomalies between 13.3% and 82.4%. The Eurofetus study is the largest prospective study performed to date and because of the time and expense involved in this type of study, a similar study is not likely to be repeated. The overall fetal detection rate for anomalous fetuses was 64.1%. It is important to note that in this study, ultrasounds were performed in tertiary centers with significant experience in detecting fetal malformations. The RADIUS study also demonstrated a significantly improved detection rate of anomalies before 24 weeks in tertiary versus community centers (35% versus 13%). Two concepts seem to emerge from reviewing these data. First, patients must be made aware of the limitations of ultrasound in detecting fetal anomalies. This information is critical to allow them to make informed decisions whether to undergo ultrasound examination and to prepare them for potential outcomes.Second, to achieve the detection rates reported in the Eurofetus study, ultrasound examination must be performed in centers that have extensive experience in the detection of fetal anomalies.
Transient ice mass variations over Greenland detected by the combination of GPS and GRACE data
NASA Astrophysics Data System (ADS)
Zhang, B.; Liu, L.; Khan, S. A.; van Dam, T. M.; Zhang, E.
2017-12-01
Over the past decade, the Greenland Ice Sheet (GrIS) has been undergoing significant warming and ice mass loss. Such mass loss was not always a steady process but had substantial temporal and spatial variabilities. Here we apply multi-channel singular spectral analysis to crustal deformation time series measured at about 50 Global Positioning System (GPS) stations mounted on bedrock around the Greenland coast and mass changes inferred from Gravity Recovery and Climate Experiment (GRACE) to detect transient changes in ice mass balance over the GrIS. We detect two transient anomalies: one is a negative melting anomaly (Anomaly 1) that peaked around 2010; the other is a positive melting anomaly (Anomaly 2) that peaked between 2012 and 2013. The GRACE data show that both anomalies caused significant mass changes south of 74°N but negligible changes north of 74°N. Both anomalies caused the maximum mass change in southeast GrIS, followed by in west GrIS near Jakobshavn. Our results also show that the mass change caused by Anomaly 1 first reached the maximum in late 2009 in the southeast GrIS and then migrated to west GrIS. However, in Anomaly 2, the southeast GrIS was the last place that reached the maximum mass change in early 2013 and the west GrIS near Jakobshavn was the second latest place that reached the maximum mass change. Most of the GPS data show similar spatiotemporal patterns as those obtained from the GRACE data. However, some GPS time series show discrepancies in either space or time, because of data gaps and different sensitivities of mass loading change. Namely, loading deformation measured by GPS can be significantly affected by local dynamical mass changes, which, yet, has little impact on GRACE observations.
Metal enrichments in solid bitumens: A review
NASA Astrophysics Data System (ADS)
Parnell, J.
1988-07-01
The association of oils and solid bitumens with ore deposits is widely recorded. The oils and bitumens may actually be enriched with metals. Unlike oils, metal enrichments within bitumens do not reflect the role of petroleum as a transporting agent for metals. By contrast, they may be a result of the reduction of metal ions on contact with bitumen, and may reach levels so high that ore mineral inclusions are precipitated. Metal determinations of British bitumens suggest that new metal anomalies can be detected by this approach, that some metal anomalies within bitumens may be related to ore mineralization, and that bitumens from different sources may be distinguished by their metal contents. The potential use of bitumen distribution and/or metal enrichment within bitumen for ore exploration is dependent on the metal concerned, and in particular whether the metal is transported by association with organic materials or reduced in the presence of organic materials.
Locality-constrained anomaly detection for hyperspectral imagery
NASA Astrophysics Data System (ADS)
Liu, Jiabin; Li, Wei; Du, Qian; Liu, Kui
2015-12-01
Detecting a target with low-occurrence-probability from unknown background in a hyperspectral image, namely anomaly detection, is of practical significance. Reed-Xiaoli (RX) algorithm is considered as a classic anomaly detector, which calculates the Mahalanobis distance between local background and the pixel under test. Local RX, as an adaptive RX detector, employs a dual-window strategy to consider pixels within the frame between inner and outer windows as local background. However, the detector is sensitive if such a local region contains anomalous pixels (i.e., outliers). In this paper, a locality-constrained anomaly detector is proposed to remove outliers in the local background region before employing the RX algorithm. Specifically, a local linear representation is designed to exploit the internal relationship between linearly correlated pixels in the local background region and the pixel under test and its neighbors. Experimental results demonstrate that the proposed detector improves the original local RX algorithm.
NASA Astrophysics Data System (ADS)
Raeesi, M.
2009-05-01
During 1970s some researchers noticed that large earthquakes occur repeatedly at the same locations. These observations led to the asperity hypothesis. At the same times some researchers noticed that there was a relationship between the location of great interplate earthquakes and the submarine structures, basins in particular, over the rupture area in the forearc regions. Despite these observations there was no comprehensive and reliable hypothesis explaining the relationship. There were numerous cons and pros to the various hypotheses given in this regard. In their pioneering study, Song and Simons (2003) approached the problem using gravity data. This was a turning point in seismology. Although their approach was correct, appropriate gravity anomaly had to be used in order to reveal the location and extent of the asperities. Following the method of Song and Simons (2003) but using the Bouguer gravity anomaly that we called "Trench Parallel Bouguer Anomaly", TPBA, we found strong, logical, and convincing relation between the TPBA-derived asperities and the slip distribution as well as earthquake distribution, foreshocks and aftershocks in particular. Various parameters with different levels of importance are known that affect the contact between the subducting and the overriding plates, We found that the TPBA can show which are the important factors. Because the TPBA-derived asperities are based on static physical properties (gravity and elevation), they do not suffer from instabilities due to the trade-offs, as it happens for asperities derived in dynamic studies such as waveform inversion. Comparison of the TPBA-derived asperities with rupture processes of the well-studied great earthquakes, reveals the high level of accuracy of the TPBA. This new measure opens a forensic viewpoint on the rupture process along the subduction zones. The TPBA reveals the reason behind 9+ earthquakes and it explains where and why they occur. The TPBA reveals the areas that can generate tsunami earthquakes. It gives a logical dimension to the foreshock and aftershock distributions. Using the TPBA, we can derive the scenarios for the early 20th century great earthquakes for which limited data is available. We present cases from Aleutian and South America subduction zones. The TPBA explains why there should be no great earthquake in the down-dip of Shumagin, but that there should be a major tsunami earthquake for its up-dip. Our evidences suggest that the process has already started. We give numerous examples for South America, Aleutian-Alaska, and Kurile-Kamchatka subduction zones and we also look at Cascadia. Despite the possible various applications of the new measure, here we draw the attention to its most important application - the detection of critical asperities. Supplied with this new measure, in addition to the available seismological data, seismologists should be able to detect the critical asperities and follow the evolving rupture process. This paves the way for revealing systematically the great interplate earthquakes.
Quantifying Performance Bias in Label Fusion
2012-08-21
detect ), may provide the end-user with the means to appropriately adjust the performance and optimal thresholds for performance by fusing legacy systems...boolean combination of classification systems in ROC space: An application to anomaly detection with HMMs. Pattern Recognition, 43(8), 2732-2752. 10...Shamsuddin, S. (2009). An overview of neural networks use in anomaly intrusion detection systems. Paper presented at the Research and Development (SCOReD
Anomaly detection for machine learning redshifts applied to SDSS galaxies
NASA Astrophysics Data System (ADS)
Hoyle, Ben; Rau, Markus Michael; Paech, Kerstin; Bonnett, Christopher; Seitz, Stella; Weller, Jochen
2015-10-01
We present an analysis of anomaly detection for machine learning redshift estimation. Anomaly detection allows the removal of poor training examples, which can adversely influence redshift estimates. Anomalous training examples may be photometric galaxies with incorrect spectroscopic redshifts, or galaxies with one or more poorly measured photometric quantity. We select 2.5 million `clean' SDSS DR12 galaxies with reliable spectroscopic redshifts, and 6730 `anomalous' galaxies with spectroscopic redshift measurements which are flagged as unreliable. We contaminate the clean base galaxy sample with galaxies with unreliable redshifts and attempt to recover the contaminating galaxies using the Elliptical Envelope technique. We then train four machine learning architectures for redshift analysis on both the contaminated sample and on the preprocessed `anomaly-removed' sample and measure redshift statistics on a clean validation sample generated without any preprocessing. We find an improvement on all measured statistics of up to 80 per cent when training on the anomaly removed sample as compared with training on the contaminated sample for each of the machine learning routines explored. We further describe a method to estimate the contamination fraction of a base data sample.
Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets
Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin
2017-01-01
Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy and sparse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxicabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques. PMID:28282948
Anomaly Detection in the Right Hemisphere: The Influence of Visuospatial Factors
ERIC Educational Resources Information Center
Smith, Stephen D.; Dixon, Michael J.; Tays, William J.; Bulman-Fleming, M. Barbara
2004-01-01
Previous research with both brain-damaged and neurologically intact populations has demonstrated that the right cerebral hemisphere (RH) is superior to the left cerebral hemisphere (LH) at detecting anomalies (or incongruities) in objects (Ramachandran, 1995; Smith, Tays, Dixon, & Bulman-Fleming, 2002). The current research assesses whether the RH…
A Semiparametric Model for Hyperspectral Anomaly Detection
2012-01-01
treeline ) in the presence of natural background clutter (e.g., trees, dirt roads, grasses). Each target consists of about 7 × 4 pixels, and each pixel...vehicles near the treeline in Cube 1 (Figure 1) constitutes the target set, but, since anomaly detectors are not designed to detect a particular target
Liu, Guanqun; Jia, Yonggang; Liu, Hongjun; Qiu, Hanxue; Qiu, Dongling; Shan, Hongxian
2002-03-01
The exploration and determination of leakage of underground pressureless nonmetallic pipes is difficult to deal with. A comprehensive method combining Ground Penetrating Rader (GPR), electric potential survey and geochemical survey is introduced in the leakage detection of an underground pressureless nonmetallic sewage pipe in this paper. Theoretically, in the influencing scope of a leakage spot, the obvious changes of the electromagnetic properties and the physical-chemical properties of the underground media will be reflected as anomalies in GPR and electrical survey plots. The advantages of GPR and electrical survey are fast and accurate in detection of anomaly scope. In-situ analysis of the geophysical surveys can guide the geochemical survey. Then water and soil sampling and analyzing can be the evidence for judging the anomaly is caused by pipe leakage or not. On the basis of previous tests and practical surveys, the GPR waveforms, electric potential curves, contour maps, and chemical survey results are all classified into three types according to the extent or indexes of anomalies in orderto find out the leakage spots. When three survey methods all show their anomalies as type I in an anomalous spot, this spot is suspected as the most possible leakage location. Otherwise, it will be down grade suspected point. The suspect leakage spots should be confirmed by referring the site conditions because some anomalies are caused other factors. The excavation afterward proved that the method for determining the suspected location by anomaly type is effective and economic. Comprehensive method of GRP, electric potential survey, and geochemical survey is one of the effective methods in the leakage detection of underground nonmetallic pressureless pipe with its advantages of being fast and accurate.
Cost Analysis of Following Up Incomplete Low-Risk Fetal Anatomy Ultrasounds.
O'Brien, Karen; Shainker, Scott A; Modest, Anna M; Spiel, Melissa H; Resetkova, Nina; Shah, Neel; Hacker, Michele R
2017-03-01
To examine the clinical utility and cost of follow-up ultrasounds performed as a result of suboptimal views at the time of initial second-trimester ultrasound in a cohort of low-risk pregnant women. We conducted a retrospective cohort study of women at low risk for fetal structural anomalies who had second-trimester ultrasounds at 16 to less than 24 weeks of gestation from 2011 to 2013. We determined the probability of women having follow-up ultrasounds as a result of suboptimal views at the time of the initial second-trimester ultrasound, and calculated the probability of detecting an anomaly on follow-up ultrasound. These probabilities were used to estimate the national cost of our current ultrasound practice, and the cost to identify one fetal anomaly on follow-up ultrasound. During the study period, 1,752 women met inclusion criteria. Four fetuses (0.23% [95% CI 0.06-0.58]) were found to have anomalies at the initial ultrasound. Because of suboptimal views, 205 women (11.7%) returned for a follow-up ultrasound, and one (0.49% [95% CI 0.01-2.7]) anomaly was detected. Two women (0.11%) still had suboptimal views and returned for an additional follow-up ultrasound, with no anomalies detected. When the incidence of incomplete ultrasounds was applied to a similar low-risk national cohort, the annual cost of these follow-up scans was estimated at $85,457,160. In our cohort, the cost to detect an anomaly on follow-up ultrasound was approximately $55,000. The clinical yield of performing follow-up ultrasounds because of suboptimal views on low-risk second-trimester ultrasounds is low. Since so few fetal abnormalities were identified on follow-up scans, this added cost and patient burden may not be warranted. © 2016 Wiley Periodicals, Inc.
Machine intelligence-based decision-making (MIND) for automatic anomaly detection
NASA Astrophysics Data System (ADS)
Prasad, Nadipuram R.; King, Jason C.; Lu, Thomas
2007-04-01
Any event deemed as being out-of-the-ordinary may be called an anomaly. Anomalies by virtue of their definition are events that occur spontaneously with no prior indication of their existence or appearance. Effects of anomalies are typically unknown until they actually occur, and their effects aggregate in time to show noticeable change from the original behavior. An evolved behavior would in general be very difficult to correct unless the anomalous event that caused such behavior can be detected early, and any consequence attributed to the specific anomaly. Substantial time and effort is required to back-track the cause for abnormal behavior and to recreate the event sequence leading to abnormal behavior. There is a critical need therefore to automatically detect anomalous behavior as and when they may occur, and to do so with the operator in the loop. Human-machine interaction results in better machine learning and a better decision-support mechanism. This is the fundamental concept of intelligent control where machine learning is enhanced by interaction with human operators, and vice versa. The paper discusses a revolutionary framework for the characterization, detection, identification, learning, and modeling of anomalous behavior in observed phenomena arising from a large class of unknown and uncertain dynamical systems.
Tune, Sarah; Schlesewsky, Matthias; Small, Steven L.; Sanford, Anthony J.; Bohan, Jason; Sassenhagen, Jona; Bornkessel-Schlesewsky, Ina
2014-01-01
The N400 event-related brain potential (ERP) has played a major role in the examination of how the human brain processes meaning. For current theories of the N400, classes of semantic inconsistencies which do not elicit N400 effects have proven particularly influential. Semantic anomalies that are difficult to detect are a case in point (“borderline anomalies”, e.g. “After an air crash, where should the survivors be buried?”), engendering a late positive ERP response but no N400 effect in English (Sanford, Leuthold, Bohan, & Sanford, 2011). In three auditory ERP experiments, we demonstrate that this result is subject to cross-linguistic variation. In a German version of Sanford and colleagues' experiment (Experiment 1), detected borderline anomalies elicited both N400 and late positivity effects compared to control stimuli or to missed borderline anomalies. Classic easy-to-detect semantic (non-borderline) anomalies showed the same pattern as in English (N400 plus late positivity). The cross-linguistic difference in the response to borderline anomalies was replicated in two additional studies with a slightly modified task (Experiment 2a: German; Experiment 2b: English), with a reliable LANGUAGE × ANOMALY interaction for the borderline anomalies confirming that the N400 effect is subject to systematic cross-linguistic variation. We argue that this variation results from differences in the language-specific default weighting of top-down and bottom-up information, concluding that N400 amplitude reflects the interaction between the two information sources in the form-to-meaning mapping. PMID:24447768
Inversion of Magnetic Measurements of the CHAMP Satellite Over the Pannonian Basin
NASA Technical Reports Server (NTRS)
Kis, K. I.; Taylor, P. T.; Wittmann, G.; Toronyi, B.; Puszta, S.
2011-01-01
The Pannonian Basin is a deep intra-continental basin that formed as part of the Alpine orogeny. In order to study the nature of the crustal basement we used the long-wavelength magnetic anomalies acquired by the CHAMP satellite. The anomalies were distributed in a spherical shell, some 107,927 data recorded between January 1 and December 31 of 2008. They covered the Pannonian Basin and its vicinity. These anomaly data were interpolated into a spherical grid of 0.5 x 0.5, at the elevation of 324 km by the Gaussian weight function. The vertical gradient of these total magnetic anomalies was also computed and mapped to the surface of a sphere at 324 km elevation. The former spherical anomaly data at 425 km altitude were downward continued to 324 km. To interpret these data at the elevation of 324 km we used an inversion method. A polygonal prism forward model was used for the inversion. The minimum problem was solved numerically by the Simplex and Simulated annealing methods; a L2 norm in the case of Gaussian distribution parameters and a L1 norm was used in the case of Laplace distribution parameters. We INTERPRET THAT the magnetic anomaly WAS produced by several sources and the effect of the sable magnetization of the exsolution of hemo-ilmenite minerals in the upper crustal metamorphic rocks.
Discrepancy of cytogenetic analysis in Western and eastern Taiwan.
Chang, Yu-Hsun; Chen, Pui-Yi; Li, Tzu-Ying; Yeh, Chung-Nan; Li, Yi-Shian; Chu, Shao-Yin; Lee, Ming-Liang
2013-06-01
This study aimed at investigating the results of second-trimester amniocyte karyotyping in western and eastern Taiwan, and identifying any regional differences in the prevalence of fetal chromosomal anomalies. From 2004 to 2009, pregnant women who underwent amniocentesis in their second trimester at three hospitals in western Taiwan and at four hospitals in eastern Taiwan were included. All the cytogenetic analyses of cultured amniocytes were performed in the cytogenetics laboratory of the Genetic Counseling Center of Hualien Buddhist Tzu Chi General Hospital. We used the chi-square test, Student t test, and Mann-Whitney U test to evaluate the variants of clinical indications, amniocyte karyotyping results, and prevalence and types of chromosomal anomalies in western and eastern Taiwan. During the study period, 3573 samples, 1990 (55.7%) from western Taiwan and 1583 (44.3%) from eastern Taiwan, were collected and analyzed. The main indication for amniocyte karyotyping was advanced maternal age (69.0% in western Taiwan, 67.1% in eastern Taiwan). The detection rates of chromosomal anomalies by amniocyte karyotyping in eastern Taiwan (45/1582, 2.8%) did not differ significantly from that in western Taiwan (42/1989, 2.1%) (p = 1.58). Mothers who had abnormal ultrasound findings and histories of familial hereditary diseases or chromosomal anomalies had higher detection rates of chromosomal anomalies (9.3% and 7.2%, respectively). The detection rate of autosomal anomalies was higher in eastern Taiwan (93.3% vs. 78.6%, p = 0.046), but the detection rate of sex-linked chromosomal anomalies was higher in western Taiwan (21.4% vs. 6.7%, p = 0.046). We demonstrated regional differences in second-trimester amniocyte karyotyping results and established a database of common chromosomal anomalies that could be useful for genetic counseling, especially in eastern Taiwan. Copyright © 2012. Published by Elsevier B.V.
NASA Technical Reports Server (NTRS)
Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.
1975-01-01
The author has identified the following significant results. Ground truth data collection proves that significant anomalies exist at 13 volcanoes within the test site of Central America. The dimensions and temperature contrast of these ten anomalies are large enough to be detected by the Skylab 192 instrument. The dimensions and intensity of thermal anomalies have changed at most of these volcanoes during the Skylab mission.
System and method for anomaly detection
Scherrer, Chad
2010-06-15
A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.
1988-04-01
anomalies (including duplication of the ureters, hypospadias and ectopic kidney) in 6%.(34) I. Pottern reports on 73 testicular cancer patients seen...urogenital anomalies such as hypospadias and ureteral duplication.( 5 2 ) Among 100 consecutive urograms performed on cryptorchid boys -. anomalies were
A primitive study on unsupervised anomaly detection with an autoencoder in emergency head CT volumes
NASA Astrophysics Data System (ADS)
Sato, Daisuke; Hanaoka, Shouhei; Nomura, Yukihiro; Takenaga, Tomomi; Miki, Soichiro; Yoshikawa, Takeharu; Hayashi, Naoto; Abe, Osamu
2018-02-01
Purpose: The target disorders of emergency head CT are wide-ranging. Therefore, people working in an emergency department desire a computer-aided detection system for general disorders. In this study, we proposed an unsupervised anomaly detection method in emergency head CT using an autoencoder and evaluated the anomaly detection performance of our method in emergency head CT. Methods: We used a 3D convolutional autoencoder (3D-CAE), which contains 11 layers in the convolution block and 6 layers in the deconvolution block. In the training phase, we trained the 3D-CAE using 10,000 3D patches extracted from 50 normal cases. In the test phase, we calculated abnormalities of each voxel in 38 emergency head CT volumes (22 abnormal cases and 16 normal cases) for evaluation and evaluated the likelihood of lesion existence. Results: Our method achieved a sensitivity of 68% and a specificity of 88%, with an area under the curve of the receiver operating characteristic curve of 0.87. It shows that this method has a moderate accuracy to distinguish normal CT cases to abnormal ones. Conclusion: Our method has potentialities for anomaly detection in emergency head CT.
Europium anomaly in plagioclase feldspar - Experimental results and semiquantitative model.
NASA Technical Reports Server (NTRS)
Weill, D. F.; Drake, M. J.
1973-01-01
The partition of europium between plagioclase feldspar and magmatic liquid is considered in terms of the distribution coefficients for divalent and trivalent europium. A model equation is derived giving the europium anomaly in plagioclase as a function of temperature and oxygen fugacity. The model explains europium anomalies in plagioclase synthesized under controlled laboratory conditions as well as the variations of the anomaly observed in natural terrestrial and extraterrestrial igneous rocks.
Europium anomaly in plagioclase feldspar: experimental results and semiquantitative model.
Weill, D F; Drake, M J
1973-06-08
The partition of europium between plagioclase feldspar and magmatic liquid is considered in terms of the distribution coefficients for divalent and trivalent europium. A model equation is derived giving the europium anomaly in plagioclase as a function of temperature and oxygen fugacity. The model explains europium anomalies in plagioclase synthesized under controlled laboratory conditions as well as the variations of the anomaly observed in natural terrestrial and extraterrestrial igneous rocks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tardiff, Mark F.; Runkle, Robert C.; Anderson, K. K.
2006-01-23
The goal of primary radiation monitoring in support of routine screening and emergency response is to detect characteristics in vehicle radiation signatures that indicate the presence of potential threats. Two conceptual approaches to analyzing gamma-ray spectra for threat detection are isotope identification and anomaly detection. While isotope identification is the time-honored method, an emerging technique is anomaly detection that uses benign vehicle gamma ray signatures to define an expectation of the radiation signature for vehicles that do not pose a threat. Newly acquired spectra are then compared to this expectation using statistical criteria that reflect acceptable false alarm rates andmore » probabilities of detection. The gamma-ray spectra analyzed here were collected at a U.S. land Port of Entry (POE) using a NaI-based radiation portal monitor (RPM). The raw data were analyzed to develop a benign vehicle expectation by decimating the original pulse-height channels to 35 energy bins, extracting composite variables via principal components analysis (PCA), and estimating statistically weighted distances from the mean vehicle spectrum with the mahalanobis distance (MD) metric. This paper reviews the methods used to establish the anomaly identification criteria and presents a systematic analysis of the response of the combined PCA and MD algorithm to modeled mono-energetic gamma-ray sources.« less
Method for Real-Time Model Based Structural Anomaly Detection
NASA Technical Reports Server (NTRS)
Urnes, James M., Sr. (Inventor); Smith, Timothy A. (Inventor); Reichenbach, Eric Y. (Inventor)
2015-01-01
A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.
Spectral anomaly methods for aerial detection using KUT nuisance rejection
NASA Astrophysics Data System (ADS)
Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.
2015-06-01
This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.
2003-11-01
Lafayette, IN 47907. [Lane et al-97b] T. Lane and C . E. Brodley. Sequence matching and learning in anomaly detection for computer security. Proceedings of...Mining, pp 259-263. 1998. [Lane et al-98b] T. Lane and C . E. Brodley. Temporal sequence learning and data reduction for anomaly detection ...W. Lee, C . Park, and S. Stolfo. Towards Automatic Intrusion Detection using NFR. 1st USENIX Workshop on Intrusion Detection and Network Monitoring
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882
Congenital aplastic-hypoplastic lumbar pedicle in infants and young children
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yousefzadeh, D.K.; El-Khoury, G.Y.; Lupetin, A.R.
1982-01-01
Nine cases of congenital aplastic-hypoplastic lumbar pedicle (mean age 27 months) are described. Their data are compared to those of 18 other reported cases (mean age 24.7 years) and the following conclusions are made: (1) Almost exclusively, the pedicular defect in infants and young children is due to developmental anomaly rather than destruction by malignancy or infectious processes. (2) This anomaly, we think, is more common than it is believed to be. (3) Unlike adults, infants and young children rarely develop hypertrophy and/or sclerosis of the contralateral pedicle. (4) Detection of pedicular anomaly is more than satisfying a radiographic curiositymore » and may lead to discovery of other coexisting anomalies. (5) Ultrasonic screening of the patients with congenital pedicular defects may detect the associated genitourinary anomalies, if present, and justify further studies in a selected group of patients.« less
Machine Learning in Intrusion Detection
2005-07-01
machine learning tasks. Anomaly detection provides the core technology for a broad spectrum of security-centric applications. In this dissertation, we examine various aspects of anomaly based intrusion detection in computer security. First, we present a new approach to learn program behavior for intrusion detection. Text categorization techniques are adopted to convert each process to a vector and calculate the similarity between two program activities. Then the k-nearest neighbor classifier is employed to classify program behavior as normal or intrusive. We demonstrate
Observed TEC Anomalies by GNSS Sites Preceding the Aegean Sea Earthquake of 2014
NASA Astrophysics Data System (ADS)
Ulukavak, Mustafa; Yal&ccedul; ınkaya, Mualla
2016-11-01
In recent years, Total Electron Content (TEC) data, obtained from Global Navigation Satellites Systems (GNSS) receivers, has been widely used to detect seismo-ionospheric anomalies. In this study, Global Positioning System - Total Electron Content (GPS-TEC) data were used to investigate ionospheric abnormal behaviors prior to the 2014 Aegean Sea earthquake (40.305°N 25.453°E, 24 May 2014, 09:25:03 UT, Mw:6.9). The data obtained from three Continuously Operating Reference Stations in Turkey (CORS-TR) and two International GNSS Service (IGS) sites near the epicenter of the earthquake is used to detect ionospheric anomalies before the earthquake. Solar activity index (F10.7) and geomagnetic activity index (Dst), which are both related to space weather conditions, were used to analyze these pre-earthquake ionospheric anomalies. An examination of these indices indicated high solar activity between May 8 and 15, 2014. The first significant increase (positive anomalies) in Vertical Total Electron Content (VTEC) was detected on May 14, 2014 or 10 days before the earthquake. This positive anomaly can be attributed to the high solar activity. The indices do not imply high solar or geomagnetic activity after May 15, 2014. Abnormal ionospheric TEC changes (negative anomaly) were observed at all stations one day before the earthquake. These changes were lower than the lower bound by approximately 10-20 TEC unit (TECU), and may be considered as the ionospheric precursor of the 2014 Aegean Sea earthquake
Eddy-Current Inspection of Ball Bearings
NASA Technical Reports Server (NTRS)
Bankston, B.
1985-01-01
Custom eddy-current probe locates surface anomalies. Low friction air cushion within cone allows ball to roll easily. Eddy current probe reliably detects surface and near-surface cracks, voids, and material anomalies in bearing balls or other spherical objects. Defects in ball surface detected by probe displayed on CRT and recorded on strip-chart recorder.
Anomaly Detection Techniques for Ad Hoc Networks
ERIC Educational Resources Information Center
Cai, Chaoli
2009-01-01
Anomaly detection is an important and indispensable aspect of any computer security mechanism. Ad hoc and mobile networks consist of a number of peer mobile nodes that are capable of communicating with each other absent a fixed infrastructure. Arbitrary node movements and lack of centralized control make them vulnerable to a wide variety of…
A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan W.
2014-01-01
This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.
A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan Walker
2015-01-01
This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.
Archean Isotope Anomalies as a Window into the Differentiation History of the Earth
NASA Astrophysics Data System (ADS)
Wainwright, A. N.; Debaille, V.; Zincone, S. A.
2018-05-01
No resolvable µ142Nd anomaly was detected in Paleo- Mesoarchean rocks of São Francisco and West African cratons. The lack of µ142Nd anomalies outside of North America and Greenland implies the Earth differentiated into at least two distinct domains.
Richards, K K; Hazelton, M L; Stevenson, M A; Lockhart, C Y; Pinto, J; Nguyen, L
2014-10-01
The widespread availability of computer hardware and software for recording and storing disease event information means that, in theory, we have the necessary information to carry out detailed analyses of factors influencing the spatial distribution of disease in animal populations. However, the reliability of such analyses depends on data quality, with anomalous records having the potential to introduce significant bias and lead to inappropriate decision making. In this paper we promote the use of exceedance probabilities as a tool for detecting anomalies when applying hierarchical spatio-temporal models to animal health data. We illustrate this methodology through a case study data on outbreaks of foot-and-mouth disease (FMD) in Viet Nam for the period 2006-2008. A flexible binomial logistic regression was employed to model the number of FMD infected communes within each province of the country. Standard analyses of the residuals from this model failed to identify problems, but exceedance probabilities identified provinces in which the number of reported FMD outbreaks was unexpectedly low. This finding is interesting given that these provinces are on major cattle movement pathways through Viet Nam. Copyright © 2014 Elsevier Ltd. All rights reserved.
Magnetic anomalies on Io and their relationship to the spatial distribution of volcanic centers
NASA Astrophysics Data System (ADS)
Knicely, J.; Everett, M. E.; Sparks, D. W.
2014-12-01
The analysis of terrestrial magnetic anomalies has long proved useful for constraining crustal structure and dynamics. Here, we study Jupiter's moon, Io, using magnetics. We conduct forward modeling to make predictions of the crustal magnetic anomaly distribution on Io. Io is the most volcanic body in the solar system due to tidal heating from its Laplace resonance with Europa and Ganymede, causing extensive sulfur and silicate volcanism. We assume the magnetic susceptibility, which controls the measured magnetic signal, is controlled by temperature. Continuous overturn of the crust controls the vertical temperature profile, and local volcanic centers give the lateral temperature structure. As non-magnetic sulfur volcanism occurs at cool temperatures beneath the Curie point, it should not greatly affect the planetary magnetism and consequently is ignored in this paper. We assume that the average crustal temperatures are determined by a model of continuous burial by newly erupted material (O'Reilly and Davies 1981, Geophysical Research Letters), which put the Curie isotherm at great depth. We use a cylindrically symmetric model of the thermal evolution of the crust around an isolated volcanic center to obtain the local deviations in the thickness of the magnetizable layer. The crustal rocks are presumed to be mafic or ultramafic in composition, based on their spectral signatures, the temperature of the silicate volcanic eruptions, and their rheology as inferred from flow structures. Analysis of the 1997 Pillan eruption suggests a composition similar to lunar mare basalt or komatiite. The magnetic and thermal properties of lunar mare basalt have been well studied since the Apollo missions. Unaltered terrestrial ultramafics have been studied sufficiently to constrain their properties. A common technique of discretizing the magnetized material into prisms and summing the magnetic field of each prism as per Blakely (1995) was used to obtain an estimate of the crustal magnetic anomalies of Io as they would be measured by a satellite. The mapping is displayed as zonal bands so that a Cartesian geometry may be used. Early results indicated an accuracy better than 2 nT is required to detect the magnetic anomalies generated by volcanic activity.
NASA Astrophysics Data System (ADS)
Sorge, J.; Williams-Jones, G.; Wright, R.; Varley, N. R.
2010-12-01
Satellite imagery is playing an increasingly prominent role in volcanology as it allows for consistent monitoring of remote, dangerous, and/or under-monitored volcanoes. One such system is Volcán de Colima (Mexico), a persistently active andesitic stratovolcano. Its characteristic and hazardous activity includes lava dome growth, pyroclastic flows, explosions, and Plinian to Subplinian eruptions, which have historically occurred at the end of Volcán de Colima’s eruptive cycle. Despite the availability of large amounts of historical satellite imagery, methods to process and interpret these images over long time periods are limited. Furthermore, while time-series InSAR data from a previous study (December 2002 to August 2006) detected an overall subsidence between 1 and 3 km from the summit, there is insufficient temporal resolution to unambiguously constrain the source processes. To address this issue, a semi-automated process for time-based characterization of persistent volcanic activity at Volcán de Colima has been developed using a combination of MODIS and GOES satellite imagery to identify thermal anomalies on the volcano edifice. This satellite time-series data is then combined with available geodetic data, a detailed eruption history, and other geophysical time-series data (e.g., seismicity, explosions/day, effusion rate, environmental data, etc.) and examined for possible correlations and recurring patterns in the multiple data sets to investigate potential trigger mechanisms responsible for the changes in volcanic activity. GOES and MODIS images are available from 2000 to present at a temporal resolution of one image every 30 minutes and up to four images per day, respectively, creating a data set of approximately 180,000 images. Thermal anomalies over Volcán de Colima are identified in both night- and day-time images by applying a time-series approach to the analysis of MODIS data. Detection of false anomalies, caused by non-volcanic heat sources such as fires or solar heating (in the daytime images), is mitigated by adjusting the MODIS detection thresholds, through comparison of daytime versus nighttime results, and by observing the spatial distribution of the anomalies on the edifice. Conversely, anomalies may not be detected due to cloud cover; clouds absorb thermal radiation limiting or preventing the ability of the satellite to measure thermal events; therefore, the anomaly data is supplemented with a cloud cover time-series data set. Fast Fourier and Wavelet transforms are then applied to the continuous, uninterrupted intervals of satellite observation to compare and correlate with the multiple time-series data sets. The result is the characterization of the behavior of an individual volcano, based on an extended time period. This volcano specific, comprehensive characterization can then be used as a predictive tool in the real-time monitoring of volcanic activity.
GBAS Ionospheric Anomaly Monitoring Based on a Two-Step Approach
Zhao, Lin; Yang, Fuxin; Li, Liang; Ding, Jicheng; Zhao, Yuxin
2016-01-01
As one significant component of space environmental weather, the ionosphere has to be monitored using Global Positioning System (GPS) receivers for the Ground-Based Augmentation System (GBAS). This is because an ionospheric anomaly can pose a potential threat for GBAS to support safety-critical services. The traditional code-carrier divergence (CCD) methods, which have been widely used to detect the variants of the ionospheric gradient for GBAS, adopt a linear time-invariant low-pass filter to suppress the effect of high frequency noise on the detection of the ionospheric anomaly. However, there is a counterbalance between response time and estimation accuracy due to the fixed time constants. In order to release the limitation, a two-step approach (TSA) is proposed by integrating the cascaded linear time-invariant low-pass filters with the adaptive Kalman filter to detect the ionospheric gradient anomaly. The performance of the proposed method is tested by using simulated and real-world data, respectively. The simulation results show that the TSA can detect ionospheric gradient anomalies quickly, even when the noise is severer. Compared to the traditional CCD methods, the experiments from real-world GPS data indicate that the average estimation accuracy of the ionospheric gradient improves by more than 31.3%, and the average response time to the ionospheric gradient at a rate of 0.018 m/s improves by more than 59.3%, which demonstrates the ability of TSA to detect a small ionospheric gradient more rapidly. PMID:27240367
NASA Astrophysics Data System (ADS)
McCarthy, J. Howard, Jr.; Reimer, G. Michael
1986-11-01
Field studies have demonstrated that gas anomalies are found over buried mineral deposits. Abnormally high concentrations of sulfur gases and carbon dioxide and abnormally low concentrations of oxygen are commonly found over sulfide ore deposits. Helium anomalies are commonly associated with uranium deposits and geothermal areas. Helium and hydrocarbon gas anomalies have been detected over oil and gas deposits. Gases are sampled by extracting them from the pore space of soil, by degassing soil or rock, or by adsorbing them on artificial collectors. The two most widely used techniques for gas analysis are gas chromatography and mass spectrometry. The detection of gas anomalies at or near the surface may be an effective method to locate buried mineral deposits.
NASA Astrophysics Data System (ADS)
Bellaoui, Mebrouk; Hassini, Abdelatif; Bouchouicha, Kada
2017-05-01
Detection of thermal anomaly prior to earthquake events has been widely confirmed by researchers over the past decade. One of the popular approaches for anomaly detection is the Robust Satellite Approach (RST). In this paper, we use this method on a collection of six years of MODIS satellite data, representing land surface temperature (LST) images to predict 21st May 2003 Boumerdes Algeria earthquake. The thermal anomalies results were compared with the ambient temperature variation measured in three meteorological stations of Algerian National Office of Meteorology (ONM) (DELLYS-AFIR, TIZI-OUZOU, and DAR-EL-BEIDA). The results confirm the importance of RST as an approach highly effective for monitoring the earthquakes.
Practical method to identify orbital anomaly as spacecraft breakup in the geostationary region
NASA Astrophysics Data System (ADS)
Hanada, Toshiya; Uetsuhara, Masahiko; Nakaniwa, Yoshitaka
2012-07-01
Identifying a spacecraft breakup is an essential issue to define the current orbital debris environment. This paper proposes a practical method to identify an orbital anomaly, which appears as a significant discontinuity in the observation data, as a spacecraft breakup. The proposed method is applicable to orbital anomalies in the geostationary region. Long-term orbital evolutions of breakup fragments may conclude that their orbital planes will converge into several corresponding regions in inertial space even if the breakup epoch is not specified. This empirical method combines the aforementioned conclusion with the search strategy developed at Kyushu University, which can identify origins of observed objects as fragments released from a specified spacecraft. This practical method starts with selecting a spacecraft that experienced an orbital anomaly, and formulates a hypothesis to generate fragments from the anomaly. Then, the search strategy is applied to predict the behavior of groups of fragments hypothetically generated. Outcome of this predictive analysis specifies effectively when, where and how we should conduct optical measurements using ground-based telescopes. Objects detected based on the outcome are supposed to be from the anomaly, so that we can confirm the anomaly as a spacecraft breakup to release the detected objects. This paper also demonstrates observation planning for a spacecraft anomaly in the geostationary region.
Systems Modeling to Implement Integrated System Health Management Capability
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Walker, Mark; Morris, Jonathan; Smith, Harvey; Schmalzel, John
2007-01-01
ISHM capability includes: detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, and user interfaces that enable integrated awareness (past, present, and future) by users. This is achieved by focused management of data, information and knowledge (DIaK) that will likely be distributed across networks. Management of DIaK implies storage, sharing (timely availability), maintaining, evolving, and processing. Processing of DIaK encapsulates strategies, methodologies, algorithms, etc. focused on achieving high ISHM Functional Capability Level (FCL). High FCL means a high degree of success in detecting anomalies, diagnosing causes, predicting future anomalies, and enabling health integrated awareness by the user. A model that enables ISHM capability, and hence, DIaK management, is denominated the ISHM Model of the System (IMS). We describe aspects of the IMS that focus on processing of DIaK. Strategies, methodologies, and algorithms require proper context. We describe an approach to define and use contexts, implementation in an object-oriented software environment (G2), and validation using actual test data from a methane thruster test program at NASA SSC. Context is linked to existence of relationships among elements of a system. For example, the context to use a strategy to detect leak is to identify closed subsystems (e.g. bounded by closed valves and by tanks) that include pressure sensors, and check if the pressure is changing. We call these subsystems Pressurizable Subsystems. If pressure changes are detected, then all members of the closed subsystem become suspect of leakage. In this case, the context is defined by identifying a subsystem that is suitable for applying a strategy. Contexts are defined in many ways. Often, a context is defined by relationships of function (e.g. liquid flow, maintaining pressure, etc.), form (e.g. part of the same component, connected to other components, etc.), or space (e.g. physically close, touching the same common element, etc.). The context might be defined dynamically (if conditions for the context appear and disappear dynamically) or statically. Although this approach is akin to case-based reasoning, we are implementing it using a software environment that embodies tools to define and manage relationships (of any nature) among objects in a very intuitive manner. Context for higher level inferences (that use detected anomalies or events), primarily for diagnosis and prognosis, are related to causal relationships. This is useful to develop root-cause analysis trees showing an event linked to its possible causes and effects. The innovation pertaining to RCA trees encompasses use of previously defined subsystems as well as individual elements in the tree. This approach allows more powerful implementations of RCA capability in object-oriented environments. For example, if a pressurizable subsystem is leaking, its root-cause representation within an RCA tree will show that the cause is that all elements of that subsystem are suspect of leak. Such a tree would apply to all instances of leak-events detected and all elements in all pressurizable subsystems in the system. Example subsystems in our environment to build IMS include: Pressurizable Subsystem, Fluid-Fill Subsystem, Flow-Thru-Valve Subsystem, and Fluid Supply Subsystem. The software environment for IMS is designed to potentially allow definition of any relationship suitable to create a context to achieve ISHM capability.
Min-max hyperellipsoidal clustering for anomaly detection in network security.
Sarasamma, Suseela T; Zhu, Qiuming A
2006-08-01
A novel hyperellipsoidal clustering technique is presented for an intrusion-detection system in network security. Hyperellipsoidal clusters toward maximum intracluster similarity and minimum intercluster similarity are generated from training data sets. The novelty of the technique lies in the fact that the parameters needed to construct higher order data models in general multivariate Gaussian functions are incrementally derived from the data sets using accretive processes. The technique is implemented in a feedforward neural network that uses a Gaussian radial basis function as the model generator. An evaluation based on the inclusiveness and exclusiveness of samples with respect to specific criteria is applied to accretively learn the output clusters of the neural network. One significant advantage of this is its ability to detect individual anomaly types that are hard to detect with other anomaly-detection schemes. Applying this technique, several feature subsets of the tcptrace network-connection records that give above 95% detection at false-positive rates below 5% were identified.
Detecting Pulsing Denial-of-Service Attacks with Nondeterministic Attack Intervals
NASA Astrophysics Data System (ADS)
Luo, Xiapu; Chan, Edmond W. W.; Chang, Rocky K. C.
2009-12-01
This paper addresses the important problem of detecting pulsing denial of service (PDoS) attacks which send a sequence of attack pulses to reduce TCP throughput. Unlike previous works which focused on a restricted form of attacks, we consider a very broad class of attacks. In particular, our attack model admits any attack interval between two adjacent pulses, whether deterministic or not. It also includes the traditional flooding-based attacks as a limiting case (i.e., zero attack interval). Our main contribution is Vanguard, a new anomaly-based detection scheme for this class of PDoS attacks. The Vanguard detection is based on three traffic anomalies induced by the attacks, and it detects them using a CUSUM algorithm. We have prototyped Vanguard and evaluated it on a testbed. The experiment results show that Vanguard is more effective than the previous methods that are based on other traffic anomalies (after a transformation using wavelet transform, Fourier transform, and autocorrelation) and detection algorithms (e.g., dynamic time warping).
NASA Astrophysics Data System (ADS)
Hood, Lon L.
2011-02-01
A re-examination of all available low-altitude LP magnetometer data confirms that magnetic anomalies are present in at least four Nectarian-aged lunar basins: Moscoviense, Mendel-Rydberg, Humboldtianum, and Crisium. In three of the four cases, a single main anomaly is present near the basin center while, in the case of Crisium, anomalies are distributed in a semi-circular arc about the basin center. These distributions, together with a lack of other anomalies near the basins, indicate that the sources of the anomalies are genetically associated with the respective basin-forming events. These central basin anomalies are difficult to attribute to shock remanent magnetization of a shocked central uplift and most probably imply thermoremanent magnetization of impact melt rocks in a steady magnetizing field. Iterative forward modeling of the single strongest and most isolated anomaly, the northern Crisium anomaly, yields a paleomagnetic pole position at 81° ± 19°N, 143° ± 31°E, not far from the present rotational pole. Assuming no significant true polar wander since the Crisium impact, this position is consistent with that expected for a core dynamo magnetizing field. Further iterative forward modeling demonstrates that the remaining Crisium anomalies can be approximately simulated assuming a multiple source model with a single magnetization direction equal to that inferred for the northernmost anomaly. This result is most consistent with a steady, large-scale magnetizing field. The inferred mean magnetization intensity within the strongest basin sources is ˜1 A/m assuming a 1-km thickness for the source layer. Future low-altitude orbital and surface magnetometer measurements will more strongly constrain the depth and/or thicknesses of the sources.
Detection of emerging sunspot regions in the solar interior.
Ilonidis, Stathis; Zhao, Junwei; Kosovichev, Alexander
2011-08-19
Sunspots are regions where strong magnetic fields emerge from the solar interior and where major eruptive events occur. These energetic events can cause power outages, interrupt telecommunication and navigation services, and pose hazards to astronauts. We detected subsurface signatures of emerging sunspot regions before they appeared on the solar disc. Strong acoustic travel-time anomalies of an order of 12 to 16 seconds were detected as deep as 65,000 kilometers. These anomalies were associated with magnetic structures that emerged with an average speed of 0.3 to 0.6 kilometer per second and caused high peaks in the photospheric magnetic flux rate 1 to 2 days after the detection of the anomalies. Thus, synoptic imaging of subsurface magnetic activity may allow anticipation of large sunspot regions before they become visible, improving space weather forecast.
Johnson, Carole D.; Dawson, C.B.; Belaval, Marcel; Lane, John W.
2002-01-01
A surface-geophysical investigation to characterize the hydrogeology and contaminant distribution of the former landfill area at the University of Connecticut in Storrs, Connecticut, was conducted in 2000 to supplement the preliminary hydrogeologic assessment of the contamination of soil, surface water, and ground water at the site. A geophysical-toolbox approach was used to characterize the hydrogeology and contaminant distribution of the former landfill. Two-dimensional direct-current resistivity, inductive terrain-conductivity, and seismic-refraction surface-geophysical data were collected and interpreted in an iterative manner with exploratory drilling, borehole geophysics, and hydraulic testing. In this investigation, a geophysical-toolbox approach was used to 1) further define previously identified conductive anomalies and leachate plumes; 2) identify additional leachate plumes, possible fracture zones, and (or) conductive lithologic layers in the bedrock; and 3) delineate bedrock-surface topography in the drainage valleys north and south of the landfill. Resistivity and terrain-conductivity surveys were used to further delineate previously identified geophysical anomalies to the north and southwest of the landfill. A conductive anomaly identified in the terrain-conductivity survey to the north of the landfill in 2000 had a similar location and magnitude as an anomaly identified in terrain-conductivity surveys conducted in 1998 and 1999. Collectively, these surveys indicated that the magnitude of the conductive anomaly decreased with depth and with distance from the landfill. These anomalies indicated landfill leachate in the overburden and shallow bedrock. Results of previous surface-geophysical investigations southwest of the landfill indicated a shallow conductive anomaly in the overburden that extended into the fractured-bedrock aquifer. This conductive anomaly had a sheet-like geometry that had a north-south strike, dipped to the west, and terminated abruptly about 450 feet southwest of the landfill. The sheet-like conductive anomaly was interpreted as a fractured, conductive lithologic feature filled with conductive fluids. To further delineate this anomaly, two two-dimensional resistivity profiles were collected west of the sheet-like conductive anomaly to assess the possibility that the sheet-like conductive anomaly continued to the west in its down-dip direction. Each of the north-south oriented resistivity profiles showed bullet-shaped rather than linear-shaped anomalies, with a relatively smaller magnitude of conductivity than the sheet-like conductive anomaly to the east. If these bullet-like features are spatially connected, they may represent a linear, or pipe-like, conductive anomaly in the bedrock with a trend of N290?E and a plunge of 12?. Additional surveys were conducted to assess the apparent southern termination of the sheet-like conductive feature. Terrain-conductivity surveys indicated the sheet-like feature was not continuous to the south. A two-dimensional resistivity line and a coincident terrain-conductivity profile indicated the presence of a steep, eastward dipping, low magnitude, electrically conductive anomaly on the eastern end of the profile. Although the sheet-like conductive anomaly apparently did not continue to the south, the survey conducted in 2000 identified an isolated, weak conductive anomaly south of the previously identified anomaly. Inductive terrain-conductivity surveys performed north of the sheet-like conductive anomaly and west of the landfill indicated the anomaly did not extend to the north into the area of the former chemical-waste disposal pits. No conductive plumes or conductive features were observed in the subsurface bedrock west of the landfill. A conductive anomaly was identified in the southern section of the new terrain-conductivity grid. The magnitude and distribution of the apparent conductivity of this anomaly was identified as a nearly vertica
NASA Astrophysics Data System (ADS)
Pattisahusiwa, Asis; Houw Liong, The; Purqon, Acep
2016-08-01
In this study, we compare two learning mechanisms: outliers and novelty detection in order to detect ionospheric TEC disturbance by November 2004 geomagnetic storm and January 2005 substorm. The mechanisms are applied by using v-SVR learning algorithm which is a regression version of SVM. Our results show that both mechanisms are quiet accurate in learning TEC data. However, novelty detection is more accurate than outliers detection in extracting anomalies related to geomagnetic events. The detected anomalies by outliers detection are mostly related to trend of data, while novelty detection are associated to geomagnetic events. Novelty detection also shows evidence of LSTID during geomagnetic events.
Application of isostatic gravity anomaly in the Yellow Sea area
NASA Astrophysics Data System (ADS)
Hao, Z.; Qin, J.; Huang, W.; Wu, X.
2017-12-01
In order to study the deep crustal structure of the Yellow Sea area, we used the Airy-Heiskanen model to calculate the isostatic gravity anomaly of this area. Based on the Bouguer gravity anomaly and water depth data of this area, we chose the calculating parameters as standard crustal thickness 30 km, crust-mantle density difference 0.6g/cm3and grid spacing 0.1°×0.1°. This study reveals that there are six faults and four isostatic negative anomalies in the study area. The isostatic anomalies in much of Yellow Sea areas give priority to those with positive anomalies. The isostatic anomalies in North Yellow Sea are higher than South Yellow Sea with Jiashan-Xiangshui fault as the boundary. In the north of the study area, isostatic anomalies are characterized by large areas of positive anomaly. The change is relatively slow, and the trends give priority to the trend NE or NEE. In the middle of the north Yellow Sea basin, there is a local negative anomaly, arranged as a string of beads in NE to discontinuous distribution. Negative anomaly range is small, basically corresponds to the region's former Cenozoic sedimentary basin position. To the south of Jiashan-Xiangshui fault and west of Yellow Sea eastern margin fault, including most of the south Yellow Sea and Jiangsu province, the isostatic anomalies are lower. And the positive and negative anomalies are alternative distribution, and negative anomaly trap in extensive development. The trends give priority to NE, NEE, both to the NW. On the basis of the characteristics of isostatic gravity anomalies, it is concluded that the Yellow Sea belongs to continental crustal isostatic area whose isostatic anomalies is smooth and slow. ReferencesHeiskanen, W. A., F. A. V. Meinesz, and S. A. Korff (1958), The Earth and Its Gravity Field, McGraw-Hill, New York. Meng, X. J., X. H. Zhang, and J. Y. Yang (2014), Geophysical survey in eastern China seas and the characteristics of gravity and magnetic fields, Marine Geoglogy & Quaternary Geology, 34(6), 127-134.
2013-01-01
The extra-cranial venous system is complex and not well studied in comparison to the peripheral venous system. A newly proposed vascular condition, named chronic cerebrospinal venous insufficiency (CCSVI), described initially in patients with multiple sclerosis (MS) has triggered intense interest in better understanding of the role of extra-cranial venous anomalies and developmental variants. So far, there is no established diagnostic imaging modality, non-invasive or invasive, that can serve as the “gold standard” for detection of these venous anomalies. However, consensus guidelines and standardized imaging protocols are emerging. Most likely, a multimodal imaging approach will ultimately be the most comprehensive means for screening, diagnostic and monitoring purposes. Further research is needed to determine the spectrum of extra-cranial venous pathology and to compare the imaging findings with pathological examinations. The ability to define and reliably detect noninvasively these anomalies is an essential step toward establishing their incidence and prevalence. The role for these anomalies in causing significant hemodynamic consequences for the intra-cranial venous drainage in MS patients and other neurologic disorders, and in aging, remains unproven. PMID:23806142
Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets.
Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin
2017-03-09
Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy ands parse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxi cabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques.
GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D
This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% truemore » positive rates at both.« less
Capacitance probe for detection of anomalies in non-metallic plastic pipe
Mathur, Mahendra P.; Spenik, James L.; Condon, Christopher M.; Anderson, Rodney; Driscoll, Daniel J.; Fincham, Jr., William L.; Monazam, Esmail R.
2010-11-23
The disclosure relates to analysis of materials using a capacitive sensor to detect anomalies through comparison of measured capacitances. The capacitive sensor is used in conjunction with a capacitance measurement device, a location device, and a processor in order to generate a capacitance versus location output which may be inspected for the detection and localization of anomalies within the material under test. The components may be carried as payload on an inspection vehicle which may traverse through a pipe interior, allowing evaluation of nonmetallic or plastic pipes when the piping exterior is not accessible. In an embodiment, supporting components are solid-state devices powered by a low voltage on-board power supply, providing for use in environments where voltage levels may be restricted.
Artificial intelligence techniques for ground test monitoring of rocket engines
NASA Technical Reports Server (NTRS)
Ali, Moonis; Gupta, U. K.
1990-01-01
An expert system is being developed which can detect anomalies in Space Shuttle Main Engine (SSME) sensor data significantly earlier than the redline algorithm currently in use. The training of such an expert system focuses on two approaches which are based on low frequency and high frequency analyses of sensor data. Both approaches are being tested on data from SSME tests and their results compared with the findings of NASA and Rocketdyne experts. Prototype implementations have detected the presence of anomalies earlier than the redline algorithms that are in use currently. It therefore appears that these approaches have the potential of detecting anomalies early eneough to shut down the engine or take other corrective action before severe damage to the engine occurs.
NASA Astrophysics Data System (ADS)
Honsho, Chie; Yamazaki, Toshitsugu; Ura, Tamaki; Okino, Kyoko; Morozumi, Haruhisa; Ueda, Satoshi
2016-11-01
We report here results from a deep-sea magnetic survey using an autonomous underwater vehicle over the Hakurei hydrothermal site, in the middle Okinawa Trough. Magnetic inversion revealed that the Hakurei site is associated with well-defined high-magnetization zones distributed within a broad low-magnetization zone. Results from rock magnetic measurements, performed on sulfide ore samples obtained by drilling, showed that some samples possessed extremely high natural remanent magnetization (NRM) (as much as 6.8-953.0 A/m), although most of the measured samples had much lower NRM. These high-NRM samples were characterized by high Königsberger ratios (101-103), indicating much larger NRM than induced magnetization, and contained pyrrhotite as the only magnetic mineral. This suggests that NRM carried by pyrrhotite is the source of the observed magnetic anomalies. The wide range of NRM intensity was considered to be due to a highly heterogeneous distribution of pyrrhotite, because pyrrhotite was commonly identified in both the high-NRM and low-NRM samples. Pyrrhotite production may have been occasionally drastically increased, with highly magnetic ores formed as a result. Rapid burial of active vents may result in the creation of an extensive reducing environment under the seafloor, which is favorable to pyrrhotite production, and may also prevent oxidation of pyrrhotite by isolating it from seawater. Because the magnetization intensity of sulfide ores was highly variable, it would not be straightforward to estimate the quantity of ore deposits from the magnetic anomalies. Nevertheless, this study demonstrates the usefulness of magnetic surveys in detecting hydrothermal deposits.
NASA Astrophysics Data System (ADS)
Appiah, Isaac; Wemegah, David Dotse; Asare, Van-Dycke Sarpong; Danuor, Sylvester K.; Forson, Eric Dominic
2018-06-01
Non-invasive geophysical investigation using magnetic gradiometry, magnetic susceptibility survey and electrical resistivity tomography (ERT) was carried out on the Sunyani Municipal Assembly (SMA) solid waste disposal (SWD) site. The study was aimed at delineating the physical boundaries and the area extent of the waste deposit, mapping the distribution of the waste at the site, detecting and delineating zones of leachate contamination and its preferential migration pathways beneath the waste deposit and its surroundings. The results of both magnetic susceptibility and gradiometric methods displayed in anomaly maps clearly delineated the physical boundaries of the waste deposit with an approximate area extent of 82,650 m2 that are characterised by high magnetic susceptibilities between 426 × 10-5 SI and 9890 × 10-5 SI. They also revealed high magnetic anomalies erratically distributed within the waste deposit attributable to its heterogeneous and uncontrolled nature. The high magnetic anomalies outside the designated waste boundaries were also attributed to indiscriminate deposition of the waste. Similarly, the ERT sections delineated and characterised zones of leachate contamination beneath the waste body and its close surroundings as well as pathways for leachate migration with low resistivity signatures up to 43.9 Ωm. In spite of the successes reported herein using the ERT, this research also revealed that the ERT is less effective in estimating the thickness of the waste deposit in unlined SWD sites due to leachate infiltration into the ground beneath it that masks the resistivities of the top level ground and makes it indistinguishable from the waste body.
Model-Based Anomaly Detection for a Transparent Optical Transmission System
NASA Astrophysics Data System (ADS)
Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.
In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.
Creating a Team Archive During Fast-Paced Anomaly Response Activities in Space Missions
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Hicks, LaDessa; Overland, David; Thronesbery, Carroll; Christofferesen, Klaus; Chow, Renee
2002-01-01
This paper describes a Web-based system to support the temporary Anomaly Response Team formed from distributed subteams in Space Shuttle and International Space Station missions. The system was designed for easy and flexible creation of small collections of files and links associated with work on a particular anomaly. The system supports privacy and levels of formality for the subteams. First we describe the supported groups and an anomaly response scenario. Then we describe the support system prototype, the Anomaly Response Tracking and Integration System (ARTIS). Finally, we describe our evaluation approach and the results of the evaluation.
Spherical earth gravity and magnetic anomaly analysis by equivalent point source inversion
NASA Technical Reports Server (NTRS)
Von Frese, R. R. B.; Hinze, W. J.; Braile, L. W.
1981-01-01
To facilitate geologic interpretation of satellite elevation potential field data, analysis techniques are developed and verified in the spherical domain that are commensurate with conventional flat earth methods of potential field interpretation. A powerful approach to the spherical earth problem relates potential field anomalies to a distribution of equivalent point sources by least squares matrix inversion. Linear transformations of the equivalent source field lead to corresponding geoidal anomalies, pseudo-anomalies, vector anomaly components, spatial derivatives, continuations, and differential magnetic pole reductions. A number of examples using 1 deg-averaged surface free-air gravity anomalies of POGO satellite magnetometer data for the United States, Mexico, and Central America illustrate the capabilities of the method.
Characterization of normality of chaotic systems including prediction and detection of anomalies
NASA Astrophysics Data System (ADS)
Engler, Joseph John
Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational techniques and novel prediction methodologies. The value and efficiency of these methods are explored in various case studies. Presented is an overview of chaotic systems with examples taken from the real world. A representation schema for rapid understanding of the various states of deterministically chaotic systems is presented. This schema is then used to detect anomalies and system state changes. Additionally, a novel prediction methodology which utilizes Lyapunov exponents to facilitate longer term prediction accuracy is presented and compared with other nonlinear prediction methodologies. These novel methodologies are then demonstrated on applications such as wind energy, cyber security and classification of social networks.
Cybersecurity Intrusion Detection and Monitoring for Field Area Network: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pietrowicz, Stanley
This report summarizes the key technical accomplishments, industry impact and performance of the I2-CEDS grant entitled “Cybersecurity Intrusion Detection and Monitoring for Field Area Network”. Led by Applied Communication Sciences (ACS/Vencore Labs) in conjunction with its utility partner Sacramento Municipal Utility District (SMUD), the project accelerated research on a first-of-its-kind cybersecurity monitoring solution for Advanced Meter Infrastructure and Distribution Automation field networks. It advanced the technology to a validated, full-scale solution that detects anomalies, intrusion events and improves utility situational awareness and visibility. The solution was successfully transitioned and commercialized for production use as SecureSmart™ Continuous Monitoring. Discoveries made withmore » SecureSmart™ Continuous Monitoring led to tangible and demonstrable improvements in the security posture of the US national electric infrastructure.« less
Semi-Supervised Novelty Detection with Adaptive Eigenbases, and Application to Radio Transients
NASA Technical Reports Server (NTRS)
Thompson, David R.; Majid, Walid A.; Reed, Colorado J.; Wagstaff, Kiri L.
2011-01-01
We present a semi-supervised online method for novelty detection and evaluate its performance for radio astronomy time series data. Our approach uses adaptive eigenbases to combine 1) prior knowledge about uninteresting signals with 2) online estimation of the current data properties to enable highly sensitive and precise detection of novel signals. We apply the method to the problem of detecting fast transient radio anomalies and compare it to current alternative algorithms. Tests based on observations from the Parkes Multibeam Survey show both effective detection of interesting rare events and robustness to known false alarm anomalies.
Bronshtein, Moshe; Solt, Ido; Blumenfeld, Zeev
2014-06-01
Despite more than three decades of universal popularity of fetal sonography as an integral part of pregnancy evaluation, there is still no unequivocal agreement regarding the optimal dating of fetal sonographic screening and the type of ultrasound (transvaginal vs abdominal). TransvaginaL systematic sonography at 14-17 weeks for fetal organ screening. The evaluation of over 72.000 early (14-17 weeks) and late (18-24 weeks) fetal ultrasonographic systematic organ screenings revealed that 96% of the malformations are detectable in the early screening with an incidence of 1:50 gestations. Only 4% of the fetal anomalies are diagnosed later in pregnancy. Over 99% of the fetal cardiac anomalies are detectable in the early screening and most of them appear in low risk gestations. Therefore, we suggest a new platform of fetal sonographic evaluation and follow-up: The extensive systematic fetal organ screening should be performed by an expert sonographer who has been trained in the detection of fetal malformations, at 14-17 weeks gestation. This examination should also include fetal cardiac echography Three additional ultrasound examinations are suggested during pregnancy: the first, performed by the patient's obstetrician at 6-7 weeks for the exclusion of ectopic pregnancy, confirmation of fetal viability, dating, assessment of chorionicity in multiple gestations, and visualization of maternal adnexae. The other two, at 22-26 and 32-34 weeks, require less training and should be performed by an obstetrician who has been qualified in the sonographic detection of fetal anomalies. The advantages of early midtrimester targeted fetal systematic organ screening for the detection of fetal anomalies may dictate a global change.
NASA Astrophysics Data System (ADS)
Park, Won-Kwang; Kim, Hwa Pyung; Lee, Kwang-Jae; Son, Seong-Ho
2017-11-01
Motivated by the biomedical engineering used in early-stage breast cancer detection, we investigated the use of MUltiple SIgnal Classification (MUSIC) algorithm for location searching of small anomalies using S-parameters. We considered the application of MUSIC to functional imaging where a small number of dipole antennas are used. Our approach is based on the application of Born approximation or physical factorization. We analyzed cases in which the anomaly is respectively small and large in relation to the wavelength, and the structure of the left-singular vectors is linked to the nonzero singular values of a Multi-Static Response (MSR) matrix whose elements are the S-parameters. Using simulations, we demonstrated the strengths and weaknesses of the MUSIC algorithm in detecting both small and extended anomalies.
Integrity Verification for SCADA Devices Using Bloom Filters and Deep Packet Inspection
2014-03-27
prevent intrusions in smart grids [PK12]. Parthasarathy proposed an anomaly detection based IDS that takes into account system state. In his implementation...Security, 25(7):498–506, 10 2006. [LMV12] O. Linda, M. Manic, and T. Vollmer. Improving cyber-security of smart grid systems via anomaly detection and...6 2012. 114 [PK12] S. Parthasarathy and D. Kundur. Bloom filter based intrusion detection for smart grid SCADA. In Electrical & Computer Engineering
Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A
2018-02-01
Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Anomaly Detection for Next-Generation Space Launch Ground Operations
NASA Technical Reports Server (NTRS)
Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.
2010-01-01
NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.
Statistical Approach to Detection of Strombolian Activity in Satellite Data
NASA Astrophysics Data System (ADS)
Worden, A. K.; Dehn, J.; Ripepe, M.; Harris, A. J.
2010-12-01
Strombolian activity across the remote volcanoes of the Aleutian Islands and Kamchatka Peninsula cannot be monitored easily or safely by direct methods. Satellite remote sensing offers a useful means to routinely monitor these volcanoes. To model the expected time-dependent thermal signal recorded by the satellite-sensor, we carried out laboratory-based experiments and collected field data for cooling spatter and bomb fields. Preliminary laboratory work focused on finding an acceptable lava analog, as well as scaled pressures and vent sizes. Honey emitted from 0.5-3.8 cm diameter vents by explosions with pressures of around 0.05 MPa seemed to work the best. Scaled explosions were recorded with a FLIR thermal camera and a digital video camera. Explosions at Stromboli Volcano in Italy were also recorded with the same thermal camera over a period of days in May and June, 2010, and were compared to the scaled explosions. In both the modeled and actual explosions, vent diameter directly dictates the type of explosion and deposit distribution ranging from intense jetting from small vents to diffuse spattering from larger vents. The style of emission controls the area of, and distribution of bombs within, the resulting bomb field. This, in turn, influences the cooling rate of the bomb field. The cooling rate of spatter and bomb fields (most likely the source of thermal anomalies in satellite data) for both modeled and actual explosions compared well, and is on the order of seconds to minutes. For a single explosion of average size, the thermal signal is detectable by satellite for a time period in terms of tens of seconds. Thus, in order to see a thermal signature related to a strombolian explosion, a satellite must pass over the volcano (with acceptable geometries) within about a minute of an explosion. A volcano with 70 explosions per day would produce roughly an hour of detectable thermal anomalies. With about a dozen possible NOAA and NASA satellite overpasses daily, dependant on weather and viewing geometry, an anomaly would be seen every couple of days and almost certainly once a week. By calibrating events observed by satellite with events recorded in infrasonic, seismic, and FLIR data a tool can be developed to gauge increasing or decreasing strombolian activity at remote volcanoes.
Occurrence and Detectability of Thermal Anomalies on Europa
NASA Astrophysics Data System (ADS)
Hayne, Paul O.; Christensen, Philip R.; Spencer, John R.; Abramov, Oleg; Howett, Carly; Mellon, Michael; Nimmo, Francis; Piqueux, Sylvain; Rathbun, Julie A.
2017-10-01
Endogenic activity is likely on Europa, given its young surface age of and ongoing tidal heating by Jupiter. Temperature is a fundamental signature of activity, as witnessed on Enceladus, where plumes emanate from vents with strongly elevated temperatures. Recent observations suggest the presence of similar water plumes at Europa. Even if plumes are uncommon, resurfacing may produce elevated surface temperatures, perhaps due to near-surface liquid water. Detecting endogenic activity on Europa is one of the primary mission objectives of NASA’s planned Europa Clipper flyby mission.Here, we use a probabilistic model to assess the likelihood of detectable thermal anomalies on the surface of Europa. The Europa Thermal Emission Imaging System (E-THEMIS) investigation is designed to characterize Europa’s thermal behavior and identify any thermal anomalies due to recent or ongoing activity. We define “detectability” on the basis of expected E-THEMIS measurements, which include multi-spectral infrared emission, both day and night.Thermal anomalies on Europa may take a variety of forms, depending on the resurfacing style, frequency, and duration of events: 1) subsurface melting due to hot spots, 2) shear heating on faults, and 3) eruptions of liquid water or warm ice on the surface. We use numerical and analytical models to estimate temperatures for these features. Once activity ceases, lifetimes of thermal anomalies are estimated to be 100 - 1000 yr. On average, Europa’s 10 - 100 Myr surface age implies a resurfacing rate of ~3 - 30 km2/yr. The typical size of resurfacing features determines their frequency of occurrence. For example, if ~100 km2 chaos features dominate recent resurfacing, we expect one event every few years to decades. Smaller features, such as double-ridges, may be active much more frequently. We model each feature type as a statistically independent event, with probabilities weighted by their observed coverage of Europa’s surface. Our results show that if Europa is resurfaced continuously by the processes considered, there is a >99% chance that E-THEMIS will detect a thermal anomaly due to endogenic activity. Therefore, if no anomalies are detected, these models can be ruled out, or revised.
Tactile sensor of hardness recognition based on magnetic anomaly detection
NASA Astrophysics Data System (ADS)
Xue, Lingyun; Zhang, Dongfang; Chen, Qingguang; Rao, Huanle; Xu, Ping
2018-03-01
Hardness, as one kind of tactile sensing, plays an important role in the field of intelligent robot application such as gripping, agricultural harvesting, prosthetic hand and so on. Recently, with the rapid development of magnetic field sensing technology with high performance, a number of magnetic sensors have been developed for intelligent application. The tunnel Magnetoresistance(TMR) based on magnetoresistance principal works as the sensitive element to detect the magnetic field and it has proven its excellent ability of weak magnetic detection. In the paper, a new method based on magnetic anomaly detection was proposed to detect the hardness in the tactile way. The sensor is composed of elastic body, ferrous probe, TMR element, permanent magnet. When the elastic body embedded with ferrous probe touches the object under the certain size of force, deformation of elastic body will produce. Correspondingly, the ferrous probe will be forced to displace and the background magnetic field will be distorted. The distorted magnetic field was detected by TMR elements and the output signal at different time can be sampled. The slope of magnetic signal with the sampling time is different for object with different hardness. The result indicated that the magnetic anomaly sensor can recognize the hardness rapidly within 150ms after the tactile moment. The hardness sensor based on magnetic anomaly detection principal proposed in the paper has the advantages of simple structure, low cost, rapid response and it has shown great application potential in the field of intelligent robot.
Acharya, Sujeet S; Gundeti, Mohan S; Zagaja, Gregory P; Shalhav, Arieh L; Zorn, Kevin C
2009-04-01
Although malformations of the genitourinary tract are typically identified during childhood, they can remain silent until incidental detection in evaluation and treatment of other pathologies during adulthood. The advent of the minimally invasive era in urologic surgery has given rise to unique challenges in the surgical management of anomalies of the genitourinary tract. This article reviews the embryology of anomalies of Wolffian duct (WD) derivatives with specific attention to the seminal vesicles, vas deferens, ureter, and kidneys. This is followed by a discussion of the history of the laparoscopic approach to WD derivative anomalies. Finally, we present two cases to describe technical considerations when managing these anomalies when encountered during robotic-assisted radical prostatectomy. The University of Chicago Robotic Laparoscopic Radical Prostatectomy (RLRP) database was reviewed for cases where anomalies of WD derivatives were encountered. We describe how modifications in technique allowed for completion of the procedure without difficulty. None Of the 1230 RLRP procedures performed at our institution by three surgeons, only two cases (0.16%) have been noted to have a WD anomaly. These cases were able to be completed without difficulty by making simple modifications in technique. Although uncommon, it is important for the urologist to be familiar with the origin and surgical management of WD anomalies, particularly when detected incidentally during surgery. Simple modifications in technique allow for completion of RLRP without difficulty.
A lightweight network anomaly detection technique
Kim, Jinoh; Yoo, Wucherl; Sim, Alex; ...
2017-03-13
While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less
A lightweight network anomaly detection technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jinoh; Yoo, Wucherl; Sim, Alex
While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less
AnRAD: A Neuromorphic Anomaly Detection Framework for Massive Concurrent Data Streams.
Chen, Qiuwen; Luley, Ryan; Wu, Qing; Bishop, Morgan; Linderman, Richard W; Qiu, Qinru
2018-05-01
The evolution of high performance computing technologies has enabled the large-scale implementation of neuromorphic models and pushed the research in computational intelligence into a new era. Among the machine learning applications, unsupervised detection of anomalous streams is especially challenging due to the requirements of detection accuracy and real-time performance. Designing a computing framework that harnesses the growing computing power of the multicore systems while maintaining high sensitivity and specificity to the anomalies is an urgent research topic. In this paper, we propose anomaly recognition and detection (AnRAD), a bioinspired detection framework that performs probabilistic inferences. We analyze the feature dependency and develop a self-structuring method that learns an efficient confabulation network using unlabeled data. This network is capable of fast incremental learning, which continuously refines the knowledge base using streaming data. Compared with several existing anomaly detection approaches, our method provides competitive detection quality. Furthermore, we exploit the massive parallel structure of the AnRAD framework. Our implementations of the detection algorithm on the graphic processing unit and the Xeon Phi coprocessor both obtain substantial speedups over the sequential implementation on general-purpose microprocessor. The framework provides real-time service to concurrent data streams within diversified knowledge contexts, and can be applied to large problems with multiple local patterns. Experimental results demonstrate high computing performance and memory efficiency. For vehicle behavior detection, the framework is able to monitor up to 16000 vehicles (data streams) and their interactions in real time with a single commodity coprocessor, and uses less than 0.2 ms for one testing subject. Finally, the detection network is ported to our spiking neural network simulator to show the potential of adapting to the emerging neuromorphic architectures.
Distributed Health Monitoring System for Reusable Liquid Rocket Engines
NASA Technical Reports Server (NTRS)
Lin, C. F.; Figueroa, F.; Politopoulos, T.; Oonk, S.
2009-01-01
The ability to correctly detect and identify any possible failure in the systems, subsystems, or sensors within a reusable liquid rocket engine is a major goal at NASA John C. Stennis Space Center (SSC). A health management (HM) system is required to provide an on-ground operation crew with an integrated awareness of the condition of every element of interest by determining anomalies, examining their causes, and making predictive statements. However, the complexity associated with relevant systems, and the large amount of data typically necessary for proper interpretation and analysis, presents difficulties in implementing complete failure detection, identification, and prognostics (FDI&P). As such, this paper presents a Distributed Health Monitoring System for Reusable Liquid Rocket Engines as a solution to these problems through the use of highly intelligent algorithms for real-time FDI&P, and efficient and embedded processing at multiple levels. The end result is the ability to successfully incorporate a comprehensive HM platform despite the complexity of the systems under consideration.
Sleep Deprivation Attack Detection in Wireless Sensor Network
NASA Astrophysics Data System (ADS)
Bhattasali, Tapalina; Chaki, Rituparna; Sanyal, Sugata
2012-02-01
Deployment of sensor network in hostile environment makes it mainly vulnerable to battery drainage attacks because it is impossible to recharge or replace the battery power of sensor nodes. Among different types of security threats, low power sensor nodes are immensely affected by the attacks which cause random drainage of the energy level of sensors, leading to death of the nodes. The most dangerous type of attack in this category is sleep deprivation, where target of the intruder is to maximize the power consumption of sensor nodes, so that their lifetime is minimized. Most of the existing works on sleep deprivation attack detection involve a lot of overhead, leading to poor throughput. The need of the day is to design a model for detecting intrusions accurately in an energy efficient manner. This paper proposes a hierarchical framework based on distributed collaborative mechanism for detecting sleep deprivation torture in wireless sensor network efficiently. Proposed model uses anomaly detection technique in two steps to reduce the probability of false intrusion.
Spatial-temporal event detection in climate parameter imagery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKenna, Sean Andrew; Gutierrez, Karen A.
Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to themore » earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.« less
Syed, Zeeshan; Saeed, Mohammed; Rubinfeld, Ilan
2010-01-01
For many clinical conditions, only a small number of patients experience adverse outcomes. Developing risk stratification algorithms for these conditions typically requires collecting large volumes of data to capture enough positive and negative for training. This process is slow, expensive, and may not be appropriate for new phenomena. In this paper, we explore different anomaly detection approaches to identify high-risk patients as cases that lie in sparse regions of the feature space. We study three broad categories of anomaly detection methods: classification-based, nearest neighbor-based, and clustering-based techniques. When evaluated on data from the National Surgical Quality Improvement Program (NSQIP), these methods were able to successfully identify patients at an elevated risk of mortality and rare morbidities following inpatient surgical procedures. PMID:21347083
NASA Astrophysics Data System (ADS)
Li, H.; Kusky, T. M.; Peng, S.; Zhu, M.
2012-12-01
Thermal infrared (TIR) remote sensing is an important technique in the exploration of geothermal resources. In this study, a geothermal survey is conducted in Tengchong area of Yunnan province in China using multi-temporal MODIS LST (Land Surface Temperature). The monthly night MODIS LST data from Mar. 2000 to Mar. 2011 of the study area were collected and analyzed. The 132 month average LST map was derived and three geothermal anomalies were identified. The findings of this study agree well with the results from relative geothermal gradient measurements. Finally, we conclude that TIR remote sensing is a cost-effective technique to detect geothermal anomalies. Combining TIR remote sensing with geological analysis and the understanding of geothermal mechanism is an accurate and efficient approach to geothermal area detection.
3D P-Wave Velocity Structure of the Crust and Relocation of Earthquakes in 21 the Lushan Source Area
NASA Astrophysics Data System (ADS)
Yu, X.; Wang, X.; Zhang, W.
2014-12-01
The double difference seismic tomography method is applied to the absolute first arrival P wave arrival times and high quality relative P arrival times of the Lushan seismic sequence to determine the detailed crustal 3D P wave velocity structure and the hypocenter parameters in the Lushan seismic area. The results show that the Lushan mainshock locates at 30.28 N, 103.98 E, with the depth of 16.38 km. The leading edge of aftershock in the northeast of mainshock present a spade with a steep dip angle, the aftershocks' extended length is about 12 km. In the southwest of the Lushan mainshock, the leading edge of aftershock in low velocity zone slope gently, the aftershocks' extended length is about 23 km. The P wave velocity structure of the Lushan seismic area shows obviously lateral heterogeneity. The P wave velocity anomalies represent close relationship with topographic relief and geological structure. In Baoxing area the complex rocks correspond obvious high-velocity anomalies extending down to 15 km depth,while the Cenozoic rocks are correlated with low-velocity anomalies. Our high-resolution tomographic model not only displays the general features contained in the previous models, but also reveals some new features. An obvious high-velocity anomaly is visible in Daxing area. The high-velocity anomalies beneath Baoxing and Daxing connect each other in 10 km depth, which makes the contrast between high and low velocity anomalies more sharp. Above 20 km depth the velocity structure in southwest and northeast segment of the mainshock shows a big difference: low-velocity anomalies are dominated the southwest segment, while high-velocity anomalies rule the northeast segment. The Lushan mainshock locates at the leading edge of a low-velocity anomaly surrounded by the Baoxing and Daxing high-velocity anomalies. The Lushan aftershocks in southwest are distributed in low-velocity anomalies or the transition belt: the footwall represents low-velocity anomalies, while the hanging wall shows high-velocity anomalies. The northeastern aftershocks are distributed at the boundary between high-velocity anomalies in Baoxing and Daxing area. The main seismogenic layer dips to northwest.
NASA Technical Reports Server (NTRS)
Shrestha, S.; Kharkovsky, S.; Zoughi, R.; Hepburn, F
2005-01-01
The Space Shuttle Columbia s catastrophic failure has been attributed to a piece of external fuel tank insulating SOFI (Spray On Foam Insulation) foam striking the leading edge of the left wing of the orbiter causing significant damage to some of the protecting heat tiles. The accident emphasizes the growing need to develop effective, robust and life-cycle oriented methods of nondestructive testing and evaluation (NDT&E) of complex conductor-backed insulating foam and protective acreage heat tiles used in the space shuttle fleet and in future multi-launch space vehicles. The insulating SOFI foam is constructed from closed-cell foam. In the microwave regime this foam is in the family of low permittivity and low loss dielectric materials. Near-field microwave and millimeter wave NDT methods were one of the techniques chosen for this purpose. To this end several flat and thick SOFI foam panels, two structurally complex panels similar to the external fuel tank and a "blind" panel were used in this investigation. Several anomalies such as voids and disbonds were embedded in these panels at various locations. The location and properties of the embedded anomalies in the "blind" panel were not disclosed to the investigating team prior to the investigation. Three frequency bands were used in this investigation covering a frequency range of 8-75 GHz. Moreover, the influence of signal polarization was also investigated. Overall the results of this investigation were very promising for detecting the presence of anomalies in different panels covered with relatively thick insulating SOFI foam. Different types of anomalies were detected in foam up to 9 in thick. Many of the anomalies in the more complex panels were also detected. When investigating the blind panel no false positives were detected. Anomalies in between and underneath bolt heads were not easily detected. This paper presents the results of this investigation along with a discussion of the capabilities of the method used.
2014-02-26
set of anomaly detection rules 62 I.-R. Chen et al. / Ad Hoc Networks 19 (2014) 59–74 Author’s personal copy including the interval rule (for...deficiencies in anomaly detection (e.g., imperfection of rules) by a false negative probability (PHfn) of misidentifying an unhealthy node as a...multimedia servers, Multimedia Syst. 8 (2) (2000) 83–91. [53] R. Mitchell, I.R. Chen, Adaptive intrusion detection for unmanned aircraft systems based on
Using Physical Models for Anomaly Detection in Control Systems
NASA Astrophysics Data System (ADS)
Svendsen, Nils; Wolthusen, Stephen
Supervisory control and data acquisition (SCADA) systems are increasingly used to operate critical infrastructure assets. However, the inclusion of advanced information technology and communications components and elaborate control strategies in SCADA systems increase the threat surface for external and subversion-type attacks. The problems are exacerbated by site-specific properties of SCADA environments that make subversion detection impractical; and by sensor noise and feedback characteristics that degrade conventional anomaly detection systems. Moreover, potential attack mechanisms are ill-defined and may include both physical and logical aspects.
Constraining Mass Anomalies Using Trans-dimensional Gravity Inversions
NASA Astrophysics Data System (ADS)
Izquierdo, K.; Montesi, L.; Lekic, V.
2016-12-01
The density structure of planetary interiors constitutes a key constraint on their composition, temperature, and dynamics. This has motivated the development of non-invasive methods to infer 3D distribution of density anomalies within a planet's interior using gravity observations made from the surface or orbit. On Earth, this information can be supplemented by seismic and electromagnetic observations, but such data are generally not available on other planets and inferences must be made from gravity observations alone. Unfortunately, inferences of density anomalies from gravity are non-unique and even the dimensionality of the problem - i.e., the number of density anomalies detectable in the planetary interior - is unknown. In this project, we use the Reversible Jump Markov chain Monte Carlo (RJMCMC) algorithm to approach gravity inversions in a trans-dimensional way, that is, considering the magnitude of the mass, the latitude, longitude, depth and number of anomalies itself as unknowns to be constrained by the observed gravity field at the surface of a planet. Our approach builds upon previous work using trans-dimensional gravity inversions in which the density contrast between the anomaly and the surrounding material is known. We validate the algorithm by analyzing a synthetic gravity field produced by a known density structure and comparing the retrieved and input density structures. We find excellent agreement between the input and retrieved structure when working in 1D and 2D domains. However, in 3D domains, comprehensive exploration of the much larger space of possible models makes search efficiency a key ingredient in successful gravity inversion. We find that upon a sufficiently long RJMCMC run, it is possible to use statistical information to recover a predicted model that matches the real model. We argue that even more complex problems, such as those involving real gravity acceleration data of a planet as the constraint, our trans-dimensional gravity inversion algorithm provides a good option to overcome the problem of non-uniqueness while achieving parsimony in gravity inversions.
Caldera unrest detected with seawater temperature anomalies at Deception Island, Antarctic Peninsula
NASA Astrophysics Data System (ADS)
Berrocoso, M.; Prates, G.; Fernández-Ros, A.; Peci, L. M.; de Gil, A.; Rosado, B.; Páez, R.; Jigena, B.
2018-04-01
Increased thermal activity was detected to coincide with the onset of volcano inflation in the seawater-filled caldera at Deception Island. This thermal activity was manifested in pulses of high water temperature that coincided with ocean tide cycles. The seawater temperature anomalies were detected by a thermometric sensor attached to the tide gauge (bottom pressure sensor). This was installed where the seawater circulation and the locations of known thermal anomalies, fumaroles and thermal springs, together favor the detection of water warmed within the caldera. Detection of the increased thermal activity was also possible because sea ice, which covers the entire caldera during the austral winter months, insulates the water and thus reduces temperature exchange between seawater and atmosphere. In these conditions, the water temperature data has been shown to provide significant information about Deception volcano activity. The detected seawater temperature increase, also observed in soil temperature readings, suggests rapid and near-simultaneous increase in geothermal activity with onset of caldera inflation and an increased number of seismic events observed in the following austral summer.
Fiedler, Klaus; Kareev, Yaakov; Avrahami, Judith; Beier, Susanne; Kutzner, Florian; Hütter, Mandy
2016-01-01
Detecting changes, in performance, sales, markets, risks, social relations, or public opinions, constitutes an important adaptive function. In a sequential paradigm devised to investigate detection of change, every trial provides a sample of binary outcomes (e.g., correct vs. incorrect student responses). Participants have to decide whether the proportion of a focal feature (e.g., correct responses) in the population from which the sample is drawn has decreased, remained constant, or increased. Strong and persistent anomalies in change detection arise when changes in proportional quantities vary orthogonally to changes in absolute sample size. Proportional increases are readily detected and nonchanges are erroneously perceived as increases when absolute sample size increases. Conversely, decreasing sample size facilitates the correct detection of proportional decreases and the erroneous perception of nonchanges as decreases. These anomalies are however confined to experienced samples of elementary raw events from which proportions have to be inferred inductively. They disappear when sample proportions are described as percentages in a normalized probability format. To explain these challenging findings, it is essential to understand the inductive-learning constraints imposed on decisions from experience.
A novel approach for pilot error detection using Dynamic Bayesian Networks.
Saada, Mohamad; Meng, Qinggang; Huang, Tingwen
2014-06-01
In the last decade Dynamic Bayesian Networks (DBNs) have become one type of the most attractive probabilistic modelling framework extensions of Bayesian Networks (BNs) for working under uncertainties from a temporal perspective. Despite this popularity not many researchers have attempted to study the use of these networks in anomaly detection or the implications of data anomalies on the outcome of such models. An abnormal change in the modelled environment's data at a given time, will cause a trailing chain effect on data of all related environment variables in current and consecutive time slices. Albeit this effect fades with time, it still can have an ill effect on the outcome of such models. In this paper we propose an algorithm for pilot error detection, using DBNs as the modelling framework for learning and detecting anomalous data. We base our experiments on the actions of an aircraft pilot, and a flight simulator is created for running the experiments. The proposed anomaly detection algorithm has achieved good results in detecting pilot errors and effects on the whole system.
Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype
NASA Technical Reports Server (NTRS)
Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.
2010-01-01
In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.
Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.
Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen
2015-11-01
Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.
HPNAIDM: The High-Performance Network Anomaly/Intrusion Detection and Mitigation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yan
Identifying traffic anomalies and attacks rapidly and accurately is critical for large network operators. With the rapid growth of network bandwidth, such as the next generation DOE UltraScience Network, and fast emergence of new attacks/virus/worms, existing network intrusion detection systems (IDS) are insufficient because they: • Are mostly host-based and not scalable to high-performance networks; • Are mostly signature-based and unable to adaptively recognize flow-level unknown attacks; • Cannot differentiate malicious events from the unintentional anomalies. To address these challenges, we proposed and developed a new paradigm called high-performance network anomaly/intrustion detection and mitigation (HPNAIDM) system. The new paradigm ismore » significantly different from existing IDSes with the following features (research thrusts). • Online traffic recording and analysis on high-speed networks; • Online adaptive flow-level anomaly/intrusion detection and mitigation; • Integrated approach for false positive reduction. Our research prototype and evaluation demonstrate that the HPNAIDM system is highly effective and economically feasible. Beyond satisfying the pre-set goals, we even exceed that significantly (see more details in the next section). Overall, our project harvested 23 publications (2 book chapters, 6 journal papers and 15 peer-reviewed conference/workshop papers). Besides, we built a website for technique dissemination, which hosts two system prototype release to the research community. We also filed a patent application and developed strong international and domestic collaborations which span both academia and industry.« less
Discrimination between pre-seismic electromagnetic anomalies and solar activity effects
NASA Astrophysics Data System (ADS)
Koulouras, G.; Balasis, G.; Kiourktsidis, I.; Nannos, E.; Kontakos, K.; Stonham, J.; Ruzhin, Y.; Eftaxias, K.; Cavouras, D.; Nomicos, C.
2009-04-01
Laboratory studies suggest that electromagnetic emissions in a wide frequency spectrum ranging from kilohertz (kHz) to very high megahertz (MHz) frequencies are produced by the opening of microcracks, with the MHz radiation appearing earlier than the kHz radiation. Earthquakes are large-scale fracture phenomena in the Earth's heterogeneous crust. Thus, the radiated kHz-MHz electromagnetic emissions are detectable not only in the laboratory but also at a geological scale. Clear MHz-to-kHz electromagnetic anomalies have been systematically detected over periods ranging from a few days to a few hours prior to recent destructive earthquakes in Greece. We should bear in mind that whether electromagnetic precursors to earthquakes exist is an important question not only for earthquake prediction but mainly for understanding the physical processes of earthquake generation. An open question in this field of research is the classification of a detected electromagnetic anomaly as a pre-seismic signal associated with earthquake occurrence. Indeed, electromagnetic fluctuations in the frequency range of MHz are known to be related to a few sources, including atmospheric noise (due to lightning), man-made composite noise, solar-terrestrial noise (resulting from the Sun-solar wind-magnetosphere-ionosphere-Earth's surface chain) or cosmic noise, and finally, the lithospheric effect, namely pre-seismic activity. We focus on this point in this paper. We suggest that if a combination of detected kHz and MHz electromagnetic anomalies satisfies the set of criteria presented herein, these anomalies could be considered as candidate precursory phenomena of an impending earthquake.
Discrimination between preseismic electromagnetic anomalies and solar activity effects
NASA Astrophysics Data System (ADS)
Koulouras, Gr; Balasis, G.; Kontakos, K.; Ruzhin, Y.; Avgoustis, G.; Kavouras, D.; Nomicos, C.
2009-04-01
Laboratory studies suggest that electromagnetic emissions in a wide frequency spectrum ranging from kHz to very high MHz frequencies are produced by the opening of microcracks, with the MHz radiation appearing earlier than the kHz radiation. Earthquakes are large-scale fracture phenomena in the Earth's heterogeneous crust. Thus, the radiated kHz-MHz electromagnetic emissions are detectable not only at laboratory but also at geological scale. Clear MHz-to-kHz electromagnetic anomalies have been systematically detected over periods ranging from a few days to a few hours prior to recent destructive earthquakes in Greece. We bear in mind that whether electromagnetic precursors to earthquakes exist is an important question not only for earthquake prediction but mainly for understanding the physical processes of earthquake generation. An open question in this field of research is the classification of a detected electromagnetic anomaly as a pre-seismic signal associated to earthquake occurrence. Indeed, electromagnetic fluctuations in the frequency range of MHz are known to related to a few sources, i.e., they might be atmospheric noise (due to lightning), man-made composite noise, solar-terrestrial noise (resulting from the Sun-solar wind-magnetosphere-ionosphere-Earth's surface chain) or cosmic noise, and finally, lithospheric effect, namely pre-seismic activity. We focus on this point. We suggest that if a combination of detected kHz and MHz electromagnetic anomalies satisfies the herein presented set of criteria these anomalies could be considered as candidate precursory phenomena of an impending earthquake.
NASA Astrophysics Data System (ADS)
Yang, G.; Shen, C.; Wang, J.
2017-12-01
we calculated the Bouguer gravity anomaly and the Airy-Heiskanen isostatic anomaly in the New Britain ocean trenches and its surrounding areas of Papua New Guinea using the topography model and the gravity anomaly model from Scripps Institute of Oceanography, and analyzed the characteristics of isostatic anomaly and the earthquake dynamic environment of this region. The results show that there are obviously differences in the isostatic state between each block in the region, and the crustal tectonic movement is very intense in the regions with high positive or negative isostatic gravity anomalies; A number of sub-plates in this area is driven by the external tectonic action such as plate subduction and thrust of the Pacific plate, the Indian - Australian plate and the Eurasian plate. From the distribution of isostatic gravity anomaly, the tectonic action of anti-isostatic movement in this region is the main source of power; from the isostatic gravity and the spatial distribution of the earthquake, with the further contraction of the Indian-Australian plate, the southwestern part of the Solomon Haiya plate will become part of the Owen Stanley fold belt, the northern part will enter the lower part of the Bismarck plate, eastern part will enter the front of the Pacific plate, the huge earthquake will migrate to the north and east of the Solomon Haiya plate.
Kaasen, A; Helbig, A; Malt, U F; Naes, T; Skari, H; Haugen, G
2010-08-01
To predict acute psychological distress in pregnant women following detection of a fetal structural anomaly by ultrasonography, and to relate these findings to a comparison group. A prospective, observational study. Tertiary referral centre for fetal medicine. One hundred and eighty pregnant women with a fetal structural anomaly detected by ultrasound (study group) and 111 with normal ultrasound findings (comparison group) were included within a week following sonographic examination after gestational age 12 weeks (inclusion period: May 2006 to February 2009). Social dysfunction and health perception were assessed by the corresponding subscales of the General Health Questionnaire (GHQ-28). Psychological distress was assessed using the Impact of Events Scale (IES-22), Edinburgh Postnatal Depression Scale (EPDS) and the anxiety and depression subscales of the GHQ-28. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Social dysfunction, health perception and psychological distress (intrusion, avoidance, arousal, anxiety, depression). The least severe anomalies with no diagnostic or prognostic ambiguity induced the lowest levels of IES intrusive distress (P = 0.025). Women included after 22 weeks of gestation (24%) reported significantly higher GHQ distress than women included earlier in pregnancy (P = 0.003). The study group had significantly higher levels of psychosocial distress than the comparison group on all psychometric endpoints. Psychological distress was predicted by gestational age at the time of assessment, severity of the fetal anomaly, and ambiguity concerning diagnosis or prognosis.
Detecting errors and anomalies in computerized materials control and accountability databases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiteson, R.; Hench, K.; Yarbro, T.
The Automated MC and A Database Assessment project is aimed at improving anomaly and error detection in materials control and accountability (MC and A) databases and increasing confidence in the data that they contain. Anomalous data resulting in poor categorization of nuclear material inventories greatly reduces the value of the database information to users. Therefore it is essential that MC and A data be assessed periodically for anomalies or errors. Anomaly detection can identify errors in databases and thus provide assurance of the integrity of data. An expert system has been developed at Los Alamos National Laboratory that examines thesemore » large databases for anomalous or erroneous data. For several years, MC and A subject matter experts at Los Alamos have been using this automated system to examine the large amounts of accountability data that the Los Alamos Plutonium Facility generates. These data are collected and managed by the Material Accountability and Safeguards System, a near-real-time computerized nuclear material accountability and safeguards system. This year they have expanded the user base, customizing the anomaly detector for the varying requirements of different groups of users. This paper describes the progress in customizing the expert systems to the needs of the users of the data and reports on their results.« less
Value of brain MRI when sonography raises suspicion of agenesis of the corpus callosum in fetuses.
Jarre, A; Llorens Salvador, R; Montoliu Fornas, G; Montoya Filardi, A
To evaluate the role of magnetic resonance imaging (MRI) in fetuses with a previous sonographic suspicion of agenesis of the corpus callosum (ACC) to confirm the diagnosis and to detect associated intracranial anomalies. Single-center retrospective and descriptive observational study of the brain MRI performed in 78 fetuses with ACC sonographic suspicion between January 2006 and December 2015. Two experts in fetal imaging reviewed the MRI findings to evaluate the presence and morphology of the corpus callosum. When ACC was detected the whole fetal brain anatomy was thoroughly studied to determine the presence of associated anomalies. Prenatal MR imaging findings were compared to postnatal brain MRI or necropsy findings when available. Fetal MRI diagnosed 45 cases of ACC, 12 were partial (26.7%) and 33 complete (73.3%). In 28 cases (62,2%) associated intracranial anomalies were identified. The most often abnormality was ventriculomegaly (78,6%), followed by cortical malformations (53,6%), posterior fossa (25%) and midline anomalies (10,7%). Fetal brain MRI has an important role in the diagnosis of ACC and detection of associated anomalies. To perform a fetal brain MRI is important in fetuses with sonographic suspicion of ACC. Copyright © 2017 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
Consistent description of kinetic equation with triangle anomaly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pu Shi; Gao Jianhua; Wang Qun
2011-05-01
We provide a consistent description of the kinetic equation with a triangle anomaly which is compatible with the entropy principle of the second law of thermodynamics and the charge/energy-momentum conservation equations. In general an anomalous source term is necessary to ensure that the equations for the charge and energy-momentum conservation are satisfied and that the correction terms of distribution functions are compatible to these equations. The constraining equations from the entropy principle are derived for the anomaly-induced leading order corrections to the particle distribution functions. The correction terms can be determined for the minimum number of unknown coefficients in onemore » charge and two charge cases by solving the constraining equations.« less
IR Thermography of International Space Station Radiator Panels
NASA Technical Reports Server (NTRS)
Koshti, Ajay; Winfree, WIlliam; Morton, Richard; Howell, Patricia
2010-01-01
Several non-flight qualification test radiators were inspected using flash thermography. Flash thermography data analysis used raw and second derivative images to detect anomalies (Echotherm and Mosaic). Simple contrast evolutions were plotted for the detected anomalies to help in anomaly characterization. Many out-of-family indications were noted. Some out-of-family indications were classified as cold spot indications and are due to additional adhesive or adhesive layer behind the facesheet. Some out-of-family indications were classified as hot spot indications and are due to void, unbond or lack of adhesive behind the facesheet. The IR inspection helped in assessing expected manufacturing quality of the radiators.
How much does the MSW effect contribute to the reactor antineutrino anomaly?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdiviesso, G. A.
2015-05-15
It has been pointed out that there is a 5.7 ± 2.3 discrepancy between the predicted and the observed reactor antineutrino flux in very short baseline experiments. Several causes for this anomaly have been discussed, including a possible non-standard forth sterile neutrino. In order to quantify how much non-standard this anomaly really is, the standard MSW effect is reviewed. Knowing that reactor antineutrinos are produced in a dense medium (the nuclear fuel) and is usually detected in a less dense one (water, or scintillator), non-adiabatic effects are expected to happen, creating a difference between the creation and detection mixing angles.
Radioactive anomaly discrimination from spectral ratios
Maniscalco, James; Sjoden, Glenn; Chapman, Mac Clements
2013-08-20
A method for discriminating a radioactive anomaly from naturally occurring radioactive materials includes detecting a first number of gamma photons having energies in a first range of energy values within a predetermined period of time and detecting a second number of gamma photons having energies in a second range of energy values within the predetermined period of time. The method further includes determining, in a controller, a ratio of the first number of gamma photons having energies in the first range and the second number of gamma photons having energies in the second range, and determining that a radioactive anomaly is present when the ratio exceeds a threshold value.
Dataset of anomalies and malicious acts in a cyber-physical subsystem.
Laso, Pedro Merino; Brosset, David; Puentes, John
2017-10-01
This article presents a dataset produced to investigate how data and information quality estimations enable to detect aNomalies and malicious acts in cyber-physical systems. Data were acquired making use of a cyber-physical subsystem consisting of liquid containers for fuel or water, along with its automated control and data acquisition infrastructure. Described data consist of temporal series representing five operational scenarios - Normal, aNomalies, breakdown, sabotages, and cyber-attacks - corresponding to 15 different real situations. The dataset is publicly available in the .zip file published with the article, to investigate and compare faulty operation detection and characterization methods for cyber-physical systems.
System and Method for Monitoring Distributed Asset Data
NASA Technical Reports Server (NTRS)
Gorinevsky, Dimitry (Inventor)
2015-01-01
A computer-based monitoring system and monitoring method implemented in computer software for detecting, estimating, and reporting the condition states, their changes, and anomalies for many assets. The assets are of same type, are operated over a period of time, and outfitted with data collection systems. The proposed monitoring method accounts for variability of working conditions for each asset by using regression model that characterizes asset performance. The assets are of the same type but not identical. The proposed monitoring method accounts for asset-to-asset variability; it also accounts for drifts and trends in the asset condition and data. The proposed monitoring system can perform distributed processing of massive amounts of historical data without discarding any useful information where moving all the asset data into one central computing system might be infeasible. The overall processing is includes distributed preprocessing data records from each asset to produce compressed data.
Dual Use Corrosion Inhibitor and Penetrant for Anomaly Detection in Neutron/X Radiography
NASA Technical Reports Server (NTRS)
Hall, Phillip B. (Inventor); Novak, Howard L. (Inventor)
2004-01-01
A dual purpose corrosion inhibitor and penetrant composition sensitive to radiography interrogation is provided. The corrosion inhibitor mitigates or eliminates corrosion on the surface of a substrate upon which the corrosion inhibitor is applied. In addition, the corrosion inhibitor provides for the attenuation of a signal used during radiography interrogation thereby providing for detection of anomalies on the surface of the substrate.
A mixed model framework for teratology studies.
Braeken, Johan; Tuerlinckx, Francis
2009-10-01
A mixed model framework is presented to model the characteristic multivariate binary anomaly data as provided in some teratology studies. The key features of the model are the incorporation of covariate effects, a flexible random effects distribution by means of a finite mixture, and the application of copula functions to better account for the relation structure of the anomalies. The framework is motivated by data of the Boston Anticonvulsant Teratogenesis study and offers an integrated approach to investigate substantive questions, concerning general and anomaly-specific exposure effects of covariates, interrelations between anomalies, and objective diagnostic measurement.
Sentence understanding depends on contextual use of semantic and real world knowledge.
Tune, Sarah; Schlesewsky, Matthias; Nagels, Arne; Small, Steven L; Bornkessel-Schlesewsky, Ina
2016-08-01
Human language allows us to express our thoughts and ideas by combining entities, concepts and actions into multi-event episodes. Yet, the functional neuroanatomy engaged in interpretation of such high-level linguistic input remains poorly understood. Here, we used easy to detect and more subtle "borderline" anomalies to investigate the brain regions and mechanistic principles involved in the use of real-world event knowledge in language comprehension. Overall, the results showed that the processing of sentences in context engages a complex set of bilateral brain regions in the frontal, temporal and inferior parietal lobes. Easy anomalies preferentially engaged lower-order cortical areas adjacent to the primary auditory cortex. In addition, the left supramarginal gyrus and anterior temporal sulcus as well as the right posterior middle temporal gyrus contributed to the processing of easy and borderline anomalies. The observed pattern of results is explained in terms of (i) hierarchical processing along a dorsal-ventral axis and (ii) the assumption of high-order association areas serving as cortical hubs in the convergence of information in a distributed network. Finally, the observed modulation of BOLD signal in prefrontal areas provides support for their role in the implementation of executive control processes. Copyright © 2016 Elsevier Inc. All rights reserved.
Extending TOPS: Ontology-driven Anomaly Detection and Analysis System
NASA Astrophysics Data System (ADS)
Votava, P.; Nemani, R. R.; Michaelis, A.
2010-12-01
Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.
Anomaly Detection in Dynamic Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turcotte, Melissa
2014-10-14
Anomaly detection in dynamic communication networks has many important security applications. These networks can be extremely large and so detecting any changes in their structure can be computationally challenging; hence, computationally fast, parallelisable methods for monitoring the network are paramount. For this reason the methods presented here use independent node and edge based models to detect locally anomalous substructures within communication networks. As a first stage, the aim is to detect changes in the data streams arising from node or edge communications. Throughout the thesis simple, conjugate Bayesian models for counting processes are used to model these data streams. Amore » second stage of analysis can then be performed on a much reduced subset of the network comprising nodes and edges which have been identified as potentially anomalous in the first stage. The first method assumes communications in a network arise from an inhomogeneous Poisson process with piecewise constant intensity. Anomaly detection is then treated as a changepoint problem on the intensities. The changepoint model is extended to incorporate seasonal behavior inherent in communication networks. This seasonal behavior is also viewed as a changepoint problem acting on a piecewise constant Poisson process. In a static time frame, inference is made on this extended model via a Gibbs sampling strategy. In a sequential time frame, where the data arrive as a stream, a novel, fast Sequential Monte Carlo (SMC) algorithm is introduced to sample from the sequence of posterior distributions of the change points over time. A second method is considered for monitoring communications in a large scale computer network. The usage patterns in these types of networks are very bursty in nature and don’t fit a Poisson process model. For tractable inference, discrete time models are considered, where the data are aggregated into discrete time periods and probability models are fitted to the communication counts. In a sequential analysis, anomalous behavior is then identified from outlying behavior with respect to the fitted predictive probability models. Seasonality is again incorporated into the model and is treated as a changepoint model on the transition probabilities of a discrete time Markov process. Second stage analytics are then developed which combine anomalous edges to identify anomalous substructures in the network.« less
NASA Astrophysics Data System (ADS)
Wang, Fei; Wang, Wenyu; Yang, Jin Min
2017-10-01
We propose to introduce general messenger-matter interactions in the deflected anomaly mediated supersymmetry (SUSY) breaking (AMSB) scenario to explain the gμ-2 anomaly. Scenarios with complete or incomplete grand unified theory (GUT) multiplet messengers are discussed, respectively. The introduction of incomplete GUT mulitiplets can be advantageous in various aspects. We found that the gμ-2 anomaly can be solved in both scenarios under current constraints including the gluino mass bounds, while the scenarios with incomplete GUT representation messengers are more favored by the gμ-2 data. We also found that the gluino is upper bounded by about 2.5 TeV (2.0 TeV) in scenario A and 3.0 TeV (2.7 TeV) in scenario B if the generalized deflected AMSB scenarios are used to fully account for the gμ-2 anomaly at 3 σ (2 σ ) level. Such a gluino should be accessible in the future LHC searches. Dark matter (DM) constraints, including DM relic density and direct detection bounds, favor scenario B with incomplete GUT multiplets. Much of the allowed parameter space for scenario B could be covered by the future DM direct detection experiments.
Congenital anomalies of the left brachiocephalic vein detected in adults on computed tomography.
Yamamuro, Hiroshi; Ichikawa, Tamaki; Hashimoto, Jun; Ono, Shun; Nagata, Yoshimi; Kawada, Shuichi; Kobayashi, Makiko; Koizumi, Jun; Shibata, Takeo; Imai, Yutaka
2017-10-01
Anomalous left brachiocephalic vein (BCV) is a rare and less known systemic venous anomaly. We evaluated congenital anomalies of the left BCV in adults detected during computed tomography (CT) examinations. This retrospective study included 81,425 patients without congenital heart disease who underwent chest CT. We reviewed the recorded reports and CT images for congenital anomalies of the left BCV including aberrant and supernumerary BCVs. The associated congenital aortic anomalies were assessed. Among 73,407 cases at a university hospital, 22 (16 males, 6 females; mean age, 59 years) with aberrant left BCVs were found using keyword research on recorded reports (0.03%). Among 8018 cases at the branch hospital, 5 (4 males, 1 female; mean age, 67 years) with aberrant left BCVs were found using CT image review (0.062%). There were no significant differences in incidences of aberrant left BCV between the two groups. Two cases had double left BCVs. Eleven cases showed high aortic arches. Two cases had the right aortic arch, one case had an incomplete double aortic arch, and one case was associated with coarctation. Aberrant left BCV on CT examination in adults was extremely rare. Some cases were associated with aortic arch anomalies.
An application of a zero-inflated lifetime distribution with multiple and incomplete data sources
Hamada, M. S.; Margevicius, K. J.
2016-02-11
In this study, we analyze data sampled from a population of parts in which an associated anomaly can occur at assembly or after assembly. Using a zero-inflated lifetime distribution to fit left-censored and right-censored data as well data from a supplementary sample, we make predictions about the proportion of the population with anomalies today and in the future. Goodness-of-fit is also addressed.
Sparse source configurations in radio tomography of asteroids
NASA Astrophysics Data System (ADS)
Pursiainen, S.; Kaasalainen, M.
2014-07-01
Our research targets at progress in non-invasive imaging of asteroids to support future planetary research and extra-terrestrial mining activities. This presentation concerns principally radio tomography in which the permittivity distribution inside an asteroid is to be recovered based on the radio frequency signal transmitted from the asteroid's surface and gathered by an orbiter. The focus will be on a sparse distribution (Pursiainen and Kaasalainen, 2013) of signal sources that can be necessary in the challenging in situ environment and within tight payload limits. The general goal in our recent research has been to approximate the minimal number of source positions needed for robust localization of anomalies caused, for example, by an internal void. Characteristic to the localization problem are the large relative changes in signal speed caused by the high permittivity of typical asteroid minerals (e.g. basalt), meaning that a signal path can include strong refractions and reflections. This presentation introduces results of a laboratory experiment in which real travel time data was inverted using a hierarchical Bayesian approach combined with the iterative alternating sequential (IAS) posterior exploration algorithm. Special interest was paid to robustness of the inverse results regarding changes of the prior model and source positioning. According to our results, strongly refractive anomalies can be detected with three or four sources independently of their positioning.
On-road anomaly detection by multimodal sensor analysis and multimedia processing
NASA Astrophysics Data System (ADS)
Orhan, Fatih; Eren, P. E.
2014-03-01
The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.
A new approach for structural health monitoring by applying anomaly detection on strain sensor data
NASA Astrophysics Data System (ADS)
Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik
2014-03-01
Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.
Radiation anomaly detection algorithms for field-acquired gamma energy spectra
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen
2015-08-01
The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.
Emy Dorfman, Luiza; Leite, Júlio César L; Giugliani, Roberto; Riegel, Mariluce
2015-01-01
To identify chromosomal imbalances by whole-genome microarray-based comparative genomic hybridization (array-CGH) in DNA samples of neonates with congenital anomalies of unknown cause from a birth defects monitoring program at a public maternity hospital. A blind genomic analysis was performed retrospectively in 35 stored DNA samples of neonates born between July of 2011 and December of 2012. All potential DNA copy number variations detected (CNVs) were matched with those reported in public genomic databases, and their clinical significance was evaluated. Out of a total of 35 samples tested, 13 genomic imbalances were detected in 12/35 cases (34.3%). In 4/35 cases (11.4%), chromosomal imbalances could be defined as pathogenic; in 5/35 (14.3%) cases, DNA CNVs of uncertain clinical significance were identified; and in 4/35 cases (11.4%), normal variants were detected. Among the four cases with results considered causally related to the clinical findings, two of the four (50%) showed causative alterations already associated with well-defined microdeletion syndromes. In two of the four samples (50%), the chromosomal imbalances found, although predicted as pathogenic, had not been previously associated with recognized clinical entities. Array-CGH analysis allowed for a higher rate of detection of chromosomal anomalies, and this determination is especially valuable in neonates with congenital anomalies of unknown etiology, or in cases in which karyotype results cannot be obtained. Moreover, although the interpretation of the results must be refined, this method is a robust and precise tool that can be used in the first-line investigation of congenital anomalies, and should be considered for prospective/retrospective analyses of DNA samples by birth defect monitoring programs. Copyright © 2014 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.
Roberts, T; Mugford, M; Piercy, J
1998-09-01
To compare the cost effectiveness of different programmes of routine antenatal ultrasound screening to detect four key fetal anomalies: serious cardiac anomalies, spina bifida, Down's syndrome and lethal anomalies, using existing evidence. Decision analysis was used based on the best data currently available, including expert opinion from the Royal College of Obstetricians and Gynaecologists, Working Party and secondary data from the literature, to predict the likely outcomes in terms of malformations detected by each screening programme. Results applicable in clinics, hospitals or GP practices delivering antenatal screening. The number of cases with a 'target' malformation correctly detected antenatally. There was substantial overlap between the cost ranges of each screening programme demonstrating considerable uncertainty about the relative economic efficiency of alternative programmes for ultrasound screening. The cheapest, but not the most effective, screening programme consisted of one second trimester ultrasound scan. The cost per target anomaly detected (cost effectiveness) for this programme was in the range 5,000 pound silver-109,000, pound silver but in any 1000 women it will also fail to detect between 3.6 and 4.7 target anomalies. The range of uncertainty in the costs did not allow selection of any one programme as a clear choice for NHS purchasers. The results suggested that the overall allocation of resources for routine ultrasound screening in the UK is not currently economically efficient, but that certain scenarios for ultrasound screening are potentially within the range of cost effectiveness reached by other, possibly competing, screening programmes. The model highlighted the weakness of available evidence and demonstrated the need for more information both about current practice and costs.
NASA Technical Reports Server (NTRS)
Bowin, C.
1982-01-01
A negative free-air gravity anomaly which occurs in the central part of the Philippine Sea was examined to determine the distribution and nature of possible regional mass excesses or deficiencies. Geoid anomalies from GEOS-3 observation were positive. A negative residual geoid anomaly consistent with the area of negative free-air gravity anomalies were found. Theoretical gravity-topography and geoid-topography admittance functions indicated that high density mantle at about 60 km dept could account for the magnitudes of the gravity and residual geoid anomaly and the 1 km residual water depth anomaly in the Philippine Sea. The negative residual depth anomaly may be compensated for by excess density in the uppermost mantle, but the residual geoid and regional free-air gravity anomalies and a slow surface wave velocity structure might result from low-density warm upper mantle material lying beneath the zone of high-density uppermost mantle. From a horizontal disk approximation, the depth of the low-density warm mantle was estimated to be on the order of 200 km.
Change point detection of the Persian Gulf sea surface temperature
NASA Astrophysics Data System (ADS)
Shirvani, A.
2017-01-01
In this study, the Student's t parametric and Mann-Whitney nonparametric change point models (CPMs) were applied to detect change point in the annual Persian Gulf sea surface temperature anomalies (PGSSTA) time series for the period 1951-2013. The PGSSTA time series, which were serially correlated, were transformed to produce an uncorrelated pre-whitened time series. The pre-whitened PGSSTA time series were utilized as the input file of change point models. Both the applied parametric and nonparametric CPMs estimated the change point in the PGSSTA in 1992. The PGSSTA follow the normal distribution up to 1992 and thereafter, but with a different mean value after year 1992. The estimated slope of linear trend in PGSSTA time series for the period 1951-1992 was negative; however, that was positive after the detected change point. Unlike the PGSSTA, the applied CPMs suggested no change point in the Niño3.4SSTA time series.
Using new edges for anomaly detection in computer networks
Neil, Joshua Charles
2017-07-04
Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.
Using new edges for anomaly detection in computer networks
Neil, Joshua Charles
2015-05-19
Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.
Temporal Methods to Detect Content-Based Anomalies in Social Media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skryzalin, Jacek; Field, Jr., Richard; Fisher, Andrew N.
Here, we develop a method for time-dependent topic tracking and meme trending in social media. Our objective is to identify time periods whose content differs signifcantly from normal, and we utilize two techniques to do so. The first is an information-theoretic analysis of the distributions of terms emitted during different periods of time. In the second, we cluster documents from each time period and analyze the tightness of each clustering. We also discuss a method of combining the scores created by each technique, and we provide ample empirical analysis of our methodology on various Twitter datasets.
Visual analytics of anomaly detection in large data streams
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay
2009-01-01
Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.
On Curating Multimodal Sensory Data for Health and Wellness Platforms
Amin, Muhammad Bilal; Banos, Oresti; Khan, Wajahat Ali; Muhammad Bilal, Hafiz Syed; Gong, Jinhyuk; Bui, Dinh-Mao; Cho, Soung Ho; Hussain, Shujaat; Ali, Taqdir; Akhtar, Usman; Chung, Tae Choong; Lee, Sungyoung
2016-01-01
In recent years, the focus of healthcare and wellness technologies has shown a significant shift towards personal vital signs devices. The technology has evolved from smartphone-based wellness applications to fitness bands and smartwatches. The novelty of these devices is the accumulation of activity data as their users go about their daily life routine. However, these implementations are device specific and lack the ability to incorporate multimodal data sources. Data accumulated in their usage does not offer rich contextual information that is adequate for providing a holistic view of a user’s lifelog. As a result, making decisions and generating recommendations based on this data are single dimensional. In this paper, we present our Data Curation Framework (DCF) which is device independent and accumulates a user’s sensory data from multimodal data sources in real time. DCF curates the context of this accumulated data over the user’s lifelog. DCF provides rule-based anomaly detection over this context-rich lifelog in real time. To provide computation and persistence over the large volume of sensory data, DCF utilizes the distributed and ubiquitous environment of the cloud platform. DCF has been evaluated for its performance, correctness, ability to detect complex anomalies, and management support for a large volume of sensory data. PMID:27355955
NASA Astrophysics Data System (ADS)
Koeppen, W. C.; Wright, R.; Pilger, E.
2009-12-01
We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found that MODVOLC2 achieved good results on multiple sensors (MODIS and GOES), which provides confidence that MODVOLC2 can be run on future instruments regardless of their spatial and temporal resolutions. The improved performance of MODVOLC2 over MODVOLC makes possible the detection of lower temperature thermal anomalies that will be useful in improving our ability to document Earth’s volcanic eruptions as well as detect possible low temperature thermal precursors to larger eruptions.
Methods for Finding Legacy Wells in Residential and Commercial Areas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammack, Richard W.; Veloski, Garret A.
In 1919, the enthusiasm surrounding a short-lived gas play in Versailles Borough, Pennsylvania resulted in the drilling of many needless wells. The legacy of this activity exists today in the form of abandoned, unplugged gas wells that are a continuing source of fugitive methane in the midst of a residential and commercial area. Flammable concentrations of methane have been detected near building foundations, which have forced people from their homes and businesses until methane concentrations decreased. Despite mitigation efforts, methane problems persist and have caused some buildings to be permanently abandoned and demolished. This paper describes the use of magneticmore » and methane sensing methods by the National Energy Technology Laboratory (NETL) to locate abandoned gas wells in Versailles Borough where site access is limited and existing infrastructure can interfere. Here, wells are located between closely spaced houses and beneath buildings and parking lots. Wells are seldom visible, often because wellheads and internal casing strings have been removed, and external casing has been cut off below ground level. The magnetic survey of Versailles Borough identified 53 strong, monopole magnetic anomalies that are presumed to indicate the locations of steel-cased wells. This hypothesis was tested by excavating the location of one strong, monopole magnetic anomaly that was within an area of anomalous methane concentrations. The excavation uncovered an unplugged gas well that was within 0.2 m of the location of the maximum magnetic signal. Truck-mounted methane surveys of Versailles Borough detected numerous methane anomalies that were useful for narrowing search areas. Methane sources identified during truck-mounted surveys included strong methane sources such as sewers and methane mitigation vents. However, inconsistent wind direction and speed, especially between buildings, made locating weaker methane sources (such as leaking wells) difficult. Walking surveys with the methane detector mounted on a cart or wagon were more effective for detecting leaking wells because the instrument’s air inlet was near the ground where: 1) the methane concentration from subsurface sources (including wells) was a maximum, and 2) there was less displacement of methane anomalies from methane sources by air currents. The Versailles Borough survey found 15 methane anomalies that coincided with the location of well-type magnetic anomalies; the methane sources for these anomalies were assumed to be leaking wells. For abandoned well locations where the wellhead and all casing strings have been removed and there is no magnetic anomaly, leaking wellbores can sometimes be detected by methane surveys. Unlike magnetic anomalies, methane anomalies can be: 1) ephemeral, 2) significantly displaced from the well location, and 3) from non-well sources that cannot be discriminated without isotopic analysis. If methane surveys are used for well location, the air inlet to the instrument should be kept as close to the ground as possible to minimize the likelihood of detecting methane from distant, wind-blown sources.« less
Neri, Marco; Giammanco, Salvatore; Ferrera, Elisabetta; Patanè, Giuseppe; Zanon, Vittorio
2011-09-01
This study concerns measurements of radon and thoron emissions from soil carried out in 2004 on the eastern flank of Mt. Etna, in a zone characterized by the presence of numerous seismogenic and aseismic faults. The statistical treatment of the geochemical data allowed recognizing anomaly thresholds for both parameters and producing distribution maps that highlighted a significant spatial correlation between soil gas anomalies and tectonic lineaments. The seismic activity occurring in and around the study area during 2004 was analyzed, producing maps of hypocentral depth and released seismic energy. Both radon and thoron anomalies were located in areas affected by relatively deep (5-10 km depth) seismic activity, while less evident correlation was found between soil gas anomalies and the released seismic energy. This study confirms that mapping the distribution of radon and thoron in soil gas can reveal hidden faults buried by recent soil cover or faults that are not clearly visible at the surface. The correlation between soil gas data and earthquakes depth and intensity can give some hints on the source of gas and/or on fault dynamics. Copyright © 2011 Elsevier Ltd. All rights reserved.
Gravity anomaly and crustal structure characteristics in North-South Seismic Belt of China
NASA Astrophysics Data System (ADS)
Shen, Chongyang; Xuan, Songtbai; Yang, Guangliang; Wu, Guiju
2017-04-01
The North-South Seismic Belt (NSSB) is the binary system boundary what is formed by the western Indian plate subduction pushing and the eastern west Pacific asthenosphere rising, and it is one of the three major seismic belts (Tianshan, Taiwan and NSSB) and mainly located between E102°and E107°. And it is mainly composed of topographic gradient zones, faults, cenozoic basins and strong earthquake zones, which form two distinct parts of tectonic and physical features in the west and east. The research results of geophysical and deep tectonic setting in the NSSB show that it is not only a gravity anomaly gradient zone, it is but also a belt of crustal thickness increasing sharply westward of abrupt change. Seismic tomography results show that the anomaly zone is deeper than hundreds of kilometers in the NSSB, and the composition and structure of the crust are more complex. We deployed multiple Gravity and GNSS synchronous detection profiles in the NSSB, and these profiles crossed the mainly faults structure and got thousands of points data. In the research, source analysis, density structure inversion, residual gravity related imaging and normalized full gradient methods were used, and analyzed gravity field, density and their structure features in different positions, finally obtained the crustal density structure section characteristics and depth structure differences. The research results showed that the gravity Bouguer anomaly is similar to the existing large scale result. The Bouguer anomaly is rising significantly from west to east, its trend variation coincides well with the trend change of Moho depth, which is agreeing with the material flows to the peripheral situation of the Tibetan plateau. The obvious difference changes of the residual anomaly is relative to the boundary of structure or main tectonics, it's also connected with the stop degree of the eurasian plate when the material migrates around. The density structure of the gravity profiles mainly reflects basic frame work of the regional crust structure. The earth's crust basically present three layer structure, nearly horizontally distributes, undulation of Moho is obvious, which is consistent with the results of seismic sounding and seismic array detection; in the local area, there are lower density layer zonal distribution in the earth's crust what accelerates the lateral movement in up and middle crust; when the substance of the Tibetan plateau spreads around, the integrity in up and middle crust is well, and it is basically a coupling movement together; in the lower crust, the thickness of the Tibetan plateau is outward gradually thinning, there is decoupling phenomenon in crust-mantle; The results of the gravity and the crustal density structure show that the research area can be divided into several part such as Qinghai-Tibet Plateau, Sichuan-Yunnan block, Ordos block and Alxa block, the transitional zones of the Qinghai-Tibet Plateau and Sichuan basin, and Alxa and Ordos are complex, and Moho slope is bigger, where is the part of strong tectonic activity and strong earthquakes occur easily. The research is of great significance for study the crustal deep structure, geodynamic evolution process and environment of earthquake gestation of the NSSB region.
Liu, Datong; Peng, Yu; Peng, Xiyuan
2018-01-01
Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR) and relevance vector machine (RVM)) are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP), which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%). There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI) based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA) algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application. PMID:29587372
The nature of subslab slow velocity anomalies beneath South America
NASA Astrophysics Data System (ADS)
Portner, Daniel Evan; Beck, Susan; Zandt, George; Scire, Alissa
2017-05-01
Slow seismic velocity anomalies are commonly imaged beneath subducting slabs in tomographic studies, yet a unifying explanation for their distribution has not been agreed upon. In South America two such anomalies have been imaged associated with subduction of the Nazca Ridge in Peru and the Juan Fernández Ridge in Chile. Here we present new seismic images of the subslab slow velocity anomaly beneath Chile, which give a unique view of the nature of such anomalies. Slow seismic velocities within a large hole in the subducted Nazca slab connect with a subslab slow anomaly that appears correlated with the extent of the subducted Juan Fernández Ridge. The hole in the slab may allow the subslab material to rise into the mantle wedge, revealing the positive buoyancy of the slow material. We propose a new model for subslab slow velocity anomalies beneath the Nazca slab related to the entrainment of hot spot material.
Accurate mobile malware detection and classification in the cloud.
Wang, Xiaolei; Yang, Yuexiang; Zeng, Yingzhi
2015-01-01
As the dominator of the Smartphone operating system market, consequently android has attracted the attention of s malware authors and researcher alike. The number of types of android malware is increasing rapidly regardless of the considerable number of proposed malware analysis systems. In this paper, by taking advantages of low false-positive rate of misuse detection and the ability of anomaly detection to detect zero-day malware, we propose a novel hybrid detection system based on a new open-source framework CuckooDroid, which enables the use of Cuckoo Sandbox's features to analyze Android malware through dynamic and static analysis. Our proposed system mainly consists of two parts: anomaly detection engine performing abnormal apps detection through dynamic analysis; signature detection engine performing known malware detection and classification with the combination of static and dynamic analysis. We evaluate our system using 5560 malware samples and 6000 benign samples. Experiments show that our anomaly detection engine with dynamic analysis is capable of detecting zero-day malware with a low false negative rate (1.16 %) and acceptable false positive rate (1.30 %); it is worth noting that our signature detection engine with hybrid analysis can accurately classify malware samples with an average positive rate 98.94 %. Considering the intensive computing resources required by the static and dynamic analysis, our proposed detection system should be deployed off-device, such as in the Cloud. The app store markets and the ordinary users can access our detection system for malware detection through cloud service.
Utilization of humus-rich forest soil (mull) in geochemical exploration for gold
Curtin, Gary C.; Lakin, H.W.; Neuerburg, G.J.; Hubert, A.E.
1968-01-01
Distribution of gold in humus-rich forest soil (mull) reflects the known distribution of gold deposits in bedrock in the Empire district, Colorado. Gold from the bedrock is accumulated by pine and aspen trees and is concentrated in the mull by the decay of organic litter from the trees. Anomalies in mull which do not coincide with known gold deposits merit further exploration. The gold anomalies in soil (6- to 12-inch depth) and in float pebbles and cobbles poorly reflect the known distribution of gold deposits in bedrock beneath the extensive cover of colluvium and glacial drift.
Dental and oral anomalies in incontinentia pigmenti: a systematic review.
Minić, Snežana; Trpinac, Dušan; Gabriel, Heinz; Gencik, Martin; Obradović, Miljana
2013-01-01
Incontinentia pigmenti (IP) is an X-linked genodermatosis caused by a mutation of the IKBKG gene. The objective of this study was to present a systematic review of the dental and oral types of anomalies, to determine the total number and sex distribution of the anomalies, and to analyze possible therapies. We analyzed the literature data from 1,286 IP cases from the period 1993-2010. Dental and/or oral anomalies were diagnosed for 54.38% of the investigated IP patients. Most of the anomaly types were dental, and the most frequent of these were dental shape anomalies, hypodontia, and delayed dentition. The most frequent oral anomaly types were cleft palate and high arched palate. IKBKG exon 4-10 deletion was present in 86.36% of genetically confirmed IP patients. According to the frequency, dental and/or oral anomalies represent the most frequent and important IP minor criteria. The most frequent mutation was IKBKG exon 4-10 deletion. The majority of dental anomalies and some of the oral anomalies could be corrected. Because of the presence of cleft palate and high arched palate in IP patients, these two anomalies may be considered as diagnostic IP minor criteria as well.
Fiber Optic Bragg Grating Sensors for Thermographic Detection of Subsurface Anomalies
NASA Technical Reports Server (NTRS)
Allison, Sidney G.; Winfree, William P.; Wu, Meng-Chou
2009-01-01
Conventional thermography with an infrared imager has been shown to be an extremely viable technique for nondestructively detecting subsurface anomalies such as thickness variations due to corrosion. A recently developed technique using fiber optic sensors to measure temperature holds potential for performing similar inspections without requiring an infrared imager. The structure is heated using a heat source such as a quartz lamp with fiber Bragg grating (FBG) sensors at the surface of the structure to detect temperature. Investigated structures include a stainless steel plate with thickness variations simulated by small platelets attached to the back side using thermal grease. A relationship is shown between the FBG sensor thermal response and variations in material thickness. For comparison, finite element modeling was performed and found to agree closely with the fiber optic thermography results. This technique shows potential for applications where FBG sensors are already bonded to structures for Integrated Vehicle Health Monitoring (IVHM) strain measurements and can serve dual-use by also performing thermographic detection of subsurface anomalies.
Detailed gravity anomalies from Geos 3 satellite altimetry data
NASA Technical Reports Server (NTRS)
Gopalapillai, G. S.; Mourad, A. G.
1979-01-01
Detailed gravity anomalies are computed from a combination of Geos 3 satellite altimeter and terrestrial gravity data using least-squares principles. The mathematical model used is based on the Stokes' equation modified for a nonglobal solution. Using Geos 3 data in the calibration area, the effects of several anomaly parameter configurations and data densities/distributions on the anomalies and their accuracy estimates are studied. The accuracy estimates for 1 deg x 1 deg mean anomalies from low density altimetry data are of the order of 4 mgal. Comparison of these anomalies with the terrestrial data and also with Rapp's data derived using collocation techniques show rms differences of 7.2 and 4.9 mgal, respectively. Indications are that the anomaly accuracies can be improved to about 2 mgal with high density data. Estimation of 30 in. x 30 in. mean anomalies indicates accuracies of the order of 5 mgal. Proper verification of these results will be possible only when accurate ground truth data become available.
The incidence of coronary anomalies on routine coronary computed tomography scans
Karabay, Kanber Ocal; Yildiz, Abdulmelik; Bagirtan, Bayram; Geceer, Gurkan; Uysal, Ender
2013-01-01
Summary Objective This study aimed to assess the incidence of coronary anomalies using 64-multi-slice coronary computed tomography (MSCT). Methods The diagnostic MSCT scans of 745 consecutive patients were reviewed. Results The incidence of coronary anomalies was 4.96%. The detected coronary anomalies included the conus artery originating separately from the right coronary sinus (RCS) (n = 8, 1.07%), absence of the left main artery (n = 7, 0.93%), a superior right coronary artery (RCA) (n = 7, 0.93%), the circumflex artery (CFX) arising from the RCS (n = 4, 0.53%), the CFX originating from the RCA (n = 2, 0.26%), a posterior RCA (n = 1, 0.13%), a coronary fistula from the left anterior descending artery and RCA to the pulmonary artery (n = 1, 0.13%), and a coronary aneurysm (n = 1, 0.13%). Conclusions This study indicated that MSCT can be used to detect common coronary anomalies, and shows it has the potential to aid cardiologists and cardiac surgeons by revealing the origin and course of the coronary vessels. PMID:24042853
The Compact Environmental Anomaly Sensor (CEASE) III
NASA Astrophysics Data System (ADS)
Roddy, P.; Hilmer, R. V.; Ballenthin, J.; Lindstrom, C. D.; Barton, D. A.; Ignazio, J. M.; Coombs, J. M.; Johnston, W. R.; Wheelock, A. T.; Quigley, S.
2016-12-01
The Air Force Research Laboratory's Energetic Charged Particle (ECP) sensor project is a comprehensive effort to measure the charged particle environment that causes satellite anomalies. The project includes the Compact Environmental Anomaly Sensor (CEASE) III, building on the flight heritage of prior CEASE designs. CEASE III consists of multiple sensor modules. High energy particles are observed using independent unique silicon detector stacks. In addition CEASE III includes an electrostatic analyzer (ESA) assembly which uses charge multiplication for particle detection. The sensors cover a wide range of proton and electron energies that contribute to satellite anomalies.
A Comparative Study of Unsupervised Anomaly Detection Techniques Using Honeypot Data
NASA Astrophysics Data System (ADS)
Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Inoue, Daisuke; Eto, Masashi; Nakao, Koji
Intrusion Detection Systems (IDS) have been received considerable attention among the network security researchers as one of the most promising countermeasures to defend our crucial computer systems or networks against attackers on the Internet. Over the past few years, many machine learning techniques have been applied to IDSs so as to improve their performance and to construct them with low cost and effort. Especially, unsupervised anomaly detection techniques have a significant advantage in their capability to identify unforeseen attacks, i.e., 0-day attacks, and to build intrusion detection models without any labeled (i.e., pre-classified) training data in an automated manner. In this paper, we conduct a set of experiments to evaluate and analyze performance of the major unsupervised anomaly detection techniques using real traffic data which are obtained at our honeypots deployed inside and outside of the campus network of Kyoto University, and using various evaluation criteria, i.e., performance evaluation by similarity measurements and the size of training data, overall performance, detection ability for unknown attacks, and time complexity. Our experimental results give some practical and useful guidelines to IDS researchers and operators, so that they can acquire insight to apply these techniques to the area of intrusion detection, and devise more effective intrusion detection models.
A new method of real-time detection of changes in periodic data stream
NASA Astrophysics Data System (ADS)
Lyu, Chen; Lu, Guoliang; Cheng, Bin; Zheng, Xiangwei
2017-07-01
The change point detection in periodic time series is much desirable in many practical usages. We present a novel algorithm for this task, which includes two phases: 1) anomaly measure- on the basis of a typical regression model, we propose a new computation method to measure anomalies in time series which does not require any reference data from other measurement(s); 2) change detection- we introduce a new martingale test for detection which can be operated in an unsupervised and nonparametric way. We have conducted extensive experiments to systematically test our algorithm. The results make us believe that our algorithm can be directly applicable in many real-world change-point-detection applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie
PingER (Ping End-to-End Reporting) is a worldwide end-to-end Internet performance measurement framework. It was developed by the SLAC National Accelerator Laboratory, Stanford, USA and running from the last 20 years. It has more than 700 monitoring agents and remote sites which monitor the performance of Internet links around 170 countries of the world. At present, the size of the compressed PingER data set is about 60 GB comprising of 100,000 flat files. The data is publicly available for valuable Internet performance analyses. However, the data sets suffer from missing values and anomalies due to congestion, bottleneck links, queuing overflow, networkmore » software misconfiguration, hardware failure, cable cuts, and social upheavals. Therefore, the objective of this paper is to detect such performance drops or spikes labeled as anomalies or outliers for the PingER data set. In the proposed approach, the raw text files of the data set are transformed into a PingER dimensional model. The missing values are imputed using the k-NN algorithm. The data is partitioned into similar instances using the k-means clustering algorithm. Afterward, clustering is integrated with the Local Outlier Factor (LOF) using the Cluster Based Local Outlier Factor (CBLOF) algorithm to detect the anomalies or outliers from the PingER data. Lastly, anomalies are further analyzed to identify the time frame and location of the hosts generating the major percentage of the anomalies in the PingER data set ranging from 1998 to 2016.« less
Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie; ...
2018-05-28
PingER (Ping End-to-End Reporting) is a worldwide end-to-end Internet performance measurement framework. It was developed by the SLAC National Accelerator Laboratory, Stanford, USA and running from the last 20 years. It has more than 700 monitoring agents and remote sites which monitor the performance of Internet links around 170 countries of the world. At present, the size of the compressed PingER data set is about 60 GB comprising of 100,000 flat files. The data is publicly available for valuable Internet performance analyses. However, the data sets suffer from missing values and anomalies due to congestion, bottleneck links, queuing overflow, networkmore » software misconfiguration, hardware failure, cable cuts, and social upheavals. Therefore, the objective of this paper is to detect such performance drops or spikes labeled as anomalies or outliers for the PingER data set. In the proposed approach, the raw text files of the data set are transformed into a PingER dimensional model. The missing values are imputed using the k-NN algorithm. The data is partitioned into similar instances using the k-means clustering algorithm. Afterward, clustering is integrated with the Local Outlier Factor (LOF) using the Cluster Based Local Outlier Factor (CBLOF) algorithm to detect the anomalies or outliers from the PingER data. Lastly, anomalies are further analyzed to identify the time frame and location of the hosts generating the major percentage of the anomalies in the PingER data set ranging from 1998 to 2016.« less
Manevich-Mazor, Mirra; Weissmann-Brenner, Alina; Bar Yosef, Omer; Hoffmann, Chen; Mazor, Roei David; Mosheva, Mariela; Achiron, Reuven Ryszard; Katorza, Eldad
2018-06-07
To evaluate the added value of fetal MRI to ultrasound in detecting and specifying callosal anomalies, and its impact on clinical decision making. Fetuses with a sonographic diagnosis of an anomalous corpus callosum (CC) who underwent a subsequent fetal brain MRI between 2010 and 2015 were retrospectively evaluated and classified according to the severity of the findings. The findings detected on ultrasound were compared to those detected on MRI. An analysis was performed to assess whether fetal MRI altered the group classification, and thus the management of these pregnancies. 78 women were recruited following sonographic diagnoses of either complete or partial callosal agenesis, short, thin or thick CC. Normal MRI studies were obtained inµ19 cases (24 %). Among these, all children available for follow-up received an adequate adaptive score in their Vineland II adaptive behavior scale assessment. Analysis of the concordance between US and MRI demonstrated a substantial level of agreement for complete callosal agenesis (kappa: 0.742), moderate agreement for thin CC (kappa: 0.418) and fair agreement for all other callosal anomalies. Comparison between US and MRI-based mild/severe findings classifications revealed that MRI contributed to a change in the management for 28 fetuses (35.9 %), mostly (25 fetuses, 32.1 %) in favor of pregnancy preservation. Fetal MRI effectively detects callosal anomalies and enables satisfactory validation of the presence or absence of callosal anomalies identified by ultrasound and adds valuable data that improves clinical decision making. © Georg Thieme Verlag KG Stuttgart · New York.
Laganà, G; Venza, N; Borzabadi-Farahani, A; Fabi, F; Danesi, C; Cozza, P
2017-03-11
To analyze the prevalence and associations between dental anomalies detectable on panoramic radiographs in a sample of non-orthodontic growing subjects. For this cross-sectional study, digital panoramic radiographs of 5005 subjects were initially screened from a single radiographic center in Rome. Inclusion criteria were: subjects who were aged 8-12 years, Caucasian, and had good diagnostic quality radiographs. Syndromic subjects, those with craniofacial malformation, or orthodontic patients were excluded and this led to a sample of 4706 subjects [mean (SD) age = 9.6 (1.2) years, 2366 males and 2340 females]. Sample was subsequently divided into four subgroups (8, 9, 10, and 11-12 year-old groups). Two operators examined panoramic radiographs to observe the presence of common dental anomalies. The prevalence and associations between dental anomalies were also investigated. The overall prevalence of dental anomalies was 20.9%. Approximately, 17.9% showed only one anomaly, 2.7% two anomalies, while only 0.3% had more than two anomalies. The most frequent anomalies were the displacement of maxillary canine (7.5%), hypodontia (7.1%), impacted teeth (3.9%), tooth ankylosis (2.8%), and tooth transposition (1.4%). The lower right second premolar was the most frequent missing teeth; 3.7% had only one tooth agenesis, and 0.08% had six or more missing tooth (Oligodontia). Mesiodens was the most common type of supernumerary tooth (0.66%). Two subjects had taurodontic tooth (0.04%). Tooth transpositions and displacement of maxillary canine were seen in 1.4 and 7.5%, retrospectively (approximately 69 and 58% were in the 8 and 9 year-old groups, retrospectively). Significant associations were detected between the different dental anomalies (P < .05). The results of our study revealed significant associations among different dental anomalies and provide further evidences to support common etiological factors.
Listening to Limericks: A Pupillometry Investigation of Perceivers’ Expectancy
Scheepers, Christoph; Mohr, Sibylle; Fischer, Martin H.; Roberts, Andrew M.
2013-01-01
What features of a poem make it captivating, and which cognitive mechanisms are sensitive to these features? We addressed these questions experimentally by measuring pupillary responses of 40 participants who listened to a series of Limericks. The Limericks ended with either a semantic, syntactic, rhyme or metric violation. Compared to a control condition without violations, only the rhyme violation condition induced a reliable pupillary response. An anomaly-rating study on the same stimuli showed that all violations were reliably detectable relative to the control condition, but the anomaly induced by rhyme violations was perceived as most severe. Together, our data suggest that rhyme violations in Limericks may induce an emotional response beyond mere anomaly detection. PMID:24086417
Jain, Shikha; Shetty, K Sadashiva; Jain, Shweta; Jain, Sachin; Prakash, A T; Agrawal, Mamta
2015-07-01
To assess the null hypothesis that there is no difference in the rate of dental development and the occurrence of selected developmental anomalies related to shape, number, structure, and position of teeth between subjects with impacted mandibular canines and those with normally erupted canines. Pretreatment records of 42 subjects diagnosed with mandibular canines impaction (impaction group: IG) were compared with those of 84 subjects serving as a control reference sample (control group: CG). Independent t-tests were used to compare mean dental ages between the groups. Intergroup differences in distribution of subjects based on the rate of dental development and occurrence of selected dental anomalies were assessed using χ(2) tests. Odds of late, normal, and early developers and various categories of developmental anomalies between the IG and the CG were evaluated in terms of odds ratios. Mean dental age for the IG was lower than that for the CG in general. Specifically, this was true for girls (P < .05). Differences in the distribution of the subjects based on the rate of dental development and occurrence of positional anomalies also reached statistical significance (P < .05). The IG showed a higher frequency of late developers and positional anomalies compared with controls (odds ratios 3.00 and 2.82, respectively; P < .05). The null hypothesis was rejected. We identified close association of female subjects in the IG with retarded dental development compared with the female orthodontic patients. Increased frequency of positional developmental anomalies was also remarkable in the IG.
A Real-Time Earthquake Precursor Detection Technique Using TEC from a GPS Network
NASA Astrophysics Data System (ADS)
Alp Akyol, Ali; Arikan, Feza; Arikan, Orhan
2016-07-01
Anomalies have been observed in the ionospheric electron density distribution prior to strong earthquakes. However, most of the reported results are obtained by earthquake analysis. Therefore, their implementation in practice is highly problematic. Recently, a novel earthquake precursor detection technique based on spatio-temporal analysis of Total Electron Content (TEC) data obtained from Turkish National Permanent GPS Network (TNPGN) is developed by IONOLAB group (www.ionolab.org). In the present study, the developed detection technique is implemented in a causal setup over the available data set in test phase that enables the real time implementation. The performance of the developed earthquake prediction technique is evaluated by using 10 fold cross validation over the data obtained in 2011. Among the 23 earthquakes that have magnitudes higher than 5, the developed technique can detect precursors of 14 earthquakes while producing 8 false alarms. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.
SiC: An Agent Based Architecture for Preventing and Detecting Attacks to Ubiquitous Databases
NASA Astrophysics Data System (ADS)
Pinzón, Cristian; de Paz, Yanira; Bajo, Javier; Abraham, Ajith; Corchado, Juan M.
One of the main attacks to ubiquitous databases is the structure query language (SQL) injection attack, which causes severe damages both in the commercial aspect and in the user’s confidence. This chapter proposes the SiC architecture as a solution to the SQL injection attack problem. This is a hierarchical distributed multiagent architecture, which involves an entirely new approach with respect to existing architectures for the prevention and detection of SQL injections. SiC incorporates a kind of intelligent agent, which integrates a case-based reasoning system. This agent, which is the core of the architecture, allows the application of detection techniques based on anomalies as well as those based on patterns, providing a great degree of autonomy, flexibility, robustness and dynamic scalability. The characteristics of the multiagent system allow an architecture to detect attacks from different types of devices, regardless of the physical location. The architecture has been tested on a medical database, guaranteeing safe access from various devices such as PDAs and notebook computers.
Toward Continuous GPS Carrier-Phase Time Transfer: Eliminating the Time Discontinuity at an Anomaly
Yao, Jian; Levine, Judah; Weiss, Marc
2015-01-01
The wide application of Global Positioning System (GPS) carrier-phase (CP) time transfer is limited by the problem of boundary discontinuity (BD). The discontinuity has two categories. One is “day boundary discontinuity,” which has been studied extensively and can be solved by multiple methods [1–8]. The other category of discontinuity, called “anomaly boundary discontinuity (anomaly-BD),” comes from a GPS data anomaly. The anomaly can be a data gap (i.e., missing data), a GPS measurement error (i.e., bad data), or a cycle slip. Initial study of the anomaly-BD shows that we can fix the discontinuity if the anomaly lasts no more than 20 min, using the polynomial curve-fitting strategy to repair the anomaly [9]. However, sometimes, the data anomaly lasts longer than 20 min. Thus, a better curve-fitting strategy is in need. Besides, a cycle slip, as another type of data anomaly, can occur and lead to an anomaly-BD. To solve these problems, this paper proposes a new strategy, i.e., the satellite-clock-aided curve fitting strategy with the function of cycle slip detection. Basically, this new strategy applies the satellite clock correction to the GPS data. After that, we do the polynomial curve fitting for the code and phase data, as before. Our study shows that the phase-data residual is only ~3 mm for all GPS satellites. The new strategy also detects and finds the number of cycle slips by searching the minimum curve-fitting residual. Extensive examples show that this new strategy enables us to repair up to a 40-min GPS data anomaly, regardless of whether the anomaly is due to a data gap, a cycle slip, or a combination of the two. We also find that interference of the GPS signal, known as “jamming”, can possibly lead to a time-transfer error, and that this new strategy can compensate for jamming outages. Thus, the new strategy can eliminate the impact of jamming on time transfer. As a whole, we greatly improve the robustness of the GPS CP time transfer. PMID:26958451
MX Siting Investigation. Gravity Survey - Sevier Desert Valley, Utah.
1981-01-24
Cheyenne, Wyoming. DMAHTC reduces the data to Simple Bouguer Anomaly (see Section A1.4, Appendix Al.0). The Defense Mapping Agency Aerospace Center...Desert Valley, Utah ......... 2 2 Topographic Setting - Sevier Desert Valley, Utah . 3 LIST OF DRAWINGS Drawing Number 1 Complete Bouguer Anomaly...gravity stations were distributed throughout the valley at an approxi- mate interval of 1.4 miles (2.3 km). Drawing 1 is a Complete Bouguer Anomaly
Castells-Sarret, Neus; Cueto-González, Anna M; Borregan, Mar; López-Grondona, Fermina; Miró, Rosa; Tizzano, Eduardo; Plaja, Alberto
2017-09-25
Conventional cytogenetics diagnoses 3-5% of patients with unexplained developmental delay/intellectual disability and/or multiple congenital anomalies. The Multiplex Ligation-dependent Probe Amplification increases diagnostic rates from between 2.4 to 5.8%. Currently the comparative genomic hybridisation array or aCGH is the highest performing diagnostic tool in patients with developmental delay/intellectual disability, congenital anomalies and autism spectrum disorders. Our aim is to evaluate the efficiency of the use of aCGH as first-line test in these and other indications (epilepsy, short stature). A total of 1000 patients referred due to one or more of the abovementioned disorders were analysed by aCGH. Pathogenic genomic imbalances were detected in 14% of the cases, with a variable distribution of diagnosis according to the phenotypes: 18.9% of patients with developmental delay/intellectual disability; 13.7% of multiple congenital anomalies, 9.76% of psychiatric pathologies, 7.02% of patients with epilepsy, and 13.3% of patients with short stature. Within the multiple congenital anomalies, central nervous system abnormalities and congenital heart diseases accounted for 14.9% and 10.6% of diagnoses, respectively. Among the psychiatric disorders, patients with autism spectrum disorders accounted for 8.9% of the diagnoses. Our results demonstrate the effectiveness and efficiency of the use of aCGH as the first line test in genetic diagnosis of patients suspected of genomic imbalances, supporting its inclusion within the National Health System. Copyright © 2017. Publicado por Elsevier España, S.L.U.
Celikoglu, Mevlut; Miloglu, Ozkan; Oztek, Ozkan
2010-09-01
The aims of this study were to investigate the frequency and characteristics of dental transpositions and to evaluate associated dental anomalies in a large sample of Turkish Anatolian population. A retrospective study was performed using panoramic radiographs of 6983 patients (4092 females and 2891 males) ranging in age from 12 to 27 subjected to Faculty of Dentistry at the University of Ataturk (Erzurum, Turkey) between 2005 and 2008. For each patient with tooth transposition we recorded the demographic variables (including age, sex), history of trauma, type, classification, and location of tooth transpositions, and associated dental anomalies. The Pearson chi-squared test was used to determine potential differences in the distribution of tooth transposition when stratified by gender. Tooth transposition was detected in 19 subjects (0.27%), with a 2.2:1 female male ratio (P=0.38). The most commonly observed transposition was maxillary canine-lateral incisor (60%). The frequencies of complete and incomplete transpositions were equal (10/10) and it was more common in the left side than in the right side (11/9). Of the 19 subjects, 10.5% had a peg shaped lateral incisor, 21.1% one congenitally missing tooth excluding third molar. Supernumerary tooth, impacted teeth excluding third molars, transmigrated tooth, and dilacerations were also observed. The frequency of tooth transposition was 0.27% in a Turkish Anatolian population and maxillary canine-lateral incisor was the most frequently observed transposition. Retained primary teeth were the most frequently observed dental anomaly in all types of tooth transposition.
Diagnostic value of perinatal autopsies: analysis of 486 cases.
Neşe, Nalan; Bülbül, Yeşim
2018-02-23
Autopsy is a beneficial procedure to determine the cause of death and the frequency of anomalies in perinatal losses. Even in the event of an autopsy not providing any additional information, completion of the procedure confirming the clinical diagnoses gives reassurance to both clinicians and parents. Here we present a 15-year archival study based on findings of perinatal autopsies. Four hundred and eighty-six cases from our archive were reviewed and according to the findings they were divided into three subcategories; (1) miscarriages (MCF); (2) fetuses terminated (FTA) for vital anomalies detected by prenatal ultrasonography; (3) premature or term newborns died within first month of life (neonates: NN). Autopsies were documented and classified according to week/age of cases, anomalies and causes of abortion or death. Two hundred and twenty-six of 486 cases (46.5%) were in MCF group while 227 (46.7%) and 33 (6.8%) were of them in FTA and NN groups, respectively. In FTA group, the most frequent anomaly detected was neural tube defects. In NN group, prematurity related complications were the most common cause of death. The autopsy process was found valuable in 39.7% of all cases. We suggest that autopsy procedure is diagnostically valuable even in situations when there is USG findings that are confirming FTAs or there is no important major fetal or placental anomaly detected in MCFs.
Yesildag, Ebru
2015-01-01
Objective: Circumcision is one of the most commonly performed operations during childhood. The procedure is often underestimated in areas where it is frequently executed due to social and religion-based indications. In fact it might be an opportunity to detect and to correct any existing penile anomaly. The aim of the study was to retrospectively evaluate the boys who were admitted to a hospital for circumcision and the outcome of the procedure. Methods: The boys who were brought to outpatient clinics for circumcision between 2009-2015, were retrospectively evaluated. The indications for hospital admission and the presence of associated penile anomalies were searched. All the boys were examined and operated by a single surgeon of the institution. Results: Nine hundred forty four boys were brought to pediatric surgery outpatient clinics in order to be circumcised. The operation was performed in 318 of them. The physical examination revealed penile anomalies in 29 of the 318 cases. The detected anomalies were webbed penis, penile torsion, hypospadias, chordee without hypospadias and meatal stenosis. Conclusions: The proper examination of the boys by a physician prior to circumcision provides the detection of penile anomalies which can be corrected at the same session. The arrangements for performing circumcision in hospitals by the medical staff should be favored. The misleading perception of underestimation of the procedure where it is ritually performed, should be corrected. PMID:26430441
NASA Astrophysics Data System (ADS)
Brax, Christoffer; Niklasson, Lars
2009-05-01
Maritime Domain Awareness is important for both civilian and military applications. An important part of MDA is detection of unusual vessel activities such as piracy, smuggling, poaching, collisions, etc. Today's interconnected sensorsystems provide us with huge amounts of information over large geographical areas which can make the operators reach their cognitive capacity and start to miss important events. We propose and agent-based situation management system that automatically analyse sensor information to detect unusual activity and anomalies. The system combines knowledge-based detection with data-driven anomaly detection. The system is evaluated using information from both radar and AIS sensors.
Magnetic Anomaly Detection by Remote Means
2016-09-21
REFERENCES 1. W. Happer, "Laser Remote Sensing of Magnetic Fields in the Atmosphere by Two-Photon Optical Pumping of Xe 129,” , NADC Report N62269-78-M...by Remote Means 5b. GRANT NUMBER NOOO 14-13-1-0282 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Miles , Richard and Dogariu...unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Research on the possibility of detecting magnetic anomalies remotely using laser excitation of a
Detecting Anomalies in Process Control Networks
NASA Astrophysics Data System (ADS)
Rrushi, Julian; Kang, Kyoung-Don
This paper presents the estimation-inspection algorithm, a statistical algorithm for anomaly detection in process control networks. The algorithm determines if the payload of a network packet that is about to be processed by a control system is normal or abnormal based on the effect that the packet will have on a variable stored in control system memory. The estimation part of the algorithm uses logistic regression integrated with maximum likelihood estimation in an inductive machine learning process to estimate a series of statistical parameters; these parameters are used in conjunction with logistic regression formulas to form a probability mass function for each variable stored in control system memory. The inspection part of the algorithm uses the probability mass functions to estimate the normalcy probability of a specific value that a network packet writes to a variable. Experimental results demonstrate that the algorithm is very effective at detecting anomalies in process control networks.
NASA Astrophysics Data System (ADS)
Nawir, Mukrimah; Amir, Amiza; Lynn, Ong Bi; Yaakob, Naimah; Badlishah Ahmad, R.
2018-05-01
The rapid growth of technologies might endanger them to various network attacks due to the nature of data which are frequently exchange their data through Internet and large-scale data that need to be handle. Moreover, network anomaly detection using machine learning faced difficulty when dealing the involvement of dataset where the number of labelled network dataset is very few in public and this caused many researchers keep used the most commonly network dataset (KDDCup99) which is not relevant to employ the machine learning (ML) algorithms for a classification. Several issues regarding these available labelled network datasets are discussed in this paper. The aim of this paper to build a network anomaly detection system using machine learning algorithms that are efficient, effective and fast processing. The finding showed that AODE algorithm is performed well in term of accuracy and processing time for binary classification towards UNSW-NB15 dataset.
NASA Astrophysics Data System (ADS)
Daud, Yunus; Rosid, Syamsu; Fahmi, Fikri; Yunus, Faris Maulana; Muflihendri, Reza
2018-02-01
Ijen geothermal area is high-temperature geothermal system located in Bondowoso regency, East Java. It is categorized as caldera-hosted geothermal system which is covered by quaternary andesitic volcanic rocks with steep topography at the surrounding. Several surface thermal manifestations are found, such as altered rocks near Mt. Kukusan and a group of Blawan hotsprings in the northern part of the caldera. Geomagnetic survey was conducted at 72 stations which is distributed inside the caldera to delineate the existence of hydrothermal activity. Magnetic anomaly was obtained by reducing total magnetic measured on the field by IGRF and diurnal variation. Reduction to pole (RTP) method was applied with geomagnetic inclination of about -32°. In general, the result shows that high magnetic anomaly is distributed at the boundary of study area, while low magnetic anomaly is observed in the centre. The low anomaly indicates demagnetized rock that probably caused by hydrothermal activity. It has a good correlation with surface alteration observed close to Mt. Kukusan as well as high temperature reservoir drilled in the centre of caldera. Accordingly, the low magnetic anomaly also presents the possibility of geothermal reservoir in Ijen geothermal area.
Thermal surveillance of active volcanoes
NASA Technical Reports Server (NTRS)
Friedman, J. D. (Principal Investigator)
1973-01-01
The author has identified the following significant results. There are three significant scientific results of the discovery of 48 pinpoint anomalies on the upper flanks of Mt. Rainier: (1) Many of these points may actually be the location of fumarolic vapor emission or warm ground considerably below the summit crater. (2) Discovery of these small anomalies required specific V/H scanner settings for precise elevation on Mt. Rainier's flank, to avoid smearing the anomalies to the point of nonrecognition. Several past missions flown to map the thermal anomalies of the summit area did not/detect the flank anomalies. (3) This illustrates the value of the aerial IR scanner as a geophysical tool suited to specific problem-oriented missions, in contrast to its more general value in a regional or reconnaissance anomaly-mapping role.
Monteiro, Fabíola P; Vieira, Társis P; Sgardioli, Ilária C; Molck, Miriam C; Damiano, Ana Paula; Souza, Josiane; Monlleó, Isabella L; Fontes, Marshall I B; Fett-Conte, Agnes C; Félix, Têmis M; Leal, Gabriela F; Ribeiro, Erlane M; Banzato, Claudio E M; Dantas, Clarissa de R; Lopes-Cendes, Iscia; Gil-da-Silva-Lopes, Vera Lúcia
2013-07-01
The 22q11.2 deletion is the most frequent interstitial deletion in humans and presents a wide phenotypic spectrum, with over 180 clinical manifestations described. Distinct studies have detected frequencies of the deletion ranging from 0 % to 75 %, depending on the studied population and selection criteria adopted. Due to the lack of consensus in this matter, several studies have been conducted aiming to define which patients would be eligible for screening; however, the issue is still up for debate. In order to contribute to the delineation of possible clinical and dysmorphologic guidelines to optimize decision making in the clinical setting, 194 individuals with variable features of the 22q11.2 deletion syndromes (22q11.2DS) were evaluated. Group I, clinical suspicion of 22q11.2DS with palatal anomalies; Group II, clinical suspicion without palatal anomalies; Group III, cardiac malformations associated with the 22q11.2DS; and Group IV, juvenile-onset schizophrenia. Multiplex ligation-dependent probe amplification was used for screening the 22q11.2 deletion, which was detected in 45 patients (23.2 %), distributed as such: Group I, 35/101 (34.7 %); Group II, 4/18 (22.2 %); Group III, 6/52 (11.5 %); and Group IV, 0/23 (0 %). Clinical data were analyzed by frequency distribution and statistically. Based on the present results and on the review of the literature, we propose a set of guidelines for screening patients with distinct manifestations of the 22q11.2DS in order to maximize resources. In addition, we report the dysmorphic features which we found to be statistically correlated with the presence of the 22q11.2DS.
NASA Astrophysics Data System (ADS)
Giusti, M.; Dziak, R. P.; Maia, M.; Perrot, J.; Sukhovich, A.
2017-12-01
In August of 2010 an unusually large earthquake sequence of >700 events occurred at the Famous and North Famous segments (36.5-37°N) of the Mid-Atlantic Ridge (MAR), recorded by an array of five hydrophones moored on the MAR flanks. The swarm extended spatially >70 km across the two segments. The non-transform offset (NTO) separating the two segements, which is thought to act as strucutural barrier, did not appear to impede or block the earthquake's spatial distribution. Broadband acoustic energy (1-30 Hz) was also observed and accompanied the onset of the swarm, lasting >20 hours. A total of 18 earthquakes from the swarm were detected teleseismically, four had Centroid-Moment Tensor (CMT) solutions derived. The CMT solutions indicated three normal faulting events, and one non-double couple (explosion) event. The spatio-temporal distribution of the seismicity and broadband energy show evidence of two magma dike intrusions at the North Famous segment, with one intrusion crossing the NTO. This is the first evidence for an intrusion event detected on the MAR south of the Azores since the 2001 Lucky Strike intrusion. Gravimetric data were required to identify whether or not the Famous area is indeed comprised of two segments down to the level of the upper mantle. A high resolution gravity anomaly map of the two segments has been realized, based on a two-dimensional polygons model (Chapman, 1979) and will be compared to gravimetric data originated from SUDACORES experiment (1998, Atalante ship, IFREMER research team). Combined with the earthquake observations, this gravity anomaly map should provide a better understanding the geodynamic processes of this non-transform offset and of the deep magmatic system driving the August 2010 swarm.
Continental and oceanic magnetic anomalies: Enhancement through GRM
NASA Technical Reports Server (NTRS)
Vonfrese, R. R. B.; Hinze, W. J.
1985-01-01
In contrast to the POGO and MAGSAT satellites, the Geopotential Research Mission (GRM) satellite system will orbit at a minimum elevation to provide significantly better resolved lithospheric magnetic anomalies for more detailed and improved geologic analysis. In addition, GRM will measure corresponding gravity anomalies to enhance our understanding of the gravity field for vast regions of the Earth which are largely inaccessible to more conventional surface mapping. Crustal studies will greatly benefit from the dual data sets as modeling has shown that lithospheric sources of long wavelength magnetic anomalies frequently involve density variations which may produce detectable gravity anomalies at satellite elevations. Furthermore, GRM will provide an important replication of lithospheric magnetic anomalies as an aid to identifying and extracting these anomalies from satellite magnetic measurements. The potential benefits to the study of the origin and characterization of the continents and oceans, that may result from the increased GRM resolution are examined.
Kaasen, Anne; Helbig, Anne; Malt, Ulrik F.; Næs, Tormod; Skari, Hans; Haugen, Guttorm
2017-01-01
In this longitudinal prospective observational study performed at a tertiary perinatal referral centre, we aimed to assess maternal distress in pregnancy in women with ultrasound findings of fetal anomaly and compare this with distress in pregnant women with normal ultrasound findings. Pregnant women with a structural fetal anomaly (n = 48) and normal ultrasound (n = 105) were included. We administered self-report questionnaires (General Health Questionnaire-28, Impact of Event Scale-22 [IES], and Edinburgh Postnatal Depression Scale) a few days following ultrasound detection of a fetal anomaly or a normal ultrasound (T1), 3 weeks post-ultrasound (T2), and at 30 (T3) and 36 weeks gestation (T4). Social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression) were the main outcome measures. The median gestational age at T1 was 20 and 19 weeks in the group with and without fetal anomaly, respectively. In the fetal anomaly group, all psychological distress scores were highest at T1. In the group with a normal scan, distress scores were stable throughout pregnancy. At all assessments, the fetal anomaly group scored significantly higher (especially on depression-related questions) compared to the normal scan group, except on the IES Intrusion and Arousal subscales at T4, although with large individual differences. In conclusion, women with a known fetal anomaly initially had high stress scores, which gradually decreased, resembling those in women with a normal pregnancy. Psychological stress levels were stable and low during the latter half of gestation in women with a normal pregnancy. PMID:28350879
Prevalence of dental anomalies in Saudi orthodontic patients.
Al-Jabaa, Aljazi H; Aldrees, Abdullah M
2013-07-01
This study aimed to investigate the prevalence of dental anomalies and study the association of these anomalies with different types of malocclusion in a random sample of Saudi orthodontic patients. Six hundred and two randomly selected pretreatment records including orthopantomographs (OPG), and study models were evaluated. The molar relationship was determined using pretreatment study models, and OPG were examined to investigate the prevalence of dental anomalies among the sample. The most common types of the investigated anomalies were: impaction followed by hypodontia, microdontia, macrodontia, ectopic eruption and supernumerary. No statistical significant correlations were observed between sex and dental anomalies. Dental anomalies were more commonly found in class I followed by asymmetric molar relation, then class II and finally class III molar relation. No malocclusion group had a statistically significant relation with any individual dental anomaly. The prevalence of dental anomalies among Saudi orthodontic patients was higher than the general population. Although, orthodontic patients have been reported to have high rates of dental anomalies, orthodontists often fail to consider this. If not detected, dental anomalies can complicate dental and orthodontic treatment; therefore, their presence should be carefully investigated during orthodontic diagnosis and considered during treatment planning.
Detection of Anomalous Insiders in Collaborative Environments via Relational Analysis of Access Logs
Chen, You; Malin, Bradley
2014-01-01
Collaborative information systems (CIS) are deployed within a diverse array of environments, ranging from the Internet to intelligence agencies to healthcare. It is increasingly the case that such systems are applied to manage sensitive information, making them targets for malicious insiders. While sophisticated security mechanisms have been developed to detect insider threats in various file systems, they are neither designed to model nor to monitor collaborative environments in which users function in dynamic teams with complex behavior. In this paper, we introduce a community-based anomaly detection system (CADS), an unsupervised learning framework to detect insider threats based on information recorded in the access logs of collaborative environments. CADS is based on the observation that typical users tend to form community structures, such that users with low a nity to such communities are indicative of anomalous and potentially illicit behavior. The model consists of two primary components: relational pattern extraction and anomaly detection. For relational pattern extraction, CADS infers community structures from CIS access logs, and subsequently derives communities, which serve as the CADS pattern core. CADS then uses a formal statistical model to measure the deviation of users from the inferred communities to predict which users are anomalies. To empirically evaluate the threat detection model, we perform an analysis with six months of access logs from a real electronic health record system in a large medical center, as well as a publicly-available dataset for replication purposes. The results illustrate that CADS can distinguish simulated anomalous users in the context of real user behavior with a high degree of certainty and with significant performance gains in comparison to several competing anomaly detection models. PMID:25485309
Gould, Sharon W; Epelman, Monica
2015-08-01
Developmental anomalies of the uterus and the vagina are associated with infertility and miscarriage and are most commonly detected in the postpubertal age-group. These conditions may also present in younger patients as a mass or pain owing to obstruction of the uterus or the vagina. Associated urinary tract anomalies are common, as well. Accurate diagnosis and thorough description of these anomalies is essential for appropriate management; however, evaluation may be difficult in an immature reproductive tract. Magnetic resonance imaging technique pertinent to imaging of the pediatric female reproductive tract is presented and illustrated along with the findings associated with various anomalies. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.
2016-12-01
Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.
NASA Technical Reports Server (NTRS)
Kis, K. I.; Taylor, Patrick T.; Wittmann, G.; Toronyi, B.; Puszta, S.
2012-01-01
In this study we interpret the magnetic anomalies at satellite altitude over a part of Europe and the Pannonian Basin. These anomalies are derived from the total magnetic measurements from the CHAMP satellite. The anomalies reduced to an elevation of 324 km. An inversion method is used to interpret the total magnetic anomalies over the Pannonian Basin. A three dimensional triangular model is used in the inversion. Two parameter distributions: Laplacian and Gaussian are investigated. The regularized inversion is numerically calculated with the Simplex and Simulated Annealing methods and the anomalous source is located in the upper crust. A probable source of the magnetization is due to the exsolution of the hematite-ilmenite minerals.
Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010
NASA Astrophysics Data System (ADS)
Yao, Y. B.; Chen, P.; Zhang, S.; Chen, J. J.; Yan, F.; Peng, W. F.
2012-03-01
The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC) from the global ionosphere map (GIM). We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0-2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time). Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.
Equilibrium Atmospheric Response to North Atlantic SST Anomalies.
NASA Astrophysics Data System (ADS)
Kushnir, Yochanan; Held, Isaac M.
1996-06-01
The equilibrium general circulation model (GCM) response to sea surface temperature (SST) anomalies in the western North Atlantic region is studied. A coarse resolution GCM, with realistic lower boundary conditions including topography and climatological SST distribution, is integrated in perpetual January and perpetual October modes, distinguished from one another by the strength of the midlatitude westerlies. An SST anomaly with a maximum of 4°C is added to the climatological SST distribution of the model with both positive and negative polarity. These anomaly runs are compared to one another, and to a control integration, to determine the atmospheric response. In all cases warming (cooling) of the midlatitude ocean surface yields a warming (cooling) of the atmosphere over and to the east of the SST anomaly center. The atmospheric temperature change is largest near the surface and decreases upward. Consistent with this simple thermal response, the geopotential height field displays a baroclinic response with a shallow anomalous low somewhat downstream from the warm SST anomaly. The equivalent barotropic, downstream response is weak and not robust. To help interpret the results, the realistic GCM integrations are compared with parallel idealized model runs. The idealized model has full physics and a similar horizontal and vertical resolution, but an all-ocean surface with a single, permanent zonal asymmetry. The idealized and realistic versions of the GCM display compatible response patterns that are qualitatively consistent with stationary, linear, quasigeostrophic theory. However, the idealized model response is stronger and more coherent. The differences between the two model response patterns can be reconciled based on the size of the anomaly, the model treatment of cloud-radiation interaction, and the static stability of the model atmosphere in the vicinity of the SST anomaly. Model results are contrasted with other GCM studies and observations.
Classification of SD-OCT volumes for DME detection: an anomaly detection approach
NASA Astrophysics Data System (ADS)
Sankar, S.; Sidibé, D.; Cheung, Y.; Wong, T. Y.; Lamoureux, E.; Milea, D.; Meriaudeau, F.
2016-03-01
Diabetic Macular Edema (DME) is the leading cause of blindness amongst diabetic patients worldwide. It is characterized by accumulation of water molecules in the macula leading to swelling. Early detection of the disease helps prevent further loss of vision. Naturally, automated detection of DME from Optical Coherence Tomography (OCT) volumes plays a key role. To this end, a pipeline for detecting DME diseases in OCT volumes is proposed in this paper. The method is based on anomaly detection using Gaussian Mixture Model (GMM). It starts with pre-processing the B-scans by resizing, flattening, filtering and extracting features from them. Both intensity and Local Binary Pattern (LBP) features are considered. The dimensionality of the extracted features is reduced using PCA. As the last stage, a GMM is fitted with features from normal volumes. During testing, features extracted from the test volume are evaluated with the fitted model for anomaly and classification is made based on the number of B-scans detected as outliers. The proposed method is tested on two OCT datasets achieving a sensitivity and a specificity of 80% and 93% on the first dataset, and 100% and 80% on the second one. Moreover, experiments show that the proposed method achieves better classification performances than other recently published works.
NASA Astrophysics Data System (ADS)
Ross, Robin M.; Quetin, Langdon B.; Martinson, Douglas G.; Iannuzzi, Rich A.; Stammerjohn, Sharon E.; Smith, Raymond C.
2008-09-01
Variability in the temporal-spatial distribution and abundance of zooplankton was documented each summer on the Palmer Long-Term Ecological Research (LTER) grid west of the Antarctic Peninsula between Anvers and Adelaide Islands during a 12-yr time series. Oblique tows to 120 m with a 2×2 m fixed-frame net were made at about 50 stations each January/February between 1993 and 2004. The numerically dominant macro- and mesozooplanktonic species >2 mm included three species of euphausiids ( Euphausia superba, Antarctic krill; Thysanoëssa macrura; Euphausia crystallorophias, ice krill), a shelled pteropod ( Limacina helicina), and a salp ( Salpa thompsoni). Life cycles, life spans, and habitat varied among these species. Abundance data from each year were allocated to 100 km by 20 km (alongshore by on/offshore) grid cells centered on cardinal transect lines and stations within the Palmer LTER grid. The long-term mean or climatology and means for each year were used to calculate annual anomalies across the grid. Principal components analysis (PCA) was used to analyze for patterns and trends in the temporal-spatial variability of the five species. Questions included whether there are groups of species with similar patterns, and whether population cycles, species interactions or seasonal sea-ice parameters were correlated with detected patterns. Patterns in the climatology were distinct, and matched those of physical parameters. Common features included higher abundance in the north than in the south, independent of the cross-shelf gradients, and cross-shelf gradients with higher abundance either inshore ( E. crystallorophias) or offshore ( S. thompsoni). Anomalies revealed either cycles in the population, as episodic recruitment in Antarctic krill, or changes in anomaly pattern between the first and second half of the sampling period. The 1998 year, which coincided with a rapid change from a negative to a positive phase in the SOI, emerged as a year with either significant anomalies or that marked a change in anomaly patterns for different species. PCA analysis showed that the pattern of cumulative variance with increasing number of modes was distinctly different for shorter-lived versus longer-lived species; the first mode accounted for nearly 50% of the variance in the shorter-lived species and less than 25% in the longer-lived species. This suggested that the mechanisms driving variability in the temporal-spatial distribution of the shorter-lived, more oceanic species were less complex and more direct than those for the longer-lived euphausiids. Evidence from both the anomaly plots and the trend analysis suggested that salps have been more consistently present across the shelf from 1999 to present, and that the range of L. helicina has been expanding. With shorter life spans, these two species can respond more quickly to the increasing heat content on the shelf in this region. The cross-correlation analysis illustrated the negative correlation between salps and ice retreat and the number of ice days, and the positive correlation between the presence of ice krill and the day of ice retreat. These results suggest that for these species, several environmental controls on distribution and abundance were linked to seasonal sea-ice dynamics.
Inflight and Preflight Detection of Pitot Tube Anomalies
NASA Technical Reports Server (NTRS)
Mitchell, Darrell W.
2014-01-01
The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.
Infrasonic Influences of Tornados and Cyclonic Weather Systems
NASA Astrophysics Data System (ADS)
Cook, Tessa
2014-03-01
Infrasound waves travel through the air at approximately 340 m/s at sea level, while experiencing low levels of friction, allowing the waves to travel over larger distances. When seismic waves travel through unconsolidated soil, the waves slow down to approximately 340 m/s. Because the speeds of waves in the air and ground are similar, a more effective transfer of energy from the atmosphere to the ground can occur. Large ring lasers can be utilized for detecting sources of infrasound traveling through the ground by measuring anomalies in the frequency difference between their two counter-rotating beams. Sources of infrasound include tornados and other cyclonic weather systems. The way systems create waves that transfer to the ground is unknown and will be continued in further research; this research has focused on attempting to isolate the time that the ring laser detected anomalies in order to investigate if these anomalies may be contributed to isolatable weather systems. Furthermore, this research analyzed the frequencies detected in each of the anomalies and compared the frequencies with various characteristics of each weather system, such as tornado width, wind speeds, and system development. This research may be beneficial for monitoring gravity waves and weather systems.
NASA Technical Reports Server (NTRS)
Frey, H. V.
2004-01-01
A comparison of the distribution of visible and buried impact basins (Quasi-Circular Depressions or QCDs) on Mars > 200 km in diameter with free air gravity, crustal thickness and magnetization models shows some QCDs have coincident gravity anomalies but most do not. Very few QCDs have closely coincident magnetization anomalies, and only the oldest of the very large impact basins have strong magnetic anomalies within their main rings. Crustal thickness data show a large number of Circular Thinned Areas (CTAs). Some of these correspond to known impact basins, while others may represent buried impact basins not always recognized as QCDs in topography data alone. If true, the buried lowlands may be even older than we have previously estimated.
Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation
NASA Technical Reports Server (NTRS)
Koshti, Ajay
2014-01-01
The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.
NASA Astrophysics Data System (ADS)
Yakunin, A. G.; Hussein, H. M.
2018-01-01
The article shows how the known statistical methods, which are widely used in solving financial problems and a number of other fields of science and technology, can be effectively applied after minor modification for solving such problems in climate and environment monitoring systems, as the detection of anomalies in the form of abrupt changes in signal levels, the occurrence of positive and negative outliers and the violation of the cycle form in periodic processes.
NASA Astrophysics Data System (ADS)
Kita, S.; Hasegawa, A.; Okada, T.; Nakajima, J.; Matsuzawa, T.; Katsumata, K.
2010-12-01
1. Introduction In south-eastern Hokkaido, the Kuril forearc sliver is colliding with the northeastern Japan arc due to the oblique subduction of the Pacific plate. This collision causes the formation of the Hidaka mountain range since the late Miocene (Kimura, 1986) and delamination of the lower-crust materials of the Kuril forearc sliver, which would be expected to descend into the mantle wedge below (e.g., Ito 2000; Ito and Iwasaki, 2002). In this study, we precisely investigated the three-dimensional seismic velocity structure beneath the Hokkaido corner to examine the collision of two forearcs in this area by using both of data from a dense temporary seismic network deployed in this area (Katsumata et al. [2006]) and those from the Kiban observation network, which covers the entire Japanese Islands with a station separation of 15-20 km. 2. Data and method The double-difference tomography method (Zhang and Thurber, 2003; 2006) was applied to a large number of arrival time data of 201,527 for P-waves and 150,963 for S-waves that were recorded at 125 stations from 10,971 earthquakes that occurred from 1999 to 2010. Grid intervals were set at 10 km in the along-arc direction, 12.5 km perpendicular to it, and 5-10 km in the vertical direction. 3. Results and discussion Inhomogeneous seismic velocity structure was clearly imaged in the Hokkaido corner at depths of 0-120 km. A high-velocity anomaly of P- and S- waves with a volume of 20 km x 90 km x 35km was detected just beneath the main zone of the Hidaka metamorphic belt at depths of 0-35 km. This high-velocity anomaly is continuously distributed from the depths of the mantle wedge to the surface. The western edge of the anomaly exactly corresponds to the Hidaka main thrust (HMT) at the surface. The highest velocity value in the anomaly corresponds to those of the uppermost mantle material (e.g. peridotite). The location of them at depths of 0-35km is also consistent with that of the Horoman-Peridotite belt, which is located at the southwestern edge of the main zone of the Hidaka metamorphic belt.On the other hand, a low-velocity anomaly of P- and S- waves with a volume of 80 km x 100 km x 50 km is distributed to the west of the Hidaka metamorphic belt at depths of 35-90km. This low-velocity anomaly seems to be continuously distributed from the continental crust of the NE Japan forearc. The velocity values of this low-V anomaly correspond to those of crustal materials, which is consistent with results of the tomographic study of Kita et al. [2010, EPSL] and Takanami et al. [1982]. Comparison with the results of seismic reflection surveys of Ito [2000] shows that the boundary between anomalous high-velocity mantle materials and low-velocity continental crustal materials just beneath the Hidaka main thrust (HMT) presently obtained is exactly consistent with the locations of reflection planes of their study. Moreover, our study also suggests that the anomalous low-velocity crustal materials at the mantle wedge depth appears to belong to the NE Japan forearc crust, which does not support the expectation of the previous studies that the delaminated lower-crust materials of the Kuril forearc sliver descends into the mantle wedge due to the collision.
Hot spots of multivariate extreme anomalies in Earth observations
NASA Astrophysics Data System (ADS)
Flach, M.; Sippel, S.; Bodesheim, P.; Brenning, A.; Denzler, J.; Gans, F.; Guanche, Y.; Reichstein, M.; Rodner, E.; Mahecha, M. D.
2016-12-01
Anomalies in Earth observations might indicate data quality issues, extremes or the change of underlying processes within a highly multivariate system. Thus, considering the multivariate constellation of variables for extreme detection yields crucial additional information over conventional univariate approaches. We highlight areas in which multivariate extreme anomalies are more likely to occur, i.e. hot spots of extremes in global atmospheric Earth observations that impact the Biosphere. In addition, we present the year of the most unusual multivariate extreme between 2001 and 2013 and show that these coincide with well known high impact extremes. Technically speaking, we account for multivariate extremes by using three sophisticated algorithms adapted from computer science applications. Namely an ensemble of the k-nearest neighbours mean distance, a kernel density estimation and an approach based on recurrences is used. However, the impact of atmosphere extremes on the Biosphere might largely depend on what is considered to be normal, i.e. the shape of the mean seasonal cycle and its inter-annual variability. We identify regions with similar mean seasonality by means of dimensionality reduction in order to estimate in each region both the `normal' variance and robust thresholds for detecting the extremes. In addition, we account for challenges like heteroscedasticity in Northern latitudes. Apart from hot spot areas, those anomalies in the atmosphere time series are of particular interest, which can only be detected by a multivariate approach but not by a simple univariate approach. Such an anomalous constellation of atmosphere variables is of interest if it impacts the Biosphere. The multivariate constellation of such an anomalous part of a time series is shown in one case study indicating that multivariate anomaly detection can provide novel insights into Earth observations.
A Healthcare Utilization Analysis Framework for Hot Spotting and Contextual Anomaly Detection
Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram
2012-01-01
Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient’s clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among medical encounters to provide more in-depth insights. PMID:23304306
A healthcare utilization analysis framework for hot spotting and contextual anomaly detection.
Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram
2012-01-01
Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient's clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among medical encounters to provide more in-depth insights.
Syntax does not necessarily precede semantics in sentence processing: ERP evidence from Chinese.
Zhang, Yaxu; Li, Ping; Piao, Qiuhong; Liu, Youyi; Huang, Yongjing; Shu, Hua
2013-07-01
Two event-related potential experiments were conducted to examine whether the processing of syntactic category or syntactic subcategorization frame always needs to temporally precede semantic processing during the reading of Chinese sentences of object-subject-verb construction. The sentences contained (a) no anomalies, (b) semantic only anomalies, (c) syntactic category plus semantic anomalies, or (d) transitivity plus semantic anomalies. In both experiments, all three types of anomalies elicited a broad negativity between 300 and 500 ms. This negativity included an N400 effect, given its distribution. Moreover, syntactic category plus semantic anomalies elicited a P600 response, whereas the other two types of anomalies did not. The finding of N400 effects suggests that semantic integration can be attempted even when the processing of syntactic category or syntactic subcategorization frame is unsuccessful. Thus, syntactic processing is not a necessary prerequisite for the initiation of semantic integration in Chinese. Copyright © 2013 Elsevier Inc. All rights reserved.
The Widespread Distribution of Swirls in Lunar Reconnaissance Orbiter Camera Images
NASA Astrophysics Data System (ADS)
Denevi, B. W.; Robinson, M. S.; Boyd, A. K.; Blewett, D. T.
2015-10-01
Lunar swirls, the sinuous high-and low-reflectance features that cannot be mentioned without the associated adjective "enigmatic,"are of interest because of their link to crustal magnetic anomalies [1,2]. These localized magnetic anomalies create mini-magnetospheres [3,4] and may alter the typical surface modification processes or result in altogether distinct processes that form the swirls. One hypothesis is that magnetic anomalies may provide some degree of shielding from the solar wind [1,2], which could impede space weathering due to solar wind sputtering. In this case, swirls would serve as a way to compare areas affected by typical lunar space weathering (solar wind plus micrometeoroid bombardment) to those where space weathering is dominated by micrometeoroid bombardment alone, providing a natural means to assess the relative contributions of these two processes to the alteration of fresh regolith. Alternately,magnetic anomalies may play a role in the sorting of soil grains, such that the high-reflectance portion of swirls may preferentially accumulate feldspar-rich dust [5]or soils with a lower component of nanophase iron [6].Each of these scenarios presumes a pre-existing magnetic anomaly; swirlshave also been suggested to be the result of recent cometary impacts in which the remanent magnetic field is generated by the impact event[7].Here we map the distribution of swirls using ultraviolet and visible images from the Lunar Reconnaissance Orbiter Camera(LROC) Wide Angle Camera (WAC) [8,9]. We explore the relationship of the swirls to crustal magnetic anomalies[10], and examine regions with magnetic anomalies and no swirls.
NASA Astrophysics Data System (ADS)
Zaba, Katherine D.; Rudnick, Daniel L.
2016-02-01
Large-scale patterns of positive temperature anomalies persisted throughout the surface waters of the North Pacific Ocean during 2014-2015. In the Southern California Current System, measurements by our sustained network of underwater gliders reveal the coastal effects of the recent warming. Regional upper ocean temperature anomalies were greatest since the initiation of the glider network in 2006. Additional observed physical anomalies included a depressed thermocline, high stratification, and freshening; induced biological consequences included changes in the vertical distribution of chlorophyll fluorescence. Contemporaneous surface heat flux and wind strength perturbations suggest that local anomalous atmospheric forcing caused the unusual oceanic conditions.
Knee point search using cascading top-k sorting with minimized time complexity.
Wang, Zheng; Tseng, Shian-Shyong
2013-01-01
Anomaly detection systems and many other applications are frequently confronted with the problem of finding the largest knee point in the sorted curve for a set of unsorted points. This paper proposes an efficient knee point search algorithm with minimized time complexity using the cascading top-k sorting when a priori probability distribution of the knee point is known. First, a top-k sort algorithm is proposed based on a quicksort variation. We divide the knee point search problem into multiple steps. And in each step an optimization problem of the selection number k is solved, where the objective function is defined as the expected time cost. Because the expected time cost in one step is dependent on that of the afterwards steps, we simplify the optimization problem by minimizing the maximum expected time cost. The posterior probability of the largest knee point distribution and the other parameters are updated before solving the optimization problem in each step. An example of source detection of DNS DoS flooding attacks is provided to illustrate the applications of the proposed algorithm.
Detecting Abnormal Machine Characteristics in Cloud Infrastructures
NASA Technical Reports Server (NTRS)
Bhaduri, Kanishka; Das, Kamalika; Matthews, Bryan L.
2011-01-01
In the cloud computing environment resources are accessed as services rather than as a product. Monitoring this system for performance is crucial because of typical pay-peruse packages bought by the users for their jobs. With the huge number of machines currently in the cloud system, it is often extremely difficult for system administrators to keep track of all machines using distributed monitoring programs such as Ganglia1 which lacks system health assessment and summarization capabilities. To overcome this problem, we propose a technique for automated anomaly detection using machine performance data in the cloud. Our algorithm is entirely distributed and runs locally on each computing machine on the cloud in order to rank the machines in order of their anomalous behavior for given jobs. There is no need to centralize any of the performance data for the analysis and at the end of the analysis, our algorithm generates error reports, thereby allowing the system administrators to take corrective actions. Experiments performed on real data sets collected for different jobs validate the fact that our algorithm has a low overhead for tracking anomalous machines in a cloud infrastructure.
Dispersive Phase in the L-band InSAR Image Associated with Heavy Rain Episodes
NASA Astrophysics Data System (ADS)
Furuya, M.; Kinoshita, Y.
2017-12-01
Interferometric synthetic aperture radar (InSAR) is a powerful geodetic technique that allows us to detect ground displacements with unprecedented spatial resolution, and has been used to detect displacements due to earthquakes, volcanic eruptions, and glacier motion. In the meantime, due to the microwave propagation through ionosphere and troposphere, we often encounter non-negligible phase anomaly in InSAR data. Correcting for the ionsphere and troposphere is therefore a long-standing issue for high-precision geodetic measurements. However, if ground displacements are negligible, InSAR image can tell us the details of the atmosphere.Kinoshita and Furuya (2017, SOLA) detected phase anomaly in ALOS/PALSAR InSAR data associated with heavy rain over Niigata area, Japan, and performed numerical weathr model simulation to reproduce the anomaly; ALOS/PALSAR is a satellite-based L-band SAR sensor launched by JAXA in 2006 and terminated in 2011. The phase anomaly could be largely reproduced, using the output data from the weather model. However, we should note that numerical weather model outputs can only account for the non-dispersive effect in the phase anomaly. In case of severe weather event, we may expect dispersive effect that could be caused by the presence of free-electrons.In Global Navigation Satellite System (GNSS) positioning, dual frequency measurements allow us to separate the ionospheric dispersive component from tropospheric non-dispersive components. In contrast, SAR imaging is based on a single carrier frequency, and thus no operational ionospheric corrections have been performed in InSAR data analyses. Recently, Gomba et al (2016) detailed the processing strategy of split spectrum method (SSM) for InSAR, which splits the finite bandwidth of the range spectrum and virtually allows for dual-frequency measurements.We apply the L-band InSAR SSM to the heavy rain episodes, in which more than 50 mm/hour precipitations were reported. We report the presence of phase anomaly in both dispersive and non-dispersive components. While the original phase anomaly turns out to be mostly due to the non-dispersive effect, we could recognize local anomalies in the dispersive component as well. We will discuss its geophysical implications, and may show several case studies.
Magnetic anomalies in the Cosmonauts Sea, off East Antarctica
NASA Astrophysics Data System (ADS)
Nogi, Y.; Hanyu, T.; Fujii, M.
2017-12-01
Identification of magnetic anomaly lineations and fracture zone trends in the Southern Indian Ocean, are vital to understanding the breakup of Gondwana. However, the magnetic spreading anomalies and fracture zones are not clear in the Southern Indian Ocean. Magnetic anomaly lineations in the Cosmonauts Sea, off East Antarctica, are key to elucidation of separation between Sri Lanka/India and Antarctica. No obvious magnetic anomaly lineations are observed from a Japanese/German aerogeophysical survey in the Cosmonauts Sea, and this area is considered to be created by seafloor spreading during the Cretaceous Normal Superchron. Vector magnetic anomaly measurements have been conducted on board the Icebreaker Shirase mainly to understand the process of Gondwana fragmentation in the Indian Ocean. Magnetic boundary strikes are derived from vector magnetic anomalies obtained in the Cosmonauts Sea. NE-SW trending magnetic boundary strikes are mainly observed along the several NW-SE oriented observation lines with magnetic anomaly amplitudes of about 200 nT. These NE-SW trending magnetic boundary strikes possibly indicate M-series magnetic anomalies that can not be detected from the aerogeophysical survey with nearly N-S observation lines. We will discuss the magnetic spreading anomalies and breakup process between Sri Lanka/India and Antarctica in the Cosmonauts Sea.
1986-01-24
at F7PTA, 3) old AVGAS distribution system at LFSA, and 4) southwest drainage system. Magnetic anomalies (buried drums) were identified at the...identified magnetic anomalies (buried metals) and determine whether any are pesticide drums or cans - Dispose of excavated material in an appropriate...intervals. The survey was hindered by the presence of three large iron warning signs at the site. These signs created a large magnetic anomaly in the
A method of inversion of satellite magnetic anomaly data
NASA Technical Reports Server (NTRS)
Mayhew, M. A.
1977-01-01
A method of finding a first approximation to a crustal magnetization distribution from inversion of satellite magnetic anomaly data is described. Magnetization is expressed as a Fourier Series in a segment of spherical shell. Input to this procedure is an equivalent source representation of the observed anomaly field. Instability of the inversion occurs when high frequency noise is present in the input data, or when the series is carried to an excessively high wave number. Preliminary results are given for the United States and adjacent areas.
Detection of sinkholes or anomalies using full seismic wave fields : phase II.
DOT National Transportation Integrated Search
2016-08-01
A new 2-D Full Waveform Inversion (FWI) software code was developed to characterize layering and anomalies beneath the ground surface using seismic testing. The software is capable of assessing the shear and compression wave velocities (Vs and Vp) fo...
Smart sensor technology for advanced launch vehicles
NASA Astrophysics Data System (ADS)
Schoess, Jeff
1989-07-01
Next-generation advanced launch vehicles will require improved use of sensor data and the management of multisensor resources to achieve automated preflight checkout, prelaunch readiness assessment and vehicle inflight condition monitoring. Smart sensor technology is a key component in meeting these needs. This paper describes the development of a smart sensor-based condition monitoring system concept referred to as the Distributed Sensor Architecture. A significant event and anomaly detection scheme that provides real-time condition assessment and fault diagnosis of advanced launch system rocket engines is described. The design and flight test of a smart autonomous sensor for Space Shuttle structural integrity health monitoring is presented.
Shi, Hyejin; Sohn, Sungmin; Wang, SungHo; Park, Sungrock; Lee, SangKi; Kim, Song Yi; Jeong, Sun Young; Kim, Changhwan
2017-12-01
Congenital cardiovascular anomalies, such as dextrocardia, persistent left superior vena cava (SVC), and pulmonary artery (PA) sling, are rare disorders. These congenital anomalies can occur alone, or coincide with other congenital malformations. In the majority of cases, congenital anomalies are detected early in life by certain signs and symptoms. A 56-year-old man with no previous medical history was admitted due to recurrent wide QRS complex tachycardia with hemodynamic collapse. A chest radiograph showed dextrocardia. After synchronized cardioversion, an electrocardiogram revealed Wolff-Parkinson-White (WPW) syndrome. Persistent left SVC, PA sling, and right tracheal bronchus were also detected by a chest computed tomography (CT) scan. He was diagnosed with paroxysmal supraventricular tachycardia (PSVT) associated with WPW syndrome, and underwent radiofrequency ablation. We reported the first case of situs solitus dextrocardia coexisting with persistent left SVC, PA sling and right tracheal bronchus presented with WPW and PSVT in a middle-aged adult. In patients with a cardiovascular anomaly, clinicians should consider thorough evaluation of possibly combined cardiovascular and airway malformations and cardiac dysrhythmia. © 2017 The Korean Academy of Medical Sciences.
CSAX: Characterizing Systematic Anomalies in eXpression Data.
Noto, Keith; Majidi, Saeed; Edlow, Andrea G; Wick, Heather C; Bianchi, Diana W; Slonim, Donna K
2015-05-01
Methods for translating gene expression signatures into clinically relevant information have typically relied upon having many samples from patients with similar molecular phenotypes. Here, we address the question of what can be done when it is relatively easy to obtain healthy patient samples, but when abnormalities corresponding to disease states may be rare and one-of-a-kind. The associated computational challenge, anomaly detection, is a well-studied machine-learning problem. However, due to the dimensionality and variability of expression data, existing methods based on feature space analysis or individual anomalously expressed genes are insufficient. We present a novel approach, CSAX, that identifies pathways in an individual sample in which the normal expression relationships are disrupted. To evaluate our approach, we have compiled and released a compendium of public expression data sets, reformulated to create a test bed for anomaly detection. We demonstrate the accuracy of CSAX on the data sets in our compendium, compare it to other leading methods, and show that CSAX aids in both identifying anomalies and explaining their underlying biology. We describe an approach to characterizing the difficulty of specific expression anomaly detection tasks. We then illustrate CSAX's value in two developmental case studies. Confirming prior hypotheses, CSAX highlights disruption of platelet activation pathways in a neonate with retinopathy of prematurity and identifies, for the first time, dysregulated oxidative stress response in second trimester amniotic fluid of fetuses with obese mothers. Our approach provides an important step toward identification of individual disease patterns in the era of precision medicine.
Defect characterization by inductive heated thermography
NASA Astrophysics Data System (ADS)
Noethen, Matthias; Meyendorf, Norbert
2012-05-01
During inductive-thermographic inspection, an eddy current of high intensity is induced into the inspected material and the thermal response is detected by an infrared camera. Anomalies in the surface temperature during and after inductive heating correspond to inhomogeneities in the material. A finite element simulation of the surface crack detection process using active thermography with inductive heating has been developed. The simulation model is based on the finite element software ANSYS. The simulation tool was tested and used for investigations on steel components with different longitudinal orientated cracks, varying in shape, width and height. This paper focuses on surface connected longitudinal orientated cracks in austenitic steel. The results show that depending on the excitation frequency the temperature distribution of the material under test are different and a possible way to measure the depth of the crack will be discussed.
De Marchi, S; Proto, G; Jengo, A; Collinassi, P; Basile, A
1983-02-25
Assessment of the pedigree of 7 persons in 3 generations showed that interpretation of the transmission modality of renal glycosuria may be influenced by the diagnostic criteria employed. Analysis of renal glucose curves and evaluation of glycosuria after an oral glucose tolerance test made it clear that albeit slight detects could be detected in family members who would be regarded as healthy according to the criteria of Marble. Distribution of the character pointed to dominant transmission, as opposed to the recessive autosomal transmission favoured in the literature. Variations in the clinical gravity of the tubular defect may be ascribable to a difference in the expressiveness of the abnormal gene or to genetic heterogeneity. Persons homozygous and heterozygous for the gene were present in the pedigree concerned.
Upper Lithospheric Sources of Magnetic and Gravity Anomalies of The Fennoscandian Shield
NASA Astrophysics Data System (ADS)
Korhonen, J. V.; Koistinen, T.; Working GroupFennoscandian Geophysical Maps
Magnetic total intensity anomalies (DGRF-65), Bouguer anomalies (d=2670 kg/m3) and geological units from 3400 Ma to present of the Fennoscandian Shield have been digitally compiled and printed as maps 1:2 000 000. Insert maps 1:15,000,000 com- pare anomaly components in different source scales: pseudogravimetric anomaly ver- sus Bouguer anomaly, DGRF-65 anomaly versus pseudomagnetic anomaly, magnetic vertical derivative versus second derivative of Bouguer anomaly. Data on bulk density, total magnetisation and lithology of samples have been presented as scatter diagrams and distribution maps of the average petrophysical properties in space and time. In sample level, the bulk density correlates with the lithology and, together with mag- netisation, establishes four principal populations of petrophysical properties. The av- erage properties, calculated for 5 km x 5 km cells, correlate only weakly with av- erage Bouguer-anomaly and magnetic anomaly, revealing major deep seated sources of anomalies. Pseudogravimetric and Bouguer anomalies correlate only locally with each other. The correlation is negative in the area of felsic Palaeoproterozoic rocks in W- and NW-parts of the Shield. In 2D models the sources of gravity anomalies are explained by lateral variation of density in upper and lower crust. Smoothly varying regional components are explained by boundaries of the lower crust, the upper mantle and the astenosphere. Magnetic anomalies are explained by lateral variation of magnetisation in the upper crust. Re- gional components are due to the lateral variation of magnetisation in the lower crust and the boundaries of lower crust and mantle and the Curie isotherm of magnetite.
RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection.
Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S
Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request.
RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection
Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S.
2015-01-01
Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request. PMID:25685112
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rostron, B.; Toth, J.
Lenticular reservoirs are accompanied by diagnostic pore-pressure anomalies when situated in a field of formation-fluid flow. Computer simulations have shown that these anomalies depend on the size and shape of the lens, the direction and intensity of flow, and the hydraulic conductivity contrast between the lens and the surrounding rock. Furthermore, the anomalies reflect the position of the petroleum-saturated portion of a lens since hydraulic conductivity is related to hydrocarbon content. Studies to date have shown that for an oil-free lens a pair of oppositely directed, symmetrical pressure anomalies exists. Pore-pressure distributions from drill-stem tests in mature, well-explored regions canmore » be compared to computer-simulated pore-pressure anomaly patterns. Results can be interpreted in terms of the lens geometry and degree of hydrocarbon saturation.« less
Prevalence of dental developmental anomalies: a radiographic study.
Ezoddini, Ardakani F; Sheikhha, M H; Ahmadi, H
2007-09-01
To determine the prevalence of developmental dental anomalies in patients attending the Dental Faculty of Medical University of Yazd, Iran and the gender differences of these anomalies. A retrospective study based on the panoramic radiographs of 480 patients. Patients referred for panoramic radiographs were clinically examined, a detailed family history of any dental anomalies in their first and second degree relatives was obtained and finally their radiographs were studied in detail for the presence of dental anomalies. 40.8% of the patients had dental anomalies. The more common anomalies were dilaceration (15%), impacted teeth (8.3%) and taurodontism (7.5%) and supernumerary teeth (3.5%). Macrodontia and fusion were detected in a few radiographs (0.2%). 49.1% of male patients had dental anomalies compared to 33.8% of females. Dilaceration, taurodontism and supernumerary teeth were found to be more prevalent in men than women, whereas impacted teeth, microdontia and gemination were more frequent in women. Family history of dental anomalies was positive in 34% of the cases.. Taurodontism, gemination, dens in dente and talon cusp were specifically limited to the patients under 20 year's old, while the prevalence of other anomalies was almost the same in all groups. Dilaceration, impaction and taurodontism were relatively common in the studied populaton. A family history of dental anomalies was positive in a third of cases.
Total electron content anomalies associated with global VEI4 + volcanic eruptions during 2002-2015
NASA Astrophysics Data System (ADS)
Li, Wang; Guo, Jinyun; Yue, Jianping; Shen, Yi; Yang, Yang
2016-10-01
In previous studies, little attention has been paid to the total electron content (TEC) anomalies preceding the volcanic eruption. We analyze the coupling relationship between volcanic eruption and TEC anomalies, and discuss the spatial distribution of TEC anomalies associated with volcanic geographical location. We utilize the global ionosphere map (GIM) data from the Center for Orbit Determination in Europe (CODE) to analyze TEC variations before the global volcanic eruptions indicated by VEI (Volcanic Explosivity Index) 4 + from 2002 to 2015 with the sliding interquartile range method. The results indicate the occurrence rate of TEC anomalies before great volcanic eruptions is related with the volcanic type and geographical position. The occurrence rate of TEC anomalies before stratovolcano and caldera eruptions is higher than that before shield and pyroclastic shield eruptions, and the occurrence rate of TEC anomalies has a descending trend from low latitudes to high latitudes. The TEC anomalies before the volcanic eruptions in low-mid latitudes are within the volcanic affected areas, but do not coincide with the volcanic foci. The corresponding TEC anomalies could be observed in the conjugated region, and all the TEC anomalies in the volcanic affected areas are usually close to bounds of equatorial anomaly zones. However, the TEC anomalies preceding these eruptions in high latitudes usually surround the volcano, and no TEC anomalies appear in the conjugated region. These conclusions have potential applications to the prediction of great volcanic eruptions in the future.
Hierarchical Kohonenen net for anomaly detection in network security.
Sarasamma, Suseela T; Zhu, Qiuming A; Huff, Julie
2005-04-01
A novel multilevel hierarchical Kohonen Net (K-Map) for an intrusion detection system is presented. Each level of the hierarchical map is modeled as a simple winner-take-all K-Map. One significant advantage of this multilevel hierarchical K-Map is its computational efficiency. Unlike other statistical anomaly detection methods such as nearest neighbor approach, K-means clustering or probabilistic analysis that employ distance computation in the feature space to identify the outliers, our approach does not involve costly point-to-point computation in organizing the data into clusters. Another advantage is the reduced network size. We use the classification capability of the K-Map on selected dimensions of data set in detecting anomalies. Randomly selected subsets that contain both attacks and normal records from the KDD Cup 1999 benchmark data are used to train the hierarchical net. We use a confidence measure to label the clusters. Then we use the test set from the same KDD Cup 1999 benchmark to test the hierarchical net. We show that a hierarchical K-Map in which each layer operates on a small subset of the feature space is superior to a single-layer K-Map operating on the whole feature space in detecting a variety of attacks in terms of detection rate as well as false positive rate.
Invesigation of prevalence of dental anomalies by using digital panoramic radiographs.
Bilge, Nebiha Hilal; Yeşiltepe, Selin; Törenek Ağırman, Kübra; Çağlayan, Fatma; Bilge, Osman Murat
2017-09-21
This study was performed to evaluate the prevalence of all types and subtypes of dental anomalies among 6 to 40 year-old patients by using panoramic radiographs. This cross-sectional study was conducted by analyzing digital panoramic radiographs of 1200 patients admitted to our clinic in 2014. Dental anomalies were examined under 5 types and 16 subtypes. Dental anomalies were divided into five types: (a) number (including hypodontia, oligodontia and hyperdontia); (b) size (including microdontia and macrodontia); (c) structure (including amelogenesis imperfecta, dentinogenesis imperfecta and dentin dysplasia); (d) position (including transposition, ectopia, displacement, impaction and inversion); (e) shape (including fusion-gemination, dilaceration and taurodontism); RESULTS: The prevalence of dental anomalies diagnosed by panoramic radiographs was 39.2% (men (46%), women (54%)). Anomalies of position (60.8%) and shape (27.8%) were the most common types of abnormalities and anomalies of size (8.2%), structure (0.2%) and number (17%) were the least in both genders. Anomalies of impaction (45.5%), dilacerations (16.3%), hypodontia (13.8%) and taurodontism (11.2%) were the most common subtypes of dental anomalies. Taurodontism was more common in the age groups of 13-19 years. The age range of the most frequent of all other anomalies was 20-29. Anomalies of tooth position were the most common type of dental anomalies and structure anomalies were the least in this Turkish dental population. The frequency and type of dental anomalies vary within and between populations, confirming the role of racial factors in the prevalence of dental anomalies. Digital panoramic radiography is a very useful method for the detection of dental anomalies.
NASA Astrophysics Data System (ADS)
Frolking, S. E.; Milliman, T.; Palace, M. W.; Wisser, D.; Lammers, R. B.; Fahnestock, M. A.
2010-12-01
A severe drought occurred in many portions of Amazonia in the dry season (June-September) of 2005. We analyzed ten years (7/99-10/09) of SeaWinds active microwave Ku-band backscatter data collected over the Amazon Basin, developing a monthly climatology and monthly anomalies from that climatology in an effort to detect landscape responses to this drought. We compared these to seasonal accumulating water deficit anomalies generated using Tropical Rainfall Monitoring Mission (TRMM) precipitation data (1999-2009) and 100 mm/mo evapotranspirative demand as a water deficit threshold. There was significant interannual variability in monthly mean backscatter only for ascending (early morning) overpass data, and little interannual variability in monthly mean backscatter for descending (late afternoon) overpass data. Strong negative anomalies in both ascending-overpass backscatter and accumulating water deficit developed during July-October 2005, centered on the southwestern Amazon Basin (Acre and western Amazonas states in Brazil; Madre de Dios state in Peru; Pando state in Bolivia). During the 2005 drought, there was a strong spatial correlation between morning overpass backscatter anomalies and water deficit anomalies. We hypothesize that as the drought persisted over several months, the forest canopy was increasingly unable to recover full leaf moisture content over night, and the early morning overpass backscatter data became anomalously low. This is the first reporting of tropical wet forest seasonal drought detection by active microwave scatterometry.
Tectonically Induced Anomalies Without Large Earthquake Occurrences
NASA Astrophysics Data System (ADS)
Shi, Zheming; Wang, Guangcai; Liu, Chenglong; Che, Yongtai
2017-06-01
In this study, we documented a case involving large-scale macroscopic anomalies in the Xichang area, southwestern Sichuan Province, China, from May to June of 2002, after which no major earthquake occurred. During our field survey in 2002, we found that the timing of the high-frequency occurrence of groundwater anomalies was in good agreement with those of animal anomalies. Spatially, the groundwater and animal anomalies were distributed along the Anninghe-Zemuhe fault zone. Furthermore, the groundwater level was elevated in the northwest part of the Zemuhe fault and depressed in the southeast part of the Zemuhe fault zone, with a border somewhere between Puge and Ningnan Counties. Combined with microscopic groundwater, geodetic and seismic activity data, we infer that the anomalies in the Xichang area were the result of increasing tectonic activity in the Sichuan-Yunnan block. In addition, groundwater data may be used as a good indicator of tectonic activity. This case tells us that there is no direct relationship between an earthquake and these anomalies. In most cases, the vast majority of the anomalies, including microscopic and macroscopic anomalies, are caused by tectonic activity. That is, these anomalies could occur under the effects of tectonic activity, but they do not necessarily relate to the occurrence of earthquakes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rostron, B.; Toth, J.
Lenticular reservoirs are accompanied by diagnostic pore-pressure anomalies when situated in a field of formation-fluid flow. Computer simulations have shown that these anomalies depend on the size and shape of the lens, the direction and intensity of flow, and the hydraulic conductivity contrast between the lens and the surrounding rock. Furthermore, the anomalies reflect the position of the petroleum-saturated portion of a lens since hydraulic conductivity is related to hydrocarbon content. Studies to date have shown that for an oil-free lens a pair of oppositely directed, symmetrical pressure anomalies exists. Each pair consists of a positive and a negative anomaly,more » respectively, at the downstream and upstream ends of the lens. A 2000-m long lens could generate a 200-kPa anomaly in a commonly occurring gravity-flow field. A lens that is filled with hydrocarbons will create a lower conductivity reservoir thus causing negative anomalies at the downstream and positive anomalies at the upstream ends of the lens. The paired anomaly for a partially full lens falls in between these two end members. Pore-pressure distributions from drill-stem tests in mature, well-explored regions can be compared to computer-simulated pore-pressure anomaly patterns. Results can be interpreted in terms of the lens geometry and degree of hydrocarbon saturation.« less
Incremental classification learning for anomaly detection in medical images
NASA Astrophysics Data System (ADS)
Giritharan, Balathasan; Yuan, Xiaohui; Liu, Jianguo
2009-02-01
Computer-aided diagnosis usually screens thousands of instances to find only a few positive cases that indicate probable presence of disease.The amount of patient data increases consistently all the time. In diagnosis of new instances, disagreement occurs between a CAD system and physicians, which suggests inaccurate classifiers. Intuitively, misclassified instances and the previously acquired data should be used to retrain the classifier. This, however, is very time consuming and, in some cases where dataset is too large, becomes infeasible. In addition, among the patient data, only a small percentile shows positive sign, which is known as imbalanced data.We present an incremental Support Vector Machines(SVM) as a solution for the class imbalance problem in classification of anomaly in medical images. The support vectors provide a concise representation of the distribution of the training data. Here we use bootstrapping to identify potential candidate support vectors for future iterations. Experiments were conducted using images from endoscopy videos, and the sensitivity and specificity were close to that of SVM trained using all samples available at a given incremental step with significantly improved efficiency in training the classifier.
Steganography anomaly detection using simple one-class classification
NASA Astrophysics Data System (ADS)
Rodriguez, Benjamin M.; Peterson, Gilbert L.; Agaian, Sos S.
2007-04-01
There are several security issues tied to multimedia when implementing the various applications in the cellular phone and wireless industry. One primary concern is the potential ease of implementing a steganography system. Traditionally, the only mechanism to embed information into a media file has been with a desktop computer. However, as the cellular phone and wireless industry matures, it becomes much simpler for the same techniques to be performed using a cell phone. In this paper, two methods are compared that classify cell phone images as either an anomaly or clean, where a clean image is one in which no alterations have been made and an anomalous image is one in which information has been hidden within the image. An image in which information has been hidden is known as a stego image. The main concern in detecting steganographic content with machine learning using cell phone images is in training specific embedding procedures to determine if the method has been used to generate a stego image. This leads to a possible flaw in the system when the learned model of stego is faced with a new stego method which doesn't match the existing model. The proposed solution to this problem is to develop systems that detect steganography as anomalies, making the embedding method irrelevant in detection. Two applicable classification methods for solving the anomaly detection of steganographic content problem are single class support vector machines (SVM) and Parzen-window. Empirical comparison of the two approaches shows that Parzen-window outperforms the single class SVM most likely due to the fact that Parzen-window generalizes less.
NASA Astrophysics Data System (ADS)
Chemura, Abel; Mutanga, Onisimo; Dube, Timothy
2017-05-01
The development of cost-effective, reliable and easy to implement crop condition monitoring methods is urgently required for perennial tree crops such as coffee (Coffea arabica), as they are grown over large areas and represent long term and higher levels of investment. These monitoring methods are useful in identifying farm areas that experience poor crop growth, pest infestation, diseases outbreaks and/or to monitor response to management interventions. This study compares field level coffee mean NDVI and LSWI anomalies and age-adjusted coffee mean NDVI and LSWI anomalies in identifying and mapping incongruous patches across perennial coffee plantations. To achieve this objective, we first derived deviation of coffee pixels from the global coffee mean NDVI and LSWI values of nine sequential Landsat 8 OLI image scenes. We then evaluated the influence of coffee age class (young, mature and old) on Landsat-scale NDVI and LSWI values using a one-way ANOVA and since results showed significant differences, we adjusted NDVI and LSWI anomalies for age-class. We then used the cumulative inverse distribution function (α ≤ 0.05) to identify fields and within field areas with excessive deviation of NDVI and LSWI from the global and the age-expected mean for each of the Landsat 8 OLI scene dates spanning three seasons. Results from accuracy assessment indicated that it was possible to separate incongruous and healthy patches using these anomalies and that using NDVI performed better than using LSWI for both global and age-adjusted mean anomalies. Using the age-adjusted anomalies performed better in separating incongruous and healthy patches than using the global mean for both NDVI (Overall accuracy = 80.9% and 68.1% respectively) and for LSWI (Overall accuracy = 68.1% and 48.9% respectively). When applied to other Landsat 8 OLI scenes, the results showed that the proportions of coffee fields that were modelled incongruent decreased with time for the young age category and while it increased for the mature and old age classes with time. We concluded that the method could be useful for the identification of anomalous patches using Landsat scale time series data to monitor large coffee plantations and provide an indication of areas requiring particular field attention.
Analysis of a SCADA System Anomaly Detection Model Based on Information Entropy
2014-03-27
20 Intrusion Detection...alarms ( Rem ). ............................................................................................................. 86 Figure 25. TP% for...literature concerning the focus areas of this research. The focus areas include SCADA vulnerabilities, information theory, and intrusion detection
Galileo spacecraft power management and distribution system
NASA Technical Reports Server (NTRS)
Detwiler, R. C.; Smith, R. L.
1990-01-01
The Galileo PMAD (power management and distribution system) is described, and the design drivers that established the final as-built hardware are discussed. The spacecraft is powered by two general-purpose heat-source-radioisotope thermoelectric generators. Power bus regulation is provided by a shunt regulator. Galileo PMAD distributes a 570-W beginning of mission (BOM) power source to a user complement of some 137 load elements. Extensive use of pyrotechnics requires two pyro switching subassemblies. They initiate 148 squibs which operate the 47 pyro devices on the spacecraft. Detection and correction of faults in the Galileo PMAD is an autonomous feature dictated by requirements for long life and reliability in the absence of ground-based support. Volatile computer memories in the spacecraft command and data system and attitude control system require a continuous source of backup power during all anticipated power bus fault scenarios. Power for the Jupiter Probe is conditioned, isolated, and controlled by a Probe interface subassembly. Flight performance of the spacecraft and the PMAD has been successful to date, with no major anomalies.
NASA Astrophysics Data System (ADS)
Scozzari, Andrea; Doveri, Marco
2015-04-01
The knowledge of the physical/chemical processes implied with the exploitation of water bodies for human consumption is an essential tool for the optimisation of the monitoring infrastructure. Due to their increasing importance in the context of human consumption (at least in the EU), this work focuses on groundwater resources. In the framework of drinkable water networks, the physical and data-driven modelling of transport phenomena in groundwater can help optimising the sensor network and validating the acquired data. This work proposes the combined usage of physical and data-driven modelling as a support to the design and maximisation of results from a network of distributed sensors. In particular, the validation of physico-chemical measurements and the detection of eventual anomalies by a set of continuous measurements take benefit from the knowledge of the domain from which water is abstracted, and its expected characteristics. Change-detection techniques based on non-specific sensors (presented by quite a large literature during the last two decades) have to deal with the classical issues of maximising correct detections and minimising false alarms, the latter of the two being the most typical problem to be faced, in the view of designing truly applicable monitoring systems. In this context, the definition of "anomaly" in terms of distance from an expected value or feature characterising the quality of water implies the definition of a suitable metric and the knowledge of the physical and chemical peculiarities of the natural domain from which water is exploited, with its implications in terms of characteristics of the water resource.
Duarte, João V; Ribeiro, Maria J; Violante, Inês R; Cunha, Gil; Silva, Eduardo; Castelo-Branco, Miguel
2014-01-01
Neurofibromatosis Type 1 (NF1) is a common genetic condition associated with cognitive dysfunction. However, the pathophysiology of the NF1 cognitive deficits is not well understood. Abnormal brain structure, including increased total brain volume, white matter (WM) and grey matter (GM) abnormalities have been reported in the NF1 brain. These previous studies employed univariate model-driven methods preventing detection of subtle and spatially distributed differences in brain anatomy. Multivariate pattern analysis allows the combination of information from multiple spatial locations yielding a discriminative power beyond that of single voxels. Here we investigated for the first time subtle anomalies in the NF1 brain, using a multivariate data-driven classification approach. We used support vector machines (SVM) to classify whole-brain GM and WM segments of structural T1 -weighted MRI scans from 39 participants with NF1 and 60 non-affected individuals, divided in children/adolescents and adults groups. We also employed voxel-based morphometry (VBM) as a univariate gold standard to study brain structural differences. SVM classifiers correctly classified 94% of cases (sensitivity 92%; specificity 96%) revealing the existence of brain structural anomalies that discriminate NF1 individuals from controls. Accordingly, VBM analysis revealed structural differences in agreement with the SVM weight maps representing the most relevant brain regions for group discrimination. These included the hippocampus, basal ganglia, thalamus, and visual cortex. This multivariate data-driven analysis thus identified subtle anomalies in brain structure in the absence of visible pathology. Our results provide further insight into the neuroanatomical correlates of known features of the cognitive phenotype of NF1. Copyright © 2012 Wiley Periodicals, Inc.
Shah, Anjana K.; Harris, M. Scott
2012-01-01
Magnetic field data are traditionally used to analyze igneous and metamorphic rocks, but recent efforts have shown that magnetic sources within sediments may be detectable, suggesting new applications for high-resolution magnetic field surveys. Candidates for sedimentary sources include heavy mineral sand concentrations rich in magnetite or hematite, alteration-induced glauconite, or biogenic magnetite. Magnetic field surveys can be used to map the distributions of such sources with much denser and more widespread coverage than possible by sampling. These data can then provide constraints on the composition history of local sediments. Mapping such sediments requires the sensor to be relatively close to the source, and filtering approaches may be needed to distinguish signals from both system noise and deeper basement features. Marine geophysical surveys conducted in July, 2010, over the Stono and North Edisto River inlets and their riverine inputs south of Charleston, South Carolina, showed 10- to 40-m-wide, 1- to 6-nT magnetic anomalies associated with shallow, sand-covered seabed. These anomalies are distinct from system noise but are too narrow to represent basement features. The anomalies are present mostly in shallow areas where river sediments originating from upland areas enter the inlets. Surface grab samples from the North Edisto River contain trace amounts of heavy mineral sediments including hematite, maghemite, ilmenite, and magnetite, as well as garnet, epidote, zircon, and rutile. Previous stream sediment analyses show enhanced titanium over much of the Atlantic Coastal Plain. The combined data suggest that the anomalies are generated by titanium- and iron-rich heavy mineral sands ultimately originating from the Piedmont and Blue Ridge provinces, which are then reworked and concentrated by tidal currents.
Frequency of developmental dental anomalies in the Indian population.
Guttal, Kruthika S; Naikmasur, Venkatesh G; Bhargava, Puneet; Bathi, Renuka J
2010-07-01
To evaluate the frequency of developmental dental anomalies in the Indian population. This prospective study was conducted over a period of 1 year and comprised both clinical and radiographic examinations in oral medicine and radiology outpatient department. Adult patients were screened for the presence of dental anomalies with appropriate radiographs. A comprehensive clinical examination was performed to detect hyperdontia, talon cusp, fused teeth, gemination, concrescence, hypodontia, dens invaginatus, dens evaginatus, macro- and microdontia and taurodontism. Patients with syndromes were not included in the study. Of the 20,182 patients screened, 350 had dental anomalies. Of these, 57.43% of anomalies occurred in male patients and 42.57% occurred in females. Hyperdontia, root dilaceration, peg-shaped laterals (microdontia), and hypodontia were more frequent compared to other dental anomalies of size and shape. Dental anomalies are clinically evident abnormalities. They may be the cause of various dental problems. Careful observation and appropriate investigations are required to diagnose the condition and institute treatment.