Science.gov

Sample records for sequential anomaly detection

  1. Anomaly Detection in Dynamic Networks

    SciTech Connect

    Turcotte, Melissa

    2014-10-14

    Anomaly detection in dynamic communication networks has many important security applications. These networks can be extremely large and so detecting any changes in their structure can be computationally challenging; hence, computationally fast, parallelisable methods for monitoring the network are paramount. For this reason the methods presented here use independent node and edge based models to detect locally anomalous substructures within communication networks. As a first stage, the aim is to detect changes in the data streams arising from node or edge communications. Throughout the thesis simple, conjugate Bayesian models for counting processes are used to model these data streams. A second stage of analysis can then be performed on a much reduced subset of the network comprising nodes and edges which have been identified as potentially anomalous in the first stage. The first method assumes communications in a network arise from an inhomogeneous Poisson process with piecewise constant intensity. Anomaly detection is then treated as a changepoint problem on the intensities. The changepoint model is extended to incorporate seasonal behavior inherent in communication networks. This seasonal behavior is also viewed as a changepoint problem acting on a piecewise constant Poisson process. In a static time frame, inference is made on this extended model via a Gibbs sampling strategy. In a sequential time frame, where the data arrive as a stream, a novel, fast Sequential Monte Carlo (SMC) algorithm is introduced to sample from the sequence of posterior distributions of the change points over time. A second method is considered for monitoring communications in a large scale computer network. The usage patterns in these types of networks are very bursty in nature and don’t fit a Poisson process model. For tractable inference, discrete time models are considered, where the data are aggregated into discrete time periods and probability models are fitted to the

  2. Seismic data fusion anomaly detection

    NASA Astrophysics Data System (ADS)

    Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David

    2014-06-01

    Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.

  3. Survey of Anomaly Detection Methods

    SciTech Connect

    Ng, B

    2006-10-12

    This survey defines the problem of anomaly detection and provides an overview of existing methods. The methods are categorized into two general classes: generative and discriminative. A generative approach involves building a model that represents the joint distribution of the input features and the output labels of system behavior (e.g., normal or anomalous) then applies the model to formulate a decision rule for detecting anomalies. On the other hand, a discriminative approach aims directly to find the decision rule, with the smallest error rate, that distinguishes between normal and anomalous behavior. For each approach, we will give an overview of popular techniques and provide references to state-of-the-art applications.

  4. Sequential detection of web defects

    DOEpatents

    Eichel, Paul H.; Sleefe, Gerard E.; Stalker, K. Terry; Yee, Amy A.

    2001-01-01

    A system for detecting defects on a moving web having a sequential series of identical frames uses an imaging device to form a real-time camera image of a frame and a comparitor to comparing elements of the camera image with corresponding elements of an image of an exemplar frame. The comparitor provides an acceptable indication if the pair of elements are determined to be statistically identical; and a defective indication if the pair of elements are determined to be statistically not identical. If the pair of elements is neither acceptable nor defective, the comparitor recursively compares the element of said exemplar frame with corresponding elements of other frames on said web until one of the acceptable or defective indications occur.

  5. Efficient Computer Network Anomaly Detection by Changepoint Detection Methods

    NASA Astrophysics Data System (ADS)

    Tartakovsky, Alexander G.; Polunchenko, Aleksey S.; Sokolov, Grigory

    2013-02-01

    We consider the problem of efficient on-line anomaly detection in computer network traffic. The problem is approached statistically, as that of sequential (quickest) changepoint detection. A multi-cyclic setting of quickest change detection is a natural fit for this problem. We propose a novel score-based multi-cyclic detection algorithm. The algorithm is based on the so-called Shiryaev-Roberts procedure. This procedure is as easy to employ in practice and as computationally inexpensive as the popular Cumulative Sum chart and the Exponentially Weighted Moving Average scheme. The likelihood ratio based Shiryaev-Roberts procedure has appealing optimality properties, particularly it is exactly optimal in a multi-cyclic setting geared to detect a change occurring at a far time horizon. It is therefore expected that an intrusion detection algorithm based on the Shiryaev-Roberts procedure will perform better than other detection schemes. This is confirmed experimentally for real traces. We also discuss the possibility of complementing our anomaly detection algorithm with a spectral-signature intrusion detection system with false alarm filtering and true attack confirmation capability, so as to obtain a synergistic system.

  6. Data Mining for Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj

    2013-01-01

    The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.

  7. A New, Principled Approach to Anomaly Detection

    SciTech Connect

    Ferragut, Erik M; Laska, Jason A; Bridges, Robert A

    2012-01-01

    Intrusion detection is often described as having two main approaches: signature-based and anomaly-based. We argue that only unsupervised methods are suitable for detecting anomalies. However, there has been a tendency in the literature to conflate the notion of an anomaly with the notion of a malicious event. As a result, the methods used to discover anomalies have typically been ad hoc, making it nearly impossible to systematically compare between models or regulate the number of alerts. We propose a new, principled approach to anomaly detection that addresses the main shortcomings of ad hoc approaches. We provide both theoretical and cyber-specific examples to demonstrate the benefits of our more principled approach.

  8. Anomaly detection for internet surveillance

    NASA Astrophysics Data System (ADS)

    Bouma, Henri; Raaijmakers, Stephan; Halma, Arvid; Wedemeijer, Harry

    2012-06-01

    Many threats in the real world can be related to activity of persons on the internet. Internet surveillance aims to predict and prevent attacks and to assist in finding suspects based on information from the web. However, the amount of data on the internet rapidly increases and it is time consuming to monitor many websites. In this paper, we present a novel method to automatically monitor trends and find anomalies on the internet. The system was tested on Twitter data. The results showed that it can successfully recognize abnormal changes in activity or emotion.

  9. Analytic sequential methods for detecting network intrusions

    NASA Astrophysics Data System (ADS)

    Chen, Xinjia; Walker, Ernest

    2014-05-01

    In this paper, we propose an analytic sequential methods for detecting port-scan attackers which routinely perform random "portscans" of IP addresses to find vulnerable servers to compromise. In addition to rigorously control the probability of falsely implicating benign remote hosts as malicious, our method performs significantly faster than other current solutions. We have developed explicit formulae for quick determination of the parameters of the new detection algorithm.

  10. Hyperspectral target detection using sequential approach

    NASA Astrophysics Data System (ADS)

    Haskett, Hanna T.; Sood, Arun K.; Habib, Mohammad K.

    1999-08-01

    This paper describes an automatic target detection algorithm based on the sequential multi-stage approach. Each stage of the algorithm uses more spectral bands than the previous stage. To ensure high probability of detection and low false alarm rate, Chebyshev's inequality test is applied. The sequential approach enables a significant reduction in computational time of a hyperspectral detection system. The Forest Radiance I database collected with the HYDICE hyperspectral sensor at the U.S. Army Proving Ground in Aberdeen, Maryland is utilized. Scenarios include targets in the open, with footprints of 1 m and different times of day. The total area coverage and the number of targets used in this evaluation are approximately 6 km2 and 126, respectively.

  11. Anomaly Detection for Discrete Sequences: A Survey

    SciTech Connect

    Chandola, Varun; Banerjee, Arindam; Kumar, Vipin

    2012-01-01

    This survey attempts to provide a comprehensive and structured overview of the existing research for the problem of detecting anomalies in discrete/symbolic sequences. The objective is to provide a global understanding of the sequence anomaly detection problem and how existing techniques relate to each other. The key contribution of this survey is the classification of the existing research into three distinct categories, based on the problem formulation that they are trying to solve. These problem formulations are: 1) identifying anomalous sequences with respect to a database of normal sequences; 2) identifying an anomalous subsequence within a long sequence; and 3) identifying a pattern in a sequence whose frequency of occurrence is anomalous. We show how each of these problem formulations is characteristically distinct from each other and discuss their relevance in various application domains. We review techniques from many disparate and disconnected application domains that address each of these formulations. Within each problem formulation, we group techniques into categories based on the nature of the underlying algorithm. For each category, we provide a basic anomaly detection technique, and show how the existing techniques are variants of the basic technique. This approach shows how different techniques within a category are related or different from each other. Our categorization reveals new variants and combinations that have not been investigated before for anomaly detection. We also provide a discussion of relative strengths and weaknesses of different techniques. We show how techniques developed for one problem formulation can be adapted to solve a different formulation, thereby providing several novel adaptations to solve the different problem formulations. We also highlight the applicability of the techniques that handle discrete sequences to other related areas such as online anomaly detection and time series anomaly detection.

  12. Anomaly Detection Using Behavioral Approaches

    NASA Astrophysics Data System (ADS)

    Benferhat, Salem; Tabia, Karim

    Behavioral approaches, which represent normal/abnormal activities, have been widely used during last years in intrusion detection and computer security. Nevertheless, most works showed that they are ineffective for detecting novel attacks involving new behaviors. In this paper, we first study this recurring problem due on one hand to inadequate handling of anomalous and unusual audit events and on other hand to insufficient decision rules which do not meet behavioral approach objectives. We then propose to enhance the standard decision rules in order to fit behavioral approach requirements and better detect novel attacks. Experimental studies carried out on real and simulated http traffic show that these enhanced decision rules improve detecting most novel attacks without triggering higher false alarm rates.

  13. Hyperspectral Anomaly Detection in Urban Scenarios

    NASA Astrophysics Data System (ADS)

    Rejas Ayuga, J. G.; Martínez Marín, R.; Marchamalo Sacristán, M.; Bonatti, J.; Ojeda, J. C.

    2016-06-01

    We have studied the spectral features of reflectance and emissivity in the pattern recognition of urban materials in several single hyperspectral scenes through a comparative analysis of anomaly detection methods and their relationship with city surfaces with the aim to improve information extraction processes. Spectral ranges of the visible-near infrared (VNIR), shortwave infrared (SWIR) and thermal infrared (TIR) from hyperspectral data cubes of AHS sensor and HyMAP and MASTER of two cities, Alcalá de Henares (Spain) and San José (Costa Rica) respectively, have been used. In this research it is assumed no prior knowledge of the targets, thus, the pixels are automatically separated according to their spectral information, significantly differentiated with respect to a background, either globally for the full scene, or locally by image segmentation. Several experiments on urban scenarios and semi-urban have been designed, analyzing the behaviour of the standard RX anomaly detector and different methods based on subspace, image projection and segmentation-based anomaly detection methods. A new technique for anomaly detection in hyperspectral data called DATB (Detector of Anomalies from Thermal Background) based on dimensionality reduction by projecting targets with unknown spectral signatures to a background calculated from thermal spectrum wavelengths is presented. First results and their consequences in non-supervised classification and extraction information processes are discussed.

  14. Detecting data anomalies methods in distributed systems

    NASA Astrophysics Data System (ADS)

    Mosiej, Lukasz

    2009-06-01

    Distributed systems became most popular systems in big companies. Nowadays many telecommunications companies want to hold large volumes of data about all customers. Obviously, those data cannot be stored in single database because of many technical difficulties, such as data access efficiency, security reasons, etc. On the other hand there is no need to hold all data in one place, because companies already have dedicated systems to perform specific tasks. In the distributed systems there is a redundancy of data and each system holds only interesting data in appropriate form. Data updated in one system should be also updated in the rest of systems, which hold that data. There are technical problems to update those data in all systems in transactional way. This article is about data anomalies in distributed systems. Avail data anomalies detection methods are shown. Furthermore, a new initial concept of new data anomalies detection methods is described on the last section.

  15. Fusion and normalization to enhance anomaly detection

    NASA Astrophysics Data System (ADS)

    Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.

    2009-05-01

    This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.

  16. Hyperspectral anomaly detection using enhanced global factors

    NASA Astrophysics Data System (ADS)

    Paciencia, Todd J.; Bauer, Kenneth W.

    2016-05-01

    Dimension reduction techniques have become one popular unsupervised approach used towards detecting anomalies in hyperspectral imagery. Although demonstrating promising results in the literature on specific images, these methods can become difficult to directly interpret and often require tuning of their parameters to achieve high performance on a specific set of images. This lack of generality is also compounded by the need to remove noise and atmospheric absorption spectral bands from the image prior to detection. Without a process for this band selection and to make the methods adaptable to different image compositions, performance becomes difficult to maintain across a wider variety of images. Here, we present a framework that uses factor analysis to provide a robust band selection and more meaningful dimension reduction with which to detect anomalies in the imagery. Measurable characteristics of the image are used to create an automated decision process that allows the algorithm to adjust to a particular image, while maintaining high detection performance. The framework and its algorithms are detailed, and results are shown for forest, desert, sea, rural, urban, anomaly-sparse, and anomaly-dense imagery types from different sensors. Additionally, the method is compared to current state-of-the-art methods and is shown to be computationally efficient.

  17. Anomaly Detection Techniques for Ad Hoc Networks

    ERIC Educational Resources Information Center

    Cai, Chaoli

    2009-01-01

    Anomaly detection is an important and indispensable aspect of any computer security mechanism. Ad hoc and mobile networks consist of a number of peer mobile nodes that are capable of communicating with each other absent a fixed infrastructure. Arbitrary node movements and lack of centralized control make them vulnerable to a wide variety of…

  18. Automatic detection of anomalies in Space Shuttle Main Engine turbopumps

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Whitehead, B. A.; Wu, Kewei

    1992-01-01

    A prototype expert system (developed on both PC and Symbolics 3670 lisp machine) for detecting anomalies in turbopump vibration data has been tested with data from ground tests 902-473, 902-501, 902-519, and 904-097 of the Space Shuttle Main Engine (SSME). The expert system has been utilized to analyze vibration data from each of the following SSME components: high-pressure oxidizer turbopump, high-pressure fuel turbopump, low-pressure fuel turbopump, and preburner boost pump. The expert system locates and classifies peaks in the power spectral density of each 0.4-sec window of steady-state data. Peaks representing the fundamental and harmonic frequencies of both shaft rotation and bearing cage rotation are identified by the expert system. Anomalies are then detected on the basis of sequential criteria and two threshold criteria set individually for the amplitude of each of these peaks: a prior threshold used during the first few windows of data in a test, and a posterior threshold used thereafter. In most cases the anomalies detected by the expert system agree with those reported by NASA. The two cases where there is significant disagreement will be further studied and the system design refined accordingly.

  19. Automatic detection of anomalies in Space Shuttle Main Engine turbopumps

    NASA Astrophysics Data System (ADS)

    Lo, Ching F.; Whitehead, B. A.; Wu, Kewei

    1992-07-01

    A prototype expert system (developed on both PC and Symbolics 3670 lisp machine) for detecting anomalies in turbopump vibration data has been tested with data from ground tests 902-473, 902-501, 902-519, and 904-097 of the Space Shuttle Main Engine (SSME). The expert system has been utilized to analyze vibration data from each of the following SSME components: high-pressure oxidizer turbopump, high-pressure fuel turbopump, low-pressure fuel turbopump, and preburner boost pump. The expert system locates and classifies peaks in the power spectral density of each 0.4-sec window of steady-state data. Peaks representing the fundamental and harmonic frequencies of both shaft rotation and bearing cage rotation are identified by the expert system. Anomalies are then detected on the basis of sequential criteria and two threshold criteria set individually for the amplitude of each of these peaks: a prior threshold used during the first few windows of data in a test, and a posterior threshold used thereafter. In most cases the anomalies detected by the expert system agree with those reported by NASA. The two cases where there is significant disagreement will be further studied and the system design refined accordingly.

  20. Sequential decision rules for failure detection

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.; Willsky, A. S.

    1981-01-01

    The formulation of the decision making of a failure detection process as a Bayes sequential decision problem (BSDP) provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Baysian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is a useful one.

  1. Adaptive sequential methods for detecting network intrusions

    NASA Astrophysics Data System (ADS)

    Chen, Xinjia; Walker, Ernest

    2013-06-01

    In this paper, we propose new sequential methods for detecting port-scan attackers which routinely perform random "portscans" of IP addresses to find vulnerable servers to compromise. In addition to rigorously control the probability of falsely implicating benign remote hosts as malicious, our method performs significantly faster than other current solutions. Moreover, our method guarantees that the maximum amount of observational time is bounded. In contrast to the previous most effective method, Threshold Random Walk Algorithm, which is explicit and analytical in nature, our proposed algorithm involve parameters to be determined by numerical methods. We have introduced computational techniques such as iterative minimax optimization for quick determination of the parameters of the new detection algorithm. A framework of multi-valued decision for detecting portscanners and DoS attacks is also proposed.

  2. System and method for anomaly detection

    DOEpatents

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  3. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  4. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  5. Sequential detection of learning in cognitive diagnosis.

    PubMed

    Ye, Sangbeak; Fellouris, Georgios; Culpepper, Steven; Douglas, Jeff

    2016-05-01

    In order to look more closely at the many particular skills examinees utilize to answer items, cognitive diagnosis models have received much attention, and perhaps are preferable to item response models that ordinarily involve just one or a few broadly defined skills, when the objective is to hasten learning. If these fine-grained skills can be identified, a sharpened focus on learning and remediation can be achieved. The focus here is on how to detect when learning has taken place for a particular attribute and efficiently guide a student through a sequence of items to ultimately attain mastery of all attributes while administering as few items as possible. This can be seen as a problem in sequential change-point detection for which there is a long history and a well-developed literature. Though some ad hoc rules for determining learning may be used, such as stopping after M consecutive items have been successfully answered, more efficient methods that are optimal under various conditions are available. The CUSUM, Shiryaev-Roberts and Shiryaev procedures can dramatically reduce the time required to detect learning while maintaining rigorous Type I error control, and they are studied in this context through simulation. Future directions for modelling and detection of learning are discussed.

  6. A model for anomaly classification in intrusion detection systems

    NASA Astrophysics Data System (ADS)

    Ferreira, V. O.; Galhardi, V. V.; Gonçalves, L. B. L.; Silva, R. C.; Cansian, A. M.

    2015-09-01

    Intrusion Detection Systems (IDS) are traditionally divided into two types according to the detection methods they employ, namely (i) misuse detection and (ii) anomaly detection. Anomaly detection has been widely used and its main advantage is the ability to detect new attacks. However, the analysis of anomalies generated can become expensive, since they often have no clear information about the malicious events they represent. In this context, this paper presents a model for automated classification of alerts generated by an anomaly based IDS. The main goal is either the classification of the detected anomalies in well-defined taxonomies of attacks or to identify whether it is a false positive misclassified by the IDS. Some common attacks to computer networks were considered and we achieved important results that can equip security analysts with best resources for their analyses.

  7. Statistical Anomaly Detection for Monitoring of Human Dynamics

    NASA Astrophysics Data System (ADS)

    Kamiya, K.; Fuse, T.

    2015-05-01

    Understanding of human dynamics has drawn attention to various areas. Due to the wide spread of positioning technologies that use GPS or public Wi-Fi, location information can be obtained with high spatial-temporal resolution as well as at low cost. By collecting set of individual location information in real time, monitoring of human dynamics is recently considered possible and is expected to lead to dynamic traffic control in the future. Although this monitoring focuses on detecting anomalous states of human dynamics, anomaly detection methods are developed ad hoc and not fully systematized. This research aims to define an anomaly detection problem of the human dynamics monitoring with gridded population data and develop an anomaly detection method based on the definition. According to the result of a review we have comprehensively conducted, we discussed the characteristics of the anomaly detection of human dynamics monitoring and categorized our problem to a semi-supervised anomaly detection problem that detects contextual anomalies behind time-series data. We developed an anomaly detection method based on a sticky HDP-HMM, which is able to estimate the number of hidden states according to input data. Results of the experiment with synthetic data showed that our proposed method has good fundamental performance with respect to the detection rate. Through the experiment with real gridded population data, an anomaly was detected when and where an actual social event had occurred.

  8. Anomaly detection enhanced classification in computer intrusion detection

    SciTech Connect

    Fugate, M. L.; Gattiker, J. R.

    2002-01-01

    This report describes work with the goal of enhancing capabilities in computer intrusion detection. The work builds upon a study of classification performance, that compared various methods of classifying information derived from computer network packets into attack versus normal categories, based on a labeled training dataset. This previous work validates our classification methods, and clears the ground for studying whether and how anomaly detection can be used to enhance this performance, The DARPA project that initiated the dataset used here concluded that anomaly detection should be examined to boost the performance of machine learning in the computer intrusion detection task. This report investigates the data set for aspects that will be valuable for anomaly detection application, and supports these results with models constructed from the data. In this report, the term anomaly detection means learning a model from unlabeled data, and using this to make some inference about future data. Our data is a feature vector derived from network packets: an 'example' or 'sample'. On the other hand, classification means building a model from labeled data, and using that model to classify unlabeled (future) examples. There is some precedent in the literature for combining these methods. One approach is to stage the two techniques, using anomaly detection to segment data into two sets for classification. An interpretation of this is a method to combat nonstationarity in the data. In our previous work, we demonstrated that the data has substantial temporal nonstationarity. With classification methods that can be thought of as learning a decision surface between two statistical distributions, performance is expected to degrade significantly when classifying examples that are from regions not well represented in the training set. Anomaly detection can be seen as a problem of learning the density (landscape) or the support (boundary) of a statistical distribution so that

  9. An Immunity-Based Anomaly Detection System with Sensor Agents

    PubMed Central

    Okamoto, Takeshi; Ishida, Yoshiteru

    2009-01-01

    This paper proposes an immunity-based anomaly detection system with sensor agents based on the specificity and diversity of the immune system. Each agent is specialized to react to the behavior of a specific user. Multiple diverse agents decide whether the behavior is normal or abnormal. Conventional systems have used only a single sensor to detect anomalies, while the immunity-based system makes use of multiple sensors, which leads to improvements in detection accuracy. In addition, we propose an evaluation framework for the anomaly detection system, which is capable of evaluating the differences in detection accuracy between internal and external anomalies. This paper focuses on anomaly detection in user's command sequences on UNIX-like systems. In experiments, the immunity-based system outperformed some of the best conventional systems. PMID:22291560

  10. Automated Network Anomaly Detection with Learning, Control and Mitigation

    ERIC Educational Resources Information Center

    Ippoliti, Dennis

    2014-01-01

    Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…

  11. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    PubMed

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  12. Recent Advances in Ionospheric Anomalies detection

    NASA Astrophysics Data System (ADS)

    Titov, Anton; Vyacheslav, Khattatov

    2016-07-01

    The variability of the parameters of the ionosphere and ionospheric anomalies are the subject of intensive research. It is widely known and studied in the literature ionospheric disturbances caused by solar activity, the passage of the terminator, artificial heating of high-latitude ionosphere, as well as seismic events. Each of the above types of anomalies is the subject of study and analysis. Analysis of these anomalies will provide an opportunity to improve our understanding of the mechanisms of ionospheric disturbances. To solve this problem are encouraged to develop a method of modeling the ionosphere, based on the assimilation of large amounts of observational data.

  13. Credibility of anomaly detection in nuclear reactors using neural networks

    SciTech Connect

    Kozma, R.; Kitamura, Masaharu; Sakuma, M.; Hoogenboom, J.E.

    1994-12-31

    The detection of anomalies in nuclear reactors in an incipient phase is an important safety issue. Artificial neural network (ANN) models can be trained to identify various anomaly types and associate the actual system state with one of the anomaly classes. The ANNs have a clear advantage over the usual statistical methods in detecting anomalies at an early stage. This advantage becomes apparent in the case of short-term analysis when the uncertainty of the statistical discriminators is large. In this paper, the signals generated by ANNs are analyzed from the viewpoint of the credibility of the judgement of the network about the presence or absence of anomaly in the system. The results have been applied to analyze the boiling anomaly induced in the Interfaculty Reactor Institute (IRI) research reactor in the Netherlands.

  14. Network Anomaly Detection System with Optimized DS Evidence Theory

    PubMed Central

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  15. Taming anomaly detection for industrial applications through spatial ponderation

    NASA Astrophysics Data System (ADS)

    Feller, Sebastian; Todorov, Yavor; Jaroszewski, Daniel; Chevalier, Roger

    2013-10-01

    In recent years, an abundant number of applications have been developed for anomaly detection methods. Anomaly detection algorithms offer an easy and interpretable possibility to monitor the health state of virtually any technical system and industrial process that can be described by periodic measurements. But one major caveat remains: As all state-space methods, anomaly detection algorithms rely on measures of distance and these distances are distorted by any kind of irregularity in the data. The introduction of a spatial ponderation promises to cure this illness, but no mathematical foundation has been built to support this intuition. In this paper, first steps are introduced towards a stringent description of this approach.

  16. Fluorescence sensor for sequential detection of zinc and phosphate ions

    NASA Astrophysics Data System (ADS)

    An, Miran; Kim, Bo-Yeon; Seo, Hansol; Helal, Aasif; Kim, Hong-Seok

    2016-12-01

    A new, highly selective turn-on fluorescent chemosensor based on 2-(2‧-tosylamidophenyl)thiazole (1) for the detection of zinc and phosphate ions in ethanol was synthesized and characterized. Sensor 1 showed a high selectivity for zinc compared to other cations and sequentially detected hydrogen pyrophosphate and hydrogen phosphate. The fluorescence mechanism can be explained by two different mechanisms: (i) the inhibition of excited-state intramolecular proton transfer (ESIPT) and (ii) chelation-induced enhanced fluorescence by binding with Zn2 +. The sequential detection of phosphate anions was achieved by the quenching and subsequent revival of ESIPT.

  17. An enhanced stream mining approach for network anomaly detection

    NASA Astrophysics Data System (ADS)

    Bellaachia, Abdelghani; Bhatt, Rajat

    2005-03-01

    Network anomaly detection is one of the hot topics in the market today. Currently, researchers are trying to find a way in which machines could automatically learn both normal and anomalous behavior and thus detect anomalies if and when they occur. Most important applications which could spring out of these systems is intrusion detection and spam mail detection. In this paper, the primary focus on the problem and solution of "real time" network intrusion detection although the underlying theory discussed may be used for other applications of anomaly detection (like spam detection or spy-ware detection) too. Since a machine needs a learning process on its own, data mining has been chosen as a preferred technique. The object of this paper is to present a real time clustering system; we call Enhanced Stream Mining (ESM) which could analyze packet information (headers, and data) to determine intrusions.

  18. Embedded GPU implementation of anomaly detection for hyperspectral images

    NASA Astrophysics Data System (ADS)

    Wu, Yuanfeng; Gao, Lianru; Zhang, Bing; Yang, Bin; Chen, Zhengchao

    2015-10-01

    Anomaly detection is one of the most important techniques for remotely sensed hyperspectral data interpretation. Developing fast processing techniques for anomaly detection has received considerable attention in recent years, especially in analysis scenarios with real-time constraints. In this paper, we develop an embedded graphics processing units based parallel computation for streaming background statistics anomaly detection algorithm. The streaming background statistics method can simulate real-time anomaly detection, which refer to that the processing can be performed at the same time as the data are collected. The algorithm is implemented on NVIDIA Jetson TK1 development kit. The experiment, conducted with real hyperspectral data, indicate the effectiveness of the proposed implementations. This work shows the embedded GPU gives a promising solution for high-performance with low power consumption hyperspectral image applications.

  19. Statistical Studies on Sequential Probability Ratio Test for Radiation Detection

    SciTech Connect

    Warnick Kernan, Ding Yuan, et al.

    2007-07-01

    A Sequential Probability Ratio Test (SPRT) algorithm helps to increase the reliability and speed of radiation detection. This algorithm is further improved to reduce spatial gap and false alarm. SPRT, using Last-in-First-Elected-Last-Out (LIFELO) technique, reduces the error between the radiation measured and resultant alarm. Statistical analysis determines the reduction of spatial error and false alarm.

  20. Lidar detection algorithm for time and range anomalies.

    PubMed

    Ben-David, Avishai; Davidson, Charles E; Vanderbeek, Richard G

    2007-10-10

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t(1) to t(2)" is addressed, and for range anomaly where the question "is a target present at time t within ranges R(1) and R(2)" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO(2) lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed. PMID:17932542

  1. Lidar detection algorithm for time and range anomalies.

    PubMed

    Ben-David, Avishai; Davidson, Charles E; Vanderbeek, Richard G

    2007-10-10

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t(1) to t(2)" is addressed, and for range anomaly where the question "is a target present at time t within ranges R(1) and R(2)" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO(2) lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  2. Lidar detection algorithm for time and range anomalies

    NASA Astrophysics Data System (ADS)

    Ben-David, Avishai; Davidson, Charles E.; Vanderbeek, Richard G.

    2007-10-01

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t1 to t2" is addressed, and for range anomaly where the question "is a target present at time t within ranges R1 and R2" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO2 lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  3. Evaluation schemes for video and image anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael

    2016-05-01

    Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.

  4. Multiple-Instance Learning for Anomaly Detection in Digital Mammography.

    PubMed

    Quellec, Gwenole; Lamard, Mathieu; Cozic, Michel; Coatrieux, Gouenou; Cazuguel, Guy

    2016-07-01

    This paper describes a computer-aided detection and diagnosis system for breast cancer, the most common form of cancer among women, using mammography. The system relies on the Multiple-Instance Learning (MIL) paradigm, which has proven useful for medical decision support in previous works from our team. In the proposed framework, breasts are first partitioned adaptively into regions. Then, features derived from the detection of lesions (masses and microcalcifications) as well as textural features, are extracted from each region and combined in order to classify mammography examinations as "normal" or "abnormal". Whenever an abnormal examination record is detected, the regions that induced that automated diagnosis can be highlighted. Two strategies are evaluated to define this anomaly detector. In a first scenario, manual segmentations of lesions are used to train an SVM that assigns an anomaly index to each region; local anomaly indices are then combined into a global anomaly index. In a second scenario, the local and global anomaly detectors are trained simultaneously, without manual segmentations, using various MIL algorithms (DD, APR, mi-SVM, MI-SVM and MILBoost). Experiments on the DDSM dataset show that the second approach, which is only weakly-supervised, surprisingly outperforms the first approach, even though it is strongly-supervised. This suggests that anomaly detectors can be advantageously trained on large medical image archives, without the need for manual segmentation. PMID:26829783

  5. Anomaly Detection in Power Quality at Data Centers

    NASA Technical Reports Server (NTRS)

    Grichine, Art; Solano, Wanda M.

    2015-01-01

    The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.

  6. Applications of TOPS Anomaly Detection Framework to Amazon Drought Analysis

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Ganguly, S.; Michaelis, A.; Hashimoto, H.

    2011-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. While there are large numbers of anomaly detection algorithms for multivariate datasets, we are extending this capability beyond the anomaly detection itself and towards an automated analysis that would discover the possible causes of the anomalies. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. We have integrated this knowledge base with a framework for deploying an ensemble of anomaly detection algorithms on large volumes of Earth science datasets and applied it to specific scientific applications that support research conducted by our group. In one early application, we were able to process large number of MODIS, TRMM, CERES data along with ground-based weather and river flow observations to detect the evolution of 2010 drought in the Amazon, identify the affected area, and publish the results in three weeks. A similar analysis of the 2005 drought using the same data sets took nearly 2 years, highlighting the potential contribution of our anomaly framework in accelerating scientific discoveries.

  7. Sequential Detection of Fission Processes for Harbor Defense

    SciTech Connect

    Candy, J V; Walston, S E; Chambers, D H

    2015-02-12

    With the large increase in terrorist activities throughout the world, the timely and accurate detection of special nuclear material (SNM) has become an extremely high priority for many countries concerned with national security. The detection of radionuclide contraband based on their γ-ray emissions has been attacked vigorously with some interesting and feasible results; however, the fission process of SNM has not received as much attention due to its inherent complexity and required predictive nature. In this paper, on-line, sequential Bayesian detection and estimation (parameter) techniques to rapidly and reliably detect unknown fissioning sources with high statistical confidence are developed.

  8. Visual analytics of anomaly detection in large data streams

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay

    2009-01-01

    Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.

  9. Anomaly Detection In Additively Manufactured Parts Using Laser Doppler Vibrometery

    SciTech Connect

    Hernandez, Carlos A.

    2015-09-29

    Additively manufactured parts are susceptible to non-uniform structure caused by the unique manufacturing process. This can lead to structural weakness or catastrophic failure. Using laser Doppler vibrometry and frequency response analysis, non-contact detection of anomalies in additively manufactured parts may be possible. Preliminary tests show promise for small scale detection, but more future work is necessary.

  10. Anomalies.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on anomalies includes Web sites, CD-ROMs and software, videos, books, and additional resources for elementary and junior high school students. Pertinent activities are suggested, and sidebars discuss UFOs, animal anomalies, and anomalies from nature; and resources covering unexplained phenonmenas like crop circles, Easter Island,…

  11. A hybrid approach for efficient anomaly detection using metaheuristic methods

    PubMed Central

    Ghanem, Tamer F.; Elkilani, Wail S.; Abdul-kader, Hatem M.

    2014-01-01

    Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. PMID:26199752

  12. Identifying Threats Using Graph-based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Eberle, William; Holder, Lawrence; Cook, Diane

    Much of the data collected during the monitoring of cyber and other infrastructures is structural in nature, consisting of various types of entities and relationships between them. The detection of threatening anomalies in such data is crucial to protecting these infrastructures. We present an approach to detecting anomalies in a graph-based representation of such data that explicitly represents these entities and relationships. The approach consists of first finding normative patterns in the data using graph-based data mining and then searching for small, unexpected deviations to these normative patterns, assuming illicit behavior tries to mimic legitimate, normative behavior. The approach is evaluated using several synthetic and real-world datasets. Results show that the approach has high truepositive rates, low false-positive rates, and is capable of detecting complex structural anomalies in real-world domains including email communications, cellphone calls and network traffic.

  13. A hybrid approach for efficient anomaly detection using metaheuristic methods.

    PubMed

    Ghanem, Tamer F; Elkilani, Wail S; Abdul-Kader, Hatem M

    2015-07-01

    Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. PMID:26199752

  14. Firefly Algorithm in detection of TEC seismo-ionospheric anomalies

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, Mehdi

    2015-07-01

    Anomaly detection in time series of different earthquake precursors is an essential introduction to create an early warning system with an allowable uncertainty. Since these time series are more often non linear, complex and massive, therefore the applied predictor method should be able to detect the discord patterns from a large data in a short time. This study acknowledges Firefly Algorithm (FA) as a simple and robust predictor to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of the some powerful earthquakes including Chile (27 February 2010), Varzeghan (11 August 2012) and Saravan (16 April 2013). Outstanding anomalies were observed 7 and 5 days before the Chile and Varzeghan earthquakes, respectively and also 3 and 8 days prior to the Saravan earthquake.

  15. Anomaly Detection Based on Sensor Data in Petroleum Industry Applications

    PubMed Central

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-01

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection. PMID:25633599

  16. Anomaly detection based on sensor data in petroleum industry applications.

    PubMed

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-27

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection.

  17. Anomaly detection using classified eigenblocks in GPR image

    NASA Astrophysics Data System (ADS)

    Kim, Min Ju; Kim, Seong Dae; Lee, Seung-eui

    2016-05-01

    Automatic landmine detection system using ground penetrating radar has been widely researched. For the automatic mine detection system, system speed is an important factor. Many techniques for mine detection have been developed based on statistical background. Among them, a detection technique employing the Principal Component Analysis(PCA) has been used for clutter reduction and anomaly detection. However, the PCA technique can retard the entire process, because of large basis dimension and a numerous number of inner product operations. In order to overcome this problem, we propose a fast anomaly detection system using 2D DCT and PCA. Our experiments use a set of data obtained from a test site where the anti-tank and anti- personnel mines are buried. We evaluate the proposed system in terms of the ROC curve. The result shows that the proposed system performs much better than the conventional PCA systems from the viewpoint of speed and false alarm rate.

  18. Profile-based adaptive anomaly detection for network security.

    SciTech Connect

    Zhang, Pengchu C. (Sandia National Laboratories, Albuquerque, NM); Durgin, Nancy Ann

    2005-11-01

    As information systems become increasingly complex and pervasive, they become inextricably intertwined with the critical infrastructure of national, public, and private organizations. The problem of recognizing and evaluating threats against these complex, heterogeneous networks of cyber and physical components is a difficult one, yet a solution is vital to ensuring security. In this paper we investigate profile-based anomaly detection techniques that can be used to address this problem. We focus primarily on the area of network anomaly detection, but the approach could be extended to other problem domains. We investigate using several data analysis techniques to create profiles of network hosts and perform anomaly detection using those profiles. The ''profiles'' reduce multi-dimensional vectors representing ''normal behavior'' into fewer dimensions, thus allowing pattern and cluster discovery. New events are compared against the profiles, producing a quantitative measure of how ''anomalous'' the event is. Most network intrusion detection systems (IDSs) detect malicious behavior by searching for known patterns in the network traffic. This approach suffers from several weaknesses, including a lack of generalizability, an inability to detect stealthy or novel attacks, and lack of flexibility regarding alarm thresholds. Our research focuses on enhancing current IDS capabilities by addressing some of these shortcomings. We identify and evaluate promising techniques for data mining and machine-learning. The algorithms are ''trained'' by providing them with a series of data-points from ''normal'' network traffic. A successful algorithm can be trained automatically and efficiently, will have a low error rate (low false alarm and miss rates), and will be able to identify anomalies in ''pseudo real-time'' (i.e., while the intrusion is still in progress, rather than after the fact). We also build a prototype anomaly detection tool that demonstrates how the techniques might

  19. Attention focusing and anomaly detection in systems monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. The focus of this paper is a new technique for attention focusing. The technique involves reasoning about the distance between two frequency distributions, and is used to detect both anomalous system parameters and 'broken' causal dependencies. These two forms of information together isolate the locus of anomalous behavior in the system being monitored.

  20. Locality-constrained anomaly detection for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Liu, Jiabin; Li, Wei; Du, Qian; Liu, Kui

    2015-12-01

    Detecting a target with low-occurrence-probability from unknown background in a hyperspectral image, namely anomaly detection, is of practical significance. Reed-Xiaoli (RX) algorithm is considered as a classic anomaly detector, which calculates the Mahalanobis distance between local background and the pixel under test. Local RX, as an adaptive RX detector, employs a dual-window strategy to consider pixels within the frame between inner and outer windows as local background. However, the detector is sensitive if such a local region contains anomalous pixels (i.e., outliers). In this paper, a locality-constrained anomaly detector is proposed to remove outliers in the local background region before employing the RX algorithm. Specifically, a local linear representation is designed to exploit the internal relationship between linearly correlated pixels in the local background region and the pixel under test and its neighbors. Experimental results demonstrate that the proposed detector improves the original local RX algorithm.

  1. The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection

    NASA Astrophysics Data System (ADS)

    Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad; Demon, Siti Zulaikha Ngah

    2016-01-01

    The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measured 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.

  2. Solar cell anomaly detection method and apparatus

    NASA Technical Reports Server (NTRS)

    Miller, Emmett L. (Inventor); Shumka, Alex (Inventor); Gauthier, Michael K. (Inventor)

    1981-01-01

    A method is provided for detecting cracks and other imperfections in a solar cell, which includes scanning a narrow light beam back and forth across the cell in a raster pattern, while monitoring the electrical output of the cell to find locations where the electrical output varies significantly. The electrical output can be monitored on a television type screen containing a raster pattern with each point on the screen corresponding to a point on the solar cell surface, and with the brightness of each point on the screen corresponding to the electrical output from the cell which was produced when the light beam was at the corresponding point on the cell. The technique can be utilized to scan a large array of interconnected solar cells, to determine which ones are defective.

  3. A spring window for geobotanical anomaly detection

    NASA Technical Reports Server (NTRS)

    Bell, R.; Labovitz, M. L.; Masuoka, E. J.

    1985-01-01

    The observation of senescence of deciduous vegetation to detect soil heavy metal mineralization is discussed. A gridded sampling of two sites of Quercus alba L. in south-central Virginia in 1982 is studied. The data reveal that smaller leaf blade lengths are observed in the soil site with copper, lead, and zinc concentrations. A random study in 1983 of red and white Q. rubra L., Q. prinus L., and Acer rubrum L., to confirm previous results is described. The observations of blade length and bud breaks show a 7-10 day lag in growth in the mineral site for the oak trees; however, the maple trees are not influenced by the minerals.

  4. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  5. Extending TOPS: Knowledge Management System for Anomaly Detection and Analysis

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Michaelis, A.

    2009-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. While there are large numbers of anomaly detection algorithms for multivariate datasets, we are extending this capability beyond the anomaly detection itself and towards an automated analysis that would discover the possible causes of the anomalies. There are often indirect connections between datasets that manifest themselves during occurrence of external events and rather than searching exhaustively throughout all the datasets, our goal is to capture this knowledge and provide it to the system during automated analysis. This results in more efficient processing. Since we don’t need to process all the datasets using the original anomaly detection algorithms, which is often compute intensive; we achieve data reduction as we don’t need to store all the datasets in order to search for possible connections but we can download selected data on-demand based on our analysis. For example, an anomaly observed in vegetation Net Primary Production (NPP) can relate to an anomaly in vegetation Leaf Area Index (LAI), which is a fairly direct connection, as LAI is one of the inputs for NPP, however the change in LAI could be caused by a fire event, which is not directly connected with NPP. Because we are able to capture this knowledge we can analyze fire datasets and if there is a match with the NPP anomaly, we can infer that a fire is a likely cause. The knowledge is captured using OWL ontology language, where connections are defined in a schema

  6. Maintaining defender's reputation in anomaly detection against insider attacks.

    PubMed

    Zhang, Nan; Yu, Wei; Fu, Xinwen; Das, Sajal K

    2010-06-01

    We address issues related to establishing a defender's reputation in anomaly detection against two types of attackers: 1) smart insiders, who learn from historic attacks and adapt their strategies to avoid detection/punishment, and 2) naïve attackers, who blindly launch their attacks without knowledge of the history. In this paper, we propose two novel algorithms for reputation establishment--one for systems solely consisting of smart insiders and the other for systems in which both smart insiders and naïve attackers are present. The theoretical analysis and performance evaluation show that our reputation-establishment algorithms can significantly improve the performance of anomaly detection against insider attacks in terms of the tradeoff between detection and false positives.

  7. Gaussian Process for Activity Modeling and Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Liao, W.; Rosenhahn, B.; Yang, M. Ying

    2015-08-01

    Complex activity modeling and identification of anomaly is one of the most interesting and desired capabilities for automated video behavior analysis. A number of different approaches have been proposed in the past to tackle this problem. There are two main challenges for activity modeling and anomaly detection: 1) most existing approaches require sufficient data and supervision for learning; 2) the most interesting abnormal activities arise rarely and are ambiguous among typical activities, i.e. hard to be precisely defined. In this paper, we propose a novel approach to model complex activities and detect anomalies by using non-parametric Gaussian Process (GP) models in a crowded and complicated traffic scene. In comparison with parametric models such as HMM, GP models are nonparametric and have their advantages. Our GP models exploit implicit spatial-temporal dependence among local activity patterns. The learned GP regression models give a probabilistic prediction of regional activities at next time interval based on observations at present. An anomaly will be detected by comparing the actual observations with the prediction at real time. We verify the effectiveness and robustness of the proposed model on the QMUL Junction Dataset. Furthermore, we provide a publicly available manually labeled ground truth of this data set.

  8. Anomalies

    NASA Astrophysics Data System (ADS)

    Deo, Nivedita

    1988-12-01

    This thesis studies the structure of local and global anomalies in certain systems and examines the conditions for their cancellation. Gauge anomalies-abelian and non -albelian-antisymmetric tensor, and gravitational anomalies in simple spinor theories with background fields have been analyzed by perturbative methods and local counterterms have been constructed to cancel the anomalies wherever possible. Anomalies occurring in supersymmetric theories in (2 + 1)-dimensions have also been calculated using both perturbative and heat kernel techniques, here again counterterms have been constructed to cancel these parity violating anomalies for certain gauge field configurations. (i) For gauge theories in four dimensions which contain couplings of fermions to a non-abelian antisymmetric tensor field, the contribution of the later to anomalies in the non-abelian chiral Ward identity is computed. It is shown by explicit construction of suitable counterterms that these anomalies can all be cancelled. (ii) The gauge anomalies associated with the gravitational fields in abelian gauge theories can be completely removed provided torsion is nonzero. This is shown by constructing a counterterm associated with the gravitational Goldstone-Wilczek current which cancels the anomalous gravitational contribution to the chiral Ward identity without introducing anomalies in the Lorentz or Einstein Ward identities. (iii) Using perturbative BPHZ renormalization techniques the parity odd part of the effective action has been extracted and explicitly determined for abitrary non-abelian gauge superfields in odd dimensions and shown to be the supersymmetric Chern -Simons secondary topological invariant. (iv) Schwinger's proper time technique is generalized to supersymmetric theories in odd dimensions. The effective action for supersymmetric QED is exactly found for space-time constant superfield. The parity violating anomaly induced in the effective action can be cancelled by adding a local

  9. Research on immune storage anomaly detection via user access behavior

    NASA Astrophysics Data System (ADS)

    Huang, Jianzhong; Chen, Yunliang; Fang, Yunfu

    2008-12-01

    If an intruder uses a stolen account, the authentication sub-system will regard the intruder as a legitimate user. In order to filter out such illegal users, the storage system should be capable of the user activity diagnosis. This paper presents a novel anomaly detection scheme to monitor the user access activities using the artificial immune technique. When an access request violates the access control rule, it is regarded as Non-self, so as to provide some early warning tips to the storage security sub-system. Compared with the NIDS, the proposed scheme targets the anomaly detection at storage level and focuses on the read/write data requests. In the prophase of simulation, a set of optimal parameters of algorithm are fitted according to the mean convergence speed and detection efficiency. The simulation shows the proposed scheme can reach rather high detection rate and low false alarm rate, further validating its feasibility. Thus the storage anomaly detection would strengthen the storage early warning and improve the storage security.

  10. Sparsity-driven anomaly detection for ship detection and tracking in maritime video

    NASA Astrophysics Data System (ADS)

    Shafer, Scott; Harguess, Josh; Forero, Pedro A.

    2015-05-01

    This work examines joint anomaly detection and dictionary learning approaches for identifying anomalies in persistent surveillance applications that require data compression. We have developed a sparsity-driven anomaly detector that can be used for learning dictionaries to address these challenges. In our approach, each training datum is modeled as a sparse linear combination of dictionary atoms in the presence of noise. The noise term is modeled as additive Gaussian noise and a deterministic term models the anomalies. However, no model for the statistical distribution of the anomalies is made. An estimator is postulated for a dictionary that exploits the fact that since anomalies by definition are rare, only a few anomalies will be present when considering the entire dataset. From this vantage point, we endow the deterministic noise term (anomaly-related) with a group-sparsity property. A robust dictionary learning problem is postulated where a group-lasso penalty is used to encourage most anomaly-related noise components to be zero. The proposed estimator achieves robustness by both identifying the anomalies and removing their effect from the dictionary estimate. Our approach is applied to the problem of ship detection and tracking from full-motion video with promising results.

  11. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  12. Anomaly detection in clutter using spectrally enhanced LADAR

    NASA Astrophysics Data System (ADS)

    Chhabra, Puneet S.; Wallace, Andrew M.; Hopgood, James R.

    2015-05-01

    Discrete return (DR) Laser Detection and Ranging (Ladar) systems provide a series of echoes that reflect from objects in a scene. These can be first, last or multi-echo returns. In contrast, Full-Waveform (FW)-Ladar systems measure the intensity of light reflected from objects continuously over a period of time. In a camflouaged scenario, e.g., objects hidden behind dense foliage, a FW-Ladar penetrates such foliage and returns a sequence of echoes including buried faint echoes. The aim of this paper is to learn local-patterns of co-occurring echoes characterised by their measured spectra. A deviation from such patterns defines an abnormal event in a forest/tree depth profile. As far as the authors know, neither DR or FW-Ladar, along with several spectral measurements, has not been applied to anomaly detection. This work presents an algorithm that allows detection of spectral and temporal anomalies in FW-Multi Spectral Ladar (FW-MSL) data samples. An anomaly is defined as a full waveform temporal and spectral signature that does not conform to a prior expectation, represented using a learnt subspace (dictionary) and set of coefficients that capture co-occurring local-patterns using an overlapping temporal window. A modified optimization scheme is proposed for subspace learning based on stochastic approximations. The objective function is augmented with a discriminative term that represents the subspace's separability properties and supports anomaly characterisation. The algorithm detects several man-made objects and anomalous spectra hidden in a dense clutter of vegetation and also allows tree species classification.

  13. Anomaly Detection for Next-Generation Space Launch Ground Operations

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.

    2010-01-01

    NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.

  14. Anomaly detection based on the statistics of hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Catterall, Stephen P.

    2004-10-01

    The purpose of this paper is to introduce a new anomaly detection algorithm for application to hyperspectral imaging (HSI) data. The algorithm uses characterisations of the joint (among wavebands) probability density function (pdf) of HSI data. Traditionally, the pdf has been assumed to be multivariate Gaussian or a mixture of multivariate Gaussians. Other distributions have been considered by previous authors, in particular Elliptically Contoured Distributions (ECDs). In this paper we focus on another distribution, which has only recently been defined and studied. This distribution has a more flexible and extensive set of parameters than the multivariate Gaussian does, yet the pdf takes on a relatively simple mathematical form. The result of all this is a model for the pdf of a hyperspectral image, consisting of a mixture of these distributions. Once a model for the pdf of a hyperspectral image has been obtained, it can be incorporated into an anomaly detector. The new anomaly detector is implemented and applied to some medium wave infra-red (MWIR) hyperspectral imagery. Comparison is made with a well-known anomaly detector, and it will be seen that the results are promising.

  15. Anomaly-based intrusion detection for SCADA systems

    SciTech Connect

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper will briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)

  16. Claycap anomaly detection using hyperspectral remote sensing and lidargrammetric techniques

    NASA Astrophysics Data System (ADS)

    Garcia Quijano, Maria Jose

    Clay capped waste sites are a common method to dispose of the more than 40 million tons of hazardous waste produced in the United States every year (EPA, 2003). Due to the potential threat that hazardous waste poses, it is essential to monitor closely the performance of these facilities. Development of a monitoring system that exploits spectral and topographic changes over hazardous waste sites is presented. Spectral anomaly detection is based upon the observed changes in absolute reflectance and spectral derivatives in centipede grass (Eremochloa ophiuroides) under different irrigation levels. The spectral features that provide the best separability among irrigation levels were identified using Stepwise Discriminant Analyses. The Red Edge Position was selected as a suitable discriminant variable to compare the performance of a global and a local anomaly detection algorithm using a DAIS 3715 hyperspectral image. Topographical anomaly detection is assessed by evaluating the vertical accuracy of two LIDAR datasets acquired from two different altitudes (700 m and 1,200 m AGL) over a clay-capped hazardous site at the Savannah River National Laboratory, SC using the same Optech ALTM 2050 and Cessna 337 platform. Additionally, a quantitative comparison is performed to determine the effect that decreasing platform altitude and increasing posting density have on the vertical accuracy of the LIDAR data collected.

  17. GPR anomaly detection with robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Masarik, Matthew P.; Burns, Joseph; Thelen, Brian T.; Kelly, Jack; Havens, Timothy C.

    2015-05-01

    This paper investigates the application of Robust Principal Component Analysis (RPCA) to ground penetrating radar as a means to improve GPR anomaly detection. The method consists of a preprocessing routine to smoothly align the ground and remove the ground response (haircut), followed by mapping to the frequency domain, applying RPCA, and then mapping the sparse component of the RPCA decomposition back to the time domain. A prescreener is then applied to the time-domain sparse component to perform anomaly detection. The emphasis of the RPCA algorithm on sparsity has the effect of significantly increasing the apparent signal-to-clutter ratio (SCR) as compared to the original data, thereby enabling improved anomaly detection. This method is compared to detrending (spatial-mean removal) and classical principal component analysis (PCA), and the RPCA-based processing is seen to provide substantial improvements in the apparent SCR over both of these alternative processing schemes. In particular, the algorithm has been applied to both field collected impulse GPR data and has shown significant improvement in terms of the ROC curve relative to detrending and PCA.

  18. Parameter estimation for support vector anomaly detection in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Meth, Reuven; Ahn, James; Banerjee, Amit; Juang, Radford; Burlina, Philippe

    2012-06-01

    Hyperspectral Image (HSI) anomaly detectors typically employ local background modeling techniques to facilitate target detection from surrounding clutter. Global background modeling has been challenging due to the multi-modal content that must be automatically modeled to enable target/background separation. We have previously developed a support vector based anomaly detector that does not impose an a priori parametric model on the data and enables multi-modal modeling of large background regions with inhomogeneous content. Effective application of this support vector approach requires the setting of a kernel parameter that controls the tightness of the model fit to the background data. Estimation of the kernel parameter has typically considered Type I / false-positive error optimization due to the availability of background samples, but this approach has not proven effective for general application since these methods only control the false alarm level, without any optimization for maximizing detection. Parameter optimization with respect to Type II / false-negative error has remained elusive due to the lack of sufficient target training exemplars. We present an approach that optimizes parameter selection based on both Type I and Type II error criteria by introducing outliers based on existing hypercube content to guide parameter estimation. The approach has been applied to hyperspectral imagery and has demonstrated automatic estimation of parameters consistent with those that were found to be optimal, thereby providing an automated method for general anomaly detection applications.

  19. Progressive anomaly detection in medical data using vital sign signals

    NASA Astrophysics Data System (ADS)

    Gao, Cheng; Lee, Li-Chien; Li, Yao; Chang, Chein-I.; Hu, Peter; Mackenzie, Colin

    2016-05-01

    Vital Sign Signals (VSSs) have been widely used for medical data analysis. One classic approach is to use Logistic Regression Model (LRM) to describe data to be analyzed. There are two challenging issues from this approach. One is how many VSSs needed to be used in the model since there are many VSSs can be used for this purpose. Another is that once the number of VSSs is determined, the follow-up issue what these VSSs are. Up to date these two issues are resolved by empirical selection. This paper addresses these two issues from a hyperspectral imaging perspective. If we view a patient with collected different vital sign signals as a pixel vector in hyperspectral image, then each vital sign signal can be considered as a particular band. In light of this interpretation each VSS can be ranked by band prioritization commonly used by band selection in hyperspectral imaging. In order to resolve the issue of how many VSSs should be used for data analysis we further develop a Progressive Band Processing of Anomaly Detection (PBPAD) which allows users to detect anomalies in medical data using prioritized VSSs one after another so that data changes between bands can be dictated by profiles provided by PBPAD. As a result, there is no need of determining the number of VSSs as well as which VSS should be used because all VSSs are used in their prioritized orders. To demonstrate the utility of PBPAD in medical data analysis anomaly detection is implemented as PBP to find anomalies which correspond to abnormal patients. The data to be used for experiments are data collected in University of Maryland, School of Medicine, Shock Trauma Center (STC). The results will be evaluated by the results obtained by Logistic Regression Model (LRM).

  20. Detection of Low Temperature Volcanogenic Thermal Anomalies with ASTER

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Baxter, S.

    2009-12-01

    Predicting volcanic eruptions is a thorny problem, as volcanoes typically exhibit idiosyncratic waxing and/or waning pre-eruption emission, geodetic, and seismic behavior. It is no surprise that increasing our accuracy and precision in eruption prediction depends on assessing the time-progressions of all relevant precursor geophysical, geochemical, and geological phenomena, and on more frequently observing volcanoes when they become restless. The ASTER instrument on the NASA Terra Earth Observing System satellite in low earth orbit provides important capabilities in the area of detection of volcanogenic anomalies such as thermal precursors and increased passive gas emissions. Its unique high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), bore-sighted with visible and near-IR imaging data, and combined with off-nadir pointing and stereo-photogrammetric capabilities make ASTER a potentially important volcanic precursor detection tool. We are utilizing the JPL ASTER Volcano Archive (http://ava.jpl.nasa.gov) to systematically examine 80,000+ ASTER volcano images to analyze (a) thermal emission baseline behavior for over 1500 volcanoes worldwide, (b) the form and magnitude of time-dependent thermal emission variability for these volcanoes, and (c) the spatio-temporal limits of detection of pre-eruption temporal changes in thermal emission in the context of eruption precursor behavior. We are creating and analyzing a catalog of the magnitude, frequency, and distribution of volcano thermal signatures worldwide as observed from ASTER since 2000 at 90m/pixel. Of particular interest as eruption precursors are small low contrast thermal anomalies of low apparent absolute temperature (e.g., melt-water lakes, fumaroles, geysers, grossly sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and

  1. Inflight and Preflight Detection of Pitot Tube Anomalies

    NASA Technical Reports Server (NTRS)

    Mitchell, Darrell W.

    2014-01-01

    The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.

  2. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.

    The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can

  3. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    NASA Technical Reports Server (NTRS)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  4. Hierarchical Kohonenen net for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A; Huff, Julie

    2005-04-01

    A novel multilevel hierarchical Kohonen Net (K-Map) for an intrusion detection system is presented. Each level of the hierarchical map is modeled as a simple winner-take-all K-Map. One significant advantage of this multilevel hierarchical K-Map is its computational efficiency. Unlike other statistical anomaly detection methods such as nearest neighbor approach, K-means clustering or probabilistic analysis that employ distance computation in the feature space to identify the outliers, our approach does not involve costly point-to-point computation in organizing the data into clusters. Another advantage is the reduced network size. We use the classification capability of the K-Map on selected dimensions of data set in detecting anomalies. Randomly selected subsets that contain both attacks and normal records from the KDD Cup 1999 benchmark data are used to train the hierarchical net. We use a confidence measure to label the clusters. Then we use the test set from the same KDD Cup 1999 benchmark to test the hierarchical net. We show that a hierarchical K-Map in which each layer operates on a small subset of the feature space is superior to a single-layer K-Map operating on the whole feature space in detecting a variety of attacks in terms of detection rate as well as false positive rate.

  5. Computationally efficient strategies to perform anomaly detection in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Rossi, Alessandro; Acito, Nicola; Diani, Marco; Corsini, Giovanni

    2012-11-01

    In remote sensing, hyperspectral sensors are effectively used for target detection and recognition because of their high spectral resolution that allows discrimination of different materials in the sensed scene. When a priori information about the spectrum of the targets of interest is not available, target detection turns into anomaly detection (AD), i.e. searching for objects that are anomalous with respect to the scene background. In the field of AD, anomalies can be generally associated to observations that statistically move away from background clutter, being this latter intended as a local neighborhood surrounding the observed pixel or as a large part of the image. In this context, many efforts have been put to reduce the computational load of AD algorithms so as to furnish information for real-time decision making. In this work, a sub-class of AD methods is considered that aim at detecting small rare objects that are anomalous with respect to their local background. Such techniques not only are characterized by mathematical tractability but also allow the design of real-time strategies for AD. Within these methods, one of the most-established anomaly detectors is the RX algorithm which is based on a local Gaussian model for background modeling. In the literature, the RX decision rule has been employed to develop computationally efficient algorithms implemented in real-time systems. In this work, a survey of computationally efficient methods to implement the RX detector is presented where advanced algebraic strategies are exploited to speed up the estimate of the covariance matrix and of its inverse. The comparison of the overall number of operations required by the different implementations of the RX algorithms is given and discussed by varying the RX parameters in order to show the computational improvements achieved with the introduced algebraic strategy.

  6. Detection of chiral anomaly and valley transport in Dirac semimetals

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Zhang, Enze; Liu, Yanwen; Chen, Zhigang; Liang, Sihang; Cao, Junzhi; Yuan, Xiang; Tang, Lei; Li, Qian; Gu, Teng; Wu, Yizheng; Zou, Jin; Xiu, Faxian

    Chiral anomaly is a non-conservation of chiral charge pumped by the topological nontrivial gauge field, which has been predicted to exist in the emergent quasiparticle excitations in Dirac and Weyl semimetals. However, so far, such pumping process hasn't been clearly demonstrated and lacks a convincing experimental identification. Here, we report the detection of the charge pumping effect and the related valley transport in Cd3As2 driven by external electric and magnetic fields (EB). We find that the chiral imbalance leads to a non-zero gyrotropic coefficient, which can be confirmed by the EB-generated Kerr effect. By applying B along the current direction, we observe a negative magnetoresistance despite the giant positive one at other directions, a clear indication of the chiral anomaly. Remarkably, a robust nonlocal response in valley diffusion originated from the chiral anomaly is persistent up to room temperature when B is parallel to E. The ability to manipulate the valley polarization in Dirac semimetal opens up a brand-new route to understand its fundamental properties through external fields and utilize the chiral fermions in valleytronic applications.

  7. Segmentation of laser range image for pipe anomaly detection

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Krys, Dennis

    2010-04-01

    Laser-based scanning can provide a precise surface profile. It has been widely applied to the inspection of pipe inner walls and is often used along with other types of sensors, like sonar and close-circuit television (CCTV). These measurements can be used for pipe deterioration modeling and condition assessment. Geometric information needs to be extracted to characterize anomalies in the pipe profile. Since the laser scanning measures the distance, segmentation with a threshold is a straightforward way to isolate the anomalies. However, threshold with a fixed distance value does not work well for the laser range image due to the intensity inhomogeneity, which is caused the uncontrollable factors during the inspection. Thus, a local binary fitting (LBF) active contour model is employed in this work to process the laser range image and an image phase congruency algorithm is adopted to provide the initial contour as required by the LBF method. The combination of these two approaches can successfully detect the anomalies from a laser range image.

  8. Anomaly depth detection in trans-admittance mammography: a formula independent of anomaly size or admittivity contrast

    NASA Astrophysics Data System (ADS)

    Zhang, Tingting; Lee, Eunjung; Seo, Jin Keun

    2014-04-01

    Trans-admittance mammography (TAM) is a bioimpedance technique for breast cancer detection. It is based on the comparison of tissue conductivity: cancerous tissue is identified by its higher conductivity in comparison with the surrounding normal tissue. In TAM, the breast is compressed between two electrical plates (in a similar architecture to x-ray mammography). The bottom plate has many sensing point electrodes that provide two-dimensional images (trans-admittance maps) that are induced by voltage differences between the two plates. Multi-frequency admittance data (Neumann data) are measured over the range 50 Hz-500 kHz. TAM aims to determine the location and size of any anomaly from the multi-frequency admittance data. Various anomaly detection algorithms can be used to process TAM data to determine the transverse positions of anomalies. However, existing methods cannot reliably determine the depth or size of an anomaly. Breast cancer detection using TAM would be improved if the depth or size of an anomaly could also be estimated, properties that are independent of the admittivity contrast. A formula is proposed here that can estimate the depth of an anomaly independent of its size and the admittivity contrast. This depth estimation can also be used to derive an estimation of the size of the anomaly. The proposed estimations are verified rigorously under a simplified model. Numerical simulation shows that the proposed method also works well in general settings.

  9. New models for hyperspectral anomaly detection and un-mixing

    NASA Astrophysics Data System (ADS)

    Bernhardt, M.; Heather, J. P.; Smith, M. I.

    2005-06-01

    It is now established that hyperspectral images of many natural backgrounds have statistics with fat-tails. In spite of this, many of the algorithms that are used to process them appeal to the multivariate Gaussian model. In this paper we consider biologically motivated generative models that might explain observed mixtures of vegetation in natural backgrounds. The degree to which these models match the observed fat-tailed distributions is investigated. Having shown how fat-tailed statistics arise naturally from the generative process, the models are put to work in new anomaly detection and un-mixing algorithms. The performance of these algorithms is compared with more traditional approaches.

  10. System for Anomaly and Failure Detection (SAFD) system development

    NASA Technical Reports Server (NTRS)

    Oreilly, D.

    1993-01-01

    The System for Anomaly and Failure Detection (SAFD) algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failures as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient conditions. This task assignment originally specified developing a platform for executing the algorithm during hot fire tests at Technology Test Bed (TTB) and installing the SAFD algorithm on that platform. Two units were built and installed in the Hardware Simulation Lab and at the TTB in December 1991. Since that time, the task primarily entailed improvement and maintenance of the systems, additional testing to prove the feasibility of the algorithm, and support of hot fire testing. This document addresses the work done since the last report of June 1992. The work on the System for Anomaly and Failure Detection during this period included improving the platform and the algorithm, testing the algorithm against previous test data and in the Hardware Simulation Lab, installing other algorithms on the system, providing support for operations at the Technology Test Bed, and providing routine maintenance.

  11. Anomalies detection in hyperspectral imagery using projection pursuit algorithm

    NASA Astrophysics Data System (ADS)

    Achard, Veronique; Landrevie, Anthony; Fort, Jean Claude

    2004-11-01

    Hyperspectral imagery provides detailed spectral information on the observed scene which enhances detection possibility, in particular for subpixel targets. In this context, we have developed and compared several anomaly detection algorithms based on a projection pursuit approach. The projection pursuit is performed either on the ACP or on the MNF (Minimum Noise Fraction) components. Depending on the method, the best axes of the eigenvectors basis are directly selected, or a genetic algorithm is used in order to optimize the projections. Two projection index (PI) have been tested: the kurtosis and the skewness. These different approaches have been tested on Aviris and Hymap hyperspectral images, in which subpixel targets have been included by simulation. The proportion of target in pixels varies from 50% to 10% of the surface. The results are presented and discussed. The performance of our detection algorithm is very satisfactory for target surfaces until 10% of the pixel.

  12. Visual Mismatch Negativity Reveals Automatic Detection of Sequential Regularity Violation

    PubMed Central

    Stefanics, Gábor; Kimura, Motohiro; Czigler, István

    2011-01-01

    Sequential regularities are abstract rules based on repeating sequences of environmental events, which are useful to make predictions about future events. Here, we tested whether the visual system is capable to detect sequential regularity in unattended stimulus sequences. The visual mismatch negativity (vMMN) component of the event-related potentials is sensitive to the violation of complex regularities (e.g., object-related characteristics, temporal patterns). We used the vMMN component as an index of violation of conditional (if, then) regularities. In the first experiment, to investigate emergence of vMMN and other change-related activity to the violation of conditional rules, red and green disk patterns were delivered in pairs. The majority of pairs comprised of disk patterns with identical colors, whereas in deviant pairs the colors were different. The probabilities of the two colors were equal. The second member of the deviant pairs elicited a vMMN with longer latency and more extended spatial distribution to deviants with lower probability (10 vs. 30%). In the second (control) experiment the emergence of vMMN to violation of a simple, feature-related rule was studied using oddball sequences of stimulus pairs where deviant colors were presented with 20% probabilities. Deviant colored patterns elicited a vMMN, and this component was larger for the second member of the pair, i.e., after a shorter inter-stimulus interval. This result corresponds to the SOA/(v)MMN relationship, expected on the basis of a memory-mismatch process. Our results show that the system underlying vMMN is sensitive to abstract, conditional rules. Representation of such rules implicates expectation of a subsequent event, therefore vMMN can be considered as a correlate of violated predictions about the characteristics of environmental events. PMID:21629766

  13. Anomaly Detection in Test Equipment via Sliding Mode Observers

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.; Drakunov, Sergey V.

    2012-01-01

    Nonlinear observers were originally developed based on the ideas of variable structure control, and for the purpose of detecting disturbances in complex systems. In this anomaly detection application, these observers were designed for estimating the distributed state of fluid flow in a pipe described by a class of advection equations. The observer algorithm uses collected data in a piping system to estimate the distributed system state (pressure and velocity along a pipe containing liquid gas propellant flow) using only boundary measurements. These estimates are then used to further estimate and localize possible anomalies such as leaks or foreign objects, and instrumentation metering problems such as incorrect flow meter orifice plate size. The observer algorithm has the following parts: a mathematical model of the fluid flow, observer control algorithm, and an anomaly identification algorithm. The main functional operation of the algorithm is in creating the sliding mode in the observer system implemented as software. Once the sliding mode starts in the system, the equivalent value of the discontinuous function in sliding mode can be obtained by filtering out the high-frequency chattering component. In control theory, "observers" are dynamic algorithms for the online estimation of the current state of a dynamic system by measurements of an output of the system. Classical linear observers can provide optimal estimates of a system state in case of uncertainty modeled by white noise. For nonlinear cases, the theory of nonlinear observers has been developed and its success is mainly due to the sliding mode approach. Using the mathematical theory of variable structure systems with sliding modes, the observer algorithm is designed in such a way that it steers the output of the model to the output of the system obtained via a variety of sensors, in spite of possible mismatches between the assumed model and actual system. The unique properties of sliding mode control

  14. Log Summarization and Anomaly Detection for TroubleshootingDistributed Systems

    SciTech Connect

    Gunter, Dan; Tierney, Brian L.; Brown, Aaron; Swany, Martin; Bresnahan, John; Schopf, Jennifer M.

    2007-08-01

    Today's system monitoring tools are capable of detectingsystem failures such as host failures, OS errors, and network partitionsin near-real time. Unfortunately, the same cannot yet be said of theend-to-end distributed softwarestack. Any given action, for example,reliably transferring a directory of files, can involve a wide range ofcomplex and interrelated actions across multiple pieces of software:checking user certificates and permissions, getting details for allfiles, performing third-party transfers, understanding re-try policydecisions, etc. We present an infrastructure for troubleshooting complexmiddleware, a general purpose technique for configurable logsummarization, and an anomaly detection technique that works in near-realtime on running Grid middleware. We present results gathered using thisinfrastructure from instrumented Grid middleware and applications runningon the Emulab testbed. From these results, we analyze the effectivenessof several algorithms at accurately detecting a variety of performanceanomalies.

  15. System for Anomaly and Failure Detection (SAFD) system development

    NASA Astrophysics Data System (ADS)

    Oreilly, D.

    1992-07-01

    This task specified developing the hardware and software necessary to implement the System for Anomaly and Failure Detection (SAFD) algorithm, developed under Technology Test Bed (TTB) Task 21, on the TTB engine stand. This effort involved building two units; one unit to be installed in the Block II Space Shuttle Main Engine (SSME) Hardware Simulation Lab (HSL) at Marshall Space Flight Center (MSFC), and one unit to be installed at the TTB engine stand. Rocketdyne personnel from the HSL performed the task. The SAFD algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failure as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient condition.

  16. System for Anomaly and Failure Detection (SAFD) system development

    NASA Technical Reports Server (NTRS)

    Oreilly, D.

    1992-01-01

    This task specified developing the hardware and software necessary to implement the System for Anomaly and Failure Detection (SAFD) algorithm, developed under Technology Test Bed (TTB) Task 21, on the TTB engine stand. This effort involved building two units; one unit to be installed in the Block II Space Shuttle Main Engine (SSME) Hardware Simulation Lab (HSL) at Marshall Space Flight Center (MSFC), and one unit to be installed at the TTB engine stand. Rocketdyne personnel from the HSL performed the task. The SAFD algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failure as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient condition.

  17. Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.

    PubMed

    Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda

    2014-05-01

    We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.

  18. Sequential Model-Based Detection in a Shallow Ocean Acoustic Environment

    SciTech Connect

    Candy, J V

    2002-03-26

    A model-based detection scheme is developed to passively monitor an ocean acoustic environment along with its associated variations. The technique employs an embedded model-based processor and a reference model in a sequential likelihood detection scheme. The monitor is therefore called a sequential reference detector. The underlying theory for the design is developed and discussed in detail.

  19. Anomaly detection of microstructural defects in continuous fiber reinforced composites

    NASA Astrophysics Data System (ADS)

    Bricker, Stephen; Simmons, J. P.; Przybyla, Craig; Hardie, Russell

    2015-03-01

    Ceramic matrix composites (CMC) with continuous fiber reinforcements have the potential to enable the next generation of high speed hypersonic vehicles and/or significant improvements in gas turbine engine performance due to their exhibited toughness when subjected to high mechanical loads at extreme temperatures (2200F+). Reinforced fiber composites (RFC) provide increased fracture toughness, crack growth resistance, and strength, though little is known about how stochastic variation and imperfections in the material effect material properties. In this work, tools are developed for quantifying anomalies within the microstructure at several scales. The detection and characterization of anomalous microstructure is a critical step in linking production techniques to properties, as well as in accurate material simulation and property prediction for the integrated computation materials engineering (ICME) of RFC based components. It is desired to find statistical outliers for any number of material characteristics such as fibers, fiber coatings, and pores. Here, fiber orientation, or `velocity', and `velocity' gradient are developed and examined for anomalous behavior. Categorizing anomalous behavior in the CMC is approached by multivariate Gaussian mixture modeling. A Gaussian mixture is employed to estimate the probability density function (PDF) of the features in question, and anomalies are classified by their likelihood of belonging to the statistical normal behavior for that feature.

  20. A High-Order Statistical Tensor Based Algorithm for Anomaly Detection in Hyperspectral Imagery

    NASA Astrophysics Data System (ADS)

    Geng, Xiurui; Sun, Kang; Ji, Luyan; Zhao, Yongchao

    2014-11-01

    Recently, high-order statistics have received more and more interest in the field of hyperspectral anomaly detection. However, most of the existing high-order statistics based anomaly detection methods require stepwise iterations since they are the direct applications of blind source separation. Moreover, these methods usually produce multiple detection maps rather than a single anomaly distribution image. In this study, we exploit the concept of coskewness tensor and propose a new anomaly detection method, which is called COSD (coskewness detector). COSD does not need iteration and can produce single detection map. The experiments based on both simulated and real hyperspectral data sets verify the effectiveness of our algorithm.

  1. A high-order statistical tensor based algorithm for anomaly detection in hyperspectral imagery.

    PubMed

    Geng, Xiurui; Sun, Kang; Ji, Luyan; Zhao, Yongchao

    2014-01-01

    Recently, high-order statistics have received more and more interest in the field of hyperspectral anomaly detection. However, most of the existing high-order statistics based anomaly detection methods require stepwise iterations since they are the direct applications of blind source separation. Moreover, these methods usually produce multiple detection maps rather than a single anomaly distribution image. In this study, we exploit the concept of coskewness tensor and propose a new anomaly detection method, which is called COSD (coskewness detector). COSD does not need iteration and can produce single detection map. The experiments based on both simulated and real hyperspectral data sets verify the effectiveness of our algorithm. PMID:25366706

  2. Apparatus for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    SciTech Connect

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1984-03-13

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  3. Apparatus for detecting a magnetic anomaly contiguous to remote location by squid gradiometer and magnetometer systems

    DOEpatents

    Overton, Jr., William C.; Steyert, Jr., William A.

    1984-01-01

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  4. Low-rank decomposition-based anomaly detection

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Yu; Yang, Shiming; Kalpakis, Konstantinos; Chang, Chein-I.

    2013-05-01

    With high spectral resolution hyperspectral imaging is capable of uncovering many subtle signal sources which cannot be known a priori or visually inspected. Such signal sources generally appear as anomalies in the data. Due to high correlation among spectral bands and sparsity of anomalies, a hyperspectral image can be e decomposed into two subspaces: a background subspace specified by a matrix with low rank dimensionality and an anomaly subspace specified by a sparse matrix with high rank dimensionality. This paper develops an approach to finding such low-high rank decomposition to identify anomaly subspace. Its idea is to formulate a convex constrained optimization problem that minimizes the nuclear norm of the background subspace and little ι1 norm of the anomaly subspace subject to a decomposition of data space into background and anomaly subspaces. By virtue of such a background-anomaly decomposition the commonly used RX detector can be implemented in the sense that anomalies can be separated in the anomaly subspace specified by a sparse matrix. Experimental results demonstrate that the background-anomaly subspace decomposition can actually improve and enhance RXD performance.

  5. Sensor Anomaly Detection in Wireless Sensor Networks for Healthcare

    PubMed Central

    Haque, Shah Ahsanul; Rahman, Mustafizur; Aziz, Syed Mahfuzul

    2015-01-01

    Wireless Sensor Networks (WSN) are vulnerable to various sensor faults and faulty measurements. This vulnerability hinders efficient and timely response in various WSN applications, such as healthcare. For example, faulty measurements can create false alarms which may require unnecessary intervention from healthcare personnel. Therefore, an approach to differentiate between real medical conditions and false alarms will improve remote patient monitoring systems and quality of healthcare service afforded by WSN. In this paper, a novel approach is proposed to detect sensor anomaly by analyzing collected physiological data from medical sensors. The objective of this method is to effectively distinguish false alarms from true alarms. It predicts a sensor value from historic values and compares it with the actual sensed value for a particular instance. The difference is compared against a threshold value, which is dynamically adjusted, to ascertain whether the sensor value is anomalous. The proposed approach has been applied to real healthcare datasets and compared with existing approaches. Experimental results demonstrate the effectiveness of the proposed system, providing high Detection Rate (DR) and low False Positive Rate (FPR). PMID:25884786

  6. Real-time anomaly detection in full motion video

    NASA Astrophysics Data System (ADS)

    Konowicz, Glenn; Li, Jiang

    2012-06-01

    Improvement in sensor technology such as charge-coupled devices (CCD) as well as constant incremental improvements in storage space has enabled the recording and storage of video more prevalent and lower cost than ever before. However, the improvements in the ability to capture and store a wide array of video have required additional manpower to translate these raw data sources into useful information. We propose an algorithm for automatically detecting anomalous movement patterns within full motion video thus reducing the amount of human intervention required to make use of these new data sources. The proposed algorithm tracks all of the objects within a video sequence and attempts to cluster each object's trajectory into a database of existing trajectories. Objects are tracked by first differentiating them from a Gaussian background model and then tracked over subsequent frames based on a combination of size and color. Once an object is tracked over several frames, its trajectory is calculated and compared with other trajectories earlier in the video sequence. Anomalous trajectories are differentiated by their failure to cluster with other well-known movement patterns. Adding the proposed algorithm to an existing surveillance system could increase the likelihood of identifying an anomaly and allow for more efficient collection of intelligence data. Additionally, by operating in real-time, our algorithm allows for the reallocation of sensing equipment to those areas most likely to contain movement that is valuable for situational awareness.

  7. SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) 2013

    SciTech Connect

    Gordon Rueff; Lyle Roybal; Denis Vollmer

    2013-01-01

    There is a significant need to protect the nation’s energy infrastructures from malicious actors using cyber methods. Supervisory, Control, and Data Acquisition (SCADA) systems may be vulnerable due to the insufficient security implemented during the design and deployment of these control systems. This is particularly true in older legacy SCADA systems that are still commonly in use. The purpose of INL’s research on the SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) project was to determine if and how data compression techniques could be used to identify and protect SCADA systems from cyber attacks. Initially, the concept was centered on how to train a compression algorithm to recognize normal control system traffic versus hostile network traffic. Because large portions of the TCP/IP message traffic (called packets) are repetitive, the concept of using compression techniques to differentiate “non-normal” traffic was proposed. In this manner, malicious SCADA traffic could be identified at the packet level prior to completing its payload. Previous research has shown that SCADA network traffic has traits desirable for compression analysis. This work investigated three different approaches to identify malicious SCADA network traffic using compression techniques. The preliminary analyses and results presented herein are clearly able to differentiate normal from malicious network traffic at the packet level at a very high confidence level for the conditions tested. Additionally, the master dictionary approach used in this research appears to initially provide a meaningful way to categorize and compare packets within a communication channel.

  8. Learning patterns of human activity for anomaly detection

    NASA Astrophysics Data System (ADS)

    Gutchess, Daniel; Checka, Neal; Snorrason, Magnús S.

    2007-04-01

    Commercial security and surveillance systems offer advanced sensors, optics, and display capabilities but lack intelligent processing. This necessitates human operators who must closely monitor video for situational awareness and threat assessment. For instance, urban environments are typically in a state of constant activity, which generates numerous visual cues, each of which must be examined so that potential security breaches do not go unnoticed. We are building a prototype system called BALDUR (Behavior Adaptive Learning during Urban Reconnaissance) that learns probabilistic models of activity for a given site using online and unsupervised training techniques. Once a camera system is set up, no operator intervention is required for the system to begin learning patterns of activity. Anomalies corresponding to unusual or suspicious behavior are automatically detected in real time. All moving object tracks (pedestrians, vehicles, etc.) are efficiently stored in a relational database for use in training. The database is also well suited for answering human- initiated queries. An example of such a query is, "Display all pedestrians who approached the door of the building between the hours of 9:00pm and 11:00pm." This forensic analysis tool complements the system's real-time situational awareness capabilities. Several large datasets have been collected for the evaluation of the system, including one database containing an entire month of activity from a commercial parking lot.

  9. Hyperspectral anomaly detection using sparse kernel-based ensemble learning

    NASA Astrophysics Data System (ADS)

    Gurram, Prudhvi; Han, Timothy; Kwon, Heesung

    2011-06-01

    In this paper, sparse kernel-based ensemble learning for hyperspectral anomaly detection is proposed. The proposed technique is aimed to optimize an ensemble of kernel-based one class classifiers, such as Support Vector Data Description (SVDD) classifiers, by estimating optimal sparse weights. In this method, hyperspectral signatures are first randomly sub-sampled into a large number of spectral feature subspaces. An enclosing hypersphere that defines the support of spectral data, corresponding to the normalcy/background data, in the Reproducing Kernel Hilbert Space (RKHS) of each respective feature subspace is then estimated using regular SVDD. The enclosing hypersphere basically represents the spectral characteristics of the background data in the respective feature subspace. The joint hypersphere is learned by optimally combining the hyperspheres from the individual RKHS, while imposing the l1 constraint on the combining weights. The joint hypersphere representing the most optimal compact support of the local hyperspectral data in the joint feature subspaces is then used to test each pixel in hyperspectral image data to determine if it belongs to the local background data or not. The outliers are considered to be targets. The performance comparison between the proposed technique and the regular SVDD is provided using the HYDICE hyperspectral images.

  10. Fault detection on a sewer network by a combination of a Kalman filter and a binary sequential probability ratio test

    NASA Astrophysics Data System (ADS)

    Piatyszek, E.; Voignier, P.; Graillot, D.

    2000-05-01

    One of the aims of sewer networks is the protection of population against floods and the reduction of pollution rejected to the receiving water during rainy events. To meet these goals, managers have to equip the sewer networks with and to set up real-time control systems. Unfortunately, a component fault (leading to intolerable behaviour of the system) or sensor fault (deteriorating the process view and disturbing the local automatism) makes the sewer network supervision delicate. In order to ensure an adequate flow management during rainy events it is essential to set up procedures capable of detecting and diagnosing these anomalies. This article introduces a real-time fault detection method, applicable to sewer networks, for the follow-up of rainy events. This method consists in comparing the sensor response with a forecast of this response. This forecast is provided by a model and more precisely by a state estimator: a Kalman filter. This Kalman filter provides not only a flow estimate but also an entity called 'innovation'. In order to detect abnormal operations within the network, this innovation is analysed with the binary sequential probability ratio test of Wald. Moreover, by crossing available information on several nodes of the network, a diagnosis of the detected anomalies is carried out. This method provided encouraging results during the analysis of several rains, on the sewer network of Seine-Saint-Denis County, France.

  11. Sequential detection of a weak target in a hostile ocean environment

    SciTech Connect

    Candy, J V; Sullivan, E J

    2005-03-14

    When the underlying physical phenomenology (medium, sediment, bottom, etc.) is space-time varying along with corresponding nonstationary statistics characterizing noise and uncertainties, then sequential methods must be applied to capture the underlying processes. Sequential detection and estimation techniques offer distinct advantages over batch methods. A reasonable signal processing approach to solve this class of problem is to employ adaptive or parametrically adaptive signal models and noise to capture these phenomena. In this paper, we develop a sequential approach to solve the signal detection problem in a nonstationary environment.

  12. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data.

    PubMed

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks.

  13. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data

    PubMed Central

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601

  14. Remote detection of geobotanical anomalies associated with hydrocarbon microseepage

    NASA Technical Reports Server (NTRS)

    Rock, B. N.

    1985-01-01

    As part of the continuing study of the Lost River, West Virginia NASA/Geosat Test Case Site, an extensive soil gas survey of the site was conducted during the summer of 1983. This soil gas survey has identified an order of magnitude methane, ethane, propane, and butane anomaly that is precisely coincident with the linear maple anomaly reported previously. This and other maple anomalies were previously suggested to be indicative of anaerobic soil conditions associated with hydrocarbon microseepage. In vitro studies support the view that anomalous distributions of native tree species tolerant of anaerobic soil conditions may be useful indicators of methane microseepage in heavily vegetated areas of the United States characterized by deciduous forest cover. Remote sensing systems which allow discrimination and mapping of native tree species and/or species associations will provide the exploration community with a means of identifying vegetation distributional anomalies indicative of microseepage.

  15. Accumulating pyramid spatial-spectral collaborative coding divergence for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Zou, Huanxin; Zhou, Shilin

    2016-03-01

    Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.

  16. Detecting anomalies in CMB maps: a new method

    SciTech Connect

    Neelakanta, Jayanth T.

    2015-10-01

    Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ''hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics are linear and quadratic combinations of the a{sub ℓ m}'s with random co-efficients, and they test the null hypothesis that the a{sub ℓ m}'s are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple ℓ and m modes, the ''coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.

  17. Structural chromosomal anomalies detected by prenatal genetic diagnosis: our experience.

    PubMed

    Farcaş, Simona; Crişan, C D; Andreescu, Nicoleta; Stoian, Monica; Motoc, A G M

    2013-01-01

    The prenatal diagnosis is currently widely spread and facilitates the acquiring of important genetic information about the fetus by a rate extremely accelerate and considered without precedent. In this paper, we like to present our experience concerning the genetic diagnosis and counseling offered for pregnancies in which a structural chromosomal aberration was found. The study group is formed by 528 prenatal samples of amniotic fluid and chorionic villi, received by our laboratory from 2006 through October 2012 for cytogenetic diagnosis. The appropriate genetic investigation was selected based on the indications for prenatal diagnosis. The cases with structural chromosomal anomalies and polymorphic variants were analyzed as regard to the maternal age, gestational age, referral indications and type of chromosomal anomaly found. A total number of 21 structural chromosomal anomalies and polymorphic variants were identified in the study group. Out of 21 structural chromosomal anomalies and polymorphic variants, six deletions and microdeletions, four situations with abnormal long "p" arm of acrocentric chromosomes, two duplications, two reciprocal translocations, two inversions, two additions, one Robertsonian translocation associating trisomy 13, one 9q heteromorphism and one complex chromosome rearrangement were noticed. To the best of our knowledge, this is the first Romanian study in which the diagnostic strategies and the management of the prenatal cases with structural rearrangements are presented. The data provided about the diagnosis strategy and the management of the prenatal cases with structural chromosomal anomalies represents a useful tool in genetic counseling of pregnancies diagnosed with rare structural chromosomal anomalies. PMID:23771085

  18. Gaussian Process Regression-Based Video Anomaly Detection and Localization With Hierarchical Feature Representation.

    PubMed

    Cheng, Kai-Wen; Chen, Yie-Tarng; Fang, Wen-Hsien

    2015-12-01

    This paper presents a hierarchical framework for detecting local and global anomalies via hierarchical feature representation and Gaussian process regression (GPR) which is fully non-parametric and robust to the noisy training data, and supports sparse features. While most research on anomaly detection has focused more on detecting local anomalies, we are more interested in global anomalies that involve multiple normal events interacting in an unusual manner, such as car accidents. To simultaneously detect local and global anomalies, we cast the extraction of normal interactions from the training videos as a problem of finding the frequent geometric relations of the nearby sparse spatio-temporal interest points (STIPs). A codebook of interaction templates is then constructed and modeled using the GPR, based on which a novel inference method for computing the likelihood of an observed interaction is also developed. Thereafter, these local likelihood scores are integrated into globally consistent anomaly masks, from which anomalies can be succinctly identified. To the best of our knowledge, it is the first time GPR is employed to model the relationship of the nearby STIPs for anomaly detection. Simulations based on four widespread datasets show that the new method outperforms the main state-of-the-art methods with lower computational burden. PMID:26394423

  19. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    NASA Technical Reports Server (NTRS)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  20. Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  1. NADIR (Network Anomaly Detection and Intrusion Reporter): A prototype network intrusion detection system

    SciTech Connect

    Jackson, K.A.; DuBois, D.H.; Stallings, C.A.

    1990-01-01

    The Network Anomaly Detection and Intrusion Reporter (NADIR) is an expert system which is intended to provide real-time security auditing for intrusion and misuse detection at Los Alamos National Laboratory's Integrated Computing Network (ICN). It is based on three basic assumptions: that statistical analysis of computer system and user activities may be used to characterize normal system and user behavior, and that given the resulting statistical profiles, behavior which deviates beyond certain bounds can be detected, that expert system techniques can be applied to security auditing and intrusion detection, and that successful intrusion detection may take place while monitoring a limited set of network activities such as user authentication and access control, file movement and storage, and job scheduling. NADIR has been developed to employ these basic concepts while monitoring the audited activities of more than 8000 ICN users.

  2. Classification of SD-OCT volumes for DME detection: an anomaly detection approach

    NASA Astrophysics Data System (ADS)

    Sankar, S.; Sidibé, D.; Cheung, Y.; Wong, T. Y.; Lamoureux, E.; Milea, D.; Meriaudeau, F.

    2016-03-01

    Diabetic Macular Edema (DME) is the leading cause of blindness amongst diabetic patients worldwide. It is characterized by accumulation of water molecules in the macula leading to swelling. Early detection of the disease helps prevent further loss of vision. Naturally, automated detection of DME from Optical Coherence Tomography (OCT) volumes plays a key role. To this end, a pipeline for detecting DME diseases in OCT volumes is proposed in this paper. The method is based on anomaly detection using Gaussian Mixture Model (GMM). It starts with pre-processing the B-scans by resizing, flattening, filtering and extracting features from them. Both intensity and Local Binary Pattern (LBP) features are considered. The dimensionality of the extracted features is reduced using PCA. As the last stage, a GMM is fitted with features from normal volumes. During testing, features extracted from the test volume are evaluated with the fitted model for anomaly and classification is made based on the number of B-scans detected as outliers. The proposed method is tested on two OCT datasets achieving a sensitivity and a specificity of 80% and 93% on the first dataset, and 100% and 80% on the second one. Moreover, experiments show that the proposed method achieves better classification performances than other recently published works.

  3. Energy Detection Based on Undecimated Discrete Wavelet Transform and Its Application in Magnetic Anomaly Detection

    PubMed Central

    Nie, Xinhua; Pan, Zhongming; Zhang, Dasha; Zhou, Han; Chen, Min; Zhang, Wenna

    2014-01-01

    Magnetic anomaly detection (MAD) is a passive approach for detection of a ferromagnetic target, and its performance is often limited by external noises. In consideration of one major noise source is the fractal noise (or called 1/f noise) with a power spectral density of 1/fa (0detection method based on undecimated discrete wavelet transform (UDWT) is proposed in this paper. Firstly, the foundations of magnetic anomaly detection and UDWT are introduced in brief, while a possible detection system based on giant magneto-impedance (GMI) magnetic sensor is also given out. Then our proposed energy detection based on UDWT is described in detail, and the probabilities of false alarm and detection for given the detection threshold in theory are presented. It is noticeable that no a priori assumptions regarding the ferromagnetic target or the magnetic noise probability are necessary for our method, and different from the discrete wavelet transform (DWT), the UDWT is shift invariant. Finally, some simulations are performed and the results show that the detection performance of our proposed detector is better than that of the conventional energy detector even utilized in the Gaussian white noise, especially when the spectral parameter α is less than 1.0. In addition, a real-world experiment was done to demonstrate the advantages of the proposed method. PMID:25343484

  4. Numerical study on the sequential Bayesian approach for radioactive materials detection

    NASA Astrophysics Data System (ADS)

    Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng

    2013-01-01

    A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.

  5. Anomaly Detection in Multiple Scale for Insider Threat Analysis

    SciTech Connect

    Kim, Yoohwan; Sheldon, Frederick T; Hively, Lee M

    2012-01-01

    We propose a method to quantify malicious insider activity with statistical and graph-based analysis aided with semantic scoring rules. Different types of personal activities or interactions are monitored to form a set of directed weighted graphs. The semantic scoring rules assign higher scores for the events more significant and suspicious. Then we build personal activity profiles in the form of score tables. Profiles are created in multiple scales where the low level profiles are aggregated toward more stable higherlevel profiles within the subject or object hierarchy. Further, the profiles are created in different time scales such as day, week, or month. During operation, the insider s current activity profile is compared to the historical profiles to produce an anomaly score. For each subject with a high anomaly score, a subgraph of connected subjects is extracted to look for any related score movement. Finally the subjects are ranked by their anomaly scores to help the analysts focus on high-scored subjects. The threat-ranking component supports the interaction between the User Dashboard and the Insider Threat Knowledge Base portal. The portal includes a repository for historical results, i.e., adjudicated cases containing all of the information first presented to the user and including any additional insights to help the analysts. In this paper we show the framework of the proposed system and the operational algorithms.

  6. Lunar magnetic anomalies detected by the Apollo substatellite magnetometers

    USGS Publications Warehouse

    Hood, L.L.; Coleman, P.J.; Russell, C.T.; Wilhelms, D.E.

    1979-01-01

    Properties of lunar crustal magnetization thus far deduced from Apollo subsatellite magnetometer data are reviewed using two of the most accurate presently available magnetic anomaly maps - one covering a portion of the lunar near side and the other a part of the far side. The largest single anomaly found within the region of coverage on the near-side map correlates exactly with a conspicuous, light-colored marking in western Oceanus Procellarum called Reiner Gamma. This feature is interpreted as an unusual deposit of ejecta from secondary craters of the large nearby primary impact crater Cavalerius. An age for Cavalerius (and, by implication, for Reiner Gamma) of 3.2 ?? 0.2 ?? 109 y is estimated. The main (30 ?? 60 km) Reiner Gamma deposit is nearly uniformly magnetized in a single direction, with a minimum mean magnetization intensity of ???7 ?? 10-2 G cm3/g (assuming a density of 3 g/cm3), or about 700 times the stable magnetization component of the most magnetic returned samples. Additional medium-amplitude anomalies exist over the Fra Mauro Formation (Imbrium basin ejecta emplaced ???3.9 ?? 109 y ago) where it has not been flooded by mare basalt flows, but are nearly absent over the maria and over the craters Copernicus, Kepler, and Reiner and their encircling ejecta mantles. The mean altitude of the far-side anomaly gap is much higher than that of the near-side map and the surface geology is more complex, so individual anomaly sources have not yet been identified. However, it is clear that a concentration of especially strong sources exists in the vicinity of the craters Van de Graaff and Aitken. Numerical modeling of the associated fields reveals that the source locations do not correspond with the larger primary impact craters of the region and, by analogy with Reiner Gamma, may be less conspicuous secondary crater ejecta deposits. The reason for a special concentration of strong sources in the Van de Graaff-Aitken region is unknown, but may be indirectly

  7. Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)

    2001-01-01

    The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.

  8. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    DOE PAGES

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; Laska, Jason A.; Sullivan, Blair D.

    2016-10-20

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  9. A hyperspectral imagery anomaly detection algorithm based on local three-dimensional orthogonal subspace projection

    NASA Astrophysics Data System (ADS)

    Zhang, Xing; Wen, Gongjian

    2015-10-01

    Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.

  10. Multi-Level Anomaly Detection on Time-Varying Graph Data

    SciTech Connect

    Bridges, Robert A; Collins, John P; Ferragut, Erik M; Laska, Jason A; Sullivan, Blair D

    2015-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.

  11. Detection of nucleic acids by multiple sequential invasive cleavages

    SciTech Connect

    Hall, Jeff G; Lyamichev, Victor I; Mast, Andrea L; Brow, Mary Ann D

    2012-10-16

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  12. Detection of nucleic acids by multiple sequential invasive cleavages

    SciTech Connect

    Hall, J.G.; Lyamichev, V.I.; Mast, A.L.; Brow, M.A.D.

    1999-11-30

    The present invention relates to methods for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  13. Detection of nucleic acids by multiple sequential invasive cleavages

    DOEpatents

    Hall, Jeff G.; Lyamichev, Victor I.; Mast, Andrea L.; Brow, Mary Ann D.

    1999-01-01

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  14. Detection of nucleic acids by multiple sequential invasive cleavages 02

    DOEpatents

    Hall, Jeff G.; Lyamichev, Victor I.; Mast, Andrea L.; Brow, Mary Ann D.

    2002-01-01

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  15. Analyzing Global Climate System Using Graph Based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Das, K.; Agrawal, S.; Atluri, G.; Liess, S.; Steinbach, M.; Kumar, V.

    2014-12-01

    Climate networks have been studied for understanding complex relationships between different spatial locations such as community structures and teleconnections. Analysis of time-evolving climate networks reveals changes that occur in those relationships over time and can provide insights for discovering new and complex climate phenomena. We have recently developed a novel data mining technique to discover anomalous relationships from dynamic climate networks. The algorithms efficiently identifies anomalous changes in relationships that cause significant structural changes in the climate network from one time instance to the next. Using this technique we investigated the presence of anomalies in precipitation networks that were constructed based on monthly averages of precipitation recorded at .5 degree resolution during the time period 1982 to 2002. The precipitation network consisted of 10-nearest neighbor graphs for every month's data. Preliminary results on this data set indicate that we were able to discover several anomalies that have been verified to be related to or as the outcome of well known climate phenomena. For instance, one such set of anomalies corresponds to transition from January 1994 (normal conditions) to January 1995 (El-Nino conditions) and include events like worst droughts of the 20th century in Australian Plains, very high rainfall in southeast Asian islands, and drought-like conditions in Peru, Chile, and eastern equatorial Africa during that time period. We plan to further apply our technique to networks constructed out of different climate variables such as sea-level pressure, surface air temperature, wind velocity, 500 geo-potential height etc. at different resolutions. Using this method we hope to develop deeper insights regarding the interactions of multiple climate variables globally over time, which might lead to discovery of previously unknown climate phenomena involving heterogeneous data sources.

  16. Off-line experiments on radionuclide detection based on the sequential Bayesian approach

    NASA Astrophysics Data System (ADS)

    Qingpei, Xiang; Dongfeng, Tian; Fanhua, Hao; Ge, Ding; Jun, Zeng; Fei, Luo

    2013-11-01

    The sequential Bayesian approach proposed by Candy et al. for radioactive materials detection has aroused increasing interest in radiation detection research and is potentially a useful tool for prevention of the transportation of radioactive materials by terrorists. In our previous work, the performance of the sequential Bayesian approach was studied numerically through a simulation experiment platform. In this paper, a sequential Bayesian processor incorporating a LaBr3(Ce) detector, and using the energy, decay rate and emission probability of the radionuclide, is fully developed. Off-line experiments for the performance of the sequential Bayesian approach in radionuclide detection are developed by placing 60Co, 137Cs, 133Ba and 152Eu at various distances from the front face of the LaBr3(Ce) detector. The off-line experiment results agree well with the results of previous numerical experiments. The maximum detection distance is introduced to evaluate the processor‧s ability to detect radionuclides with a specific level of activity.

  17. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines

    PubMed Central

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-01-01

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561

  18. Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering

    NASA Astrophysics Data System (ADS)

    Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech

    2015-03-01

    We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.

  19. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.

    PubMed

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-04-29

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.

  20. Addressing the Challenges of Anomaly Detection for Cyber Physical Energy Grid Systems

    SciTech Connect

    Ferragut, Erik M; Laska, Jason A; Melin, Alexander M; Czejdo, Bogdan

    2013-01-01

    The consolidation of cyber communications networks and physical control systems within the energy smart grid introduces a number of new risks. Unfortunately, these risks are largely unknown and poorly understood, yet include very high impact losses from attack and component failures. One important aspect of risk management is the detection of anomalies and changes. However, anomaly detection within cyber security remains a difficult, open problem, with special challenges in dealing with false alert rates and heterogeneous data. Furthermore, the integration of cyber and physical dynamics is often intractable. And, because of their broad scope, energy grid cyber-physical systems must be analyzed at multiple scales, from individual components, up to network level dynamics. We describe an improved approach to anomaly detection that combines three important aspects. First, system dynamics are modeled using a reduced order model for greater computational tractability. Second, a probabilistic and principled approach to anomaly detection is adopted that allows for regulation of false alerts and comparison of anomalies across heterogeneous data sources. Third, a hierarchy of aggregations are constructed to support interactive and automated analyses of anomalies at multiple scales.

  1. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.

    PubMed

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-01-01

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561

  2. A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.

  3. Extending TOPS: A Prototype MODIS Anomaly Detection Architecture

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Srivastava, A. N.

    2008-12-01

    The management and processing of Earth science data has been gaining importance over the last decade due to higher data volumes generated by a larger number of instruments, and due to the increase in complexity of Earth science models that use this data. The volume of data itself is often a limiting factor in obtaining the information needed by the scientists; without more sophisticated data volume reduction technologies, possible key information may not be discovered. We are especially interested in automatic identification of disturbances within the ecosystems (e,g, wildfires, droughts, floods, insect/pest damage, wind damage, logging), and focusing our analysis efforts on the identified areas. There are dozens of variables that define the health of our ecosystem and both long-term and short-term changes in these variables can serve as early indicators of natural disasters and shifts in climate and ecosystem health. These changes can have profound socio-economic impacts and we need to develop capabilities for identification, analysis and response to these changes in a timely manner. Because the ecosystem consists of a large number of variables, there can be a disturbance that is only apparent when we examine relationships among multiple variables despite the fact that none of them is by itself alarming. We have to be able to extract information from multiple sensors and observations and discover these underlying relationships. As the data volumes increase, there is also potential for large number of anomalies to "flood" the system, so we need to provide ability to automatically select the most likely ones and the most important ones and the ability to analyze the anomaly with minimal involvement of scientists. We describe a prototype architecture for anomaly driven data reduction for both near-real-time and archived surface reflectance data from the MODIS instrument collected over Central California and test it using Orca and One-Class Support Vector Machines

  4. Advancements of Data Anomaly Detection Research in Wireless Sensor Networks: A Survey and Open Issues

    PubMed Central

    Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-01-01

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182

  5. Sequential detection of temporal communities by estrangement confinement.

    PubMed

    Kawadia, Vikas; Sreenivasan, Sameet

    2012-01-01

    Temporal communities are the result of a consistent partitioning of nodes across multiple snapshots of an evolving network, and they provide insights into how dense clusters in a network emerge, combine, split and decay over time. To reliably detect temporal communities we need to not only find a good community partition in a given snapshot but also ensure that it bears some similarity to the partition(s) found in the previous snapshot(s), a particularly difficult task given the extreme sensitivity of community structure yielded by current methods to changes in the network structure. Here, motivated by the inertia of inter-node relationships, we present a new measure of partition distance called estrangement, and show that constraining estrangement enables one to find meaningful temporal communities at various degrees of temporal smoothness in diverse real-world datasets. Estrangement confinement thus provides a principled approach to uncovering temporal communities in evolving networks. PMID:23145317

  6. Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

    SciTech Connect

    Ondrej Linda; Todd Vollmer; Milos Manic

    2012-08-01

    The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, this paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.

  7. Physics-based, Bayesian sequential detection method and system for radioactive contraband

    DOEpatents

    Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E

    2014-03-18

    A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.

  8. Fuzzy neural networks for classification and detection of anomalies.

    PubMed

    Meneganti, M; Saviello, F S; Tagliaferri, R

    1998-01-01

    In this paper, a new learning algorithm for the Simpson's fuzzy min-max neural network is presented. It overcomes some undesired properties of the Simpson's model: specifically, in it there are neither thresholds that bound the dimension of the hyperboxes nor sensitivity parameters. Our new algorithm improves the network performance: in fact, the classification result does not depend on the presentation order of the patterns in the training set, and at each step, the classification error in the training set cannot increase. The new neural model is particularly useful in classification problems as it is shown by comparison with some fuzzy neural nets cited in literature (Simpson's min-max model, fuzzy ARTMAP proposed by Carpenter, Grossberg et al. in 1992, adaptive fuzzy systems as introduced by Wang in his book) and the classical multilayer perceptron neural network with backpropagation learning algorithm. The tests were executed on three different classification problems: the first one with two-dimensional synthetic data, the second one with realistic data generated by a simulator to find anomalies in the cooling system of a blast furnace, and the third one with real data for industrial diagnosis. The experiments were made following some recent evaluation criteria known in literature and by using Microsoft Visual C++ development environment on personal computers.

  9. Low-rank and sparse matrix decomposition-based anomaly detection for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Sun, Weiwei; Liu, Chun; Li, Jialin; Lai, Yenming Mark; Li, Weiyue

    2014-01-01

    A low-rank and sparse matrix decomposition (LRaSMD) detector has been proposed to detect anomalies in hyperspectral imagery (HSI). The detector assumes background images are low-rank while anomalies are gross errors that are sparsely distributed throughout the image scene. By solving a constrained convex optimization problem, the LRaSMD detector separates the anomalies from the background. This protects the background model from corruption. An anomaly value for each pixel is calculated using the Euclidean distance, and anomalies are determined by thresholding the anomaly value. Four groups of experiments on three widely used HSI datasets are designed to completely analyze the performances of the new detector. Experimental results show that the LRaSMD detector outperforms the global Reed-Xiaoli (GRX), the orthogonal subspace projection-GRX, and the cluster-based detectors. Moreover, the results show that LRaSMD achieves equal or better detection performance than the local support vector data description detector within a shorter computational time.

  10. Stillbirth Risk Among Fetuses With Ultrasound-Detected Isolated Congenital Anomalies

    PubMed Central

    Frey, Heather A.; Odibo, Anthony O.; Dicke, Jeffrey M.; Shanks, Anthony L.; Macones, George A.; Cahill, Alison G.

    2014-01-01

    Objective To estimate the risk of stillbirth among pregnancies complicated by a major isolated congenital anomaly detected by antenatal ultrasound, and the influence of incidental growth restriction. Methods A retrospective cohort study of all consecutive singleton pregnancies undergoing routine anatomic survey between 1990 and 2009 was performed. Stillbirth rates among fetuses with an ultrasound-detected isolated major congenital anomaly were compared to fetuses without major anomalies. Stillbirth rates were calculated per 1,000 ongoing pregnancies. Exclusion criteria included delivery prior to 24 weeks of gestation, multiple fetal anomalies, minor anomalies and chromosomal abnormalities. Analyses were stratified by gestational age at delivery (prior to 32 weeks vs. 32 weeks of gestation or after) and birth weight less than the 10th percentile. We adjusted for confounders using logistic regression. Results Among 65,308 singleton pregnancies delivered at 24 weeks of gestation or after, 873 pregnancies with an isolated major congenital anomaly (1.3%) were identified. The overall stillbirth rate among fetuses with a major anomaly was 55/1,000 compared to 4/1,000 in nonanomalous fetuses (aOR 15.17, 95% CI 11.03–20.86). Stillbirth risk in anomalous fetuses was similar prior to 32 weeks of gestation (26/1,000) and 32 weeks of gestation or after (31/1,000). Among growth-restricted fetuses, the stillbirth rate increased among anomalous (127/1,000) and nonanomalous fetuses (18/1,000), and congenital anomalies remained associated with higher rates of stillbirth (aOR 8.20, 95% CI 5.27–12.74). Conclusion The stillbirth rate is increased in anomalous fetuses regardless of incidental growth restriction. These risks can assist practitioners designing care plans for anomalous fetuses who have elevated and competing risks of stillbirth and neonatal death. PMID:24901272

  11. Apparatus and method for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    DOEpatents

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1981-05-22

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  12. Robust and accurate anomaly detection in ECG artifacts using time series motif discovery.

    PubMed

    Sivaraks, Haemwaan; Ratanamahatana, Chotirat Ann

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods.

  13. A novel anomaly detection approach based on clustering and decision-level fusion

    NASA Astrophysics Data System (ADS)

    Zhong, Shengwei; Zhang, Ye

    2015-09-01

    In hyperspectral image processing, anomaly detection is a valuable way of searching targets whose spectral characteristics are not known, and the estimation of background signals is the key procedure. On account of the high dimensionality and complexity of hyperspectral image, dimensionality reduction and background suppression is necessary. In addition, the complementarity of different anomaly detection algorithms can be utilized to improve the effectiveness of anomaly detection. In this paper, we propose a novel method of anomaly detection, which is based on clustering of optimized K-means and decision-level fusion. In our proposed method, pixels with similar features are firstly clustered using an optimized k-means method. Secondly, dimensionality reduction is conducted using principle component analysis to reduce the amount of calculation. Then, to increase the accuracy of detection and decrease the false-alarm ratio, both Reed-Xiaoli (RX) and Kernel RX algorithm are used on processed image. Lastly, a decision-level fusion is processed on the detection results. A simulated hyperspectral image and a real hyperspectral one are both used to evaluate the performance of our proposed method. Visual analysis and quantative analysis of receiver operating characteristic (ROC) curves show that our algorithm can achieve better performance when compared with other classic approaches and state-of-the-art approaches.

  14. Sequential Model Selection based Segmentation to Detect DNA Copy Number Variation

    PubMed Central

    Hu, Jianhua; Zhang, Liwen; Wang, Huixia Judy

    2016-01-01

    Summary Array-based CGH experiments are designed to detect genomic aberrations or regions of DNA copy-number variation that are associated with an outcome, typically a state of disease. Most of the existing statistical methods target on detecting DNA copy number variations in a single sample or array. We focus on the detection of group effect variation, through simultaneous study of multiple samples from multiple groups. Rather than using direct segmentation or smoothing techniques, as commonly seen in existing detection methods, we develop a sequential model selection procedure that is guided by a modified Bayesian information criterion. This approach improves detection accuracy by accumulatively utilizing information across contiguous clones, and has computational advantage over the existing popular detection methods. Our empirical investigation suggests that the performance of the proposed method is superior to that of the existing detection methods, in particular, in detecting small segments or separating neighboring segments with differential degrees of copy-number variation. PMID:26954760

  15. Temporal Characteristics of Radiologists' and Novices' Lesion Detection in Viewing Medical Images Presented Rapidly and Sequentially

    PubMed Central

    Nakashima, Ryoichi; Komori, Yuya; Maeda, Eriko; Yoshikawa, Takeharu; Yokosawa, Kazuhiko

    2016-01-01

    Although viewing multiple stacks of medical images presented on a display is a relatively new but useful medical task, little is known about this task. Particularly, it is unclear how radiologists search for lesions in this type of image reading. When viewing cluttered and dynamic displays, continuous motion itself does not capture attention. Thus, it is effective for the target detection that observers' attention is captured by the onset signal of a suddenly appearing target among the continuously moving distractors (i.e., a passive viewing strategy). This can be applied to stack viewing tasks, because lesions often show up as transient signals in medical images which are sequentially presented simulating a dynamic and smoothly transforming image progression of organs. However, it is unclear whether observers can detect a target when the target appears at the beginning of a sequential presentation where the global apparent motion onset signal (i.e., signal of the initiation of the apparent motion by sequential presentation) occurs. We investigated the ability of radiologists to detect lesions during such tasks by comparing the performances of radiologists and novices. Results show that overall performance of radiologists is better than novices. Furthermore, the temporal locations of lesions in CT image sequences, i.e., when a lesion appears in an image sequence, does not affect the performance of radiologists, whereas it does affect the performance of novices. Results indicate that novices have greater difficulty in detecting a lesion appearing early than late in the image sequence. We suggest that radiologists have other mechanisms to detect lesions in medical images with little attention which novices do not have. This ability is critically important when viewing rapid sequential presentations of multiple CT images, such as stack viewing tasks. PMID:27774080

  16. Improvements in the method of radiation anomaly detection by spectral comparison ratios.

    PubMed

    Pfund, D M; Anderson, K K; Detwiler, R S; Jarman, K D; McDonald, B S; Milbrath, B D; Myjak, M J; Paradis, N C; Robinson, S M; Woodring, M L

    2016-04-01

    We present a new procedure for configuring the Nuisance-rejection Spectral Comparison Ratio Anomaly Detection (N-SCRAD) method. The procedure minimizes detectable count rates of source spectra at a specified false positive rate using simulated annealing. We also present a new method for correcting the estimates of background variability used in N-SCRAD to current conditions of the total count rate. The correction lowers detection thresholds for a specified false positive rate, enabling greater sensitivity to targets. PMID:26807839

  17. Low frequency of Y anomaly detected in Australian Brahman cow-herds.

    PubMed

    de Camargo, Gregório M F; Porto-Neto, Laercio R; Fortes, Marina R S; Bunch, Rowan J; Tonhati, Humberto; Reverter, Antonio; Moore, Stephen S; Lehnert, Sigrid A

    2015-02-01

    Indicine cattle have lower reproductive performance in comparison to taurine. A chromosomal anomaly characterized by the presence Y markers in females was reported and associated with infertility in cattle. The aim of this study was to investigate the occurrence of the anomaly in Brahman cows. Brahman cows (n = 929) were genotyped for a Y chromosome specific region using real time-PCR. Only six out of 929 cows had the anomaly (0.6%). The anomaly frequency was much lower in Brahman cows than in the crossbred population, in which it was first detected. It also seems that the anomaly doesn't affect pregnancy in the population. Due to the low frequency, association analyses couldn't be executed. Further, SNP signal of the pseudoautosomal boundary region of the Y chromosome was investigated using HD SNP chip. Pooled DNA of "non-pregnant" and "pregnant" cows were compared and no difference in SNP allele frequency was observed. Results suggest that the anomaly had a very low frequency in this Australian Brahman population and had no effect on reproduction. Further studies comparing pregnant cows and cows that failed to conceive should be executed after better assembly and annotation of the Y chromosome in cattle. PMID:25750859

  18. Low frequency of Y anomaly detected in Australian Brahman cow-herds

    PubMed Central

    de Camargo, Gregório M.F.; Porto-Neto, Laercio R.; Fortes, Marina R.S.; Bunch, Rowan J.; Tonhati, Humberto; Reverter, Antonio; Moore, Stephen S.; Lehnert, Sigrid A.

    2015-01-01

    Indicine cattle have lower reproductive performance in comparison to taurine. A chromosomal anomaly characterized by the presence Y markers in females was reported and associated with infertility in cattle. The aim of this study was to investigate the occurrence of the anomaly in Brahman cows. Brahman cows (n = 929) were genotyped for a Y chromosome specific region using real time-PCR. Only six out of 929 cows had the anomaly (0.6%). The anomaly frequency was much lower in Brahman cows than in the crossbred population, in which it was first detected. It also seems that the anomaly doesn't affect pregnancy in the population. Due to the low frequency, association analyses couldn't be executed. Further, SNP signal of the pseudoautosomal boundary region of the Y chromosome was investigated using HD SNP chip. Pooled DNA of “non-pregnant” and “pregnant” cows were compared and no difference in SNP allele frequency was observed. Results suggest that the anomaly had a very low frequency in this Australian Brahman population and had no effect on reproduction. Further studies comparing pregnant cows and cows that failed to conceive should be executed after better assembly and annotation of the Y chromosome in cattle. PMID:25750859

  19. Adaptive sequential algorithms for detecting targets in a heavy IR clutter

    NASA Astrophysics Data System (ADS)

    Tartakovsky, Alexander G.; Kligys, Skirmantas; Petrov, Anton

    1999-10-01

    Cruise missiles over land and sea cluttered background are serious threats to search and track systems. In general, these threats are stealth in both the IR and radio frequency bands. That is, their thermal IR signature and their radar cross section can be quite small. This paper discusses adaptive sequential detection methods which exploit 'track- before-detect' technology for detection glow-SNR targets in IR search and track (IRST) systems. Despite the fact that we focus on an IRST against cruise missiles over land and sea cluttered backgrounds, the results are applicable to other sensors and other kinds of targets.

  20. Detection of anomalies in radio tomography of asteroids: Source count and forward errors

    NASA Astrophysics Data System (ADS)

    Pursiainen, S.; Kaasalainen, M.

    2014-09-01

    The purpose of this study was to advance numerical methods for radio tomography in which asteroid's internal electric permittivity distribution is to be recovered from radio frequency data gathered by an orbiter. The focus was on signal generation via multiple sources (transponders) providing one potential, or even essential, scenario to be implemented in a challenging in situ measurement environment and within tight payload limits. As a novel feature, the effects of forward errors including noise and a priori uncertainty of the forward (data) simulation were examined through a combination of the iterative alternating sequential (IAS) inverse algorithm and finite-difference time-domain (FDTD) simulation of time evolution data. Single and multiple source scenarios were compared in two-dimensional localization of permittivity anomalies. Three different anomaly strengths and four levels of total noise were tested. Results suggest, among other things, that multiple sources can be necessary to obtain appropriate results, for example, to distinguish three separate anomalies with permittivity less or equal than half of the background value, relevant in recovery of internal cavities.

  1. [A Hyperspectral Imagery Anomaly Detection Algorithm Based on Gauss-Markov Model].

    PubMed

    Gao, Kun; Liu, Ying; Wang, Li-jing; Zhu, Zhen-yu; Cheng, Hao-bo

    2015-10-01

    With the development of spectral imaging technology, hyperspectral anomaly detection is getting more and more widely used in remote sensing imagery processing. The traditional RX anomaly detection algorithm neglects spatial correlation of images. Besides, it does not validly reduce the data dimension, which costs too much processing time and shows low validity on hyperspectral data. The hyperspectral images follow Gauss-Markov Random Field (GMRF) in space and spectral dimensions. The inverse matrix of covariance matrix is able to be directly calculated by building the Gauss-Markov parameters, which avoids the huge calculation of hyperspectral data. This paper proposes an improved RX anomaly detection algorithm based on three-dimensional GMRF. The hyperspectral imagery data is simulated with GMRF model, and the GMRF parameters are estimated with the Approximated Maximum Likelihood method. The detection operator is constructed with GMRF estimation parameters. The detecting pixel is considered as the centre in a local optimization window, which calls GMRF detecting window. The abnormal degree is calculated with mean vector and covariance inverse matrix, and the mean vector and covariance inverse matrix are calculated within the window. The image is detected pixel by pixel with the moving of GMRF window. The traditional RX detection algorithm, the regional hypothesis detection algorithm based on GMRF and the algorithm proposed in this paper are simulated with AVIRIS hyperspectral data. Simulation results show that the proposed anomaly detection method is able to improve the detection efficiency and reduce false alarm rate. We get the operation time statistics of the three algorithms in the same computer environment. The results show that the proposed algorithm improves the operation time by 45.2%, which shows good computing efficiency.

  2. [A Hyperspectral Imagery Anomaly Detection Algorithm Based on Gauss-Markov Model].

    PubMed

    Gao, Kun; Liu, Ying; Wang, Li-jing; Zhu, Zhen-yu; Cheng, Hao-bo

    2015-10-01

    With the development of spectral imaging technology, hyperspectral anomaly detection is getting more and more widely used in remote sensing imagery processing. The traditional RX anomaly detection algorithm neglects spatial correlation of images. Besides, it does not validly reduce the data dimension, which costs too much processing time and shows low validity on hyperspectral data. The hyperspectral images follow Gauss-Markov Random Field (GMRF) in space and spectral dimensions. The inverse matrix of covariance matrix is able to be directly calculated by building the Gauss-Markov parameters, which avoids the huge calculation of hyperspectral data. This paper proposes an improved RX anomaly detection algorithm based on three-dimensional GMRF. The hyperspectral imagery data is simulated with GMRF model, and the GMRF parameters are estimated with the Approximated Maximum Likelihood method. The detection operator is constructed with GMRF estimation parameters. The detecting pixel is considered as the centre in a local optimization window, which calls GMRF detecting window. The abnormal degree is calculated with mean vector and covariance inverse matrix, and the mean vector and covariance inverse matrix are calculated within the window. The image is detected pixel by pixel with the moving of GMRF window. The traditional RX detection algorithm, the regional hypothesis detection algorithm based on GMRF and the algorithm proposed in this paper are simulated with AVIRIS hyperspectral data. Simulation results show that the proposed anomaly detection method is able to improve the detection efficiency and reduce false alarm rate. We get the operation time statistics of the three algorithms in the same computer environment. The results show that the proposed algorithm improves the operation time by 45.2%, which shows good computing efficiency. PMID:26904830

  3. Using new edges for anomaly detection in computer networks

    DOEpatents

    Neil, Joshua Charles

    2015-05-19

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  4. Multivariate diagnostics and anomaly detection for nuclear safeguards

    SciTech Connect

    Burr, T.; Jones, J.; Wangen, L.

    1994-08-01

    For process control and other reasons, new and future nuclear reprocessing plants are expected to be increasingly more automated than older plants. As a consequence of this automation, the quantity of data potentially available for safeguards may be much greater in future reprocessing plants than in current plants. The authors first review recent literature that applies multivariate Shewhart and multivariate cumulative sum (Cusum) tests to detect anomalous data. These tests are used to evaluate residuals obtained from a simulated three-tank problem in which five variables (volume, density, and concentrations of uranium, plutonium, and nitric acid) in each tank are modeled and measured. They then present results from several simulations involving transfers between the tanks and between the tanks and the environment. Residuals from a no-fault problem in which the measurements and model predictions are both correct are used to develop Cusum test parameters which are then used to test for faults for several simulated anomalous situations, such as an unknown leak or diversion of material from one of the tanks. The leak can be detected by comparing measurements, which estimate the true state of the tank system, with the model predictions, which estimate the state of the tank system as it ``should`` be. The no-fault simulation compares false alarm behavior for the various tests, whereas the anomalous problems allow one to compare the power of the various tests to detect faults under possible diversion scenarios. For comparison with the multivariate tests, univariate tests are also applied to the residuals.

  5. Some practical issues in anomaly detection and exploitation of regions of interest in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Goudail, François; Roux, Nicolas; Baarstad, Ivar; Løke, Trond; Kaspersen, Peter; Alouini, Mehdi; Normandin, Xavier

    2006-07-01

    We address method of detection of anomalies in hyperspectral images that consists in performing the detection when the spectral signatures of the targets are unknown. We show that, in real hyperspectral images, use of the full spectral resolution may not be necessary for detection but that the correlation properties of spectral fluctuations have to be taken into account in the design of the detection algorithm. Anomaly detectors are useful for detecting regions of interest (ROIs), but, as they are prone to false alarms, one must analyze the ROIs obtained further to decide whether they correspond to real targets. We propose a method of exploitation of these ROIs that consists in generating a single image in which the contrast of the ROI is optimized.

  6. Some practical issues in anomaly detection and exploitation of regions of interest in hyperspectral images.

    PubMed

    Goudail, François; Roux, Nicolas; Baarstad, Ivar; Løke, Trond; Kaspersen, Peter; Alouini, Mehdi; Normandin, Xavier

    2006-07-20

    We address method of detection of anomalies in hyperspectral images that consists in performing the detection when the spectral signatures of the targets are unknown. We show that, in real hyperspectral images, use of the full spectral resolution may not be necessary for detection but that the correlation properties of spectral fluctuations have to be taken into account in the design of the detection algorithm. Anomaly detectors are useful for detecting regions of interest (ROIs), but, as they are prone to false alarms, one must analyze the ROIs obtained further to decide whether they correspond to real targets. We propose a method of exploitation of these ROIs that consists in generating a single image in which the contrast of the ROI is optimized. PMID:16826261

  7. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    SciTech Connect

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  8. Anomaly Detection in the Right Hemisphere: The Influence of Visuospatial Factors

    ERIC Educational Resources Information Center

    Smith, Stephen D.; Dixon, Michael J.; Tays, William J.; Bulman-Fleming, M. Barbara

    2004-01-01

    Previous research with both brain-damaged and neurologically intact populations has demonstrated that the right cerebral hemisphere (RH) is superior to the left cerebral hemisphere (LH) at detecting anomalies (or incongruities) in objects (Ramachandran, 1995; Smith, Tays, Dixon, & Bulman-Fleming, 2002). The current research assesses whether the RH…

  9. Dual Use Corrosion Inhibitor and Penetrant for Anomaly Detection in Neutron/X Radiography

    NASA Technical Reports Server (NTRS)

    Hall, Phillip B. (Inventor); Novak, Howard L. (Inventor)

    2004-01-01

    A dual purpose corrosion inhibitor and penetrant composition sensitive to radiography interrogation is provided. The corrosion inhibitor mitigates or eliminates corrosion on the surface of a substrate upon which the corrosion inhibitor is applied. In addition, the corrosion inhibitor provides for the attenuation of a signal used during radiography interrogation thereby providing for detection of anomalies on the surface of the substrate.

  10. A comparison of algorithms for anomaly detection in safeguards and computer security systems using neural networks

    SciTech Connect

    Howell, J.A.; Whiteson, R.

    1992-08-01

    Detection of anomalies in nuclear safeguards and computer security systems is a tedious and time-consuming task. It typically requires the examination of large amounts of data for unusual patterns of activity. Neural networks provide a flexible pattern-recognition capability that can easily be adapted for these purposes. In this paper, we discuss architectures for accomplishing this task.

  11. A comparison of algorithms for anomaly detection in safeguards and computer security systems using neural networks

    SciTech Connect

    Howell, J.A.; Whiteson, R.

    1992-01-01

    Detection of anomalies in nuclear safeguards and computer security systems is a tedious and time-consuming task. It typically requires the examination of large amounts of data for unusual patterns of activity. Neural networks provide a flexible pattern-recognition capability that can easily be adapted for these purposes. In this paper, we discuss architectures for accomplishing this task.

  12. Radio signal anomalies detected with MEXART in 2012 during the recovery phase of geomagnetic storms

    NASA Astrophysics Data System (ADS)

    Carrillo-Vargas, Armando; Pérez-Enríquez, Román; López-Montes, Rebeca; Rodríguez-Martínez, Mario; Ugalde-Calvillo, Luis Gerardo

    2016-11-01

    In this work we present MEXART observations in 2012 from 17 radio sources in which we detected anomalies in the radio signal of these sources occurring during the recovery phase of some geomagnetic storms. We performed FFT and wavelet analysis of the radio signals during these periods and found that rather than IPS the anomalies seem to originate in the ionosphere, especially because of the frequencies at which they are observed. We discuss this results under the view that the source of the geomagnetic storm is no longer in the interplanetary medium.

  13. Radiation detection method and system using the sequential probability ratio test

    DOEpatents

    Nelson, Karl E.; Valentine, John D.; Beauchamp, Brock R.

    2007-07-17

    A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.

  14. HPNAIDM: The High-Performance Network Anomaly/Intrusion Detection and Mitigation System

    SciTech Connect

    Chen, Yan

    2013-12-05

    Identifying traffic anomalies and attacks rapidly and accurately is critical for large network operators. With the rapid growth of network bandwidth, such as the next generation DOE UltraScience Network, and fast emergence of new attacks/virus/worms, existing network intrusion detection systems (IDS) are insufficient because they: • Are mostly host-based and not scalable to high-performance networks; • Are mostly signature-based and unable to adaptively recognize flow-level unknown attacks; • Cannot differentiate malicious events from the unintentional anomalies. To address these challenges, we proposed and developed a new paradigm called high-performance network anomaly/intrustion detection and mitigation (HPNAIDM) system. The new paradigm is significantly different from existing IDSes with the following features (research thrusts). • Online traffic recording and analysis on high-speed networks; • Online adaptive flow-level anomaly/intrusion detection and mitigation; • Integrated approach for false positive reduction. Our research prototype and evaluation demonstrate that the HPNAIDM system is highly effective and economically feasible. Beyond satisfying the pre-set goals, we even exceed that significantly (see more details in the next section). Overall, our project harvested 23 publications (2 book chapters, 6 journal papers and 15 peer-reviewed conference/workshop papers). Besides, we built a website for technique dissemination, which hosts two system prototype release to the research community. We also filed a patent application and developed strong international and domestic collaborations which span both academia and industry.

  15. Towards spatial localisation of harmful algal blooms; statistics-based spatial anomaly detection

    NASA Astrophysics Data System (ADS)

    Shutler, J. D.; Grant, M. G.; Miller, P. I.

    2005-10-01

    Harmful algal blooms are believed to be increasing in occurrence and their toxins can be concentrated by filter-feeding shellfish and cause amnesia or paralysis when ingested. As a result fisheries and beaches in the vicinity of blooms may need to be closed and the local population informed. For this avoidance planning timely information on the existence of a bloom, its species and an accurate map of its extent would be prudent. Current research to detect these blooms from space has mainly concentrated on spectral approaches towards determining species. We present a novel statistics-based background-subtraction technique that produces improved descriptions of an anomaly's extent from remotely-sensed ocean colour data. This is achieved by extracting bulk information from a background model; this is complemented by a computer vision ramp filtering technique to specifically detect the perimeter of the anomaly. The complete extraction technique uses temporal-variance estimates which control the subtraction of the scene of interest from the time-weighted background estimate, producing confidence maps of anomaly extent. Through the variance estimates the method learns the associated noise present in the data sequence, providing robustness, and allowing generic application. Further, the use of the median for the background model reduces the effects of anomalies that appear within the time sequence used to generate it, allowing seasonal variations in the background levels to be closely followed. To illustrate the detection algorithm's application, it has been applied to two spectrally different oceanic regions.

  16. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  17. Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method

    NASA Technical Reports Server (NTRS)

    Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.

    2005-01-01

    NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.

  18. Unsupervised Anomaly Detection Based on Clustering and Multiple One-Class SVM

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Kwon, Yongjin

    Intrusion detection system (IDS) has played an important role as a device to defend our networks from cyber attacks. However, since it is unable to detect unknown attacks, i.e., 0-day attacks, the ultimate challenge in intrusion detection field is how we can exactly identify such an attack by an automated manner. Over the past few years, several studies on solving these problems have been made on anomaly detection using unsupervised learning techniques such as clustering, one-class support vector machine (SVM), etc. Although they enable one to construct intrusion detection models at low cost and effort, and have capability to detect unforeseen attacks, they still have mainly two problems in intrusion detection: a low detection rate and a high false positive rate. In this paper, we propose a new anomaly detection method based on clustering and multiple one-class SVM in order to improve the detection rate while maintaining a low false positive rate. We evaluated our method using KDD Cup 1999 data set. Evaluation results show that our approach outperforms the existing algorithms reported in the literature; especially in detection of unknown attacks.

  19. Automated Detection of Volcanic Thermal Anomalies: Detailed Analysis of the 2004 - 2005 Mt. Etna, Italy Eruption

    NASA Astrophysics Data System (ADS)

    Steffke, A. M.; Harris, A.; Garbeil, H.; Wright, R.; Dehn, J.

    2007-05-01

    Use of thermal infrared satellite data to detect, characterize and track volcanic thermal emissions is an appealing method for monitoring volcanoes for a number of reasons. It provides a synoptic perspective, with satellites sensors such as AVHRR and MODIS allowing global coverage at-least 4 times/day. At the same time, direct reception of calibrated digital data in a standard and stable format allows automation, enabling near-real time analysis of many volcanoes over large regions, including volcanoes where other geophysical instruments are not deployed. In addition, extracted thermal data can be use to convert to heat and volume flux estimates/time series. The development of an automated algorithm to detect volcanic thermal anomalies using thermal satellite data was first attempted over a decade ago (VAST). Subsequently several attempts have been made to create an effective way to automatically detect thermal anomalies at volcanoes using such high-temporal resolution satellite data (e.g. Okmok, MODVOLC and RAT). The underlying motivation has been to allow automated, routine and timely hot spot detection for volcanic monitoring purposes. In this study we review four algorithms that have been implemented to date, specifically: VAST, Okmok, MODVOLC and RAT. To test how VAST and MODVOLC performed we tested them on the 2004 - 2005 effusive eruption of Mount Etna (Sicily, Italy). These results were then compared with manually detected and picked thermal anomalies. Each algorithm is designed for different purposes, thus they perform differently. MODVOLC, for example, must run efficiently, up to 4 times a day, on a full global data set. Thus the number of algorithm steps are minimal and the detection threshold is high, meaning that the incidence of false positives are low, but so too is its sensitivity. In contrast, VAST is designed to run on a single volcano and has the added advantage of some user input. Thus, a greater incidence of false positives occurs, but more

  20. Application of Artificial Bee Colony algorithm in TEC seismo-ionospheric anomalies detection

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2015-09-01

    In this study, the efficiency of Artificial Bee Colony (ABC) algorithm is investigated to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of some strong earthquakes including Chile (27 February 2010; 01 April 2014), Varzeghan (11 August 2012), Saravan (16 April 2013) and Papua New Guinea (29 March 2015). In comparison with other anomaly detection algorithms, ABC has a number of advantages which can be numerated as (1) detection of discord patterns in a large non linear data during a short time, (2) simplicity, (3) having less control parameters and (4) efficiently for solving multimodal and multidimensional optimization problems. Also the results of this study acknowledge the TEC time-series as a robust earthquake precursor.

  1. A new approach for structural health monitoring by applying anomaly detection on strain sensor data

    NASA Astrophysics Data System (ADS)

    Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik

    2014-03-01

    Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.

  2. Millimeter Wave Detection of Localized Anomalies in the Space Shuttle External Fuel Tank Insulating Foam

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Case, J. T.; Abou-Khousa, M. A.; Zoughi, R.; Hepburn, F.

    2006-01-01

    The Space Shuttle Columbia's catastrophic accident emphasizes the growing need for developing and applying effective, robust and life-cycle oriented nondestructive testing (NDT) methods for inspecting the shuttle external fuel tank spray on foam insulation (SOFI). Millimeter wave NDT techniques were one of the methods chosen for evaluating their potential for inspecting these structures. Several panels with embedded anomalies (mainly voids) were produced and tested for this purpose. Near-field and far-field millimeter wave NDT methods were used for producing images of the anomalies in these panels. This paper presents the results of an investigation for the purpose of detecting localized anomalies in several SOFI panels. To this end, reflectometers at a relatively wide range of frequencies (Ka-band (26.5 - 40 GHz) to W-band (75 - 110 GHz)) and utilizing different types of radiators were employed. The resulting raw images revealed a significant amount of information about the interior of these panels. However, using simple image processing techniques the results were improved in particular as it relate s to detecting the smaller anomalies. This paper presents the results of this investigation and a discussion of these results.

  3. Multiple Regression Model Based Sequential Probability Ratio Test for Structural Change Detection of Time Series

    NASA Astrophysics Data System (ADS)

    Takeda, Katsunori; Hattori, Tetsuo; Kawano, Hiromichi

    In real time analysis and forecasting of time series data, it is important to detect the structural change as immediately, correctly, and simply as possible. And it is necessary for rebuilding the next prediction model after the change point as soon as possible. For this kind of time series data analysis, in general, multiple linear regression models are used. In this paper, we present two methods, i.e., Sequential Probability Ratio Test (SPRT) and Chow Test that is well-known in economics, and describe those experimental evaluations of the effectiveness in the change detection using the multiple regression models. Moreover, we extend the definition of the detected change point in the SPRT method, and show the improvement of the change detection accuracy.

  4. Sequential feature selection for detecting buried objects using forward looking ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Shaw, Darren; Stone, Kevin; Ho, K. C.; Keller, James M.; Luke, Robert H.; Burns, Brian P.

    2016-05-01

    Forward looking ground penetrating radar (FLGPR) has the benefit of detecting objects at a significant standoff distance. The FLGPR signal is radiated over a large surface area and the radar signal return is often weak. Improving detection, especially for buried in road targets, while maintaining an acceptable false alarm rate remains to be a challenging task. Various kinds of features have been developed over the years to increase the FLGPR detection performance. This paper focuses on investigating the use of as many features as possible for detecting buried targets and uses the sequential feature selection technique to automatically choose the features that contribute most for improving performance. Experimental results using data collected at a government test site are presented.

  5. A New Curb Detection Method for Unmanned Ground Vehicles Using 2D Sequential Laser Data

    PubMed Central

    Liu, Zhao; Wang, Jinling; Liu, Daxue

    2013-01-01

    Curb detection is an important research topic in environment perception, which is an essential part of unmanned ground vehicle (UGV) operations. In this paper, a new curb detection method using a 2D laser range finder in a semi-structured environment is presented. In the proposed method, firstly, a local Digital Elevation Map (DEM) is built using 2D sequential laser rangefinder data and vehicle state data in a dynamic environment and a probabilistic moving object deletion approach is proposed to cope with the effect of moving objects. Secondly, the curb candidate points are extracted based on the moving direction of the vehicle in the local DEM. Finally, the straight and curved curbs are detected by the Hough transform and the multi-model RANSAC algorithm, respectively. The proposed method can detect the curbs robustly in both static and typical dynamic environments. The proposed method has been verified in real vehicle experiments. PMID:23325170

  6. A new curb detection method for unmanned ground vehicles using 2D sequential laser data.

    PubMed

    Liu, Zhao; Wang, Jinling; Liu, Daxue

    2013-01-01

    Curb detection is an important research topic in environment perception, which is an essential part of unmanned ground vehicle (UGV) operations. In this paper, a new curb detection method using a 2D laser range finder in a semi-structured environment is presented. In the proposed method, firstly, a local Digital Elevation Map (DEM) is built using 2D sequential laser rangefinder data and vehicle state data in a dynamic environment and a probabilistic moving object deletion approach is proposed to cope with the effect of moving objects. Secondly, the curb candidate points are extracted based on the moving direction of the vehicle in the local DEM. Finally, the straight and curved curbs are detected by the Hough transform and the multi-model RANSAC algorithm, respectively. The proposed method can detect the curbs robustly in both static and typical dynamic environments. The proposed method has been verified in real vehicle experiments. PMID:23325170

  7. Sequential detection of influenza epidemics by the Kolmogorov-Smirnov test

    PubMed Central

    2012-01-01

    Background Influenza is a well known and common human respiratory infection, causing significant morbidity and mortality every year. Despite Influenza variability, fast and reliable outbreak detection is required for health resource planning. Clinical health records, as published by the Diagnosticat database in Catalonia, host useful data for probabilistic detection of influenza outbreaks. Methods This paper proposes a statistical method to detect influenza epidemic activity. Non-epidemic incidence rates are modeled against the exponential distribution, and the maximum likelihood estimate for the decaying factor λ is calculated. The sequential detection algorithm updates the parameter as new data becomes available. Binary epidemic detection of weekly incidence rates is assessed by Kolmogorov-Smirnov test on the absolute difference between the empirical and the cumulative density function of the estimated exponential distribution with significance level 0 ≤ α ≤ 1. Results The main advantage with respect to other approaches is the adoption of a statistically meaningful test, which provides an indicator of epidemic activity with an associated probability. The detection algorithm was initiated with parameter λ0 = 3.8617 estimated from the training sequence (corresponding to non-epidemic incidence rates of the 2008-2009 influenza season) and sequentially updated. Kolmogorov-Smirnov test detected the following weeks as epidemic for each influenza season: 50−10 (2008-2009 season), 38−50 (2009-2010 season), weeks 50−9 (2010-2011 season) and weeks 3 to 12 for the current 2011-2012 season. Conclusions Real medical data was used to assess the validity of the approach, as well as to construct a realistic statistical model of weekly influenza incidence rates in non-epidemic periods. For the tested data, the results confirmed the ability of the algorithm to detect the start and the end of epidemic periods. In general, the proposed test could be applied to other data

  8. On selecting reference image models for anomaly detection in industrial systems

    NASA Astrophysics Data System (ADS)

    Xiao, Xinhua; Quan, Jin; Ferro, Andrew; Han, Chia Y.; Zhou, Xuefu; Wee, William G.

    2013-09-01

    Automatic X-ray inspection of industrial parts usually uses reference-based methods, in which a set of model images or statistics extracted from the model image set are selected as the benchmark. Based on these methods, many systems are developed and are used extensively for anomaly detection. However, the performance of these systems relies heavily on the model image set. Thus, the selection of the model images is very important. This paper presents an approach for automatically selecting a set of model images to be used in a reference-based assisted defect recognition (ADR) system for anomaly detection of turbine blades of jet engines. The proposed approach to generating a model image set is based on feature extraction. Features are extracted from callout images of ADR, including potential defect indication type, size and location. Experimental results show that the proposed approach is fast and a low false alarm rate with acceptable detection rate is ensured. Moreover, the approach is applicable to different blade types and varied views of the blade. Further validation shows that the approach can be applied to the update of the model image set, when more images are generated from new blades and the model becomes inaccurate for anomaly detection in the new images.

  9. Automatic metal parts inspection: Use of thermographic images and anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Benmoussat, M. S.; Guillaume, M.; Caulier, Y.; Spinnler, K.

    2013-11-01

    A fully-automatic approach based on the use of induction thermography and detection algorithms is proposed to inspect industrial metallic parts containing different surface and sub-surface anomalies such as open cracks, open and closed notches with different sizes and depths. A practical experimental setup is developed, where lock-in and pulsed thermography (LT and PT, respectively) techniques are used to establish a dataset of thermal images for three different mockups. Data cubes are constructed by stacking up the temporal sequence of thermogram images. After the reduction of the data space dimension by means of denoising and dimensionality reduction methods; anomaly detection algorithms are applied on the reduced data cubes. The dimensions of the reduced data spaces are automatically calculated with arbitrary criterion. The results show that, when reduced data cubes are used, the anomaly detection algorithms originally developed for hyperspectral data, the well-known Reed and Xiaoli Yu detector (RX) and the regularized adaptive RX (RARX), give good detection performances for both surface and sub-surface defects in a non-supervised way.

  10. Capacitance probe for detection of anomalies in non-metallic plastic pipe

    DOEpatents

    Mathur, Mahendra P.; Spenik, James L.; Condon, Christopher M.; Anderson, Rodney; Driscoll, Daniel J.; Fincham, Jr., William L.; Monazam, Esmail R.

    2010-11-23

    The disclosure relates to analysis of materials using a capacitive sensor to detect anomalies through comparison of measured capacitances. The capacitive sensor is used in conjunction with a capacitance measurement device, a location device, and a processor in order to generate a capacitance versus location output which may be inspected for the detection and localization of anomalies within the material under test. The components may be carried as payload on an inspection vehicle which may traverse through a pipe interior, allowing evaluation of nonmetallic or plastic pipes when the piping exterior is not accessible. In an embodiment, supporting components are solid-state devices powered by a low voltage on-board power supply, providing for use in environments where voltage levels may be restricted.

  11. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  12. Epsilon-optimal non-Bayesian anomaly detection for parametric tomography.

    PubMed

    Fillatre, Lionel; Nikiforov, Igor; Retraint, Florent

    2008-11-01

    The non-Bayesian detection of an anomaly from a single or a few noisy tomographic projections is considered as a statistical hypotheses testing problem. It is supposed that a radiography is composed of an imaged nonanomalous background medium, considered as a deterministic nuisance parameter, with a possibly hidden anomaly. Because the full voxel-by-voxel reconstruction is impossible, an original tomographic method based on the parametric models of the nonanomalous background medium and radiographic process is proposed to fill up the gap in the missing data. Exploiting this "parametric tomography," a new detection scheme with a limited loss of optimality is proposed as an alternative to the nonlinear generalized likelihood ratio test, which is untractable in the context of nondestructive testing for the objects with uncertainties in their physical/geometrical properties. The theoretical results are illustrated by the processing of real radiographies for the nuclear fuel rod inspection.

  13. GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection

    SciTech Connect

    Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D; Reed, Joel W; Goodall, John R

    2016-01-01

    This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% true positive rates at both.

  14. A new morphological anomaly detection algorithm for hyperspectral images and its GPU implementation

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2011-10-01

    Anomaly detection is considered a very important task for hyperspectral data exploitation. It is now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we develop a new morphological algorithm for anomaly detection in hyperspectral images along with an efficient GPU implementation of the algorithm. The algorithm is implemented on latest-generation GPU architectures, and evaluated with regards to other anomaly detection algorithms using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex. The proposed GPU implementation achieves real-time performance in the considered case study.

  15. Parallel implementation of RX anomaly detection on multi-core processors: impact of data partitioning strategies

    NASA Astrophysics Data System (ADS)

    Molero, Jose M.; Garzón, Ester M.; García, Inmaculada; Plaza, Antonio

    2011-11-01

    Anomaly detection is an important task for remotely sensed hyperspectral data exploitation. One of the most widely used and successful algorithms for anomaly detection in hyperspectral images is the Reed-Xiaoli (RX) algorithm. Despite its wide acceptance and high computational complexity when applied to real hyperspectral scenes, few documented parallel implementations of this algorithm exist, in particular for multi-core processors. The advantage of multi-core platforms over other specialized parallel architectures is that they are a low-power, inexpensive, widely available and well-known technology. A critical issue in the parallel implementation of RX is the sample covariance matrix calculation, which can be approached in global or local fashion. This aspect is crucial for the RX implementation since the consideration of a local or global strategy for the computation of the sample covariance matrix is expected to affect both the scalability of the parallel solution and the anomaly detection results. In this paper, we develop new parallel implementations of the RX in multi-core processors and specifically investigate the impact of different data partitioning strategies when parallelizing its computations. For this purpose, we consider both global and local data partitioning strategies in the spatial domain of the scene, and further analyze their scalability in different multi-core platforms. The numerical effectiveness of the considered solutions is evaluated using receiver operating characteristics (ROC) curves, analyzing their capacity to detect thermal hot spots (anomalies) in hyperspectral data collected by the NASA's Airborne Visible Infra- Red Imaging Spectrometer system over the World Trade Center in New York, five days after the terrorist attacks of September 11th, 2001.

  16. Fetal Central Nervous System Anomalies Detected by Magnetic Resonance Imaging: A Two-Year Experience

    PubMed Central

    Sefidbakht, Sepideh; Dehghani, Sakineh; Safari, Maryam; Vafaei, Homeira; Kasraeian, Maryam

    2016-01-01

    Background Magnetic resonance imaging (MRI) is gradually becoming more common for thorough visualization of the fetus than ultrasound (US), especially for neurological anomalies, which are the most common indications for fetal MRI and are a matter of concern for both families and society. Objectives We investigated fetal MRIs carried out in our center for frequency of central nervous system anomalies. This is the first such report in southern Iran. Materials and Methods One hundred and seven (107) pregnant women with suspicious fetal anomalies in prenatal ultrasound entered a cross-sectional retrospective study from 2011 to 2013. A 1.5 T Siemens Avanto scanner was employed for sequences, including T2 HASTE and Trufisp images in axial, coronal, and sagittal planes to mother’s body, T2 HASTE and Trufisp relative to the specific fetal body part being evaluated, and T1 flash images in at least one plane based on clinical indication. We investigated any abnormality in the central nervous system and performed descriptive analysis to achieve index of frequency. Results Mean gestational age ± standard deviation (SD) for fetuses was 25.54 ± 5.22 weeks, and mean maternal age ± SD was 28.38 ± 5.80 years Eighty out of 107 (74.7%) patients who were referred with initial impression of borderline ventriculomegaly. A total of 18 out of 107 (16.82%) patients were found to have fetuses with CNS anomalies and the remainder were neurologically normal. Detected anomalies were as follow: 3 (16.6%) fetuses each had the Dandy-Walker variant and Arnold-Chiari II (with myelomeningocele). Complete agenesis of corpus callosum, partial agenesis of corpus callosum, and aqueductal stenosis were each seen in 2 (11.1%) fetuses. Arnold-Chiari II without myelomeningocele, anterior spina bifida associated with neurenteric cyst, arachnoid cyst, lissencephaly, and isolated enlarged cisterna magna each presented in one (5.5%) fetus. One fetus had concomitant schizencephaly and complete agenesis of

  17. Developing a new, passive diffusion sampling array to detect helium anomalies associated with volcanic unrest

    USGS Publications Warehouse

    Dame, Brittany E; Solomon, D Kip; Evans, William C.; Ingebritsen, Steven E.

    2015-01-01

    Helium (He) concentration and 3 He/ 4 He anomalies in soil gas and spring water are potentially powerful tools for investigating hydrothermal circulation associated with volca- nism and could perhaps serve as part of a hazards warning system. However, in operational practice, He and other gases are often sampled only after volcanic unrest is detected by other means. A new passive diffusion sampler suite, intended to be collected after the onset of unrest, has been developed and tested as a relatively low-cost method of determining He- isotope composition pre- and post-unrest. The samplers, each with a distinct equilibration time, passively record He concen- tration and isotope ratio in springs and soil gas. Once collected and analyzed, the He concentrations in the samplers are used to deconvolve the time history of the He concentration and the 3 He/ 4 He ratio at the collection site. The current suite consisting of three samplers is sufficient to deconvolve both the magnitude and the timing of a step change in in situ con- centration if the suite is collected within 100 h of the change. The effects of temperature and prolonged deployment on the suite ’ s capability of recording He anomalies have also been evaluated. The suite has captured a significant 3 He/ 4 He soil gas anomaly at Horseshoe Lake near Mammoth Lakes, California. The passive diffusion sampler suite appears to be an accurate and affordable alternative for determining He anomalies associated with volcanic unrest.

  18. A Comparative Study of Unsupervised Anomaly Detection Techniques Using Honeypot Data

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Inoue, Daisuke; Eto, Masashi; Nakao, Koji

    Intrusion Detection Systems (IDS) have been received considerable attention among the network security researchers as one of the most promising countermeasures to defend our crucial computer systems or networks against attackers on the Internet. Over the past few years, many machine learning techniques have been applied to IDSs so as to improve their performance and to construct them with low cost and effort. Especially, unsupervised anomaly detection techniques have a significant advantage in their capability to identify unforeseen attacks, i.e., 0-day attacks, and to build intrusion detection models without any labeled (i.e., pre-classified) training data in an automated manner. In this paper, we conduct a set of experiments to evaluate and analyze performance of the major unsupervised anomaly detection techniques using real traffic data which are obtained at our honeypots deployed inside and outside of the campus network of Kyoto University, and using various evaluation criteria, i.e., performance evaluation by similarity measurements and the size of training data, overall performance, detection ability for unknown attacks, and time complexity. Our experimental results give some practical and useful guidelines to IDS researchers and operators, so that they can acquire insight to apply these techniques to the area of intrusion detection, and devise more effective intrusion detection models.

  19. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  20. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  1. Detection and Origin of Hydrocarbon Seepage Anomalies in the Barents Sea

    NASA Astrophysics Data System (ADS)

    Polteau, Stephane; Planke, Sverre; Stolze, Lina; Kjølhamar, Bent E.; Myklebust, Reidun

    2016-04-01

    We have collected more than 450 gravity cores in the Barents Sea to detect hydrocarbon seepage anomalies and for seismic-stratigraphic tie. The cores are from the Hoop Area (125 samples) and from the Barents Sea SE (293 samples). In addition, we have collected cores near seven exploration wells. The samples were analyzed using three different analytical methods; (1) the standard organic geochemical analyzes of Applied Petroleum Technologies (APT), (2) the Amplified Geochemical Imaging (AGI) method, and (3) the Microbial Prospecting for Oil and Gas (MPOG) method. These analytical approaches can detect trace amounts of thermogenic hydrocarbons in the sediment samples, and may provide additional information about the fluid phases and the depositional environment, maturation, and age of the source rocks. However, hydrocarbon anomalies in seabed sediments may also be related to shallow sources, such as biogenic gas or reworked source rocks in the sediments. To better understand the origin of the hydrocarbon anomalies in the Barents Sea we have studied 35 samples collected approximately 200 m away from seven exploration wells. The wells included three boreholes associated with oil discoveries, two with gas discoveries, one dry well with gas shows, and one dry well. In general, the results of this case study reveal that the oil wells have an oil signature, gas wells show a gas signature, and dry wells have a background signature. However, differences in results from the three methods may occur and have largely been explained in terms of analytical measurement ranges, method sensitivities, and bio-geochemical processes in the seabed sediments. The standard geochemical method applied by APT relies on measuring the abundance of compounds between C1 to C5 in the headspace gas and between C11 to C36 in the sediment extracts. The anomalies detected in the sediment samples from this study were in the C16 to C30 range. Since the organic matter yields were mostly very low, the

  2. A healthcare utilization analysis framework for hot spotting and contextual anomaly detection.

    PubMed

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient's clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among

  3. A Healthcare Utilization Analysis Framework for Hot Spotting and Contextual Anomaly Detection

    PubMed Central

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient’s clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among

  4. A healthcare utilization analysis framework for hot spotting and contextual anomaly detection.

    PubMed

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient's clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among

  5. Gaussian mixture model based approach to anomaly detection in multi/hyperspectral images

    NASA Astrophysics Data System (ADS)

    Acito, N.; Diani, M.; Corsini, G.

    2005-10-01

    Anomaly detectors reveal the presence of objects/materials in a multi/hyperspectral image simply searching for those pixels whose spectrum differs from the background one (anomalies). This procedure can be applied directly to the radiance at the sensor level and has the great advantage of avoiding the difficult step of atmospheric correction. The most popular anomaly detector is the RX algorithm derived by Yu and Reed. It is based on the assumption that the pixels, in a region around the one under test, follow a single multivariate Gaussian distribution. Unfortunately, such a hypothesis is generally not met in actual scenarios and a large number of false alarms is usually experienced when the RX algorithm is applied in practice. In this paper, a more general approach to anomaly detection is considered based on the assumption that the background contains different terrain types (clusters) each of them Gaussian distributed. In this approach the parameters of each cluster are estimated and used in the detection process. Two detectors are considered: the SEM-RX and the K-means RX. Both the algorithms follow two steps: first, 1) the parameters of the background clusters are estimated, then, 2) a detection rule based on the RX test is applied. The SEM-RX stems from the GMM and employs the SEM algorithm to estimate the clusters' parameters; instead, the K-means RX resorts to the well known K-means algorithm to obtain the background clusters. An automatic procedure is defined, for both the detectors, to select the number of clusters and a novel criterion is proposed to set the test threshold. The performances of the two detectors are also evaluated on an experimental data set and compared to the ones of the RX algorithm. The comparative analysis is carried out in terms of experimental Receiver Operating Characteristics.

  6. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    NASA Astrophysics Data System (ADS)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  7. A divide and conquer approach to anomaly detection, localization and diagnosis

    NASA Astrophysics Data System (ADS)

    Liu, Jianbo; Djurdjanovic, Dragan; Marko, Kenneth A.; Ni, Jun

    2009-11-01

    With the growing complexity of dynamic control systems, the effective diagnosis of all possible failures has become increasingly difficult and time consuming. The virtually infinite variety of behavior patterns of such systems due to control inputs and environmental influences further complicates system characterization and fault diagnosis. To circumvent these difficulties, we propose a new diagnostic method, consisting of three elements: the first, based on anomaly detection, identifies any performance deviation from normal operation; the second, based on anomaly/fault localization, localizes the problem, as best as possible, to the specific component or subsystem that does not operate properly and the third, fault diagnosis, discriminates known and unknown faults and identifies the type of the fault if it is previously known. Our prescriptive method for diagnostic design relies on the use of self-organizing maps (SOMs) for regionalization of the system operating conditions, followed by the performance assessment module based on time-frequency distributions (TFDs) and principal component analysis (PCA) for anomaly detection and fault diagnosis. The complete procedure is described in detail and demonstrated with an example of automotive engine control system.

  8. Unmixing and anomaly detection in hyperspectral data due to cluster variation and local information

    NASA Astrophysics Data System (ADS)

    Maerker, Jochen M.; Huber, Johannes; Middelmann, Wolfgang

    2010-04-01

    This paper presents a novel method for anomaly detection based on a cluster unmixing approach. Several algorithms for endmember extraction and unmixing have been reported in literature. Endmember extraction algorithms search for pure materials which constitute the significant structure of the environment. For abundance estimation in hyperspectral imagery, various physically motivated least squares methods are considered. In real hyperspectral data, signatures of each pure material vary with physical texture and perspective. In this work, clustering of data is performed and normal distributions - instead of constant signatures - are used to represent the endmembers. This representation allows determination of class membership by means of unmixing. Furthermore, a parameter optimization is performed. Using only endmembers in a focal window around each pixel better fits the physical model. As result of this local approach, the residual of the reconstruction indicates the magnitude of anomalies. The results obtained with the new approach is called 'Cluster Mixing' (CM). The performance of Cluster Mixing is illustrated by a comparison with other anomaly detection algorithms.

  9. Small sample training and test selection method for optimized anomaly detection algorithms in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Mindrup, Frank M.; Friend, Mark A.; Bauer, Kenneth W.

    2012-01-01

    There are numerous anomaly detection algorithms proposed for hyperspectral imagery. Robust parameter design (RPD) techniques provide an avenue to select robust settings capable of operating consistently across a large variety of image scenes. Many researchers in this area are faced with a paucity of data. Unfortunately, there are no data splitting methods for model validation of datasets with small sample sizes. Typically, training and test sets of hyperspectral images are chosen randomly. Previous research has developed a framework for optimizing anomaly detection in HSI by considering specific image characteristics as noise variables within the context of RPD; these characteristics include the Fisher's score, ratio of target pixels and number of clusters. We have developed method for selecting hyperspectral image training and test subsets that yields consistent RPD results based on these noise features. These subsets are not necessarily orthogonal, but still provide improvements over random training and test subset assignments by maximizing the volume and average distance between image noise characteristics. The small sample training and test selection method is contrasted with randomly selected training sets as well as training sets chosen from the CADEX and DUPLEX algorithms for the well known Reed-Xiaoli anomaly detector.

  10. Anomaly detection in hyperspectral imagery based on low-rank and sparse decomposition

    NASA Astrophysics Data System (ADS)

    Cui, Xiaoguang; Tian, Yuan; Weng, Lubin; Yang, Yiping

    2014-01-01

    This paper presents a novel low-rank and sparse decomposition (LSD) based model for anomaly detection in hyperspectral images. In our model, a local image region is represented as a low-rank matrix plus spares noises in the spectral space, where the background can be explained by the low-rank matrix, and the anomalies are indicated by the sparse noises. The detection of anomalies in local image regions is formulated as a constrained LSD problem, which can be solved efficiently and robustly with a modified "Go Decomposition" (GoDec) method. To enhance the validity of this model, we adapts a "simple linear iterative clustering" (SLIC) superpixel algorithm to efficiently generate homogeneous local image regions i.e. superpixels in hyperspectral imagery, thus ensures that the background in local image regions satisfies the condition of low-rank. Experimental results on real hyperspectral data demonstrate that, compared with several known local detectors including RX detector, kernel RX detector, and SVDD detector, the proposed model can comfortably achieves better performance in satisfactory computation time.

  11. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  12. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.

    PubMed

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes.

  13. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.

    PubMed

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477

  14. Min-max hyperellipsoidal clustering for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A

    2006-08-01

    A novel hyperellipsoidal clustering technique is presented for an intrusion-detection system in network security. Hyperellipsoidal clusters toward maximum intracluster similarity and minimum intercluster similarity are generated from training data sets. The novelty of the technique lies in the fact that the parameters needed to construct higher order data models in general multivariate Gaussian functions are incrementally derived from the data sets using accretive processes. The technique is implemented in a feedforward neural network that uses a Gaussian radial basis function as the model generator. An evaluation based on the inclusiveness and exclusiveness of samples with respect to specific criteria is applied to accretively learn the output clusters of the neural network. One significant advantage of this is its ability to detect individual anomaly types that are hard to detect with other anomaly-detection schemes. Applying this technique, several feature subsets of the tcptrace network-connection records that give above 95% detection at false-positive rates below 5% were identified.

  15. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems

    PubMed Central

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477

  16. Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.

    2010-01-01

    In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.

  17. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery

    PubMed Central

    Marapareddy, Ramakalavathi; Aanstoos, James V.; Younan, Nicolas H.

    2016-01-01

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ1, λ2, and λ3), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory’s (JPL’s) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers. PMID:27322270

  18. Automatic, Real-Time Algorithms for Anomaly Detection in High Resolution Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Srivastava, A. N.; Nemani, R. R.; Votava, P.

    2008-12-01

    Earth observing satellites are generating data at an unprecedented rate, surpassing almost all other data intensive applications. However, most of the data that arrives from the satellites is not analyzed directly. Rather, multiple scientific teams analyze only a small fraction of the total data available in the data stream. Although there are many reasons for this situation one paramount concern is developing algorithms and methods that can analyze the vast, high dimensional, streaming satellite images. This paper describes a new set of methods that are among the fastest available algorithms for real-time anomaly detection. These algorithms were built to maximize accuracy and speed for a variety of applications in fields outside of the earth sciences. However, our studies indicate that with appropriate modifications, these algorithms can be extremely valuable for identifying anomalies rapidly using only modest computational power. We review two algorithms which are used as benchmarks in the field: Orca, One-Class Support Vector Machines and discuss the anomalies that are discovered in MODIS data taken over the Central California region. We are especially interested in automatic identification of disturbances within the ecosystems (e,g, wildfires, droughts, floods, insect/pest damage, wind damage, logging). We show the scalability of the algorithms and demonstrate that with appropriately adapted technology, the dream of real-time analysis can be made a reality.

  19. Volcanic activity and satellite-detected thermal anomalies at Central American volcanoes

    NASA Technical Reports Server (NTRS)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1973-01-01

    The author has identified the following significant results. A large nuee ardente eruption occurred at Santiaguito volcano, within the test area on 16 September 1973. Through a system of local observers, the eruption has been described, reported to the international scientific community, extent of affected area mapped, and the new ash sampled. A more extensive report on this event will be prepared. The eruption is an excellent example of the kind of volcanic situation in which satellite thermal imagery might be useful. The Santiaguito dome is a complex mass with a whole series of historically active vents. It's location makes access difficult, yet its activity is of great concern to large agricultural populations who live downslope. Santiaguito has produced a number of large eruptions with little apparent warning. In the earlier ground survey large thermal anomalies were identified at Santiaguito. There is no way of knowing whether satellite monitoring could have detected changes in thermal anomaly patterns related to this recent event, but the position of thermal anomalies on Santiaguito and any changes in their character would be relevant information.

  20. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery.

    PubMed

    Marapareddy, Ramakalavathi; Aanstoos, James V; Younan, Nicolas H

    2016-01-01

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ₁, λ₂, and λ₃), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory's (JPL's) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers.

  1. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery.

    PubMed

    Marapareddy, Ramakalavathi; Aanstoos, James V; Younan, Nicolas H

    2016-01-01

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ₁, λ₂, and λ₃), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory's (JPL's) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers. PMID:27322270

  2. GPU implementation of target and anomaly detection algorithms for remotely sensed hyperspectral image analysis

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2010-08-01

    Automatic target and anomaly detection are considered very important tasks for hyperspectral data exploitation. These techniques are now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly and target detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we describe several new GPU-based implementations of target and anomaly detection algorithms for hyperspectral data exploitation. The parallel algorithms are implemented on latest-generation Tesla C1060 GPU architectures, and quantitatively evaluated using hyperspectral data collected by NASA's AVIRIS system over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex.

  3. Binding at birth: the newborn brain detects identity relations and sequential position in speech.

    PubMed

    Gervain, Judit; Berent, Iris; Werker, Janet F

    2012-03-01

    Breaking the linguistic code requires the extraction of at least two types of information from the speech signal: the relations between linguistic units and their sequential position. Furthermore, these different types of information need to be integrated into a coherent representation of language structure. The brain networks responsible for these abilities are well known in adults, but not in young infants. Our results show that the neural architecture underlying these abilities is operational at birth. In three optical imaging studies, we found that the newborn brain detects identity relations, as evidenced by enhanced activation in the bilateral superior temporal and left inferior frontal regions. More importantly, the newborn brain can also determine whether such identity relations hold for the initial or final positions of speech sequences, as indicated by increased activity in the inferior frontal regions, possibly Broca's area. This implies that the neural foundations of language acquisition are in place from birth.

  4. Using the sequential regression (SER) algorithm for long-term signal processing. [Intrusion detection

    SciTech Connect

    Soldan, D. L.; Ahmed, N.; Stearns, S. D.

    1980-01-01

    The use of the sequential regression (SER) algorithm (Electron. Lett., 14, 118(1978); 13, 446(1977)) for long-term processing applications is limited by two problems that can occur when an SER predictor has more weights than required to predict the input signal. First, computational difficulties related to updating the autocorrelation matrix inverse could arise, since no unique least-squares solution exists. Second, the predictor strives to remove very low-level components in the input, and hence could implement a gain function that is essentially zero over the entire passband. The predictor would then tend to become a no-pass filter which is undesirable in certain applications, e.g., intrusion detection (SAND--78-1032). Modifications to the SER algorithm that overcome the above problems are presented, which enable its use for long-term signal processing applications. 3 figures.

  5. Automated determinations of selenium in thermal power plant wastewater by sequential hydride generation and chemiluminescence detection.

    PubMed

    Ezoe, Kentaro; Ohyama, Seiichi; Hashem, Md Abul; Ohira, Shin-Ichi; Toda, Kei

    2016-02-01

    After the Fukushima disaster, power generation from nuclear power plants in Japan was completely stopped and old coal-based power plants were re-commissioned to compensate for the decrease in power generation capacity. Although coal is a relatively inexpensive fuel for power generation, it contains high levels (mgkg(-1)) of selenium, which could contaminate the wastewater from thermal power plants. In this work, an automated selenium monitoring system was developed based on sequential hydride generation and chemiluminescence detection. This method could be applied to control of wastewater contamination. In this method, selenium is vaporized as H2Se, which reacts with ozone to produce chemiluminescence. However, interference from arsenic is of concern because the ozone-induced chemiluminescence intensity of H2Se is much lower than that of AsH3. This problem was successfully addressed by vaporizing arsenic and selenium individually in a sequential procedure using a syringe pump equipped with an eight-port selection valve and hot and cold reactors. Oxidative decomposition of organoselenium compounds and pre-reduction of the selenium were performed in the hot reactor, and vapor generation of arsenic and selenium were performed separately in the cold reactor. Sample transfers between the reactors were carried out by a pneumatic air operation by switching with three-way solenoid valves. The detection limit for selenium was 0.008 mg L(-1) and calibration curve was linear up to 1.0 mg L(-1), which provided suitable performance for controlling selenium in wastewater to around the allowable limit (0.1 mg L(-1)). This system consumes few chemicals and is stable for more than a month without any maintenance. Wastewater samples from thermal power plants were collected, and data obtained by the proposed method were compared with those from batchwise water treatment followed by hydride generation-atomic fluorescence spectrometry.

  6. Automated determinations of selenium in thermal power plant wastewater by sequential hydride generation and chemiluminescence detection.

    PubMed

    Ezoe, Kentaro; Ohyama, Seiichi; Hashem, Md Abul; Ohira, Shin-Ichi; Toda, Kei

    2016-02-01

    After the Fukushima disaster, power generation from nuclear power plants in Japan was completely stopped and old coal-based power plants were re-commissioned to compensate for the decrease in power generation capacity. Although coal is a relatively inexpensive fuel for power generation, it contains high levels (mgkg(-1)) of selenium, which could contaminate the wastewater from thermal power plants. In this work, an automated selenium monitoring system was developed based on sequential hydride generation and chemiluminescence detection. This method could be applied to control of wastewater contamination. In this method, selenium is vaporized as H2Se, which reacts with ozone to produce chemiluminescence. However, interference from arsenic is of concern because the ozone-induced chemiluminescence intensity of H2Se is much lower than that of AsH3. This problem was successfully addressed by vaporizing arsenic and selenium individually in a sequential procedure using a syringe pump equipped with an eight-port selection valve and hot and cold reactors. Oxidative decomposition of organoselenium compounds and pre-reduction of the selenium were performed in the hot reactor, and vapor generation of arsenic and selenium were performed separately in the cold reactor. Sample transfers between the reactors were carried out by a pneumatic air operation by switching with three-way solenoid valves. The detection limit for selenium was 0.008 mg L(-1) and calibration curve was linear up to 1.0 mg L(-1), which provided suitable performance for controlling selenium in wastewater to around the allowable limit (0.1 mg L(-1)). This system consumes few chemicals and is stable for more than a month without any maintenance. Wastewater samples from thermal power plants were collected, and data obtained by the proposed method were compared with those from batchwise water treatment followed by hydride generation-atomic fluorescence spectrometry. PMID:26653491

  7. Fiber Optic Bragg Grating Sensors for Thermographic Detection of Subsurface Anomalies

    NASA Technical Reports Server (NTRS)

    Allison, Sidney G.; Winfree, William P.; Wu, Meng-Chou

    2009-01-01

    Conventional thermography with an infrared imager has been shown to be an extremely viable technique for nondestructively detecting subsurface anomalies such as thickness variations due to corrosion. A recently developed technique using fiber optic sensors to measure temperature holds potential for performing similar inspections without requiring an infrared imager. The structure is heated using a heat source such as a quartz lamp with fiber Bragg grating (FBG) sensors at the surface of the structure to detect temperature. Investigated structures include a stainless steel plate with thickness variations simulated by small platelets attached to the back side using thermal grease. A relationship is shown between the FBG sensor thermal response and variations in material thickness. For comparison, finite element modeling was performed and found to agree closely with the fiber optic thermography results. This technique shows potential for applications where FBG sensors are already bonded to structures for Integrated Vehicle Health Monitoring (IVHM) strain measurements and can serve dual-use by also performing thermographic detection of subsurface anomalies.

  8. Multiple Kernel Learning for Heterogeneous Anomaly Detection: Algorithm and Aviation Safety Case Study

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.

    2010-01-01

    The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods

  9. Molecular Detection of Human Cytomegalovirus (HCMV) Among Infants with Congenital Anomalies in Khartoum State, Sudan

    PubMed Central

    Ebrahim, Maha G.; Ali, Aisha S.; Mustafa, Mohamed O.; Musa, Dalal F.; El Hussein, Abdel Rahim M.; Elkhidir, Isam M.; Enan, Khalid A.

    2015-01-01

    Human Cytomegalovirus (HCMV) infection still represents the most common potentially serious viral complication in humans and is a major cause of congenital anomalies in infants. This study is aimed to detect HCMV in infants with congenital anomalies. Study subjects consisted of infants born with neural tube defect, hydrocephalus and microcephaly. Fifty serum specimens (20 males, 30 females) were collected from different hospitals in Khartoum State. The sera were investigated for cytomegalovirus specific immunoglobin M (IgM) antibodies using enzyme-linked immunosorbent assay (ELISA), and for Cytomegalovirus DNA using polymerase chain reaction (PCR). Out of the 50 sera tested, one patient’s (2%) sample showed HCMV IgM, but with no detectable DNA, other 4(8.2 %) sera were positive for HCMV DNA but with no detectable IgM. Various diagnostic techniques should be considered to evaluate HCMV disease and routine screening for HCMV should be introduced for pregnant women in this setting. It is vital to initiate further research work with many samples from different area to assess prevalence and characterize HCMV and evaluate its maternal health implications. PMID:26862356

  10. Subspace based non-parametric approach for hyperspectral anomaly detection in complex scenarios

    NASA Astrophysics Data System (ADS)

    Matteoli, Stefania; Acito, Nicola; Diani, Marco; Corsini, Giovanni

    2014-10-01

    Recent studies on global anomaly detection (AD) in hyperspectral images have focused on non-parametric approaches that seem particularly suitable to detect anomalies in complex backgrounds without the need of assuming any specific model for the background distribution. Among these, AD algorithms based on the kernel density estimator (KDE) benefit from the flexibility provided by KDE, which attempts to estimate the background probability density function (PDF) regardless of its specific form. The high computational burden associated with KDE requires KDE-based AD algorithms be preceded by a suitable dimensionality reduction (DR) procedure aimed at identifying the subspace where most of the useful signal lies. In most cases, this may lead to a degradation of the detection performance due to the leakage of some anomalous target components to the subspace orthogonal to the one identified by the DR procedure. This work presents a novel subspace-based AD strategy that combines the use of KDE with a simple parametric detector performed on the orthogonal complement of the signal subspace, in order to benefit of the non-parametric nature of KDE and, at the same time, avoid the performance loss that may occur due to the DR procedure. Experimental results indicate that the proposed AD strategy is promising and deserves further investigation.

  11. Native fluorescent detection with sequential injection chromatography for doping control analysis

    PubMed Central

    2013-01-01

    Background Sequential injection chromatography (SIC) is a young, ten years old, separation technique. It was proposed with the benefits of reagent-saving, rapid analysis, system miniaturization and simplicity. SIC with UV detection has proven to be efficient mostly for pharmaceutical analysis. In the current study, a stand-alone multi-wavelength fluorescence (FL) detector was coupled to an SIC system. The hyphenation was exploited for developing an SIC-FL method for the separation and quantification of amiloride (AML) and furosemide (FSM) in human urine and tablet formulation. Results AML and FSM were detected using excitation maxima at 380 and 270 nm, respectively, and emission maxima at 413 and 470 nm, respectively. The separation was accomplished in less than 2.0 min into a C18 monolithic column (50 × 4.6 nm) with a mobile phase containing 25 mmol/L phosphate buffer (pH 4.0): acetonitrile: (35:65, v/v). The detection limits were found to be 12 and 470 ng/mL for AML and FSM, respectively. Conclusions The proposed SIC-FL method features satisfactory sensitivity for AML and FSM in urine samples for the minimum required performance limits recommended by the World Anti-Doping Agency, besides a downscaled consumption of reagents and high rapidity for industrial-scale analysis of pharmaceutical preparations. PMID:23985079

  12. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The

  13. Feasibility of anomaly detection and characterization using trans-admittance mammography with 60 × 60 electrode array

    NASA Astrophysics Data System (ADS)

    Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; In Oh, Tong

    2014-10-01

    Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.

  14. Sparsity divergence index based on locally linear embedding for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhang, Lili; Zhao, Chunhui

    2016-04-01

    Hyperspectral imagery (HSI) has high spectral and spatial resolutions, which are essential for anomaly detection (AD). Many anomaly detectors assume that the spectrum signature of HSI pixels can be modeled with a Gaussian distribution, which is actually not accurate and often leads to many false alarms. Therefore, a sparsity model without any distribution hypothesis is usually employed. Dimensionality reduction (DR) as a preprocessing step for HSI is important. Principal component analysis as a conventional DR method is a linear projection and cannot exploit the nonlinear properties in hyperspectral data, whereas locally linear embedding (LLE) as a local, nonlinear manifold learning algorithm works well for DR of HSI. A modified algorithm of sparsity divergence index based on locally linear embedding (SDI-LLE) is thus proposed. First, kernel collaborative representation detection is adopted to calculate the sparse dictionary matrix of local reconstruction weights in LLE. Then, SDI is obtained both in the spectral and spatial domains, where spatial SDI is computed after DR by LLE. Finally, joint SDI, combining spectral SDI and spatial SDI, is computed, and the optimal SDI is performed for AD. Experimental results demonstrate that the proposed algorithm significantly improves the performance, when compared with its counterparts.

  15. A MLP neural network as an investigator of TEC time series to detect seismo-ionospheric anomalies

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-06-01

    Anomaly detection is extremely important for earthquake parameters estimation. In this paper, an application of Artificial Neural Networks (ANNs) in the earthquake precursor's domain has been developed. This study is concerned with investigating the Total Electron Content (TEC) time series by using a Multi-Layer Perceptron (MLP) neural network to detect seismo-ionospheric anomalous variations induced by the powerful Tohoku earthquake of March 11, 2011.The duration of TEC time series dataset is 120 days at time resolution of 2 h. The results show that the MLP presents anomalies better than referenced and conventional methods such as Auto-Regressive Integrated Moving Average (ARIMA) technique. In this study, also the detected TEC anomalies using the proposed method, are compared to the previous results (Akhoondzadeh, 2012) dealing with the observed TEC anomalies by applying the mean, median, wavelet and Kalman filter methods. The MLP detected anomalies are similar to those detected using the previous methods applied on the same case study. The results indicate that a MLP feed-forward neural network can be a suitable non-parametric method to detect changes of a non linear time series such as variations of earthquake precursors.

  16. System and method for the detection of anomalies in an image

    DOEpatents

    Prasad, Lakshman; Swaminarayan, Sriram

    2013-09-03

    Preferred aspects of the present invention can include receiving a digital image at a processor; segmenting the digital image into a hierarchy of feature layers comprising one or more fine-scale features defining a foreground object embedded in one or more coarser-scale features defining a background to the one or more fine-scale features in the segmentation hierarchy; detecting a first fine-scale foreground feature as an anomaly with respect to a first background feature within which it is embedded; and constructing an anomalous feature layer by synthesizing spatially contiguous anomalous fine-scale features. Additional preferred aspects of the present invention can include detecting non-pervasive changes between sets of images in response at least in part to one or more difference images between the sets of images.

  17. Anomaly Detection Techniques with Real Test Data from a Spinning Turbine Engine-Like Rotor

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Woike, Mark R.; Oza, Nikunj C.; Matthews, Bryan L.

    2012-01-01

    Online detection techniques to monitor the health of rotating engine components are becoming increasingly attractive to aircraft engine manufacturers in order to increase safety of operation and lower maintenance costs. Health monitoring remains a challenge to easily implement, especially in the presence of scattered loading conditions, crack size, component geometry, and materials properties. The current trend, however, is to utilize noninvasive types of health monitoring or nondestructive techniques to detect hidden flaws and mini-cracks before any catastrophic event occurs. These techniques go further to evaluate material discontinuities and other anomalies that have grown to the level of critical defects that can lead to failure. Generally, health monitoring is highly dependent on sensor systems capable of performing in various engine environmental conditions and able to transmit a signal upon a predetermined crack length, while acting in a neutral form upon the overall performance of the engine system.

  18. MedMon: securing medical devices through wireless monitoring and anomaly detection.

    PubMed

    Zhang, Meng; Raghunathan, Anand; Jha, Niraj K

    2013-12-01

    Rapid advances in personal healthcare systems based on implantable and wearable medical devices promise to greatly improve the quality of diagnosis and treatment for a range of medical conditions. However, the increasing programmability and wireless connectivity of medical devices also open up opportunities for malicious attackers. Unfortunately, implantable/wearable medical devices come with extreme size and power constraints, and unique usage models, making it infeasible to simply borrow conventional security solutions such as cryptography. We propose a general framework for securing medical devices based on wireless channel monitoring and anomaly detection. Our proposal is based on a medical security monitor (MedMon) that snoops on all the radio-frequency wireless communications to/from medical devices and uses multi-layered anomaly detection to identify potentially malicious transactions. Upon detection of a malicious transaction, MedMon takes appropriate response actions, which could range from passive (notifying the user) to active (jamming the packets so that they do not reach the medical device). A key benefit of MedMon is that it is applicable to existing medical devices that are in use by patients, with no hardware or software modifications to them. Consequently, it also leads to zero power overheads on these devices. We demonstrate the feasibility of our proposal by developing a prototype implementation for an insulin delivery system using off-the-shelf components (USRP software-defined radio). We evaluate its effectiveness under several attack scenarios. Our results show that MedMon can detect virtually all naive attacks and a large fraction of more sophisticated attacks, suggesting that it is an effective approach to enhancing the security of medical devices. PMID:24473551

  19. Visual detection and sequential injection determination of aluminium using a cinnamoyl derivative.

    PubMed

    Elečková, Lenka; Alexovič, Michal; Kuchár, Juraj; Balogh, Ioseph S; Andruch, Vasil

    2015-02-01

    A cinnamoyl derivative, 3-[4-(dimethylamino)cinnamoyl]-4-hydroxy-6-methyl-3,4-2H-pyran-2-one, was used as a ligand for the determination of aluminium. Upon the addition of an acetonitrile solution of the ligand to an aqueous solution containing Al(III) and a buffer solution at pH 8, a marked change in colour from yellow to orange is observed. The colour intensity is proportional to the concentration of Al(III); thus, the 'naked-eye' detection of aluminium is possible. The reaction is also applied for sequential injection determination of aluminium. Beer׳s law is obeyed in the range from 0.055 to 0.66 mg L(-1) of Al(III). The limit of detection, calculated as three times the standard deviation of the blank test (n=10), was found to be 4 μg L(-1) for Al(III). The method was applied for the determination of aluminium in spiked water samples and pharmaceutical preparations.

  20. Identification of inorganic improvised explosive devices using sequential injection capillary electrophoresis and contactless conductivity detection.

    PubMed

    Blanco, Gustavo A; Nai, Yi H; Hilder, Emily F; Shellie, Robert A; Dicinoski, Greg W; Haddad, Paul R; Breadmore, Michael C

    2011-12-01

    A simple sequential injection capillary electrophoresis (SI-CE) instrument with capacitively coupled contactless conductivity detection (C(4)D) has been developed for the rapid separation of anions relevant to the identification of inorganic improvised explosive devices (IEDs). Four of the most common explosive tracer ions, nitrate, perchlorate, chlorate, and azide, and the most common background ions, chloride, sulfate, thiocyanate, fluoride, phosphate, and carbonate, were chosen for investigation. Using a separation electrolyte comprising 50 mM tris(hydroxymethyl)aminomethane, 50 mM cyclohexyl-2-aminoethanesulfonic acid, pH 8.9 and 0.05% poly(ethyleneimine) (PEI) in a hexadimethrine bromide (HDMB)-coated capillary it was possible to partially separate all 10 ions within 90 s. The combination of two cationic polymer additives (PEI and HDMB) was necessary to achieve adequate selectivity with a sufficiently stable electroosmotic flow (EOF), which was not possible with only one polymer. Careful optimization of variables affecting the speed of separation and injection timing allowed a further reduction of separation time to 55 s while maintaining adequate efficiency and resolution. Software control makes high sample throughput possible (60 samples/h), with very high repeatability of migration times [0.63-2.07% relative standard deviation (RSD) for 240 injections]. The separation speed does not compromise sensitivity, with limits of detection ranging from 23 to 50 μg·L(-1) for all the explosive residues considered, which is 10× lower than those achieved by indirect absorbance detection and 2× lower than those achieved by C(4)D using portable benchtop instrumentation. The combination of automation, high sample throughput, high confidence of peak identification, and low limits of detection makes this methodology ideal for the rapid identification of inorganic IED residues.

  1. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling.

    PubMed

    Raghuram, Jayaram; Miller, David J; Kesidis, George

    2014-07-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511

  2. Detection of submicron scale cracks and other surface anomalies using positron emission tomography

    DOEpatents

    Cowan, Thomas E.; Howell, Richard H.; Colmenares, Carlos A.

    2004-02-17

    Detection of submicron scale cracks and other mechanical and chemical surface anomalies using PET. This surface technique has sufficient sensitivity to detect single voids or pits of sub-millimeter size and single cracks or fissures of millimeter size; and single cracks or fissures of millimeter-scale length, micrometer-scale depth, and nanometer-scale length, micrometer-scale depth, and nanometer-scale width. This technique can also be applied to detect surface regions of differing chemical reactivity. It may be utilized in a scanning or survey mode to simultaneously detect such mechanical or chemical features over large interior or exterior surface areas of parts as large as about 50 cm in diameter. The technique involves exposing a surface to short-lived radioactive gas for a time period, removing the excess gas to leave a partial monolayer, determining the location and shape of the cracks, voids, porous regions, etc., and calculating the width, depth, and length thereof. Detection of 0.01 mm deep cracks using a 3 mm detector resolution has been accomplished using this technique.

  3. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling

    PubMed Central

    Raghuram, Jayaram; Miller, David J.; Kesidis, George

    2014-01-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511

  4. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling.

    PubMed

    Raghuram, Jayaram; Miller, David J; Kesidis, George

    2014-07-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates.

  5. Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection

    PubMed Central

    O’Hern, Corey S.; Shattuck, Mark D.; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G.

    2016-01-01

    Clinical diagnosis of acute infectious diseases during the early stages of infection is critical to administering the appropriate treatment to improve the disease outcome. We present a data driven analysis of the human cellular response to respiratory viruses including influenza, respiratory syncytia virus, and human rhinovirus, and compared this with the response to the bacterial endotoxin, Lipopolysaccharides (LPS). Using an anomaly detection framework we identified pathways that clearly distinguish between asymptomatic and symptomatic patients infected with the four different respiratory viruses and that accurately diagnosed patients exposed to a bacterial infection. Connectivity pathway analysis comparing the viral and bacterial diagnostic signatures identified host cellular pathways that were unique to patients exposed to LPS endotoxin indicating this type of analysis could be used to identify host biomarkers that can differentiate clinical etiologies of acute infection. We applied the Multivariate State Estimation Technique (MSET) on two human influenza (H1N1 and H3N2) gene expression data sets to define host networks perturbed in the asymptomatic phase of infection. Our analysis identified pathways in the respiratory virus diagnostic signature as prognostic biomarkers that triggered prior to clinical presentation of acute symptoms. These early warning pathways correctly predicted that almost half of the subjects would become symptomatic in less than forty hours post-infection and that three of the 18 subjects would become symptomatic after only 8 hours. These results provide a proof-of-concept for utility of anomaly detection algorithms to classify host pathway signatures that can identify presymptomatic signatures of acute diseases and differentiate between etiologies of infection. On a global scale, acute respiratory infections cause a significant proportion of human co-morbidities and account for 4.25 million deaths annually. The development of clinical

  6. Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection.

    PubMed

    Wang, Kun; Langevin, Stanley; O'Hern, Corey S; Shattuck, Mark D; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G; Kirby, Michael

    2016-01-01

    Clinical diagnosis of acute infectious diseases during the early stages of infection is critical to administering the appropriate treatment to improve the disease outcome. We present a data driven analysis of the human cellular response to respiratory viruses including influenza, respiratory syncytia virus, and human rhinovirus, and compared this with the response to the bacterial endotoxin, Lipopolysaccharides (LPS). Using an anomaly detection framework we identified pathways that clearly distinguish between asymptomatic and symptomatic patients infected with the four different respiratory viruses and that accurately diagnosed patients exposed to a bacterial infection. Connectivity pathway analysis comparing the viral and bacterial diagnostic signatures identified host cellular pathways that were unique to patients exposed to LPS endotoxin indicating this type of analysis could be used to identify host biomarkers that can differentiate clinical etiologies of acute infection. We applied the Multivariate State Estimation Technique (MSET) on two human influenza (H1N1 and H3N2) gene expression data sets to define host networks perturbed in the asymptomatic phase of infection. Our analysis identified pathways in the respiratory virus diagnostic signature as prognostic biomarkers that triggered prior to clinical presentation of acute symptoms. These early warning pathways correctly predicted that almost half of the subjects would become symptomatic in less than forty hours post-infection and that three of the 18 subjects would become symptomatic after only 8 hours. These results provide a proof-of-concept for utility of anomaly detection algorithms to classify host pathway signatures that can identify presymptomatic signatures of acute diseases and differentiate between etiologies of infection. On a global scale, acute respiratory infections cause a significant proportion of human co-morbidities and account for 4.25 million deaths annually. The development of clinical

  7. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    /index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.

  8. Characterization of normality of chaotic systems including prediction and detection of anomalies

    NASA Astrophysics Data System (ADS)

    Engler, Joseph John

    Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational

  9. The Detection of Abundance Anomalies in the Infrared Spectra of Cataclysmic Variables: Shorter Period Systems

    NASA Astrophysics Data System (ADS)

    Harrison, Thomas E.; Osborne, Heather L.; Howell, Steve B.

    2005-05-01

    We present K-band spectra for 12 cataclysmic variables (CVs) with orbital periods under 6 hr. We confidently detect the secondary stars in nine of these systems and may have detected them in the other three. Nine of the 12 CVs clearly have CO first-overtone absorption features that are weaker than they should be for the derived spectral type. We demonstrate that, in general, the weak CO features are due to a carbon deficiency in the secondary star. In the case of U Gem, UU Aql, and TW Vir the carbon abundance in the secondary star appears to be very low, likely only a few percent of the solar value. Deficits of carbon, when combined with the detection of 13CO and the ultraviolet detections of enhanced levels of nitrogen in other CV systems, imply that material that has been processed through the CNO cycle is finding its way into the photospheres of CV secondary stars. While several plausible models exist to explain unusual levels of CNO species in CV secondary stars, they do not detail how such species as aluminum, magnesium, or silicon (elements that show abundance anomalies in our spectra) will behave. It appears that the standard model for the formation and evolution of CVs needs substantial revision.

  10. Sequential detection and robust estimation of vapor concentration using frequency-agile lidar time series data

    NASA Astrophysics Data System (ADS)

    Warren, Russell E.; Vanderbeek, Richard G.; D'Amico, Francis M.; Ben-David, Avishai

    1999-01-01

    This paper extends an earlier optimal approach for frequency-agile lidar using fixed-size samples of data to include the time series aspect of data collection. The likelihood ratio test methodology for deterministic but unknown vapor concentration is replaced by a Bayesian formalism in which the path integral of vapor concentration CL evolves in time through a random walk model. The fixed- sample maximum likelihood estimates of CL derived earlier are replaced by Kalman filter estimates, and the log- likelihood ratio is generalized to a sequential test statistic written in terms of the Kalman estimates. In addition to the time series aspect, the earlier approach is generalized by (1) including the transmitted energy on a short-by-shot basis in a statistically optimum manner, (2) adding a linear slope component to the transmitter and received data models, and (3) replacing the nominal multivariate normal statistical assumption by a robust model in the Huber sensor for mitigating the effects of occasional data spikes caused by laser misfiring or EMI. The estimation and detection algorithms are compared with fixed-sample processing by the DIAL method on FAL data collected by ERDEC during vapor chamber testing at Dugway, Utah.

  11. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).

    PubMed

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-07-13

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.

  12. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).

    PubMed

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  13. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO)

    PubMed Central

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  14. Scalable Algorithms for Unsupervised Classification and Anomaly Detection in Large Geospatiotemporal Data Sets

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hoffman, F. M.; Kumar, J.

    2015-12-01

    The increasing availability of high-resolution geospatiotemporal datasets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery and mining of ecological data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe some unsupervised knowledge discovery and anomaly detection approaches based on highly scalable parallel algorithms for k-means clustering and singular value decomposition, consider a few practical applications thereof to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.

  15. Cfetool: A General Purpose Tool for Anomaly Detection in Periodic Data

    SciTech Connect

    Wachsmann, Alf; Cassell, Elizabeth; /UC, Santa Barbara

    2007-03-06

    Cfengine's environment daemon ''cfenv'' has a limited and fixed set of metrics it measures on a computer. The data is assumed to be periodic in nature and cfenvd reports any data points that fall too far out of the pattern it has learned from past measurements. This is used to detect ''anomalies'' on computers. We introduce a new standalone tool, ''cfetool'', that allows arbitrary periodic data to be stored and evaluated. The user interface is modeled after rrdtool, another widely used tool to store measured data. Because a standalone tool can be used not only for computer related data, we have extended the built-in mathematics to apply to yearly data as well.

  16. Realization and detection of Weyl semimetals and the chiral anomaly in cold atomic systems

    NASA Astrophysics Data System (ADS)

    He, Wen-Yu; Zhang, Shizhong; Law, K. T.

    2016-07-01

    In this work, we describe a method to realize a three-dimensional Weyl semimetal by coupling multilayers of a honeycomb optical lattice in the presence of a pair of Raman lasers. The Raman lasers render each isolated honeycomb layer a Chern insulator. With finite interlayer coupling, the bulk gap of the system closes at certain out-of-plane momenta due to Raman assisted tunneling and results in the Weyl semimetal phase. Using experimentally relevant parameters, we show that both one pair and two pairs of Weyl points can be realized by tuning the interlayer coupling strength. We suggest that Landau-Zener tunneling can be used to detect Weyl points and show that the transition probability increases dramatically when the Weyl point emerges. The realization of chiral anomaly by using a magnetic-field gradient is also discussed.

  17. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  18. Stochastic anomaly detection in eye-tracking data for quantification of motor symptoms in Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Jansson, Daniel; Medvedev, Alexander; Axelson, Hans; Nyholm, Dag

    2013-10-01

    Two methods for distinguishing between healthy controls and patients diagnosed with Parkinson's disease by means of recorded smooth pursuit eye movements are presented and evaluated. Both methods are based on the principles of stochastic anomaly detection and make use of orthogonal series approximation for probability distribution estimation. The first method relies on the identification of a Wiener-type model of the smooth pursuit system and attempts to find statistically significant differences between the estimated parameters in healthy controls and patientts with Parkinson's disease. The second method applies the same statistical method to distinguish between the gaze trajectories of healthy and Parkinson subjects attempting to track visual stimuli. Both methods show promising results, where healthy controls and patients with Parkinson's disease are effectively separated in terms of the considered metric. The results are preliminary because of the small number of participating test subjects, but they are indicative of the potential of the presented methods as diagnosing or staging tools for Parkinson's disease.

  19. Detection, identification and mapping of iron anomalies in brain tissue using X-ray absorption spectroscopy

    SciTech Connect

    Mikhaylova, A.; Davidson, M.; Toastmann, H.; Channell, J.E.T.; Guyodo, Y.; Batich, C.; Dobson, J.

    2008-06-16

    This work describes a novel method for the detection, identification and mapping of anomalous iron compounds in mammalian brain tissue using X-ray absorption spectroscopy. We have located and identified individual iron anomalies in an avian tissue model associated with ferritin, biogenic magnetite and haemoglobin with a pixel resolution of less than 5 {micro}m. This technique represents a breakthrough in the study of both intra- and extra-cellular iron compounds in brain tissue. The potential for high-resolution iron mapping using microfocused X-ray beams has direct application to investigations of the location and structural form of iron compounds associated with human neurodegenerative disorders - a problem which has vexed researchers for 50 years.

  20. Mining Building Energy Management System Data Using Fuzzy Anomaly Detection and Linguistic Descriptions

    SciTech Connect

    Dumidu Wijayasekara; Ondrej Linda; Milos Manic; Craig Rieger

    2014-08-01

    Building Energy Management Systems (BEMSs) are essential components of modern buildings that utilize digital control technologies to minimize energy consumption while maintaining high levels of occupant comfort. However, BEMSs can only achieve these energy savings when properly tuned and controlled. Since indoor environment is dependent on uncertain criteria such as weather, occupancy, and thermal state, performance of BEMS can be sub-optimal at times. Unfortunately, the complexity of BEMS control mechanism, the large amount of data available and inter-relations between the data can make identifying these sub-optimal behaviors difficult. This paper proposes a novel Fuzzy Anomaly Detection and Linguistic Description (Fuzzy-ADLD) based method for improving the understandability of BEMS behavior for improved state-awareness. The presented method is composed of two main parts: 1) detection of anomalous BEMS behavior and 2) linguistic representation of BEMS behavior. The first part utilizes modified nearest neighbor clustering algorithm and fuzzy logic rule extraction technique to build a model of normal BEMS behavior. The second part of the presented method computes the most relevant linguistic description of the identified anomalies. The presented Fuzzy-ADLD method was applied to real-world BEMS system and compared against a traditional alarm based BEMS. In six different scenarios, the Fuzzy-ADLD method identified anomalous behavior either as fast as or faster (an hour or more), that the alarm based BEMS. In addition, the Fuzzy-ADLD method identified cases that were missed by the alarm based system, demonstrating potential for increased state-awareness of abnormal building behavior.

  1. Evaluation of Anomaly Detection Capability for Ground-Based Pre-Launch Shuttle Operations. Chapter 8

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2010-01-01

    This chapter will provide a thorough end-to-end description of the process for evaluation of three different data-driven algorithms for anomaly detection to select the best candidate for deployment as part of a suite of IVHM (Integrated Vehicle Health Management) technologies. These algorithms were deemed to be sufficiently mature enough to be considered viable candidates for deployment in support of the maiden launch of Ares I-X, the successor to the Space Shuttle for NASA's Constellation program. Data-driven algorithms are just one of three different types being deployed. The other two types of algorithms being deployed include a "nile-based" expert system, and a "model-based" system. Within these two categories, the deployable candidates have already been selected based upon qualitative factors such as flight heritage. For the rule-based system, SHINE (Spacecraft High-speed Inference Engine) has been selected for deployment, which is a component of BEAM (Beacon-based Exception Analysis for Multimissions), a patented technology developed at NASA's JPL (Jet Propulsion Laboratory) and serves to aid in the management and identification of operational modes. For the "model-based" system, a commercially available package developed by QSI (Qualtech Systems, Inc.), TEAMS (Testability Engineering and Maintenance System) has been selected for deployment to aid in diagnosis. In the context of this particular deployment, distinctions among the use of the terms "data-driven," "rule-based," and "model-based," can be found in. Although there are three different categories of algorithms that have been selected for deployment, our main focus in this chapter will be on the evaluation of three candidates for data-driven anomaly detection. These algorithms will be evaluated upon their capability for robustly detecting incipient faults or failures in the ground-based phase of pre-launch space shuttle operations, rather than based oil heritage as performed in previous studies. Robust

  2. Detection of Local Anomalies in High Resolution Hyperspectral Imagery Using Geostatistical Filtering and Local Spatial Statistics

    NASA Astrophysics Data System (ADS)

    Goovaerts, P.; Jacquez, G. M.; Marcus, A. W.

    2004-12-01

    finally the computation of a local indicator of spatial autocorrelation to detect local clusters of high or low reflectance values as well as anomalies. The approach is illustrated using one meter resolution data collected in Yellowstone National Park. Ground validation data demonstrate the ability of the filtering procedure to reduce the proportion of false alarms, and its robustness under low signal to noise ratios. In almost all scenarios, the proposed approach outperforms traditional anomaly detectors (i.e. RXD) and fewer false alarms were obtained when using statistic S2 (average absolute deviation of p-values from 0.5 through all spectral bands) to summarize information across bands. Image degradation through addition of noise or reduction of spectral resolution tends to blur the detection of anomalies, leading to more false alarms, in particular for the identification of the least pure pixels. Results from the tailings site demonstrated that the approach still performs reasonably well for highly complex landscape with multiple targets of various sizes and shapes. By leveraging both spectral and spatial information, the technique requires little or no input from the user, and hence can be readily automated.

  3. Paternal psychological response after ultrasonographic detection of structural fetal anomalies with a comparison to maternal response: a cohort study

    PubMed Central

    2013-01-01

    Background In Norway almost all pregnant women attend one routine ultrasound examination. Detection of fetal structural anomalies triggers psychological stress responses in the women affected. Despite the frequent use of ultrasound examination in pregnancy, little attention has been devoted to the psychological response of the expectant father following the detection of fetal anomalies. This is important for later fatherhood and the psychological interaction within the couple. We aimed to describe paternal psychological responses shortly after detection of structural fetal anomalies by ultrasonography, and to compare paternal and maternal responses within the same couple. Methods A prospective observational study was performed at a tertiary referral centre for fetal medicine. Pregnant women with a structural fetal anomaly detected by ultrasound and their partners (study group,n=155) and 100 with normal ultrasound findings (comparison group) were included shortly after sonographic examination (inclusion period: May 2006-February 2009). Gestational age was >12 weeks. We used psychometric questionnaires to assess self-reported social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression): Impact of Event Scale. General Health Questionnaire and Edinburgh Postnatal Depression Scale. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Results Median (range) gestational age at inclusion in the study and comparison group was 19 (12–38) and 19 (13–22) weeks, respectively. Men and women in the study group had significantly higher levels of psychological distress than men and women in the comparison group on all psychometric endpoints. The lowest level of distress in the study group was associated with the least severe anomalies with no diagnostic or prognostic ambiguity (p < 0.033). Men had lower scores than women on all psychometric

  4. Airborne detection of magnetic anomalies associated with soils on the Oak Ridge Reservation, Tennessee

    SciTech Connect

    Doll, W.E.; Beard, L.P.; Helm, J.M.

    1995-04-01

    Reconnaissance airborne geophysical data acquired over the 35,000-acre Oak Ridge Reservation (ORR), TN, show several magnetic anomalies over undisturbed areas mapped as Copper Ridge Dolomite (CRD). The anomalies of interest are most apparent in magnetic gradient maps where they exceed 0.06 nT/m and in some cases exceed 0.5 nT/m. Anomalies as large as 25nT are seen on maps. Some of the anomalies correlate with known or suspected karst, or with apparent conductivity anomalies calculated from electromagnetic data acquired contemporaneously with the magnetic data. Some of the anomalies have a strong correlation with topographic lows or closed depressions. Surface magnetic data have been acquired over some of these sites and have confirmed the existence of the anomalies. Ground inspections in the vicinity of several of the anomalies has not led to any discoveries of manmade surface materials of sufficient size to generate the observed anomalies. One would expect an anomaly of approximately 1 nT for a pickup truck from 200 ft altitude. Typical residual magnetic anomalies have magnitudes of 5--10 nT, and some are as large as 25nT. The absence of roads or other indications of culture (past or present) near the anomalies and the modeling of anomalies in data acquired with surface instruments indicate that man-made metallic objects are unlikely to be responsible for the anomaly. The authors show that observed anomalies in the CRD can reasonably be associated with thickening of the soil layer. The occurrence of the anomalies in areas where evidences of karstification are seen would follow because sediment deposition would occur in topographic lows. Linear groups of anomalies on the maps may be associated with fracture zones which were eroded more than adjacent rocks and were subsequently covered with a thicker blanket of sediment. This study indicates that airborne magnetic data may be of use in other sites where fracture zones or buried collapse structures are of interest.

  5. Detection of inhomogeneities in precipitation time series in Portugal using direct sequential simulation

    NASA Astrophysics Data System (ADS)

    Ribeiro, Sara; Caineta, Júlio; Costa, Ana Cristina; Henriques, Roberto; Soares, Amílcar

    2016-05-01

    Climate data homogenisation is of major importance in climate change monitoring, validation of weather forecasting, general circulation and regional atmospheric models, modelling of erosion, drought monitoring, among other studies of hydrological and environmental impacts. The reason is that non-climate factors can cause time series discontinuities which may hide the true climatic signal and patterns, thus potentially bias the conclusions of those studies. In the last two decades, many methods have been developed to identify and remove these inhomogeneities. One of those is based on a geostatistical simulation technique (DSS - direct sequential simulation), where local probability density functions (pdfs) are calculated at candidate monitoring stations using spatial and temporal neighbouring observations, which then are used for the detection of inhomogeneities. Such approach has been previously applied to detect inhomogeneities in four precipitation series (wet day count) from a network with 66 monitoring stations located in the southern region of Portugal (1980-2001). That study revealed promising results and the potential advantages of geostatistical techniques for inhomogeneity detection in climate time series. This work extends the case study presented before and investigates the application of the geostatistical stochastic approach to ten precipitation series that were previously classified as inhomogeneous by one of six absolute homogeneity tests (Mann-Kendall, Wald-Wolfowitz runs, Von Neumann ratio, Pettitt, Buishand range test, and standard normal homogeneity test (SNHT) for a single break). Moreover, a sensitivity analysis is performed to investigate the number of simulated realisations which should be used to infer the local pdfs with more accuracy. Accordingly, the number of simulations per iteration was increased from 50 to 500, which resulted in a more representative local pdf. As in the previous study, the results are compared with those from the

  6. Inhomogeneities detection in annual precipitation time series in Portugal using direct sequential simulation

    NASA Astrophysics Data System (ADS)

    Caineta, Júlio; Ribeiro, Sara; Costa, Ana Cristina; Henriques, Roberto; Soares, Amílcar

    2014-05-01

    Climate data homogenisation is of major importance in monitoring climate change, the validation of weather forecasting, general circulation and regional atmospheric models, modelling of erosion, drought monitoring, among other studies of hydrological and environmental impacts. This happens because non-climate factors can cause time series discontinuities which may hide the true climatic signal and patterns, thus potentially bias the conclusions of those studies. In the last two decades, many methods have been developed to identify and remove these inhomogeneities. One of those is based on geostatistical simulation (DSS - direct sequential simulation), where local probability density functions (pdf) are calculated at candidate monitoring stations, using spatial and temporal neighbouring observations, and then are used for detection of inhomogeneities. This approach has been previously applied to detect inhomogeneities in four precipitation series (wet day count) from a network with 66 monitoring stations located in the southern region of Portugal (1980-2001). This study revealed promising results and the potential advantages of geostatistical techniques for inhomogeneities detection in climate time series. This work extends the case study presented before and investigates the application of the geostatistical stochastic approach to ten precipitation series that were previously classified as inhomogeneous by one of six absolute homogeneity tests (Mann-Kendall test, Wald-Wolfowitz runs test, Von Neumann ratio test, Standard normal homogeneity test (SNHT) for a single break, Pettit test, and Buishand range test). Moreover, a sensibility analysis is implemented to investigate the number of simulated realisations that should be used to accurately infer the local pdfs. Accordingly, the number of simulations per iteration is increased from 50 to 500, which resulted in a more representative local pdf. A set of default and recommended settings is provided, which will help

  7. GNSS reflectometry aboard the International Space Station: phase-altimetry simulation to detect ocean topography anomalies

    NASA Astrophysics Data System (ADS)

    Semmling, Maximilian; Leister, Vera; Saynisch, Jan; Zus, Florian; Wickert, Jens

    2016-04-01

    An ocean altimetry experiment using Earth reflected GNSS signals has been proposed to the European Space Agency (ESA). It is part of the GNSS Reflectometry Radio Occultation Scatterometry (GEROS) mission that is planned aboard the International Space Station (ISS). Altimetric simulations are presented that examine the detection of ocean topography anomalies assuming GNSS phase delay observations. Such delay measurements are well established for positioning and are possible due to a sufficient synchronization of GNSS receiver and transmitter. For altimetric purpose delays of Earth reflected GNSS signals can be observed similar to radar altimeter signals. The advantage of GNSS is the synchronized separation of transmitter and receiver that allow a significantly increased number of observation per receiver due to more than 70 GNSS transmitters currently in orbit. The altimetric concept has already been applied successfully to flight data recorded over the Mediterranean Sea. The presented altimetric simulation considers anomalies in the Agulhas current region which are obtained from the Region Ocean Model System (ROMS). Suitable reflection events in an elevation range between 3° and 30° last about 10min with ground track's length >3000km. Typical along-track footprints (1s signal integration time) have a length of about 5km. The reflection's Fresnel zone limits the footprint of coherent observations to a major axis extention between 1 to 6km dependent on the elevation. The altimetric performance depends on the signal-to-noise ratio (SNR) of the reflection. Simulation results show that precision is better than 10cm for SNR of 30dB. Whereas, it is worse than 0.5m if SNR goes down to 10dB. Precision, in general, improves towards higher elevation angles. Critical biases are introduced by atmospheric and ionospheric refraction. Corresponding correction strategies are still under investigation.

  8. Para-GMRF: parallel algorithm for anomaly detection of hyperspectral image

    NASA Astrophysics Data System (ADS)

    Dong, Chao; Zhao, Huijie; Li, Na; Wang, Wei

    2007-12-01

    The hyperspectral imager is capable of collecting hundreds of images corresponding to different wavelength channels for the observed area simultaneously, which make it possible to discriminate man-made objects from natural background. However, the price paid for the wealthy information is the enormous amounts of data, usually hundreds of Gigabytes per day. Turning the huge volume data into useful information and knowledge in real time is critical for geoscientists. In this paper, the proposed parallel Gaussian-Markov random field (Para-GMRF) anomaly detection algorithm is an attempt of applying parallel computing technology to solve the problem. Based on the locality of GMRF algorithm, we partition the 3-D hyperspectral image cube in spatial domain and distribute data blocks to multiple computers for concurrent detection. Meanwhile, to achieve load balance, a work pool scheduler is designed for task assignment. The Para-GMRF algorithm is organized in master-slave architecture, coded in C programming language using message passing interface (MPI) library and tested on a Beowulf cluster. Experimental results show that Para-GMRF algorithm successfully conquers the challenge and can be used in time sensitive areas, such as environmental monitoring and battlefield reconnaissance.

  9. Possibility of detecting triple gluon coupling and Adler-Bell-Jackiw anomaly in polarized deep inelastic scattering

    SciTech Connect

    Lam, C.S.; Li, B.A.

    1980-05-01

    A way to detect experimentally the existence of triple gluon coupling and the Adler-Bell-Jackiw anomaly is to measure the Q/sup 2/-dependence of polarized deep inelastic scattering. These effects lead to a ln ln Q/sup 2/ term which we calculate by introducing a new gluon operator in the Wilson expansion.

  10. Rapid detection and classification of airborne time-domain electromagnetic anomalies using weighted multi-linear regression

    NASA Astrophysics Data System (ADS)

    Claprood, Maxime; Chouteau, Michel; Cheng, Li Zhen

    2008-10-01

    We propose a rapid and efficient methodology for the detection and interpretation of airborne time-domain electromagnetic anomalies generated by thin sheet-like volcanogenic massive sulphides (VMS) deposits in a resistive environment, which are representative of VMS deposits in the Canadian Shield.

  11. Insider threat detection enabled by converting user applications into fractal fingerprints and autonomously detecting anomalies

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James

    2012-06-01

    We demonstrate insider threat detection for determining when the behavior of a computer user is suspicious or different from his or her normal behavior. This is accomplished by combining features extracted from text, emails, and blogs that are associated with the user. These sources can be characterized using QUEST, DANCER, and MenTat to extract features; however, some of these features are still in text form. We show how to convert these features into numerical form and characterize them using parametric and non-parametric statistics. These features are then used as input into a Random Forest classifier that is trained to recognize whenever the user's behavior is suspicious or different from normal (off-nominal). Active authentication (user identification) is also demonstrated using the features and classifiers derived in this work. We also introduce a novel concept for remotely monitoring user behavior indicator patterns displayed as an infrared overlay on the computer monitor, which the user is unaware of, but a narrow pass-band filtered webcam can clearly distinguish. The results of our analysis are presented.

  12. Anomaly Identification from Super-Low Frequency Electromagnetic Data for the Coalbed Methane Detection

    NASA Astrophysics Data System (ADS)

    Zhao, S. S.; Wang, N.; Hui, J.; Ye, X.; Qin, Q.

    2016-06-01

    Natural source Super Low Frequency(SLF) electromagnetic prospecting methods have become an increasingly promising way in the resource detection. The capacity estimation of the reservoirs is of great importance to evaluate their exploitation potency. In this paper, we built a signal-estimate model for SLF electromagnetic signal and processed the monitored data with adaptive filter. The non-normal distribution test showed that the distribution of the signal was obviously different from Gaussian probability distribution, and Class B instantaneous amplitude probability model can well describe the statistical properties of SLF electromagnetic data. The Class B model parameter estimation is very complicated because its kernel function is confluent hypergeometric function. The parameters of the model were estimated based on property spectral function using Least Square Gradient Method(LSGM). The simulation of this estimation method was carried out, and the results of simulation demonstrated that the LGSM estimation method can reflect important information of the Class B signal model, of which the Gaussian component was considered to be the systematic noise and random noise, and the Intermediate Event Component was considered to be the background ground and human activity noise. Then the observation data was processed using adaptive noise cancellation filter. With the noise components subtracted out adaptively, the remaining part is the signal of interest, i.e., the anomaly information. It was considered to be relevant to the reservoir position of the coalbed methane stratum.

  13. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks

    PubMed Central

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-01-01

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens’ quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%. PMID:27304957

  14. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks.

    PubMed

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-06-13

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens' quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%.

  15. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks.

    PubMed

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-01-01

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens' quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%. PMID:27304957

  16. An Approach to Detecting Crowd Anomalies for Entrance and Checkpoint Security

    NASA Astrophysics Data System (ADS)

    Zelnio, Holly

    This thesis develops an approach for detecting behavioral anomalies using tracks of pedestrians, including specified threat tracks. The application area is installation security with focus on monitoring the entrances of these installations. The approach specifically allows operator interaction to specify threats and to interactively adjust the system parameters depending on the context of the situation. This research has discovered physically meaningful features that are developed and organized in a manner so that features can be systematically added or deleted depending on the situation and operator preference. The features can be used with standard classifiers such as the one class support vector machine that is used in this research. The one class support vector machine is very stable for this application and provides significant insight into the nature of its decision boundary. Its stability and ease of system use stems from a unique automatic tuning approach that is computationally efficient and compares favorable with competing approaches. This automatic tuning approach is believed to be novel and was developed as part of this research. Results are provided using both measured and synthetic data.

  17. Experiments to Detect Clandestine Graves from Interpreted High Resolution Geophysical Anomalies

    NASA Astrophysics Data System (ADS)

    Molina, C. M.; Hernandez, O.; Pringle, J.

    2013-05-01

    This project refers to the search for clandestine sites where possibly missing people have been buried based on interpreted near surface high resolution geophysical anomalies. Nowadays, there are thousands of missing people around the world that could have been tortured and killed and buried in clandestine graves. This is a huge problem for their families and governments that are responsible to warranty the human rights for everybody. These people need to be found and the related crime cases need to be resolved. This work proposes to construct a series of graves where all the conditions of the grave, human remains and related objects are known. It is expected to detect contrasting physical properties of soil to identify the known human remains and objects. The proposed geophysical methods will include electrical tomography, magnetic and ground penetrating radar, among others. Two geographical sites will be selected to located and build standard graves with contrasting weather, soil, vegetation, geographic and geologic conditions. Forward and inverse modeling will be applied to locate and enhance the geophysical response of the known graves and to validate the methodology. As a result, an integrated geophysical program will be provided to support the search for clandestine graves helping to find missing people that have been illegally buried. Optionally, the methodology will be tested to search for real clandestine graves.

  18. Sequential strand displacement beacon for detection of DNA coverage on functionalized gold nanoparticles.

    PubMed

    Paliwoda, Rebecca E; Li, Feng; Reid, Michael S; Lin, Yanwen; Le, X Chris

    2014-06-17

    Functionalizing nanomaterials for diverse analytical, biomedical, and therapeutic applications requires determination of surface coverage (or density) of DNA on nanomaterials. We describe a sequential strand displacement beacon assay that is able to quantify specific DNA sequences conjugated or coconjugated onto gold nanoparticles (AuNPs). Unlike the conventional fluorescence assay that requires the target DNA to be fluorescently labeled, the sequential strand displacement beacon method is able to quantify multiple unlabeled DNA oligonucleotides using a single (universal) strand displacement beacon. This unique feature is achieved by introducing two short unlabeled DNA probes for each specific DNA sequence and by performing sequential DNA strand displacement reactions. Varying the relative amounts of the specific DNA sequences and spacing DNA sequences during their coconjugation onto AuNPs results in different densities of the specific DNA on AuNP, ranging from 90 to 230 DNA molecules per AuNP. Results obtained from our sequential strand displacement beacon assay are consistent with those obtained from the conventional fluorescence assays. However, labeling of DNA with some fluorescent dyes, e.g., tetramethylrhodamine, alters DNA density on AuNP. The strand displacement strategy overcomes this problem by obviating direct labeling of the target DNA. This method has broad potential to facilitate more efficient design and characterization of novel multifunctional materials for diverse applications. PMID:24848126

  19. Sequential strand displacement beacon for detection of DNA coverage on functionalized gold nanoparticles.

    PubMed

    Paliwoda, Rebecca E; Li, Feng; Reid, Michael S; Lin, Yanwen; Le, X Chris

    2014-06-17

    Functionalizing nanomaterials for diverse analytical, biomedical, and therapeutic applications requires determination of surface coverage (or density) of DNA on nanomaterials. We describe a sequential strand displacement beacon assay that is able to quantify specific DNA sequences conjugated or coconjugated onto gold nanoparticles (AuNPs). Unlike the conventional fluorescence assay that requires the target DNA to be fluorescently labeled, the sequential strand displacement beacon method is able to quantify multiple unlabeled DNA oligonucleotides using a single (universal) strand displacement beacon. This unique feature is achieved by introducing two short unlabeled DNA probes for each specific DNA sequence and by performing sequential DNA strand displacement reactions. Varying the relative amounts of the specific DNA sequences and spacing DNA sequences during their coconjugation onto AuNPs results in different densities of the specific DNA on AuNP, ranging from 90 to 230 DNA molecules per AuNP. Results obtained from our sequential strand displacement beacon assay are consistent with those obtained from the conventional fluorescence assays. However, labeling of DNA with some fluorescent dyes, e.g., tetramethylrhodamine, alters DNA density on AuNP. The strand displacement strategy overcomes this problem by obviating direct labeling of the target DNA. This method has broad potential to facilitate more efficient design and characterization of novel multifunctional materials for diverse applications.

  20. Extraction of oil slicks on the sea surface from optical satellite images by using an anomaly detection technique

    NASA Astrophysics Data System (ADS)

    Chen, Chi-Farn; Chang, Li-Yu

    2010-12-01

    Many methods for the detection of oil pollution on the sea surface from remotely sensed images have been developed in recent years. However, because of the diverse physical properties of oil on the sea surface in the visible wavelengths, such images are easily affected by the surrounding environment. This is a common difficulty encountered when optical satellite images are used as data sources for observing oil slicks on the sea surface. However, provided the spectral interference generated by the surrounding environment can be regarded as noise and properly modeled, the spectral anomalies caused by an oil slick on normal sea water may be observed after the suppression of this noise. In this study, sea surface oil slicks are extracted by detecting spectral anomalies in multispectral optical satellite images. First, assuming that the sea water and oil slick comprise the dominant background and target anomaly, respectively, an RX algorithm is used to enhance the oil slick anomaly. The oil slick can be distinguished from the sea water background after modeling and suppression of inherent noise. Next, a Gaussian mixture model is used to characterize the statistical distributions of the background and anomaly, respectively. The expectation maximization (EM) algorithm is used to obtain the parameters needed for the Gaussian mixture model. Finally, according to the Bayesian decision rule of minimum error, an optimized threshold can be obtained to extract the oil slick areas from the source image. Furthermore, with the obtained Gaussian distributions and optimized threshold, a theoretical false alarm level can be established to evaluate the quality of the extracted oil slicks. Experimental results show that the proposed method can not only successfully detect oil slicks from multispectral optical satellite images, but also provide a quantitative accuracy evaluation of the detected image.

  1. A Diagnoser Algorithm for Anomaly Detection in DEDS under Partial Unreliable Observations: Characterization and Inclusion in Sensor Configuration Optimizaton

    SciTech Connect

    Wen-Chiao Lin; Humberto Garcia; Tae-Sic Yoo

    2013-03-01

    Complex engineering systems have to be carefully monitored to meet demanding performance requirements, including detecting anomalies in their operations. There are two major monitoring challenges for these systems. The first challenge is that information collected from the monitored system is often partial and/or unreliable, in the sense that some occurred events may not be reported and/or may be reported incorrectly (e.g., reported as another event). The second is that anomalies often consist of sequences of event patterns separated in space and time. This paper introduces and analyzes a diagnoser algorithm that meets these challenges for detecting and counting occurrences of anomalies in engineering systems. The proposed diagnoser algorithm assumes that models are available for characterizing plant operations (via stochastic automata) and sensors (via probabilistic mappings) used for reporting partial and unreliable information. Methods for analyzing the effects of model uncertainties on the diagnoser performance are also discussed. In order to select configurations that reduce sensor costs, while satisfying diagnoser performance requirements, a sensor configuration selection algorithm developed in previous work is then extended for the proposed diagnoser algorithm. The proposed algorithms and methods are then applied to a multi-unit-operation system, which is derived from an actual facility application. Results show that the proposed diagnoser algorithm is able to detect and count occurrences of anomalies accurately and that its performance is robust to model uncertainties. Furthermore, the sensor configuration selection algorithm is able to suggest optimal sensor configurations with significantly reduced costs, while still yielding acceptable performance for counting the occurrences of anomalies.

  2. MODVOLC2: A Hybrid Time Series Analysis for Detecting Thermal Anomalies Applied to Thermal Infrared Satellite Data

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Wright, R.; Pilger, E.

    2009-12-01

    We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found

  3. Discrete shearlet transform on GPU with applications in anomaly detection and denoising

    NASA Astrophysics Data System (ADS)

    Gibert, Xavier; Patel, Vishal M.; Labate, Demetrio; Chellappa, Rama

    2014-12-01

    Shearlets have emerged in recent years as one of the most successful methods for the multiscale analysis of multidimensional signals. Unlike wavelets, shearlets form a pyramid of well-localized functions defined not only over a range of scales and locations, but also over a range of orientations and with highly anisotropic supports. As a result, shearlets are much more effective than traditional wavelets in handling the geometry of multidimensional data, and this was exploited in a wide range of applications from image and signal processing. However, despite their desirable properties, the wider applicability of shearlets is limited by the computational complexity of current software implementations. For example, denoising a single 512 × 512 image using a current implementation of the shearlet-based shrinkage algorithm can take between 10 s and 2 min, depending on the number of CPU cores, and much longer processing times are required for video denoising. On the other hand, due to the parallel nature of the shearlet transform, it is possible to use graphics processing units (GPU) to accelerate its implementation. In this paper, we present an open source stand-alone implementation of the 2D discrete shearlet transform using CUDA C++ as well as GPU-accelerated MATLAB implementations of the 2D and 3D shearlet transforms. We have instrumented the code so that we can analyze the running time of each kernel under different GPU hardware. In addition to denoising, we describe a novel application of shearlets for detecting anomalies in textured images. In this application, computation times can be reduced by a factor of 50 or more, compared to multicore CPU implementations.

  4. Detecting low Velocity Anomalies Combining Seismic Reflection With First Arrival Seismic Tomography

    NASA Astrophysics Data System (ADS)

    Flecha, I.; Marti, D.; Carbonell, R.

    2002-12-01

    In the present study seismic reflection techniques and high resolution seismic tomography are combined to determine location and geometry of shallow low velocity anomalies. Underground cavities (mines), water flows (formation with loose sand), etc. are geologic features characterized by slow seismic velocities and are targets of considerable social interest. Theoretical considerations (Snell's law) suggest that low velocity anomalies are undersampled and therefore badly resolved by ray tracing methods. A series of synthetics simulations have been carried out to asses the resolving power of the different methodologies. A 400mx50m two dimensional velocity model consisting of a background velocity gradient in depth from 3000 to 4000 m/s which included a rectangular low velocity anomaly (300 m/s). This anomaly was placed between 10m and 30m in depth and between 180m and 220m in length. The synthetic data calculation and the tomographic inversion have been done with absolutely independent programs. The data has been created using a 2D finite differences wave propagation acoustic algorithm. The tomographic inversion has been performed using two different software packages. The first one uses a combination of ray tracing a finite differences schemes to estimate the forward problem and an iterative conjugate gradient matrix solver to calculate the inverse. The second software package uses a modified Vidale scheme (Eikonal equation) to solve the forward problem and a LSQR to solve the inverse problem. The synthetic data were used for the inversions and for the generation of a conventional stacked section simulating a high resolution seismic reflection transect along the velocity model. The conventional stack images the diffractions caused by the velocity anomaly, which provided the location and extent of the low velocity anomaly. The inversions schemes provided estimates of the velocities, however, the tomograms and the ray tracing diagrams indicated a low resolution for

  5. DNA sequencing by a single molecule detection of labeled nucleotides sequentially cleaved from a single strand of DNA

    SciTech Connect

    Goodwin, P.M.; Schecker, J.A.; Wilkerson, C.W.; Hammond, M.L.; Ambrose, W.P.; Jett, J.H.; Martin, J.C.; Marrone, B.L.; Keller, R.A. ); Haces, A.; Shih, P.J.; Harding, J.D. )

    1993-01-01

    We are developing a laser-based technique for the rapid sequencing of large DNA fragments (several kb in size) at a rate of 100 to 1000 bases per second. Our approach relies on fluorescent labeling of the bases in a single fragment of DNA, attachment of this labeled DNA fragment to a support, movement of the supported DNA into a flowing sample stream, sequential cleavage of the end nucleotide from the DNA fragment with an exonuclease, and detection of the individual fluorescently labeled bases by laser-induced fluorescence.

  6. DNA sequencing by a single molecule detection of labeled nucleotides sequentially cleaved from a single strand of DNA

    SciTech Connect

    Goodwin, P.M.; Schecker, J.A.; Wilkerson, C.W.; Hammond, M.L.; Ambrose, W.P.; Jett, J.H.; Martin, J.C.; Marrone, B.L.; Keller, R.A.; Haces, A.; Shih, P.J.; Harding, J.D.

    1993-02-01

    We are developing a laser-based technique for the rapid sequencing of large DNA fragments (several kb in size) at a rate of 100 to 1000 bases per second. Our approach relies on fluorescent labeling of the bases in a single fragment of DNA, attachment of this labeled DNA fragment to a support, movement of the supported DNA into a flowing sample stream, sequential cleavage of the end nucleotide from the DNA fragment with an exonuclease, and detection of the individual fluorescently labeled bases by laser-induced fluorescence.

  7. Finding Cardinality Heavy-Hitters in Massive Traffic Data and Its Application to Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Ishibashi, Keisuke; Mori, Tatsuya; Kawahara, Ryoichi; Hirokawa, Yutaka; Kobayashi, Atsushi; Yamamoto, Kimihiro; Sakamoto, Hitoaki; Asano, Shoichiro

    introduce an application of our algorithm to anomaly detection. With actual traffic data, our method could successfully detect a sudden network scan.

  8. Digital speckle pattern interferometry based anomaly detection in breast mimicking phantoms: a pilot study

    NASA Astrophysics Data System (ADS)

    Udayakumar, K.; Sujatha, N.; Ganesan, A. R.

    2015-03-01

    Early screening of subsurface anomalies in breast can improve the patient survival rate. Clinically approved breast screening modalities may either have body ionizing effect/cause pain to the body parts/ involves body contact/ increased cost. In this paper, a non-invasive, whole field Digital Speckle Pattern Interferometry (DSPI) is used to study normal and abnormal breast mimicking tissue phantoms. While uniform fringes were obtained for a normal phantom in the out of plane speckle pattern interferometry configuration, the non uniformity in the observed fringes clearly showed the anomaly location in the abnormal phantom. The results are compared with deformation profiles using finite element analysis of the sample under similar loading conditions.

  9. An earthquake from space: detection of precursory magnetic anomalies from Swarm satellites before the 2015 M8 Nepal Earthquake

    NASA Astrophysics Data System (ADS)

    De Santis, A.; Balasis, G.; Pavón-Carrasco, F. J.; Cianchini, G.; Mandea, M.

    2015-12-01

    A large earthquake of around 8 magnitude occurred on 25 April 2015, 06:26 UTC, with epicenter in Nepal, causing more than 9000 fatalities and devastating destruction. The contemporary orbiting in the topside ionosphere of the three Swarm satellites by ESA makes it possible to look for possible pre-earthquake magnetic anomalous signals, likely due to some lithosphere-atmosphere-ionosphere (LAI) coupling. First, a wavelet analysis has been performed during the same day of the earthquake (from the external magnetic point of view, an exceptionally quiet day) with the result that a ULF anomalous and persisting signal (from around 3 to 6 UTC), is clearly detected before the earthquake. After this single-spot analysis, we performed a more extensive analysis for two months around the earthquake occurrence, to confirm or refute the cause-effect relationship. From the series of the detected magnetic anomalies (during night and magnetically quiet times) from Swarm satellites, we show that the cumulative numbers of anomalies follows the same typical power-law behavior of a critical system approaching its critical time, in our case, the large seismic event of 25 April, 2015, and then it recovers as the typical recovery phase after a large earthquake. The impressive similarity of this behavior with the analogous of seismic data analysis, provides strong support to the lithospheric origin of the satellite magnetic anomalies, as due to the LAI coupling during the preparation phase of the Nepal earthquake.

  10. Anomaly detection in radiographic images of composite materials via crosshatch regression

    NASA Astrophysics Data System (ADS)

    Lockard, Colin D.

    The development and testing of new composite materials is an important area of research supporting advances in aerospace engineering. Understanding the properties of these materials requires the analysis of material samples to identify damage. Given the significant time and effort required from human experts to analyze computed tomography (CT) scans related to the non-destructive evaluation of carbon fiber materials, it is advantageous to develop an automated system for identifying anomalies in these images. This thesis introduces a regression-based algorithm for identifying anomalies in grayscale images, with a particular focus on its application for the analysis of CT scan images of carbon fiber. The algorithm centers around a "crosshatch regression" approach in which each two-dimensional image is divided into a series of one-dimensional signals, each representing a single line of pixels. A robust multiple linear regression model is fitted to each signal and outliers are identified. Smoothing and quality control techniques help better define anomaly boundaries and remove noise, and multiple crosshatch regression runs are combined to generate the final result. A ground truth set was created and the algorithm was run against these images for testing. The experimental results support the efficacy of the technique, locating 92% of anomalies with an average recall of 88%, precision of 78%, and root mean square deviation of 11.2 pixels.

  11. A reversible fluorescence "off-on-off" sensor for sequential detection of aluminum and acetate/fluoride ions.

    PubMed

    Gupta, Vinod Kumar; Mergu, Naveen; Kumawat, Lokesh Kumar; Singh, Ashok Kumar

    2015-11-01

    A new rhodamine functionalized fluorogenic Schiff base CS was synthesized and its colorimetric and fluorescence responses toward various metal ions were explored. The sensor exhibited highly selective and sensitive colorimetric and "off-on" fluorescence response towards Al(3+) in the presence of other competing metal ions. These spectral changes are large enough in the visible region of the spectrum and thus enable naked-eye detection. Studies proved that the formation of CS-Al(3+) complex is fully reversible and can sense to AcO(-)/F(-) via dissociation. The results revealed that the sensor provides fluorescence "off-on-off" strategy for the sequential detection of Al(3+) and AcO(-)/F(-). PMID:26452794

  12. Millimeter Wave Detection of Localized Anomalies in the Space Shuttle External Fuel Tank Insulating Foam and Acreage Heat Tiles

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Case, J. T.; Zoughi, R.; Hepburn, F.

    2005-01-01

    The Space Shuttle Columbia's catastrophic accident emphasizes the growing need for developing and applying effective, robust and life-cycle oriented nondestructive testing (NDT) methods for inspecting the shuttle external fuel tank spray on foam insulation (SOFI) and its protective acreage heat tiles. Millimeter wave NDT techniques were one of the methods chosen for evaluating their potential for inspecting these structures. Several panels with embedded anomalies (mainly voids) were produced and tested for this purpose. Near-field and far-field millimeter wave NDT methods were used for producing millimeter wave images of the anomalies in SOFI panel and heat tiles. This paper presents the results of an investigation for the purpose of detecting localized anomalies in two SOFI panels and a set of heat tiles. To this end, reflectometers at a relatively wide range of frequencies (Ka-band (26.5 - 40 GHz) to W-band (75 - 110 GHz)) and utilizing different types of radiators were employed. The results clearly illustrate the utility of these methods for this purpose.

  13. Detection of Characteristic Precipitation Anomaly Patterns of El Nino / La Nina in Time- variable Gravity Fields by GRACE

    NASA Astrophysics Data System (ADS)

    Heki, K.; Morishita, Y.

    2007-12-01

    GRACE (Gravity Recovery and Climate Experiment) satellites, launched in March 2002, have been mapping monthly gravity fields of the Earth, allowing us to infer changes in surface mass, e.g. water and ice. Past findings include the ice mass loss in southern Greenland (Luthcke et al., 2006) and its acceleration in 2004 (Velicogna and Wahr, 2006), crustal dilatation by the 2004 Sumatra Earthquake (Han et al., 2006) and the postseismic movement of water in mantle (Ogawa and Heki, 2007). ENSO (El Nino and Southern Oscillation) brings about global climate impacts, together with its opposite phenomenon, La Nina. Ropelewski and Halpert (1987) showed typical precipitation patterns in ENSO years; characteristic regional-scale precipitation anomalies occur in India, tropical and southern Africa and South America. Nearly opposite precipitation anomalies are shown to occur in La Nina years (Ropelewski and Halpert, 1988). Here we report the detection of such precipitation anomaly patterns in the GRACE monthly gravity data 2002 - 2007, which includes both La Nina (2005 fall - 2006 spring) and El Nino (2006 fall - 2007 spring) periods. We modeled the worldwide gravity time series with constant trends and seasonal changes, and extracted deviations of gravity values at two time epochs, i.e. February 2006 and 2007, and converted them into the changes in equivalent surface water mass. East Africa showed negative gravity deviation (-20.5 cm in water) in 2006 February (La Nina), which reversed to positive (18.7 cm) in 2007 February (El Nino). Northern and southern parts of South America also showed similar see-saw patterns. Such patterns closely resemble to those found meteorologically (Ropelewski and Halpert, 1987; 1988), suggesting the potential of GRACE as a sensor of inter-annual precipitation anomalies through changes in continental water storage. We performed numerical simulations of soil moisture changes at grid points in land area incorporating the CMAP precipitation data, NCEP

  14. A Hybrid Positive-and-Negative Curvature Approach for Detection of the Edges of Magnetic Anomalies, and Its Application in the South China Sea

    NASA Astrophysics Data System (ADS)

    Guo, Lianghui; Gao, Rui; Meng, Xiaohong; Zhang, Guoli

    2015-10-01

    In work discussed in this paper the characteristics of both the most positive and most negative curvatures of a magnetic anomaly were analyzed, and a new approach for detection of the edges of magnetic anomalies is proposed. The new approach, called the hybrid positive-and-negative curvature approach, combines the most positive and most negative curvatures into one curvature by formula adjustments and weighted summation, combining the advantages of the two curvatures to improve edge detection. This approach is suitable for vertically magnetized or reduction-to-pole anomalies, which avoids the complexity of magnetic anomalies caused by oblique magnetization. Testing on synthetic vertically magnetized magnetic anomalies data demonstrated that the hybrid approach traces the edges of magnetic source bodies effectively, discriminates between high and low magnetism intuitively, and is better than approaches based solely on use of the most positive or most negative curvature. Testing on reduced-to-pole magnetic anomalies data around the ocean basin of the South China Sea showed that the hybrid approach enables better edge detection than the most positive or most negative curvatures. On the basis of the features of the reduced-to-pole magnetic anomalies and their hybrid curvature, we suggest the tectonic boundary between the southwestern subbasin and the eastern subbasin of the South China Sea ranges from the northeastern edge of the Zhongsha Islands in the southeast direction to the northeastern edge of the Reed Bank.

  15. Interpretation of Magnetic Anomalies in Salihli (Turkey) Geothermal Area Using 3-D Inversion and Edge Detection Techniques

    NASA Astrophysics Data System (ADS)

    Timur, Emre

    2016-04-01

    There are numerous geophysical methods used to investigate geothermal areas. The major purpose of this magnetic survey is to locate the boudaries of active hydrothermal system in the South of Gediz Graben in Salihli (Manisa/Turkey). The presence of the hydrothermal system had already been inferred from surface evidence of hydrothermal activity and drillings. Firstly, 3-D prismatic models were theoretically investigated and edge detection methods were utilized with an iterative inversion method to define the boundaries and the parameters of the structure. In the first step of the application, it was necessary to convert the total field anomaly into a pseudo-gravity anomaly map. Then the geometric boudaries of the structures were determined by applying a MATLAB based software with 3 different edge detection algorithms. The exact location of the structures were obtained by using these boundary coordinates as initial geometric parameters in the inversion process. In addition to these methods, reduction to pole and horizontal gradient methods were applied to the data to achieve more information about the location and shape of the possible reservoir. As a result, the edge detection methods were found to be successful, both in the field and as theoretical data sets for delineating the boundaries of the possible geothermal reservoir structure. The depth of the geothermal reservoir was determined as 2,4 km from 3-D inversion and 2,1 km from power spectrum methods.

  16. Sequential detection and concentration estimation of chemical vapors using range-resolved lidar with frequency-agile lasers

    NASA Astrophysics Data System (ADS)

    Warren, Russell E.; Vanderbeek, Richard G.; D'Amico, Francis M.

    2000-07-01

    This paper extends our earlier work in developing statistically optimal algorithms for estimating the range- dependent concentration of multiple vapor materials using multiwavelength frequency-agile lidar with a fixed set of wavelength bursts to the case of a time series processor that recursively updates the estimates as new data become available. The concentration estimates are used to detect the presence of one or more vapor materials by a sequential approach that accumulates likelihood in time for each range cell. A Bayesian methodology is used to construct the concentration estimates with a prior concentration smoothness constraint chosen to produce numerically stable results at longer ranges having weak signal return. The approach is illustrated on synthetic and actual field test data collected by SBCCOM.

  17. Network Event Recording Device: An automated system for Network anomaly detection, and notification. Draft

    SciTech Connect

    Simmons, D.G.; Wilkins, R.

    1994-09-01

    The goal of the Network Event Recording Device (NERD) is to provide a flexible autonomous system for network logging and notification when significant network anomalies occur. The NERD is also charged with increasing the efficiency and effectiveness of currently implemented network security procedures. While it has always been possible for network and security managers to review log files for evidence of network irregularities, the NERD provides real-time display of network activity, as well as constant monitoring and notification services for managers. Similarly, real-time display and notification of possible security breaches will provide improved effectiveness in combating resource infiltration from both inside and outside the immediate network environment.

  18. Quantitative Integration of Multiple Geophysical Techniques for Reducing Uncertainty in Discrete Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Carr, M. C.; Baker, G. S.; Herrmann, N.; Yerka, S.; Angst, M.

    2008-12-01

    The objectives of this project are to (1) utilize quantitative integration of multiple geophysical techniques, (2) determine geophysical anomalies that may indicate locations of various archaeological structures, and (3) develop techniques of quantifying causes of uncertainty. Two sites are used to satisfy these objectives. The first, representing a site with unknown target features, is an archaeological site on the Tennessee River floodplain. The area is divided into 437 (20 x 20 m) plots with 0.5 m spacing where magnetic gradiometry profiles were collected in a zig-zag pattern, resulting in 350 km of line data. Once anomalies are identified in the magnetics data, potential excavation sites for archeological features are determined and other geophysical techniques are utilized to gain confidence in choosing which anomalies to excavate. Several grids are resurveyed using Ground Penetrating Radar (GPR) and EM-31 with a 0.25 m spacing in a grid pattern. A quantitative method of integrating data into one comprehensive set is developed, enhancing interpretation because each geophysical technique utilized within this study produced a unique response to noise and the targets. Spatial visualization software is used to interpolate irregularly spaced XYZ data into a regularly spaced grid and display the geophysical data in 3D representations. Once all data are exported from each individual instrument, grid files are created for quantitative merging of the data and to create grid-based maps including contour, image, shaded relief, and surface maps. Statistics were calculated from anomaly classification in the data and excavated features present. To study this methodology in a more controlled setting, a second site is used. This site is analogous to the first in that it is along the Tennessee River floodplain on the same bedrock units. However, this analog site contains known targets (previously buried and accurately located) including size, shape, and orientation. Four

  19. Two sequential processes of change detection in hierarchically ordered areas of the human auditory cortex.

    PubMed

    Recasens, Marc; Grimm, Sabine; Capilla, Almudena; Nowak, Rafal; Escera, Carles

    2014-01-01

    Auditory deviance detection occurs around 150 ms after the onset of a deviant sound. Recent studies in animals and humans have described change-related processes occurring during the first 50 ms after sound onset. However, it still remains an open question whether these early and late processes of deviance detection are organized hierarchically in the human auditory cortex. We applied a beamforming source reconstruction approach in order to estimate brain sources associated with 2 temporally distinct markers of deviance detection. Results showed that rare frequency changes elicit an enhancement of the Nbm component of the middle latency response (MLR) peaking at 43 ms, in addition to the magnetic mismatch negativity (MMNm) peaking at 115 ms. Sources of MMNm, located in the right superior temporal gyrus, were lateral and posterior to the deviance-related MLR activity being generated in the right primary auditory cortex. Source reconstruction analyses revealed that detection of changes in the acoustic environment is a process accomplished in 2 different time ranges, by spatially separated auditory regions. Paralleling animal studies, our findings suggest that primary and secondary areas are involved in successive stages of deviance detection and support the existence of a hierarchical network devoted to auditory change detection.

  20. Structural features of sequential weak measurements

    NASA Astrophysics Data System (ADS)

    Diósi, Lajos

    2016-07-01

    We discuss the abstract structure of sequential weak measurement (WM) of general observables. In all orders, the sequential WM correlations without postselection yield the corresponding correlations of the Wigner function, offering direct quantum tomography through the moments of the canonical variables. Correlations in spin-1/2 sequential weak measurements coincide with those in strong measurements, they are constrained kinematically, and they are equivalent with single measurements. In sequential WMs with postselection, an anomaly occurs, different from the weak value anomaly of single WMs. In particular, the spread of polarization σ ̂ as measured in double WMs of σ ̂ will diverge for certain orthogonal pre- and postselected states.

  1. Response-time evidence for mixed memory states in a sequential-presentation change-detection task.

    PubMed

    Nosofsky, Robert M; Donkin, Chris

    2016-02-01

    Response-time (RT) and choice-probability data were obtained in a rapid visual sequential-presentation change-detection task in which memory set size, study-test lag, and objective change probabilities were manipulated. False "change" judgments increased dramatically with increasing lag, consistent with the idea that study items with long lags were ejected from a discrete-slots buffer. Error RTs were nearly invariant with set size and lag, consistent with the idea that the errors were produced by a stimulus-independent guessing process. The patterns of error and RT data could not be explained in terms of encoding limitations, but were consistent with the hypothesis that long retention lags produced a zero-stimulus-information state that required guessing. Formal modeling of the change-detection RT and error data pointed toward a hybrid model of visual working memory. The hybrid model assumed mixed states involving a combination of memory and guessing, but with higher memory resolution for items with shorter retention lags. The work raises new questions concerning the nature of the memory representations that are produced across the closely related tasks of change detection and visual memory search.

  2. Decreased Perifoveal Sensitivity Detected by Microperimetry in Patients Using Hydroxychloroquine and without Visual Field and Fundoscopic Anomalies

    PubMed Central

    Molina-Martín, A.; Piñero, D. P.; Pérez-Cambrodí, R. J.

    2015-01-01

    Purpose. To evaluate the usefulness of microperimetry in the early detection of the ocular anomalies associated with the use of hydroxychloroquine. Methods. Prospective comparative case series study comprising 14 healthy eyes of 7 patients (group A) and 14 eyes of 7 patients under treatment with hydroxychloroquine for the treatment of rheumatologic diseases and without fundoscopic or perimetric anomalies (group B). A comprehensive ophthalmological examination including microperimetry (MP) and spectral-domain optical coherence tomography was performed in both groups. Results. No significant differences were found in mean MP foveal sensitivity between groups (P = 0.18). However, mean MP overall sensitivity was significantly higher in group A (29.05 ± 0.57 dB versus group B, 26.05 ± 2.75 dB; P < 0.001). Significantly higher sensitivity values were obtained in group A in comparison to group B for the three eccentric loci evaluated (P < 0.001). Conclusion. Microperimetry seems to be a useful tool for the early detection of retinal damage in patients treated with hydroxychloroquine. PMID:25861463

  3. Selecting training and test images for optimized anomaly detection algorithms in hyperspectral imagery through robust parameter design

    NASA Astrophysics Data System (ADS)

    Mindrup, Frank M.; Friend, Mark A.; Bauer, Kenneth W.

    2011-06-01

    There are numerous anomaly detection algorithms proposed for hyperspectral imagery. Robust parameter design (RPD) techniques have been applied to some of these algorithms in an attempt to choose robust settings capable of operating consistently across a large variety of image scenes. Typically, training and test sets of hyperspectral images are chosen randomly. Previous research developed a frameworkfor optimizing anomaly detection in HSI by considering specific image characteristics as noise variables within the context of RPD; these characteristics include the Fisher's score, ratio of target pixels and number of clusters. This paper describes a method for selecting hyperspectral image training and test subsets yielding consistent RPD results based on these noise features. These subsets are not necessarily orthogonal, but still provide improvements over random training and test subset assignments by maximizing the volume and average distance between image noise characteristics. Several different mathematical models representing the value of a training and test set based on such measures as the D-optimal score and various distance norms are tested in a simulation experiment.

  4. Fuzzy Logic Based Anomaly Detection for Embedded Network Security Cyber Sensor

    SciTech Connect

    Ondrej Linda; Todd Vollmer; Jason Wright; Milos Manic

    2011-04-01

    Resiliency and security in critical infrastructure control systems in the modern world of cyber terrorism constitute a relevant concern. Developing a network security system specifically tailored to the requirements of such critical assets is of a primary importance. This paper proposes a novel learning algorithm for anomaly based network security cyber sensor together with its hardware implementation. The presented learning algorithm constructs a fuzzy logic rule based model of normal network behavior. Individual fuzzy rules are extracted directly from the stream of incoming packets using an online clustering algorithm. This learning algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental test-bed mimicking the environment of a critical infrastructure control system.

  5. Space Shuttle Main Engine Propellant Path Leak Detection Using Sequential Image Processing

    NASA Technical Reports Server (NTRS)

    Smith, L. Montgomery; Malone, Jo Anne; Crawford, Roger A.

    1995-01-01

    Initial research in this study using theoretical radiation transport models established that the occurrence of a leak is accompanies by a sudden but sustained change in intensity in a given region of an image. In this phase, temporal processing of video images on a frame-by-frame basis was used to detect leaks within a given field of view. The leak detection algorithm developed in this study consists of a digital highpass filter cascaded with a moving average filter. The absolute value of the resulting discrete sequence is then taken and compared to a threshold value to produce the binary leak/no leak decision at each point in the image. Alternatively, averaging over the full frame of the output image produces a single time-varying mean value estimate that is indicative of the intensity and extent of a leak. Laboratory experiments were conducted in which artificially created leaks on a simulated SSME background were produced and recorded from a visible wavelength video camera. This data was processed frame-by-frame over the time interval of interest using an image processor implementation of the leak detection algorithm. In addition, a 20 second video sequence of an actual SSME failure was analyzed using this technique. The resulting output image sequences and plots of the full frame mean value versus time verify the effectiveness of the system.

  6. VISAD: an interactive and visual analytical tool for the detection of behavioral anomalies in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Warston, Håkan

    2009-05-01

    Monitoring the surveillance of large sea areas normally involves the analysis of huge quantities of heterogeneous data from multiple sources (radars, cameras, automatic identification systems, reports, etc.). The rapid identification of anomalous behavior or any threat activity in the data is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems are rarely used in the real world. There are two main reasons: (1) the detection of anomalous behavior is normally not a well-defined and structured problem and therefore, automatic data mining approaches do not work well and (2) the difficulties that these systems have regarding the representation and employment of the prior knowledge that the users bring to their tasks. In order to overcome these limitations, we believe that human involvement in the entire discovery process is crucial. Using a visual analytics process model as a framework, we present VISAD: an interactive, visual knowledge discovery tool for supporting the detection and identification of anomalous behavior in maritime traffic data. VISAD supports the insertion of human expert knowledge in (1) the preparation of the system, (2) the establishment of the normal picture and (3) in the actual detection of rare events. For each of these three modules, VISAD implements different layers of data mining, visualization and interaction techniques. Thus, the detection procedure becomes transparent to the user, which increases his/her confidence and trust in the system and overall, in the whole discovery process.

  7. Sequential Filtering Processes Shape Feature Detection in Crickets: A Framework for Song Pattern Recognition

    PubMed Central

    Hedwig, Berthold G.

    2016-01-01

    Intraspecific acoustic communication requires filtering processes and feature detectors in the auditory pathway of the receiver for the recognition of species-specific signals. Insects like acoustically communicating crickets allow describing and analysing the mechanisms underlying auditory processing at the behavioral and neural level. Female crickets approach male calling song, their phonotactic behavior is tuned to the characteristic features of the song, such as the carrier frequency and the temporal pattern of sound pulses. Data from behavioral experiments and from neural recordings at different stages of processing in the auditory pathway lead to a concept of serially arranged filtering mechanisms. These encompass a filter for the carrier frequency at the level of the hearing organ, and the pulse duration through phasic onset responses of afferents and reciprocal inhibition of thoracic interneurons. Further, processing by a delay line and coincidence detector circuit in the brain leads to feature detecting neurons that specifically respond to the species-specific pulse rate, and match the characteristics of the phonotactic response. This same circuit may also control the response to the species-specific chirp pattern. Based on these serial filters and the feature detecting mechanism, female phonotactic behavior is shaped and tuned to the characteristic properties of male calling song. PMID:26941647

  8. Sequential Filtering Processes Shape Feature Detection in Crickets: A Framework for Song Pattern Recognition.

    PubMed

    Hedwig, Berthold G

    2016-01-01

    Intraspecific acoustic communication requires filtering processes and feature detectors in the auditory pathway of the receiver for the recognition of species-specific signals. Insects like acoustically communicating crickets allow describing and analysing the mechanisms underlying auditory processing at the behavioral and neural level. Female crickets approach male calling song, their phonotactic behavior is tuned to the characteristic features of the song, such as the carrier frequency and the temporal pattern of sound pulses. Data from behavioral experiments and from neural recordings at different stages of processing in the auditory pathway lead to a concept of serially arranged filtering mechanisms. These encompass a filter for the carrier frequency at the level of the hearing organ, and the pulse duration through phasic onset responses of afferents and reciprocal inhibition of thoracic interneurons. Further, processing by a delay line and coincidence detector circuit in the brain leads to feature detecting neurons that specifically respond to the species-specific pulse rate, and match the characteristics of the phonotactic response. This same circuit may also control the response to the species-specific chirp pattern. Based on these serial filters and the feature detecting mechanism, female phonotactic behavior is shaped and tuned to the characteristic properties of male calling song.

  9. Sequential detection of temporal communities in evolving networks by estrangement confinement

    NASA Astrophysics Data System (ADS)

    Sreenivasan, Sameet; Kawadia, Vikas

    2013-03-01

    Temporal communities are the result of a consistent partitioning of nodes across multiple snapshots of an evolving network, and they provide insights into how dense clusters in a network emerge, combine, split and decay over time. Reliable detection of temporal communities requires finding a good community partition in a given snapshot while simultaneously ensuring that it bears some similarity to the partition(s) found in the previous snapshot(s). This is a particularly difficult task given the extreme sensitivity of community structure yielded by current methods to changes in the network structure. Motivated by the inertia of inter-node relationships, we present a new measure of partition distance called estrangement, and show that constraining estrangement enables the detection of meaningful temporal communities at various degrees of temporal smoothness in diverse real-world datasets. Estrangement confinement consequently provides a principled approach to uncovering temporal communities in evolving networks. (V. Kawadia and S. Sreenivasan, http://arxiv.org/abs/1203.5126) Supported in part by ARL NS-CTA

  10. Thermal and TEC anomalies detection using an intelligent hybrid system around the time of the Saravan, Iran, (Mw = 7.7) earthquake of 16 April 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2014-02-01

    A powerful earthquake of Mw = 7.7 struck the Saravan region (28.107° N, 62.053° E) in Iran on 16 April 2013. Up to now nomination of an automated anomaly detection method in a non linear time series of earthquake precursor has been an attractive and challenging task. Artificial Neural Network (ANN) and Particle Swarm Optimization (PSO) have revealed strong potentials in accurate time series prediction. This paper presents the first study of an integration of ANN and PSO method in the research of earthquake precursors to detect the unusual variations of the thermal and total electron content (TEC) seismo-ionospheric anomalies induced by the strong earthquake of Saravan. In this study, to overcome the stagnation in local minimum during the ANN training, PSO as an optimization method is used instead of traditional algorithms for training the ANN method. The proposed hybrid method detected a considerable number of anomalies 4 and 8 days preceding the earthquake. Since, in this case study, ionospheric TEC anomalies induced by seismic activity is confused with background fluctuations due to solar activity, a multi-resolution time series processing technique based on wavelet transform has been applied on TEC signal variations. In view of the fact that the accordance in the final results deduced from some robust methods is a convincing indication for the efficiency of the method, therefore the detected thermal and TEC anomalies using the ANN + PSO method were compared to the results with regard to the observed anomalies by implementing the mean, median, Wavelet, Kalman filter, Auto-Regressive Integrated Moving Average (ARIMA), Support Vector Machine (SVM) and Genetic Algorithm (GA) methods. The results indicate that the ANN + PSO method is quite promising and deserves serious attention as a new tool for thermal and TEC seismo anomalies detection.

  11. Principle of indirect comparison (PIC): simulation and analysis of PIC-based anomaly detection in multispectral data

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton

    2006-05-01

    The Army has gained a renewed interest in hyperspectral (HS) imagery for military surveillance. As a result, a HS research team has been established at the Army Research Lab (ARL) to focus exclusively on the design of innovative algorithms for target detection in natural clutter. In 2005 at this symposium, we presented comparison performances between a proposed anomaly detector and existing ones testing real HS data. Herein, we present some insightful results on our general approach using analyses of statistical performances of an additional ARL anomaly detector testing 1500 simulated realizations of model-specific data to shed some light on its effectiveness. Simulated data of increasing background complexity will be used for the analysis, where highly correlated multivariate Gaussian random samples will model homogeneous backgrounds and mixtures of Gaussian will model non-homogeneous backgrounds. Distinct multivariate random samples will model targets, and targets will be added to backgrounds. The principle that led to the design of our detectors employs an indirect sample comparison to test the likelihood that local HS random samples belong to the same population. Let X and Y denote two random samples, and let Z = X U Y, where U denotes the union. We showed that X can be indirectly compared to Y by comparing, instead, Z to Y (or to X). Mathematical implementations of this simple idea have shown a remarkable ability to preserve performance of meaningful detections (e.g., full-pixel targets), while significantly reducing the number of meaningless detections (e.g., transitions of background regions in the scene).

  12. Selecting Observation Platforms for Optimized Anomaly Detectability under Unreliable Partial Observations

    SciTech Connect

    Wen-Chiao Lin; Humberto E. Garcia; Tae-Sic Yoo

    2011-06-01

    Diagnosers for keeping track on the occurrences of special events in the framework of unreliable partially observed discrete-event dynamical systems were developed in previous work. This paper considers observation platforms consisting of sensors that provide partial and unreliable observations and of diagnosers that analyze them. Diagnosers in observation platforms typically perform better as sensors providing the observations become more costly or increase in number. This paper proposes a methodology for finding an observation platform that achieves an optimal balance between cost and performance, while satisfying given observability requirements and constraints. Since this problem is generally computational hard in the framework considered, an observation platform optimization algorithm is utilized that uses two greedy heuristics, one myopic and another based on projected performances. These heuristics are sequentially executed in order to find best observation platforms. The developed algorithm is then applied to an observation platform optimization problem for a multi-unit-operation system. Results show that improved observation platforms can be found that may significantly reduce the observation platform cost but still yield acceptable performance for correctly inferring the occurrences of special events.

  13. Detection of oxygen isotopic anomaly in terrestrial atmospheric carbonates and its implications to Mars

    PubMed Central

    Shaheen, R.; Abramian, A.; Horn, J.; Dominguez, G.; Sullivan, R.; Thiemens, Mark H.

    2010-01-01

    The debate of life on Mars centers around the source of the globular, micrometer-sized mineral carbonates in the ALH84001 meteorite; consequently, the identification of Martian processes that form carbonates is critical. This paper reports a previously undescribed carbonate formation process that occurs on Earth and, likely, on Mars. We identified micrometer-sized carbonates in terrestrial aerosols that possess excess 17O (0.4–3.9‰). The unique O-isotopic composition mechanistically describes the atmospheric heterogeneous chemical reaction on aerosol surfaces. Concomitant laboratory experiments define the transfer of ozone isotopic anomaly to carbonates via hydrogen peroxide formation when O3 reacts with surface adsorbed water. This previously unidentified chemical reaction scenario provides an explanation for production of the isotopically anomalous carbonates found in the SNC (shergottites, nakhlaites, chassignites) Martian meteorites and terrestrial atmospheric carbonates. The anomalous hydrogen peroxide formed on the aerosol surfaces may transfer its O-isotopic signature to the water reservoir, thus producing mass independently fractionated secondary mineral evaporites. The formation of peroxide via heterogeneous chemistry on aerosol surfaces also reveals a previously undescribed oxidative process of utility in understanding ozone and oxygen chemistry, both on Mars and Earth. PMID:21059939

  14. Subsurface faults detection based on magnetic anomalies investigation: A field example at Taba protectorate, South Sinai

    NASA Astrophysics Data System (ADS)

    Khalil, Mohamed H.

    2016-08-01

    Quantitative interpretation of the magnetic data particularly in a complex dissected structure necessitates using of filtering techniques. In Taba protectorate, Sinai synthesis of different filtering algorithms was carried out to distinct and verifies the subsurface structure and estimates the depth of the causative magnetic sources. In order to separate the shallow-seated structure, filters of the vertical derivatives (VDR), Butterworth high-pass (BWHP), analytic signal (AS) amplitude, and total horizontal derivative of the tilt derivative (TDR_THDR) were conducted. While, filters of the apparent susceptibility and Butterworth low-pass (BWLP) were conducted to identify the deep-seated structure. The depths of the geological contacts and faults were calculated by the 3D Euler deconvolution. Noteworthy, TDR_THDR was independent of geomagnetic inclination, significantly less susceptible to noise, and more sensitive to the details of the shallow superimposed structures. Whereas, the BWLP proved high resolution capabilities in attenuating the shorter wavelength of the near surface anomalies and emphasizing the longer wavelength derived from deeper causative structure. 3D Euler deconvolution (SI = 0) was quite amenable to estimate the depths of superimposed subsurface structure. The pattern, location, and trend of the deduced shallow and deep faults were conformed remarkably to the addressed fault system.

  15. Detection of oxygen isotopic anomaly in terrestrial atmospheric carbonates and its implications to Mars.

    PubMed

    Shaheen, R; Abramian, A; Horn, J; Dominguez, G; Sullivan, R; Thiemens, Mark H

    2010-11-23

    The debate of life on Mars centers around the source of the globular, micrometer-sized mineral carbonates in the ALH84001 meteorite; consequently, the identification of Martian processes that form carbonates is critical. This paper reports a previously undescribed carbonate formation process that occurs on Earth and, likely, on Mars. We identified micrometer-sized carbonates in terrestrial aerosols that possess excess (17)O (0.4-3.9‰). The unique O-isotopic composition mechanistically describes the atmospheric heterogeneous chemical reaction on aerosol surfaces. Concomitant laboratory experiments define the transfer of ozone isotopic anomaly to carbonates via hydrogen peroxide formation when O(3) reacts with surface adsorbed water. This previously unidentified chemical reaction scenario provides an explanation for production of the isotopically anomalous carbonates found in the SNC (shergottites, nakhlaites, chassignites) Martian meteorites and terrestrial atmospheric carbonates. The anomalous hydrogen peroxide formed on the aerosol surfaces may transfer its O-isotopic signature to the water reservoir, thus producing mass independently fractionated secondary mineral evaporites. The formation of peroxide via heterogeneous chemistry on aerosol surfaces also reveals a previously undescribed oxidative process of utility in understanding ozone and oxygen chemistry, both on Mars and Earth. PMID:21059939

  16. Detection of oxygen isotopic anomaly in terrestrial atmospheric carbonates and its implications to Mars.

    PubMed

    Shaheen, R; Abramian, A; Horn, J; Dominguez, G; Sullivan, R; Thiemens, Mark H

    2010-11-23

    The debate of life on Mars centers around the source of the globular, micrometer-sized mineral carbonates in the ALH84001 meteorite; consequently, the identification of Martian processes that form carbonates is critical. This paper reports a previously undescribed carbonate formation process that occurs on Earth and, likely, on Mars. We identified micrometer-sized carbonates in terrestrial aerosols that possess excess (17)O (0.4-3.9‰). The unique O-isotopic composition mechanistically describes the atmospheric heterogeneous chemical reaction on aerosol surfaces. Concomitant laboratory experiments define the transfer of ozone isotopic anomaly to carbonates via hydrogen peroxide formation when O(3) reacts with surface adsorbed water. This previously unidentified chemical reaction scenario provides an explanation for production of the isotopically anomalous carbonates found in the SNC (shergottites, nakhlaites, chassignites) Martian meteorites and terrestrial atmospheric carbonates. The anomalous hydrogen peroxide formed on the aerosol surfaces may transfer its O-isotopic signature to the water reservoir, thus producing mass independently fractionated secondary mineral evaporites. The formation of peroxide via heterogeneous chemistry on aerosol surfaces also reveals a previously undescribed oxidative process of utility in understanding ozone and oxygen chemistry, both on Mars and Earth.

  17. Coronary artery anomalies.

    PubMed

    Earls, James P

    2006-12-01

    Coronary artery anomalies are uncommon findings but can be of significant clinical importance in a small number of individuals. Clinical presentation depends on the specific anomaly. Most coronary artery anomalies are benign and clinically insignificant, however, some anomalies are potentially significant and can lead to heart failure and even death. Noninvasive imaging has emerged as the preferred way to image coronary anomalies. Both electron beam computed tomography (EBCT) and magnetic resonance angiography (MRA) are useful for the diagnosis of anomalous coronary arteries. Recently, MDCT has also proven to be very useful in the detection and characterization of anomalous coronary arteries. This chapter will review the appearance of the most commonly encountered coronary anomalies on MDCT. PMID:17709086

  18. DEVELOPMENT AND TESTING OF PROCEDURES FOR CARRYING OUT EMERGENCY PHYSICAL INVENTORY TAKING AFTER DETECTING ANOMALY EVENTS CONCERNING NM SECURITY.

    SciTech Connect

    VALENTE,J.FISHBONE,L.ET AL.

    2003-07-13

    In the State Scientific Center of Russian Federation - Institute of Physics and Power Engineering (SSC RF-IPPE, Obninsk), which is under Minatom jurisdiction, the procedures for carrying out emergency physical inventory taking (EPIT) were developed and tested in cooperation with the Brookhaven National Laboratory (USA). Here the emergency physical inventory taking means the PIT, which is carried out in case of symptoms indicating a possibility of NM loss (theft). Such PIT often requires a verification of attributes and quantitative characteristics for all the NM items located in a specific Material Balance Area (MBA). In order to carry out the exercise, an MBA was selected where many thousands of NM items containing highly enriched uranium are used. Three clients of the computerized material accounting system (CMAS) are installed in this MBA. Labels with unique (within IPPE site) identification numbers in the form of digit combinations and an appropriate bar code have been applied on the NM items, containers and authorized locations. All the data to be checked during the EPIT are stored in the CMAS database. Five variants of anomalies initiating EPIT and requiring different types of activities on EPIT organization are considered. Automatic working places (AWP) were created on the basis of the client computers in order to carry out a large number of measurements within a reasonable time. In addition to a CMAS client computer, the main components of an AWP include a bar-code reader, an electronic scale and an enrichment meter with NaI--detector--the lMCA Inspector (manufactured by the Canberra Company). All these devices work together with a client computer in the on-line mode. Special computer code (Emergency Inventory Software-EIS) was developed. All the algorithms of interaction between the operator and the system, as well as algorithms of data exchange during the measurements and data comparison, are implemented in this software. Registration of detected

  19. A major Sm epitope anchored to sequential oligopeptide carriers is a suitable antigenic substrate to detect anti-Sm antibodies.

    PubMed

    Petrovas, C J; Vlachoyiannopoulos, P G; Tzioufas, A G; Alexopoulos, C; Tsikaris, V; Sakarellos-Daitsiotis, M; Sakarellos, C; Moutsopoulos, H M

    1998-11-01

    A sensitive, highly reproducible, solid-phase enzyme immunoassay (ELISA), was developed in order to investigate whether the synthetic heptapeptide PPGMRPP-a major epitope of the Sm autoantigen-anchored in five copies to a sequential oligopeptide carrier (SOC), [(PPGMRPP)5-SOC5] is a suitable antigenic substrate to identify anti-Sm/antibodies. Sera with different autoantibody specificities [45 anti-Sm, 40 anti-U1RNP, 40 anti-Ro (SSA)/La(SSB) positive, 21 Antinuclear antibody positive, but negative for antibodies to extractable nuclear antigens (ANA + /ENA - ) and 75 normal human sera, ANA negative] and 75 sera from patients with rheumatoid arthritis (RA) were tested for anti-(PPGMRPP)5-(SOC)5 reactivity in order to evaluate the specificity and sensitivity of the method to detect anti-Sm antibodies. RNA immunoprecipitation assays for the detection of anti-Sm and anti-U1RNP antibodies and counter immunoelectrophoresis (CIE) for the detection of anti-Ro(SSA) and anti-La(SSB) antibodies were used as reference techniques. The sensitivity of the method was 98% and the specificity was 68% for the determination of anti-Sm antibodies, while for the determination of anti-Sm and/or anti-U1RNP reactivity (antibodies to snRNPs) the corresponding values were 82% and 86%, respectively. In a comparison of the above assay with an ELISA, using Sm/U1RNP purified complex as immobilized antigen it was shown that the sensitivity of the anti-Sm/U1RNP ELISA in detecting anti-snRNPs was 74%; in addition sera with anti-Sm antibodies gave higher binding in the anti-(PPGMRPP)5-(SOC)5 ELISA compared with anti-Sm/U1RNP ELISA. Intra- and inter-assay precision was measured on four sera with reactivities extending into a wide range of absorbance values showed that the intra-assay coefficient of variation (CV%) ranged from 2.7 to 6 and the inter-assay CV% ranged from 9 to 14.5. These results indicate that the PPGMRPP peptide anchored to a pentameric SOC as a carrier is a suitable antigen for

  20. Evolutionary neural networks for anomaly detection based on the behavior of a program.

    PubMed

    Han, Sang-Jun; Cho, Sung-Bae

    2006-06-01

    The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection.

  1. Evolutionary neural networks for anomaly detection based on the behavior of a program.

    PubMed

    Han, Sang-Jun; Cho, Sung-Bae

    2006-06-01

    The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection. PMID:16761810

  2. Isolation and detection of single molecules on paramagnetic beads using sequential fluid flows in microfabricated polymer array assemblies.

    PubMed

    Kan, Cheuk W; Rivnak, Andrew J; Campbell, Todd G; Piech, Tomasz; Rissin, David M; Mösl, Matthias; Peterça, Andrej; Niederberger, Hans-Peter; Minnehan, Kaitlin A; Patel, Purvish P; Ferrell, Evan P; Meyer, Raymond E; Chang, Lei; Wilson, David H; Fournier, David R; Duffy, David C

    2012-03-01

    We report a method for isolating individual paramagnetic beads in arrays of femtolitre-sized wells and detecting single enzyme-labeled proteins on these beads using sequential fluid flows in microfabricated polymer array assemblies. Arrays of femtolitre-sized wells were fabricated in cyclic olefin polymer (COP) using injection moulding based on DVD manufacturing. These arrays were bonded to a complementary fluidic structure that was also moulded in COP to create an enclosed device to allow delivery of liquids to the arrays. Enzyme-associated, paramagnetic beads suspended in aqueous solutions of enzyme substrate were delivered fluidically to the array such that one bead per well was loaded by gravity. A fluorocarbon oil was then flowed into the device to remove excess beads from the surface of the array, and to seal and isolate the femtolitre-sized wells containing beads and enzyme substrate. The device was then imaged using standard fluorescence imaging to determine which wells contained single enzyme molecules. The analytical performance of this device as the detector for digital ELISA compared favourably to the standard method, i.e., glass arrays mechanically sealed against a silicone gasket; prostate specific antigen (PSA) could be detected from 0.011 pg mL(-1) up to 100 pg mL(-1). The use of an enclosed fluidic device to isolate beads in single-molecule arrays offers a multitude of advantages for low-cost manufacturing, ease of automation, and instrument development to enable applications in biomarker validation and medical diagnosis. PMID:22179487

  3. Spatial scanning for anomaly detection in acoustic emission testing of an aerospace structure

    NASA Astrophysics Data System (ADS)

    Hensman, James; Worden, Keith; Eaton, Mark; Pullin, Rhys; Holford, Karen; Evans, Sam

    2011-10-01

    Acoustic emission (AE) monitoring of engineering structures potentially provides a convenient, cost-effective means of performing structural health monitoring. Networks of AE sensors can be easily and unobtrusively installed upon structures, giving the ability to detect and locate damage-related strain releases ('events') in the structure. Use of the technique is not widespread due to the lack of a simple and effective method for detecting abnormal activity levels: the sensitivity of AE sensor networks is such that events unrelated to damage are prevalent in most applications. In this publication, we propose to monitor AE activity in a structure using a spatial scanning statistic, developed and used effectively in the field of epidemiology. The technique is demonstrated on an aerospace structure - an Airbus A320 main landing gear fitting - undergoing fatigue loading, and the method is compared to existing techniques. Despite its simplicity, the scanning statistic proves to be an extremely effective tool in detecting the onset of damage in the structure: it requires little to no user intervention or expertise, is inexpensive to compute and has an easily interpretable output. Furthermore, the generic nature of the method allows the technique to be used in a variety of monitoring scenarios, to detect damage in a wide range of structures.

  4. Behavioral Anomaly Detection: A Socio-Technical Study of Trustworthiness in Virtual Organizations

    ERIC Educational Resources Information Center

    Ho, Shuyuan Mary

    2009-01-01

    This study examines perceptions of human "trustworthiness" as a key component in countering insider threats. The term "insider threat" refers to situations where a critical member of an organization behaves against the interests of the organization, in an illegal and/or unethical manner. Identifying and detecting how an individual's behavior…

  5. Recent Results on "Approximations to Optimal Alarm Systems for Anomaly Detection"

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2009-01-01

    An optimal alarm system and its approximations may use Kalman filtering for univariate linear dynamic systems driven by Gaussian noise to provide a layer of predictive capability. Predicted Kalman filter future process values and a fixed critical threshold can be used to construct a candidate level-crossing event over a predetermined prediction window. An optimal alarm system can be designed to elicit the fewest false alarms for a fixed detection probability in this particular scenario.

  6. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, M. Clinton; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many konts to make possible high speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flow fields/plumes; the Optical Plume Anomaly Detection (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDIFIS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Additionally, efforts are being advanced to hardware encode components of the EDIFIS in order to address real-time operational requirements for health monitoring and management. This paper addresses the OPAD with its tool suite, and discusses what is considered a natural progression: a concept for migrating OPAD towards detection of high energy particles, including neutrons and gamma rays. The integration of these tools and capabilities will provide NASA with a systematic approach to monitor space vehicle internal and external environment.

  7. Remote detection of metal anomalies on Pilot Mountain, Randolph County, North Carolina

    USGS Publications Warehouse

    Milton, N.M.; Collins, William; Chang, S.-H.; Schmidt, R.G.

    1982-01-01

    A biogeophysical technique used successfully to delineate mineralized zones under coniferous forests has been extended to a deciduous region in the Piedmont physiographic province of North Carolina. Pilot Mountain, a hydrothermally altered monadnock within the Carolina slate belt, contains areas of anomalously high amounts of Cu, Mo, and Sn in the soils. Leaves of canopy trees in the mineralized zone also contain significant amounts of Cu. Spectral data acquired from a high-resolution airborne spectroradiometer were processed using a waveform analysis technique to minimize background noise caused by canopy variations and slope effects. Areas containing anomalous metals were detected by spectral changes in the chlorophyll absorption region.

  8. Cyber-Critical Infrastructure Protection Using Real-Time Payload-Based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Düssel, Patrick; Gehl, Christian; Laskov, Pavel; Bußer, Jens-Uwe; Störmann, Christof; Kästner, Jan

    With an increasing demand of inter-connectivity and protocol standardization modern cyber-critical infrastructures are exposed to a multitude of serious threats that may give rise to severe damage for life and assets without the implementation of proper safeguards. Thus, we propose a method that is capable to reliably detect unknown, exploit-based attacks on cyber-critical infrastructures carried out over the network. We illustrate the effectiveness of the proposed method by conducting experiments on network traffic that can be found in modern industrial control systems. Moreover, we provide results of a throughput measuring which demonstrate the real-time capabilities of our system.

  9. Genetic algorithm for TEC seismo-ionospheric anomalies detection around the time of the Solomon (Mw = 8.0) earthquake of 06 February 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-08-01

    On 6 February 2013, at 12:12:27 local time (01:12:27 UTC) a seismic event registering Mw 8.0 struck the Solomon Islands, located at the boundaries of the Australian and Pacific tectonic plates. Time series prediction is an important and widely interesting topic in the research of earthquake precursors. This paper describes a new computational intelligence approach to detect the unusual variations of the total electron content (TEC) seismo-ionospheric anomalies induced by the powerful Solomon earthquake using genetic algorithm (GA). The GA detected a considerable number of anomalous occurrences on earthquake day and also 7 and 8 days prior to the earthquake in a period of high geomagnetic activities. In this study, also the detected TEC anomalies using the proposed method are compared to the results dealing with the observed TEC anomalies by applying the mean, median, wavelet, Kalman filter, ARIMA, neural network and support vector machine methods. The accordance in the final results of all eight methods is a convincing indication for the efficiency of the GA method. It indicates that GA can be an appropriate non-parametric tool for anomaly detection in a non linear time series showing the seismo-ionospheric precursors variations.

  10. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition

    PubMed Central

    Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine

    2016-01-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466

  11. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition.

    PubMed

    Hébert-Dufresne, Laurent; Grochow, Joshua A; Allard, Antoine

    2016-01-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.

  12. Anomaly detection using simulated MTI data cubes derived from HYDICE data

    SciTech Connect

    Moya, M.M.; Taylor, J.G.; Stallard, B.R.; Motomatsu, S.E.

    1998-07-01

    The US Department of Energy is funding the development of the Multi-spectral Thermal Imager (MTI), a satellite-based multi-spectral (MS) thermal imaging sensor scheduled for launch in October 1999. MTI is a research and development (R and D) platform to test the applicability of multispectral and thermal imaging technology for detecting and monitoring signs of proliferation of weapons of mass destruction. During its three-year mission, MTI will periodically record images of participating government, industrial and natural sites in fifteen visible and infrared spectral bands to provide a variety of image data associated with weapons production activities. The MTI satellite will have spatial resolution in the visible bands that is five times better than LANDSAT TM in each dimension and will have five thermal bands. In this work, the authors quantify the separability between specific materials and the natural background by applying Receiver Operating Curve (ROC) analysis to the residual errors from a linear unmixing. The authors apply the ROC analysis to quantify performance of the MTI. They describe the MTI imager and simulate its data by filtering HYDICE hyperspectral imagery both spatially and spectrally and by introducing atmospheric effects corresponding to the MTI satellite altitude. They compare and contrast the individual effects on performance of spectral resolution, spatial resolution, atmospheric corrections, and varying atmospheric conditions.

  13. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition.

    PubMed

    Hébert-Dufresne, Laurent; Grochow, Joshua A; Allard, Antoine

    2016-01-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466

  14. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition

    NASA Astrophysics Data System (ADS)

    Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine

    2016-08-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.

  15. Algorithms for Spectral Decomposition with Applications to Optical Plume Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Srivastava, Askok N.; Matthews, Bryan; Das, Santanu

    2008-01-01

    The analysis of spectral signals for features that represent physical phenomenon is ubiquitous in the science and engineering communities. There are two main approaches that can be taken to extract relevant features from these high-dimensional data streams. The first set of approaches relies on extracting features using a physics-based paradigm where the underlying physical mechanism that generates the spectra is used to infer the most important features in the data stream. We focus on a complementary methodology that uses a data-driven technique that is informed by the underlying physics but also has the ability to adapt to unmodeled system attributes and dynamics. We discuss the following four algorithms: Spectral Decomposition Algorithm (SDA), Non-Negative Matrix Factorization (NMF), Independent Component Analysis (ICA) and Principal Components Analysis (PCA) and compare their performance on a spectral emulator which we use to generate artificial data with known statistical properties. This spectral emulator mimics the real-world phenomena arising from the plume of the space shuttle main engine and can be used to validate the results that arise from various spectral decomposition algorithms and is very useful for situations where real-world systems have very low probabilities of fault or failure. Our results indicate that methods like SDA and NMF provide a straightforward way of incorporating prior physical knowledge while NMF with a tuning mechanism can give superior performance on some tests. We demonstrate these algorithms to detect potential system-health issues on data from a spectral emulator with tunable health parameters.

  16. Structural Anomalies Detected in Ceramic Matrix Composites Using Combined Nondestructive Evaluation and Finite Element Analysis (NDE and FEA)

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.; Bhatt, Ramakrishna T.

    2003-01-01

    and the experimental data. Furthermore, modeling of the voids collected via NDE offered an analytical advantage that resulted in more accurate assessments of the material s structural strength. The top figure shows a CT scan image of the specimen test section illustrating various hidden structural entities in the material and an optical image of the test specimen considered in this study. The bottom figure represents the stress response predicted from the finite element analyses (ref .3 ) for a selected CT slice where it clearly illustrates the correspondence of the high stress risers due to voids in the material with those predicted by the NDE. This study is continuing, and efforts are concentrated on improving the modeling capabilities to imitate the structural anomalies as detected.

  17. Bangui Anomaly

    NASA Technical Reports Server (NTRS)

    Taylor, Patrick T.

    2004-01-01

    Bangui anomaly is the name given to one of the Earth s largest crustal magnetic anomalies and the largest over the African continent. It covers two-thirds of the Central African Republic and therefore the name derives from the capitol city-Bangui that is also near the center of this feature. From surface magnetic survey data Godivier and Le Donche (1962) were the first to describe this anomaly. Subsequently high-altitude world magnetic surveying by the U.S. Naval Oceanographic Office (Project Magnet) recorded a greater than 1000 nT dipolar, peak-to-trough anomaly with the major portion being negative (figure 1). Satellite observations (Cosmos 49) were first reported in 1964, these revealed a 40nT anomaly at 350 km altitude. Subsequently the higher altitude (417-499km) POGO (Polar Orbiting Geomagnetic Observatory) satellite data recorded peak-to-trough anomalies of 20 nT these data were added to Cosmos 49 measurements by Regan et al. (1975) for a regional satellite altitude map. In October 1979, with the launch of Magsat, a satellite designed to measure crustal magnetic anomalies, a more uniform satellite altitude magnetic map was obtained. These data, computed at 375 km altitude recorded a -22 nT anomaly (figure 2). This elliptically shaped anomaly is approximately 760 by 1000 km and is centered at 6%, 18%. The Bangui anomaly is composed of three segments; there are two positive anomalies lobes north and south of a large central negative field. This displays the classic pattern of a magnetic anomalous body being magnetized by induction in a zero inclination field. This is not surprising since the magnetic equator passes near the center of this body.

  18. Anomaly Monitoring Method for Key Components of Satellite

    PubMed Central

    Fan, Linjun; Xiao, Weidong; Tang, Jun

    2014-01-01

    This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703

  19. Space-borne detection of volcanic carbon dioxide anomalies: The importance of ground-based validation networks

    NASA Astrophysics Data System (ADS)

    Schwandner, F. M.; Carn, S. A.; Corradini, S.; Merucci, L.; Salerno, G.; La Spina, A.

    2012-04-01

    We have investigated the feasibility of space-borne detection of volcanic carbon dioxide (CO2) anomalies, and their integration with ground-based observations. Three goals provide motivation to their integration: (a) development of new volcano monitoring techniques, with better spatial and temporal coverage, because pre-eruptive volcanic CO2 emissions are potentially the earliest available indicators of volcanic unrest; (b) improvement the currently very poor global CO2 source strength inventory for volcanoes, and (c) use of volcanic CO2 emissions for high altitude strong point source emission and dispersion studies. (1) Feasibility of space-borne detection of volcanic CO2 anomalies. Volcanoes are highly variable but continuous CO2 emitters, distributed globally, and emissions often occur at high altitudes. To detect strong point sources of CO2 from space, several hurdles have to be overcome: orographic clouds, unknown dispersion behavior, a high CO2 background in the troposphere, and sparse data coverage from existing satellite sensors. These obstacles can be overcome by a small field of view, enhanced spectral resolving power, and by employing repeat target mode observation strategies. The Japanese GOSAT instrument has been operational since January 2009, producing CO2 total column measurements with a repeat cycle of 3 days and a field of view of 10km. GOSAT thus has the potential to provide spatially integrated data for entire volcanic edifices, especially in target mode. Since summer 2010 we have conducted repeated target mode observations of over 20 persistently active global volcanoes including Etna (Italy), Erta Ale (Ethiopia), and Ambrym (Vanuatu), using L2 GOSAT FTS SWIR data. One of our best-studied test cases is Mt. Etna on Sicily (Italy), which reawakened in 2011 after a period of quiescence and produced a sequence of eruptive activities including lava fountaining events, coinciding with target-mode GOSAT observations conducted there since 2010. For the

  20. SADM potentiometer anomaly investigations

    NASA Astrophysics Data System (ADS)

    Wood, Brian; Mussett, David; Cattaldo, Olivier; Rohr, Thomas

    2005-07-01

    During the last 3 years Contraves Space have been developing a Low Power (1-2kW) Solar Array Drive Mechanism (SADM) aimed at small series production. The mechanism was subjected to two test programmes in order to qualify the SADM to acceptable levels. During the two test programmes, anomalies were experienced with the Potentiometers provided by Eurofarad SA and joint investigations were undertaken to resolve why these anomalies had occurred. This paper deals with the lessons learnt from the failure investigation on the two Eurofarad (rotary) Potentiometer anomaly. The Rotary Potentiometers that were used were fully redundant; using two back to back mounted "plastic tracks". It is a pancake configuration mounted directly to the shaft of the Slip Ring Assembly at the extreme in-board end of the SADM. It has no internal bearings. The anomaly initially manifested itself as a loss of performance in terms of linearity, which was first detected during Thermal Vacuum testing. A subsequent anomaly manifested itself by the complete failure of the redundant potentiometer again during thermal vacuum testing. This paper will follow and detail the chain of events following this anomaly and identifies corrective measures to be applied to the potentiometer design and assembly process.

  1. Detection of occult infection following total joint arthroplasty using sequential technetium-99m HDP bone scintigraphy and indium-111 WBC imaging

    SciTech Connect

    Johnson, J.A.; Christie, M.J.; Sandler, M.P.; Parks, P.F. Jr.; Homra, L.; Kaye, J.J.

    1988-08-01

    Preoperative exclusion or confirmation of periprosthetic infection is essential for correct surgical management of patients with suspected infected joint prostheses. The sensitivity and specificity of (/sup 111/In)WBC imaging in the diagnosis of infected total joint prostheses was examined in 28 patients and compared with sequential (/sup 99m/Tc)HDP/(/sup 111/In)WBC scintigraphy and aspiration arthrography. The sensitivity of preoperative aspiration cultures was 12%, with a specificity of 81% and an accuracy of 58%. The sensitivity of (/sup 111/In)WBC imaging alone was 100%, with a specificity of 50% and an accuracy of 65%. When correlated with the bone scintigraphy and read as sequential (/sup 99m/Tc)HDP/(/sup 111/In)WBC imaging, the sensitivity was 88%, specificity 95%, and accuracy 93%. This study demonstrates that (/sup 111/In)WBC imaging is an extremely sensitive imaging modality for the detection of occult infection of joint prostheses. It also demonstrates the necessity of correlating (/sup 111/In)WBC images with (/sup 99m/Tc)HDP skeletal scintigraphy in the detection of occult periprosthetic infection.

  2. A comparison of classical and intelligent methods to detect potential thermal anomalies before the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4)

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-04-01

    In this paper, a number of classical and intelligent methods, including interquartile, autoregressive integrated moving average (ARIMA), artificial neural network (ANN) and support vector machine (SVM), have been proposed to quantify potential thermal anomalies around the time of the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4). The duration of the data set, which is comprised of Aqua-MODIS land surface temperature (LST) night-time snapshot images, is 62 days. In order to quantify variations of LST data obtained from satellite images, the air temperature (AT) data derived from the meteorological station close to the earthquake epicenter has been taken into account. For the models examined here, results indicate the following: (i) ARIMA models, which are the most widely used in the time series community for short-term forecasting, are quickly and easily implemented, and can efficiently act through linear solutions. (ii) A multilayer perceptron (MLP) feed-forward neural network can be a suitable non-parametric method to detect the anomalous changes of a non-linear time series such as variations of LST. (iii) Since SVMs are often used due to their many advantages for classification and regression tasks, it can be shown that, if the difference between the predicted value using the SVM method and the observed value exceeds the pre-defined threshold value, then the observed value could be regarded as an anomaly. (iv) ANN and SVM methods could be powerful tools in modeling complex phenomena such as earthquake precursor time series where we may not know what the underlying data generating process is. There is good agreement in the results obtained from the different methods for quantifying potential anomalies in a given LST time series. This paper indicates that the detection of the potential thermal anomalies derive credibility from the overall efficiencies and potentialities of the four integrated methods.

  3. AB086. Chromosomal microarray analysis—detection of both duplication and deletion in patients with multiple congenital anomalies and/or developmental delay

    PubMed Central

    Ee, Hui Jing; Yon, Hui Yi; Tan, Mui Li; Roch, Robin; Brett, Maggie; Yong, Min Hwee; Law, Hai Yang; Lai, Angeline

    2015-01-01

    Background and objective Chromosomal microarray analysis (CMA) is recommended as first-tier genetic testing for patients with multiple congenital anomalies, developmental delay/intellectual disability and/or autism spectrum disorder. It detects chromosomal imbalance at a higher resolution than conventional chromosomal analysis. CMA diagnostic service was launched in our hospital in February 2014. The aim of this report is to review the incidence of detecting both duplication and deletion in patients referred for this test. Methods DNA was extracted using Gentra Puregene Blood Kit. CMA was performed using the Agilent 4×180 K CGH + SNP array and analysed with Agilent CytoGenomics. G-banding analysis was carried out on stimulated lymphocytes culture. Targeted fluorescence in-situ hybridization (FISH) was performed using locus specific probes. Results From 1 February 2014 to 31 May 2015, a total of 205 patients were tested. Seven (3.4%) were identified to have both duplication and deletion of chromosomal segments that were pathogenic [5] or of uncertain clinical significance [2]. We present a case of a 1-day-old Chinese girl with oligohydramnios, prematurity (35+5 weeks) and multiple congenital anomalies including heart defect, cleft palate, ear anomalies, microcephaly, vaginal skin tag, bilateral clinodactyly and wide anterior fontanelle. Karyotyping and FISH analysis for 22q11 deletion were normal. CMA revealed a pathogenic gain of 2.143 Mb at 16p13.3 and a pathogenic loss of 0.271 Mb at 16q24.2q24.3. The gain at 16p13.3 affects 67 genes including CREBBP. The 16p13.3 duplication syndrome is a contiguous gene syndrome characterized by normal to moderate intellectual disability, normal growth, mild arthrogryposis, frequently small and proximally implanted thumbs, characteristic facial features and occasionally, developmental defects of the heart, genitalia, palate or eyes. The 0.271 Mb deletion at 16q24.3 affects four genes including ANKRD11 and CDH15. The clinical

  4. Enzyme leaching of surficial geochemical samples for detecting hydromorphic trace-element anomalies associated with precious-metal mineralized bedrock buried beneath glacial overburden in northern Minnesota

    USGS Publications Warehouse

    Clark, Robert J.; Meier, A.L.; Riddle, G.; ,

    1990-01-01

    One objective of the International Falls and Roseau, Minnesota, CUSMAP projects was to develop a means of conducting regional-scale geochemical surveys in areas where bedrock is buried beneath complex glacially derived overburden. Partial analysis of B-horizon soils offered hope for detecting subtle hydromorphic trace-element dispersion patterns. An enzyme-based partial leach selectively removes metals from oxide coatings on the surfaces of soil materials without attacking their matrix. Most trace-element concentrations in the resulting solutions are in the part-per-trillion to low part-per-billion range, necessitating determinations by inductively coupled plasma/mass spectrometry. The resulting data show greater contrasts for many trace elements than with other techniques tested. Spatially, many trace metal anomalies are locally discontinuous, but anomalous trends within larger areas are apparent. In many instances, the source for an anomaly seems to be either basal till or bedrock. Ground water flow is probably the most important mechanism for transporting metals toward the surface, although ionic diffusion, electrochemical gradients, and capillary action may play a role in anomaly dispersal. Sample sites near the Rainy Lake-Seine River fault zone, a regional shear zone, often have anomalous concentrations of a variety of metals, commonly including Zn and/or one or more metals which substitute for Zn in sphalerite (Cd, Ge, Ga, and Sn). Shifts in background concentrations of Bi, Sb, and As show a trend across the area indicating a possible regional zoning of lode-Au mineralization. Soil anomalies of Ag, Co, and Tl parallel basement structures, suggesting areas that may have potential for Cobalt/Thunder Baytype silver viens. An area around Baudette, Minnesota, which is underlain by quartz-chlorite-carbonate-altered shear zones, is anomalous in Ag, As, Bi, Co, Mo, Te, Tl, and W. Anomalies of Ag, As, Bi, Te, and W tend to follow the fault zones, suggesting potential

  5. DOWN'S ANOMALY.

    ERIC Educational Resources Information Center

    PENROSE, L.S.; SMITH, G.F.

    BOTH CLINICAL AND PATHOLOGICAL ASPECTS AND MATHEMATICAL ELABORATIONS OF DOWN'S ANOMALY, KNOWN ALSO AS MONGOLISM, ARE PRESENTED IN THIS REFERENCE MANUAL FOR PROFESSIONAL PERSONNEL. INFORMATION PROVIDED CONCERNS (1) HISTORICAL STUDIES, (2) PHYSICAL SIGNS, (3) BONES AND MUSCLES, (4) MENTAL DEVELOPMENT, (5) DERMATOGLYPHS, (6) HEMATOLOGY, (7)…

  6. Uhl's anomaly.

    PubMed Central

    Vecht, R J; Carmichael, D J; Gopal, R; Philip, G

    1979-01-01

    Uhl's anomaly of the heart is a rare condition. Another well-documented case is presented with a review of the published reports outlining the main clinical features and the bad overall prognosis. Right atriotomy should be avoided if closure of the atrial septal defect is attempted. Images PMID:465242

  7. A colorimetric sensor for the sequential detection of Cu(2+) and CN(-) in fully aqueous media: practical performance of Cu(2+).

    PubMed

    You, Ga Rim; Park, Gyeong Jin; Lee, Jae Jun; Kim, Cheal

    2015-05-21

    A new highly selective colorimetric chemosensor 1 (E)-9-(((5-mercapto-1,3,4-thiadiazol-2-yl)imino)methyl)-2,3,6,7-tetrahydro-1H,5H-pyrido[3,2,1-ij]quinolin-8-ol was designed and synthesized for the sequential detection of Cu(2+) and CN(-). This sensor 1 exhibited an obvious color change from yellow to orange in the presence of Cu(2+) in a fully aqueous solution. The detection limit (0.9 μM) of 1 for Cu(2+) is far lower than the WHO limit (31.5 μM) for drinking water. In addition, the resulting Cu(2+)-2· 1 complex can be further used to detect toxic cyanide through a color change from orange to yellow, indicating the recovery of 1 from Cu(2+)-2·1. Importantly, chemosensor 1 could be used to detect and quantify Cu(2+) in water samples, and a colorimetric test strip of 1 for the detection of Cu(2+) could be useful for all practical purposes. PMID:25900000

  8. SplicePie: a novel analytical approach for the detection of alternative, non-sequential and recursive splicing.

    PubMed

    Pulyakhina, Irina; Gazzoli, Isabella; 't Hoen, Peter A C; Verwey, Nisha; den Dunnen, Johan T; den Dunnen, Johan; Aartsma-Rus, Annemieke; Laros, Jeroen F J

    2015-07-13

    Alternative splicing is a powerful mechanism present in eukaryotic cells to obtain a wide range of transcripts and protein isoforms from a relatively small number of genes. The mechanisms regulating (alternative) splicing and the paradigm of consecutive splicing have recently been challenged, especially for genes with a large number of introns. RNA-Seq, a powerful technology using deep sequencing in order to determine transcript structure and expression levels, is usually performed on mature mRNA, therefore not allowing detailed analysis of splicing progression. Sequencing pre-mRNA at different stages of splicing potentially provides insight into mRNA maturation. Although the number of tools that analyze total and cytoplasmic RNA in order to elucidate the transcriptome composition is rapidly growing, there are no tools specifically designed for the analysis of nuclear RNA (which contains mixtures of pre- and mature mRNA). We developed dedicated algorithms to investigate the splicing process. In this paper, we present a new classification of RNA-Seq reads based on three major stages of splicing: pre-, intermediate- and post-splicing. Applying this novel classification we demonstrate the possibility to analyze the order of splicing. Furthermore, we uncover the potential to investigate the multi-step nature of splicing, assessing various types of recursive splicing events. We provide the data that gives biological insight into the order of splicing, show that non-sequential splicing of certain introns is reproducible and coinciding in multiple cell lines. We validated our observations with independent experimental technologies and showed the reliability of our method. The pipeline, named SplicePie, is freely available at: https://github.com/pulyakhina/splicing_analysis_pipeline. The example data can be found at: https://barmsijs.lumc.nl/HG/irina/example_data.tar.gz. PMID:25800735

  9. SplicePie: a novel analytical approach for the detection of alternative, non-sequential and recursive splicing.

    PubMed

    Pulyakhina, Irina; Gazzoli, Isabella; 't Hoen, Peter A C; Verwey, Nisha; den Dunnen, Johan T; den Dunnen, Johan; Aartsma-Rus, Annemieke; Laros, Jeroen F J

    2015-07-13

    Alternative splicing is a powerful mechanism present in eukaryotic cells to obtain a wide range of transcripts and protein isoforms from a relatively small number of genes. The mechanisms regulating (alternative) splicing and the paradigm of consecutive splicing have recently been challenged, especially for genes with a large number of introns. RNA-Seq, a powerful technology using deep sequencing in order to determine transcript structure and expression levels, is usually performed on mature mRNA, therefore not allowing detailed analysis of splicing progression. Sequencing pre-mRNA at different stages of splicing potentially provides insight into mRNA maturation. Although the number of tools that analyze total and cytoplasmic RNA in order to elucidate the transcriptome composition is rapidly growing, there are no tools specifically designed for the analysis of nuclear RNA (which contains mixtures of pre- and mature mRNA). We developed dedicated algorithms to investigate the splicing process. In this paper, we present a new classification of RNA-Seq reads based on three major stages of splicing: pre-, intermediate- and post-splicing. Applying this novel classification we demonstrate the possibility to analyze the order of splicing. Furthermore, we uncover the potential to investigate the multi-step nature of splicing, assessing various types of recursive splicing events. We provide the data that gives biological insight into the order of splicing, show that non-sequential splicing of certain introns is reproducible and coinciding in multiple cell lines. We validated our observations with independent experimental technologies and showed the reliability of our method. The pipeline, named SplicePie, is freely available at: https://github.com/pulyakhina/splicing_analysis_pipeline. The example data can be found at: https://barmsijs.lumc.nl/HG/irina/example_data.tar.gz.

  10. Using a combination of MLPA kits to detect chromosomal imbalances in patients with multiple congenital anomalies and mental retardation is a valuable choice for developing countries.

    PubMed

    Jehee, Fernanda Sarquis; Takamori, Jean Tetsuo; Medeiros, Paula F Vasconcelos; Pordeus, Ana Carolina B; Latini, Flavia Roche M; Bertola, Débora Romeo; Kim, Chong Ae; Passos-Bueno, Maria Rita

    2011-01-01

    Conventional karyotyping detects anomalies in 3-15% of patients with multiple congenital anomalies and mental retardation (MCA/MR). Whole-genome array screening (WGAS) has been consistently suggested as the first choice diagnostic test for this group of patients, but it is very costly for large-scale use in developing countries. We evaluated the use of a combination of Multiplex Ligation-dependent Probe Amplification (MLPA) kits to increase the detection rate of chromosomal abnormalities in MCA/MR patients. We screened 261 MCA/MR patients with two subtelomeric and one microdeletion kits. This would theoretically detect up to 70% of all submicroscopic abnormalities. Additionally we scored the de Vries score for 209 patients in an effort to find a suitable cut-off for MLPA screening. Our results reveal that chromosomal abnormalities were present in 87 (33.3%) patients, but only 57 (21.8%) were considered causative. Karyotyping detected 15 abnormalities (6.9%), while MLPA identified 54 (20.7%). Our combined MLPA screening raised the total detection number of pathogenic imbalances more than three times when compared to conventional karyotyping. We also show that using the de Vries score as a cut-off for this screening would only be suitable under financial restrictions. A decision analytic model was constructed with three possible strategies: karyotype, karyotype + MLPA and karyotype + WGAS. Karyotype + MLPA strategy detected anomalies in 19.8% of cases which account for 76.45% of the expected yield for karyotype + WGAS. Incremental Cost Effectiveness Ratio (ICER) of MLPA is three times lower than that of WGAS, which means that, for the same costs, we have three additional diagnoses with MLPA but only one with WGAS. We list all causative alterations found, including rare findings, such as reciprocal duplications of regions deleted in Sotos and Williams-Beuren syndromes. We also describe imbalances that were considered polymorphisms or rare variants, such as the new SNP

  11. Diagnostic accuracy of VIA and HPV detection as primary and sequential screening tests in a cervical cancer screening demonstration project in India.

    PubMed

    Basu, Partha; Mittal, Srabani; Banerjee, Dipanwita; Singh, Priyanka; Panda, Chinmay; Dutta, Sankhadeep; Mandal, Ranajit; Das, Pradip; Biswas, Jaydip; Muwonge, Richard; Sankaranarayanan, Rengaswamy

    2015-08-15

    Visual inspection after acetic acid application (VIA) and human papillomavirus (HPV) detection tests have been recommended to screen women for cervical cancer in low and middle income countries. A demonstration project in rural India screened 39,740 women with both the tests to compare their accuracies in real population setting. The project also evaluated the model of screening women in the existing primary health care facilities, evaluating the screen positive women with colposcopy (and biopsy) in the same setup and recalling the women diagnosed to have disease for treatment at tertiary center. Accuracy of VIA and HPV test used sequentially was also studied. VIA was performed by trained health workers and Hybrid Capture II (HC II) assay was used for oncogenic HPV detection. Test positivity was 7.1% for VIA and 4.7% for HC II. Detection rate of CIN 3+ disease was significantly higher with HC II than VIA. Sensitivities of VIA and HC II to detect 162 histology proved CIN 3+ lesions were 67.9 and 91.2%, respectively after adjusting for verification bias. Specificity for the same disease outcome and verification bias correction was 93.2% for VIA and 96.9% for HC II. Triaging of VIA positive women with HPV test would have considerably improved the positive predictive value (4.0 to 37.5% to detect CIN 3+) without significant drop in sensitivity. All VIA positive women and 74.0% of HC II positive women had colposcopy. There was high compliance to treatment and significant stage-shift of the screen-detected cancers towards more early stage.

  12. Synthesis and Application of an Aldazine-Based Fluorescence Chemosensor for the Sequential Detection of Cu2+ and Biological Thiols in Aqueous Solution and Living Cells

    PubMed Central

    Jia, Hongmin; Yang, Ming; Meng, Qingtao; He, Guangjie; Wang, Yue; Hu, Zhizhi; Zhang, Run; Zhang, Zhiqiang

    2016-01-01

    A fluorescence chemosensor, 2-hydroxy-1-naphthaldehyde azine (HNA) was designed and synthesized for sequential detection of Cu2+ and biothiols. It was found that HNA can specifically bind to Cu2+ with 1:1 stoichiometry, accompanied with a dramatic fluorescence quenching and a remarkable bathochromic-shift of the absorbance peak in HEPES buffer. The generated HNA-Cu2+ ensemble displayed a “turn-on” fluorescent response specific for biothiols (Hcy, Cys and GSH) based on the displacement approach, giving a remarkable recovery of fluorescence and UV-Vis spectra. The detection limits of HNA-Cu2+ to Hcy, Cys and GSH were estimated to be 1.5 μM, 1.0 μM and 0.8 μM, respectively, suggesting that HNA-Cu2+ is sensitive enough for the determination of thiols in biological systems. The biocompatibility of HNA towards A549 human lung carcinoma cell, was evaluated by an MTT assay. The capability of HNA-Cu2+ to detect biothiols in live A549 cells was then demonstrated by a microscopy fluorescence imaging assay. PMID:26761012

  13. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  14. Investigation of the collision line broadening problem as applicable to the NASA Optical Plume Anomaly Detection (OPAD) system, phase 1

    NASA Technical Reports Server (NTRS)

    Dean, Timothy C.; Ventrice, Carl A.

    1995-01-01

    As a final report for phase 1 of the project, the researchers are submitting to the Tennessee Tech Office of Research the following two papers (reprinted in this report): 'Collision Line Broadening Effects on Spectrometric Data from the Optical Plume Anomaly System (OPAD),' presented at the 30th AIAA/ASME/SAE/ASEE Joint Propulsion Conference, 27-29 June 1994, and 'Calculation of Collision Cross Sections for Atomic Line Broadening in the Plume of the Space Shuttle Main Engine (SSME),' presented at the IEEE Southeastcon '95, 26-29 March 1995. These papers fully state the problem and the progress made up to the end of NASA Fiscal Year 1994. The NASA OPAD system was devised to predict concentrations of anomalous species in the plume of the Space Shuttle Main Engine (SSME) through analysis of spectrometric data. The self absorption of the radiation of these plume anomalies is highly dependent on the line shape of the atomic transition of interest. The Collision Line Broadening paper discusses the methods used to predict line shapes of atomic transitions in the environment of a rocket plume. The Voigt profile is used as the line shape factor since both Doppler and collisional line broadening are significant. Methods used to determine the collisional cross sections are discussed and the results are given and compared with experimental data. These collisional cross sections are then incorporated into the current self absorbing radiative model and the predicted spectrum is compared to actual spectral data collected from the Stennis Space Center Diagnostic Test Facility rocket engine. The second paper included in this report investigates an analytical method for determining the cross sections for collision line broadening by molecular perturbers, using effective central force interaction potentials. These cross sections are determined for several atomic species with H2, one of the principal constituents of the SSME plume environment, and compared with experimental data.

  15. New Peak Temperature Constraints Using RSCM Geothermometry on Lucia Subterrane in Franciscan Complex (California, USA): Detection of Thermal Anomalies in Gold-Bearing Quartz Veins Surrounding.

    NASA Astrophysics Data System (ADS)

    Lahfid, A.; Delchini, S.; Lacroix, B.

    2015-12-01

    The occurrence of deposits hosted by carbonaceous materials-rich metasediments is widespread. Therefore, we aims in this study to investigate the potential of the Raman Spectroscopy of Carbonaceous Material (RSCM) geothermometry to detect thermal anomalies in hydrothermal ore deposits environment and to demonstrate the ability of warm fluids, migrating through the sedimentary sequence to locally disturb the thermal gradient and associated peak temperatures. For this purpose, we have chosen the Lucia subterrane in the Franciscan Complex (California, USA), which includes gold-bearing quartz veins that witness a hydrothermal overprint (Underwood et al., 1995).The sediments in this zone essentially comprise greywacke and shale-matrix mélange (e.g. Frey and Robinson, 1999), which have undergone high-pressure, low-temperature metamorphism. The thermal history of the Lucia subterrane has been previously proposed by Underwood et al. (1995), essentially using vitrinite reflectance method (Rm). Rm values increase from the south to the north; they vary between 0.9 and 3.7 % (~150-280°C). All these results suggest that the Lucia subterrane underwent a regional increase of thermal gradient toward the north. Anomalous Rm values from 4.5% to 4.9% (~305-315°C) are recorded near Cape San Martin. These highest temperatures estimated are likely, associated with a late hydrothermal event (Underwood et al., 1995). Estimated Raman temperatures 1) confirmed the increase in the metamorphic grade towards the north already shown by Underwood et al. (1995), using classical methods like mineralogy and vitrinite reflectance and 2) exhibit anomalous values (temperatures reach 350°C). These anomalies are probably due to the later hydrothermal event. This result suggests that RSCM could be used as a reliable tool to determine thermal anomalies caused by hot fluid-flow.

  16. Sequential interval motif search: unrestricted database surveys of global MS/MS data sets for detection of putative post-translational modifications.

    PubMed

    Liu, Jian; Erassov, Alexandre; Halina, Patrick; Canete, Myra; Nguyen, Dinh Vo; Chung, Clement; Cagney, Gerard; Ignatchenko, Alexandr; Fong, Vincent; Emili, Andrew

    2008-10-15

    Tandem mass spectrometry is the prevailing approach for large-scale peptide sequencing in high-throughput proteomic profiling studies. Effective database search engines have been developed to identify peptide sequences from MS/MS fragmentation spectra. Since proteins are polymorphic and subject to post-translational modifications (PTM), however, computational methods for detecting unanticipated variants are also needed to achieve true proteome-wide coverage. Different from existing "unrestrictive" search tools, we present a novel algorithm, termed SIMS (for Sequential Motif Interval Search), that interprets pairs of product ion peaks, representing potential amino acid residues or "intervals", as a means of mapping PTMs or substitutions in a blind database search mode. An effective heuristic software program was likewise developed to evaluate, rank, and filter optimal combinations of relevant intervals to identify candidate sequences, and any associated PTM or polymorphism, from large collections of MS/MS spectra. The prediction performance of SIMS was benchmarked extensively against annotated reference spectral data sets and compared favorably with, and was complementary to, current state-of-the-art methods. An exhaustive discovery screen using SIMS also revealed thousands of previously overlooked putative PTMs in a compendium of yeast protein complexes and in a proteome-wide map of adult mouse cardiomyocytes. We demonstrate that SIMS, freely accessible for academic research use, addresses gaps in current proteomic data interpretation pipelines, improving overall detection coverage, and facilitating comprehensive investigations of the fundamental multiplicity of the expressed proteome.

  17. A sequential method for passive detection, characterization, and localization of multiple low probability of intercept LFMCW signals

    NASA Astrophysics Data System (ADS)

    Hamschin, Brandon M.

    A method for passive Detection, Characterization, and Localization (DCL) of multiple low power, Linear Frequency Modulated Continuous Wave (LFMCW) (i.e., Low Probability of Intercept (LPI)) signals is proposed. We demonstrate, via simulation, laboratory, and outdoor experiments, that the method is able to detect and correctly characterize the parameters that define two simultaneous LFMCW signals with probability greater than 90% when the signal to noise ratio is -10 dB or greater. While this performance is compelling, it is far from the Cramer-Rao Lower Bound (CRLB), which we derive, and the performance of the Maximum Likelihood Estimator (MLE), whose performance we simulate. The loss in performance relative to the CRLB and the MLE is the price paid for computational tractability. The LFMCW signal is the focus of this work because of its common use in modern, low-cost radar systems. In contrast to other detection and characterization approaches, such as the MLE and those based on the Wigner-Ville Transform (WVT) or the Wigner-Ville Hough Transform (WVHT), our approach does not begin with a parametric model of the received signal that is specified directly in terms of its LFMCW constituents. Rather, we analyze the signal over time intervals that are short, non-overlapping, and contiguous by modeling it within these intervals as a sum of a small number sinusoidal (i.e., harmonic) components with unknown frequencies, deterministic but unknown amplitudes, unknown order (i.e., number of harmonic components), and unknown noise autocorrelation function. It is this model of the data that makes the solution computationally feasible, but also what leads to a degradation in performance since estimates are not based on the full time series. By modeling the signal in this way, we reliably detect the presence of multiple LFMCW signals in colored noise without the need for prewhitening, efficiently estimate (i.e. , characterize) their parameters, provide estimation error

  18. Sequential monitoring of chimerism and detection of minimal residual disease after allogeneic blood stem cell transplantation (BSCT) using multiplex PCR amplification of short tandem repeat-markers.

    PubMed

    Thiede, C; Bornhäuser, M; Oelschlägel, U; Brendel, C; Leo, R; Daxberger, H; Mohr, B; Florek, M; Kroschinsky, F; Geissler, G; Naumann, R; Ritter, M; Prange-Krex, G; Lion, T; Neubauer, A; Ehninger, G

    2001-02-01

    Sequential analysis of chimerism after allogeneic blood stem cell transplantation (BSCT) has been shown to be predictive for graft failure and relapse. We have explored the impact of a novel approach for the quantitative determination of chimerism using a commercial PCR assay with multiplex amplification of nine STR-loci and fluorescence detection. The feasibility was studied in 121 patients transplanted from related or unrelated donors. Follow-up investigation was performed in 88 patients. Twenty-eight of these patients had received a transplantation after dose-reduced conditioning therapy. Results were compared to data obtained by FISH analysis in a subgroup of patients receiving grafts from sex-mismatched donors. The analysis was possible in all patients, the median number of informative alleles was 4 (range 1-8) compared to 7 (range 1-9) in the related and unrelated situation, respectively. A good correlation was seen in 84 samples from 14 patients analyzed in parallel with STR-PCR and FISH. Decreasing values of donor chimerism were detected prior to or concomitantly with the occurrence of graft failure and relapse of disease in all patients investigated prospectively. Using FACS-sorted material, eg peripheral blood CD34+ cells, the assay permitted the detection of residual recipient cells with high sensitivity (down to one CD34+ Kasumi cell in 40,000 normal WBC). Evaluation of the inter-laboratory reproducibility revealed that in 20 samples analyzed in three different centers, the median coefficient of variation was 2.1% (range 0.7-9.6%). Taken together, the results support the use of the test as a valuable tool in the follow-up of patients undergoing allogeneic BSCT. In cases lacking PCR-detectable disease-specific gene products, this assay may represent an alternative to recently established real-time PCR methods. PMID:11236950

  19. Detection of a tropospheric ozone anomaly using a newly developed ozone retrieval algorithm for an up-looking infrared interferometer

    NASA Astrophysics Data System (ADS)

    Lightner, K. J.; McMillan, W. W.; McCann, K. J.; Hoff, R. M.; Newchurch, M. J.; Hintsa, E. J.; Barnet, C. D.

    2009-03-01

    On 2 June 2003, the Baltimore Bomem Atmospheric Emitted Radiance Interferometer (BBAERI) recorded an infrared spectral time series indicating the presence of a tropospheric ozone anomaly. The measurements were collected during an Atmospheric Infrared Sounder (AIRS) validation campaign called the 2003 AIRS BBAERI Ocean Validation Experiment (ABOVE03) conducted at the United States Coast Guard Chesapeake Light station located 14 miles due east of Virginia Beach, Virginia (36.91°N, 75.71°W). Ozone retrievals were performed with the Kurt Lightner Ozone BBAERI Retrieval (KLOBBER) algorithm, which retrieves tropospheric column ozone, surface to 300 mbar, from zenith-viewing atmospheric thermal emission spectra. KLOBBER is modeled after the AIRS retrieval algorithm consisting of a synthetic statistical regression followed by a physical retrieval. The physical retrieval is implemented using the k-Compressed Atmospheric Radiative Transfer Algorithm (kCARTA) to compute spectra. The time series of retrieved integrated ozone column on 2 June 2003 displays spikes of about 10 Dobson units, well above the error of the KLOBBER algorithm. Using instrumentation at Chesapeake Light, satellite imaging, trace gas retrievals from satellites, and Potential Vorticity (PV) computations, it was determined that these sudden increases in column ozone likely were caused by a combination of midtropospheric biomass burning products from forest fires in Siberia, Russia, and stratospheric intrusion by a tropopause fold occurring over central Canada and the midwestern United States.

  20. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Marshall Clint; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many fronts to make possible high-speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flowfields/plumes. The Optical Plume Anomaly Detector (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDiFiS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Capabilities for real-time processing are being advanced on several fronts, including an effort to hardware encode components of the EDiFiS for health monitoring and management. This paper addresses the OPAD with its tool suites, and discusses what is considered a natural progression: a concept for taking OPAD to the next logical level of high energy physics, incorporating fermion and boson particle analyses in measurement of neutron flux.

  1. How Well Can We Detect Lineage-Specific Diversification-Rate Shifts? A Simulation Study of Sequential AIC Methods

    PubMed Central

    May, Michael R.; Moore, Brian R.

    2016-01-01

    Evolutionary biologists have long been fascinated by the extreme differences in species numbers across branches of the Tree of Life. This has motivated the development of statistical methods for detecting shifts in the rate of lineage diversification across the branches of phylogenic trees. One of the most frequently used methods, MEDUSA, explores a set of diversification-rate models, where each model assigns branches of the phylogeny to a set of diversification-rate categories. Each model is first fit to the data, and the Akaike information criterion (AIC) is then used to identify the optimal diversification model. Surprisingly, the statistical behavior of this popular method is uncharacterized, which is a concern in light of: (1) the poor performance of the AIC as a means of choosing among models in other phylogenetic contexts; (2) the ad hoc algorithm used to visit diversification models, and; (3) errors that we reveal in the likelihood function used to fit diversification models to the phylogenetic data. Here, we perform an extensive simulation study demonstrating that MEDUSA (1) has a high false-discovery rate (on average, spurious diversification-rate shifts are identified ≈30% of the time), and (2) provides biased estimates of diversification-rate parameters. Understanding the statistical behavior of MEDUSA is critical both to empirical researchers—in order to clarify whether these methods can make reliable inferences from empirical datasets—and to theoretical biologists—in order to clarify the specific problems that need to be solved in order to develop more reliable approaches for detecting shifts in the rate of lineage diversification. [Akaike information criterion; extinction; lineage-specific diversification rates; phylogenetic model selection; speciation.] PMID:27037081

  2. Rapid automated assay of anti-oxidation/radical-scavenging activity of natural substances by sequential injection technique (SIA) using spectrophotometric detection.

    PubMed

    Polásek, Miroslav; Skála, Petr; Opletal, Lubomír; Jahodár, Ludek

    2004-07-01

    A PC-controlled sequential injection analysis (SIA) system equipped with a spectrophotometric diode-array detector is used for rapid monitoring and evaluation of antioxidation/radical scavenging activity of biological samples. The automated method is based on the known reaction of stable 2,2'-diphenyl-1-picrylhydrazyl radical (DPPH) with antioxidants in organic or aqueous-organic media resulting in bleaching of DPPH due to its "quenching" by the interaction with the analytes. The decrease of the absorbance of DPPH (compared to blank experiment carried out with water-ethanol 1:1 instead of the test solution) measured at 525 nm is related to concentration of an antioxidant in the test solution. With the optimised SIA procedure it is possible to detect down to micromolar concentrations of model antioxidants such as ascorbic acid, caffeic acid, (+)-catechin, (-)-epicatechin and rutin and to evaluate the concentration of these antioxidants in the micromolar to millimolar range. The sample throughput is 45 h(-1). Thanks to its rapidity and sensitivity, the proposed SIA method is suitable for performing routine screening tests for the presence of various antioxidants in large series of lyophilised herbal or mushroom extracts (the amount of sample needed for the analysis is several milligrams).

  3. Gauge anomalies, gravitational anomalies, and superstrings

    SciTech Connect

    Bardeen, W.A.

    1985-08-01

    The structure of gauge and gravitational anomalies will be reviewed. The impact of these anomalies on the construction, consistency, and application of the new superstring theories will be discussed. 25 refs.

  4. A sensitive sequential 'on/off' SERS assay for heparin with wider detection window and higher reliability based on the reversed surface charge changes of functionalized Au@Ag nanoparticles.

    PubMed

    Zeng, Yi; Pei, Jin-Ju; Wang, Li-Hua; Shen, Ai-Guo; Hu, Ji-Ming

    2015-04-15

    A sequential 'on/off' dual mode SERS assay platform for heparin with wider detection window and higher reliability is constructed based on electrostatic forces, in which the highly protonated chitosan encapsulated p-Mercaptobenzoic acid coated Au@Ag core-shell nanoparticles undergo sequential aggregation/segregation upon the additive of heparin with a limit of detection of 43.74ng/mL (5.69U/mL) and a continuous concentration range of 50-800ng/mL (6.5-104U/mL), which are lower in sensitivity and wider in detection window than the most reported assay for heparin. Remarkably, the latter declined window over a range of 350-800ng/mL in contrast, which has not reported before, is extremely important in reliable and practical assay of heparin.

  5. ANOMALY STRUCTURE OF SUPERGRAVITY AND ANOMALY CANCELLATION

    SciTech Connect

    Butter, Daniel; Gaillard, Mary K.

    2009-06-10

    We display the full anomaly structure of supergravity, including new D-term contributions to the conformal anomaly. This expression has the super-Weyl and chiral U(1){sub K} transformation properties that are required for implementation of the Green-Schwarz mechanism for anomaly cancellation. We outline the procedure for full anomaly cancellation. Our results have implications for effective supergravity theories from the weakly coupled heterotic string theory.

  6. The elliptic anomaly

    NASA Technical Reports Server (NTRS)

    Janin, G.; Bond, V. R.

    1980-01-01

    An independent variable different from the time for elliptic orbit integration is used. Such a time transformation provides an analytical step-size regulation along the orbit. An intermediate anomaly (an anomaly intermediate between the eccentric and the true anomaly) is suggested for optimum performances. A particular case of an intermediate anomaly (the elliptic anomaly) is defined, and its relation with the other anomalies is developed.

  7. Persistent left superior vena cava, absence of the innominate vein, and upper sinus venosus defect : a rare anomaly detected using bubbles.

    PubMed

    Akpinar, I; Sayin, M R; Karabag, T; Dogan, S M; Sen, S T; Gudul, N E; Aydin, M

    2013-05-01

    Superior vena cava anomalies are rare malformations that are typically seen with other congenital cardiac defects. Although a persistent left superior vena cava is the most common anomaly of the systemic venous return in the thorax, its combination with an upper sinus venosus defect and absence of the innominate vein is extremely rare. Here, we report a patient diagnosed with these anomalies based on a bubble study and confirmed with magnetic resonance imaging.

  8. Prenatal Detection of Cardiac Anomalies in Fetuses with Single Umbilical Artery: Diagnostic Accuracy Comparison of Maternal-Fetal-Medicine and Pediatric Cardiologist

    PubMed Central

    Tasha, Ilir; Brook, Rachel; Frasure, Heidi

    2014-01-01

    Aim. To determine agreement of cardiac anomalies between maternal fetal medicine (MFM) physicians and pediatric cardiologists (PC) in fetuses with single umbilical artery (SUA). Methods. A retrospective review of all fetuses with SUA between 1999 and 2008. Subjects were studied by MFM and PC, delivered at our institution, and had confirmation of SUA and cardiac anomaly by antenatal and neonatal PC follow-up. Subjects were divided into four groups: isolated SUA, SUA and isolated cardiac anomaly, SUA and multiple anomalies without heart anomalies, and SUA and multiple malformations including cardiac anomaly. Results. 39,942 cases were studied between 1999 and 2008. In 376 of 39,942 cases (0.94%), SUA was diagnosed. Only 182 (48.4%) met inclusion criteria. Cardiac anomalies were found in 21% (38/182). Agreement between MFM physicians and PC in all groups combined was 94% (171/182) (95% CI [89.2, 96.8]). MFM physicians overdiagnosed cardiac anomalies in 4.4% (8/182). MFM physicians and PC failed to antenatally diagnose cardiac anomaly in the same two cases. Conclusions. Good agreement was noted between MFM physicians and PC in our institution. Studies performed antenatally by MFM physicians and PC are less likely to uncover the entire spectrum of cardiac abnormalities and thus neonatal follow-up is suggested. PMID:24719766

  9. ISHM Anomaly Lexicon for Rocket Test

    NASA Technical Reports Server (NTRS)

    Schmalzel, John L.; Buchanan, Aubri; Hensarling, Paula L.; Morris, Jonathan; Turowski, Mark; Figueroa, Jorge F.

    2007-01-01

    Integrated Systems Health Management (ISHM) is a comprehensive capability. An ISHM system must detect anomalies, identify causes of such anomalies, predict future anomalies, help identify consequences of anomalies for example, suggested mitigation steps. The system should also provide users with appropriate navigation tools to facilitate the flow of information into and out of the ISHM system. Central to the ability of the ISHM to detect anomalies is a clearly defined catalog of anomalies. Further, this lexicon of anomalies must be organized in ways that make it accessible to a suite of tools used to manage the data, information and knowledge (DIaK) associated with a system. In particular, it is critical to ensure that there is optimal mapping between target anomalies and the algorithms associated with their detection. During the early development of our ISHM architecture and approach, it became clear that a lexicon of anomalies would be important to the development of critical anomaly detection algorithms. In our work in the rocket engine test environment at John C. Stennis Space Center, we have access to a repository of discrepancy reports (DRs) that are generated in response to squawks identified during post-test data analysis. The DR is the tool used to document anomalies and the methods used to resolve the issue. These DRs have been generated for many different tests and for all test stands. The result is that they represent a comprehensive summary of the anomalies associated with rocket engine testing. Fig. 1 illustrates some of the data that can be extracted from a DR. Such information includes affected transducer channels, narrative description of the observed anomaly, and the steps used to correct the problem. The primary goal of the anomaly lexicon development efforts we have undertaken is to create a lexicon that could be used in support of an associated health assessment database system (HADS) co-development effort. There are a number of significant

  10. Rapid determination of plutonium isotopes in environmental samples using sequential injection extraction chromatography and detection by inductively coupled plasma mass spectrometry.

    PubMed

    Qiao, Jixin; Hou, Xiaolin; Roos, Per; Miró, Manuel

    2009-10-01

    This article presents an automated method for the rapid determination of 239Pu and 240Pu in various environmental samples. The analytical method involves the in-line separation of Pu isotopes using extraction chromatography (TEVA) implemented in a sequential injection (SI) network followed by detection of isolated analytes with inductively coupled plasma mass spectrometry (ICP-MS). The method has been devised for the determination of Pu isotopes at environmentally relevant concentrations, whereby it has been successfully applied to the analyses of large volumes/amounts of samples, for example, 100-200 g of soil and sediment, 20 g of seaweed, and 200 L of seawater following analyte preconcentration. The investigation of the separation capability of the assembled SI system revealed that up to 200 g of soil or sediment can be treated using a column containing about 0.70 g of TEVA resin. The analytical results of Pu isotopes in the reference materials showed good agreement with the certified or reference values at the 0.05 significance level. Chemical yields of Pu ranged from 80 to 105%, and the decontamination factors for uranium, thorium, mercury and lead were all above 10(4). The duration of the in-line extraction chromatographic run was <1.5 h, and the proposed setup was able to handle up to 20 samples (14 mL each) in a fully automated mode using a single chromatographic column. The SI manifold is thus suitable for rapid and automated determination of Pu isotopes in environmental risk assessment and emergency preparedness scenarios. PMID:19722516

  11. Mobile gamma-ray scanning system for detecting radiation anomalies associated with /sup 226/Ra-bearing materials

    SciTech Connect

    Myrick, T.E.; Blair, M.S.; Doane, R.W.; Goldsmith, W.A.

    1982-11-01

    A mobile gamma-ray scanning system has been developed by Oak Ridge National Laboratory for use in the Department of Energy's remedial action survey programs. The unit consists of a NaI(T1) detection system housed in a specially-equipped van. The system is operator controlled through an on-board mini-computer, with data output provided on the computer video screen, strip chart recorders, and an on-line printer. Data storage is provided by a floppy disk system. Multichannel analysis capabilities are included for qualitative radionuclide identification. A /sup 226/Ra-specific algorithm is employed to identify locations containing residual radium-bearing materials. This report presents the details of the system description, software development, and scanning methods utilized with the ORNL system. Laboratory calibration and field testing have established the system sensitivity, field of view, and other performance characteristics, the results of which are also presented. Documentation of the instrumentation and computer programs are included.

  12. Experimental evidence for spring and autumn windows for the detection of geobotanical anomalies through the remote sensing of overlying vegetation

    NASA Technical Reports Server (NTRS)

    Labovitz, M. L.; Masuoka, E. J.; Bell, R.; Nelson, R. F.; Larsen, C. A.; Hooker, L. K.; Troensegaard, K. W.

    1985-01-01

    It is pointed out that in many regions of the world, vegetation is the predominant factor influencing variation in reflected energy in the 0.4-2.5 micron region of the spectrum. Studies have, therefore, been conducted regarding the utility of remote sensing for detecting changes in vegetation which could be related to the presence of mineralization. The present paper provides primarily a report on the results of the second year of a multiyear study of geobotanical-remote-sensing relationships as developed over areas of sulfide mineralization. The field study has a strong experimental design basis. It is proceeded by first delineating the boundaries of a large geographic region which satisfied a set of previously enumerated field-site criteria. Within this region, carefully selected pairs of mineralized and nonmineralized test sites were examined over the growing season. The experiment is to provide information about the spectral and temporal resolutions required for remote-sensing-geobotanical exploration. The obtained results are evaluated.

  13. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.

  14. Sequentially Executed Model Evaluation Framework

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such asmore » time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  15. Aeromagnetic anomalies over faulted strata

    USGS Publications Warehouse

    Grauch, V.J.S.; Hudson, Mark R.

    2011-01-01

    High-resolution aeromagnetic surveys are now an industry standard and they commonly detect anomalies that are attributed to faults within sedimentary basins. However, detailed studies identifying geologic sources of magnetic anomalies in sedimentary environments are rare in the literature. Opportunities to study these sources have come from well-exposed sedimentary basins of the Rio Grande rift in New Mexico and Colorado. High-resolution aeromagnetic data from these areas reveal numerous, curvilinear, low-amplitude (2–15 nT at 100-m terrain clearance) anomalies that consistently correspond to intrasedimentary normal faults (Figure 1). Detailed geophysical and rock-property studies provide evidence for the magnetic sources at several exposures of these faults in the central Rio Grande rift (summarized in Grauch and Hudson, 2007, and Hudson et al., 2008). A key result is that the aeromagnetic anomalies arise from the juxtaposition of magnetically differing strata at the faults as opposed to chemical processes acting at the fault zone. The studies also provide (1) guidelines for understanding and estimating the geophysical parameters controlling aeromagnetic anomalies at faulted strata (Grauch and Hudson), and (2) observations on key geologic factors that are favorable for developing similar sedimentary sources of aeromagnetic anomalies elsewhere (Hudson et al.).

  16. Chiral anomalies and differential geometry

    SciTech Connect

    Zumino, B.

    1983-10-01

    Some properties of chiral anomalies are described from a geometric point of view. Topics include chiral anomalies and differential forms, transformation properties of the anomalies, identification and use of the anomalies, and normalization of the anomalies. 22 references. (WHK)

  17. Graph anomalies in cyber communications

    SciTech Connect

    Vander Wiel, Scott A; Storlie, Curtis B; Sandine, Gary; Hagberg, Aric A; Fisk, Michael

    2011-01-11

    Enterprises monitor cyber traffic for viruses, intruders and stolen information. Detection methods look for known signatures of malicious traffic or search for anomalies with respect to a nominal reference model. Traditional anomaly detection focuses on aggregate traffic at central nodes or on user-level monitoring. More recently, however, traffic is being viewed more holistically as a dynamic communication graph. Attention to the graph nature of the traffic has expanded the types of anomalies that are being sought. We give an overview of several cyber data streams collected at Los Alamos National Laboratory and discuss current work in modeling the graph dynamics of traffic over the network. We consider global properties and local properties within the communication graph. A method for monitoring relative entropy on multiple correlated properties is discussed in detail.

  18. Deployment of a sequential two-photon laser-induced fluorescence sensor for the detection of gaseous elemental mercury at ambient levels: fast, specific, ultrasensitive detection with parts-per-quadrillion sensitivity

    NASA Astrophysics Data System (ADS)

    Bauer, D.; Everhart, S.; Remeika, J.; Tatum Ernest, C.; Hynes, A. J.

    2014-12-01

    The operation of a laser-based sensor for gas-phase elemental mercury, Hg(0), is described. It utilizes sequential two-photon laser excitation with detection of blue-shifted laser-induced fluorescence (LIF) to provide a highly specific detection scheme that precludes detection of anything other than atomic mercury. It has high sensitivity, fast temporal resolution, and can be deployed for in situ measurements in the open atmosphere with essentially no perturbation of the environment. An ambient sample can also be pulled through a fluorescence cell, allowing for standard addition calibrations of the concentration. No type of preconcentration is required and there appears to be no significant interferences from other atmospheric constituents, including gas-phase oxidized mercury species. As a consequence, it is not necessary to remove oxidized mercury, commonly referred to as reactive gaseous mercury (RGM), from the air sample. The instrument has been deployed as part of an instrument intercomparison and compares well with conventional instrumentation that utilizes preconcentration on gold followed by analysis using cold-vapor atomic fluorescence spectroscopy (CVAFS). Currently, the achievable detection sensitivity is ~ 15 pg m-3 (~ 5 × 104 atoms cm-3, ~ 2 ppq) at a sampling rate of 0.1 Hz, i.e., averaging 100 shots with a 10 Hz laser system. Preliminary results are described for a 50 Hz instrument that utilizes a modified excitation sequence and has monitored ambient elemental mercury with an effective sampling rate of 10 Hz. Additional work is required to produce the precision necessary to perform eddy correlation measurements. Addition of a pyrolysis channel should allow for the measurement of total gaseous mercury (TGM) and hence RGM (by difference) with good sensitivity and time resolution.

  19. Lymphatic Anomalies Registry

    ClinicalTrials.gov

    2016-07-26

    Lymphatic Malformation; Generalized Lymphatic Anomaly (GLA); Central Conducting Lymphatic Anomaly; CLOVES Syndrome; Gorham-Stout Disease ("Disappearing Bone Disease"); Blue Rubber Bleb Nevus Syndrome; Kaposiform Lymphangiomatosis; Kaposiform Hemangioendothelioma/Tufted Angioma; Klippel-Trenaunay Syndrome; Lymphangiomatosis

  20. Alberta Congenital Anomalies Surveillance System.

    PubMed Central

    Lowry, R B; Thunem, N Y; Anderson-Redick, S

    1989-01-01

    The Alberta Congenital Anomalies Surveillance System was started in 1966 in response to the thalidomide tragedy earlier in the decade. It was one of four provincial surveillance systems on which the federal government relied for baseline statistics of congenital anomalies. The government now collects data from six provinces and one territory. The Alberta Congenital Anomaly Surveillance System originally depended on three types of notification to the Division of Vital Statistics, Department of Health, Government of Alberta: birth notice and certificates of death and stillbirth; increased sources of ascertainment have greatly improved data quality. We present the data for 1980-86 and compare the prevalence rates of selected anomalies with the rates from three other surveillance systems. Surveillance systems do not guarantee that a new teratogen will be detected, but they are extremely valuable for testing hypotheses regarding causation. At the very least they provide baseline data with which to compare any deviation or trend. For many, if not most, congenital anomalies total prevention is not possible; however, surveillance systems can be used to measure progress in prevention. PMID:2819634

  1. INVESTIGATION OF ARSENIC SPECIATION ON DRINKING WATER TREATMENT MEDIA UTILIZING AUTOMATED SEQUENTIAL CONTINUOUS FLOW EXTRACTION WITH IC-ICP-MS DETECTION

    EPA Science Inventory

    Three treatment media, used for the removal of arsenic from drinking water, were sequentially extracted using 10mM MgCl2 (pH 8), 10mM NaH2PO4 (pH 7) followed by 10mM (NH4)2C2O4 (pH 3). The media were extracted using an on-line automated continuous extraction system which allowed...

  2. Magnetic Anomalies over Iceland.

    PubMed

    Serson, P H; Hannaford, W; Haines, G V

    1968-10-18

    An aeromagnetic survey of Iceland reveals broad anomalies of large amplitude over zones of recent volcanic activity. The source of the anomalies is ascribed to large masses of basalt that have been coherently remagnetized by intrusive heating. A simple correlation of the Icelandic anomalies with those of the ocean floor therefore appears unjustified.

  3. Eyewitness confidence in simultaneous and sequential lineups: a criterion shift account for sequential mistaken identification overconfidence.

    PubMed

    Dobolyi, David G; Dodson, Chad S

    2013-12-01

    Confidence judgments for eyewitness identifications play an integral role in determining guilt during legal proceedings. Past research has shown that confidence in positive identifications is strongly associated with accuracy. Using a standard lineup recognition paradigm, we investigated accuracy using signal detection and ROC analyses, along with the tendency to choose a face with both simultaneous and sequential lineups. We replicated past findings of reduced rates of choosing with sequential as compared to simultaneous lineups, but notably found an accuracy advantage in favor of simultaneous lineups. Moreover, our analysis of the confidence-accuracy relationship revealed two key findings. First, we observed a sequential mistaken identification overconfidence effect: despite an overall reduction in false alarms, confidence for false alarms that did occur was higher with sequential lineups than with simultaneous lineups, with no differences in confidence for correct identifications. This sequential mistaken identification overconfidence effect is an expected byproduct of the use of a more conservative identification criterion with sequential than with simultaneous lineups. Second, we found a steady drop in confidence for mistaken identifications (i.e., foil identifications and false alarms) from the first to the last face in sequential lineups, whereas confidence in and accuracy of correct identifications remained relatively stable. Overall, we observed that sequential lineups are both less accurate and produce higher confidence false identifications than do simultaneous lineups. Given the increasing prominence of sequential lineups in our legal system, our data argue for increased scrutiny and possibly a wholesale reevaluation of this lineup format.

  4. Eyewitness confidence in simultaneous and sequential lineups: a criterion shift account for sequential mistaken identification overconfidence.

    PubMed

    Dobolyi, David G; Dodson, Chad S

    2013-12-01

    Confidence judgments for eyewitness identifications play an integral role in determining guilt during legal proceedings. Past research has shown that confidence in positive identifications is strongly associated with accuracy. Using a standard lineup recognition paradigm, we investigated accuracy using signal detection and ROC analyses, along with the tendency to choose a face with both simultaneous and sequential lineups. We replicated past findings of reduced rates of choosing with sequential as compared to simultaneous lineups, but notably found an accuracy advantage in favor of simultaneous lineups. Moreover, our analysis of the confidence-accuracy relationship revealed two key findings. First, we observed a sequential mistaken identification overconfidence effect: despite an overall reduction in false alarms, confidence for false alarms that did occur was higher with sequential lineups than with simultaneous lineups, with no differences in confidence for correct identifications. This sequential mistaken identification overconfidence effect is an expected byproduct of the use of a more conservative identification criterion with sequential than with simultaneous lineups. Second, we found a steady drop in confidence for mistaken identifications (i.e., foil identifications and false alarms) from the first to the last face in sequential lineups, whereas confidence in and accuracy of correct identifications remained relatively stable. Overall, we observed that sequential lineups are both less accurate and produce higher confidence false identifications than do simultaneous lineups. Given the increasing prominence of sequential lineups in our legal system, our data argue for increased scrutiny and possibly a wholesale reevaluation of this lineup format. PMID:24188335

  5. Adaptive sequential controller

    DOEpatents

    El-Sharkawi, Mohamed A.; Xing, Jian; Butler, Nicholas G.; Rodriguez, Alonso

    1994-01-01

    An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.

  6. Analysis of spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Bloomquist, C. E.; Graham, W. C.

    1976-01-01

    The anomalies from 316 spacecraft covering the entire U.S. space program were analyzed to determine if there were any experimental or technological programs which could be implemented to remove the anomalies from future space activity. Thirty specific categories of anomalies were found to cover nearly 85 percent of all observed anomalies. Thirteen experiments were defined to deal with 17 of these categories; nine additional experiments were identified to deal with other classes of observed and anticipated anomalies. Preliminary analyses indicate that all 22 experimental programs are both technically feasible and economically viable.

  7. Identification of mineral resources in Afghanistan-Detecting and mapping resource anomalies in prioritized areas using geophysical and remote sensing (ASTER and HyMap) data

    USGS Publications Warehouse

    : King, Trude V. V.; Johnson, Michaela R.; Hubbard, Bernard E.; Drenth, Benjamin J.

    2011-01-01

    During the independent analysis of the geophysical, ASTER, and imaging spectrometer (HyMap) data by USGS scientists, previously unrecognized targets of potential mineralization were identified using evaluation criteria most suitable to the individual dataset. These anomalous zones offer targets of opportunity that warrant additional field verification. This report describes the standards used to define the anomalies, summarizes the results of the evaluations for each type of data, and discusses the importance and implications of regions of anomaly overlap between two or three of the datasets.

  8. Dual left anterior descending artery with anomalous origin of long LAD from pulmonary artery - rare coronary anomaly detected on computed tomography coronary angiography

    PubMed Central

    Vohra, Aditi; Narula, Harneet

    2016-01-01

    Dual left anterior descending artery is a rare coronary artery anomaly showing two left anterior descending arteries. Short anterior descending artery usually arises from the left coronary artery, while long anterior descending artery has anomalous origin and course. Dual left anterior descending artery with origin of long anterior descending artery from the pulmonary artery (ALCAPA) is a very rare coronary artery anomaly which has not been reported previously in the literature. We present the computed tomography coronary angiographic findings of this rare case in a young female patient who presented with atypical chest pain. PMID:27413266

  9. Lifshitz scale anomalies

    NASA Astrophysics Data System (ADS)

    Arav, Igal; Chapman, Shira; Oz, Yaron

    2015-02-01

    We analyse scale anomalies in Lifshitz field theories, formulated as the relative cohomology of the scaling operator with respect to foliation preserving diffeomorphisms. We construct a detailed framework that enables us to calculate the anomalies for any number of spatial dimensions, and for any value of the dynamical exponent. We derive selection rules, and establish the anomaly structure in diverse universal sectors. We present the complete cohomologies for various examples in one, two and three space dimensions for several values of the dynamical exponent. Our calculations indicate that all the Lifshitz scale anomalies are trivial descents, called B-type in the terminology of conformal anomalies. However, not all the trivial descents are cohomologically non-trivial. We compare the conformal anomalies to Lifshitz scale anomalies with a dynamical exponent equal to one.

  10. Morning glory disc anomaly with Chiari type I malformation.

    PubMed

    Arlow, Tim; Arepalli, Sruthi; Flanders, Adam E; Shields, Carol L

    2014-04-30

    Morning glory disc anomaly is a rare optic nerve dysplasia associated with various neovascular abnormalities. Due to these associations, children with morning glory disc anomaly have brain imaging and angiography to detect other congenital defects. The authors report the case of an infant with morning glory disc anomaly and coexisting Chiari type I malformation.

  11. Sequential probability ratio controllers for safeguards radiation monitors

    SciTech Connect

    Fehlau, P.E.; Coop, K.L.; Nixon, K.V.

    1984-01-01

    Sequential hypothesis tests applied to nuclear safeguards accounting methods make the methods more sensitive to detecting diversion. The sequential tests also improve transient signal detection in safeguards radiation monitors. This paper describes three microprocessor control units with sequential probability-ratio tests for detecting transient increases in radiation intensity. The control units are designed for three specific applications: low-intensity monitoring with Poisson probability ratios, higher intensity gamma-ray monitoring where fixed counting intervals are shortened by sequential testing, and monitoring moving traffic where the sequential technique responds to variable-duration signals. The fixed-interval controller shortens a customary 50-s monitoring time to an average of 18 s, making the monitoring delay less bothersome. The controller for monitoring moving vehicles benefits from the sequential technique by maintaining more than half its sensitivity when the normal passage speed doubles.

  12. WF4 Anomaly Characterization

    NASA Astrophysics Data System (ADS)

    Biretta, John

    2005-07-01

    A serious anomaly has been found in images from the WF4 CCD in WFPC2. The WF4 CCD bias level appears to have become unstable, resulting in sporadic images with either low or zero bias level. The severity and frequency of the problem is rapidly increasing, and it is possible that WF4 will soon become unusable if no work-around is found. The other three CCDs {PC1, WF2, and WF3} appear to be unaffected and continue to operate properly. The impacts from "low" and "zero" bias are somewhat different, but in both cases the effects are immediately obvious. Images with low bias will tend to have horizontal {x-direction} streaks and stripes with an amplitude of ? about 0.5 DN in WF4. We believe these data should be mostly recoverable with some effort, though at a loss in the detectability of faint targets. "Zero bias" is a much more serious problem and is evidenced by images which are blank in WF4, except for showing occasional cosmic rays, bright targets, and negative pixels from dark subtraction. These images with zero bias are probably unusable for most purposes. Both the CCD gain settings of 7 and 14 are affected. The frequency of the anomaly is rapidly increasing. The first significant instances of low bias appear to have been in late 2004 when a few images were impacted. However, within the last few weeks over half the images are beginning to show the low bias problem. The more serious "zero bias" problem appears to have first occurred in Feb. 2005, but it is also increasing and now impacts 10% to 20% of WFPC2 images. At present there are still many images which appear fine and unaffected, but the situation is quickly evolving. We believe the science impact for most observers will be minimal. Targets are by default placed on either PC1 or WF3 which continue to operate properly. However, observers requiring the full field of view {survey projects, large targets, etc.} will potentially lose one-third of their imaging area. Our understanding of this anomaly is still

  13. Familial Ebstein's anomaly.

    PubMed Central

    Rosenmann, A; Arad, I; Simcha, A; Schaap, T

    1976-01-01

    A family is described in which both a father and son are affected with Ebstein's anomaly, while several other family members manifest different cardiac malformations. Five additional instances of familial Ebstein's anomaly were found in the literature and compared with our family. Inspection of possible modes of inheritance in this group of families suggests that Ebstein's anomaly is probably inherited as a polygenic character with a threshold phenomenon. PMID:1018315

  14. Sequential ranging: How it works

    NASA Technical Reports Server (NTRS)

    Baugh, Harold W.

    1993-01-01

    This publication is directed to the users of data from the Sequential Ranging Assembly (SRA), and to others who have a general interest in range measurements. It covers the hardware, the software, and the processes used in acquiring range data; it does not cover analytical aspects such as the theory of modulation, detection, noise spectral density, and other highly technical subjects. In other words, it covers how ranging is done, but not the details of why it works. The publication also includes an appendix that gives a brief discussion of PN ranging, a capability now under development.

  15. Taussig-Bing Anomaly

    PubMed Central

    Konstantinov, Igor E.

    2009-01-01

    Taussig-Bing anomaly is a rare congenital heart malformation that was first described in 1949 by Helen B. Taussig (1898–1986) and Richard J. Bing (1909–). Although substantial improvement has since been achieved in surgical results of the repair of the anomaly, management of the Taussig-Bing anomaly remains challenging. A history of the original description of the anomaly, the life stories of the individuals who first described it, and the current outcomes of its surgical management are reviewed herein. PMID:20069085

  16. Glassy carbon electrodes sequentially modified by cysteamine-capped gold nanoparticles and poly(amidoamine) dendrimers generation 4.5 for detecting uric acid in human serum without ascorbic acid interference.

    PubMed

    Ramírez-Segovia, A S; Banda-Alemán, J A; Gutiérrez-Granados, S; Rodríguez, A; Rodríguez, F J; Godínez, Luis A; Bustos, E; Manríquez, J

    2014-02-17

    Glassy carbon electrodes (GCE) were sequentially modified by cysteamine-capped gold nanoparticles (AuNp@cysteamine) and PAMAM dendrimers generation 4.5 bearing 128-COOH peripheral groups (GCE/AuNp@cysteamine/PAMAM), in order to explore their capabilities as electrochemical detectors of uric acid (UA) in human serum samples at pH 2. The results showed that concentrations of UA detected by cyclic voltammetry with GCE/AuNp@cysteamine/PAMAM were comparable (deviation <±10%; limits of detection (LOD) and quantification (LOQ) were 1.7×10(-4) and 5.8×10(-4) mg dL(-1), respectively) to those concentrations obtained using the uricase-based enzymatic-colorimetric method. It was also observed that the presence of dendrimers in the GCE/AuNp@cysteamine/PAMAM system minimizes ascorbic acid (AA) interference during UA oxidation, thus improving the electrocatalytic activity of the gold nanoparticles. PMID:24491759

  17. Sequential inductive learning

    SciTech Connect

    Gratch, J.

    1996-12-31

    This article advocates a new model for inductive learning. Called sequential induction, it helps bridge classical fixed-sample learning techniques (which are efficient but difficult to formally characterize), and worst-case approaches (which provide strong statistical guarantees but are too inefficient for practical use). Learning proceeds as a sequence of decisions which are informed by training data. By analyzing induction at the level of these decisions, and by utilizing the only enough data to make each decision, sequential induction provides statistical guarantees but with substantially less data than worst-case methods require. The sequential inductive model is also useful as a method for determining a sufficient sample size for inductive learning and as such, is relevant to learning problems where the preponderance of data or the cost of gathering data precludes the use of traditional methods.

  18. Competing Orders and Anomalies

    NASA Astrophysics Data System (ADS)

    Moon, Eun-Gook

    2016-08-01

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation “laws” could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the ’t Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed.

  19. Competing Orders and Anomalies.

    PubMed

    Moon, Eun-Gook

    2016-08-08

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation "laws" could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the 't Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed.

  20. Competing Orders and Anomalies.

    PubMed

    Moon, Eun-Gook

    2016-01-01

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation "laws" could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the 't Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed. PMID:27499184

  1. Competing Orders and Anomalies

    PubMed Central

    Moon, Eun-Gook

    2016-01-01

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation “laws” could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the ’t Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed. PMID:27499184

  2. Sequential elution process

    DOEpatents

    Kingsley, I.S.

    1987-01-06

    A process and apparatus are disclosed for the separation of complex mixtures of carbonaceous material by sequential elution with successively stronger solvents. In the process, a column containing glass beads is maintained in a fluidized state by a rapidly flowing stream of a weak solvent, and the sample is injected into this flowing stream such that a portion of the sample is dissolved therein and the remainder of the sample is precipitated therein and collected as a uniform deposit on the glass beads. Successively stronger solvents are then passed through the column to sequentially elute less soluble materials. 1 fig.

  3. The resolution of a magnetic anomaly map expected from GRM data

    NASA Technical Reports Server (NTRS)

    Strangway, D. W.; Arkani-Hamed, J.; Teskey, D. J.; Hood, P. J.

    1985-01-01

    Data from the MAGSAT mission were used to derive a global scalar magnetic anomaly map at an average altitude of about 400 km. It was possible to work with 2 data sets corresponding to dawn and dusk. The anomalies which were repeatable at dawn and at dusk was identified and the error limits of these anomalies were estimated. The repeatable anomalies were downward continued to about 10 km altitude. The anomalies over Canada were correlated quantitatively with bandpass filtered magnetic anomalies derived from aeromagnetic surveys. The close correlation indicates that the repeatable anomalies detected from orbit are due to geological causes. This correlation supports the geological significance of the global anomaly map.

  4. Sequential memory: Binding dynamics

    NASA Astrophysics Data System (ADS)

    Afraimovich, Valentin; Gong, Xue; Rabinovich, Mikhail

    2015-10-01

    Temporal order memories are critical for everyday animal and human functioning. Experiments and our own experience show that the binding or association of various features of an event together and the maintaining of multimodality events in sequential order are the key components of any sequential memories—episodic, semantic, working, etc. We study a robustness of binding sequential dynamics based on our previously introduced model in the form of generalized Lotka-Volterra equations. In the phase space of the model, there exists a multi-dimensional binding heteroclinic network consisting of saddle equilibrium points and heteroclinic trajectories joining them. We prove here the robustness of the binding sequential dynamics, i.e., the feasibility phenomenon for coupled heteroclinic networks: for each collection of successive heteroclinic trajectories inside the unified networks, there is an open set of initial points such that the trajectory going through each of them follows the prescribed collection staying in a small neighborhood of it. We show also that the symbolic complexity function of the system restricted to this neighborhood is a polynomial of degree L - 1, where L is the number of modalities.

  5. Sequential Dependencies in Driving

    ERIC Educational Resources Information Center

    Doshi, Anup; Tran, Cuong; Wilder, Matthew H.; Mozer, Michael C.; Trivedi, Mohan M.

    2012-01-01

    The effect of recent experience on current behavior has been studied extensively in simple laboratory tasks. We explore the nature of sequential effects in the more naturalistic setting of automobile driving. Driving is a safety-critical task in which delayed response times may have severe consequences. Using a realistic driving simulator, we find…

  6. Sequential memory: Binding dynamics.

    PubMed

    Afraimovich, Valentin; Gong, Xue; Rabinovich, Mikhail

    2015-10-01

    Temporal order memories are critical for everyday animal and human functioning. Experiments and our own experience show that the binding or association of various features of an event together and the maintaining of multimodality events in sequential order are the key components of any sequential memories-episodic, semantic, working, etc. We study a robustness of binding sequential dynamics based on our previously introduced model in the form of generalized Lotka-Volterra equations. In the phase space of the model, there exists a multi-dimensional binding heteroclinic network consisting of saddle equilibrium points and heteroclinic trajectories joining them. We prove here the robustness of the binding sequential dynamics, i.e., the feasibility phenomenon for coupled heteroclinic networks: for each collection of successive heteroclinic trajectories inside the unified networks, there is an open set of initial points such that the trajectory going through each of them follows the prescribed collection staying in a small neighborhood of it. We show also that the symbolic complexity function of the system restricted to this neighborhood is a polynomial of degree L - 1, where L is the number of modalities. PMID:26520084

  7. Detecting ecosystem performance anomalies for land management in the upper colorado river basin using satellite observations, climate data, and ecosystem models

    USGS Publications Warehouse

    Gu, Y.; Wylie, B.K.

    2010-01-01

    This study identifies areas with ecosystem performance anomalies (EPA) within the Upper Colorado River Basin (UCRB) during 2005-2007 using satellite observations, climate data, and ecosystem models. The final EPA maps with 250-m spatial resolution were categorized as normal performance, underperformance, and overperformance (observed performance relative to weather-based predictions) at the 90% level of confidence. The EPA maps were validated using "percentage of bare soil" ground observations. The validation results at locations with comparable site potential showed that regions identified as persistently underperforming (overperforming) tended to have a higher (lower) percentage of bare soil, suggesting that our preliminary EPA maps are reliable and agree with ground-based observations. The 3-year (2005-2007) persistent EPA map from this study provides the first quantitative evaluation of ecosystem performance anomalies within the UCRB and will help the Bureau of Land Management (BLM) identify potentially degraded lands. Results from this study can be used as a prototype by BLM and other land managers for making optimal land management decisions. ?? 2010 by the authors.

  8. Detecting Ecosystem Performance Anomalies for Land Management in the Upper Colorado River Basin Using Satellite Observations, Climate Data, and Ecosystem Models

    USGS Publications Warehouse

    Gu, Yingxin; Wylie, Bruce K.

    2010-01-01

    This study identifies areas with ecosystem performance anomalies (EPA) within the Upper Colorado River Basin (UCRB) during 2005–2007 using satellite observations, climate data, and ecosystem models. The final EPA maps with 250-m spatial resolution were categorized as normal performance, underperformance, and overperformance (observed performance relative to weather-based predictions) at the 90% level of confidence. The EPA maps were validated using “percentage of bare soil” ground observations. The validation results at locations with comparable site potential showed that regions identified as persistently underperforming (overperforming) tended to have a higher (lower) percentage of bare soil, suggesting that our preliminary EPA maps are reliable and agree with ground-based observations. The 3-year (2005–2007) persistent EPA map from this study provides the first quantitative evaluation of ecosystem performance anomalies within the UCRB and will help the Bureau of Land Management (BLM) identify potentially degraded lands. Results from this study can be used as a prototype by BLM and other land managers for making optimal land management decisions.

  9. Behavioral economics without anomalies.

    PubMed Central

    Rachlin, H

    1995-01-01

    Behavioral economics is often conceived as the study of anomalies superimposed on a rational system. As research has progressed, anomalies have multiplied until little is left of rationality. Another conception of behavioral economics is based on the axiom that value is always maximized. It incorporates so-called anomalies either as conflicts between temporal patterns of behavior and the individual acts comprising those patterns or as outcomes of nonexponential time discounting. This second conception of behavioral economics is both empirically based and internally consistent. PMID:8551195

  10. Imaging of facial anomalies.

    PubMed

    Castillo, M; Mukherji, S K

    1995-01-01

    Anomalies of the face may occur in its lower or middle segments. Anomalies of the lower face generally involve the derivatives of the branchial apparatus and therefore manifest as defects in the mandible, pinnae, external auditory canals, and portions of the middle ears. These anomalies are occasionally isolated, but most of them occur in combination with systemic syndromes. These anomalies generally do not occur with respiratory compromise. Anomalies of the midface may extend from the upper lip to the forehead, reflecting the complex embryology of this region. Most of these deformities are isolated, but some patients with facial clefts, notably the midline cleft syndrome and holoprosencephaly, have anomalies in other sites. This is important because these patients will require detailed imaging of the face and brain. Anomalies of the midface tend to involve the nose and its air-conducting passages. We prefer to divide these anomalies into those with and without respiratory obstruction. The most common anomalies that result in airway compromise include posterior choanal stenoses and atresias, bilateral cysts (mucoceles) of the distal lacrimal ducts, and stenosis of the pyriform (anterior) nasal aperture. These may be optimally evaluated with computed tomography (CT) and generally require immediate treatment to ensure adequate ventilation. Rare nasal anomalies that also result in airway obstruction are agenesis of the pharynx, agenesis of the nose, and hypoplasia of the nasal alae. Agenesis of the nasopharynx and nose are complex anomalies that require both CT and magnetic resonance imaging (MRI). The diagnosis of hypoplasia of the nasal alae is a clinical one; these anomalies do not require imaging studies. Besides facial clefts, anomalies of the nose without respiratory obstruction tend to be centered around the nasofrontal region. This is the site of the most common sincipital encephaloceles. Patients with frontonasal and nasoethmoidal encephaloceles require both

  11. Nitrogen isotope anomalies in primitive ordinary chondrites

    NASA Astrophysics Data System (ADS)

    Sugiura, Naoji; Hashizume, Ko

    1992-07-01

    Large anomalies in nitrogen isotopic composition were found in two type-L3 ordinary chondrites. One of them is isotopically heavy, and the other is isotopically light. The carriers of anomalous nitrogen are partly soluble in HCl. Thus, the anomalies are probably due to new types of presolar grains, although they have not been identified yet. Trapped Ar-36 in these chondrites seems to be associated with this anomalous nitrogen, and may be presolar in origin. The presence of two different nitrogen isotopic anomalies suggests that the parent body of L chondrites, and also the primitive solar nebula, were not homogeneous. Nitrogen isotope anomalies seem to be useful in detecting subdivisions of chemical groups of chondrites.

  12. Design and Implementation of an Anomaly Detector

    SciTech Connect

    Bagherjeiran, A; Cantu-Paz, E; Kamath, C

    2005-07-11

    This paper describes the design and implementation of a general-purpose anomaly detector for streaming data. Based on a survey of similar work from the literature, a basic anomaly detector builds a model on normal data, compares this model to incoming data, and uses a threshold to determine when the incoming data represent an anomaly. Models compactly represent the data but still allow for effective comparison. Comparison methods determine the distance between two models of data or the distance between a model and a point. Threshold selection is a largely neglected problem in the literature, but the current implementation includes two methods to estimate thresholds from normal data. With these components, a user can construct a variety of anomaly detection schemes. The implementation contains several methods from the literature. Three separate experiments tested the performance of the components on two well-known and one completely artificial dataset. The results indicate that the implementation works and can reproduce results from previous experiments.

  13. Dual diaphragmatic anomalies.

    PubMed

    Padmanabhan, Arjun; Thomas, Abin Varghese

    2016-01-01

    Although diaphragmatic anomalies such as an eventration and hiatus hernia are commonly encountered in incidental chest X-ray imaging, the presence of concomitant multiple anomalies is extremely rare. This is all the more true in adults. Herein, we present the case of a 75-year-old female, while undergoing a routine chest X-ray imaging, was found to have eventration of right hemidiaphragm along with a hiatus hernia as well. PMID:27625457

  14. Dual diaphragmatic anomalies

    PubMed Central

    Padmanabhan, Arjun; Thomas, Abin Varghese

    2016-01-01

    Although diaphragmatic anomalies such as an eventration and hiatus hernia are commonly encountered in incidental chest X-ray imaging, the presence of concomitant multiple anomalies is extremely rare. This is all the more true in adults. Herein, we present the case of a 75-year-old female, while undergoing a routine chest X-ray imaging, was found to have eventration of right hemidiaphragm along with a hiatus hernia as well.

  15. Dual diaphragmatic anomalies

    PubMed Central

    Padmanabhan, Arjun; Thomas, Abin Varghese

    2016-01-01

    Although diaphragmatic anomalies such as an eventration and hiatus hernia are commonly encountered in incidental chest X-ray imaging, the presence of concomitant multiple anomalies is extremely rare. This is all the more true in adults. Herein, we present the case of a 75-year-old female, while undergoing a routine chest X-ray imaging, was found to have eventration of right hemidiaphragm along with a hiatus hernia as well. PMID:27625457

  16. Sequential cloning of chromosomes

    SciTech Connect

    Lacks, S.A.

    1991-12-31

    A method for sequential cloning of chromosomal DNA and chromosomal DNA cloned by this method are disclosed. The method includes the selection of a target organism having a segment of chromosomal DNA to be sequentially cloned. A first DNA segment, having a first restriction enzyme site on either side. homologous to the chromosomal DNA to be sequentially cloned is isolated. A first vector product is formed by ligating the homologous segment into a suitably designed vector. The first vector product is circularly integrated into the target organism`s chromosomal DNA. The resulting integrated chromosomal DNA segment includes the homologous DNA segment at either end of the integrated vector segment. The integrated chromosomal DNA is cleaved with a second restriction enzyme and ligated to form a vector-containing plasmid, which is replicated in a host organism. The replicated plasmid is then cleaved with the first restriction enzyme. Next, a DNA segment containing the vector and a segment of DNA homologous to a distal portion of the previously isolated DNA segment is isolated. This segment is then ligated to form a plasmid which is replicated within a suitable host. This plasmid is then circularly integrated into the target chromosomal DNA. The chromosomal DNA containing the circularly integrated vector is treated with a third, retrorestriction enzyme. The cleaved DNA is ligated to give a plasmid that is used to transform a host permissive for replication of its vector. The sequential cloning process continues by repeated cycles of circular integration and excision. The excision is carried out alternately with the second and third enzymes.

  17. Splenic Anomalies of Shape, Size, and Location: Pictorial Essay

    PubMed Central

    Yildiz, Adalet Elcin; Ariyurek, Macit Orhan; Karcaaltincaba, Musturay

    2013-01-01

    Spleen can have a wide range of anomalies including its shape, location, number, and size. Although most of these anomalies are congenital, there are also acquired types. Congenital anomalies affecting the shape of spleen are lobulations, notches, and clefts; the fusion and location anomalies of spleen are accessory spleen, splenopancreatic fusion, and wandering spleen; polysplenia can be associated with a syndrome. Splenosis and small spleen are acquired anomalies which are caused by trauma and sickle cell disease, respectively. These anomalies can be detected easily by using different imaging modalities including ultrasonography, computed tomography, magnetic resonance imaging, and also Tc-99m scintigraphy. In this pictorial essay, we review the imaging findings of these anomalies which can cause diagnostic pitfalls and be interpreted as pathologic processes. PMID:23710135

  18. Synthesis and Application of an Aldazine-Based Fluorescence Chemosensor for the Sequential Detection of Cu²⁺ and Biological Thiols in Aqueous Solution and Living Cells.

    PubMed

    Jia, Hongmin; Yang, Ming; Meng, Qingtao; He, Guangjie; Wang, Yue; Hu, Zhizhi; Zhang, Run; Zhang, Zhiqiang

    2016-01-01

    A fluorescence chemosensor, 2-hydroxy-1-naphthaldehyde azine (HNA) was designed and synthesized for sequential detection of Cu(2+) and biothiols. It was found that HNA can specifically bind to Cu(2+) with 1:1 stoichiometry, accompanied with a dramatic fluorescence quenching and a remarkable bathochromic-shift of the absorbance peak in HEPES buffer. The generated HNA-Cu(2+) ensemble displayed a "turn-on" fluorescent response specific for biothiols (Hcy, Cys and GSH) based on the displacement approach, giving a remarkable recovery of fluorescence and UV-Vis spectra. The detection limits of HNA-Cu(2+) to Hcy, Cys and GSH were estimated to be 1.5 μM, 1.0 μM and 0.8 μM, respectively, suggesting that HNA-Cu(2+) is sensitive enough for the determination of thiols in biological systems. The biocompatibility of HNA towards A549 human lung carcinoma cell, was evaluated by an MTT assay. The capability of HNA-Cu(2+) to detect biothiols in live A549 cells was then demonstrated by a microscopy fluorescence imaging assay. PMID:26761012

  19. Sequential cloning of chromosomes

    DOEpatents

    Lacks, Sanford A.

    1995-07-18

    A method for sequential cloning of chromosomal DNA of a target organism is disclosed. A first DNA segment homologous to the chromosomal DNA to be sequentially cloned is isolated. The first segment has a first restriction enzyme site on either side. A first vector product is formed by ligating the homologous segment into a suitably designed vector. The first vector product is circularly integrated into the target organism's chromosomal DNA. The resulting integrated chromosomal DNA segment includes the homologous DNA segment at either end of the integrated vector segment. The integrated chromosomal DNA is cleaved with a second restriction enzyme and ligated to form a vector-containing plasmid, which is replicated in a host organism. The replicated plasmid is then cleaved with the first restriction enzyme. Next, a DNA segment containing the vector and a segment of DNA homologous to a distal portion of the previously isolated DNA segment is isolated. This segment is then ligated to form a plasmid which is replicated within a suitable host. This plasmid is then circularly integrated into the target chromosomal DNA. The chromosomal DNA containing the circularly integrated vector is treated with a third, retrorestriction (class IIS) enzyme. The cleaved DNA is ligated to give a plasmid that is used to transform a host permissive for replication of its vector. The sequential cloning process continues by repeated cycles of circular integration and excision. The excision is carried out alternately with the second and third enzymes.

  20. Sequential cloning of chromosomes

    DOEpatents

    Lacks, S.A.

    1995-07-18

    A method for sequential cloning of chromosomal DNA of a target organism is disclosed. A first DNA segment homologous to the chromosomal DNA to be sequentially cloned is isolated. The first segment has a first restriction enzyme site on either side. A first vector product is formed by ligating the homologous segment into a suitably designed vector. The first vector product is circularly integrated into the target organism`s chromosomal DNA. The resulting integrated chromosomal DNA segment includes the homologous DNA segment at either end of the integrated vector segment. The integrated chromosomal DNA is cleaved with a second restriction enzyme and ligated to form a vector-containing plasmid, which is replicated in a host organism. The replicated plasmid is then cleaved with the first restriction enzyme. Next, a DNA segment containing the vector and a segment of DNA homologous to a distal portion of the previously isolated DNA segment is isolated. This segment is then ligated to form a plasmid which is replicated within a suitable host. This plasmid is then circularly integrated into the target chromosomal DNA. The chromosomal DNA containing the circularly integrated vector is treated with a third, retrorestriction (class IIS) enzyme. The cleaved DNA is ligated to give a plasmid that is used to transform a host permissive for replication of its vector. The sequential cloning process continues by repeated cycles of circular integration and excision. The excision is carried out alternately with the second and third enzymes. 9 figs.

  1. Radioactive anomaly discrimination from spectral ratios

    DOEpatents

    Maniscalco, James; Sjoden, Glenn; Chapman, Mac Clements

    2013-08-20

    A method for discriminating a radioactive anomaly from naturally occurring radioactive materials includes detecting a first number of gamma photons having energies in a first range of energy values within a predetermined period of time and detecting a second number of gamma photons having energies in a second range of energy values within the predetermined period of time. The method further includes determining, in a controller, a ratio of the first number of gamma photons having energies in the first range and the second number of gamma photons having energies in the second range, and determining that a radioactive anomaly is present when the ratio exceeds a threshold value.

  2. Relationships between Rwandan seasonal rainfall anomalies and ENSO events

    NASA Astrophysics Data System (ADS)

    Muhire, I.; Ahmed, F.; Abutaleb, K.

    2015-10-01

    This study aims primarily at investigating the relationships between Rwandan seasonal rainfall anomalies and El Niño-South Oscillation phenomenon (ENSO) events. The study is useful for early warning of negative effects associated with extreme rainfall anomalies across the country. It covers the period 1935-1992, using long and short rains data from 28 weather stations in Rwanda and ENSO events resourced from Glantz (2001). The mean standardized anomaly indices were calculated to investigate their associations with ENSO events. One-way analysis of variance was applied on the mean standardized anomaly index values per ENSO event to explore the spatial correlation of rainfall anomalies per ENSO event. A geographical information system was used to present spatially the variations in mean standardized anomaly indices per ENSO event. The results showed approximately three climatic periods, namely, dry period (1935-1960), semi-humid period (1961-1976) and wet period (1977-1992). Though positive and negative correlations were detected between extreme short rains anomalies and El Niño events, La Niña events were mostly linked to negative rainfall anomalies while El Niño events were associated with positive rainfall anomalies. The occurrence of El Niño and La Niña in the same year does not show any clear association with rainfall anomalies. However, the phenomenon was more linked with positive long rains anomalies and negative short rains anomalies. The normal years were largely linked with negative long rains anomalies and positive short rains anomalies, which is a pointer to the influence of other factors other than ENSO events. This makes projection of seasonal rainfall anomalies in the country by merely predicting ENSO events difficult.

  3. Astrometric solar system anomalies

    SciTech Connect

    Nieto, Michael Martin; Anderson, John D

    2009-01-01

    There are at least four unexplained anomalies connected with astrometric data. perhaps the most disturbing is the fact that when a spacecraft on a flyby trajectory approaches the Earth within 2000 km or less, it often experiences a change in total orbital energy per unit mass. next, a secular change in the astronomical unit AU is definitely a concern. It is increasing by about 15 cm yr{sup -1}. The other two anomalies are perhaps less disturbing because of known sources of nongravitational acceleration. The first is an apparent slowing of the two Pioneer spacecraft as they exit the solar system in opposite directions. Some astronomers and physicists are convinced this effect is of concern, but many others are convinced it is produced by a nearly identical thermal emission from both spacecraft, in a direction away from the Sun, thereby producing acceleration toward the Sun. The fourth anomaly is a measured increase in the eccentricity of the Moon's orbit. Here again, an increase is expected from tidal friction in both the Earth and Moon. However, there is a reported unexplained increase that is significant at the three-sigma level. It is produent to suspect that all four anomalies have mundane explanations, or that one or more anomalies are a result of systematic error. Yet they might eventually be explained by new physics. For example, a slightly modified theory of gravitation is not ruled out, perhaps analogous to Einstein's 1916 explanation for the excess precession of Mercury's perihelion.

  4. Congenital Vascular Anomalies.

    PubMed

    Gravereaux, Edwin C.; Nguyen, Louis L.; Cunningham, Leslie D.

    2004-04-01

    Congenital vascular anomalies are rare. The cardiovascular specialist should nevertheless be aware of the more common types of vascular anomalies and understand the implications for patient treatment and the likelihood of associated morbidity. The presentation of congenital arteriovenous malformations can range from asymptomatic or cosmetic lesions, to those causing ischemia, ulceration, hemorrhage, or high-output congestive heart failure. Treatment of large, symptomatic arteriovenous malformations often requires catheter-directed embolization prior to the attempt at complete surgical excision. Later recurrence, due to collateral recruitment, is frequent. Graded compression stockings and leg elevation are the mainstays of treatment for the predominantly venous congenital vascular anomalies. Most congenital central venous disorders are clinically silent. An exception is the retrocaval ureter. Retroaortic left renal vein, circumaortic venous ring, and absent, left-sided or duplicated inferior vena cava are relevant when aortic or inferior vena cava procedures are planned. The treatment of the venous disorders is directed at prevention or management of symptoms. Persistent sciatic artery, popliteal entrapment syndrome, and aberrant right subclavian artery origin are congenital anomalies that are typically symptomatic at presentation. Because they mimic more common diseases, diagnosis is frequently delayed. Delay can result in significant morbidity for the patient. Failure to make the diagnosis of persistent sciatic artery and popliteal entrapment can result in critical limb ischemia and subsequent amputation. Unrecognized aberrant right subclavian artery origin associated with aneurysmal degeneration can rupture and result in death. The treatment options for large-vessel arterial anomalies are surgical, sometimes in combination with endovascular techniques.

  5. Magnetic anomalies. [Magsat studies

    NASA Technical Reports Server (NTRS)

    Harrison, C. G. A.

    1983-01-01

    The implications and accuracy of anomaly maps produced using Magsat data on the scalar and vector magnetic field of the earth are discussed. Comparisons have been made between the satellite maps and aeromagnetic survey maps, showing smoother data from the satellite maps and larger anomalies in the aircraft data. The maps are being applied to characterize the structure and tectonics of the underlying regions. Investigations are still needed regarding the directions of magnetization within the crust and to generate further correlations between anomaly features and large scale geological structures. Furthermore, an increased data base is recommended for the Pacific Ocean basin in order to develop a better starting model for Pacific tectonic movements. The Pacific basin was large farther backwards in time and subduction zones surround the basin, thereby causing difficulties for describing the complex break-up scenario for Gondwanaland.

  6. QCD trace anomaly

    SciTech Connect

    Andersen, Jens O.; Leganger, Lars E.; Strickland, Michael; Su, Nan

    2011-10-15

    In this brief report we compare the predictions of a recent next-to-next-to-leading order hard-thermal-loop perturbation theory (HTLpt) calculation of the QCD trace anomaly to available lattice data. We focus on the trace anomaly scaled by T{sup 2} in two cases: N{sub f}=0 and N{sub f}=3. When using the canonical value of {mu}=2{pi}T for the renormalization scale, we find that for Yang-Mills theory (N{sub f}=0) agreement between HTLpt and lattice data for the T{sup 2}-scaled trace anomaly begins at temperatures on the order of 8T{sub c}, while treating the subtracted piece as an interaction term when including quarks (N{sub f}=3) agreement begins already at temperatures above 2T{sub c}. In both cases we find that at very high temperatures the T{sup 2}-scaled trace anomaly increases with temperature in accordance with the predictions of HTLpt.

  7. The structure of sequential effects.

    PubMed

    Gökaydin, Dinis; Navarro, Daniel J; Ma-Wyatt, Anna; Perfors, Amy

    2016-01-01

    There is a long history of research into sequential effects, extending more than one hundred years. The pattern of sequential effects varies widely with both experimental conditions as well as for different individuals performing the same experiment. Yet this great diversity of results is poorly understood, particularly with respect to individual variation, which save for some passing mentions has largely gone unreported in the literature. Here we seek to understand the way in which sequential effects vary by identifying the causes underlying the differences observed in sequential effects. In order to achieve this goal we perform principal component analysis on a dataset of 158 individual results from participants performing different experiments with the aim of identifying hidden variables responsible for sequential effects. We find a latent structure consisting of 3 components related to sequential effects-2 main and 1 minor. A relationship between the 2 main components and the separate processing of stimuli and of responses is proposed on the basis of previous empirical evidence. It is further speculated that the minor component of sequential effects arises as the consequence of processing delays. Independently of the explanation for the latent variables encountered, this work provides a unified descriptive model for a wide range of different types of sequential effects previously identified in the literature. In addition to explaining individual differences themselves, it is demonstrated how the latent structure uncovered here is useful in understanding the classical problem of the dependence of sequential effects on the interval between successive stimuli. PMID:26523425

  8. QUASI-PERIODIC FAST-MODE WAVE TRAINS WITHIN A GLOBAL EUV WAVE AND SEQUENTIAL TRANSVERSE OSCILLATIONS DETECTED BY SDO/AIA

    SciTech Connect

    Liu Wei; Nitta, Nariaki V.; Aschwanden, Markus J.; Schrijver, Carolus J.; Title, Alan M.; Tarbell, Theodore D.; Ofman, Leon

    2012-07-01

    We present the first unambiguous detection of quasi-periodic wave trains within the broad pulse of a global EUV wave (so-called EIT wave) occurring on the limb. These wave trains, running ahead of the lateral coronal mass ejection (CME) front of 2-4 times slower, coherently travel to distances {approx}> R{sub Sun }/2 along the solar surface, with initial velocities up to 1400 km s{sup -1} decelerating to {approx}650 km s{sup -1}. The rapid expansion of the CME initiated at an elevated height of 110 Mm produces a strong downward and lateral compression, which may play an important role in driving the primary EUV wave and shaping its front forwardly inclined toward the solar surface. The wave trains have a dominant 2 minute periodicity that matches the X-ray flare pulsations, suggesting a causal connection. The arrival of the leading EUV wave front at increasing distances produces an uninterrupted chain sequence of deflections and/or transverse (likely fast kink mode) oscillations of local structures, including a flux-rope coronal cavity and its embedded filament with delayed onsets consistent with the wave travel time at an elevated (by {approx}50%) velocity within it. This suggests that the EUV wave penetrates through a topological separatrix surface into the cavity, unexpected from CME-caused magnetic reconfiguration. These observations, when taken together, provide compelling evidence of the fast-mode MHD wave nature of the primary (outer) fast component of a global EUV wave, running ahead of the secondary (inner) slow component of CME-caused restructuring.

  9. Quasi-periodic Fast-mode Wave Trains Within a Global EUV Wave and Sequential Transverse Oscillations Detected by SDO-AIA

    NASA Technical Reports Server (NTRS)

    Liu, Wei; Ofman, Leon; Nitta, Nariaki; Aschwanden, Markus J.; Schrijver, Carolus J.; Title, Alan M.; Tarbell, Theodore D.

    2012-01-01

    We present the first unambiguous detection of quasi-periodic wave trains within the broad pulse of a global EUV wave (so-called EIT wave) occurring on the limb. These wave trains, running ahead of the lateral coronal mass ejection (CME) front of 2-4 times slower, coherently travel to distances greater than approximately solar radius/2 along the solar surface, with initial velocities up to 1400 kilometers per second decelerating to approximately 650 kilometers per second. The rapid expansion of the CME initiated at an elevated height of 110 Mm produces a strong downward and lateral compression, which may play an important role in driving the primary EUV wave and shaping its front forwardly inclined toward the solar surface. The wave trains have a dominant 2 minute periodicity that matches the X-ray flare pulsations, suggesting a causal connection. The arrival of the leading EUV wave front at increasing distances produces an uninterrupted chain sequence of deflections and/or transverse (likely fast kink mode) oscillations of local structures, including a flux-rope coronal cavity and its embedded filament with delayed onsets consistent with the wave travel time at an elevated (by approximately 50%) velocity within it. This suggests that the EUV wave penetrates through a topological separatrix surface into the cavity, unexpected from CME-caused magnetic reconfiguration. These observations, when taken together, provide compelling evidence of the fast-mode MHD wave nature of the primary (outer) fast component of a global EUV wave, running ahead of the secondary (inner) slow component of CME-caused restructuring.

  10. Continental and oceanic magnetic anomalies: Enhancement through GRM

    NASA Technical Reports Server (NTRS)

    Vonfrese, R. R. B.; Hinze, W. J.

    1985-01-01

    In contrast to the POGO and MAGSAT satellites, the Geopotential Research Mission (GRM) satellite system will orbit at a minimum elevation to provide significantly better resolved lithospheric magnetic anomalies for more detailed and improved geologic analysis. In addition, GRM will measure corresponding gravity anomalies to enhance our understanding of the gravity field for vast regions of the Earth which are largely inaccessible to more conventional surface mapping. Crustal studies will greatly benefit from the dual data sets as modeling has shown that lithospheric sources of long wavelength magnetic anomalies frequently involve density variations which may produce detectable gravity anomalies at satellite elevations. Furthermore, GRM will provide an important replication of lithospheric magnetic anomalies as an aid to identifying and extracting these anomalies from satellite magnetic measurements. The potential benefits to the study of the origin and characterization of the continents and oceans, that may result from the increased GRM resolution are examined.

  11. Magnetic Anomalies in the Enderby Basin, the Southern Indian Ocean

    NASA Astrophysics Data System (ADS)

    Nogi, Y.; Sato, T.; Hanyu, T.

    2013-12-01

    Magnetic anomalies in the Southern indian Ocean are vital to understanding initial breakup process of Gondwana. However, seafloor age estimated from magnetic anomalies still remain less well-defined because of the sparse observations in this area. To understand the seafloor spreading history related to the initial breakup process of Gondwana, vector magnetic anomaly data as well as total intensity magnetic anomaly data obtained by the R/V Hakuho-maru and the icebreaker Shirase in the Enderby Basin, Southern Indian Ocean, are used. The strikes of magnetic structures are deduced from the vector magnetic anomalies. Magnetic anomaly signals, most likely indicating Mesozoic magnetic anomaly sequence, are obtained almost parallel to the west of WNW-ESE trending lineaments just to the south of Conrad Rise inferred from satellite gravity anomalies. Most of the strikes of magnetic structures indicate NNE-SSW trends, and are almost perpendicular to the WNW-ESE trending lineaments. Mesozoic sequence magnetic anomalies with mostly WNW-ESE strikes are also observed along the NNE-SSW trending lineaments between the south of the Conrad Rise and Gunnerus Ridge. Magnetic anomalies originated from Cretaceous normal polarity superchron are found in these profiles, although magnetic anomaly C34 has been identified just to the north of the Conrad Rise. However Mesozoic sequence magnetic anomalies are only observed in the west side of the WNW-ESE trending lineaments just to the south of Conrad Rise and not detected to the east of Cretaceous normal superchron signals. These results show that counter part of Mesozoic sequence magnetic anomalies in the south of Conrad Rise would be found in the East Enderby Basin, off East Antarctica. NNE-SSW trending magnetic structures, which are similar to those obtained just to the south of Conrad Rise, are found off East Antarctica in the East Enderby Basin. However, some of the strikes show almost E-W orientations. These suggest complicated ridge

  12. Multi-Attribute Sequential Search

    ERIC Educational Resources Information Center

    Bearden, J. Neil; Connolly, Terry

    2007-01-01

    This article describes empirical and theoretical results from two multi-attribute sequential search tasks. In both tasks, the DM sequentially encounters options described by two attributes and must pay to learn the values of the attributes. In the "continuous" version of the task the DM learns the precise numerical value of an attribute when she…

  13. Feedback in sequential machine realizations.

    NASA Technical Reports Server (NTRS)

    Harlow, C. A.; Coates, C. L., Jr.

    1972-01-01

    A method is described for determining the realizability of a sequential machine with trigger or set-reset flip-flop memory elements when the feedback of the machine is given by a Boolean function. Feedbacks in several types of sequential machines with different memory elements are compared, showing the memory specifications allowing the realization of such machines.

  14. Vascular Anomalies and Airway Concerns

    PubMed Central

    Clarke, Caroline; Lee, Edward I.; Edmonds, Joseph

    2014-01-01

    Vascular anomalies, both tumors and malformations, can occur anywhere in the body, including the airway, often without any external manifestations. However, vascular anomalies involving the airway deserve special consideration as proper recognition and management can be lifesaving. In this article, the authors discuss vascular anomalies as they pertains to the airway, focusing on proper diagnosis, diagnostic modalities, and therapeutic options. PMID:25045336

  15. Mass Anomalies on Ganymede

    NASA Technical Reports Server (NTRS)

    Schubert, G.; Anderson, J. D.; Jacobson, R. A.; Lau, E. L.; Moore, W. B.; Palguta, J.

    2004-01-01

    Radio Doppler data from two Ganymede encounters (G1 and G2) on the first two orbits in the Galileo mission have been analyzed previously for gravity information . For a satellite in hydrostatic equilibrium, its gravitational field can be modeled adequately by a truncated spherical harmonic series of degree two. However, a fourth degree field is required in order to fit the second Galileo flyby (G2). This need for a higher degree field strongly suggests that Ganymede s gravitational field is perturbed by a gravity anomaly near the G2 closest approach point (79.29 latitude, 123.68 west longitude). In fact, a plot of the Doppler residuals , after removal of the best-fit model for the zero degree term (GM) and the second degree moments (J2 and C22), suggests that if an anomaly exists, it is located downtrack of the closest approach point, closer to the equator.

  16. Multiprobe in-situ measurement of magnetic field in a minefield via a distributed network of miniaturized low-power integrated sensor systems for detection of magnetic field anomalies

    NASA Astrophysics Data System (ADS)

    Javadi, Hamid H. S.; Bendrihem, David; Blaes, B.; Boykins, Kobe; Cardone, John; Cruzan, C.; Gibbs, J.; Goodman, W.; Lieneweg, U.; Michalik, H.; Narvaez, P.; Perrone, D.; Rademacher, Joel D.; Snare, R.; Spencer, Howard; Sue, Miles; Weese, J.

    1998-09-01

    Based on technologies developed for the Jet Propulsion Laboratory (JPL) Free-Flying-Magnetometer (FFM) concept, we propose to modify the present design of FFMs for detection of mines and arsenals with large magnetic signature. The result will be an integrated miniature sensor system capable of identifying local magnetic field anomaly caused by a magnetic dipole moment. Proposed integrated sensor system is in line with the JPL technology road-map for development of autonomous, intelligent, networked, integrated systems with a broad range of applications. In addition, advanced sensitive magnetic sensors (e.g., silicon micromachined magnetometer, laser pumped helium magnetometer) are being developed for future NASA space plasma probes. It is envisioned that a fleet of these Integrated Sensor Systems (ISS) units will be dispersed on a mine-field via an aerial vehicle (a low-flying airplane or helicopter). The number of such sensor systems in each fleet and the corresponding in-situ probe-grid cell size is based on the strength of magnetic anomaly of the target and ISS measurement resolution of magnetic field vector. After a specified time, ISS units will transmit the measured magnetic field and attitude data to an air-borne platform for further data processing. The cycle of data acquisition and transmission will be continued until batteries run out. Data analysis will allow a local deformation of the Earth's magnetic field vector by a magnetic dipole moment to be detected. Each ISS unit consists of miniaturized sensitive 3- axis magnetometer, high resolution analog-to-digital converter (ADC), Field Programmable Gate Array (FPGA)-based data subsystem, Li-batteries and power regulation circuitry, memory, S-band transmitter, single-patch antenna, and a sun angle sensor. ISS unit is packaged with non-magnetic components and the electronic design implements low-magnetic signature circuits. Care is undertaken to guarantee no corruption of magnetometer sensitivity as a result

  17. Sequential biases in accumulating evidence

    PubMed Central

    Huggins, Richard; Dogo, Samson Henry

    2015-01-01

    Whilst it is common in clinical trials to use the results of tests at one phase to decide whether to continue to the next phase and to subsequently design the next phase, we show that this can lead to biased results in evidence synthesis. Two new kinds of bias associated with accumulating evidence, termed ‘sequential decision bias’ and ‘sequential design bias’, are identified. Both kinds of bias are the result of making decisions on the usefulness of a new study, or its design, based on the previous studies. Sequential decision bias is determined by the correlation between the value of the current estimated effect and the probability of conducting an additional study. Sequential design bias arises from using the estimated value instead of the clinically relevant value of an effect in sample size calculations. We considered both the fixed‐effect and the random‐effects models of meta‐analysis and demonstrated analytically and by simulations that in both settings the problems due to sequential biases are apparent. According to our simulations, the sequential biases increase with increased heterogeneity. Minimisation of sequential biases arises as a new and important research area necessary for successful evidence‐based approaches to the development of science. © 2015 The Authors. Research Synthesis Methods Published by John Wiley & Sons Ltd. PMID:26626562

  18. High-throughput sequential injection method for simultaneous determination of plutonium and neptunium in environmental solids using macroporous anion-exchange chromatography, followed by inductively coupled plasma mass spectrometric detection.

    PubMed

    Qiao, Jixin; Hou, Xiaolin; Roos, Per; Miró, Manuel

    2011-01-01

    This paper reports an automated analytical method for rapid and simultaneous determination of plutonium and neptunium in soil, sediment, and seaweed, with detection via inductively coupled plasma mass spectrometry (ICP-MS). A chromatographic column packed with a macroporous anion exchanger (AG MP-1 M) was incorporated in a sequential injection (SI) system for the efficient retrieval of plutonium, along with neptunium, from matrix elements and potential interfering nuclides. The sorption and elution behavior of plutonium and neptunium onto AG MP-1 M resin was compared with a commonly utilized AG 1-gel-type anion exchanger. Experimental results reveal that the pore structure of the anion exchanger plays a pivotal role in ensuring similar separation behavior of plutonium and neptunium along the separation protocol. It is proven that plutonium-242 ((242)Pu) performs well as a tracer for monitoring the chemical yield of neptunium when using AG MP-1 M resin, whereby the difficulties in obtaining a reliable and practicable isotopic neptunium tracer are overcome. An important asset of the SI setup is the feasibility of processing up to 100 g of solid substrates using a small-sized (ca. 2 mL) column with chemical yields of neptunium and plutonium being ≥79%. Analytical results of three certified/standard reference materials and two solid samples from intercomparison exercises are in good agreement with the reference values at the 0.05 significance level. The overall on-column separation can be completed within 3.5 h for 10 g of soil samples. Most importantly, the anion-exchange mini-column suffices to be reused up to 10-fold with satisfactory chemical yields (>70%), as demanded in environmental monitoring and emergency scenarios, making the proposed automated assembly well-suited for unattended and high-throughput analysis.

  19. [Fetal ocular anomalies: the advantages of prenatal magnetic resonance imaging].

    PubMed

    Brémond-Gignac, D; Copin, H; Elmaleh, M; Milazzo, S

    2010-05-01

    Congenital ocular malformations are uncommon and require prenatal diagnosis. Severe anomalies are more often detected by trained teams and minor anomalies are more difficult to identify and must be systematically sought, particularly when multiple malformations or a family and maternal history is known. The prenatal diagnosis-imaging tool most commonly used is ultrasound but it can be completed by magnetic resonance imaging (MRI), which contributes crucial information. Fetal dysmorphism can occur in various types of dysfunction and prenatal diagnosis must recognize fetal ocular anomalies. After systematic morphologic ultrasound imaging, different abnormalities detected by MRI are studied. Classical parameters such as binocular and interorbital measurements are used to detect hypotelorism and hypertelorism. Prenatal ocular anomalies such as cataract microphthalmia, anophthalmia, and coloboma have been described. Fetal MRI added to prenatal sonography is essential in detecting cerebral and general anomalies and can give more information on the size and morphology of the eyeball. Fetal abnormality detection includes a detailed family and maternal history, an amniotic fluid sample for karyotype, and other analyses for a better understanding of the images. Each pregnancy must be discussed with all specialists for genetic counseling. With severe malformations, termination of pregnancy is proposed because of risk of blindness and associated cerebral or systemic anomalies. Early prenatal diagnosis of ocular malformations can also detect associated abnormalities, taking congenital cataracts that need surgical treatment into account as early as possible. Finally, various associated syndromes need a pediatric check-up that could lead to emergency treatment.

  20. Physicochemical isotope anomalies

    SciTech Connect

    Esat, T.M.

    1988-06-01

    Isotopic composition of refractory elements can be modified, by physical processes such as distillation and sputtering, in unexpected patterns. Distillation enriches the heavy isotopes in the residue and the light isotopes in the vapor. However, current models appear to be inadequate to describe the detailed mass dependence, in particular for large fractionations. Coarse- and fine-grained inclusions from the Allende meteorite exhibit correlated isotope effects in Mg both as mass-dependent fractionation and residual anomalies. This isotope pattern can be duplicated by high temperature distillation in the laboratory. A ubiquitous property of meteoritic inclusions for Mg as well as for most of the other elements, where measurements exist, is mass-dependent fractionation. In contrast, terrestrial materials such as microtektites, tektite buttons as well as lunar orange and green glass spheres have normal Mg isotopic composition. A subset of interplanetary dust particles labelled as chondritic aggregates exhibit excesses in {sup 26}Mg and deuterium anomalies. Sputtering is expected to be a dominant mechanism in the destruction of grains within interstellar dust clouds. An active proto-sun as well as the present solar-wind and solar-flare flux are of sufficient intensity to sputter significant amounts of material. Laboratory experiments in Mg show widespread isotope effects including residual {sup 26}Mg excesses and mass dependent fractionation. It is possible that the {sup 26}Mg excesses in interplanetary dust is related to sputtering by energetic solar-wind particles. The implication if the laboratory distillation and sputtering effects are discussed and contrasted with the anomalies in meteoritic inclusions the other extraterrestrial materials the authors have access to.

  1. [First branchial cleft anomalies].

    PubMed

    Nikoghosyan, Gohar; Krogdahl, Annelise; Godballe, Christian

    2008-05-12

    First branchial cleft anomalies are congenital rare lesions that can sometimes be difficult to diagnose. During the normal embryonic development the outer ear canal derives from the first branchial cleft. Abnormal development can result in production of a cyst, sinus or fistula with recurring infections. Early and correct diagnosis is necessary for the correct choice of surgical set-up in which identification and preservation of the facial nerve is an important step. A case of first branchial cleft sinus is presented with further discussion of classification, diagnostics and treatment. PMID:18489895

  2. When do anomalies begin?

    NASA Astrophysics Data System (ADS)

    Lightman, Alan; Gingerich, Owen

    1992-02-01

    The present historical and methodological consideration of scientific anomalies notes that some of these are recognized as such, after long neglect, only after the emergence of compelling explanations for their presence in the given theory in view of an alternative conceptual framework. These cases of 'retrorecognition' are indicative not merely of a significant characteristic of the process of conceptual development and scientific discovery, but of the bases for such process in human psychology. Attention is given to the illustrative cases of the 'flatness problem' in big bang theory, the perigee-opposition problem in Ptolemaic astronomy, the continental-fit problem in geology, and the equality of inertial and gravitational mass.

  3. Sequentially pulsed traveling wave accelerator

    DOEpatents

    Caporaso, George J.; Nelson, Scott D.; Poole, Brian R.

    2009-08-18

    A sequentially pulsed traveling wave compact accelerator having two or more pulse forming lines each with a switch for producing a short acceleration pulse along a short length of a beam tube, and a trigger mechanism for sequentially triggering the switches so that a traveling axial electric field is produced along the beam tube in synchronism with an axially traversing pulsed beam of charged particles to serially impart energy to the particle beam.

  4. Augment railgun and sequential discharge

    NASA Astrophysics Data System (ADS)

    Kobayashi, K.

    1993-01-01

    Proprietary R&D efforts toward the creation of tactical weapon systems-applicable railguns are presented. Attention is given to measures taken for projectile velocity maximization and sequential-discharge operation, and to an augmenting railgun which has demonstrated a 66-percent efficiency improvement over the two-rail baseline railgun system. This device is characterized by strong interaction between capacitor bank submodules during sequential discharge.

  5. Einstein, Entropy and Anomalies

    NASA Astrophysics Data System (ADS)

    Sirtes, Daniel; Oberheim, Eric

    2006-11-01

    This paper strengthens and defends the pluralistic implications of Einstein's successful, quantitative predictions of Brownian motion for a philosophical dispute about the nature of scientific advance that began between two prominent philosophers of science in the second half of the twentieth century (Thomas Kuhn and Paul Feyerabend). Kuhn promoted a monistic phase-model of scientific advance, according to which a paradigm driven `normal science' gives rise to its own anomalies, which then lead to a crisis and eventually a scientific revolution. Feyerabend stressed the importance of pluralism for scientific progress. He rejected Kuhn's model arguing that it fails to recognize the role that alternative theories can play in identifying exactly which phenomena are anomalous in the first place. On Feyerabend's account, Einstein's predictions allow for a crucial experiment between two incommensurable theories, and are an example of an anomaly that could refute the reigning paradigm only after the development of a competitor. Using Kuhn's specification of a disciplinary matrix to illustrate the incommensurability between the two paradigms, we examine the different research strategies available in this peculiar case. On the basis of our reconstruction, we conclude by rebutting some critics of Feyerabend's argument.

  6. Statistical significance of the gallium anomaly

    SciTech Connect

    Giunti, Carlo; Laveder, Marco

    2011-06-15

    We calculate the statistical significance of the anomalous deficit of electron neutrinos measured in the radioactive source experiments of the GALLEX and SAGE solar neutrino detectors, taking into account the uncertainty of the detection cross section. We found that the statistical significance of the anomaly is {approx}3.0{sigma}. A fit of the data in terms of neutrino oscillations favors at {approx}2.7{sigma} short-baseline electron neutrino disappearance with respect to the null hypothesis of no oscillations.

  7. Systematic Screening for Subtelomeric Anomalies in a Clinical Sample of Autism

    ERIC Educational Resources Information Center

    Wassink, Thomas H.; Losh, Molly; Piven, Joseph; Sheffield, Val C.; Ashley, Elizabeth; Westin, Erik R.; Patil, Shivanand R.

    2007-01-01

    High-resolution karyotyping detects cytogenetic anomalies in 5-10% of cases of autism. Karyotyping, however, may fail to detect abnormalities of chromosome subtelomeres, which are gene rich regions prone to anomalies. We assessed whether panels of FISH probes targeted for subtelomeres could detect abnormalities beyond those identified by…

  8. A Bayesian sequential processor approach to spectroscopic portal system decisions

    SciTech Connect

    Sale, K; Candy, J; Breitfeller, E; Guidry, B; Manatt, D; Gosnell, T; Chambers, D

    2007-07-31

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waiting for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.

  9. Turtle Carapace Anomalies: The Roles of Genetic Diversity and Environment

    PubMed Central

    Velo-Antón, Guillermo; Becker, C. Guilherme; Cordero-Rivera, Adolfo

    2011-01-01

    Background Phenotypic anomalies are common in wild populations and multiple genetic, biotic and abiotic factors might contribute to their formation. Turtles are excellent models for the study of developmental instability because anomalies are easily detected in the form of malformations, additions, or reductions in the number of scutes or scales. Methodology/Principal Findings In this study, we integrated field observations, manipulative experiments, and climatic and genetic approaches to investigate the origin of carapace scute anomalies across Iberian populations of the European pond turtle, Emys orbicularis. The proportion of anomalous individuals varied from 3% to 69% in local populations, with increasing frequency of anomalies in northern regions. We found no significant effect of climatic and soil moisture, or climatic temperature on the occurrence of anomalies. However, lower genetic diversity and inbreeding were good predictors of the prevalence of scute anomalies among populations. Both decreasing genetic diversity and increasing proportion of anomalous individuals in northern parts of the Iberian distribution may be linked to recolonization events from the Southern Pleistocene refugium. Conclusions/Significance Overall, our results suggest that developmental instability in turtle carapace formation might be caused, at least in part, by genetic factors, although the influence of environmental factors affecting the developmental stability of turtle carapace cannot be ruled out. Further studies of the effects of environmental factors, pollutants and heritability of anomalies would be useful to better understand the complex origin of anomalies in natural populations. PMID:21533278

  10. Prevalence of Associated Anomalies in Cleft Lip and/or Palate Patients

    PubMed Central

    Abdollahi Fakhim, Shahin; Shahidi, Nikzad; Lotfi, Alireza

    2016-01-01

    Introduction: Orofacial clefts are among the most common congenital anomalies. Patients presenting with orofacial clefts often require surgery or other complex procedures. A cleft lip or palate can be a single anomaly or a part of multiple congenital anomalies. The reported prevalence of cleft disease and associated anomalies varies widely across the literature, and is dependent on the diagnostic procedure used. In this study we determined the prevalence of associated anomalies in patients with a cleft lip and/or palate, with a specific focus on cardiac anomalies. Materials and Methods: In this cross-sectional study, 526 patients with a cleft lip and /or palate admitted to the children’s referral hospital between 2006 and 2011 were evaluated. All associated anomalies were detected and recorded. Patient information collected included age, gender, type and side of cleft, craniofacial anomalies and presence of other anomalies, including cardiac anomalies. Data were analyzed using SPSS version 16. Results: Of the 526 patients enrolled in the study, 58% (305) were male and 42% (221) were female. In total, 75% of patients (396) were aged between 4 and 8 years and 25% (130) were aged less than 4 years. The most common cleft type in our study was bilateral cleft palate. The most commonly associated anomaly among cleft patients, in 12% of cleft patients, was a cardiac anomaly. The most common cardiac anomaly was atrial septal defect (ASD). Conclusion: The prevalence of associated anomalies among orofacial cleft patients is high. The most common associated anomaly is cardiac anomaly, with ASD being the most common cardiac anomaly. There are no significant relationships between type of cleft and associated cardiac anomalies. PMID:27280100

  11. Sequential Syndrome Decoding of Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Reed, I. S.; Truong, T. K.

    1984-01-01

    The algebraic structure of convolutional codes are reviewed and sequential syndrome decoding is applied to those codes. These concepts are then used to realize by example actual sequential decoding, using the stack algorithm. The Fano metric for use in sequential decoding is modified so that it can be utilized to sequentially find the minimum weight error sequence.

  12. Continuous sequential boundaries for vaccine safety surveillance.

    PubMed

    Li, Rongxia; Stewart, Brock; Weintraub, Eric; McNeil, Michael M

    2014-08-30

    Various recently developed sequential methods have been used to detect signals for post-marketing surveillance in drug and vaccine safety. Among these, the maximized sequential probability ratio test (MaxSPRT) has been used to detect elevated risks of adverse events following vaccination using large healthcare databases. However, a limitation of MaxSPRT is that it only provides a time-invariant flat boundary. In this study, we propose the use of time-varying boundaries for controlling how type I error is distributed throughout the surveillance period. This is especially useful in two scenarios: (i) when we desire generally larger sample sizes before a signal is generated, for example, when early adopters are not representative of the larger population; and (ii) when it is desired for a signal to be generated as early as possible, for example, when the adverse event is considered rare but serious. We consider four specific time-varying boundaries (which we call critical value functions), and we study their statistical power and average time to signal detection. The methodology we present here can be viewed as a generalization or flexible extension of MaxSPRT. PMID:24691986

  13. Nolen-Schiffer anomaly

    SciTech Connect

    Pieper, S.C.; Wiringa, R.B.

    1995-08-01

    The Argonne v{sub 18} potential contains a detailed treatment of the pp, pn and nn electromagnetic potential, including Coulomb, vacuum polarization, Darwin Foldy and magnetic moment terms, all with suitable form factors and was fit to pp and pn data using the appropriate nuclear masses. In addition, it contains a nuclear charge-symmetry breaking (CSB) term adjusted to reproduce the difference in the experimental pp and nn scattering lengths. We have used these potential terms to compute differences in the binding energies of mirror isospin-1/2 nuclei (Nolen-Schiffer [NS] anomaly). Variational Monte Carlo calculations for the {sup 3}He-{sup 3}H system and cluster variational Monte Carlo for the {sup 15}O-{sup 15}N and {sup 17}F-{sup 17}O systems were made. In the first case, the best variational wave function for the A = 3 nuclei was used. However, because our {sup 16}O wave function does not reproduce accurately the {sup 16}O rms radius, to which the NS anomaly is very sensitive, we adjusted the A = 15 and A = 17 wave functions to reproduce the experimental density profiles. Our computed energy differences for these three systems are 0.757 {plus_minus} .001, 3.544 {plus_minus} .018 and 3.458 {plus_minus} .040 MeV respectively, which are to be compared with the experimental differences of 0.764, 3.537, and 3.544 MeV. Most of the theoretical uncertainties are due to uncertainties in the experimental rms radii. The nuclear CSB potential contributes 0.066, 0.188, and 0.090 MeV to these totals. We also attempted calculations for A = 39 and A = 41. However, in these cases, the experimental uncertainties in the rms radius make it impossible to extract useful information about the contribution of the nuclear CSB potential.

  14. Ebstein's anomaly in neonates.

    PubMed

    Moura, C; Guimarães, H; Areias, J C; Moreira, J

    2001-09-01

    Ebstein's anomaly is a rare congenital heart disease abnormality in which the tricuspid valve leaflets do not attach normally to the tricuspid valve annulus. The effective tricuspid valve orifice is displaced apically into the right ventricle (RV), near the junction of the inlet and the trabecular parts of the RV. The authors present a retrospective study of the patients with Ebstein's anomaly admitted to a neonatal intensive care unit, in the period between January 1993 and March 2000. There were ten patients, representing 0.24% of total neonates and 1.99% of total congenital heart disease admitted to the institution in the same period. Fifty per cent were male and only one case had prenatal diagnosis. Holosystolic murmur (100%) from tricuspid regurgitation and cyanosis (80%) were the most frequent clinical findings. Chest X-ray was abnormal in 90% of the neonates, with a "balloon-shaped" enlarged heart. The main electrocardiographic findings were right atrial enlargement (70%) and arrhythmias (40%). Apical displacement of the septal leaflet of the tricuspid valve, to a maximum of 20 mm, and leaflets tethering to underlying RV myocardium were found in all patients. Tricuspid valve regurgitation was found in 90% (severe form in four cases). An atrial intracardiac shunt, mostly right-to-left, was also found in 50%. Digoxin was used (40%) to restore sinus rhythm. Fifty per cent of the neonates received intravenous prostaglandins. Two patients required a surgical procedure. Two patients died in the neonatal period. During the follow-up period (range 0.3-74.6 months), only one episode of supraventricular tachycardia was recorded. At present seven patients are clinically stable, three of them on medication.

  15. Medical management of vascular anomalies.

    PubMed

    Trenor, Cameron C

    2016-03-01

    We have entered an exciting era in the care of patients with vascular anomalies. These disorders require multidisciplinary care and coordination and dedicated centers have emerged to address this need. Vascular tumors have been treated with medical therapies for many years, while malformations have been historically treated with endovascular and operative procedures. The recent serendipitous discoveries of propranolol and sirolimus for vascular anomalies have revolutionized this field. In particular, sirolimus responses are challenging the dogma that vascular malformations are not biologically active. While initially explored for lymphatic anomalies, sirolimus is now being used broadly throughout the spectrum of vascular anomalies. Whether medical therapies are reserved for refractory patients or used first line is currently dependent on the experience and availability of alternative therapies at each institution. On the horizon, we anticipate new drugs targeting genes and pathways involved in vascular anomalies to be developed. Also, combinations of medications and protocols combining medical and procedural approaches are in development for refractory patients. PMID:27607327

  16. A bit serial sequential circuit

    NASA Technical Reports Server (NTRS)

    Hu, S.; Whitaker, S.

    1990-01-01

    Normally a sequential circuit with n state variables consists of n unique hardware realizations, one for each state variable. All variables are processed in parallel. This paper introduces a new sequential circuit architecture that allows the state variables to be realized in a serial manner using only one next state logic circuit. The action of processing the state variables in a serial manner has never been addressed before. This paper presents a general design procedure for circuit construction and initialization. Utilizing pass transistors to form the combinational next state forming logic in synchronous sequential machines, a bit serial state machine can be realized with a single NMOS pass transistor network connected to shift registers. The bit serial state machine occupies less area than other realizations which perform parallel operations. Moreover, the logical circuit of the bit serial state machine can be modified by simply changing the circuit input matrix to develop an adaptive state machine.

  17. System for closure of a physical anomaly

    DOEpatents

    Bearinger, Jane P; Maitland, Duncan J; Schumann, Daniel L; Wilson, Thomas S

    2014-11-11

    Systems for closure of a physical anomaly. Closure is accomplished by a closure body with an exterior surface. The exterior surface contacts the opening of the anomaly and closes the anomaly. The closure body has a primary shape for closing the anomaly and a secondary shape for being positioned in the physical anomaly. The closure body preferably comprises a shape memory polymer.

  18. Reliability of CHAMP Anomaly Continuations

    NASA Technical Reports Server (NTRS)

    vonFrese, Ralph R. B.; Kim, Hyung Rae; Taylor, Patrick T.; Asgharzadeh, Mohammad F.

    2003-01-01

    CHAMP is recording state-of-the-art magnetic and gravity field observations at altitudes ranging over roughly 300 - 550 km. However, anomaly continuation is severely limited by the non-uniqueness of the process and satellite anomaly errors. Indeed, our numerical anomaly simulations from satellite to airborne altitudes show that effective downward continuations of the CHAMP data are restricted to within approximately 50 km of the observation altitudes while upward continuations can be effective over a somewhat larger altitude range. The great unreliability of downward continuation requires that the satellite geopotential observations must be analyzed at satellite altitudes if the anomaly details are to be exploited most fully. Given current anomaly error levels, joint inversion of satellite and near- surface anomalies is the best approach for implementing satellite geopotential observations for subsurface studies. We demonstrate the power of this approach using a crustal model constrained by joint inversions of near-surface and satellite magnetic and gravity observations for Maude Rise, Antarctica, in the southwestern Indian Ocean. Our modeling suggests that the dominant satellite altitude magnetic anomalies are produced by crustal thickness variations and remanent magnetization of the normal polarity Cretaceous Quiet Zone.

  19. An investigation of thermal anomalies in the Central American volcanic chain and evaluation of the utility of thermal anomaly monitoring in the prediction of volcanic eruptions. [Central America

    NASA Technical Reports Server (NTRS)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1975-01-01

    The author has identified the following significant results. Ground truth data collection proves that significant anomalies exist at 13 volcanoes within the test site of Central America. The dimensions and temperature contrast of these ten anomalies are large enough to be detected by the Skylab 192 instrument. The dimensions and intensity of thermal anomalies have changed at most of these volcanoes during the Skylab mission.

  20. Discovering System Health Anomalies Using Data Mining Techniques

    NASA Technical Reports Server (NTRS)

    Sriastava, Ashok, N.

    2005-01-01

    We present a data mining framework for the analysis and discovery of anomalies in high-dimensional time series of sensor measurements that would be found in an Integrated System Health Monitoring system. We specifically treat the problem of discovering anomalous features in the time series that may be indicative of a system anomaly, or in the case of a manned system, an anomaly due to the human. Identification of these anomalies is crucial to building stable, reusable, and cost-efficient systems. The framework consists of an analysis platform and new algorithms that can scale to thousands of sensor streams to discovers temporal anomalies. We discuss the mathematical framework that underlies the system and also describe in detail how this framework is general enough to encompass both discrete and continuous sensor measurements. We also describe a new set of data mining algorithms based on kernel methods and hidden Markov models that allow for the rapid assimilation, analysis, and discovery of system anomalies. We then describe the performance of the system on a real-world problem in the aircraft domain where we analyze the cockpit data from aircraft as well as data from the aircraft propulsion, control, and guidance systems. These data are discrete and continuous sensor measurements and are dealt with seamlessly in order to discover anomalous flights. We conclude with recommendations that describe the tradeoffs in building an integrated scalable platform for robust anomaly detection in ISHM applications.

  1. Time Data Sequential Processor /TDSP/

    NASA Technical Reports Server (NTRS)

    Joseph, A. E.; Pavlovitch, T.; Roth, R. Y.; Sturms, F. M., Jr.

    1970-01-01

    Time Data Sequential Processor /TDSP/ computer program provides preflight predictions for lunar trajectories from injection to impact, and for planetary escape trajectories for up to 100 hours from launch. One of the major options TDSP performs is the determination of tracking station view periods.

  2. Sequential Effects in Essay Ratings

    ERIC Educational Resources Information Center

    Attali, Yigal

    2011-01-01

    Contrary to previous research on sequential ratings of student performance, this study found that professional essay raters of a large-scale standardized testing program produced ratings that were drawn toward previous ratings, creating an assimilation effect. Longer intervals between the two adjacent ratings and higher degree of agreement with…

  3. Sequential triangulation of orbital photography

    NASA Technical Reports Server (NTRS)

    Rajan, M.; Junkins, J. L.; Turner, J. D.

    1979-01-01

    The feasibility of structuring the satellite photogrammetric triangulation as an iterative Extended Kalman estimation algorithm is demonstrated. Comparative numerical results of the sequential against batch estimation algorithm are presented. Difficulty of accurately modeling of the attitude motion is overcome by utilizing the on-board angular rate measurements. Solutions of the differential equations and the evaluation of state transition matrix are carried out numerically.

  4. Manifestations of sequential electron transfer

    SciTech Connect

    Thurnauer, M.C.; Tang, J.

    1996-05-01

    An essential feature of efficient photo-initiated charge separation is sequential electron transfer. Charge separation is initiated by photoexcitation of an electron donor followed by rapid electron transfer steps from the excited donor through a series of electron acceptors, so that, after one or two successive steps, charge separation is stabilized by the physical separation between the oxidized donor and reduced acceptor. The prime example of this process is the sequential electron transfer that takes place in the purple photosynthetic bacterial reaction center, resulting in the charge separation between P{sup +} and Q{sub A}{sup -} across a biological membrane. We have developed magnetic resonance tools to monitor sequential electron transfer. We are applying these techniques to study charge separation in natural photo-synthetic systems in order to gain insights into the features of the reaction center proteins that promote efficient charge separation. As we establish what some of these factors are, we are beginning to design artificial photosynthetic systems that undergo photoinduced sequential electron transfer steps.

  5. Congenital uterine anomalies affecting reproduction.

    PubMed

    Reichman, David E; Laufer, Marc R

    2010-04-01

    The following review seeks to summarise the current data regarding reproductive outcomes associated with congenital uterine anomalies. Such malformations originate from adverse embryologic events ranging from agenesis to lateral and vertical fusion defects. Associated renal anomalies are common both for the symmetric and asymmetric malformations. While fertility is minimally impacted upon by müllerian anomalies in most cases, such malformations have historically been associated with poor obstetric outcomes such as recurrent miscarriage, second trimester loss, preterm delivery, malpresentation and intrauterine foetal demise (IUFD). The following review delineates the existing literature regarding such outcomes and indicates therapies, where applicable, to optimise the care of such patients.

  6. South Atlantic Anomaly

    Atmospheric Science Data Center

    2013-04-19

    ... to detect visible light, are also sensitive to energetic protons at high altitudes. With the cover closed, background levels of protons stand out. This map was created by specially processing MISR "dark" ...

  7. Enceladus’ Emission at 2cm: an Anomaly Near the Equator?

    NASA Astrophysics Data System (ADS)

    Ries, Paul; Janssen, M.

    2013-10-01

    The Cassini spacecraft flew by Enceladus on 5 November 2011, configured to acquire SAR imaging of most of the surface with the RADAR instrument. While not optimized for acquiring radiometer data, the pass nonetheless recorded thermal emission from most of the surface. We report on global patterns of thermal emission. The thermal emission is consistent with dielectric constants of pure water or methane ice, but cannot discriminate between the two. The emissivity is low (≈0.5), consistent with substantial volume scattering. The most intriguing result, however, is an anomaly in the thermal emission on Enceladus’ leading hemisphere. This anomaly is at a similar location to anomalies previously detected with the CIRS instrument on Mimas and Tethys (Howett et al, 2011, 2012, Schenk et al 2011). Whether or not the anomaly on Enceladus is the same phenomenon remains to be seen. Possible origins and implications are discussed.

  8. Brain anomalies in velo-cardio-facial syndrome

    SciTech Connect

    Mitnick, R.J.; Bello, J.A.; Shprintzen, R.J.

    1994-06-15

    Magnetic resonance imaging of the brain in 11 consecutively referred patients with velo-cardio-facial syndrome (VCF) showed anomalies in nine cases including small vermis, cysts adjacent to the frontal horns, and small posterior fossa. Focal signal hyperintensities in the white matter on long TR images were also noted. The nine patients showed a variety of behavioral abnormalities including mild development delay, learning disabilities, and characteristic personality traits typical of this common multiple anomaly syndrome which has been related to a microdeletion at 22q11. Analysis of the behavorial findings showed no specific pattern related to the brain anomalies, and the patients with VCF who did not have detectable brain lesions also had behavioral abnormalities consistent with VCF. The significance of the lesions is not yet known, but the high prevalence of anomalies in this sample suggests that structural brain abnormalities are probably common in VCF. 25 refs.

  9. Aeromagnetic anomalies and perspective oil traps in China

    SciTech Connect

    Zhang, Y.X. )

    1994-10-01

    Based on analyses of aeromagnetic data from known oil and gas fields in China, aeromagnetic anomalies have been classified according to their genesis into three types: (1) structure-associated anomalies related to volcanic rocks, (2) anomalies related to magnetic basement fault blocks, and (3) structure-associated anomalies related to weakly magnetic sedimentary strata. The most successful applications of aeromagnetic data for locating favorable oil and gas structures are in the following kinds of areas: (1) areas where basement fault blocks of inhomogeneous lithology and magnetization are developed; (2) areas of weakly magnetic layered strata with a considerably thickness, either effusive or clastic deposits; and (3) areas where magnetic layers have undergone tectonic deformation with faulting and dip angles larger than 30 degrees. For reliable detection of such structures in sedimentary rocks and associated oil and gas traps, an integrated interpretation of geological and geophysical data is necessary.

  10. Genetics Home Reference: Peters anomaly

    MedlinePlus

    ... the anterior segment is abnormal, leading to incomplete separation of the cornea from the iris or the ... anomaly type I is characterized by an incomplete separation of the cornea and iris and mild to ...

  11. Classifying sex biased congenital anomalies

    SciTech Connect

    Lubinsky, M.S.

    1997-03-31

    The reasons for sex biases in congenital anomalies that arise before structural or hormonal dimorphisms are established has long been unclear. A review of such disorders shows that patterning and tissue anomalies are female biased, and structural findings are more common in males. This suggests different gender dependent susceptibilities to developmental disturbances, with female vulnerabilities focused on early blastogenesis/determination, while males are more likely to involve later organogenesis/morphogenesis. A dual origin for some anomalies explains paradoxical reductions of sex biases with greater severity (i.e., multiple rather than single malformations), presumably as more severe events increase the involvement of an otherwise minor process with opposite biases to those of the primary mechanism. The cause for these sex differences is unknown, but early dimorphisms, such as differences in growth or presence of H-Y antigen, may be responsible. This model provides a useful rationale for understanding and classifying sex-biased congenital anomalies. 42 refs., 7 tabs.

  12. Congenital Anomalies of the Limbs

    PubMed Central

    Gingras, G.; Mongeau, M.; Moreault, P.; Dupuis, M.; Hebert, B.; Corriveau, C.

    1964-01-01

    As a preparatory step towards the development of a complete habilitation program for children with congenital limb anomalies associated with maternal ingestion of thalidomide, the medical records of all patients with congenital limb anomalies referred to the Rehabilitation Institute of Montreal in the past decade were studied, and an examination and a thorough reassessment were made of 41 patients (21 males and 20 females). In this paper, Part I, the medical and prosthetic aspects are dealt with and a form of management is described for each type of anomaly. The conclusions are reached that prosthetic fitting and training should be initiated very early in life and that co-operation of the parent is essential to successful habilitation of a child with congenital limb anomalies. ImagesFig. 1Fig. 2Fig. 3Fig. 4Fig. 5Fig. 6Fig. 7 PMID:14154297

  13. Revisiting Gravitational Anomalies and a Potential Solution

    NASA Astrophysics Data System (ADS)

    Murad, P. A.

    2009-03-01

    Gravitational anomalies require investigation and resolution to understand the space environment if man is to travel beyond trans-lunar or trans-Mars region. This paper will provide a framework for further and more detailed evaluations. These anomalies include, a slight change in the sun's gravitational attraction observed by two Pioneer probes based upon trajectory deviations detected after being in flight for over a decade and, several events where other long-range spacecraft undergoing flybys of the Earth experience increases in velocity that could not be predicted by Newtonian gravitation. Moreover, the assumption of dark energy and dark matter supposedly explain some astronomical observations to include expansion of the cosmos on a scale of the order of galaxies, galaxy clusters and other celestial bodies at considerable distances from the Earth. If, however, gravitational waves exist, then gravity should obey a wavelike partial differential equation implying that gravity is a function of both spatial and temporal dimensions. If true, then gravity may grow or decay as a function of time in contrast to Newtonian gravitation, which has propulsion implications that may also provide a partial explanation to some of these anomalies.

  14. Overgrowth syndromes with vascular anomalies.

    PubMed

    Blei, Francine

    2015-04-01

    Overgrowth syndromes with vascular anomalies encompass entities with a vascular anomaly as the predominant feature vs those syndromes with predominant somatic overgrowth and a vascular anomaly as a more minor component. The focus of this article is to categorize these syndromes phenotypically, including updated clinical criteria, radiologic features, evaluation, management issues, pathophysiology, and genetic information. A literature review was conducted in PubMed using key words "overgrowth syndromes and vascular anomalies" as well as specific literature reviews for each entity and supportive genetic information (e.g., somatic mosaicism). Additional searches in OMIM and Gene Reviews were conducted for each syndrome. Disease entities were categorized by predominant clinical features, known genetic information, and putative affected signaling pathway. Overgrowth syndromes with vascular anomalies are a heterogeneous group of disorders, often with variable clinical expression, due to germline or somatic mutations. Overgrowth can be focal (e.g., macrocephaly) or generalized, often asymmetrically (and/or mosaically) distributed. All germ layers may be affected, and the abnormalities may be progressive. Patients with overgrowth syndromes may be at an increased risk for malignancies. Practitioners should be attentive to patients having syndromes with overgrowth and vascular defects. These patients require proactive evaluation, referral to appropriate specialists, and in some cases, early monitoring for potential malignancies. Progress in identifying vascular anomaly-related overgrowth syndromes and their genetic etiology has been robust in the past decade and is contributing to genetically based prenatal diagnosis and new therapies targeting the putative causative genetic mutations. PMID:25937473

  15. Overgrowth syndromes with vascular anomalies.

    PubMed

    Blei, Francine

    2015-04-01

    Overgrowth syndromes with vascular anomalies encompass entities with a vascular anomaly as the predominant feature vs those syndromes with predominant somatic overgrowth and a vascular anomaly as a more minor component. The focus of this article is to categorize these syndromes phenotypically, including updated clinical criteria, radiologic features, evaluation, management issues, pathophysiology, and genetic information. A literature review was conducted in PubMed using key words "overgrowth syndromes and vascular anomalies" as well as specific literature reviews for each entity and supportive genetic information (e.g., somatic mosaicism). Additional searches in OMIM and Gene Reviews were conducted for each syndrome. Disease entities were categorized by predominant clinical features, known genetic information, and putative affected signaling pathway. Overgrowth syndromes with vascular anomalies are a heterogeneous group of disorders, often with variable clinical expression, due to germline or somatic mutations. Overgrowth can be focal (e.g., macrocephaly) or generalized, often asymmetrically (and/or mosaically) distributed. All germ layers may be affected, and the abnormalities may be progressive. Patients with overgrowth syndromes may be at an increased risk for malignancies. Practitioners should be attentive to patients having syndromes with overgrowth and vascular defects. These patients require proactive evaluation, referral to appropriate specialists, and in some cases, early monitoring for potential malignancies. Progress in identifying vascular anomaly-related overgrowth syndromes and their genetic etiology has been robust in the past decade and is contributing to genetically based prenatal diagnosis and new therapies targeting the putative causative genetic mutations.

  16. Sequential estimation of surface water mass changes from daily satellite gravimetry data

    NASA Astrophysics Data System (ADS)

    Ramillien, G. L.; Frappart, F.; Gratton, S.; Vasseur, X.

    2015-03-01

    We propose a recursive Kalman filtering approach to map regional spatio-temporal variations of terrestrial water mass over large continental areas, such as South America. Instead of correcting hydrology model outputs by the GRACE observations using a Kalman filter estimation strategy, regional 2-by-2 degree water mass solutions are constructed by integration of daily potential differences deduced from GRACE K-band range rate (KBRR) measurements. Recovery of regional water mass anomaly averages obtained by accumulation of information of daily noise-free simulated GRACE data shows that convergence is relatively fast and yields accurate solutions. In the case of cumulating real GRACE KBRR data contaminated by observational noise, the sequential method of step-by-step integration provides estimates of water mass variation for the period 2004-2011 by considering a set of suitable a priori error uncertainty parameters to stabilize the inversion. Spatial and temporal averages of the Kalman filter solutions over river basin surfaces are consistent with the ones computed using global monthly/10-day GRACE solutions from official providers CSR, GFZ and JPL. They are also highly correlated to in situ records of river discharges (70-95 %), especially for the Obidos station where the total outflow of the Amazon River is measured. The sparse daily coverage of the GRACE satellite tracks limits the time resolution of the regional Kalman filter solutions, and thus the detection of short-term hydrological events.

  17. Multiplexed protein profiling by sequential affinity capture

    PubMed Central

    Ayoglu, Burcu; Birgersson, Elin; Mezger, Anja; Nilsson, Mats; Uhlén, Mathias; Nilsson, Peter

    2016-01-01

    Antibody microarrays enable parallelized and miniaturized analysis of clinical samples, and have proven to provide novel insights for the analysis of different proteomes. However, there are concerns that the performance of such direct labeling and single antibody assays are prone to off‐target binding due to the sample context. To improve selectivity and sensitivity while maintaining the possibility to conduct multiplexed protein profiling, we developed a multiplexed and semi‐automated sequential capture assay. This novel bead‐based procedure encompasses a first antigen capture, labeling of captured protein targets on magnetic particles, combinatorial target elution and a read‐out by a secondary capture bead array. We demonstrate in a proof‐of‐concept setting that target detection via two sequential affinity interactions reduced off‐target contribution, while lowered background and noise levels, improved correlation to clinical values compared to single binder assays. We also compared sensitivity levels with single binder and classical sandwich assays, explored the possibility for DNA‐based signal amplification, and demonstrate the applicability of the dual capture bead‐based antibody microarray for biomarker analysis. Hence, the described concept enhances the possibilities for antibody array assays to be utilized for protein profiling in body fluids and beyond. PMID:26935855

  18. Nonequilibrium structure in sequential assembly

    NASA Astrophysics Data System (ADS)

    Popov, Alexander V.; Craven, Galen T.; Hernandez, Rigoberto

    2015-11-01

    The assembly of monomeric constituents into molecular superstructures through sequential-arrival processes has been simulated and theoretically characterized. When the energetic interactions allow for complete overlap of the particles, the model is equivalent to that of the sequential absorption of soft particles on a surface. In the present work, we consider more general cases by including arbitrary aggregating geometries and varying prescriptions of the connectivity network. The resulting theory accounts for the evolution and final-state configurations through a system of equations governing structural generation. We find that particle geometries differ significantly from those in equilibrium. In particular, variations of structural rigidity and morphology tune particle energetics and result in significant variation in the nonequilibrium distributions of the assembly in comparison to the corresponding equilibrium case.

  19. Robustness of the Sequential Lineup Advantage

    ERIC Educational Resources Information Center

    Gronlund, Scott D.; Carlson, Curt A.; Dailey, Sarah B.; Goodsell, Charles A.

    2009-01-01

    A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup…

  20. Deterministic sequential isolation of floating cancer cells under continuous flow.

    PubMed

    Tran, Quang D; Kong, Tian Fook; Hu, Dinglong; Marcos; Lam, Raymond H W

    2016-08-01

    Isolation of rare cells, such as circulating tumor cells, has been challenging because of their low abundance and limited timeframes of expressions of relevant cell characteristics. In this work, we devise a novel hydrodynamic mechanism to sequentially trap and isolate floating cells in biosamples. We develop a microfluidic device for the sequential isolation of floating cancer cells through a series of microsieves to obtain up to 100% trapping yield and >95% sequential isolation efficiency. We optimize the trappers' dimensions and locations through both computational and experimental analyses using microbeads and cells. Furthermore, we investigated the functional range of flow rates for effective sequential cell isolation by taking the cell deformability into account. We verify the cell isolation ability using the human breast cancer cell line MDA-MB-231 with perfect agreement with the microbead results. The viability of the isolated cells can be maintained for direct identification of any cell characteristics within the device. We further demonstrate that this device can be applied to isolate the largest particles from a sample containing multiple sizes of particles, revealing its possible applicability in isolation of circulating tumor cells in cancer patients' blood. Our study provides a promising sequential cell isolation strategy with high potential for rapid detection and analysis of general floating cells, including circulating tumor cells and other rare cell types. PMID:27387093

  1. Blocking for Sequential Political Experiments

    PubMed Central

    Moore, Sally A.

    2013-01-01

    In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects “trickle in” to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion. PMID:24143061

  2. Method for locating underground anomalies by diffraction of electromagnetic waves passing between spaced boreholes

    DOEpatents

    Lytle, R. Jeffrey; Lager, Darrel L.; Laine, Edwin F.; Davis, Donald T.

    1979-01-01

    Underground anomalies or discontinuities, such as holes, tunnels, and caverns, are located by lowering an electromagnetic signal transmitting antenna down one borehole and a receiving antenna down another, the ground to be surveyed for anomalies being situated between the boreholes. Electronic transmitting and receiving equipment associated with the antennas is activated and the antennas are lowered in unison at the same rate down their respective boreholes a plurality of times, each time with the receiving antenna at a different level with respect to the transmitting antenna. The transmitted electromagnetic waves diffract at each edge of an anomaly. This causes minimal signal reception at the receiving antenna. Triangulation of the straight lines between the antennas for the depths at which the signal minimums are detected precisely locates the anomaly. Alternatively, phase shifts of the transmitted waves may be detected to locate an anomaly, the phase shift being distinctive for the waves directed at the anomaly.

  3. MAGSAT anomaly map and continental drift

    NASA Technical Reports Server (NTRS)

    Lemouel, J. L. (Principal Investigator); Galdeano, A.; Ducruix, J.

    1981-01-01

    Anomaly maps of high quality are needed to display unambiguously the so called long wave length anomalies. The anomalies were analyzed in terms of continental drift and the nature of their sources is discussed. The map presented confirms the thinness of the oceanic magnetized layer. Continental magnetic anomalies are characterized by elongated structures generally of east-west trend. Paleomagnetic reconstruction shows that the anomalies found in India, Australia, and Antarctic exhibit a fair consistency with the African anomalies. It is also shown that anomalies are locked under the continents and have a fixed geometry.

  4. Lipid peroxidation in experimental uveitis: sequential studies.

    PubMed

    Goto, H; Wu, G S; Chen, F; Kristeva, M; Sevanian, A; Rao, N A

    1992-06-01

    Previously we have detected the occurrence of retinal lipid peroxidation initiated by phagocyte-derived oxygen radicals in experimental autoimmune uveitis (EAU). In the current studies, the confirmation of inflammation-mediated lipid peroxidation was proceeded further to include measurement of multiple parameters, including conjugated dienes, ketodienes, thiobarbituric acid reactive substances and fluorescent chromolipids. The assay for myeloperoxidase, a measure for the number of polymorphonuclear leukocytes in the inflammatory sites was also carried out. The levels of all these parameters were followed through the course of EAU development. The sequential evaluation of histologic changes using both light and electron microscopy was also carried out and the results were correlated with lipid peroxidation indices. These data suggest that the retinal lipid peroxidation plays a causative role in the subsequent retinal degeneration.

  5. Giant intracranial aneurysms: rapid sequential computed tomography

    SciTech Connect

    Pinto, R.S.; Cohen, W.A.; Kricheff, I.I.; Redington, R.W.; Berninger, W.H.

    1982-11-01

    Giant intracranial aneurysms often present as mass lesions rather than with subarachnoid hemorrhage. Routine computed tomographic (CT) scans with contrast material will generally detect them, but erroneous diagnosis of basal meningioma is possible. Rapid sequential scanning (dynamic CT) after bolus injection of 40 ml of Renografin-76 can conclusively demonstrate an intracranial aneurysm, differentiating it from other lesions by transit-time analysis of the passage of contrast medium. In five patients, the dynamics of contrast bolus transit in aneurysms were consistently different from the dynamics in pituitary tumors, craniopharyngiomas, and meningiomas, thereby allowing a specific diagnosis. Dynamic CT was also useful after treatment of the aneurysms by carotid artery ligation and may be used as an alternative to angiographic evaluation in determining luminal patency or thrombosis.

  6. Sequential scintigraphic staging of small cell carcinoma

    SciTech Connect

    Bitran, J.D.; Bekerman, C.; Pinsky, S.

    1981-04-15

    Thirty patients with small cell carcinoma (SCC) of the lung were sequentially staged following a history and physical exam with liver, bran, bone, and gallium-67 citrate scans. Scintigraphic evaluation disclosed 7 of 30 patients (23%) with advanced disease, stage IIIM1. When Gallium-67 scans were used as the sole criteria for staging, they proved to be accurate and identified six of the seven patients with occult metastatic disease. Gallium-67 scans proved to be accurate in detecting thoracic and extrathoracic metastases in the 30 patients with SCC, especially within the liver and lymph node-bearing area. The diagnostic accuracy of gallium-67 fell in regions such as bone or brain. Despite the limitations of gallium-67 scanning, the authors conclude that these scans are useful in staging patients with SCC and should be the initial scans used in staging such patients.

  7. Experimental Anomalies in Neutrino Physics

    NASA Astrophysics Data System (ADS)

    Palamara, Ornella

    2014-03-01

    In recent years, experimental anomalies ranging in significance (2.8-3.8 σ) have been reported from a variety of experiments studying neutrinos over baselines less than 1 km. Results from the LSND and MiniBooNE short-baseline νe /νe appearance experiments show anomalies which cannot be described by oscillations between the three standard model neutrinos (the ``LSND anomaly''). In addition, a re-analysis of the anti-neutrino flux produced by nuclear power reactors has led to an apparent deficit in νe event rates in a number of reactor experiments (the ``reactor anomaly''). Similarly, calibration runs using 51Cr and 37Ar radioactive sources in the Gallium solar neutrino experiments GALLEX and SAGE have shown an unexplained deficit in the electron neutrino event rate over very short distances (the ``Gallium anomaly''). The puzzling results from these experiments, which together may suggest the existence of physics beyond the Standard Model and hint at exciting new physics, including the possibility of additional low-mass sterile neutrino states, have raised the interest in the community for new experimental efforts that could eventually solve this puzzle. Definitive evidence for sterile neutrinos would be a revolutionary discovery, with implications for particle physics as well as cosmology. Proposals to address these signals by employing accelerator, reactor and radioactive source experiments are in the planning stages or underway worldwide. In this talk some of these will be reviewed, with emphasis on the accelerator programs.

  8. Non-relativistic scale anomalies

    NASA Astrophysics Data System (ADS)

    Arav, Igal; Chapman, Shira; Oz, Yaron

    2016-06-01

    We extend the cohomological analysis in arXiv:1410.5831 of anisotropic Lifshitz scale anomalies. We consider non-relativistic theories with a dynamical critical exponent z = 2 with or without non-relativistic boosts and a particle number symmetry. We distinguish between cases depending on whether the time direction does or does not induce a foliation structure. We analyse both 1 + 1 and 2 + 1 spacetime dimensions. In 1 + 1 dimensions we find no scale anomalies with Galilean boost symmetries. The anomalies in 2 + 1 dimensions with Galilean boosts and a foliation structure are all B-type and are identical to the Lifshitz case in the purely spatial sector. With Galilean boosts and without a foliation structure we find also an A-type scale anomaly. There is an infinite ladder of B-type anomalies in the absence of a foliation structure with or without Galilean boosts. We discuss the relation between the existence of a foliation structure and the causality of the field theory.

  9. Astrometric solar-system anomalies

    NASA Astrophysics Data System (ADS)

    Anderson, John D.; Nieto, Michael Martin

    2010-01-01

    There are at least four unexplained anomalies connected with astrometric data. Perhaps the most disturbing is the fact that when a spacecraft on a flyby trajectory approaches the Earth within 2000 km or less, it often experiences a change in total orbital energy per unit mass. Next, a secular change in the astronomical unit AU is definitely a concern. It is reportedly increasing by about 15 cm yr-1. The other two anomalies are perhaps less disturbing because of known sources of nongravitational acceleration. The first is an apparent slowing of the two Pioneer spacecraft as they exit the solar system in opposite directions. Some astronomers and physicists, including us, are convinced this effect is of concern, but many others are convinced it is produced by a nearly identical thermal emission from both spacecraft, in a direction away from the Sun, thereby producing acceleration toward the Sun. The fourth anomaly is a measured increase in the eccentricity of the Moon's orbit. Here again, an increase is expected from tidal friction in both the Earth and Moon. However, there is a reported unexplained increase that is significant at the three-sigma level. It is prudent to suspect that all four anomalies have mundane explanations, or that one or more anomalies are a result of systematic error. Yet they might eventually be explained by new physics. For example, a slightly modified theory of gravitation is not ruled out, perhaps analogous to Einstein's 1916 explanation for the excess precession of Mercury's perihelion.

  10. A Pulmonary Sequestered Segment with an Aberrant Pulmonary Arterial Supply: A Case of Unique Anomaly

    PubMed Central

    Kim, Minchul; An, Jin Kyung; Jung, Yoon Young; Choi, Yun Sun

    2016-01-01

    We presented a rare case of a 64-year-old man with a combined anomaly of the bronchus and pulmonary artery that was detected incidentally. Computed tomography showed a hyperlucent, aerated sequestered segment of the right lower lung with an independent ectopic bronchus, which had no connection to the other airway. The affected segment was supplied by its own aberrant pulmonary artery branch from the right pulmonary trunk. This anomaly cannot be classified with any of the previously reported anomalies. PMID:26957918

  11. Ant colony optimization-based firewall anomaly mitigation engine.

    PubMed

    Penmatsa, Ravi Kiran Varma; Vatsavayi, Valli Kumari; Samayamantula, Srinivas Kumar

    2016-01-01

    A firewall is the most essential component of network perimeter security. Due to human error and the involvement of multiple administrators in configuring firewall rules, there exist common anomalies in firewall rulesets such as Shadowing, Generalization, Correlation, and Redundancy. There is a need for research on efficient ways of resolving such anomalies. The challenge is also to see that the reordered or resolved ruleset conforms to the organization's framed security policy. This study proposes an ant colony optimization (ACO)-based anomaly resolution and reordering of firewall rules called ACO-based firewall anomaly mitigation engine. Modified strategies are also introduced to automatically detect these anomalies and to minimize manual intervention of the administrator. Furthermore, an adaptive reordering strategy is proposed to aid faster reordering when a new rule is appended. The proposed approach was tested with different firewall policy sets. The results were found to be promising in terms of the number of conflicts resolved, with minimal availability loss and marginal security risk. This work demonstrated the application of a metaheuristic search technique, ACO, in improving the performance of a packet-filter firewall with respect to mitigating anomalies in the rules, and at the same time demonstrated conformance to the security policy. PMID:27441151

  12. Branchial Anomalies: Diagnosis and Management

    PubMed Central

    Azeez, Arun; Thada, Nikhil Dinaker; Rao, Pallavi; Prasad, Kishore Chandra

    2014-01-01

    Objective. To find out the incidence of involvement of individual arches, anatomical types of lesions, the age and sex incidence, the site and side of predilection, the common clinical features, the common investigations, treatment, and complications of the different anomalies. Setting. Academic Department of Otolaryngology, Head and Neck Surgery. Design. A 10 year retrospective study. Participants. 30 patients with clinically proven branchial anomalies including patients with bilateral disease totaling 34 lesions. Main Outcome Measures. The demographical data, clinical features, type of branchial anomalies, and the management details were recorded and analyzed. Results and Observations. The mean age of presentation was 18.67 years. Male to female sex ratio was 1.27 : 1 with a male preponderance. Of the 34 lesions, maximum incidence was of second arch anomalies (50%) followed by first arch. We had two cases each of third and fourth arch anomalies. Only 1 (3.3%) patients of the 30 presented with lesion at birth. The most common pathological type of lesions was fistula (58.82%) followed by cyst. 41.18% of the lesions occurred on the right side. All the patients underwent surgical excision. None of our patients had involvement of facial nerve in first branchial anomaly. All patients had tracts going superficial to the facial nerve. Conclusion. Confirming the extent of the tract is mandatory before any surgery as these lesions pass in relation to some of the most vital structures of the neck. Surgery should always be the treatment option. injection of dye, microscopic removal and inclusion of surrounding tissue while excising the tract leads to a decreased incidence of recurrence. PMID:24772172

  13. Boundary anomalies and correlation functions

    NASA Astrophysics Data System (ADS)

    Huang, Kuo-Wei

    2016-08-01

    It was shown recently that boundary terms of conformal anomalies recover the universal contribution to the entanglement entropy and also play an important role in the boundary monotonicity theorem of odd-dimensional quantum field theories. Motivated by these results, we investigate relationships between boundary anomalies and the stress tensor correlation functions in conformal field theories. In particular, we focus on how the conformal Ward identity and the renormalization group equation are modified by boundary central charges. Renormalized stress tensors induced by boundary Weyl invariants are also discussed, with examples in spherical and cylindrical geometries.

  14. Genetic basis for vascular anomalies.

    PubMed

    Kirkorian, A Yasmine; Grossberg, Anna L; Püttgen, Katherine B

    2016-03-01

    The fundamental genetics of many isolated vascular anomalies and syndromes associated with vascular anomalies have been elucidated. The rate of discovery continues to increase, expanding our understanding of the underlying interconnected molecular pathways. This review summarizes genetic and clinical information on the following diagnoses: capillary malformation, venous malformation, lymphatic malformation, arteriovenous malformation, PIK3CA-related overgrowth spectrum (PROS), Proteus syndrome, SOLAMEN syndrome, Sturge-Weber syndrome, phakomatosis pigmentovascularis, congenital hemangioma, verrucous venous malformation, cutaneomucosal venous malformation, blue rubber bleb nevus syndrome, capillary malformation-arteriovenous malformation syndrome, Parkes-Weber syndrome, and Maffucci syndrome. PMID:27607321

  15. Analysis of DSN software anomalies

    NASA Technical Reports Server (NTRS)

    Galorath, D. D.; Hecht, H.; Hecht, M.; Reifer, D. J.

    1981-01-01

    A categorized data base of software errors which were discovered during the various stages of development and operational use of the Deep Space Network DSN/Mark 3 System was developed. A study team identified several existing error classification schemes (taxonomies), prepared a detailed annotated bibliography of the error taxonomy literature, and produced a new classification scheme which was tuned to the DSN anomaly reporting system and encapsulated the work of others. Based upon the DSN/RCI error taxonomy, error data on approximately 1000 reported DSN/Mark 3 anomalies were analyzed, interpreted and classified. Next, error data are summarized and histograms were produced highlighting key tendencies.

  16. Review on possible gravitational anomalies

    NASA Astrophysics Data System (ADS)

    Amador, Xavier E.

    2005-01-01

    This is an updated introductory review of 2 possible gravitational anomalies that has attracted part of the Scientific community: the Allais effect that occur during solar eclipses, and the Pioneer 10 spacecraft anomaly, experimented also by Pioneer 11 and Ulysses spacecrafts. It seems that, to date, no satisfactory conventional explanation exist to these phenomena, and this suggests that possible new physics will be needed to account for them. The main purpose of this review is to announce 3 other new measurements that will be carried on during the 2005 solar eclipses in Panama and Colombia (Apr. 8) and in Portugal (Oct.15).

  17. Random sequential adsorption on fractals.

    PubMed

    Ciesla, Michal; Barbasz, Jakub

    2012-07-28

    Irreversible adsorption of spheres on flat collectors having dimension d < 2 is studied. Molecules are adsorbed on Sierpinski's triangle and carpet-like fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions.

  18. Sequential extrusion-microwave pretreatment of switchgrass and big bluestem.

    PubMed

    Karunanithy, C; Muthukumarappan, K; Gibbons, W R

    2014-02-01

    Developing an effective and economical biomass pretreatment method is a significant roadblock to meeting the ever growing demand for transportation fuels. Earlier studies with different feedstocks revealed that in the absence of chemicals, neither extrusion nor microwave could be standalone pretreatments. However, there is potential that the advantages of these individual methods can be harnessed in a sequential pretreatment process. Accordingly, switchgrass and big bluestem were extruded and then subject to microwave pretreatment, under optimal conditions that had been separately determined in prior studies. Pretreated biomass was then subject to enzymatic hydrolysis to understand the effectiveness of the sequential pretreatment on sugar recovery and generation of fermentation inhibitors. Statistical analysis confirmed that moisture content, microwave power level, and exposure time (and their interactions) had significant influence on sugar recovery. Sequential pretreatment of switchgrass (25% moisture, 450W and 2.5min) resulted in a maximum glucose, xylose, and total sugar recovery of 52.6%, 75.5%, and 59.2%, respectively. This was higher by 1.27 and 2.71, 1.21 and 4.60, and 1.25 and 2.87 times compared to extrusion alone and the unpretreated control, respectively. The same sequential pretreatment conditions achieved maximum glucose, xylose, and total sugar recovery of 83.2%, 92.1%, and 68.1%, respectively, for big bluestem. This was 1.14 and 4.1, 1.18 and 2.7, and 1.20 and 3.0 times higher than extrusion alone and the unpretreated control, respectively. This sequential pretreatment process did not aggravate acetic acid formation over levels observed with the individual pretreatments. Furthermore, furfural, HMF, and formic acid were not detected in any of the treatments. Although the sequential pretreatment process enhanced sugar recovery without increasing the levels of potential fermentation inhibitors, the increased energy input for the microwave treatment may

  19. CMB anomalies after Planck

    NASA Astrophysics Data System (ADS)

    Schwarz, Dominik J.; Copi, Craig J.; Huterer, Dragan; Starkman, Glenn D.

    2016-09-01

    Several unexpected features have been observed in the microwave sky at large angular scales, both by WMAP and by Planck. Among those features is a lack of both variance and correlation on the largest angular scales, alignment of the lowest multipole moments with one another and with the motion and geometry of the solar system, a hemispherical power asymmetry or dipolar power modulation, a preference for odd parity modes and an unexpectedly large cold spot in the Southern hemisphere. The individual p-values of the significance of these features are in the per mille to per cent level, when compared to the expectations of the best-fit inflationary ΛCDM model. Some pairs of those features are demonstrably uncorrelated, increasing their combined statistical significance and indicating a significant detection of CMB features at angular scales larger than a few degrees on top of the standard model. Despite numerous detailed investigations, we still lack a clear understanding of these large-scale features, which seem to imply a violation of statistical isotropy and scale invariance of inflationary perturbations. In this contribution we present a critical analysis of our current understanding and discuss several ideas of how to make further progress.

  20. Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.

    PubMed

    Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen

    2015-11-01

    Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies. PMID:26026722