Science.gov

Sample records for sequential anomaly detection

  1. An anomaly detection and isolation scheme with instance-based learning and sequential analysis

    SciTech Connect

    Yoo, T. S.; Garcia, H. E.

    2006-07-01

    This paper presents an online anomaly detection and isolation (FDI) technique using an instance-based learning method combined with a sequential change detection and isolation algorithm. The proposed method uses kernel density estimation techniques to build statistical models of the given empirical data (null hypothesis). The null hypothesis is associated with the set of alternative hypotheses modeling the abnormalities of the systems. A decision procedure involves a sequential change detection and isolation algorithm. Notably, the proposed method enjoys asymptotic optimality as the applied change detection and isolation algorithm is optimal in minimizing the worst mean detection/isolation delay for a given mean time before a false alarm or a false isolation. Applicability of this methodology is illustrated with redundant sensor data set and its performance. (authors)

  2. Anomaly Detection in Dynamic Networks

    SciTech Connect

    Turcotte, Melissa

    2014-10-14

    Anomaly detection in dynamic communication networks has many important security applications. These networks can be extremely large and so detecting any changes in their structure can be computationally challenging; hence, computationally fast, parallelisable methods for monitoring the network are paramount. For this reason the methods presented here use independent node and edge based models to detect locally anomalous substructures within communication networks. As a first stage, the aim is to detect changes in the data streams arising from node or edge communications. Throughout the thesis simple, conjugate Bayesian models for counting processes are used to model these data streams. A second stage of analysis can then be performed on a much reduced subset of the network comprising nodes and edges which have been identified as potentially anomalous in the first stage. The first method assumes communications in a network arise from an inhomogeneous Poisson process with piecewise constant intensity. Anomaly detection is then treated as a changepoint problem on the intensities. The changepoint model is extended to incorporate seasonal behavior inherent in communication networks. This seasonal behavior is also viewed as a changepoint problem acting on a piecewise constant Poisson process. In a static time frame, inference is made on this extended model via a Gibbs sampling strategy. In a sequential time frame, where the data arrive as a stream, a novel, fast Sequential Monte Carlo (SMC) algorithm is introduced to sample from the sequence of posterior distributions of the change points over time. A second method is considered for monitoring communications in a large scale computer network. The usage patterns in these types of networks are very bursty in nature and don’t fit a Poisson process model. For tractable inference, discrete time models are considered, where the data are aggregated into discrete time periods and probability models are fitted to the

  3. Detecting Patterns of Anomalies

    DTIC Science & Technology

    2009-03-01

    ct)P (bt|ct) , where A,B and C are mutually exclusive subsets of attributes with at most k elements . This ratio is similar to the previous formula , but...AND SUBTITLE Detecting Patterns of Anomalies 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...to be dependent if, µ(A,B) ≥ βµ (2.1) where, βµ is a threshold parameter, set to a low value of 0.1 ( empirically ) in our experi- ments. Thus, for a

  4. Seismic data fusion anomaly detection

    NASA Astrophysics Data System (ADS)

    Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David

    2014-06-01

    Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.

  5. Anomaly detection on cup anemometers

    NASA Astrophysics Data System (ADS)

    Vega, Enrique; Pindado, Santiago; Martínez, Alejandro; Meseguer, Encarnación; García, Luis

    2014-12-01

    The performances of two rotor-damaged commercial anemometers (Vector Instruments A100 LK) were studied. The calibration results (i.e. the transfer function) were very linear, the aerodynamic behavior being more efficient than the one shown by both anemometers equipped with undamaged rotors. No detection of the anomaly (the rotors’ damage) was possible based on the calibration results. However, the Fourier analysis clearly revealed this anomaly.

  6. Seismic Anomaly Detection Using Symbolic Representation Methods

    NASA Astrophysics Data System (ADS)

    Christodoulou, Vyron; Bi, Yaxin; Wilkie, George; Zhao, Guoze

    2016-08-01

    In this work we investigate the use of symbolic representation methods for Anomaly Detection in different electromagnetic sequential time series datasets. An issue that is often overlooked regarding symbolic representation and its performance in Anomaly Detection is the use of a quantitative accuracy metric. Until recently only visual representations have been used to show the efficiency of an algorithm to detect anomalies. In this respect we propose an novel accuracy metric that takes into account the length of the sliding window of such symbolic representation algorithms and we present its utility. For the evaluation of the accuracy metric, HOT-SAX is used, a method that aggregates data points by use of sliding windows. A HOT-SAX variant, with the use of overlapping windows, is also introduced that achieves better results based on the newly defined accuracy metric. Both methods are evaluated on ten different benchmark datasets and based on the empirical evidence we use Earth's geomagnetic data gathered by the SWARM satellites and terrestrial sources around the epicenter of two seismic events in the Yunnan region of China.

  7. Survey of Anomaly Detection Methods

    SciTech Connect

    Ng, B

    2006-10-12

    This survey defines the problem of anomaly detection and provides an overview of existing methods. The methods are categorized into two general classes: generative and discriminative. A generative approach involves building a model that represents the joint distribution of the input features and the output labels of system behavior (e.g., normal or anomalous) then applies the model to formulate a decision rule for detecting anomalies. On the other hand, a discriminative approach aims directly to find the decision rule, with the smallest error rate, that distinguishes between normal and anomalous behavior. For each approach, we will give an overview of popular techniques and provide references to state-of-the-art applications.

  8. Sequential detection of web defects

    DOEpatents

    Eichel, Paul H.; Sleefe, Gerard E.; Stalker, K. Terry; Yee, Amy A.

    2001-01-01

    A system for detecting defects on a moving web having a sequential series of identical frames uses an imaging device to form a real-time camera image of a frame and a comparitor to comparing elements of the camera image with corresponding elements of an image of an exemplar frame. The comparitor provides an acceptable indication if the pair of elements are determined to be statistically identical; and a defective indication if the pair of elements are determined to be statistically not identical. If the pair of elements is neither acceptable nor defective, the comparitor recursively compares the element of said exemplar frame with corresponding elements of other frames on said web until one of the acceptable or defective indications occur.

  9. Data Mining for Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj

    2013-01-01

    The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.

  10. Symbolic Representation of Electromagnetic Data for Seismic Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Christodoulou, Vyron; Bi, Yaxin; Wilkie, George; Zhao, Guoze

    2016-08-01

    In this work we investigate the use of symbolic representation methods for Anomaly Detection in different electromagnetic sequential me series datasets. An issue that is o en overlooked regarding symbolic representation and its performance in Anomaly Detection is the use of a quantitative accuracy metric. Un l recently only visual representations have been used to show the efficiency of an algorithm to detect anomalies. In this respect we propose an novel accuracy metric that takes into account the length of the sliding window of such symbolic representation algorithms and we present its utility. For the evaluation of the accuracy metric, HOT-SAX is used, a method that aggregates data points by use of sliding windows. A HOT-SAX variant, with the use of overlapping windows, is also introduced that achieves be er results based on the newly de ned accuracy metric. Both algorithms are evaluated under ten benchmark and real terrestrial and satellite data.

  11. Congenital renal anomalies detected in adulthood

    PubMed Central

    Muttarak, M; Sriburi, T

    2012-01-01

    Objective To document the types of congenital renal anomalies detected in adulthood, the clinical presentation and complications of these renal anomalies, and the most useful imaging modality in detecting a renal anomaly. Materials and methods This study was approved by the institutional review board and informed consent was waived. Between January 2007 and January 2011, the clinical data and imaging studies of 28 patients older than 18 years diagnosed with renal anomaly at the authors’ institution were retrospectively reviewed. Renal anomalies in this study included only those with abnormality in position and in form. Results Of these 28 patients, 22 underwent imaging studies and their results constituted the material of this study. Of the 22 patients, 14 had horseshoe kidneys (HSK), four had crossed renal ectopia and four had malrotation. Sixteen patients were men and six were women. The patients ranged in age from 19 to 74 years (mean age 51.1 years). Clinical presentations were abdominal pain (13), fever (13), haematuria (4), palpable mass (2), asymptomatic (2), polyuria (1) dysuria (1), blurred vision (1), and headache with weakness of left extremities (1). Imaging studies included abdominal radiograph (15), intravenous pyelography (IVP) (8), retrograde pyelography (RP) (4), ultrasonography (US) (7), and computed tomography (CT) (9). Associated complications included urinary tract stones (17), urinary tract infection (16), hydronephrosis (12), and tumours (2). Abdominal radiograph suggested renal anomalies in nine out of 15 studies. IVP, RP, US and CT suggested anomalies in all patients who had these studies performed. However, CT was the best imaging modality to evaluate anatomy, function and complications of patients with renal anomalies. Conclusion HSK was the most common renal anomaly, with abdominal pain and fever being the most common presentations. UTI and stones were the most common complications. IVP, RP, US and CT can be used to diagnose renal

  12. Contextual Detection of Anomalies within Hyperspectral Images

    DTIC Science & Technology

    2011-03-01

    Hyperspectral Imagery (HSI), Unsupervised Target Detection, Target Identification, Contextual Anomaly Detection 16. SECURITY CLASSIFICATION OF: 17. LIMITATION...processing. Hyperspectral imaging has a wide range of applications within remote sensing, not limited to terrain classification , environmental monitoring...Johnson, R. J. (2008). Improved feature extraction, feature selection, and identification techniques that create a fast unsupervised hyperspectral

  13. Anomaly Detection for Discrete Sequences: A Survey

    SciTech Connect

    Chandola, Varun; Banerjee, Arindam; Kumar, Vipin

    2012-01-01

    This survey attempts to provide a comprehensive and structured overview of the existing research for the problem of detecting anomalies in discrete/symbolic sequences. The objective is to provide a global understanding of the sequence anomaly detection problem and how existing techniques relate to each other. The key contribution of this survey is the classification of the existing research into three distinct categories, based on the problem formulation that they are trying to solve. These problem formulations are: 1) identifying anomalous sequences with respect to a database of normal sequences; 2) identifying an anomalous subsequence within a long sequence; and 3) identifying a pattern in a sequence whose frequency of occurrence is anomalous. We show how each of these problem formulations is characteristically distinct from each other and discuss their relevance in various application domains. We review techniques from many disparate and disconnected application domains that address each of these formulations. Within each problem formulation, we group techniques into categories based on the nature of the underlying algorithm. For each category, we provide a basic anomaly detection technique, and show how the existing techniques are variants of the basic technique. This approach shows how different techniques within a category are related or different from each other. Our categorization reveals new variants and combinations that have not been investigated before for anomaly detection. We also provide a discussion of relative strengths and weaknesses of different techniques. We show how techniques developed for one problem formulation can be adapted to solve a different formulation, thereby providing several novel adaptations to solve the different problem formulations. We also highlight the applicability of the techniques that handle discrete sequences to other related areas such as online anomaly detection and time series anomaly detection.

  14. Hyperspectral Anomaly Detection in Urban Scenarios

    NASA Astrophysics Data System (ADS)

    Rejas Ayuga, J. G.; Martínez Marín, R.; Marchamalo Sacristán, M.; Bonatti, J.; Ojeda, J. C.

    2016-06-01

    We have studied the spectral features of reflectance and emissivity in the pattern recognition of urban materials in several single hyperspectral scenes through a comparative analysis of anomaly detection methods and their relationship with city surfaces with the aim to improve information extraction processes. Spectral ranges of the visible-near infrared (VNIR), shortwave infrared (SWIR) and thermal infrared (TIR) from hyperspectral data cubes of AHS sensor and HyMAP and MASTER of two cities, Alcalá de Henares (Spain) and San José (Costa Rica) respectively, have been used. In this research it is assumed no prior knowledge of the targets, thus, the pixels are automatically separated according to their spectral information, significantly differentiated with respect to a background, either globally for the full scene, or locally by image segmentation. Several experiments on urban scenarios and semi-urban have been designed, analyzing the behaviour of the standard RX anomaly detector and different methods based on subspace, image projection and segmentation-based anomaly detection methods. A new technique for anomaly detection in hyperspectral data called DATB (Detector of Anomalies from Thermal Background) based on dimensionality reduction by projecting targets with unknown spectral signatures to a background calculated from thermal spectrum wavelengths is presented. First results and their consequences in non-supervised classification and extraction information processes are discussed.

  15. Anomaly Detection Techniques for Ad Hoc Networks

    ERIC Educational Resources Information Center

    Cai, Chaoli

    2009-01-01

    Anomaly detection is an important and indispensable aspect of any computer security mechanism. Ad hoc and mobile networks consist of a number of peer mobile nodes that are capable of communicating with each other absent a fixed infrastructure. Arbitrary node movements and lack of centralized control make them vulnerable to a wide variety of…

  16. Hyperspectral anomaly detection using enhanced global factors

    NASA Astrophysics Data System (ADS)

    Paciencia, Todd J.; Bauer, Kenneth W.

    2016-05-01

    Dimension reduction techniques have become one popular unsupervised approach used towards detecting anomalies in hyperspectral imagery. Although demonstrating promising results in the literature on specific images, these methods can become difficult to directly interpret and often require tuning of their parameters to achieve high performance on a specific set of images. This lack of generality is also compounded by the need to remove noise and atmospheric absorption spectral bands from the image prior to detection. Without a process for this band selection and to make the methods adaptable to different image compositions, performance becomes difficult to maintain across a wider variety of images. Here, we present a framework that uses factor analysis to provide a robust band selection and more meaningful dimension reduction with which to detect anomalies in the imagery. Measurable characteristics of the image are used to create an automated decision process that allows the algorithm to adjust to a particular image, while maintaining high detection performance. The framework and its algorithms are detailed, and results are shown for forest, desert, sea, rural, urban, anomaly-sparse, and anomaly-dense imagery types from different sensors. Additionally, the method is compared to current state-of-the-art methods and is shown to be computationally efficient.

  17. Sequential decision rules for failure detection

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.; Willsky, A. S.

    1981-01-01

    The formulation of the decision making of a failure detection process as a Bayes sequential decision problem (BSDP) provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Baysian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is a useful one.

  18. Hyperspectral Anomaly Detection by Graph Pixel Selection.

    PubMed

    Yuan, Yuan; Ma, Dandan; Wang, Qi

    2016-12-01

    Hyperspectral anomaly detection (AD) is an important problem in remote sensing field. It can make full use of the spectral differences to discover certain potential interesting regions without any target priors. Traditional Mahalanobis-distance-based anomaly detectors assume the background spectrum distribution conforms to a Gaussian distribution. However, this and other similar distributions may not be satisfied for the real hyperspectral images. Moreover, the background statistics are susceptible to contamination of anomaly targets which will lead to a high false-positive rate. To address these intrinsic problems, this paper proposes a novel AD method based on the graph theory. We first construct a vertex- and edge-weighted graph and then utilize a pixel selection process to locate the anomaly targets. Two contributions are claimed in this paper: 1) no background distributions are required which makes the method more adaptive and 2) both the vertex and edge weights are considered which enables a more accurate detection performance and better robustness to noise. Intensive experiments on the simulated and real hyperspectral images demonstrate that the proposed method outperforms other benchmark competitors. In addition, the robustness of the proposed method has been validated by using various window sizes. This experimental result also demonstrates the valuable characteristic of less computational complexity and less parameter tuning for real applications.

  19. System and method for anomaly detection

    DOEpatents

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  20. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  1. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  2. Algorithm development for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton S.

    2008-10-01

    This dissertation proposes and evaluates a novel anomaly detection algorithm suite for ground-to-ground, or air-to-ground, applications requiring automatic target detection using hyperspectral (HS) data. Targets are manmade objects in natural background clutter under unknown illumination and atmospheric conditions. The use of statistical models herein is purely for motivation of particular formulas for calculating anomaly output surfaces. In particular, formulas from semiparametrics are utilized to obtain novel forms for output surfaces, and alternative scoring algorithms are proposed to calculate output surfaces that are comparable to those of semiparametrics. Evaluation uses both simulated data and real HS data from a joint data collection effort between the Army Research Laboratory and the Army Armament Research Development & Engineering Center. A data transformation method is presented for use by the two-sample data structure univariate semiparametric and nonparametric scoring algorithms, such that, the two-sample data are mapped from their original multivariate space to an univariate domain, where the statistical power of the univariate scoring algorithms is shown to be improved relative to existing multivariate scoring algorithms testing the same two-sample data. An exhaustive simulation experimental study is conducted to assess the performance of different HS anomaly detection techniques, where the null and alternative hypotheses are completely specified, including all parameters, using multivariate normal and mixtures of multivariate normal distributions. Finally, for ground-to-ground anomaly detection applications, where the unknown scales of targets add to the problem complexity, a novel global anomaly detection algorithm suite is introduced, featuring autonomous partial random sampling (PRS) of the data cube. The PRS method is proposed to automatically sample the unknown background clutter in the test HS imagery, and by repeating multiple times this

  3. Sequential detection of learning in cognitive diagnosis.

    PubMed

    Ye, Sangbeak; Fellouris, Georgios; Culpepper, Steven; Douglas, Jeff

    2016-05-01

    In order to look more closely at the many particular skills examinees utilize to answer items, cognitive diagnosis models have received much attention, and perhaps are preferable to item response models that ordinarily involve just one or a few broadly defined skills, when the objective is to hasten learning. If these fine-grained skills can be identified, a sharpened focus on learning and remediation can be achieved. The focus here is on how to detect when learning has taken place for a particular attribute and efficiently guide a student through a sequence of items to ultimately attain mastery of all attributes while administering as few items as possible. This can be seen as a problem in sequential change-point detection for which there is a long history and a well-developed literature. Though some ad hoc rules for determining learning may be used, such as stopping after M consecutive items have been successfully answered, more efficient methods that are optimal under various conditions are available. The CUSUM, Shiryaev-Roberts and Shiryaev procedures can dramatically reduce the time required to detect learning while maintaining rigorous Type I error control, and they are studied in this context through simulation. Future directions for modelling and detection of learning are discussed.

  4. Detecting syntactic and semantic anomalies in schizophrenia.

    PubMed

    Moro, Andrea; Bambini, Valentina; Bosia, Marta; Anselmetti, Simona; Riccaboni, Roberta; Cappa, Stefano F; Smeraldi, Enrico; Cavallaro, Roberto

    2015-12-01

    One of the major challenges in the study of language in schizophrenia is to identify specific levels of the linguistic structure that might be selectively impaired. While historically a main semantic deficit has been widely claimed, results are mixed, with also evidence of syntactic impairment. This might be due to heterogeneity in materials and paradigms across studies, which often do not allow to tap into single linguistic components. Moreover, the interaction between linguistic and neurocognitive deficits is still unclear. In this study, we concentrated on syntactic and semantic knowledge. We employed an anomaly detection task including short and long sentences with either syntactic errors violating the principles of Universal Grammar, or a novel form of semantic errors, resulting from a contradiction in the computation of the whole sentence meaning. Fifty-eight patients with diagnosis of schizophrenia were compared to 30 healthy subjects. Results showed that, in patients, only the ability to identify syntactic anomaly, both in short and long sentences, was impaired. This result cannot be explained by working memory abilities or psychopathological features. These findings suggest the presence of an impairment of syntactic knowledge in schizophrenia, at least partially independent of the cognitive and psychopathological profile. On the contrary, we cannot conclude that there is a semantic impairment, at least in terms of compositional semantics abilities.

  5. Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)

    NASA Technical Reports Server (NTRS)

    McIntosh, Dawn

    2006-01-01

    This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search

  6. Statistical Anomaly Detection for Monitoring of Human Dynamics

    NASA Astrophysics Data System (ADS)

    Kamiya, K.; Fuse, T.

    2015-05-01

    Understanding of human dynamics has drawn attention to various areas. Due to the wide spread of positioning technologies that use GPS or public Wi-Fi, location information can be obtained with high spatial-temporal resolution as well as at low cost. By collecting set of individual location information in real time, monitoring of human dynamics is recently considered possible and is expected to lead to dynamic traffic control in the future. Although this monitoring focuses on detecting anomalous states of human dynamics, anomaly detection methods are developed ad hoc and not fully systematized. This research aims to define an anomaly detection problem of the human dynamics monitoring with gridded population data and develop an anomaly detection method based on the definition. According to the result of a review we have comprehensively conducted, we discussed the characteristics of the anomaly detection of human dynamics monitoring and categorized our problem to a semi-supervised anomaly detection problem that detects contextual anomalies behind time-series data. We developed an anomaly detection method based on a sticky HDP-HMM, which is able to estimate the number of hidden states according to input data. Results of the experiment with synthetic data showed that our proposed method has good fundamental performance with respect to the detection rate. Through the experiment with real gridded population data, an anomaly was detected when and where an actual social event had occurred.

  7. Anomaly detection enhanced classification in computer intrusion detection

    SciTech Connect

    Fugate, M. L.; Gattiker, J. R.

    2002-01-01

    This report describes work with the goal of enhancing capabilities in computer intrusion detection. The work builds upon a study of classification performance, that compared various methods of classifying information derived from computer network packets into attack versus normal categories, based on a labeled training dataset. This previous work validates our classification methods, and clears the ground for studying whether and how anomaly detection can be used to enhance this performance, The DARPA project that initiated the dataset used here concluded that anomaly detection should be examined to boost the performance of machine learning in the computer intrusion detection task. This report investigates the data set for aspects that will be valuable for anomaly detection application, and supports these results with models constructed from the data. In this report, the term anomaly detection means learning a model from unlabeled data, and using this to make some inference about future data. Our data is a feature vector derived from network packets: an 'example' or 'sample'. On the other hand, classification means building a model from labeled data, and using that model to classify unlabeled (future) examples. There is some precedent in the literature for combining these methods. One approach is to stage the two techniques, using anomaly detection to segment data into two sets for classification. An interpretation of this is a method to combat nonstationarity in the data. In our previous work, we demonstrated that the data has substantial temporal nonstationarity. With classification methods that can be thought of as learning a decision surface between two statistical distributions, performance is expected to degrade significantly when classifying examples that are from regions not well represented in the training set. Anomaly detection can be seen as a problem of learning the density (landscape) or the support (boundary) of a statistical distribution so that

  8. APHID: Anomaly Processor in Hardware for Intrusion Detection

    DTIC Science & Technology

    2007-03-01

    APHID : Anomaly Processor in Hardware for Intrusion Detection THESIS Samuel Hart, Captain, USAF AFIT/GCE/ENG/07-04 DEPARTMENT OF THE AIR FORCE AIR...the United States Government. AFIT/GCE/ENG/07-04 APHID : Anomaly Processor in Hardware for Intrusion Detection THESIS Presented to the Faculty...Captain, USAF March 2007 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GCE/ENG/07-04 APHID : Anomaly Processor in Hardware for Intrusion

  9. Automated Network Anomaly Detection with Learning, Control and Mitigation

    ERIC Educational Resources Information Center

    Ippoliti, Dennis

    2014-01-01

    Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…

  10. Recent Advances in Ionospheric Anomalies detection

    NASA Astrophysics Data System (ADS)

    Titov, Anton; Vyacheslav, Khattatov

    2016-07-01

    The variability of the parameters of the ionosphere and ionospheric anomalies are the subject of intensive research. It is widely known and studied in the literature ionospheric disturbances caused by solar activity, the passage of the terminator, artificial heating of high-latitude ionosphere, as well as seismic events. Each of the above types of anomalies is the subject of study and analysis. Analysis of these anomalies will provide an opportunity to improve our understanding of the mechanisms of ionospheric disturbances. To solve this problem are encouraged to develop a method of modeling the ionosphere, based on the assimilation of large amounts of observational data.

  11. Network Anomaly Detection System with Optimized DS Evidence Theory

    PubMed Central

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  12. Adaptive Anomaly Detection using Isolation Forest

    DTIC Science & Technology

    2009-12-20

    detectors such as ORCA [8] and one-class SVM [31], and density-based anomaly detector LOF [9]. The rest of the paper is organised as follows. Section 2...that a value can be computed using this measure. We use k = 5 in our experiments. The second experiment compares HS*-Trees with ORCA [8], one-class...SVM (first mentioned in [31]) and LOF [9]. ORCA employs distance-based definition (ii), stated in sec- tion 3.1, to rank anomalies; LOF is the state-of

  13. Post-processing for improving hyperspectral anomaly detection accuracy

    NASA Astrophysics Data System (ADS)

    Wu, Jee-Cheng; Jiang, Chi-Ming; Huang, Chen-Liang

    2015-10-01

    Anomaly detection is an important topic in the exploitation of hyperspectral data. Based on the Reed-Xiaoli (RX) detector and a morphology operator, this research proposes a novel technique for improving the accuracy of hyperspectral anomaly detection. Firstly, the RX-based detector is used to process a given input scene. Then, a post-processing scheme using morphology operator is employed to detect those pixels around high-scoring anomaly pixels. Tests were conducted using two real hyperspectral images with ground truth information and the results based on receiver operating characteristic curves, illustrated that the proposed method reduced the false alarm rates of the RXbased detector.

  14. An enhanced stream mining approach for network anomaly detection

    NASA Astrophysics Data System (ADS)

    Bellaachia, Abdelghani; Bhatt, Rajat

    2005-03-01

    Network anomaly detection is one of the hot topics in the market today. Currently, researchers are trying to find a way in which machines could automatically learn both normal and anomalous behavior and thus detect anomalies if and when they occur. Most important applications which could spring out of these systems is intrusion detection and spam mail detection. In this paper, the primary focus on the problem and solution of "real time" network intrusion detection although the underlying theory discussed may be used for other applications of anomaly detection (like spam detection or spy-ware detection) too. Since a machine needs a learning process on its own, data mining has been chosen as a preferred technique. The object of this paper is to present a real time clustering system; we call Enhanced Stream Mining (ESM) which could analyze packet information (headers, and data) to determine intrusions.

  15. Sequential Analysis: Hypothesis Testing and Changepoint Detection

    DTIC Science & Technology

    2014-07-11

    markets, detection of signals with unknown arrival time in seismology , navigation, radar and sonar signal processing, speech segmentation, and the...discussed in [10]. Other examples of events in seismology are men- tioned in the previous subsection. Changepoint detection methods are also efficient

  16. Evaluation of Anomaly Detection Method Based on Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Fontugne, Romain; Himura, Yosuke; Fukuda, Kensuke

    The number of threats on the Internet is rapidly increasing, and anomaly detection has become of increasing importance. High-speed backbone traffic is particularly degraded, but their analysis is a complicated task due to the amount of data, the lack of payload data, the asymmetric routing and the use of sampling techniques. Most anomaly detection schemes focus on the statistical properties of network traffic and highlight anomalous traffic through their singularities. In this paper, we concentrate on unusual traffic distributions, which are easily identifiable in temporal-spatial space (e.g., time/address or port). We present an anomaly detection method that uses a pattern recognition technique to identify anomalies in pictures representing traffic. The main advantage of this method is its ability to detect attacks involving mice flows. We evaluate the parameter set and the effectiveness of this approach by analyzing six years of Internet traffic collected from a trans-Pacific link. We show several examples of detected anomalies and compare our results with those of two other methods. The comparison indicates that the only anomalies detected by the pattern-recognition-based method are mainly malicious traffic with a few packets.

  17. Lidar detection algorithm for time and range anomalies

    NASA Astrophysics Data System (ADS)

    Ben-David, Avishai; Davidson, Charles E.; Vanderbeek, Richard G.

    2007-10-01

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t1 to t2" is addressed, and for range anomaly where the question "is a target present at time t within ranges R1 and R2" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO2 lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  18. Mixtures of Probabilistic Principal Component Analyzers for Anomaly Detection

    SciTech Connect

    Fang, Yi; Ganguly, Auroop R

    2007-01-01

    Anomaly detection tools have been increasingly used in recent years to generate predictive insights on rare events. The typical challenges encountered in such applications include a large number of data dimensions and absence of labeled data. An anomaly detection strategy for these scenarios is dimensionality reduction followed by clustering in the reduced space, with the degree of anomaly of an event or observation quantified by statistical distance from the clusters. However, most research efforts so far are focused on single abrupt anomalies, while the correlation between observations is completely ignored. In this paper, we address the problem of detection of both abrupt and sustained anomalies with high dimensions. The task becomes more challenging than only detecting abrupt outliers because of the gradual and indiscriminant changes in sustained anomalies. We utilize a mixture model of probabilistic principal component analyzers to quantify each observation by probabilistic measures. A statistical process control method is then used to monitor both abrupt and gradual changes. On the other hand, the mixture model can be regarded as a trade-off strategy between linear and nonlinear dimensionality reductions in terms of computational efficiency. This compromise is particularly important in real-time deployment. The proposed method is evaluated on simulated and benchmark data, as well as on data from wide-area sensors at a truck weigh station test-bed.

  19. A New Methodology for Early Anomaly Detection of BWR Instabilities

    SciTech Connect

    Ivanov, K. N.

    2005-11-27

    The objective of the performed research is to develop an early anomaly detection methodology so as to enhance safety, availability, and operational flexibility of Boiling Water Reactor (BWR) nuclear power plants. The technical approach relies on suppression of potential power oscillations in BWRs by detecting small anomalies at an early stage and taking appropriate prognostic actions based on an anticipated operation schedule. The research utilizes a model of coupled (two-phase) thermal-hydraulic and neutron flux dynamics, which is used as a generator of time series data for anomaly detection at an early stage. The model captures critical nonlinear features of coupled thermal-hydraulic and nuclear reactor dynamics and (slow time-scale) evolution of the anomalies as non-stationary parameters. The time series data derived from this nonlinear non-stationary model serves as the source of information for generating the symbolic dynamics for characterization of model parameter changes that quantitatively represent small anomalies. The major focus of the presented research activity was on developing and qualifying algorithms of pattern recognition for power instability based on anomaly detection from time series data, which later can be used to formulate real-time decision and control algorithms for suppression of power oscillations for a variety of anticipated operating conditions. The research being performed in the framework of this project is essential to make significant improvement in the capability of thermal instability analyses for enhancing safety, availability, and operational flexibility of currently operating and next generation BWRs.

  20. Statistical Studies on Sequential Probability Ratio Test for Radiation Detection

    SciTech Connect

    Warnick Kernan, Ding Yuan, et al.

    2007-07-01

    A Sequential Probability Ratio Test (SPRT) algorithm helps to increase the reliability and speed of radiation detection. This algorithm is further improved to reduce spatial gap and false alarm. SPRT, using Last-in-First-Elected-Last-Out (LIFELO) technique, reduces the error between the radiation measured and resultant alarm. Statistical analysis determines the reduction of spatial error and false alarm.

  1. Evaluation schemes for video and image anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael

    2016-05-01

    Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.

  2. Anomalies.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on anomalies includes Web sites, CD-ROMs and software, videos, books, and additional resources for elementary and junior high school students. Pertinent activities are suggested, and sidebars discuss UFOs, animal anomalies, and anomalies from nature; and resources covering unexplained phenonmenas like crop circles, Easter Island,…

  3. Anomaly Detection in Power Quality at Data Centers

    NASA Technical Reports Server (NTRS)

    Grichine, Art; Solano, Wanda M.

    2015-01-01

    The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.

  4. Visual analytics of anomaly detection in large data streams

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay

    2009-01-01

    Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.

  5. Anomaly Detection In Additively Manufactured Parts Using Laser Doppler Vibrometery

    SciTech Connect

    Hernandez, Carlos A.

    2015-09-29

    Additively manufactured parts are susceptible to non-uniform structure caused by the unique manufacturing process. This can lead to structural weakness or catastrophic failure. Using laser Doppler vibrometry and frequency response analysis, non-contact detection of anomalies in additively manufactured parts may be possible. Preliminary tests show promise for small scale detection, but more future work is necessary.

  6. Anomaly detection of blast furnace condition using tuyere cameras

    NASA Astrophysics Data System (ADS)

    Yamahira, Naoshi; Hirata, Takehide; Tsuda, Kazuro; Morikawa, Yasuyuki; Takata, Yousuke

    2016-09-01

    We present a method of anomaly detection using multivariate statistical process control(MSPC) to detect the abnormal behaviors of a blast furnace. Tuyere cameras attached circumferentially at the lower side of a blast furnace are used to monitor the inside of the furnace and this method extracts abnormal behaviors of intensities. It is confirmed that with our method, detecting timing is earlier than operators' notice. Besides, misalignment of cameras doesn't affect detecting performance, which is important property in actual use.

  7. Sequential Detection of Fission Processes for Harbor Defense

    SciTech Connect

    Candy, J V; Walston, S E; Chambers, D H

    2015-02-12

    With the large increase in terrorist activities throughout the world, the timely and accurate detection of special nuclear material (SNM) has become an extremely high priority for many countries concerned with national security. The detection of radionuclide contraband based on their γ-ray emissions has been attacked vigorously with some interesting and feasible results; however, the fission process of SNM has not received as much attention due to its inherent complexity and required predictive nature. In this paper, on-line, sequential Bayesian detection and estimation (parameter) techniques to rapidly and reliably detect unknown fissioning sources with high statistical confidence are developed.

  8. Dependence-Based Anomaly Detection Methodologies

    DTIC Science & Technology

    2012-08-16

    tricks the user to enter their Netflix login. Detecting it is out of our scope and requires site authentication (i.e., certification verification... Netflix login. Detecting it is out of our scope and requires site authentication (i.e., certification verification) and user education. The preliminary

  9. The role of visualization and interaction in maritime anomaly detection

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran

    2011-01-01

    The surveillance of large sea, air or land areas normally involves the analysis of large volumes of heterogeneous data from multiple sources. Timely detection and identification of anomalous behavior or any threat activity is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems for area surveillance are rarely used in the real world. We argue that such capabilities and applications present two critical challenges: (1) they need to provide adequate user support and (2) they need to involve the user in the underlying detection process. In order to encourage the use of anomaly detection capabilities in surveillance systems, this paper analyzes the challenges that existing anomaly detection and behavioral analysis approaches present regarding their use and maintenance by users. We analyze input parameters, detection process, model representation and outcomes. We discuss the role of visualization and interaction in the anomaly detection process. Practical examples from our current research within the maritime domain illustrate key aspects presented.

  10. Identifying Threats Using Graph-based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Eberle, William; Holder, Lawrence; Cook, Diane

    Much of the data collected during the monitoring of cyber and other infrastructures is structural in nature, consisting of various types of entities and relationships between them. The detection of threatening anomalies in such data is crucial to protecting these infrastructures. We present an approach to detecting anomalies in a graph-based representation of such data that explicitly represents these entities and relationships. The approach consists of first finding normative patterns in the data using graph-based data mining and then searching for small, unexpected deviations to these normative patterns, assuming illicit behavior tries to mimic legitimate, normative behavior. The approach is evaluated using several synthetic and real-world datasets. Results show that the approach has high truepositive rates, low false-positive rates, and is capable of detecting complex structural anomalies in real-world domains including email communications, cellphone calls and network traffic.

  11. [Anomaly Detection of Multivariate Time Series Based on Riemannian Manifolds].

    PubMed

    Xu, Yonghong; Hou, Xiaoying; Li Shuting; Cui, Jie

    2015-06-01

    Multivariate time series problems widely exist in production and life in the society. Anomaly detection has provided people with a lot of valuable information in financial, hydrological, meteorological fields, and the research areas of earthquake, video surveillance, medicine and others. In order to quickly and efficiently find exceptions in time sequence so that it can be presented in front of people in an intuitive way, we in this study combined the Riemannian manifold with statistical process control charts, based on sliding window, with a description of the covariance matrix as the time sequence, to achieve the multivariate time series of anomaly detection and its visualization. We made MA analog data flow and abnormal electrocardiogram data from MIT-BIH as experimental objects, and verified the anomaly detection method. The results showed that the method was reasonable and effective.

  12. Anomaly detection based on sensor data in petroleum industry applications.

    PubMed

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-27

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection.

  13. Anomaly Detection Based on Sensor Data in Petroleum Industry Applications

    PubMed Central

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-01

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection. PMID:25633599

  14. Anomaly detection using classified eigenblocks in GPR image

    NASA Astrophysics Data System (ADS)

    Kim, Min Ju; Kim, Seong Dae; Lee, Seung-eui

    2016-05-01

    Automatic landmine detection system using ground penetrating radar has been widely researched. For the automatic mine detection system, system speed is an important factor. Many techniques for mine detection have been developed based on statistical background. Among them, a detection technique employing the Principal Component Analysis(PCA) has been used for clutter reduction and anomaly detection. However, the PCA technique can retard the entire process, because of large basis dimension and a numerous number of inner product operations. In order to overcome this problem, we propose a fast anomaly detection system using 2D DCT and PCA. Our experiments use a set of data obtained from a test site where the anti-tank and anti- personnel mines are buried. We evaluate the proposed system in terms of the ROC curve. The result shows that the proposed system performs much better than the conventional PCA systems from the viewpoint of speed and false alarm rate.

  15. Profile-based adaptive anomaly detection for network security.

    SciTech Connect

    Zhang, Pengchu C. (Sandia National Laboratories, Albuquerque, NM); Durgin, Nancy Ann

    2005-11-01

    As information systems become increasingly complex and pervasive, they become inextricably intertwined with the critical infrastructure of national, public, and private organizations. The problem of recognizing and evaluating threats against these complex, heterogeneous networks of cyber and physical components is a difficult one, yet a solution is vital to ensuring security. In this paper we investigate profile-based anomaly detection techniques that can be used to address this problem. We focus primarily on the area of network anomaly detection, but the approach could be extended to other problem domains. We investigate using several data analysis techniques to create profiles of network hosts and perform anomaly detection using those profiles. The ''profiles'' reduce multi-dimensional vectors representing ''normal behavior'' into fewer dimensions, thus allowing pattern and cluster discovery. New events are compared against the profiles, producing a quantitative measure of how ''anomalous'' the event is. Most network intrusion detection systems (IDSs) detect malicious behavior by searching for known patterns in the network traffic. This approach suffers from several weaknesses, including a lack of generalizability, an inability to detect stealthy or novel attacks, and lack of flexibility regarding alarm thresholds. Our research focuses on enhancing current IDS capabilities by addressing some of these shortcomings. We identify and evaluate promising techniques for data mining and machine-learning. The algorithms are ''trained'' by providing them with a series of data-points from ''normal'' network traffic. A successful algorithm can be trained automatically and efficiently, will have a low error rate (low false alarm and miss rates), and will be able to identify anomalies in ''pseudo real-time'' (i.e., while the intrusion is still in progress, rather than after the fact). We also build a prototype anomaly detection tool that demonstrates how the techniques might

  16. Attention focusing and anomaly detection in systems monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. The focus of this paper is a new technique for attention focusing. The technique involves reasoning about the distance between two frequency distributions, and is used to detect both anomalous system parameters and 'broken' causal dependencies. These two forms of information together isolate the locus of anomalous behavior in the system being monitored.

  17. Anomaly Detection and Modeling of Trajectories

    DTIC Science & Technology

    2012-08-01

    unsupervised fashion using support vector machines (SVMs) and various spatial representations of trajectories. This thesis will also focus on...empirically to provide a rich analysis of trajectory datasets. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as...based on the rest of the dataset. This thesis develops a technique for detecting anoma- lous trajectories in a dataset in an unsupervised fashion using

  18. Compressive Hyperspectral Imaging and Anomaly Detection

    DTIC Science & Technology

    2013-03-01

    simple, yet effective method of using the spatial information to increase the accuracy of target detection. The idea is to apply TV denoising [4] to the...a zero value, and isolated false alarm pixels are usually eliminated by the TV denoising algorithm. 2 2.1.1 TV Denoising Here we briefly describe the...total variation denoising model[4] we use in the above. Given an image I ∈ R2, we solve the following L1 minimization problem to denoise the image

  19. The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection

    NASA Astrophysics Data System (ADS)

    Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad; Demon, Siti Zulaikha Ngah

    2016-01-01

    The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measured 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.

  20. The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection

    SciTech Connect

    Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad; Demon, Siti Zulaikha Ngah

    2016-01-22

    The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measured 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.

  1. Robust and efficient anomaly detection using heterogeneous representations

    NASA Astrophysics Data System (ADS)

    Hu, Xing; Hu, Shiqiang; Xie, Jinhua; Zheng, Shiyou

    2015-05-01

    Various approaches have been proposed for video anomaly detection. Yet these approaches typically suffer from one or more limitations: they often characterize the pattern using its internal information, but ignore its external relationship which is important for local anomaly detection. Moreover, the high-dimensionality and the lack of robustness of pattern representation may lead to problems, including overfitting, increased computational cost and memory requirements, and high false alarm rate. We propose a video anomaly detection framework which relies on a heterogeneous representation to account for both the pattern's internal information and external relationship. The internal information is characterized by slow features learned by slow feature analysis from low-level representations, and the external relationship is characterized by the spatial contextual distances. The heterogeneous representation is compact, robust, efficient, and discriminative for anomaly detection. Moreover, both the pattern's internal information and external relationship can be taken into account in the proposed framework. Extensive experiments demonstrate the robustness and efficiency of our approach by comparison with the state-of-the-art approaches on the widely used benchmark datasets.

  2. Radio Frequency Based Programmable Logic Controller Anomaly Detection

    DTIC Science & Technology

    2013-09-01

    RF- DNA Transform . . . . . . . . . . . . . . . . . . . . 49 3.7 Region of Interest Selection . . . . . . . . . . . . . . . . . . . . 52 3.8 CBAD...Device, NB=60, NOp=5 . . . . . . . . . . . . . . . 71 4.4 Software Anomaly Detection: RF- DNA Sequences . . . . . . . . 74 vii Page 4.4.1 Single Device...Waveforms . . . . . . . . . . . . . . 50 3.11 RF- DNA Fingerprint Diagram . . . . . . . . . . . . . . . . . 53 3.12 Representative Collected Scan Waveform

  3. Hyperspectral anomaly detection using Sony PlayStation 3

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton; Romano, João; Sepulveda, Rene

    2009-05-01

    We present a proof-of-principle demonstration using Sony's IBM Cell processor-based PlayStation 3 (PS3) to run-in near real-time-a hyperspectral anomaly detection algorithm (HADA) on real hyperspectral (HS) long-wave infrared imagery. The PS3 console proved to be ideal for doing precisely the kind of heavy computational lifting HS based algorithms require, and the fact that it is a relatively open platform makes programming scientific applications feasible. The PS3 HADA is a unique parallel-random sampling based anomaly detection approach that does not require prior spectra of the clutter background. The PS3 HADA is designed to handle known underlying difficulties (e.g., target shape/scale uncertainties) often ignored in the development of autonomous anomaly detection algorithms. The effort is part of an ongoing cooperative contribution between the Army Research Laboratory and the Army's Armament, Research, Development and Engineering Center, which aims at demonstrating performance of innovative algorithmic approaches for applications requiring autonomous anomaly detection using passive sensors.

  4. A spring window for geobotanical anomaly detection

    NASA Technical Reports Server (NTRS)

    Bell, R.; Labovitz, M. L.; Masuoka, E. J.

    1985-01-01

    The observation of senescence of deciduous vegetation to detect soil heavy metal mineralization is discussed. A gridded sampling of two sites of Quercus alba L. in south-central Virginia in 1982 is studied. The data reveal that smaller leaf blade lengths are observed in the soil site with copper, lead, and zinc concentrations. A random study in 1983 of red and white Q. rubra L., Q. prinus L., and Acer rubrum L., to confirm previous results is described. The observations of blade length and bud breaks show a 7-10 day lag in growth in the mineral site for the oak trees; however, the maple trees are not influenced by the minerals.

  5. Solar cell anomaly detection method and apparatus

    NASA Technical Reports Server (NTRS)

    Miller, Emmett L. (Inventor); Shumka, Alex (Inventor); Gauthier, Michael K. (Inventor)

    1981-01-01

    A method is provided for detecting cracks and other imperfections in a solar cell, which includes scanning a narrow light beam back and forth across the cell in a raster pattern, while monitoring the electrical output of the cell to find locations where the electrical output varies significantly. The electrical output can be monitored on a television type screen containing a raster pattern with each point on the screen corresponding to a point on the solar cell surface, and with the brightness of each point on the screen corresponding to the electrical output from the cell which was produced when the light beam was at the corresponding point on the cell. The technique can be utilized to scan a large array of interconnected solar cells, to determine which ones are defective.

  6. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  7. An Anomaly Clock Detection Algorithm for a Robust Clock Ensemble

    DTIC Science & Technology

    2009-11-01

    41 st Annual Precise Time and Time Interval (PTTI) Meeting 121 AN ANOMALY CLOCK DETECTION ALGORITHM FOR A ROBUST CLOCK ENSEMBLE...clocks are in phase and on frequency all the time with advantages of relatively simple, robust, fully redundant, and improved performance. It allows...Algorithm parameters, such as the sliding window width as a function of the time constant, and the minimum detectable levels have been optimized and

  8. Gaussian Process for Activity Modeling and Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Liao, W.; Rosenhahn, B.; Yang, M. Ying

    2015-08-01

    Complex activity modeling and identification of anomaly is one of the most interesting and desired capabilities for automated video behavior analysis. A number of different approaches have been proposed in the past to tackle this problem. There are two main challenges for activity modeling and anomaly detection: 1) most existing approaches require sufficient data and supervision for learning; 2) the most interesting abnormal activities arise rarely and are ambiguous among typical activities, i.e. hard to be precisely defined. In this paper, we propose a novel approach to model complex activities and detect anomalies by using non-parametric Gaussian Process (GP) models in a crowded and complicated traffic scene. In comparison with parametric models such as HMM, GP models are nonparametric and have their advantages. Our GP models exploit implicit spatial-temporal dependence among local activity patterns. The learned GP regression models give a probabilistic prediction of regional activities at next time interval based on observations at present. An anomaly will be detected by comparing the actual observations with the prediction at real time. We verify the effectiveness and robustness of the proposed model on the QMUL Junction Dataset. Furthermore, we provide a publicly available manually labeled ground truth of this data set.

  9. Automated anomaly detection for Orbiter High Temperature Reusable Surface Insulation

    NASA Astrophysics Data System (ADS)

    Cooper, Eric G.; Jones, Sharon M.; Goode, Plesent W.; Vazquez, Sixto L.

    1992-11-01

    The description, analysis, and experimental results of a method for identifying possible defects on High Temperature Reusable Surface Insulation (HRSI) of the Orbiter Thermal Protection System (TPS) is presented. Currently, a visual postflight inspection of Orbiter TPS is conducted to detect and classify defects as part of the Orbiter maintenance flow. The objective of the method is to automate the detection of defects by identifying anomalies between preflight and postflight images of TPS components. The initial version is intended to detect and label gross (greater than 0.1 inches in the smallest dimension) anomalies on HRSI components for subsequent classification by a human inspector. The approach is a modified Golden Template technique where the preflight image of a tile serves as the template against which the postflight image of the tile is compared. Candidate anomalies are selected as a result of the comparison and processed to identify true anomalies. The processing methods are developed and discussed, and the results of testing on actual and simulated tile images are presented. Solutions to the problems of brightness and spatial normalization, timely execution, and minimization of false positives are also discussed.

  10. Anomaly detection for machine learning redshifts applied to SDSS galaxies

    NASA Astrophysics Data System (ADS)

    Hoyle, Ben; Rau, Markus Michael; Paech, Kerstin; Bonnett, Christopher; Seitz, Stella; Weller, Jochen

    2015-10-01

    We present an analysis of anomaly detection for machine learning redshift estimation. Anomaly detection allows the removal of poor training examples, which can adversely influence redshift estimates. Anomalous training examples may be photometric galaxies with incorrect spectroscopic redshifts, or galaxies with one or more poorly measured photometric quantity. We select 2.5 million `clean' SDSS DR12 galaxies with reliable spectroscopic redshifts, and 6730 `anomalous' galaxies with spectroscopic redshift measurements which are flagged as unreliable. We contaminate the clean base galaxy sample with galaxies with unreliable redshifts and attempt to recover the contaminating galaxies using the Elliptical Envelope technique. We then train four machine learning architectures for redshift analysis on both the contaminated sample and on the preprocessed `anomaly-removed' sample and measure redshift statistics on a clean validation sample generated without any preprocessing. We find an improvement on all measured statistics of up to 80 per cent when training on the anomaly removed sample as compared with training on the contaminated sample for each of the machine learning routines explored. We further describe a method to estimate the contamination fraction of a base data sample.

  11. Clutter and anomaly removal for enhanced target detection

    NASA Astrophysics Data System (ADS)

    Basener, William F.

    2010-04-01

    In this paper we investigate the use of anomaly detection to identify pixels to be removed prior to covariance computation. The resulting covariance matrix provides a better model of the image background and is less likely to be tainted by target spectra. In our tests, this method results in robust improvement in target detection performance for quadratic detection algorithms. Tests are conducted using imagery and targets freely available online. The imagery was acquired over Cooke City, Montana, a small town near Yellowstone Park, using the HyMap V/NIR/SWIR sensor with 126 spectral bands. There are three vehicle and four fabric targets located in the town and surrounding area.

  12. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  13. Anomaly Detection for Next-Generation Space Launch Ground Operations

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.

    2010-01-01

    NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.

  14. Claycap anomaly detection using hyperspectral remote sensing and lidargrammetric techniques

    NASA Astrophysics Data System (ADS)

    Garcia Quijano, Maria Jose

    Clay capped waste sites are a common method to dispose of the more than 40 million tons of hazardous waste produced in the United States every year (EPA, 2003). Due to the potential threat that hazardous waste poses, it is essential to monitor closely the performance of these facilities. Development of a monitoring system that exploits spectral and topographic changes over hazardous waste sites is presented. Spectral anomaly detection is based upon the observed changes in absolute reflectance and spectral derivatives in centipede grass (Eremochloa ophiuroides) under different irrigation levels. The spectral features that provide the best separability among irrigation levels were identified using Stepwise Discriminant Analyses. The Red Edge Position was selected as a suitable discriminant variable to compare the performance of a global and a local anomaly detection algorithm using a DAIS 3715 hyperspectral image. Topographical anomaly detection is assessed by evaluating the vertical accuracy of two LIDAR datasets acquired from two different altitudes (700 m and 1,200 m AGL) over a clay-capped hazardous site at the Savannah River National Laboratory, SC using the same Optech ALTM 2050 and Cessna 337 platform. Additionally, a quantitative comparison is performed to determine the effect that decreasing platform altitude and increasing posting density have on the vertical accuracy of the LIDAR data collected.

  15. GPR anomaly detection with robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Masarik, Matthew P.; Burns, Joseph; Thelen, Brian T.; Kelly, Jack; Havens, Timothy C.

    2015-05-01

    This paper investigates the application of Robust Principal Component Analysis (RPCA) to ground penetrating radar as a means to improve GPR anomaly detection. The method consists of a preprocessing routine to smoothly align the ground and remove the ground response (haircut), followed by mapping to the frequency domain, applying RPCA, and then mapping the sparse component of the RPCA decomposition back to the time domain. A prescreener is then applied to the time-domain sparse component to perform anomaly detection. The emphasis of the RPCA algorithm on sparsity has the effect of significantly increasing the apparent signal-to-clutter ratio (SCR) as compared to the original data, thereby enabling improved anomaly detection. This method is compared to detrending (spatial-mean removal) and classical principal component analysis (PCA), and the RPCA-based processing is seen to provide substantial improvements in the apparent SCR over both of these alternative processing schemes. In particular, the algorithm has been applied to both field collected impulse GPR data and has shown significant improvement in terms of the ROC curve relative to detrending and PCA.

  16. Anomaly-based intrusion detection for SCADA systems

    SciTech Connect

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper will briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)

  17. Progressive anomaly detection in medical data using vital sign signals

    NASA Astrophysics Data System (ADS)

    Gao, Cheng; Lee, Li-Chien; Li, Yao; Chang, Chein-I.; Hu, Peter; Mackenzie, Colin

    2016-05-01

    Vital Sign Signals (VSSs) have been widely used for medical data analysis. One classic approach is to use Logistic Regression Model (LRM) to describe data to be analyzed. There are two challenging issues from this approach. One is how many VSSs needed to be used in the model since there are many VSSs can be used for this purpose. Another is that once the number of VSSs is determined, the follow-up issue what these VSSs are. Up to date these two issues are resolved by empirical selection. This paper addresses these two issues from a hyperspectral imaging perspective. If we view a patient with collected different vital sign signals as a pixel vector in hyperspectral image, then each vital sign signal can be considered as a particular band. In light of this interpretation each VSS can be ranked by band prioritization commonly used by band selection in hyperspectral imaging. In order to resolve the issue of how many VSSs should be used for data analysis we further develop a Progressive Band Processing of Anomaly Detection (PBPAD) which allows users to detect anomalies in medical data using prioritized VSSs one after another so that data changes between bands can be dictated by profiles provided by PBPAD. As a result, there is no need of determining the number of VSSs as well as which VSS should be used because all VSSs are used in their prioritized orders. To demonstrate the utility of PBPAD in medical data analysis anomaly detection is implemented as PBP to find anomalies which correspond to abnormal patients. The data to be used for experiments are data collected in University of Maryland, School of Medicine, Shock Trauma Center (STC). The results will be evaluated by the results obtained by Logistic Regression Model (LRM).

  18. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.

    The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can

  19. Inflight and Preflight Detection of Pitot Tube Anomalies

    NASA Technical Reports Server (NTRS)

    Mitchell, Darrell W.

    2014-01-01

    The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.

  20. Value Focused Thinking Applications to Supervised Pattern Classification With Extensions to Hyperspectral Anomaly Detection Algorithms

    DTIC Science & Technology

    2015-03-26

    HYPERSPECTRAL ANOMALY DETECTION ALGORITHMS THESIS MARCH 2015 David E. Scanland, Captain, USAF AFIT-ENS-MS-15-M-121 DEPARTMENT OF THE AIR FORCE...PATTERN CLASSIFICATION WITH EXTENSIONS TO HYPERSPECTRAL ANOMALY DETECTION ALGORITHMS THESIS Presented to the Faculty Department of...APPLICATION TO SUPERVISED PATTERN CLASSIFICATION WITH EXTENSIONS TO HYPERSPECTRAL ANOMALY DETECTION ALGORITHMS David E. Scanland, MS Captain, USAF

  1. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    NASA Technical Reports Server (NTRS)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  2. Hierarchical Kohonenen net for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A; Huff, Julie

    2005-04-01

    A novel multilevel hierarchical Kohonen Net (K-Map) for an intrusion detection system is presented. Each level of the hierarchical map is modeled as a simple winner-take-all K-Map. One significant advantage of this multilevel hierarchical K-Map is its computational efficiency. Unlike other statistical anomaly detection methods such as nearest neighbor approach, K-means clustering or probabilistic analysis that employ distance computation in the feature space to identify the outliers, our approach does not involve costly point-to-point computation in organizing the data into clusters. Another advantage is the reduced network size. We use the classification capability of the K-Map on selected dimensions of data set in detecting anomalies. Randomly selected subsets that contain both attacks and normal records from the KDD Cup 1999 benchmark data are used to train the hierarchical net. We use a confidence measure to label the clusters. Then we use the test set from the same KDD Cup 1999 benchmark to test the hierarchical net. We show that a hierarchical K-Map in which each layer operates on a small subset of the feature space is superior to a single-layer K-Map operating on the whole feature space in detecting a variety of attacks in terms of detection rate as well as false positive rate.

  3. Automated detection of changes in sequential color ocular fundus images

    NASA Astrophysics Data System (ADS)

    Sakuma, Satoshi; Nakanishi, Tadashi; Takahashi, Yasuko; Fujino, Yuichi; Tsubouchi, Tetsuro; Nakanishi, Norimasa

    1998-06-01

    A recent trend is the automatic screening of color ocular fundus images. The examination of such images is used in the early detection of several adult diseases such as hypertension and diabetes. Since this type of examination is easier than CT, costs less, and has no harmful side effects, it will become a routine medical examination. Normal ocular fundus images are found in more than 90% of all people. To deal with the increasing number of such images, this paper proposes a new approach to process them automatically and accurately. Our approach, based on individual comparison, identifies changes in sequential images: a previously diagnosed normal reference image is compared to a non- diagnosed image.

  4. Detection of chiral anomaly and valley transport in Dirac semimetals

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Zhang, Enze; Liu, Yanwen; Chen, Zhigang; Liang, Sihang; Cao, Junzhi; Yuan, Xiang; Tang, Lei; Li, Qian; Gu, Teng; Wu, Yizheng; Zou, Jin; Xiu, Faxian

    Chiral anomaly is a non-conservation of chiral charge pumped by the topological nontrivial gauge field, which has been predicted to exist in the emergent quasiparticle excitations in Dirac and Weyl semimetals. However, so far, such pumping process hasn't been clearly demonstrated and lacks a convincing experimental identification. Here, we report the detection of the charge pumping effect and the related valley transport in Cd3As2 driven by external electric and magnetic fields (EB). We find that the chiral imbalance leads to a non-zero gyrotropic coefficient, which can be confirmed by the EB-generated Kerr effect. By applying B along the current direction, we observe a negative magnetoresistance despite the giant positive one at other directions, a clear indication of the chiral anomaly. Remarkably, a robust nonlocal response in valley diffusion originated from the chiral anomaly is persistent up to room temperature when B is parallel to E. The ability to manipulate the valley polarization in Dirac semimetal opens up a brand-new route to understand its fundamental properties through external fields and utilize the chiral fermions in valleytronic applications.

  5. Anomaly detection of flight routes through optimal waypoint

    NASA Astrophysics Data System (ADS)

    Pusadan, M. Y.; Buliali, J. L.; Ginardi, R. V. H.

    2017-01-01

    Deciding factor of flight, one of them is the flight route. Flight route determined by coordinate (latitude and longitude). flight routed is determined by its coordinates (latitude and longitude) as defined is waypoint. anomaly occurs, if the aircraft is flying outside the specified waypoint area. In the case of flight data, anomalies occur by identifying problems of the flight route based on data ADS-B. This study has an aim of to determine the optimal waypoints of the flight route. The proposed methods: i) Agglomerative Hierarchical Clustering (AHC) in several segments based on range area coordinates (latitude and longitude) in every waypoint; ii) The coefficient cophenetics correlation (c) to determine the correlation between the members in each cluster; iii) cubic spline interpolation as a graphic representation of the has connected between the coordinates on every waypoint; and iv). Euclidean distance to measure distances between waypoints with 2 centroid result of clustering AHC. The experiment results are value of coefficient cophenetics correlation (c): 0,691≤ c ≤ 0974, five segments the generated of the range area waypoint coordinates, and the shortest and longest distance between the centroid with waypoint are 0.46 and 2.18. Thus, concluded that the shortest distance is used as the reference coordinates of optimal waypoint, and farthest distance can be indicated potentially detected anomaly.

  6. System for Anomaly and Failure Detection (SAFD) system development

    NASA Technical Reports Server (NTRS)

    Oreilly, D.

    1993-01-01

    The System for Anomaly and Failure Detection (SAFD) algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failures as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient conditions. This task assignment originally specified developing a platform for executing the algorithm during hot fire tests at Technology Test Bed (TTB) and installing the SAFD algorithm on that platform. Two units were built and installed in the Hardware Simulation Lab and at the TTB in December 1991. Since that time, the task primarily entailed improvement and maintenance of the systems, additional testing to prove the feasibility of the algorithm, and support of hot fire testing. This document addresses the work done since the last report of June 1992. The work on the System for Anomaly and Failure Detection during this period included improving the platform and the algorithm, testing the algorithm against previous test data and in the Hardware Simulation Lab, installing other algorithms on the system, providing support for operations at the Technology Test Bed, and providing routine maintenance.

  7. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field

    PubMed Central

    Christiansen, Peter; Nielsen, Lars N.; Steen, Kim A.; Jørgensen, Rasmus N.; Karstoft, Henrik

    2016-01-01

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45–90 m) than RCNN. RCNN has a similar performance at a short range (0–30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit). PMID:27845717

  8. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field.

    PubMed

    Christiansen, Peter; Nielsen, Lars N; Steen, Kim A; Jørgensen, Rasmus N; Karstoft, Henrik

    2016-11-11

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks" (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45-90 m) than RCNN. RCNN has a similar performance at a short range (0-30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit).

  9. The Frequencies of the Urinary Anomalies which were Detected in a Foetal Autopsy Study

    PubMed Central

    Gupta, Tulika; Kapoor, Kanchan; Sharma, A.; Huria, A.

    2012-01-01

    Aim The detection of foetal urinary abnormalities in the antenatal period will help in an adequate post natal management and it will also have a bearing on the decision of the termination of the pregnancy. The purpose of the present study was to detect urinary anomalies in the antenatal period by doing autopsies of the aborted foetuses. Settings and Design A cross-sectional study. Methods and Material A total of 226 aborted foetuses were autopsied. The urinary anomalies which were related to the renal parenchyma, the pelvi-ureteral system and the urinary bladder were recorded. The associated anomalies of the other organ systems were also noted. The incidences of the different urinary anomalies among the aborted foetuses were calculated. The gestational ages at which the various anomalies were detected were also studied. Results Twenty nine of the 226 fetuses were detected to have 34 urinary anomalies. Renal agenesis was the single most common anomaly. Overall, the anomalies which were related to the renal parenchyma accounted for 67.65 % of all the urinary anomalies, while the anomalies of the pelvi-ureteral system and the bladder constituted 20.59% of the detected urinary anomalies. The anomalies of the renal parenchyma (renal agenesis and horse-shoe and polycystic kidneys) were more frequently seen in the foetuses with a shorter gestational age as compared to the gestational ages of the foetuses which showed pelvi-ureteral anomalies. The cumulative incidence of the foetuses with urinary anomalies by 30 weeks of gestation was 12.83%. Conclusions A significant proportion of the aborted foetuses was detected to have urinary anomalies. An early antenatal detection of these and associated anomalies has significance, as this may help in an early postnatal diagnosis and management. The degree and the extent of the detected anomalies could also help in the decision making regarding the therapeutic abortions and the future pregnancies. PMID:23373012

  10. Application of Improved SOM Neural Network in Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Jiang, Xueying; Liu, Kean; Yan, Jiegou; Chen, Wenhui

    For the false alarm rate, false negative rate, training time and other issues of SOM neural network algorithm, the author Gives an improved anomaly detection SOM algorithm---FPSOM through the introduction of the learning rate, which can adaptively learn the original sample space, better reflects the status of the original data. At the same time, combined with the artificial neural network, The author also gives the intelligent detection model and the model of the training module, designed the main realization of FPSOM neural network algorithm, and finally simulation experiments were carried out in KDDCUP data sets. The experiments show that the new algorithm is better than SOM which can greatly shorten the training time, and effectively improve the detection rate and reduce the false positive rate.

  11. Detection of fluid density anomalies using remote imaging techniques

    NASA Astrophysics Data System (ADS)

    Smart, Clara J.

    Systematic and remote imaging techniques capable of detecting fluid density anomalies will allow for effective scientific sampling, improved geologic and biologic spatial understanding and analysis of temporal changes. This work presents algorithms for detection of anomalous fluids using an ROV-mounted high resolution imaging suite, specifically the structured light laser sensor and 1350kHz multibeam sonar system. As the ROV-mounted structured light laser sensor passes over areas of active flow the turbulent nature of the density anomaly causes the project laser line, imaged at the seafloor, to blur and distort. Detection of this phenomena was initially presented in 2013 with significant limitations including false positive results for active venting. Advancements to the detection algorithm presented in this work include intensity normalization algorithms and the implementation of a support vector machine classification algorithm. Results showing clear differentiation between areas of plain seafloor, bacteria or biology, and active venting are presented for multiple hydrothermal vent fields. Survey altitudes and the direction of travel impact laser data gathered over active vent sites. To determine the implications of these survey parameters, data collected over a single hydrothermal vent at three altitudes with four headings per altitude are analyzed. Changing survey geometry will impact the resolution and intensity of the laser line images, therefore, normalization and processing considerations are presented to maintain signal quality. The spatial distribution of the detected density anomaly will also be discussed as it is impacted by survey range and vehicle heading. While surveying hypersaline brine pools the observed acoustic responses from the 1350kHz high frequency multibeam sonar system indicate sensitivity to changes in acoustic impedance and therefore the density of a fluid. Internal density stratification was detected acoustically, appearing as multiple

  12. Anomaly Detection in Test Equipment via Sliding Mode Observers

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.; Drakunov, Sergey V.

    2012-01-01

    Nonlinear observers were originally developed based on the ideas of variable structure control, and for the purpose of detecting disturbances in complex systems. In this anomaly detection application, these observers were designed for estimating the distributed state of fluid flow in a pipe described by a class of advection equations. The observer algorithm uses collected data in a piping system to estimate the distributed system state (pressure and velocity along a pipe containing liquid gas propellant flow) using only boundary measurements. These estimates are then used to further estimate and localize possible anomalies such as leaks or foreign objects, and instrumentation metering problems such as incorrect flow meter orifice plate size. The observer algorithm has the following parts: a mathematical model of the fluid flow, observer control algorithm, and an anomaly identification algorithm. The main functional operation of the algorithm is in creating the sliding mode in the observer system implemented as software. Once the sliding mode starts in the system, the equivalent value of the discontinuous function in sliding mode can be obtained by filtering out the high-frequency chattering component. In control theory, "observers" are dynamic algorithms for the online estimation of the current state of a dynamic system by measurements of an output of the system. Classical linear observers can provide optimal estimates of a system state in case of uncertainty modeled by white noise. For nonlinear cases, the theory of nonlinear observers has been developed and its success is mainly due to the sliding mode approach. Using the mathematical theory of variable structure systems with sliding modes, the observer algorithm is designed in such a way that it steers the output of the model to the output of the system obtained via a variety of sensors, in spite of possible mismatches between the assumed model and actual system. The unique properties of sliding mode control

  13. System for Anomaly and Failure Detection (SAFD) system development

    NASA Astrophysics Data System (ADS)

    Oreilly, D.

    1992-07-01

    This task specified developing the hardware and software necessary to implement the System for Anomaly and Failure Detection (SAFD) algorithm, developed under Technology Test Bed (TTB) Task 21, on the TTB engine stand. This effort involved building two units; one unit to be installed in the Block II Space Shuttle Main Engine (SSME) Hardware Simulation Lab (HSL) at Marshall Space Flight Center (MSFC), and one unit to be installed at the TTB engine stand. Rocketdyne personnel from the HSL performed the task. The SAFD algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failure as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient condition.

  14. Log Summarization and Anomaly Detection for TroubleshootingDistributed Systems

    SciTech Connect

    Gunter, Dan; Tierney, Brian L.; Brown, Aaron; Swany, Martin; Bresnahan, John; Schopf, Jennifer M.

    2007-08-01

    Today's system monitoring tools are capable of detectingsystem failures such as host failures, OS errors, and network partitionsin near-real time. Unfortunately, the same cannot yet be said of theend-to-end distributed softwarestack. Any given action, for example,reliably transferring a directory of files, can involve a wide range ofcomplex and interrelated actions across multiple pieces of software:checking user certificates and permissions, getting details for allfiles, performing third-party transfers, understanding re-try policydecisions, etc. We present an infrastructure for troubleshooting complexmiddleware, a general purpose technique for configurable logsummarization, and an anomaly detection technique that works in near-realtime on running Grid middleware. We present results gathered using thisinfrastructure from instrumented Grid middleware and applications runningon the Emulab testbed. From these results, we analyze the effectivenessof several algorithms at accurately detecting a variety of performanceanomalies.

  15. Identification and detection of anomalies through SSME data analysis

    NASA Technical Reports Server (NTRS)

    Pereira, Lisa; Ali, Moonis

    1990-01-01

    The goal of the ongoing research described in this paper is to analyze real-time ground test data in order to identify patterns associated with the anomalous engine behavior, and on the basis of this analysis to develop an expert system which detects anomalous engine behavior in the early stages of fault development. A prototype of the expert system has been developed and tested on the high frequency data of two SSME tests, namely Test #901-0516 and Test #904-044. The comparison of our results with the post-test analyses indicates that the expert system detected the presence of the anomalies in a significantly early stage of fault development.

  16. System for Anomaly and Failure Detection (SAFD) system development

    NASA Technical Reports Server (NTRS)

    Oreilly, D.

    1992-01-01

    This task specified developing the hardware and software necessary to implement the System for Anomaly and Failure Detection (SAFD) algorithm, developed under Technology Test Bed (TTB) Task 21, on the TTB engine stand. This effort involved building two units; one unit to be installed in the Block II Space Shuttle Main Engine (SSME) Hardware Simulation Lab (HSL) at Marshall Space Flight Center (MSFC), and one unit to be installed at the TTB engine stand. Rocketdyne personnel from the HSL performed the task. The SAFD algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failure as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient condition.

  17. Anomaly detection of microstructural defects in continuous fiber reinforced composites

    NASA Astrophysics Data System (ADS)

    Bricker, Stephen; Simmons, J. P.; Przybyla, Craig; Hardie, Russell

    2015-03-01

    Ceramic matrix composites (CMC) with continuous fiber reinforcements have the potential to enable the next generation of high speed hypersonic vehicles and/or significant improvements in gas turbine engine performance due to their exhibited toughness when subjected to high mechanical loads at extreme temperatures (2200F+). Reinforced fiber composites (RFC) provide increased fracture toughness, crack growth resistance, and strength, though little is known about how stochastic variation and imperfections in the material effect material properties. In this work, tools are developed for quantifying anomalies within the microstructure at several scales. The detection and characterization of anomalous microstructure is a critical step in linking production techniques to properties, as well as in accurate material simulation and property prediction for the integrated computation materials engineering (ICME) of RFC based components. It is desired to find statistical outliers for any number of material characteristics such as fibers, fiber coatings, and pores. Here, fiber orientation, or `velocity', and `velocity' gradient are developed and examined for anomalous behavior. Categorizing anomalous behavior in the CMC is approached by multivariate Gaussian mixture modeling. A Gaussian mixture is employed to estimate the probability density function (PDF) of the features in question, and anomalies are classified by their likelihood of belonging to the statistical normal behavior for that feature.

  18. Anomaly detection applied to a materials control and accounting database

    SciTech Connect

    Whiteson, R.; Spanks, L.; Yarbro, T.

    1995-09-01

    An important component of the national mission of reducing the nuclear danger includes accurate recording of the processing and transportation of nuclear materials. Nuclear material storage facilities, nuclear chemical processing plants, and nuclear fuel fabrication facilities collect and store large amounts of data describing transactions that involve nuclear materials. To maintain confidence in the integrity of these data, it is essential to identify anomalies in the databases. Anomalous data could indicate error, theft, or diversion of material. Yet, because of the complex and diverse nature of the data, analysis and evaluation are extremely tedious. This paper describes the authors work in the development of analysis tools to automate the anomaly detection process for the Material Accountability and Safeguards System (MASS) that tracks and records the activities associated with accountable quantities of nuclear material at Los Alamos National Laboratory. Using existing guidelines that describe valid transactions, the authors have created an expert system that identifies transactions that do not conform to the guidelines. Thus, this expert system can be used to focus the attention of the expert or inspector directly on significant phenomena.

  19. Sequential Model-Based Detection in a Shallow Ocean Acoustic Environment

    SciTech Connect

    Candy, J V

    2002-03-26

    A model-based detection scheme is developed to passively monitor an ocean acoustic environment along with its associated variations. The technique employs an embedded model-based processor and a reference model in a sequential likelihood detection scheme. The monitor is therefore called a sequential reference detector. The underlying theory for the design is developed and discussed in detail.

  20. A High-Order Statistical Tensor Based Algorithm for Anomaly Detection in Hyperspectral Imagery

    PubMed Central

    Geng, Xiurui; Sun, Kang; Ji, Luyan; Zhao, Yongchao

    2014-01-01

    Recently, high-order statistics have received more and more interest in the field of hyperspectral anomaly detection. However, most of the existing high-order statistics based anomaly detection methods require stepwise iterations since they are the direct applications of blind source separation. Moreover, these methods usually produce multiple detection maps rather than a single anomaly distribution image. In this study, we exploit the concept of coskewness tensor and propose a new anomaly detection method, which is called COSD (coskewness detector). COSD does not need iteration and can produce single detection map. The experiments based on both simulated and real hyperspectral data sets verify the effectiveness of our algorithm. PMID:25366706

  1. Near-Real Time Anomaly Detection for Scientific Sensor Data

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A.; Tweedie, C. E.; goswami, S.; Jaimes, A.; Gamon, J. A.

    2011-12-01

    Verification (SDVe) prototype tool identified anomalies detected by the expert-specified data properties over the EC data. Scientists using DaProS and SDVe were able to detect environmental variability, instrument malfunctioning, and seasonal and diurnal variability in EC and hyperspectral datasets. The results of the experiment also yielded insights regarding the practices followed by scientists to specify data properties, and it exposed new data properties challenges and a potential method for capturing data quality confidence levels.

  2. Apparatus for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    SciTech Connect

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1984-03-13

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  3. Apparatus for detecting a magnetic anomaly contiguous to remote location by squid gradiometer and magnetometer systems

    DOEpatents

    Overton, Jr., William C.; Steyert, Jr., William A.

    1984-01-01

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  4. FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection

    PubMed Central

    Brodley, Carla; Slonim, Donna

    2011-01-01

    Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called “normal” instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach. PMID:22639542

  5. Sensor Anomaly Detection in Wireless Sensor Networks for Healthcare

    PubMed Central

    Haque, Shah Ahsanul; Rahman, Mustafizur; Aziz, Syed Mahfuzul

    2015-01-01

    Wireless Sensor Networks (WSN) are vulnerable to various sensor faults and faulty measurements. This vulnerability hinders efficient and timely response in various WSN applications, such as healthcare. For example, faulty measurements can create false alarms which may require unnecessary intervention from healthcare personnel. Therefore, an approach to differentiate between real medical conditions and false alarms will improve remote patient monitoring systems and quality of healthcare service afforded by WSN. In this paper, a novel approach is proposed to detect sensor anomaly by analyzing collected physiological data from medical sensors. The objective of this method is to effectively distinguish false alarms from true alarms. It predicts a sensor value from historic values and compares it with the actual sensed value for a particular instance. The difference is compared against a threshold value, which is dynamically adjusted, to ascertain whether the sensor value is anomalous. The proposed approach has been applied to real healthcare datasets and compared with existing approaches. Experimental results demonstrate the effectiveness of the proposed system, providing high Detection Rate (DR) and low False Positive Rate (FPR). PMID:25884786

  6. Online anomaly detection in crowd scenes via structure analysis.

    PubMed

    Yuan, Yuan; Fang, Jianwu; Wang, Qi

    2015-03-01

    Abnormal behavior detection in crowd scenes is continuously a challenge in the field of computer vision. For tackling this problem, this paper starts from a novel structure modeling of crowd behavior. We first propose an informative structural context descriptor (SCD) for describing the crowd individual, which originally introduces the potential energy function of particle's interforce in solid-state physics to intuitively conduct vision contextual cueing. For computing the crowd SCD variation effectively, we then design a robust multi-object tracker to associate the targets in different frames, which employs the incremental analytical ability of the 3-D discrete cosine transform (DCT). By online spatial-temporal analyzing the SCD variation of the crowd, the abnormality is finally localized. Our contribution mainly lies on three aspects: 1) the new exploration of abnormal detection from structure modeling where the motion difference between individuals is computed by a novel selective histogram of optical flow that makes the proposed method can deal with more kinds of anomalies; 2) the SCD description that can effectively represent the relationship among the individuals; and 3) the 3-D DCT multi-object tracker that can robustly associate the limited number of (instead of all) targets which makes the tracking analysis in high density crowd situation feasible. Experimental results on several publicly available crowd video datasets verify the effectiveness of the proposed method.

  7. Traffic Pattern Detection Using the Hough Transformation for Anomaly Detection to Improve Maritime Domain Awareness

    DTIC Science & Technology

    2013-12-01

    emergency responders to prioritize their actions based directly on the anomaly detection system output. 2. The Point-in-polygon Problem If expected...Computer Vision, Jan. 30 –Feb. 1, pp. 220-224, 2013. [9] A. Holst , B. Bjurling, J. Ekman, A. Rudstrom, K. Wallenius, M. Bjorkman, F. Fooladvandi, R

  8. Efficient Mining and Detection of Sequential Intrusion Patterns for Network Intrusion Detection Systems

    NASA Astrophysics Data System (ADS)

    Shyu, Mei-Ling; Huang, Zifang; Luo, Hongli

    In recent years, pervasive computing infrastructures have greatly improved the interaction between human and system. As we put more reliance on these computing infrastructures, we also face threats of network intrusion and/or any new forms of undesirable IT-based activities. Hence, network security has become an extremely important issue, which is closely connected with homeland security, business transactions, and people's daily life. Accurate and efficient intrusion detection technologies are required to safeguard the network systems and the critical information transmitted in the network systems. In this chapter, a novel network intrusion detection framework for mining and detecting sequential intrusion patterns is proposed. The proposed framework consists of a Collateral Representative Subspace Projection Modeling (C-RSPM) component for supervised classification, and an inter-transactional association rule mining method based on Layer Divided Modeling (LDM) for temporal pattern analysis. Experiments on the KDD99 data set and the traffic data set generated by a private LAN testbed show promising results with high detection rates, low processing time, and low false alarm rates in mining and detecting sequential intrusion detections.

  9. Detection of Lexical and Morphological Anomalies by Children with and without Language Impairment

    ERIC Educational Resources Information Center

    Pawlowska, Monika; Robinson, Sarah; Seddoh, Amebu

    2014-01-01

    Purpose: The abilities of 5-year-old children with and without language impairment (LI) to detect anomalies involving lexical items and grammatical morphemes in stories were compared. The influence of sentence versus discourse context on lexical anomaly detection rates was explored. Method: The participants were read 3 story scripts and asked to…

  10. SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) 2013

    SciTech Connect

    Gordon Rueff; Lyle Roybal; Denis Vollmer

    2013-01-01

    There is a significant need to protect the nation’s energy infrastructures from malicious actors using cyber methods. Supervisory, Control, and Data Acquisition (SCADA) systems may be vulnerable due to the insufficient security implemented during the design and deployment of these control systems. This is particularly true in older legacy SCADA systems that are still commonly in use. The purpose of INL’s research on the SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) project was to determine if and how data compression techniques could be used to identify and protect SCADA systems from cyber attacks. Initially, the concept was centered on how to train a compression algorithm to recognize normal control system traffic versus hostile network traffic. Because large portions of the TCP/IP message traffic (called packets) are repetitive, the concept of using compression techniques to differentiate “non-normal” traffic was proposed. In this manner, malicious SCADA traffic could be identified at the packet level prior to completing its payload. Previous research has shown that SCADA network traffic has traits desirable for compression analysis. This work investigated three different approaches to identify malicious SCADA network traffic using compression techniques. The preliminary analyses and results presented herein are clearly able to differentiate normal from malicious network traffic at the packet level at a very high confidence level for the conditions tested. Additionally, the master dictionary approach used in this research appears to initially provide a meaningful way to categorize and compare packets within a communication channel.

  11. Change and Anomaly Detection in Real-Time GPS Data

    NASA Astrophysics Data System (ADS)

    Granat, R.; Pierce, M.; Gao, X.; Bock, Y.

    2008-12-01

    The California Real-Time Network (CRTN) is currently generating real-time GPS position data at a rate of 1-2Hz at over 80 locations. The CRTN data presents the possibility of studying dynamical solid earth processes in a way that complements existing seismic networks. To realize this possibility we have developed a prototype system for detecting changes and anomalies in the real-time data. Through this system, we can can correlate changes in multiple stations in order to detect signals with geographical extent. Our approach involves developing a statistical model for each GPS station in the network, and then using those models to segment the time series into a number of discrete states described by the model. We use a hidden Markov model (HMM) to describe the behavior of each station; fitting the model to the data requires neither labeled training examples nor a priori information about the system. As such, HMMs are well suited to this problem domain, in which the data remains largely uncharacterized. There are two main components to our approach. The first is the model fitting algorithm, regularized deterministic annealing expectation- maximization (RDAEM), which provides robust, high-quality results. The second is a web service infrastructure that connects the data to the statistical modeling analysis and allows us to easily present the results of that analysis through a web portal interface. This web service approach facilitates the automatic updating of station models to keep pace with dynamical changes in the data. Our web portal interface is critical to the process of interpreting the data. A Google Maps interface allows users to visually interpret state changes not only on individual stations but across the entire network. Users can drill down from the map interface to inspect detailed results for individual stations, download the time series data, and inspect fitted models. Alternatively, users can use the web portal look at the evolution of changes on the

  12. Sequential detection of a weak target in a hostile ocean environment

    SciTech Connect

    Candy, J V; Sullivan, E J

    2005-03-14

    When the underlying physical phenomenology (medium, sediment, bottom, etc.) is space-time varying along with corresponding nonstationary statistics characterizing noise and uncertainties, then sequential methods must be applied to capture the underlying processes. Sequential detection and estimation techniques offer distinct advantages over batch methods. A reasonable signal processing approach to solve this class of problem is to employ adaptive or parametrically adaptive signal models and noise to capture these phenomena. In this paper, we develop a sequential approach to solve the signal detection problem in a nonstationary environment.

  13. A Bayesian Hidden Markov Model-based approach for anomaly detection in electronic systems

    NASA Astrophysics Data System (ADS)

    Dorj, E.; Chen, C.; Pecht, M.

    Early detection of anomalies in any system or component prevents impending failures and enhances performance and availability. The complex architecture of electronics, the interdependency of component functionalities, and the miniaturization of most electronic systems make it difficult to detect and analyze anomalous behaviors. A Hidden Markov Model-based classification technique determines unobservable hidden behaviors of complex and remotely inaccessible electronic systems using observable signals. This paper presents a data-driven approach for anomaly detection in electronic systems based on a Bayesian Hidden Markov Model classification technique. The posterior parameters of the Hidden Markov Models are estimated using the conjugate prior method. An application of the developed Bayesian Hidden Markov Model-based anomaly detection approach is presented for detecting anomalous behavior in Insulated Gate Bipolar Transistors using experimental data. The detection results illustrate that the developed anomaly detection approach can help detect anomalous behaviors in electronic systems, which can help prevent system downtime and catastrophic failures.

  14. Resampling approach for anomaly detection in multispectral images

    SciTech Connect

    Theiler, J. P.; Cai, D.

    2003-01-01

    We propose a novel approach for identifying the 'most unusual' samples in a data set, based on a resampling of data attributes. The resampling produces a 'background class' and then binary classification is used to distinguish the original training set from the background. Those in the training set that are most like the background (i e, most unlike the rest of the training set) are considered anomalous. Although by their nature, anomalies do not permit a positive definition (if I knew what they were, I wouldn't call them anomalies), one can make 'negative definitions' (I can say what does not qualify as an interesting anomaly). By choosing different resampling schemes, one can identify different kinds of anomalies. For multispectral images, anomalous pixels correspond to locations on the ground with unusual spectral signatures or, depending on how feature sets are constructed, unusual spatial textures.

  15. Remote detection of geobotanical anomalies associated with hydrocarbon microseepage

    NASA Technical Reports Server (NTRS)

    Rock, B. N.

    1985-01-01

    As part of the continuing study of the Lost River, West Virginia NASA/Geosat Test Case Site, an extensive soil gas survey of the site was conducted during the summer of 1983. This soil gas survey has identified an order of magnitude methane, ethane, propane, and butane anomaly that is precisely coincident with the linear maple anomaly reported previously. This and other maple anomalies were previously suggested to be indicative of anaerobic soil conditions associated with hydrocarbon microseepage. In vitro studies support the view that anomalous distributions of native tree species tolerant of anaerobic soil conditions may be useful indicators of methane microseepage in heavily vegetated areas of the United States characterized by deciduous forest cover. Remote sensing systems which allow discrimination and mapping of native tree species and/or species associations will provide the exploration community with a means of identifying vegetation distributional anomalies indicative of microseepage.

  16. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data

    PubMed Central

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601

  17. Detecting anomalies in CMB maps: a new method

    SciTech Connect

    Neelakanta, Jayanth T.

    2015-10-01

    Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ''hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics are linear and quadratic combinations of the a{sub ℓ m}'s with random co-efficients, and they test the null hypothesis that the a{sub ℓ m}'s are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple ℓ and m modes, the ''coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.

  18. Anomalies in the detection of change: When changes in sample size are mistaken for changes in proportions.

    PubMed

    Fiedler, Klaus; Kareev, Yaakov; Avrahami, Judith; Beier, Susanne; Kutzner, Florian; Hütter, Mandy

    2016-01-01

    Detecting changes, in performance, sales, markets, risks, social relations, or public opinions, constitutes an important adaptive function. In a sequential paradigm devised to investigate detection of change, every trial provides a sample of binary outcomes (e.g., correct vs. incorrect student responses). Participants have to decide whether the proportion of a focal feature (e.g., correct responses) in the population from which the sample is drawn has decreased, remained constant, or increased. Strong and persistent anomalies in change detection arise when changes in proportional quantities vary orthogonally to changes in absolute sample size. Proportional increases are readily detected and nonchanges are erroneously perceived as increases when absolute sample size increases. Conversely, decreasing sample size facilitates the correct detection of proportional decreases and the erroneous perception of nonchanges as decreases. These anomalies are however confined to experienced samples of elementary raw events from which proportions have to be inferred inductively. They disappear when sample proportions are described as percentages in a normalized probability format. To explain these challenging findings, it is essential to understand the inductive-learning constraints imposed on decisions from experience.

  19. A robust background regression based score estimation algorithm for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei

    2016-12-01

    Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement

  20. A Multi-Agent Framework for Anomalies Detection on Distributed Firewalls Using Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Karoui, Kamel; Ftima, Fakher Ben; Ghezala, Henda Ben

    The Agents and Data Mining integration has emerged as a promising area for disributed problems solving. Applying this integration on distributed firewalls will facilitate the anomalies detection process. In this chapter, we present a set of algorithms and mining techniques to analyse, manage and detect anomalies on distributed firewalls' policy rules using the multi-agent approach; first, for each firewall, a static agent will execute a set of data mining techniques to generate a new set of efficient firewall policy rules. Then, a mobile agent will exploit these sets of optimized rules to detect eventual anomalies on a specific firewall (intra-firewalls anomalies) or between firewalls (inter-firewalls anomalies). An experimental case study will be presented to demonstrate the usefulness of our approach.

  1. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    NASA Technical Reports Server (NTRS)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  2. Reliability of Physical Systems: Detection of Malicious Subcircuits (Trojan Circuits) in Sequential Circuits

    NASA Astrophysics Data System (ADS)

    Matrosova, A. Yu.; Kirienko, I. E.; Tomkov, V. V.; Miryutov, A. A.

    2016-12-01

    Reliability of physical systems is provided by reliability of their parts including logical ones. Insertion of malicious subcircuits that can destroy logical circuit or cause leakage of confidential information from a system necessitates the detection of such subcircuits followed by their masking if possible. We suggest a method of finding a set of sequential circuit nodes in which Trojan Circuits can be inserted. The method is based on random estimations of controllability and observability of combinational nodes calculated using a description of sequential circuit working area and an evidence of existence of a transfer sequence for the proper set of internal states without finding the sequence itself. The method allows cutting calculations using operations on Reduced Ordered Binary Decision Diagrams (ROBDDs) that can depend only on the state variables of the circuit. The approach, unlike traditional ones, does not require preliminary sequential circuit simulation but can use its results. It can be used when malicious circuits cannot be detected during sequential circuit verification.

  3. Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  4. Inverse sequential detection of parameter changes in developing time series

    NASA Technical Reports Server (NTRS)

    Radok, Uwe; Brown, Timothy J.

    1992-01-01

    Progressive values of two probabilities are obtained for parameter estimates derived from an existing set of values and from the same set enlarged by one or more new values, respectively. One probability is that of erroneously preferring the second of these estimates for the existing data ('type 1 error'), while the second probability is that of erroneously accepting their estimates for the enlarged test ('type 2 error'). A more stable combined 'no change' probability which always falls between 0.5 and 0 is derived from the (logarithmic) width of the uncertainty region of an equivalent 'inverted' sequential probability ratio test (SPRT, Wald 1945) in which the error probabilities are calculated rather than prescribed. A parameter change is indicated when the compound probability undergoes a progressive decrease. The test is explicitly formulated and exemplified for Gaussian samples.

  5. Classification of SD-OCT volumes for DME detection: an anomaly detection approach

    NASA Astrophysics Data System (ADS)

    Sankar, S.; Sidibé, D.; Cheung, Y.; Wong, T. Y.; Lamoureux, E.; Milea, D.; Meriaudeau, F.

    2016-03-01

    Diabetic Macular Edema (DME) is the leading cause of blindness amongst diabetic patients worldwide. It is characterized by accumulation of water molecules in the macula leading to swelling. Early detection of the disease helps prevent further loss of vision. Naturally, automated detection of DME from Optical Coherence Tomography (OCT) volumes plays a key role. To this end, a pipeline for detecting DME diseases in OCT volumes is proposed in this paper. The method is based on anomaly detection using Gaussian Mixture Model (GMM). It starts with pre-processing the B-scans by resizing, flattening, filtering and extracting features from them. Both intensity and Local Binary Pattern (LBP) features are considered. The dimensionality of the extracted features is reduced using PCA. As the last stage, a GMM is fitted with features from normal volumes. During testing, features extracted from the test volume are evaluated with the fitted model for anomaly and classification is made based on the number of B-scans detected as outliers. The proposed method is tested on two OCT datasets achieving a sensitivity and a specificity of 80% and 93% on the first dataset, and 100% and 80% on the second one. Moreover, experiments show that the proposed method achieves better classification performances than other recently published works.

  6. Energy Detection Based on Undecimated Discrete Wavelet Transform and Its Application in Magnetic Anomaly Detection

    PubMed Central

    Nie, Xinhua; Pan, Zhongming; Zhang, Dasha; Zhou, Han; Chen, Min; Zhang, Wenna

    2014-01-01

    Magnetic anomaly detection (MAD) is a passive approach for detection of a ferromagnetic target, and its performance is often limited by external noises. In consideration of one major noise source is the fractal noise (or called 1/f noise) with a power spectral density of 1/fa (0detection method based on undecimated discrete wavelet transform (UDWT) is proposed in this paper. Firstly, the foundations of magnetic anomaly detection and UDWT are introduced in brief, while a possible detection system based on giant magneto-impedance (GMI) magnetic sensor is also given out. Then our proposed energy detection based on UDWT is described in detail, and the probabilities of false alarm and detection for given the detection threshold in theory are presented. It is noticeable that no a priori assumptions regarding the ferromagnetic target or the magnetic noise probability are necessary for our method, and different from the discrete wavelet transform (DWT), the UDWT is shift invariant. Finally, some simulations are performed and the results show that the detection performance of our proposed detector is better than that of the conventional energy detector even utilized in the Gaussian white noise, especially when the spectral parameter α is less than 1.0. In addition, a real-world experiment was done to demonstrate the advantages of the proposed method. PMID:25343484

  7. Association of Copy Number Variants With Specific Ultrasonographically Detected Fetal Anomalies

    PubMed Central

    Donnelly, Jennifer C; Platt, Lawrence D; Rebarber, Andrei; Zachary, Julia; Grobman, William A; Wapner, Ronald J

    2014-01-01

    Objective To evaluate the association of other-than-common benign copy number variants with specific fetal abnormalities detected by ultrasonogram. Methods Fetuses with structural anomalies were compared to fetuses without detected abnormalities for the frequency of other-than-common benign copy number variants. This is a secondary analysis from the previously published National Institute of Child Health and Human Development microarray trial. Ultrasound reports were reviewed and details of structural anomalies were entered into a nonhierarchical web-based database. The frequency of other-than-common benign copy number variants (ie, either pathogenic or variants of uncertain significance) not detected by karyotype was calculated for each anomaly in isolation and in the presence of other anomalies and compared to the frequency in fetuses without detected abnormalities. Results Of 1,082 fetuses with anomalies detected on ultrasound, 752 had a normal karyotype. Other-than-common benign copy number variants were present in 61 (8.1%) of these euploid fetuses. Fetuses with anomalies in more than one system had a 13.0% frequency of other-than-common benign copy number variants, which was significantly higher (p<0.001) than the frequency (3.6%) in fetuses without anomalies (n = 1966). Specific organ systems in which isolated anomalies were nominally significantly associated with other-than-common benign copy number variants were the renal (p= 0.036) and cardiac systems (p=0.012) but did not meet the adjustment for multiple comparisons. Conclusions When a fetal anomaly is detected on ultrasonogram, chromosomal microarray offers additional information over karyotype, the degree of which depends on the organ system involved. PMID:24901266

  8. Anomaly Detection Techniques for the Condition Monitoring of Tidal Turbines

    DTIC Science & Technology

    2014-09-29

    live turbine data, with anomalies indicating the possible onset of a fault within the system . 1. INTRODUCTION Tidal power has great potential...live turbine data, alerting the operator to the possible onset of a fault . The implementation of an intelligent condition monitoring system is also...indicate a change in the response of the system , indicating the possible onset of a fault . 1.2.1. CRISP-DM The CRISP-DM (Cross-Industry Standard

  9. Autonomous detection of crowd anomalies in multiple-camera surveillance feeds

    NASA Astrophysics Data System (ADS)

    Nordlöf, Jonas; Andersson, Maria

    2016-10-01

    A novel approach for autonomous detection of anomalies in crowded environments is presented in this paper. The proposed models uses a Gaussian mixture probability hypothesis density (GM-PHD) filter as feature extractor in conjunction with different Gaussian mixture hidden Markov models (GM-HMMs). Results, based on both simulated and recorded data, indicate that this method can track and detect anomalies on-line in individual crowds through multiple camera feeds in a crowded environment.

  10. Anomaly Detection in Multiple Scale for Insider Threat Analysis

    SciTech Connect

    Kim, Yoohwan; Sheldon, Frederick T; Hively, Lee M

    2012-01-01

    We propose a method to quantify malicious insider activity with statistical and graph-based analysis aided with semantic scoring rules. Different types of personal activities or interactions are monitored to form a set of directed weighted graphs. The semantic scoring rules assign higher scores for the events more significant and suspicious. Then we build personal activity profiles in the form of score tables. Profiles are created in multiple scales where the low level profiles are aggregated toward more stable higherlevel profiles within the subject or object hierarchy. Further, the profiles are created in different time scales such as day, week, or month. During operation, the insider s current activity profile is compared to the historical profiles to produce an anomaly score. For each subject with a high anomaly score, a subgraph of connected subjects is extracted to look for any related score movement. Finally the subjects are ranked by their anomaly scores to help the analysts focus on high-scored subjects. The threat-ranking component supports the interaction between the User Dashboard and the Insider Threat Knowledge Base portal. The portal includes a repository for historical results, i.e., adjudicated cases containing all of the information first presented to the user and including any additional insights to help the analysts. In this paper we show the framework of the proposed system and the operational algorithms.

  11. Lunar magnetic anomalies detected by the Apollo substatellite magnetometers

    USGS Publications Warehouse

    Hood, L.L.; Coleman, P.J.; Russell, C.T.; Wilhelms, D.E.

    1979-01-01

    Properties of lunar crustal magnetization thus far deduced from Apollo subsatellite magnetometer data are reviewed using two of the most accurate presently available magnetic anomaly maps - one covering a portion of the lunar near side and the other a part of the far side. The largest single anomaly found within the region of coverage on the near-side map correlates exactly with a conspicuous, light-colored marking in western Oceanus Procellarum called Reiner Gamma. This feature is interpreted as an unusual deposit of ejecta from secondary craters of the large nearby primary impact crater Cavalerius. An age for Cavalerius (and, by implication, for Reiner Gamma) of 3.2 ?? 0.2 ?? 109 y is estimated. The main (30 ?? 60 km) Reiner Gamma deposit is nearly uniformly magnetized in a single direction, with a minimum mean magnetization intensity of ???7 ?? 10-2 G cm3/g (assuming a density of 3 g/cm3), or about 700 times the stable magnetization component of the most magnetic returned samples. Additional medium-amplitude anomalies exist over the Fra Mauro Formation (Imbrium basin ejecta emplaced ???3.9 ?? 109 y ago) where it has not been flooded by mare basalt flows, but are nearly absent over the maria and over the craters Copernicus, Kepler, and Reiner and their encircling ejecta mantles. The mean altitude of the far-side anomaly gap is much higher than that of the near-side map and the surface geology is more complex, so individual anomaly sources have not yet been identified. However, it is clear that a concentration of especially strong sources exists in the vicinity of the craters Van de Graaff and Aitken. Numerical modeling of the associated fields reveals that the source locations do not correspond with the larger primary impact craters of the region and, by analogy with Reiner Gamma, may be less conspicuous secondary crater ejecta deposits. The reason for a special concentration of strong sources in the Van de Graaff-Aitken region is unknown, but may be indirectly

  12. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  13. Target detection using the background model from the topological anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Dorado Munoz, Leidy P.; Messinger, David W.; Ziemann, Amanda K.

    2013-05-01

    The Topological Anomaly Detection (TAD) algorithm has been used as an anomaly detector in hyperspectral and multispectral images. TAD is an algorithm based on graph theory that constructs a topological model of the background in a scene, and computes an anomalousness ranking for all of the pixels in the image with respect to the background in order to identify pixels with uncommon or strange spectral signatures. The pixels that are modeled as background are clustered into groups or connected components, which could be representative of spectral signatures of materials present in the background. Therefore, the idea of using the background components given by TAD in target detection is explored in this paper. In this way, these connected components are characterized in three different approaches, where the mean signature and endmembers for each component are calculated and used as background basis vectors in Orthogonal Subspace Projection (OSP) and Adaptive Subspace Detector (ASD). Likewise, the covariance matrix of those connected components is estimated and used in detectors: Constrained Energy Minimization (CEM) and Adaptive Coherence Estimator (ACE). The performance of these approaches and the different detectors is compared with a global approach, where the background characterization is derived directly from the image. Experiments and results using self-test data set provided as part of the RIT blind test target detection project are shown.

  14. Anomaly detection in hyperspectral imagery: statistics vs. graph-based algorithms

    NASA Astrophysics Data System (ADS)

    Berkson, Emily E.; Messinger, David W.

    2016-05-01

    Anomaly detection (AD) algorithms are frequently applied to hyperspectral imagery, but different algorithms produce different outlier results depending on the image scene content and the assumed background model. This work provides the first comparison of anomaly score distributions between common statistics-based anomaly detection algorithms (RX and subspace-RX) and the graph-based Topological Anomaly Detector (TAD). Anomaly scores in statistical AD algorithms should theoretically approximate a chi-squared distribution; however, this is rarely the case with real hyperspectral imagery. The expected distribution of scores found with graph-based methods remains unclear. We also look for general trends in algorithm performance with varied scene content. Three separate scenes were extracted from the hyperspectral MegaScene image taken over downtown Rochester, NY with the VIS-NIR-SWIR ProSpecTIR instrument. In order of most to least cluttered, we study an urban, suburban, and rural scene. The three AD algorithms were applied to each scene, and the distributions of the most anomalous 5% of pixels were compared. We find that subspace-RX performs better than RX, because the data becomes more normal when the highest variance principal components are removed. We also see that compared to statistical detectors, anomalies detected by TAD are easier to separate from the background. Due to their different underlying assumptions, the statistical and graph-based algorithms highlighted different anomalies within the urban scene. These results will lead to a deeper understanding of these algorithms and their applicability across different types of imagery.

  15. A hyperspectral imagery anomaly detection algorithm based on local three-dimensional orthogonal subspace projection

    NASA Astrophysics Data System (ADS)

    Zhang, Xing; Wen, Gongjian

    2015-10-01

    Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.

  16. Multi-Level Anomaly Detection on Time-Varying Graph Data

    SciTech Connect

    Bridges, Robert A; Collins, John P; Ferragut, Erik M; Laska, Jason A; Sullivan, Blair D

    2015-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.

  17. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    DOE PAGES

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; ...

    2016-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  18. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    SciTech Connect

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; Laska, Jason A.; Sullivan, Blair D.

    2016-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.

  19. Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)

    2001-01-01

    The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.

  20. Overlapping image segmentation for context-dependent anomaly detection

    NASA Astrophysics Data System (ADS)

    Theiler, James; Prasad, Lakshman

    2011-06-01

    The challenge of finding small targets in big images lies in the characterization of the background clutter. The more homogeneous the background, the more distinguishable a typical target will be from its background. One way to homogenize the background is to segment the image into distinct regions, each of which is individually homogeneous, and then to treat each region separately. In this paper we will report on experiments in which the target is unspecified (it is an anomaly), and various segmentation strategies are employed, including an adaptive hierarchical tree-based scheme. We find that segmentations that employ overlap achieve better performance in the low false alarm rate regime.

  1. Detection of nucleic acids by multiple sequential invasive cleavages

    SciTech Connect

    Hall, Jeff G; Lyamichev, Victor I; Mast, Andrea L; Brow, Mary Ann D

    2012-10-16

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  2. Detection of nucleic acids by multiple sequential invasive cleavages 02

    DOEpatents

    Hall, Jeff G.; Lyamichev, Victor I.; Mast, Andrea L.; Brow, Mary Ann D.

    2002-01-01

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  3. Detection of nucleic acids by multiple sequential invasive cleavages

    DOEpatents

    Hall, Jeff G.; Lyamichev, Victor I.; Mast, Andrea L.; Brow, Mary Ann D.

    1999-01-01

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  4. Addressing the Challenges of Anomaly Detection for Cyber Physical Energy Grid Systems

    SciTech Connect

    Ferragut, Erik M; Laska, Jason A; Melin, Alexander M; Czejdo, Bogdan

    2013-01-01

    The consolidation of cyber communications networks and physical control systems within the energy smart grid introduces a number of new risks. Unfortunately, these risks are largely unknown and poorly understood, yet include very high impact losses from attack and component failures. One important aspect of risk management is the detection of anomalies and changes. However, anomaly detection within cyber security remains a difficult, open problem, with special challenges in dealing with false alert rates and heterogeneous data. Furthermore, the integration of cyber and physical dynamics is often intractable. And, because of their broad scope, energy grid cyber-physical systems must be analyzed at multiple scales, from individual components, up to network level dynamics. We describe an improved approach to anomaly detection that combines three important aspects. First, system dynamics are modeled using a reduced order model for greater computational tractability. Second, a probabilistic and principled approach to anomaly detection is adopted that allows for regulation of false alerts and comparison of anomalies across heterogeneous data sources. Third, a hierarchy of aggregations are constructed to support interactive and automated analyses of anomalies at multiple scales.

  5. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.

    PubMed

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-04-29

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.

  6. Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering

    NASA Astrophysics Data System (ADS)

    Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech

    2015-03-01

    We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.

  7. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines

    PubMed Central

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-01-01

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561

  8. Anomaly detection based on PCA and local RXOSP in hyperspectral image

    NASA Astrophysics Data System (ADS)

    Lin, Juan; Gao, Kun; Wang, Lijing; Gong, Xuemei

    2016-10-01

    Aiming at the noise vulnerability and the low detection performance of the classical RX algorithm under the complex background, an improved RX-OSP hyperspectral anomaly detection method is proposed. Firstly, PCA dimension reduction method is applied to suppress the background of hyper-spectral image. Secondly, RX operator is used to detect the pixels owning the most prominent anomaly and the pixels are projected to their orthogonal complement subspaces. Then RXOSP processing is repeated according to the foregoing steps until there is no obvious anomaly. During the process of detection, the covariance matrix is calculated by localization instead of the traditional global approach to reduce the false detection effectively. Finally, ROC curve is adopted as the evaluation index for the experiment results, which reveals that the improved RXOSP algorithm is superior to RX, PCA-RX and RXOSP algorithms.

  9. A new comparison of hyperspectral anomaly detection algorithms for real-time applications

    NASA Astrophysics Data System (ADS)

    Díaz, María.; López, Sebastián.; Sarmiento, Roberto

    2016-10-01

    Due to the high spectral resolution that remotely sensed hyperspectral images provide, there has been an increasing interest in anomaly detection. The aim of anomaly detection is to stand over pixels whose spectral signature differs significantly from the background spectra. Basically, anomaly detectors mark pixels with a certain score, considering as anomalies those whose scores are higher than a threshold. Receiver Operating Characteristic (ROC) curves have been widely used as an assessment measure in order to compare the performance of different algorithms. ROC curves are graphical plots which illustrate the trade- off between false positive and true positive rates. However, they are limited in order to make deep comparisons due to the fact that they discard relevant factors required in real-time applications such as run times, costs of misclassification and the competence to mark anomalies with high scores. This last fact is fundamental in anomaly detection in order to distinguish them easily from the background without any posterior processing. An extensive set of simulations have been made using different anomaly detection algorithms, comparing their performances and efficiencies using several extra metrics in order to complement ROC curves analysis. Results support our proposal and demonstrate that ROC curves do not provide a good visualization of detection performances for themselves. Moreover, a figure of merit has been proposed in this paper which encompasses in a single global metric all the measures yielded for the proposed additional metrics. Therefore, this figure, named Detection Efficiency (DE), takes into account several crucial types of performance assessment that ROC curves do not consider. Results demonstrate that algorithms with the best detection performances according to ROC curves do not have the highest DE values. Consequently, the recommendation of using extra measures to properly evaluate performances have been supported and justified by

  10. A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.

  11. Extending TOPS: A Prototype MODIS Anomaly Detection Architecture

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Srivastava, A. N.

    2008-12-01

    The management and processing of Earth science data has been gaining importance over the last decade due to higher data volumes generated by a larger number of instruments, and due to the increase in complexity of Earth science models that use this data. The volume of data itself is often a limiting factor in obtaining the information needed by the scientists; without more sophisticated data volume reduction technologies, possible key information may not be discovered. We are especially interested in automatic identification of disturbances within the ecosystems (e,g, wildfires, droughts, floods, insect/pest damage, wind damage, logging), and focusing our analysis efforts on the identified areas. There are dozens of variables that define the health of our ecosystem and both long-term and short-term changes in these variables can serve as early indicators of natural disasters and shifts in climate and ecosystem health. These changes can have profound socio-economic impacts and we need to develop capabilities for identification, analysis and response to these changes in a timely manner. Because the ecosystem consists of a large number of variables, there can be a disturbance that is only apparent when we examine relationships among multiple variables despite the fact that none of them is by itself alarming. We have to be able to extract information from multiple sensors and observations and discover these underlying relationships. As the data volumes increase, there is also potential for large number of anomalies to "flood" the system, so we need to provide ability to automatically select the most likely ones and the most important ones and the ability to analyze the anomaly with minimal involvement of scientists. We describe a prototype architecture for anomaly driven data reduction for both near-real-time and archived surface reflectance data from the MODIS instrument collected over Central California and test it using Orca and One-Class Support Vector Machines

  12. Sequential capillary electrophoresis analysis using optically gated sample injection and UV/vis detection.

    PubMed

    Liu, Xiaoxia; Tian, Miaomiao; Camara, Mohamed Amara; Guo, Liping; Yang, Li

    2015-10-01

    We present sequential CE analysis of amino acids and L-asparaginase-catalyzed enzyme reaction, by combing the on-line derivatization, optically gated (OG) injection and commercial-available UV-Vis detection. Various experimental conditions for sequential OG-UV/vis CE analysis were investigated and optimized by analyzing a standard mixture of amino acids. High reproducibility of the sequential CE analysis was demonstrated with RSD values (n = 20) of 2.23, 2.57, and 0.70% for peak heights, peak areas, and migration times, respectively, and the LOD of 5.0 μM (for asparagine) and 2.0 μM (for aspartic acid) were obtained. With the application of the OG-UV/vis CE analysis, sequential online CE enzyme assay of L-asparaginase-catalyzed enzyme reaction was carried out by automatically and continuously monitoring the substrate consumption and the product formation every 12 s from the beginning to the end of the reaction. The Michaelis constants for the reaction were obtained and were found to be in good agreement with the results of traditional off-line enzyme assays. The study demonstrated the feasibility and reliability of integrating the OG injection with UV/vis detection for sequential online CE analysis, which could be of potential value for online monitoring various chemical reaction and bioprocesses.

  13. Security inspection in ports by anomaly detection using hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Rivera, Javier; Valverde, Fernando; Saldaña, Manuel; Manian, Vidya

    2013-05-01

    Applying hyperspectral imaging technology in port security is crucial for the detection of possible threats or illegal activities. One of the most common problems that cargo suffers is tampering. This represents a danger to society because it creates a channel to smuggle illegal and hazardous products. If a cargo is altered, security inspections on that cargo should contain anomalies that reveal the nature of the tampering. Hyperspectral images can detect anomalies by gathering information through multiple electromagnetic bands. The spectrums extracted from these bands can be used to detect surface anomalies from different materials. Based on this technology, a scenario was built in which a hyperspectral camera was used to inspect the cargo for any surface anomalies and a user interface shows the results. The spectrum of items, altered by different materials that can be used to conceal illegal products, is analyzed and classified in order to provide information about the tampered cargo. The image is analyzed with a variety of techniques such as multiple features extracting algorithms, autonomous anomaly detection, and target spectrum detection. The results will be exported to a workstation or mobile device in order to show them in an easy -to-use interface. This process could enhance the current capabilities of security systems that are already implemented, providing a more complete approach to detect threats and illegal cargo.

  14. Advancements of data anomaly detection research in wireless sensor networks: a survey and open issues.

    PubMed

    Rassam, Murad A; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-08-07

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept "Internet of Things" has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed.

  15. Advancements of Data Anomaly Detection Research in Wireless Sensor Networks: A Survey and Open Issues

    PubMed Central

    Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-01-01

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182

  16. A Statistical Detection of an Anomaly from a Few Noisy Tomographic Projections

    NASA Astrophysics Data System (ADS)

    Fillatre, Lionel; Nikiforov, Igor

    2005-12-01

    The problem of detecting an anomaly/target from a very limited number of noisy tomographic projections is addressed from the statistical point of view. The imaged object is composed of an environment, considered as a nuisance parameter, with a possibly hidden anomaly/target. The GLR test is used to solve the problem. When the projection linearly depends on the nuisance parameters, the GLR test coincides with an optimal statistical invariant test.

  17. Anomaly Detection and Life Pattern Estimation for the Elderly Based on Categorization of Accumulated Data

    NASA Astrophysics Data System (ADS)

    Mori, Taketoshi; Ishino, Takahito; Noguchi, Hiroshi; Shimosaka, Masamichi; Sato, Tomomasa

    2011-06-01

    We propose a life pattern estimation method and an anomaly detection method for elderly people living alone. In our observation system for such people, we deploy some pyroelectric sensors into the house and measure the person's activities all the time in order to grasp the person's life pattern. The data are transferred successively to the operation center and displayed to the nurses in the center in a precise way. Then, the nurses decide whether the data is the anomaly or not. In the system, the people whose features in their life resemble each other are categorized as the same group. Anomalies occurred in the past are shared in the group and utilized in the anomaly detection algorithm. This algorithm is based on "anomaly score." The "anomaly score" is figured out by utilizing the activeness of the person. This activeness is approximately proportional to the frequency of the sensor response in a minute. The "anomaly score" is calculated from the difference between the activeness in the present and the past one averaged in the long term. Thus, the score is positive if the activeness in the present is higher than the average in the past, and the score is negative if the value in the present is lower than the average. If the score exceeds a certain threshold, it means that an anomaly event occurs. Moreover, we developed an activity estimation algorithm. This algorithm estimates the residents' basic activities such as uprising, outing, and so on. The estimation is shown to the nurses with the "anomaly score" of the residents. The nurses can understand the residents' health conditions by combining these two information.

  18. Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

    SciTech Connect

    Ondrej Linda; Todd Vollmer; Milos Manic

    2012-08-01

    The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, this paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.

  19. Physics-based, Bayesian sequential detection method and system for radioactive contraband

    DOEpatents

    Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E

    2014-03-18

    A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.

  20. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    PubMed Central

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882

  1. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    PubMed

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  2. Apparatus and method for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    DOEpatents

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1981-05-22

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  3. Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery

    PubMed Central

    Sivaraks, Haemwaan

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284

  4. Robust and accurate anomaly detection in ECG artifacts using time series motif discovery.

    PubMed

    Sivaraks, Haemwaan; Ratanamahatana, Chotirat Ann

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods.

  5. Sequential Model Selection based Segmentation to Detect DNA Copy Number Variation

    PubMed Central

    Hu, Jianhua; Zhang, Liwen; Wang, Huixia Judy

    2016-01-01

    Summary Array-based CGH experiments are designed to detect genomic aberrations or regions of DNA copy-number variation that are associated with an outcome, typically a state of disease. Most of the existing statistical methods target on detecting DNA copy number variations in a single sample or array. We focus on the detection of group effect variation, through simultaneous study of multiple samples from multiple groups. Rather than using direct segmentation or smoothing techniques, as commonly seen in existing detection methods, we develop a sequential model selection procedure that is guided by a modified Bayesian information criterion. This approach improves detection accuracy by accumulatively utilizing information across contiguous clones, and has computational advantage over the existing popular detection methods. Our empirical investigation suggests that the performance of the proposed method is superior to that of the existing detection methods, in particular, in detecting small segments or separating neighboring segments with differential degrees of copy-number variation. PMID:26954760

  6. Statistical Inference for Detecting Structures and Anomalies in Networks

    DTIC Science & Technology

    2015-08-27

    community structure in dynamic networks, along with the discovery of a detectability phase transition as a function of the rate of change and the...local in- formation, about the known nodes and their neighbors. But when this fraction crosses a critical threshold, our knowledge becomes global

  7. Countering Botnets: Anomaly-Based Detection, Comprehensive Analysis, and Efficient Mitigation

    DTIC Science & Technology

    2011-05-01

    BOTNETS: ANOMALY-BASED DETECTION , COMPREHENSIVE ANALYSIS, AND EFFICIENT MITIGATION GEORGIA TECH RESEARCH CORPORATION MAY 2011 FINAL... DETECTION , COMPREHENSIVE ANALYSIS, AND EFFICIENT MITIGATION 5a. CONTRACT NUMBER N/A 5b. GRANT NUMBER FA8750-08-2-0141 5c. PROGRAM ELEMENT NUMBER...cover five general areas: (1) botnet detection , (2) botnet analysis, (3) botnet mitigation, (4) add-on tasks to the original contract, including the

  8. Low frequency of Y anomaly detected in Australian Brahman cow-herds

    PubMed Central

    de Camargo, Gregório M.F.; Porto-Neto, Laercio R.; Fortes, Marina R.S.; Bunch, Rowan J.; Tonhati, Humberto; Reverter, Antonio; Moore, Stephen S.; Lehnert, Sigrid A.

    2015-01-01

    Indicine cattle have lower reproductive performance in comparison to taurine. A chromosomal anomaly characterized by the presence Y markers in females was reported and associated with infertility in cattle. The aim of this study was to investigate the occurrence of the anomaly in Brahman cows. Brahman cows (n = 929) were genotyped for a Y chromosome specific region using real time-PCR. Only six out of 929 cows had the anomaly (0.6%). The anomaly frequency was much lower in Brahman cows than in the crossbred population, in which it was first detected. It also seems that the anomaly doesn't affect pregnancy in the population. Due to the low frequency, association analyses couldn't be executed. Further, SNP signal of the pseudoautosomal boundary region of the Y chromosome was investigated using HD SNP chip. Pooled DNA of “non-pregnant” and “pregnant” cows were compared and no difference in SNP allele frequency was observed. Results suggest that the anomaly had a very low frequency in this Australian Brahman population and had no effect on reproduction. Further studies comparing pregnant cows and cows that failed to conceive should be executed after better assembly and annotation of the Y chromosome in cattle. PMID:25750859

  9. Temporal Characteristics of Radiologists' and Novices' Lesion Detection in Viewing Medical Images Presented Rapidly and Sequentially

    PubMed Central

    Nakashima, Ryoichi; Komori, Yuya; Maeda, Eriko; Yoshikawa, Takeharu; Yokosawa, Kazuhiko

    2016-01-01

    Although viewing multiple stacks of medical images presented on a display is a relatively new but useful medical task, little is known about this task. Particularly, it is unclear how radiologists search for lesions in this type of image reading. When viewing cluttered and dynamic displays, continuous motion itself does not capture attention. Thus, it is effective for the target detection that observers' attention is captured by the onset signal of a suddenly appearing target among the continuously moving distractors (i.e., a passive viewing strategy). This can be applied to stack viewing tasks, because lesions often show up as transient signals in medical images which are sequentially presented simulating a dynamic and smoothly transforming image progression of organs. However, it is unclear whether observers can detect a target when the target appears at the beginning of a sequential presentation where the global apparent motion onset signal (i.e., signal of the initiation of the apparent motion by sequential presentation) occurs. We investigated the ability of radiologists to detect lesions during such tasks by comparing the performances of radiologists and novices. Results show that overall performance of radiologists is better than novices. Furthermore, the temporal locations of lesions in CT image sequences, i.e., when a lesion appears in an image sequence, does not affect the performance of radiologists, whereas it does affect the performance of novices. Results indicate that novices have greater difficulty in detecting a lesion appearing early than late in the image sequence. We suggest that radiologists have other mechanisms to detect lesions in medical images with little attention which novices do not have. This ability is critically important when viewing rapid sequential presentations of multiple CT images, such as stack viewing tasks. PMID:27774080

  10. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  11. Using new edges for anomaly detection in computer networks

    DOEpatents

    Neil, Joshua Charles

    2015-05-19

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  12. Compendium of Anomaly Detection and Reaction Tools and Projects

    DTIC Science & Technology

    2000-05-17

    Trak Vendor Internet Tools, Inc. Type of Tool Network Monitor Description ID-Trak is an advanced network- based intrusion detection system developed to...download (or receive in e - mail ) an individual attack signature that can be imported into the system and activated in real time. This does not require...Audit logs from monitored systems Network packets Reactions Alerts: at console (Director), e - mail , pager (from STVDB) Responses: disable user’s

  13. [A Hyperspectral Imagery Anomaly Detection Algorithm Based on Gauss-Markov Model].

    PubMed

    Gao, Kun; Liu, Ying; Wang, Li-jing; Zhu, Zhen-yu; Cheng, Hao-bo

    2015-10-01

    With the development of spectral imaging technology, hyperspectral anomaly detection is getting more and more widely used in remote sensing imagery processing. The traditional RX anomaly detection algorithm neglects spatial correlation of images. Besides, it does not validly reduce the data dimension, which costs too much processing time and shows low validity on hyperspectral data. The hyperspectral images follow Gauss-Markov Random Field (GMRF) in space and spectral dimensions. The inverse matrix of covariance matrix is able to be directly calculated by building the Gauss-Markov parameters, which avoids the huge calculation of hyperspectral data. This paper proposes an improved RX anomaly detection algorithm based on three-dimensional GMRF. The hyperspectral imagery data is simulated with GMRF model, and the GMRF parameters are estimated with the Approximated Maximum Likelihood method. The detection operator is constructed with GMRF estimation parameters. The detecting pixel is considered as the centre in a local optimization window, which calls GMRF detecting window. The abnormal degree is calculated with mean vector and covariance inverse matrix, and the mean vector and covariance inverse matrix are calculated within the window. The image is detected pixel by pixel with the moving of GMRF window. The traditional RX detection algorithm, the regional hypothesis detection algorithm based on GMRF and the algorithm proposed in this paper are simulated with AVIRIS hyperspectral data. Simulation results show that the proposed anomaly detection method is able to improve the detection efficiency and reduce false alarm rate. We get the operation time statistics of the three algorithms in the same computer environment. The results show that the proposed algorithm improves the operation time by 45.2%, which shows good computing efficiency.

  14. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    SciTech Connect

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  15. Sequential rank law of signal detection on a background of Markov noise

    NASA Astrophysics Data System (ADS)

    Akimov, P. S.; Nedoluzhko, V. I.

    1985-04-01

    The paper examines a binary sequential truncated rank procedure of signal detection based on the lower and upper boundedness of the solving statistics. A method for calculating the distribution of the number of observations of single-channel and multichannel detectors in the presence of Markov noise is presented. The advantages of the proposed procedure as compared with a single-threshold Neumann-Pearson procedure are indicated.

  16. Anomaly Detection in the Right Hemisphere: The Influence of Visuospatial Factors

    ERIC Educational Resources Information Center

    Smith, Stephen D.; Dixon, Michael J.; Tays, William J.; Bulman-Fleming, M. Barbara

    2004-01-01

    Previous research with both brain-damaged and neurologically intact populations has demonstrated that the right cerebral hemisphere (RH) is superior to the left cerebral hemisphere (LH) at detecting anomalies (or incongruities) in objects (Ramachandran, 1995; Smith, Tays, Dixon, & Bulman-Fleming, 2002). The current research assesses whether the RH…

  17. Dual Use Corrosion Inhibitor and Penetrant for Anomaly Detection in Neutron/X Radiography

    NASA Technical Reports Server (NTRS)

    Hall, Phillip B. (Inventor); Novak, Howard L. (Inventor)

    2004-01-01

    A dual purpose corrosion inhibitor and penetrant composition sensitive to radiography interrogation is provided. The corrosion inhibitor mitigates or eliminates corrosion on the surface of a substrate upon which the corrosion inhibitor is applied. In addition, the corrosion inhibitor provides for the attenuation of a signal used during radiography interrogation thereby providing for detection of anomalies on the surface of the substrate.

  18. A non-parametric approach to anomaly detection in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Veracini, Tiziana; Matteoli, Stefania; Diani, Marco; Corsini, Giovanni; de Ceglie, Sergio U.

    2010-10-01

    In the past few years, spectral analysis of data collected by hyperspectral sensors aimed at automatic anomaly detection has become an interesting area of research. In this paper, we are interested in an Anomaly Detection (AD) scheme for hyperspectral images in which spectral anomalies are defined with respect to a statistical model of the background Probability Density Function (PDF).The characterization of the PDF of hyperspectral imagery is not trivial. We approach the background PDF estimation through the Parzen Windowing PDF estimator (PW). PW is a flexible and valuable tool for accurately modeling unknown PDFs in a non-parametric fashion. Although such an approach is well known and has been widely employed, its use within an AD scheme has been not investigated yet. For practical purposes, the PW ability to estimate PDFs is strongly influenced by the choice of the bandwidth matrix, which controls the degree of smoothing of the resulting PDF approximation. Here, a Bayesian approach is employed to carry out the bandwidth selection. The resulting estimated background PDF is then used to detect spectral anomalies within a detection scheme based on the Neyman-Pearson approach. Real hyperspectral imagery is used for an experimental evaluation of the proposed strategy.

  19. Using Machine Learning for Advanced Anomaly Detection and Classification

    NASA Astrophysics Data System (ADS)

    Lane, B.; Poole, M.; Camp, M.; Murray-Krezan, J.

    2016-09-01

    Machine Learning (ML) techniques have successfully been used in a wide variety of applications to automatically detect and potentially classify changes in activity, or a series of activities by utilizing large amounts data, sometimes even seemingly-unrelated data. The amount of data being collected, processed, and stored in the Space Situational Awareness (SSA) domain has grown at an exponential rate and is now better suited for ML. This paper describes development of advanced algorithms to deliver significant improvements in characterization of deep space objects and indication and warning (I&W) using a global network of telescopes that are collecting photometric data on a multitude of space-based objects. The Phase II Air Force Research Laboratory (AFRL) Small Business Innovative Research (SBIR) project Autonomous Characterization Algorithms for Change Detection and Characterization (ACDC), contracted to ExoAnalytic Solutions Inc. is providing the ability to detect and identify photometric signature changes due to potential space object changes (e.g. stability, tumble rate, aspect ratio), and correlate observed changes to potential behavioral changes using a variety of techniques, including supervised learning. Furthermore, these algorithms run in real-time on data being collected and processed by the ExoAnalytic Space Operations Center (EspOC), providing timely alerts and warnings while dynamically creating collection requirements to the EspOC for the algorithms that generate higher fidelity I&W. This paper will discuss the recently implemented ACDC algorithms, including the general design approach and results to date. The usage of supervised algorithms, such as Support Vector Machines, Neural Networks, k-Nearest Neighbors, etc., and unsupervised algorithms, for example k-means, Principle Component Analysis, Hierarchical Clustering, etc., and the implementations of these algorithms is explored. Results of applying these algorithms to EspOC data both in an off

  20. Radio signal anomalies detected with MEXART in 2012 during the recovery phase of geomagnetic storms

    NASA Astrophysics Data System (ADS)

    Carrillo-Vargas, Armando; Pérez-Enríquez, Román; López-Montes, Rebeca; Rodríguez-Martínez, Mario; Ugalde-Calvillo, Luis Gerardo

    2016-11-01

    In this work we present MEXART observations in 2012 from 17 radio sources in which we detected anomalies in the radio signal of these sources occurring during the recovery phase of some geomagnetic storms. We performed FFT and wavelet analysis of the radio signals during these periods and found that rather than IPS the anomalies seem to originate in the ionosphere, especially because of the frequencies at which they are observed. We discuss this results under the view that the source of the geomagnetic storm is no longer in the interplanetary medium.

  1. Magnetic anomaly detection (MAD) of ferromagnetic pipelines using principal component analysis (PCA)

    NASA Astrophysics Data System (ADS)

    Sheinker, Arie; Moldwin, Mark B.

    2016-04-01

    The magnetic anomaly detection (MAD) method is used for detection of visually obscured ferromagnetic objects. The method exploits the magnetic field originating from the ferromagnetic object, which constitutes an anomaly in the ambient earth’s magnetic field. Traditionally, MAD is used to detect objects with a magnetic field of a dipole structure, where far from the object it can be considered as a point source. In the present work, we expand MAD to the case of a non-dipole source, i.e. a ferromagnetic pipeline. We use principal component analysis (PCA) to calculate the principal components, which are then employed to construct an effective detector. Experiments conducted in our lab with real-world data validate the above analysis. The simplicity, low computational complexity, and the high detection rate make the proposed detector attractive for real-time, low power applications.

  2. Graph Learning for Anomaly Detection using Psychological Context GLAD-PC

    DTIC Science & Technology

    2015-08-03

    Juan Liu, Bob Price, Jianqiang Shen, Akshay Patil, Richard Chow, Eugene Bart, Nicolas Ducheneaut. Proactive insider threat detection through graph...Price, Oliver Brdiczka, Eugene Bart. Multi-source fusion for anomaly detection:using across-domain and across-time peer-groupconsistency checks...0.25 Mudita Singhal 0.10 Eugene Bart 0.10 Bob Price 0.10 2.25 7 ...... ...... Sub Contractors (DD882) Inventions (DD882) Sub Contractor Numbers (c

  3. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  4. HPNAIDM: The High-Performance Network Anomaly/Intrusion Detection and Mitigation System

    SciTech Connect

    Chen, Yan

    2013-12-05

    Identifying traffic anomalies and attacks rapidly and accurately is critical for large network operators. With the rapid growth of network bandwidth, such as the next generation DOE UltraScience Network, and fast emergence of new attacks/virus/worms, existing network intrusion detection systems (IDS) are insufficient because they: • Are mostly host-based and not scalable to high-performance networks; • Are mostly signature-based and unable to adaptively recognize flow-level unknown attacks; • Cannot differentiate malicious events from the unintentional anomalies. To address these challenges, we proposed and developed a new paradigm called high-performance network anomaly/intrustion detection and mitigation (HPNAIDM) system. The new paradigm is significantly different from existing IDSes with the following features (research thrusts). • Online traffic recording and analysis on high-speed networks; • Online adaptive flow-level anomaly/intrusion detection and mitigation; • Integrated approach for false positive reduction. Our research prototype and evaluation demonstrate that the HPNAIDM system is highly effective and economically feasible. Beyond satisfying the pre-set goals, we even exceed that significantly (see more details in the next section). Overall, our project harvested 23 publications (2 book chapters, 6 journal papers and 15 peer-reviewed conference/workshop papers). Besides, we built a website for technique dissemination, which hosts two system prototype release to the research community. We also filed a patent application and developed strong international and domestic collaborations which span both academia and industry.

  5. Towards spatial localisation of harmful algal blooms; statistics-based spatial anomaly detection

    NASA Astrophysics Data System (ADS)

    Shutler, J. D.; Grant, M. G.; Miller, P. I.

    2005-10-01

    Harmful algal blooms are believed to be increasing in occurrence and their toxins can be concentrated by filter-feeding shellfish and cause amnesia or paralysis when ingested. As a result fisheries and beaches in the vicinity of blooms may need to be closed and the local population informed. For this avoidance planning timely information on the existence of a bloom, its species and an accurate map of its extent would be prudent. Current research to detect these blooms from space has mainly concentrated on spectral approaches towards determining species. We present a novel statistics-based background-subtraction technique that produces improved descriptions of an anomaly's extent from remotely-sensed ocean colour data. This is achieved by extracting bulk information from a background model; this is complemented by a computer vision ramp filtering technique to specifically detect the perimeter of the anomaly. The complete extraction technique uses temporal-variance estimates which control the subtraction of the scene of interest from the time-weighted background estimate, producing confidence maps of anomaly extent. Through the variance estimates the method learns the associated noise present in the data sequence, providing robustness, and allowing generic application. Further, the use of the median for the background model reduces the effects of anomalies that appear within the time sequence used to generate it, allowing seasonal variations in the background levels to be closely followed. To illustrate the detection algorithm's application, it has been applied to two spectrally different oceanic regions.

  6. Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method

    NASA Technical Reports Server (NTRS)

    Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.

    2005-01-01

    NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.

  7. Radiation detection method and system using the sequential probability ratio test

    DOEpatents

    Nelson, Karl E.; Valentine, John D.; Beauchamp, Brock R.

    2007-07-17

    A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.

  8. Damage diagnosis algorithm using a sequential change point detection method with an unknown distribution for damage

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram; Kiremidjian, Anne S.

    2012-04-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method for the cases where the post-damage feature distribution is unknown a priori. This algorithm extracts features from structural vibration data using time-series analysis and then declares damage using the change point detection method. The change point detection method asymptotically minimizes detection delay for a given false alarm rate. The conventional method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori. Therefore, our algorithm estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using multiple sets of simulated data and a set of experimental data collected from a four-story steel special moment-resisting frame. Our algorithm was able to estimate the post-damage distribution consistently and resulted in detection delays only a few seconds longer than the delays from the conventional method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  9. A scalable architecture for online anomaly detection of WLCG batch jobs

    NASA Astrophysics Data System (ADS)

    Kuehn, E.; Fischer, M.; Giffels, M.; Jung, C.; Petzold, A.

    2016-10-01

    For data centres it is increasingly important to monitor the network usage, and learn from network usage patterns. Especially configuration issues or misbehaving batch jobs preventing a smooth operation need to be detected as early as possible. At the GridKa data and computing centre we therefore operate a tool BPNetMon for monitoring traffic data and characteristics of WLCG batch jobs and pilots locally on different worker nodes. On the one hand local information itself are not sufficient to detect anomalies for several reasons, e.g. the underlying job distribution on a single worker node might change or there might be a local misconfiguration. On the other hand a centralised anomaly detection approach does not scale regarding network communication as well as computational costs. We therefore propose a scalable architecture based on concepts of a super-peer network.

  10. Application of Artificial Bee Colony algorithm in TEC seismo-ionospheric anomalies detection

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2015-09-01

    In this study, the efficiency of Artificial Bee Colony (ABC) algorithm is investigated to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of some strong earthquakes including Chile (27 February 2010; 01 April 2014), Varzeghan (11 August 2012), Saravan (16 April 2013) and Papua New Guinea (29 March 2015). In comparison with other anomaly detection algorithms, ABC has a number of advantages which can be numerated as (1) detection of discord patterns in a large non linear data during a short time, (2) simplicity, (3) having less control parameters and (4) efficiently for solving multimodal and multidimensional optimization problems. Also the results of this study acknowledge the TEC time-series as a robust earthquake precursor.

  11. Beyond Trisomy 21: Additional Chromosomal Anomalies Detected through Routine Aneuploidy Screening

    PubMed Central

    Metcalfe, Amy; Hippman, Catriona; Pastuck, Melanie; Johnson, Jo-Ann

    2014-01-01

    Prenatal screening is often misconstrued by patients as screening for trisomy 21 alone; however, other chromosomal anomalies are often detected. This study aimed to systematically review the literature and use diagnostic meta-analysis to derive pooled detection and false positive rates for aneuploidies other than trisomy 21 with different prenatal screening tests. Non-invasive prenatal testing had the highest detection (DR) and lowest false positive (FPR) rates for trisomy 13 (DR: 90.3%; FPR: 0.2%), trisomy 18 (DR: 98.1%; FPR: 0.2%), and 45,X (DR: 92.2%; FPR: 0.1%); however, most estimates came from high-risk samples. The first trimester combined test also had high DRs for all conditions studied (trisomy 13 DR: 83.1%; FPR: 4.4%; trisomy 18 DR: 91.9%; FPR: 3.5%; 45,X DR: 70.1%; FPR: 5.4%; triploidy DR: 100%; FPR: 6.3%). Second trimester triple screening had the lowest DRs and highest FPRs for all conditions (trisomy 13 DR: 43.9%; FPR: 8.1%; trisomy 18 DR: 70.5%; FPR: 3.3%; 45,X DR: 77.2%; FPR: 9.3%). Prenatal screening tests differ in their ability to accurately detect chromosomal anomalies. Patients should be counseled about the ability of prenatal screening to detect anomalies other than trisomy 21 prior to undergoing screening. PMID:26237381

  12. Anomaly Detection in Gamma-Ray Vehicle Spectra with Principal Components Analysis and Mahalanobis Distances

    SciTech Connect

    Tardiff, Mark F.; Runkle, Robert C.; Anderson, K. K.; Smith, L. E.

    2006-01-23

    The goal of primary radiation monitoring in support of routine screening and emergency response is to detect characteristics in vehicle radiation signatures that indicate the presence of potential threats. Two conceptual approaches to analyzing gamma-ray spectra for threat detection are isotope identification and anomaly detection. While isotope identification is the time-honored method, an emerging technique is anomaly detection that uses benign vehicle gamma ray signatures to define an expectation of the radiation signature for vehicles that do not pose a threat. Newly acquired spectra are then compared to this expectation using statistical criteria that reflect acceptable false alarm rates and probabilities of detection. The gamma-ray spectra analyzed here were collected at a U.S. land Port of Entry (POE) using a NaI-based radiation portal monitor (RPM). The raw data were analyzed to develop a benign vehicle expectation by decimating the original pulse-height channels to 35 energy bins, extracting composite variables via principal components analysis (PCA), and estimating statistically weighted distances from the mean vehicle spectrum with the mahalanobis distance (MD) metric. This paper reviews the methods used to establish the anomaly identification criteria and presents a systematic analysis of the response of the combined PCA and MD algorithm to modeled mono-energetic gamma-ray sources.

  13. A new approach for structural health monitoring by applying anomaly detection on strain sensor data

    NASA Astrophysics Data System (ADS)

    Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik

    2014-03-01

    Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.

  14. Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets

    PubMed Central

    Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin

    2017-01-01

    Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy and sparse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxicabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques. PMID:28282948

  15. Millimeter Wave Detection of Localized Anomalies in the Space Shuttle External Fuel Tank Insulating Foam

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Case, J. T.; Abou-Khousa, M. A.; Zoughi, R.; Hepburn, F.

    2006-01-01

    The Space Shuttle Columbia's catastrophic accident emphasizes the growing need for developing and applying effective, robust and life-cycle oriented nondestructive testing (NDT) methods for inspecting the shuttle external fuel tank spray on foam insulation (SOFI). Millimeter wave NDT techniques were one of the methods chosen for evaluating their potential for inspecting these structures. Several panels with embedded anomalies (mainly voids) were produced and tested for this purpose. Near-field and far-field millimeter wave NDT methods were used for producing images of the anomalies in these panels. This paper presents the results of an investigation for the purpose of detecting localized anomalies in several SOFI panels. To this end, reflectometers at a relatively wide range of frequencies (Ka-band (26.5 - 40 GHz) to W-band (75 - 110 GHz)) and utilizing different types of radiators were employed. The resulting raw images revealed a significant amount of information about the interior of these panels. However, using simple image processing techniques the results were improved in particular as it relate s to detecting the smaller anomalies. This paper presents the results of this investigation and a discussion of these results.

  16. Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets.

    PubMed

    Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin

    2017-03-09

    Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy ands parse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxi cabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques.

  17. Detecting errors and anomalies in computerized materials control and accountability databases

    SciTech Connect

    Whiteson, R.; Hench, K.; Yarbro, T.; Baumgart, C.

    1998-12-31

    The Automated MC and A Database Assessment project is aimed at improving anomaly and error detection in materials control and accountability (MC and A) databases and increasing confidence in the data that they contain. Anomalous data resulting in poor categorization of nuclear material inventories greatly reduces the value of the database information to users. Therefore it is essential that MC and A data be assessed periodically for anomalies or errors. Anomaly detection can identify errors in databases and thus provide assurance of the integrity of data. An expert system has been developed at Los Alamos National Laboratory that examines these large databases for anomalous or erroneous data. For several years, MC and A subject matter experts at Los Alamos have been using this automated system to examine the large amounts of accountability data that the Los Alamos Plutonium Facility generates. These data are collected and managed by the Material Accountability and Safeguards System, a near-real-time computerized nuclear material accountability and safeguards system. This year they have expanded the user base, customizing the anomaly detector for the varying requirements of different groups of users. This paper describes the progress in customizing the expert systems to the needs of the users of the data and reports on their results.

  18. Conformal prediction for anomaly detection and collision alert in space surveillance

    NASA Astrophysics Data System (ADS)

    Chen, Huimin; Chen, Genshe; Blasch, Erik; Pham, Khanh

    2013-05-01

    Anomaly detection has been considered as an important technique for detecting critical events in a wide range of data rich applications where a majority of the data is inconsequential and/or uninteresting. We study the detection of anomalous behaviors among space objects using the theory of conformal prediction for distribution-independent on-line learning to provide collision alerts with a desirable confidence level. We exploit the fact that conformal predictors provide valid forecasted sets at specified confidence levels under the relatively weak assumption that the normal training data, together with the normal testing data, are generated from the same distribution. If the actual observation is not included in the conformal prediction set, it is classified as anomalous at the corresponding significance level. Interpreting the significance level as an upper bound of the probability that a normal observation is mistakenly classified as anomalous, we can conveniently adjust the sensitivity to anomalies while controlling the false alarm rate without having to find the application specific threshold. The proposed conformal prediction method was evaluated for a space surveillance application using the open source North American Aerospace Defense Command (NORAD) catalog data. The validity of the prediction sets is justified by the empirical error rate that matches the significance level. In addition, experiments with simulated anomalous data indicate that anomaly detection sensitivity with conformal prediction is superior to that of the existing methods in declaring potential collision events.

  19. [Multi-DSP parallel processing technique of hyperspectral RX anomaly detection].

    PubMed

    Guo, Wen-Ji; Zeng, Xiao-Ru; Zhao, Bao-Wei; Ming, Xing; Zhang, Gui-Feng; Lü, Qun-Bo

    2014-05-01

    To satisfy the requirement of high speed, real-time and mass data storage etc. for RX anomaly detection of hyperspectral image data, the present paper proposes a solution of multi-DSP parallel processing system for hyperspectral image based on CPCI Express standard bus architecture. Hardware topological architecture of the system combines the tight coupling of four DSPs sharing data bus and memory unit with the interconnection of Link ports. On this hardware platform, by assigning parallel processing task for each DSP in consideration of the spectrum RX anomaly detection algorithm and the feature of 3D data in the spectral image, a 4DSP parallel processing technique which computes and solves the mean matrix and covariance matrix of the whole image by spatially partitioning the image is proposed. The experiment result shows that, in the case of equivalent detective effect, it can reach the time efficiency 4 times higher than single DSP process with the 4-DSP parallel processing technique of RX anomaly detection algorithm proposed by this paper, which makes a breakthrough in the constraints to the huge data image processing of DSP's internal storage capacity, meanwhile well meeting the demands of the spectral data in real-time processing.

  20. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  1. Capacitance probe for detection of anomalies in non-metallic plastic pipe

    DOEpatents

    Mathur, Mahendra P.; Spenik, James L.; Condon, Christopher M.; Anderson, Rodney; Driscoll, Daniel J.; Fincham, Jr., William L.; Monazam, Esmail R.

    2010-11-23

    The disclosure relates to analysis of materials using a capacitive sensor to detect anomalies through comparison of measured capacitances. The capacitive sensor is used in conjunction with a capacitance measurement device, a location device, and a processor in order to generate a capacitance versus location output which may be inspected for the detection and localization of anomalies within the material under test. The components may be carried as payload on an inspection vehicle which may traverse through a pipe interior, allowing evaluation of nonmetallic or plastic pipes when the piping exterior is not accessible. In an embodiment, supporting components are solid-state devices powered by a low voltage on-board power supply, providing for use in environments where voltage levels may be restricted.

  2. GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection

    SciTech Connect

    Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D; Reed, Joel W; Goodall, John R

    2016-01-01

    This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% true positive rates at both.

  3. Sequential structural damage diagnosis algorithm using a change point detection method

    NASA Astrophysics Data System (ADS)

    Noh, H.; Rajagopal, R.; Kiremidjian, A. S.

    2013-11-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method. The general change point detection method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori, unless we are looking for a known specific type of damage. Therefore, we introduce an additional algorithm that estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using a set of experimental data collected from a four-story steel special moment-resisting frame and multiple sets of simulated data. Various features of different dimensions have been explored, and the algorithm was able to identify damage, particularly when it uses multidimensional damage sensitive features and lower false alarm rates, with a known post-damage feature distribution. For unknown feature distribution cases, the post-damage distribution was consistently estimated and the detection delays were only a few time steps longer than the delays from the general method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  4. Improving Non-Linear Approaches to Anomaly Detection, Class Separation, and Visualization

    DTIC Science & Technology

    2014-12-26

    CLASS SEPARATION, AND VISUALIZATION DISSERTATION Presented to the Faculty Graduate School of Engineering and Management Air Force Institute of...Date 4 Dec 2014 Date 4 Dec 2014 Date Accepted: Adedeji B. Badiru , Ph.D. Dean, Graduate School of Engineering and Management Date AFIT-ENS-DS-14-D-15...existing non-linear techniques are investigated for the purposes of providing better, timely class separation and improved anomaly detection on various

  5. Anomaly Detection for Data Reduction in an Unattended Ground Sensor (UGS) Field

    DTIC Science & Technology

    2014-09-01

    Framework integrates super - resolution , contrast, and deblur research algorithms as well as the Force Protection Surveillance System (FPSS),2,3 a full-motion...report describes the design and implementation of a data reduction technique for video sensors that are part of a larger unattended ground sensor (UGS...network. The data reduction technique is based on anomaly detection in full-motion video and subsequent statistical analysis techniques that allow the

  6. Can we detect regional methane anomalies? A comparison between three observing systems

    NASA Astrophysics Data System (ADS)

    Cressot, Cindy; Pison, Isabelle; Rayner, Peter J.; Bousquet, Philippe; Fortems-Cheiney, Audrey; Chevallier, Frédéric

    2016-07-01

    A Bayesian inversion system is used to evaluate the capability of the current global surface network and of the space-borne GOSAT/TANSO-FTS and IASI instruments to quantify surface flux anomalies of methane at various spatial (global, semi-hemispheric and regional) and time (seasonal, yearly, 3-yearly) scales. The evaluation is based on a signal-to-noise ratio analysis, the signal being the methane fluxes inferred from the surface-based inversion from 2000 to 2011 and the noise (i.e., precision) of each of the three observing systems being computed from the Bayesian equation. At the global and semi-hemispheric scales, all observing systems detect flux anomalies at most of the tested timescales. At the regional scale, some seasonal flux anomalies are detected by the three observing systems, but year-to-year anomalies and longer-term trends are only poorly detected. Moreover, reliably detected regions depend on the reference surface-based inversion used as the signal. Indeed, tropical flux inter-annual variability, for instance, can be attributed mostly to Africa in the reference inversion or spread between tropical regions in Africa and America. Our results show that inter-annual analyses of methane emissions inferred by atmospheric inversions should always include an uncertainty assessment and that the attribution of current trends in atmospheric methane to particular regions' needs increased effort, for instance, gathering more observations (in the future) and improving transport models. At all scales, GOSAT generally shows the best performance of the three observing systems.

  7. A new morphological anomaly detection algorithm for hyperspectral images and its GPU implementation

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2011-10-01

    Anomaly detection is considered a very important task for hyperspectral data exploitation. It is now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we develop a new morphological algorithm for anomaly detection in hyperspectral images along with an efficient GPU implementation of the algorithm. The algorithm is implemented on latest-generation GPU architectures, and evaluated with regards to other anomaly detection algorithms using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex. The proposed GPU implementation achieves real-time performance in the considered case study.

  8. Parallel implementation of RX anomaly detection on multi-core processors: impact of data partitioning strategies

    NASA Astrophysics Data System (ADS)

    Molero, Jose M.; Garzón, Ester M.; García, Inmaculada; Plaza, Antonio

    2011-11-01

    Anomaly detection is an important task for remotely sensed hyperspectral data exploitation. One of the most widely used and successful algorithms for anomaly detection in hyperspectral images is the Reed-Xiaoli (RX) algorithm. Despite its wide acceptance and high computational complexity when applied to real hyperspectral scenes, few documented parallel implementations of this algorithm exist, in particular for multi-core processors. The advantage of multi-core platforms over other specialized parallel architectures is that they are a low-power, inexpensive, widely available and well-known technology. A critical issue in the parallel implementation of RX is the sample covariance matrix calculation, which can be approached in global or local fashion. This aspect is crucial for the RX implementation since the consideration of a local or global strategy for the computation of the sample covariance matrix is expected to affect both the scalability of the parallel solution and the anomaly detection results. In this paper, we develop new parallel implementations of the RX in multi-core processors and specifically investigate the impact of different data partitioning strategies when parallelizing its computations. For this purpose, we consider both global and local data partitioning strategies in the spatial domain of the scene, and further analyze their scalability in different multi-core platforms. The numerical effectiveness of the considered solutions is evaluated using receiver operating characteristics (ROC) curves, analyzing their capacity to detect thermal hot spots (anomalies) in hyperspectral data collected by the NASA's Airborne Visible Infra- Red Imaging Spectrometer system over the World Trade Center in New York, five days after the terrorist attacks of September 11th, 2001.

  9. Fetal Central Nervous System Anomalies Detected by Magnetic Resonance Imaging: A Two-Year Experience

    PubMed Central

    Sefidbakht, Sepideh; Dehghani, Sakineh; Safari, Maryam; Vafaei, Homeira; Kasraeian, Maryam

    2016-01-01

    Background Magnetic resonance imaging (MRI) is gradually becoming more common for thorough visualization of the fetus than ultrasound (US), especially for neurological anomalies, which are the most common indications for fetal MRI and are a matter of concern for both families and society. Objectives We investigated fetal MRIs carried out in our center for frequency of central nervous system anomalies. This is the first such report in southern Iran. Materials and Methods One hundred and seven (107) pregnant women with suspicious fetal anomalies in prenatal ultrasound entered a cross-sectional retrospective study from 2011 to 2013. A 1.5 T Siemens Avanto scanner was employed for sequences, including T2 HASTE and Trufisp images in axial, coronal, and sagittal planes to mother’s body, T2 HASTE and Trufisp relative to the specific fetal body part being evaluated, and T1 flash images in at least one plane based on clinical indication. We investigated any abnormality in the central nervous system and performed descriptive analysis to achieve index of frequency. Results Mean gestational age ± standard deviation (SD) for fetuses was 25.54 ± 5.22 weeks, and mean maternal age ± SD was 28.38 ± 5.80 years Eighty out of 107 (74.7%) patients who were referred with initial impression of borderline ventriculomegaly. A total of 18 out of 107 (16.82%) patients were found to have fetuses with CNS anomalies and the remainder were neurologically normal. Detected anomalies were as follow: 3 (16.6%) fetuses each had the Dandy-Walker variant and Arnold-Chiari II (with myelomeningocele). Complete agenesis of corpus callosum, partial agenesis of corpus callosum, and aqueductal stenosis were each seen in 2 (11.1%) fetuses. Arnold-Chiari II without myelomeningocele, anterior spina bifida associated with neurenteric cyst, arachnoid cyst, lissencephaly, and isolated enlarged cisterna magna each presented in one (5.5%) fetus. One fetus had concomitant schizencephaly and complete agenesis of

  10. Developing a new, passive diffusion sampling array to detect helium anomalies associated with volcanic unrest

    USGS Publications Warehouse

    Dame, Brittany E; Solomon, D Kip; Evans, William C.; Ingebritsen, Steven E.

    2015-01-01

    Helium (He) concentration and 3 He/ 4 He anomalies in soil gas and spring water are potentially powerful tools for investigating hydrothermal circulation associated with volca- nism and could perhaps serve as part of a hazards warning system. However, in operational practice, He and other gases are often sampled only after volcanic unrest is detected by other means. A new passive diffusion sampler suite, intended to be collected after the onset of unrest, has been developed and tested as a relatively low-cost method of determining He- isotope composition pre- and post-unrest. The samplers, each with a distinct equilibration time, passively record He concen- tration and isotope ratio in springs and soil gas. Once collected and analyzed, the He concentrations in the samplers are used to deconvolve the time history of the He concentration and the 3 He/ 4 He ratio at the collection site. The current suite consisting of three samplers is sufficient to deconvolve both the magnitude and the timing of a step change in in situ con- centration if the suite is collected within 100 h of the change. The effects of temperature and prolonged deployment on the suite ’ s capability of recording He anomalies have also been evaluated. The suite has captured a significant 3 He/ 4 He soil gas anomaly at Horseshoe Lake near Mammoth Lakes, California. The passive diffusion sampler suite appears to be an accurate and affordable alternative for determining He anomalies associated with volcanic unrest.

  11. Developing a new, passive diffusion sampler suite to detect helium anomalies associated with volcanic unrest

    NASA Astrophysics Data System (ADS)

    Dame, Brittany E.; Solomon, D. Kip; Evans, William C.; Ingebritsen, Steven E.

    2015-03-01

    Helium (He) concentration and 3He/4He anomalies in soil gas and spring water are potentially powerful tools for investigating hydrothermal circulation associated with volcanism and could perhaps serve as part of a hazards warning system. However, in operational practice, He and other gases are often sampled only after volcanic unrest is detected by other means. A new passive diffusion sampler suite, intended to be collected after the onset of unrest, has been developed and tested as a relatively low-cost method of determining He-isotope composition pre- and post-unrest. The samplers, each with a distinct equilibration time, passively record He concentration and isotope ratio in springs and soil gas. Once collected and analyzed, the He concentrations in the samplers are used to deconvolve the time history of the He concentration and the 3He/4He ratio at the collection site. The current suite consisting of three samplers is sufficient to deconvolve both the magnitude and the timing of a step change in in situ concentration if the suite is collected within 100 h of the change. The effects of temperature and prolonged deployment on the suite's capability of recording He anomalies have also been evaluated. The suite has captured a significant 3He/4He soil gas anomaly at Horseshoe Lake near Mammoth Lakes, California. The passive diffusion sampler suite appears to be an accurate and affordable alternative for determining He anomalies associated with volcanic unrest.

  12. Clusters versus GPUs for Parallel Target and Anomaly Detection in Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2010-12-01

    Remotely sensed hyperspectral sensors provide image data containing rich information in both the spatial and the spectral domain, and this information can be used to address detection tasks in many applications. In many surveillance applications, the size of the objects (targets) searched for constitutes a very small fraction of the total search area and the spectral signatures associated to the targets are generally different from those of the background, hence the targets can be seen as anomalies. In hyperspectral imaging, many algorithms have been proposed for automatic target and anomaly detection. Given the dimensionality of hyperspectral scenes, these techniques can be time-consuming and difficult to apply in applications requiring real-time performance. In this paper, we develop several new parallel implementations of automatic target and anomaly detection algorithms. The proposed parallel algorithms are quantitatively evaluated using hyperspectral data collected by the NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) system over theWorld Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in theWTC complex.

  13. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  14. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  15. Detection and Origin of Hydrocarbon Seepage Anomalies in the Barents Sea

    NASA Astrophysics Data System (ADS)

    Polteau, Stephane; Planke, Sverre; Stolze, Lina; Kjølhamar, Bent E.; Myklebust, Reidun

    2016-04-01

    We have collected more than 450 gravity cores in the Barents Sea to detect hydrocarbon seepage anomalies and for seismic-stratigraphic tie. The cores are from the Hoop Area (125 samples) and from the Barents Sea SE (293 samples). In addition, we have collected cores near seven exploration wells. The samples were analyzed using three different analytical methods; (1) the standard organic geochemical analyzes of Applied Petroleum Technologies (APT), (2) the Amplified Geochemical Imaging (AGI) method, and (3) the Microbial Prospecting for Oil and Gas (MPOG) method. These analytical approaches can detect trace amounts of thermogenic hydrocarbons in the sediment samples, and may provide additional information about the fluid phases and the depositional environment, maturation, and age of the source rocks. However, hydrocarbon anomalies in seabed sediments may also be related to shallow sources, such as biogenic gas or reworked source rocks in the sediments. To better understand the origin of the hydrocarbon anomalies in the Barents Sea we have studied 35 samples collected approximately 200 m away from seven exploration wells. The wells included three boreholes associated with oil discoveries, two with gas discoveries, one dry well with gas shows, and one dry well. In general, the results of this case study reveal that the oil wells have an oil signature, gas wells show a gas signature, and dry wells have a background signature. However, differences in results from the three methods may occur and have largely been explained in terms of analytical measurement ranges, method sensitivities, and bio-geochemical processes in the seabed sediments. The standard geochemical method applied by APT relies on measuring the abundance of compounds between C1 to C5 in the headspace gas and between C11 to C36 in the sediment extracts. The anomalies detected in the sediment samples from this study were in the C16 to C30 range. Since the organic matter yields were mostly very low, the

  16. A healthcare utilization analysis framework for hot spotting and contextual anomaly detection.

    PubMed

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient's clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among

  17. A Healthcare Utilization Analysis Framework for Hot Spotting and Contextual Anomaly Detection

    PubMed Central

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient’s clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among

  18. A sequential enzymatic microreactor system for ethanol detection of gasohol mixtures.

    PubMed

    Alhadeff, Eliana M; Salgado, Andréa M; Pereira, Nei; Valdman, Belkis

    2005-01-01

    A sequential enzymatic double microreactor system with dilution line was developed for quantifying ethanol from gasohol mixtures, using a colorimetric detection method, as a new proposal to the single micro reactor system used in previous work. Alcohol oxidase (AOD) and horseradish peroxidase (HRP) immobilized on glass beads, one in each microreactor, were used with phenol and 4-aminophenazone and the red-colored product was detected with a spectrophotometer at 555 nm. Good results were obtained with the immobilization technique used for both AOD and HRP enzymes, with best retention efficiencies of 95.3 +/- 2.3% and 63.2 +/- 7.0%, respectively. The two microreactors were used to analyze extracted ethanol from gasohol blends in the range 1-30 % v/v (10.0-238.9 g ethanol/L), with and without an on-line dilution sampling line. A calibration curve was obtained in the range 0.0034-0.087 g ethanol/L working with the on-line dilution integrated to the biosensor FIA system proposed. The diluted sample concentrations were also determined by gas chromatography (GC) and high-pressure liquid chromatography (HPLC) methods and the results compared with the proposed sequential system measurements. The effect of the number of analysis performed with the same system was also investigated.

  19. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    NASA Astrophysics Data System (ADS)

    Chen, Szi-Wen

    2006-12-01

    A novel approach that employs a complexity-based sequential hypothesis testing (SHT) technique for real-time detection of ventricular fibrillation (VF) and ventricular tachycardia (VT) is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG) recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM) value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of[InlineEquation not available: see fulltext.]. The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  20. Anomaly detection in hyperspectral imagery based on low-rank and sparse decomposition

    NASA Astrophysics Data System (ADS)

    Cui, Xiaoguang; Tian, Yuan; Weng, Lubin; Yang, Yiping

    2014-01-01

    This paper presents a novel low-rank and sparse decomposition (LSD) based model for anomaly detection in hyperspectral images. In our model, a local image region is represented as a low-rank matrix plus spares noises in the spectral space, where the background can be explained by the low-rank matrix, and the anomalies are indicated by the sparse noises. The detection of anomalies in local image regions is formulated as a constrained LSD problem, which can be solved efficiently and robustly with a modified "Go Decomposition" (GoDec) method. To enhance the validity of this model, we adapts a "simple linear iterative clustering" (SLIC) superpixel algorithm to efficiently generate homogeneous local image regions i.e. superpixels in hyperspectral imagery, thus ensures that the background in local image regions satisfies the condition of low-rank. Experimental results on real hyperspectral data demonstrate that, compared with several known local detectors including RX detector, kernel RX detector, and SVDD detector, the proposed model can comfortably achieves better performance in satisfactory computation time.

  1. Model-Based Anomaly Detection for a Transparent Optical Transmission System

    NASA Astrophysics Data System (ADS)

    Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.

    In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.

  2. Clairvoyant fusion detection of ocean anomalies in WorldView-2 spectral imagery

    NASA Astrophysics Data System (ADS)

    Schaum, Alan; Allman, Eric; Stites, Matthew

    2016-09-01

    For every possible mixture of clouds and ocean in WorldView-2 8-band data, we construct an anomaly detector (called a "clairvoyant" because we never know which mixture is appropriate in any given pixel). Then we combine these using a fusion technique. The usual method of deriving an analytic expression describing the envelope of all the clairvoyants' decision boundaries is not possible. Instead, we compute the intersections of infinitesimally close boundaries generated by differential changes in the mixing fraction. When glued together, these 6-dimensional hyperstrings constitute the desired 7-dimensional decision boundary of the fused anomaly detector. However, no closed-form solution exists for the fused result. Therefore, we construct an approximation to the fused detection boundary by first flattening the strings into 6-dimensional hyperplanes and then gluing them together à la 3D printing.

  3. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  4. Small-scale anomaly detection in panoramic imaging using neural models of low-level vision

    NASA Astrophysics Data System (ADS)

    Casey, Matthew C.; Hickman, Duncan L.; Pavlou, Athanasios; Sadler, James R. E.

    2011-06-01

    Our understanding of sensory processing in animals has reached the stage where we can exploit neurobiological principles in commercial systems. In human vision, one brain structure that offers insight into how we might detect anomalies in real-time imaging is the superior colliculus (SC). The SC is a small structure that rapidly orients our eyes to a movement, sound or touch that it detects, even when the stimulus may be on a small-scale; think of a camouflaged movement or the rustle of leaves. This automatic orientation allows us to prioritize the use of our eyes to raise awareness of a potential threat, such as a predator approaching stealthily. In this paper we describe the application of a neural network model of the SC to the detection of anomalies in panoramic imaging. The neural approach consists of a mosaic of topographic maps that are each trained using competitive Hebbian learning to rapidly detect image features of a pre-defined shape and scale. What makes this approach interesting is the ability of the competition between neurons to automatically filter noise, yet with the capability of generalizing the desired shape and scale. We will present the results of this technique applied to the real-time detection of obscured targets in visible-band panoramic CCTV images. Using background subtraction to highlight potential movement, the technique is able to correctly identify targets which span as little as 3 pixels wide while filtering small-scale noise.

  5. Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.

    2010-01-01

    In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.

  6. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.

    PubMed

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes.

  7. Real Time Detection of Anomalies in Streaming Radar and Rain Gauge Data

    NASA Astrophysics Data System (ADS)

    Hill, D. J.; Minsker, B.; Amir, E.; Choi, J.

    2008-12-01

    Radar-rainfall data are being used in an increasing number of real-time applications because of their wide spatial and temporal coverage. Because of uncertainties in radar measurements and the relationship between radar measurements and rainfall on the ground, radar-rainfall data are often combined with rain gauge data to improve their accuracy. While rain gauges can provide accurate estimates of rainfall, their data are sometimes subject to a number of errors caused by the environment in which the gauges are deployed. This study develops a method for automatically detecting anomalies (i.e. data that deviate markedly from historical patterns) in both radar and raingauge data through integration and modeling of data from these two different sources.. These anomalous data can be caused by sensor or data transmission errors or by infrequent system behaviors that may be of interest to the scientific or public safety communities. This study develops an automated anomaly detection method that employs a Dynamic Bayesian Network to assimilate data from multiple rain gauges and weather radar (NEXRAD) into an uncertain model of the current rainfall. Filtering (e.g. Kalman filtering) can then be used to infer the likelihood that a particular gauge measurement is anomalous. Measurements with a high likelihood of being anomalous are classified as such. The method developed in this study performs fast, incremental evaluation of data as they become available; scales to large quantities of data; and requires no a priori information regarding process variables or types of anomalies that may be encountered. The performance of the anomaly detector developed in this study is demonstrated using a precipitation sensor network composed of a NEXRAD weather radar and several near- real-time telemetered rain gauges deployed by the USGS in Chicago. The results indicate that the method performs well at identifying anomalous data caused by a real sensor failure.

  8. Sequential change detection and monitoring of temporal trends in random-effects meta-analysis.

    PubMed

    Dogo, Samson Henry; Clark, Allan; Kulinskaya, Elena

    2016-12-08

    Temporal changes in magnitude of effect sizes reported in many areas of research are a threat to the credibility of the results and conclusions of meta-analysis. Numerous sequential methods for meta-analysis have been proposed to detect changes and monitor trends in effect sizes so that meta-analysis can be updated when necessary and interpreted based on the time it was conducted. The difficulties of sequential meta-analysis under the random-effects model are caused by dependencies in increments introduced by the estimation of the heterogeneity parameter τ(2) . In this paper, we propose the use of a retrospective cumulative sum (CUSUM)-type test with bootstrap critical values. This method allows retrospective analysis of the past trajectory of cumulative effects in random-effects meta-analysis and its visualization on a chart similar to CUSUM chart. Simulation results show that the new method demonstrates good control of Type I error regardless of the number or size of the studies and the amount of heterogeneity. Application of the new method is illustrated on two examples of medical meta-analyses. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  9. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery.

    PubMed

    Marapareddy, Ramakalavathi; Aanstoos, James V; Younan, Nicolas H

    2016-06-16

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ₁, λ₂, and λ₃), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory's (JPL's) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers.

  10. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery

    PubMed Central

    Marapareddy, Ramakalavathi; Aanstoos, James V.; Younan, Nicolas H.

    2016-01-01

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ1, λ2, and λ3), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory’s (JPL’s) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers. PMID:27322270

  11. GPU implementation of target and anomaly detection algorithms for remotely sensed hyperspectral image analysis

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2010-08-01

    Automatic target and anomaly detection are considered very important tasks for hyperspectral data exploitation. These techniques are now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly and target detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we describe several new GPU-based implementations of target and anomaly detection algorithms for hyperspectral data exploitation. The parallel algorithms are implemented on latest-generation Tesla C1060 GPU architectures, and quantitatively evaluated using hyperspectral data collected by NASA's AVIRIS system over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex.

  12. Binding at birth: the newborn brain detects identity relations and sequential position in speech.

    PubMed

    Gervain, Judit; Berent, Iris; Werker, Janet F

    2012-03-01

    Breaking the linguistic code requires the extraction of at least two types of information from the speech signal: the relations between linguistic units and their sequential position. Furthermore, these different types of information need to be integrated into a coherent representation of language structure. The brain networks responsible for these abilities are well known in adults, but not in young infants. Our results show that the neural architecture underlying these abilities is operational at birth. In three optical imaging studies, we found that the newborn brain detects identity relations, as evidenced by enhanced activation in the bilateral superior temporal and left inferior frontal regions. More importantly, the newborn brain can also determine whether such identity relations hold for the initial or final positions of speech sequences, as indicated by increased activity in the inferior frontal regions, possibly Broca's area. This implies that the neural foundations of language acquisition are in place from birth.

  13. Sequential injection setup for capillary isoelectric focusing combined with MS detection.

    PubMed

    Páger, Csilla; Dörnyei, Agnes; Kilár, Ferenc

    2011-07-01

    Capillary isoelectric focusing in the presence of electroosmosis with sequential injection of carrier ampholytes and sample was found to be suitable for MS detection. The separate injection of the sample and the ampholytes provides good condition to suppress and overcome the undesirable effect of the presence of ampholytes in MS. By the appropriate selection of ampholyte solutions, whose pH range not necessarily covers the pI values of the analytes, the migration of the components can be controlled, and the impact of the ampholytes on MS detection is decreased. The unique applicability of this setup is shown by testing several parameters, such as the application of volatile electrolyte solutions, the type of the ampholytes, the order and the number of the ampholyte and sample zones. Broad and narrow pH range ampholytes were applied in experiments using uncoated capillaries with different lengths for the analyses of substituted nitrophenol dyes to achieve optimal conditions for the MS detection. Although the sample components are not leaving the pH gradient, due to the decrease in the ampholyte concentration at the position of the components, and because the sample components migrate in charged state, the ionisation is more effective for MS detection.

  14. Building robust neighborhoods for manifold learning-based image classification and anomaly detection

    NASA Astrophysics Data System (ADS)

    Doster, Timothy; Olson, Colin C.

    2016-05-01

    We exploit manifold learning algorithms to perform image classification and anomaly detection in complex scenes involving hyperspectral land cover and broadband IR maritime data. The results of standard manifold learning techniques are improved by including spatial information. This is accomplished by creating super-pixels which are robust to affine transformations inherent in natural scenes. We utilize techniques from harmonic analysis and image processing, namely, rotation, skew, flip, and shift operators to develop a more representational graph structure which defines the data-dependent manifold.

  15. Range-invariant anomaly detection applied to imaging Fourier transform spectrometry data

    NASA Astrophysics Data System (ADS)

    Borel, Christoph; Rosario, Dalton; Romano, Joao

    2012-09-01

    This paper describes the end-to-end processing of image Fourier transform spectrometry data taken of surrogate tank targets at Picatinny Arsenal in New Jersey with the long-wave hyper-spectral camera HyperCam from Telops. The first part of the paper discusses the processing from raw data to calibrated radiance and emissivity data. The second part discusses the application of a range-invariant anomaly detection approach to calibrated radiance, emissivity and brightness temperature data for different spatial resolutions and compares it to the Reed-Xiaoli detector.

  16. Utilization of Electrical Impedance Tomography to Detect Internal Anomalies in Southern Pine Logs

    NASA Astrophysics Data System (ADS)

    Steele, Philip; Cooper, Jerome

    2006-03-01

    A large body of research has shown that knowledge of internal defect location in logs prior to sawing has the potential to significantly increase lumber value yield. This paper describes a relatively low-capital log scanning technique based on Electrical Impedance Tomography (EIT) to image anomalies interior to sawlogs. Static testing results showed that knots, juvenile and compression wood internal to logs can be detected. Although resolution is lower than that of CT and NMR technologies, the low cost of this EIT application should render it competitive.

  17. Molecular Detection of Human Cytomegalovirus (HCMV) Among Infants with Congenital Anomalies in Khartoum State, Sudan

    PubMed Central

    Ebrahim, Maha G.; Ali, Aisha S.; Mustafa, Mohamed O.; Musa, Dalal F.; El Hussein, Abdel Rahim M.; Elkhidir, Isam M.; Enan, Khalid A.

    2015-01-01

    Human Cytomegalovirus (HCMV) infection still represents the most common potentially serious viral complication in humans and is a major cause of congenital anomalies in infants. This study is aimed to detect HCMV in infants with congenital anomalies. Study subjects consisted of infants born with neural tube defect, hydrocephalus and microcephaly. Fifty serum specimens (20 males, 30 females) were collected from different hospitals in Khartoum State. The sera were investigated for cytomegalovirus specific immunoglobin M (IgM) antibodies using enzyme-linked immunosorbent assay (ELISA), and for Cytomegalovirus DNA using polymerase chain reaction (PCR). Out of the 50 sera tested, one patient’s (2%) sample showed HCMV IgM, but with no detectable DNA, other 4(8.2 %) sera were positive for HCMV DNA but with no detectable IgM. Various diagnostic techniques should be considered to evaluate HCMV disease and routine screening for HCMV should be introduced for pregnant women in this setting. It is vital to initiate further research work with many samples from different area to assess prevalence and characterize HCMV and evaluate its maternal health implications. PMID:26862356

  18. Fiber Optic Bragg Grating Sensors for Thermographic Detection of Subsurface Anomalies

    NASA Technical Reports Server (NTRS)

    Allison, Sidney G.; Winfree, William P.; Wu, Meng-Chou

    2009-01-01

    Conventional thermography with an infrared imager has been shown to be an extremely viable technique for nondestructively detecting subsurface anomalies such as thickness variations due to corrosion. A recently developed technique using fiber optic sensors to measure temperature holds potential for performing similar inspections without requiring an infrared imager. The structure is heated using a heat source such as a quartz lamp with fiber Bragg grating (FBG) sensors at the surface of the structure to detect temperature. Investigated structures include a stainless steel plate with thickness variations simulated by small platelets attached to the back side using thermal grease. A relationship is shown between the FBG sensor thermal response and variations in material thickness. For comparison, finite element modeling was performed and found to agree closely with the fiber optic thermography results. This technique shows potential for applications where FBG sensors are already bonded to structures for Integrated Vehicle Health Monitoring (IVHM) strain measurements and can serve dual-use by also performing thermographic detection of subsurface anomalies.

  19. RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection

    PubMed Central

    Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S.

    2015-01-01

    Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request. PMID:25685112

  20. Multiple Kernel Learning for Heterogeneous Anomaly Detection: Algorithm and Aviation Safety Case Study

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.

    2010-01-01

    The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods

  1. Automated determinations of selenium in thermal power plant wastewater by sequential hydride generation and chemiluminescence detection.

    PubMed

    Ezoe, Kentaro; Ohyama, Seiichi; Hashem, Md Abul; Ohira, Shin-Ichi; Toda, Kei

    2016-02-01

    After the Fukushima disaster, power generation from nuclear power plants in Japan was completely stopped and old coal-based power plants were re-commissioned to compensate for the decrease in power generation capacity. Although coal is a relatively inexpensive fuel for power generation, it contains high levels (mgkg(-1)) of selenium, which could contaminate the wastewater from thermal power plants. In this work, an automated selenium monitoring system was developed based on sequential hydride generation and chemiluminescence detection. This method could be applied to control of wastewater contamination. In this method, selenium is vaporized as H2Se, which reacts with ozone to produce chemiluminescence. However, interference from arsenic is of concern because the ozone-induced chemiluminescence intensity of H2Se is much lower than that of AsH3. This problem was successfully addressed by vaporizing arsenic and selenium individually in a sequential procedure using a syringe pump equipped with an eight-port selection valve and hot and cold reactors. Oxidative decomposition of organoselenium compounds and pre-reduction of the selenium were performed in the hot reactor, and vapor generation of arsenic and selenium were performed separately in the cold reactor. Sample transfers between the reactors were carried out by a pneumatic air operation by switching with three-way solenoid valves. The detection limit for selenium was 0.008 mg L(-1) and calibration curve was linear up to 1.0 mg L(-1), which provided suitable performance for controlling selenium in wastewater to around the allowable limit (0.1 mg L(-1)). This system consumes few chemicals and is stable for more than a month without any maintenance. Wastewater samples from thermal power plants were collected, and data obtained by the proposed method were compared with those from batchwise water treatment followed by hydride generation-atomic fluorescence spectrometry.

  2. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks

    PubMed Central

    Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin

    2016-01-01

    The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035

  3. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks.

    PubMed

    Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin

    2016-09-28

    The spatial-temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial-temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data.

  4. Sparsity divergence index based on locally linear embedding for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhang, Lili; Zhao, Chunhui

    2016-04-01

    Hyperspectral imagery (HSI) has high spectral and spatial resolutions, which are essential for anomaly detection (AD). Many anomaly detectors assume that the spectrum signature of HSI pixels can be modeled with a Gaussian distribution, which is actually not accurate and often leads to many false alarms. Therefore, a sparsity model without any distribution hypothesis is usually employed. Dimensionality reduction (DR) as a preprocessing step for HSI is important. Principal component analysis as a conventional DR method is a linear projection and cannot exploit the nonlinear properties in hyperspectral data, whereas locally linear embedding (LLE) as a local, nonlinear manifold learning algorithm works well for DR of HSI. A modified algorithm of sparsity divergence index based on locally linear embedding (SDI-LLE) is thus proposed. First, kernel collaborative representation detection is adopted to calculate the sparse dictionary matrix of local reconstruction weights in LLE. Then, SDI is obtained both in the spectral and spatial domains, where spatial SDI is computed after DR by LLE. Finally, joint SDI, combining spectral SDI and spatial SDI, is computed, and the optimal SDI is performed for AD. Experimental results demonstrate that the proposed algorithm significantly improves the performance, when compared with its counterparts.

  5. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  6. Fully automated analytical procedure for propofol determination by sequential injection technique with spectrophotometric and fluorimetric detections.

    PubMed

    Šrámková, Ivana; Amorim, Célia G; Sklenářová, Hana; Montenegro, Maria C B M; Horstkotte, Burkhard; Araújo, Alberto N; Solich, Petr

    2014-01-01

    In this work, an application of an enzymatic reaction for the determination of the highly hydrophobic drug propofol in emulsion dosage form is presented. Emulsions represent a complex and therefore challenging matrix for analysis. Ethanol was used for breakage of a lipid emulsion, which enabled optical detection. A fully automated method based on Sequential Injection Analysis was developed, allowing propofol determination without the requirement of tedious sample pre-treatment. The method was based on spectrophotometric detection after the enzymatic oxidation catalysed by horseradish peroxidase and subsequent coupling with 4-aminoantipyrine leading to a coloured product with an absorbance maximum at 485 nm. This procedure was compared with a simple fluorimetric method, which was based on the direct selective fluorescence emission of propofol in ethanol at 347 nm. Both methods provide comparable validation parameters with linear working ranges of 0.005-0.100 mg mL(-1) and 0.004-0.243 mg mL(-1) for the spectrophotometric and fluorimetric methods, respectively. The detection and quantitation limits achieved with the spectrophotometric method were 0.0016 and 0.0053 mg mL(-1), respectively. The fluorimetric method provided the detection limit of 0.0013 mg mL(-1) and limit of quantitation of 0.0043 mg mL(-1). The RSD did not exceed 5% and 2% (n=10), correspondingly. A sample throughput of approx. 14 h(-1) for the spectrophotometric and 68 h(-1) for the fluorimetric detection was achieved. Both methods proved to be suitable for the determination of propofol in pharmaceutical formulation with average recovery values of 98.1 and 98.5%.

  7. Data mining method for anomaly detection in the supercomputer task flow

    NASA Astrophysics Data System (ADS)

    Voevodin, Vadim; Voevodin, Vladimir; Shaikhislamov, Denis; Nikitenko, Dmitry

    2016-10-01

    The efficiency of most supercomputer applications is extremely low. At the same time, the user rarely even suspects that their applications may be wasting computing resources. Software tools need to be developed to help detect inefficient applications and report them to the users. We suggest an algorithm for detecting anomalies in the supercomputer's task flow, based on a data mining methods. System monitoring is used to calculate integral characteristics for every job executed, and the data is used as input for our classification method based on the Random Forest algorithm. The proposed approach can currently classify the application as one of three classes - normal, suspicious and definitely anomalous. The proposed approach has been demonstrated on actual applications running on the "Lomonosov" supercomputer.

  8. Anomaly Detection Techniques with Real Test Data from a Spinning Turbine Engine-Like Rotor

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Woike, Mark R.; Oza, Nikunj C.; Matthews, Bryan L.

    2012-01-01

    Online detection techniques to monitor the health of rotating engine components are becoming increasingly attractive to aircraft engine manufacturers in order to increase safety of operation and lower maintenance costs. Health monitoring remains a challenge to easily implement, especially in the presence of scattered loading conditions, crack size, component geometry, and materials properties. The current trend, however, is to utilize noninvasive types of health monitoring or nondestructive techniques to detect hidden flaws and mini-cracks before any catastrophic event occurs. These techniques go further to evaluate material discontinuities and other anomalies that have grown to the level of critical defects that can lead to failure. Generally, health monitoring is highly dependent on sensor systems capable of performing in various engine environmental conditions and able to transmit a signal upon a predetermined crack length, while acting in a neutral form upon the overall performance of the engine system.

  9. System and method for the detection of anomalies in an image

    DOEpatents

    Prasad, Lakshman; Swaminarayan, Sriram

    2013-09-03

    Preferred aspects of the present invention can include receiving a digital image at a processor; segmenting the digital image into a hierarchy of feature layers comprising one or more fine-scale features defining a foreground object embedded in one or more coarser-scale features defining a background to the one or more fine-scale features in the segmentation hierarchy; detecting a first fine-scale foreground feature as an anomaly with respect to a first background feature within which it is embedded; and constructing an anomalous feature layer by synthesizing spatially contiguous anomalous fine-scale features. Additional preferred aspects of the present invention can include detecting non-pervasive changes between sets of images in response at least in part to one or more difference images between the sets of images.

  10. MedMon: securing medical devices through wireless monitoring and anomaly detection.

    PubMed

    Zhang, Meng; Raghunathan, Anand; Jha, Niraj K

    2013-12-01

    Rapid advances in personal healthcare systems based on implantable and wearable medical devices promise to greatly improve the quality of diagnosis and treatment for a range of medical conditions. However, the increasing programmability and wireless connectivity of medical devices also open up opportunities for malicious attackers. Unfortunately, implantable/wearable medical devices come with extreme size and power constraints, and unique usage models, making it infeasible to simply borrow conventional security solutions such as cryptography. We propose a general framework for securing medical devices based on wireless channel monitoring and anomaly detection. Our proposal is based on a medical security monitor (MedMon) that snoops on all the radio-frequency wireless communications to/from medical devices and uses multi-layered anomaly detection to identify potentially malicious transactions. Upon detection of a malicious transaction, MedMon takes appropriate response actions, which could range from passive (notifying the user) to active (jamming the packets so that they do not reach the medical device). A key benefit of MedMon is that it is applicable to existing medical devices that are in use by patients, with no hardware or software modifications to them. Consequently, it also leads to zero power overheads on these devices. We demonstrate the feasibility of our proposal by developing a prototype implementation for an insulin delivery system using off-the-shelf components (USRP software-defined radio). We evaluate its effectiveness under several attack scenarios. Our results show that MedMon can detect virtually all naive attacks and a large fraction of more sophisticated attacks, suggesting that it is an effective approach to enhancing the security of medical devices.

  11. Visual detection and sequential injection determination of aluminium using a cinnamoyl derivative.

    PubMed

    Elečková, Lenka; Alexovič, Michal; Kuchár, Juraj; Balogh, Ioseph S; Andruch, Vasil

    2015-02-01

    A cinnamoyl derivative, 3-[4-(dimethylamino)cinnamoyl]-4-hydroxy-6-methyl-3,4-2H-pyran-2-one, was used as a ligand for the determination of aluminium. Upon the addition of an acetonitrile solution of the ligand to an aqueous solution containing Al(III) and a buffer solution at pH 8, a marked change in colour from yellow to orange is observed. The colour intensity is proportional to the concentration of Al(III); thus, the 'naked-eye' detection of aluminium is possible. The reaction is also applied for sequential injection determination of aluminium. Beer׳s law is obeyed in the range from 0.055 to 0.66 mg L(-1) of Al(III). The limit of detection, calculated as three times the standard deviation of the blank test (n=10), was found to be 4 μg L(-1) for Al(III). The method was applied for the determination of aluminium in spiked water samples and pharmaceutical preparations.

  12. Detection of subpixel anomalies in multispectral infrared imagery using an adaptive Bayesian classifier

    SciTech Connect

    Ashton, E.A.

    1998-03-01

    The detection of subpixel targets with unknown spectral signatures and cluttered backgrounds in multispectral imagery is a topic of great interest for remote surveillance applications. Because no knowledge of the target is assumed, the only way to accomplish such a detection is through a search for anomalous pixels. Two approaches to this problem are examined in this paper. The first is to separate the image into a number of statistical clusters by using an extension of the well-known {kappa}-means algorithm. Each bin of resultant residual vectors is then decorrelated, and the results are thresholded to provide detection. The second approach requires the formation of a probabilistic background model by using an adaptive Bayesian classification algorithm. This allows the calculation of a probability for each pixel, with respect to the model. These probabilities are then thresholded to provide detection. Both algorithms are shown to provide significant improvement over current filtering techniques for anomaly detection in experiments using multispectral IR imagery with both simulated and actual subpixel targets.

  13. Detection of submicron scale cracks and other surface anomalies using positron emission tomography

    DOEpatents

    Cowan, Thomas E.; Howell, Richard H.; Colmenares, Carlos A.

    2004-02-17

    Detection of submicron scale cracks and other mechanical and chemical surface anomalies using PET. This surface technique has sufficient sensitivity to detect single voids or pits of sub-millimeter size and single cracks or fissures of millimeter size; and single cracks or fissures of millimeter-scale length, micrometer-scale depth, and nanometer-scale length, micrometer-scale depth, and nanometer-scale width. This technique can also be applied to detect surface regions of differing chemical reactivity. It may be utilized in a scanning or survey mode to simultaneously detect such mechanical or chemical features over large interior or exterior surface areas of parts as large as about 50 cm in diameter. The technique involves exposing a surface to short-lived radioactive gas for a time period, removing the excess gas to leave a partial monolayer, determining the location and shape of the cracks, voids, porous regions, etc., and calculating the width, depth, and length thereof. Detection of 0.01 mm deep cracks using a 3 mm detector resolution has been accomplished using this technique.

  14. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling

    PubMed Central

    Raghuram, Jayaram; Miller, David J.; Kesidis, George

    2014-01-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511

  15. Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection

    PubMed Central

    O’Hern, Corey S.; Shattuck, Mark D.; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G.

    2016-01-01

    Clinical diagnosis of acute infectious diseases during the early stages of infection is critical to administering the appropriate treatment to improve the disease outcome. We present a data driven analysis of the human cellular response to respiratory viruses including influenza, respiratory syncytia virus, and human rhinovirus, and compared this with the response to the bacterial endotoxin, Lipopolysaccharides (LPS). Using an anomaly detection framework we identified pathways that clearly distinguish between asymptomatic and symptomatic patients infected with the four different respiratory viruses and that accurately diagnosed patients exposed to a bacterial infection. Connectivity pathway analysis comparing the viral and bacterial diagnostic signatures identified host cellular pathways that were unique to patients exposed to LPS endotoxin indicating this type of analysis could be used to identify host biomarkers that can differentiate clinical etiologies of acute infection. We applied the Multivariate State Estimation Technique (MSET) on two human influenza (H1N1 and H3N2) gene expression data sets to define host networks perturbed in the asymptomatic phase of infection. Our analysis identified pathways in the respiratory virus diagnostic signature as prognostic biomarkers that triggered prior to clinical presentation of acute symptoms. These early warning pathways correctly predicted that almost half of the subjects would become symptomatic in less than forty hours post-infection and that three of the 18 subjects would become symptomatic after only 8 hours. These results provide a proof-of-concept for utility of anomaly detection algorithms to classify host pathway signatures that can identify presymptomatic signatures of acute diseases and differentiate between etiologies of infection. On a global scale, acute respiratory infections cause a significant proportion of human co-morbidities and account for 4.25 million deaths annually. The development of clinical

  16. Automatic detection and measurement of structures in fetal head ultrasound volumes using sequential estimation and Integrated Detection Network (IDN).

    PubMed

    Sofka, Michal; Zhang, Jingdan; Good, Sara; Zhou, S Kevin; Comaniciu, Dorin

    2014-05-01

    Routine ultrasound exam in the second and third trimesters of pregnancy involves manually measuring fetal head and brain structures in 2-D scans. The procedure requires a sonographer to find the standardized visualization planes with a probe and manually place measurement calipers on the structures of interest. The process is tedious, time consuming, and introduces user variability into the measurements. This paper proposes an automatic fetal head and brain (AFHB) system for automatically measuring anatomical structures from 3-D ultrasound volumes. The system searches the 3-D volume in a hierarchy of resolutions and by focusing on regions that are likely to be the measured anatomy. The output is a standardized visualization of the plane with correct orientation and centering as well as the biometric measurement of the anatomy. The system is based on a novel framework for detecting multiple structures in 3-D volumes. Since a joint model is difficult to obtain in most practical situations, the structures are detected in a sequence, one-by-one. The detection relies on Sequential Estimation techniques, frequently applied to visual tracking. The interdependence of structure poses and strong prior information embedded in our domain yields faster and more accurate results than detecting the objects individually. The posterior distribution of the structure pose is approximated at each step by sequential Monte Carlo. The samples are propagated within the sequence across multiple structures and hierarchical levels. The probabilistic model helps solve many challenges present in the ultrasound images of the fetus such as speckle noise, signal drop-out, shadows caused by bones, and appearance variations caused by the differences in the fetus gestational age. This is possible by discriminative learning on an extensive database of scans comprising more than two thousand volumes and more than thirteen thousand annotations. The average difference between ground truth and automatic

  17. Characterization of normality of chaotic systems including prediction and detection of anomalies

    NASA Astrophysics Data System (ADS)

    Engler, Joseph John

    Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational

  18. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    /index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.

  19. Anomaly Detection using Multi-channel FLAC for Supporting Diagnosis of ECG

    NASA Astrophysics Data System (ADS)

    Ye, Jiaxing; Kobayashi, Takumi; Murakawa, Masahiro; Higuchi, Tetsuya; Otsu, Nobuyuki

    In this paper, we propose an approach for abnormality detection in multi-channel ECG signals. This system serves as front end to detect the irregular sections in ECG signals, where symptoms may be observed. Thereby, the doctor can focus on only the detected suspected symptom sections, ignoring the disease-free parts. Hence the workload of the inspection by the doctors is significantly reduced and the diagnosis efficiency can be sharply improved. For extracting the predominant characteristics of multi-channel ECG signals, we propose multi-channel Fourier local auto-correlations (m-FLAC) features on multi-channel complex spectrograms. The method characterizes the amplitude and phase information as well as temporal dynamics of the multi-channel ECG signal. At the anomaly detection stage, we employ complex subspace method for statistically modeling the normal (healthy) ECG patterns as in one-class learning. Then, we investigate the input ECG signals by measuring its deviation distance to the trained subspace. The ECG sections with disordered spectral distributions can be effectively discerned based on such distance metric. To validate the proposed approach, we conducted experiments on ECG dataset. The experimental results demonstrated the effectiveness of the proposed approach including promising performance and high efficiency, compared to conventional methods.

  20. Adaptive hidden Markov model with anomaly States for price manipulation detection.

    PubMed

    Cao, Yi; Li, Yuhua; Coleman, Sonya; Belatreche, Ammar; McGinnity, Thomas Martin

    2015-02-01

    Price manipulation refers to the activities of those traders who use carefully designed trading behaviors to manually push up or down the underlying equity prices for making profits. With increasing volumes and frequency of trading, price manipulation can be extremely damaging to the proper functioning and integrity of capital markets. The existing literature focuses on either empirical studies of market abuse cases or analysis of particular manipulation types based on certain assumptions. Effective approaches for analyzing and detecting price manipulation in real time are yet to be developed. This paper proposes a novel approach, called adaptive hidden Markov model with anomaly states (AHMMAS) for modeling and detecting price manipulation activities. Together with wavelet transformations and gradients as the feature extraction methods, the AHMMAS model caters to price manipulation detection and basic manipulation type recognition. The evaluation experiments conducted on seven stock tick data from NASDAQ and the London Stock Exchange and 10 simulated stock prices by stochastic differential equation show that the proposed AHMMAS model can effectively detect price manipulation patterns and outperforms the selected benchmark models.

  1. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  2. Cfetool: A General Purpose Tool for Anomaly Detection in Periodic Data

    SciTech Connect

    Wachsmann, Alf; Cassell, Elizabeth; /UC, Santa Barbara

    2007-03-06

    Cfengine's environment daemon ''cfenv'' has a limited and fixed set of metrics it measures on a computer. The data is assumed to be periodic in nature and cfenvd reports any data points that fall too far out of the pattern it has learned from past measurements. This is used to detect ''anomalies'' on computers. We introduce a new standalone tool, ''cfetool'', that allows arbitrary periodic data to be stored and evaluated. The user interface is modeled after rrdtool, another widely used tool to store measured data. Because a standalone tool can be used not only for computer related data, we have extended the built-in mathematics to apply to yearly data as well.

  3. Detection, identification and mapping of iron anomalies in brain tissue using X-ray absorption spectroscopy

    SciTech Connect

    Mikhaylova, A.; Davidson, M.; Toastmann, H.; Channell, J.E.T.; Guyodo, Y.; Batich, C.; Dobson, J.

    2008-06-16

    This work describes a novel method for the detection, identification and mapping of anomalous iron compounds in mammalian brain tissue using X-ray absorption spectroscopy. We have located and identified individual iron anomalies in an avian tissue model associated with ferritin, biogenic magnetite and haemoglobin with a pixel resolution of less than 5 {micro}m. This technique represents a breakthrough in the study of both intra- and extra-cellular iron compounds in brain tissue. The potential for high-resolution iron mapping using microfocused X-ray beams has direct application to investigations of the location and structural form of iron compounds associated with human neurodegenerative disorders - a problem which has vexed researchers for 50 years.

  4. Realization and detection of Weyl semimetals and the chiral anomaly in cold atomic systems

    NASA Astrophysics Data System (ADS)

    He, Wen-Yu; Zhang, Shizhong; Law, K. T.

    2016-07-01

    In this work, we describe a method to realize a three-dimensional Weyl semimetal by coupling multilayers of a honeycomb optical lattice in the presence of a pair of Raman lasers. The Raman lasers render each isolated honeycomb layer a Chern insulator. With finite interlayer coupling, the bulk gap of the system closes at certain out-of-plane momenta due to Raman assisted tunneling and results in the Weyl semimetal phase. Using experimentally relevant parameters, we show that both one pair and two pairs of Weyl points can be realized by tuning the interlayer coupling strength. We suggest that Landau-Zener tunneling can be used to detect Weyl points and show that the transition probability increases dramatically when the Weyl point emerges. The realization of chiral anomaly by using a magnetic-field gradient is also discussed.

  5. Seismological detection of low-velocity anomalies surrounding the mantle transition zone in Japan subduction zone

    NASA Astrophysics Data System (ADS)

    Liu, Zhen; Park, Jeffrey; Karato, Shun-ichiro

    2016-03-01

    In the Japan subduction zone, a locally depressed 660 discontinuity has been observed beneath northeast Asia, suggesting downwelling of materials from the mantle transition zone (MTZ). Vertical transport of water-rich MTZ materials across the major mineral phase changes could lead to water release and to partial melting in surrounding mantle regions, causing seismic low-velocity anomalies. Melt layers implied by low-velocity zones (LVZs) above the 410 discontinuity have been detected in many regions, but seismic evidence for partial melting below the 660 discontinuity has been limited. High-frequency migrated Ps receiver functions indicate LVZs below the depressed 660 discontinuity and above the 410 discontinuity in the deep Japan subduction zone, suggesting dehydration melting induced by water transport out of the MTZ. Our results provide insights into water circulation associated with dynamic interactions between the subducted slab and surrounding mantle.

  6. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO)

    PubMed Central

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  7. Sequential injection titration with spectrophotometric detection for the assay of acidity in fruit juices.

    PubMed

    Jakmunee, Jaroon; Rujiralai, Thitima; Grudpan, Kate

    2006-01-01

    A simple sequential injection analysis (SIA) with spectrophotometric detection for an assay of acidity in fruit juice was investigated. An alkaline reagent (sodium hydroxide), a sample and an indicator (phenolphthalein) were first aspirated and stacked as adjacent zones in a holding coil. With flow reversal through a reaction coil to the detector, zone penetration occurred, leading to a neutralization reaction that caused a decrease in the color intensity of the indicator being monitored for absorbance at 552 nm. The effects of various parameters were studied. Linear calibration graphs for acidities of 0.2 - 1.0 and 0.5 - 2.5% w/v citric acid as a standard, with a relative standard deviation of 1% (acidity of 0.3 - 0.6% w/v as citric acid, n=11) and a sample throughput of 30 samples h(-1), were achieved. The developed method was validated by a standard titrimetric method for assaying the acidity of fruit juice samples.

  8. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).

    PubMed

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-07-13

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.

  9. Mining Building Energy Management System Data Using Fuzzy Anomaly Detection and Linguistic Descriptions

    SciTech Connect

    Dumidu Wijayasekara; Ondrej Linda; Milos Manic; Craig Rieger

    2014-08-01

    Building Energy Management Systems (BEMSs) are essential components of modern buildings that utilize digital control technologies to minimize energy consumption while maintaining high levels of occupant comfort. However, BEMSs can only achieve these energy savings when properly tuned and controlled. Since indoor environment is dependent on uncertain criteria such as weather, occupancy, and thermal state, performance of BEMS can be sub-optimal at times. Unfortunately, the complexity of BEMS control mechanism, the large amount of data available and inter-relations between the data can make identifying these sub-optimal behaviors difficult. This paper proposes a novel Fuzzy Anomaly Detection and Linguistic Description (Fuzzy-ADLD) based method for improving the understandability of BEMS behavior for improved state-awareness. The presented method is composed of two main parts: 1) detection of anomalous BEMS behavior and 2) linguistic representation of BEMS behavior. The first part utilizes modified nearest neighbor clustering algorithm and fuzzy logic rule extraction technique to build a model of normal BEMS behavior. The second part of the presented method computes the most relevant linguistic description of the identified anomalies. The presented Fuzzy-ADLD method was applied to real-world BEMS system and compared against a traditional alarm based BEMS. In six different scenarios, the Fuzzy-ADLD method identified anomalous behavior either as fast as or faster (an hour or more), that the alarm based BEMS. In addition, the Fuzzy-ADLD method identified cases that were missed by the alarm based system, demonstrating potential for increased state-awareness of abnormal building behavior.

  10. Evaluation of Anomaly Detection Capability for Ground-Based Pre-Launch Shuttle Operations. Chapter 8

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2010-01-01

    This chapter will provide a thorough end-to-end description of the process for evaluation of three different data-driven algorithms for anomaly detection to select the best candidate for deployment as part of a suite of IVHM (Integrated Vehicle Health Management) technologies. These algorithms were deemed to be sufficiently mature enough to be considered viable candidates for deployment in support of the maiden launch of Ares I-X, the successor to the Space Shuttle for NASA's Constellation program. Data-driven algorithms are just one of three different types being deployed. The other two types of algorithms being deployed include a "nile-based" expert system, and a "model-based" system. Within these two categories, the deployable candidates have already been selected based upon qualitative factors such as flight heritage. For the rule-based system, SHINE (Spacecraft High-speed Inference Engine) has been selected for deployment, which is a component of BEAM (Beacon-based Exception Analysis for Multimissions), a patented technology developed at NASA's JPL (Jet Propulsion Laboratory) and serves to aid in the management and identification of operational modes. For the "model-based" system, a commercially available package developed by QSI (Qualtech Systems, Inc.), TEAMS (Testability Engineering and Maintenance System) has been selected for deployment to aid in diagnosis. In the context of this particular deployment, distinctions among the use of the terms "data-driven," "rule-based," and "model-based," can be found in. Although there are three different categories of algorithms that have been selected for deployment, our main focus in this chapter will be on the evaluation of three candidates for data-driven anomaly detection. These algorithms will be evaluated upon their capability for robustly detecting incipient faults or failures in the ground-based phase of pre-launch space shuttle operations, rather than based oil heritage as performed in previous studies. Robust

  11. A Feasibility Study on the Application of the ScriptGenE Framework as an Anomaly Detection System in Industrial Control Systems

    DTIC Science & Technology

    2015-09-17

    Hines. Anomaly-based intrusion detection for SCADA systems . In 5th International Topical Meeting on Nuclear Plant Instrumentation , Control, and Human...A FEASIBILITY STUDY ON THE APPLICATION OF THE SCRIPTGENE FRAMEWORK AS AN ANOMALY DETECTION SYSTEM IN INDUSTRIAL CONTROL SYSTEMS THESIS Charito M...FEASIBILITY STUDY ON THE APPLICATION OF THE SCRIPTGENE FRAMEWORK AS AN ANOMALY DETECTION SYSTEM IN INDUSTRIAL CONTROL SYSTEMS THESIS Presented to the

  12. Anomaly and signature filtering improve classifier performance for detection of suspicious access to EHRs.

    PubMed

    Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila

    2011-01-01

    Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10(-6). Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful.

  13. A parametric study of unsupervised anomaly detection performance in maritime imagery using manifold learning techniques

    NASA Astrophysics Data System (ADS)

    Olson, C. C.; Doster, T.

    2016-05-01

    We investigate the parameters that govern an unsupervised anomaly detection framework that uses nonlinear techniques to learn a better model of the non-anomalous data. A manifold or kernel-based model is learned from a small, uniformly sampled subset in order to reduce computational burden and under the assumption that anomalous data will have little effect on the learned model because their rarity reduces the likelihood of their inclusion in the subset. The remaining data are then projected into the learned space and their projection errors used as detection statistics. Here, kernel principal component analysis is considered for learning the background model. We consider spectral data from an 8-band multispectral sensor as well as panchromatic infrared images treated by building a data set composed of overlapping image patches. We consider detection performance as a function of patch neighborhood size as well as embedding parameters such as kernel bandwidth and dimension. ROC curves are generated over a range of parameters and compared to RX performance.

  14. Detection of inhomogeneities in precipitation time series in Portugal using direct sequential simulation

    NASA Astrophysics Data System (ADS)

    Ribeiro, Sara; Caineta, Júlio; Costa, Ana Cristina; Henriques, Roberto; Soares, Amílcar

    2016-05-01

    Climate data homogenisation is of major importance in climate change monitoring, validation of weather forecasting, general circulation and regional atmospheric models, modelling of erosion, drought monitoring, among other studies of hydrological and environmental impacts. The reason is that non-climate factors can cause time series discontinuities which may hide the true climatic signal and patterns, thus potentially bias the conclusions of those studies. In the last two decades, many methods have been developed to identify and remove these inhomogeneities. One of those is based on a geostatistical simulation technique (DSS - direct sequential simulation), where local probability density functions (pdfs) are calculated at candidate monitoring stations using spatial and temporal neighbouring observations, which then are used for the detection of inhomogeneities. Such approach has been previously applied to detect inhomogeneities in four precipitation series (wet day count) from a network with 66 monitoring stations located in the southern region of Portugal (1980-2001). That study revealed promising results and the potential advantages of geostatistical techniques for inhomogeneity detection in climate time series. This work extends the case study presented before and investigates the application of the geostatistical stochastic approach to ten precipitation series that were previously classified as inhomogeneous by one of six absolute homogeneity tests (Mann-Kendall, Wald-Wolfowitz runs, Von Neumann ratio, Pettitt, Buishand range test, and standard normal homogeneity test (SNHT) for a single break). Moreover, a sensitivity analysis is performed to investigate the number of simulated realisations which should be used to infer the local pdfs with more accuracy. Accordingly, the number of simulations per iteration was increased from 50 to 500, which resulted in a more representative local pdf. As in the previous study, the results are compared with those from the

  15. Inhomogeneities detection in annual precipitation time series in Portugal using direct sequential simulation

    NASA Astrophysics Data System (ADS)

    Caineta, Júlio; Ribeiro, Sara; Costa, Ana Cristina; Henriques, Roberto; Soares, Amílcar

    2014-05-01

    Climate data homogenisation is of major importance in monitoring climate change, the validation of weather forecasting, general circulation and regional atmospheric models, modelling of erosion, drought monitoring, among other studies of hydrological and environmental impacts. This happens because non-climate factors can cause time series discontinuities which may hide the true climatic signal and patterns, thus potentially bias the conclusions of those studies. In the last two decades, many methods have been developed to identify and remove these inhomogeneities. One of those is based on geostatistical simulation (DSS - direct sequential simulation), where local probability density functions (pdf) are calculated at candidate monitoring stations, using spatial and temporal neighbouring observations, and then are used for detection of inhomogeneities. This approach has been previously applied to detect inhomogeneities in four precipitation series (wet day count) from a network with 66 monitoring stations located in the southern region of Portugal (1980-2001). This study revealed promising results and the potential advantages of geostatistical techniques for inhomogeneities detection in climate time series. This work extends the case study presented before and investigates the application of the geostatistical stochastic approach to ten precipitation series that were previously classified as inhomogeneous by one of six absolute homogeneity tests (Mann-Kendall test, Wald-Wolfowitz runs test, Von Neumann ratio test, Standard normal homogeneity test (SNHT) for a single break, Pettit test, and Buishand range test). Moreover, a sensibility analysis is implemented to investigate the number of simulated realisations that should be used to accurately infer the local pdfs. Accordingly, the number of simulations per iteration is increased from 50 to 500, which resulted in a more representative local pdf. A set of default and recommended settings is provided, which will help

  16. A sequential Monte Carlo probability hypothesis density algorithm for multitarget track-before-detect

    NASA Astrophysics Data System (ADS)

    Punithakumar, K.; Kirubarajan, T.; Sinha, A.

    2005-09-01

    In this paper, we present a recursive track-before-detect (TBD) algorithm based on the Probability Hypothesis Density (PHD) filter for multitarget tracking. TBD algorithms are better suited over standard target tracking methods for tracking dim targets in heavy clutter and noise. Classical target tracking, where the measurements are pre-processed at each time step before passing them to the tracking filter results in information loss, which is very damaging if the target signal-to-noise ratio is low. However, in TBD the tracking filter operates directly on the raw measurements at the expense of added computational burden. The development of a recursive TBD algorithm reduces the computational burden over conventional TBD methods, namely, Hough transform, dynamic programming, etc. The TBD is a hard nonlinear non-Gaussian problem even for single target scenarios. Recent advances in Sequential Monte Carlo (SMC) based nonlinear filtering make multitarget TBD feasible. However, the current implementations use a modeling setup to accommodate the varying number of targets where a multiple model SMC based TBD approach is used to solve the problem conditioned on the model, i.e., number of targets. The PHD filter, which propagates only the first-order statistical moment (or the PHD) of the full target posterior, has been shown to be a computationally efficient solution to multitarget tracking problems with varying number of targets. We propose a PHD filter based TBD so that there is no assumption to be made on the number of targets. Simulation results are presented to show the effectiveness of the proposed filter in tracking multiple weak targets.

  17. Hypergraph-based anomaly detection of high-dimensional co-occurrences.

    PubMed

    Silva, Jorge; Willett, Rebecca

    2009-03-01

    This paper addresses the problem of detecting anomalous multivariate co-occurrences using a limited number of unlabeled training observations. A novel method based on using a hypergraph representation of the data is proposed to deal with this very high-dimensional problem. Hypergraphs constitute an important extension of graphs which allow edges to connect more than two vertices simultaneously. A variational Expectation-Maximization algorithm for detecting anomalies directly on the hypergraph domain without any feature selection or dimensionality reduction is presented. The resulting estimate can be used to calculate a measure of anomalousness based on the False Discovery Rate. The algorithm has O(np) computational complexity, where n is the number of training observations and p is the number of potential participants in each co-occurrence event. This efficiency makes the method ideally suited for very high-dimensional settings, and requires no tuning, bandwidth or regularization parameters. The proposed approach is validated on both high-dimensional synthetic data and the Enron email database, where p > 75,000, and it is shown that it can outperform other state-of-the-art methods.

  18. Para-GMRF: parallel algorithm for anomaly detection of hyperspectral image

    NASA Astrophysics Data System (ADS)

    Dong, Chao; Zhao, Huijie; Li, Na; Wang, Wei

    2007-12-01

    The hyperspectral imager is capable of collecting hundreds of images corresponding to different wavelength channels for the observed area simultaneously, which make it possible to discriminate man-made objects from natural background. However, the price paid for the wealthy information is the enormous amounts of data, usually hundreds of Gigabytes per day. Turning the huge volume data into useful information and knowledge in real time is critical for geoscientists. In this paper, the proposed parallel Gaussian-Markov random field (Para-GMRF) anomaly detection algorithm is an attempt of applying parallel computing technology to solve the problem. Based on the locality of GMRF algorithm, we partition the 3-D hyperspectral image cube in spatial domain and distribute data blocks to multiple computers for concurrent detection. Meanwhile, to achieve load balance, a work pool scheduler is designed for task assignment. The Para-GMRF algorithm is organized in master-slave architecture, coded in C programming language using message passing interface (MPI) library and tested on a Beowulf cluster. Experimental results show that Para-GMRF algorithm successfully conquers the challenge and can be used in time sensitive areas, such as environmental monitoring and battlefield reconnaissance.

  19. Insider threat detection enabled by converting user applications into fractal fingerprints and autonomously detecting anomalies

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James

    2012-06-01

    We demonstrate insider threat detection for determining when the behavior of a computer user is suspicious or different from his or her normal behavior. This is accomplished by combining features extracted from text, emails, and blogs that are associated with the user. These sources can be characterized using QUEST, DANCER, and MenTat to extract features; however, some of these features are still in text form. We show how to convert these features into numerical form and characterize them using parametric and non-parametric statistics. These features are then used as input into a Random Forest classifier that is trained to recognize whenever the user's behavior is suspicious or different from normal (off-nominal). Active authentication (user identification) is also demonstrated using the features and classifiers derived in this work. We also introduce a novel concept for remotely monitoring user behavior indicator patterns displayed as an infrared overlay on the computer monitor, which the user is unaware of, but a narrow pass-band filtered webcam can clearly distinguish. The results of our analysis are presented.

  20. Sequential acquisition of multi-dimensional heteronuclear chemical shift correlation spectra with 1H detection

    NASA Astrophysics Data System (ADS)

    Bellstedt, Peter; Ihle, Yvonne; Wiedemann, Christoph; Kirschstein, Anika; Herbst, Christian; Görlach, Matthias; Ramachandran, Ramadurai

    2014-03-01

    RF pulse schemes for the simultaneous acquisition of heteronuclear multi-dimensional chemical shift correlation spectra, such as {HA(CA)NH & HA(CACO)NH}, {HA(CA)NH & H(N)CAHA} and {H(N)CAHA & H(CC)NH}, that are commonly employed in the study of moderately-sized protein molecules, have been implemented using dual sequential 1H acquisitions in the direct dimension. Such an approach is not only beneficial in terms of the reduction of experimental time as compared to data collection via two separate experiments but also facilitates the unambiguous sequential linking of the backbone amino acid residues. The potential of sequential 1H data acquisition procedure in the study of RNA is also demonstrated here.

  1. Sequential acquisition of multi-dimensional heteronuclear chemical shift correlation spectra with 1H detection

    PubMed Central

    Bellstedt, Peter; Ihle, Yvonne; Wiedemann, Christoph; Kirschstein, Anika; Herbst, Christian; Görlach, Matthias; Ramachandran, Ramadurai

    2014-01-01

    RF pulse schemes for the simultaneous acquisition of heteronuclear multi-dimensional chemical shift correlation spectra, such as {HA(CA)NH & HA(CACO)NH}, {HA(CA)NH & H(N)CAHA} and {H(N)CAHA & H(CC)NH}, that are commonly employed in the study of moderately-sized protein molecules, have been implemented using dual sequential 1H acquisitions in the direct dimension. Such an approach is not only beneficial in terms of the reduction of experimental time as compared to data collection via two separate experiments but also facilitates the unambiguous sequential linking of the backbone amino acid residues. The potential of sequential 1H data acquisition procedure in the study of RNA is also demonstrated here. PMID:24671105

  2. One sliding PCA method to detect ionospheric anomalies before strong Earthquakes: Cases study of Qinghai, Honshu, Hotan and Nepal earthquakes

    NASA Astrophysics Data System (ADS)

    Chang, Xiaotao; Zou, Bin; Guo, Jinyun; Zhu, Guangbin; Li, Wang; Li, Wudong

    2017-04-01

    A sliding principal component analysis (PCA) method is proposed to detect pre-earthquake ionospheric anomalies. We analyzed the precision of this new method with different length of time window in detecting the reference background total electron content (TEC). The results showed that the most suitable window length is 27 days which is consistent with the solar rotation period. We compared the precision of this new method with the sliding inter quartile range (IQR) method and the sliding average method in detecting the background TECs, and found that the precision of sliding PCA is better than those of the traditional methods in the middle and low latitudes because the detected background TEC residual errors by the sliding PCA method are less than that by the traditional methods. We adopted a more reasonable method to calculate the background value's upper and lower bounds and then take four strong earthquakes (Qinghai earthquake, Honshu earthquake, Hotan earthquake and Nepal earthquake) as examples to prove the effectiveness of the sliding PCA method. From the detection results of the sliding PCA we found that obvious ionospheric anomalies appeared on April 1, 2010, March 8, 2011, February 2, 2014, April 11, 23, year 2015. After the further analysis of the solar-terrestrial environment and the distribution of TEC anomaly area, it can be considered that the pre-earthquake anomalies on these days before the earthquake may have a large correlation with the following earthquake. In addition, the TEC anomaly area spans largely in the longitude direction along the bound of equator anomalous zone, and the anomalous morphology has the conjugate structure. This regular can provide a valuable reference for the short-impending earthquake prediction in the future.

  3. Maternal psychological responses during pregnancy after ultrasonographic detection of structural fetal anomalies: A prospective longitudinal observational study

    PubMed Central

    Kaasen, Anne; Helbig, Anne; Malt, Ulrik F.; Næs, Tormod; Skari, Hans; Haugen, Guttorm

    2017-01-01

    In this longitudinal prospective observational study performed at a tertiary perinatal referral centre, we aimed to assess maternal distress in pregnancy in women with ultrasound findings of fetal anomaly and compare this with distress in pregnant women with normal ultrasound findings. Pregnant women with a structural fetal anomaly (n = 48) and normal ultrasound (n = 105) were included. We administered self-report questionnaires (General Health Questionnaire-28, Impact of Event Scale-22 [IES], and Edinburgh Postnatal Depression Scale) a few days following ultrasound detection of a fetal anomaly or a normal ultrasound (T1), 3 weeks post-ultrasound (T2), and at 30 (T3) and 36 weeks gestation (T4). Social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression) were the main outcome measures. The median gestational age at T1 was 20 and 19 weeks in the group with and without fetal anomaly, respectively. In the fetal anomaly group, all psychological distress scores were highest at T1. In the group with a normal scan, distress scores were stable throughout pregnancy. At all assessments, the fetal anomaly group scored significantly higher (especially on depression-related questions) compared to the normal scan group, except on the IES Intrusion and Arousal subscales at T4, although with large individual differences. In conclusion, women with a known fetal anomaly initially had high stress scores, which gradually decreased, resembling those in women with a normal pregnancy. Psychological stress levels were stable and low during the latter half of gestation in women with a normal pregnancy. PMID:28350879

  4. Supervised Classification Method with Efficient Filter Techniques to Detect Anomalies on Earthen Levees Using Synthetic Aperture Radar Imagery

    NASA Astrophysics Data System (ADS)

    Marapareddy, Ramakalavathi; Anastoos, James V.; Younan, Nicolas H.

    2016-08-01

    The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides or other anomalies on earthen levees. These slough slides are the primary cause for creating levee areas which are vulnerable to seepage and failure during high water events. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. In this paper, we implemented a supervised classification algorithm the minimum distance classifier with a majority filter and morphology filter for the identification of anomalies on levees using polarimetric Synthetic Aperture Radar (polSAR) data. This study employed remote sensing data from the NASA Jet Propulsion Laboratory's (JPL's) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) instrument, using its fully quad-polarimetric L-band polSAR data. The study area is a section of the lower Mississippi River in the southern USA.

  5. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks.

    PubMed

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-06-13

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens' quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%.

  6. Anomaly Identification from Super-Low Frequency Electromagnetic Data for the Coalbed Methane Detection

    NASA Astrophysics Data System (ADS)

    Zhao, S. S.; Wang, N.; Hui, J.; Ye, X.; Qin, Q.

    2016-06-01

    Natural source Super Low Frequency(SLF) electromagnetic prospecting methods have become an increasingly promising way in the resource detection. The capacity estimation of the reservoirs is of great importance to evaluate their exploitation potency. In this paper, we built a signal-estimate model for SLF electromagnetic signal and processed the monitored data with adaptive filter. The non-normal distribution test showed that the distribution of the signal was obviously different from Gaussian probability distribution, and Class B instantaneous amplitude probability model can well describe the statistical properties of SLF electromagnetic data. The Class B model parameter estimation is very complicated because its kernel function is confluent hypergeometric function. The parameters of the model were estimated based on property spectral function using Least Square Gradient Method(LSGM). The simulation of this estimation method was carried out, and the results of simulation demonstrated that the LGSM estimation method can reflect important information of the Class B signal model, of which the Gaussian component was considered to be the systematic noise and random noise, and the Intermediate Event Component was considered to be the background ground and human activity noise. Then the observation data was processed using adaptive noise cancellation filter. With the noise components subtracted out adaptively, the remaining part is the signal of interest, i.e., the anomaly information. It was considered to be relevant to the reservoir position of the coalbed methane stratum.

  7. Symmetry fractionalization and anomaly detection in three-dimensional topological phases

    NASA Astrophysics Data System (ADS)

    Chen, Xie; Hermele, Michael

    2016-11-01

    In a phase with fractional excitations, topological properties are enriched in the presence of global symmetry. In particular, fractional excitations can transform under symmetry in a fractionalized manner, resulting in different symmetry enriched topological (SET) phases. While a good deal is now understood in 2D regarding what symmetry fractionalization patterns are possible, the situation in 3D is much more open. A new feature in 3D is the existence of loop excitations, so to study 3D SET phases, first we need to understand how to properly describe the fractionalized action of symmetry on loops. Using a dimensional reduction procedure, we show that these loop excitations exist as the boundary between two 2D SET phases, and the symmetry action is characterized by the corresponding difference in SET orders. Moreover, similar to the 2D case, we find that some seemingly possible symmetry fractionalization patterns are actually anomalous and cannot be realized strictly in 3D. We detect such anomalies using the flux fusion method we introduced previously in 2D. To illustrate these ideas, we use the 3 D Z2 gauge theory with Z2 global symmetry as an example, and enumerate and describe the corresponding SET phases. In particular, we find four nonanomalous SET phases and one anomalous SET phase, which we show can be realized as the surface of a 4D system with symmetry protected topological order.

  8. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks

    PubMed Central

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-01-01

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens’ quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%. PMID:27304957

  9. Experiments to Detect Clandestine Graves from Interpreted High Resolution Geophysical Anomalies

    NASA Astrophysics Data System (ADS)

    Molina, C. M.; Hernandez, O.; Pringle, J.

    2013-05-01

    This project refers to the search for clandestine sites where possibly missing people have been buried based on interpreted near surface high resolution geophysical anomalies. Nowadays, there are thousands of missing people around the world that could have been tortured and killed and buried in clandestine graves. This is a huge problem for their families and governments that are responsible to warranty the human rights for everybody. These people need to be found and the related crime cases need to be resolved. This work proposes to construct a series of graves where all the conditions of the grave, human remains and related objects are known. It is expected to detect contrasting physical properties of soil to identify the known human remains and objects. The proposed geophysical methods will include electrical tomography, magnetic and ground penetrating radar, among others. Two geographical sites will be selected to located and build standard graves with contrasting weather, soil, vegetation, geographic and geologic conditions. Forward and inverse modeling will be applied to locate and enhance the geophysical response of the known graves and to validate the methodology. As a result, an integrated geophysical program will be provided to support the search for clandestine graves helping to find missing people that have been illegally buried. Optionally, the methodology will be tested to search for real clandestine graves.

  10. Sequential strand displacement beacon for detection of DNA coverage on functionalized gold nanoparticles.

    PubMed

    Paliwoda, Rebecca E; Li, Feng; Reid, Michael S; Lin, Yanwen; Le, X Chris

    2014-06-17

    Functionalizing nanomaterials for diverse analytical, biomedical, and therapeutic applications requires determination of surface coverage (or density) of DNA on nanomaterials. We describe a sequential strand displacement beacon assay that is able to quantify specific DNA sequences conjugated or coconjugated onto gold nanoparticles (AuNPs). Unlike the conventional fluorescence assay that requires the target DNA to be fluorescently labeled, the sequential strand displacement beacon method is able to quantify multiple unlabeled DNA oligonucleotides using a single (universal) strand displacement beacon. This unique feature is achieved by introducing two short unlabeled DNA probes for each specific DNA sequence and by performing sequential DNA strand displacement reactions. Varying the relative amounts of the specific DNA sequences and spacing DNA sequences during their coconjugation onto AuNPs results in different densities of the specific DNA on AuNP, ranging from 90 to 230 DNA molecules per AuNP. Results obtained from our sequential strand displacement beacon assay are consistent with those obtained from the conventional fluorescence assays. However, labeling of DNA with some fluorescent dyes, e.g., tetramethylrhodamine, alters DNA density on AuNP. The strand displacement strategy overcomes this problem by obviating direct labeling of the target DNA. This method has broad potential to facilitate more efficient design and characterization of novel multifunctional materials for diverse applications.

  11. Detection of electromagnetic anomalies related to volcanic eruptions by DEMETER micro-satellite: August 2004 - December 2010

    NASA Astrophysics Data System (ADS)

    Zlotnicki, J.; Li, F.; Parrot, M.

    2012-04-01

    More than 1500 volcanoes on the Earth can potentially enter into eruption but only some tens of them are equipped with dense and complex monitoring networks. In the electromagnetic field (EM), a long history of ground observations, data processing and analysis show that EM signals often appear before volcanic eruptions. The characteristics widely vary from one type of volcano to another one, going from smooth, continuous and slow changes over years to rapid signals of large amplitude during the hours preceding the eruptive phases. The possibility that volcanic eruptions may also be preceded by transient electromagnetic anomalies in the ionosphere can be analyzed through DEMETER satellite which was a micro-satellite launched by the French National Spatial Agency (CNES) and devoted to the detection of ionospheric disturbances generated by natural hazards and human activity. EM studies can be performed on the records corresponding to the time life of the satellite: August 2004 to December 2010. The first study focuses on the identification of ionospheric anomalies above erupting volcanoes within a time window starting 60 days before the surface activity till 15 days after. A threshold distance between the footprint of the satellite and the volcano was fixed to 500 and 900 km depending on the Volcanic Explosivity Index (VEI #1 or VEI >1). Five types of ionospheric anomalies were detected which may involve electric and/or magnetic anomalies, ionic or electronic densities and temperatures. 136 eruptions located within latitudes [-50°S, 50°N] where large natural magnetic activity does not arise too frequently, have occurred. 89 of them were accompanied by ionospheric anomalies. 269 anomalies were recorded during the 6.5 years of records. The peak of the number of anomalies appears to be between -30 days and -15 days. The second study is related to ionospheric disturbances detected by DEMETER satellite over active volcanoes submitted to volcanic lightnings. The database

  12. Reasoning about anomalies: a study of the analytical process of detecting and identifying anomalous behavior in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Kronhamn, Thomas

    2009-05-01

    The goal of visual analytical tools is to support the analytical reasoning process, maximizing human perceptual, understanding and reasoning capabilities in complex and dynamic situations. Visual analytics software must be built upon an understanding of the reasoning process, since it must provide appropriate interactions that allow a true discourse with the information. In order to deepen our understanding of the human analytical process and guide developers in the creation of more efficient anomaly detection systems, this paper investigates how is the human analytical process of detecting and identifying anomalous behavior in maritime traffic data. The main focus of this work is to capture the entire analysis process that an analyst goes through, from the raw data to the detection and identification of anomalous behavior. Three different sources are used in this study: a literature survey of the science of analytical reasoning, requirements specified by experts from organizations with interest in port security and user field studies conducted in different marine surveillance control centers. Furthermore, this study elaborates on how to support the human analytical process using data mining, visualization and interaction methods. The contribution of this paper is twofold: (1) within visual analytics, contribute to the science of analytical reasoning with practical understanding of users tasks in order to develop a taxonomy of interactions that support the analytical reasoning process and (2) within anomaly detection, facilitate the design of future anomaly detector systems when fully automatic approaches are not viable and human participation is needed.

  13. MODVOLC2: A Hybrid Time Series Analysis for Detecting Thermal Anomalies Applied to Thermal Infrared Satellite Data

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Wright, R.; Pilger, E.

    2009-12-01

    We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found

  14. Discrete shearlet transform on GPU with applications in anomaly detection and denoising

    NASA Astrophysics Data System (ADS)

    Gibert, Xavier; Patel, Vishal M.; Labate, Demetrio; Chellappa, Rama

    2014-12-01

    Shearlets have emerged in recent years as one of the most successful methods for the multiscale analysis of multidimensional signals. Unlike wavelets, shearlets form a pyramid of well-localized functions defined not only over a range of scales and locations, but also over a range of orientations and with highly anisotropic supports. As a result, shearlets are much more effective than traditional wavelets in handling the geometry of multidimensional data, and this was exploited in a wide range of applications from image and signal processing. However, despite their desirable properties, the wider applicability of shearlets is limited by the computational complexity of current software implementations. For example, denoising a single 512 × 512 image using a current implementation of the shearlet-based shrinkage algorithm can take between 10 s and 2 min, depending on the number of CPU cores, and much longer processing times are required for video denoising. On the other hand, due to the parallel nature of the shearlet transform, it is possible to use graphics processing units (GPU) to accelerate its implementation. In this paper, we present an open source stand-alone implementation of the 2D discrete shearlet transform using CUDA C++ as well as GPU-accelerated MATLAB implementations of the 2D and 3D shearlet transforms. We have instrumented the code so that we can analyze the running time of each kernel under different GPU hardware. In addition to denoising, we describe a novel application of shearlets for detecting anomalies in textured images. In this application, computation times can be reduced by a factor of 50 or more, compared to multicore CPU implementations.

  15. Finding Cardinality Heavy-Hitters in Massive Traffic Data and Its Application to Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Ishibashi, Keisuke; Mori, Tatsuya; Kawahara, Ryoichi; Hirokawa, Yutaka; Kobayashi, Atsushi; Yamamoto, Kimihiro; Sakamoto, Hitoaki; Asano, Shoichiro

    introduce an application of our algorithm to anomaly detection. With actual traffic data, our method could successfully detect a sudden network scan.

  16. DNA sequencing by a single molecule detection of labeled nucleotides sequentially cleaved from a single strand of DNA

    SciTech Connect

    Goodwin, P.M.; Schecker, J.A.; Wilkerson, C.W.; Hammond, M.L.; Ambrose, W.P.; Jett, J.H.; Martin, J.C.; Marrone, B.L.; Keller, R.A. ); Haces, A.; Shih, P.J.; Harding, J.D. )

    1993-01-01

    We are developing a laser-based technique for the rapid sequencing of large DNA fragments (several kb in size) at a rate of 100 to 1000 bases per second. Our approach relies on fluorescent labeling of the bases in a single fragment of DNA, attachment of this labeled DNA fragment to a support, movement of the supported DNA into a flowing sample stream, sequential cleavage of the end nucleotide from the DNA fragment with an exonuclease, and detection of the individual fluorescently labeled bases by laser-induced fluorescence.

  17. DNA sequencing by a single molecule detection of labeled nucleotides sequentially cleaved from a single strand of DNA

    SciTech Connect

    Goodwin, P.M.; Schecker, J.A.; Wilkerson, C.W.; Hammond, M.L.; Ambrose, W.P.; Jett, J.H.; Martin, J.C.; Marrone, B.L.; Keller, R.A.; Haces, A.; Shih, P.J.; Harding, J.D.

    1993-02-01

    We are developing a laser-based technique for the rapid sequencing of large DNA fragments (several kb in size) at a rate of 100 to 1000 bases per second. Our approach relies on fluorescent labeling of the bases in a single fragment of DNA, attachment of this labeled DNA fragment to a support, movement of the supported DNA into a flowing sample stream, sequential cleavage of the end nucleotide from the DNA fragment with an exonuclease, and detection of the individual fluorescently labeled bases by laser-induced fluorescence.

  18. Early India-Australia Spreading History Revealed by Newly Detected Magnetic Anomalies

    NASA Astrophysics Data System (ADS)

    Williams, S.; Whittaker, J. M.; Granot, R.; Müller, D.

    2013-12-01

    The seafloor within the Perth Abyssal Plain (PAP), offshore Western Australia, is the only section of crust that directly records the early spreading history between India and Australia during the Mesozoic breakup of Gondwana. However, this early spreading has been poorly constrained due to an absence of data, including marine magnetic anomalies and data constraining the crustal nature of key tectonic features. Here, we present new magnetic anomaly data from the PAP that shows that the crust in the western part of the basin was part of the Indian Plate - the conjugate flank to the oceanic crust immediately offshore the Perth margin, Australia. We identify a sequence of M2 and older anomalies in the west PAP within crust that initially moved with the Indian Plate, formed at intermediate half-spreading rates (35 mm/yr) consistent with the conjugate sequence on the Australian Plate. More speculatively, we reinterpret the youngest anomalies in the east PAP, finding that the M0-age crust initially formed on the Indian Plate was transferred to the Australian Plate by a westward jump or propagation of the spreading ridge shortly after M0 time. Samples dredged from the Gulden Draak and Batavia Knolls (at the western edge of the PAP) reveal that these bathymetric features are continental fragments rather than igneous plateaus related to Broken Ridge. These microcontinents rifted away from Australia with Greater India during initial breakup at ~130 Ma, then rifted from India following the cessation of spreading in the PAP (~101-103 Ma).

  19. Anomaly detection in radiographic images of composite materials via crosshatch regression

    NASA Astrophysics Data System (ADS)

    Lockard, Colin D.

    The development and testing of new composite materials is an important area of research supporting advances in aerospace engineering. Understanding the properties of these materials requires the analysis of material samples to identify damage. Given the significant time and effort required from human experts to analyze computed tomography (CT) scans related to the non-destructive evaluation of carbon fiber materials, it is advantageous to develop an automated system for identifying anomalies in these images. This thesis introduces a regression-based algorithm for identifying anomalies in grayscale images, with a particular focus on its application for the analysis of CT scan images of carbon fiber. The algorithm centers around a "crosshatch regression" approach in which each two-dimensional image is divided into a series of one-dimensional signals, each representing a single line of pixels. A robust multiple linear regression model is fitted to each signal and outliers are identified. Smoothing and quality control techniques help better define anomaly boundaries and remove noise, and multiple crosshatch regression runs are combined to generate the final result. A ground truth set was created and the algorithm was run against these images for testing. The experimental results support the efficacy of the technique, locating 92% of anomalies with an average recall of 88%, precision of 78%, and root mean square deviation of 11.2 pixels.

  20. An earthquake from space: detection of precursory magnetic anomalies from Swarm satellites before the 2015 M8 Nepal Earthquake

    NASA Astrophysics Data System (ADS)

    De Santis, A.; Balasis, G.; Pavón-Carrasco, F. J.; Cianchini, G.; Mandea, M.

    2015-12-01

    A large earthquake of around 8 magnitude occurred on 25 April 2015, 06:26 UTC, with epicenter in Nepal, causing more than 9000 fatalities and devastating destruction. The contemporary orbiting in the topside ionosphere of the three Swarm satellites by ESA makes it possible to look for possible pre-earthquake magnetic anomalous signals, likely due to some lithosphere-atmosphere-ionosphere (LAI) coupling. First, a wavelet analysis has been performed during the same day of the earthquake (from the external magnetic point of view, an exceptionally quiet day) with the result that a ULF anomalous and persisting signal (from around 3 to 6 UTC), is clearly detected before the earthquake. After this single-spot analysis, we performed a more extensive analysis for two months around the earthquake occurrence, to confirm or refute the cause-effect relationship. From the series of the detected magnetic anomalies (during night and magnetically quiet times) from Swarm satellites, we show that the cumulative numbers of anomalies follows the same typical power-law behavior of a critical system approaching its critical time, in our case, the large seismic event of 25 April, 2015, and then it recovers as the typical recovery phase after a large earthquake. The impressive similarity of this behavior with the analogous of seismic data analysis, provides strong support to the lithospheric origin of the satellite magnetic anomalies, as due to the LAI coupling during the preparation phase of the Nepal earthquake.

  1. Rapid Anomaly Detection and Tracking via Compressive Time-Spectra Measurement

    DTIC Science & Technology

    2016-02-12

    Each of the single-block cases show nulls represented by dark -colored lines in the patterns. When the patterns are averaged together all of the...red square of the example frame for four selected blocks of the Hadamard spectrum. Dark areas corresponding to low anomaly energy are eliminated when...raster shifts; we also see a significant amount of background noise, which was found to be due to detector dark current. Approved for public

  2. Millimeter Wave Detection of Localized Anomalies in the Space Shuttle External Fuel Tank Insulating Foam and Acreage Heat Tiles

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Case, J. T.; Zoughi, R.; Hepburn, F.

    2005-01-01

    The Space Shuttle Columbia's catastrophic accident emphasizes the growing need for developing and applying effective, robust and life-cycle oriented nondestructive testing (NDT) methods for inspecting the shuttle external fuel tank spray on foam insulation (SOFI) and its protective acreage heat tiles. Millimeter wave NDT techniques were one of the methods chosen for evaluating their potential for inspecting these structures. Several panels with embedded anomalies (mainly voids) were produced and tested for this purpose. Near-field and far-field millimeter wave NDT methods were used for producing millimeter wave images of the anomalies in SOFI panel and heat tiles. This paper presents the results of an investigation for the purpose of detecting localized anomalies in two SOFI panels and a set of heat tiles. To this end, reflectometers at a relatively wide range of frequencies (Ka-band (26.5 - 40 GHz) to W-band (75 - 110 GHz)) and utilizing different types of radiators were employed. The results clearly illustrate the utility of these methods for this purpose.

  3. Interpretation of Magnetic Anomalies in Salihli (Turkey) Geothermal Area Using 3-D Inversion and Edge Detection Techniques

    NASA Astrophysics Data System (ADS)

    Timur, Emre

    2016-04-01

    There are numerous geophysical methods used to investigate geothermal areas. The major purpose of this magnetic survey is to locate the boudaries of active hydrothermal system in the South of Gediz Graben in Salihli (Manisa/Turkey). The presence of the hydrothermal system had already been inferred from surface evidence of hydrothermal activity and drillings. Firstly, 3-D prismatic models were theoretically investigated and edge detection methods were utilized with an iterative inversion method to define the boundaries and the parameters of the structure. In the first step of the application, it was necessary to convert the total field anomaly into a pseudo-gravity anomaly map. Then the geometric boudaries of the structures were determined by applying a MATLAB based software with 3 different edge detection algorithms. The exact location of the structures were obtained by using these boundary coordinates as initial geometric parameters in the inversion process. In addition to these methods, reduction to pole and horizontal gradient methods were applied to the data to achieve more information about the location and shape of the possible reservoir. As a result, the edge detection methods were found to be successful, both in the field and as theoretical data sets for delineating the boundaries of the possible geothermal reservoir structure. The depth of the geothermal reservoir was determined as 2,4 km from 3-D inversion and 2,1 km from power spectrum methods.

  4. An anomaly detection approach for the identification of DME patients using spectral domain optical coherence tomography images.

    PubMed

    Sidibé, Désiré; Sankar, Shrinivasan; Lemaître, Guillaume; Rastgoo, Mojdeh; Massich, Joan; Cheung, Carol Y; Tan, Gavin S W; Milea, Dan; Lamoureux, Ecosse; Wong, Tien Y; Mériaudeau, Fabrice

    2017-02-01

    This paper proposes a method for automatic classification of spectral domain OCT data for the identification of patients with retinal diseases such as Diabetic Macular Edema (DME). We address this issue as an anomaly detection problem and propose a method that not only allows the classification of the OCT volume, but also allows the identification of the individual diseased B-scans inside the volume. Our approach is based on modeling the appearance of normal OCT images with a Gaussian Mixture Model (GMM) and detecting abnormal OCT images as outliers. The classification of an OCT volume is based on the number of detected outliers. Experimental results with two different datasets show that the proposed method achieves a sensitivity and a specificity of 80% and 93% on the first dataset, and 100% and 80% on the second one. Moreover, the experiments show that the proposed method achieves better classification performance than other recently published works.

  5. 13C-direct detected NMR experiments for the sequential J-based resonance assignment of RNA oligonucleotides

    PubMed Central

    Richter, Christian; Kovacs, Helena; Buck, Janina; Wacker, Anna; Fürtig, Boris; Bermel, Wolfgang

    2010-01-01

    We present here a set of 13C-direct detected NMR experiments to facilitate the resonance assignment of RNA oligonucleotides. Three experiments have been developed: (1) the (H)CC-TOCSY-experiment utilizing a virtual decoupling scheme to assign the intraresidual ribose 13C-spins, (2) the (H)CPC-experiment that correlates each phosphorus with the C4′ nuclei of adjacent nucleotides via J(C,P) couplings and (3) the (H)CPC-CCH-TOCSY-experiment that correlates the phosphorus nuclei with the respective C1′,H1′ ribose signals. The experiments were applied to two RNA hairpin structures. The current set of 13C-direct detected experiments allows direct and unambiguous assignment of the majority of the hetero nuclei and the identification of the individual ribose moieties following their sequential assignment. Thus, 13C-direct detected NMR methods constitute useful complements to the conventional 1H-detected approach for the resonance assignment of oligonucleotides that is often hindered by the limited chemical shift dispersion. The developed methods can also be applied to large deuterated RNAs. Electronic supplementary material The online version of this article (doi:10.1007/s10858-010-9429-5) contains supplementary material, which is available to authorized users. PMID:20544375

  6. Quantitative Integration of Multiple Geophysical Techniques for Reducing Uncertainty in Discrete Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Carr, M. C.; Baker, G. S.; Herrmann, N.; Yerka, S.; Angst, M.

    2008-12-01

    The objectives of this project are to (1) utilize quantitative integration of multiple geophysical techniques, (2) determine geophysical anomalies that may indicate locations of various archaeological structures, and (3) develop techniques of quantifying causes of uncertainty. Two sites are used to satisfy these objectives. The first, representing a site with unknown target features, is an archaeological site on the Tennessee River floodplain. The area is divided into 437 (20 x 20 m) plots with 0.5 m spacing where magnetic gradiometry profiles were collected in a zig-zag pattern, resulting in 350 km of line data. Once anomalies are identified in the magnetics data, potential excavation sites for archeological features are determined and other geophysical techniques are utilized to gain confidence in choosing which anomalies to excavate. Several grids are resurveyed using Ground Penetrating Radar (GPR) and EM-31 with a 0.25 m spacing in a grid pattern. A quantitative method of integrating data into one comprehensive set is developed, enhancing interpretation because each geophysical technique utilized within this study produced a unique response to noise and the targets. Spatial visualization software is used to interpolate irregularly spaced XYZ data into a regularly spaced grid and display the geophysical data in 3D representations. Once all data are exported from each individual instrument, grid files are created for quantitative merging of the data and to create grid-based maps including contour, image, shaded relief, and surface maps. Statistics were calculated from anomaly classification in the data and excavated features present. To study this methodology in a more controlled setting, a second site is used. This site is analogous to the first in that it is along the Tennessee River floodplain on the same bedrock units. However, this analog site contains known targets (previously buried and accurately located) including size, shape, and orientation. Four

  7. Realtime hand detection system using convex shape detector in sequential depth images

    NASA Astrophysics Data System (ADS)

    Tai, Chung-Li; Li, Chia-Chang; Liao, Duan-Li

    2013-12-01

    In this paper, a real-time hand detection and tracking system is proposed. A calibrated stereo vision system is used to obtain disparity images and real world coordinates are available by geometry transformation. Unlike other pixel-based shape detector that edge information is necessary, the proposed convex shape detector, which is based on real world coordinates, is applied directly in depth images to detect hands regardless of distance. Around waving gesture recognition and simple hand tracking are also implemented in this work. The acceptable accuracy of the proposed system is examined in verification process. Experimental results of hand detection and tracking prove the robustness and the feasibility of the proposed method.

  8. Anomaly detection driven active learning for identifying suspicious tracks and events in WAMI video

    NASA Astrophysics Data System (ADS)

    Miller, David J.; Natraj, Aditya; Hockenbury, Ryler; Dunn, Katherine; Sheffler, Michael; Sullivan, Kevin

    2012-06-01

    We describe a comprehensive system for learning to identify suspicious vehicle tracks from wide-area motion (WAMI) video. First, since the road network for the scene of interest is assumed unknown, agglomerative hierarchical clustering is applied to all spatial vehicle measurements, resulting in spatial cells that largely capture individual road segments. Next, for each track, both at the cell (speed, acceleration, azimuth) and track (range, total distance, duration) levels, extreme value feature statistics are both computed and aggregated, to form summary (p-value based) anomaly statistics for each track. Here, to fairly evaluate tracks that travel across different numbers of spatial cells, for each cell-level feature type, a single (most extreme) statistic is chosen, over all cells traveled. Finally, a novel active learning paradigm, applied to a (logistic regression) track classifier, is invoked to learn to distinguish suspicious from merely anomalous tracks, starting from anomaly-ranked track prioritization, with ground-truth labeling by a human operator. This system has been applied to WAMI video data (ARGUS), with the tracks automatically extracted by a system developed in-house at Toyon Research Corporation. Our system gives promising preliminary results in highly ranking as suspicious aerial vehicles, dismounts, and traffic violators, and in learning which features are most indicative of suspicious tracks.

  9. Decreased Perifoveal Sensitivity Detected by Microperimetry in Patients Using Hydroxychloroquine and without Visual Field and Fundoscopic Anomalies

    PubMed Central

    Molina-Martín, A.; Piñero, D. P.; Pérez-Cambrodí, R. J.

    2015-01-01

    Purpose. To evaluate the usefulness of microperimetry in the early detection of the ocular anomalies associated with the use of hydroxychloroquine. Methods. Prospective comparative case series study comprising 14 healthy eyes of 7 patients (group A) and 14 eyes of 7 patients under treatment with hydroxychloroquine for the treatment of rheumatologic diseases and without fundoscopic or perimetric anomalies (group B). A comprehensive ophthalmological examination including microperimetry (MP) and spectral-domain optical coherence tomography was performed in both groups. Results. No significant differences were found in mean MP foveal sensitivity between groups (P = 0.18). However, mean MP overall sensitivity was significantly higher in group A (29.05 ± 0.57 dB versus group B, 26.05 ± 2.75 dB; P < 0.001). Significantly higher sensitivity values were obtained in group A in comparison to group B for the three eccentric loci evaluated (P < 0.001). Conclusion. Microperimetry seems to be a useful tool for the early detection of retinal damage in patients treated with hydroxychloroquine. PMID:25861463

  10. Fuzzy Logic Based Anomaly Detection for Embedded Network Security Cyber Sensor

    SciTech Connect

    Ondrej Linda; Todd Vollmer; Jason Wright; Milos Manic

    2011-04-01

    Resiliency and security in critical infrastructure control systems in the modern world of cyber terrorism constitute a relevant concern. Developing a network security system specifically tailored to the requirements of such critical assets is of a primary importance. This paper proposes a novel learning algorithm for anomaly based network security cyber sensor together with its hardware implementation. The presented learning algorithm constructs a fuzzy logic rule based model of normal network behavior. Individual fuzzy rules are extracted directly from the stream of incoming packets using an online clustering algorithm. This learning algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental test-bed mimicking the environment of a critical infrastructure control system.

  11. Data Integration and Anomaly Detection for Decision Support in Protected Area Management

    NASA Astrophysics Data System (ADS)

    Melton, F.; Votava, P.; Michaelis, A.; Kuhn, B.; Milesi, C.; Tague, C.; Nemani, R.

    2006-12-01

    We present a case study in the use of cyberinfrastructure to identify anomalies in ecosystem conditions to support decision making for protected area management. U.S. National Parks and other protected areas internationally are subject to increasing pressure from environmental change within and adjacent to park boundaries. Despite great interest in these areas and the fact that some U.S. parks receive as many as 3.5 million visitors per year, protected areas are often sparsely instrumented, making it difficult for resource managers to quickly identify trends and changes in park conditions. Remote sensing and ecosystem models offer protected area managers important tools for comprehensive monitoring of ecosystem conditions and scientifically based decision-making. These tools, however, can generate tremendous data volumes. New techniques are required to identify and present key data features to decision makers. The Terrestrial Observation and Prediction System (TOPS) is currently being applied to automate the production, analysis, and delivery of a suite of data products from NASA satellites and ecosystem models to assist managers of U.S. and international protected areas. TOPS uses ecosystem models to combine satellite data with ground-based observations to produce nowcasts and forecasts of ecosystem conditions. We are utilizing TOPS to deliver data products to NPS resource managers in near-real-time for use in operational decision-making. Current products include estimates of vegetation condition, ecosystem productivity, soil moisture, snow cover, fire occurrence, and others. In addition, the use of TOPS to automate the identification and display of trends and anomalies in ecosystem conditions enables protected area managers to track park- wide conditions daily, identify changes, focus monitoring efforts, and improve decision making through infusion of NASA data.

  12. Analysis of a SCADA System Anomaly Detection Model Based on Information Entropy

    DTIC Science & Technology

    2014-03-27

    20 Intrusion Detection...alarms ( Rem ). ............................................................................................................. 86 Figure 25. TP% for...literature concerning the focus areas of this research. The focus areas include SCADA vulnerabilities, information theory, and intrusion detection

  13. Space Shuttle Main Engine Propellant Path Leak Detection Using Sequential Image Processing

    NASA Technical Reports Server (NTRS)

    Smith, L. Montgomery; Malone, Jo Anne; Crawford, Roger A.

    1995-01-01

    Initial research in this study using theoretical radiation transport models established that the occurrence of a leak is accompanies by a sudden but sustained change in intensity in a given region of an image. In this phase, temporal processing of video images on a frame-by-frame basis was used to detect leaks within a given field of view. The leak detection algorithm developed in this study consists of a digital highpass filter cascaded with a moving average filter. The absolute value of the resulting discrete sequence is then taken and compared to a threshold value to produce the binary leak/no leak decision at each point in the image. Alternatively, averaging over the full frame of the output image produces a single time-varying mean value estimate that is indicative of the intensity and extent of a leak. Laboratory experiments were conducted in which artificially created leaks on a simulated SSME background were produced and recorded from a visible wavelength video camera. This data was processed frame-by-frame over the time interval of interest using an image processor implementation of the leak detection algorithm. In addition, a 20 second video sequence of an actual SSME failure was analyzed using this technique. The resulting output image sequences and plots of the full frame mean value versus time verify the effectiveness of the system.

  14. VISAD: an interactive and visual analytical tool for the detection of behavioral anomalies in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Warston, Håkan

    2009-05-01

    Monitoring the surveillance of large sea areas normally involves the analysis of huge quantities of heterogeneous data from multiple sources (radars, cameras, automatic identification systems, reports, etc.). The rapid identification of anomalous behavior or any threat activity in the data is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems are rarely used in the real world. There are two main reasons: (1) the detection of anomalous behavior is normally not a well-defined and structured problem and therefore, automatic data mining approaches do not work well and (2) the difficulties that these systems have regarding the representation and employment of the prior knowledge that the users bring to their tasks. In order to overcome these limitations, we believe that human involvement in the entire discovery process is crucial. Using a visual analytics process model as a framework, we present VISAD: an interactive, visual knowledge discovery tool for supporting the detection and identification of anomalous behavior in maritime traffic data. VISAD supports the insertion of human expert knowledge in (1) the preparation of the system, (2) the establishment of the normal picture and (3) in the actual detection of rare events. For each of these three modules, VISAD implements different layers of data mining, visualization and interaction techniques. Thus, the detection procedure becomes transparent to the user, which increases his/her confidence and trust in the system and overall, in the whole discovery process.

  15. Application of Qualitative and Quantitative Analyses of Self-Potential Anomaly in Caves Detection in Djuanda Forest Park, Bandung

    NASA Astrophysics Data System (ADS)

    Srigutomo, Wahyu; Arkanuddin, Muhammad R.; Pratomo, Prihandhanu M.; Novana, Eka C.; Agustina, Rena D.

    2010-12-01

    Self-Potential (SP) is naturally occurring electric potential difference observed at the surface. In the vicinity of a cave, SP anomaly is dominantly generated by the resistivity contrast of the cave with its environment and the current source associated with the streaming potential generated by fluid flow through the cave. In this study we applied a simple qualitative analysis to understand the SP values caused by streaming potential and values that are due to the presence of caves. Further, we conducted two-dimensional SP continuous modeling by solving the fluid velocity vector first in the modeling domain. Current source distribution and hence the SP value are obtained by incorporating resistivity value of the subsurface and calculating the divergence of the velocity vector. For validation, this scheme was applied in detection caves dug by Japanese army during WWII as at Djuanda Forest Park, Bandung. The results can be used to understand the characteristics of fluid flow and current source distribution around cavities that are responsible for the observed SP anomaly at the surface.

  16. Sequential Filtering Processes Shape Feature Detection in Crickets: A Framework for Song Pattern Recognition

    PubMed Central

    Hedwig, Berthold G.

    2016-01-01

    Intraspecific acoustic communication requires filtering processes and feature detectors in the auditory pathway of the receiver for the recognition of species-specific signals. Insects like acoustically communicating crickets allow describing and analysing the mechanisms underlying auditory processing at the behavioral and neural level. Female crickets approach male calling song, their phonotactic behavior is tuned to the characteristic features of the song, such as the carrier frequency and the temporal pattern of sound pulses. Data from behavioral experiments and from neural recordings at different stages of processing in the auditory pathway lead to a concept of serially arranged filtering mechanisms. These encompass a filter for the carrier frequency at the level of the hearing organ, and the pulse duration through phasic onset responses of afferents and reciprocal inhibition of thoracic interneurons. Further, processing by a delay line and coincidence detector circuit in the brain leads to feature detecting neurons that specifically respond to the species-specific pulse rate, and match the characteristics of the phonotactic response. This same circuit may also control the response to the species-specific chirp pattern. Based on these serial filters and the feature detecting mechanism, female phonotactic behavior is shaped and tuned to the characteristic properties of male calling song. PMID:26941647

  17. CHIRP-Like Signals: Estimation, Detection and Processing A Sequential Model-Based Approach

    SciTech Connect

    Candy, J. V.

    2016-08-04

    Chirp signals have evolved primarily from radar/sonar signal processing applications specifically attempting to estimate the location of a target in surveillance/tracking volume. The chirp, which is essentially a sinusoidal signal whose phase changes instantaneously at each time sample, has an interesting property in that its correlation approximates an impulse function. It is well-known that a matched-filter detector in radar/sonar estimates the target range by cross-correlating a replicant of the transmitted chirp with the measurement data reflected from the target back to the radar/sonar receiver yielding a maximum peak corresponding to the echo time and therefore enabling the desired range estimate. In this application, we perform the same operation as a radar or sonar system, that is, we transmit a “chirp-like pulse” into the target medium and attempt to first detect its presence and second estimate its location or range. Our problem is complicated by the presence of disturbance signals from surrounding broadcast stations as well as extraneous sources of interference in our frequency bands and of course the ever present random noise from instrumentation. First, we discuss the chirp signal itself and illustrate its inherent properties and then develop a model-based processing scheme enabling both the detection and estimation of the signal from noisy measurement data.

  18. Selecting Observation Platforms for Optimized Anomaly Detectability under Unreliable Partial Observations

    SciTech Connect

    Wen-Chiao Lin; Humberto E. Garcia; Tae-Sic Yoo

    2011-06-01

    Diagnosers for keeping track on the occurrences of special events in the framework of unreliable partially observed discrete-event dynamical systems were developed in previous work. This paper considers observation platforms consisting of sensors that provide partial and unreliable observations and of diagnosers that analyze them. Diagnosers in observation platforms typically perform better as sensors providing the observations become more costly or increase in number. This paper proposes a methodology for finding an observation platform that achieves an optimal balance between cost and performance, while satisfying given observability requirements and constraints. Since this problem is generally computational hard in the framework considered, an observation platform optimization algorithm is utilized that uses two greedy heuristics, one myopic and another based on projected performances. These heuristics are sequentially executed in order to find best observation platforms. The developed algorithm is then applied to an observation platform optimization problem for a multi-unit-operation system. Results show that improved observation platforms can be found that may significantly reduce the observation platform cost but still yield acceptable performance for correctly inferring the occurrences of special events.

  19. Detection of oxygen isotopic anomaly in terrestrial atmospheric carbonates and its implications to Mars

    PubMed Central

    Shaheen, R.; Abramian, A.; Horn, J.; Dominguez, G.; Sullivan, R.; Thiemens, Mark H.

    2010-01-01

    The debate of life on Mars centers around the source of the globular, micrometer-sized mineral carbonates in the ALH84001 meteorite; consequently, the identification of Martian processes that form carbonates is critical. This paper reports a previously undescribed carbonate formation process that occurs on Earth and, likely, on Mars. We identified micrometer-sized carbonates in terrestrial aerosols that possess excess 17O (0.4–3.9‰). The unique O-isotopic composition mechanistically describes the atmospheric heterogeneous chemical reaction on aerosol surfaces. Concomitant laboratory experiments define the transfer of ozone isotopic anomaly to carbonates via hydrogen peroxide formation when O3 reacts with surface adsorbed water. This previously unidentified chemical reaction scenario provides an explanation for production of the isotopically anomalous carbonates found in the SNC (shergottites, nakhlaites, chassignites) Martian meteorites and terrestrial atmospheric carbonates. The anomalous hydrogen peroxide formed on the aerosol surfaces may transfer its O-isotopic signature to the water reservoir, thus producing mass independently fractionated secondary mineral evaporites. The formation of peroxide via heterogeneous chemistry on aerosol surfaces also reveals a previously undescribed oxidative process of utility in understanding ozone and oxygen chemistry, both on Mars and Earth. PMID:21059939

  20. Detection of oxygen isotopic anomaly in terrestrial atmospheric carbonates and its implications to Mars.

    PubMed

    Shaheen, R; Abramian, A; Horn, J; Dominguez, G; Sullivan, R; Thiemens, Mark H

    2010-11-23

    The debate of life on Mars centers around the source of the globular, micrometer-sized mineral carbonates in the ALH84001 meteorite; consequently, the identification of Martian processes that form carbonates is critical. This paper reports a previously undescribed carbonate formation process that occurs on Earth and, likely, on Mars. We identified micrometer-sized carbonates in terrestrial aerosols that possess excess (17)O (0.4-3.9‰). The unique O-isotopic composition mechanistically describes the atmospheric heterogeneous chemical reaction on aerosol surfaces. Concomitant laboratory experiments define the transfer of ozone isotopic anomaly to carbonates via hydrogen peroxide formation when O(3) reacts with surface adsorbed water. This previously unidentified chemical reaction scenario provides an explanation for production of the isotopically anomalous carbonates found in the SNC (shergottites, nakhlaites, chassignites) Martian meteorites and terrestrial atmospheric carbonates. The anomalous hydrogen peroxide formed on the aerosol surfaces may transfer its O-isotopic signature to the water reservoir, thus producing mass independently fractionated secondary mineral evaporites. The formation of peroxide via heterogeneous chemistry on aerosol surfaces also reveals a previously undescribed oxidative process of utility in understanding ozone and oxygen chemistry, both on Mars and Earth.

  1. Bayesian signal processing techniques for the detection of highly localised gravity anomalies using quantum interferometry technology

    NASA Astrophysics Data System (ADS)

    Brown, Gareth; Ridley, Kevin; Rodgers, Anthony; de Villiers, Geoffrey

    2016-10-01

    Recent advances in the field of quantum technology offer the exciting possibility of gravimeters and gravity gradiometers capable of performing rapid surveys with unprecedented precision and accuracy. Measurements with sub nano-g (a billionth of the acceleration due to gravity) precision should enable the resolution of underground structures on metre length scales. However, deducing the exact dimensions of the structure producing the measured gravity anomaly is known to be an ill-posed inversion problem. Furthermore, the measurement process will be affected by multiple sources of uncertainty that increase the range of plausible solutions that fit the measured data. Bayesian inference is the natural framework for accommodating these uncertainties and providing a fully probabilistic assessment of possible structures producing inhomogeneities in the gravitational field. Previous work introduced the probability of excavation map as a means to convert the high-dimensional space belonging to the posterior distribution to an easily interpretable map. We now report on the development of the inference model to account for spatial correlations in the gravitational field induced by variations in soil density.

  2. Subsurface faults detection based on magnetic anomalies investigation: A field example at Taba protectorate, South Sinai

    NASA Astrophysics Data System (ADS)

    Khalil, Mohamed H.

    2016-08-01

    Quantitative interpretation of the magnetic data particularly in a complex dissected structure necessitates using of filtering techniques. In Taba protectorate, Sinai synthesis of different filtering algorithms was carried out to distinct and verifies the subsurface structure and estimates the depth of the causative magnetic sources. In order to separate the shallow-seated structure, filters of the vertical derivatives (VDR), Butterworth high-pass (BWHP), analytic signal (AS) amplitude, and total horizontal derivative of the tilt derivative (TDR_THDR) were conducted. While, filters of the apparent susceptibility and Butterworth low-pass (BWLP) were conducted to identify the deep-seated structure. The depths of the geological contacts and faults were calculated by the 3D Euler deconvolution. Noteworthy, TDR_THDR was independent of geomagnetic inclination, significantly less susceptible to noise, and more sensitive to the details of the shallow superimposed structures. Whereas, the BWLP proved high resolution capabilities in attenuating the shorter wavelength of the near surface anomalies and emphasizing the longer wavelength derived from deeper causative structure. 3D Euler deconvolution (SI = 0) was quite amenable to estimate the depths of superimposed subsurface structure. The pattern, location, and trend of the deduced shallow and deep faults were conformed remarkably to the addressed fault system.

  3. Application of the LMC algorithm to anomaly detection using the Wichmann/NIITEK ground-penetrating radar

    NASA Astrophysics Data System (ADS)

    Torrione, Peter A.; Collins, Leslie M.; Clodfelter, Fred; Frasier, Shane; Starnes, Ian

    2003-09-01

    This paper describes the application of a 2-dimensional (2-D) lattice LMS algorithm for anomaly detection using the Wichmann/Niitek ground penetrating radar (GPR) system. Sets of 3-dimensional (3-D) data are collected from the GPR system and these are processed in separate 2-D slices. Those 2-D slices that are spatially correlated in depth are combined into separate "depth segments" and these are processed independently. When target/no target declarations need to be made, the individual depth segments are combined to yield a 2-D confidence map. The 2-D confidence map is then thresholded and alarms are placed at the centroids of the remaining 8-connected data points. Calibration lane results are presented for data collected over several soil types under several weather conditions. Results show a false alarm rate improvement of at least an order of magnitude over other GPR systems, as well as significant improvement over other adaptive algorithms operating on the same data.

  4. Anomaly detection in reconstructed quantum states using a machine-learning technique

    NASA Astrophysics Data System (ADS)

    Hara, Satoshi; Ono, Takafumi; Okamoto, Ryo; Washio, Takashi; Takeuchi, Shigeki

    2014-02-01

    The accurate detection of small deviations in given density matrices is important for quantum information processing. Here we propose a method based on the concept of data mining. We demonstrate that the proposed method can more accurately detect small erroneous deviations in reconstructed density matrices, which contain intrinsic fluctuations due to the limited number of samples, than a naive method of checking the trace distance from the average of the given density matrices. This method has the potential to be a key tool in broad areas of physics where the detection of small deviations of quantum states reconstructed using a limited number of samples is essential.

  5. Evolutionary neural networks for anomaly detection based on the behavior of a program.

    PubMed

    Han, Sang-Jun; Cho, Sung-Bae

    2006-06-01

    The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection.

  6. Investigation of a Neural Network Implementation of a TCP Packet Anomaly Detection System

    DTIC Science & Technology

    2004-05-01

    as the range 1024–65535. Some types of port scans may be detected through this attribute, as well as trojans and distributed denial of service (DDoS...Summary of DARPA 1999 week 5 detects. Date Attack Classifier Details 04/05/99 Portsweep Flags Lone FIN packets Sequence # SEQ=ACK=0 04/05/99 Neptune DoS IP...port (final stage of attack) 04/06/99 Neptune DoS IP Private IP address 10.20.30.40 Flags SYN packets with low source ports Ports Low ports to low

  7. Information-theoretic analysis of x-ray scatter and phase architectures for anomaly detection

    NASA Astrophysics Data System (ADS)

    Coccarelli, David; Gong, Qian; Stoian, Razvan-Ionut; Greenberg, Joel A.; Gehm, Michael E.; Lin, Yuzhang; Huang, Liang-Chih; Ashok, Amit

    2016-05-01

    Conventional performance analysis of detection systems confounds the effects of the system architecture (sources, detectors, system geometry, etc.) with the effects of the detection algorithm. Previously, we introduced an information-theoretic approach to this problem by formulating a performance metric, based on Cauchy-Schwarz mutual information, that is analogous to the channel capacity concept from communications engineering. In this work, we discuss the application of this metric to study novel screening systems based on x-ray scatter or phase. Our results show how effective use of this metric can impact design decisions for x-ray scatter and phase systems.

  8. Enhanced magnetic anomaly detection using a nitrogen-cooled superconducting gradiometer

    NASA Astrophysics Data System (ADS)

    Clem, Ted R.; Overway, David J.; Purpura, John W.; Bono, John T.; Carroll, Paul J.; Koch, Roger H.; Rozen, James R.; Keefe, George A.; Willen, Scott; Mohling, Robert A.

    2000-07-01

    During the 1980's the Superconducting Gradiometer/Magnetometer Sensor was demonstrated in the Magnetic and Acoustic Detection of Mines Advanced Technology Demonstration to provide effective mine detection, localization, and classification capabilities, especially against buried mines, and to reduce significantly acoustic false alarms arising from bottom clutter. This sensor utilized Superconducting Quantum Interference Devices manufactured using the low critical temperature (low Tc) superconductor niobium and liquid helium for sensor cooling. This sensor has most recently bee integrated into the Mobile Underwater Debris Survey System and has been demonstrated successfully in a survey to locate unexploded ordnance in coastal waters.

  9. Behavioral Anomaly Detection: A Socio-Technical Study of Trustworthiness in Virtual Organizations

    ERIC Educational Resources Information Center

    Ho, Shuyuan Mary

    2009-01-01

    This study examines perceptions of human "trustworthiness" as a key component in countering insider threats. The term "insider threat" refers to situations where a critical member of an organization behaves against the interests of the organization, in an illegal and/or unethical manner. Identifying and detecting how an individual's behavior…

  10. Least Square Support Vector Machine for Detection of - Ionospheric Anomalies Associated with the Powerful Nepal Earthquake (Mw = 7.5) of 25 April 2015

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2016-06-01

    Due to the irrepalable devastations of strong earthquakes, accurate anomaly detection in time series of different precursors for creating a trustworthy early warning system has brought new challenges. In this paper the predictability of Least Square Support Vector Machine (LSSVM) has been investigated by forecasting the GPS-TEC (Total Electron Content) variations around the time and location of Nepal earthquake. In 77 km NW of Kathmandu in Nepal (28.147° N, 84.708° E, depth = 15.0 km) a powerful earthquake of Mw = 7.8 took place at 06:11:26 UTC on April 25, 2015. For comparing purpose, other two methods including Median and ANN (Artificial Neural Network) have been implemented. All implemented algorithms indicate on striking TEC anomalies 2 days prior to the main shock. Results reveal that LSSVM method is promising for TEC sesimo-ionospheric anomalies detection.

  11. Quantum-state anomaly detection for arbitrary errors using a machine-learning technique

    NASA Astrophysics Data System (ADS)

    Hara, Satoshi; Ono, Takafumi; Okamoto, Ryo; Washio, Takashi; Takeuchi, Shigeki

    2016-10-01

    The accurate detection of small deviations in given density matrice is important for quantum information processing, which is a difficult task because of the intrinsic fluctuation in density matrices reconstructed using a limited number of experiments. We previously proposed a method for decoherence error detection using a machine-learning technique [S. Hara, T. Ono, R. Okamoto, T. Washio, and S. Takeuchi, Phys. Rev. A 89, 022104 (2014), 10.1103/PhysRevA.89.022104]. However, the previous method is not valid when the errors are just changes in phase. Here, we propose a method that is valid for arbitrary errors in density matrices. The performance of the proposed method is verified using both numerical simulation data and real experimental data.

  12. Recent Results on "Approximations to Optimal Alarm Systems for Anomaly Detection"

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2009-01-01

    An optimal alarm system and its approximations may use Kalman filtering for univariate linear dynamic systems driven by Gaussian noise to provide a layer of predictive capability. Predicted Kalman filter future process values and a fixed critical threshold can be used to construct a candidate level-crossing event over a predetermined prediction window. An optimal alarm system can be designed to elicit the fewest false alarms for a fixed detection probability in this particular scenario.

  13. Detecting anomalies in astronomical signals using machine learning algorithms embedded in an FPGA

    NASA Astrophysics Data System (ADS)

    Saez, Alejandro F.; Herrera, Daniel E.

    2016-07-01

    Taking a large interferometer for radio astronomy, such as the ALMA1 telescope, where the amount of stations (50 in the case of ALMA's main array, which can extend to 64 antennas) produces an enormous amount of data in a short period of time - visibilities can be produced every 16msec or total power information every 1msec (this means up to 2016 baselines). With the aforementioned into account it is becoming more difficult to detect problems in the signal produced by each antenna in a timely manner (one antenna produces 4 x 2GHz spectral windows x 2 polarizations, which means a 16 GHz bandwidth signal which is later digitized using 3-bits samplers). This work will present an approach based on machine learning algorithms for detecting problems in the already digitized signal produced by the active antennas (the set of antennas which is being used in an observation). The aim of this work is to detect unsuitable, or totally corrupted, signals. In addition, this development also provides an almost real time warning which finally helps stop and investigate the problem in order to avoid collecting useless information.

  14. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, M. Clinton; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many konts to make possible high speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flow fields/plumes; the Optical Plume Anomaly Detection (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDIFIS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Additionally, efforts are being advanced to hardware encode components of the EDIFIS in order to address real-time operational requirements for health monitoring and management. This paper addresses the OPAD with its tool suite, and discusses what is considered a natural progression: a concept for migrating OPAD towards detection of high energy particles, including neutrons and gamma rays. The integration of these tools and capabilities will provide NASA with a systematic approach to monitor space vehicle internal and external environment.

  15. Cyber-Critical Infrastructure Protection Using Real-Time Payload-Based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Düssel, Patrick; Gehl, Christian; Laskov, Pavel; Bußer, Jens-Uwe; Störmann, Christof; Kästner, Jan

    With an increasing demand of inter-connectivity and protocol standardization modern cyber-critical infrastructures are exposed to a multitude of serious threats that may give rise to severe damage for life and assets without the implementation of proper safeguards. Thus, we propose a method that is capable to reliably detect unknown, exploit-based attacks on cyber-critical infrastructures carried out over the network. We illustrate the effectiveness of the proposed method by conducting experiments on network traffic that can be found in modern industrial control systems. Moreover, we provide results of a throughput measuring which demonstrate the real-time capabilities of our system.

  16. Sequential Two-Dimensional Partial Response Maximum Likelihood Detection Scheme with Constant-Weight Constraint Code for Holographic Data Storage Systems

    NASA Astrophysics Data System (ADS)

    Kong, Gyuyeol; Choi, Sooyong

    2012-08-01

    A sequential two-dimensional (2D) partial response maximum likelihood (PRML) detection scheme for holographic data storage (HDS) systems is proposed. We use two complexity reduction schemes, a reduced-state trellis and a constant-weight (CW) constraint. In the reduced-state trellis, the limited candidate bits surrounding the target bit are considered for the 2D PRML detector. In the CW constraint, the trellis transitions that violate the CW condition that each code-word block has only one white bit are eliminated. However, the 2D PRML detector using the complexity reduction schemes, which operates on 47 states and 169 branches, has performance degradation. To overcome performance degradation, a sequential detection algorithm uses the estimated a priori probability. By the sequential procedure, we mitigate 2D intersymbol interference with an enhanced reliability of the branch metric. Simulation results show that the proposed 2D PRML detection scheme yields about 3 dB gains over the one-dimensional PRML detection scheme.

  17. Holonomy anomalies

    SciTech Connect

    Bagger, J.; Nemeschansky, D.; Yankielowicz, S.

    1985-05-01

    A new type of anomaly is discussed that afflicts certain non-linear sigma models with fermions. This anomaly is similar to the ordinary gauge and gravitational anomalies since it reflects a topological obstruction to the reparametrization invariance of the quantum effective action. Nonlinear sigma models are constructed based on homogeneous spaces G/H. Anomalies arising when the fermions are chiral are shown to be cancelled sometimes by Chern-Simons terms. Nonlinear sigma models are considered based on general Riemannian manifolds. 9 refs. (LEW)

  18. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition

    PubMed Central

    Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine

    2016-01-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466

  19. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition

    NASA Astrophysics Data System (ADS)

    Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine

    2016-08-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.

  20. Anomaly detection using simulated MTI data cubes derived from HYDICE data

    SciTech Connect

    Moya, M.M.; Taylor, J.G.; Stallard, B.R.; Motomatsu, S.E.

    1998-07-01

    The US Department of Energy is funding the development of the Multi-spectral Thermal Imager (MTI), a satellite-based multi-spectral (MS) thermal imaging sensor scheduled for launch in October 1999. MTI is a research and development (R and D) platform to test the applicability of multispectral and thermal imaging technology for detecting and monitoring signs of proliferation of weapons of mass destruction. During its three-year mission, MTI will periodically record images of participating government, industrial and natural sites in fifteen visible and infrared spectral bands to provide a variety of image data associated with weapons production activities. The MTI satellite will have spatial resolution in the visible bands that is five times better than LANDSAT TM in each dimension and will have five thermal bands. In this work, the authors quantify the separability between specific materials and the natural background by applying Receiver Operating Curve (ROC) analysis to the residual errors from a linear unmixing. The authors apply the ROC analysis to quantify performance of the MTI. They describe the MTI imager and simulate its data by filtering HYDICE hyperspectral imagery both spatially and spectrally and by introducing atmospheric effects corresponding to the MTI satellite altitude. They compare and contrast the individual effects on performance of spectral resolution, spatial resolution, atmospheric corrections, and varying atmospheric conditions.

  1. Is a "loss of balance" a control error signal anomaly? Evidence for three-sigma failure detection in young adults.

    PubMed

    Ahmed, Alaa A; Ashton-Miller, James A

    2004-06-01

    Given that a physical definition for a loss of balance (LOB) is lacking, the hypothesis was tested that a LOB is actually a loss of effective control, as evidenced by a control error signal anomaly (CEA). A model-reference adaptive controller and failure-detection algorithm were used to represent central nervous system decision-making based on input and output signals obtained during a challenging whole-body planar balancing task. Control error was defined as the residual generated when the actual system output is compared with the predicted output of the simple first-order polynomial system model. A CEA was hypothesized to occur when the model-generated control error signal exceeded three standard deviations (3sigma) beyond the mean calculated across a 2-s trailing window. The primary hypothesis tested was that a CEA is indeed observable in 20 healthy young adults (ten women) performing the following experiment. Seated subjects were asked to balance a high-backed chair for as long as possible over its rear legs. Each subject performed ten trials. The ground reaction force under the dominant foot, which constituted the sole input to the system, was measured using a two-axis load cell. Angular acceleration of the chair represented the one degree-of-freedom system output. The results showed that the 3sigma algorithm detected a CEA in 94% of 197 trials. A secondary hypothesis was supported in that a CEA was followed in 93% of the trials by an observable compensatory response, occurring at least 100 ms later, and an average of 479 ms, later. Longer reaction times were associated with low velocities at CEA, and vice versa. It is noteworthy that this method of detecting CEA does not rely on an external positional or angular reference, or knowledge of the location of the system's center of mass.

  2. Algorithms for Spectral Decomposition with Applications to Optical Plume Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Srivastava, Askok N.; Matthews, Bryan; Das, Santanu

    2008-01-01

    The analysis of spectral signals for features that represent physical phenomenon is ubiquitous in the science and engineering communities. There are two main approaches that can be taken to extract relevant features from these high-dimensional data streams. The first set of approaches relies on extracting features using a physics-based paradigm where the underlying physical mechanism that generates the spectra is used to infer the most important features in the data stream. We focus on a complementary methodology that uses a data-driven technique that is informed by the underlying physics but also has the ability to adapt to unmodeled system attributes and dynamics. We discuss the following four algorithms: Spectral Decomposition Algorithm (SDA), Non-Negative Matrix Factorization (NMF), Independent Component Analysis (ICA) and Principal Components Analysis (PCA) and compare their performance on a spectral emulator which we use to generate artificial data with known statistical properties. This spectral emulator mimics the real-world phenomena arising from the plume of the space shuttle main engine and can be used to validate the results that arise from various spectral decomposition algorithms and is very useful for situations where real-world systems have very low probabilities of fault or failure. Our results indicate that methods like SDA and NMF provide a straightforward way of incorporating prior physical knowledge while NMF with a tuning mechanism can give superior performance on some tests. We demonstrate these algorithms to detect potential system-health issues on data from a spectral emulator with tunable health parameters.

  3. Regional and residual anomaly separation in microgravity maps for cave detection: The case study of Gruta de las Maravillas (SW Spain)

    NASA Astrophysics Data System (ADS)

    Martínez-Moreno, F. J.; Galindo-Zaldívar, J.; Pedrera, A.; Teixidó, T.; Peña, J. A.; González-Castillo, L.

    2015-03-01

    Gravity can be considered an optimal geophysical method for cave detection, given the high density contrast between an empty cavity and the surrounding materials. A number of methods can be used for regional and residual gravity anomaly separation, although they have not been tested in natural scenarios. With the purpose of comparing the different methods, we calculate the residual anomalies associated with the karst system of Gruta de las Maravillas whose cave morphology and dimensions are well-known. A total of 1857 field measurements, mostly distributed in a regular grid of 10 × 10 m, cover the studied area. The microgravity data were acquired using a Scintrex CG5 gravimeter and topography control was carried out with a differential GPS. Regional anomaly maps were calculated by means of several algorithms to generate the corresponding residual gravimetric maps: polynomial first-order fitting, fast Fourier transformation with an upward continuation filter, moving average, minimum curvature and kriging methods. Results are analysed and discussed in terms of resolution, implying the capacity to detect shallow voids. We propose that polynomial fitting is the best technique when microgravity data are used to obtain the residual anomaly maps for cave detection.

  4. Bangui Anomaly

    NASA Technical Reports Server (NTRS)

    Taylor, Patrick T.

    2004-01-01

    Bangui anomaly is the name given to one of the Earth s largest crustal magnetic anomalies and the largest over the African continent. It covers two-thirds of the Central African Republic and therefore the name derives from the capitol city-Bangui that is also near the center of this feature. From surface magnetic survey data Godivier and Le Donche (1962) were the first to describe this anomaly. Subsequently high-altitude world magnetic surveying by the U.S. Naval Oceanographic Office (Project Magnet) recorded a greater than 1000 nT dipolar, peak-to-trough anomaly with the major portion being negative (figure 1). Satellite observations (Cosmos 49) were first reported in 1964, these revealed a 40nT anomaly at 350 km altitude. Subsequently the higher altitude (417-499km) POGO (Polar Orbiting Geomagnetic Observatory) satellite data recorded peak-to-trough anomalies of 20 nT these data were added to Cosmos 49 measurements by Regan et al. (1975) for a regional satellite altitude map. In October 1979, with the launch of Magsat, a satellite designed to measure crustal magnetic anomalies, a more uniform satellite altitude magnetic map was obtained. These data, computed at 375 km altitude recorded a -22 nT anomaly (figure 2). This elliptically shaped anomaly is approximately 760 by 1000 km and is centered at 6%, 18%. The Bangui anomaly is composed of three segments; there are two positive anomalies lobes north and south of a large central negative field. This displays the classic pattern of a magnetic anomalous body being magnetized by induction in a zero inclination field. This is not surprising since the magnetic equator passes near the center of this body.

  5. Structural Anomalies Detected in Ceramic Matrix Composites Using Combined Nondestructive Evaluation and Finite Element Analysis (NDE and FEA)

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.; Bhatt, Ramakrishna T.

    2003-01-01

    and the experimental data. Furthermore, modeling of the voids collected via NDE offered an analytical advantage that resulted in more accurate assessments of the material s structural strength. The top figure shows a CT scan image of the specimen test section illustrating various hidden structural entities in the material and an optical image of the test specimen considered in this study. The bottom figure represents the stress response predicted from the finite element analyses (ref .3 ) for a selected CT slice where it clearly illustrates the correspondence of the high stress risers due to voids in the material with those predicted by the NDE. This study is continuing, and efforts are concentrated on improving the modeling capabilities to imitate the structural anomalies as detected.

  6. Anomaly Monitoring Method for Key Components of Satellite

    PubMed Central

    Fan, Linjun; Xiao, Weidong; Tang, Jun

    2014-01-01

    This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703

  7. Anomaly monitoring method for key components of satellite.

    PubMed

    Peng, Jian; Fan, Linjun; Xiao, Weidong; Tang, Jun

    2014-01-01

    This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R(e)) and the charge transfer resistance (R(ct)) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R(X)) and healthy residual value (R(L)) of LIBs based on the state estimation of MSET, and then, through the residual values (R(X) and R(L)) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM).

  8. The role of noninvasive and invasive diagnostic imaging techniques for detection of extra-cranial venous system anomalies and developmental variants

    PubMed Central

    2013-01-01

    The extra-cranial venous system is complex and not well studied in comparison to the peripheral venous system. A newly proposed vascular condition, named chronic cerebrospinal venous insufficiency (CCSVI), described initially in patients with multiple sclerosis (MS) has triggered intense interest in better understanding of the role of extra-cranial venous anomalies and developmental variants. So far, there is no established diagnostic imaging modality, non-invasive or invasive, that can serve as the “gold standard” for detection of these venous anomalies. However, consensus guidelines and standardized imaging protocols are emerging. Most likely, a multimodal imaging approach will ultimately be the most comprehensive means for screening, diagnostic and monitoring purposes. Further research is needed to determine the spectrum of extra-cranial venous pathology and to compare the imaging findings with pathological examinations. The ability to define and reliably detect noninvasively these anomalies is an essential step toward establishing their incidence and prevalence. The role for these anomalies in causing significant hemodynamic consequences for the intra-cranial venous drainage in MS patients and other neurologic disorders, and in aging, remains unproven. PMID:23806142

  9. The role of noninvasive and invasive diagnostic imaging techniques for detection of extra-cranial venous system anomalies and developmental variants.

    PubMed

    Dolic, Kresimir; Siddiqui, Adnan H; Karmon, Yuval; Marr, Karen; Zivadinov, Robert

    2013-06-27

    The extra-cranial venous system is complex and not well studied in comparison to the peripheral venous system. A newly proposed vascular condition, named chronic cerebrospinal venous insufficiency (CCSVI), described initially in patients with multiple sclerosis (MS) has triggered intense interest in better understanding of the role of extra-cranial venous anomalies and developmental variants. So far, there is no established diagnostic imaging modality, non-invasive or invasive, that can serve as the "gold standard" for detection of these venous anomalies. However, consensus guidelines and standardized imaging protocols are emerging. Most likely, a multimodal imaging approach will ultimately be the most comprehensive means for screening, diagnostic and monitoring purposes. Further research is needed to determine the spectrum of extra-cranial venous pathology and to compare the imaging findings with pathological examinations. The ability to define and reliably detect noninvasively these anomalies is an essential step toward establishing their incidence and prevalence. The role for these anomalies in causing significant hemodynamic consequences for the intra-cranial venous drainage in MS patients and other neurologic disorders, and in aging, remains unproven.

  10. Risk of developing palatally displaced canines in patients with early detectable dental anomalies: a retrospective cohort study

    PubMed Central

    GARIB, Daniela Gamba; LANCIA, Melissa; KATO, Renata Mayumi; OLIVEIRA, Thais Marchini; NEVES, Lucimara Teixeira das

    2016-01-01

    ABSTRACT The early recognition of risk factors for the occurrence of palatally displaced canines (PDC) can increase the possibility of impaction prevention. Objective To estimate the risk of PDC occurrence in children with dental anomalies identified early during mixed dentition. Material and Methods The sample comprised 730 longitudinal orthodontic records from children (448 females and 282 males) with an initial mean age of 8.3 years (SD=1.36). The dental anomaly group (DA) included 263 records of patients with at least one dental anomaly identified in the initial or middle mixed dentition. The non-dental anomaly group (NDA) was composed of 467 records of patients with no dental anomalies. The occurrence of PDC in both groups was diagnosed using panoramic and periapical radiographs taken in the late mixed dentition or early permanent dentition. The prevalence of PDC in patients with and without early diagnosed dental anomalies was compared using the chi-square test (p<0.01), relative risk assessments (RR), and positive and negative predictive values (PPV and NPV). Results PDC frequency was 16.35% and 6.2% in DA and NDA groups, respectively. A statistically significant difference was observed between groups (p<0.01), with greater risk of PDC development in the DA group (RR=2.63). The PPV and NPV was 16% and 93%, respectively. Small maxillary lateral incisors, deciduous molar infraocclusion, and mandibular second premolar distoangulation were associated with PDC. Conclusion Children with dental anomalies diagnosed during early mixed dentition have an approximately two and a half fold increased risk of developing PDC during late mixed dentition compared with children without dental anomalies. PMID:28076458

  11. Sequential hypothesis testing for automatic detection of task-related changes in cerebral perfusion in a brain-computer interface.

    PubMed

    Faulkner, Hayley G; Myrden, Andrew; Li, Michael; Mamun, Khondaker; Chau, Tom

    2015-11-01

    Evidence suggests that the cerebral blood flow patterns accompanying cognitive activity are retained in many locked-in patients. These patterns can be monitored using transcranial Doppler ultrasound (TCD), a medical imaging technique that measures bilateral cerebral blood flow velocities. Recently, TCD has been proposed as an alternative imaging modality for brain-computer interfaces (BCIs). However, most previous TCD-BCI studies have performed offline analyses with impractically lengthy tasks. In this study, we designed a BCI that automatically differentiates between counting and verbal fluency tasks using sequential hypothesis testing to make decisions as quickly as possible. Ten able-bodied participants silently alternated between counting and verbal fluency tasks within the paradigm of a simulated on-screen keyboard. During this experiment, blood flow velocities were recorded within the left and right middle cerebral arteries using bilateral TCD. Twelve features were used to characterize TCD signals. In a simulated online analysis, sequential hypothesis testing was used to update estimates of class probability every 250 ms as TCD data were processed. Classification was terminated once a threshold level of certainty was reached. Mean classification accuracy across all participants was 72% after an average of 23s, compared to an offline analysis which obtained a classification accuracy of 80% after 45 s. This represents a substantial gain in data transmission rate, while maintaining classification accuracies exceeding 70%. Furthermore, a range of decision times between 19 and 28s was observed, suggesting that the ability of sequential hypothesis testing to adapt the task duration for each individual participant is critical to achieving consistent performance across participants. These results indicate that sequential hypothesis testing is a promising alternative for online TCD-BCIs.

  12. A sequential sampling scheme for detecting infestation levels of tracheal mites (Heterostigmata: Tarsonemidae) in honey bee (Hymenoptera: Apidae) colonies.

    PubMed

    Frazier, M T; Finley, J; Harkness, W; Rajotte, E G

    2000-06-01

    The introduction of parasitic honey bee mites, the tracheal mite, Acarapis woodi (Rennie) in 1984 and the Varroa mite, Varroa jacobsoni, in 1987, has dramatically increased the winter mortality of honey bee, Apis mellifera L., colonies in many areas of the United States. Some beekeepers have minimized their losses by routinely treating their colonies with menthol, currently the only Environmental Protection Agency-approved and available chemical for tracheal mite control. Menthol is also expensive and can interfere with honey harvesting. Because of inadequate sampling techniques and a lack of information concerning treatment, this routine treatment strategy has increased the possibility that tracheal mites will develop resistance to menthol. It is important to establish economic thresholds and treat colonies with menthol only when treatment is warranted rather than treating all colonies regardless of infestation level. The use of sequential sampling may reduce the amount of time and effort expended in examining individual colonies and determining if treatment is necessary. Sequential sampling also allows statistically based estimates of the percentage of bees in standard Langstroth hives infested with mites while controlling for the possibility of incorrectly assessing the amount of infestation. On the average, sequential sampling plans require fewer observations (bees) to reach a decision for specified probabilities of type I and type II errors than are required for fixed sampling plans, especially when the proportion of infested bees is either very low or very high. We developed a sequential sampling decision plan to allow the user to choose specific economic injury levels and the probability of making type I and type II errors which can result inconsiderable savings in time, labor and expense.

  13. Early India-Australia spreading history revealed by newly detected Mesozoic magnetic anomalies in the Perth Abyssal Plain

    NASA Astrophysics Data System (ADS)

    Williams, Simon E.; Whittaker, Joanne M.; Granot, Roi; Müller, Dietmar R.

    2013-07-01

    seafloor within the Perth Abyssal Plain (PAP), offshore Western Australia, is the only section of crust that directly records the early spreading history between India and Australia during the Mesozoic breakup of Gondwana. However, this early spreading has been poorly constrained due to an absence of data, including marine magnetic anomalies and data constraining the crustal nature of key tectonic features. Here, we present new magnetic anomaly data from the PAP that shows that the crust in the western part of the basin was part of the Indian Plate—the conjugate flank to the oceanic crust immediately offshore the Perth margin, Australia. We identify a sequence of M2 and older anomalies in the west PAP within crust that initially moved with the Indian Plate, formed at intermediate half-spreading rates (35 mm/yr) consistent with the conjugate sequence on the Australian Plate. More speculatively, we reinterpret the youngest anomalies in the east PAP, finding that the M0-age crust initially formed on the Indian Plate was transferred to the Australian Plate by a westward jump or propagation of the spreading ridge shortly after M0 time. Samples dredged from the Gulden Draak and Batavia Knolls (at the western edge of the PAP) reveal that these bathymetric features are continental fragments rather than igneous plateaus related to Broken Ridge. These microcontinents rifted away from Australia with Greater India during initial breakup at ~130 Ma, then rifted from India following the cessation of spreading in the PAP (~101-103 Ma).

  14. Detection of right ventricle thrombosis in patient with Ebstein anomaly of tricuspid valve after Fontan procedure by CT.

    PubMed

    Kardos, Marek

    2014-01-01

    A case of a 9-year-old boy with a severe form of Ebstein anomaly who underwent a fenestrated Fontan procedure and exclusion of the tricuspid valve is reported. CT demonstrated the presence of the right ventricular thrombus which was first found on echocardiography and confirmed perioperatively.

  15. DOWN'S ANOMALY.

    ERIC Educational Resources Information Center

    PENROSE, L.S.; SMITH, G.F.

    BOTH CLINICAL AND PATHOLOGICAL ASPECTS AND MATHEMATICAL ELABORATIONS OF DOWN'S ANOMALY, KNOWN ALSO AS MONGOLISM, ARE PRESENTED IN THIS REFERENCE MANUAL FOR PROFESSIONAL PERSONNEL. INFORMATION PROVIDED CONCERNS (1) HISTORICAL STUDIES, (2) PHYSICAL SIGNS, (3) BONES AND MUSCLES, (4) MENTAL DEVELOPMENT, (5) DERMATOGLYPHS, (6) HEMATOLOGY, (7)…

  16. Low-Complexity Methods for Provably Good Information Transmission and Network Anomaly Detection via Packet Timings In Networks

    DTIC Science & Technology

    2011-03-30

    shown to be philosophically consistent with “ Granger causality ”, in that it measures directionality of causality (e.g., X causing Y) by assessing...coding with feedforward, has led to an optimal, low-complexity recursive scheme for source coding with feedforward causal side information [6]. By...feedback – I have become generally interested in high-level principles that underlie optimal structures of sequential, causal , information theoretic

  17. Kinetic enzymatic determination of glycerol in wine and beer using a sequential injection system with spectrophotometric detection.

    PubMed

    Oliveira, Hugo M; Segundo, Marcela A; Lima, José L F C; Grassi, Viviane; Zagatto, Elias A G

    2006-06-14

    A sequential injection system for the automatic determination of glycerol in wine and beer was developed. The method is based on the rate of formation of NADH from the reaction of glycerol and NAD+ catalyzed by the enzyme glycerol dehydrogenase in solution. The determination of glycerol was performed between 0.3 and 3.0 mmol L(-1) (0.028 and 0.276 g L(-1)), and good repeatability was attained (rsd < 3.6%, n = 5) for all samples tested. The determination rate was 54 h(-1), the reagent consumption was only 0.75 micromol of NAD+ and 5.4 ng of enzyme per assay, and the waste production was 2.12 mL per assay. Results obtained for samples were in agreement with those obtained with the batch enzymatic method.

  18. Enzyme leaching of surficial geochemical samples for detecting hydromorphic trace-element anomalies associated with precious-metal mineralized bedrock buried beneath glacial overburden in northern Minnesota

    USGS Publications Warehouse

    Clark, Robert J.; Meier, A.L.; Riddle, G.; ,

    1990-01-01

    One objective of the International Falls and Roseau, Minnesota, CUSMAP projects was to develop a means of conducting regional-scale geochemical surveys in areas where bedrock is buried beneath complex glacially derived overburden. Partial analysis of B-horizon soils offered hope for detecting subtle hydromorphic trace-element dispersion patterns. An enzyme-based partial leach selectively removes metals from oxide coatings on the surfaces of soil materials without attacking their matrix. Most trace-element concentrations in the resulting solutions are in the part-per-trillion to low part-per-billion range, necessitating determinations by inductively coupled plasma/mass spectrometry. The resulting data show greater contrasts for many trace elements than with other techniques tested. Spatially, many trace metal anomalies are locally discontinuous, but anomalous trends within larger areas are apparent. In many instances, the source for an anomaly seems to be either basal till or bedrock. Ground water flow is probably the most important mechanism for transporting metals toward the surface, although ionic diffusion, electrochemical gradients, and capillary action may play a role in anomaly dispersal. Sample sites near the Rainy Lake-Seine River fault zone, a regional shear zone, often have anomalous concentrations of a variety of metals, commonly including Zn and/or one or more metals which substitute for Zn in sphalerite (Cd, Ge, Ga, and Sn). Shifts in background concentrations of Bi, Sb, and As show a trend across the area indicating a possible regional zoning of lode-Au mineralization. Soil anomalies of Ag, Co, and Tl parallel basement structures, suggesting areas that may have potential for Cobalt/Thunder Baytype silver viens. An area around Baudette, Minnesota, which is underlain by quartz-chlorite-carbonate-altered shear zones, is anomalous in Ag, As, Bi, Co, Mo, Te, Tl, and W. Anomalies of Ag, As, Bi, Te, and W tend to follow the fault zones, suggesting potential

  19. Sequential on-line C-terminal sequencing of peptides based on carboxypeptidase Y digestion and optically gated capillary electrophoresis with laser-induced fluorescence detection.

    PubMed

    Tian, Miaomiao; Zhang, Ning; Liu, Xiaoxia; Guo, Liping; Yang, Li

    2016-08-12

    We report a novel method for sequential on-line C-terminal sequencing of peptides, which combines carboxypeptidase Y (CPY) digestion with on-line derivatization and optically gated capillary electrophoresis with laser-induced fluorescence detection (OGCE-LIF). Various factors that may affect the C-terminal sequencing were investigated and optimized. High repeatability of on-line derivatization and the sequential OGCE-LIF assay of amino acids (AAs) was achieved with relative standard deviation (RSD) (n=20) less than 1.5% and 3.2% for migration time and peak height, respectively. A total of 13 AAs was efficiently separated in the present study, indicating that the method can be used for sequencing of peptides consisting of the 13 AAs studied. Using two synthesized N-terminally blocked peptides as test examples, we show that the present method can on-line monitor the released AAs with a temporal resolution of 50s during the entire CPY digestion process. The rates of AA release as a function of digestion time were easily measured; thus, the AA sequence of the peptide was determined with just one OGCE assay. Our study indicates the present approach is an effective, reliable, and convenient method for rapid analysis of the C-terminal sequence of peptides, with potential application in peptide analysis and proteome research.

  20. Using a combination of MLPA kits to detect chromosomal imbalances in patients with multiple congenital anomalies and mental retardation is a valuable choice for developing countries.

    PubMed

    Jehee, Fernanda Sarquis; Takamori, Jean Tetsuo; Medeiros, Paula F Vasconcelos; Pordeus, Ana Carolina B; Latini, Flavia Roche M; Bertola, Débora Romeo; Kim, Chong Ae; Passos-Bueno, Maria Rita

    2011-01-01

    Conventional karyotyping detects anomalies in 3-15% of patients with multiple congenital anomalies and mental retardation (MCA/MR). Whole-genome array screening (WGAS) has been consistently suggested as the first choice diagnostic test for this group of patients, but it is very costly for large-scale use in developing countries. We evaluated the use of a combination of Multiplex Ligation-dependent Probe Amplification (MLPA) kits to increase the detection rate of chromosomal abnormalities in MCA/MR patients. We screened 261 MCA/MR patients with two subtelomeric and one microdeletion kits. This would theoretically detect up to 70% of all submicroscopic abnormalities. Additionally we scored the de Vries score for 209 patients in an effort to find a suitable cut-off for MLPA screening. Our results reveal that chromosomal abnormalities were present in 87 (33.3%) patients, but only 57 (21.8%) were considered causative. Karyotyping detected 15 abnormalities (6.9%), while MLPA identified 54 (20.7%). Our combined MLPA screening raised the total detection number of pathogenic imbalances more than three times when compared to conventional karyotyping. We also show that using the de Vries score as a cut-off for this screening would only be suitable under financial restrictions. A decision analytic model was constructed with three possible strategies: karyotype, karyotype + MLPA and karyotype + WGAS. Karyotype + MLPA strategy detected anomalies in 19.8% of cases which account for 76.45% of the expected yield for karyotype + WGAS. Incremental Cost Effectiveness Ratio (ICER) of MLPA is three times lower than that of WGAS, which means that, for the same costs, we have three additional diagnoses with MLPA but only one with WGAS. We list all causative alterations found, including rare findings, such as reciprocal duplications of regions deleted in Sotos and Williams-Beuren syndromes. We also describe imbalances that were considered polymorphisms or rare variants, such as the new SNP

  1. SplicePie: a novel analytical approach for the detection of alternative, non-sequential and recursive splicing.

    PubMed

    Pulyakhina, Irina; Gazzoli, Isabella; 't Hoen, Peter A C; Verwey, Nisha; den Dunnen, Johan T; den Dunnen, Johan; Aartsma-Rus, Annemieke; Laros, Jeroen F J

    2015-07-13

    Alternative splicing is a powerful mechanism present in eukaryotic cells to obtain a wide range of transcripts and protein isoforms from a relatively small number of genes. The mechanisms regulating (alternative) splicing and the paradigm of consecutive splicing have recently been challenged, especially for genes with a large number of introns. RNA-Seq, a powerful technology using deep sequencing in order to determine transcript structure and expression levels, is usually performed on mature mRNA, therefore not allowing detailed analysis of splicing progression. Sequencing pre-mRNA at different stages of splicing potentially provides insight into mRNA maturation. Although the number of tools that analyze total and cytoplasmic RNA in order to elucidate the transcriptome composition is rapidly growing, there are no tools specifically designed for the analysis of nuclear RNA (which contains mixtures of pre- and mature mRNA). We developed dedicated algorithms to investigate the splicing process. In this paper, we present a new classification of RNA-Seq reads based on three major stages of splicing: pre-, intermediate- and post-splicing. Applying this novel classification we demonstrate the possibility to analyze the order of splicing. Furthermore, we uncover the potential to investigate the multi-step nature of splicing, assessing various types of recursive splicing events. We provide the data that gives biological insight into the order of splicing, show that non-sequential splicing of certain introns is reproducible and coinciding in multiple cell lines. We validated our observations with independent experimental technologies and showed the reliability of our method. The pipeline, named SplicePie, is freely available at: https://github.com/pulyakhina/splicing_analysis_pipeline. The example data can be found at: https://barmsijs.lumc.nl/HG/irina/example_data.tar.gz.

  2. Synthesis and Application of an Aldazine-Based Fluorescence Chemosensor for the Sequential Detection of Cu2+ and Biological Thiols in Aqueous Solution and Living Cells

    PubMed Central

    Jia, Hongmin; Yang, Ming; Meng, Qingtao; He, Guangjie; Wang, Yue; Hu, Zhizhi; Zhang, Run; Zhang, Zhiqiang

    2016-01-01

    A fluorescence chemosensor, 2-hydroxy-1-naphthaldehyde azine (HNA) was designed and synthesized for sequential detection of Cu2+ and biothiols. It was found that HNA can specifically bind to Cu2+ with 1:1 stoichiometry, accompanied with a dramatic fluorescence quenching and a remarkable bathochromic-shift of the absorbance peak in HEPES buffer. The generated HNA-Cu2+ ensemble displayed a “turn-on” fluorescent response specific for biothiols (Hcy, Cys and GSH) based on the displacement approach, giving a remarkable recovery of fluorescence and UV-Vis spectra. The detection limits of HNA-Cu2+ to Hcy, Cys and GSH were estimated to be 1.5 μM, 1.0 μM and 0.8 μM, respectively, suggesting that HNA-Cu2+ is sensitive enough for the determination of thiols in biological systems. The biocompatibility of HNA towards A549 human lung carcinoma cell, was evaluated by an MTT assay. The capability of HNA-Cu2+ to detect biothiols in live A549 cells was then demonstrated by a microscopy fluorescence imaging assay. PMID:26761012

  3. Sequential (step-by-step) detection, identification and quantitation of extra virgin olive oil adulteration by chemometric treatment of chromatographic profiles.

    PubMed

    Capote, F Priego; Jiménez, J Ruiz; de Castro, M D Luque

    2007-08-01

    An analytical method for the sequential detection, identification and quantitation of extra virgin olive oil adulteration with four edible vegetable oils--sunflower, corn, peanut and coconut oils--is proposed. The only data required for this method are the results obtained from an analysis of the lipid fraction by gas chromatography-mass spectrometry. A total number of 566 samples (pure oils and samples of adulterated olive oil) were used to develop the chemometric models, which were designed to accomplish, step-by-step, the three aims of the method: to detect whether an olive oil sample is adulterated, to identify the type of adulterant used in the fraud, and to determine how much aldulterant is in the sample. Qualitative analysis was carried out via two chemometric approaches--soft independent modelling of class analogy (SIMCA) and K nearest neighbours (KNN)--both approaches exhibited prediction abilities that were always higher than 91% for adulterant detection and 88% for type of adulterant identification. Quantitative analysis was based on partial least squares regression (PLSR), which yielded R2 values of >0.90 for calibration and validation sets and thus made it possible to determine adulteration with excellent precision according to the Shenk criteria.

  4. Congenital anomalies

    PubMed Central

    Kunisaki, Shaun M.

    2012-01-01

    Over the past decade, amniotic fluid-derived stem cells have emerged as a novel, experimental approach for the treatment of a wide variety of congenital anomalies diagnosed either in utero or postnatally. There are a number of unique properties of amniotic fluid stem cells that have allowed it to become a major research focus. These include the relative ease of accessing amniotic fluid cells in a minimally invasive fashion by amniocentesis as well as the relatively rich population of progenitor cells obtained from a small aliquot of fluid. Mesenchymal stem cells, c-kit positive stem cells, as well as induced pluripotent stem cells have all been derived from human amniotic fluid in recent years. This article gives a pediatric surgeon’s perspective on amniotic fluid stem cell therapy for the management of congenital anomalies. The current status in the use of amniotic fluid-derived stem cells, particularly as they relate as substrates in tissue engineering-based applications, is described in various animal models. A roadmap for further study and eventual clinical application is also proposed. PMID:22986340

  5. Investigation of the collision line broadening problem as applicable to the NASA Optical Plume Anomaly Detection (OPAD) system, phase 1

    NASA Technical Reports Server (NTRS)

    Dean, Timothy C.; Ventrice, Carl A.

    1995-01-01

    As a final report for phase 1 of the project, the researchers are submitting to the Tennessee Tech Office of Research the following two papers (reprinted in this report): 'Collision Line Broadening Effects on Spectrometric Data from the Optical Plume Anomaly System (OPAD),' presented at the 30th AIAA/ASME/SAE/ASEE Joint Propulsion Conference, 27-29 June 1994, and 'Calculation of Collision Cross Sections for Atomic Line Broadening in the Plume of the Space Shuttle Main Engine (SSME),' presented at the IEEE Southeastcon '95, 26-29 March 1995. These papers fully state the problem and the progress made up to the end of NASA Fiscal Year 1994. The NASA OPAD system was devised to predict concentrations of anomalous species in the plume of the Space Shuttle Main Engine (SSME) through analysis of spectrometric data. The self absorption of the radiation of these plume anomalies is highly dependent on the line shape of the atomic transition of interest. The Collision Line Broadening paper discusses the methods used to predict line shapes of atomic transitions in the environment of a rocket plume. The Voigt profile is used as the line shape factor since both Doppler and collisional line broadening are significant. Methods used to determine the collisional cross sections are discussed and the results are given and compared with experimental data. These collisional cross sections are then incorporated into the current self absorbing radiative model and the predicted spectrum is compared to actual spectral data collected from the Stennis Space Center Diagnostic Test Facility rocket engine. The second paper included in this report investigates an analytical method for determining the cross sections for collision line broadening by molecular perturbers, using effective central force interaction potentials. These cross sections are determined for several atomic species with H2, one of the principal constituents of the SSME plume environment, and compared with experimental data.

  6. Investigation of the collision line broadening problem as applicable to the NASA Optical Plume Anomaly Detection (OPAD) system, phase 1

    NASA Astrophysics Data System (ADS)

    Dean, Timothy C.; Ventrice, Carl A.

    1995-05-01

    As a final report for phase 1 of the project, the researchers are submitting to the Tennessee Tech Office of Research the following two papers (reprinted in this report): 'Collision Line Broadening Effects on Spectrometric Data from the Optical Plume Anomaly System (OPAD),' presented at the 30th AIAA/ASME/SAE/ASEE Joint Propulsion Conference, 27-29 June 1994, and 'Calculation of Collision Cross Sections for Atomic Line Broadening in the Plume of the Space Shuttle Main Engine (SSME),' presented at the IEEE Southeastcon '95, 26-29 March 1995. These papers fully state the problem and the progress made up to the end of NASA Fiscal Year 1994. The NASA OPAD system was devised to predict concentrations of anomalous species in the plume of the Space Shuttle Main Engine (SSME) through analysis of spectrometric data. The self absorption of the radiation of these plume anomalies is highly dependent on the line shape of the atomic transition of interest. The Collision Line Broadening paper discusses the methods used to predict line shapes of atomic transitions in the environment of a rocket plume. The Voigt profile is used as the line shape factor since both Doppler and collisional line broadening are significant. Methods used to determine the collisional cross sections are discussed and the results are given and compared with experimental data. These collisional cross sections are then incorporated into the current self absorbing radiative model and the predicted spectrum is compared to actual spectral data collected from the Stennis Space Center Diagnostic Test Facility rocket engine. The second paper included in this report investigates an analytical method for determining the cross sections for collision line broadening by molecular perturbers, using effective central force interaction potentials. These cross sections are determined for several atomic species with H2, one of the principal constituents of the SSME plume environment, and compared with experimental data.

  7. Anomaly Detection and Comparative Analysis of Hydrothermal Alteration Materials Trough Hyperspectral Multisensor Data in the Turrialba Volcano

    NASA Astrophysics Data System (ADS)

    Rejas, J. G.; Martínez-Frías, J.; Bonatti, J.; Martínez, R.; Marchamalo, M.

    2012-07-01

    The aim of this work is the comparative study of the presence of hydrothermal alteration materials in the Turrialba volcano (Costa Rica) in relation with computed spectral anomalies from multitemporal and multisensor data adquired in spectral ranges of the visible (VIS), short wave infrared (SWIR) and thermal infrared (TIR). We used for this purposes hyperspectral and multispectral images from the HyMAP and MASTER airborne sensors, and ASTER and Hyperion scenes in a period between 2002 and 2010. Field radiometry was applied in order to remove the atmospheric contribution in an empirical line method. HyMAP and MASTER images were georeferenced directly thanks to positioning and orientation data that were measured at the same time in the acquisition campaign from an inertial system based on GPS/IMU. These two important steps were allowed the identification of spectral diagnostic bands of hydrothermal alteration minerals and the accuracy spatial correlation. Enviromental impact of the volcano activity has been studied through different vegetation indexes and soil patterns. Have been mapped hydrothermal materials in the crater of the volcano, in fact currently active, and their surrounding carrying out a principal components analysis differentiated for a high and low absorption bands to characterize accumulations of kaolinite, illite, alunite and kaolinite+smectite, delimitating zones with the presence of these minerals. Spectral anomalies have been calculated on a comparative study of methods pixel and subpixel focused in thermal bands fused with high-resolution images. Results are presented as an approach based on expert whose main interest lies in the automated identification of patterns of hydrothermal altered materials without prior knowledge or poor information on the area.

  8. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  9. New Peak Temperature Constraints Using RSCM Geothermometry on Lucia Subterrane in Franciscan Complex (California, USA): Detection of Thermal Anomalies in Gold-Bearing Quartz Veins Surrounding.

    NASA Astrophysics Data System (ADS)

    Lahfid, A.; Delchini, S.; Lacroix, B.

    2015-12-01

    The occurrence of deposits hosted by carbonaceous materials-rich metasediments is widespread. Therefore, we aims in this study to investigate the potential of the Raman Spectroscopy of Carbonaceous Material (RSCM) geothermometry to detect thermal anomalies in hydrothermal ore deposits environment and to demonstrate the ability of warm fluids, migrating through the sedimentary sequence to locally disturb the thermal gradient and associated peak temperatures. For this purpose, we have chosen the Lucia subterrane in the Franciscan Complex (California, USA), which includes gold-bearing quartz veins that witness a hydrothermal overprint (Underwood et al., 1995).The sediments in this zone essentially comprise greywacke and shale-matrix mélange (e.g. Frey and Robinson, 1999), which have undergone high-pressure, low-temperature metamorphism. The thermal history of the Lucia subterrane has been previously proposed by Underwood et al. (1995), essentially using vitrinite reflectance method (Rm). Rm values increase from the south to the north; they vary between 0.9 and 3.7 % (~150-280°C). All these results suggest that the Lucia subterrane underwent a regional increase of thermal gradient toward the north. Anomalous Rm values from 4.5% to 4.9% (~305-315°C) are recorded near Cape San Martin. These highest temperatures estimated are likely, associated with a late hydrothermal event (Underwood et al., 1995). Estimated Raman temperatures 1) confirmed the increase in the metamorphic grade towards the north already shown by Underwood et al. (1995), using classical methods like mineralogy and vitrinite reflectance and 2) exhibit anomalous values (temperatures reach 350°C). These anomalies are probably due to the later hydrothermal event. This result suggests that RSCM could be used as a reliable tool to determine thermal anomalies caused by hot fluid-flow.

  10. Reliable VLSI sequential controllers

    NASA Technical Reports Server (NTRS)

    Whitaker, S.; Maki, G.; Shamanna, M.

    1990-01-01

    A VLSI architecture for synchronous sequential controllers is presented that has attractive qualities for producing reliable circuits. In these circuits, one hardware implementation can realize any flow table with a maximum of 2(exp n) internal states and m inputs. Also all design equations are identical. A real time fault detection means is presented along with a strategy for verifying the correctness of the checking hardware. This self check feature can be employed with no increase in hardware. The architecture can be modified to achieve fail safe designs. With no increase in hardware, an adaptable circuit can be realized that allows replacement of faulty transitions with fault free transitions.

  11. Sequential detection of alphafetoprotein-bearing cells in blood stem cell fraction of germ cell tumour patients

    PubMed Central

    Kasahara, T; Hara, N; Bilim, V; Tomita, Y; Saito, K; Obara, K; Takahashi, K

    2001-01-01

    High-dose chemotherapy with peripheral blood stem cell (PBSC) transplantation in advanced germ cell tumour (GCT) patients is widely applied. The aims of this study were: (1) To examine the presence of alphafetoprotein (AFP) bearing tumour cells in PBSC harvests from advanced GCT patients obtained after multiple cycles of induction chemotherapy. (2) To determine whether induction chemotherapy contributed to in vivo purging of the tumour. We evaluated cryopreserved PBSC samples from 5 patients with advanced stage II/III AFP producing GCT. PBSC were separated after the first, second and third cycles of induction chemotherapy. Those samples were analysed using the nested reverse transcription polymerase chain reaction (RT-PCR) method to detect AFP mRNA. Although, in all patients, AFP mRNA was detected in PBSC samples after the first or second cycle of induction chemotherapy, but was not detected in 3 of 4 samples after the third cycle of chemotherapy. Although it is not clear whether tumour cells contaminating PBSC fraction contribute to disease relapse, PBSC harvested after at least 3 cycles of induction chemotherapy might be recommended to avoid such a possibility. © 2001 Cancer Research Campaignhttp://www.bjcancer.com PMID:11710823

  12. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Marshall Clint; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many fronts to make possible high-speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flowfields/plumes. The Optical Plume Anomaly Detector (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDiFiS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Capabilities for real-time processing are being advanced on several fronts, including an effort to hardware encode components of the EDiFiS for health monitoring and management. This paper addresses the OPAD with its tool suites, and discusses what is considered a natural progression: a concept for taking OPAD to the next logical level of high energy physics, incorporating fermion and boson particle analyses in measurement of neutron flux.

  13. Gauge anomalies, gravitational anomalies, and superstrings

    SciTech Connect

    Bardeen, W.A.

    1985-08-01

    The structure of gauge and gravitational anomalies will be reviewed. The impact of these anomalies on the construction, consistency, and application of the new superstring theories will be discussed. 25 refs.

  14. Highly reproducible SERS detection in sequential injection analysis: real time preparation and application of photo-reduced silver substrate in a moving flow-cell.

    PubMed

    El-Zahry, Marwa R; Genner, Andreas; Refaat, Ibrahim H; Mohamed, Horria A; Lendl, Bernhard

    2013-11-15

    This paper reports an improved way for performing highly reproducible surface enhanced Raman scattering of different analytes using an automated flow system. The method uses a confocal Raman microscope to prepare SERS active silver spots on the window of a flow cell by photo-reduction of silver nitrate in the presence of citrate. Placement of the flow cell on an automated x and y stages of the Raman microscope allows to prepare a fresh spot for every new measurement. This procedure thus efficiently avoids any carry over effects which might result from adsorption of the analyte on the SERS active material and enables highly reproducible SERS measurements. For reproducible liquid handling the used sequential injection analysis system as well as the Raman microscope was operated by the flexible LabVIEW based software ATLAS developed in our group. Quantitative aspects were investigated using Cu(PAR)2 as a model analyte. Concentration down to 5×10(-6) M provided clear SERS spectra, a linear concentration dependence of the SERS intensities at 1333 cm(-1) was obtained from 5×10(-5) to 1×10(-3) with a correlation coefficient r=0.999. The coefficient of variation of the method Vxo was found to be 5.6% and the calculated limit of detection 1.7×10(-5) M. The results demonstrate the potential of SERS spectroscopy to be used as a molecular specific detector in aqueous flow systems.

  15. The elliptic anomaly

    NASA Technical Reports Server (NTRS)

    Janin, G.; Bond, V. R.

    1980-01-01

    An independent variable different from the time for elliptic orbit integration is used. Such a time transformation provides an analytical step-size regulation along the orbit. An intermediate anomaly (an anomaly intermediate between the eccentric and the true anomaly) is suggested for optimum performances. A particular case of an intermediate anomaly (the elliptic anomaly) is defined, and its relation with the other anomalies is developed.

  16. Stacked Sequential Learning

    DTIC Science & Technology

    2005-07-01

    a constant factor of K + 2. (To see this, note sequential stacking requires training K+2 classifiers: the classifiers f1, . . . , fK used in cross...on the non- sequential learners (ME and VP) but improves per- formance of the sequential learners (CRFs and VPH - MMs) less consistently. This pattern

  17. ISHM Anomaly Lexicon for Rocket Test

    NASA Technical Reports Server (NTRS)

    Schmalzel, John L.; Buchanan, Aubri; Hensarling, Paula L.; Morris, Jonathan; Turowski, Mark; Figueroa, Jorge F.

    2007-01-01

    Integrated Systems Health Management (ISHM) is a comprehensive capability. An ISHM system must detect anomalies, identify causes of such anomalies, predict future anomalies, help identify consequences of anomalies for example, suggested mitigation steps. The system should also provide users with appropriate navigation tools to facilitate the flow of information into and out of the ISHM system. Central to the ability of the ISHM to detect anomalies is a clearly defined catalog of anomalies. Further, this lexicon of anomalies must be organized in ways that make it accessible to a suite of tools used to manage the data, information and knowledge (DIaK) associated with a system. In particular, it is critical to ensure that there is optimal mapping between target anomalies and the algorithms associated with their detection. During the early development of our ISHM architecture and approach, it became clear that a lexicon of anomalies would be important to the development of critical anomaly detection algorithms. In our work in the rocket engine test environment at John C. Stennis Space Center, we have access to a repository of discrepancy reports (DRs) that are generated in response to squawks identified during post-test data analysis. The DR is the tool used to document anomalies and the methods used to resolve the issue. These DRs have been generated for many different tests and for all test stands. The result is that they represent a comprehensive summary of the anomalies associated with rocket engine testing. Fig. 1 illustrates some of the data that can be extracted from a DR. Such information includes affected transducer channels, narrative description of the observed anomaly, and the steps used to correct the problem. The primary goal of the anomaly lexicon development efforts we have undertaken is to create a lexicon that could be used in support of an associated health assessment database system (HADS) co-development effort. There are a number of significant

  18. Rapid determination of plutonium isotopes in environmental samples using sequential injection extraction chromatography and detection by inductively coupled plasma mass spectrometry.

    PubMed

    Qiao, Jixin; Hou, Xiaolin; Roos, Per; Miró, Manuel

    2009-10-01

    This article presents an automated method for the rapid determination of 239Pu and 240Pu in various environmental samples. The analytical method involves the in-line separation of Pu isotopes using extraction chromatography (TEVA) implemented in a sequential injection (SI) network followed by detection of isolated analytes with inductively coupled plasma mass spectrometry (ICP-MS). The method has been devised for the determination of Pu isotopes at environmentally relevant concentrations, whereby it has been successfully applied to the analyses of large volumes/amounts of samples, for example, 100-200 g of soil and sediment, 20 g of seaweed, and 200 L of seawater following analyte preconcentration. The investigation of the separation capability of the assembled SI system revealed that up to 200 g of soil or sediment can be treated using a column containing about 0.70 g of TEVA resin. The analytical results of Pu isotopes in the reference materials showed good agreement with the certified or reference values at the 0.05 significance level. Chemical yields of Pu ranged from 80 to 105%, and the decontamination factors for uranium, thorium, mercury and lead were all above 10(4). The duration of the in-line extraction chromatographic run was <1.5 h, and the proposed setup was able to handle up to 20 samples (14 mL each) in a fully automated mode using a single chromatographic column. The SI manifold is thus suitable for rapid and automated determination of Pu isotopes in environmental risk assessment and emergency preparedness scenarios.

  19. Mobile gamma-ray scanning system for detecting radiation anomalies associated with /sup 226/Ra-bearing materials

    SciTech Connect

    Myrick, T.E.; Blair, M.S.; Doane, R.W.; Goldsmith, W.A.

    1982-11-01

    A mobile gamma-ray scanning system has been developed by Oak Ridge National Laboratory for use in the Department of Energy's remedial action survey programs. The unit consists of a NaI(T1) detection system housed in a specially-equipped van. The system is operator controlled through an on-board mini-computer, with data output provided on the computer video screen, strip chart recorders, and an on-line printer. Data storage is provided by a floppy disk system. Multichannel analysis capabilities are included for qualitative radionuclide identification. A /sup 226/Ra-specific algorithm is employed to identify locations containing residual radium-bearing materials. This report presents the details of the system description, software development, and scanning methods utilized with the ORNL system. Laboratory calibration and field testing have established the system sensitivity, field of view, and other performance characteristics, the results of which are also presented. Documentation of the instrumentation and computer programs are included.

  20. Experimental evidence for spring and autumn windows for the detection of geobotanical anomalies through the remote sensing of overlying vegetation

    NASA Technical Reports Server (NTRS)

    Labovitz, M. L.; Masuoka, E. J.; Bell, R.; Nelson, R. F.; Larsen, C. A.; Hooker, L. K.; Troensegaard, K. W.

    1985-01-01

    It is pointed out that in many regions of the world, vegetation is the predominant factor influencing variation in reflected energy in the 0.4-2.5 micron region of the spectrum. Studies have, therefore, been conducted regarding the utility of remote sensing for detecting changes in vegetation which could be related to the presence of mineralization. The present paper provides primarily a report on the results of the second year of a multiyear study of geobotanical-remote-sensing relationships as developed over areas of sulfide mineralization. The field study has a strong experimental design basis. It is proceeded by first delineating the boundaries of a large geographic region which satisfied a set of previously enumerated field-site criteria. Within this region, carefully selected pairs of mineralized and nonmineralized test sites were examined over the growing season. The experiment is to provide information about the spectral and temporal resolutions required for remote-sensing-geobotanical exploration. The obtained results are evaluated.

  1. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.

  2. Aeromagnetic anomalies over faulted strata

    USGS Publications Warehouse

    Grauch, V.J.S.; Hudson, Mark R.

    2011-01-01

    High-resolution aeromagnetic surveys are now an industry standard and they commonly detect anomalies that are attributed to faults within sedimentary basins. However, detailed studies identifying geologic sources of magnetic anomalies in sedimentary environments are rare in the literature. Opportunities to study these sources have come from well-exposed sedimentary basins of the Rio Grande rift in New Mexico and Colorado. High-resolution aeromagnetic data from these areas reveal numerous, curvilinear, low-amplitude (2–15 nT at 100-m terrain clearance) anomalies that consistently correspond to intrasedimentary normal faults (Figure 1). Detailed geophysical and rock-property studies provide evidence for the magnetic sources at several exposures of these faults in the central Rio Grande rift (summarized in Grauch and Hudson, 2007, and Hudson et al., 2008). A key result is that the aeromagnetic anomalies arise from the juxtaposition of magnetically differing strata at the faults as opposed to chemical processes acting at the fault zone. The studies also provide (1) guidelines for understanding and estimating the geophysical parameters controlling aeromagnetic anomalies at faulted strata (Grauch and Hudson), and (2) observations on key geologic factors that are favorable for developing similar sedimentary sources of aeromagnetic anomalies elsewhere (Hudson et al.).

  3. Chiral anomalies and differential geometry

    SciTech Connect

    Zumino, B.

    1983-10-01

    Some properties of chiral anomalies are described from a geometric point of view. Topics include chiral anomalies and differential forms, transformation properties of the anomalies, identification and use of the anomalies, and normalization of the anomalies. 22 references. (WHK)

  4. Graph anomalies in cyber communications

    SciTech Connect

    Vander Wiel, Scott A; Storlie, Curtis B; Sandine, Gary; Hagberg, Aric A; Fisk, Michael

    2011-01-11

    Enterprises monitor cyber traffic for viruses, intruders and stolen information. Detection methods look for known signatures of malicious traffic or search for anomalies with respect to a nominal reference model. Traditional anomaly detection focuses on aggregate traffic at central nodes or on user-level monitoring. More recently, however, traffic is being viewed more holistically as a dynamic communication graph. Attention to the graph nature of the traffic has expanded the types of anomalies that are being sought. We give an overview of several cyber data streams collected at Los Alamos National Laboratory and discuss current work in modeling the graph dynamics of traffic over the network. We consider global properties and local properties within the communication graph. A method for monitoring relative entropy on multiple correlated properties is discussed in detail.

  5. [Redistribution of 201 Tl after myocardial scintigraphy with dipyridamole: value in the detection of coronary stenosis and ventricular kinetic anomalies].

    PubMed

    Demangeat, J L; Wolff, F

    1985-12-01

    One hundred and eight-four patients suspected of having coronary artery disease underwent coronary and left ventricular angiography and Tl 201 myocardial scintigraphy with dipyridamole including images of redistribution after 3-4 hours. The results of scintigraphy were assessed visually in all cases and by quantitative analysis in 91 patients. Comparison of early (DIP) and late (REDIS) images showed three types of response: 1) no hypofixation on either (10 patients), 2) a constant defect (59 patients), 3) a reversible defect (115 patients, including 21 cases of "paradoxical" redistribution). The value of the redistribution images was assessed in the diagnosis of coronary stenosis and in the evaluation of ventricular wall function in post-stenotic zones. The following results were obtained: Visual analysis of the DIP scintigraphy alone gave 17 false positive and 8 false negative results (sens: 95%, spec: 41%). The false negative results were all observed in patients at high risk. The DIP/REDIS scintigraphy (considered normal if both images were normal) gave 20 false positive but only 1 false negative result (sens: 99%, spec: 32%). In addition, the negative predictivity increased from 60 to 90%. The considerable reduction in the number of false negative results was due to the detection of "paradoxical" redistribution. The finding indicates that late films must be taken systematically even if the early scintigraphy is normal. Quantitative analysis of DIP scintigraphy was less sensitive and more specific than visual analysis (sens: 82.7%, spec: 68.7%; NVP: 46%). The same was observed when the redistribution films were processed (DIP/REDIS): significantly increased sensitivity and negative predictive value at the cost of a lower specificity (sens: 96%, spec: 41%; NPV: 70%). No significant differences were observed between the type of scintigraphic defect (constant or reversible) and the probability of coronary stenosis (positive predictive value 93 and 86% respectively

  6. Sequential document visualization.

    PubMed

    Mao, Yi; Dillon, Joshua; Lebanon, Guy

    2007-01-01

    Documents and other categorical valued time series are often characterized by the frequencies of short range sequential patterns such as n-grams. This representation converts sequential data of varying lengths to high dimensional histogram vectors which are easily modeled by standard statistical models. Unfortunately, the histogram representation ignores most of the medium and long range sequential dependencies making it unsuitable for visualizing sequential data. We present a novel framework for sequential visualization of discrete categorical time series based on the idea of local statistical modeling. The framework embeds categorical time series as smooth curves in the multinomial simplex summarizing the progression of sequential trends. We discuss several visualization techniques based on the above framework and demonstrate their usefulness for document visualization.

  7. Event-Driven Collaboration through Publish/Subscribe Messaging Services for Near-Real- Time Environmental Sensor Anomaly Detection and Management

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Downey, S.; Minsker, B.; Myers, J. D.; Wentling, T.; Marini, L.

    2006-12-01

    One of the challenges in designing cyberinfrastructure for national environmental observatories is how to provide integrated cyberenvironment which not only provides a standardized pipeline for streaming data from sensors into the observatory for archiving and distribution, but also makes raw data and identified events available in real-time for use in individual and group research efforts. This aspect of observatories is critical for promoting efficient collaboration and innovation among scientists and engineers and enabling observatories to serve as a focus that directly supports the broad community. The National Center for Supercomputing Applications' Environmental Cyberinfrastructure Demo (ECID) project has adopted an event-driven architecture and developed a CyberCollaboratory to facilitate event-driven, near-real-time collaboration and management of sensors and workflows for bringing data from environmental observatories into local research contexts. The CyberCollaboratory's event broker uses publish-subscribe service powered by JMS (Java Messaging Service) with semantics-enhanced messages using RDF (Resource Description Framework) triples to allow exchange of contextual information about the event between the event generators and the event consumers. Non-scheduled, event-driven collaboration effectively reduces the barrier to collaboration for scientists and engineers and promotes much faster turn-around time for critical environmental events. This is especially useful for real-time adaptive monitoring and modeling of sensor data in environmental observatories. In this presentation, we illustrate our system using a sensor anomaly detection event as an example where near-real- time data streams from field sensor in Corpus Christi Bay, Texas, trigger monitoring/anomaly alerts in the CyberCollaboratory's CyberDashboard and collaborative activities in the CyberCollaboratory. The CyberDashboard is a Java application where users can monitor various events

  8. Path scanning for the detection of anomalous subgraphs and use of DNS requests and host agents for anomaly/change detection and network situational awareness

    DOEpatents

    Neil, Joshua Charles; Fisk, Michael Edward; Brugh, Alexander William; Hash, Jr., Curtis Lee; Storlie, Curtis Byron; Uphoff, Benjamin; Kent, Alexander

    2017-01-31

    A system, apparatus, computer-readable medium, and computer-implemented method are provided for detecting anomalous behavior in a network. Historical parameters of the network are determined in order to determine normal activity levels. A plurality of paths in the network are enumerated as part of a graph representing the network, where each computing system in the network may be a node in the graph and the sequence of connections between two computing systems may be a directed edge in the graph. A statistical model is applied to the plurality of paths in the graph on a sliding window basis to detect anomalous behavior. Data collected by a Unified Host Collection Agent ("UHCA") may also be used to detect anomalous behavior.

  9. On Disturbed Sequential Hypothesis Testing

    DTIC Science & Technology

    1991-06-01

    procedures over the classical single sensor schemes. 14. SUBJECT TERMS iS NUMBER OF PAGES Sensor Fusion, Signal Processing, Detection Comunication 150...matrix. Proakis [28] described the random walk formulation approach and derived the exact distribution of the test length T and the ASN for quantized...Trans. Inform. Theory vol. IT-26, No. 2, pp. 255-259, March 1980. 141 [28] J. G. Proakis , "Exact distribution functions of test length for sequential

  10. Lymphatic Anomalies Registry

    ClinicalTrials.gov

    2016-07-26

    Lymphatic Malformation; Generalized Lymphatic Anomaly (GLA); Central Conducting Lymphatic Anomaly; CLOVES Syndrome; Gorham-Stout Disease ("Disappearing Bone Disease"); Blue Rubber Bleb Nevus Syndrome; Kaposiform Lymphangiomatosis; Kaposiform Hemangioendothelioma/Tufted Angioma; Klippel-Trenaunay Syndrome; Lymphangiomatosis

  11. Ebstein anomaly: a review.

    PubMed

    Galea, Joseph; Ellul, Sarah; Schembri, Aaron; Schembri-Wismayer, Pierre; Calleja-Agius, Jean

    2014-01-01

    Cardiac congenital abnormalities are a leading cause in neonatal mortality occurring in up to 1 in 200 of live births. Ebstein anomaly, also known as Kassamali anomaly, accounts for 1 percent of all congenital cardiac anomalies. This congenital abnormality involves malformation of the tricuspid valve and of the right ventricle. In this review, the causes of the anomaly are outlined and the pathophysiology is discussed, with a focus on the symptoms, management, and treatments available to date.

  12. Spacecraft Environmental Anomalies Handbook

    DTIC Science & Technology

    1989-08-01

    engineering solutions for mitigating the effects of environmental anomalies have been developed. Among the causes o, spacecraft anomalies are surface...have been discovered after years of investig!:tion, and engineering solutions for mitigating the effccts of environmental anomalies have been developed...23 * 6.4.3 Fauth Tolerant Solutions .............................................................................. 23 6.4.4. Methods

  13. South Atlantic Anomaly

    Atmospheric Science Data Center

    2013-04-19

    article title:  The South Atlantic Anomaly     View larger GIF image The South Atlantic Anomaly (SAA) . Even before the cover opened, the Multi-angle Imaging ... Atlantic Anomaly location:  Atlantic Ocean Global Images First Light Images region:  Before the ...

  14. Partition algebraic design of asynchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Maki, Gary K.; Chen, Kristen Q.; Gopalakrishnan, Suresh K.

    1993-01-01

    Tracey's Theorem has long been recognized as essential in generating state assignments for asynchronous sequential circuits. This paper shows that partitioning variables derived from Tracey's Theorem also has a significant impact in generating the design equations. Moreover, this theorem is important to the fundamental understanding of asynchronous sequential operation. The results of this work simplify asynchronous logic design. Moreover, detection of safe circuits is made easier.

  15. INVESTIGATION OF ARSENIC SPECIATION ON DRINKING WATER TREATMENT MEDIA UTILIZING AUTOMATED SEQUENTIAL CONTINUOUS FLOW EXTRACTION WITH IC-ICP-MS DETECTION

    EPA Science Inventory

    Three treatment media, used for the removal of arsenic from drinking water, were sequentially extracted using 10mM MgCl2 (pH 8), 10mM NaH2PO4 (pH 7) followed by 10mM (NH4)2C2O4 (pH 3). The media were extracted using an on-line automated continuous extraction system which allowed...

  16. Analysis of spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Bloomquist, C. E.; Graham, W. C.

    1976-01-01

    The anomalies from 316 spacecraft covering the entire U.S. space program were analyzed to determine if there were any experimental or technological programs which could be implemented to remove the anomalies from future space activity. Thirty specific categories of anomalies were found to cover nearly 85 percent of all observed anomalies. Thirteen experiments were defined to deal with 17 of these categories; nine additional experiments were identified to deal with other classes of observed and anticipated anomalies. Preliminary analyses indicate that all 22 experimental programs are both technically feasible and economically viable.

  17. Adaptive sequential controller

    DOEpatents

    El-Sharkawi, Mohamed A.; Xing, Jian; Butler, Nicholas G.; Rodriguez, Alonso

    1994-01-01

    An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.

  18. Identification of mineral resources in Afghanistan-Detecting and mapping resource anomalies in prioritized areas using geophysical and remote sensing (ASTER and HyMap) data

    USGS Publications Warehouse

    : King, Trude V. V.; Johnson, Michaela R.; Hubbard, Bernard E.; Drenth, Benjamin J.

    2011-01-01

    During the independent analysis of the geophysical, ASTER, and imaging spectrometer (HyMap) data by USGS scientists, previously unrecognized targets of potential mineralization were identified using evaluation criteria most suitable to the individual dataset. These anomalous zones offer targets of opportunity that warrant additional field verification. This report describes the standards used to define the anomalies, summarizes the results of the evaluations for each type of data, and discusses the importance and implications of regions of anomaly overlap between two or three of the datasets.

  19. Taussig-Bing Anomaly

    PubMed Central

    Konstantinov, Igor E.

    2009-01-01

    Taussig-Bing anomaly is a rare congenital heart malformation that was first described in 1949 by Helen B. Taussig (1898–1986) and Richard J. Bing (1909–). Although substantial improvement has since been achieved in surgical results of the repair of the anomaly, management of the Taussig-Bing anomaly remains challenging. A history of the original description of the anomaly, the life stories of the individuals who first described it, and the current outcomes of its surgical management are reviewed herein. PMID:20069085

  20. Sequential ranging: How it works

    NASA Technical Reports Server (NTRS)

    Baugh, Harold W.

    1993-01-01

    This publication is directed to the users of data from the Sequential Ranging Assembly (SRA), and to others who have a general interest in range measurements. It covers the hardware, the software, and the processes used in acquiring range data; it does not cover analytical aspects such as the theory of modulation, detection, noise spectral density, and other highly technical subjects. In other words, it covers how ranging is done, but not the details of why it works. The publication also includes an appendix that gives a brief discussion of PN ranging, a capability now under development.

  1. Glassy carbon electrodes sequentially modified by cysteamine-capped gold nanoparticles and poly(amidoamine) dendrimers generation 4.5 for detecting uric acid in human serum without ascorbic acid interference.

    PubMed

    Ramírez-Segovia, A S; Banda-Alemán, J A; Gutiérrez-Granados, S; Rodríguez, A; Rodríguez, F J; Godínez, Luis A; Bustos, E; Manríquez, J

    2014-02-17

    Glassy carbon electrodes (GCE) were sequentially modified by cysteamine-capped gold nanoparticles (AuNp@cysteamine) and PAMAM dendrimers generation 4.5 bearing 128-COOH peripheral groups (GCE/AuNp@cysteamine/PAMAM), in order to explore their capabilities as electrochemical detectors of uric acid (UA) in human serum samples at pH 2. The results showed that concentrations of UA detected by cyclic voltammetry with GCE/AuNp@cysteamine/PAMAM were comparable (deviation <±10%; limits of detection (LOD) and quantification (LOQ) were 1.7×10(-4) and 5.8×10(-4) mg dL(-1), respectively) to those concentrations obtained using the uricase-based enzymatic-colorimetric method. It was also observed that the presence of dendrimers in the GCE/AuNp@cysteamine/PAMAM system minimizes ascorbic acid (AA) interference during UA oxidation, thus improving the electrocatalytic activity of the gold nanoparticles.

  2. Multiple and sequential data acquisition method: an improved method for fragmentation and detection of cross-linked peptides on a hybrid linear trap quadrupole Orbitrap Velos mass spectrometer.

    PubMed

    Rudashevskaya, Elena L; Breitwieser, Florian P; Huber, Marie L; Colinge, Jacques; Müller, André C; Bennett, Keiryn L

    2013-02-05

    The identification and validation of cross-linked peptides by mass spectrometry remains a daunting challenge for protein-protein cross-linking approaches when investigating protein interactions. This includes the fragmentation of cross-linked peptides in the mass spectrometer per se and following database searching, the matching of the molecular masses of the fragment ions to the correct cross-linked peptides. The hybrid linear trap quadrupole (LTQ) Orbitrap Velos combines the speed of the tandem mass spectrometry (MS/MS) duty circle with high mass accuracy, and these features were utilized in the current study to substantially improve the confidence in the identification of cross-linked peptides. An MS/MS method termed multiple and sequential data acquisition method (MSDAM) was developed. Preliminary optimization of the MS/MS settings was performed with a synthetic peptide (TP1) cross-linked with bis[sulfosuccinimidyl] suberate (BS(3)). On the basis of these results, MSDAM was created and assessed on the BS(3)-cross-linked bovine serum albumin (BSA) homodimer. MSDAM applies a series of multiple sequential fragmentation events with a range of different normalized collision energies (NCE) to the same precursor ion. The combination of a series of NCE enabled a considerable improvement in the quality of the fragmentation spectra for cross-linked peptides, and ultimately aided in the identification of the sequences of the cross-linked peptides. Concurrently, MSDAM provides confirmatory evidence from the formation of reporter ions fragments, which reduces the false positive rate of incorrectly assigned cross-linked peptides.

  3. Detection of TEM-induced reciprocal translocations in F/sub 1/ sons of CD-1 male mice: comparison of sequential fertility evaluation and cytogenetic analysis

    SciTech Connect

    Morris, S.M.; Kodell, R.L.; Domon, O.E.; Bishop, J.B.

    1988-01-01

    To determine the positive and negative classification error rates associated with the HTA in our laboratory, F/sub 1/ sons of TEM-exposed CD-1 male mice were evaluated by the sequential fertility method with subsequent cytogenetic analysis. Males who sired three litters of size 10 or less when mated to primiparous females from either the B6C3F/sub 1/ or the BCF/sub 1/ strain were classified as partial steriles. When meiotic chromosomes analyses revealed the presence of at least two cells containing multivalent figures, males were classified as translocation heterozygotes. When the fertility evaluation and the cytogenetic analysis were compared, normal fertility was observed on 5 of 83 (6.02%) translocation-bearing F/sub 1/ males mated to B6C3F/sub 1/ tester females and on 3 of 83 (3.61%) F/sub 1/ males mated to BCF/sub 1/ tester females. Thus, the false-negative error rates were 6.02% and 3.61% with these two tester strains. Multivalent figures were not observed in the meiotic chromosomes of 410 F/sub 1/ males. The false-positive error rates with these two tester strains were 2.93% for the B6C3F/sub 1/ strain and 1.71% for the BCF/sub 1/ strain. The results indicate that nonzero error rates, both false-positive and false-negative, are associated with the sequential mating method HTA. In addition, the magnitude of these error rates was influenced not only by the tester female strain but also by the genotype of the F/sub 1/ male.

  4. Competing Orders and Anomalies

    PubMed Central

    Moon, Eun-Gook

    2016-01-01

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation “laws” could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the ’t Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed. PMID:27499184

  5. Competing Orders and Anomalies

    NASA Astrophysics Data System (ADS)

    Moon, Eun-Gook

    2016-08-01

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation “laws” could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the ’t Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed.

  6. Competing Orders and Anomalies.

    PubMed

    Moon, Eun-Gook

    2016-08-08

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation "laws" could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the 't Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed.

  7. Sequential inductive learning

    SciTech Connect

    Gratch, J.

    1996-12-31

    This article advocates a new model for inductive learning. Called sequential induction, it helps bridge classical fixed-sample learning techniques (which are efficient but difficult to formally characterize), and worst-case approaches (which provide strong statistical guarantees but are too inefficient for practical use). Learning proceeds as a sequence of decisions which are informed by training data. By analyzing induction at the level of these decisions, and by utilizing the only enough data to make each decision, sequential induction provides statistical guarantees but with substantially less data than worst-case methods require. The sequential inductive model is also useful as a method for determining a sufficient sample size for inductive learning and as such, is relevant to learning problems where the preponderance of data or the cost of gathering data precludes the use of traditional methods.

  8. Sequential elution process

    DOEpatents

    Kingsley, I.S.

    1987-01-06

    A process and apparatus are disclosed for the separation of complex mixtures of carbonaceous material by sequential elution with successively stronger solvents. In the process, a column containing glass beads is maintained in a fluidized state by a rapidly flowing stream of a weak solvent, and the sample is injected into this flowing stream such that a portion of the sample is dissolved therein and the remainder of the sample is precipitated therein and collected as a uniform deposit on the glass beads. Successively stronger solvents are then passed through the column to sequentially elute less soluble materials. 1 fig.

  9. Sequential Path Entanglement for Quantum Metrology

    PubMed Central

    Jin, Xian-Min; Peng, Cheng-Zhi; Deng, Youjin; Barbieri, Marco; Nunn, Joshua; Walmsley, Ian A.

    2013-01-01

    Path entanglement is a key resource for quantum metrology. Using path-entangled states, the standard quantum limit can be beaten, and the Heisenberg limit can be achieved. However, the preparation and detection of such states scales unfavourably with the number of photons. Here we introduce sequential path entanglement, in which photons are distributed across distinct time bins with arbitrary separation, as a resource for quantum metrology. We demonstrate a scheme for converting polarization Greenberger-Horne-Zeilinger entanglement into sequential path entanglement. We observe the same enhanced phase resolution expected for conventional path entanglement, independent of the delay between consecutive photons. Sequential path entanglement can be prepared comparably easily from polarization entanglement, can be detected without using photon-number-resolving detectors, and enables novel applications.

  10. Sequential Dependencies in Driving

    ERIC Educational Resources Information Center

    Doshi, Anup; Tran, Cuong; Wilder, Matthew H.; Mozer, Michael C.; Trivedi, Mohan M.

    2012-01-01

    The effect of recent experience on current behavior has been studied extensively in simple laboratory tasks. We explore the nature of sequential effects in the more naturalistic setting of automobile driving. Driving is a safety-critical task in which delayed response times may have severe consequences. Using a realistic driving simulator, we find…

  11. Sequential Reliability Tests.

    ERIC Educational Resources Information Center

    Eiting, Mindert H.

    1991-01-01

    A method is proposed for sequential evaluation of reliability of psychometric instruments. Sample size is unfixed; a test statistic is computed after each person is sampled and a decision is made in each stage of the sampling process. Results from a series of Monte-Carlo experiments establish the method's efficiency. (SLD)

  12. Sequential memory: Binding dynamics

    NASA Astrophysics Data System (ADS)

    Afraimovich, Valentin; Gong, Xue; Rabinovich, Mikhail

    2015-10-01

    Temporal order memories are critical for everyday animal and human functioning. Experiments and our own experience show that the binding or association of various features of an event together and the maintaining of multimodality events in sequential order are the key components of any sequential memories—episodic, semantic, working, etc. We study a robustness of binding sequential dynamics based on our previously introduced model in the form of generalized Lotka-Volterra equations. In the phase space of the model, there exists a multi-dimensional binding heteroclinic network consisting of saddle equilibrium points and heteroclinic trajectories joining them. We prove here the robustness of the binding sequential dynamics, i.e., the feasibility phenomenon for coupled heteroclinic networks: for each collection of successive heteroclinic trajectories inside the unified networks, there is an open set of initial points such that the trajectory going through each of them follows the prescribed collection staying in a small neighborhood of it. We show also that the symbolic complexity function of the system restricted to this neighborhood is a polynomial of degree L - 1, where L is the number of modalities.

  13. Vascular anomalies in children.

    PubMed

    Weibel, L

    2011-11-01

    Vascular anomalies are divided in two major categories: tumours (such as infantile hemangiomas) and malformations. Hemangiomas are common benign neoplasms that undergo a proliferative phase followed by stabilization and eventual spontaneous involution, whereas vascular malformations are rare structural anomalies representing morphogenetic errors of developing blood vessels and lymphatics. It is important to properly diagnose vascular anomalies early in childhood because of their distinct differences in morbidity, prognosis and need for a multidisciplinary management. We discuss a number of characteristic clinical features as clues for early diagnosis and identification of associated syndromes.

  14. Congenital basis of posterior fossa anomalies

    PubMed Central

    Cotes, Claudia; Bonfante, Eliana; Lazor, Jillian; Jadhav, Siddharth; Caldas, Maria; Swischuk, Leonard

    2015-01-01

    The classification of posterior fossa congenital anomalies has been a controversial topic. Advances in genetics and imaging have allowed a better understanding of the embryologic development of these abnormalities. A new classification schema correlates the embryologic, morphologic, and genetic bases of these anomalies in order to better distinguish and describe them. Although they provide a better understanding of the clinical aspects and genetics of these disorders, it is crucial for the radiologist to be able to diagnose the congenital posterior fossa anomalies based on their morphology, since neuroimaging is usually the initial step when these disorders are suspected. We divide the most common posterior fossa congenital anomalies into two groups: 1) hindbrain malformations, including diseases with cerebellar or vermian agenesis, aplasia or hypoplasia and cystic posterior fossa anomalies; and 2) cranial vault malformations. In addition, we will review the embryologic development of the posterior fossa and, from the perspective of embryonic development, will describe the imaging appearance of congenital posterior fossa anomalies. Knowledge of the developmental bases of these malformations facilitates detection of the morphological changes identified on imaging, allowing accurate differentiation and diagnosis of congenital posterior fossa anomalies. PMID:26246090

  15. Detecting ecosystem performance anomalies for land management in the upper colorado river basin using satellite observations, climate data, and ecosystem models

    USGS Publications Warehouse

    Gu, Y.; Wylie, B.K.

    2010-01-01

    This study identifies areas with ecosystem performance anomalies (EPA) within the Upper Colorado River Basin (UCRB) during 2005-2007 using satellite observations, climate data, and ecosystem models. The final EPA maps with 250-m spatial resolution were categorized as normal performance, underperformance, and overperformance (observed performance relative to weather-based predictions) at the 90% level of confidence. The EPA maps were validated using "percentage of bare soil" ground observations. The validation results at locations with comparable site potential showed that regions identified as persistently underperforming (overperforming) tended to have a higher (lower) percentage of bare soil, suggesting that our preliminary EPA maps are reliable and agree with ground-based observations. The 3-year (2005-2007) persistent EPA map from this study provides the first quantitative evaluation of ecosystem performance anomalies within the UCRB and will help the Bureau of Land Management (BLM) identify potentially degraded lands. Results from this study can be used as a prototype by BLM and other land managers for making optimal land management decisions. ?? 2010 by the authors.

  16. Detecting Ecosystem Performance Anomalies for Land Management in the Upper Colorado River Basin Using Satellite Observations, Climate Data, and Ecosystem Models

    USGS Publications Warehouse

    Gu, Yingxin; Wylie, Bruce K.

    2010-01-01

    This study identifies areas with ecosystem performance anomalies (EPA) within the Upper Colorado River Basin (UCRB) during 2005–2007 using satellite observations, climate data, and ecosystem models. The final EPA maps with 250-m spatial resolution were categorized as normal performance, underperformance, and overperformance (observed performance relative to weather-based predictions) at the 90% level of confidence. The EPA maps were validated using “percentage of bare soil” ground observations. The validation results at locations with comparable site potential showed that regions identified as persistently underperforming (overperforming) tended to have a higher (lower) percentage of bare soil, suggesting that our preliminary EPA maps are reliable and agree with ground-based observations. The 3-year (2005–2007) persistent EPA map from this study provides the first quantitative evaluation of ecosystem performance anomalies within the UCRB and will help the Bureau of Land Management (BLM) identify potentially degraded lands. Results from this study can be used as a prototype by BLM and other land managers for making optimal land management decisions.

  17. Test pattern generation for ILA sequential circuits

    NASA Technical Reports Server (NTRS)

    Feng, YU; Frenzel, James F.; Maki, Gary K.

    1993-01-01

    An efficient method of generating test patterns for sequential machines implemented using one-dimensional, unilateral, iterative logic arrays (ILA's) of BTS pass transistor networks is presented. Based on a transistor level fault model, the method affords a unique opportunity for real-time fault detection with improved fault coverage. The resulting test sets are shown to be equivalent to those obtained using conventional gate level models, thus eliminating the need for additional test patterns. The proposed method advances the simplicity and ease of the test pattern generation for a special class of sequential circuitry.

  18. Dual diaphragmatic anomalies

    PubMed Central

    Padmanabhan, Arjun; Thomas, Abin Varghese

    2016-01-01

    Although diaphragmatic anomalies such as an eventration and hiatus hernia are commonly encountered in incidental chest X-ray imaging, the presence of concomitant multiple anomalies is extremely rare. This is all the more true in adults. Herein, we present the case of a 75-year-old female, while undergoing a routine chest X-ray imaging, was found to have eventration of right hemidiaphragm along with a hiatus hernia as well. PMID:27625457

  19. Sequential measurements of conjugate observables

    NASA Astrophysics Data System (ADS)

    Carmeli, Claudio; Heinosaari, Teiko; Toigo, Alessandro

    2011-07-01

    We present a unified treatment of sequential measurements of two conjugate observables. Our approach is to derive a mathematical structure theorem for all the relevant covariant instruments. As a consequence of this result, we show that every Weyl-Heisenberg covariant observable can be implemented as a sequential measurement of two conjugate observables. This method is applicable both in finite- and infinite-dimensional Hilbert spaces, therefore covering sequential spin component measurements as well as position-momentum sequential measurements.

  20. Sequential cloning of chromosomes

    SciTech Connect

    Lacks, S.A.

    1991-12-31

    A method for sequential cloning of chromosomal DNA and chromosomal DNA cloned by this method are disclosed. The method includes the selection of a target organism having a segment of chromosomal DNA to be sequentially cloned. A first DNA segment, having a first restriction enzyme site on either side. homologous to the chromosomal DNA to be sequentially cloned is isolated. A first vector product is formed by ligating the homologous segment into a suitably designed vector. The first vector product is circularly integrated into the target organism`s chromosomal DNA. The resulting integrated chromosomal DNA segment includes the homologous DNA segment at either end of the integrated vector segment. The integrated chromosomal DNA is cleaved with a second restriction enzyme and ligated to form a vector-containing plasmid, which is replicated in a host organism. The replicated plasmid is then cleaved with the first restriction enzyme. Next, a DNA segment containing the vector and a segment of DNA homologous to a distal portion of the previously isolated DNA segment is isolated. This segment is then ligated to form a plasmid which is replicated within a suitable host. This plasmid is then circularly integrated into the target chromosomal DNA. The chromosomal DNA containing the circularly integrated vector is treated with a third, retrorestriction enzyme. The cleaved DNA is ligated to give a plasmid that is used to transform a host permissive for replication of its vector. The sequential cloning process continues by repeated cycles of circular integration and excision. The excision is carried out alternately with the second and third enzymes.

  1. Sequential cloning of chromosomes

    DOEpatents

    Lacks, Sanford A.

    1995-07-18

    A method for sequential cloning of chromosomal DNA of a target organism is disclosed. A first DNA segment homologous to the chromosomal DNA to be sequentially cloned is isolated. The first segment has a first restriction enzyme site on either side. A first vector product is formed by ligating the homologous segment into a suitably designed vector. The first vector product is circularly integrated into the target organism's chromosomal DNA. The resulting integrated chromosomal DNA segment includes the homologous DNA segment at either end of the integrated vector segment. The integrated chromosomal DNA is cleaved with a second restriction enzyme and ligated to form a vector-containing plasmid, which is replicated in a host organism. The replicated plasmid is then cleaved with the first restriction enzyme. Next, a DNA segment containing the vector and a segment of DNA homologous to a distal portion of the previously isolated DNA segment is isolated. This segment is then ligated to form a plasmid which is replicated within a suitable host. This plasmid is then circularly integrated into the target chromosomal DNA. The chromosomal DNA containing the circularly integrated vector is treated with a third, retrorestriction (class IIS) enzyme. The cleaved DNA is ligated to give a plasmid that is used to transform a host permissive for replication of its vector. The sequential cloning process continues by repeated cycles of circular integration and excision. The excision is carried out alternately with the second and third enzymes.

  2. Sequential cloning of chromosomes

    DOEpatents

    Lacks, S.A.

    1995-07-18

    A method for sequential cloning of chromosomal DNA of a target organism is disclosed. A first DNA segment homologous to the chromosomal DNA to be sequentially cloned is isolated. The first segment has a first restriction enzyme site on either side. A first vector product is formed by ligating the homologous segment into a suitably designed vector. The first vector product is circularly integrated into the target organism`s chromosomal DNA. The resulting integrated chromosomal DNA segment includes the homologous DNA segment at either end of the integrated vector segment. The integrated chromosomal DNA is cleaved with a second restriction enzyme and ligated to form a vector-containing plasmid, which is replicated in a host organism. The replicated plasmid is then cleaved with the first restriction enzyme. Next, a DNA segment containing the vector and a segment of DNA homologous to a distal portion of the previously isolated DNA segment is isolated. This segment is then ligated to form a plasmid which is replicated within a suitable host. This plasmid is then circularly integrated into the target chromosomal DNA. The chromosomal DNA containing the circularly integrated vector is treated with a third, retrorestriction (class IIS) enzyme. The cleaved DNA is ligated to give a plasmid that is used to transform a host permissive for replication of its vector. The sequential cloning process continues by repeated cycles of circular integration and excision. The excision is carried out alternately with the second and third enzymes. 9 figs.

  3. Radioactive anomaly discrimination from spectral ratios

    DOEpatents

    Maniscalco, James; Sjoden, Glenn; Chapman, Mac Clements

    2013-08-20

    A method for discriminating a radioactive anomaly from naturally occurring radioactive materials includes detecting a first number of gamma photons having energies in a first range of energy values within a predetermined period of time and detecting a second number of gamma photons having energies in a second range of energy values within the predetermined period of time. The method further includes determining, in a controller, a ratio of the first number of gamma photons having energies in the first range and the second number of gamma photons having energies in the second range, and determining that a radioactive anomaly is present when the ratio exceeds a threshold value.

  4. Relationships between Rwandan seasonal rainfall anomalies and ENSO events

    NASA Astrophysics Data System (ADS)

    Muhire, I.; Ahmed, F.; Abutaleb, K.

    2015-10-01

    This study aims primarily at investigating the relationships between Rwandan seasonal rainfall anomalies and El Niño-South Oscillation phenomenon (ENSO) events. The study is useful for early warning of negative effects associated with extreme rainfall anomalies across the country. It covers the period 1935-1992, using long and short rains data from 28 weather stations in Rwanda and ENSO events resourced from Glantz (2001). The mean standardized anomaly indices were calculated to investigate their associations with ENSO events. One-way analysis of variance was applied on the mean standardized anomaly index values per ENSO event to explore the spatial correlation of rainfall anomalies per ENSO event. A geographical information system was used to present spatially the variations in mean standardized anomaly indices per ENSO event. The results showed approximately three climatic periods, namely, dry period (1935-1960), semi-humid period (1961-1976) and wet period (1977-1992). Though positive and negative correlations were detected between extreme short rains anomalies and El Niño events, La Niña events were mostly linked to negative rainfall anomalies while El Niño events were associated with positive rainfall anomalies. The occurrence of El Niño and La Niña in the same year does not show any clear association with rainfall anomalies. However, the phenomenon was more linked with positive long rains anomalies and negative short rains anomalies. The normal years were largely linked with negative long rains anomalies and positive short rains anomalies, which is a pointer to the influence of other factors other than ENSO events. This makes projection of seasonal rainfall anomalies in the country by merely predicting ENSO events difficult.

  5. Astrometric solar system anomalies

    SciTech Connect

    Nieto, Michael Martin; Anderson, John D

    2009-01-01

    There are at least four unexplained anomalies connected with astrometric data. perhaps the most disturbing is the fact that when a spacecraft on a flyby trajectory approaches the Earth within 2000 km or less, it often experiences a change in total orbital energy per unit mass. next, a secular change in the astronomical unit AU is definitely a concern. It is increasing by about 15 cm yr{sup -1}. The other two anomalies are perhaps less disturbing because of known sources of nongravitational acceleration. The first is an apparent slowing of the two Pioneer spacecraft as they exit the solar system in opposite directions. Some astronomers and physicists are convinced this effect is of concern, but many others are convinced it is produced by a nearly identical thermal emission from both spacecraft, in a direction away from the Sun, thereby producing acceleration toward the Sun. The fourth anomaly is a measured increase in the eccentricity of the Moon's orbit. Here again, an increase is expected from tidal friction in both the Earth and Moon. However, there is a reported unexplained increase that is significant at the three-sigma level. It is produent to suspect that all four anomalies have mundane explanations, or that one or more anomalies are a result of systematic error. Yet they might eventually be explained by new physics. For example, a slightly modified theory of gravitation is not ruled out, perhaps analogous to Einstein's 1916 explanation for the excess precession of Mercury's perihelion.

  6. Magnetic anomalies. [Magsat studies

    NASA Technical Reports Server (NTRS)

    Harrison, C. G. A.

    1983-01-01

    The implications and accuracy of anomaly maps produced using Magsat data on the scalar and vector magnetic field of the earth are discussed. Comparisons have been made between the satellite maps and aeromagnetic survey maps, showing smoother data from the satellite maps and larger anomalies in the aircraft data. The maps are being applied to characterize the structure and tectonics of the underlying regions. Investigations are still needed regarding the directions of magnetization within the crust and to generate further correlations between anomaly features and large scale geological structures. Furthermore, an increased data base is recommended for the Pacific Ocean basin in order to develop a better starting model for Pacific tectonic movements. The Pacific basin was large farther backwards in time and subduction zones surround the basin, thereby causing difficulties for describing the complex break-up scenario for Gondwanaland.

  7. La detection des cyanobacteries en milieu lacustre par l'etude des anomalies des spectres de reflectance de l'eau

    NASA Astrophysics Data System (ADS)

    Constantin, Gabriel

    Proliferation of cyanobacteria is a growing problem in lacustrine environment that results in rapid degradation of water quality. Moreover, certain cyanobacteria species produce harmful toxins. Phycocyanin (PC) is a photosynthetic pigment typical of cyanobacteria and affects the water color: it is therefore possible to study them using remote sensing. At least three algorithms to estimate PC concentration ([PC]) have been published, but their relative errors are important, especially for lower concentration. In this study, we are presenting the results of a new algorithm that uses the second order variability (anomalies) of water's reflectance spectrum to estimate [PC]. This method has never been used in lacustrine environment. The dataset used to develop and validate the algorithm was obtained between 2001 and 2005 in 57 different lakes and reservoirs of the Netherlands and Spain. The performance of the second order algorithm is equivalent or better than the three previously published algorithms. For the subset were [PC] > 32 mg m-3, the contribution of the second order term (R2=0.68 and RMSE=0.25) seems to improve considerably the first order algorithm (R2=0.50 and RMSE=0.35). The accuracy of the second order algorithm for [PC] > 32 mg m-3 is superior to the one calculated for the whole dataset (R2=0.69 and RMSE=0.44). The algorithm can also be adapted to the. bands of satellite sensor MERIS for the study of cyanobacteria. The application of this algorithm to a MERIS image acquired the 29 August 2010 taken over the Missisquoi Bay (Quebec, Canada) demonstrates the potential of this new algorithm for a future cyanobacteria' monitoring system. Note that all the statistical results presented above are for the logarithm of [PC] and the units of the RMSE are log(mg/m 3).

  8. Continental and oceanic magnetic anomalies: Enhancement through GRM

    NASA Technical Reports Server (NTRS)

    Vonfrese, R. R. B.; Hinze, W. J.

    1985-01-01

    In contrast to the POGO and MAGSAT satellites, the Geopotential Research Mission (GRM) satellite system will orbit at a minimum elevation to provide significantly better resolved lithospheric magnetic anomalies for more detailed and improved geologic analysis. In addition, GRM will measure corresponding gravity anomalies to enhance our understanding of the gravity field for vast regions of the Earth which are largely inaccessible to more conventional surface mapping. Crustal studies will greatly benefit from the dual data sets as modeling has shown that lithospheric sources of long wavelength magnetic anomalies frequently involve density variations which may produce detectable gravity anomalies at satellite elevations. Furthermore, GRM will provide an important replication of lithospheric magnetic anomalies as an aid to identifying and extracting these anomalies from satellite magnetic measurements. The potential benefits to the study of the origin and characterization of the continents and oceans, that may result from the increased GRM resolution are examined.

  9. An isolated single L-II type coronary artery anomaly: A rare coronary anomaly

    PubMed Central

    Ermis, Emrah; Demirelli, Selami; Korkmaz, Ali Fuat; Sahin, Bingul Dilekci; Kantarci, Abdulmecit

    2015-01-01

    Summary The incidence of congenital artery anomalies is 0.2–1.4%, and most are benign. Single coronary artery (SCA) anomalies are very rare. The right coronary artery (RCA) originating from the left coronary system is one such SCA anomaly, and the risk of sudden cardiac death (SCD) increases if it courses between the pulmonary artery and aorta and coexists with other congenital heart diseases. Additionally, coursing of the RCA between the great vessels increases the risk of atherosclerosis. We herein present the case of a 57 year-old man who was admitted to our cardiology outpatient clinic and diagnosed with an SCA anomaly in which the RCA arose from the left main coronary artery (LMCA) and coursed between the pulmonary artery and aorta. However a critical stenosis was not detected in imaging techniques, and myocardial perfusion scintigraphic evidence of ischaemia was found in a small area. Therefore, he was managed with conservative medical therapy. PMID:26668781

  10. Generalized random sequential adsorption

    NASA Astrophysics Data System (ADS)

    Tarjus, G.; Schaaf, P.; Talbot, J.

    1990-12-01

    Adsorption of hard spherical particles onto a flat uniform surface is analyzed by using generalized random sequential adsorption (RSA) models. These models are defined by releasing the condition of immobility present in the usual RSA rules to allow for desorption or surface diffusion. Contrary to the simple RSA case, generalized RSA processes are no longer irreversible and the system formed by the adsorbed particles on the surface may reach an equilibrium state. We show by using a distribution function approach that the kinetics of such processes can be described by means of an exact infinite hierarchy of equations reminiscent of the Kirkwood-Salsburg hierarchy for systems at equilibrium. We illustrate the way in which the systems produced by adsorption/desorption and by adsorption/diffusion evolve between the two limits represented by ``simple RSA'' and ``equilibrium'' by considering approximate solutions in terms of truncated density expansions.

  11. QUASI-PERIODIC FAST-MODE WAVE TRAINS WITHIN A GLOBAL EUV WAVE AND SEQUENTIAL TRANSVERSE OSCILLATIONS DETECTED BY SDO/AIA

    SciTech Connect

    Liu Wei; Nitta, Nariaki V.; Aschwanden, Markus J.; Schrijver, Carolus J.; Title, Alan M.; Tarbell, Theodore D.; Ofman, Leon

    2012-07-01

    We present the first unambiguous detection of quasi-periodic wave trains within the broad pulse of a global EUV wave (so-called EIT wave) occurring on the limb. These wave trains, running ahead of the lateral coronal mass ejection (CME) front of 2-4 times slower, coherently travel to distances {approx}> R{sub Sun }/2 along the solar surface, with initial velocities up to 1400 km s{sup -1} decelerating to {approx}650 km s{sup -1}. The rapid expansion of the CME initiated at an elevated height of 110 Mm produces a strong downward and lateral compression, which may play an important role in driving the primary EUV wave and shaping its front forwardly inclined toward the solar surface. The wave trains have a dominant 2 minute periodicity that matches the X-ray flare pulsations, suggesting a causal connection. The arrival of the leading EUV wave front at increasing distances produces an uninterrupted chain sequence of deflections and/or transverse (likely fast kink mode) oscillations of local structures, including a flux-rope coronal cavity and its embedded filament with delayed onsets consistent with the wave travel time at an elevated (by {approx}50%) velocity within it. This suggests that the EUV wave penetrates through a topological separatrix surface into the cavity, unexpected from CME-caused magnetic reconfiguration. These observations, when taken together, provide compelling evidence of the fast-mode MHD wave nature of the primary (outer) fast component of a global EUV wave, running ahead of the secondary (inner) slow component of CME-caused restructuring.

  12. Quasi-periodic Fast-mode Wave Trains within a Global EUV Wave and Sequential Transverse Oscillations Detected by SDO/AIA

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Ofman, Leon; Nitta, Nariaki V.; Aschwanden, Markus J.; Schrijver, Carolus J.; Title, Alan M.; Tarbell, Theodore D.

    2012-07-01

    We present the first unambiguous detection of quasi-periodic wave trains within the broad pulse of a global EUV wave (so-called EIT wave) occurring on the limb. These wave trains, running ahead of the lateral coronal mass ejection (CME) front of 2-4 times slower, coherently travel to distances >~ R ⊙/2 along the solar surface, with initial velocities up to 1400 km s-1 decelerating to ~650 km s-1. The rapid expansion of the CME initiated at an elevated height of 110 Mm produces a strong downward and lateral compression, which may play an important role in driving the primary EUV wave and shaping its front forwardly inclined toward the solar surface. The wave trains have a dominant 2 minute periodicity that matches the X-ray flare pulsations, suggesting a causal connection. The arrival of the leading EUV wave front at increasing distances produces an uninterrupted chain sequence of deflections and/or transverse (likely fast kink mode) oscillations of local structures, including a flux-rope coronal cavity and its embedded filament with delayed onsets consistent with the wave travel time at an elevated (by ~50%) velocity within it. This suggests that the EUV wave penetrates through a topological separatrix surface into the cavity, unexpected from CME-caused magnetic reconfiguration. These observations, when taken together, provide compelling evidence of the fast-mode MHD wave nature of the primary (outer) fast component of a global EUV wave, running ahead of the secondary (inner) slow component of CME-caused restructuring.

  13. Quasi-periodic Fast-mode Wave Trains Within a Global EUV Wave and Sequential Transverse Oscillations Detected by SDO-AIA

    NASA Technical Reports Server (NTRS)

    Liu, Wei; Ofman, Leon; Nitta, Nariaki; Aschwanden, Markus J.; Schrijver, Carolus J.; Title, Alan M.; Tarbell, Theodore D.

    2012-01-01

    We present the first unambiguous detection of quasi-periodic wave trains within the broad pulse of a global EUV wave (so-called EIT wave) occurring on the limb. These wave trains, running ahead of the lateral coronal mass ejection (CME) front of 2-4 times slower, coherently travel to distances greater than approximately solar radius/2 along the solar surface, with initial velocities up to 1400 kilometers per second decelerating to approximately 650 kilometers per second. The rapid expansion of the CME initiated at an elevated height of 110 Mm produces a strong downward and lateral compression, which may play an important role in driving the primary EUV wave and shaping its front forwardly inclined toward the solar surface. The wave trains have a dominant 2 minute periodicity that matches the X-ray flare pulsations, suggesting a causal connection. The arrival of the leading EUV wave front at increasing distances produces an uninterrupted chain sequence of deflections and/or transverse (likely fast kink mode) oscillations of local structures, including a flux-rope coronal cavity and its embedded filament with delayed onsets consistent with the wave travel time at an elevated (by approximately 50%) velocity within it. This suggests that the EUV wave penetrates through a topological separatrix surface into the cavity, unexpected from CME-caused magnetic reconfiguration. These observations, when taken together, provide compelling evidence of the fast-mode MHD wave nature of the primary (outer) fast component of a global EUV wave, running ahead of the secondary (inner) slow component of CME-caused restructuring.

  14. Hawking radiation and covariant anomalies

    SciTech Connect

    Banerjee, Rabin; Kulkarni, Shailesh

    2008-01-15

    Generalizing the method of Wilczek and collaborators we provide a derivation of Hawking radiation from charged black holes using only covariant gauge and gravitational anomalies. The reliability and universality of the anomaly cancellation approach to Hawking radiation is also discussed.

  15. XYY chromosome anomaly and schizophrenia.

    PubMed

    Rajagopalan, M; MacBeth, R; Varma, S L

    1998-02-07

    Sex chromosome anomalies have been associated with psychoses, and most of the evidence is linked to the presence of an additional X chromosome. We report a patient with XYY chromosome anomaly who developed schizophrenia.

  16. Magnetic Anomalies in the Enderby Basin, the Southern Indian Ocean

    NASA Astrophysics Data System (ADS)

    Nogi, Y.; Sato, T.; Hanyu, T.

    2013-12-01

    Magnetic anomalies in the Southern indian Ocean are vital to understanding initial breakup process of Gondwana. However, seafloor age estimated from magnetic anomalies still remain less well-defined because of the sparse observations in this area. To understand the seafloor spreading history related to the initial breakup process of Gondwana, vector magnetic anomaly data as well as total intensity magnetic anomaly data obtained by the R/V Hakuho-maru and the icebreaker Shirase in the Enderby Basin, Southern Indian Ocean, are used. The strikes of magnetic structures are deduced from the vector magnetic anomalies. Magnetic anomaly signals, most likely indicating Mesozoic magnetic anomaly sequence, are obtained almost parallel to the west of WNW-ESE trending lineaments just to the south of Conrad Rise inferred from satellite gravity anomalies. Most of the strikes of magnetic structures indicate NNE-SSW trends, and are almost perpendicular to the WNW-ESE trending lineaments. Mesozoic sequence magnetic anomalies with mostly WNW-ESE strikes are also observed along the NNE-SSW trending lineaments between the south of the Conrad Rise and Gunnerus Ridge. Magnetic anomalies originated from Cretaceous normal polarity superchron are found in these profiles, although magnetic anomaly C34 has been identified just to the north of the Conrad Rise. However Mesozoic sequence magnetic anomalies are only observed in the west side of the WNW-ESE trending lineaments just to the south of Conrad Rise and not detected to the east of Cretaceous normal superchron signals. These results show that counter part of Mesozoic sequence magnetic anomalies in the south of Conrad Rise would be found in the East Enderby Basin, off East Antarctica. NNE-SSW trending magnetic structures, which are similar to those obtained just to the south of Conrad Rise, are found off East Antarctica in the East Enderby Basin. However, some of the strikes show almost E-W orientations. These suggest complicated ridge

  17. Creating chiral anomalies

    NASA Astrophysics Data System (ADS)

    Bradlyn, Barry; Cano, Jennifer; Wang, Zhijun; Hirschberger, Max; Ong, N. Phuan; Bernevig, B. Andrei

    Materials with intrinsic Weyl points should present exotic magnetotransport phenomena due to spectral flow between Weyl nodes of opposite chirality - the so-called ``chiral anomaly''. However, to date, the most definitive transport data showing the presence of a chiral anomaly comes from Dirac (not Weyl) materials. These semimetals develop Weyl fermions only in the presence of an externally applied magnetic field, when the four-fold degeneracy is lifted. In this talk we examine Berry phase effects on transport due to the emergence of these field-induced Weyl point and (in some cases) line nodes. We pay particular attention to the differences between intrinsic and field-induced Weyl fermions, from the point of view of kinetic theory. Finally, we apply our analysis to a particular material relevant to current experiments performed at Princeton.

  18. Ebstein Anomaly in Pregnancy.

    PubMed

    Rusdi, Lusiani; Azizi, Syahrir; Suwita, Christopher; Karina, Astrid; Nasution, Sally A

    2016-10-01

    A 27-year-old primiparous woman with 28 weeks gestational age was admitted to our hospital with worsening shortness of breath. She was diagnosed with Ebstein's anomaly three years ago, but preferred to be left untreated. The patient was not cyanotic and her vital signs were stable. Her ECG showed incomplete RBBB and prolonged PR-interval. Blood tests revealed mild anemia. Observation of two-dimensional echo with color flow Doppler study showed Ebstein's anomaly with PFO as additional defects, EF of 57%, LV and LA dilatation, RV atrialization, severe TR, and moderate PH with RVSP of 44.3 mmHg. The patient then underwent elective sectio caesaria at 30 weeks of gestational age; both the mother and her baby were alive and were in good conditions.

  19. Multi-Attribute Sequential Search

    ERIC Educational Resources Information Center

    Bearden, J. Neil; Connolly, Terry

    2007-01-01

    This article describes empirical and theoretical results from two multi-attribute sequential search tasks. In both tasks, the DM sequentially encounters options described by two attributes and must pay to learn the values of the attributes. In the "continuous" version of the task the DM learns the precise numerical value of an attribute when she…

  20. Pathogenesis of Vascular Anomalies

    PubMed Central

    Boon, Laurence M.; Ballieux, Fanny; Vikkula, Miikka

    2010-01-01

    Vascular anomalies are localized defects of vascular development. Most of them occur sporadically, i.e. there is no familial history of lesions, yet in a few cases clear inheritance is observed. These inherited forms are often characterized by multifocal lesions that are mainly small in size and increase in number with patient’s age. On the basis of these inherited forms, molecular genetic studies have unraveled a number of inherited mutations giving direct insight into the pathophysiological cause and the molecular pathways that are implicated. Genetic defects have been identified for hereditary haemorrhagic telangiectasia (HHT), inherited cutaneomucosal venous malformation (VMCM), glomuvenous malformation (GVM), capillary malformation - arteriovenous malformation (CM-AVM), cerebral cavernous malformation (CCM) and some isolated and syndromic forms of primary lymphedema. We focus on these disorders, the implicated mutated genes and the underlying pathogenic mechanisms. We also call attention to the concept of Knudson’s double-hit mechanism to explain incomplete penetrance and the large clinical variation in expressivity of inherited vascular anomalies. This variability renders the making of correct diagnosis of the rare inherited forms difficult. Yet, the identification of the pathophysiological causes and pathways involved in them has had an unprecedented impact on our thinking of their etiopathogenesis, and has opened the doors towards a more refined classification of vascular anomalies. It has also made it possible to develop animal models that can be tested for specific molecular therapies, aimed at alleviating the dysfunctions caused by the aberrant genes and proteins. PMID:21095468

  1. Detection of Anorectal and Cervicovaginal Chlamydia Trachomatis Infections following Azithromycin Treatment: Prospective Cohort Study with Multiple Time-Sequential Measures of rRNA, DNA, Quantitative Load and Symptoms

    PubMed Central

    Dukers-Muijrers, Nicole H. T. M.; Speksnijder, Arjen G. C. L.; Morré, Servaas A.; Wolffs, Petra F. G.; van der Sande, Marianne A. B.; Brink, Antoinette A. T. P.; van den Broek, Ingrid V. F.; Werner, Marita I. L. S.; Hoebe, Christian J. P. A.

    2013-01-01

    Background Determination of Chlamydia trachomatis (Ct) treatment success is hampered by current assessment methods, which involve a single post-treatment measurement only. Therefore, we evaluated Ct detection by applying multiple laboratory measures on time-sequential post-treatment samples. Methods A prospective cohort study was established with azithromycin-treated (1000 mg) Ct patients (44 cervicovaginal and 15 anorectal cases). Each patient provided 18 self-taken samples pre-treatment and for 8 weeks post-treatment (response: 96%; 1,016 samples). Samples were tested for 16S rRNA (TMA), bacterial load (quantitative PCR; Chlamydia plasmid DNA) and type (serovar and multilocus sequence typing). Covariates (including behavior, pre-treatment load, anatomic site, symptoms, age, and menstruation) were tested for their potential association with positivity and load at 3–8 weeks using regression analyses controlling for repeated measures. Findings By day 9, Ct positivity decreased to 20% and the median load to 0.3 inclusion-forming units (IFU) per ml (pre-treatment: 170 IFU/ml). Of the 35 cases who reported no sex, sex with a treated partner or safe sex with a new partner, 40% had detection, i.e. one or more positive samples from 3–8 weeks (same Ct type over time), indicating possible antimicrobial treatment failure. Cases showed intermittent positive detection and the number of positive samples was higher in anorectal cases than in cervicovaginal cases. The highest observed bacterial load between 3–8 weeks post-treatment was 313 IFU/ml, yet the majority (65%) of positive samples showed a load of ≤2 IFU/ml. Pre-treatment load was found to be associated with later load in anorectal cases. Conclusions A single test at 3–8 weeks post-treatment frequently misses Ct. Detection reveals intermittent low loads, with an unknown risk of later complications or transmission. These findings warrant critical re-evaluation of the clinical management of single dose

  2. Sequential biases in accumulating evidence

    PubMed Central

    Huggins, Richard; Dogo, Samson Henry

    2015-01-01

    Whilst it is common in clinical trials to use the results of tests at one phase to decide whether to continue to the next phase and to subsequently design the next phase, we show that this can lead to biased results in evidence synthesis. Two new kinds of bias associated with accumulating evidence, termed ‘sequential decision bias’ and ‘sequential design bias’, are identified. Both kinds of bias are the result of making decisions on the usefulness of a new study, or its design, based on the previous studies. Sequential decision bias is determined by the correlation between the value of the current estimated effect and the probability of conducting an additional study. Sequential design bias arises from using the estimated value instead of the clinically relevant value of an effect in sample size calculations. We considered both the fixed‐effect and the random‐effects models of meta‐analysis and demonstrated analytically and by simulations that in both settings the problems due to sequential biases are apparent. According to our simulations, the sequential biases increase with increased heterogeneity. Minimisation of sequential biases arises as a new and important research area necessary for successful evidence‐based approaches to the development of science. © 2015 The Authors. Research Synthesis Methods Published by John Wiley & Sons Ltd. PMID:26626562

  3. On Sequential Detection in Dependent Noise.

    DTIC Science & Technology

    1980-06-01

    the test statistics leads to an optimum nonlinearity which seems to contradict the well- q known Wald - Wolfowitz theorem. Moreover, this situation can... Wolfowitz theorem (see A. Wald 1947) which proves that Log(L) is the optimum nonlinearity function that minimizes the average sample size * under both H and...defined (A. Wald , 1947) by testing the probability ratio against two thresholds A and B, i.e., we continue sampling as long as 0 A < f1(Xl,.. . , x n

  4. Physicochemical isotope anomalies

    SciTech Connect

    Esat, T.M.

    1988-06-01

    Isotopic composition of refractory elements can be modified, by physical processes such as distillation and sputtering, in unexpected patterns. Distillation enriches the heavy isotopes in the residue and the light isotopes in the vapor. However, current models appear to be inadequate to describe the detailed mass dependence, in particular for large fractionations. Coarse- and fine-grained inclusions from the Allende meteorite exhibit correlated isotope effects in Mg both as mass-dependent fractionation and residual anomalies. This isotope pattern can be duplicated by high temperature distillation in the laboratory. A ubiquitous property of meteoritic inclusions for Mg as well as for most of the other elements, where measurements exist, is mass-dependent fractionation. In contrast, terrestrial materials such as microtektites, tektite buttons as well as lunar orange and green glass spheres have normal Mg isotopic composition. A subset of interplanetary dust particles labelled as chondritic aggregates exhibit excesses in {sup 26}Mg and deuterium anomalies. Sputtering is expected to be a dominant mechanism in the destruction of grains within interstellar dust clouds. An active proto-sun as well as the present solar-wind and solar-flare flux are of sufficient intensity to sputter significant amounts of material. Laboratory experiments in Mg show widespread isotope effects including residual {sup 26}Mg excesses and mass dependent fractionation. It is possible that the {sup 26}Mg excesses in interplanetary dust is related to sputtering by energetic solar-wind particles. The implication if the laboratory distillation and sputtering effects are discussed and contrasted with the anomalies in meteoritic inclusions the other extraterrestrial materials the authors have access to.

  5. Multiprobe in-situ measurement of magnetic field in a minefield via a distributed network of miniaturized low-power integrated sensor systems for detection of magnetic field anomalies

    NASA Astrophysics Data System (ADS)

    Javadi, Hamid H. S.; Bendrihem, David; Blaes, B.; Boykins, Kobe; Cardone, John; Cruzan, C.; Gibbs, J.; Goodman, W.; Lieneweg, U.; Michalik, H.; Narvaez, P.; Perrone, D.; Rademacher, Joel D.; Snare, R.; Spencer, Howard; Sue, Miles; Weese, J.

    1998-09-01

    Based on technologies developed for the Jet Propulsion Laboratory (JPL) Free-Flying-Magnetometer (FFM) concept, we propose to modify the present design of FFMs for detection of mines and arsenals with large magnetic signature. The result will be an integrated miniature sensor system capable of identifying local magnetic field anomaly caused by a magnetic dipole moment. Proposed integrated sensor system is in line with the JPL technology road-map for development of autonomous, intelligent, networked, integrated systems with a broad range of applications. In addition, advanced sensitive magnetic sensors (e.g., silicon micromachined magnetometer, laser pumped helium magnetometer) are being developed for future NASA space plasma probes. It is envisioned that a fleet of these Integrated Sensor Systems (ISS) units will be dispersed on a mine-field via an aerial vehicle (a low-flying airplane or helicopter). The number of such sensor systems in each fleet and the corresponding in-situ probe-grid cell size is based on the strength of magnetic anomaly of the target and ISS measurement resolution of magnetic field vector. After a specified time, ISS units will transmit the measured magnetic field and attitude data to an air-borne platform for further data processing. The cycle of data acquisition and transmission will be continued until batteries run out. Data analysis will allow a local deformation of the Earth's magnetic field vector by a magnetic dipole moment to be detected. Each ISS unit consists of miniaturized sensitive 3- axis magnetometer, high resolution analog-to-digital converter (ADC), Field Programmable Gate Array (FPGA)-based data subsystem, Li-batteries and power regulation circuitry, memory, S-band transmitter, single-patch antenna, and a sun angle sensor. ISS unit is packaged with non-magnetic components and the electronic design implements low-magnetic signature circuits. Care is undertaken to guarantee no corruption of magnetometer sensitivity as a result

  6. Sequential Testing: Basics and Benefits

    DTIC Science & Technology

    1978-03-01

    103-109 44. A. Wald , Sequential Analysis, John Wiley and Sons, 1947 45. A Wald and J. Wolfowitz , "Optimum Character of The Sequential Probability Ratio...work done by A. Wald [44].. Wald’s work on sequential analysis can be used virtually’without modification in a situation where decisions are made... Wald can be used. The decision to accept, reject, or continue the test depends on: 8 < (8 0/el)r exp [-(1/01 - 1/0 0 )V(t)] < A (1) where 0 and A are

  7. Satellite magnetic anomalies over subduction zones - The Aleutian Arc anomaly

    NASA Technical Reports Server (NTRS)

    Clark, S. C.; Frey, H.; Thomas, H. H.

    1985-01-01

    Positive magnetic anomalies seen in MAGSAT average scalar anomaly data overlying some subduction zones can be explained in terms of the magnetization contrast between the cold subducted oceanic slab and the surrounding hotter, nonmagnetic mantle. Three-dimensional modeling studies show that peak anomaly amplitude and location depend on slab length and dip. A model for the Aleutian Arc anomaly matches the general trend of the observed MAGSAT anomaly if a slab thickness of 7 km and a relatively high (induced plus viscous) magnetization contrast of 4 A/m are used. A second source body along the present day continental margin is required to match the observed anomaly in detail, and may be modeled as a relic slab from subduction prior to 60 m.y. ago.

  8. High-throughput sequential injection method for simultaneous determination of plutonium and neptunium in environmental solids using macroporous anion-exchange chromatography, followed by inductively coupled plasma mass spectrometric detection.

    PubMed

    Qiao, Jixin; Hou, Xiaolin; Roos, Per; Miró, Manuel

    2011-01-01

    This paper reports an automated analytical method for rapid and simultaneous determination of plutonium and neptunium in soil, sediment, and seaweed, with detection via inductively coupled plasma mass spectrometry (ICP-MS). A chromatographic column packed with a macroporous anion exchanger (AG MP-1 M) was incorporated in a sequential injection (SI) system for the efficient retrieval of plutonium, along with neptunium, from matrix elements and potential interfering nuclides. The sorption and elution behavior of plutonium and neptunium onto AG MP-1 M resin was compared with a commonly utilized AG 1-gel-type anion exchanger. Experimental results reveal that the pore structure of the anion exchanger plays a pivotal role in ensuring similar separation behavior of plutonium and neptunium along the separation protocol. It is proven that plutonium-242 ((242)Pu) performs well as a tracer for monitoring the chemical yield of neptunium when using AG MP-1 M resin, whereby the difficulties in obtaining a reliable and practicable isotopic neptunium tracer are overcome. An important asset of the SI setup is the feasibility of processing up to 100 g of solid substrates using a small-sized (ca. 2 mL) column with chemical yields of neptunium and plutonium being ≥79%. Analytical results of three certified/standard reference materials and two solid samples from intercomparison exercises are in good agreement with the reference values at the 0.05 significance level. The overall on-column separation can be completed within 3.5 h for 10 g of soil samples. Most importantly, the anion-exchange mini-column suffices to be reused up to 10-fold with satisfactory chemical yields (>70%), as demanded in environmental monitoring and emergency scenarios, making the proposed automated assembly well-suited for unattended and high-throughput analysis.

  9. Einstein, Entropy and Anomalies

    NASA Astrophysics Data System (ADS)

    Sirtes, Daniel; Oberheim, Eric

    2006-11-01

    This paper strengthens and defends the pluralistic implications of Einstein's successful, quantitative predictions of Brownian motion for a philosophical dispute about the nature of scientific advance that began between two prominent philosophers of science in the second half of the twentieth century (Thomas Kuhn and Paul Feyerabend). Kuhn promoted a monistic phase-model of scientific advance, according to which a paradigm driven `normal science' gives rise to its own anomalies, which then lead to a crisis and eventually a scientific revolution. Feyerabend stressed the importance of pluralism for scientific progress. He rejected Kuhn's model arguing that it fails to recognize the role that alternative theories can play in identifying exactly which phenomena are anomalous in the first place. On Feyerabend's account, Einstein's predictions allow for a crucial experiment between two incommensurable theories, and are an example of an anomaly that could refute the reigning paradigm only after the development of a competitor. Using Kuhn's specification of a disciplinary matrix to illustrate the incommensurability between the two paradigms, we examine the different research strategies available in this peculiar case. On the basis of our reconstruction, we conclude by rebutting some critics of Feyerabend's argument.

  10. The XXXXY Chromosome Anomaly

    PubMed Central

    Zaleski, Witold A.; Houston, C. Stuart; Pozsonyi, J.; Ying, K. L.

    1966-01-01

    The majority of abnormal sex chromosome complexes in the male have been considered to be variants of Klinefelter's syndrome but an exception should probably be made in the case of the XXXXY individual who has distinctive phenotypic features. Clinical, radiological and cytological data on three new cases of XXXXY syndrome are presented and 30 cases from the literature are reviewed. In many cases the published clinical and radiological data were supplemented and re-evaluated. Mental retardation, usually severe, was present in all cases. Typical facies was observed in many; clinodactyly of the fifth finger was seen in nearly all. Radiological examination revealed abnormalities in the elbows and wrists in all the 19 personally evaluated cases, and other skeletal anomalies were very frequent. Cryptorchism is very common and absence of Leydig's cells may differentiate the XXXXY chromosome anomaly from polysomic variants of Klinefelter's syndrome. The relationship of this syndrome to Klinefelter's syndrome and to Down's syndrome is discussed. ImagesFig. 1Fig. 2Fig. 3Fig. 4Fig. 5Fig. 6Fig. 7Fig. 8Fig. 9Fig. 10Fig. 11Fig. 12Fig. 13Fig. 14Fig. 15 PMID:4222822

  11. Augment railgun and sequential discharge

    NASA Astrophysics Data System (ADS)

    Kobayashi, K.

    1993-01-01

    Proprietary R&D efforts toward the creation of tactical weapon systems-applicable railguns are presented. Attention is given to measures taken for projectile velocity maximization and sequential-discharge operation, and to an augmenting railgun which has demonstrated a 66-percent efficiency improvement over the two-rail baseline railgun system. This device is characterized by strong interaction between capacitor bank submodules during sequential discharge.

  12. Sequentially pulsed traveling wave accelerator

    DOEpatents

    Caporaso, George J.; Nelson, Scott D.; Poole, Brian R.

    2009-08-18

    A sequentially pulsed traveling wave compact accelerator having two or more pulse forming lines each with a switch for producing a short acceleration pulse along a short length of a beam tube, and a trigger mechanism for sequentially triggering the switches so that a traveling axial electric field is produced along the beam tube in synchronism with an axially traversing pulsed beam of charged particles to serially impart energy to the particle beam.

  13. Sequential spatial frequency discrimination is consistently impaired among adult dyslexics.

    PubMed

    Ben-Yehudah, Gal; Ahissar, Merav

    2004-05-01

    The degree and nature of dyslexics' difficulties in performing basic visual tasks have been debated for more than thirty years. We recently found that dyslexics' difficulties in detecting temporally modulated gratings are specific to conditions that require accurate comparisons between sequentially presented stimuli [Brain 124 (2001) 1381]. We now examine dyslexics' spatial frequency discrimination (rather than detection), under simultaneous (spatial forced choice) and sequential (temporal forced choice) presentations. Sequential presentation (at SOAs of 0.5, 0.75 and 2.25 s) yielded better discrimination thresholds among the majority of controls (around 0.5 c/ degrees reference), but not among dyslexics. Consequently, there was a (large and significant) group effect only for the sequential conditions. Within the same dyslexic group, performance on a sequential auditory task, two-tone frequency discrimination, was impaired in a smaller proportion of the participants. Taken together, our findings indicate that visual paradigms requiring sequential comparisons are difficult for the majority of dyslexic individuals, perhaps because deficits either in visual perception or in visual memory could both lead to difficulties on these paradigms.

  14. Statistical significance of the gallium anomaly

    SciTech Connect

    Giunti, Carlo; Laveder, Marco

    2011-06-15

    We calculate the statistical significance of the anomalous deficit of electron neutrinos measured in the radioactive source experiments of the GALLEX and SAGE solar neutrino detectors, taking into account the uncertainty of the detection cross section. We found that the statistical significance of the anomaly is {approx}3.0{sigma}. A fit of the data in terms of neutrino oscillations favors at {approx}2.7{sigma} short-baseline electron neutrino disappearance with respect to the null hypothesis of no oscillations.

  15. A Bayesian sequential processor approach to spectroscopic portal system decisions

    SciTech Connect

    Sale, K; Candy, J; Breitfeller, E; Guidry, B; Manatt, D; Gosnell, T; Chambers, D

    2007-07-31

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waiting for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.

  16. Genetics of lymphatic anomalies

    PubMed Central

    Brouillard, Pascal; Boon, Laurence; Vikkula, Miikka

    2014-01-01

    Lymphatic anomalies include a variety of developmental and/or functional defects affecting the lymphatic vessels: sporadic and familial forms of primary lymphedema, secondary lymphedema, chylothorax and chylous ascites, lymphatic malformations, and overgrowth syndromes with a lymphatic component. Germline mutations have been identified in at least 20 genes that encode proteins acting around VEGFR-3 signaling but also downstream of other tyrosine kinase receptors. These mutations exert their effects via the RAS/MAPK and the PI3K/AKT pathways and explain more than a quarter of the incidence of primary lymphedema, mostly of inherited forms. More common forms may also result from multigenic effects or post-zygotic mutations. Most of the corresponding murine knockouts are homozygous lethal, while heterozygotes are healthy, which suggests differences in human and murine physiology and the influence of other factors. PMID:24590274

  17. Systematic Screening for Subtelomeric Anomalies in a Clinical Sample of Autism

    ERIC Educational Resources Information Center

    Wassink, Thomas H.; Losh, Molly; Piven, Joseph; Sheffield, Val C.; Ashley, Elizabeth; Westin, Erik R.; Patil, Shivanand R.

    2007-01-01

    High-resolution karyotyping detects cytogenetic anomalies in 5-10% of cases of autism. Karyotyping, however, may fail to detect abnormalities of chromosome subtelomeres, which are gene rich regions prone to anomalies. We assessed whether panels of FISH probes targeted for subtelomeres could detect abnormalities beyond those identified by…

  18. Nolen-Schiffer anomaly

    SciTech Connect

    Pieper, S.C.; Wiringa, R.B.

    1995-08-01

    The Argonne v{sub 18} potential contains a detailed treatment of the pp, pn and nn electromagnetic potential, including Coulomb, vacuum polarization, Darwin Foldy and magnetic moment terms, all with suitable form factors and was fit to pp and pn data using the appropriate nuclear masses. In addition, it contains a nuclear charge-symmetry breaking (CSB) term adjusted to reproduce the difference in the experimental pp and nn scattering lengths. We have used these potential terms to compute differences in the binding energies of mirror isospin-1/2 nuclei (Nolen-Schiffer [NS] anomaly). Variational Monte Carlo calculations for the {sup 3}He-{sup 3}H system and cluster variational Monte Carlo for the {sup 15}O-{sup 15}N and {sup 17}F-{sup 17}O systems were made. In the first case, the best variational wave function for the A = 3 nuclei was used. However, because our {sup 16}O wave function does not reproduce accurately the {sup 16}O rms radius, to which the NS anomaly is very sensitive, we adjusted the A = 15 and A = 17 wave functions to reproduce the experimental density profiles. Our computed energy differences for these three systems are 0.757 {plus_minus} .001, 3.544 {plus_minus} .018 and 3.458 {plus_minus} .040 MeV respectively, which are to be compared with the experimental differences of 0.764, 3.537, and 3.544 MeV. Most of the theoretical uncertainties are due to uncertainties in the experimental rms radii. The nuclear CSB potential contributes 0.066, 0.188, and 0.090 MeV to these totals. We also attempted calculations for A = 39 and A = 41. However, in these cases, the experimental uncertainties in the rms radius make it impossible to extract useful information about the contribution of the nuclear CSB potential.

  19. Turtle Carapace Anomalies: The Roles of Genetic Diversity and Environment

    PubMed Central

    Velo-Antón, Guillermo; Becker, C. Guilherme; Cordero-Rivera, Adolfo

    2011-01-01

    Background Phenotypic anomalies are common in wild populations and multiple genetic, biotic and abiotic factors might contribute to their formation. Turtles are excellent models for the study of developmental instability because anomalies are easily detected in the form of malformations, additions, or reductions in the number of scutes or scales. Methodology/Principal Findings In this study, we integrated field observations, manipulative experiments, and climatic and genetic approaches to investigate the origin of carapace scute anomalies across Iberian populations of the European pond turtle, Emys orbicularis. The proportion of anomalous individuals varied from 3% to 69% in local populations, with increasing frequency of anomalies in northern regions. We found no significant effect of climatic and soil moisture, or climatic temperature on the occurrence of anomalies. However, lower genetic diversity and inbreeding were good predictors of the prevalence of scute anomalies among populations. Both decreasing genetic diversity and increasing proportion of anomalous individuals in northern parts of the Iberian distribution may be linked to recolonization events from the Southern Pleistocene refugium. Conclusions/Significance Overall, our results suggest that developmental instability in turtle carapace formation might be caused, at least in part, by genetic factors, although the influence of environmental factors affecting the developmental stability of turtle carapace cannot be ruled out. Further studies of the effects of environmental factors, pollutants and heritability of anomalies would be useful to better understand the complex origin of anomalies in natural populations. PMID:21533278

  20. Sequential Syndrome Decoding of Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Reed, I. S.; Truong, T. K.

    1984-01-01

    The algebraic structure of convolutional codes are reviewed and sequential syndrome decoding is applied to those codes. These concepts are then used to realize by example actual sequential decoding, using the stack algorithm. The Fano metric for use in sequential decoding is modified so that it can be utilized to sequentially find the minimum weight error sequence.

  1. Sequential algorithm for fast clique percolation.

    PubMed

    Kumpula, Jussi M; Kivelä, Mikko; Kaski, Kimmo; Saramäki, Jari

    2008-08-01

    In complex network research clique percolation, introduced by Palla, Derényi, and Vicsek [Nature (London) 435, 814 (2005)], is a deterministic community detection method which allows for overlapping communities and is purely based on local topological properties of a network. Here we present a sequential clique percolation algorithm (SCP) to do fast community detection in weighted and unweighted networks, for cliques of a chosen size. This method is based on sequentially inserting the constituent links to the network and simultaneously keeping track of the emerging community structure. Unlike existing algorithms, the SCP method allows for detecting k -clique communities at multiple weight thresholds in a single run, and can simultaneously produce a dendrogram representation of hierarchical community structure. In sparse weighted networks, the SCP algorithm can also be used for implementing the weighted clique percolation method recently introduced by Farkas [New J. Phys. 9, 180 (2007)]. The computational time of the SCP algorithm scales linearly with the number of k -cliques in the network. As an example, the method is applied to a product association network, revealing its nested community structure.

  2. Limb anomalies in DiGeorge and CHARGE syndromes

    SciTech Connect

    Prasad, C.; Quackenbush, E.J.; Whiteman, D.; Korf, B.

    1997-01-20

    Limb anomalies are not common in the DiGeorge or CHARGE syndromes. We describe limb anomalies in two children, one with DiGeorge and the other with CHARGE syndrome. Our first patient had a bifid left thumb, Tetralogy of Fallot, absent thymus, right facial palsy, and a reduced number of T-cells. A deletion of 22q11 was detected by fluorescence in situ hybridization (FISH). The second patient, with CHARGE syndrome, had asymmetric findings that included right fifth finger clinodactyly, camptodactyly, tibial hemimelia and dimpling, and severe club-foot. The expanded spectrum of the DiGeorge and CHARGE syndromes includes limb anomalies. 14 refs., 4 figs.

  3. A Semiparametric Model for Hyperspectral Anomaly Detection

    DTIC Science & Technology

    2012-01-01

    known that the performance of kernel methods crucially depends on the kernel function and its parameter(s) [11]. More recently, Gurram and Kwon in [12...700 VNIR HS spectral imager, which is commercially available off the shelf. The system produces HS data cubes of fixed dimensions R = 640 by C = 640...window in X (a data cube). The data format of X is shown in (1), where r ( r = 1, . . . , R ) and c (c = 1, . . . ,C) index pixels xrc in the R × C spatial

  4. Compressive Hyperspectral Imaging and Anomaly Detection

    DTIC Science & Technology

    2010-02-01

    the desired jointly sparse a"s, one shall adjust a and b. 4.4 Hyperspectral Image Reconstruction and Denoising We apply the model x* = Da’ + e! to...iteration for compressive sensing and sparse denoising ,’" Communications in Mathematical Sciences , 2008. W. Yin, "Analysis and generalizations of...Aharon, M. Elad, and A. Bruckstein, "K- SVD : An algorithm for designing overcomplete dictionaries for sparse representation,’" IEEE Transactions on Signal

  5. Anomaly Detection at Multiple Scales (ADAMS)

    DTIC Science & Technology

    2011-11-09

    card numbers that have black market value [15, 26]. However, enticement depends upon the attacker’s intent or preference. We define enticing...Analysis, " Phrack 11 , 61-9, 2003. [7] Friess , N., and Aycock, J., "Black Market Botnets," Department of Com- puter Science, University of Calgary, TR...Introduction to Modern Cryptography, Chapman and Hall CRC Press, 2007. [14] Kravets, D., "From Riches to Prison: Hackers Rig Stock Prices," Wired

  6. Unsupervised Topic Discovery by Anomaly Detection

    DTIC Science & Technology

    2013-09-01

    plans to achieve a sustainable population for the future. The idea of the whitepaper was first mooted due to the declining birth rate in Singapore...It talks about the importance of marriage and parenthood and the measures taken by the government to encourage parenthood . It addresses unpopular...Marriage and Parenthood Integration and Identity Immigrants Cost of Living Economy and Workforce Livability Others Table 2. Seven categories of

  7. Anomaly Detection and Attribution Using Bayesian Networks

    DTIC Science & Technology

    2014-06-01

    Introduction Underlying every issue in statistical reasoning is the assumption that the data being ex- amined was generated by the same underlying ... process which our statistical models are designed to represent. The accuracy of a model compared to the process it represents is irrelevant unless we are...considering data which was indeed generated by the same process . When this assumption fails for a given piece of data, that data is called an outlier

  8. System for closure of a physical anomaly

    DOEpatents

    Bearinger, Jane P; Maitland, Duncan J; Schumann, Daniel L; Wilson, Thomas S

    2014-11-11

    Systems for closure of a physical anomaly. Closure is accomplished by a closure body with an exterior surface. The exterior surface contacts the opening of the anomaly and closes the anomaly. The closure body has a primary shape for closing the anomaly and a secondary shape for being positioned in the physical anomaly. The closure body preferably comprises a shape memory polymer.

  9. Blocking for Sequential Political Experiments.

    PubMed

    Moore, Ryan T; Moore, Sally A

    2013-10-01

    In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects "trickle in" to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion.

  10. A bit serial sequential circuit

    NASA Technical Reports Server (NTRS)

    Hu, S.; Whitaker, S.

    1990-01-01

    Normally a sequential circuit with n state variables consists of n unique hardware realizations, one for each state variable. All variables are processed in parallel. This paper introduces a new sequential circuit architecture that allows the state variables to be realized in a serial manner using only one next state logic circuit. The action of processing the state variables in a serial manner has never been addressed before. This paper presents a general design procedure for circuit construction and initialization. Utilizing pass transistors to form the combinational next state forming logic in synchronous sequential machines, a bit serial state machine can be realized with a single NMOS pass transistor network connected to shift registers. The bit serial state machine occupies less area than other realizations which perform parallel operations. Moreover, the logical circuit of the bit serial state machine can be modified by simply changing the circuit input matrix to develop an adaptive state machine.

  11. Complementary sequential measurements generate entanglement

    NASA Astrophysics Data System (ADS)

    Coles, Patrick J.; Piani, Marco

    2014-01-01

    We present a paradigm for capturing the complementarity of two observables. It is based on the entanglement created by the interaction between the system observed and the two measurement devices used to measure the observables sequentially. Our main result is a lower bound on this entanglement and resembles well-known entropic uncertainty relations. Besides its fundamental interest, this result directly bounds the effectiveness of sequential bipartite operations—corresponding to the measurement interactions—for entanglement generation. We further discuss the intimate connection of our result with two primitives of information processing, namely, decoupling and coherent teleportation.

  12. Prevalence and distribution of selected dental anomalies among saudi children in Abha, Saudi Arabia

    PubMed Central

    2016-01-01

    Background Dental anomalies are not an unusual finding in routine dental examination. The effect of dental anomalies can lead to functional, esthetic and occlusal problems. The Purpose of the study was to determine the prevalence and distribution of selected developmental dental anomalies in Saudi children. Material and Methods The study was based on clinical examination and Panoramic radiographs of children who visited the Pediatric dentistry clinics at King Khalid University College of Dentistry, Saudi Arabia. These patients were examined for dental anomalies in size, shape, number, structure and position. Data collected were entered and analyzed using statistical package for social sciences version. Results Of the 1252 children (638 Boys, 614 girls) examined, 318 subjects (25.39%) presented with selected dental anomalies. The distribution by gender was 175 boys (27.42%) and 143 girls (23.28%). On intergroup comparison, number anomalies was the most common anomaly with Hypodontia (9.7%) being the most common anomaly in Saudi children, followed by hyperdontia (3.5%). The Prevalence of size anomalies were Microdontia (2.6%) and Macrodontia (1.8%). The prevalence of Shape anomalies were Talon cusp (1.4%), Taurodontism (1.4%), Fusion (0.8%).The prevalence of Positional anomalies were Ectopic eruption (2.3%) and Rotation (0.4%). The prevalence of structural anomalies were Amelogenesis imperfecta (0.3%) Dentinogenesis imperfecta (0.1%). Conclusions A significant number of children had dental anomaly with Hypodontia being the most common anomaly and Dentinogenesis imperfecta being the rare anomaly in the study. Early detection and management of these anomalies can avoid potential orthodontic and esthetic problems in a child. Key words:Dental anomalies, children, Saudi Arabia. PMID:27957258

  13. The flyby anomaly: a multivariate analysis approach

    NASA Astrophysics Data System (ADS)

    Acedo, L.

    2017-02-01

    The flyby anomaly is the unexpected variation of the asymptotic post-encounter velocity of a spacecraft with respect to the pre-encounter velocity as it performs a slingshot manoeuvre. This effect has been detected in, at least, six flybys of the Earth but it has not appeared in other recent flybys. In order to find a pattern in these, apparently contradictory, data several phenomenological formulas have been proposed but all have failed to predict a new result in agreement with the observations. In this paper we use a multivariate dimensional analysis approach to propose a fitting of the data in terms of the local parameters at perigee, as it would occur if this anomaly comes from an unknown fifth force with latitude dependence. Under this assumption, we estimate the range of this force around 300 km.

  14. Data Stream Mining Based Dynamic Link Anomaly Analysis Using Paired Sliding Time Window Data

    DTIC Science & Technology

    2014-11-01

    DATA STREAM MINING BASED DYNAMIC LINK ANOMALY ANALYSIS USING PAIRED SLIDING TIME WINDOW DATA NOVEMBER 2014 FINAL TECHNICAL REPORT...2014 2. REPORT TYPE FINAL TECHNICAL REPORT 3. DATES COVERED (From - To) APR 2011 – APR 2014 4. TITLE AND SUBTITLE DATA STREAM MINING BASED DYNAMIC...for data stream mining in order to incorporate link anomaly detection into the dynamic network analysis. The proposed dynamic link anomaly detection

  15. Reliability of CHAMP Anomaly Continuations

    NASA Technical Reports Server (NTRS)

    vonFrese, Ralph R. B.; Kim, Hyung Rae; Taylor, Patrick T.; Asgharzadeh, Mohammad F.

    2003-01-01

    CHAMP is recording state-of-the-art magnetic and gravity field observations at altitudes ranging over roughly 300 - 550 km. However, anomaly continuation is severely limited by the non-uniqueness of the process and satellite anomaly errors. Indeed, our numerical anomaly simulations from satellite to airborne altitudes show that effective downward continuations of the CHAMP data are restricted to within approximately 50 km of the observation altitudes while upward continuations can be effective over a somewhat larger altitude range. The great unreliability of downward continuation requires that the satellite geopotential observations must be analyzed at satellite altitudes if the anomaly details are to be exploited most fully. Given current anomaly error levels, joint inversion of satellite and near- surface anomalies is the best approach for implementing satellite geopotential observations for subsurface studies. We demonstrate the power of this approach using a crustal model constrained by joint inversions of near-surface and satellite magnetic and gravity observations for Maude Rise, Antarctica, in the southwestern Indian Ocean. Our modeling suggests that the dominant satellite altitude magnetic anomalies are produced by crustal thickness variations and remanent magnetization of the normal polarity Cretaceous Quiet Zone.

  16. Anomalies and graded coisotropic branes

    NASA Astrophysics Data System (ADS)

    Li, Yi

    2006-03-01

    We compute the anomaly of the axial U(1) current in the A-model on a Calabi-Yau manifold, in the presence of coisotropic branes discovered by Kapustin and Orlov. Our results relate the anomaly-free condition to a recently proposed definition of graded coisotropic branes in Calabi-Yau manifolds. More specifically, we find that a coisotropic brane is anomaly-free if and only if it is gradable. We also comment on a different grading for coisotropic submanifolds introduced recently by Oh.

  17. Evaluation Using Sequential Trials Methods.

    ERIC Educational Resources Information Center

    Cohen, Mark E.; Ralls, Stephen A.

    1986-01-01

    Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)

  18. Sequential Effects in Essay Ratings

    ERIC Educational Resources Information Center

    Attali, Yigal

    2011-01-01

    Contrary to previous research on sequential ratings of student performance, this study found that professional essay raters of a large-scale standardized testing program produced ratings that were drawn toward previous ratings, creating an assimilation effect. Longer intervals between the two adjacent ratings and higher degree of agreement with…

  19. Discovering System Health Anomalies Using Data Mining Techniques

    NASA Technical Reports Server (NTRS)

    Sriastava, Ashok, N.

    2005-01-01

    We present a data mining framework for the analysis and discovery of anomalies in high-dimensional time series of sensor measurements that would be found in an Integrated System Health Monitoring system. We specifically treat the problem of discovering anomalous features in the time series that may be indicative of a system anomaly, or in the case of a manned system, an anomaly due to the human. Identification of these anomalies is crucial to building stable, reusable, and cost-efficient systems. The framework consists of an analysis platform and new algorithms that can scale to thousands of sensor streams to discovers temporal anomalies. We discuss the mathematical framework that underlies the system and also describe in detail how this framework is general enough to encompass both discrete and continuous sensor measurements. We also describe a new set of data mining algorithms based on kernel methods and hidden Markov models that allow for the rapid assimilation, analysis, and discovery of system anomalies. We then describe the performance of the system on a real-world problem in the aircraft domain where we analyze the cockpit data from aircraft as well as data from the aircraft propulsion, control, and guidance systems. These data are discrete and continuous sensor measurements and are dealt with seamlessly in order to discover anomalous flights. We conclude with recommendations that describe the tradeoffs in building an integrated scalable platform for robust anomaly detection in ISHM applications.

  20. An investigation of thermal anomalies in the Central American volcanic chain and evaluation of the utility of thermal anomaly monitoring in the prediction of volcanic eruptions. [Central America

    NASA Technical Reports Server (NTRS)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1975-01-01

    The author has identified the following significant results. Ground truth data collection proves that significant anomalies exist at 13 volcanoes within the test site of Central America. The dimensions and temperature contrast of these ten anomalies are large enough to be detected by the Skylab 192 instrument. The dimensions and intensity of thermal anomalies have changed at most of these volcanoes during the Skylab mission.

  1. Genetics Home Reference: Peters anomaly

    MedlinePlus

    ... the anterior segment is abnormal, leading to incomplete separation of the cornea from the iris or the ... anomaly type I is characterized by an incomplete separation of the cornea and iris and mild to ...

  2. Classifying sex biased congenital anomalies

    SciTech Connect

    Lubinsky, M.S.

    1997-03-31

    The reasons for sex biases in congenital anomalies that arise before structural or hormonal dimorphisms are established has long been unclear. A review of such disorders shows that patterning and tissue anomalies are female biased, and structural findings are more common in males. This suggests different gender dependent susceptibilities to developmental disturbances, with female vulnerabilities focused on early blastogenesis/determination, while males are more likely to involve later organogenesis/morphogenesis. A dual origin for some anomalies explains paradoxical reductions of sex biases with greater severity (i.e., multiple rather than single malformations), presumably as more severe events increase the involvement of an otherwise minor process with opposite biases to those of the primary mechanism. The cause for these sex differences is unknown, but early dimorphisms, such as differences in growth or presence of H-Y antigen, may be responsible. This model provides a useful rationale for understanding and classifying sex-biased congenital anomalies. 42 refs., 7 tabs.

  3. Spinal anomalies in Pfeiffer syndrome.

    PubMed

    Moore, M H; Lodge, M L; Clark, B E

    1995-05-01

    Review of the spinal radiographs of a consecutive series of 11 patients with Pfeiffer syndrome presenting to the Australian Craniofacial Unit was performed. The prevalence of cervical spine fusions was high, and the pattern of fusion complex. Isolated anomalies were evident at lower levels, including two cases of sacrococcygeal eversion. Spinal anomalies occur more frequently in the more severely involved cases of Pfeiffer syndrome emphasizing the generalized dysostotic nature of this condition.

  4. Brain anomalies in velo-cardio-facial syndrome

    SciTech Connect

    Mitnick, R.J.; Bello, J.A.; Shprintzen, R.J.

    1994-06-15

    Magnetic resonance imaging of the brain in 11 consecutively referred patients with velo-cardio-facial syndrome (VCF) showed anomalies in nine cases including small vermis, cysts adjacent to the frontal horns, and small posterior fossa. Focal signal hyperintensities in the white matter on long TR images were also noted. The nine patients showed a variety of behavioral abnormalities including mild development delay, learning disabilities, and characteristic personality traits typical of this common multiple anomaly syndrome which has been related to a microdeletion at 22q11. Analysis of the behavorial findings showed no specific pattern related to the brain anomalies, and the patients with VCF who did not have detectable brain lesions also had behavioral abnormalities consistent with VCF. The significance of the lesions is not yet known, but the high prevalence of anomalies in this sample suggests that structural brain abnormalities are probably common in VCF. 25 refs.

  5. Diabetic Ketoacidosis with Ebstein's Anomaly in an Adult.

    PubMed

    Patra, Soumya; Beeresha P, Nagamani A C; B, Ramesh; C N, Manjunath

    2016-03-01

    Ebstein's anomaly is a rare form of congenital malformation of the heart, characterized by apical displacement of the septal and posterior tricuspid valve leaflets, leading to atrialisation of the right ventricle with a variable degree of malformation and displacement of the anterior leaflet. It may not be detected until late in adolescence or adulthood. The clinical manifestations of Ebstein's anomaly vary greatly. We are reporting a case of 35-year male who presented with generalized fatigue, palpitation and effort intolerance. Laboratory investigations confirmed the diagnosis of diabetes ketosis. Transthoracic echocardiography showed severe Ebstein's anomaly with severe tricuspid regurgitation, no residual atrial septal defect, but with severe right ventricular dysfunction. Though only few studies showed the high prevalence of abnormal glucose metabolism in young adult patients with complex congenital heart disease, but Epstein's anomaly with diabetes ketosis was nowhere mentioned.

  6. Multiplexed protein profiling by sequential affinity capture

    PubMed Central

    Ayoglu, Burcu; Birgersson, Elin; Mezger, Anja; Nilsson, Mats; Uhlén, Mathias; Nilsson, Peter

    2016-01-01

    Antibody microarrays enable parallelized and miniaturized analysis of clinical samples, and have proven to provide novel insights for the analysis of different proteomes. However, there are concerns that the performance of such direct labeling and single antibody assays are prone to off‐target binding due to the sample context. To improve selectivity and sensitivity while maintaining the possibility to conduct multiplexed protein profiling, we developed a multiplexed and semi‐automated sequential capture assay. This novel bead‐based procedure encompasses a first antigen capture, labeling of captured protein targets on magnetic particles, combinatorial target elution and a read‐out by a secondary capture bead array. We demonstrate in a proof‐of‐concept setting that target detection via two sequential affinity interactions reduced off‐target contribution, while lowered background and noise levels, improved correlation to clinical values compared to single binder assays. We also compared sensitivity levels with single binder and classical sandwich assays, explored the possibility for DNA‐based signal amplification, and demonstrate the applicability of the dual capture bead‐based antibody microarray for biomarker analysis. Hence, the described concept enhances the possibilities for antibody array assays to be utilized for protein profiling in body fluids and beyond. PMID:26935855

  7. Sequential estimation of surface water mass changes from daily satellite gravimetry data

    NASA Astrophysics Data System (ADS)

    Ramillien, G. L.; Frappart, F.; Gratton, S.; Vasseur, X.

    2015-03-01

    We propose a recursive Kalman filtering approach to map regional spatio-temporal variations of terrestrial water mass over large continental areas, such as South America. Instead of correcting hydrology model outputs by the GRACE observations using a Kalman filter estimation strategy, regional 2-by-2 degree water mass solutions are constructed by integration of daily potential differences deduced from GRACE K-band range rate (KBRR) measurements. Recovery of regional water mass anomaly averages obtained by accumulation of information of daily noise-free simulated GRACE data shows that convergence is relatively fast and yields accurate solutions. In the case of cumulating real GRACE KBRR data contaminated by observational noise, the sequential method of step-by-step integration provides estimates of water mass variation for the period 2004-2011 by considering a set of suitable a priori error uncertainty parameters to stabilize the inversion. Spatial and temporal averages of the Kalman filter solutions over river basin surfaces are consistent with the ones computed using global monthly/10-day GRACE solutions from official providers CSR, GFZ and JPL. They are also highly correlated to in situ records of river discharges (70-95 %), especially for the Obidos station where the total outflow of the Amazon River is measured. The sparse daily coverage of the GRACE satellite tracks limits the time resolution of the regional Kalman filter solutions, and thus the detection of short-term hydrological events.

  8. Dynamics of Sequential Decision Making

    NASA Astrophysics Data System (ADS)

    Rabinovich, Mikhail I.; Huerta, Ramón; Afraimovich, Valentin

    2006-11-01

    We suggest a new paradigm for intelligent decision-making suitable for dynamical sequential activity of animals or artificial autonomous devices that depends on the characteristics of the internal and external world. To do it we introduce a new class of dynamical models that are described by ordinary differential equations with a finite number of possibilities at the decision points, and also include rules solving this uncertainty. Our approach is based on the competition between possible cognitive states using their stable transient dynamics. The model controls the order of choosing successive steps of a sequential activity according to the environment and decision-making criteria. Two strategies (high-risk and risk-aversion conditions) that move the system out of an erratic environment are analyzed.

  9. Relationship Between Seismic Velocity Anomalies and Rheological Anomalies

    NASA Astrophysics Data System (ADS)

    Karato, S.

    2001-05-01

    One of the ultimate goals of high-resolution Earth models is to reveal anomalies (lateral variations) in thermal and rheological structures. Although such a relationship has been well known at a qualitative level, no quantitative relationship has been established to allow estimate of anomalies in viscosity from seismological data. In this presentation, I formulate such a relationship for Earth's upper mantle, based on the latest mineral physics observations. The key in doing this is the quantitative analysis of the effects of water on seismic wave velocities. Earlier analysis indicated the importance of water on seismic wave velocities through enhanced attenuation (Karato, 1995). I have quantified this notion by combining laboratory observations on attenuation at limited conditions (Jackson et al., 1992) with the recent quantitative data on the effects of water on rheology at wider conditions (Karato and Jung, 2001). I show that both seismic wave velocities and rheology (viscosity) of Earth materials are controlled by "rheologically effective temperature (Teff)" that depends on temperature as well as water content. Such an analysis allows us to define the relationships between velocity anomalies and anomalies in Teff and hence anomalies in viscosity. The present formulation has been applied to the upper mantle beneath northeastern Japan where the high-resolution tomographic images are available. The results show that anomalies in effective temperatures of ~+400 K occur in these regions indicating that viscosity there could be lower than the average values by a factor of ~10 to ~1000. References Jackson, I. et al. (1992), Geophys. J. Int., 108: 517-534. Karato, S. (1995), Proc. Japan Academy, B71: 61-66. Karato, S. and Jung, H. (2001), submitted to Philos. Mag.

  10. Automatic Construction of Anomaly Detectors from Graphical Models

    SciTech Connect

    Ferragut, Erik M; Darmon, David M; Shue, Craig A; Kelley, Stephen

    2011-01-01

    Detection of rare or previously unseen attacks in cyber security presents a central challenge: how does one search for a sufficiently wide variety of types of anomalies and yet allow the process to scale to increasingly complex data? In particular, creating each anomaly detector manually and training each one separately presents untenable strains on both human and computer resources. In this paper we propose a systematic method for constructing a potentially very large number of complementary anomaly detectors from a single probabilistic model of the data. Only one model needs to be trained, but numerous detectors can then be implemented. This approach promises to scale better than manual methods to the complex heterogeneity of real-life data. As an example, we develop a Latent Dirichlet Allocation probability model of TCP connections entering Oak Ridge National Laboratory. We show that several detectors can be automatically constructed from the model and will provide anomaly detection at flow, sub-flow, and host (both server and client) levels. This demonstrates how the fundamental connection between anomaly detection and probabilistic modeling can be exploited to develop more robust operational solutions.

  11. MAGSAT anomaly map and continental drift

    NASA Technical Reports Server (NTRS)

    Lemouel, J. L. (Principal Investigator); Galdeano, A.; Ducruix, J.

    1981-01-01

    Anomaly maps of high quality are needed to display unambiguously the so called long wave length anomalies. The anomalies were analyzed in terms of continental drift and the nature of their sources is discussed. The map presented confirms the thinness of the oceanic magnetized layer. Continental magnetic anomalies are characterized by elongated structures generally of east-west trend. Paleomagnetic reconstruction shows that the anomalies found in India, Australia, and Antarctic exhibit a fair consistency with the African anomalies. It is also shown that anomalies are locked under the continents and have a fixed geometry.

  12. Method for locating underground anomalies by diffraction of electromagnetic waves passing between spaced boreholes

    DOEpatents

    Lytle, R. Jeffrey; Lager, Darrel L.; Laine, Edwin F.; Davis, Donald T.

    1979-01-01

    Underground anomalies or discontinuities, such as holes, tunnels, and caverns, are located by lowering an electromagnetic signal transmitting antenna down one borehole and a receiving antenna down another, the ground to be surveyed for anomalies being situated between the boreholes. Electronic transmitting and receiving equipment associated with the antennas is activated and the antennas are lowered in unison at the same rate down their respective boreholes a plurality of times, each time with the receiving antenna at a different level with respect to the transmitting antenna. The transmitted electromagnetic waves diffract at each edge of an anomaly. This causes minimal signal reception at the receiving antenna. Triangulation of the straight lines between the antennas for the depths at which the signal minimums are detected precisely locates the anomaly. Alternatively, phase shifts of the transmitted waves may be detected to locate an anomaly, the phase shift being distinctive for the waves directed at the anomaly.

  13. Ant colony optimization-based firewall anomaly mitigation engine.

    PubMed

    Penmatsa, Ravi Kiran Varma; Vatsavayi, Valli Kumari; Samayamantula, Srinivas Kumar

    2016-01-01

    A firewall is the most essential component of network perimeter security. Due to human error and the involvement of multiple administrators in configuring firewall rules, there exist common anomalies in firewall rulesets such as Shadowing, Generalization, Correlation, and Redundancy. There is a need for research on efficient ways of resolving such anomalies. The challenge is also to see that the reordered or resolved ruleset conforms to the organization's framed security policy. This study proposes an ant colony optimization (ACO)-based anomaly resolution and reordering of firewall rules called ACO-based firewall anomaly mitigation engine. Modified strategies are also introduced to automatically detect these anomalies and to minimize manual intervention of the administrator. Furthermore, an adaptive reordering strategy is proposed to aid faster reordering when a new rule is appended. The proposed approach was tested with different firewall policy sets. The results were found to be promising in terms of the number of conflicts resolved, with minimal availability loss and marginal security risk. This work demonstrated the application of a metaheuristic search technique, ACO, in improving the performance of a packet-filter firewall with respect to mitigating anomalies in the rules, and at the same time demonstrated conformance to the security policy.

  14. Shortening anomalies in supersymmetric theories

    NASA Astrophysics Data System (ADS)

    Gomis, Jaume; Komargodski, Zohar; Ooguri, Hirosi; Seiberg, Nathan; Wang, Yifan

    2017-01-01

    We present new anomalies in two-dimensional N=(2,2) superconformal theories. They obstruct the shortening conditions of chiral and twisted chiral multiplets at coincident points. This implies that marginal couplings cannot be promoted to background superfields in short representations. Therefore, standard results that follow from N=(2,2) spurion analysis are invalidated. These anomalies appear only if supersymmetry is enhanced beyond N=(2,2) . These anomalies explain why the conformal manifolds of the K3 and T 4 sigma models are not Kähler and do not factorize into chiral and twisted chiral moduli spaces and why there are no N=(2,2) gauged linear sigma models that cover these conformal manifolds. We also present these results from the point of view of the Riemann curvature of conformal manifolds.

  15. Spacecraft environmental anomalies expert system

    NASA Technical Reports Server (NTRS)

    Koons, H. C.; Gorney, D. J.

    1988-01-01

    A microcomputer-based expert system is being developed at the Aerospace Corporation Space Sciences Laboratory to assist in the diagnosis of satellite anomalies caused by the space environment. The expert system is designed to address anomalies caused by surface charging, bulk charging, single event effects and total radiation dose. These effects depend on the orbit of the satellite, the local environment (which is highly variable), the satellite exposure time and the hardness of the circuits and components of the satellite. The expert system is a rule-based system that uses the Texas Instruments Personal Consultant Plus expert system shell. The completed expert system knowledge base will include 150 to 200 rules, as well as a spacecraft attributes database, an historical spacecraft anomalies database, and a space environment database which is updated in near real-time. Currently, the expert system is undergoing development and testing within the Aerospace Corporation Space Sciences Laboratory.

  16. The anomalies associated with congenital solitary functioning kidney in children.

    PubMed

    Akl, Kamal

    2011-01-01

    The aim of this study was to determine the incidence of associated urological and non-urological anomalies as well as the renal outcome in patients with a congenital solitary func-tioning kidney (CSFK). A retrospective review of 30 consecutive cases of CSFK seen at the pediatric renal service at the Jordan University Hospital between 2004 and 2008 was performed. There were 20 males and 10 females, whose ages ranged from five days to 14 years. In 20 patients (67%), the left kidney was absent. Associated anomalies were detected in 23 (77%) of the 30 patients; urological anomalies accounted for 47% (14/30) and non-urological anomalies were found in 19/30 (53%) patients. The latter included anomalies of the ear, nose and throat (ENT) in 9/30 (30%), musculoskeletal system (one with hypermobile joints) in 8/30 (27%), gastrointestinal (GI) in 7/30 (23%), cardiovascular (CV) in 4/30 (13%) and dermatological with epidermolysis bullosa, endocrine (euthyroid goiter) and gynecological (cervical cyst) in one patient each (3%). Proteinuria was seen in 6/30 (20%) and hypertension in 2/30 (7%) patients. Chronic renal failure (CRF) was seen in 6/30 (20%) patients, of whom three had end-stage renal failure (ESRF). CRF was seen mainly in patients with more than two associated urological anomalies. Idiopathic hyperuricosuria was found in five of the six tested patients (83%). In our study, the most common associated anomalies with CSFK were urological. The presence of more than two associated urological anomalies increased the risk of CRF.

  17. Random sequential adsorption on fractals.

    PubMed

    Ciesla, Michal; Barbasz, Jakub

    2012-07-28

    Irreversible adsorption of spheres on flat collectors having dimension d < 2 is studied. Molecules are adsorbed on Sierpinski's triangle and carpet-like fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions.

  18. A high speed sequential decoder

    NASA Technical Reports Server (NTRS)

    Lum, H., Jr.

    1972-01-01

    The performance and theory of operation for the High Speed Hard Decision Sequential Decoder are delineated. The decoder is a forward error correction system which is capable of accepting data from binary-phase-shift-keyed and quadriphase-shift-keyed modems at input data rates up to 30 megabits per second. Test results show that the decoder is capable of maintaining a composite error rate of 0.00001 at an input E sub b/N sub o of 5.6 db. This performance has been obtained with minimum circuit complexity.

  19. The Sequential Parameter Optimization Toolbox

    NASA Astrophysics Data System (ADS)

    Bartz-Beielstein, Thomas; Lasarczyk, Christian; Preuss, Mike

    The sequential parameter optimization toolbox (SPOT) is one possible implementation of the SPO framework introduced in Chap. 2. It has been successfully applied to numerous heuristics for practical and theoretical optimization problems. We describe the mechanics and interfaces employed by SPOT to enable users to plug in their own algorithms. Furthermore, two case studies are presented to demonstrate how SPOT can be applied in practice, followed by a discussion of alternative metamodels to be plugged into it.We conclude with some general guidelines.

  20. Thermal anomalies in stressed Teflon.

    NASA Technical Reports Server (NTRS)

    Lee, S. H.; Wulff, C. A.

    1972-01-01

    In the course of testing polytetrafluoroethylene (Teflon) as a calorimetric gasketing material, serendipity revealed a thermal anomaly in stressed film that occurs concomitantly with the well-documented 25 C transition. The magnitude of the excess energy absorption - about 35 cal/g - is suggested to be related to the restricted thermal expansion of the film.