Sample records for sequential anomaly detection

  1. Anomaly Detection in Dynamic Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turcotte, Melissa

    2014-10-14

    Anomaly detection in dynamic communication networks has many important security applications. These networks can be extremely large and so detecting any changes in their structure can be computationally challenging; hence, computationally fast, parallelisable methods for monitoring the network are paramount. For this reason the methods presented here use independent node and edge based models to detect locally anomalous substructures within communication networks. As a first stage, the aim is to detect changes in the data streams arising from node or edge communications. Throughout the thesis simple, conjugate Bayesian models for counting processes are used to model these data streams. Amore » second stage of analysis can then be performed on a much reduced subset of the network comprising nodes and edges which have been identified as potentially anomalous in the first stage. The first method assumes communications in a network arise from an inhomogeneous Poisson process with piecewise constant intensity. Anomaly detection is then treated as a changepoint problem on the intensities. The changepoint model is extended to incorporate seasonal behavior inherent in communication networks. This seasonal behavior is also viewed as a changepoint problem acting on a piecewise constant Poisson process. In a static time frame, inference is made on this extended model via a Gibbs sampling strategy. In a sequential time frame, where the data arrive as a stream, a novel, fast Sequential Monte Carlo (SMC) algorithm is introduced to sample from the sequence of posterior distributions of the change points over time. A second method is considered for monitoring communications in a large scale computer network. The usage patterns in these types of networks are very bursty in nature and don’t fit a Poisson process model. For tractable inference, discrete time models are considered, where the data are aggregated into discrete time periods and probability models are fitted to the

  2. Unsupervised Sequential Outlier Detection With Deep Architectures.

    PubMed

    Lu, Weining; Cheng, Yu; Xiao, Cao; Chang, Shiyu; Huang, Shuai; Liang, Bin; Huang, Thomas

    2017-09-01

    Unsupervised outlier detection is a vital task and has high impact on a wide variety of applications domains, such as image analysis and video surveillance. It also gains long-standing attentions and has been extensively studied in multiple research areas. Detecting and taking action on outliers as quickly as possible are imperative in order to protect network and related stakeholders or to maintain the reliability of critical systems. However, outlier detection is difficult due to the one class nature and challenges in feature construction. Sequential anomaly detection is even harder with more challenges from temporal correlation in data, as well as the presence of noise and high dimensionality. In this paper, we introduce a novel deep structured framework to solve the challenging sequential outlier detection problem. We use autoencoder models to capture the intrinsic difference between outliers and normal instances and integrate the models to recurrent neural networks that allow the learning to make use of previous context as well as make the learners more robust to warp along the time axis. Furthermore, we propose to use a layerwise training procedure, which significantly simplifies the training procedure and hence helps achieve efficient and scalable training. In addition, we investigate a fine-tuning step to update all parameters set by incorporating the temporal correlation in the sequence. We further apply our proposed models to conduct systematic experiments on five real-world benchmark data sets. Experimental results demonstrate the effectiveness of our model, compared with other state-of-the-art approaches.

  3. Seismic data fusion anomaly detection

    NASA Astrophysics Data System (ADS)

    Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David

    2014-06-01

    Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.

  4. Road Anomalies Detection System Evaluation.

    PubMed

    Silva, Nuno; Shah, Vaibhav; Soares, João; Rodrigues, Helena

    2018-06-21

    Anomalies on road pavement cause discomfort to drivers and passengers, and may cause mechanical failure or even accidents. Governments spend millions of Euros every year on road maintenance, often causing traffic jams and congestion on urban roads on a daily basis. This paper analyses the difference between the deployment of a road anomalies detection and identification system in a “conditioned” and a real world setup, where the system performed worse compared to the “conditioned” setup. It also presents a system performance analysis based on the analysis of the training data sets; on the analysis of the attributes complexity, through the application of PCA techniques; and on the analysis of the attributes in the context of each anomaly type, using acceleration standard deviation attributes to observe how different anomalies classes are distributed in the Cartesian coordinates system. Overall, in this paper, we describe the main insights on road anomalies detection challenges to support the design and deployment of a new iteration of our system towards the deployment of a road anomaly detection service to provide information about roads condition to drivers and government entities.

  5. Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)

    NASA Technical Reports Server (NTRS)

    McIntosh, Dawn

    2006-01-01

    This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search

  6. Adiabatic Quantum Anomaly Detection and Machine Learning

    NASA Astrophysics Data System (ADS)

    Pudenz, Kristen; Lidar, Daniel

    2012-02-01

    We present methods of anomaly detection and machine learning using adiabatic quantum computing. The machine learning algorithm is a boosting approach which seeks to optimally combine somewhat accurate classification functions to create a unified classifier which is much more accurate than its components. This algorithm then becomes the first part of the larger anomaly detection algorithm. In the anomaly detection routine, we first use adiabatic quantum computing to train two classifiers which detect two sets, the overlap of which forms the anomaly class. We call this the learning phase. Then, in the testing phase, the two learned classification functions are combined to form the final Hamiltonian for an adiabatic quantum computation, the low energy states of which represent the anomalies in a binary vector space.

  7. AnRAD: A Neuromorphic Anomaly Detection Framework for Massive Concurrent Data Streams.

    PubMed

    Chen, Qiuwen; Luley, Ryan; Wu, Qing; Bishop, Morgan; Linderman, Richard W; Qiu, Qinru

    2018-05-01

    The evolution of high performance computing technologies has enabled the large-scale implementation of neuromorphic models and pushed the research in computational intelligence into a new era. Among the machine learning applications, unsupervised detection of anomalous streams is especially challenging due to the requirements of detection accuracy and real-time performance. Designing a computing framework that harnesses the growing computing power of the multicore systems while maintaining high sensitivity and specificity to the anomalies is an urgent research topic. In this paper, we propose anomaly recognition and detection (AnRAD), a bioinspired detection framework that performs probabilistic inferences. We analyze the feature dependency and develop a self-structuring method that learns an efficient confabulation network using unlabeled data. This network is capable of fast incremental learning, which continuously refines the knowledge base using streaming data. Compared with several existing anomaly detection approaches, our method provides competitive detection quality. Furthermore, we exploit the massive parallel structure of the AnRAD framework. Our implementations of the detection algorithm on the graphic processing unit and the Xeon Phi coprocessor both obtain substantial speedups over the sequential implementation on general-purpose microprocessor. The framework provides real-time service to concurrent data streams within diversified knowledge contexts, and can be applied to large problems with multiple local patterns. Experimental results demonstrate high computing performance and memory efficiency. For vehicle behavior detection, the framework is able to monitor up to 16000 vehicles (data streams) and their interactions in real time with a single commodity coprocessor, and uses less than 0.2 ms for one testing subject. Finally, the detection network is ported to our spiking neural network simulator to show the potential of adapting to the emerging

  8. Conditional anomaly detection methods for patient–management alert systems

    PubMed Central

    Valko, Michal; Cooper, Gregory; Seybert, Amy; Visweswaran, Shyam; Saul, Melissa; Hauskrecht, Milos

    2010-01-01

    Anomaly detection methods can be very useful in identifying unusual or interesting patterns in data. A recently proposed conditional anomaly detection framework extends anomaly detection to the problem of identifying anomalous patterns on a subset of attributes in the data. The anomaly always depends (is conditioned) on the value of remaining attributes. The work presented in this paper focuses on instance–based methods for detecting conditional anomalies. The methods rely on the distance metric to identify examples in the dataset that are most critical for detecting the anomaly. We investigate various metrics and metric learning methods to optimize the performance of the instance–based anomaly detection methods. We show the benefits of the instance–based methods on two real–world detection problems: detection of unusual admission decisions for patients with the community–acquired pneumonia and detection of unusual orders of an HPF4 test that is used to confirm Heparin induced thrombocytopenia — a life–threatening condition caused by the Heparin therapy. PMID:25392850

  9. Gravity anomaly detection: Apollo/Soyuz

    NASA Technical Reports Server (NTRS)

    Vonbun, F. O.; Kahn, W. D.; Bryan, J. W.; Schmid, P. E.; Wells, W. T.; Conrad, D. T.

    1976-01-01

    The Goddard Apollo-Soyuz Geodynamics Experiment is described. It was performed to demonstrate the feasibility of tracking and recovering high frequency components of the earth's gravity field by utilizing a synchronous orbiting tracking station such as ATS-6. Gravity anomalies of 5 MGLS or larger having wavelengths of 300 to 1000 kilometers on the earth's surface are important for geologic studies of the upper layers of the earth's crust. Short wavelength Earth's gravity anomalies were detected from space. Two prime areas of data collection were selected for the experiment: (1) the center of the African continent and (2) the Indian Ocean Depression centered at 5% north latitude and 75% east longitude. Preliminary results show that the detectability objective of the experiment was met in both areas as well as at several additional anomalous areas around the globe. Gravity anomalies of the Karakoram and Himalayan mountain ranges, ocean trenches, as well as the Diamantina Depth, can be seen. Maps outlining the anomalies discovered are shown.

  10. Quantum machine learning for quantum anomaly detection

    NASA Astrophysics Data System (ADS)

    Liu, Nana; Rebentrost, Patrick

    2018-04-01

    Anomaly detection is used for identifying data that deviate from "normal" data patterns. Its usage on classical data finds diverse applications in many important areas such as finance, fraud detection, medical diagnoses, data cleaning, and surveillance. With the advent of quantum technologies, anomaly detection of quantum data, in the form of quantum states, may become an important component of quantum applications. Machine-learning algorithms are playing pivotal roles in anomaly detection using classical data. Two widely used algorithms are the kernel principal component analysis and the one-class support vector machine. We find corresponding quantum algorithms to detect anomalies in quantum states. We show that these two quantum algorithms can be performed using resources that are logarithmic in the dimensionality of quantum states. For pure quantum states, these resources can also be logarithmic in the number of quantum states used for training the machine-learning algorithm. This makes these algorithms potentially applicable to big quantum data applications.

  11. Model selection for anomaly detection

    NASA Astrophysics Data System (ADS)

    Burnaev, E.; Erofeev, P.; Smolyakov, D.

    2015-12-01

    Anomaly detection based on one-class classification algorithms is broadly used in many applied domains like image processing (e.g. detection of whether a patient is "cancerous" or "healthy" from mammography image), network intrusion detection, etc. Performance of an anomaly detection algorithm crucially depends on a kernel, used to measure similarity in a feature space. The standard approaches (e.g. cross-validation) for kernel selection, used in two-class classification problems, can not be used directly due to the specific nature of a data (absence of a second, abnormal, class data). In this paper we generalize several kernel selection methods from binary-class case to the case of one-class classification and perform extensive comparison of these approaches using both synthetic and real-world data.

  12. A model for anomaly classification in intrusion detection systems

    NASA Astrophysics Data System (ADS)

    Ferreira, V. O.; Galhardi, V. V.; Gonçalves, L. B. L.; Silva, R. C.; Cansian, A. M.

    2015-09-01

    Intrusion Detection Systems (IDS) are traditionally divided into two types according to the detection methods they employ, namely (i) misuse detection and (ii) anomaly detection. Anomaly detection has been widely used and its main advantage is the ability to detect new attacks. However, the analysis of anomalies generated can become expensive, since they often have no clear information about the malicious events they represent. In this context, this paper presents a model for automated classification of alerts generated by an anomaly based IDS. The main goal is either the classification of the detected anomalies in well-defined taxonomies of attacks or to identify whether it is a false positive misclassified by the IDS. Some common attacks to computer networks were considered and we achieved important results that can equip security analysts with best resources for their analyses.

  13. An immunity-based anomaly detection system with sensor agents.

    PubMed

    Okamoto, Takeshi; Ishida, Yoshiteru

    2009-01-01

    This paper proposes an immunity-based anomaly detection system with sensor agents based on the specificity and diversity of the immune system. Each agent is specialized to react to the behavior of a specific user. Multiple diverse agents decide whether the behavior is normal or abnormal. Conventional systems have used only a single sensor to detect anomalies, while the immunity-based system makes use of multiple sensors, which leads to improvements in detection accuracy. In addition, we propose an evaluation framework for the anomaly detection system, which is capable of evaluating the differences in detection accuracy between internal and external anomalies. This paper focuses on anomaly detection in user's command sequences on UNIX-like systems. In experiments, the immunity-based system outperformed some of the best conventional systems.

  14. Multiuser signal detection using sequential decoding

    NASA Astrophysics Data System (ADS)

    Xie, Zhenhua; Rushforth, Craig K.; Short, Robert T.

    1990-05-01

    The application of sequential decoding to the detection of data transmitted over the additive white Gaussian noise channel by K asynchronous transmitters using direct-sequence spread-spectrum multiple access is considered. A modification of Fano's (1963) sequential-decoding metric, allowing the messages from a given user to be safely decoded if its Eb/N0 exceeds -1.6 dB, is presented. Computer simulation is used to evaluate the performance of a sequential decoder that uses this metric in conjunction with the stack algorithm. In many circumstances, the sequential decoder achieves results comparable to those obtained using the much more complicated optimal receiver.

  15. Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.

    PubMed

    Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda

    2014-05-01

    We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.

  16. Anomalies in the detection of change: When changes in sample size are mistaken for changes in proportions.

    PubMed

    Fiedler, Klaus; Kareev, Yaakov; Avrahami, Judith; Beier, Susanne; Kutzner, Florian; Hütter, Mandy

    2016-01-01

    Detecting changes, in performance, sales, markets, risks, social relations, or public opinions, constitutes an important adaptive function. In a sequential paradigm devised to investigate detection of change, every trial provides a sample of binary outcomes (e.g., correct vs. incorrect student responses). Participants have to decide whether the proportion of a focal feature (e.g., correct responses) in the population from which the sample is drawn has decreased, remained constant, or increased. Strong and persistent anomalies in change detection arise when changes in proportional quantities vary orthogonally to changes in absolute sample size. Proportional increases are readily detected and nonchanges are erroneously perceived as increases when absolute sample size increases. Conversely, decreasing sample size facilitates the correct detection of proportional decreases and the erroneous perception of nonchanges as decreases. These anomalies are however confined to experienced samples of elementary raw events from which proportions have to be inferred inductively. They disappear when sample proportions are described as percentages in a normalized probability format. To explain these challenging findings, it is essential to understand the inductive-learning constraints imposed on decisions from experience.

  17. A Survey on Anomaly Based Host Intrusion Detection System

    NASA Astrophysics Data System (ADS)

    Jose, Shijoe; Malathi, D.; Reddy, Bharath; Jayaseeli, Dorathi

    2018-04-01

    An intrusion detection system (IDS) is hardware, software or a combination of two, for monitoring network or system activities to detect malicious signs. In computer security, designing a robust intrusion detection system is one of the most fundamental and important problems. The primary function of system is detecting intrusion and gives alerts when user tries to intrusion on timely manner. In these techniques when IDS find out intrusion it will send alert massage to the system administrator. Anomaly detection is an important problem that has been researched within diverse research areas and application domains. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. From the existing anomaly detection techniques, each technique has relative strengths and weaknesses. The current state of the experiment practice in the field of anomaly-based intrusion detection is reviewed and survey recent studies in this. This survey provides a study of existing anomaly detection techniques, and how the techniques used in one area can be applied in another application domain.

  18. Statistical Traffic Anomaly Detection in Time-Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types

  19. Statistical Traffic Anomaly Detection in Time Varying Communication Networks

    DTIC Science & Technology

    2015-02-01

    methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Statistical Traffic Anomaly Detection in Time...our methods perform better than their vanilla counterparts, which assume that normal traffic is stationary. Index Terms—Statistical anomaly detection...anomaly detection but also for understanding the normal traffic in time-varying networks. C. Comparison with vanilla stochastic methods For both types

  20. Evaluation of Anomaly Detection Method Based on Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Fontugne, Romain; Himura, Yosuke; Fukuda, Kensuke

    The number of threats on the Internet is rapidly increasing, and anomaly detection has become of increasing importance. High-speed backbone traffic is particularly degraded, but their analysis is a complicated task due to the amount of data, the lack of payload data, the asymmetric routing and the use of sampling techniques. Most anomaly detection schemes focus on the statistical properties of network traffic and highlight anomalous traffic through their singularities. In this paper, we concentrate on unusual traffic distributions, which are easily identifiable in temporal-spatial space (e.g., time/address or port). We present an anomaly detection method that uses a pattern recognition technique to identify anomalies in pictures representing traffic. The main advantage of this method is its ability to detect attacks involving mice flows. We evaluate the parameter set and the effectiveness of this approach by analyzing six years of Internet traffic collected from a trans-Pacific link. We show several examples of detected anomalies and compare our results with those of two other methods. The comparison indicates that the only anomalies detected by the pattern-recognition-based method are mainly malicious traffic with a few packets.

  1. An incremental anomaly detection model for virtual machines.

    PubMed

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.

  2. Topological anomaly detection performance with multispectral polarimetric imagery

    NASA Astrophysics Data System (ADS)

    Gartley, M. G.; Basener, W.,

    2009-05-01

    Polarimetric imaging has demonstrated utility for increasing contrast of manmade targets above natural background clutter. Manual detection of manmade targets in multispectral polarimetric imagery can be challenging and a subjective process for large datasets. Analyst exploitation may be improved utilizing conventional anomaly detection algorithms such as RX. In this paper we examine the performance of a relatively new approach to anomaly detection, which leverages topology theory, applied to spectral polarimetric imagery. Detection results for manmade targets embedded in a complex natural background will be presented for both the RX and Topological Anomaly Detection (TAD) approaches. We will also present detailed results examining detection sensitivities relative to: (1) the number of spectral bands, (2) utilization of Stoke's images versus intensity images, and (3) airborne versus spaceborne measurements.

  3. Network anomaly detection system with optimized DS evidence theory.

    PubMed

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network-complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each sensor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly.

  4. Lidar detection algorithm for time and range anomalies.

    PubMed

    Ben-David, Avishai; Davidson, Charles E; Vanderbeek, Richard G

    2007-10-10

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t(1) to t(2)" is addressed, and for range anomaly where the question "is a target present at time t within ranges R(1) and R(2)" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO(2) lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  5. Detecting Biosphere anomalies hotspots

    NASA Astrophysics Data System (ADS)

    Guanche-Garcia, Yanira; Mahecha, Miguel; Flach, Milan; Denzler, Joachim

    2017-04-01

    The current amount of satellite remote sensing measurements available allow for applying data-driven methods to investigate environmental processes. The detection of anomalies or abnormal events is crucial to monitor the Earth system and to analyze their impacts on ecosystems and society. By means of a combination of statistical methods, this study proposes an intuitive and efficient methodology to detect those areas that present hotspots of anomalies, i.e. higher levels of abnormal or extreme events or more severe phases during our historical records. Biosphere variables from a preliminary version of the Earth System Data Cube developed within the CAB-LAB project (http://earthsystemdatacube.net/) have been used in this study. This database comprises several atmosphere and biosphere variables expanding 11 years (2001-2011) with 8-day of temporal resolution and 0.25° of global spatial resolution. In this study, we have used 10 variables that measure the biosphere. The methodology applied to detect abnormal events follows the intuitive idea that anomalies are assumed to be time steps that are not well represented by a previously estimated statistical model [1].We combine the use of Autoregressive Moving Average (ARMA) models with a distance metric like Mahalanobis distance to detect abnormal events in multiple biosphere variables. In a first step we pre-treat the variables by removing the seasonality and normalizing them locally (μ=0,σ=1). Additionally we have regionalized the area of study into subregions of similar climate conditions, by using the Köppen climate classification. For each climate region and variable we have selected the best ARMA parameters by means of a Bayesian Criteria. Then we have obtained the residuals by comparing the fitted models with the original data. To detect the extreme residuals from the 10 variables, we have computed the Mahalanobis distance to the data's mean (Hotelling's T^2), which considers the covariance matrix of the joint

  6. Network Anomaly Detection System with Optimized DS Evidence Theory

    PubMed Central

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  7. An incremental anomaly detection model for virtual machines

    PubMed Central

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245

  8. Survey of Anomaly Detection Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, B

    This survey defines the problem of anomaly detection and provides an overview of existing methods. The methods are categorized into two general classes: generative and discriminative. A generative approach involves building a model that represents the joint distribution of the input features and the output labels of system behavior (e.g., normal or anomalous) then applies the model to formulate a decision rule for detecting anomalies. On the other hand, a discriminative approach aims directly to find the decision rule, with the smallest error rate, that distinguishes between normal and anomalous behavior. For each approach, we will give an overview ofmore » popular techniques and provide references to state-of-the-art applications.« less

  9. Anomaly Detection in Power Quality at Data Centers

    NASA Technical Reports Server (NTRS)

    Grichine, Art; Solano, Wanda M.

    2015-01-01

    The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.

  10. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field.

    PubMed

    Christiansen, Peter; Nielsen, Lars N; Steen, Kim A; Jørgensen, Rasmus N; Karstoft, Henrik

    2016-11-11

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks" (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45-90 m) than RCNN. RCNN has a similar performance at a short range (0-30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit).

  11. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field

    PubMed Central

    Christiansen, Peter; Nielsen, Lars N.; Steen, Kim A.; Jørgensen, Rasmus N.; Karstoft, Henrik

    2016-01-01

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45–90 m) than RCNN. RCNN has a similar performance at a short range (0–30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit). PMID:27845717

  12. Hyperspectral anomaly detection using Sony PlayStation 3

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton; Romano, João; Sepulveda, Rene

    2009-05-01

    We present a proof-of-principle demonstration using Sony's IBM Cell processor-based PlayStation 3 (PS3) to run-in near real-time-a hyperspectral anomaly detection algorithm (HADA) on real hyperspectral (HS) long-wave infrared imagery. The PS3 console proved to be ideal for doing precisely the kind of heavy computational lifting HS based algorithms require, and the fact that it is a relatively open platform makes programming scientific applications feasible. The PS3 HADA is a unique parallel-random sampling based anomaly detection approach that does not require prior spectra of the clutter background. The PS3 HADA is designed to handle known underlying difficulties (e.g., target shape/scale uncertainties) often ignored in the development of autonomous anomaly detection algorithms. The effort is part of an ongoing cooperative contribution between the Army Research Laboratory and the Army's Armament, Research, Development and Engineering Center, which aims at demonstrating performance of innovative algorithmic approaches for applications requiring autonomous anomaly detection using passive sensors.

  13. Disparity : scalable anomaly detection for clusters.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desai, N.; Bradshaw, R.; Lusk, E.

    2008-01-01

    In this paper, we describe disparity, a tool that does parallel, scalable anomaly detection for clusters. Disparity uses basic statistical methods and scalable reduction operations to perform data reduction on client nodes and uses these results to locate node anomalies. We discuss the implementation of disparity and present results of its use on a SiCortex SC5832 system.

  14. Post-processing for improving hyperspectral anomaly detection accuracy

    NASA Astrophysics Data System (ADS)

    Wu, Jee-Cheng; Jiang, Chi-Ming; Huang, Chen-Liang

    2015-10-01

    Anomaly detection is an important topic in the exploitation of hyperspectral data. Based on the Reed-Xiaoli (RX) detector and a morphology operator, this research proposes a novel technique for improving the accuracy of hyperspectral anomaly detection. Firstly, the RX-based detector is used to process a given input scene. Then, a post-processing scheme using morphology operator is employed to detect those pixels around high-scoring anomaly pixels. Tests were conducted using two real hyperspectral images with ground truth information and the results based on receiver operating characteristic curves, illustrated that the proposed method reduced the false alarm rates of the RXbased detector.

  15. Identifying Threats Using Graph-based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Eberle, William; Holder, Lawrence; Cook, Diane

    Much of the data collected during the monitoring of cyber and other infrastructures is structural in nature, consisting of various types of entities and relationships between them. The detection of threatening anomalies in such data is crucial to protecting these infrastructures. We present an approach to detecting anomalies in a graph-based representation of such data that explicitly represents these entities and relationships. The approach consists of first finding normative patterns in the data using graph-based data mining and then searching for small, unexpected deviations to these normative patterns, assuming illicit behavior tries to mimic legitimate, normative behavior. The approach is evaluated using several synthetic and real-world datasets. Results show that the approach has high truepositive rates, low false-positive rates, and is capable of detecting complex structural anomalies in real-world domains including email communications, cellphone calls and network traffic.

  16. Robust and efficient anomaly detection using heterogeneous representations

    NASA Astrophysics Data System (ADS)

    Hu, Xing; Hu, Shiqiang; Xie, Jinhua; Zheng, Shiyou

    2015-05-01

    Various approaches have been proposed for video anomaly detection. Yet these approaches typically suffer from one or more limitations: they often characterize the pattern using its internal information, but ignore its external relationship which is important for local anomaly detection. Moreover, the high-dimensionality and the lack of robustness of pattern representation may lead to problems, including overfitting, increased computational cost and memory requirements, and high false alarm rate. We propose a video anomaly detection framework which relies on a heterogeneous representation to account for both the pattern's internal information and external relationship. The internal information is characterized by slow features learned by slow feature analysis from low-level representations, and the external relationship is characterized by the spatial contextual distances. The heterogeneous representation is compact, robust, efficient, and discriminative for anomaly detection. Moreover, both the pattern's internal information and external relationship can be taken into account in the proposed framework. Extensive experiments demonstrate the robustness and efficiency of our approach by comparison with the state-of-the-art approaches on the widely used benchmark datasets.

  17. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data

    PubMed Central

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601

  18. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data.

    PubMed

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks.

  19. Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling

    NASA Astrophysics Data System (ADS)

    Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji

    We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.

  20. Modeling And Detecting Anomalies In Scada Systems

    NASA Astrophysics Data System (ADS)

    Svendsen, Nils; Wolthusen, Stephen

    The detection of attacks and intrusions based on anomalies is hampered by the limits of specificity underlying the detection techniques. However, in the case of many critical infrastructure systems, domain-specific knowledge and models can impose constraints that potentially reduce error rates. At the same time, attackers can use their knowledge of system behavior to mask their manipulations, causing adverse effects to observed only after a significant period of time. This paper describes elementary statistical techniques that can be applied to detect anomalies in critical infrastructure networks. A SCADA system employed in liquefied natural gas (LNG) production is used as a case study.

  1. Real-time anomaly detection for very short-term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Jian; Hong, Tao; Yue, Meng

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  2. Real-time anomaly detection for very short-term load forecasting

    DOE PAGES

    Luo, Jian; Hong, Tao; Yue, Meng

    2018-01-06

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  3. Conditional Anomaly Detection with Soft Harmonic Functions.

    PubMed

    Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F; Hauskrecht, Milos

    2011-01-01

    In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions.

  4. Multi-Level Anomaly Detection on Time-Varying Graph Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A; Collins, John P; Ferragut, Erik M

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, thismore » multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  5. Conditional Anomaly Detection with Soft Harmonic Functions

    PubMed Central

    Valko, Michal; Kveton, Branislav; Valizadegan, Hamed; Cooper, Gregory F.; Hauskrecht, Milos

    2012-01-01

    In this paper, we consider the problem of conditional anomaly detection that aims to identify data instances with an unusual response or a class label. We develop a new non-parametric approach for conditional anomaly detection based on the soft harmonic solution, with which we estimate the confidence of the label to detect anomalous mislabeling. We further regularize the solution to avoid the detection of isolated examples and examples on the boundary of the distribution support. We demonstrate the efficacy of the proposed method on several synthetic and UCI ML datasets in detecting unusual labels when compared to several baseline approaches. We also evaluate the performance of our method on a real-world electronic health record dataset where we seek to identify unusual patient-management decisions. PMID:25309142

  6. Variable Discretisation for Anomaly Detection using Bayesian Networks

    DTIC Science & Technology

    2017-01-01

    UNCLASSIFIED DST- Group –TR–3328 1 Introduction Bayesian network implementations usually require each variable to take on a finite number of mutually...UNCLASSIFIED Variable Discretisation for Anomaly Detection using Bayesian Networks Jonathan Legg National Security and ISR Division Defence Science...and Technology Group DST- Group –TR–3328 ABSTRACT Anomaly detection is the process by which low probability events are automatically found against a

  7. A lightweight network anomaly detection technique

    DOE PAGES

    Kim, Jinoh; Yoo, Wucherl; Sim, Alex; ...

    2017-03-13

    While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less

  8. A lightweight network anomaly detection technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jinoh; Yoo, Wucherl; Sim, Alex

    While the network anomaly detection is essential in network operations and management, it becomes further challenging to perform the first line of detection against the exponentially increasing volume of network traffic. In this paper, we develop a technique for the first line of online anomaly detection with two important considerations: (i) availability of traffic attributes during the monitoring time, and (ii) computational scalability for streaming data. The presented learning technique is lightweight and highly scalable with the beauty of approximation based on the grid partitioning of the given dimensional space. With the public traffic traces of KDD Cup 1999 andmore » NSL-KDD, we show that our technique yields 98.5% and 83% of detection accuracy, respectively, only with a couple of readily available traffic attributes that can be obtained without the help of post-processing. Finally, the results are at least comparable with the classical learning methods including decision tree and random forest, with approximately two orders of magnitude faster learning performance.« less

  9. Enhanced detection and visualization of anomalies in spectral imagery

    NASA Astrophysics Data System (ADS)

    Basener, William F.; Messinger, David W.

    2009-05-01

    Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.

  10. Automated Network Anomaly Detection with Learning, Control and Mitigation

    ERIC Educational Resources Information Center

    Ippoliti, Dennis

    2014-01-01

    Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…

  11. The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad

    The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measuredmore » 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.« less

  12. Evaluation schemes for video and image anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael

    2016-05-01

    Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.

  13. Network Anomaly Detection Based on Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  14. Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  15. Anomaly Monitoring Method for Key Components of Satellite

    PubMed Central

    Fan, Linjun; Xiao, Weidong; Tang, Jun

    2014-01-01

    This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703

  16. Using statistical anomaly detection models to find clinical decision support malfunctions.

    PubMed

    Ray, Soumi; McEvoy, Dustin S; Aaron, Skye; Hickman, Thu-Trang; Wright, Adam

    2018-05-11

    Malfunctions in Clinical Decision Support (CDS) systems occur due to a multitude of reasons, and often go unnoticed, leading to potentially poor outcomes. Our goal was to identify malfunctions within CDS systems. We evaluated 6 anomaly detection models: (1) Poisson Changepoint Model, (2) Autoregressive Integrated Moving Average (ARIMA) Model, (3) Hierarchical Divisive Changepoint (HDC) Model, (4) Bayesian Changepoint Model, (5) Seasonal Hybrid Extreme Studentized Deviate (SHESD) Model, and (6) E-Divisive with Median (EDM) Model and characterized their ability to find known anomalies. We analyzed 4 CDS alerts with known malfunctions from the Longitudinal Medical Record (LMR) and Epic® (Epic Systems Corporation, Madison, WI, USA) at Brigham and Women's Hospital, Boston, MA. The 4 rules recommend lead testing in children, aspirin therapy in patients with coronary artery disease, pneumococcal vaccination in immunocompromised adults and thyroid testing in patients taking amiodarone. Poisson changepoint, ARIMA, HDC, Bayesian changepoint and the SHESD model were able to detect anomalies in an alert for lead screening in children and in an alert for pneumococcal conjugate vaccine in immunocompromised adults. EDM was able to detect anomalies in an alert for monitoring thyroid function in patients on amiodarone. Malfunctions/anomalies occur frequently in CDS alert systems. It is important to be able to detect such anomalies promptly. Anomaly detection models are useful tools to aid such detections.

  17. Numerical study on the sequential Bayesian approach for radioactive materials detection

    NASA Astrophysics Data System (ADS)

    Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng

    2013-01-01

    A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.

  18. FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection.

    PubMed

    Noto, Keith; Brodley, Carla; Slonim, Donna

    2012-01-01

    Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called "normal" instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach.

  19. FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection

    PubMed Central

    Brodley, Carla; Slonim, Donna

    2011-01-01

    Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called “normal” instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach. PMID:22639542

  20. System and method for anomaly detection

    DOEpatents

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  1. Fuzzy Kernel k-Medoids algorithm for anomaly detection problems

    NASA Astrophysics Data System (ADS)

    Rustam, Z.; Talita, A. S.

    2017-07-01

    Intrusion Detection System (IDS) is an essential part of security systems to strengthen the security of information systems. IDS can be used to detect the abuse by intruders who try to get into the network system in order to access and utilize the available data sources in the system. There are two approaches of IDS, Misuse Detection and Anomaly Detection (behavior-based intrusion detection). Fuzzy clustering-based methods have been widely used to solve Anomaly Detection problems. Other than using fuzzy membership concept to determine the object to a cluster, other approaches as in combining fuzzy and possibilistic membership or feature-weighted based methods are also used. We propose Fuzzy Kernel k-Medoids that combining fuzzy and possibilistic membership as a powerful method to solve anomaly detection problem since on numerical experiment it is able to classify IDS benchmark data into five different classes simultaneously. We classify IDS benchmark data KDDCup'99 data set into five different classes simultaneously with the best performance was achieved by using 30 % of training data with clustering accuracy reached 90.28 percent.

  2. Firefly Algorithm in detection of TEC seismo-ionospheric anomalies

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, Mehdi

    2015-07-01

    Anomaly detection in time series of different earthquake precursors is an essential introduction to create an early warning system with an allowable uncertainty. Since these time series are more often non linear, complex and massive, therefore the applied predictor method should be able to detect the discord patterns from a large data in a short time. This study acknowledges Firefly Algorithm (FA) as a simple and robust predictor to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of the some powerful earthquakes including Chile (27 February 2010), Varzeghan (11 August 2012) and Saravan (16 April 2013). Outstanding anomalies were observed 7 and 5 days before the Chile and Varzeghan earthquakes, respectively and also 3 and 8 days prior to the Saravan earthquake.

  3. A robust background regression based score estimation algorithm for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei

    2016-12-01

    Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement

  4. Locality-constrained anomaly detection for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Liu, Jiabin; Li, Wei; Du, Qian; Liu, Kui

    2015-12-01

    Detecting a target with low-occurrence-probability from unknown background in a hyperspectral image, namely anomaly detection, is of practical significance. Reed-Xiaoli (RX) algorithm is considered as a classic anomaly detector, which calculates the Mahalanobis distance between local background and the pixel under test. Local RX, as an adaptive RX detector, employs a dual-window strategy to consider pixels within the frame between inner and outer windows as local background. However, the detector is sensitive if such a local region contains anomalous pixels (i.e., outliers). In this paper, a locality-constrained anomaly detector is proposed to remove outliers in the local background region before employing the RX algorithm. Specifically, a local linear representation is designed to exploit the internal relationship between linearly correlated pixels in the local background region and the pixel under test and its neighbors. Experimental results demonstrate that the proposed detector improves the original local RX algorithm.

  5. A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data.

    PubMed

    Song, Hongchao; Jiang, Zhuqing; Men, Aidong; Yang, Bo

    2017-01-01

    Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE) and an ensemble k -nearest neighbor graphs- ( K -NNG-) based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity.

  6. A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data

    PubMed Central

    Jiang, Zhuqing; Men, Aidong; Yang, Bo

    2017-01-01

    Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE) and an ensemble k-nearest neighbor graphs- (K-NNG-) based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity. PMID:29270197

  7. Deep learning on temporal-spectral data for anomaly detection

    NASA Astrophysics Data System (ADS)

    Ma, King; Leung, Henry; Jalilian, Ehsan; Huang, Daniel

    2017-05-01

    Detecting anomalies is important for continuous monitoring of sensor systems. One significant challenge is to use sensor data and autonomously detect changes that cause different conditions to occur. Using deep learning methods, we are able to monitor and detect changes as a result of some disturbance in the system. We utilize deep neural networks for sequence analysis of time series. We use a multi-step method for anomaly detection. We train the network to learn spectral and temporal features from the acoustic time series. We test our method using fiber-optic acoustic data from a pipeline.

  8. Anomaly-based intrusion detection for SCADA systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less

  9. Novel Hyperspectral Anomaly Detection Methods Based on Unsupervised Nearest Regularized Subspace

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Chen, Y.; Tan, K.; Du, P.

    2018-04-01

    Anomaly detection has been of great interest in hyperspectral imagery analysis. Most conventional anomaly detectors merely take advantage of spectral and spatial information within neighboring pixels. In this paper, two methods of Unsupervised Nearest Regularized Subspace-based with Outlier Removal Anomaly Detector (UNRSORAD) and Local Summation UNRSORAD (LSUNRSORAD) are proposed, which are based on the concept that each pixel in background can be approximately represented by its spatial neighborhoods, while anomalies cannot. Using a dual window, an approximation of each testing pixel is a representation of surrounding data via a linear combination. The existence of outliers in the dual window will affect detection accuracy. Proposed detectors remove outlier pixels that are significantly different from majority of pixels. In order to make full use of various local spatial distributions information with the neighboring pixels of the pixels under test, we take the local summation dual-window sliding strategy. The residual image is constituted by subtracting the predicted background from the original hyperspectral imagery, and anomalies can be detected in the residual image. Experimental results show that the proposed methods have greatly improved the detection accuracy compared with other traditional detection method.

  10. Anomaly Detection Based on Sensor Data in Petroleum Industry Applications

    PubMed Central

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-01

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection. PMID:25633599

  11. OceanXtremes: Scalable Anomaly Detection in Oceanographic Time-Series

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Armstrong, E. M.; Chin, T. M.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jacob, J. C.; Quach, N.

    2016-12-01

    The oceanographic community must meet the challenge to rapidly identify features and anomalies in complex and voluminous observations to further science and improve decision support. Given this data-intensive reality, we are developing an anomaly detection system, called OceanXtremes, powered by an intelligent, elastic Cloud-based analytic service backend that enables execution of domain-specific, multi-scale anomaly and feature detection algorithms across the entire archive of 15 to 30-year ocean science datasets.Our parallel analytics engine is extending the NEXUS system and exploits multiple open-source technologies: Apache Cassandra as a distributed spatial "tile" cache, Apache Spark for in-memory parallel computation, and Apache Solr for spatial search and storing pre-computed tile statistics and other metadata. OceanXtremes provides these key capabilities: Parallel generation (Spark on a compute cluster) of 15 to 30-year Ocean Climatologies (e.g. sea surface temperature or SST) in hours or overnight, using simple pixel averages or customizable Gaussian-weighted "smoothing" over latitude, longitude, and time; Parallel pre-computation, tiling, and caching of anomaly fields (daily variables minus a chosen climatology) with pre-computed tile statistics; Parallel detection (over the time-series of tiles) of anomalies or phenomena by regional area-averages exceeding a specified threshold (e.g. high SST in El Nino or SST "blob" regions), or more complex, custom data mining algorithms; Shared discovery and exploration of ocean phenomena and anomalies (facet search using Solr), along with unexpected correlations between key measured variables; Scalable execution for all capabilities on a hybrid Cloud, using our on-premise OpenStack Cloud cluster or at Amazon. The key idea is that the parallel data-mining operations will be run "near" the ocean data archives (a local "network" hop) so that we can efficiently access the thousands of files making up a three decade time

  12. Anomaly detection in hyperspectral imagery: statistics vs. graph-based algorithms

    NASA Astrophysics Data System (ADS)

    Berkson, Emily E.; Messinger, David W.

    2016-05-01

    Anomaly detection (AD) algorithms are frequently applied to hyperspectral imagery, but different algorithms produce different outlier results depending on the image scene content and the assumed background model. This work provides the first comparison of anomaly score distributions between common statistics-based anomaly detection algorithms (RX and subspace-RX) and the graph-based Topological Anomaly Detector (TAD). Anomaly scores in statistical AD algorithms should theoretically approximate a chi-squared distribution; however, this is rarely the case with real hyperspectral imagery. The expected distribution of scores found with graph-based methods remains unclear. We also look for general trends in algorithm performance with varied scene content. Three separate scenes were extracted from the hyperspectral MegaScene image taken over downtown Rochester, NY with the VIS-NIR-SWIR ProSpecTIR instrument. In order of most to least cluttered, we study an urban, suburban, and rural scene. The three AD algorithms were applied to each scene, and the distributions of the most anomalous 5% of pixels were compared. We find that subspace-RX performs better than RX, because the data becomes more normal when the highest variance principal components are removed. We also see that compared to statistical detectors, anomalies detected by TAD are easier to separate from the background. Due to their different underlying assumptions, the statistical and graph-based algorithms highlighted different anomalies within the urban scene. These results will lead to a deeper understanding of these algorithms and their applicability across different types of imagery.

  13. Occurrence and Detectability of Thermal Anomalies on Europa

    NASA Astrophysics Data System (ADS)

    Hayne, Paul O.; Christensen, Philip R.; Spencer, John R.; Abramov, Oleg; Howett, Carly; Mellon, Michael; Nimmo, Francis; Piqueux, Sylvain; Rathbun, Julie A.

    2017-10-01

    Endogenic activity is likely on Europa, given its young surface age of and ongoing tidal heating by Jupiter. Temperature is a fundamental signature of activity, as witnessed on Enceladus, where plumes emanate from vents with strongly elevated temperatures. Recent observations suggest the presence of similar water plumes at Europa. Even if plumes are uncommon, resurfacing may produce elevated surface temperatures, perhaps due to near-surface liquid water. Detecting endogenic activity on Europa is one of the primary mission objectives of NASA’s planned Europa Clipper flyby mission.Here, we use a probabilistic model to assess the likelihood of detectable thermal anomalies on the surface of Europa. The Europa Thermal Emission Imaging System (E-THEMIS) investigation is designed to characterize Europa’s thermal behavior and identify any thermal anomalies due to recent or ongoing activity. We define “detectability” on the basis of expected E-THEMIS measurements, which include multi-spectral infrared emission, both day and night.Thermal anomalies on Europa may take a variety of forms, depending on the resurfacing style, frequency, and duration of events: 1) subsurface melting due to hot spots, 2) shear heating on faults, and 3) eruptions of liquid water or warm ice on the surface. We use numerical and analytical models to estimate temperatures for these features. Once activity ceases, lifetimes of thermal anomalies are estimated to be 100 - 1000 yr. On average, Europa’s 10 - 100 Myr surface age implies a resurfacing rate of ~3 - 30 km2/yr. The typical size of resurfacing features determines their frequency of occurrence. For example, if ~100 km2 chaos features dominate recent resurfacing, we expect one event every few years to decades. Smaller features, such as double-ridges, may be active much more frequently. We model each feature type as a statistically independent event, with probabilities weighted by their observed coverage of Europa’s surface. Our results

  14. Attention focusing and anomaly detection in systems monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. The focus of this paper is a new technique for attention focusing. The technique involves reasoning about the distance between two frequency distributions, and is used to detect both anomalous system parameters and 'broken' causal dependencies. These two forms of information together isolate the locus of anomalous behavior in the system being monitored.

  15. Security inspection in ports by anomaly detection using hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Rivera, Javier; Valverde, Fernando; Saldaña, Manuel; Manian, Vidya

    2013-05-01

    Applying hyperspectral imaging technology in port security is crucial for the detection of possible threats or illegal activities. One of the most common problems that cargo suffers is tampering. This represents a danger to society because it creates a channel to smuggle illegal and hazardous products. If a cargo is altered, security inspections on that cargo should contain anomalies that reveal the nature of the tampering. Hyperspectral images can detect anomalies by gathering information through multiple electromagnetic bands. The spectrums extracted from these bands can be used to detect surface anomalies from different materials. Based on this technology, a scenario was built in which a hyperspectral camera was used to inspect the cargo for any surface anomalies and a user interface shows the results. The spectrum of items, altered by different materials that can be used to conceal illegal products, is analyzed and classified in order to provide information about the tampered cargo. The image is analyzed with a variety of techniques such as multiple features extracting algorithms, autonomous anomaly detection, and target spectrum detection. The results will be exported to a workstation or mobile device in order to show them in an easy -to-use interface. This process could enhance the current capabilities of security systems that are already implemented, providing a more complete approach to detect threats and illegal cargo.

  16. Real-time Bayesian anomaly detection in streaming environmental data

    NASA Astrophysics Data System (ADS)

    Hill, David J.; Minsker, Barbara S.; Amir, Eyal

    2009-04-01

    With large volumes of data arriving in near real time from environmental sensors, there is a need for automated detection of anomalous data caused by sensor or transmission errors or by infrequent system behaviors. This study develops and evaluates three automated anomaly detection methods using dynamic Bayesian networks (DBNs), which perform fast, incremental evaluation of data as they become available, scale to large quantities of data, and require no a priori information regarding process variables or types of anomalies that may be encountered. This study investigates these methods' abilities to identify anomalies in eight meteorological data streams from Corpus Christi, Texas. The results indicate that DBN-based detectors, using either robust Kalman filtering or Rao-Blackwellized particle filtering, outperform a DBN-based detector using Kalman filtering, with the former having false positive/negative rates of less than 2%. These methods were successful at identifying data anomalies caused by two real events: a sensor failure and a large storm.

  17. Detecting Anomalies in Process Control Networks

    NASA Astrophysics Data System (ADS)

    Rrushi, Julian; Kang, Kyoung-Don

    This paper presents the estimation-inspection algorithm, a statistical algorithm for anomaly detection in process control networks. The algorithm determines if the payload of a network packet that is about to be processed by a control system is normal or abnormal based on the effect that the packet will have on a variable stored in control system memory. The estimation part of the algorithm uses logistic regression integrated with maximum likelihood estimation in an inductive machine learning process to estimate a series of statistical parameters; these parameters are used in conjunction with logistic regression formulas to form a probability mass function for each variable stored in control system memory. The inspection part of the algorithm uses the probability mass functions to estimate the normalcy probability of a specific value that a network packet writes to a variable. Experimental results demonstrate that the algorithm is very effective at detecting anomalies in process control networks.

  18. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less

  19. A hybrid approach for efficient anomaly detection using metaheuristic methods.

    PubMed

    Ghanem, Tamer F; Elkilani, Wail S; Abdul-Kader, Hatem M

    2015-07-01

    Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.

  20. Development of anomaly detection models for deep subsurface monitoring

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.

    2017-12-01

    Deep subsurface repositories are used for waste disposal and carbon sequestration. Monitoring deep subsurface repositories for potential anomalies is challenging, not only because the number of sensor networks and the quality of data are often limited, but also because of the lack of labeled data needed to train and validate machine learning (ML) algorithms. Although physical simulation models may be applied to predict anomalies (or the system's nominal state for that sake), the accuracy of such predictions may be limited by inherent conceptual and parameter uncertainties. The main objective of this study was to demonstrate the potential of data-driven models for leakage detection in carbon sequestration repositories. Monitoring data collected during an artificial CO2 release test at a carbon sequestration repository were used, which include both scalar time series (pressure) and vector time series (distributed temperature sensing). For each type of data, separate online anomaly detection algorithms were developed using the baseline experiment data (no leak) and then tested on the leak experiment data. Performance of a number of different online algorithms was compared. Results show the importance of including contextual information in the dataset to mitigate the impact of reservoir noise and reduce false positive rate. The developed algorithms were integrated into a generic Web-based platform for real-time anomaly detection.

  1. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  2. Accumulating pyramid spatial-spectral collaborative coding divergence for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Zou, Huanxin; Zhou, Shilin

    2016-03-01

    Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.

  3. A new comparison of hyperspectral anomaly detection algorithms for real-time applications

    NASA Astrophysics Data System (ADS)

    Díaz, María.; López, Sebastián.; Sarmiento, Roberto

    2016-10-01

    Due to the high spectral resolution that remotely sensed hyperspectral images provide, there has been an increasing interest in anomaly detection. The aim of anomaly detection is to stand over pixels whose spectral signature differs significantly from the background spectra. Basically, anomaly detectors mark pixels with a certain score, considering as anomalies those whose scores are higher than a threshold. Receiver Operating Characteristic (ROC) curves have been widely used as an assessment measure in order to compare the performance of different algorithms. ROC curves are graphical plots which illustrate the trade- off between false positive and true positive rates. However, they are limited in order to make deep comparisons due to the fact that they discard relevant factors required in real-time applications such as run times, costs of misclassification and the competence to mark anomalies with high scores. This last fact is fundamental in anomaly detection in order to distinguish them easily from the background without any posterior processing. An extensive set of simulations have been made using different anomaly detection algorithms, comparing their performances and efficiencies using several extra metrics in order to complement ROC curves analysis. Results support our proposal and demonstrate that ROC curves do not provide a good visualization of detection performances for themselves. Moreover, a figure of merit has been proposed in this paper which encompasses in a single global metric all the measures yielded for the proposed additional metrics. Therefore, this figure, named Detection Efficiency (DE), takes into account several crucial types of performance assessment that ROC curves do not consider. Results demonstrate that algorithms with the best detection performances according to ROC curves do not have the highest DE values. Consequently, the recommendation of using extra measures to properly evaluate performances have been supported and justified by

  4. Symbolic Time-Series Analysis for Anomaly Detection in Mechanical Systems

    DTIC Science & Technology

    2006-08-01

    Amol Khatkhate, Asok Ray , Fellow, IEEE, Eric Keller, Shalabh Gupta, and Shin C. Chin Abstract—This paper examines the efficacy of a novel method for...recognition. KHATKHATE et al.: SYMBOLIC TIME-SERIES ANALYSIS FOR ANOMALY DETECTION 447 Asok Ray (F’02) received graduate degrees in electri- cal...anomaly detection has been pro- posed by Ray [6], where the underlying information on the dynamical behavior of complex systems is derived based on

  5. A hybrid approach for efficient anomaly detection using metaheuristic methods

    PubMed Central

    Ghanem, Tamer F.; Elkilani, Wail S.; Abdul-kader, Hatem M.

    2014-01-01

    Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. PMID:26199752

  6. Autonomous detection of crowd anomalies in multiple-camera surveillance feeds

    NASA Astrophysics Data System (ADS)

    Nordlöf, Jonas; Andersson, Maria

    2016-10-01

    A novel approach for autonomous detection of anomalies in crowded environments is presented in this paper. The proposed models uses a Gaussian mixture probability hypothesis density (GM-PHD) filter as feature extractor in conjunction with different Gaussian mixture hidden Markov models (GM-HMMs). Results, based on both simulated and recorded data, indicate that this method can track and detect anomalies on-line in individual crowds through multiple camera feeds in a crowded environment.

  7. Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity.

    PubMed

    Napoletano, Paolo; Piccoli, Flavio; Schettini, Raimondo

    2018-01-12

    Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art.

  8. Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity

    PubMed Central

    Schettini, Raimondo

    2018-01-01

    Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art. PMID:29329268

  9. Efficient Mining and Detection of Sequential Intrusion Patterns for Network Intrusion Detection Systems

    NASA Astrophysics Data System (ADS)

    Shyu, Mei-Ling; Huang, Zifang; Luo, Hongli

    In recent years, pervasive computing infrastructures have greatly improved the interaction between human and system. As we put more reliance on these computing infrastructures, we also face threats of network intrusion and/or any new forms of undesirable IT-based activities. Hence, network security has become an extremely important issue, which is closely connected with homeland security, business transactions, and people's daily life. Accurate and efficient intrusion detection technologies are required to safeguard the network systems and the critical information transmitted in the network systems. In this chapter, a novel network intrusion detection framework for mining and detecting sequential intrusion patterns is proposed. The proposed framework consists of a Collateral Representative Subspace Projection Modeling (C-RSPM) component for supervised classification, and an inter-transactional association rule mining method based on Layer Divided Modeling (LDM) for temporal pattern analysis. Experiments on the KDD99 data set and the traffic data set generated by a private LAN testbed show promising results with high detection rates, low processing time, and low false alarm rates in mining and detecting sequential intrusion detections.

  10. Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa

    2018-01-01

    A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.

  11. Anomaly Detection for Next-Generation Space Launch Ground Operations

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.

    2010-01-01

    NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.

  12. Fault detection on a sewer network by a combination of a Kalman filter and a binary sequential probability ratio test

    NASA Astrophysics Data System (ADS)

    Piatyszek, E.; Voignier, P.; Graillot, D.

    2000-05-01

    One of the aims of sewer networks is the protection of population against floods and the reduction of pollution rejected to the receiving water during rainy events. To meet these goals, managers have to equip the sewer networks with and to set up real-time control systems. Unfortunately, a component fault (leading to intolerable behaviour of the system) or sensor fault (deteriorating the process view and disturbing the local automatism) makes the sewer network supervision delicate. In order to ensure an adequate flow management during rainy events it is essential to set up procedures capable of detecting and diagnosing these anomalies. This article introduces a real-time fault detection method, applicable to sewer networks, for the follow-up of rainy events. This method consists in comparing the sensor response with a forecast of this response. This forecast is provided by a model and more precisely by a state estimator: a Kalman filter. This Kalman filter provides not only a flow estimate but also an entity called 'innovation'. In order to detect abnormal operations within the network, this innovation is analysed with the binary sequential probability ratio test of Wald. Moreover, by crossing available information on several nodes of the network, a diagnosis of the detected anomalies is carried out. This method provided encouraging results during the analysis of several rains, on the sewer network of Seine-Saint-Denis County, France.

  13. Radon anomalies: When are they possible to be detected?

    NASA Astrophysics Data System (ADS)

    Passarelli, Luigi; Woith, Heiko; Seyis, Cemil; Nikkhoo, Mehdi; Donner, Reik

    2017-04-01

    Records of the Radon noble gas in different environments like soil, air, groundwater, rock, caves, and tunnels, typically display cyclic variations including diurnal (S1), semidiurnal (S2) and seasonal components. But there are also cases where theses cycles are absent. Interestingly, radon emission can also be affected by transient processes, which inhibit or enhance the radon carrying process at the surface. This results in transient changes in the radon emission rate, which are superimposed on the low and high frequency cycles. The complexity in the spectral contents of the radon time-series makes any statistical analysis aiming at understanding the physical driving processes a challenging task. In the past decades there have been several attempts to relate changes in radon emission rate with physical triggering processes such as earthquake occurrence. One of the problems in this type of investigation is to objectively detect anomalies in the radon time-series. In the present work, we propose a simple and objective statistical method for detecting changes in the radon emission rate time-series. The method uses non-parametric statistical tests (e.g., Kolmogorov-Smirnov) to compare empirical distributions of radon emission rate by sequentially applying various time window to the time-series. The statistical test indicates whether two empirical distributions of data originate from the same distribution at a desired significance level. We test the algorithm on synthetic data in order to explore the sensitivity of the statistical test to the sample size. We successively apply the test to six radon emission rate recordings from stations located around the Marmara Sea obtained within the MARsite project (MARsite has received funding from the European Union's Seventh Programme for research, technological development and demonstration under grant agreement No 308417). We conclude that the test performs relatively well on identify transient changes in the radon emission

  14. Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)

    2001-01-01

    The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.

  15. Machine intelligence-based decision-making (MIND) for automatic anomaly detection

    NASA Astrophysics Data System (ADS)

    Prasad, Nadipuram R.; King, Jason C.; Lu, Thomas

    2007-04-01

    Any event deemed as being out-of-the-ordinary may be called an anomaly. Anomalies by virtue of their definition are events that occur spontaneously with no prior indication of their existence or appearance. Effects of anomalies are typically unknown until they actually occur, and their effects aggregate in time to show noticeable change from the original behavior. An evolved behavior would in general be very difficult to correct unless the anomalous event that caused such behavior can be detected early, and any consequence attributed to the specific anomaly. Substantial time and effort is required to back-track the cause for abnormal behavior and to recreate the event sequence leading to abnormal behavior. There is a critical need therefore to automatically detect anomalous behavior as and when they may occur, and to do so with the operator in the loop. Human-machine interaction results in better machine learning and a better decision-support mechanism. This is the fundamental concept of intelligent control where machine learning is enhanced by interaction with human operators, and vice versa. The paper discusses a revolutionary framework for the characterization, detection, identification, learning, and modeling of anomalous behavior in observed phenomena arising from a large class of unknown and uncertain dynamical systems.

  16. Residual Error Based Anomaly Detection Using Auto-Encoder in SMD Machine Sound.

    PubMed

    Oh, Dong Yul; Yun, Il Dong

    2018-04-24

    Detecting an anomaly or an abnormal situation from given noise is highly useful in an environment where constantly verifying and monitoring a machine is required. As deep learning algorithms are further developed, current studies have focused on this problem. However, there are too many variables to define anomalies, and the human annotation for a large collection of abnormal data labeled at the class-level is very labor-intensive. In this paper, we propose to detect abnormal operation sounds or outliers in a very complex machine along with reducing the data-driven annotation cost. The architecture of the proposed model is based on an auto-encoder, and it uses the residual error, which stands for its reconstruction quality, to identify the anomaly. We assess our model using Surface-Mounted Device (SMD) machine sound, which is very complex, as experimental data, and state-of-the-art performance is successfully achieved for anomaly detection.

  17. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines

    PubMed Central

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-01-01

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561

  18. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.

    PubMed

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-04-29

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.

  19. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  20. A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.

  1. A Semiparametric Model for Hyperspectral Anomaly Detection

    DTIC Science & Technology

    2012-01-01

    treeline ) in the presence of natural background clutter (e.g., trees, dirt roads, grasses). Each target consists of about 7 × 4 pixels, and each pixel...vehicles near the treeline in Cube 1 (Figure 1) constitutes the target set, but, since anomaly detectors are not designed to detect a particular target

  2. Anomaly Detection of Electromyographic Signals.

    PubMed

    Ijaz, Ahsan; Choi, Jongeun

    2018-04-01

    In this paper, we provide a robust framework to detect anomalous electromyographic (EMG) signals and identify contamination types. As a first step for feature selection, optimally selected Lawton wavelets transform is applied. Robust principal component analysis (rPCA) is then performed on these wavelet coefficients to obtain features in a lower dimension. The rPCA based features are used for constructing a self-organizing map (SOM). Finally, hierarchical clustering is applied on the SOM that separates anomalous signals residing in the smaller clusters and breaks them into logical units for contamination identification. The proposed methodology is tested using synthetic and real world EMG signals. The synthetic EMG signals are generated using a heteroscedastic process mimicking desired experimental setups. A sub-part of these synthetic signals is introduced with anomalies. These results are followed with real EMG signals introduced with synthetic anomalies. Finally, a heterogeneous real world data set is used with known quality issues under an unsupervised setting. The framework provides recall of 90% (± 3.3) and precision of 99%(±0.4).

  3. Extending TOPS: Ontology-driven Anomaly Detection and Analysis System

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Michaelis, A.

    2010-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.

  4. Steganography anomaly detection using simple one-class classification

    NASA Astrophysics Data System (ADS)

    Rodriguez, Benjamin M.; Peterson, Gilbert L.; Agaian, Sos S.

    2007-04-01

    There are several security issues tied to multimedia when implementing the various applications in the cellular phone and wireless industry. One primary concern is the potential ease of implementing a steganography system. Traditionally, the only mechanism to embed information into a media file has been with a desktop computer. However, as the cellular phone and wireless industry matures, it becomes much simpler for the same techniques to be performed using a cell phone. In this paper, two methods are compared that classify cell phone images as either an anomaly or clean, where a clean image is one in which no alterations have been made and an anomalous image is one in which information has been hidden within the image. An image in which information has been hidden is known as a stego image. The main concern in detecting steganographic content with machine learning using cell phone images is in training specific embedding procedures to determine if the method has been used to generate a stego image. This leads to a possible flaw in the system when the learned model of stego is faced with a new stego method which doesn't match the existing model. The proposed solution to this problem is to develop systems that detect steganography as anomalies, making the embedding method irrelevant in detection. Two applicable classification methods for solving the anomaly detection of steganographic content problem are single class support vector machines (SVM) and Parzen-window. Empirical comparison of the two approaches shows that Parzen-window outperforms the single class SVM most likely due to the fact that Parzen-window generalizes less.

  5. Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering

    NASA Astrophysics Data System (ADS)

    Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech

    2015-03-01

    We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.

  6. Detection of anomalies in radio tomography of asteroids: Source count and forward errors

    NASA Astrophysics Data System (ADS)

    Pursiainen, S.; Kaasalainen, M.

    2014-09-01

    The purpose of this study was to advance numerical methods for radio tomography in which asteroid's internal electric permittivity distribution is to be recovered from radio frequency data gathered by an orbiter. The focus was on signal generation via multiple sources (transponders) providing one potential, or even essential, scenario to be implemented in a challenging in situ measurement environment and within tight payload limits. As a novel feature, the effects of forward errors including noise and a priori uncertainty of the forward (data) simulation were examined through a combination of the iterative alternating sequential (IAS) inverse algorithm and finite-difference time-domain (FDTD) simulation of time evolution data. Single and multiple source scenarios were compared in two-dimensional localization of permittivity anomalies. Three different anomaly strengths and four levels of total noise were tested. Results suggest, among other things, that multiple sources can be necessary to obtain appropriate results, for example, to distinguish three separate anomalies with permittivity less or equal than half of the background value, relevant in recovery of internal cavities.

  7. Anomaly Detection Techniques for Ad Hoc Networks

    ERIC Educational Resources Information Center

    Cai, Chaoli

    2009-01-01

    Anomaly detection is an important and indispensable aspect of any computer security mechanism. Ad hoc and mobile networks consist of a number of peer mobile nodes that are capable of communicating with each other absent a fixed infrastructure. Arbitrary node movements and lack of centralized control make them vulnerable to a wide variety of…

  8. Detection of admittivity anomaly on high-contrast heterogeneous backgrounds using frequency difference EIT.

    PubMed

    Jang, J; Seo, J K

    2015-06-01

    This paper describes a multiple background subtraction method in frequency difference electrical impedance tomography (fdEIT) to detect an admittivity anomaly from a high-contrast background conductivity distribution. The proposed method expands the use of the conventional weighted frequency difference EIT method, which has been used limitedly to detect admittivity anomalies in a roughly homogeneous background. The proposed method can be viewed as multiple weighted difference imaging in fdEIT. Although the spatial resolutions of the output images by fdEIT are very low due to the inherent ill-posedness, numerical simulations and phantom experiments of the proposed method demonstrate its feasibility to detect anomalies. It has potential application in stroke detection in a head model, which is highly heterogeneous due to the skull.

  9. Inflight and Preflight Detection of Pitot Tube Anomalies

    NASA Technical Reports Server (NTRS)

    Mitchell, Darrell W.

    2014-01-01

    The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.

  10. Classification of SD-OCT volumes for DME detection: an anomaly detection approach

    NASA Astrophysics Data System (ADS)

    Sankar, S.; Sidibé, D.; Cheung, Y.; Wong, T. Y.; Lamoureux, E.; Milea, D.; Meriaudeau, F.

    2016-03-01

    Diabetic Macular Edema (DME) is the leading cause of blindness amongst diabetic patients worldwide. It is characterized by accumulation of water molecules in the macula leading to swelling. Early detection of the disease helps prevent further loss of vision. Naturally, automated detection of DME from Optical Coherence Tomography (OCT) volumes plays a key role. To this end, a pipeline for detecting DME diseases in OCT volumes is proposed in this paper. The method is based on anomaly detection using Gaussian Mixture Model (GMM). It starts with pre-processing the B-scans by resizing, flattening, filtering and extracting features from them. Both intensity and Local Binary Pattern (LBP) features are considered. The dimensionality of the extracted features is reduced using PCA. As the last stage, a GMM is fitted with features from normal volumes. During testing, features extracted from the test volume are evaluated with the fitted model for anomaly and classification is made based on the number of B-scans detected as outliers. The proposed method is tested on two OCT datasets achieving a sensitivity and a specificity of 80% and 93% on the first dataset, and 100% and 80% on the second one. Moreover, experiments show that the proposed method achieves better classification performances than other recently published works.

  11. Tactile sensor of hardness recognition based on magnetic anomaly detection

    NASA Astrophysics Data System (ADS)

    Xue, Lingyun; Zhang, Dongfang; Chen, Qingguang; Rao, Huanle; Xu, Ping

    2018-03-01

    Hardness, as one kind of tactile sensing, plays an important role in the field of intelligent robot application such as gripping, agricultural harvesting, prosthetic hand and so on. Recently, with the rapid development of magnetic field sensing technology with high performance, a number of magnetic sensors have been developed for intelligent application. The tunnel Magnetoresistance(TMR) based on magnetoresistance principal works as the sensitive element to detect the magnetic field and it has proven its excellent ability of weak magnetic detection. In the paper, a new method based on magnetic anomaly detection was proposed to detect the hardness in the tactile way. The sensor is composed of elastic body, ferrous probe, TMR element, permanent magnet. When the elastic body embedded with ferrous probe touches the object under the certain size of force, deformation of elastic body will produce. Correspondingly, the ferrous probe will be forced to displace and the background magnetic field will be distorted. The distorted magnetic field was detected by TMR elements and the output signal at different time can be sampled. The slope of magnetic signal with the sampling time is different for object with different hardness. The result indicated that the magnetic anomaly sensor can recognize the hardness rapidly within 150ms after the tactile moment. The hardness sensor based on magnetic anomaly detection principal proposed in the paper has the advantages of simple structure, low cost, rapid response and it has shown great application potential in the field of intelligent robot.

  12. Anomaly detection for machine learning redshifts applied to SDSS galaxies

    NASA Astrophysics Data System (ADS)

    Hoyle, Ben; Rau, Markus Michael; Paech, Kerstin; Bonnett, Christopher; Seitz, Stella; Weller, Jochen

    2015-10-01

    We present an analysis of anomaly detection for machine learning redshift estimation. Anomaly detection allows the removal of poor training examples, which can adversely influence redshift estimates. Anomalous training examples may be photometric galaxies with incorrect spectroscopic redshifts, or galaxies with one or more poorly measured photometric quantity. We select 2.5 million `clean' SDSS DR12 galaxies with reliable spectroscopic redshifts, and 6730 `anomalous' galaxies with spectroscopic redshift measurements which are flagged as unreliable. We contaminate the clean base galaxy sample with galaxies with unreliable redshifts and attempt to recover the contaminating galaxies using the Elliptical Envelope technique. We then train four machine learning architectures for redshift analysis on both the contaminated sample and on the preprocessed `anomaly-removed' sample and measure redshift statistics on a clean validation sample generated without any preprocessing. We find an improvement on all measured statistics of up to 80 per cent when training on the anomaly removed sample as compared with training on the contaminated sample for each of the machine learning routines explored. We further describe a method to estimate the contamination fraction of a base data sample.

  13. Global Anomaly Detection in Two-Dimensional Symmetry-Protected Topological Phases

    NASA Astrophysics Data System (ADS)

    Bultinck, Nick; Vanhove, Robijn; Haegeman, Jutho; Verstraete, Frank

    2018-04-01

    Edge theories of symmetry-protected topological phases are well known to possess global symmetry anomalies. In this Letter we focus on two-dimensional bosonic phases protected by an on-site symmetry and analyze the corresponding edge anomalies in more detail. Physical interpretations of the anomaly in terms of an obstruction to orbifolding and constructing symmetry-preserving boundaries are connected to the cohomology classification of symmetry-protected phases in two dimensions. Using the tensor network and matrix product state formalism we numerically illustrate our arguments and discuss computational detection schemes to identify symmetry-protected order in a ground state wave function.

  14. Min-max hyperellipsoidal clustering for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A

    2006-08-01

    A novel hyperellipsoidal clustering technique is presented for an intrusion-detection system in network security. Hyperellipsoidal clusters toward maximum intracluster similarity and minimum intercluster similarity are generated from training data sets. The novelty of the technique lies in the fact that the parameters needed to construct higher order data models in general multivariate Gaussian functions are incrementally derived from the data sets using accretive processes. The technique is implemented in a feedforward neural network that uses a Gaussian radial basis function as the model generator. An evaluation based on the inclusiveness and exclusiveness of samples with respect to specific criteria is applied to accretively learn the output clusters of the neural network. One significant advantage of this is its ability to detect individual anomaly types that are hard to detect with other anomaly-detection schemes. Applying this technique, several feature subsets of the tcptrace network-connection records that give above 95% detection at false-positive rates below 5% were identified.

  15. Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery

    PubMed Central

    Sivaraks, Haemwaan

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284

  16. Model-Based Anomaly Detection for a Transparent Optical Transmission System

    NASA Astrophysics Data System (ADS)

    Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.

    In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.

  17. Detecting an atomic clock frequency anomaly using an adaptive Kalman filter algorithm

    NASA Astrophysics Data System (ADS)

    Song, Huijie; Dong, Shaowu; Wu, Wenjun; Jiang, Meng; Wang, Weixiong

    2018-06-01

    The abnormal frequencies of an atomic clock mainly include frequency jump and frequency drift jump. Atomic clock frequency anomaly detection is a key technique in time-keeping. The Kalman filter algorithm, as a linear optimal algorithm, has been widely used in real-time detection for abnormal frequency. In order to obtain an optimal state estimation, the observation model and dynamic model of the Kalman filter algorithm should satisfy Gaussian white noise conditions. The detection performance is degraded if anomalies affect the observation model or dynamic model. The idea of the adaptive Kalman filter algorithm, applied to clock frequency anomaly detection, uses the residuals given by the prediction for building ‘an adaptive factor’ the prediction state covariance matrix is real-time corrected by the adaptive factor. The results show that the model error is reduced and the detection performance is improved. The effectiveness of the algorithm is verified by the frequency jump simulation, the frequency drift jump simulation and the measured data of the atomic clock by using the chi-square test.

  18. Visual analytics of anomaly detection in large data streams

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay

    2009-01-01

    Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.

  19. Magnetic Anomaly Detection by Remote Means

    DTIC Science & Technology

    2016-09-21

    REFERENCES 1. W. Happer, "Laser Remote Sensing of Magnetic Fields in the Atmosphere by Two-Photon Optical Pumping of Xe 129,” , NADC Report N62269-78-M...by Remote Means 5b. GRANT NUMBER NOOO 14-13-1-0282 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Miles , Richard and Dogariu...unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Research on the possibility of detecting magnetic anomalies remotely using laser excitation of a

  20. A hyperspectral imagery anomaly detection algorithm based on local three-dimensional orthogonal subspace projection

    NASA Astrophysics Data System (ADS)

    Zhang, Xing; Wen, Gongjian

    2015-10-01

    Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.

  1. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  2. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    DOE PAGES

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; ...

    2016-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  3. An Optimized Method to Detect BDS Satellites' Orbit Maneuvering and Anomalies in Real-Time.

    PubMed

    Huang, Guanwen; Qin, Zhiwei; Zhang, Qin; Wang, Le; Yan, Xingyuan; Wang, Xiaolei

    2018-02-28

    The orbital maneuvers of Global Navigation Satellite System (GNSS) Constellations will decrease the performance and accuracy of positioning, navigation, and timing (PNT). Because satellites in the Chinese BeiDou Navigation Satellite System (BDS) are in Geostationary Orbit (GEO) and Inclined Geosynchronous Orbit (IGSO), maneuvers occur more frequently. Also, the precise start moment of the BDS satellites' orbit maneuvering cannot be obtained by common users. This paper presented an improved real-time detecting method for BDS satellites' orbit maneuvering and anomalies with higher timeliness and higher accuracy. The main contributions to this improvement are as follows: (1) instead of the previous two-steps method, a new one-step method with higher accuracy is proposed to determine the start moment and the pseudo random noise code (PRN) of the satellite orbit maneuvering in that time; (2) BDS Medium Earth Orbit (MEO) orbital maneuvers are firstly detected according to the proposed selection strategy for the stations; and (3) the classified non-maneuvering anomalies are detected by a new median robust method using the weak anomaly detection factor and the strong anomaly detection factor. The data from the Multi-GNSS Experiment (MGEX) in 2017 was used for experimental analysis. The experimental results and analysis showed that the start moment of orbital maneuvers and the period of non-maneuver anomalies can be determined more accurately in real-time. When orbital maneuvers and anomalies occur, the proposed method improved the data utilization for 91 and 95 min in 2017.

  4. An Optimized Method to Detect BDS Satellites’ Orbit Maneuvering and Anomalies in Real-Time

    PubMed Central

    Huang, Guanwen; Qin, Zhiwei; Zhang, Qin; Wang, Le; Yan, Xingyuan; Wang, Xiaolei

    2018-01-01

    The orbital maneuvers of Global Navigation Satellite System (GNSS) Constellations will decrease the performance and accuracy of positioning, navigation, and timing (PNT). Because satellites in the Chinese BeiDou Navigation Satellite System (BDS) are in Geostationary Orbit (GEO) and Inclined Geosynchronous Orbit (IGSO), maneuvers occur more frequently. Also, the precise start moment of the BDS satellites’ orbit maneuvering cannot be obtained by common users. This paper presented an improved real-time detecting method for BDS satellites’ orbit maneuvering and anomalies with higher timeliness and higher accuracy. The main contributions to this improvement are as follows: (1) instead of the previous two-steps method, a new one-step method with higher accuracy is proposed to determine the start moment and the pseudo random noise code (PRN) of the satellite orbit maneuvering in that time; (2) BDS Medium Earth Orbit (MEO) orbital maneuvers are firstly detected according to the proposed selection strategy for the stations; and (3) the classified non-maneuvering anomalies are detected by a new median robust method using the weak anomaly detection factor and the strong anomaly detection factor. The data from the Multi-GNSS Experiment (MGEX) in 2017 was used for experimental analysis. The experimental results and analysis showed that the start moment of orbital maneuvers and the period of non-maneuver anomalies can be determined more accurately in real-time. When orbital maneuvers and anomalies occur, the proposed method improved the data utilization for 91 and 95 min in 2017. PMID:29495638

  5. Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Todd Vollmer; Milos Manic

    The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, thismore » paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.« less

  6. Detection of Low Temperature Volcanogenic Thermal Anomalies with ASTER

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Baxter, S.

    2009-12-01

    Predicting volcanic eruptions is a thorny problem, as volcanoes typically exhibit idiosyncratic waxing and/or waning pre-eruption emission, geodetic, and seismic behavior. It is no surprise that increasing our accuracy and precision in eruption prediction depends on assessing the time-progressions of all relevant precursor geophysical, geochemical, and geological phenomena, and on more frequently observing volcanoes when they become restless. The ASTER instrument on the NASA Terra Earth Observing System satellite in low earth orbit provides important capabilities in the area of detection of volcanogenic anomalies such as thermal precursors and increased passive gas emissions. Its unique high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), bore-sighted with visible and near-IR imaging data, and combined with off-nadir pointing and stereo-photogrammetric capabilities make ASTER a potentially important volcanic precursor detection tool. We are utilizing the JPL ASTER Volcano Archive (http://ava.jpl.nasa.gov) to systematically examine 80,000+ ASTER volcano images to analyze (a) thermal emission baseline behavior for over 1500 volcanoes worldwide, (b) the form and magnitude of time-dependent thermal emission variability for these volcanoes, and (c) the spatio-temporal limits of detection of pre-eruption temporal changes in thermal emission in the context of eruption precursor behavior. We are creating and analyzing a catalog of the magnitude, frequency, and distribution of volcano thermal signatures worldwide as observed from ASTER since 2000 at 90m/pixel. Of particular interest as eruption precursors are small low contrast thermal anomalies of low apparent absolute temperature (e.g., melt-water lakes, fumaroles, geysers, grossly sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and

  7. Detecting ship targets in spaceborne infrared image based on modeling radiation anomalies

    NASA Astrophysics Data System (ADS)

    Wang, Haibo; Zou, Zhengxia; Shi, Zhenwei; Li, Bo

    2017-09-01

    Using infrared imaging sensors to detect ship target in the ocean environment has many advantages compared to other sensor modalities, such as better thermal sensitivity and all-weather detection capability. We propose a new ship detection method by modeling radiation anomalies for spaceborne infrared image. The proposed method can be decomposed into two stages, where in the first stage, a test infrared image is densely divided into a set of image patches and the radiation anomaly of each patch is estimated by a Gaussian Mixture Model (GMM), and thereby target candidates are obtained from anomaly image patches. In the second stage, target candidates are further checked by a more discriminative criterion to obtain the final detection result. The main innovation of the proposed method is inspired by the biological mechanism that human eyes are sensitive to the unusual and anomalous patches among complex background. The experimental result on short wavelength infrared band (1.560 - 2.300 μm) and long wavelength infrared band (10.30 - 12.50 μm) of Landsat-8 satellite shows the proposed method achieves a desired ship detection accuracy with higher recall than other classical ship detection methods.

  8. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    NASA Technical Reports Server (NTRS)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  9. Sequential detection of influenza epidemics by the Kolmogorov-Smirnov test

    PubMed Central

    2012-01-01

    Background Influenza is a well known and common human respiratory infection, causing significant morbidity and mortality every year. Despite Influenza variability, fast and reliable outbreak detection is required for health resource planning. Clinical health records, as published by the Diagnosticat database in Catalonia, host useful data for probabilistic detection of influenza outbreaks. Methods This paper proposes a statistical method to detect influenza epidemic activity. Non-epidemic incidence rates are modeled against the exponential distribution, and the maximum likelihood estimate for the decaying factor λ is calculated. The sequential detection algorithm updates the parameter as new data becomes available. Binary epidemic detection of weekly incidence rates is assessed by Kolmogorov-Smirnov test on the absolute difference between the empirical and the cumulative density function of the estimated exponential distribution with significance level 0 ≤ α ≤ 1. Results The main advantage with respect to other approaches is the adoption of a statistically meaningful test, which provides an indicator of epidemic activity with an associated probability. The detection algorithm was initiated with parameter λ0 = 3.8617 estimated from the training sequence (corresponding to non-epidemic incidence rates of the 2008-2009 influenza season) and sequentially updated. Kolmogorov-Smirnov test detected the following weeks as epidemic for each influenza season: 50−10 (2008-2009 season), 38−50 (2009-2010 season), weeks 50−9 (2010-2011 season) and weeks 3 to 12 for the current 2011-2012 season. Conclusions Real medical data was used to assess the validity of the approach, as well as to construct a realistic statistical model of weekly influenza incidence rates in non-epidemic periods. For the tested data, the results confirmed the ability of the algorithm to detect the start and the end of epidemic periods. In general, the proposed test could be applied to other data

  10. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Urnes, James M., Sr. (Inventor); Smith, Timothy A. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  11. Bio-Inspired Distributed Decision Algorithms for Anomaly Detection

    DTIC Science & Technology

    2017-03-01

    TERMS DIAMoND, Local Anomaly Detector, Total Impact Estimation, Threat Level Estimator 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU...21 4.2 Performance of the DIAMoND Algorithm as a DNS-Server Level Attack Detection and Mitigation...with 6 Nodes ........................................................................................ 13 8 Hierarchical 2- Level Topology

  12. Apparatus for detecting a magnetic anomaly contiguous to remote location by squid gradiometer and magnetometer systems

    DOEpatents

    Overton, Jr., William C.; Steyert, Jr., William A.

    1984-01-01

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  13. Latent Space Tracking from Heterogeneous Data with an Application for Anomaly Detection

    DTIC Science & Technology

    2015-11-01

    specific, if the anomaly behaves as a sudden outlier after which the data stream goes back to normal state, then the anomalous data point should be...introduced three types of anomalies , all of them are sudden outliers . 438 J. Huang and X. Ning Table 2. Synthetic dataset: AUC and parameters method...Latent Space Tracking from Heterogeneous Data with an Application for Anomaly Detection Jiaji Huang1(B) and Xia Ning2 1 Department of Electrical

  14. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks

    PubMed Central

    Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin

    2016-01-01

    The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035

  15. Advancements of Data Anomaly Detection Research in Wireless Sensor Networks: A Survey and Open Issues

    PubMed Central

    Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-01-01

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182

  16. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  17. Rule-based expert system for maritime anomaly detection

    NASA Astrophysics Data System (ADS)

    Roy, Jean

    2010-04-01

    Maritime domain operators/analysts have a mandate to be aware of all that is happening within their areas of responsibility. This mandate derives from the needs to defend sovereignty, protect infrastructures, counter terrorism, detect illegal activities, etc., and it has become more challenging in the past decade, as commercial shipping turned into a potential threat. In particular, a huge portion of the data and information made available to the operators/analysts is mundane, from maritime platforms going about normal, legitimate activities, and it is very challenging for them to detect and identify the non-mundane. To achieve such anomaly detection, they must establish numerous relevant situational facts from a variety of sensor data streams. Unfortunately, many of the facts of interest just cannot be observed; the operators/analysts thus use their knowledge of the maritime domain and their reasoning faculties to infer these facts. As they are often overwhelmed by the large amount of data and information, automated reasoning tools could be used to support them by inferring the necessary facts, ultimately providing indications and warning on a small number of anomalous events worthy of their attention. Along this line of thought, this paper describes a proof-of-concept prototype of a rule-based expert system implementing automated rule-based reasoning in support of maritime anomaly detection.

  18. CHAMP: a locally adaptive unmixing-based hyperspectral anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Crist, Eric P.; Thelen, Brian J.; Carrara, David A.

    1998-10-01

    Anomaly detection offers a means by which to identify potentially important objects in a scene without prior knowledge of their spectral signatures. As such, this approach is less sensitive to variations in target class composition, atmospheric and illumination conditions, and sensor gain settings than would be a spectral matched filter or similar algorithm. The best existing anomaly detectors generally fall into one of two categories: those based on local Gaussian statistics, and those based on linear mixing moles. Unmixing-based approaches better represent the real distribution of data in a scene, but are typically derived and applied on a global or scene-wide basis. Locally adaptive approaches allow detection of more subtle anomalies by accommodating the spatial non-homogeneity of background classes in a typical scene, but provide a poorer representation of the true underlying background distribution. The CHAMP algorithm combines the best attributes of both approaches, applying a linear-mixing model approach in a spatially adaptive manner. The algorithm itself, and teste results on simulated and actual hyperspectral image data, are presented in this paper.

  19. Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets

    PubMed Central

    Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin

    2017-01-01

    Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy and sparse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxicabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques. PMID:28282948

  20. Unsupervised Anomaly Detection Based on Clustering and Multiple One-Class SVM

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Kwon, Yongjin

    Intrusion detection system (IDS) has played an important role as a device to defend our networks from cyber attacks. However, since it is unable to detect unknown attacks, i.e., 0-day attacks, the ultimate challenge in intrusion detection field is how we can exactly identify such an attack by an automated manner. Over the past few years, several studies on solving these problems have been made on anomaly detection using unsupervised learning techniques such as clustering, one-class support vector machine (SVM), etc. Although they enable one to construct intrusion detection models at low cost and effort, and have capability to detect unforeseen attacks, they still have mainly two problems in intrusion detection: a low detection rate and a high false positive rate. In this paper, we propose a new anomaly detection method based on clustering and multiple one-class SVM in order to improve the detection rate while maintaining a low false positive rate. We evaluated our method using KDD Cup 1999 data set. Evaluation results show that our approach outperforms the existing algorithms reported in the literature; especially in detection of unknown attacks.

  1. Feasibility of anomaly detection and characterization using trans-admittance mammography with 60 × 60 electrode array

    NASA Astrophysics Data System (ADS)

    Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; In Oh, Tong

    2014-10-01

    Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.

  2. Sequential detection of web defects

    DOEpatents

    Eichel, Paul H.; Sleefe, Gerard E.; Stalker, K. Terry; Yee, Amy A.

    2001-01-01

    A system for detecting defects on a moving web having a sequential series of identical frames uses an imaging device to form a real-time camera image of a frame and a comparitor to comparing elements of the camera image with corresponding elements of an image of an exemplar frame. The comparitor provides an acceptable indication if the pair of elements are determined to be statistically identical; and a defective indication if the pair of elements are determined to be statistically not identical. If the pair of elements is neither acceptable nor defective, the comparitor recursively compares the element of said exemplar frame with corresponding elements of other frames on said web until one of the acceptable or defective indications occur.

  3. HPNAIDM: The High-Performance Network Anomaly/Intrusion Detection and Mitigation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yan

    Identifying traffic anomalies and attacks rapidly and accurately is critical for large network operators. With the rapid growth of network bandwidth, such as the next generation DOE UltraScience Network, and fast emergence of new attacks/virus/worms, existing network intrusion detection systems (IDS) are insufficient because they: • Are mostly host-based and not scalable to high-performance networks; • Are mostly signature-based and unable to adaptively recognize flow-level unknown attacks; • Cannot differentiate malicious events from the unintentional anomalies. To address these challenges, we proposed and developed a new paradigm called high-performance network anomaly/intrustion detection and mitigation (HPNAIDM) system. The new paradigm ismore » significantly different from existing IDSes with the following features (research thrusts). • Online traffic recording and analysis on high-speed networks; • Online adaptive flow-level anomaly/intrusion detection and mitigation; • Integrated approach for false positive reduction. Our research prototype and evaluation demonstrate that the HPNAIDM system is highly effective and economically feasible. Beyond satisfying the pre-set goals, we even exceed that significantly (see more details in the next section). Overall, our project harvested 23 publications (2 book chapters, 6 journal papers and 15 peer-reviewed conference/workshop papers). Besides, we built a website for technique dissemination, which hosts two system prototype release to the research community. We also filed a patent application and developed strong international and domestic collaborations which span both academia and industry.« less

  4. A novel approach for detection of anomalies using measurement data of the Ironton-Russell bridge

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Norouzi, Mehdi; Hunt, Victor; Helmicki, Arthur

    2015-04-01

    Data models have been increasingly used in recent years for documenting normal behavior of structures and hence detect and classify anomalies. Large numbers of machine learning algorithms were proposed by various researchers to model operational and functional changes in structures; however, a limited number of studies were applied to actual measurement data due to limited access to the long term measurement data of structures and lack of access to the damaged states of structures. By monitoring the structure during construction and reviewing the effect of construction events on the measurement data, this study introduces a new approach to detect and eventually classify anomalies during construction and after construction. First, the implementation procedure of the sensory network that develops while the bridge is being built and its current status will be detailed. Second, the proposed anomaly detection algorithm will be applied on the collected data and finally, detected anomalies will be validated against the archived construction events.

  5. Hierarchical Kohonenen net for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A; Huff, Julie

    2005-04-01

    A novel multilevel hierarchical Kohonen Net (K-Map) for an intrusion detection system is presented. Each level of the hierarchical map is modeled as a simple winner-take-all K-Map. One significant advantage of this multilevel hierarchical K-Map is its computational efficiency. Unlike other statistical anomaly detection methods such as nearest neighbor approach, K-means clustering or probabilistic analysis that employ distance computation in the feature space to identify the outliers, our approach does not involve costly point-to-point computation in organizing the data into clusters. Another advantage is the reduced network size. We use the classification capability of the K-Map on selected dimensions of data set in detecting anomalies. Randomly selected subsets that contain both attacks and normal records from the KDD Cup 1999 benchmark data are used to train the hierarchical net. We use a confidence measure to label the clusters. Then we use the test set from the same KDD Cup 1999 benchmark to test the hierarchical net. We show that a hierarchical K-Map in which each layer operates on a small subset of the feature space is superior to a single-layer K-Map operating on the whole feature space in detecting a variety of attacks in terms of detection rate as well as false positive rate.

  6. Anomaly Detection In Additively Manufactured Parts Using Laser Doppler Vibrometery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, Carlos A.

    Additively manufactured parts are susceptible to non-uniform structure caused by the unique manufacturing process. This can lead to structural weakness or catastrophic failure. Using laser Doppler vibrometry and frequency response analysis, non-contact detection of anomalies in additively manufactured parts may be possible. Preliminary tests show promise for small scale detection, but more future work is necessary.

  7. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  8. Detecting errors and anomalies in computerized materials control and accountability databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteson, R.; Hench, K.; Yarbro, T.

    The Automated MC and A Database Assessment project is aimed at improving anomaly and error detection in materials control and accountability (MC and A) databases and increasing confidence in the data that they contain. Anomalous data resulting in poor categorization of nuclear material inventories greatly reduces the value of the database information to users. Therefore it is essential that MC and A data be assessed periodically for anomalies or errors. Anomaly detection can identify errors in databases and thus provide assurance of the integrity of data. An expert system has been developed at Los Alamos National Laboratory that examines thesemore » large databases for anomalous or erroneous data. For several years, MC and A subject matter experts at Los Alamos have been using this automated system to examine the large amounts of accountability data that the Los Alamos Plutonium Facility generates. These data are collected and managed by the Material Accountability and Safeguards System, a near-real-time computerized nuclear material accountability and safeguards system. This year they have expanded the user base, customizing the anomaly detector for the varying requirements of different groups of users. This paper describes the progress in customizing the expert systems to the needs of the users of the data and reports on their results.« less

  9. Detection of sinkholes or anomalies using full seismic wave fields.

    DOT National Transportation Integrated Search

    2013-04-01

    This research presents an application of two-dimensional (2-D) time-domain waveform tomography for detection of embedded sinkholes and anomalies. The measured seismic surface wave fields were inverted using a full waveform inversion (FWI) technique, ...

  10. A scalable architecture for online anomaly detection of WLCG batch jobs

    NASA Astrophysics Data System (ADS)

    Kuehn, E.; Fischer, M.; Giffels, M.; Jung, C.; Petzold, A.

    2016-10-01

    For data centres it is increasingly important to monitor the network usage, and learn from network usage patterns. Especially configuration issues or misbehaving batch jobs preventing a smooth operation need to be detected as early as possible. At the GridKa data and computing centre we therefore operate a tool BPNetMon for monitoring traffic data and characteristics of WLCG batch jobs and pilots locally on different worker nodes. On the one hand local information itself are not sufficient to detect anomalies for several reasons, e.g. the underlying job distribution on a single worker node might change or there might be a local misconfiguration. On the other hand a centralised anomaly detection approach does not scale regarding network communication as well as computational costs. We therefore propose a scalable architecture based on concepts of a super-peer network.

  11. SU-G-JeP4-03: Anomaly Detection of Respiratory Motion by Use of Singular Spectrum Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kotoku, J; Kumagai, S; Nakabayashi, S

    Purpose: The implementation and realization of automatic anomaly detection of respiratory motion is a very important technique to prevent accidental damage during radiation therapy. Here, we propose an automatic anomaly detection method using singular value decomposition analysis. Methods: The anomaly detection procedure consists of four parts:1) measurement of normal respiratory motion data of a patient2) calculation of a trajectory matrix representing normal time-series feature3) real-time monitoring and calculation of a trajectory matrix of real-time data.4) calculation of an anomaly score from the similarity of the two feature matrices. Patient motion was observed by a marker-less tracking system using a depthmore » camera. Results: Two types of motion e.g. cough and sudden stop of breathing were successfully detected in our real-time application. Conclusion: Automatic anomaly detection of respiratory motion using singular spectrum analysis was successful in the cough and sudden stop of breathing. The clinical use of this algorithm will be very hopeful. This work was supported by JSPS KAKENHI Grant Number 15K08703.« less

  12. Lithospheric structure of Taiwan from gravity modelling and sequential inversion of seismological and gravity data

    NASA Astrophysics Data System (ADS)

    Masson, F.; Mouyen, M.; Hwang, C.; Wu, Y.-M.; Ponton, F.; Lehujeur, M.; Dorbath, C.

    2012-11-01

    Using a Bouguer anomaly map and a dense seismic data set, we have performed two studies in order to improve our knowledge of the deep structure of Taiwan. First, we model the Bouguer anomaly along a profile crossing the island using simple forward modelling. The modelling is 2D, with the hypothesis of cylindrical symmetry. Second we present a joint analysis of gravity anomaly and seismic arrival time data recorded in Taiwan. An initial velocity model has been obtained by local earthquake tomography (LET) of the seismological data. The LET velocity model was used to construct an initial 3D gravity model, using a linear velocity-density relationship (Birch's law). The synthetic Bouguer anomaly calculated for this model has the same shape and wavelength as the observed anomaly. However some characteristics of the anomaly map are not retrieved. To derive a crustal velocity/density model which accounts for both types of observations, we performed a sequential inversion of seismological and gravity data. The variance reduction of the arrival time data for the final sequential model was comparable to the variance reduction obtained by simple LET. Moreover, the sequential model explained about 80% of the observed gravity anomaly. New 3D model of Taiwan lithosphere is presented.

  13. Target detection using the background model from the topological anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Dorado Munoz, Leidy P.; Messinger, David W.; Ziemann, Amanda K.

    2013-05-01

    The Topological Anomaly Detection (TAD) algorithm has been used as an anomaly detector in hyperspectral and multispectral images. TAD is an algorithm based on graph theory that constructs a topological model of the background in a scene, and computes an anomalousness ranking for all of the pixels in the image with respect to the background in order to identify pixels with uncommon or strange spectral signatures. The pixels that are modeled as background are clustered into groups or connected components, which could be representative of spectral signatures of materials present in the background. Therefore, the idea of using the background components given by TAD in target detection is explored in this paper. In this way, these connected components are characterized in three different approaches, where the mean signature and endmembers for each component are calculated and used as background basis vectors in Orthogonal Subspace Projection (OSP) and Adaptive Subspace Detector (ASD). Likewise, the covariance matrix of those connected components is estimated and used in detectors: Constrained Energy Minimization (CEM) and Adaptive Coherence Estimator (ACE). The performance of these approaches and the different detectors is compared with a global approach, where the background characterization is derived directly from the image. Experiments and results using self-test data set provided as part of the RIT blind test target detection project are shown.

  14. Sequential dengue virus infections detected in active and passive surveillance programs in Thailand, 1994-2010.

    PubMed

    Bhoomiboonchoo, Piraya; Nisalak, Ananda; Chansatiporn, Natkamol; Yoon, In-Kyu; Kalayanarooj, Siripen; Thipayamongkolgul, Mathuros; Endy, Timothy; Rothman, Alan L; Green, Sharone; Srikiatkhachorn, Anon; Buddhari, Darunee; Mammen, Mammen P; Gibbons, Robert V

    2015-03-14

    The effect of prior dengue virus (DENV) exposure on subsequent heterologous infection can be beneficial or detrimental depending on many factors including timing of infection. We sought to evaluate this effect by examining a large database of DENV infections captured by both active and passive surveillance encompassing a wide clinical spectrum of disease. We evaluated datasets from 17 years of hospital-based passive surveillance and nine years of cohort studies, including clinical and subclinical DENV infections, to assess the outcomes of sequential heterologous infections. Chi square or Fisher's exact test was used to compare proportions of infection outcomes such as disease severity; ANOVA was used for continuous variables. Multivariate logistic regression was used to assess risk factors for infection outcomes. Of 38,740 DENV infections, two or more infections were detected in 502 individuals; 14 had three infections. The mean ages at the time of the first and second detected infections were 7.6 ± 3.0 and 11.2 ± 3.0 years. The shortest time between sequential infections was 66 days. A longer time interval between sequential infections was associated with dengue hemorrhagic fever (DHF) in the second detected infection (OR 1.3, 95% CI 1.2-1.4). All possible sequential serotype pairs were observed among 201 subjects with DHF at the second detected infection, except DENV-4 followed by DENV-3. Among DENV infections detected in cohort subjects by active study surveillance and subsequent non-study hospital-based passive surveillance, hospitalization at the first detected infection increased the likelihood of hospitalization at the second detected infection. Increasing time between sequential DENV infections was associated with greater severity of the second detected infection, supporting the role of heterotypic immunity in both protection and enhancement. Hospitalization was positively associated between the first and second detected infections, suggesting

  15. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    NASA Astrophysics Data System (ADS)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  16. Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets.

    PubMed

    Wang, Hongtao; Wen, Hui; Yi, Feng; Zhu, Hongsong; Sun, Limin

    2017-03-09

    Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System) snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy ands parse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI) model and a Road Anomaly Test (RAT) model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF). Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxi cabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques.

  17. Eyewitness decisions in simultaneous and sequential lineups: a dual-process signal detection theory analysis.

    PubMed

    Meissner, Christian A; Tredoux, Colin G; Parker, Janat F; MacLin, Otto H

    2005-07-01

    Many eyewitness researchers have argued for the application of a sequential alternative to the traditional simultaneous lineup, given its role in decreasing false identifications of innocent suspects (sequential superiority effect). However, Ebbesen and Flowe (2002) have recently noted that sequential lineups may merely bring about a shift in response criterion, having no effect on discrimination accuracy. We explored this claim, using a method that allows signal detection theory measures to be collected from eyewitnesses. In three experiments, lineup type was factorially combined with conditions expected to influence response criterion and/or discrimination accuracy. Results were consistent with signal detection theory predictions, including that of a conservative criterion shift with the sequential presentation of lineups. In a fourth experiment, we explored the phenomenological basis for the criterion shift, using the remember-know-guess procedure. In accord with previous research, the criterion shift in sequential lineups was associated with a reduction in familiarity-based responding. It is proposed that the relative similarity between lineup members may create a context in which fluency-based processing is facilitated to a greater extent when lineup members are presented simultaneously.

  18. Apparatus and method for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    DOEpatents

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1981-05-22

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  19. Sequential capillary electrophoresis analysis using optically gated sample injection and UV/vis detection.

    PubMed

    Liu, Xiaoxia; Tian, Miaomiao; Camara, Mohamed Amara; Guo, Liping; Yang, Li

    2015-10-01

    We present sequential CE analysis of amino acids and L-asparaginase-catalyzed enzyme reaction, by combing the on-line derivatization, optically gated (OG) injection and commercial-available UV-Vis detection. Various experimental conditions for sequential OG-UV/vis CE analysis were investigated and optimized by analyzing a standard mixture of amino acids. High reproducibility of the sequential CE analysis was demonstrated with RSD values (n = 20) of 2.23, 2.57, and 0.70% for peak heights, peak areas, and migration times, respectively, and the LOD of 5.0 μM (for asparagine) and 2.0 μM (for aspartic acid) were obtained. With the application of the OG-UV/vis CE analysis, sequential online CE enzyme assay of L-asparaginase-catalyzed enzyme reaction was carried out by automatically and continuously monitoring the substrate consumption and the product formation every 12 s from the beginning to the end of the reaction. The Michaelis constants for the reaction were obtained and were found to be in good agreement with the results of traditional off-line enzyme assays. The study demonstrated the feasibility and reliability of integrating the OG injection with UV/vis detection for sequential online CE analysis, which could be of potential value for online monitoring various chemical reaction and bioprocesses. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Capacitance probe for detection of anomalies in non-metallic plastic pipe

    DOEpatents

    Mathur, Mahendra P.; Spenik, James L.; Condon, Christopher M.; Anderson, Rodney; Driscoll, Daniel J.; Fincham, Jr., William L.; Monazam, Esmail R.

    2010-11-23

    The disclosure relates to analysis of materials using a capacitive sensor to detect anomalies through comparison of measured capacitances. The capacitive sensor is used in conjunction with a capacitance measurement device, a location device, and a processor in order to generate a capacitance versus location output which may be inspected for the detection and localization of anomalies within the material under test. The components may be carried as payload on an inspection vehicle which may traverse through a pipe interior, allowing evaluation of nonmetallic or plastic pipes when the piping exterior is not accessible. In an embodiment, supporting components are solid-state devices powered by a low voltage on-board power supply, providing for use in environments where voltage levels may be restricted.

  1. GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D

    This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% truemore » positive rates at both.« less

  2. A new prior for bayesian anomaly detection: application to biosurveillance.

    PubMed

    Shen, Y; Cooper, G F

    2010-01-01

    Bayesian anomaly detection computes posterior probabilities of anomalous events by combining prior beliefs and evidence from data. However, the specification of prior probabilities can be challenging. This paper describes a Bayesian prior in the context of disease outbreak detection. The goal is to provide a meaningful, easy-to-use prior that yields a posterior probability of an outbreak that performs at least as well as a standard frequentist approach. If this goal is achieved, the resulting posterior could be usefully incorporated into a decision analysis about how to act in light of a possible disease outbreak. This paper describes a Bayesian method for anomaly detection that combines learning from data with a semi-informative prior probability over patterns of anomalous events. A univariate version of the algorithm is presented here for ease of illustration of the essential ideas. The paper describes the algorithm in the context of disease-outbreak detection, but it is general and can be used in other anomaly detection applications. For this application, the semi-informative prior specifies that an increased count over baseline is expected for the variable being monitored, such as the number of respiratory chief complaints per day at a given emergency department. The semi-informative prior is derived based on the baseline prior, which is estimated from using historical data. The evaluation reported here used semi-synthetic data to evaluate the detection performance of the proposed Bayesian method and a control chart method, which is a standard frequentist algorithm that is closest to the Bayesian method in terms of the type of data it uses. The disease-outbreak detection performance of the Bayesian method was statistically significantly better than that of the control chart method when proper baseline periods were used to estimate the baseline behavior to avoid seasonal effects. When using longer baseline periods, the Bayesian method performed as well as the

  3. Anomaly detection of turbopump vibration in Space Shuttle Main Engine using statistics and neural networks

    NASA Technical Reports Server (NTRS)

    Lo, C. F.; Wu, K.; Whitehead, B. A.

    1993-01-01

    The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.

  4. A Comparative Study of Unsupervised Anomaly Detection Techniques Using Honeypot Data

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Inoue, Daisuke; Eto, Masashi; Nakao, Koji

    Intrusion Detection Systems (IDS) have been received considerable attention among the network security researchers as one of the most promising countermeasures to defend our crucial computer systems or networks against attackers on the Internet. Over the past few years, many machine learning techniques have been applied to IDSs so as to improve their performance and to construct them with low cost and effort. Especially, unsupervised anomaly detection techniques have a significant advantage in their capability to identify unforeseen attacks, i.e., 0-day attacks, and to build intrusion detection models without any labeled (i.e., pre-classified) training data in an automated manner. In this paper, we conduct a set of experiments to evaluate and analyze performance of the major unsupervised anomaly detection techniques using real traffic data which are obtained at our honeypots deployed inside and outside of the campus network of Kyoto University, and using various evaluation criteria, i.e., performance evaluation by similarity measurements and the size of training data, overall performance, detection ability for unknown attacks, and time complexity. Our experimental results give some practical and useful guidelines to IDS researchers and operators, so that they can acquire insight to apply these techniques to the area of intrusion detection, and devise more effective intrusion detection models.

  5. Effects of Sampling and Spatio/Temporal Granularity in Traffic Monitoring on Anomaly Detectability

    NASA Astrophysics Data System (ADS)

    Ishibashi, Keisuke; Kawahara, Ryoichi; Mori, Tatsuya; Kondoh, Tsuyoshi; Asano, Shoichiro

    We quantitatively evaluate how sampling and spatio/temporal granularity in traffic monitoring affect the detectability of anomalous traffic. Those parameters also affect the monitoring burden, so network operators face a trade-off between the monitoring burden and detectability and need to know which are the optimal paramter values. We derive equations to calculate the false positive ratio and false negative ratio for given values of the sampling rate, granularity, statistics of normal traffic, and volume of anomalies to be detected. Specifically, assuming that the normal traffic has a Gaussian distribution, which is parameterized by its mean and standard deviation, we analyze how sampling and monitoring granularity change these distribution parameters. This analysis is based on observation of the backbone traffic, which exhibits spatially uncorrelated and temporally long-range dependence. Then we derive the equations for detectability. With those equations, we can answer the practical questions that arise in actual network operations: what sampling rate to set to find the given volume of anomaly, or, if the sampling is too high for actual operation, what granularity is optimal to find the anomaly for a given lower limit of sampling rate.

  6. Statistical characteristics of the sequential detection of signals in correlated noise

    NASA Astrophysics Data System (ADS)

    Averochkin, V. A.; Baranov, P. E.

    1985-10-01

    A solution is given to the problem of determining the distribution of the duration of the sequential two-threshold Wald rule for the time-discrete detection of determinate and Gaussian correlated signals on a background of Gaussian correlated noise. Expressions are obtained for the joint probability densities of the likelihood ratio logarithms, and an analysis is made of the effect of correlation and SNR on the duration distribution and the detection efficiency. Comparison is made with Neumann-Pearson detection.

  7. Using Physical Models for Anomaly Detection in Control Systems

    NASA Astrophysics Data System (ADS)

    Svendsen, Nils; Wolthusen, Stephen

    Supervisory control and data acquisition (SCADA) systems are increasingly used to operate critical infrastructure assets. However, the inclusion of advanced information technology and communications components and elaborate control strategies in SCADA systems increase the threat surface for external and subversion-type attacks. The problems are exacerbated by site-specific properties of SCADA environments that make subversion detection impractical; and by sensor noise and feedback characteristics that degrade conventional anomaly detection systems. Moreover, potential attack mechanisms are ill-defined and may include both physical and logical aspects.

  8. Fiber Optic Bragg Grating Sensors for Thermographic Detection of Subsurface Anomalies

    NASA Technical Reports Server (NTRS)

    Allison, Sidney G.; Winfree, William P.; Wu, Meng-Chou

    2009-01-01

    Conventional thermography with an infrared imager has been shown to be an extremely viable technique for nondestructively detecting subsurface anomalies such as thickness variations due to corrosion. A recently developed technique using fiber optic sensors to measure temperature holds potential for performing similar inspections without requiring an infrared imager. The structure is heated using a heat source such as a quartz lamp with fiber Bragg grating (FBG) sensors at the surface of the structure to detect temperature. Investigated structures include a stainless steel plate with thickness variations simulated by small platelets attached to the back side using thermal grease. A relationship is shown between the FBG sensor thermal response and variations in material thickness. For comparison, finite element modeling was performed and found to agree closely with the fiber optic thermography results. This technique shows potential for applications where FBG sensors are already bonded to structures for Integrated Vehicle Health Monitoring (IVHM) strain measurements and can serve dual-use by also performing thermographic detection of subsurface anomalies.

  9. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  10. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  11. Data Mining for Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj

    2013-01-01

    The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.

  12. RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection.

    PubMed

    Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S

    Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request.

  13. RS-Forest: A Rapid Density Estimator for Streaming Anomaly Detection

    PubMed Central

    Wu, Ke; Zhang, Kun; Fan, Wei; Edwards, Andrea; Yu, Philip S.

    2015-01-01

    Anomaly detection in streaming data is of high interest in numerous application domains. In this paper, we propose a novel one-class semi-supervised algorithm to detect anomalies in streaming data. Underlying the algorithm is a fast and accurate density estimator implemented by multiple fully randomized space trees (RS-Trees), named RS-Forest. The piecewise constant density estimate of each RS-tree is defined on the tree node into which an instance falls. Each incoming instance in a data stream is scored by the density estimates averaged over all trees in the forest. Two strategies, statistical attribute range estimation of high probability guarantee and dual node profiles for rapid model update, are seamlessly integrated into RS-Forest to systematically address the ever-evolving nature of data streams. We derive the theoretical upper bound for the proposed algorithm and analyze its asymptotic properties via bias-variance decomposition. Empirical comparisons to the state-of-the-art methods on multiple benchmark datasets demonstrate that the proposed method features high detection rate, fast response, and insensitivity to most of the parameter settings. Algorithm implementations and datasets are available upon request. PMID:25685112

  14. Caldera unrest detected with seawater temperature anomalies at Deception Island, Antarctic Peninsula

    NASA Astrophysics Data System (ADS)

    Berrocoso, M.; Prates, G.; Fernández-Ros, A.; Peci, L. M.; de Gil, A.; Rosado, B.; Páez, R.; Jigena, B.

    2018-04-01

    Increased thermal activity was detected to coincide with the onset of volcano inflation in the seawater-filled caldera at Deception Island. This thermal activity was manifested in pulses of high water temperature that coincided with ocean tide cycles. The seawater temperature anomalies were detected by a thermometric sensor attached to the tide gauge (bottom pressure sensor). This was installed where the seawater circulation and the locations of known thermal anomalies, fumaroles and thermal springs, together favor the detection of water warmed within the caldera. Detection of the increased thermal activity was also possible because sea ice, which covers the entire caldera during the austral winter months, insulates the water and thus reduces temperature exchange between seawater and atmosphere. In these conditions, the water temperature data has been shown to provide significant information about Deception volcano activity. The detected seawater temperature increase, also observed in soil temperature readings, suggests rapid and near-simultaneous increase in geothermal activity with onset of caldera inflation and an increased number of seismic events observed in the following austral summer.

  15. Anomaly Detection in Test Equipment via Sliding Mode Observers

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.; Drakunov, Sergey V.

    2012-01-01

    Nonlinear observers were originally developed based on the ideas of variable structure control, and for the purpose of detecting disturbances in complex systems. In this anomaly detection application, these observers were designed for estimating the distributed state of fluid flow in a pipe described by a class of advection equations. The observer algorithm uses collected data in a piping system to estimate the distributed system state (pressure and velocity along a pipe containing liquid gas propellant flow) using only boundary measurements. These estimates are then used to further estimate and localize possible anomalies such as leaks or foreign objects, and instrumentation metering problems such as incorrect flow meter orifice plate size. The observer algorithm has the following parts: a mathematical model of the fluid flow, observer control algorithm, and an anomaly identification algorithm. The main functional operation of the algorithm is in creating the sliding mode in the observer system implemented as software. Once the sliding mode starts in the system, the equivalent value of the discontinuous function in sliding mode can be obtained by filtering out the high-frequency chattering component. In control theory, "observers" are dynamic algorithms for the online estimation of the current state of a dynamic system by measurements of an output of the system. Classical linear observers can provide optimal estimates of a system state in case of uncertainty modeled by white noise. For nonlinear cases, the theory of nonlinear observers has been developed and its success is mainly due to the sliding mode approach. Using the mathematical theory of variable structure systems with sliding modes, the observer algorithm is designed in such a way that it steers the output of the model to the output of the system obtained via a variety of sensors, in spite of possible mismatches between the assumed model and actual system. The unique properties of sliding mode control

  16. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    PubMed Central

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-01-01

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm2 areas and ≥2% in ∼20 mm2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified. PMID:22894421

  17. Simultaneous capture and sequential detection of two malarial biomarkers on magnetic microparticles.

    PubMed

    Markwalter, Christine F; Ricks, Keersten M; Bitting, Anna L; Mudenda, Lwiindi; Wright, David W

    2016-12-01

    We have developed a rapid magnetic microparticle-based detection strategy for malarial biomarkers Plasmodium lactate dehydrogenase (pLDH) and Plasmodium falciparum histidine-rich protein II (PfHRPII). In this assay, magnetic particles functionalized with antibodies specific for pLDH and PfHRPII as well as detection antibodies with distinct enzymes for each biomarker are added to parasitized lysed blood samples. Sandwich complexes for pLDH and PfHRPII form on the surface of the magnetic beads, which are washed and sequentially re-suspended in detection enzyme substrate for each antigen. The developed simultaneous capture and sequential detection (SCSD) assay detects both biomarkers in samples as low as 2.0parasites/µl, an order of magnitude below commercially available ELISA kits, has a total incubation time of 35min, and was found to be reproducible between users over time. This assay provides a simple and efficient alternative to traditional 96-well plate ELISAs, which take 5-8h to complete and are limited to one analyte. Further, the modularity of the magnetic bead-based SCSD ELISA format could serve as a platform for application to other diseases for which multi-biomarker detection is advantageous. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Anomaly detection of flight routes through optimal waypoint

    NASA Astrophysics Data System (ADS)

    Pusadan, M. Y.; Buliali, J. L.; Ginardi, R. V. H.

    2017-01-01

    Deciding factor of flight, one of them is the flight route. Flight route determined by coordinate (latitude and longitude). flight routed is determined by its coordinates (latitude and longitude) as defined is waypoint. anomaly occurs, if the aircraft is flying outside the specified waypoint area. In the case of flight data, anomalies occur by identifying problems of the flight route based on data ADS-B. This study has an aim of to determine the optimal waypoints of the flight route. The proposed methods: i) Agglomerative Hierarchical Clustering (AHC) in several segments based on range area coordinates (latitude and longitude) in every waypoint; ii) The coefficient cophenetics correlation (c) to determine the correlation between the members in each cluster; iii) cubic spline interpolation as a graphic representation of the has connected between the coordinates on every waypoint; and iv). Euclidean distance to measure distances between waypoints with 2 centroid result of clustering AHC. The experiment results are value of coefficient cophenetics correlation (c): 0,691≤ c ≤ 0974, five segments the generated of the range area waypoint coordinates, and the shortest and longest distance between the centroid with waypoint are 0.46 and 2.18. Thus, concluded that the shortest distance is used as the reference coordinates of optimal waypoint, and farthest distance can be indicated potentially detected anomaly.

  19. Physics-based, Bayesian sequential detection method and system for radioactive contraband

    DOEpatents

    Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E

    2014-03-18

    A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.

  20. Speech detection in noise and spatial unmasking in children with simultaneous versus sequential bilateral cochlear implants.

    PubMed

    Chadha, Neil K; Papsin, Blake C; Jiwani, Salima; Gordon, Karen A

    2011-09-01

    To measure speech detection in noise performance for children with bilateral cochlear implants (BiCI), to compare performance in children with simultaneous implant versus those with sequential implant, and to compare performance to normal-hearing children. Prospective cohort study. Tertiary academic pediatric center. Children with early-onset bilateral deafness and 2-year BiCI experience, comprising the "sequential" group (>2 yr interimplantation delay, n = 12) and "simultaneous group" (no interimplantation delay, n = 10) and normal-hearing controls (n = 8). Thresholds to speech detection (at 0-degree azimuth) were measured with noise at 0-degree azimuth or ± 90-degree azimuth. Spatial unmasking (SU) as the noise condition changed from 0-degree azimuth to ± 90-degree azimuth and binaural summation advantage (BSA) of 2 over 1 CI. Speech detection in noise was significantly poorer than controls for both BiCI groups (p < 0.0001). However, the SU in the simultaneous group approached levels found in normal controls (7.2 ± 0.6 versus 8.6 ± 0.6 dB, p > 0.05) and was significantly better than that in the sequential group (3.9 ± 0.4 dB, p < 0.05). Spatial unmasking was unaffected by the side of noise presentation in the simultaneous group but, in the sequential group, was significantly better when noise was moved to the second rather than the first implanted ear (4.8 ± 0.5 versus 3.0 ± 0.4 dB, p < 0.05). This was consistent with a larger BSA from the sequential group's second rather than first CI. Children with simultaneously implanted BiCI demonstrated an advantage over children with sequential implant by using spatial cues to improve speech detection in noise.

  1. Anomaly Detection and Life Pattern Estimation for the Elderly Based on Categorization of Accumulated Data

    NASA Astrophysics Data System (ADS)

    Mori, Taketoshi; Ishino, Takahito; Noguchi, Hiroshi; Shimosaka, Masamichi; Sato, Tomomasa

    2011-06-01

    We propose a life pattern estimation method and an anomaly detection method for elderly people living alone. In our observation system for such people, we deploy some pyroelectric sensors into the house and measure the person's activities all the time in order to grasp the person's life pattern. The data are transferred successively to the operation center and displayed to the nurses in the center in a precise way. Then, the nurses decide whether the data is the anomaly or not. In the system, the people whose features in their life resemble each other are categorized as the same group. Anomalies occurred in the past are shared in the group and utilized in the anomaly detection algorithm. This algorithm is based on "anomaly score." The "anomaly score" is figured out by utilizing the activeness of the person. This activeness is approximately proportional to the frequency of the sensor response in a minute. The "anomaly score" is calculated from the difference between the activeness in the present and the past one averaged in the long term. Thus, the score is positive if the activeness in the present is higher than the average in the past, and the score is negative if the value in the present is lower than the average. If the score exceeds a certain threshold, it means that an anomaly event occurs. Moreover, we developed an activity estimation algorithm. This algorithm estimates the residents' basic activities such as uprising, outing, and so on. The estimation is shown to the nurses with the "anomaly score" of the residents. The nurses can understand the residents' health conditions by combining these two information.

  2. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The

  3. Sequential detection of learning in cognitive diagnosis.

    PubMed

    Ye, Sangbeak; Fellouris, Georgios; Culpepper, Steven; Douglas, Jeff

    2016-05-01

    In order to look more closely at the many particular skills examinees utilize to answer items, cognitive diagnosis models have received much attention, and perhaps are preferable to item response models that ordinarily involve just one or a few broadly defined skills, when the objective is to hasten learning. If these fine-grained skills can be identified, a sharpened focus on learning and remediation can be achieved. The focus here is on how to detect when learning has taken place for a particular attribute and efficiently guide a student through a sequence of items to ultimately attain mastery of all attributes while administering as few items as possible. This can be seen as a problem in sequential change-point detection for which there is a long history and a well-developed literature. Though some ad hoc rules for determining learning may be used, such as stopping after M consecutive items have been successfully answered, more efficient methods that are optimal under various conditions are available. The CUSUM, Shiryaev-Roberts and Shiryaev procedures can dramatically reduce the time required to detect learning while maintaining rigorous Type I error control, and they are studied in this context through simulation. Future directions for modelling and detection of learning are discussed. © 2016 The British Psychological Society.

  4. WE-H-BRC-06: A Unified Machine-Learning Based Probabilistic Model for Automated Anomaly Detection in the Treatment Plan Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, X; Liu, S; Kalet, A

    Purpose: The purpose of this work was to investigate the ability of a machine-learning based probabilistic approach to detect radiotherapy treatment plan anomalies given initial disease classes information. Methods In total we obtained 1112 unique treatment plans with five plan parameters and disease information from a Mosaiq treatment management system database for use in the study. The plan parameters include prescription dose, fractions, fields, modality and techniques. The disease information includes disease site, and T, M and N disease stages. A Bayesian network method was employed to model the probabilistic relationships between tumor disease information, plan parameters and an anomalymore » flag. A Bayesian learning method with Dirichlet prior was useed to learn the joint probabilities between dependent variables in error-free plan data and data with artificially induced anomalies. In the study, we randomly sampled data with anomaly in a specified anomaly space.We tested the approach with three groups of plan anomalies – improper concurrence of values of all five plan parameters and values of any two out of five parameters, and all single plan parameter value anomalies. Totally, 16 types of plan anomalies were covered by the study. For each type, we trained an individual Bayesian network. Results: We found that the true positive rate (recall) and positive predictive value (precision) to detect concurrence anomalies of five plan parameters in new patient cases were 94.45±0.26% and 93.76±0.39% respectively. To detect other 15 types of plan anomalies, the average recall and precision were 93.61±2.57% and 93.78±3.54% respectively. The computation time to detect the plan anomaly of each type in a new plan is ∼0.08 seconds. Conclusion: The proposed method for treatment plan anomaly detection was found effective in the initial tests. The results suggest that this type of models could be applied to develop plan anomaly detection tools to assist manual and

  5. A Healthcare Utilization Analysis Framework for Hot Spotting and Contextual Anomaly Detection

    PubMed Central

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient’s clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among

  6. A healthcare utilization analysis framework for hot spotting and contextual anomaly detection.

    PubMed

    Hu, Jianying; Wang, Fei; Sun, Jimeng; Sorrentino, Robert; Ebadollahi, Shahram

    2012-01-01

    Patient medical records today contain vast amount of information regarding patient conditions along with treatment and procedure records. Systematic healthcare resource utilization analysis leveraging such observational data can provide critical insights to guide resource planning and improve the quality of care delivery while reducing cost. Of particular interest to providers are hot spotting: the ability to identify in a timely manner heavy users of the systems and their patterns of utilization so that targeted intervention programs can be instituted, and anomaly detection: the ability to identify anomalous utilization cases where the patients incurred levels of utilization that are unexpected given their clinical characteristics which may require corrective actions. Past work on medical utilization pattern analysis has focused on disease specific studies. We present a framework for utilization analysis that can be easily applied to any patient population. The framework includes two main components: utilization profiling and hot spotting, where we use a vector space model to represent patient utilization profiles, and apply clustering techniques to identify utilization groups within a given population and isolate high utilizers of different types; and contextual anomaly detection for utilization, where models that map patient's clinical characteristics to the utilization level are built in order to quantify the deviation between the expected and actual utilization levels and identify anomalies. We demonstrate the effectiveness of the framework using claims data collected from a population of 7667 diabetes patients. Our analysis demonstrates the usefulness of the proposed approaches in identifying clinically meaningful instances for both hot spotting and anomaly detection. In future work we plan to incorporate additional sources of observational data including EMRs and disease registries, and develop analytics models to leverage temporal relationships among

  7. Anomaly Detection in Gamma-Ray Vehicle Spectra with Principal Components Analysis and Mahalanobis Distances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tardiff, Mark F.; Runkle, Robert C.; Anderson, K. K.

    2006-01-23

    The goal of primary radiation monitoring in support of routine screening and emergency response is to detect characteristics in vehicle radiation signatures that indicate the presence of potential threats. Two conceptual approaches to analyzing gamma-ray spectra for threat detection are isotope identification and anomaly detection. While isotope identification is the time-honored method, an emerging technique is anomaly detection that uses benign vehicle gamma ray signatures to define an expectation of the radiation signature for vehicles that do not pose a threat. Newly acquired spectra are then compared to this expectation using statistical criteria that reflect acceptable false alarm rates andmore » probabilities of detection. The gamma-ray spectra analyzed here were collected at a U.S. land Port of Entry (POE) using a NaI-based radiation portal monitor (RPM). The raw data were analyzed to develop a benign vehicle expectation by decimating the original pulse-height channels to 35 energy bins, extracting composite variables via principal components analysis (PCA), and estimating statistically weighted distances from the mean vehicle spectrum with the mahalanobis distance (MD) metric. This paper reviews the methods used to establish the anomaly identification criteria and presents a systematic analysis of the response of the combined PCA and MD algorithm to modeled mono-energetic gamma-ray sources.« less

  8. Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method

    NASA Technical Reports Server (NTRS)

    Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.

    2005-01-01

    NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.

  9. Multi-criteria anomaly detection in urban noise sensor networks.

    PubMed

    Dauwe, Samuel; Oldoni, Damiano; De Baets, Bernard; Van Renterghem, Timothy; Botteldooren, Dick; Dhoedt, Bart

    2014-01-01

    The growing concern of citizens about the quality of their living environment and the emergence of low-cost microphones and data acquisition systems triggered the deployment of numerous noise monitoring networks spread over large geographical areas. Due to the local character of noise pollution in an urban environment, a dense measurement network is needed in order to accurately assess the spatial and temporal variations. The use of consumer grade microphones in this context appears to be very cost-efficient compared to the use of measurement microphones. However, the lower reliability of these sensing units requires a strong quality control of the measured data. To automatically validate sensor (microphone) data, prior to their use in further processing, a multi-criteria measurement quality assessment model for detecting anomalies such as microphone breakdowns, drifts and critical outliers was developed. Each of the criteria results in a quality score between 0 and 1. An ordered weighted average (OWA) operator combines these individual scores into a global quality score. The model is validated on datasets acquired from a real-world, extensive noise monitoring network consisting of more than 50 microphones. Over a period of more than a year, the proposed approach successfully detected several microphone faults and anomalies.

  10. Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.

    2010-01-01

    In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.

  11. Identification and detection of anomalies through SSME data analysis

    NASA Technical Reports Server (NTRS)

    Pereira, Lisa; Ali, Moonis

    1990-01-01

    The goal of the ongoing research described in this paper is to analyze real-time ground test data in order to identify patterns associated with the anomalous engine behavior, and on the basis of this analysis to develop an expert system which detects anomalous engine behavior in the early stages of fault development. A prototype of the expert system has been developed and tested on the high frequency data of two SSME tests, namely Test #901-0516 and Test #904-044. The comparison of our results with the post-test analyses indicates that the expert system detected the presence of the anomalies in a significantly early stage of fault development.

  12. Support vector machines for TEC seismo-ionospheric anomalies detection

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-02-01

    Using time series prediction methods, it is possible to pursue the behaviors of earthquake precursors in the future and to announce early warnings when the differences between the predicted value and the observed value exceed the predefined threshold value. Support Vector Machines (SVMs) are widely used due to their many advantages for classification and regression tasks. This study is concerned with investigating the Total Electron Content (TEC) time series by using a SVM to detect seismo-ionospheric anomalous variations induced by the three powerful earthquakes of Tohoku (11 March 2011), Haiti (12 January 2010) and Samoa (29 September 2009). The duration of TEC time series dataset is 49, 46 and 71 days, for Tohoku, Haiti and Samoa earthquakes, respectively, with each at time resolution of 2 h. In the case of Tohoku earthquake, the results show that the difference between the predicted value obtained from the SVM method and the observed value reaches the maximum value (i.e., 129.31 TECU) at earthquake time in a period of high geomagnetic activities. The SVM method detected a considerable number of anomalous occurrences 1 and 2 days prior to the Haiti earthquake and also 1 and 5 days before the Samoa earthquake in a period of low geomagnetic activities. In order to show that the method is acting sensibly with regard to the results extracted during nonevent and event TEC data, i.e., to perform some null-hypothesis tests in which the methods would also be calibrated, the same period of data from the previous year of the Samoa earthquake date has been taken into the account. Further to this, in this study, the detected TEC anomalies using the SVM method were compared to the previous results (Akhoondzadeh and Saradjian, 2011; Akhoondzadeh, 2012) obtained from the mean, median, wavelet and Kalman filter methods. The SVM detected anomalies are similar to those detected using the previous methods. It can be concluded that SVM can be a suitable learning method to detect

  13. Anomaly Detection in the Right Hemisphere: The Influence of Visuospatial Factors

    ERIC Educational Resources Information Center

    Smith, Stephen D.; Dixon, Michael J.; Tays, William J.; Bulman-Fleming, M. Barbara

    2004-01-01

    Previous research with both brain-damaged and neurologically intact populations has demonstrated that the right cerebral hemisphere (RH) is superior to the left cerebral hemisphere (LH) at detecting anomalies (or incongruities) in objects (Ramachandran, 1995; Smith, Tays, Dixon, & Bulman-Fleming, 2002). The current research assesses whether the RH…

  14. A primitive study on unsupervised anomaly detection with an autoencoder in emergency head CT volumes

    NASA Astrophysics Data System (ADS)

    Sato, Daisuke; Hanaoka, Shouhei; Nomura, Yukihiro; Takenaga, Tomomi; Miki, Soichiro; Yoshikawa, Takeharu; Hayashi, Naoto; Abe, Osamu

    2018-02-01

    Purpose: The target disorders of emergency head CT are wide-ranging. Therefore, people working in an emergency department desire a computer-aided detection system for general disorders. In this study, we proposed an unsupervised anomaly detection method in emergency head CT using an autoencoder and evaluated the anomaly detection performance of our method in emergency head CT. Methods: We used a 3D convolutional autoencoder (3D-CAE), which contains 11 layers in the convolution block and 6 layers in the deconvolution block. In the training phase, we trained the 3D-CAE using 10,000 3D patches extracted from 50 normal cases. In the test phase, we calculated abnormalities of each voxel in 38 emergency head CT volumes (22 abnormal cases and 16 normal cases) for evaluation and evaluated the likelihood of lesion existence. Results: Our method achieved a sensitivity of 68% and a specificity of 88%, with an area under the curve of the receiver operating characteristic curve of 0.87. It shows that this method has a moderate accuracy to distinguish normal CT cases to abnormal ones. Conclusion: Our method has potentialities for anomaly detection in emergency head CT.

  15. A modified anomaly detection method for capsule endoscopy images using non-linear color conversion and Higher-order Local Auto-Correlation (HLAC).

    PubMed

    Hu, Erzhong; Nosato, Hirokazu; Sakanashi, Hidenori; Murakawa, Masahiro

    2013-01-01

    Capsule endoscopy is a patient-friendly endoscopy broadly utilized in gastrointestinal examination. However, the efficacy of diagnosis is restricted by the large quantity of images. This paper presents a modified anomaly detection method, by which both known and unknown anomalies in capsule endoscopy images of small intestine are expected to be detected. To achieve this goal, this paper introduces feature extraction using a non-linear color conversion and Higher-order Local Auto Correlation (HLAC) Features, and makes use of image partition and subspace method for anomaly detection. Experiments are implemented among several major anomalies with combinations of proposed techniques. As the result, the proposed method achieved 91.7% and 100% detection accuracy for swelling and bleeding respectively, so that the effectiveness of proposed method is demonstrated.

  16. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems

    PubMed Central

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477

  17. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.

    PubMed

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes.

  18. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  19. Volcanic activity and satellite-detected thermal anomalies at Central American volcanoes

    NASA Technical Reports Server (NTRS)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1973-01-01

    The author has identified the following significant results. A large nuee ardente eruption occurred at Santiaguito volcano, within the test area on 16 September 1973. Through a system of local observers, the eruption has been described, reported to the international scientific community, extent of affected area mapped, and the new ash sampled. A more extensive report on this event will be prepared. The eruption is an excellent example of the kind of volcanic situation in which satellite thermal imagery might be useful. The Santiaguito dome is a complex mass with a whole series of historically active vents. It's location makes access difficult, yet its activity is of great concern to large agricultural populations who live downslope. Santiaguito has produced a number of large eruptions with little apparent warning. In the earlier ground survey large thermal anomalies were identified at Santiaguito. There is no way of knowing whether satellite monitoring could have detected changes in thermal anomaly patterns related to this recent event, but the position of thermal anomalies on Santiaguito and any changes in their character would be relevant information.

  20. Anomaly detection using temporal data mining in a smart home environment.

    PubMed

    Jakkula, V; Cook, D J

    2008-01-01

    To many people, home is a sanctuary. With the maturing of smart home technologies, many people with cognitive and physical disabilities can lead independent lives in their own homes for extended periods of time. In this paper, we investigate the design of machine learning algorithms that support this goal. We hypothesize that machine learning algorithms can be designed to automatically learn models of resident behavior in a smart home, and that the results can be used to perform automated health monitoring and to detect anomalies. Specifically, our algorithms draw upon the temporal nature of sensor data collected in a smart home to build a model of expected activities and to detect unexpected, and possibly health-critical, events in the home. We validate our algorithms using synthetic data and real activity data collected from volunteers in an automated smart environment. The results from our experiments support our hypothesis that a model can be learned from observed smart home data and used to report anomalies, as they occur, in a smart home.

  1. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    PubMed

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  2. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    PubMed Central

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882

  3. Temporal Characteristics of Radiologists' and Novices' Lesion Detection in Viewing Medical Images Presented Rapidly and Sequentially.

    PubMed

    Nakashima, Ryoichi; Komori, Yuya; Maeda, Eriko; Yoshikawa, Takeharu; Yokosawa, Kazuhiko

    2016-01-01

    Although viewing multiple stacks of medical images presented on a display is a relatively new but useful medical task, little is known about this task. Particularly, it is unclear how radiologists search for lesions in this type of image reading. When viewing cluttered and dynamic displays, continuous motion itself does not capture attention. Thus, it is effective for the target detection that observers' attention is captured by the onset signal of a suddenly appearing target among the continuously moving distractors (i.e., a passive viewing strategy). This can be applied to stack viewing tasks, because lesions often show up as transient signals in medical images which are sequentially presented simulating a dynamic and smoothly transforming image progression of organs. However, it is unclear whether observers can detect a target when the target appears at the beginning of a sequential presentation where the global apparent motion onset signal (i.e., signal of the initiation of the apparent motion by sequential presentation) occurs. We investigated the ability of radiologists to detect lesions during such tasks by comparing the performances of radiologists and novices. Results show that overall performance of radiologists is better than novices. Furthermore, the temporal locations of lesions in CT image sequences, i.e., when a lesion appears in an image sequence, does not affect the performance of radiologists, whereas it does affect the performance of novices. Results indicate that novices have greater difficulty in detecting a lesion appearing early than late in the image sequence. We suggest that radiologists have other mechanisms to detect lesions in medical images with little attention which novices do not have. This ability is critically important when viewing rapid sequential presentations of multiple CT images, such as stack viewing tasks.

  4. Temporal Characteristics of Radiologists' and Novices' Lesion Detection in Viewing Medical Images Presented Rapidly and Sequentially

    PubMed Central

    Nakashima, Ryoichi; Komori, Yuya; Maeda, Eriko; Yoshikawa, Takeharu; Yokosawa, Kazuhiko

    2016-01-01

    Although viewing multiple stacks of medical images presented on a display is a relatively new but useful medical task, little is known about this task. Particularly, it is unclear how radiologists search for lesions in this type of image reading. When viewing cluttered and dynamic displays, continuous motion itself does not capture attention. Thus, it is effective for the target detection that observers' attention is captured by the onset signal of a suddenly appearing target among the continuously moving distractors (i.e., a passive viewing strategy). This can be applied to stack viewing tasks, because lesions often show up as transient signals in medical images which are sequentially presented simulating a dynamic and smoothly transforming image progression of organs. However, it is unclear whether observers can detect a target when the target appears at the beginning of a sequential presentation where the global apparent motion onset signal (i.e., signal of the initiation of the apparent motion by sequential presentation) occurs. We investigated the ability of radiologists to detect lesions during such tasks by comparing the performances of radiologists and novices. Results show that overall performance of radiologists is better than novices. Furthermore, the temporal locations of lesions in CT image sequences, i.e., when a lesion appears in an image sequence, does not affect the performance of radiologists, whereas it does affect the performance of novices. Results indicate that novices have greater difficulty in detecting a lesion appearing early than late in the image sequence. We suggest that radiologists have other mechanisms to detect lesions in medical images with little attention which novices do not have. This ability is critically important when viewing rapid sequential presentations of multiple CT images, such as stack viewing tasks. PMID:27774080

  5. Anomaly Detection Based on Local Nearest Neighbor Distance Descriptor in Crowded Scenes

    PubMed Central

    Hu, Shiqiang; Zhang, Huanlong; Luo, Lingkun

    2014-01-01

    We propose a novel local nearest neighbor distance (LNND) descriptor for anomaly detection in crowded scenes. Comparing with the commonly used low-level feature descriptors in previous works, LNND descriptor has two major advantages. First, LNND descriptor efficiently incorporates spatial and temporal contextual information around the video event that is important for detecting anomalous interaction among multiple events, while most existing feature descriptors only contain the information of single event. Second, LNND descriptor is a compact representation and its dimensionality is typically much lower than the low-level feature descriptor. Therefore, not only the computation time and storage requirement can be accordingly saved by using LNND descriptor for the anomaly detection method with offline training fashion, but also the negative aspects caused by using high-dimensional feature descriptor can be avoided. We validate the effectiveness of LNND descriptor by conducting extensive experiments on different benchmark datasets. Experimental results show the promising performance of LNND-based method against the state-of-the-art methods. It is worthwhile to notice that the LNND-based approach requires less intermediate processing steps without any subsequent processing such as smoothing but achieves comparable event better performance. PMID:25105164

  6. Congenital anomalies of the left brachiocephalic vein detected in adults on computed tomography.

    PubMed

    Yamamuro, Hiroshi; Ichikawa, Tamaki; Hashimoto, Jun; Ono, Shun; Nagata, Yoshimi; Kawada, Shuichi; Kobayashi, Makiko; Koizumi, Jun; Shibata, Takeo; Imai, Yutaka

    2017-10-01

    Anomalous left brachiocephalic vein (BCV) is a rare and less known systemic venous anomaly. We evaluated congenital anomalies of the left BCV in adults detected during computed tomography (CT) examinations. This retrospective study included 81,425 patients without congenital heart disease who underwent chest CT. We reviewed the recorded reports and CT images for congenital anomalies of the left BCV including aberrant and supernumerary BCVs. The associated congenital aortic anomalies were assessed. Among 73,407 cases at a university hospital, 22 (16 males, 6 females; mean age, 59 years) with aberrant left BCVs were found using keyword research on recorded reports (0.03%). Among 8018 cases at the branch hospital, 5 (4 males, 1 female; mean age, 67 years) with aberrant left BCVs were found using CT image review (0.062%). There were no significant differences in incidences of aberrant left BCV between the two groups. Two cases had double left BCVs. Eleven cases showed high aortic arches. Two cases had the right aortic arch, one case had an incomplete double aortic arch, and one case was associated with coarctation. Aberrant left BCV on CT examination in adults was extremely rare. Some cases were associated with aortic arch anomalies.

  7. A new approach for structural health monitoring by applying anomaly detection on strain sensor data

    NASA Astrophysics Data System (ADS)

    Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik

    2014-03-01

    Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.

  8. Sequential structural damage diagnosis algorithm using a change point detection method

    NASA Astrophysics Data System (ADS)

    Noh, H.; Rajagopal, R.; Kiremidjian, A. S.

    2013-11-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method. The general change point detection method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori, unless we are looking for a known specific type of damage. Therefore, we introduce an additional algorithm that estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using a set of experimental data collected from a four-story steel special moment-resisting frame and multiple sets of simulated data. Various features of different dimensions have been explored, and the algorithm was able to identify damage, particularly when it uses multidimensional damage sensitive features and lower false alarm rates, with a known post-damage feature distribution. For unknown feature distribution cases, the post-damage distribution was consistently estimated and the detection delays were only a few time steps longer than the delays from the general method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  9. Anomaly Detection Using an Ensemble of Feature Models

    PubMed Central

    Noto, Keith; Brodley, Carla; Slonim, Donna

    2011-01-01

    We present a new approach to semi-supervised anomaly detection. Given a set of training examples believed to come from the same distribution or class, the task is to learn a model that will be able to distinguish examples in the future that do not belong to the same class. Traditional approaches typically compare the position of a new data point to the set of “normal” training data points in a chosen representation of the feature space. For some data sets, the normal data may not have discernible positions in feature space, but do have consistent relationships among some features that fail to appear in the anomalous examples. Our approach learns to predict the values of training set features from the values of other features. After we have formed an ensemble of predictors, we apply this ensemble to new data points. To combine the contribution of each predictor in our ensemble, we have developed a novel, information-theoretic anomaly measure that our experimental results show selects against noisy and irrelevant features. Our results on 47 data sets show that for most data sets, this approach significantly improves performance over current state-of-the-art feature space distance and density-based approaches. PMID:22020249

  10. Microarray-based comparative genomic hybridization analysis in neonates with congenital anomalies: detection of chromosomal imbalances.

    PubMed

    Emy Dorfman, Luiza; Leite, Júlio César L; Giugliani, Roberto; Riegel, Mariluce

    2015-01-01

    To identify chromosomal imbalances by whole-genome microarray-based comparative genomic hybridization (array-CGH) in DNA samples of neonates with congenital anomalies of unknown cause from a birth defects monitoring program at a public maternity hospital. A blind genomic analysis was performed retrospectively in 35 stored DNA samples of neonates born between July of 2011 and December of 2012. All potential DNA copy number variations detected (CNVs) were matched with those reported in public genomic databases, and their clinical significance was evaluated. Out of a total of 35 samples tested, 13 genomic imbalances were detected in 12/35 cases (34.3%). In 4/35 cases (11.4%), chromosomal imbalances could be defined as pathogenic; in 5/35 (14.3%) cases, DNA CNVs of uncertain clinical significance were identified; and in 4/35 cases (11.4%), normal variants were detected. Among the four cases with results considered causally related to the clinical findings, two of the four (50%) showed causative alterations already associated with well-defined microdeletion syndromes. In two of the four samples (50%), the chromosomal imbalances found, although predicted as pathogenic, had not been previously associated with recognized clinical entities. Array-CGH analysis allowed for a higher rate of detection of chromosomal anomalies, and this determination is especially valuable in neonates with congenital anomalies of unknown etiology, or in cases in which karyotype results cannot be obtained. Moreover, although the interpretation of the results must be refined, this method is a robust and precise tool that can be used in the first-line investigation of congenital anomalies, and should be considered for prospective/retrospective analyses of DNA samples by birth defect monitoring programs. Copyright © 2014 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  11. Performances of Machine Learning Algorithms for Binary Classification of Network Anomaly Detection System

    NASA Astrophysics Data System (ADS)

    Nawir, Mukrimah; Amir, Amiza; Lynn, Ong Bi; Yaakob, Naimah; Badlishah Ahmad, R.

    2018-05-01

    The rapid growth of technologies might endanger them to various network attacks due to the nature of data which are frequently exchange their data through Internet and large-scale data that need to be handle. Moreover, network anomaly detection using machine learning faced difficulty when dealing the involvement of dataset where the number of labelled network dataset is very few in public and this caused many researchers keep used the most commonly network dataset (KDDCup99) which is not relevant to employ the machine learning (ML) algorithms for a classification. Several issues regarding these available labelled network datasets are discussed in this paper. The aim of this paper to build a network anomaly detection system using machine learning algorithms that are efficient, effective and fast processing. The finding showed that AODE algorithm is performed well in term of accuracy and processing time for binary classification towards UNSW-NB15 dataset.

  12. Radiation detection method and system using the sequential probability ratio test

    DOEpatents

    Nelson, Karl E [Livermore, CA; Valentine, John D [Redwood City, CA; Beauchamp, Brock R [San Ramon, CA

    2007-07-17

    A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.

  13. Paternal psychological response after ultrasonographic detection of structural fetal anomalies with a comparison to maternal response: a cohort study.

    PubMed

    Kaasen, Anne; Helbig, Anne; Malt, Ulrik Fredrik; Naes, Tormod; Skari, Hans; Haugen, Guttorm Nils

    2013-07-12

    In Norway almost all pregnant women attend one routine ultrasound examination. Detection of fetal structural anomalies triggers psychological stress responses in the women affected. Despite the frequent use of ultrasound examination in pregnancy, little attention has been devoted to the psychological response of the expectant father following the detection of fetal anomalies. This is important for later fatherhood and the psychological interaction within the couple. We aimed to describe paternal psychological responses shortly after detection of structural fetal anomalies by ultrasonography, and to compare paternal and maternal responses within the same couple. A prospective observational study was performed at a tertiary referral centre for fetal medicine. Pregnant women with a structural fetal anomaly detected by ultrasound and their partners (study group,n=155) and 100 with normal ultrasound findings (comparison group) were included shortly after sonographic examination (inclusion period: May 2006-February 2009). Gestational age was >12 weeks. We used psychometric questionnaires to assess self-reported social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression): Impact of Event Scale. General Health Questionnaire and Edinburgh Postnatal Depression Scale. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Median (range) gestational age at inclusion in the study and comparison group was 19 (12-38) and 19 (13-22) weeks, respectively. Men and women in the study group had significantly higher levels of psychological distress than men and women in the comparison group on all psychometric endpoints. The lowest level of distress in the study group was associated with the least severe anomalies with no diagnostic or prognostic ambiguity (p < 0.033). Men had lower scores than women on all psychometric outcome variables. The correlation in

  14. ISHM Anomaly Lexicon for Rocket Test

    NASA Technical Reports Server (NTRS)

    Schmalzel, John L.; Buchanan, Aubri; Hensarling, Paula L.; Morris, Jonathan; Turowski, Mark; Figueroa, Jorge F.

    2007-01-01

    Integrated Systems Health Management (ISHM) is a comprehensive capability. An ISHM system must detect anomalies, identify causes of such anomalies, predict future anomalies, help identify consequences of anomalies for example, suggested mitigation steps. The system should also provide users with appropriate navigation tools to facilitate the flow of information into and out of the ISHM system. Central to the ability of the ISHM to detect anomalies is a clearly defined catalog of anomalies. Further, this lexicon of anomalies must be organized in ways that make it accessible to a suite of tools used to manage the data, information and knowledge (DIaK) associated with a system. In particular, it is critical to ensure that there is optimal mapping between target anomalies and the algorithms associated with their detection. During the early development of our ISHM architecture and approach, it became clear that a lexicon of anomalies would be important to the development of critical anomaly detection algorithms. In our work in the rocket engine test environment at John C. Stennis Space Center, we have access to a repository of discrepancy reports (DRs) that are generated in response to squawks identified during post-test data analysis. The DR is the tool used to document anomalies and the methods used to resolve the issue. These DRs have been generated for many different tests and for all test stands. The result is that they represent a comprehensive summary of the anomalies associated with rocket engine testing. Fig. 1 illustrates some of the data that can be extracted from a DR. Such information includes affected transducer channels, narrative description of the observed anomaly, and the steps used to correct the problem. The primary goal of the anomaly lexicon development efforts we have undertaken is to create a lexicon that could be used in support of an associated health assessment database system (HADS) co-development effort. There are a number of significant

  15. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    PubMed Central

    Liu, Datong; Peng, Yu; Peng, Xiyuan

    2018-01-01

    Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR) and relevance vector machine (RVM)) are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP), which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%). There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI) based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA) algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application. PMID:29587372

  16. Ellipsoids for anomaly detection in remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Grosklos, Guenchik; Theiler, James

    2015-05-01

    For many target and anomaly detection algorithms, a key step is the estimation of a centroid (relatively easy) and a covariance matrix (somewhat harder) that characterize the background clutter. For a background that can be modeled as a multivariate Gaussian, the centroid and covariance lead to an explicit probability density function that can be used in likelihood ratio tests for optimal detection statistics. But ellipsoidal contours can characterize a much larger class of multivariate density function, and the ellipsoids that characterize the outer periphery of the distribution are most appropriate for detection in the low false alarm rate regime. Traditionally the sample mean and sample covariance are used to estimate ellipsoid location and shape, but these quantities are confounded both by large lever-arm outliers and non-Gaussian distributions within the ellipsoid of interest. This paper compares a variety of centroid and covariance estimation schemes with the aim of characterizing the periphery of the background distribution. In particular, we will consider a robust variant of the Khachiyan algorithm for minimum-volume enclosing ellipsoid. The performance of these different approaches is evaluated on multispectral and hyperspectral remote sensing imagery using coverage plots of ellipsoid volume versus false alarm rate.

  17. A function approximation approach to anomaly detection in propulsion system test data

    NASA Technical Reports Server (NTRS)

    Whitehead, Bruce A.; Hoyt, W. A.

    1993-01-01

    Ground test data from propulsion systems such as the Space Shuttle Main Engine (SSME) can be automatically screened for anomalies by a neural network. The neural network screens data after being trained with nominal data only. Given the values of 14 measurements reflecting external influences on the SSME at a given time, the neural network predicts the expected nominal value of a desired engine parameter at that time. We compared the ability of three different function-approximation techniques to perform this nominal value prediction: a novel neural network architecture based on Gaussian bar basis functions, a conventional back propagation neural network, and linear regression. These three techniques were tested with real data from six SSME ground tests containing two anomalies. The basis function network trained more rapidly than back propagation. It yielded nominal predictions with, a tight enough confidence interval to distinguish anomalous deviations from the nominal fluctuations in an engine parameter. Since the function-approximation approach requires nominal training data only, it is capable of detecting unknown classes of anomalies for which training data is not available.

  18. Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Lauzon, N.; Lence, B. J.

    2002-12-01

    This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be

  19. Simultaneous detection of creatine and creatinine using a sequential injection analysis/biosensor system.

    PubMed

    Stefan-van Staden, Raluca-Ioana; Bokretsion, Rahel Girmai; van Staden, Jacobus F; Aboul-Enein, Hassan Y

    2006-01-01

    Carbon paste based biosensors for the determination of creatine and creatinine have been integrated into a sequential injection system. Applying the multi-enzyme sequence of creatininase (CA), and/or creatinase (CI) and sarcosine oxidase (SO), hydrogen peroxide has been detected amperometrically. The linear concentration ranges are of pmol/L to nmol/L magnitude, with very low limits of detection. The proposed SIA system can be utilized reliably for the on-line simultaneous detection of creatine and creatinine in pharmaceutical products, as well as in serum samples, with a rate of 34 samples per hour and RSD values better than 0.16% (n=10).

  20. MODVOLC2: A Hybrid Time Series Analysis for Detecting Thermal Anomalies Applied to Thermal Infrared Satellite Data

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Wright, R.; Pilger, E.

    2009-12-01

    We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found

  1. Detecting primary precursors of January surface air temperature anomalies in China

    NASA Astrophysics Data System (ADS)

    Tan, Guirong; Ren, Hong-Li; Chen, Haishan; You, Qinglong

    2017-12-01

    This study aims to detect the primary precursors and impact mechanisms for January surface temperature anomaly (JSTA) events in China against the background of global warming, by comparing the causes of two extreme JSTA events occurring in 2008 and 2011 with the common mechanisms inferred from all typical episodes during 1979-2008. The results show that these two extreme events exhibit atmospheric circulation patterns in the mid-high latitudes of Eurasia, with a positive anomaly center over the Ural Mountains and a negative one to the south of Lake Baikal (UMLB), which is a pattern quite similar to that for all the typical events. However, the Eurasian teleconnection patterns in the 2011 event, which are accompanied by a negative phase of the North Atlantic Oscillation, are different to those of the typical events and the 2008 event. We further find that a common anomalous signal appearing in early summer over the tropical Indian Ocean may be responsible for the following late-winter Eurasian teleconnections and the associated JSTA events in China. We show that sea surface temperature anomalies (SSTAs) in the preceding summer over the western Indian Ocean (WIO) are intimately related to the UMLB-like circulation pattern in the following January. Positive WIOSSTAs in early summer tend to induce strong UMLB-like circulation anomalies in January, which may result in anomalously or extremely cold events in China, which can also be successfully reproduced in model experiments. Our results suggest that the WIOSSTAs may be a useful precursor for predicting JSTA events in China.

  2. Identifying High-Risk Patients without Labeled Training Data: Anomaly Detection Methodologies to Predict Adverse Outcomes

    PubMed Central

    Syed, Zeeshan; Saeed, Mohammed; Rubinfeld, Ilan

    2010-01-01

    For many clinical conditions, only a small number of patients experience adverse outcomes. Developing risk stratification algorithms for these conditions typically requires collecting large volumes of data to capture enough positive and negative for training. This process is slow, expensive, and may not be appropriate for new phenomena. In this paper, we explore different anomaly detection approaches to identify high-risk patients as cases that lie in sparse regions of the feature space. We study three broad categories of anomaly detection methods: classification-based, nearest neighbor-based, and clustering-based techniques. When evaluated on data from the National Surgical Quality Improvement Program (NSQIP), these methods were able to successfully identify patients at an elevated risk of mortality and rare morbidities following inpatient surgical procedures. PMID:21347083

  3. Detecting anomalies in CMB maps: a new method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neelakanta, Jayanth T., E-mail: jayanthtn@gmail.com

    2015-10-01

    Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ''hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics aremore » linear and quadratic combinations of the a{sub ℓ m}'s with random co-efficients, and they test the null hypothesis that the a{sub ℓ m}'s are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple ℓ and m modes, the ''coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.« less

  4. Semi-supervised anomaly detection - towards model-independent searches of new physics

    NASA Astrophysics Data System (ADS)

    Kuusela, Mikael; Vatanen, Tommi; Malmi, Eric; Raiko, Tapani; Aaltonen, Timo; Nagai, Yoshikazu

    2012-06-01

    Most classification algorithms used in high energy physics fall under the category of supervised machine learning. Such methods require a training set containing both signal and background events and are prone to classification errors should this training data be systematically inaccurate for example due to the assumed MC model. To complement such model-dependent searches, we propose an algorithm based on semi-supervised anomaly detection techniques, which does not require a MC training sample for the signal data. We first model the background using a multivariate Gaussian mixture model. We then search for deviations from this model by fitting to the observations a mixture of the background model and a number of additional Gaussians. This allows us to perform pattern recognition of any anomalous excess over the background. We show by a comparison to neural network classifiers that such an approach is a lot more robust against misspecification of the signal MC than supervised classification. In cases where there is an unexpected signal, a neural network might fail to correctly identify it, while anomaly detection does not suffer from such a limitation. On the other hand, when there are no systematic errors in the training data, both methods perform comparably.

  5. Anomaly Detection in Moving-Camera Video Sequences Using Principal Subspace Analysis

    DOE PAGES

    Thomaz, Lucas A.; Jardim, Eric; da Silva, Allan F.; ...

    2017-10-16

    This study presents a family of algorithms based on sparse decompositions that detect anomalies in video sequences obtained from slow moving cameras. These algorithms start by computing the union of subspaces that best represents all the frames from a reference (anomaly free) video as a low-rank projection plus a sparse residue. Then, they perform a low-rank representation of a target (possibly anomalous) video by taking advantage of both the union of subspaces and the sparse residue computed from the reference video. Such algorithms provide good detection results while at the same time obviating the need for previous video synchronization. However,more » this is obtained at the cost of a large computational complexity, which hinders their applicability. Another contribution of this paper approaches this problem by using intrinsic properties of the obtained data representation in order to restrict the search space to the most relevant subspaces, providing computational complexity gains of up to two orders of magnitude. The developed algorithms are shown to cope well with videos acquired in challenging scenarios, as verified by the analysis of 59 videos from the VDAO database that comprises videos with abandoned objects in a cluttered industrial scenario.« less

  6. Anomaly Detection in Moving-Camera Video Sequences Using Principal Subspace Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomaz, Lucas A.; Jardim, Eric; da Silva, Allan F.

    This study presents a family of algorithms based on sparse decompositions that detect anomalies in video sequences obtained from slow moving cameras. These algorithms start by computing the union of subspaces that best represents all the frames from a reference (anomaly free) video as a low-rank projection plus a sparse residue. Then, they perform a low-rank representation of a target (possibly anomalous) video by taking advantage of both the union of subspaces and the sparse residue computed from the reference video. Such algorithms provide good detection results while at the same time obviating the need for previous video synchronization. However,more » this is obtained at the cost of a large computational complexity, which hinders their applicability. Another contribution of this paper approaches this problem by using intrinsic properties of the obtained data representation in order to restrict the search space to the most relevant subspaces, providing computational complexity gains of up to two orders of magnitude. The developed algorithms are shown to cope well with videos acquired in challenging scenarios, as verified by the analysis of 59 videos from the VDAO database that comprises videos with abandoned objects in a cluttered industrial scenario.« less

  7. Sequential feature selection for detecting buried objects using forward looking ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Shaw, Darren; Stone, Kevin; Ho, K. C.; Keller, James M.; Luke, Robert H.; Burns, Brian P.

    2016-05-01

    Forward looking ground penetrating radar (FLGPR) has the benefit of detecting objects at a significant standoff distance. The FLGPR signal is radiated over a large surface area and the radar signal return is often weak. Improving detection, especially for buried in road targets, while maintaining an acceptable false alarm rate remains to be a challenging task. Various kinds of features have been developed over the years to increase the FLGPR detection performance. This paper focuses on investigating the use of as many features as possible for detecting buried targets and uses the sequential feature selection technique to automatically choose the features that contribute most for improving performance. Experimental results using data collected at a government test site are presented.

  8. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    NASA Astrophysics Data System (ADS)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long

  9. Detection of Perlger-Huet anomaly based on augmented fast marching method and speeded up robust features.

    PubMed

    Sun, Minglei; Yang, Shaobao; Jiang, Jinling; Wang, Qiwei

    2015-01-01

    Pelger-Huet anomaly (PHA) and Pseudo Pelger-Huet anomaly (PPHA) are neutrophil with abnormal morphology. They have the bilobed or unilobed nucleus and excessive clumping chromatin. Currently, detection of this kind of cell mainly depends on the manual microscopic examination by a clinician, thus, the quality of detection is limited by the efficiency and a certain subjective consciousness of the clinician. In this paper, a detection method for PHA and PPHA is proposed based on karyomorphism and chromatin distribution features. Firstly, the skeleton of the nucleus is extracted using an augmented Fast Marching Method (AFMM) and width distribution is obtained through distance transform. Then, caryoplastin in the nucleus is extracted based on Speeded Up Robust Features (SURF) and a K-nearest-neighbor (KNN) classifier is constructed to analyze the features. Experiment shows that the sensitivity and specificity of this method achieved 87.5% and 83.33%, which means that the detection accuracy of PHA is acceptable. Meanwhile, the detection method should be helpful to the automatic morphological classification of blood cells.

  10. Freezing of Gait Detection in Parkinson's Disease: A Subject-Independent Detector Using Anomaly Scores.

    PubMed

    Pham, Thuy T; Moore, Steven T; Lewis, Simon John Geoffrey; Nguyen, Diep N; Dutkiewicz, Eryk; Fuglevand, Andrew J; McEwan, Alistair L; Leong, Philip H W

    2017-11-01

    Freezing of gait (FoG) is common in Parkinsonian gait and strongly relates to falls. Current clinical FoG assessments are patients' self-report diaries and experts' manual video analysis. Both are subjective and yield moderate reliability. Existing detection algorithms have been predominantly designed in subject-dependent settings. In this paper, we aim to develop an automated FoG detector for subject independent. After extracting highly relevant features, we apply anomaly detection techniques to detect FoG events. Specifically, feature selection is performed using correlation and clusterability metrics. From a list of 244 feature candidates, 36 candidates were selected using saliency and robustness criteria. We develop an anomaly score detector with adaptive thresholding to identify FoG events. Then, using accuracy metrics, we reduce the feature list to seven candidates. Our novel multichannel freezing index was the most selective across all window sizes, achieving sensitivity (specificity) of (). On the other hand, freezing index from the vertical axis was the best choice for a single input, achieving sensitivity (specificity) of () for ankle and () for back sensors. Our subject-independent method is not only significantly more accurate than those previously reported, but also uses a much smaller window (e.g., versus ) and/or lower tolerance (e.g., versus ).Freezing of gait (FoG) is common in Parkinsonian gait and strongly relates to falls. Current clinical FoG assessments are patients' self-report diaries and experts' manual video analysis. Both are subjective and yield moderate reliability. Existing detection algorithms have been predominantly designed in subject-dependent settings. In this paper, we aim to develop an automated FoG detector for subject independent. After extracting highly relevant features, we apply anomaly detection techniques to detect FoG events. Specifically, feature selection is performed using correlation and clusterability metrics. From

  11. An exact computational method for performance analysis of sequential test algorithms for detecting network intrusions

    NASA Astrophysics Data System (ADS)

    Chen, Xinjia; Lacy, Fred; Carriere, Patrick

    2015-05-01

    Sequential test algorithms are playing increasingly important roles for quick detecting network intrusions such as portscanners. In view of the fact that such algorithms are usually analyzed based on intuitive approximation or asymptotic analysis, we develop an exact computational method for the performance analysis of such algorithms. Our method can be used to calculate the probability of false alarm and average detection time up to arbitrarily pre-specified accuracy.

  12. System for Anomaly and Failure Detection (SAFD) system development

    NASA Technical Reports Server (NTRS)

    Oreilly, D.

    1992-01-01

    This task specified developing the hardware and software necessary to implement the System for Anomaly and Failure Detection (SAFD) algorithm, developed under Technology Test Bed (TTB) Task 21, on the TTB engine stand. This effort involved building two units; one unit to be installed in the Block II Space Shuttle Main Engine (SSME) Hardware Simulation Lab (HSL) at Marshall Space Flight Center (MSFC), and one unit to be installed at the TTB engine stand. Rocketdyne personnel from the HSL performed the task. The SAFD algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failure as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient condition.

  13. Acute maternal social dysfunction, health perception and psychological distress after ultrasonographic detection of a fetal structural anomaly.

    PubMed

    Kaasen, A; Helbig, A; Malt, U F; Naes, T; Skari, H; Haugen, G

    2010-08-01

    To predict acute psychological distress in pregnant women following detection of a fetal structural anomaly by ultrasonography, and to relate these findings to a comparison group. A prospective, observational study. Tertiary referral centre for fetal medicine. One hundred and eighty pregnant women with a fetal structural anomaly detected by ultrasound (study group) and 111 with normal ultrasound findings (comparison group) were included within a week following sonographic examination after gestational age 12 weeks (inclusion period: May 2006 to February 2009). Social dysfunction and health perception were assessed by the corresponding subscales of the General Health Questionnaire (GHQ-28). Psychological distress was assessed using the Impact of Events Scale (IES-22), Edinburgh Postnatal Depression Scale (EPDS) and the anxiety and depression subscales of the GHQ-28. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Social dysfunction, health perception and psychological distress (intrusion, avoidance, arousal, anxiety, depression). The least severe anomalies with no diagnostic or prognostic ambiguity induced the lowest levels of IES intrusive distress (P = 0.025). Women included after 22 weeks of gestation (24%) reported significantly higher GHQ distress than women included earlier in pregnancy (P = 0.003). The study group had significantly higher levels of psychosocial distress than the comparison group on all psychometric endpoints. Psychological distress was predicted by gestational age at the time of assessment, severity of the fetal anomaly, and ambiguity concerning diagnosis or prognosis.

  14. The Maximum Cross-Correlation approach to detecting translational motions from sequential remote-sensing images

    NASA Astrophysics Data System (ADS)

    Gao, J.; Lythe, M. B.

    1996-06-01

    This paper presents the principle of the Maximum Cross-Correlation (MCC) approach in detecting translational motions within dynamic fields from time-sequential remotely sensed images. A C program implementing the approach is presented and illustrated in a flowchart. The program is tested with a pair of sea-surface temperature images derived from Advanced Very High Resolution Radiometer (AVHRR) images near East Cape, New Zealand. Results show that the mean currents in the region have been detected satisfactorily with the approach.

  15. Confabulation Based Real-time Anomaly Detection for Wide-area Surveillance Using Heterogeneous High Performance Computing Architecture

    DTIC Science & Technology

    2015-06-01

    system accuracy. The AnRAD system was also generalized for the additional application of network intrusion detection . A self-structuring technique...to Host- based Intrusion Detection Systems using Contiguous and Discontiguous System Call Patterns,” IEEE Transactions on Computer, 63(4), pp. 807...square kilometer areas. The anomaly recognition and detection (AnRAD) system was built as a cogent confabulation network . It represented road

  16. Detection of geothermal anomalies in Tengchong, Yunnan Province, China from MODIS multi-temporal night LST imagery

    NASA Astrophysics Data System (ADS)

    Li, H.; Kusky, T. M.; Peng, S.; Zhu, M.

    2012-12-01

    Thermal infrared (TIR) remote sensing is an important technique in the exploration of geothermal resources. In this study, a geothermal survey is conducted in Tengchong area of Yunnan province in China using multi-temporal MODIS LST (Land Surface Temperature). The monthly night MODIS LST data from Mar. 2000 to Mar. 2011 of the study area were collected and analyzed. The 132 month average LST map was derived and three geothermal anomalies were identified. The findings of this study agree well with the results from relative geothermal gradient measurements. Finally, we conclude that TIR remote sensing is a cost-effective technique to detect geothermal anomalies. Combining TIR remote sensing with geological analysis and the understanding of geothermal mechanism is an accurate and efficient approach to geothermal area detection.

  17. Dark sequential Z ' portal: Collider and direct detection experiments

    NASA Astrophysics Data System (ADS)

    Arcadi, Giorgio; Campos, Miguel D.; Lindner, Manfred; Masiero, Antonio; Queiroz, Farinaldo S.

    2018-02-01

    We revisit the status of a Majorana fermion as a dark matter candidate when a sequential Z' gauge boson dictates the dark matter phenomenology. Direct dark matter detection signatures rise from dark matter-nucleus scatterings at bubble chamber and liquid xenon detectors, and from the flux of neutrinos from the Sun measured by the IceCube experiment, which is governed by the spin-dependent dark matter-nucleus scattering. On the collider side, LHC searches for dilepton and monojet + missing energy signals play an important role. The relic density and perturbativity requirements are also addressed. By exploiting the dark matter complementarity we outline the region of parameter space where one can successfully have a Majorana dark matter particle in light of current and planned experimental sensitivities.

  18. The sequential injection system with adsorptive stripping voltammetric detection.

    PubMed

    Kubiak, W W; Latonen, R M; Ivaska, A

    2001-03-16

    Two sequential injection systems have been developed for adsorptive stripping voltammetric measurement. One is for substances adsorbing at mercury, e.g. riboflavin. In this case, a simple arrangement with only sample aspiration is needed. Reproducibility was 3% and detection limit 0.07 muM. The measuring system was applied to determination of riboflavin in vitamin pills and to study the photodegradation process of riboflavin in aqueous solutions. In the second case, metal ions were determined. They have to be complexed before deposition on the mercury surface. Thus, both the sample and the ligand have to be aspirated in the system. In this case, the reproducibility was approximately 6% and the detection limit <0.1 ppm for cadmium, lead and copper when complexation with oxine was used. Dimethylglyoxime was used in determination of nickel and cobalt and nioxime complexes were used in determination of nickel and copper. With these complexing agents, the reproducibility was the same as with oxine, but the metals could be determined at concentrations lower than 0.01 ppm. Application of two ligands in a SIA system with AdSV detection was also studied. Simultaneous determination of copper, lead, cadmium and cobalt was possible by using oxine and dimethylglyoxime. Copper and nickel were simultaneously determined by using dimethylglyoxime and nioxime.

  19. Anomaly detection applied to a materials control and accounting database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteson, R.; Spanks, L.; Yarbro, T.

    An important component of the national mission of reducing the nuclear danger includes accurate recording of the processing and transportation of nuclear materials. Nuclear material storage facilities, nuclear chemical processing plants, and nuclear fuel fabrication facilities collect and store large amounts of data describing transactions that involve nuclear materials. To maintain confidence in the integrity of these data, it is essential to identify anomalies in the databases. Anomalous data could indicate error, theft, or diversion of material. Yet, because of the complex and diverse nature of the data, analysis and evaluation are extremely tedious. This paper describes the authors workmore » in the development of analysis tools to automate the anomaly detection process for the Material Accountability and Safeguards System (MASS) that tracks and records the activities associated with accountable quantities of nuclear material at Los Alamos National Laboratory. Using existing guidelines that describe valid transactions, the authors have created an expert system that identifies transactions that do not conform to the guidelines. Thus, this expert system can be used to focus the attention of the expert or inspector directly on significant phenomena.« less

  20. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.

    The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can

  1. Dual Use Corrosion Inhibitor and Penetrant for Anomaly Detection in Neutron/X Radiography

    NASA Technical Reports Server (NTRS)

    Hall, Phillip B. (Inventor); Novak, Howard L. (Inventor)

    2004-01-01

    A dual purpose corrosion inhibitor and penetrant composition sensitive to radiography interrogation is provided. The corrosion inhibitor mitigates or eliminates corrosion on the surface of a substrate upon which the corrosion inhibitor is applied. In addition, the corrosion inhibitor provides for the attenuation of a signal used during radiography interrogation thereby providing for detection of anomalies on the surface of the substrate.

  2. Musical experts recruit action-related neural structures in harmonic anomaly detection: Evidence for embodied cognition in expertise

    PubMed Central

    Sherwin, Jason; Sajda, Paul

    2013-01-01

    Humans are extremely good at detecting anomalies in sensory input. For example, while listening to a piece of Western-style music, an anomalous key change or an out-of-key pitch is readily apparent, even to the non-musician. In this paper we investigate differences between musical experts and non-experts during musical anomaly detection. Specifically, we analyzed the electroencephalograms (EEG) of five expert cello players and five non-musicians while they listened to excerpts of J.S. Bach’s Prelude from Cello Suite No.1. All subjects were familiar with the piece, though experts also had extensive experience playing the piece. Subjects were told that anomalous musical events (AMEs) could occur at random within the excerpts of the piece and were told to report the number of AMEs after each excerpt. Furthermore, subjects were instructed to remain still while listening to the excerpts and their lack of movement was verified via visual and EEG monitoring. Experts had significantly better behavioral performance (i.e. correctly reporting AME counts) than non-experts, though both groups had mean accuracies greater than 80%. These group differences were also reflected in the EEG correlates of key-change detection post-stimulus, with experts showing more significant, greater magnitude, longer periods of and earlier peaks in condition-discriminating EEG activity than novices. Using the timing of the maximum discriminating neural correlates, we performed source reconstruction and compared significant differences between cellists and non-musicians. We found significant differences that included a slightly right lateralized motor and frontal source distribution. The right lateralized motor activation is consistent with the cortical representation of the left hand – i.e. the hand a cellist would use, while playing, to generate the anomalous key-changes. In general, these results suggest that sensory anomalies detected by experts may in fact be partially a result of an

  3. Small-scale anomaly detection in panoramic imaging using neural models of low-level vision

    NASA Astrophysics Data System (ADS)

    Casey, Matthew C.; Hickman, Duncan L.; Pavlou, Athanasios; Sadler, James R. E.

    2011-06-01

    Our understanding of sensory processing in animals has reached the stage where we can exploit neurobiological principles in commercial systems. In human vision, one brain structure that offers insight into how we might detect anomalies in real-time imaging is the superior colliculus (SC). The SC is a small structure that rapidly orients our eyes to a movement, sound or touch that it detects, even when the stimulus may be on a small-scale; think of a camouflaged movement or the rustle of leaves. This automatic orientation allows us to prioritize the use of our eyes to raise awareness of a potential threat, such as a predator approaching stealthily. In this paper we describe the application of a neural network model of the SC to the detection of anomalies in panoramic imaging. The neural approach consists of a mosaic of topographic maps that are each trained using competitive Hebbian learning to rapidly detect image features of a pre-defined shape and scale. What makes this approach interesting is the ability of the competition between neurons to automatically filter noise, yet with the capability of generalizing the desired shape and scale. We will present the results of this technique applied to the real-time detection of obscured targets in visible-band panoramic CCTV images. Using background subtraction to highlight potential movement, the technique is able to correctly identify targets which span as little as 3 pixels wide while filtering small-scale noise.

  4. Development of references of anomalies detection on P91 material using Self-Magnetic Leakage Field (SMLF) technique

    NASA Astrophysics Data System (ADS)

    Husin, Shuib; Afiq Pauzi, Ahmad; Yunus, Salmi Mohd; Ghafar, Mohd Hafiz Abdul; Adilin Sekari, Saiful

    2017-10-01

    This technical paper demonstrates the successful of the application of self-magnetic leakage field (SMLF) technique in detecting anomalies in weldment of a thick P91 materials joint (1 inch thickness). Boiler components such as boiler tubes, stub boiler at penthouse and energy piping such as hot reheat pipe (HRP) and H-balance energy piping to turbine are made of P91 material. P91 is ferromagnetic material, therefore the technique of self-magnetic leakage field (SMLF) is applicable for P91 in detecting anomalies within material (internal defects). The technique is categorized under non-destructive technique (NDT). It is the second passive method after acoustic emission (AE), at which the information on structures radiation (magnetic field and energy waves) is used. The measured magnetic leakage field of a product or component is a magnetic leakage field occurring on the component’s surface in the zone of dislocation stable slipbands under the influence of operational (in-service) or residual stresses or in zones of maximum inhomogeneity of metal structure in new products or components. Inter-granular and trans-granular cracks, inclusion, void, cavity and corrosion are considered types of inhomogeneity and discontinuity in material where obviously the output of magnetic leakage field will be shown when using this technique. The technique does not required surface preparation for the component to be inspected. This technique is contact-type inspection, which means the sensor has to touch or in-contact to the component’s surface during inspection. The results of application of SMLF technique on the developed P91 reference blocks have demonstrated that the technique is practical to be used for anomaly inspection and detection as well as identification of anomalies’ location. The evaluation of this passive self-magnetic leakage field (SMLF) technique has been verified by other conventional non-destructive tests (NDTs) on the reference blocks where simulated

  5. A sequential sampling account of response bias and speed-accuracy tradeoffs in a conflict detection task.

    PubMed

    Vuckovic, Anita; Kwantes, Peter J; Humphreys, Michael; Neal, Andrew

    2014-03-01

    Signal Detection Theory (SDT; Green & Swets, 1966) is a popular tool for understanding decision making. However, it does not account for the time taken to make a decision, nor why response bias might change over time. Sequential sampling models provide a way of accounting for speed-accuracy trade-offs and response bias shifts. In this study, we test the validity of a sequential sampling model of conflict detection in a simulated air traffic control task by assessing whether two of its key parameters respond to experimental manipulations in a theoretically consistent way. Through experimental instructions, we manipulated participants' response bias and the relative speed or accuracy of their responses. The sequential sampling model was able to replicate the trends in the conflict responses as well as response time across all conditions. Consistent with our predictions, manipulating response bias was associated primarily with changes in the model's Criterion parameter, whereas manipulating speed-accuracy instructions was associated with changes in the Threshold parameter. The success of the model in replicating the human data suggests we can use the parameters of the model to gain an insight into the underlying response bias and speed-accuracy preferences common to dynamic decision-making tasks. © 2013 American Psychological Association

  6. Detecting Anomalies from End-to-End Internet Performance Measurements (PingER) Using Cluster Based Local Outlier Factor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie

    PingER (Ping End-to-End Reporting) is a worldwide end-to-end Internet performance measurement framework. It was developed by the SLAC National Accelerator Laboratory, Stanford, USA and running from the last 20 years. It has more than 700 monitoring agents and remote sites which monitor the performance of Internet links around 170 countries of the world. At present, the size of the compressed PingER data set is about 60 GB comprising of 100,000 flat files. The data is publicly available for valuable Internet performance analyses. However, the data sets suffer from missing values and anomalies due to congestion, bottleneck links, queuing overflow, networkmore » software misconfiguration, hardware failure, cable cuts, and social upheavals. Therefore, the objective of this paper is to detect such performance drops or spikes labeled as anomalies or outliers for the PingER data set. In the proposed approach, the raw text files of the data set are transformed into a PingER dimensional model. The missing values are imputed using the k-NN algorithm. The data is partitioned into similar instances using the k-means clustering algorithm. Afterward, clustering is integrated with the Local Outlier Factor (LOF) using the Cluster Based Local Outlier Factor (CBLOF) algorithm to detect the anomalies or outliers from the PingER data. Lastly, anomalies are further analyzed to identify the time frame and location of the hosts generating the major percentage of the anomalies in the PingER data set ranging from 1998 to 2016.« less

  7. Detecting Anomalies from End-to-End Internet Performance Measurements (PingER) Using Cluster Based Local Outlier Factor

    DOE PAGES

    Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie; ...

    2018-05-28

    PingER (Ping End-to-End Reporting) is a worldwide end-to-end Internet performance measurement framework. It was developed by the SLAC National Accelerator Laboratory, Stanford, USA and running from the last 20 years. It has more than 700 monitoring agents and remote sites which monitor the performance of Internet links around 170 countries of the world. At present, the size of the compressed PingER data set is about 60 GB comprising of 100,000 flat files. The data is publicly available for valuable Internet performance analyses. However, the data sets suffer from missing values and anomalies due to congestion, bottleneck links, queuing overflow, networkmore » software misconfiguration, hardware failure, cable cuts, and social upheavals. Therefore, the objective of this paper is to detect such performance drops or spikes labeled as anomalies or outliers for the PingER data set. In the proposed approach, the raw text files of the data set are transformed into a PingER dimensional model. The missing values are imputed using the k-NN algorithm. The data is partitioned into similar instances using the k-means clustering algorithm. Afterward, clustering is integrated with the Local Outlier Factor (LOF) using the Cluster Based Local Outlier Factor (CBLOF) algorithm to detect the anomalies or outliers from the PingER data. Lastly, anomalies are further analyzed to identify the time frame and location of the hosts generating the major percentage of the anomalies in the PingER data set ranging from 1998 to 2016.« less

  8. Multiple Kernel Learning for Heterogeneous Anomaly Detection: Algorithm and Aviation Safety Case Study

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.

    2010-01-01

    The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods

  9. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    /index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.

  10. Genetic algorithm for TEC seismo-ionospheric anomalies detection around the time of the Solomon (Mw = 8.0) earthquake of 06 February 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-08-01

    On 6 February 2013, at 12:12:27 local time (01:12:27 UTC) a seismic event registering Mw 8.0 struck the Solomon Islands, located at the boundaries of the Australian and Pacific tectonic plates. Time series prediction is an important and widely interesting topic in the research of earthquake precursors. This paper describes a new computational intelligence approach to detect the unusual variations of the total electron content (TEC) seismo-ionospheric anomalies induced by the powerful Solomon earthquake using genetic algorithm (GA). The GA detected a considerable number of anomalous occurrences on earthquake day and also 7 and 8 days prior to the earthquake in a period of high geomagnetic activities. In this study, also the detected TEC anomalies using the proposed method are compared to the results dealing with the observed TEC anomalies by applying the mean, median, wavelet, Kalman filter, ARIMA, neural network and support vector machine methods. The accordance in the final results of all eight methods is a convincing indication for the efficiency of the GA method. It indicates that GA can be an appropriate non-parametric tool for anomaly detection in a non linear time series showing the seismo-ionospheric precursors variations.

  11. Automated detection of changes in sequential color ocular fundus images

    NASA Astrophysics Data System (ADS)

    Sakuma, Satoshi; Nakanishi, Tadashi; Takahashi, Yasuko; Fujino, Yuichi; Tsubouchi, Tetsuro; Nakanishi, Norimasa

    1998-06-01

    A recent trend is the automatic screening of color ocular fundus images. The examination of such images is used in the early detection of several adult diseases such as hypertension and diabetes. Since this type of examination is easier than CT, costs less, and has no harmful side effects, it will become a routine medical examination. Normal ocular fundus images are found in more than 90% of all people. To deal with the increasing number of such images, this paper proposes a new approach to process them automatically and accurately. Our approach, based on individual comparison, identifies changes in sequential images: a previously diagnosed normal reference image is compared to a non- diagnosed image.

  12. Remote detection of geobotanical anomalies associated with hydrocarbon microseepage

    NASA Technical Reports Server (NTRS)

    Rock, B. N.

    1985-01-01

    As part of the continuing study of the Lost River, West Virginia NASA/Geosat Test Case Site, an extensive soil gas survey of the site was conducted during the summer of 1983. This soil gas survey has identified an order of magnitude methane, ethane, propane, and butane anomaly that is precisely coincident with the linear maple anomaly reported previously. This and other maple anomalies were previously suggested to be indicative of anaerobic soil conditions associated with hydrocarbon microseepage. In vitro studies support the view that anomalous distributions of native tree species tolerant of anaerobic soil conditions may be useful indicators of methane microseepage in heavily vegetated areas of the United States characterized by deciduous forest cover. Remote sensing systems which allow discrimination and mapping of native tree species and/or species associations will provide the exploration community with a means of identifying vegetation distributional anomalies indicative of microseepage.

  13. [Ebstein's "like" anomaly ventricular double inlet. A rare association].

    PubMed

    Muñoz Castellanos, Luis; Kuri Nivon, Magdalena

    The association of univentricular heart with double inlet and Ebstein's "like" anomaly of the common atrioventricular valve is extremely rare. Two hearts with this association are described with the segmental sequential system which determine the atrial situs, the types of atrioventricular and ventriculoarterial connections and associated anomalies. Both hearts had atrial situs solitus, and a univentricular heart with common atrioventricular valve, a foramen primum and double outlet ventricle with normal crossed great arteries. In the fiefirst heart the four leaflets of the atrioventricular valve were displaced and fused to the ventricular walls, from the atrioventricular union roward the apex with atrialization of the inlet and trabecular zones and there was stenosis in the infundibulum and in the pulmonary valve. In the second heart the proximal segment of the atrioventricular valve was displaced and fused to the ventricular whith shot atrialization and the distal segment was dysplastic with fibromixoid nodules and tendinous cords short and thick; the pulmonary artery was dilate. Both hearts are grouped in the atrioventricular univentricular connection in the segmental sequential system. The application of this method in the diagnosis of congenital heart disease demonstrates its usefulness. The associations of complex anomalies in these hearts show us the infinite spectrum of presentation of congenital heart disease which expands our knowledge of pediatric cardiology. Copyright © 2016 Instituto Nacional de Cardiología Ignacio Chávez. Publicado por Masson Doyma México S.A. All rights reserved.

  14. Sensor Anomaly Detection in Wireless Sensor Networks for Healthcare

    PubMed Central

    Haque, Shah Ahsanul; Rahman, Mustafizur; Aziz, Syed Mahfuzul

    2015-01-01

    Wireless Sensor Networks (WSN) are vulnerable to various sensor faults and faulty measurements. This vulnerability hinders efficient and timely response in various WSN applications, such as healthcare. For example, faulty measurements can create false alarms which may require unnecessary intervention from healthcare personnel. Therefore, an approach to differentiate between real medical conditions and false alarms will improve remote patient monitoring systems and quality of healthcare service afforded by WSN. In this paper, a novel approach is proposed to detect sensor anomaly by analyzing collected physiological data from medical sensors. The objective of this method is to effectively distinguish false alarms from true alarms. It predicts a sensor value from historic values and compares it with the actual sensed value for a particular instance. The difference is compared against a threshold value, which is dynamically adjusted, to ascertain whether the sensor value is anomalous. The proposed approach has been applied to real healthcare datasets and compared with existing approaches. Experimental results demonstrate the effectiveness of the proposed system, providing high Detection Rate (DR) and low False Positive Rate (FPR). PMID:25884786

  15. Hypergraph-based anomaly detection of high-dimensional co-occurrences.

    PubMed

    Silva, Jorge; Willett, Rebecca

    2009-03-01

    This paper addresses the problem of detecting anomalous multivariate co-occurrences using a limited number of unlabeled training observations. A novel method based on using a hypergraph representation of the data is proposed to deal with this very high-dimensional problem. Hypergraphs constitute an important extension of graphs which allow edges to connect more than two vertices simultaneously. A variational Expectation-Maximization algorithm for detecting anomalies directly on the hypergraph domain without any feature selection or dimensionality reduction is presented. The resulting estimate can be used to calculate a measure of anomalousness based on the False Discovery Rate. The algorithm has O(np) computational complexity, where n is the number of training observations and p is the number of potential participants in each co-occurrence event. This efficiency makes the method ideally suited for very high-dimensional settings, and requires no tuning, bandwidth or regularization parameters. The proposed approach is validated on both high-dimensional synthetic data and the Enron email database, where p > 75,000, and it is shown that it can outperform other state-of-the-art methods.

  16. A Load-Based Temperature Prediction Model for Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Sobhani, Masoud

    Electric load forecasting, as a basic requirement for the decision-making in power utilities, has been improved in various aspects in the past decades. Many factors may affect the accuracy of the load forecasts, such as data quality, goodness of the underlying model and load composition. Due to the strong correlation between the input variables (e.g., weather and calendar variables) and the load, the quality of input data plays a vital role in forecasting practices. Even if the forecasting model were able to capture most of the salient features of the load, a low quality input data may result in inaccurate forecasts. Most of the data cleansing efforts in the load forecasting literature have been devoted to the load data. Few studies focused on weather data cleansing for load forecasting. This research proposes an anomaly detection method for the temperature data. The method consists of two components: a load-based temperature prediction model and a detection technique. The effectiveness of the proposed method is demonstrated through two case studies: one based on the data from the Global Energy Forecasting Competition 2014, and the other based on the data published by ISO New England. The results show that by removing the detected observations from the original input data, the final load forecast accuracy is enhanced.

  17. Adaptive hidden Markov model with anomaly States for price manipulation detection.

    PubMed

    Cao, Yi; Li, Yuhua; Coleman, Sonya; Belatreche, Ammar; McGinnity, Thomas Martin

    2015-02-01

    Price manipulation refers to the activities of those traders who use carefully designed trading behaviors to manually push up or down the underlying equity prices for making profits. With increasing volumes and frequency of trading, price manipulation can be extremely damaging to the proper functioning and integrity of capital markets. The existing literature focuses on either empirical studies of market abuse cases or analysis of particular manipulation types based on certain assumptions. Effective approaches for analyzing and detecting price manipulation in real time are yet to be developed. This paper proposes a novel approach, called adaptive hidden Markov model with anomaly states (AHMMAS) for modeling and detecting price manipulation activities. Together with wavelet transformations and gradients as the feature extraction methods, the AHMMAS model caters to price manipulation detection and basic manipulation type recognition. The evaluation experiments conducted on seven stock tick data from NASDAQ and the London Stock Exchange and 10 simulated stock prices by stochastic differential equation show that the proposed AHMMAS model can effectively detect price manipulation patterns and outperforms the selected benchmark models.

  18. System and method for the detection of anomalies in an image

    DOEpatents

    Prasad, Lakshman; Swaminarayan, Sriram

    2013-09-03

    Preferred aspects of the present invention can include receiving a digital image at a processor; segmenting the digital image into a hierarchy of feature layers comprising one or more fine-scale features defining a foreground object embedded in one or more coarser-scale features defining a background to the one or more fine-scale features in the segmentation hierarchy; detecting a first fine-scale foreground feature as an anomaly with respect to a first background feature within which it is embedded; and constructing an anomalous feature layer by synthesizing spatially contiguous anomalous fine-scale features. Additional preferred aspects of the present invention can include detecting non-pervasive changes between sets of images in response at least in part to one or more difference images between the sets of images.

  19. Anomaly detection of microstructural defects in continuous fiber reinforced composites

    NASA Astrophysics Data System (ADS)

    Bricker, Stephen; Simmons, J. P.; Przybyla, Craig; Hardie, Russell

    2015-03-01

    Ceramic matrix composites (CMC) with continuous fiber reinforcements have the potential to enable the next generation of high speed hypersonic vehicles and/or significant improvements in gas turbine engine performance due to their exhibited toughness when subjected to high mechanical loads at extreme temperatures (2200F+). Reinforced fiber composites (RFC) provide increased fracture toughness, crack growth resistance, and strength, though little is known about how stochastic variation and imperfections in the material effect material properties. In this work, tools are developed for quantifying anomalies within the microstructure at several scales. The detection and characterization of anomalous microstructure is a critical step in linking production techniques to properties, as well as in accurate material simulation and property prediction for the integrated computation materials engineering (ICME) of RFC based components. It is desired to find statistical outliers for any number of material characteristics such as fibers, fiber coatings, and pores. Here, fiber orientation, or `velocity', and `velocity' gradient are developed and examined for anomalous behavior. Categorizing anomalous behavior in the CMC is approached by multivariate Gaussian mixture modeling. A Gaussian mixture is employed to estimate the probability density function (PDF) of the features in question, and anomalies are classified by their likelihood of belonging to the statistical normal behavior for that feature.

  20. Damage diagnosis algorithm using a sequential change point detection method with an unknown distribution for damage

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram; Kiremidjian, Anne S.

    2012-04-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method for the cases where the post-damage feature distribution is unknown a priori. This algorithm extracts features from structural vibration data using time-series analysis and then declares damage using the change point detection method. The change point detection method asymptotically minimizes detection delay for a given false alarm rate. The conventional method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori. Therefore, our algorithm estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using multiple sets of simulated data and a set of experimental data collected from a four-story steel special moment-resisting frame. Our algorithm was able to estimate the post-damage distribution consistently and resulted in detection delays only a few seconds longer than the delays from the conventional method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  1. Pre-seismic anomalies from optical satellite observations: a review

    NASA Astrophysics Data System (ADS)

    Jiao, Zhong-Hu; Zhao, Jing; Shan, Xinjian

    2018-04-01

    Detecting various anomalies using optical satellite data prior to strong earthquakes is key to understanding and forecasting earthquake activities because of its recognition of thermal-radiation-related phenomena in seismic preparation phases. Data from satellite observations serve as a powerful tool in monitoring earthquake preparation areas at a global scale and in a nearly real-time manner. Over the past several decades, many new different data sources have been utilized in this field, and progressive anomaly detection approaches have been developed. This paper reviews the progress and development of pre-seismic anomaly detection technology in this decade. First, precursor parameters, including parameters from the top of the atmosphere, in the atmosphere, and on the Earth's surface, are stated and discussed. Second, different anomaly detection methods, which are used to extract anomalous signals that probably indicate future seismic events, are presented. Finally, certain critical problems with the current research are highlighted, and new developing trends and perspectives for future work are discussed. The development of Earth observation satellites and anomaly detection algorithms can enrich available information sources, provide advanced tools for multilevel earthquake monitoring, and improve short- and medium-term forecasting, which play a large and growing role in pre-seismic anomaly detection research.

  2. The role of noninvasive and invasive diagnostic imaging techniques for detection of extra-cranial venous system anomalies and developmental variants

    PubMed Central

    2013-01-01

    The extra-cranial venous system is complex and not well studied in comparison to the peripheral venous system. A newly proposed vascular condition, named chronic cerebrospinal venous insufficiency (CCSVI), described initially in patients with multiple sclerosis (MS) has triggered intense interest in better understanding of the role of extra-cranial venous anomalies and developmental variants. So far, there is no established diagnostic imaging modality, non-invasive or invasive, that can serve as the “gold standard” for detection of these venous anomalies. However, consensus guidelines and standardized imaging protocols are emerging. Most likely, a multimodal imaging approach will ultimately be the most comprehensive means for screening, diagnostic and monitoring purposes. Further research is needed to determine the spectrum of extra-cranial venous pathology and to compare the imaging findings with pathological examinations. The ability to define and reliably detect noninvasively these anomalies is an essential step toward establishing their incidence and prevalence. The role for these anomalies in causing significant hemodynamic consequences for the intra-cranial venous drainage in MS patients and other neurologic disorders, and in aging, remains unproven. PMID:23806142

  3. Sequential analysis as a tool for detection of amikacin ototoxicity in the treatment of multidrug-resistant tuberculosis.

    PubMed

    Vasconcelos, Karla Anacleto de; Frota, Silvana Maria Monte Coelho; Ruffino-Netto, Antonio; Kritski, Afrânio Lineu

    2018-04-01

    To investigate early detection of amikacin-induced ototoxicity in a population treated for multidrug-resistant tuberculosis (MDR-TB), by means of three different tests: pure-tone audiometry (PTA); high-frequency audiometry (HFA); and distortion-product otoacoustic emission (DPOAE) testing. This was a longitudinal prospective cohort study involving patients aged 18-69 years with a diagnosis of MDR-TB who had to receive amikacin for six months as part of their antituberculosis drug regimen for the first time. Hearing was assessed before treatment initiation and at two and six months after treatment initiation. Sequential statistics were used to analyze the results. We included 61 patients, but the final population consisted of 10 patients (7 men and 3 women) because of sequential analysis. Comparison of the test results obtained at two and six months after treatment initiation with those obtained at baseline revealed that HFA at two months and PTA at six months detected hearing threshold shifts consistent with ototoxicity. However, DPOAE testing did not detect such shifts. The statistical method used in this study makes it possible to conclude that, over the six-month period, amikacin-associated hearing threshold shifts were detected by HFA and PTA, and that DPOAE testing was not efficient in detecting such shifts.

  4. Thermal and TEC anomalies detection using an intelligent hybrid system around the time of the Saravan, Iran, (Mw = 7.7) earthquake of 16 April 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2014-02-01

    A powerful earthquake of Mw = 7.7 struck the Saravan region (28.107° N, 62.053° E) in Iran on 16 April 2013. Up to now nomination of an automated anomaly detection method in a non linear time series of earthquake precursor has been an attractive and challenging task. Artificial Neural Network (ANN) and Particle Swarm Optimization (PSO) have revealed strong potentials in accurate time series prediction. This paper presents the first study of an integration of ANN and PSO method in the research of earthquake precursors to detect the unusual variations of the thermal and total electron content (TEC) seismo-ionospheric anomalies induced by the strong earthquake of Saravan. In this study, to overcome the stagnation in local minimum during the ANN training, PSO as an optimization method is used instead of traditional algorithms for training the ANN method. The proposed hybrid method detected a considerable number of anomalies 4 and 8 days preceding the earthquake. Since, in this case study, ionospheric TEC anomalies induced by seismic activity is confused with background fluctuations due to solar activity, a multi-resolution time series processing technique based on wavelet transform has been applied on TEC signal variations. In view of the fact that the accordance in the final results deduced from some robust methods is a convincing indication for the efficiency of the method, therefore the detected thermal and TEC anomalies using the ANN + PSO method were compared to the results with regard to the observed anomalies by implementing the mean, median, Wavelet, Kalman filter, Auto-Regressive Integrated Moving Average (ARIMA), Support Vector Machine (SVM) and Genetic Algorithm (GA) methods. The results indicate that the ANN + PSO method is quite promising and deserves serious attention as a new tool for thermal and TEC seismo anomalies detection.

  5. Recombinant Temporal Aberration Detection Algorithms for Enhanced Biosurveillance

    PubMed Central

    Murphy, Sean Patrick; Burkom, Howard

    2008-01-01

    Objective Broadly, this research aims to improve the outbreak detection performance and, therefore, the cost effectiveness of automated syndromic surveillance systems by building novel, recombinant temporal aberration detection algorithms from components of previously developed detectors. Methods This study decomposes existing temporal aberration detection algorithms into two sequential stages and investigates the individual impact of each stage on outbreak detection performance. The data forecasting stage (Stage 1) generates predictions of time series values a certain number of time steps in the future based on historical data. The anomaly measure stage (Stage 2) compares features of this prediction to corresponding features of the actual time series to compute a statistical anomaly measure. A Monte Carlo simulation procedure is then used to examine the recombinant algorithms’ ability to detect synthetic aberrations injected into authentic syndromic time series. Results New methods obtained with procedural components of published, sometimes widely used, algorithms were compared to the known methods using authentic datasets with plausible stochastic injected signals. Performance improvements were found for some of the recombinant methods, and these improvements were consistent over a range of data types, outbreak types, and outbreak sizes. For gradual outbreaks, the WEWD MovAvg7+WEWD Z-Score recombinant algorithm performed best; for sudden outbreaks, the HW+WEWD Z-Score performed best. Conclusion This decomposition was found not only to yield valuable insight into the effects of the aberration detection algorithms but also to produce novel combinations of data forecasters and anomaly measures with enhanced detection performance. PMID:17947614

  6. Improvement of statistical methods for detecting anomalies in climate and environmental monitoring systems

    NASA Astrophysics Data System (ADS)

    Yakunin, A. G.; Hussein, H. M.

    2018-01-01

    The article shows how the known statistical methods, which are widely used in solving financial problems and a number of other fields of science and technology, can be effectively applied after minor modification for solving such problems in climate and environment monitoring systems, as the detection of anomalies in the form of abrupt changes in signal levels, the occurrence of positive and negative outliers and the violation of the cycle form in periodic processes.

  7. Maternal psychological responses during pregnancy after ultrasonographic detection of structural fetal anomalies: A prospective longitudinal observational study

    PubMed Central

    Kaasen, Anne; Helbig, Anne; Malt, Ulrik F.; Næs, Tormod; Skari, Hans; Haugen, Guttorm

    2017-01-01

    In this longitudinal prospective observational study performed at a tertiary perinatal referral centre, we aimed to assess maternal distress in pregnancy in women with ultrasound findings of fetal anomaly and compare this with distress in pregnant women with normal ultrasound findings. Pregnant women with a structural fetal anomaly (n = 48) and normal ultrasound (n = 105) were included. We administered self-report questionnaires (General Health Questionnaire-28, Impact of Event Scale-22 [IES], and Edinburgh Postnatal Depression Scale) a few days following ultrasound detection of a fetal anomaly or a normal ultrasound (T1), 3 weeks post-ultrasound (T2), and at 30 (T3) and 36 weeks gestation (T4). Social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression) were the main outcome measures. The median gestational age at T1 was 20 and 19 weeks in the group with and without fetal anomaly, respectively. In the fetal anomaly group, all psychological distress scores were highest at T1. In the group with a normal scan, distress scores were stable throughout pregnancy. At all assessments, the fetal anomaly group scored significantly higher (especially on depression-related questions) compared to the normal scan group, except on the IES Intrusion and Arousal subscales at T4, although with large individual differences. In conclusion, women with a known fetal anomaly initially had high stress scores, which gradually decreased, resembling those in women with a normal pregnancy. Psychological stress levels were stable and low during the latter half of gestation in women with a normal pregnancy. PMID:28350879

  8. Spatially-Aware Temporal Anomaly Mapping of Gamma Spectra

    NASA Astrophysics Data System (ADS)

    Reinhart, Alex; Athey, Alex; Biegalski, Steven

    2014-06-01

    For security, environmental, and regulatory purposes it is useful to continuously monitor wide areas for unexpected changes in radioactivity. We report on a temporal anomaly detection algorithm which uses mobile detectors to build a spatial map of background spectra, allowing sensitive detection of any anomalies through many days or months of monitoring. We adapt previously-developed anomaly detection methods, which compare spectral shape rather than count rate, to function with limited background data, allowing sensitive detection of small changes in spectral shape from day to day. To demonstrate this technique we collected daily observations over the period of six weeks on a 0.33 square mile research campus and performed source injection simulations.

  9. Methods for computational disease surveillance in infection prevention and control: Statistical process control versus Twitter's anomaly and breakout detection algorithms.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A

    2018-02-01

    Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  10. Detection of Anomalies in Citrus Leaves Using Laser-Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Sankaran, Sindhuja; Ehsani, Reza; Morgan, Kelly T

    2015-08-01

    Nutrient assessment and management are important to maintain productivity in citrus orchards. In this study, laser-induced breakdown spectroscopy (LIBS) was applied for rapid and real-time detection of citrus anomalies. Laser-induced breakdown spectroscopy spectra were collected from citrus leaves with anomalies such as diseases (Huanglongbing, citrus canker) and nutrient deficiencies (iron, manganese, magnesium, zinc), and compared with those of healthy leaves. Baseline correction, wavelet multivariate denoising, and normalization techniques were applied to the LIBS spectra before analysis. After spectral pre-processing, features were extracted using principal component analysis and classified using two models, quadratic discriminant analysis and support vector machine (SVM). The SVM resulted in a high average classification accuracy of 97.5%, with high average canker classification accuracy (96.5%). LIBS peak analysis indicated that high intensities at 229.7, 247.9, 280.3, 393.5, 397.0, and 769.8 nm were observed of 11 peaks found in all the samples. Future studies using controlled experiments with variable nutrient applications are required for quantification of foliar nutrients by using LIBS-based sensing.

  11. Euclidean commute time distance embedding and its application to spectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Albano, James A.; Messinger, David W.

    2012-06-01

    Spectral image analysis problems often begin by performing a preprocessing step composed of applying a transformation that generates an alternative representation of the spectral data. In this paper, a transformation based on a Markov-chain model of a random walk on a graph is introduced. More precisely, we quantify the random walk using a quantity known as the average commute time distance and find a nonlinear transformation that embeds the nodes of a graph in a Euclidean space where the separation between them is equal to the square root of this quantity. This has been referred to as the Commute Time Distance (CTD) transformation and it has the important characteristic of increasing when the number of paths between two nodes decreases and/or the lengths of those paths increase. Remarkably, a closed form solution exists for computing the average commute time distance that avoids running an iterative process and is found by simply performing an eigendecomposition on the graph Laplacian matrix. Contained in this paper is a discussion of the particular graph constructed on the spectral data for which the commute time distance is then calculated from, an introduction of some important properties of the graph Laplacian matrix, and a subspace projection that approximately preserves the maximal variance of the square root commute time distance. Finally, RX anomaly detection and Topological Anomaly Detection (TAD) algorithms will be applied to the CTD subspace followed by a discussion of their results.

  12. Routine screening for fetal anomalies: expectations.

    PubMed

    Goldberg, James D

    2004-03-01

    Ultrasound has become a routine part of prenatal care. Despite this, the sensitivity and specificity of the procedure is unclear to many patients and healthcare providers. In a small study from Canada, 54.9% of women reported that they had received no information about ultrasound before their examination. In addition, 37.2% of women indicated that they were unaware of any fetal problems that ultrasound could not detect. Most centers that perform ultrasound do not have their own statistics regarding sensitivity and specificity; it is necessary to rely on large collaborative studies. Unfortunately, wide variations exist in these studies with detection rates for fetal anomalies between 13.3% and 82.4%. The Eurofetus study is the largest prospective study performed to date and because of the time and expense involved in this type of study, a similar study is not likely to be repeated. The overall fetal detection rate for anomalous fetuses was 64.1%. It is important to note that in this study, ultrasounds were performed in tertiary centers with significant experience in detecting fetal malformations. The RADIUS study also demonstrated a significantly improved detection rate of anomalies before 24 weeks in tertiary versus community centers (35% versus 13%). Two concepts seem to emerge from reviewing these data. First, patients must be made aware of the limitations of ultrasound in detecting fetal anomalies. This information is critical to allow them to make informed decisions whether to undergo ultrasound examination and to prepare them for potential outcomes.Second, to achieve the detection rates reported in the Eurofetus study, ultrasound examination must be performed in centers that have extensive experience in the detection of fetal anomalies.

  13. Adaptive x-ray threat detection using sequential hypotheses testing with fan-beam experimental data (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Thamvichai, Ratchaneekorn; Huang, Liang-Chih; Ashok, Amit; Gong, Qian; Coccarelli, David; Greenberg, Joel A.; Gehm, Michael E.; Neifeld, Mark A.

    2017-05-01

    We employ an adaptive measurement system, based on sequential hypotheses testing (SHT) framework, for detecting material-based threats using experimental data acquired on an X-ray experimental testbed system. This testbed employs 45-degree fan-beam geometry and 15 views over a 180-degree span to generate energy sensitive X-ray projection data. Using this testbed system, we acquire multiple view projection data for 200 bags. We consider an adaptive measurement design where the X-ray projection measurements are acquired in a sequential manner and the adaptation occurs through the choice of the optimal "next" source/view system parameter. Our analysis of such an adaptive measurement design using the experimental data demonstrates a 3x-7x reduction in the probability of error relative to a static measurement design. Here the static measurement design refers to the operational system baseline that corresponds to a sequential measurement using all the available sources/views. We also show that by using adaptive measurements it is possible to reduce the number of sources/views by nearly 50% compared a system that relies on static measurements.

  14. In situ laser-induced photochemical silver substrate synthesis and sequential SERS detection in a flow cell.

    PubMed

    Herman, Krisztian; Szabó, László; Leopold, Loredana F; Chiş, Vasile; Leopold, Nicolae

    2011-05-01

    A new, simple, and effective approach for multianalyte sequential surface-enhanced Raman scattering (SERS) detection in a flow cell is reported. The silver substrate was prepared in situ by laser-induced photochemical synthesis. By focusing the laser on the 320 μm inner diameter glass capillary at 0.5 ml/min continuous flow of 1 mM silver nitrate and 10 mM sodium citrate mixture, a SERS active silver spot on the inner wall of the glass capillary was prepared in a few seconds. The test analytes, dacarbazine, 4-(2-pyridylazo)resorcinol (PAR) complex with Cu(II), and amoxicillin, were sequentially injected into the flow cell. Each analyte was adsorbed to the silver surface, enabling the recording of high intensity SERS spectra even at 2 s integration times, followed by desorption from the silver surface and being washed away from the capillary. Before and after each analyte passed the detection window, citrate background spectra were recorded, and thus, no "memory effects" perturbed the SERS detection. A good reproducibility of the SERS spectra obtained under flow conditions was observed. The laser-induced photochemically synthesized silver substrate enables high Raman enhancement, is characterized by fast preparation with a high success rate, and represents a valuable alternative for silver colloids as SERS substrate in flow approaches.

  15. Anomaly Detection for Beam Loss Maps in the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja

    2017-07-01

    In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.

  16. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery

    PubMed Central

    Marapareddy, Ramakalavathi; Aanstoos, James V.; Younan, Nicolas H.

    2016-01-01

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ1, λ2, and λ3), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory’s (JPL’s) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers. PMID:27322270

  17. Characterization of normality of chaotic systems including prediction and detection of anomalies

    NASA Astrophysics Data System (ADS)

    Engler, Joseph John

    Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational

  18. Detection of submicron scale cracks and other surface anomalies using positron emission tomography

    DOEpatents

    Cowan, Thomas E.; Howell, Richard H.; Colmenares, Carlos A.

    2004-02-17

    Detection of submicron scale cracks and other mechanical and chemical surface anomalies using PET. This surface technique has sufficient sensitivity to detect single voids or pits of sub-millimeter size and single cracks or fissures of millimeter size; and single cracks or fissures of millimeter-scale length, micrometer-scale depth, and nanometer-scale length, micrometer-scale depth, and nanometer-scale width. This technique can also be applied to detect surface regions of differing chemical reactivity. It may be utilized in a scanning or survey mode to simultaneously detect such mechanical or chemical features over large interior or exterior surface areas of parts as large as about 50 cm in diameter. The technique involves exposing a surface to short-lived radioactive gas for a time period, removing the excess gas to leave a partial monolayer, determining the location and shape of the cracks, voids, porous regions, etc., and calculating the width, depth, and length thereof. Detection of 0.01 mm deep cracks using a 3 mm detector resolution has been accomplished using this technique.

  19. Deep-cascade: Cascading 3D Deep Neural Networks for Fast Anomaly Detection and Localization in Crowded Scenes.

    PubMed

    Sabokrou, Mohammad; Fayyaz, Mohsen; Fathy, Mahmood; Klette, Reinhard

    2017-02-17

    This paper proposes a fast and reliable method for anomaly detection and localization in video data showing crowded scenes. Time-efficient anomaly localization is an ongoing challenge and subject of this paper. We propose a cubicpatch- based method, characterised by a cascade of classifiers, which makes use of an advanced feature-learning approach. Our cascade of classifiers has two main stages. First, a light but deep 3D auto-encoder is used for early identification of "many" normal cubic patches. This deep network operates on small cubic patches as being the first stage, before carefully resizing remaining candidates of interest, and evaluating those at the second stage using a more complex and deeper 3D convolutional neural network (CNN). We divide the deep autoencoder and the CNN into multiple sub-stages which operate as cascaded classifiers. Shallow layers of the cascaded deep networks (designed as Gaussian classifiers, acting as weak single-class classifiers) detect "simple" normal patches such as background patches, and more complex normal patches are detected at deeper layers. It is shown that the proposed novel technique (a cascade of two cascaded classifiers) performs comparable to current top-performing detection and localization methods on standard benchmarks, but outperforms those in general with respect to required computation time.

  20. Developing a new, passive diffusion sampling array to detect helium anomalies associated with volcanic unrest

    USGS Publications Warehouse

    Dame, Brittany E; Solomon, D Kip; Evans, William C.; Ingebritsen, Steven E.

    2015-01-01

    Helium (He) concentration and 3 He/ 4 He anomalies in soil gas and spring water are potentially powerful tools for investigating hydrothermal circulation associated with volca- nism and could perhaps serve as part of a hazards warning system. However, in operational practice, He and other gases are often sampled only after volcanic unrest is detected by other means. A new passive diffusion sampler suite, intended to be collected after the onset of unrest, has been developed and tested as a relatively low-cost method of determining He- isotope composition pre- and post-unrest. The samplers, each with a distinct equilibration time, passively record He concen- tration and isotope ratio in springs and soil gas. Once collected and analyzed, the He concentrations in the samplers are used to deconvolve the time history of the He concentration and the 3 He/ 4 He ratio at the collection site. The current suite consisting of three samplers is sufficient to deconvolve both the magnitude and the timing of a step change in in situ con- centration if the suite is collected within 100 h of the change. The effects of temperature and prolonged deployment on the suite ’ s capability of recording He anomalies have also been evaluated. The suite has captured a significant 3 He/ 4 He soil gas anomaly at Horseshoe Lake near Mammoth Lakes, California. The passive diffusion sampler suite appears to be an accurate and affordable alternative for determining He anomalies associated with volcanic unrest.

  1. First and second trimester screening for fetal structural anomalies.

    PubMed

    Edwards, Lindsay; Hui, Lisa

    2018-04-01

    Fetal structural anomalies are found in up to 3% of all pregnancies and ultrasound-based screening has been an integral part of routine prenatal care for decades. The prenatal detection of fetal anomalies allows for optimal perinatal management, providing expectant parents with opportunities for additional imaging, genetic testing, and the provision of information regarding prognosis and management options. Approximately one-half of all major structural anomalies can now be detected in the first trimester, including acrania/anencephaly, abdominal wall defects, holoprosencephaly and cystic hygromata. Due to the ongoing development of some organ systems however, some anomalies will not be evident until later in the pregnancy. To this extent, the second trimester anatomy is recommended by professional societies as the standard investigation for the detection of fetal structural anomalies. The reported detection rates of structural anomalies vary according to the organ system being examined, and are also dependent upon factors such as the equipment settings and sonographer experience. Technological advances over the past two decades continue to support the role of ultrasound as the primary imaging modality in pregnancy, and the safety of ultrasound for the developing fetus is well established. With increasing capabilities and experience, detailed examination of the central nervous system and cardiovascular system is possible, with dedicated examinations such as the fetal neurosonogram and the fetal echocardiogram now widely performed in tertiary centers. Magnetic resonance imaging (MRI) is well recognized for its role in the assessment of fetal brain anomalies; other potential indications for fetal MRI include lung volume measurement (in cases of congenital diaphragmatic hernia), and pre-surgical planning prior to fetal spina bifida repair. When a major structural abnormality is detected prenatally, genetic testing with chromosomal microarray is recommended over

  2. Delay test generation for synchronous sequential circuits

    NASA Astrophysics Data System (ADS)

    Devadas, Srinivas

    1989-05-01

    We address the problem of generating tests for delay faults in non-scan synchronous sequential circuits. Delay test generation for sequential circuits is a considerably more difficult problem than delay testing of combinational circuits and has received much less attention. In this paper, we present a method for generating test sequences to detect delay faults in sequential circuits using the stuck-at fault sequential test generator STALLION. The method is complete in that it will generate a delay test sequence for a targeted fault given sufficient CPU time, if such a sequence exists. We term faults for which no delay test sequence exists, under out test methodology, sequentially delay redundant. We describe means of eliminating sequential delay redundancies in logic circuits. We present a partial-scan methodology for enhancing the testability of difficult-to-test of untestable sequential circuits, wherein a small number of flip-flops are selected and made controllable/observable. The selection process guarantees the elimination of all sequential delay redundancies. We show that an intimate relationship exists between state assignment and delay testability of a sequential machine. We describe a state assignment algorithm for the synthesis of sequential machines with maximal delay fault testability. Preliminary experimental results using the test generation, partial-scan and synthesis algorithm are presented.

  3. An Unsupervised Deep Hyperspectral Anomaly Detector

    PubMed Central

    Ma, Ning; Peng, Yu; Wang, Shaojun

    2018-01-01

    Hyperspectral image (HSI) based detection has attracted considerable attention recently in agriculture, environmental protection and military applications as different wavelengths of light can be advantageously used to discriminate different types of objects. Unfortunately, estimating the background distribution and the detection of interesting local objects is not straightforward, and anomaly detectors may give false alarms. In this paper, a Deep Belief Network (DBN) based anomaly detector is proposed. The high-level features and reconstruction errors are learned through the network in a manner which is not affected by previous background distribution assumption. To reduce contamination by local anomalies, adaptive weights are constructed from reconstruction errors and statistical information. By using the code image which is generated during the inference of DBN and modified by adaptively updated weights, a local Euclidean distance between under test pixels and their neighboring pixels is used to determine the anomaly targets. Experimental results on synthetic and recorded HSI datasets show the performance of proposed method outperforms the classic global Reed-Xiaoli detector (RXD), local RX detector (LRXD) and the-state-of-the-art Collaborative Representation detector (CRD). PMID:29495410

  4. A Stochastic-entropic Approach to Detect Persistent Low-temperature Volcanogenic Thermal Anomalies

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Baxter, S.

    2011-12-01

    Eruption prediction is a chancy idiosyncratic affair, as volcanoes often manifest waxing and/or waning pre-eruption emission, geodetic, and seismic behavior that is unsystematic. Thus, fundamental to increased prediction accuracy and precision are good and frequent assessments of the time-series behavior of relevant precursor geophysical, geochemical, and geological phenomena, especially when volcanoes become restless. The Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER), in orbit since 1999 on the NASA Terra Earth Observing System satellite is an important capability for detection of thermal eruption precursors (even subtle ones) and increased passive gas emissions. The unique combination of ASTER high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), combined with simultaneous visible and near-IR imaging data, and stereo-photogrammetric capabilities make it a useful, especially thermal, precursor detection tool. The JPL ASTER Volcano Archive consisting of 80,000+ASTER volcano images allows systematic analysis of (a) baseline thermal emissions for 1550+ volcanoes, (b) important aspects of the time-dependent thermal variability, and (c) the limits of detection of temporal dynamics of eruption precursors. We are analyzing a catalog of the magnitude, frequency, and distribution of ASTER-documented volcano thermal signatures, compiled from 2000 onward, at 90m/pixel. Low contrast thermal anomalies of relatively low apparent absolute temperature (e.g., summit lakes, fumarolically altered areas, geysers, very small sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and their combinations), are particularly important to discern and monitor. We have developed a technique to detect persistent hotspots that takes into account in-scene observed pixel joint frequency

  5. Sequentially Executed Model Evaluation Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as partmore » of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  6. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  7. Prevalence and distribution of dental anomalies in orthodontic patients.

    PubMed

    Montasser, Mona A; Taha, Mahasen

    2012-01-01

    To study the prevalence and distribution of dental anomalies in a sample of orthodontic patients. The dental casts, intraoral photographs, and lateral panoramic and cephalometric radiographs of 509 Egyptian orthodontic patients were studied. Patients were examined for dental anomalies in number, size, shape, position, and structure. The prevalence of each dental anomaly was calculated and compared between sexes. Of the total study sample, 32.6% of the patients had at least one dental anomaly other than agenesis of third molars; 32.1% of females and 33.5% of males had at least one dental anomaly other than agenesis of third molars. The most commonly detected dental anomalies were impaction (12.8%) and ectopic eruption (10.8%). The total prevalence of hypodontia (excluding third molars) and hyperdontia was 2.4% and 2.8%, respectively, with similiar distributions in females and males. Gemination and accessory roots were reported in this study; each of these anomalies was detected in 0.2% of patients. In addition to genetic and racial factors, environmental factors could have more important influence on the prevalence of dental anomalies in every population. Impaction, ectopic eruption, hyperdontia, hypodontia, and microdontia were the most common dental anomalies, while fusion and dentinogenesis imperfecta were absent.

  8. Application of data cubes for improving detection of water cycle extreme events

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Albayrak, A.

    2015-12-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.

  9. Systematic Screening for Subtelomeric Anomalies in a Clinical Sample of Autism

    ERIC Educational Resources Information Center

    Wassink, Thomas H.; Losh, Molly; Piven, Joseph; Sheffield, Val C.; Ashley, Elizabeth; Westin, Erik R.; Patil, Shivanand R.

    2007-01-01

    High-resolution karyotyping detects cytogenetic anomalies in 5-10% of cases of autism. Karyotyping, however, may fail to detect abnormalities of chromosome subtelomeres, which are gene rich regions prone to anomalies. We assessed whether panels of FISH probes targeted for subtelomeres could detect abnormalities beyond those identified by…

  10. [The advantages of early midtrimester targeted fetal systematic organ screening for the detection of fetal anomalies--will a global change start in Israel?].

    PubMed

    Bronshtein, Moshe; Solt, Ido; Blumenfeld, Zeev

    2014-06-01

    Despite more than three decades of universal popularity of fetal sonography as an integral part of pregnancy evaluation, there is still no unequivocal agreement regarding the optimal dating of fetal sonographic screening and the type of ultrasound (transvaginal vs abdominal). TransvaginaL systematic sonography at 14-17 weeks for fetal organ screening. The evaluation of over 72.000 early (14-17 weeks) and late (18-24 weeks) fetal ultrasonographic systematic organ screenings revealed that 96% of the malformations are detectable in the early screening with an incidence of 1:50 gestations. Only 4% of the fetal anomalies are diagnosed later in pregnancy. Over 99% of the fetal cardiac anomalies are detectable in the early screening and most of them appear in low risk gestations. Therefore, we suggest a new platform of fetal sonographic evaluation and follow-up: The extensive systematic fetal organ screening should be performed by an expert sonographer who has been trained in the detection of fetal malformations, at 14-17 weeks gestation. This examination should also include fetal cardiac echography Three additional ultrasound examinations are suggested during pregnancy: the first, performed by the patient's obstetrician at 6-7 weeks for the exclusion of ectopic pregnancy, confirmation of fetal viability, dating, assessment of chorionicity in multiple gestations, and visualization of maternal adnexae. The other two, at 22-26 and 32-34 weeks, require less training and should be performed by an obstetrician who has been qualified in the sonographic detection of fetal anomalies. The advantages of early midtrimester targeted fetal systematic organ screening for the detection of fetal anomalies may dictate a global change.

  11. Solving the muon g -2 anomaly in deflected anomaly mediated SUSY breaking with messenger-matter interactions

    NASA Astrophysics Data System (ADS)

    Wang, Fei; Wang, Wenyu; Yang, Jin Min

    2017-10-01

    We propose to introduce general messenger-matter interactions in the deflected anomaly mediated supersymmetry (SUSY) breaking (AMSB) scenario to explain the gμ-2 anomaly. Scenarios with complete or incomplete grand unified theory (GUT) multiplet messengers are discussed, respectively. The introduction of incomplete GUT mulitiplets can be advantageous in various aspects. We found that the gμ-2 anomaly can be solved in both scenarios under current constraints including the gluino mass bounds, while the scenarios with incomplete GUT representation messengers are more favored by the gμ-2 data. We also found that the gluino is upper bounded by about 2.5 TeV (2.0 TeV) in scenario A and 3.0 TeV (2.7 TeV) in scenario B if the generalized deflected AMSB scenarios are used to fully account for the gμ-2 anomaly at 3 σ (2 σ ) level. Such a gluino should be accessible in the future LHC searches. Dark matter (DM) constraints, including DM relic density and direct detection bounds, favor scenario B with incomplete GUT multiplets. Much of the allowed parameter space for scenario B could be covered by the future DM direct detection experiments.

  12. Integrating age in the detection and mapping of incongruous patches in coffee (Coffea arabica) plantations using multi-temporal Landsat 8 NDVI anomalies

    NASA Astrophysics Data System (ADS)

    Chemura, Abel; Mutanga, Onisimo; Dube, Timothy

    2017-05-01

    The development of cost-effective, reliable and easy to implement crop condition monitoring methods is urgently required for perennial tree crops such as coffee (Coffea arabica), as they are grown over large areas and represent long term and higher levels of investment. These monitoring methods are useful in identifying farm areas that experience poor crop growth, pest infestation, diseases outbreaks and/or to monitor response to management interventions. This study compares field level coffee mean NDVI and LSWI anomalies and age-adjusted coffee mean NDVI and LSWI anomalies in identifying and mapping incongruous patches across perennial coffee plantations. To achieve this objective, we first derived deviation of coffee pixels from the global coffee mean NDVI and LSWI values of nine sequential Landsat 8 OLI image scenes. We then evaluated the influence of coffee age class (young, mature and old) on Landsat-scale NDVI and LSWI values using a one-way ANOVA and since results showed significant differences, we adjusted NDVI and LSWI anomalies for age-class. We then used the cumulative inverse distribution function (α ≤ 0.05) to identify fields and within field areas with excessive deviation of NDVI and LSWI from the global and the age-expected mean for each of the Landsat 8 OLI scene dates spanning three seasons. Results from accuracy assessment indicated that it was possible to separate incongruous and healthy patches using these anomalies and that using NDVI performed better than using LSWI for both global and age-adjusted mean anomalies. Using the age-adjusted anomalies performed better in separating incongruous and healthy patches than using the global mean for both NDVI (Overall accuracy = 80.9% and 68.1% respectively) and for LSWI (Overall accuracy = 68.1% and 48.9% respectively). When applied to other Landsat 8 OLI scenes, the results showed that the proportions of coffee fields that were modelled incongruent decreased with time for the young age category and

  13. Eyewitness confidence in simultaneous and sequential lineups: a criterion shift account for sequential mistaken identification overconfidence.

    PubMed

    Dobolyi, David G; Dodson, Chad S

    2013-12-01

    Confidence judgments for eyewitness identifications play an integral role in determining guilt during legal proceedings. Past research has shown that confidence in positive identifications is strongly associated with accuracy. Using a standard lineup recognition paradigm, we investigated accuracy using signal detection and ROC analyses, along with the tendency to choose a face with both simultaneous and sequential lineups. We replicated past findings of reduced rates of choosing with sequential as compared to simultaneous lineups, but notably found an accuracy advantage in favor of simultaneous lineups. Moreover, our analysis of the confidence-accuracy relationship revealed two key findings. First, we observed a sequential mistaken identification overconfidence effect: despite an overall reduction in false alarms, confidence for false alarms that did occur was higher with sequential lineups than with simultaneous lineups, with no differences in confidence for correct identifications. This sequential mistaken identification overconfidence effect is an expected byproduct of the use of a more conservative identification criterion with sequential than with simultaneous lineups. Second, we found a steady drop in confidence for mistaken identifications (i.e., foil identifications and false alarms) from the first to the last face in sequential lineups, whereas confidence in and accuracy of correct identifications remained relatively stable. Overall, we observed that sequential lineups are both less accurate and produce higher confidence false identifications than do simultaneous lineups. Given the increasing prominence of sequential lineups in our legal system, our data argue for increased scrutiny and possibly a wholesale reevaluation of this lineup format. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  14. Assessing the impact of background spectral graph construction techniques on the topological anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Ziemann, Amanda K.; Messinger, David W.; Albano, James A.; Basener, William F.

    2012-06-01

    Anomaly detection algorithms have historically been applied to hyperspectral imagery in order to identify pixels whose material content is incongruous with the background material in the scene. Typically, the application involves extracting man-made objects from natural and agricultural surroundings. A large challenge in designing these algorithms is determining which pixels initially constitute the background material within an image. The topological anomaly detection (TAD) algorithm constructs a graph theory-based, fully non-parametric topological model of the background in the image scene, and uses codensity to measure deviation from this background. In TAD, the initial graph theory structure of the image data is created by connecting an edge between any two pixel vertices x and y if the Euclidean distance between them is less than some resolution r. While this type of proximity graph is among the most well-known approaches to building a geometric graph based on a given set of data, there is a wide variety of dierent geometrically-based techniques. In this paper, we present a comparative test of the performance of TAD across four dierent constructs of the initial graph: mutual k-nearest neighbor graph, sigma-local graph for two different values of σ > 1, and the proximity graph originally implemented in TAD.

  15. Mining Building Energy Management System Data Using Fuzzy Anomaly Detection and Linguistic Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumidu Wijayasekara; Ondrej Linda; Milos Manic

    Building Energy Management Systems (BEMSs) are essential components of modern buildings that utilize digital control technologies to minimize energy consumption while maintaining high levels of occupant comfort. However, BEMSs can only achieve these energy savings when properly tuned and controlled. Since indoor environment is dependent on uncertain criteria such as weather, occupancy, and thermal state, performance of BEMS can be sub-optimal at times. Unfortunately, the complexity of BEMS control mechanism, the large amount of data available and inter-relations between the data can make identifying these sub-optimal behaviors difficult. This paper proposes a novel Fuzzy Anomaly Detection and Linguistic Description (Fuzzy-ADLD)more » based method for improving the understandability of BEMS behavior for improved state-awareness. The presented method is composed of two main parts: 1) detection of anomalous BEMS behavior and 2) linguistic representation of BEMS behavior. The first part utilizes modified nearest neighbor clustering algorithm and fuzzy logic rule extraction technique to build a model of normal BEMS behavior. The second part of the presented method computes the most relevant linguistic description of the identified anomalies. The presented Fuzzy-ADLD method was applied to real-world BEMS system and compared against a traditional alarm based BEMS. In six different scenarios, the Fuzzy-ADLD method identified anomalous behavior either as fast as or faster (an hour or more), that the alarm based BEMS. In addition, the Fuzzy-ADLD method identified cases that were missed by the alarm based system, demonstrating potential for increased state-awareness of abnormal building behavior.« less

  16. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  17. Lunar magnetic anomalies detected by the Apollo substatellite magnetometers

    USGS Publications Warehouse

    Hood, L.L.; Coleman, P.J.; Russell, C.T.; Wilhelms, D.E.

    1979-01-01

    Properties of lunar crustal magnetization thus far deduced from Apollo subsatellite magnetometer data are reviewed using two of the most accurate presently available magnetic anomaly maps - one covering a portion of the lunar near side and the other a part of the far side. The largest single anomaly found within the region of coverage on the near-side map correlates exactly with a conspicuous, light-colored marking in western Oceanus Procellarum called Reiner Gamma. This feature is interpreted as an unusual deposit of ejecta from secondary craters of the large nearby primary impact crater Cavalerius. An age for Cavalerius (and, by implication, for Reiner Gamma) of 3.2 ?? 0.2 ?? 109 y is estimated. The main (30 ?? 60 km) Reiner Gamma deposit is nearly uniformly magnetized in a single direction, with a minimum mean magnetization intensity of ???7 ?? 10-2 G cm3/g (assuming a density of 3 g/cm3), or about 700 times the stable magnetization component of the most magnetic returned samples. Additional medium-amplitude anomalies exist over the Fra Mauro Formation (Imbrium basin ejecta emplaced ???3.9 ?? 109 y ago) where it has not been flooded by mare basalt flows, but are nearly absent over the maria and over the craters Copernicus, Kepler, and Reiner and their encircling ejecta mantles. The mean altitude of the far-side anomaly gap is much higher than that of the near-side map and the surface geology is more complex, so individual anomaly sources have not yet been identified. However, it is clear that a concentration of especially strong sources exists in the vicinity of the craters Van de Graaff and Aitken. Numerical modeling of the associated fields reveals that the source locations do not correspond with the larger primary impact craters of the region and, by analogy with Reiner Gamma, may be less conspicuous secondary crater ejecta deposits. The reason for a special concentration of strong sources in the Van de Graaff-Aitken region is unknown, but may be indirectly

  18. Sequential change detection and monitoring of temporal trends in random-effects meta-analysis.

    PubMed

    Dogo, Samson Henry; Clark, Allan; Kulinskaya, Elena

    2017-06-01

    Temporal changes in magnitude of effect sizes reported in many areas of research are a threat to the credibility of the results and conclusions of meta-analysis. Numerous sequential methods for meta-analysis have been proposed to detect changes and monitor trends in effect sizes so that meta-analysis can be updated when necessary and interpreted based on the time it was conducted. The difficulties of sequential meta-analysis under the random-effects model are caused by dependencies in increments introduced by the estimation of the heterogeneity parameter τ 2 . In this paper, we propose the use of a retrospective cumulative sum (CUSUM)-type test with bootstrap critical values. This method allows retrospective analysis of the past trajectory of cumulative effects in random-effects meta-analysis and its visualization on a chart similar to CUSUM chart. Simulation results show that the new method demonstrates good control of Type I error regardless of the number or size of the studies and the amount of heterogeneity. Application of the new method is illustrated on two examples of medical meta-analyses. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  19. Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection.

    PubMed

    Wang, Kun; Langevin, Stanley; O'Hern, Corey S; Shattuck, Mark D; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G; Kirby, Michael

    2016-01-01

    Clinical diagnosis of acute infectious diseases during the early stages of infection is critical to administering the appropriate treatment to improve the disease outcome. We present a data driven analysis of the human cellular response to respiratory viruses including influenza, respiratory syncytia virus, and human rhinovirus, and compared this with the response to the bacterial endotoxin, Lipopolysaccharides (LPS). Using an anomaly detection framework we identified pathways that clearly distinguish between asymptomatic and symptomatic patients infected with the four different respiratory viruses and that accurately diagnosed patients exposed to a bacterial infection. Connectivity pathway analysis comparing the viral and bacterial diagnostic signatures identified host cellular pathways that were unique to patients exposed to LPS endotoxin indicating this type of analysis could be used to identify host biomarkers that can differentiate clinical etiologies of acute infection. We applied the Multivariate State Estimation Technique (MSET) on two human influenza (H1N1 and H3N2) gene expression data sets to define host networks perturbed in the asymptomatic phase of infection. Our analysis identified pathways in the respiratory virus diagnostic signature as prognostic biomarkers that triggered prior to clinical presentation of acute symptoms. These early warning pathways correctly predicted that almost half of the subjects would become symptomatic in less than forty hours post-infection and that three of the 18 subjects would become symptomatic after only 8 hours. These results provide a proof-of-concept for utility of anomaly detection algorithms to classify host pathway signatures that can identify presymptomatic signatures of acute diseases and differentiate between etiologies of infection. On a global scale, acute respiratory infections cause a significant proportion of human co-morbidities and account for 4.25 million deaths annually. The development of clinical

  20. Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection

    PubMed Central

    O’Hern, Corey S.; Shattuck, Mark D.; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G.

    2016-01-01

    Clinical diagnosis of acute infectious diseases during the early stages of infection is critical to administering the appropriate treatment to improve the disease outcome. We present a data driven analysis of the human cellular response to respiratory viruses including influenza, respiratory syncytia virus, and human rhinovirus, and compared this with the response to the bacterial endotoxin, Lipopolysaccharides (LPS). Using an anomaly detection framework we identified pathways that clearly distinguish between asymptomatic and symptomatic patients infected with the four different respiratory viruses and that accurately diagnosed patients exposed to a bacterial infection. Connectivity pathway analysis comparing the viral and bacterial diagnostic signatures identified host cellular pathways that were unique to patients exposed to LPS endotoxin indicating this type of analysis could be used to identify host biomarkers that can differentiate clinical etiologies of acute infection. We applied the Multivariate State Estimation Technique (MSET) on two human influenza (H1N1 and H3N2) gene expression data sets to define host networks perturbed in the asymptomatic phase of infection. Our analysis identified pathways in the respiratory virus diagnostic signature as prognostic biomarkers that triggered prior to clinical presentation of acute symptoms. These early warning pathways correctly predicted that almost half of the subjects would become symptomatic in less than forty hours post-infection and that three of the 18 subjects would become symptomatic after only 8 hours. These results provide a proof-of-concept for utility of anomaly detection algorithms to classify host pathway signatures that can identify presymptomatic signatures of acute diseases and differentiate between etiologies of infection. On a global scale, acute respiratory infections cause a significant proportion of human co-morbidities and account for 4.25 million deaths annually. The development of clinical

  1. Congenital basis of posterior fossa anomalies

    PubMed Central

    Cotes, Claudia; Bonfante, Eliana; Lazor, Jillian; Jadhav, Siddharth; Caldas, Maria; Swischuk, Leonard

    2015-01-01

    The classification of posterior fossa congenital anomalies has been a controversial topic. Advances in genetics and imaging have allowed a better understanding of the embryologic development of these abnormalities. A new classification schema correlates the embryologic, morphologic, and genetic bases of these anomalies in order to better distinguish and describe them. Although they provide a better understanding of the clinical aspects and genetics of these disorders, it is crucial for the radiologist to be able to diagnose the congenital posterior fossa anomalies based on their morphology, since neuroimaging is usually the initial step when these disorders are suspected. We divide the most common posterior fossa congenital anomalies into two groups: 1) hindbrain malformations, including diseases with cerebellar or vermian agenesis, aplasia or hypoplasia and cystic posterior fossa anomalies; and 2) cranial vault malformations. In addition, we will review the embryologic development of the posterior fossa and, from the perspective of embryonic development, will describe the imaging appearance of congenital posterior fossa anomalies. Knowledge of the developmental bases of these malformations facilitates detection of the morphological changes identified on imaging, allowing accurate differentiation and diagnosis of congenital posterior fossa anomalies. PMID:26246090

  2. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    NASA Technical Reports Server (NTRS)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  3. Pediatric tinnitus: Incidence of imaging anomalies and the impact of hearing loss.

    PubMed

    Kerr, Rhorie; Kang, Elise; Hopkins, Brandon; Anne, Samantha

    2017-12-01

    Guidelines exist for evaluation and management of tinnitus in adults; however lack of evidence in children limits applicability of these guidelines to pediatric patients. Objective of this study is to determine the incidence of inner ear anomalies detected on imaging studies within the pediatric population with tinnitus and evaluate if presence of hearing loss increases the rate of detection of anomalies in comparison to normal hearing patients. Retrospective review of all children with diagnosis of tinnitus from 2010 to 2015 ;at a tertiary care academic center. 102 pediatric patients with tinnitus were identified. Overall, 53 patients had imaging studies with 6 abnormal findings (11.3%). 51/102 patients had hearing loss of which 33 had imaging studies demonstrating 6 inner ear anomalies detected. This is an incidence of 18.2% for inner ear anomalies identified in patients with hearing loss (95% confidence interval (CI) of 7.0-35.5%). 4 of these 6 inner ear anomalies detected were vestibular aqueduct abnormalities. The other two anomalies were cochlear hypoplasia and bilateral semicircular canal dysmorphism. 51 patients had no hearing loss and of these patients, 20 had imaging studies with no inner ear abnormalities detected. There was no statistical difference in incidence of abnormal imaging findings in patients with and without hearing loss (Fisher's exact test, p ;= ;0.072.) CONCLUSION: There is a high incidence of anomalies detected in imaging studies done in pediatric patients with tinnitus, especially in the presence of hearing loss. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Visual detection and sequential injection determination of aluminium using a cinnamoyl derivative.

    PubMed

    Elečková, Lenka; Alexovič, Michal; Kuchár, Juraj; Balogh, Ioseph S; Andruch, Vasil

    2015-02-01

    A cinnamoyl derivative, 3-[4-(dimethylamino)cinnamoyl]-4-hydroxy-6-methyl-3,4-2H-pyran-2-one, was used as a ligand for the determination of aluminium. Upon the addition of an acetonitrile solution of the ligand to an aqueous solution containing Al(III) and a buffer solution at pH 8, a marked change in colour from yellow to orange is observed. The colour intensity is proportional to the concentration of Al(III); thus, the 'naked-eye' detection of aluminium is possible. The reaction is also applied for sequential injection determination of aluminium. Beer׳s law is obeyed in the range from 0.055 to 0.66 mg L(-1) of Al(III). The limit of detection, calculated as three times the standard deviation of the blank test (n=10), was found to be 4 μg L(-1) for Al(III). The method was applied for the determination of aluminium in spiked water samples and pharmaceutical preparations. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Time series of GNSS-derived ionospheric maps to detect anomalies as possible precursors of high magnitude earthquakes

    NASA Astrophysics Data System (ADS)

    Barbarella, M.; De Giglio, M.; Galeandro, A.; Mancini, F.

    2012-04-01

    The modification of some atmospheric physical properties prior to a high magnitude earthquake has been recently debated within the Lithosphere-Atmosphere-Ionosphere (LAI) Coupling model. Among this variety of phenomena the ionization of air at the higher level of the atmosphere, called ionosphere, is investigated in this work. Such a ionization occurrences could be caused by possible leaking of gases from earth crust and their presence was detected around the time of high magnitude earthquakes by several authors. However, the spatial scale and temporal domain over which such a disturbances come into evidence is still a controversial item. Even thought the ionospheric activity could be investigated by different methodologies (satellite or terrestrial measurements), we selected the production of ionospheric maps by the analysis of GNSS (Global Navigation Satellite Data) data as possible way to detect anomalies prior of a seismic event over a wide area around the epicentre. It is well known that, in the GNSS sciences, the ionospheric activity could be probed by the analysis of refraction phenomena occurred on the dual frequency signals along the satellite to receiver path. The analysis of refraction phenomena affecting data acquired by the GNSS permanent trackers is able to produce daily to hourly maps representing the spatial distribution of the ionospheric Total Electron Content (TEC) as an index of the ionization degree in the upper atmosphere. The presence of large ionospheric anomalies could be therefore interpreted in the LAI Coupling model like a precursor signal of a strong earthquake, especially when the appearance of other different precursors (thermal anomalies and/or gas fluxes) could be detected. In this work, a six-month long series of ionospheric maps produced from GNSS data collected by a network of 49 GPS permanent stations distributed within an area around the city of L'Aquila (Abruzzi, Italy), where an earthquake (M = 6.3) occurred on April 6, 2009

  6. Active Learning with Rationales for Identifying Operationally Significant Anomalies in Aviation

    NASA Technical Reports Server (NTRS)

    Sharma, Manali; Das, Kamalika; Bilgic, Mustafa; Matthews, Bryan; Nielsen, David Lynn; Oza, Nikunj C.

    2016-01-01

    A major focus of the commercial aviation community is discovery of unknown safety events in flight operations data. Data-driven unsupervised anomaly detection methods are better at capturing unknown safety events compared to rule-based methods which only look for known violations. However, not all statistical anomalies that are discovered by these unsupervised anomaly detection methods are operationally significant (e.g., represent a safety concern). Subject Matter Experts (SMEs) have to spend significant time reviewing these statistical anomalies individually to identify a few operationally significant ones. In this paper we propose an active learning algorithm that incorporates SME feedback in the form of rationales to build a classifier that can distinguish between uninteresting and operationally significant anomalies. Experimental evaluation on real aviation data shows that our approach improves detection of operationally significant events by as much as 75% compared to the state-of-the-art. The learnt classifier also generalizes well to additional validation data sets.

  7. Aeromagnetic anomalies over faulted strata

    USGS Publications Warehouse

    Grauch, V.J.S.; Hudson, Mark R.

    2011-01-01

    High-resolution aeromagnetic surveys are now an industry standard and they commonly detect anomalies that are attributed to faults within sedimentary basins. However, detailed studies identifying geologic sources of magnetic anomalies in sedimentary environments are rare in the literature. Opportunities to study these sources have come from well-exposed sedimentary basins of the Rio Grande rift in New Mexico and Colorado. High-resolution aeromagnetic data from these areas reveal numerous, curvilinear, low-amplitude (2–15 nT at 100-m terrain clearance) anomalies that consistently correspond to intrasedimentary normal faults (Figure 1). Detailed geophysical and rock-property studies provide evidence for the magnetic sources at several exposures of these faults in the central Rio Grande rift (summarized in Grauch and Hudson, 2007, and Hudson et al., 2008). A key result is that the aeromagnetic anomalies arise from the juxtaposition of magnetically differing strata at the faults as opposed to chemical processes acting at the fault zone. The studies also provide (1) guidelines for understanding and estimating the geophysical parameters controlling aeromagnetic anomalies at faulted strata (Grauch and Hudson), and (2) observations on key geologic factors that are favorable for developing similar sedimentary sources of aeromagnetic anomalies elsewhere (Hudson et al.).

  8. Radioactive anomaly discrimination from spectral ratios

    DOEpatents

    Maniscalco, James; Sjoden, Glenn; Chapman, Mac Clements

    2013-08-20

    A method for discriminating a radioactive anomaly from naturally occurring radioactive materials includes detecting a first number of gamma photons having energies in a first range of energy values within a predetermined period of time and detecting a second number of gamma photons having energies in a second range of energy values within the predetermined period of time. The method further includes determining, in a controller, a ratio of the first number of gamma photons having energies in the first range and the second number of gamma photons having energies in the second range, and determining that a radioactive anomaly is present when the ratio exceeds a threshold value.

  9. Gravity anomalies on Venus

    NASA Technical Reports Server (NTRS)

    Sjogren, W. L.; Phillips, R. J.; Birkeland, P. W.; Wimberly, R. N.

    1980-01-01

    Doppler radio tracking of the Pioneer Venus orbiter has provided gravity measures over a significant portion of Venus. Feature resolution is approximately 300-1000 km within an area extending from 10 deg S to 40 deg N latitude and from 70 deg W to 130 deg E longitude (approximately equal to 200 deg). Many anomalies were detected, and there is considerable correlation with radar altimetry topography (Pettengill et al., 1980). The amplitudes of the anomalies are relatively mild and similar to those on earth at this resolution. Calculations for isostatic adjustment reveal that significant compensation has occurred.

  10. Anomaly and signature filtering improve classifier performance for detection of suspicious access to EHRs.

    PubMed

    Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila

    2011-01-01

    Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10(-6). Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful.

  11. Anomaly and Signature Filtering Improve Classifier Performance For Detection Of Suspicious Access To EHRs

    PubMed Central

    Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila

    2011-01-01

    Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10−6. Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful. PMID:22195129

  12. The Compact Environmental Anomaly Sensor (CEASE) III

    NASA Astrophysics Data System (ADS)

    Roddy, P.; Hilmer, R. V.; Ballenthin, J.; Lindstrom, C. D.; Barton, D. A.; Ignazio, J. M.; Coombs, J. M.; Johnston, W. R.; Wheelock, A. T.; Quigley, S.

    2016-12-01

    The Air Force Research Laboratory's Energetic Charged Particle (ECP) sensor project is a comprehensive effort to measure the charged particle environment that causes satellite anomalies. The project includes the Compact Environmental Anomaly Sensor (CEASE) III, building on the flight heritage of prior CEASE designs. CEASE III consists of multiple sensor modules. High energy particles are observed using independent unique silicon detector stacks. In addition CEASE III includes an electrostatic analyzer (ESA) assembly which uses charge multiplication for particle detection. The sensors cover a wide range of proton and electron energies that contribute to satellite anomalies.

  13. A reversible fluorescence "off-on-off" sensor for sequential detection of aluminum and acetate/fluoride ions.

    PubMed

    Gupta, Vinod Kumar; Mergu, Naveen; Kumawat, Lokesh Kumar; Singh, Ashok Kumar

    2015-11-01

    A new rhodamine functionalized fluorogenic Schiff base CS was synthesized and its colorimetric and fluorescence responses toward various metal ions were explored. The sensor exhibited highly selective and sensitive colorimetric and "off-on" fluorescence response towards Al(3+) in the presence of other competing metal ions. These spectral changes are large enough in the visible region of the spectrum and thus enable naked-eye detection. Studies proved that the formation of CS-Al(3+) complex is fully reversible and can sense to AcO(-)/F(-) via dissociation. The results revealed that the sensor provides fluorescence "off-on-off" strategy for the sequential detection of Al(3+) and AcO(-)/F(-). Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Prevalence of dental anomalies in Saudi orthodontic patients.

    PubMed

    Al-Jabaa, Aljazi H; Aldrees, Abdullah M

    2013-07-01

    This study aimed to investigate the prevalence of dental anomalies and study the association of these anomalies with different types of malocclusion in a random sample of Saudi orthodontic patients. Six hundred and two randomly selected pretreatment records including orthopantomographs (OPG), and study models were evaluated. The molar relationship was determined using pretreatment study models, and OPG were examined to investigate the prevalence of dental anomalies among the sample. The most common types of the investigated anomalies were: impaction followed by hypodontia, microdontia, macrodontia, ectopic eruption and supernumerary. No statistical significant correlations were observed between sex and dental anomalies. Dental anomalies were more commonly found in class I followed by asymmetric molar relation, then class II and finally class III molar relation. No malocclusion group had a statistically significant relation with any individual dental anomaly. The prevalence of dental anomalies among Saudi orthodontic patients was higher than the general population. Although, orthodontic patients have been reported to have high rates of dental anomalies, orthodontists often fail to consider this. If not detected, dental anomalies can complicate dental and orthodontic treatment; therefore, their presence should be carefully investigated during orthodontic diagnosis and considered during treatment planning.

  15. Millimeter Wave Detection of Localized Anomalies in the Space Shuttle External Fuel Tank Insulating Foam and Acreage Heat Tiles

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Case, J. T.; Zoughi, R.; Hepburn, F.

    2005-01-01

    The Space Shuttle Columbia's catastrophic accident emphasizes the growing need for developing and applying effective, robust and life-cycle oriented nondestructive testing (NDT) methods for inspecting the shuttle external fuel tank spray on foam insulation (SOFI) and its protective acreage heat tiles. Millimeter wave NDT techniques were one of the methods chosen for evaluating their potential for inspecting these structures. Several panels with embedded anomalies (mainly voids) were produced and tested for this purpose. Near-field and far-field millimeter wave NDT methods were used for producing millimeter wave images of the anomalies in SOFI panel and heat tiles. This paper presents the results of an investigation for the purpose of detecting localized anomalies in two SOFI panels and a set of heat tiles. To this end, reflectometers at a relatively wide range of frequencies (Ka-band (26.5 - 40 GHz) to W-band (75 - 110 GHz)) and utilizing different types of radiators were employed. The results clearly illustrate the utility of these methods for this purpose.

  16. Online anomaly detection in wireless body area networks for reliable healthcare monitoring.

    PubMed

    Salem, Osman; Liu, Yaning; Mehaoua, Ahmed; Boutaba, Raouf

    2014-09-01

    In this paper, we propose a lightweight approach for online detection of faulty measurements by analyzing the data collected from medical wireless body area networks. The proposed framework performs sequential data analysis using a smart phone as a base station, and takes into account the constrained resources of the smart phone, such as processing power and storage capacity. The main objective is to raise alarms only when patients enter in an emergency situation, and to discard false alarms triggered by faulty measurements or ill-behaved sensors. The proposed approach is based on the Haar wavelet decomposition, nonseasonal Holt-Winters forecasting, and the Hampel filter for spatial analysis, and on for temporal analysis. Our objective is to reduce false alarms resulting from unreliable measurements and to reduce unnecessary healthcare intervention. We apply our proposed approach on real physiological dataset. Our experimental results prove the effectiveness of our approach in achieving good detection accuracy with a low false alarm rate. The simplicity and the processing speed of our proposed framework make it useful and efficient for real time diagnosis.

  17. Automatic detection of multiple UXO-like targets using magnetic anomaly inversion and self-adaptive fuzzy c-means clustering

    NASA Astrophysics Data System (ADS)

    Yin, Gang; Zhang, Yingtang; Fan, Hongbo; Ren, Guoquan; Li, Zhining

    2017-12-01

    We have developed a method for automatically detecting UXO-like targets based on magnetic anomaly inversion and self-adaptive fuzzy c-means clustering. Magnetic anomaly inversion methods are used to estimate the initial locations of multiple UXO-like sources. Although these initial locations have some errors with respect to the real positions, they form dense clouds around the actual positions of the magnetic sources. Then we use the self-adaptive fuzzy c-means clustering algorithm to cluster these initial locations. The estimated number of cluster centroids represents the number of targets and the cluster centroids are regarded as the locations of magnetic targets. Effectiveness of the method has been demonstrated using synthetic datasets. Computational results show that the proposed method can be applied to the case of several UXO-like targets that are randomly scattered within in a confined, shallow subsurface, volume. A field test was carried out to test the validity of the proposed method and the experimental results show that the prearranged magnets can be detected unambiguously and located precisely.

  18. Anomaly-specified virtual dimensionality

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Yu; Paylor, Drew; Chang, Chein-I.

    2013-09-01

    Virtual dimensionality (VD) has received considerable interest where VD is used to estimate the number of spectral distinct signatures, denoted by p. Unfortunately, no specific definition is provided by VD for what a spectrally distinct signature is. As a result, various types of spectral distinct signatures determine different values of VD. There is no one value-fit-all for VD. In order to address this issue this paper presents a new concept, referred to as anomaly-specified VD (AS-VD) which determines the number of anomalies of interest present in the data. Specifically, two types of anomaly detection algorithms are of particular interest, sample covariance matrix K-based anomaly detector developed by Reed and Yu, referred to as K-RXD and sample correlation matrix R-based RXD, referred to as R-RXD. Since K-RXD is only determined by 2nd order statistics compared to R-RXD which is specified by statistics of the first two orders including sample mean as the first order statistics, the values determined by K-RXD and R-RXD will be different. Experiments are conducted in comparison with widely used eigen-based approaches.

  19. Sequential extractions of selenium soils from Stewart Lake: total selenium and speciation measurements with ICP-MS detection.

    PubMed

    Ponce de León, Claudia A; DeNicola, Katie; Montes Bayón, Maria; Caruso, Joseph A

    2003-06-01

    Different techniques have been employed in order to evaluate the most efficient procedure for the extraction of selenium from soil as required for speciation. Selenium contaminated sediments from Stewart Lake Wetland, California were used. A strong acid mineralization of the samples gives quantitative total selenium, which is then used to estimate recoveries for the milder extraction methods. The different extraction methodologies involve the sequential use of water, buffer (phosphate, pH 7) and either acid solution (e.g. HNO3 or HCl) or basic solutions (e.g. ammonium acetate, NaOH or TMAH). Pyrophosphate extraction was also evaluated and showed that selenium was not associated with humic acids. The extractants were subsequently analyzed by size exclusion chromatography (SEC) with UV (254 and 400 nm) and on-line ICP-MS detection; anion exchange chromatography, and ion-pair reversed phase chromatography with ICP-MS detection. For sequential extractions the extraction efficiencies showed that the basic extractions were more efficient than the acidic. The difference between the acidic and the basic extraction efficiency is carried to the sulfite extraction, suggesting that whatever is not extracted by the acid is subsequently extracted by the sulfite. The species identified with the different chromatographies were selenate, selenite, elemental selenium and some organic selenium.

  20. A Bayesian sequential processor approach to spectroscopic portal system decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sale, K; Candy, J; Breitfeller, E

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waitingmore » for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.« less

  1. Detection of meteorological extreme effect on historical crop yield anomaly

    NASA Astrophysics Data System (ADS)

    Kim, W.; Iizumi, T.; Nishimori, M.

    2017-12-01

    Meteorological extremes of temperature and precipitation are a critical issue in the global climate change, and some studies investigating how the extreme changes in accordance with the climate change are continuously reported. However, it is rarely understandable that the extremes affect crop yield worldwide as heatwave, coolwave, drought, and flood, albeit some local or national reports are available. Therefore, we globally investigated the extremes effects on the variability of historical yield of maize, rice, soy, and wheat with a standardized index and a historical yield anomaly. For the regression analysis, the standardized index is annually aggregated in the consideration of a crop calendar, and the historical yield is detrended with 5-year moving average. Throughout this investigation, we found that the relationship between the aggregated standardized index and the historical yield anomaly shows not merely positive correlation but also negative correlation in all crops in the globe. Namely, the extremes cause decrease of crop yield as a matter of course, but increase in some regions contrastingly. These results help us to quantify the extremes effect on historical crop yield anomaly.

  2. Linking entanglement and discrete anomaly

    NASA Astrophysics Data System (ADS)

    Hung, Ling-Yan; Wu, Yong-Shi; Zhou, Yang

    2018-05-01

    In 3 d Chern-Simons theory, there is a discrete one-form symmetry, whose symmetry group is isomorphic to the center of the gauge group. We study the `t Hooft anomaly associated to this discrete one-form symmetry in theories with generic gauge groups, A, B, C, D-types. We propose to detect the discrete anomaly by computing the Hopf state entanglement in the subspace spanned by the symmetry generators and develop a systematical way based on the truncated modular S matrix. We check our proposal for many examples.

  3. Simultaneous Surgical Treatment of Congenital Spinal Deformity Associated with Intraspinal Anomalies.

    PubMed

    Singrakhia, Manoj; Malewar, Nikhil; Deshmukh, Sonal; Deshmukh, Shivaji

    2018-06-01

    Prospective case series. To study the safety, efficacy, and long-term outcomes of single-stage surgical intervention for congenital spinal deformity and intraspinal anomalies. Congenital spinal deformities associated with intraspinal anomalies are usually treated sequentially, first by treating the intraspinal anomalies followed by deformity correction after a period of 3-6 months. Recently, a single-stage approach has been reported to show better postoperative results and reduced complication rates. Thirty patients (23 females and seven males) were prospectively evaluated for the simultaneous surgical treatment of congenital spinal deformity with concurrent intraspinal anomalies from May 2006 to October 2016. The average age at presentation was 9.8±3.7 years, with the average follow-up duration being 49.06±8.6 months. Clinical records were evaluated for clinical, radiological, perioperative, and postoperative data. The average angle of deformity was 56.53°±25.22° preoperatively, 21.13°±14.34° postoperatively, and 23.93°±14.99° at the final follow-up. The average surgical time was 232.58±53.56 minutes (range, 100-330 minutes), with a mean blood loss of 1,587.09±439.09 mL (range, 100-2,300 mL). Single stage surgical intervention for intraspinal anomalies with congenital spinal deformity correction, including adequate intra-operative wake-up test, is a viable option in appropriately selected patients and has minimum complication rates.

  4. Ultrahigh Responsivity and Detectivity Graphene-Perovskite Hybrid Phototransistors by Sequential Vapor Deposition

    NASA Astrophysics Data System (ADS)

    Chang, Po-Han; Liu, Shang-Yi; Lan, Yu-Bing; Tsai, Yi-Chen; You, Xue-Qian; Li, Chia-Shuo; Huang, Kuo-You; Chou, Ang-Sheng; Cheng, Tsung-Chin; Wang, Juen-Kai; Wu, Chih-I.

    2017-04-01

    In this work, graphene-methylammonium lead iodide (MAPbI3) perovskite hybrid phototransistors fabricated by sequential vapor deposition are demonstrated. Ultrahigh responsivity of 1.73 × 107 A W-1 and detectivity of 2 × 1015 Jones are achieved, with extremely high effective quantum efficiencies of about 108% in the visible range (450-700 nm). This excellent performance is attributed to the ultra-flat perovskite films grown by vapor deposition on the graphene sheets. The hybrid structure of graphene covered with uniform perovskite has high exciton separation ability under light exposure, and thus efficiently generates photocurrents. This paper presents photoluminescence (PL) images along with statistical analysis used to study the photo-induced exciton behavior. Both uniform and dramatic PL intensity quenching has been observed over entire measured regions, consistently demonstrating excellent exciton separation in the devices.

  5. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  6. ANOMALY STRUCTURE OF SUPERGRAVITY AND ANOMALY CANCELLATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butter, Daniel; Gaillard, Mary K.

    2009-06-10

    We display the full anomaly structure of supergravity, including new D-term contributions to the conformal anomaly. This expression has the super-Weyl and chiral U(1){sub K} transformation properties that are required for implementation of the Green-Schwarz mechanism for anomaly cancellation. We outline the procedure for full anomaly cancellation. Our results have implications for effective supergravity theories from the weakly coupled heterotic string theory.

  7. Detection of anomalies in ocean acoustic velocity structure and their effect in sea-bottom crustal deformation measurement: synthetic test and future suggestion

    NASA Astrophysics Data System (ADS)

    Nagai, S.; Eto, S.; Tadokoro, K.; Watanabe, T.

    2011-12-01

    On-land geodetic observations are not enough to monitor crustal activities in and around the subduction zone, so seafloor geodetic observations have been required. However, present accuracy of seafloor geodetic observation is an order of 1 cm or larger, which is difficult to detect differences from plate motion in short time interval, which means a plate coupling rate and its spatio-temporal variation. Our group has been developed observation system and methodology for seafloor geodesy, which is combined kinematic GPS and ocean acoustic ranging. One of influence factors is acoustic velocity change in ocean, due to change in temperature, ocean currents in different scale, and so on. A typical perturbation of acoustic velocity makes an order of 1 ms difference in travel time, which corresponds to 1 m difference in ray length. We have investigated this effect in seafloor geodesy using both observed and synthetic data to reduce estimation error of benchmarker (transponder) positions and to develop our strategy for observation and its analyses. In this paper, we focus on forward modeling of travel times of acoustic ranging data and recovery tests using synthetic data comparing with observed results [Eto et al., 2011; in this meeting]. Estimation procedure for benchmarker positions is similar to those used in earthquake location method and seismic tomography. So we have applied methods in seismic study, especially in tomographic inversion. First, we use method of a one-dimensional velocity inversion with station corrections, proposed by Kissling et al. [1994], to detect spatio-temporal change in ocean acoustic velocity from observed data in the Suruga-Nankai Trough, Japan. From these analyses, some important information has been clarified in travel time data [Eto et al., 2011]. Most of them can explain small velocity anomaly at a depth of 300m or shallower, through forward modeling of travel time data using simple velocity structure with velocity anomaly. However, due to

  8. Continental and oceanic magnetic anomalies: Enhancement through GRM

    NASA Technical Reports Server (NTRS)

    Vonfrese, R. R. B.; Hinze, W. J.

    1985-01-01

    In contrast to the POGO and MAGSAT satellites, the Geopotential Research Mission (GRM) satellite system will orbit at a minimum elevation to provide significantly better resolved lithospheric magnetic anomalies for more detailed and improved geologic analysis. In addition, GRM will measure corresponding gravity anomalies to enhance our understanding of the gravity field for vast regions of the Earth which are largely inaccessible to more conventional surface mapping. Crustal studies will greatly benefit from the dual data sets as modeling has shown that lithospheric sources of long wavelength magnetic anomalies frequently involve density variations which may produce detectable gravity anomalies at satellite elevations. Furthermore, GRM will provide an important replication of lithospheric magnetic anomalies as an aid to identifying and extracting these anomalies from satellite magnetic measurements. The potential benefits to the study of the origin and characterization of the continents and oceans, that may result from the increased GRM resolution are examined.

  9. Evaluation of Anomaly Detection Capability for Ground-Based Pre-Launch Shuttle Operations. Chapter 8

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2010-01-01

    This chapter will provide a thorough end-to-end description of the process for evaluation of three different data-driven algorithms for anomaly detection to select the best candidate for deployment as part of a suite of IVHM (Integrated Vehicle Health Management) technologies. These algorithms were deemed to be sufficiently mature enough to be considered viable candidates for deployment in support of the maiden launch of Ares I-X, the successor to the Space Shuttle for NASA's Constellation program. Data-driven algorithms are just one of three different types being deployed. The other two types of algorithms being deployed include a "nile-based" expert system, and a "model-based" system. Within these two categories, the deployable candidates have already been selected based upon qualitative factors such as flight heritage. For the rule-based system, SHINE (Spacecraft High-speed Inference Engine) has been selected for deployment, which is a component of BEAM (Beacon-based Exception Analysis for Multimissions), a patented technology developed at NASA's JPL (Jet Propulsion Laboratory) and serves to aid in the management and identification of operational modes. For the "model-based" system, a commercially available package developed by QSI (Qualtech Systems, Inc.), TEAMS (Testability Engineering and Maintenance System) has been selected for deployment to aid in diagnosis. In the context of this particular deployment, distinctions among the use of the terms "data-driven," "rule-based," and "model-based," can be found in. Although there are three different categories of algorithms that have been selected for deployment, our main focus in this chapter will be on the evaluation of three candidates for data-driven anomaly detection. These algorithms will be evaluated upon their capability for robustly detecting incipient faults or failures in the ground-based phase of pre-launch space shuttle operations, rather than based oil heritage as performed in previous studies. Robust

  10. Anomaly Detection Techniques with Real Test Data from a Spinning Turbine Engine-Like Rotor

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Woike, Mark R.; Oza, Nikunj C.; Matthews, Bryan L.

    2012-01-01

    Online detection techniques to monitor the health of rotating engine components are becoming increasingly attractive to aircraft engine manufacturers in order to increase safety of operation and lower maintenance costs. Health monitoring remains a challenge to easily implement, especially in the presence of scattered loading conditions, crack size, component geometry, and materials properties. The current trend, however, is to utilize noninvasive types of health monitoring or nondestructive techniques to detect hidden flaws and mini-cracks before any catastrophic event occurs. These techniques go further to evaluate material discontinuities and other anomalies that have grown to the level of critical defects that can lead to failure. Generally, health monitoring is highly dependent on sensor systems capable of performing in various engine environmental conditions and able to transmit a signal upon a predetermined crack length, while acting in a neutral form upon the overall performance of the engine system.

  11. Prevalence of dental developmental anomalies: a radiographic study.

    PubMed

    Ezoddini, Ardakani F; Sheikhha, M H; Ahmadi, H

    2007-09-01

    To determine the prevalence of developmental dental anomalies in patients attending the Dental Faculty of Medical University of Yazd, Iran and the gender differences of these anomalies. A retrospective study based on the panoramic radiographs of 480 patients. Patients referred for panoramic radiographs were clinically examined, a detailed family history of any dental anomalies in their first and second degree relatives was obtained and finally their radiographs were studied in detail for the presence of dental anomalies. 40.8% of the patients had dental anomalies. The more common anomalies were dilaceration (15%), impacted teeth (8.3%) and taurodontism (7.5%) and supernumerary teeth (3.5%). Macrodontia and fusion were detected in a few radiographs (0.2%). 49.1% of male patients had dental anomalies compared to 33.8% of females. Dilaceration, taurodontism and supernumerary teeth were found to be more prevalent in men than women, whereas impacted teeth, microdontia and gemination were more frequent in women. Family history of dental anomalies was positive in 34% of the cases.. Taurodontism, gemination, dens in dente and talon cusp were specifically limited to the patients under 20 year's old, while the prevalence of other anomalies was almost the same in all groups. Dilaceration, impaction and taurodontism were relatively common in the studied populaton. A family history of dental anomalies was positive in a third of cases.

  12. Ultrahigh Responsivity and Detectivity Graphene–Perovskite Hybrid Phototransistors by Sequential Vapor Deposition

    PubMed Central

    Chang, Po-Han; Liu, Shang-Yi; Lan, Yu-Bing; Tsai, Yi-Chen; You, Xue-Qian; Li, Chia-Shuo; Huang, Kuo-You; Chou, Ang-Sheng; Cheng, Tsung-Chin; Wang, Juen-Kai; Wu, Chih-I

    2017-01-01

    In this work, graphene-methylammonium lead iodide (MAPbI3) perovskite hybrid phototransistors fabricated by sequential vapor deposition are demonstrated. Ultrahigh responsivity of 1.73 × 107 A W−1 and detectivity of 2 × 1015 Jones are achieved, with extremely high effective quantum efficiencies of about 108% in the visible range (450–700 nm). This excellent performance is attributed to the ultra-flat perovskite films grown by vapor deposition on the graphene sheets. The hybrid structure of graphene covered with uniform perovskite has high exciton separation ability under light exposure, and thus efficiently generates photocurrents. This paper presents photoluminescence (PL) images along with statistical analysis used to study the photo-induced exciton behavior. Both uniform and dramatic PL intensity quenching has been observed over entire measured regions, consistently demonstrating excellent exciton separation in the devices. PMID:28422117

  13. Anomaly clustering in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Doster, Timothy J.; Ross, David S.; Messinger, David W.; Basener, William F.

    2009-05-01

    The topological anomaly detection algorithm (TAD) differs from other anomaly detection algorithms in that it uses a topological/graph-theoretic model for the image background instead of modeling the image with a Gaussian normal distribution. In the construction of the model, TAD produces a hard threshold separating anomalous pixels from background in the image. We build on this feature of TAD by extending the algorithm so that it gives a measure of the number of anomalous objects, rather than the number of anomalous pixels, in a hyperspectral image. This is done by identifying, and integrating, clusters of anomalous pixels via a graph theoretical method combining spatial and spectral information. The method is applied to a cluttered HyMap image and combines small groups of pixels containing like materials, such as those corresponding to rooftops and cars, into individual clusters. This improves visualization and interpretation of objects.

  14. Anomalies.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on anomalies includes Web sites, CD-ROMs and software, videos, books, and additional resources for elementary and junior high school students. Pertinent activities are suggested, and sidebars discuss UFOs, animal anomalies, and anomalies from nature; and resources covering unexplained phenonmenas like crop circles, Easter Island,…

  15. Simultaneous Surgical Treatment of Congenital Spinal Deformity Associated with Intraspinal Anomalies

    PubMed Central

    Singrakhia, Manoj; Malewar, Nikhil; Deshmukh, Sonal; Deshmukh, Shivaji

    2018-01-01

    Study Design Prospective case series. Purpose To study the safety, efficacy, and long-term outcomes of single-stage surgical intervention for congenital spinal deformity and intraspinal anomalies. Overview of literature Congenital spinal deformities associated with intraspinal anomalies are usually treated sequentially, first by treating the intraspinal anomalies followed by deformity correction after a period of 3–6 months. Recently, a single-stage approach has been reported to show better postoperative results and reduced complication rates. Methods Thirty patients (23 females and seven males) were prospectively evaluated for the simultaneous surgical treatment of congenital spinal deformity with concurrent intraspinal anomalies from May 2006 to October 2016. The average age at presentation was 9.8±3.7 years, with the average follow-up duration being 49.06±8.6 months. Clinical records were evaluated for clinical, radiological, perioperative, and postoperative data. Results The average angle of deformity was 56.53°±25.22° preoperatively, 21.13°±14.34° postoperatively, and 23.93°±14.99° at the final follow-up. The average surgical time was 232.58±53.56 minutes (range, 100–330 minutes), with a mean blood loss of 1,587.09±439.09 mL (range, 100–2,300 mL). Conclusions Single stage surgical intervention for intraspinal anomalies with congenital spinal deformity correction, including adequate intra-operative wake-up test, is a viable option in appropriately selected patients and has minimum complication rates. PMID:29879774

  16. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling.

    PubMed

    Raghuram, Jayaram; Miller, David J; Kesidis, George

    2014-07-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates.

  17. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling

    PubMed Central

    Raghuram, Jayaram; Miller, David J.; Kesidis, George

    2014-01-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511

  18. Discrimination between preseismic electromagnetic anomalies and solar activity effects

    NASA Astrophysics Data System (ADS)

    Koulouras, Gr; Balasis, G.; Kontakos, K.; Ruzhin, Y.; Avgoustis, G.; Kavouras, D.; Nomicos, C.

    2009-04-01

    Laboratory studies suggest that electromagnetic emissions in a wide frequency spectrum ranging from kHz to very high MHz frequencies are produced by the opening of microcracks, with the MHz radiation appearing earlier than the kHz radiation. Earthquakes are large-scale fracture phenomena in the Earth's heterogeneous crust. Thus, the radiated kHz-MHz electromagnetic emissions are detectable not only at laboratory but also at geological scale. Clear MHz-to-kHz electromagnetic anomalies have been systematically detected over periods ranging from a few days to a few hours prior to recent destructive earthquakes in Greece. We bear in mind that whether electromagnetic precursors to earthquakes exist is an important question not only for earthquake prediction but mainly for understanding the physical processes of earthquake generation. An open question in this field of research is the classification of a detected electromagnetic anomaly as a pre-seismic signal associated to earthquake occurrence. Indeed, electromagnetic fluctuations in the frequency range of MHz are known to related to a few sources, i.e., they might be atmospheric noise (due to lightning), man-made composite noise, solar-terrestrial noise (resulting from the Sun-solar wind-magnetosphere-ionosphere-Earth's surface chain) or cosmic noise, and finally, lithospheric effect, namely pre-seismic activity. We focus on this point. We suggest that if a combination of detected kHz and MHz electromagnetic anomalies satisfies the herein presented set of criteria these anomalies could be considered as candidate precursory phenomena of an impending earthquake.

  19. GBAS Ionospheric Anomaly Monitoring Based on a Two-Step Approach

    PubMed Central

    Zhao, Lin; Yang, Fuxin; Li, Liang; Ding, Jicheng; Zhao, Yuxin

    2016-01-01

    As one significant component of space environmental weather, the ionosphere has to be monitored using Global Positioning System (GPS) receivers for the Ground-Based Augmentation System (GBAS). This is because an ionospheric anomaly can pose a potential threat for GBAS to support safety-critical services. The traditional code-carrier divergence (CCD) methods, which have been widely used to detect the variants of the ionospheric gradient for GBAS, adopt a linear time-invariant low-pass filter to suppress the effect of high frequency noise on the detection of the ionospheric anomaly. However, there is a counterbalance between response time and estimation accuracy due to the fixed time constants. In order to release the limitation, a two-step approach (TSA) is proposed by integrating the cascaded linear time-invariant low-pass filters with the adaptive Kalman filter to detect the ionospheric gradient anomaly. The performance of the proposed method is tested by using simulated and real-world data, respectively. The simulation results show that the TSA can detect ionospheric gradient anomalies quickly, even when the noise is severer. Compared to the traditional CCD methods, the experiments from real-world GPS data indicate that the average estimation accuracy of the ionospheric gradient improves by more than 31.3%, and the average response time to the ionospheric gradient at a rate of 0.018 m/s improves by more than 59.3%, which demonstrates the ability of TSA to detect a small ionospheric gradient more rapidly. PMID:27240367

  20. Unipedal Diagnostic Lymphangiography Followed by Sequential CT Examinations in Patients With Idiopathic Chyluria: A Retrospective Study.

    PubMed

    Dong, Jian; Xin, Jianfeng; Shen, Wenbin; Chen, Xiaobai; Wen, Tingguo; Zhang, Chunyan; Wang, Rengui

    2018-04-01

    The objective of our study was to investigate the clinical value of diagnostic lymphangiography followed by sequential CT examinations in patients with idiopathic chyluria. Thirty-six patients with idiopathic chyluria underwent unipedal diagnostic lymphangiography and then underwent sequential CT examinations. The examinations were reviewed separately by two radiologists. Abnormal distribution of contrast medium, lymphourinary leakages, and retrograde flow were noted, and the range and distribution of lymphatic vessel lesions were recorded. The stage of idiopathic chyluria based on CT findings and the stage based on clinical findings were compared. Therapeutic management and follow-up were recorded. Statistical analyses were performed. Compared with CT studies performed after lymphangiography, diagnostic lymphangiography showed a unique capability to depict lymphourinary leakages in three patients. Lymphourinary fistulas and abnormal dilated lymphatic vessels were found in and around kidney in all patients. CT depicted retrograde flow of lymph fluid in 47.2% of patients. The consistency in staging chyluria based on CT findings and clinical findings was fair (κ = 0.455). Twenty-nine patients underwent conservative therapy, and seven underwent surgery. Surgical therapy was superior to conservative management (no recurrence, 85.7% of patients who underwent surgery vs 62.1% of patients who underwent conservative therapy; p = 0.025). From assessing the drainage of contrast medium on unipedal diagnostic lymphangiography and the redistribution of contrast medium on sequential CT examinations, it is possible to detect the existence of lymphourinary fistulas, the precise location of lymphatic anomalies, the distribution of collateral lymphatic vessels, and hydrodynamic pressure abnormality in the lymph circulation in patients with idiopathic chyluria. CT staging of chyluria provides additional information that can be used to guide therapeutic management.

  1. Test pattern generation for ILA sequential circuits

    NASA Technical Reports Server (NTRS)

    Feng, YU; Frenzel, James F.; Maki, Gary K.

    1993-01-01

    An efficient method of generating test patterns for sequential machines implemented using one-dimensional, unilateral, iterative logic arrays (ILA's) of BTS pass transistor networks is presented. Based on a transistor level fault model, the method affords a unique opportunity for real-time fault detection with improved fault coverage. The resulting test sets are shown to be equivalent to those obtained using conventional gate level models, thus eliminating the need for additional test patterns. The proposed method advances the simplicity and ease of the test pattern generation for a special class of sequential circuitry.

  2. Frequency of developmental dental anomalies in the Indian population.

    PubMed

    Guttal, Kruthika S; Naikmasur, Venkatesh G; Bhargava, Puneet; Bathi, Renuka J

    2010-07-01

    To evaluate the frequency of developmental dental anomalies in the Indian population. This prospective study was conducted over a period of 1 year and comprised both clinical and radiographic examinations in oral medicine and radiology outpatient department. Adult patients were screened for the presence of dental anomalies with appropriate radiographs. A comprehensive clinical examination was performed to detect hyperdontia, talon cusp, fused teeth, gemination, concrescence, hypodontia, dens invaginatus, dens evaginatus, macro- and microdontia and taurodontism. Patients with syndromes were not included in the study. Of the 20,182 patients screened, 350 had dental anomalies. Of these, 57.43% of anomalies occurred in male patients and 42.57% occurred in females. Hyperdontia, root dilaceration, peg-shaped laterals (microdontia), and hypodontia were more frequent compared to other dental anomalies of size and shape. Dental anomalies are clinically evident abnormalities. They may be the cause of various dental problems. Careful observation and appropriate investigations are required to diagnose the condition and institute treatment.

  3. Relationships between Rwandan seasonal rainfall anomalies and ENSO events

    NASA Astrophysics Data System (ADS)

    Muhire, I.; Ahmed, F.; Abutaleb, K.

    2015-10-01

    This study aims primarily at investigating the relationships between Rwandan seasonal rainfall anomalies and El Niño-South Oscillation phenomenon (ENSO) events. The study is useful for early warning of negative effects associated with extreme rainfall anomalies across the country. It covers the period 1935-1992, using long and short rains data from 28 weather stations in Rwanda and ENSO events resourced from Glantz (2001). The mean standardized anomaly indices were calculated to investigate their associations with ENSO events. One-way analysis of variance was applied on the mean standardized anomaly index values per ENSO event to explore the spatial correlation of rainfall anomalies per ENSO event. A geographical information system was used to present spatially the variations in mean standardized anomaly indices per ENSO event. The results showed approximately three climatic periods, namely, dry period (1935-1960), semi-humid period (1961-1976) and wet period (1977-1992). Though positive and negative correlations were detected between extreme short rains anomalies and El Niño events, La Niña events were mostly linked to negative rainfall anomalies while El Niño events were associated with positive rainfall anomalies. The occurrence of El Niño and La Niña in the same year does not show any clear association with rainfall anomalies. However, the phenomenon was more linked with positive long rains anomalies and negative short rains anomalies. The normal years were largely linked with negative long rains anomalies and positive short rains anomalies, which is a pointer to the influence of other factors other than ENSO events. This makes projection of seasonal rainfall anomalies in the country by merely predicting ENSO events difficult.

  4. Magnetic anomalies in the Cosmonauts Sea, off East Antarctica

    NASA Astrophysics Data System (ADS)

    Nogi, Y.; Hanyu, T.; Fujii, M.

    2017-12-01

    Identification of magnetic anomaly lineations and fracture zone trends in the Southern Indian Ocean, are vital to understanding the breakup of Gondwana. However, the magnetic spreading anomalies and fracture zones are not clear in the Southern Indian Ocean. Magnetic anomaly lineations in the Cosmonauts Sea, off East Antarctica, are key to elucidation of separation between Sri Lanka/India and Antarctica. No obvious magnetic anomaly lineations are observed from a Japanese/German aerogeophysical survey in the Cosmonauts Sea, and this area is considered to be created by seafloor spreading during the Cretaceous Normal Superchron. Vector magnetic anomaly measurements have been conducted on board the Icebreaker Shirase mainly to understand the process of Gondwana fragmentation in the Indian Ocean. Magnetic boundary strikes are derived from vector magnetic anomalies obtained in the Cosmonauts Sea. NE-SW trending magnetic boundary strikes are mainly observed along the several NW-SE oriented observation lines with magnetic anomaly amplitudes of about 200 nT. These NE-SW trending magnetic boundary strikes possibly indicate M-series magnetic anomalies that can not be detected from the aerogeophysical survey with nearly N-S observation lines. We will discuss the magnetic spreading anomalies and breakup process between Sri Lanka/India and Antarctica in the Cosmonauts Sea.

  5. Neutrino scattering and the reactor antineutrino anomaly

    NASA Astrophysics Data System (ADS)

    Garcés, Estela; Cañas, Blanca; Miranda, Omar; Parada, Alexander

    2017-12-01

    Low energy threshold reactor experiments have the potential to give insight into the light sterile neutrino signal provided by the reactor antineutrino anomaly and the gallium anomaly. In this work we analyze short baseline reactor experiments that detect by elastic neutrino electron scattering in the context of a light sterile neutrino signal. We also analyze the sensitivity of experimental proposals of coherent elastic neutrino nucleus scattering (CENNS) detectors in order to exclude or confirm the sterile neutrino signal with reactor antineutrinos.

  6. Bangui Anomaly

    NASA Technical Reports Server (NTRS)

    Taylor, Patrick T.

    2004-01-01

    Bangui anomaly is the name given to one of the Earth s largest crustal magnetic anomalies and the largest over the African continent. It covers two-thirds of the Central African Republic and therefore the name derives from the capitol city-Bangui that is also near the center of this feature. From surface magnetic survey data Godivier and Le Donche (1962) were the first to describe this anomaly. Subsequently high-altitude world magnetic surveying by the U.S. Naval Oceanographic Office (Project Magnet) recorded a greater than 1000 nT dipolar, peak-to-trough anomaly with the major portion being negative (figure 1). Satellite observations (Cosmos 49) were first reported in 1964, these revealed a 40nT anomaly at 350 km altitude. Subsequently the higher altitude (417-499km) POGO (Polar Orbiting Geomagnetic Observatory) satellite data recorded peak-to-trough anomalies of 20 nT these data were added to Cosmos 49 measurements by Regan et al. (1975) for a regional satellite altitude map. In October 1979, with the launch of Magsat, a satellite designed to measure crustal magnetic anomalies, a more uniform satellite altitude magnetic map was obtained. These data, computed at 375 km altitude recorded a -22 nT anomaly (figure 2). This elliptically shaped anomaly is approximately 760 by 1000 km and is centered at 6%, 18%. The Bangui anomaly is composed of three segments; there are two positive anomalies lobes north and south of a large central negative field. This displays the classic pattern of a magnetic anomalous body being magnetized by induction in a zero inclination field. This is not surprising since the magnetic equator passes near the center of this body.

  7. Extending TOPS: A Prototype MODIS Anomaly Detection Architecture

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Srivastava, A. N.

    2008-12-01

    The management and processing of Earth science data has been gaining importance over the last decade due to higher data volumes generated by a larger number of instruments, and due to the increase in complexity of Earth science models that use this data. The volume of data itself is often a limiting factor in obtaining the information needed by the scientists; without more sophisticated data volume reduction technologies, possible key information may not be discovered. We are especially interested in automatic identification of disturbances within the ecosystems (e,g, wildfires, droughts, floods, insect/pest damage, wind damage, logging), and focusing our analysis efforts on the identified areas. There are dozens of variables that define the health of our ecosystem and both long-term and short-term changes in these variables can serve as early indicators of natural disasters and shifts in climate and ecosystem health. These changes can have profound socio-economic impacts and we need to develop capabilities for identification, analysis and response to these changes in a timely manner. Because the ecosystem consists of a large number of variables, there can be a disturbance that is only apparent when we examine relationships among multiple variables despite the fact that none of them is by itself alarming. We have to be able to extract information from multiple sensors and observations and discover these underlying relationships. As the data volumes increase, there is also potential for large number of anomalies to "flood" the system, so we need to provide ability to automatically select the most likely ones and the most important ones and the ability to analyze the anomaly with minimal involvement of scientists. We describe a prototype architecture for anomaly driven data reduction for both near-real-time and archived surface reflectance data from the MODIS instrument collected over Central California and test it using Orca and One-Class Support Vector Machines

  8. Limb anomalies in DiGeorge and CHARGE syndromes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, C.; Quackenbush, E.J.; Whiteman, D.

    1997-01-20

    Limb anomalies are not common in the DiGeorge or CHARGE syndromes. We describe limb anomalies in two children, one with DiGeorge and the other with CHARGE syndrome. Our first patient had a bifid left thumb, Tetralogy of Fallot, absent thymus, right facial palsy, and a reduced number of T-cells. A deletion of 22q11 was detected by fluorescence in situ hybridization (FISH). The second patient, with CHARGE syndrome, had asymmetric findings that included right fifth finger clinodactyly, camptodactyly, tibial hemimelia and dimpling, and severe club-foot. The expanded spectrum of the DiGeorge and CHARGE syndromes includes limb anomalies. 14 refs., 4 figs.

  9. SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon Rueff; Lyle Roybal; Denis Vollmer

    2013-01-01

    There is a significant need to protect the nation’s energy infrastructures from malicious actors using cyber methods. Supervisory, Control, and Data Acquisition (SCADA) systems may be vulnerable due to the insufficient security implemented during the design and deployment of these control systems. This is particularly true in older legacy SCADA systems that are still commonly in use. The purpose of INL’s research on the SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) project was to determine if and how data compression techniques could be used to identify and protect SCADA systems from cyber attacks. Initially, the concept was centered on howmore » to train a compression algorithm to recognize normal control system traffic versus hostile network traffic. Because large portions of the TCP/IP message traffic (called packets) are repetitive, the concept of using compression techniques to differentiate “non-normal” traffic was proposed. In this manner, malicious SCADA traffic could be identified at the packet level prior to completing its payload. Previous research has shown that SCADA network traffic has traits desirable for compression analysis. This work investigated three different approaches to identify malicious SCADA network traffic using compression techniques. The preliminary analyses and results presented herein are clearly able to differentiate normal from malicious network traffic at the packet level at a very high confidence level for the conditions tested. Additionally, the master dictionary approach used in this research appears to initially provide a meaningful way to categorize and compare packets within a communication channel.« less

  10. Sequential strand displacement beacon for detection of DNA coverage on functionalized gold nanoparticles.

    PubMed

    Paliwoda, Rebecca E; Li, Feng; Reid, Michael S; Lin, Yanwen; Le, X Chris

    2014-06-17

    Functionalizing nanomaterials for diverse analytical, biomedical, and therapeutic applications requires determination of surface coverage (or density) of DNA on nanomaterials. We describe a sequential strand displacement beacon assay that is able to quantify specific DNA sequences conjugated or coconjugated onto gold nanoparticles (AuNPs). Unlike the conventional fluorescence assay that requires the target DNA to be fluorescently labeled, the sequential strand displacement beacon method is able to quantify multiple unlabeled DNA oligonucleotides using a single (universal) strand displacement beacon. This unique feature is achieved by introducing two short unlabeled DNA probes for each specific DNA sequence and by performing sequential DNA strand displacement reactions. Varying the relative amounts of the specific DNA sequences and spacing DNA sequences during their coconjugation onto AuNPs results in different densities of the specific DNA on AuNP, ranging from 90 to 230 DNA molecules per AuNP. Results obtained from our sequential strand displacement beacon assay are consistent with those obtained from the conventional fluorescence assays. However, labeling of DNA with some fluorescent dyes, e.g., tetramethylrhodamine, alters DNA density on AuNP. The strand displacement strategy overcomes this problem by obviating direct labeling of the target DNA. This method has broad potential to facilitate more efficient design and characterization of novel multifunctional materials for diverse applications.

  11. Brain anomalies in velo-cardio-facial syndrome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitnick, R.J.; Bello, J.A.; Shprintzen, R.J.

    Magnetic resonance imaging of the brain in 11 consecutively referred patients with velo-cardio-facial syndrome (VCF) showed anomalies in nine cases including small vermis, cysts adjacent to the frontal horns, and small posterior fossa. Focal signal hyperintensities in the white matter on long TR images were also noted. The nine patients showed a variety of behavioral abnormalities including mild development delay, learning disabilities, and characteristic personality traits typical of this common multiple anomaly syndrome which has been related to a microdeletion at 22q11. Analysis of the behavorial findings showed no specific pattern related to the brain anomalies, and the patients withmore » VCF who did not have detectable brain lesions also had behavioral abnormalities consistent with VCF. The significance of the lesions is not yet known, but the high prevalence of anomalies in this sample suggests that structural brain abnormalities are probably common in VCF. 25 refs.« less

  12. Three-body dissociation of OCS3+: Separating sequential and concerted pathways

    NASA Astrophysics Data System (ADS)

    Kumar, Herendra; Bhatt, Pragya; Safvan, C. P.; Rajput, Jyoti

    2018-02-01

    Events from the sequential and concerted modes of the fragmentation of OCS3+ that result in coincident detection of fragments C+, O+, and S+ have been separated using a newly proposed representation. An ion beam of 1.8 MeV Xe9+ is used to make the triply charged molecular ion, with the fragments being detected by a recoil ion momentum spectrometer. By separating events belonging exclusively to the sequential mode of breakup, the electronic states of the intermediate molecular ion (CO2+ or CS2+) involved are determined, and from the kinetic energy release spectra, it is shown that the low lying excited states of the parent OCS3+ are responsible for this mechanism. An estimate of branching ratios of events coming from sequential versus concerted mode is presented.

  13. An investigation of thermal anomalies in the Central American volcanic chain and evaluation of the utility of thermal anomaly monitoring in the prediction of volcanic eruptions. [Central America

    NASA Technical Reports Server (NTRS)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1975-01-01

    The author has identified the following significant results. Ground truth data collection proves that significant anomalies exist at 13 volcanoes within the test site of Central America. The dimensions and temperature contrast of these ten anomalies are large enough to be detected by the Skylab 192 instrument. The dimensions and intensity of thermal anomalies have changed at most of these volcanoes during the Skylab mission.

  14. CSAX: Characterizing Systematic Anomalies in eXpression Data.

    PubMed

    Noto, Keith; Majidi, Saeed; Edlow, Andrea G; Wick, Heather C; Bianchi, Diana W; Slonim, Donna K

    2015-05-01

    Methods for translating gene expression signatures into clinically relevant information have typically relied upon having many samples from patients with similar molecular phenotypes. Here, we address the question of what can be done when it is relatively easy to obtain healthy patient samples, but when abnormalities corresponding to disease states may be rare and one-of-a-kind. The associated computational challenge, anomaly detection, is a well-studied machine-learning problem. However, due to the dimensionality and variability of expression data, existing methods based on feature space analysis or individual anomalously expressed genes are insufficient. We present a novel approach, CSAX, that identifies pathways in an individual sample in which the normal expression relationships are disrupted. To evaluate our approach, we have compiled and released a compendium of public expression data sets, reformulated to create a test bed for anomaly detection. We demonstrate the accuracy of CSAX on the data sets in our compendium, compare it to other leading methods, and show that CSAX aids in both identifying anomalies and explaining their underlying biology. We describe an approach to characterizing the difficulty of specific expression anomaly detection tasks. We then illustrate CSAX's value in two developmental case studies. Confirming prior hypotheses, CSAX highlights disruption of platelet activation pathways in a neonate with retinopathy of prematurity and identifies, for the first time, dysregulated oxidative stress response in second trimester amniotic fluid of fetuses with obese mothers. Our approach provides an important step toward identification of individual disease patterns in the era of precision medicine.

  15. CSAX: Characterizing Systematic Anomalies in eXpression Data

    PubMed Central

    Noto, Keith; Majidi, Saeed; Edlow, Andrea G.; Wick, Heather C.; Bianchi, Diana W.

    2015-01-01

    Abstract Methods for translating gene expression signatures into clinically relevant information have typically relied upon having many samples from patients with similar molecular phenotypes. Here, we address the question of what can be done when it is relatively easy to obtain healthy patient samples, but when abnormalities corresponding to disease states may be rare and one-of-a-kind. The associated computational challenge, anomaly detection, is a well-studied machine-learning problem. However, due to the dimensionality and variability of expression data, existing methods based on feature space analysis or individual anomalously expressed genes are insufficient. We present a novel approach, CSAX, that identifies pathways in an individual sample in which the normal expression relationships are disrupted. To evaluate our approach, we have compiled and released a compendium of public expression data sets, reformulated to create a test bed for anomaly detection. We demonstrate the accuracy of CSAX on the data sets in our compendium, compare it to other leading methods, and show that CSAX aids in both identifying anomalies and explaining their underlying biology. We describe an approach to characterizing the difficulty of specific expression anomaly detection tasks. We then illustrate CSAX's value in two developmental case studies. Confirming prior hypotheses, CSAX highlights disruption of platelet activation pathways in a neonate with retinopathy of prematurity and identifies, for the first time, dysregulated oxidative stress response in second trimester amniotic fluid of fetuses with obese mothers. Our approach provides an important step toward identification of individual disease patterns in the era of precision medicine. PMID:25651392

  16. Transferring embryos with genetic anomalies detected in preimplantation testing: an Ethics Committee Opinion.

    PubMed

    2017-05-01

    Patient requests for transfer of embryos with genetic anomalies linked to serious health-affecting disorders detected in preimplantation testing are rare but do exist. This Opinion sets out the possible rationales for a provider's decision to assist or decline to assist in such transfers. The Committee concludes in most clinical cases it is ethically permissible to assist or decline to assist in transferring such embryos. In circumstances in which a child is highly likely to be born with a life-threatening condition that causes severe and early debility with no possibility of reasonable function, provider transfer of such embryos is ethically problematic and highly discouraged. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  17. Method for locating underground anomalies by diffraction of electromagnetic waves passing between spaced boreholes

    DOEpatents

    Lytle, R. Jeffrey; Lager, Darrel L.; Laine, Edwin F.; Davis, Donald T.

    1979-01-01

    Underground anomalies or discontinuities, such as holes, tunnels, and caverns, are located by lowering an electromagnetic signal transmitting antenna down one borehole and a receiving antenna down another, the ground to be surveyed for anomalies being situated between the boreholes. Electronic transmitting and receiving equipment associated with the antennas is activated and the antennas are lowered in unison at the same rate down their respective boreholes a plurality of times, each time with the receiving antenna at a different level with respect to the transmitting antenna. The transmitted electromagnetic waves diffract at each edge of an anomaly. This causes minimal signal reception at the receiving antenna. Triangulation of the straight lines between the antennas for the depths at which the signal minimums are detected precisely locates the anomaly. Alternatively, phase shifts of the transmitted waves may be detected to locate an anomaly, the phase shift being distinctive for the waves directed at the anomaly.

  18. Anomaly detection for medical images based on a one-class classification

    NASA Astrophysics Data System (ADS)

    Wei, Qi; Ren, Yinhao; Hou, Rui; Shi, Bibo; Lo, Joseph Y.; Carin, Lawrence

    2018-02-01

    Detecting an anomaly such as a malignant tumor or a nodule from medical images including mammogram, CT or PET images is still an ongoing research problem drawing a lot of attention with applications in medical diagnosis. A conventional way to address this is to learn a discriminative model using training datasets of negative and positive samples. The learned model can be used to classify a testing sample into a positive or negative class. However, in medical applications, the high unbalance between negative and positive samples poses a difficulty for learning algorithms, as they will be biased towards the majority group, i.e., the negative one. To address this imbalanced data issue as well as leverage the huge amount of negative samples, i.e., normal medical images, we propose to learn an unsupervised model to characterize the negative class. To make the learned model more flexible and extendable for medical images of different scales, we have designed an autoencoder based on a deep neural network to characterize the negative patches decomposed from large medical images. A testing image is decomposed into patches and then fed into the learned autoencoder to reconstruct these patches themselves. The reconstruction error of one patch is used to classify this patch into a binary class, i.e., a positive or a negative one, leading to a one-class classifier. The positive patches highlight the suspicious areas containing anomalies in a large medical image. The proposed method has been tested on InBreast dataset and achieves an AUC of 0.84. The main contribution of our work can be summarized as follows. 1) The proposed one-class learning requires only data from one class, i.e., the negative data; 2) The patch-based learning makes the proposed method scalable to images of different sizes and helps avoid the large scale problem for medical images; 3) The training of the proposed deep convolutional neural network (DCNN) based auto-encoder is fast and stable.

  19. Ranking Causal Anomalies via Temporal and Dynamical Analysis on Vanishing Correlations.

    PubMed

    Cheng, Wei; Zhang, Kai; Chen, Haifeng; Jiang, Guofei; Chen, Zhengzhang; Wang, Wei

    2016-08-01

    Modern world has witnessed a dramatic increase in our ability to collect, transmit and distribute real-time monitoring and surveillance data from large-scale information systems and cyber-physical systems. Detecting system anomalies thus attracts significant amount of interest in many fields such as security, fault management, and industrial optimization. Recently, invariant network has shown to be a powerful way in characterizing complex system behaviours. In the invariant network, a node represents a system component and an edge indicates a stable, significant interaction between two components. Structures and evolutions of the invariance network, in particular the vanishing correlations, can shed important light on locating causal anomalies and performing diagnosis. However, existing approaches to detect causal anomalies with the invariant network often use the percentage of vanishing correlations to rank possible casual components, which have several limitations: 1) fault propagation in the network is ignored; 2) the root casual anomalies may not always be the nodes with a high-percentage of vanishing correlations; 3) temporal patterns of vanishing correlations are not exploited for robust detection. To address these limitations, in this paper we propose a network diffusion based framework to identify significant causal anomalies and rank them. Our approach can effectively model fault propagation over the entire invariant network, and can perform joint inference on both the structural, and the time-evolving broken invariance patterns. As a result, it can locate high-confidence anomalies that are truly responsible for the vanishing correlations, and can compensate for unstructured measurement noise in the system. Extensive experiments on synthetic datasets, bank information system datasets, and coal plant cyber-physical system datasets demonstrate the effectiveness of our approach.

  20. Detection of occult infection following total joint arthroplasty using sequential technetium-99m HDP bone scintigraphy and indium-111 WBC imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, J.A.; Christie, M.J.; Sandler, M.P.

    1988-08-01

    Preoperative exclusion or confirmation of periprosthetic infection is essential for correct surgical management of patients with suspected infected joint prostheses. The sensitivity and specificity of (/sup 111/In)WBC imaging in the diagnosis of infected total joint prostheses was examined in 28 patients and compared with sequential (/sup 99m/Tc)HDP/(/sup 111/In)WBC scintigraphy and aspiration arthrography. The sensitivity of preoperative aspiration cultures was 12%, with a specificity of 81% and an accuracy of 58%. The sensitivity of (/sup 111/In)WBC imaging alone was 100%, with a specificity of 50% and an accuracy of 65%. When correlated with the bone scintigraphy and read as sequential (/supmore » 99m/Tc)HDP/(/sup 111/In)WBC imaging, the sensitivity was 88%, specificity 95%, and accuracy 93%. This study demonstrates that (/sup 111/In)WBC imaging is an extremely sensitive imaging modality for the detection of occult infection of joint prostheses. It also demonstrates the necessity of correlating (/sup 111/In)WBC images with (/sup 99m/Tc)HDP skeletal scintigraphy in the detection of occult periprosthetic infection.« less

  1. An anomaly detection approach for the identification of DME patients using spectral domain optical coherence tomography images.

    PubMed

    Sidibé, Désiré; Sankar, Shrinivasan; Lemaître, Guillaume; Rastgoo, Mojdeh; Massich, Joan; Cheung, Carol Y; Tan, Gavin S W; Milea, Dan; Lamoureux, Ecosse; Wong, Tien Y; Mériaudeau, Fabrice

    2017-02-01

    This paper proposes a method for automatic classification of spectral domain OCT data for the identification of patients with retinal diseases such as Diabetic Macular Edema (DME). We address this issue as an anomaly detection problem and propose a method that not only allows the classification of the OCT volume, but also allows the identification of the individual diseased B-scans inside the volume. Our approach is based on modeling the appearance of normal OCT images with a Gaussian Mixture Model (GMM) and detecting abnormal OCT images as outliers. The classification of an OCT volume is based on the number of detected outliers. Experimental results with two different datasets show that the proposed method achieves a sensitivity and a specificity of 80% and 93% on the first dataset, and 100% and 80% on the second one. Moreover, the experiments show that the proposed method achieves better classification performance than other recently published works. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Change and Anomaly Detection in Real-Time GPS Data

    NASA Astrophysics Data System (ADS)

    Granat, R.; Pierce, M.; Gao, X.; Bock, Y.

    2008-12-01

    The California Real-Time Network (CRTN) is currently generating real-time GPS position data at a rate of 1-2Hz at over 80 locations. The CRTN data presents the possibility of studying dynamical solid earth processes in a way that complements existing seismic networks. To realize this possibility we have developed a prototype system for detecting changes and anomalies in the real-time data. Through this system, we can can correlate changes in multiple stations in order to detect signals with geographical extent. Our approach involves developing a statistical model for each GPS station in the network, and then using those models to segment the time series into a number of discrete states described by the model. We use a hidden Markov model (HMM) to describe the behavior of each station; fitting the model to the data requires neither labeled training examples nor a priori information about the system. As such, HMMs are well suited to this problem domain, in which the data remains largely uncharacterized. There are two main components to our approach. The first is the model fitting algorithm, regularized deterministic annealing expectation- maximization (RDAEM), which provides robust, high-quality results. The second is a web service infrastructure that connects the data to the statistical modeling analysis and allows us to easily present the results of that analysis through a web portal interface. This web service approach facilitates the automatic updating of station models to keep pace with dynamical changes in the data. Our web portal interface is critical to the process of interpreting the data. A Google Maps interface allows users to visually interpret state changes not only on individual stations but across the entire network. Users can drill down from the map interface to inspect detailed results for individual stations, download the time series data, and inspect fitted models. Alternatively, users can use the web portal look at the evolution of changes on the

  3. Sequential projection pursuit for optimised vibration-based damage detection in an experimental wind turbine blade

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2018-02-01

    To advance the concept of smart structures in large systems, such as wind turbines (WTs), it is desirable to be able to detect structural damage early while using minimal instrumentation. Data-driven vibration-based damage detection methods can be competitive in that respect because global vibrational responses encompass the entire structure. Multivariate damage sensitive features (DSFs) extracted from acceleration responses enable to detect changes in a structure via statistical methods. However, even though such DSFs contain information about the structural state, they may not be optimised for the damage detection task. This paper addresses the shortcoming by exploring a DSF projection technique specialised for statistical structural damage detection. High dimensional initial DSFs are projected onto a low-dimensional space for improved damage detection performance and simultaneous computational burden reduction. The technique is based on sequential projection pursuit where the projection vectors are optimised one by one using an advanced evolutionary strategy. The approach is applied to laboratory experiments with a small-scale WT blade under wind-like excitations. Autocorrelation function coefficients calculated from acceleration signals are employed as DSFs. The optimal numbers of projection vectors are identified with the help of a fast forward selection procedure. To benchmark the proposed method, selections of original DSFs as well as principal component analysis scores from these features are additionally investigated. The optimised DSFs are tested for damage detection on previously unseen data from the healthy state and a wide range of damage scenarios. It is demonstrated that using selected subsets of the initial and transformed DSFs improves damage detectability compared to the full set of features. Furthermore, superior results can be achieved by projecting autocorrelation coefficients onto just a single optimised projection vector.

  4. Discrimination between pre-seismic electromagnetic anomalies and solar activity effects

    NASA Astrophysics Data System (ADS)

    Koulouras, G.; Balasis, G.; Kiourktsidis, I.; Nannos, E.; Kontakos, K.; Stonham, J.; Ruzhin, Y.; Eftaxias, K.; Cavouras, D.; Nomicos, C.

    2009-04-01

    Laboratory studies suggest that electromagnetic emissions in a wide frequency spectrum ranging from kilohertz (kHz) to very high megahertz (MHz) frequencies are produced by the opening of microcracks, with the MHz radiation appearing earlier than the kHz radiation. Earthquakes are large-scale fracture phenomena in the Earth's heterogeneous crust. Thus, the radiated kHz-MHz electromagnetic emissions are detectable not only in the laboratory but also at a geological scale. Clear MHz-to-kHz electromagnetic anomalies have been systematically detected over periods ranging from a few days to a few hours prior to recent destructive earthquakes in Greece. We should bear in mind that whether electromagnetic precursors to earthquakes exist is an important question not only for earthquake prediction but mainly for understanding the physical processes of earthquake generation. An open question in this field of research is the classification of a detected electromagnetic anomaly as a pre-seismic signal associated with earthquake occurrence. Indeed, electromagnetic fluctuations in the frequency range of MHz are known to be related to a few sources, including atmospheric noise (due to lightning), man-made composite noise, solar-terrestrial noise (resulting from the Sun-solar wind-magnetosphere-ionosphere-Earth's surface chain) or cosmic noise, and finally, the lithospheric effect, namely pre-seismic activity. We focus on this point in this paper. We suggest that if a combination of detected kHz and MHz electromagnetic anomalies satisfies the set of criteria presented herein, these anomalies could be considered as candidate precursory phenomena of an impending earthquake.

  5. Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.

    PubMed

    Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen

    2015-11-01

    Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.

  6. Ionospheric anomalies detected by ionosonde and possibly related to crustal earthquakes in Greece

    NASA Astrophysics Data System (ADS)

    Perrone, Loredana; De Santis, Angelo; Abbattista, Cristoforo; Alfonsi, Lucilla; Amoruso, Leonardo; Carbone, Marianna; Cesaroni, Claudio; Cianchini, Gianfranco; De Franceschi, Giorgiana; De Santis, Anna; Di Giovambattista, Rita; Marchetti, Dedalo; Pavòn-Carrasco, Francisco J.; Piscini, Alessandro; Spogli, Luca; Santoro, Francesca

    2018-03-01

    Ionosonde data and crustal earthquakes with magnitude M ≥ 6.0 observed in Greece during the 2003-2015 period were examined to check if the relationships obtained earlier between precursory ionospheric anomalies and earthquakes in Japan and central Italy are also valid for Greek earthquakes. The ionospheric anomalies are identified on the observed variations of the sporadic E-layer parameters (h'Es, foEs) and foF2 at the ionospheric station of Athens. The corresponding empirical relationships between the seismo-ionospheric disturbances and the earthquake magnitude and the epicentral distance are obtained and found to be similar to those previously published for other case studies. The large lead times found for the ionospheric anomalies occurrence may confirm a rather long earthquake preparation period. The possibility of using the relationships obtained for earthquake prediction is finally discussed.

  7. Practical method to identify orbital anomaly as spacecraft breakup in the geostationary region

    NASA Astrophysics Data System (ADS)

    Hanada, Toshiya; Uetsuhara, Masahiko; Nakaniwa, Yoshitaka

    2012-07-01

    Identifying a spacecraft breakup is an essential issue to define the current orbital debris environment. This paper proposes a practical method to identify an orbital anomaly, which appears as a significant discontinuity in the observation data, as a spacecraft breakup. The proposed method is applicable to orbital anomalies in the geostationary region. Long-term orbital evolutions of breakup fragments may conclude that their orbital planes will converge into several corresponding regions in inertial space even if the breakup epoch is not specified. This empirical method combines the aforementioned conclusion with the search strategy developed at Kyushu University, which can identify origins of observed objects as fragments released from a specified spacecraft. This practical method starts with selecting a spacecraft that experienced an orbital anomaly, and formulates a hypothesis to generate fragments from the anomaly. Then, the search strategy is applied to predict the behavior of groups of fragments hypothetically generated. Outcome of this predictive analysis specifies effectively when, where and how we should conduct optical measurements using ground-based telescopes. Objects detected based on the outcome are supposed to be from the anomaly, so that we can confirm the anomaly as a spacecraft breakup to release the detected objects. This paper also demonstrates observation planning for a spacecraft anomaly in the geostationary region.

  8. Automated determinations of selenium in thermal power plant wastewater by sequential hydride generation and chemiluminescence detection.

    PubMed

    Ezoe, Kentaro; Ohyama, Seiichi; Hashem, Md Abul; Ohira, Shin-Ichi; Toda, Kei

    2016-02-01

    After the Fukushima disaster, power generation from nuclear power plants in Japan was completely stopped and old coal-based power plants were re-commissioned to compensate for the decrease in power generation capacity. Although coal is a relatively inexpensive fuel for power generation, it contains high levels (mgkg(-1)) of selenium, which could contaminate the wastewater from thermal power plants. In this work, an automated selenium monitoring system was developed based on sequential hydride generation and chemiluminescence detection. This method could be applied to control of wastewater contamination. In this method, selenium is vaporized as H2Se, which reacts with ozone to produce chemiluminescence. However, interference from arsenic is of concern because the ozone-induced chemiluminescence intensity of H2Se is much lower than that of AsH3. This problem was successfully addressed by vaporizing arsenic and selenium individually in a sequential procedure using a syringe pump equipped with an eight-port selection valve and hot and cold reactors. Oxidative decomposition of organoselenium compounds and pre-reduction of the selenium were performed in the hot reactor, and vapor generation of arsenic and selenium were performed separately in the cold reactor. Sample transfers between the reactors were carried out by a pneumatic air operation by switching with three-way solenoid valves. The detection limit for selenium was 0.008 mg L(-1) and calibration curve was linear up to 1.0 mg L(-1), which provided suitable performance for controlling selenium in wastewater to around the allowable limit (0.1 mg L(-1)). This system consumes few chemicals and is stable for more than a month without any maintenance. Wastewater samples from thermal power plants were collected, and data obtained by the proposed method were compared with those from batchwise water treatment followed by hydride generation-atomic fluorescence spectrometry. Copyright © 2015 Elsevier B.V. All rights

  9. Anomaly-free models for flavour anomalies

    NASA Astrophysics Data System (ADS)

    Ellis, John; Fairbairn, Malcolm; Tunney, Patrick

    2018-03-01

    We explore the constraints imposed by the cancellation of triangle anomalies on models in which the flavour anomalies reported by LHCb and other experiments are due to an extra U(1)^' gauge boson Z^' . We assume universal and rational U(1)^' charges for the first two generations of left-handed quarks and of right-handed up-type quarks but allow different charges for their third-generation counterparts. If the right-handed charges vanish, cancellation of the triangle anomalies requires all the quark U(1)^' charges to vanish, if there are either no exotic fermions or there is only one Standard Model singlet dark matter (DM) fermion. There are non-trivial anomaly-free models with more than one such `dark' fermion, or with a single DM fermion if right-handed up-type quarks have non-zero U(1)^' charges. In some of the latter models the U(1)^' couplings of the first- and second-generation quarks all vanish, weakening the LHC Z^' constraint, and in some other models the DM particle has purely axial couplings, weakening the direct DM scattering constraint. We also consider models in which anomalies are cancelled via extra vector-like leptons, showing how the prospective LHC Z^' constraint may be weakened because the Z^' → μ ^+ μ ^- branching ratio is suppressed relative to other decay modes.

  10. Observed TEC Anomalies by GNSS Sites Preceding the Aegean Sea Earthquake of 2014

    NASA Astrophysics Data System (ADS)

    Ulukavak, Mustafa; Yal&ccedul; ınkaya, Mualla

    2016-11-01

    In recent years, Total Electron Content (TEC) data, obtained from Global Navigation Satellites Systems (GNSS) receivers, has been widely used to detect seismo-ionospheric anomalies. In this study, Global Positioning System - Total Electron Content (GPS-TEC) data were used to investigate ionospheric abnormal behaviors prior to the 2014 Aegean Sea earthquake (40.305°N 25.453°E, 24 May 2014, 09:25:03 UT, Mw:6.9). The data obtained from three Continuously Operating Reference Stations in Turkey (CORS-TR) and two International GNSS Service (IGS) sites near the epicenter of the earthquake is used to detect ionospheric anomalies before the earthquake. Solar activity index (F10.7) and geomagnetic activity index (Dst), which are both related to space weather conditions, were used to analyze these pre-earthquake ionospheric anomalies. An examination of these indices indicated high solar activity between May 8 and 15, 2014. The first significant increase (positive anomalies) in Vertical Total Electron Content (VTEC) was detected on May 14, 2014 or 10 days before the earthquake. This positive anomaly can be attributed to the high solar activity. The indices do not imply high solar or geomagnetic activity after May 15, 2014. Abnormal ionospheric TEC changes (negative anomaly) were observed at all stations one day before the earthquake. These changes were lower than the lower bound by approximately 10-20 TEC unit (TECU), and may be considered as the ionospheric precursor of the 2014 Aegean Sea earthquake

  11. Vascular Anomalies (Part I): Classification and Diagnostics of Vascular Anomalies.

    PubMed

    Sadick, Maliha; Müller-Wille, René; Wildgruber, Moritz; Wohlgemuth, Walter A

    2018-06-06

     Vascular anomalies are a diagnostic and therapeutic challenge. They require dedicated interdisciplinary management. Optimal patient care relies on integral medical evaluation and a classification system established by experts in the field, to provide a better understanding of these complex vascular entities.  A dedicated classification system according to the International Society for the Study of Vascular Anomalies (ISSVA) and the German Interdisciplinary Society of Vascular Anomalies (DiGGefA) is presented. The vast spectrum of diagnostic modalities, ranging from ultrasound with color Doppler, conventional X-ray, CT with 4 D imaging and MRI as well as catheter angiography for appropriate assessment is discussed.  Congenital vascular anomalies are comprised of vascular tumors, based on endothelial cell proliferation and vascular malformations with underlying mesenchymal and angiogenetic disorder. Vascular tumors tend to regress with patient's age, vascular malformations increase in size and are subdivided into capillary, venous, lymphatic, arterio-venous and combined malformations, depending on their dominant vasculature. According to their appearance, venous malformations are the most common representative of vascular anomalies (70 %), followed by lymphatic malformations (12 %), arterio-venous malformations (8 %), combined malformation syndromes (6 %) and capillary malformations (4 %).  The aim is to provide an overview of the current classification system and diagnostic characterization of vascular anomalies in order to facilitate interdisciplinary management of vascular anomalies.   · Vascular anomalies are comprised of vascular tumors and vascular malformations, both considered to be rare diseases.. · Appropriate treatment depends on correct classification and diagnosis of vascular anomalies, which is based on established national and international classification systems, recommendations and guidelines.. · In the classification

  12. Hot spots of multivariate extreme anomalies in Earth observations

    NASA Astrophysics Data System (ADS)

    Flach, M.; Sippel, S.; Bodesheim, P.; Brenning, A.; Denzler, J.; Gans, F.; Guanche, Y.; Reichstein, M.; Rodner, E.; Mahecha, M. D.

    2016-12-01

    Anomalies in Earth observations might indicate data quality issues, extremes or the change of underlying processes within a highly multivariate system. Thus, considering the multivariate constellation of variables for extreme detection yields crucial additional information over conventional univariate approaches. We highlight areas in which multivariate extreme anomalies are more likely to occur, i.e. hot spots of extremes in global atmospheric Earth observations that impact the Biosphere. In addition, we present the year of the most unusual multivariate extreme between 2001 and 2013 and show that these coincide with well known high impact extremes. Technically speaking, we account for multivariate extremes by using three sophisticated algorithms adapted from computer science applications. Namely an ensemble of the k-nearest neighbours mean distance, a kernel density estimation and an approach based on recurrences is used. However, the impact of atmosphere extremes on the Biosphere might largely depend on what is considered to be normal, i.e. the shape of the mean seasonal cycle and its inter-annual variability. We identify regions with similar mean seasonality by means of dimensionality reduction in order to estimate in each region both the `normal' variance and robust thresholds for detecting the extremes. In addition, we account for challenges like heteroscedasticity in Northern latitudes. Apart from hot spot areas, those anomalies in the atmosphere time series are of particular interest, which can only be detected by a multivariate approach but not by a simple univariate approach. Such an anomalous constellation of atmosphere variables is of interest if it impacts the Biosphere. The multivariate constellation of such an anomalous part of a time series is shown in one case study indicating that multivariate anomaly detection can provide novel insights into Earth observations.

  13. Discovering System Health Anomalies Using Data Mining Techniques

    NASA Technical Reports Server (NTRS)

    Sriastava, Ashok, N.

    2005-01-01

    We present a data mining framework for the analysis and discovery of anomalies in high-dimensional time series of sensor measurements that would be found in an Integrated System Health Monitoring system. We specifically treat the problem of discovering anomalous features in the time series that may be indicative of a system anomaly, or in the case of a manned system, an anomaly due to the human. Identification of these anomalies is crucial to building stable, reusable, and cost-efficient systems. The framework consists of an analysis platform and new algorithms that can scale to thousands of sensor streams to discovers temporal anomalies. We discuss the mathematical framework that underlies the system and also describe in detail how this framework is general enough to encompass both discrete and continuous sensor measurements. We also describe a new set of data mining algorithms based on kernel methods and hidden Markov models that allow for the rapid assimilation, analysis, and discovery of system anomalies. We then describe the performance of the system on a real-world problem in the aircraft domain where we analyze the cockpit data from aircraft as well as data from the aircraft propulsion, control, and guidance systems. These data are discrete and continuous sensor measurements and are dealt with seamlessly in order to discover anomalous flights. We conclude with recommendations that describe the tradeoffs in building an integrated scalable platform for robust anomaly detection in ISHM applications.

  14. The incidence of coronary anomalies on routine coronary computed tomography scans

    PubMed Central

    Karabay, Kanber Ocal; Yildiz, Abdulmelik; Bagirtan, Bayram; Geceer, Gurkan; Uysal, Ender

    2013-01-01

    Summary Objective This study aimed to assess the incidence of coronary anomalies using 64-multi-slice coronary computed tomography (MSCT). Methods The diagnostic MSCT scans of 745 consecutive patients were reviewed. Results The incidence of coronary anomalies was 4.96%. The detected coronary anomalies included the conus artery originating separately from the right coronary sinus (RCS) (n = 8, 1.07%), absence of the left main artery (n = 7, 0.93%), a superior right coronary artery (RCA) (n = 7, 0.93%), the circumflex artery (CFX) arising from the RCS (n = 4, 0.53%), the CFX originating from the RCA (n = 2, 0.26%), a posterior RCA (n = 1, 0.13%), a coronary fistula from the left anterior descending artery and RCA to the pulmonary artery (n = 1, 0.13%), and a coronary aneurysm (n = 1, 0.13%). Conclusions This study indicated that MSCT can be used to detect common coronary anomalies, and shows it has the potential to aid cardiologists and cardiac surgeons by revealing the origin and course of the coronary vessels. PMID:24042853

  15. A comparison of classical and intelligent methods to detect potential thermal anomalies before the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4)

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-04-01

    In this paper, a number of classical and intelligent methods, including interquartile, autoregressive integrated moving average (ARIMA), artificial neural network (ANN) and support vector machine (SVM), have been proposed to quantify potential thermal anomalies around the time of the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4). The duration of the data set, which is comprised of Aqua-MODIS land surface temperature (LST) night-time snapshot images, is 62 days. In order to quantify variations of LST data obtained from satellite images, the air temperature (AT) data derived from the meteorological station close to the earthquake epicenter has been taken into account. For the models examined here, results indicate the following: (i) ARIMA models, which are the most widely used in the time series community for short-term forecasting, are quickly and easily implemented, and can efficiently act through linear solutions. (ii) A multilayer perceptron (MLP) feed-forward neural network can be a suitable non-parametric method to detect the anomalous changes of a non-linear time series such as variations of LST. (iii) Since SVMs are often used due to their many advantages for classification and regression tasks, it can be shown that, if the difference between the predicted value using the SVM method and the observed value exceeds the pre-defined threshold value, then the observed value could be regarded as an anomaly. (iv) ANN and SVM methods could be powerful tools in modeling complex phenomena such as earthquake precursor time series where we may not know what the underlying data generating process is. There is good agreement in the results obtained from the different methods for quantifying potential anomalies in a given LST time series. This paper indicates that the detection of the potential thermal anomalies derive credibility from the overall efficiencies and potentialities of the four integrated methods.

  16. Contribution of ground surface altitude difference to thermal anomaly detection using satellite images: Application to volcanic/geothermal complexes in the Andes of Central Chile

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Francisco J.; Lemus, Martín; Parada, Miguel A.; Benavente, Oscar M.; Aguilera, Felipe A.

    2012-09-01

    Detection of thermal anomalies in volcanic-geothermal areas using remote sensing methodologies requires the subtraction of temperatures, not provided by geothermal manifestations (e.g. hot springs, fumaroles, active craters), from satellite image kinetic temperature, which is assumed to correspond to the ground surface temperature. Temperatures that have been subtracted in current models include those derived from the atmospheric transmittance, reflectance of the Earth's surface (albedo), topography effect, thermal inertia and geographic position effect. We propose a model that includes a new parameter (K) that accounts for the variation of temperature with ground surface altitude difference in areas where steep relief exists. The proposed model was developed and applied, using ASTER satellite images, in two Andean volcanic/geothermal complexes (Descabezado Grande-Cerro Azul Volcanic Complex and Planchón-Peteroa-Azufre Volcanic Complex) where field data of atmosphere and ground surface temperature as well as radiation for albedo calibration were obtained in 10 selected sites. The study area was divided into three zones (Northern, Central and Southern zones) where the thermal anomalies were obtained independently. K value calculated for night images of the three zones are better constrained and resulted to be very similar to the Environmental Lapse Rate (ELR) determined for a stable atmosphere (ELR > 7 °C/km). Using the proposed model, numerous thermal anomalies in areas of ≥ 90 m × 90 m were identified that were successfully cross-checked in the field. Night images provide more reliable information for thermal anomaly detection than day images because they record higher temperature contrast between geothermal areas and its surroundings and correspond to more stable atmospheric condition at the time of image acquisition.

  17. Prenatal detection of structural cardiac defects and presence of associated anomalies: a retrospective observational study of 1262 fetal echocardiograms.

    PubMed

    Mone, Fionnuala; Walsh, Colin; Mulcahy, Cecelia; McMahon, Colin J; Farrell, Sinead; MacTiernan, Aoife; Segurado, Ricardo; Mahony, Rhona; Higgins, Shane; Carroll, Stephen; McParland, Peter; McAuliffe, Fionnuala M

    2015-06-01

    The aim of this study is to document the detection of fetal congenital heart defect (CHD) in relation to the following: (1) indication for referral, (2) chromosomal and (3) extracardiac abnormalities. All fetal echocardiograms performed in our institution from 2007 to 2011 were reviewed retrospectively. Indication for referral, cardiac diagnosis based on the World Health Organization International Classification of Diseases tenth revision criteria and the presence of chromosomal and extracardiac defects were recorded. Of 1262 echocardiograms, 287 (22.7%) had CHD. Abnormal anatomy scan in pregnancies originally considered to be at low risk of CHD was the best indicator for detecting CHD (91.2% of positive cardiac diagnoses), compared with other indications of family history (5.6%) or maternal medical disorder (3.1%). Congenital anomalies of the cardiac septa comprised the largest category (n = 89), within which atrioventricular septal defects were the most common anomaly (n = 36). Invasive prenatal testing was performed for 126 of 287 cases, of which 44% (n = 55) had a chromosomal abnormality. Of 232 fetuses without chromosomal abnormalities, 31% had an extracardiac defect (n = 76). Most CHDs occur in pregnancies regarded to be at low risk, highlighting the importance of a routine midtrimester fetal anatomy scan. Frequent association of fetal CHD and chromosomal and extracardiac pathology emphasises the importance of thorough evaluation of any fetus with CHD. © 2015 John Wiley & Sons, Ltd.

  18. Wolffian duct derivative anomalies: technical considerations when encountered during robot-assisted radical prostatectomy.

    PubMed

    Acharya, Sujeet S; Gundeti, Mohan S; Zagaja, Gregory P; Shalhav, Arieh L; Zorn, Kevin C

    2009-04-01

    Although malformations of the genitourinary tract are typically identified during childhood, they can remain silent until incidental detection in evaluation and treatment of other pathologies during adulthood. The advent of the minimally invasive era in urologic surgery has given rise to unique challenges in the surgical management of anomalies of the genitourinary tract. This article reviews the embryology of anomalies of Wolffian duct (WD) derivatives with specific attention to the seminal vesicles, vas deferens, ureter, and kidneys. This is followed by a discussion of the history of the laparoscopic approach to WD derivative anomalies. Finally, we present two cases to describe technical considerations when managing these anomalies when encountered during robotic-assisted radical prostatectomy. The University of Chicago Robotic Laparoscopic Radical Prostatectomy (RLRP) database was reviewed for cases where anomalies of WD derivatives were encountered. We describe how modifications in technique allowed for completion of the procedure without difficulty. None Of the 1230 RLRP procedures performed at our institution by three surgeons, only two cases (0.16%) have been noted to have a WD anomaly. These cases were able to be completed without difficulty by making simple modifications in technique. Although uncommon, it is important for the urologist to be familiar with the origin and surgical management of WD anomalies, particularly when detected incidentally during surgery. Simple modifications in technique allow for completion of RLRP without difficulty.

  19. Archean Isotope Anomalies as a Window into the Differentiation History of the Earth

    NASA Astrophysics Data System (ADS)

    Wainwright, A. N.; Debaille, V.; Zincone, S. A.

    2018-05-01

    No resolvable µ142Nd anomaly was detected in Paleo- Mesoarchean rocks of São Francisco and West African cratons. The lack of µ142Nd anomalies outside of North America and Greenland implies the Earth differentiated into at least two distinct domains.

  20. Invesigation of prevalence of dental anomalies by using digital panoramic radiographs.

    PubMed

    Bilge, Nebiha Hilal; Yeşiltepe, Selin; Törenek Ağırman, Kübra; Çağlayan, Fatma; Bilge, Osman Murat

    2017-09-21

    This study was performed to evaluate the prevalence of all types and subtypes of dental anomalies among 6 to 40 year-old patients by using panoramic radiographs. This cross-sectional study was conducted by analyzing digital panoramic radiographs of 1200 patients admitted to our clinic in 2014. Dental anomalies were examined under 5 types and 16 subtypes. Dental anomalies were divided into five types: (a) number (including hypodontia, oligodontia and hyperdontia); (b) size (including microdontia and macrodontia); (c) structure (including amelogenesis imperfecta, dentinogenesis imperfecta and dentin dysplasia); (d) position (including transposition, ectopia, displacement, impaction and inversion); (e) shape (including fusion-gemination, dilaceration and taurodontism); RESULTS: The prevalence of dental anomalies diagnosed by panoramic radiographs was 39.2% (men (46%), women (54%)). Anomalies of position (60.8%) and shape (27.8%) were the most common types of abnormalities and anomalies of size (8.2%), structure (0.2%) and number (17%) were the least in both genders. Anomalies of impaction (45.5%), dilacerations (16.3%), hypodontia (13.8%) and taurodontism (11.2%) were the most common subtypes of dental anomalies. Taurodontism was more common in the age groups of 13-19 years. The age range of the most frequent of all other anomalies was 20-29. Anomalies of tooth position were the most common type of dental anomalies and structure anomalies were the least in this Turkish dental population. The frequency and type of dental anomalies vary within and between populations, confirming the role of racial factors in the prevalence of dental anomalies. Digital panoramic radiography is a very useful method for the detection of dental anomalies.

  1. Stochastic Control of Multi-Scale Networks: Modeling, Analysis and Algorithms

    DTIC Science & Technology

    2014-10-20

    Theory, (02 2012): 0. doi: B. T. Swapna, Atilla Eryilmaz, Ness B. Shroff. Throughput-Delay Analysis of Random Linear Network Coding for Wireless ... Wireless Sensor Networks and Effects of Long-Range Dependent Data, Sequential Analysis , (10 2012): 0. doi: 10.1080/07474946.2012.719435 Stefano...Sequential Analysis , (10 2012): 0. doi: John S. Baras, Shanshan Zheng. Sequential Anomaly Detection in Wireless Sensor Networks andEffects of Long

  2. Risk of developing palatally displaced canines in patients with early detectable dental anomalies: a retrospective cohort study.

    PubMed

    Garib, Daniela Gamba; Lancia, Melissa; Kato, Renata Mayumi; Oliveira, Thais Marchini; Neves, Lucimara Teixeira das

    2016-01-01

    To estimate the risk of PDC occurrence in children with dental anomalies identified early during mixed dentition. The sample comprised 730 longitudinal orthodontic records from children (448 females and 282 males) with an initial mean age of 8.3 years (SD=1.36). The dental anomaly group (DA) included 263 records of patients with at least one dental anomaly identified in the initial or middle mixed dentition. The non-dental anomaly group (NDA) was composed of 467 records of patients with no dental anomalies. The occurrence of PDC in both groups was diagnosed using panoramic and periapical radiographs taken in the late mixed dentition or early permanent dentition. The prevalence of PDC in patients with and without early diagnosed dental anomalies was compared using the chi-square test (p<0.01), relative risk assessments (RR), and positive and negative predictive values (PPV and NPV). PDC frequency was 16.35% and 6.2% in DA and NDA groups, respectively. A statistically significant difference was observed between groups (p<0.01), with greater risk of PDC development in the DA group (RR=2.63). The PPV and NPV was 16% and 93%, respectively. Small maxillary lateral incisors, deciduous molar infraocclusion, and mandibular second premolar distoangulation were associated with PDC. Children with dental anomalies diagnosed during early mixed dentition have an approximately two and a half fold increased risk of developing PDC during late mixed dentition compared with children without dental anomalies.

  3. Systematic review and meta-analysis of isolated posterior fossa malformations on prenatal ultrasound imaging (part 1): nomenclature, diagnostic accuracy and associated anomalies.

    PubMed

    D'Antonio, F; Khalil, A; Garel, C; Pilu, G; Rizzo, G; Lerman-Sagie, T; Bhide, A; Thilaganathan, B; Manzoli, L; Papageorghiou, A T

    2016-06-01

    To explore the outcome in fetuses with prenatal diagnosis of posterior fossa anomalies apparently isolated on ultrasound imaging. MEDLINE and EMBASE were searched electronically utilizing combinations of relevant medical subject headings for 'posterior fossa' and 'outcome'. The posterior fossa anomalies analyzed were Dandy-Walker malformation (DWM), mega cisterna magna (MCM), Blake's pouch cyst (BPC) and vermian hypoplasia (VH). The outcomes observed were rate of chromosomal abnormalities, additional anomalies detected at prenatal magnetic resonance imaging (MRI), additional anomalies detected at postnatal imaging and concordance between prenatal and postnatal diagnoses. Only isolated cases of posterior fossa anomalies - defined as having no cerebral or extracerebral additional anomalies detected on ultrasound examination - were included in the analysis. Quality assessment of the included studies was performed using the Newcastle-Ottawa Scale for cohort studies. We used meta-analyses of proportions to combine data and fixed- or random-effects models according to the heterogeneity of the results. Twenty-two studies including 531 fetuses with posterior fossa anomalies were included in this systematic review. The prevalence of chromosomal abnormalities in fetuses with isolated DWM was 16.3% (95% CI, 8.7-25.7%). The prevalence of additional central nervous system (CNS) abnormalities that were missed at ultrasound examination and detected only at prenatal MRI was 13.7% (95% CI, 0.2-42.6%), and the prevalence of additional CNS anomalies that were missed at prenatal imaging and detected only after birth was 18.2% (95% CI, 6.2-34.6%). Prenatal diagnosis was not confirmed after birth in 28.2% (95% CI, 8.5-53.9%) of cases. MCM was not significantly associated with additional anomalies detected at prenatal MRI or detected after birth. Prenatal diagnosis was not confirmed postnatally in 7.1% (95% CI, 2.3-14.5%) of cases. The rate of chromosomal anomalies in fetuses with

  4. The impact of eyewitness identifications from simultaneous and sequential lineups.

    PubMed

    Wright, Daniel B

    2007-10-01

    Recent guidelines in the US allow either simultaneous or sequential lineups to be used for eyewitness identification. This paper investigates how potential jurors weight the probative value of the different outcomes from both of these types of lineups. Participants (n=340) were given a description of a case that included some exonerating and some incriminating evidence. There was either a simultaneous or a sequential lineup. Depending on the condition, an eyewitness chose the suspect, chose a filler, or made no identification. The participant had to judge the guilt of the suspect and decide whether to render a guilty verdict. For both simultaneous and sequential lineups an identification had a large effect,increasing the probability of a guilty verdict. There were no reliable effects detected between making no identification and identifying a filler. The effect sizes were similar for simultaneous and sequential lineups. These findings are important for judges and other legal professionals to know for trials involving lineup identifications.

  5. Risk of developing palatally displaced canines in patients with early detectable dental anomalies: a retrospective cohort study

    PubMed Central

    GARIB, Daniela Gamba; LANCIA, Melissa; KATO, Renata Mayumi; OLIVEIRA, Thais Marchini; NEVES, Lucimara Teixeira das

    2016-01-01

    ABSTRACT The early recognition of risk factors for the occurrence of palatally displaced canines (PDC) can increase the possibility of impaction prevention. Objective To estimate the risk of PDC occurrence in children with dental anomalies identified early during mixed dentition. Material and Methods The sample comprised 730 longitudinal orthodontic records from children (448 females and 282 males) with an initial mean age of 8.3 years (SD=1.36). The dental anomaly group (DA) included 263 records of patients with at least one dental anomaly identified in the initial or middle mixed dentition. The non-dental anomaly group (NDA) was composed of 467 records of patients with no dental anomalies. The occurrence of PDC in both groups was diagnosed using panoramic and periapical radiographs taken in the late mixed dentition or early permanent dentition. The prevalence of PDC in patients with and without early diagnosed dental anomalies was compared using the chi-square test (p<0.01), relative risk assessments (RR), and positive and negative predictive values (PPV and NPV). Results PDC frequency was 16.35% and 6.2% in DA and NDA groups, respectively. A statistically significant difference was observed between groups (p<0.01), with greater risk of PDC development in the DA group (RR=2.63). The PPV and NPV was 16% and 93%, respectively. Small maxillary lateral incisors, deciduous molar infraocclusion, and mandibular second premolar distoangulation were associated with PDC. Conclusion Children with dental anomalies diagnosed during early mixed dentition have an approximately two and a half fold increased risk of developing PDC during late mixed dentition compared with children without dental anomalies. PMID:28076458

  6. Using a combination of MLPA kits to detect chromosomal imbalances in patients with multiple congenital anomalies and mental retardation is a valuable choice for developing countries.

    PubMed

    Jehee, Fernanda Sarquis; Takamori, Jean Tetsuo; Medeiros, Paula F Vasconcelos; Pordeus, Ana Carolina B; Latini, Flavia Roche M; Bertola, Débora Romeo; Kim, Chong Ae; Passos-Bueno, Maria Rita

    2011-01-01

    Conventional karyotyping detects anomalies in 3-15% of patients with multiple congenital anomalies and mental retardation (MCA/MR). Whole-genome array screening (WGAS) has been consistently suggested as the first choice diagnostic test for this group of patients, but it is very costly for large-scale use in developing countries. We evaluated the use of a combination of Multiplex Ligation-dependent Probe Amplification (MLPA) kits to increase the detection rate of chromosomal abnormalities in MCA/MR patients. We screened 261 MCA/MR patients with two subtelomeric and one microdeletion kits. This would theoretically detect up to 70% of all submicroscopic abnormalities. Additionally we scored the de Vries score for 209 patients in an effort to find a suitable cut-off for MLPA screening. Our results reveal that chromosomal abnormalities were present in 87 (33.3%) patients, but only 57 (21.8%) were considered causative. Karyotyping detected 15 abnormalities (6.9%), while MLPA identified 54 (20.7%). Our combined MLPA screening raised the total detection number of pathogenic imbalances more than three times when compared to conventional karyotyping. We also show that using the de Vries score as a cut-off for this screening would only be suitable under financial restrictions. A decision analytic model was constructed with three possible strategies: karyotype, karyotype + MLPA and karyotype + WGAS. Karyotype + MLPA strategy detected anomalies in 19.8% of cases which account for 76.45% of the expected yield for karyotype + WGAS. Incremental Cost Effectiveness Ratio (ICER) of MLPA is three times lower than that of WGAS, which means that, for the same costs, we have three additional diagnoses with MLPA but only one with WGAS. We list all causative alterations found, including rare findings, such as reciprocal duplications of regions deleted in Sotos and Williams-Beuren syndromes. We also describe imbalances that were considered polymorphisms or rare variants, such as the new SNP

  7. Analysis and interpretation of MAGSAT anomalies over north Africa

    NASA Technical Reports Server (NTRS)

    Phillips, R. J.

    1985-01-01

    Crustal anomaly detection with MAGSAT data is frustrated by inherent resolving power of the data and by contamination from external and core fields. Quality of the data might be tested by modeling specific tectonic features which produce anomalies that fall within proposed resolution and crustal amplitude capabilities of MAGSAT fields. To test this hypothesis, north African hotspots associated with Ahaggar, Tibesti and Darfur were modeled as magnetic induction anomalies. MAGSAT data were reduced by subtracting external and core fields to isolate scalar and vertical component crustal signals. Of the three volcanic areas, only the Ahaggar region had an associated anomaly of magnitude above error limits of the data. Hotspot hypothesis was tested for Ahaggar by seeing if predicted magnetic signal matched MAGSAT anomaly. Predicted model magnetic signal arising from surface topography of the uplift and the Curie isothermal surface was calculated at MAGSAT altitudes by Fourier transform technique modified to allow for variable magnetization. Curie isotherm surface was calculated using a method for temperature distribution in a moving plate above a fixed hotspot. Magnetic signal was calculated for a fixed plate as well as a number of plate velocities and directions.

  8. Turtle Carapace Anomalies: The Roles of Genetic Diversity and Environment

    PubMed Central

    Velo-Antón, Guillermo; Becker, C. Guilherme; Cordero-Rivera, Adolfo

    2011-01-01

    Background Phenotypic anomalies are common in wild populations and multiple genetic, biotic and abiotic factors might contribute to their formation. Turtles are excellent models for the study of developmental instability because anomalies are easily detected in the form of malformations, additions, or reductions in the number of scutes or scales. Methodology/Principal Findings In this study, we integrated field observations, manipulative experiments, and climatic and genetic approaches to investigate the origin of carapace scute anomalies across Iberian populations of the European pond turtle, Emys orbicularis. The proportion of anomalous individuals varied from 3% to 69% in local populations, with increasing frequency of anomalies in northern regions. We found no significant effect of climatic and soil moisture, or climatic temperature on the occurrence of anomalies. However, lower genetic diversity and inbreeding were good predictors of the prevalence of scute anomalies among populations. Both decreasing genetic diversity and increasing proportion of anomalous individuals in northern parts of the Iberian distribution may be linked to recolonization events from the Southern Pleistocene refugium. Conclusions/Significance Overall, our results suggest that developmental instability in turtle carapace formation might be caused, at least in part, by genetic factors, although the influence of environmental factors affecting the developmental stability of turtle carapace cannot be ruled out. Further studies of the effects of environmental factors, pollutants and heritability of anomalies would be useful to better understand the complex origin of anomalies in natural populations. PMID:21533278

  9. The parallel-sequential field subtraction techniques for nonlinear ultrasonic imaging

    NASA Astrophysics Data System (ADS)

    Cheng, Jingwei; Potter, Jack N.; Drinkwater, Bruce W.

    2018-04-01

    Nonlinear imaging techniques have recently emerged which have the potential to detect cracks at a much earlier stage and have sensitivity to particularly closed defects. This study utilizes two modes of focusing: parallel, in which the elements are fired together with a delay law, and sequential, in which elements are fired independently. In the parallel focusing, a high intensity ultrasonic beam is formed in the specimen at the focal point. However, in sequential focusing only low intensity signals from individual elements enter the sample and the full matrix of transmit-receive signals is recorded; with elastic assumptions, both parallel and sequential images are expected to be identical. Here we measure the difference between these images formed from the coherent component of the field and use this to characterize nonlinearity of closed fatigue cracks. In particular we monitor the reduction in amplitude at the fundamental frequency at each focal point and use this metric to form images of the spatial distribution of nonlinearity. The results suggest the subtracted image can suppress linear features (e.g., back wall or large scatters) and allow damage to be detected at an early stage.

  10. Dataset of anomalies and malicious acts in a cyber-physical subsystem.

    PubMed

    Laso, Pedro Merino; Brosset, David; Puentes, John

    2017-10-01

    This article presents a dataset produced to investigate how data and information quality estimations enable to detect aNomalies and malicious acts in cyber-physical systems. Data were acquired making use of a cyber-physical subsystem consisting of liquid containers for fuel or water, along with its automated control and data acquisition infrastructure. Described data consist of temporal series representing five operational scenarios - Normal, aNomalies, breakdown, sabotages, and cyber-attacks - corresponding to 15 different real situations. The dataset is publicly available in the .zip file published with the article, to investigate and compare faulty operation detection and characterization methods for cyber-physical systems.

  11. Continuity of the sequential product of sequential quantum effect algebras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Qiang, E-mail: leiqiang@hit.edu.cn; Su, Xiaochao, E-mail: hitswh@163.com; Wu, Junde, E-mail: wjd@zju.edu.cn

    In order to study quantum measurement theory, sequential product defined by A∘B = A{sup 1/2}BA{sup 1/2} for any two quantum effects A, B has been introduced. Physically motivated conditions ask the sequential product to be continuous with respect to the strong operator topology. In this paper, we study the continuity problems of the sequential product A∘B = A{sup 1/2}BA{sup 1/2} with respect to other important topologies, such as norm topology, weak operator topology, order topology, and interval topology.

  12. Estimation of anomaly location and size using electrical impedance tomography.

    PubMed

    Kwon, Ohin; Yoon, Jeong Rock; Seo, Jin Keun; Woo, Eung Je; Cho, Young Gu

    2003-01-01

    We developed a new algorithm that estimates locations and sizes of anomalies in electrically conducting medium based on electrical impedance tomography (EIT) technique. When only the boundary current and voltage measurements are available, it is not practically feasible to reconstruct accurate high-resolution cross-sectional conductivity or resistivity images of a subject. In this paper, we focus our attention on the estimation of locations and sizes of anomalies with different conductivity values compared with the background tissues. We showed the performance of the algorithm from experimental results using a 32-channel EIT system and saline phantom. With about 1.73% measurement error in boundary current-voltage data, we found that the minimal size (area) of the detectable anomaly is about 0.72% of the size (area) of the phantom. Potential applications include the monitoring of impedance related physiological events and bubble detection in two-phase flow. Since this new algorithm requires neither any forward solver nor time-consuming minimization process, it is fast enough for various real-time applications in medicine and nondestructive testing.

  13. Sequential lineup presentation promotes less-biased criterion setting but does not improve discriminability.

    PubMed

    Palmer, Matthew A; Brewer, Neil

    2012-06-01

    When compared with simultaneous lineup presentation, sequential presentation has been shown to reduce false identifications to a greater extent than it reduces correct identifications. However, there has been much debate about whether this difference in identification performance represents improved discriminability or more conservative responding. In this research, data from 22 experiments that compared sequential and simultaneous lineups were analyzed using a compound signal-detection model, which is specifically designed to describe decision-making performance on tasks such as eyewitness identification tests. Sequential (cf. simultaneous) presentation did not influence discriminability, but produced a conservative shift in response bias that resulted in less-biased choosing for sequential than simultaneous lineups. These results inform understanding of the effects of lineup presentation mode on eyewitness identification decisions.

  14. Dental anomalies associated with cleft lip and palate in Northern Finland.

    PubMed

    Lehtonen, V; Anttonen, V; Ylikontiola, L P; Koskinen, S; Pesonen, P; Sándor, G K

    2015-12-01

    Despite the reported occurrence of dental anomalies of cleft lip and palate, little is known about their prevalence in children from Northern Finland with cleft lip and palate. The aim was to investigate the prevalence of dental anomalies among patients with different types of clefts in Northern Finland. Design and Statistics: patient records of 139 subjects aged three years and older (with clefts treated in Oulu University Hospital, Finland during the period 1996-2010 (total n. 183) were analysed for dental anomalies including the number of teeth, morphological and developmental anomalies and their association with the cleft type. The analyses were carried out using Chi-square test and Fisher's exact test. Differences between the groups were considered statistically significant at p values < 0.05. More than half of the patients had clefts of the hard palate, 18% of the lip and palate, and 13% of the lip. At least one dental anomaly was detected in 47% of the study population. Almost one in three (26.6%) subjects had at least one anomaly and 17.9% had two or three anomalies. The most common type of anomaly in permanent teeth were missing teeth followed by supernumerary teeth. Supernumerary teeth were significantly more apparent when the lip was involved in the cleft compared with palatal clefts. Missing teeth were less prevalent among those 5 years or younger. The prevalence of different anomalies was significantly associated with the cleft type in both age groups. Dental anomalies are more prevalent among cleft children than in the general population in Finland. The most prevalent anomalies associated with cleft were missing and supernumerary teeth.

  15. Dental Anomalies in Permanent Teeth after Trauma in Primary Dentition.

    PubMed

    Bardellini, Elena; Amadori, Francesca; Pasini, Stefania; Majorana, Alessandra

    This retrospective study aims to evaluate the prevalence of dental anomalies in permanent teeth as a result of a trauma concerning the predecessor primary teeth. A total of 241 records of children (118 males and 123 females, mean age 3.62 ± 1.40) affected by trauma on primary teeth were analyzed. All patients were recalled to evaluate the status of the permanent successor teeth by clinical and radiographic investigations. Out of 241 patients, 106 patients (for a total of 179 traumatized primary teeth) presented at the recall. Dental anomalies on successor permanent teeth were detected in 21 patients (19.8%), for a total of 26 teeth (14.5%) and 28 anomalies. Anomalies of the eruptive process were the most observed disturbances (60.7%), followed by enamel hypoplasia (25%) and white spots (14.3%). A higher percentage of anomalies on permanent teeth was observed when trauma occurred at an age less than 36 months (38.5% of cases). Intrusive and extrusive luxation were related with the most cases of clinical disturbances in the successor permanent teeth. The results of this study highlight the risk of dental anomalies after a trauma in primary dentition, especially in early-aged children and in case of intrusive luxation.

  16. Branchial anomalies in children.

    PubMed

    Bajaj, Y; Ifeacho, S; Tweedie, D; Jephson, C G; Albert, D M; Cochrane, L A; Wyatt, M E; Jonas, N; Hartley, B E J

    2011-08-01

    Branchial cleft anomalies are the second most common head and neck congenital lesions seen in children. Amongst the branchial cleft malformations, second cleft lesions account for 95% of the branchial anomalies. This article analyzes all the cases of branchial cleft anomalies operated on at Great Ormond Street Hospital over the past 10 years. All children who underwent surgery for branchial cleft sinus or fistula from January 2000 to December 2010 were included in this study. In this series, we had 80 patients (38 female and 42 male). The age at the time of operation varied from 1 year to 14 years. Amongst this group, 15 patients had first branchial cleft anomaly, 62 had second branchial cleft anomaly and 3 had fourth branchial pouch anomaly. All the first cleft cases were operated on by a superficial parotidectomy approach with facial nerve identification. Complete excision was achieved in all these first cleft cases. In this series of first cleft anomalies, we had one complication (temporary marginal mandibular nerve weakness. In the 62 children with second branchial cleft anomalies, 50 were unilateral and 12 were bilateral. In the vast majority, the tract extended through the carotid bifurcation and extended up to pharyngeal constrictor muscles. Majority of these cases were operated on through an elliptical incision around the external opening. Complete excision was achieved in all second cleft cases except one who required a repeat excision. In this subgroup, we had two complications one patient developed a seroma and one had incomplete excision. The three patients with fourth pouch anomaly were treated with endoscopic assisted monopolar diathermy to the sinus opening with good outcome. Branchial anomalies are relatively common in children. There are three distinct types, first cleft, second cleft and fourth pouch anomaly. Correct diagnosis is essential to avoid inadequate surgery and multiple procedures. The surgical approach needs to be tailored to the type

  17. MUSIC algorithm for location searching of dielectric anomalies from S-parameters using microwave imaging

    NASA Astrophysics Data System (ADS)

    Park, Won-Kwang; Kim, Hwa Pyung; Lee, Kwang-Jae; Son, Seong-Ho

    2017-11-01

    Motivated by the biomedical engineering used in early-stage breast cancer detection, we investigated the use of MUltiple SIgnal Classification (MUSIC) algorithm for location searching of small anomalies using S-parameters. We considered the application of MUSIC to functional imaging where a small number of dipole antennas are used. Our approach is based on the application of Born approximation or physical factorization. We analyzed cases in which the anomaly is respectively small and large in relation to the wavelength, and the structure of the left-singular vectors is linked to the nonzero singular values of a Multi-Static Response (MSR) matrix whose elements are the S-parameters. Using simulations, we demonstrated the strengths and weaknesses of the MUSIC algorithm in detecting both small and extended anomalies.

  18. Posterior fossa anomalies diagnosed with fetal MRI: associated anomalies and neurodevelopmental outcomes.

    PubMed

    Patek, Kyla J; Kline-Fath, Beth M; Hopkin, Robert J; Pilipenko, Valentina V; Crombleholme, Timothy M; Spaeth, Christine G

    2012-01-01

    The purpose of this study was to describe the relationship between intracranial and extracranial anomalies and neurodevelopmental outcome for fetuses diagnosed with a posterior fossa anomaly (PFA) on fetal MRI. Cases of Dandy-Walker malformation, vermian hypogenesis/hypoplasia, and mega cisterna magna (MCM) were identified through the Fetal Care Center of Cincinnati between January 2004 and December 2010. Parental interview and retrospective chart review were used to assess neurodevelopmental outcome. Posterior fossa anomalies were identified in 59 fetuses; 9 with Dandy-Walker malformation, 36 with vermian hypogenesis/hypoplasia, and 14 with MCM. Cases with isolated PFAs (14/59) had better outcomes than those with additional anomalies (p = 0.00016), with isolated cases of MCM all being neurodevelopmentally normal. Cases with additional intracranial anomalies had a worse outcome than those without intracranial anomalies (p = 0.00017). The presence of extracranial anomalies increased the likelihood of having a poor outcome (p = 0.00014) as did the identification of an abnormal brainstem (p = 0.00018). Intracranial and extracranial anomalies were good predictors of neurodevelopmental outcome in this study. The prognosis was poor for individuals with an abnormal brainstem, whereas those with isolated MCM had normal neurodevelopmental outcome. © 2012 John Wiley & Sons, Ltd.

  19. Fully automated analytical procedure for propofol determination by sequential injection technique with spectrophotometric and fluorimetric detections.

    PubMed

    Šrámková, Ivana; Amorim, Célia G; Sklenářová, Hana; Montenegro, Maria C B M; Horstkotte, Burkhard; Araújo, Alberto N; Solich, Petr

    2014-01-01

    In this work, an application of an enzymatic reaction for the determination of the highly hydrophobic drug propofol in emulsion dosage form is presented. Emulsions represent a complex and therefore challenging matrix for analysis. Ethanol was used for breakage of a lipid emulsion, which enabled optical detection. A fully automated method based on Sequential Injection Analysis was developed, allowing propofol determination without the requirement of tedious sample pre-treatment. The method was based on spectrophotometric detection after the enzymatic oxidation catalysed by horseradish peroxidase and subsequent coupling with 4-aminoantipyrine leading to a coloured product with an absorbance maximum at 485 nm. This procedure was compared with a simple fluorimetric method, which was based on the direct selective fluorescence emission of propofol in ethanol at 347 nm. Both methods provide comparable validation parameters with linear working ranges of 0.005-0.100 mg mL(-1) and 0.004-0.243 mg mL(-1) for the spectrophotometric and fluorimetric methods, respectively. The detection and quantitation limits achieved with the spectrophotometric method were 0.0016 and 0.0053 mg mL(-1), respectively. The fluorimetric method provided the detection limit of 0.0013 mg mL(-1) and limit of quantitation of 0.0043 mg mL(-1). The RSD did not exceed 5% and 2% (n=10), correspondingly. A sample throughput of approx. 14 h(-1) for the spectrophotometric and 68 h(-1) for the fluorimetric detection was achieved. Both methods proved to be suitable for the determination of propofol in pharmaceutical formulation with average recovery values of 98.1 and 98.5%. © 2013 Elsevier B.V. All rights reserved.

  20. Cervical vertebral anomalies in patients with anomalies of the head and neck.

    PubMed

    Manaligod, J M; Bauman, N M; Menezes, A H; Smith, R J

    1999-10-01

    Congenital head and neck anomalies can occur in association with vertebral anomalies, particularly of the cervical vertebrae. While the former are easily recognized, especially when part of a syndrome, the latter are often occult, thereby delaying their diagnosis. The presence of vertebral anomalies must be considered in pediatric patients with head and neck abnormalities to expedite management of select cases and to prevent neurologic injury. We present our experience with 5 pediatric patients who were referred to the Department of Otolaryngology-Head and Neck Surgery at the University of Iowa with a variety of syndromic anomalies of the head and neck. Each patient was subsequently also found to have a vertebral anomaly. The relevant embryogenesis of the anomalous structures is discussed, with highlighting of potential causes such as teratogenic agents and events and germ-line mutations. A review of syndromes having both head and neck and vertebral anomalies is presented to heighten awareness of otolaryngologists evaluating children with syndromic disorders. Finally, the findings on radiographic imaging studies, particularly computed tomography, are discussed to facilitate the prompt diagnosis of vertebral anomalies.

  1. Analysis of spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Bloomquist, C. E.; Graham, W. C.

    1976-01-01

    The anomalies from 316 spacecraft covering the entire U.S. space program were analyzed to determine if there were any experimental or technological programs which could be implemented to remove the anomalies from future space activity. Thirty specific categories of anomalies were found to cover nearly 85 percent of all observed anomalies. Thirteen experiments were defined to deal with 17 of these categories; nine additional experiments were identified to deal with other classes of observed and anticipated anomalies. Preliminary analyses indicate that all 22 experimental programs are both technically feasible and economically viable.

  2. BEARS: a multi-mission anomaly response system

    NASA Astrophysics Data System (ADS)

    Roberts, Bryce A.

    2009-05-01

    The Mission Operations Group at UC Berkeley's Space Sciences Laboratory operates a highly automated ground station and presently a fleet of seven satellites, each with its own associated command and control console. However, the requirement for prompt anomaly detection and resolution is shared commonly between the ground segment and all spacecraft. The efficient, low-cost operation and "lights-out" staffing of the Mission Operations Group requires that controllers and engineers be notified of spacecraft and ground system problems around the clock. The Berkeley Emergency Anomaly and Response System (BEARS) is an in-house developed web- and paging-based software system that meets this need. BEARS was developed as a replacement for an existing emergency reporting software system that was too closedsource, platform-specific, expensive, and antiquated to expand or maintain. To avoid these limitations, the new system design leverages cross-platform, open-source software products such as MySQL, PHP, and Qt. Anomaly notifications and responses make use of the two-way paging capabilities of modern smart phones.

  3. Prevalence and distribution of selected dental anomalies among saudi children in Abha, Saudi Arabia.

    PubMed

    Yassin, Syed M

    2016-12-01

    Dental anomalies are not an unusual finding in routine dental examination. The effect of dental anomalies can lead to functional, esthetic and occlusal problems. The Purpose of the study was to determine the prevalence and distribution of selected developmental dental anomalies in Saudi children. The study was based on clinical examination and Panoramic radiographs of children who visited the Pediatric dentistry clinics at King Khalid University College of Dentistry, Saudi Arabia. These patients were examined for dental anomalies in size, shape, number, structure and position. Data collected were entered and analyzed using statistical package for social sciences version. Of the 1252 children (638 Boys, 614 girls) examined, 318 subjects (25.39%) presented with selected dental anomalies. The distribution by gender was 175 boys (27.42%) and 143 girls (23.28%). On intergroup comparison, number anomalies was the most common anomaly with Hypodontia (9.7%) being the most common anomaly in Saudi children, followed by hyperdontia (3.5%). The Prevalence of size anomalies were Microdontia (2.6%) and Macrodontia (1.8%). The prevalence of Shape anomalies were Talon cusp (1.4%), Taurodontism (1.4%), Fusion (0.8%).The prevalence of Positional anomalies were Ectopic eruption (2.3%) and Rotation (0.4%). The prevalence of structural anomalies were Amelogenesis imperfecta (0.3%) Dentinogenesis imperfecta (0.1%). A significant number of children had dental anomaly with Hypodontia being the most common anomaly and Dentinogenesis imperfecta being the rare anomaly in the study. Early detection and management of these anomalies can avoid potential orthodontic and esthetic problems in a child. Key words: Dental anomalies, children, Saudi Arabia.

  4. Detection of sinkholes or anomalies using full seismic wave fields : phase II.

    DOT National Transportation Integrated Search

    2016-08-01

    A new 2-D Full Waveform Inversion (FWI) software code was developed to characterize layering and anomalies beneath the ground surface using seismic testing. The software is capable of assessing the shear and compression wave velocities (Vs and Vp) fo...

  5. Edge detection of magnetic anomalies using analytic signal of tilt angle (ASTA)

    NASA Astrophysics Data System (ADS)

    Alamdar, K.; Ansari, A. H.; Ghorbani, A.

    2009-04-01

    Magnetic is a commonly used geophysical technique to identify and image potential subsurface targets. Interpretation of magnetic anomalies is a complex process due to the superposition of multiple magnetic sources, presence of geologic and cultural noise and acquisition and positioning error. Both the vertical and horizontal derivatives of potential field data are useful; horizontal derivative, enhance edges whereas vertical derivative narrow the width of anomaly and so locate source bodies more accurately. We can combine vertical and horizontal derivative of magnetic field to achieve analytic signal which is independent to body magnetization direction and maximum value of this lies over edges of body directly. Tilt angle filter is phased-base filter and is defined as angle between vertical derivative and total horizontal derivative. Tilt angle value differ from +90 degree to -90 degree and its zero value lies over body edge. One of disadvantage of this filter is when encountering with deep sources the detected edge is blurred. For overcome this problem many authors introduced new filters such as total horizontal derivative of tilt angle or vertical derivative of tilt angle which Because of using high-order derivative in these filters results may be too noisy. If we combine analytic signal and tilt angle, a new filter termed (ASTA) is produced which its maximum value lies directly over body edge and is easer than tilt angle to delineate body edge and no complicity of tilt angle. In this work new filter has been demonstrated on magnetic data from an area in Sar- Cheshme region in Iran. This area is located in 55 degree longitude and 32 degree latitude and is a copper potential region. The main formation in this area is Andesith and Trachyandezite. Magnetic surveying was employed to separate the boundaries of Andezite and Trachyandezite from adjacent area. In this regard a variety of filters such as analytic signal, tilt angle and ASTA filter have been applied which

  6. Competing Orders and Anomalies

    PubMed Central

    Moon, Eun-Gook

    2016-01-01

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation “laws” could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the ’t Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed. PMID:27499184

  7. Spacecraft Environmental Anomalies Handbook

    DTIC Science & Technology

    1989-08-01

    1989 4. TITLE AND SUBTITLE S. FUNDING NUMBERS SPACECRAFT ENVIRONMENTAL ANOMALIES HANDBOOK 282201AA PE: 63410F 6. AUTHOR(S) Paul A. Robinson, Jr 7...engineering solutions for mitigating the effects of environmental anomalies have been developed. Among the causes o, spacecraft anomalies are surface...have been discovered after years of investig!:tion, and engineering solutions for mitigating the effccts of environmental anomalies have been developed

  8. NTilt as an improved enhanced tilt derivative filter for edge detection of potential field anomalies

    NASA Astrophysics Data System (ADS)

    Nasuti, Yasin; Nasuti, Aziz

    2018-07-01

    We develop a new phase-based filter to enhance the edges of geological sources from potential-field data called NTilt, which utilizes the vertical derivative of the analytical signal in different orders to the tilt derivative equation. This will equalize signals from sources buried at different depths. In order to evaluate the designed filter, we compared the results obtained from our filter with those from recently applied methods, testing against both synthetic data, and measured data from the Finnmark region of North Norway were used. The results demonstrate that the new filter permits better definition of the edges of causative anomalies, as well as better highlighting several anomalies that either are not shown in tilt derivative and other methods or not very well defined. The proposed technique also shows improvements in delineation of the actual edges of deep-seated anomalies compared to tilt derivative and other methods. The NTilt filter provides more accurate and sharper edges and makes the nearby anomalies more distinguishable, and also can avoid bringing some additional false edges reducing the ambiguity in potential field interpretations. This filter, thus, appears to be promising in providing a better qualitative interpretation of the gravity and magnetic data in comparison with the more commonly used filters.

  9. ORP and pH measurements to detect redox and acid-base anomalies from hydrothermal activity

    NASA Astrophysics Data System (ADS)

    Santana-Casiano, J. M.; González-Dávila, M.; Fraile-Nuez, E.

    2017-12-01

    The Tagoro submarine volcano is located 1.8 km south of the Island of El Hierro at 350 m depth and rises up to 88 m below sea level. It was erupting melting material for five months, from October 2011 to March 2012, changing drastically the physical-chemical properties of the water column in the area. After this eruption, the system evolved to a hydrothermal system. The character of both reduced and acid of the hydrothermal emissions in the Tagoro submarine volcano allowed us to detect anomalies related with changes in the chemical potential and the proton concentration using ORP and pH sensors, respectively. Tow-yos using a CTD-rosette with these two sensors provided the locations of the emissions plotting δ(ORP)/δt and ΔpH versus the latitude or longitude. The ORP sensor responds very quickly to the presence of reduced chemicals in the water column. Changes in potential are proportional to the amount of reduced chemical species present in the water. The magnitude of these changes are examined by the time derivative of ORP, δ(ORP)/δt. To detect changes in the pH, the mean pH for each depth at a reference station in an area not affected by the vent emission is subtracted from each point measured near the volcanic edifice, defining in this way ΔpH. Detailed surveys of the volcanic edifice were carried out between 2014 and 2016 using several CTD-pH-ORP tow-yo studies, localizing the ORP and pH changes, which were used to obtain surface maps of anomalies. Moreover, meridional tow-yos were used to calculate the amount of volcanic CO2 added to the water column. The inputs of CO2 along multiple sections combined with measurements of oceanic currents produced an estimated volcanic CO2 flux = 6.0 105 ± 1.1 105 kg d-1 which increases the acidity above the volcano by 20%. Sites like the Tagoro submarine volcano, in its degasification stage, provide an excellent opportunity to study the carbonate system in a high CO2 world, the volcanic contribution to the global

  10. How much does the MSW effect contribute to the reactor antineutrino anomaly?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdiviesso, G. A.

    2015-05-15

    It has been pointed out that there is a 5.7 ± 2.3 discrepancy between the predicted and the observed reactor antineutrino flux in very short baseline experiments. Several causes for this anomaly have been discussed, including a possible non-standard forth sterile neutrino. In order to quantify how much non-standard this anomaly really is, the standard MSW effect is reviewed. Knowing that reactor antineutrinos are produced in a dense medium (the nuclear fuel) and is usually detected in a less dense one (water, or scintillator), non-adiabatic effects are expected to happen, creating a difference between the creation and detection mixing angles.

  11. Sequential Multiplex Analyte Capturing for Phosphoprotein Profiling*

    PubMed Central

    Poetz, Oliver; Henzler, Tanja; Hartmann, Michael; Kazmaier, Cornelia; Templin, Markus F.; Herget, Thomas; Joos, Thomas O.

    2010-01-01

    Microarray-based sandwich immunoassays can simultaneously detect dozens of proteins. However, their use in quantifying large numbers of proteins is hampered by cross-reactivity and incompatibilities caused by the immunoassays themselves. Sequential multiplex analyte capturing addresses these problems by repeatedly probing the same sample with different sets of antibody-coated, magnetic suspension bead arrays. As a miniaturized immunoassay format, suspension bead array-based assays fulfill the criteria of the ambient analyte theory, and our experiments reveal that the analyte concentrations are not significantly changed. The value of sequential multiplex analyte capturing was demonstrated by probing tumor cell line lysates for the abundance of seven different receptor tyrosine kinases and their degree of phosphorylation and by measuring the complex phosphorylation pattern of the epidermal growth factor receptor in the same sample from the same cavity. PMID:20682761

  12. Toward Continuous GPS Carrier-Phase Time Transfer: Eliminating the Time Discontinuity at an Anomaly

    PubMed Central

    Yao, Jian; Levine, Judah; Weiss, Marc

    2015-01-01

    The wide application of Global Positioning System (GPS) carrier-phase (CP) time transfer is limited by the problem of boundary discontinuity (BD). The discontinuity has two categories. One is “day boundary discontinuity,” which has been studied extensively and can be solved by multiple methods [1–8]. The other category of discontinuity, called “anomaly boundary discontinuity (anomaly-BD),” comes from a GPS data anomaly. The anomaly can be a data gap (i.e., missing data), a GPS measurement error (i.e., bad data), or a cycle slip. Initial study of the anomaly-BD shows that we can fix the discontinuity if the anomaly lasts no more than 20 min, using the polynomial curve-fitting strategy to repair the anomaly [9]. However, sometimes, the data anomaly lasts longer than 20 min. Thus, a better curve-fitting strategy is in need. Besides, a cycle slip, as another type of data anomaly, can occur and lead to an anomaly-BD. To solve these problems, this paper proposes a new strategy, i.e., the satellite-clock-aided curve fitting strategy with the function of cycle slip detection. Basically, this new strategy applies the satellite clock correction to the GPS data. After that, we do the polynomial curve fitting for the code and phase data, as before. Our study shows that the phase-data residual is only ~3 mm for all GPS satellites. The new strategy also detects and finds the number of cycle slips by searching the minimum curve-fitting residual. Extensive examples show that this new strategy enables us to repair up to a 40-min GPS data anomaly, regardless of whether the anomaly is due to a data gap, a cycle slip, or a combination of the two. We also find that interference of the GPS signal, known as “jamming”, can possibly lead to a time-transfer error, and that this new strategy can compensate for jamming outages. Thus, the new strategy can eliminate the impact of jamming on time transfer. As a whole, we greatly improve the robustness of the GPS CP time transfer

  13. Identification of inorganic improvised explosive devices using sequential injection capillary electrophoresis and contactless conductivity detection.

    PubMed

    Blanco, Gustavo A; Nai, Yi H; Hilder, Emily F; Shellie, Robert A; Dicinoski, Greg W; Haddad, Paul R; Breadmore, Michael C

    2011-12-01

    A simple sequential injection capillary electrophoresis (SI-CE) instrument with capacitively coupled contactless conductivity detection (C(4)D) has been developed for the rapid separation of anions relevant to the identification of inorganic improvised explosive devices (IEDs). Four of the most common explosive tracer ions, nitrate, perchlorate, chlorate, and azide, and the most common background ions, chloride, sulfate, thiocyanate, fluoride, phosphate, and carbonate, were chosen for investigation. Using a separation electrolyte comprising 50 mM tris(hydroxymethyl)aminomethane, 50 mM cyclohexyl-2-aminoethanesulfonic acid, pH 8.9 and 0.05% poly(ethyleneimine) (PEI) in a hexadimethrine bromide (HDMB)-coated capillary it was possible to partially separate all 10 ions within 90 s. The combination of two cationic polymer additives (PEI and HDMB) was necessary to achieve adequate selectivity with a sufficiently stable electroosmotic flow (EOF), which was not possible with only one polymer. Careful optimization of variables affecting the speed of separation and injection timing allowed a further reduction of separation time to 55 s while maintaining adequate efficiency and resolution. Software control makes high sample throughput possible (60 samples/h), with very high repeatability of migration times [0.63-2.07% relative standard deviation (RSD) for 240 injections]. The separation speed does not compromise sensitivity, with limits of detection ranging from 23 to 50 μg·L(-1) for all the explosive residues considered, which is 10× lower than those achieved by indirect absorbance detection and 2× lower than those achieved by C(4)D using portable benchtop instrumentation. The combination of automation, high sample throughput, high confidence of peak identification, and low limits of detection makes this methodology ideal for the rapid identification of inorganic IED residues.

  14. Operator based integration of information in multimodal radiological search mission with applications to anomaly detection

    NASA Astrophysics Data System (ADS)

    Benedetto, J.; Cloninger, A.; Czaja, W.; Doster, T.; Kochersberger, K.; Manning, B.; McCullough, T.; McLane, M.

    2014-05-01

    Successful performance of radiological search mission is dependent on effective utilization of mixture of signals. Examples of modalities include, e.g., EO imagery and gamma radiation data, or radiation data collected during multiple events. In addition, elevation data or spatial proximity can be used to enhance the performance of acquisition systems. State of the art techniques in processing and exploitation of complex information manifolds rely on diffusion operators. Our approach involves machine learning techniques based on analysis of joint data- dependent graphs and their associated diffusion kernels. Then, the significant eigenvectors of the derived fused graph Laplace and Schroedinger operators form the new representation, which provides integrated features from the heterogeneous input data. The families of data-dependent Laplace and Schroedinger operators on joint data graphs, shall be integrated by means of appropriately designed fusion metrics. These fused representations are used for target and anomaly detection.

  15. Prevalence and distribution of selected dental anomalies among saudi children in Abha, Saudi Arabia

    PubMed Central

    2016-01-01

    Background Dental anomalies are not an unusual finding in routine dental examination. The effect of dental anomalies can lead to functional, esthetic and occlusal problems. The Purpose of the study was to determine the prevalence and distribution of selected developmental dental anomalies in Saudi children. Material and Methods The study was based on clinical examination and Panoramic radiographs of children who visited the Pediatric dentistry clinics at King Khalid University College of Dentistry, Saudi Arabia. These patients were examined for dental anomalies in size, shape, number, structure and position. Data collected were entered and analyzed using statistical package for social sciences version. Results Of the 1252 children (638 Boys, 614 girls) examined, 318 subjects (25.39%) presented with selected dental anomalies. The distribution by gender was 175 boys (27.42%) and 143 girls (23.28%). On intergroup comparison, number anomalies was the most common anomaly with Hypodontia (9.7%) being the most common anomaly in Saudi children, followed by hyperdontia (3.5%). The Prevalence of size anomalies were Microdontia (2.6%) and Macrodontia (1.8%). The prevalence of Shape anomalies were Talon cusp (1.4%), Taurodontism (1.4%), Fusion (0.8%).The prevalence of Positional anomalies were Ectopic eruption (2.3%) and Rotation (0.4%). The prevalence of structural anomalies were Amelogenesis imperfecta (0.3%) Dentinogenesis imperfecta (0.1%). Conclusions A significant number of children had dental anomaly with Hypodontia being the most common anomaly and Dentinogenesis imperfecta being the rare anomaly in the study. Early detection and management of these anomalies can avoid potential orthodontic and esthetic problems in a child. Key words:Dental anomalies, children, Saudi Arabia. PMID:27957258

  16. Rate based failure detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Brett Emery Trabun; Gamage, Thoshitha Thanushka; Bakken, David Edward

    This disclosure describes, in part, a system management component and failure detection component for use in a power grid data network to identify anomalies within the network and systematically adjust the quality of service of data published by publishers and subscribed to by subscribers within the network. In one implementation, subscribers may identify a desired data rate, a minimum acceptable data rate, desired latency, minimum acceptable latency and a priority for each subscription. The failure detection component may identify an anomaly within the network and a source of the anomaly. Based on the identified anomaly, data rates and or datamore » paths may be adjusted in real-time to ensure that the power grid data network does not become overloaded and/or fail.« less

  17. Sequential Testing: Basics and Benefits

    DTIC Science & Technology

    1978-03-01

    Eii~TARADC6M and x _..TECHNICAL REPORT NO. 12325 SEQUENTIAL TESTING: BASICS AND BENEFITS / i * p iREFERENCE CP...Sequential Testing: Basics and Benefits Contents Page I. Introduction and Summary II. Sequential Analysis 2 III. Mathematics of Sequential Testing 4 IV...testing. The added benefit of reduced energy needs are inherent in this testing method. The text was originally released by the authors in 1972. The text

  18. Pre-seismic anomalies in remotely sensed land surface temperature measurements: The case study of 2003 Boumerdes earthquake

    NASA Astrophysics Data System (ADS)

    Bellaoui, Mebrouk; Hassini, Abdelatif; Bouchouicha, Kada

    2017-05-01

    Detection of thermal anomaly prior to earthquake events has been widely confirmed by researchers over the past decade. One of the popular approaches for anomaly detection is the Robust Satellite Approach (RST). In this paper, we use this method on a collection of six years of MODIS satellite data, representing land surface temperature (LST) images to predict 21st May 2003 Boumerdes Algeria earthquake. The thermal anomalies results were compared with the ambient temperature variation measured in three meteorological stations of Algerian National Office of Meteorology (ONM) (DELLYS-AFIR, TIZI-OUZOU, and DAR-EL-BEIDA). The results confirm the importance of RST as an approach highly effective for monitoring the earthquakes.

  19. Value of Ultrasound in Detecting Urinary Tract Anomalies After First Febrile Urinary Tract Infection in Children.

    PubMed

    Ghobrial, Emad E; Abdelaziz, Doaa M; Sheba, Maha F; Abdel-Azeem, Yasser S

    2016-05-01

    Background Urinary tract infection (UTI) is an infection that affects part of the urinary tract. Ultrasound is a noninvasive test that can demonstrate the size and shape of kidneys, presence of dilatation of the ureters, and the existence of anatomic abnormalities. The aim of the study is to estimate the value of ultrasound in detecting urinary tract anomalies after first attack of UTI. Methods This study was conducted at the Nephrology Clinic, New Children's Hospital, Faculty of Medicine, Cairo University, from August 2012 to March 2013, and included 30 children who presented with first attack of acute febrile UTI. All patients were subjected to urine analysis, urine culture and sensitivity, serum creatinine, complete blood count, and imaging in the form of renal ultrasound, voiding cysto-urethrography, and renal scan. Results All the patients had fever with a mean of 38.96°C ± 0.44°C and the mean duration of illness was 6.23 ± 5.64 days. Nineteen patients (63.3%) had an ultrasound abnormality. The commonest abnormalities were kidney stones (15.8%). Only 2 patients who had abnormal ultrasound had also vesicoureteric reflux on cystourethrography. Sensitivity of ultrasound was 66.7%, specificity was 37.5%, positive predictive value was 21.1%, negative predictive value was 81.8%, and total accuracy was 43.33%. Conclusion We concluded that ultrasound alone was not of much value in diagnosing and putting a plan of first attack of febrile UTI. It is recommended that combined investigations are the best way to confirm diagnosis of urinary tract anomalies. © The Author(s) 2015.

  20. Surveying the South Pole-Aitken basin magnetic anomaly for remnant impactor metallic iron

    USGS Publications Warehouse

    Cahill, Joshua T.S.; Hagerty, Justin J.; Lawrence, David M.; Klima, Rachel L.; Blewett, David T.

    2014-01-01

    The Moon has areas of magnetized crust ("magnetic anomalies"), the origins of which are poorly constrained. A magnetic anomaly near the northern rim of South Pole-Aitken (SPA) basin was recently postulated to originate from remnant metallic iron emplaced by the SPA basin-forming impactor. Here, we remotely examine the regolith of this SPA magnetic anomaly with a combination of Clementine and Lunar Prospector derived iron maps for any evidence of enhanced metallic iron content. We find that these data sets do not definitively detect the hypothesized remnant metallic iron within the upper tens of centimeters of the lunar regolith.

  1. Lymphatic Anomalies Registry

    ClinicalTrials.gov

    2018-01-23

    Lymphatic Malformation; Generalized Lymphatic Anomaly (GLA); Central Conducting Lymphatic Anomaly; CLOVES Syndrome; Gorham-Stout Disease ("Disappearing Bone Disease"); Blue Rubber Bleb Nevus Syndrome; Kaposiform Lymphangiomatosis; Kaposiform Hemangioendothelioma/Tufted Angioma; Klippel-Trenaunay Syndrome; Lymphangiomatosis

  2. Analysis of genitourinary anomalies in patients with VACTERL (Vertebral anomalies, Anal atresia, Cardiac malformations, Tracheo-Esophageal fistula, Renal anomalies, Limb abnormalities) association.

    PubMed

    Solomon, Benjamin D; Raam, Manu S; Pineda-Alvarez, Daniel E

    2011-06-01

    The goal of this study was to describe a novel pattern of genitourinary (GU) anomalies in VACTERL association,which involves congenital anomalies affecting the vertebrae,anus, heart, trachea and esophagus, kidneys, and limbs.We collected clinical data on 105 patients diagnosed with VACTERL association and analyzed a subset of 89 patients who met more stringent inclusion criteria. Twenty-one percent of patients have GU anomalies, which are more severe (but not more frequent) in females. Anomalies were noted in patients without malformations affecting the renal, lower vertebral, or lower gastrointestinal systems. There should be a high index of suspicion for the presence of GU anomalies even in patients who do not have spatially similar malformations.

  3. Cross-linguistic variation in the neurophysiological response to semantic processing: Evidence from anomalies at the borderline of awareness

    PubMed Central

    Tune, Sarah; Schlesewsky, Matthias; Small, Steven L.; Sanford, Anthony J.; Bohan, Jason; Sassenhagen, Jona; Bornkessel-Schlesewsky, Ina

    2014-01-01

    The N400 event-related brain potential (ERP) has played a major role in the examination of how the human brain processes meaning. For current theories of the N400, classes of semantic inconsistencies which do not elicit N400 effects have proven particularly influential. Semantic anomalies that are difficult to detect are a case in point (“borderline anomalies”, e.g. “After an air crash, where should the survivors be buried?”), engendering a late positive ERP response but no N400 effect in English (Sanford, Leuthold, Bohan, & Sanford, 2011). In three auditory ERP experiments, we demonstrate that this result is subject to cross-linguistic variation. In a German version of Sanford and colleagues' experiment (Experiment 1), detected borderline anomalies elicited both N400 and late positivity effects compared to control stimuli or to missed borderline anomalies. Classic easy-to-detect semantic (non-borderline) anomalies showed the same pattern as in English (N400 plus late positivity). The cross-linguistic difference in the response to borderline anomalies was replicated in two additional studies with a slightly modified task (Experiment 2a: German; Experiment 2b: English), with a reliable LANGUAGE × ANOMALY interaction for the borderline anomalies confirming that the N400 effect is subject to systematic cross-linguistic variation. We argue that this variation results from differences in the language-specific default weighting of top-down and bottom-up information, concluding that N400 amplitude reflects the interaction between the two information sources in the form-to-meaning mapping. PMID:24447768

  4. Magnetic Resonance Imaging of Developmental Anomalies of the Uterus and the Vagina in Pediatric Patients.

    PubMed

    Gould, Sharon W; Epelman, Monica

    2015-08-01

    Developmental anomalies of the uterus and the vagina are associated with infertility and miscarriage and are most commonly detected in the postpubertal age-group. These conditions may also present in younger patients as a mass or pain owing to obstruction of the uterus or the vagina. Associated urinary tract anomalies are common, as well. Accurate diagnosis and thorough description of these anomalies is essential for appropriate management; however, evaluation may be difficult in an immature reproductive tract. Magnetic resonance imaging technique pertinent to imaging of the pediatric female reproductive tract is presented and illustrated along with the findings associated with various anomalies. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. In vivo detection of 13C isotopomer turnover in the human brain by sequential infusion of 13C labeled substrates

    NASA Astrophysics Data System (ADS)

    Li, Shizhe; Zhang, Yan; Ferraris Araneta, Maria; Xiang, Yun; Johnson, Christopher; Innis, Robert B.; Shen, Jun

    2012-05-01

    This study demonstrates the feasibility of simultaneously detecting human brain metabolites labeled by two substrates infused in a sequential order. In vivo 13C spectra of carboxylic/amide carbons were acquired only during the infusion of the second substrate. This approach allowed dynamic detection of 13C labeling from two substrates with considerably different labeling patterns. [2-13C]glucose and [U-13C6]glucose were used to generate singlet and doublet signals of the same carboxylic/amide carbon atom, respectively. Because of the large one-bond 13C-13C homonuclear J coupling between a carboxylic/amide carbon and an aliphatic carbon (˜50 Hz), the singlet and doublet signals of the same carboxylic/amide carbon were well distinguished. The results demonstrated that different 13C isotopomer patterns could be simultaneously and distinctly measured in vivo in a clinical setting at 3 T.

  6. Application of Array Comparative Genomic Hybridization in Newborns with Multiple Congenital Anomalies.

    PubMed

    Szczałuba, Krzysztof; Nowakowska, Beata; Sobecka, Katarzyna; Smyk, Marta; Castaneda, Jennifer; Klapecki, Jakub; Kutkowska-Kaźmierczak, Anna; Śmigiel, Robert; Bocian, Ewa; Radkowski, Marek; Demkow, Urszula

    2016-01-01

    Major congenital anomalies are detectable in 2-3 % of the newborn population. Some of their genetic causes are attributable to copy number variations identified by array comparative genomic hybridization (aCGH). The value of aCGH screening as a first-tier test in children with multiple congenital anomalies has been studied and consensus adopted. However, array resolution has not been agreed upon, specifically in the newborn or infant population. Moreover, most array studies have been focused on mixed populations of intellectual disability/developmental delay with or without multiple congenital anomalies, making it difficult to assess the value of microarrays in newborns. The aim of the study was to determine the optimal quality and clinical sensitivity of high-resolution array comparative genomic hybridization in neonates with multiple congenital anomalies. We investigated a group of 54 newborns with multiple congenital anomalies defined as two or more birth defects from more than one organ system. Cytogenetic studies were performed using OGT CytoSure 8 × 60 K microarray. We found ten rearrangements in ten newborns. Of these, one recurrent syndromic microduplication was observed, whereas all other changes were unique. Six rearrangements were definitely pathogenic, including one submicroscopic and five that could be seen on routine karyotype analysis. Four other copy number variants were likely pathogenic. The candidate genes that may explain the phenotype were discussed. In conclusion, high-resolution array comparative hybridization can be applied successfully in newborns with multiple congenital anomalies as the method detects a significant number of pathogenic changes, resulting in early diagnoses. We hypothesize that small changes previously considered benign or even inherited rearrangements should be classified as potentially pathogenic at least until a subsequent clinical assessment would exclude a developmental delay or dysmorphism.

  7. Fetal Urinary Tract Anomalies: Review of Pathophysiology, Imaging, and Management.

    PubMed

    Mileto, Achille; Itani, Malak; Katz, Douglas S; Siebert, Joseph R; Dighe, Manjiri K; Dubinsky, Theodore J; Moshiri, Mariam

    2018-05-01

    Common fetal anomalies of the kidneys and urinary tract encompass a complex spectrum of abnormalities that can be detected prenatally by ultrasound. Common fetal anomalies of the kidneys and urinary tract can affect amniotic fluid volume production with the development of oligohydramnios or anhydramnios, resulting in fetal pulmonary hypoplasia and, potentially, abnormal development of other fetal structures. We provide an overview of common fetal anomalies of the kidneys and urinary tract with an emphasis on sonographic patterns as well as pathologic and postnatal correlation, along with brief recommendations for postnatal management. Of note, we render an updated classification of fetal abnormalities of the kidneys and urinary tract based on the presence or absence of associated urinary tract dilation. In addition, we review the 2014 classification of urinary tract dilation based on the Linthicum multidisciplinary consensus panel.

  8. Robustness of the sequential lineup advantage.

    PubMed

    Gronlund, Scott D; Carlson, Curt A; Dailey, Sarah B; Goodsell, Charles A

    2009-06-01

    A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup advantages and 3 significant simultaneous advantages. Both sequential advantages occurred when the good photograph of the guilty suspect or either innocent suspect was in the fifth position in the sequential lineup; all 3 simultaneous advantages occurred when the poorer quality photograph of the guilty suspect or either innocent suspect was in the second position. Adjusting the statistical criterion to control for the multiple tests (.05/24) revealed no significant sequential advantages. Moreover, despite finding more conservative overall choosing for the sequential lineup, no support was found for the proposal that a sequential advantage was due to that conservative criterion shift. Unless lineups with particular characteristics predominate in the real world, there appears to be no strong preference for conducting lineups in either a sequential or a simultaneous manner. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  9. Visualizing the chiral anomaly in Dirac and Weyl semimetals with photoemission spectroscopy

    NASA Astrophysics Data System (ADS)

    Behrends, Jan; Grushin, Adolfo G.; Ojanen, Teemu; Bardarson, Jens H.

    2016-02-01

    Quantum anomalies are the breaking of a classical symmetry by quantum fluctuations. They dictate how physical systems of diverse nature, ranging from fundamental particles to crystalline materials, respond topologically to external perturbations, insensitive to local details. The anomaly paradigm was triggered by the discovery of the chiral anomaly that contributes to the decay of pions into photons and influences the motion of superfluid vortices in 3He-A. In the solid state, it also fundamentally affects the properties of topological Weyl and Dirac semimetals, recently realized experimentally. In this work we propose that the most identifying consequence of the chiral anomaly, the charge density imbalance between fermions of different chirality induced by nonorthogonal electric and magnetic fields, can be directly observed in these materials with the existing technology of photoemission spectroscopy. With angle resolution, the chiral anomaly is identified by a characteristic note-shaped pattern of the emission spectra, originating from the imbalanced occupation of the bulk states and a previously unreported momentum dependent energy shift of the surface state Fermi arcs. We further demonstrate that the chiral anomaly likewise leaves an imprint in angle averaged emission spectra, facilitating its experimental detection. Thereby, our work provides essential theoretical input to foster the direct visualization of the chiral anomaly in condensed matter, in contrast to transport properties, such as negative magnetoresistance, which can also be obtained in the absence of a chiral anomaly.

  10. Thermal wake/vessel detection technique

    DOEpatents

    Roskovensky, John K [Albuquerque, NM; Nandy, Prabal [Albuquerque, NM; Post, Brian N [Albuquerque, NM

    2012-01-10

    A computer-automated method for detecting a vessel in water based on an image of a portion of Earth includes generating a thermal anomaly mask. The thermal anomaly mask flags each pixel of the image initially deemed to be a wake pixel based on a comparison of a thermal value of each pixel against other thermal values of other pixels localized about each pixel. Contiguous pixels flagged by the thermal anomaly mask are grouped into pixel clusters. A shape of each of the pixel clusters is analyzed to determine whether each of the pixel clusters represents a possible vessel detection event. The possible vessel detection events are represented visually within the image.

  11. GNSS reflectometry aboard the International Space Station: phase-altimetry simulation to detect ocean topography anomalies

    NASA Astrophysics Data System (ADS)

    Semmling, Maximilian; Leister, Vera; Saynisch, Jan; Zus, Florian; Wickert, Jens

    2016-04-01

    An ocean altimetry experiment using Earth reflected GNSS signals has been proposed to the European Space Agency (ESA). It is part of the GNSS Reflectometry Radio Occultation Scatterometry (GEROS) mission that is planned aboard the International Space Station (ISS). Altimetric simulations are presented that examine the detection of ocean topography anomalies assuming GNSS phase delay observations. Such delay measurements are well established for positioning and are possible due to a sufficient synchronization of GNSS receiver and transmitter. For altimetric purpose delays of Earth reflected GNSS signals can be observed similar to radar altimeter signals. The advantage of GNSS is the synchronized separation of transmitter and receiver that allow a significantly increased number of observation per receiver due to more than 70 GNSS transmitters currently in orbit. The altimetric concept has already been applied successfully to flight data recorded over the Mediterranean Sea. The presented altimetric simulation considers anomalies in the Agulhas current region which are obtained from the Region Ocean Model System (ROMS). Suitable reflection events in an elevation range between 3° and 30° last about 10min with ground track's length >3000km. Typical along-track footprints (1s signal integration time) have a length of about 5km. The reflection's Fresnel zone limits the footprint of coherent observations to a major axis extention between 1 to 6km dependent on the elevation. The altimetric performance depends on the signal-to-noise ratio (SNR) of the reflection. Simulation results show that precision is better than 10cm for SNR of 30dB. Whereas, it is worse than 0.5m if SNR goes down to 10dB. Precision, in general, improves towards higher elevation angles. Critical biases are introduced by atmospheric and ionospheric refraction. Corresponding correction strategies are still under investigation.

  12. Analysis of genitourinary anomalies in patients with VACTERL (Vertebral anomalies, Anal atresia, Cardiac malformations, Tracheo-Esophageal fistula, Renal anomalies, Limb abnormalities) association

    PubMed Central

    Solomon, Benjamin D.; Raam, Manu S.; Pineda-Alvarez, Daniel E.

    2010-01-01

    Purpose The goal of this study was to describe a novel pattern of genitourinary (GU) anomalies in VACTERL association, which involves congenital anomalies affecting the vertebrae, anus, heart, trachea and esophagus, kidneys, and limbs. Procedures We collected clinical data on 105 patients diagnosed with VACTERL association and analyzed a subset of 89 patients who met more stringent inclusion criteria. Findings Twenty-one percent of patients have GU anomalies, which are more severe (but not more frequent) in females. Anomalies were noted in patients without malformations affecting the renal, lower vertebral, or lower gastrointestinal systems. Conclusions There should be a high index of suspicion for the presence of GU anomalies even in patient who do not have spatially similar malformations. PMID:21235632

  13. A detailed description of the sequential probability ratio test for 2-IMU FDI

    NASA Technical Reports Server (NTRS)

    Rich, T. M.

    1976-01-01

    The sequential probability ratio test (SPRT) for 2-IMU FDI (inertial measuring unit failure detection/isolation) is described. The SPRT is a statistical technique for detecting and isolating soft IMU failures originally developed for the strapdown inertial reference unit. The flowchart of a subroutine incorporating the 2-IMU SPRT is included.

  14. MAGSAT anomaly map and continental drift

    NASA Technical Reports Server (NTRS)

    Lemouel, J. L. (Principal Investigator); Galdeano, A.; Ducruix, J.

    1981-01-01

    Anomaly maps of high quality are needed to display unambiguously the so called long wave length anomalies. The anomalies were analyzed in terms of continental drift and the nature of their sources is discussed. The map presented confirms the thinness of the oceanic magnetized layer. Continental magnetic anomalies are characterized by elongated structures generally of east-west trend. Paleomagnetic reconstruction shows that the anomalies found in India, Australia, and Antarctic exhibit a fair consistency with the African anomalies. It is also shown that anomalies are locked under the continents and have a fixed geometry.

  15. Using principal component analysis for selecting network behavioral anomaly metrics

    NASA Astrophysics Data System (ADS)

    Gregorio-de Souza, Ian; Berk, Vincent; Barsamian, Alex

    2010-04-01

    This work addresses new approaches to behavioral analysis of networks and hosts for the purposes of security monitoring and anomaly detection. Most commonly used approaches simply implement anomaly detectors for one, or a few, simple metrics and those metrics can exhibit unacceptable false alarm rates. For instance, the anomaly score of network communication is defined as the reciprocal of the likelihood that a given host uses a particular protocol (or destination);this definition may result in an unrealistically high threshold for alerting to avoid being flooded by false positives. We demonstrate that selecting and adapting the metrics and thresholds, on a host-by-host or protocol-by-protocol basis can be done by established multivariate analyses such as PCA. We will show how to determine one or more metrics, for each network host, that records the highest available amount of information regarding the baseline behavior, and shows relevant deviances reliably. We describe the methodology used to pick from a large selection of available metrics, and illustrate a method for comparing the resulting classifiers. Using our approach we are able to reduce the resources required to properly identify misbehaving hosts, protocols, or networks, by dedicating system resources to only those metrics that actually matter in detecting network deviations.

  16. Development anomaly and non-vitality: Two case reports

    PubMed Central

    Kailasam, Sivakumar; Thangavel, Boopathi; Mathew, Sebeena; Das, Arjun Kesavan Purushotaman; Jayakodi, Harikaran; Kumaravadivel, Karthick

    2012-01-01

    Anatomic aberrations are seen in human dentition. The maxillary incisor region of the permanent dentition where these anatomical aberrations are commonly seen is considered an area of embryonic hazard. Aberrations affecting the internal and external morphology can at times be the cause of complex pathological conditions involving the pulpal and periodontal tissues and can pose a challenge to the clinician for the diagnosis and clinical management. Detecting and treating the anomalies at an early phase is essential as it poses a threat for the loss of vitality of the concerned teeth. The aim of this paper is to highlight the fact two different developmental anomalies of maxillary incisors, namely palatoradicular groove and Turner's hypoplasia, led to the loss of vitality of the same. PMID:23066269

  17. A Comparative Evaluation of Anomaly Detection Algorithms for Maritime Video Surveillance

    DTIC Science & Technology

    2011-01-01

    of k-means clustering and the k- NN Localized p-value Estimator ( KNN -LPE). K-means is a popular distance-based clustering algorithm while KNN -LPE...implemented the sparse cluster identification rule we described in Section 3.1. 2. k-NN Localized p-value Estimator ( KNN -LPE): We implemented this using...Average Density ( KNN -NAD): This was implemented as described in Section 3.4. Algorithm Parameter Settings The global and local density-based anomaly

  18. Poland syndrome a rare congenital anomaly.

    PubMed

    Ibrahim, Aliyu; Ramatu, Abdallah; Helen, Akhiwu

    2013-07-01

    Poland syndrome is a rare congenital anomaly classically consisting of unilateral hypoplasia of the sternocostal head of the pectoralis major muscle and ipsilateral brachysyndactyly. It was first described by Alfred Poland in 1840 and may occur with different gravity. Our patient is an eight-year-old Nigerian girl with left-sided anterior chest wall defect with no detectable structural heart abnormality but presented with repeated episodes of syncopal attacks following minor trauma to the anterior chest wall.

  19. Sequential Syndrome Decoding of Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Reed, I. S.; Truong, T. K.

    1984-01-01

    The algebraic structure of convolutional codes are reviewed and sequential syndrome decoding is applied to those codes. These concepts are then used to realize by example actual sequential decoding, using the stack algorithm. The Fano metric for use in sequential decoding is modified so that it can be utilized to sequentially find the minimum weight error sequence.

  20. SSME propellant path leak detection real-time

    NASA Technical Reports Server (NTRS)

    Crawford, R. A.; Smith, L. M.

    1994-01-01

    Included are four documents that outline the technical aspects of the research performed on NASA Grant NAG8-140: 'A System for Sequential Step Detection with Application to Video Image Processing'; 'Leak Detection from the SSME Using Sequential Image Processing'; 'Digital Image Processor Specifications for Real-Time SSME Leak Detection'; and 'A Color Change Detection System for Video Signals with Applications to Spectral Analysis of Rocket Engine Plumes'.

  1. Adaptive sequential Bayesian classification using Page's test

    NASA Astrophysics Data System (ADS)

    Lynch, Robert S., Jr.; Willett, Peter K.

    2002-03-01

    In this paper, the previously introduced Mean-Field Bayesian Data Reduction Algorithm is extended for adaptive sequential hypothesis testing utilizing Page's test. In general, Page's test is well understood as a method of detecting a permanent change in distribution associated with a sequence of observations. However, the relationship between detecting a change in distribution utilizing Page's test with that of classification and feature fusion is not well understood. Thus, the contribution of this work is based on developing a method of classifying an unlabeled vector of fused features (i.e., detect a change to an active statistical state) as quickly as possible given an acceptable mean time between false alerts. In this case, the developed classification test can be thought of as equivalent to performing a sequential probability ratio test repeatedly until a class is decided, with the lower log-threshold of each test being set to zero and the upper log-threshold being determined by the expected distance between false alerts. It is of interest to estimate the delay (or, related stopping time) to a classification decision (the number of time samples it takes to classify the target), and the mean time between false alerts, as a function of feature selection and fusion by the Mean-Field Bayesian Data Reduction Algorithm. Results are demonstrated by plotting the delay to declaring the target class versus the mean time between false alerts, and are shown using both different numbers of simulated training data and different numbers of relevant features for each class.

  2. Diagnostic accuracy of ultrasonography and magnetic resonance imaging for the detection of fetal anomalies: a blinded case–control study

    PubMed Central

    Gonçalves, L. F.; Lee, W.; Mody, S.; Shetty, A.; Sangi-Haghpeykar, H.; Romero, R.

    2018-01-01

    Objectives To compare the accuracy of two-dimensional ultrasound (2D-US), three-dimensional ultrasound (3D-US) and magnetic resonance imaging (MRI) for the diagnosis of congenital anomalies without prior knowledge of indications and previous imaging findings. Methods This was a prospective, blinded case–control study comprising women with a singleton pregnancy with fetal congenital abnormalities identified on clinical ultrasound and those with an uncomplicated pregnancy. All women volunteered to undergo 2D-US, 3D-US and MRI, which were performed at one institution. Different examiners at a collaborating institution performed image interpretation. Sensitivity and specificity of the three imaging methods were calculated for individual anomalies, based on postnatal imaging and/or autopsy as the definitive diagnosis. Diagnostic confidence was graded on a four-point Likert scale. Results A total of 157 singleton pregnancies were enrolled, however nine cases were excluded owing to incomplete outcome, resulting in 148 fetuses (58 cases and 90 controls) included in the final analysis. Among cases, 13 (22.4%) had central nervous system (CNS) anomalies, 40 (69.0%) had non-CNS anomalies and five (8.6%) had both CNS and non-CNS anomalies. The main findings were: (1) MRI was more sensitive than 3D-US for diagnosing CNS anomalies (MRI, 88.9% (16/18) vs 3D-US, 66.7% (12/18) vs 2D-US, 72.2% (13/18); McNemar’s test for MRI vs 3D-US: P=0.046); (2) MRI provided additional information affecting prognosis and/or counseling in 22.2% (4/18) of fetuses with CNS anomalies; (3) 2D-US, 3D-US and MRI had similar sensitivity for diagnosing non-CNS anomalies; (4) specificity for all anomalies was highest for 3D-US (MRI, 85.6% (77/90) vs 3D-US, 94.4% (85/90) vs 2D-US, 92.2% (83/90); McNemar’s test for MRI vs 3D-US: P=0.03); and (5) the confidence of MRI for ruling out certain CNS abnormalities (usually questionable for cortical dysplasias or hemorrhage) that were not confirmed after

  3. [Congenital generalized lipodystrophy in a patient with Dandy Walker anomaly].

    PubMed

    Luna, Cecilia Inés; Fernández Cordero, Marisa; Escruela, Romina; Sierra, Valeria; Córdoba, Antonela; Goñi, Ignacio María; Berridi, Ricardo

    2014-10-01

    The objective of this study is to describe the unexpected association between the congenital generalized lipodystrophy (CGL) and Dandy Walker anomaly. We report the case of a 1-year-old infant who was hospitalized at her fourth month of life with Dandy Walker anomaly diagnosis and an increased social risk. During her hospitalization, she developed progressively: acromegaloid aspect, triangular fascia, hirsutism, lipoatrophy, muscle hypertrophy, clitoromegaly, abdominal distention, progressive hepatomegaly, and hypertriglyceridemia. This led to the clinical diagnosis of congenital generalized lipodystrophy. Importance should be given to the examination of clinical aspects as well as the interdisciplinary follow-up for proper detection of insulin resistance and diabetes, early puberty, cardiomyopathy, among others. In case of Dandy Walker anomaly, it should be checked the evolution to search intracranial hypertension signs. Due to its autosomal recessive nature, it is important to provide genetic counseling to the parents.

  4. De novo pericentric inversion of chromosome 9 in congenital anomaly.

    PubMed

    Jeong, Seon-Yong; Kim, Bo-Young; Yu, Jae Eun

    2010-09-01

    The pericentric inversion of chromosome 9 is one of the most common structural balanced chromosomal variations and has been found in both normal populations and patients with various abnormal phenotypes and diseases. The aim of this study was to re-evaluate the clinical impact of inv(9)(p11q13). We studied the karyotypes of 431 neonates with congenital anomalies at the Pediatric Clinic in Ajou University Hospital between 2004 and 2008 and retrospectively reviewed their clinical data. Chromosomal aberrations were detected in 60 patients (13.9%). The most common type of structural abnormality was inv(9)(p11q13), found in eight patients. Clinical investigation revealed that all eight cases with inv(9)(p11q13) had various congenital anomalies including: polydactyly, club foot, microtia, deafness, asymmetric face, giant Meckel's diverticulum, duodenal diaphragm, small bowel malrotation, pulmonary stenosis, cardiomyopathy, arrhythmia, and intrauterine growth restriction. The cytogenetic analysis of parents showed that all of the cases were de novo heterozygous inv(9)(p11q13). Since our results indicate that the incidence of inv(9)(p11q13) in patients with congenital anomalies was not significantly different from the normal population, inv(9)(p11q13) does not appear to be pathogenic with regard to the congenital anomalies. Some other, to date unknown, causes of the anomalies remain to be identified.

  5. Sequential biases in accumulating evidence

    PubMed Central

    Huggins, Richard; Dogo, Samson Henry

    2015-01-01

    Whilst it is common in clinical trials to use the results of tests at one phase to decide whether to continue to the next phase and to subsequently design the next phase, we show that this can lead to biased results in evidence synthesis. Two new kinds of bias associated with accumulating evidence, termed ‘sequential decision bias’ and ‘sequential design bias’, are identified. Both kinds of bias are the result of making decisions on the usefulness of a new study, or its design, based on the previous studies. Sequential decision bias is determined by the correlation between the value of the current estimated effect and the probability of conducting an additional study. Sequential design bias arises from using the estimated value instead of the clinically relevant value of an effect in sample size calculations. We considered both the fixed‐effect and the random‐effects models of meta‐analysis and demonstrated analytically and by simulations that in both settings the problems due to sequential biases are apparent. According to our simulations, the sequential biases increase with increased heterogeneity. Minimisation of sequential biases arises as a new and important research area necessary for successful evidence‐based approaches to the development of science. © 2015 The Authors. Research Synthesis Methods Published by John Wiley & Sons Ltd. PMID:26626562

  6. A Novel Ship-Tracking Method for GF-4 Satellite Sequential Images.

    PubMed

    Yao, Libo; Liu, Yong; He, You

    2018-06-22

    The geostationary remote sensing satellite has the capability of wide scanning, persistent observation and operational response, and has tremendous potential for maritime target surveillance. The GF-4 satellite is the first geostationary orbit (GEO) optical remote sensing satellite with medium resolution in China. In this paper, a novel ship-tracking method in GF-4 satellite sequential imagery is proposed. The algorithm has three stages. First, a local visual saliency map based on local peak signal-to-noise ratio (PSNR) is used to detect ships in a single frame of GF-4 satellite sequential images. Second, the accuracy positioning of each potential target is realized by a dynamic correction using the rational polynomial coefficients (RPCs) and automatic identification system (AIS) data of ships. Finally, an improved multiple hypotheses tracking (MHT) algorithm with amplitude information is used to track ships by further removing the false targets, and to estimate ships’ motion parameters. The algorithm has been tested using GF-4 sequential images and AIS data. The results of the experiment demonstrate that the algorithm achieves good tracking performance in GF-4 satellite sequential images and estimates the motion information of ships accurately.

  7. First branchial groove anomaly.

    PubMed

    Kumar, M; Hickey, S; Joseph, G

    2000-06-01

    First branchial groove anomalies are very rare. We report a case of a first branchial groove anomaly presented as an infected cyst in an 11-month-old child. Management of such lesions is complicated because of their close association with the facial nerve. Surgical management must include identification and protection of the facial nerve. Embryology and facial nerve disposition in relation to the anomaly are reviewed.

  8. Detecting Signals of Disproportionate Reporting from Singapore's Spontaneous Adverse Event Reporting System: An Application of the Sequential Probability Ratio Test.

    PubMed

    Chan, Cheng Leng; Rudrappa, Sowmya; Ang, Pei San; Li, Shu Chuen; Evans, Stephen J W

    2017-08-01

    The ability to detect safety concerns from spontaneous adverse drug reaction reports in a timely and efficient manner remains important in public health. This paper explores the behaviour of the Sequential Probability Ratio Test (SPRT) and ability to detect signals of disproportionate reporting (SDRs) in the Singapore context. We used SPRT with a combination of two hypothesised relative risks (hRRs) of 2 and 4.1 to detect signals of both common and rare adverse events in our small database. We compared SPRT with other methods in terms of number of signals detected and whether labelled adverse drug reactions were detected or the reaction terms were considered serious. The other methods used were reporting odds ratio (ROR), Bayesian Confidence Propagation Neural Network (BCPNN) and Gamma Poisson Shrinker (GPS). The SPRT produced 2187 signals in common with all methods, 268 unique signals, and 70 signals in common with at least one other method, and did not produce signals in 178 cases where two other methods detected them, and there were 403 signals unique to one of the other methods. In terms of sensitivity, ROR performed better than other methods, but the SPRT method found more new signals. The performances of the methods were similar for negative predictive value and specificity. Using a combination of hRRs for SPRT could be a useful screening tool for regulatory agencies, and more detailed investigation of the medical utility of the system is merited.

  9. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks.

    PubMed

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-06-13

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens' quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%.

  10. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks

    PubMed Central

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-01-01

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens’ quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%. PMID:27304957

  11. A sonographic approach to prenatal classification of congenital spine anomalies

    PubMed Central

    Robertson, Meiri; Sia, Sock Bee

    2015-01-01

    Abstract Objective: To develop a classification system for congenital spine anomalies detected by prenatal ultrasound. Methods: Data were collected from fetuses with spine abnormalities diagnosed in our institution over a five‐year period between June 2005 and June 2010. The ultrasound images were analysed to determine which features were associated with different congenital spine anomalies. Findings of the prenatal ultrasound images were correlated with other prenatal imaging, post mortem findings, post mortem imaging, neonatal imaging, karyotype, and other genetic workup. Data from published case reports of prenatal diagnosis of rare congenital spine anomalies were analysed to provide a comprehensive work. Results: During the study period, eighteen cases of spine abnormalities were diagnosed in 7819 women. The mean gestational age at diagnosis was 18.8w ± 2.2 SD. While most cases represented open NTD, a spectrum of vertebral abnormalities were diagnosed prenatally. These included hemivertebrae, block vertebrae, cleft or butterfly vertebrae, sacral agenesis, and a lipomeningocele. The most sensitive features for diagnosis of a spine abnormality included flaring of the vertebral arch ossification centres, abnormal spine curvature, and short spine length. While reported findings at the time of diagnosis were often conservative, retrospective analysis revealed good correlation with radiographic imaging. 3D imaging was found to be a valuable tool in many settings. Conclusions: Analysis of the study findings showed prenatal ultrasound allowed detection of disruption to the normal appearances of the fetal spine. Using the three features of flaring of the vertebral arch ossification centres, abnormal spine curvature, and short spine length, an algorithm was devised to aid with the diagnosis of spine anomalies for those who perform and report prenatal ultrasound. PMID:28191204

  12. Sequential estimation of surface water mass changes from daily satellite gravimetry data

    NASA Astrophysics Data System (ADS)

    Ramillien, G. L.; Frappart, F.; Gratton, S.; Vasseur, X.

    2015-03-01

    We propose a recursive Kalman filtering approach to map regional spatio-temporal variations of terrestrial water mass over large continental areas, such as South America. Instead of correcting hydrology model outputs by the GRACE observations using a Kalman filter estimation strategy, regional 2-by-2 degree water mass solutions are constructed by integration of daily potential differences deduced from GRACE K-band range rate (KBRR) measurements. Recovery of regional water mass anomaly averages obtained by accumulation of information of daily noise-free simulated GRACE data shows that convergence is relatively fast and yields accurate solutions. In the case of cumulating real GRACE KBRR data contaminated by observational noise, the sequential method of step-by-step integration provides estimates of water mass variation for the period 2004-2011 by considering a set of suitable a priori error uncertainty parameters to stabilize the inversion. Spatial and temporal averages of the Kalman filter solutions over river basin surfaces are consistent with the ones computed using global monthly/10-day GRACE solutions from official providers CSR, GFZ and JPL. They are also highly correlated to in situ records of river discharges (70-95 %), especially for the Obidos station where the total outflow of the Amazon River is measured. The sparse daily coverage of the GRACE satellite tracks limits the time resolution of the regional Kalman filter solutions, and thus the detection of short-term hydrological events.

  13. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).

    PubMed

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-07-13

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.

  14. Sequential (step-by-step) detection, identification and quantitation of extra virgin olive oil adulteration by chemometric treatment of chromatographic profiles.

    PubMed

    Capote, F Priego; Jiménez, J Ruiz; de Castro, M D Luque

    2007-08-01

    An analytical method for the sequential detection, identification and quantitation of extra virgin olive oil adulteration with four edible vegetable oils--sunflower, corn, peanut and coconut oils--is proposed. The only data required for this method are the results obtained from an analysis of the lipid fraction by gas chromatography-mass spectrometry. A total number of 566 samples (pure oils and samples of adulterated olive oil) were used to develop the chemometric models, which were designed to accomplish, step-by-step, the three aims of the method: to detect whether an olive oil sample is adulterated, to identify the type of adulterant used in the fraud, and to determine how much aldulterant is in the sample. Qualitative analysis was carried out via two chemometric approaches--soft independent modelling of class analogy (SIMCA) and K nearest neighbours (KNN)--both approaches exhibited prediction abilities that were always higher than 91% for adulterant detection and 88% for type of adulterant identification. Quantitative analysis was based on partial least squares regression (PLSR), which yielded R2 values of >0.90 for calibration and validation sets and thus made it possible to determine adulteration with excellent precision according to the Shenk criteria.

  15. Hybrid anomaly and gravity mediation for electroweak supersymmetry

    NASA Astrophysics Data System (ADS)

    Zhu, Bin; Ding, Ran; Li, Tianjun

    2018-03-01

    In this paper, we propose a hybrid mediation and hybrid supersymmetry breaking. In particular, the RG-invariant anomaly mediation is considered. Together with additional gravity mediation, the slepton tachyon problem of anomaly mediation is solved automatically. The special properties are that all color sparticles masses fall into several TeV regions due to the large m0 and m32 which are well beyond the scope of current LHC Run II limits. Unlike the gauge mediation, the dark matter candidate is still the lightest neutralino and the correct dark matter relic density can be realized within the framework of mixed axion-Wino dark matter. Due to the existence of multi-component axion-Wino dark matter, the direct detection cross-section is suppressed to evade the tightest LUX, PandaX bound.

  16. Trial Sequential Analysis in systematic reviews with meta-analysis.

    PubMed

    Wetterslev, Jørn; Jakobsen, Janus Christian; Gluud, Christian

    2017-03-06

    Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size. The results of the meta-analyses should relate the total number of randomised participants to the estimated required meta-analytic information size accounting for statistical diversity. When the number of participants and the corresponding number of trials in a meta-analysis are insufficient, the use of the traditional 95% confidence interval or the 5% statistical significance threshold will lead to too many false positive conclusions (type I errors) and too many false negative conclusions (type II errors). We developed a methodology for interpreting meta-analysis results, using generally accepted, valid evidence on how to adjust thresholds for significance in randomised clinical trials when the required sample size has not been reached. The Lan-DeMets trial sequential monitoring boundaries in Trial Sequential Analysis offer adjusted confidence intervals and restricted thresholds for statistical significance when the diversity-adjusted required information size and the corresponding number of required trials for the meta-analysis have not been reached. Trial Sequential Analysis provides a frequentistic approach to control both type I and type II errors. We define the required information size and the corresponding number of required trials in a meta-analysis and the diversity (D 2 ) measure of heterogeneity. We explain the reasons for using Trial Sequential Analysis of meta-analysis when the actual information size fails to reach the required information size. We present examples drawn from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in systematic reviews with traditional meta-analyses can

  17. Framework for behavioral analytics in anomaly identification

    NASA Astrophysics Data System (ADS)

    Touma, Maroun; Bertino, Elisa; Rivera, Brian; Verma, Dinesh; Calo, Seraphin

    2017-05-01

    Behavioral Analytics (BA) relies on digital breadcrumbs to build user profiles and create clusters of entities that exhibit a large degree of similarity. The prevailing assumption is that an entity will assimilate the group behavior of the cluster it belongs to. Our understanding of BA and its application in different domains continues to evolve and is a direct result of the growing interest in Machine Learning research. When trying to detect security threats, we use BA techniques to identify anomalies, defined in this paper as deviation from the group behavior. Early research papers in this field reveal a high number of false positives where a security alert is triggered based on deviation from the cluster learned behavior but still within the norm of what the system defines as an acceptable behavior. Further, domain specific security policies tend to be narrow and inadequately represent what an entity can do. Hence, they: a) limit the amount of useful data during the learning phase; and, b) lead to violation of policy during the execution phase. In this paper, we propose a framework for future research on the role of policies and behavior security in a coalition setting with emphasis on anomaly detection and individual's deviation from group activities.

  18. Machine Learning in Intrusion Detection

    DTIC Science & Technology

    2005-07-01

    machine learning tasks. Anomaly detection provides the core technology for a broad spectrum of security-centric applications. In this dissertation, we examine various aspects of anomaly based intrusion detection in computer security. First, we present a new approach to learn program behavior for intrusion detection. Text categorization techniques are adopted to convert each process to a vector and calculate the similarity between two program activities. Then the k-nearest neighbor classifier is employed to classify program behavior as normal or intrusive. We demonstrate

  19. Interplay between the b →s l l anomalies and dark matter physics

    NASA Astrophysics Data System (ADS)

    Kawamura, Junichiro; Okawa, Shohei; Omura, Yuji

    2017-10-01

    Recently, the LHCb Collaboration has reported the excesses in the b →s l l processes. One of the promising candidates for new physics to explain the anomalies is the extended Standard Model (SM) with vectorlike quarks and leptons. In that model, Yukawa couplings between the extra fermions and SM fermions are introduced, adding extra scalars. Then, the box diagrams involving the extra fields achieve the b →s l l anomalies. It has been known that the excesses require the large Yukawa couplings of leptons, so that this kind of model can be tested by studying correlations with other observables. In this paper, we consider the extra scalar to be a dark matter (DM) candidate, and investigate DM physics as well as the flavor physics and the LHC physics. The DM relic density and the direct-detection cross section are also dominantly given by the Yukawa couplings, so that we find some explicit correlations between DM physics and the flavor physics. In particular, we find the predictions of the b →s l l anomalies against the direct detection of DM.

  20. The parallel-sequential field subtraction technique for coherent nonlinear ultrasonic imaging

    NASA Astrophysics Data System (ADS)

    Cheng, Jingwei; Potter, Jack N.; Drinkwater, Bruce W.

    2018-06-01

    Nonlinear imaging techniques have recently emerged which have the potential to detect cracks at a much earlier stage than was previously possible and have sensitivity to partially closed defects. This study explores a coherent imaging technique based on the subtraction of two modes of focusing: parallel, in which the elements are fired together with a delay law and sequential, in which elements are fired independently. In the parallel focusing a high intensity ultrasonic beam is formed in the specimen at the focal point. However, in sequential focusing only low intensity signals from individual elements enter the sample and the full matrix of transmit-receive signals is recorded and post-processed to form an image. Under linear elastic assumptions, both parallel and sequential images are expected to be identical. Here we measure the difference between these images and use this to characterise the nonlinearity of small closed fatigue cracks. In particular we monitor the change in relative phase and amplitude at the fundamental frequencies for each focal point and use this nonlinear coherent imaging metric to form images of the spatial distribution of nonlinearity. The results suggest the subtracted image can suppress linear features (e.g. back wall or large scatters) effectively when instrumentation noise compensation in applied, thereby allowing damage to be detected at an early stage (c. 15% of fatigue life) and reliably quantified in later fatigue life.

  1. Glycerol production by Oenococcus oeni during sequential and simultaneous cultures with wine yeast strains.

    PubMed

    Ale, Cesar E; Farías, Marta E; Strasser de Saad, Ana M; Pasteris, Sergio E

    2014-07-01

    Growth and fermentation patterns of Saccharomyces cerevisiae, Kloeckera apiculata, and Oenococcus oeni strains cultured in grape juice medium were studied. In pure, sequential and simultaneous cultures, the strains reached the stationary growth phase between 2 and 3 days. Pure and mixed K. apiculata and S. cerevisiae cultures used mainly glucose, producing ethanol, organic acids, and 4.0 and 0.1 mM glycerol, respectively. In sequential cultures, O. oeni achieved about 1 log unit at 3 days using mainly fructose and L-malic acid. Highest sugars consumption was detected in K. apiculata supernatants, lactic acid being the major end-product. 8.0 mM glycerol was found in 6-day culture supernatants. In simultaneous cultures, total sugars and L-malic acid were used at 3 days and 98% of ethanol and glycerol were detected. This study represents the first report of the population dynamics and metabolic behavior of yeasts and O. oeni in sequential and simultaneous cultures and contributes to the selection of indigenous strains to design starter cultures for winemaking, also considering the inclusion of K. apiculata. The sequential inoculation of yeasts and O. oeni would enhance glycerol production, which confers desirable organoleptic characteristics to wines, while organic acids levels would not affect their sensory profile. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Postnatal outcome of congenital anomalies in low resource setting.

    PubMed

    Kumar, Manisha; Sharma, Sumedha; Bhagat, Manisha; Gupta, Usha; Anand, Rama; Puri, Archana; Singh, Anuradha; Singh, Abha

    2013-10-01

    This study aimed to determine the postnatal outcome of congenital malformations in a tertiary care hospital of India. This was a prospective study of all women with prenatally detected major congenital malformations. Postnatal follow-up of live born babies was carried out for 1 year. There were 574 cases with major congenital anomalies, 523 of which were fully followed. Only 69 women (13.6%) had the initial scan before 20 weeks of gestation. Craniospinal defects were the most common (42.7%), followed by genitourinary anomalies (28%). There was no live birth in cases such as anencephaly, iniencephaly, bilateral renal agenesis, gastroschisis, and cystic hygroma. Survival at 1 year was less than 25% in spina bifida, bilateral cystic kidneys, complex cardiac disease, and non-immune hydrops fetalis. In cases with mild hydrocephalus or unilateral and mild renal disease, the survival was over 75%. In India, the majority of congenital anomalies present late in gestation. Although fetal outcome is invariably poor for severe defects, existing legislation in the country leaves pregnancy continuation as the only option. © 2013 John Wiley & Sons, Ltd.

  3. Reliability of CHAMP Anomaly Continuations

    NASA Technical Reports Server (NTRS)

    vonFrese, Ralph R. B.; Kim, Hyung Rae; Taylor, Patrick T.; Asgharzadeh, Mohammad F.

    2003-01-01

    CHAMP is recording state-of-the-art magnetic and gravity field observations at altitudes ranging over roughly 300 - 550 km. However, anomaly continuation is severely limited by the non-uniqueness of the process and satellite anomaly errors. Indeed, our numerical anomaly simulations from satellite to airborne altitudes show that effective downward continuations of the CHAMP data are restricted to within approximately 50 km of the observation altitudes while upward continuations can be effective over a somewhat larger altitude range. The great unreliability of downward continuation requires that the satellite geopotential observations must be analyzed at satellite altitudes if the anomaly details are to be exploited most fully. Given current anomaly error levels, joint inversion of satellite and near- surface anomalies is the best approach for implementing satellite geopotential observations for subsurface studies. We demonstrate the power of this approach using a crustal model constrained by joint inversions of near-surface and satellite magnetic and gravity observations for Maude Rise, Antarctica, in the southwestern Indian Ocean. Our modeling suggests that the dominant satellite altitude magnetic anomalies are produced by crustal thickness variations and remanent magnetization of the normal polarity Cretaceous Quiet Zone.

  4. Theory and experiments in model-based space system anomaly management

    NASA Astrophysics Data System (ADS)

    Kitts, Christopher Adam

    This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.

  5. Enzyme leaching of surficial geochemical samples for detecting hydromorphic trace-element anomalies associated with precious-metal mineralized bedrock buried beneath glacial overburden in northern Minnesota

    USGS Publications Warehouse

    Clark, Robert J.; Meier, A.L.; Riddle, G.; ,

    1990-01-01

    One objective of the International Falls and Roseau, Minnesota, CUSMAP projects was to develop a means of conducting regional-scale geochemical surveys in areas where bedrock is buried beneath complex glacially derived overburden. Partial analysis of B-horizon soils offered hope for detecting subtle hydromorphic trace-element dispersion patterns. An enzyme-based partial leach selectively removes metals from oxide coatings on the surfaces of soil materials without attacking their matrix. Most trace-element concentrations in the resulting solutions are in the part-per-trillion to low part-per-billion range, necessitating determinations by inductively coupled plasma/mass spectrometry. The resulting data show greater contrasts for many trace elements than with other techniques tested. Spatially, many trace metal anomalies are locally discontinuous, but anomalous trends within larger areas are apparent. In many instances, the source for an anomaly seems to be either basal till or bedrock. Ground water flow is probably the most important mechanism for transporting metals toward the surface, although ionic diffusion, electrochemical gradients, and capillary action may play a role in anomaly dispersal. Sample sites near the Rainy Lake-Seine River fault zone, a regional shear zone, often have anomalous concentrations of a variety of metals, commonly including Zn and/or one or more metals which substitute for Zn in sphalerite (Cd, Ge, Ga, and Sn). Shifts in background concentrations of Bi, Sb, and As show a trend across the area indicating a possible regional zoning of lode-Au mineralization. Soil anomalies of Ag, Co, and Tl parallel basement structures, suggesting areas that may have potential for Cobalt/Thunder Baytype silver viens. An area around Baudette, Minnesota, which is underlain by quartz-chlorite-carbonate-altered shear zones, is anomalous in Ag, As, Bi, Co, Mo, Te, Tl, and W. Anomalies of Ag, As, Bi, Te, and W tend to follow the fault zones, suggesting potential

  6. Hyperspectral target detection using heavy-tailed distributions

    NASA Astrophysics Data System (ADS)

    Willis, Chris J.

    2009-09-01

    One promising approach to target detection in hyperspectral imagery exploits a statistical mixture model to represent scene content at a pixel level. The process then goes on to look for pixels which are rare, when judged against the model, and marks them as anomalies. It is assumed that military targets will themselves be rare and therefore likely to be detected amongst these anomalies. For the typical assumption of multivariate Gaussianity for the mixture components, the presence of the anomalous pixels within the training data will have a deleterious effect on the quality of the model. In particular, the derivation process itself is adversely affected by the attempt to accommodate the anomalies within the mixture components. This will bias the statistics of at least some of the components away from their true values and towards the anomalies. In many cases this will result in a reduction in the detection performance and an increased false alarm rate. This paper considers the use of heavy-tailed statistical distributions within the mixture model. Such distributions are better able to account for anomalies in the training data within the tails of their distributions, and the balance of the pixels within their central masses. This means that an improved model of the majority of the pixels in the scene may be produced, ultimately leading to a better anomaly detection result. The anomaly detection techniques are examined using both synthetic data and hyperspectral imagery with injected anomalous pixels. A range of results is presented for the baseline Gaussian mixture model and for models accommodating heavy-tailed distributions, for different parameterizations of the algorithms. These include scene understanding results, anomalous pixel maps at given significance levels and Receiver Operating Characteristic curves.

  7. Retrieving Temperature Anomaly in the Global Subsurface and Deeper Ocean From Satellite Observations

    NASA Astrophysics Data System (ADS)

    Su, Hua; Li, Wene; Yan, Xiao-Hai

    2018-01-01

    Retrieving the subsurface and deeper ocean (SDO) dynamic parameters from satellite observations is crucial for effectively understanding ocean interior anomalies and dynamic processes, but it is challenging to accurately estimate the subsurface thermal structure over the global scale from sea surface parameters. This study proposes a new approach based on Random Forest (RF) machine learning to retrieve subsurface temperature anomaly (STA) in the global ocean from multisource satellite observations including sea surface height anomaly (SSHA), sea surface temperature anomaly (SSTA), sea surface salinity anomaly (SSSA), and sea surface wind anomaly (SSWA) via in situ Argo data for RF training and testing. RF machine-learning approach can accurately retrieve the STA in the global ocean from satellite observations of sea surface parameters (SSHA, SSTA, SSSA, SSWA). The Argo STA data were used to validate the accuracy and reliability of the results from the RF model. The results indicated that SSHA, SSTA, SSSA, and SSWA together are useful parameters for detecting SDO thermal information and obtaining accurate STA estimations. The proposed method also outperformed support vector regression (SVR) in global STA estimation. It will be a useful technique for studying SDO thermal variability and its role in global climate system from global-scale satellite observations.

  8. Mining of high utility-probability sequential patterns from uncertain databases

    PubMed Central

    Zhang, Binbin; Fournier-Viger, Philippe; Li, Ting

    2017-01-01

    High-utility sequential pattern mining (HUSPM) has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs). They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM) for mining high utility-probability sequential patterns (HUPSPs) in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds. PMID:28742847

  9. 6d, Coulomb branch anomaly matching

    NASA Astrophysics Data System (ADS)

    Intriligator, Kenneth

    2014-10-01

    6d QFTs are constrained by the analog of 't Hooft anomaly matching: all anomalies for global symmetries and metric backgrounds are constants of RG flows, and for all vacua in moduli spaces. We discuss an anomaly matching mechanism for 6d theories on their Coulomb branch. It is a global symmetry analog of Green-Schwarz-West-Sagnotti anomaly cancellation, and requires the apparent anomaly mismatch to be a perfect square, . Then Δ I 8 is cancelled by making X 4 an electric/magnetic source for the tensor multiplet, so background gauge field instantons yield charged strings. This requires the coefficients in X 4 to be integrally quantized. We illustrate this for theories. We also consider the SCFTs from N small E8 instantons, verifying that the recent result for its anomaly polynomial fits with the anomaly matching mechanism.

  10. Mosaic tetraploidy in a liveborn infant with features of the DiGeorge anomaly.

    PubMed

    Wullich, B; Henn, W; Groterath, E; Ermis, A; Fuchs, S; Zankl, M

    1991-11-01

    We report on a liveborn male infant with mosaic tetraploidy who presented with multiple congenital anomalies including features of the DiGeorge anomaly (type I truncus arteriosus with other cardiovascular malformations, thymic hypoplasia, hypocalcemia). No structural chromosome aberrations, namely of chromosome 22, were detected. These findings contribute to the variability of symptoms of the polyploid phenotype. Additionally, the cytogenetic studies in our case emphasize the necessity of investigating fibroblasts in order to evaluate the relevant proportion of aberrant cells in mosaicism.

  11. Sequential lineup laps and eyewitness accuracy.

    PubMed

    Steblay, Nancy K; Dietrich, Hannah L; Ryan, Shannon L; Raczynski, Jeanette L; James, Kali A

    2011-08-01

    Police practice of double-blind sequential lineups prompts a question about the efficacy of repeated viewings (laps) of the sequential lineup. Two laboratory experiments confirmed the presence of a sequential lap effect: an increase in witness lineup picks from first to second lap, when the culprit was a stranger. The second lap produced more errors than correct identifications. In Experiment 2, lineup diagnosticity was significantly higher for sequential lineup procedures that employed a single versus double laps. Witnesses who elected to view a second lap made significantly more errors than witnesses who chose to stop after one lap or those who were required to view two laps. Witnesses with prior exposure to the culprit did not exhibit a sequential lap effect.

  12. A Locally Optimal Algorithm for Estimating a Generating Partition from an Observed Time Series and Its Application to Anomaly Detection.

    PubMed

    Ghalyan, Najah F; Miller, David J; Ray, Asok

    2018-06-12

    Estimation of a generating partition is critical for symbolization of measurements from discrete-time dynamical systems, where a sequence of symbols from a (finite-cardinality) alphabet may uniquely specify the underlying time series. Such symbolization is useful for computing measures (e.g., Kolmogorov-Sinai entropy) to identify or characterize the (possibly unknown) dynamical system. It is also useful for time series classification and anomaly detection. The seminal work of Hirata, Judd, and Kilminster (2004) derives a novel objective function, akin to a clustering objective, that measures the discrepancy between a set of reconstruction values and the points from the time series. They cast estimation of a generating partition via the minimization of their objective function. Unfortunately, their proposed algorithm is nonconvergent, with no guarantee of finding even locally optimal solutions with respect to their objective. The difficulty is a heuristic-nearest neighbor symbol assignment step. Alternatively, we develop a novel, locally optimal algorithm for their objective. We apply iterative nearest-neighbor symbol assignments with guaranteed discrepancy descent, by which joint, locally optimal symbolization of the entire time series is achieved. While most previous approaches frame generating partition estimation as a state-space partitioning problem, we recognize that minimizing the Hirata et al. (2004) objective function does not induce an explicit partitioning of the state space, but rather the space consisting of the entire time series (effectively, clustering in a (countably) infinite-dimensional space). Our approach also amounts to a novel type of sliding block lossy source coding. Improvement, with respect to several measures, is demonstrated over popular methods for symbolizing chaotic maps. We also apply our approach to time-series anomaly detection, considering both chaotic maps and failure application in a polycrystalline alloy material.

  13. Bayesian anomaly detection in monitoring data applying relevance vector machine

    NASA Astrophysics Data System (ADS)

    Saito, Tomoo

    2011-04-01

    A method for automatically classifying the monitoring data into two categories, normal and anomaly, is developed in order to remove anomalous data included in the enormous amount of monitoring data, applying the relevance vector machine (RVM) to a probabilistic discriminative model with basis functions and their weight parameters whose posterior PDF (probabilistic density function) conditional on the learning data set is given by Bayes' theorem. The proposed framework is applied to actual monitoring data sets containing some anomalous data collected at two buildings in Tokyo, Japan, which shows that the trained models discriminate anomalous data from normal data very clearly, giving high probabilities of being normal to normal data and low probabilities of being normal to anomalous data.

  14. Satellite GN and C Anomaly Trends

    NASA Technical Reports Server (NTRS)

    Robertson, Brent; Stoneking, Eric

    2003-01-01

    On-orbit anomaly records for satellites launched from 1990 through 2001 are reviewed to determine recent trends of un-manned space mission critical failures. Anomalies categorized by subsystems show that Guidance, Navigation and Control (GN&C) subsystems have a high number of anomalies that result in a mission critical failure when compared to other subsystems. A mission critical failure is defined as a premature loss of a satellite or loss of its ability to perform its primary mission during its design life. The majority of anomalies are shown to occur early in the mission, usually within one year from launch. GN&C anomalies are categorized by cause and equipment type involved. A statistical analysis of the data is presented for all anomalies compared with the GN&C anomalies for various mission types, orbits and time periods. Conclusions and recommendations are presented for improving mission success and reliability.

  15. Robustness of the Sequential Lineup Advantage

    ERIC Educational Resources Information Center

    Gronlund, Scott D.; Carlson, Curt A.; Dailey, Sarah B.; Goodsell, Charles A.

    2009-01-01

    A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup…

  16. Multi-Attribute Sequential Search

    ERIC Educational Resources Information Center

    Bearden, J. Neil; Connolly, Terry

    2007-01-01

    This article describes empirical and theoretical results from two multi-attribute sequential search tasks. In both tasks, the DM sequentially encounters options described by two attributes and must pay to learn the values of the attributes. In the "continuous" version of the task the DM learns the precise numerical value of an attribute when she…

  17. Behavioral economics without anomalies.

    PubMed Central

    Rachlin, H

    1995-01-01

    Behavioral economics is often conceived as the study of anomalies superimposed on a rational system. As research has progressed, anomalies have multiplied until little is left of rationality. Another conception of behavioral economics is based on the axiom that value is always maximized. It incorporates so-called anomalies either as conflicts between temporal patterns of behavior and the individual acts comprising those patterns or as outcomes of nonexponential time discounting. This second conception of behavioral economics is both empirically based and internally consistent. PMID:8551195

  18. 10 CFR 74.4 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...

  19. 10 CFR 74.4 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...

  20. 10 CFR 74.4 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...

  1. 10 CFR 74.4 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...

  2. 10 CFR 74.4 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...

  3. [Mass anomalies of the extremities in anurans].

    PubMed

    Kovalenko, E E

    2000-01-01

    The author analyses literature data on anomalies of limbs in Anura. It is shown that published data is usually not enough to discuss either conditions of appearance or the causes of anomalies. Traditional statistical methods does not adequately characterise the frequency of anomalies. The author suggests a new criteria for ascertaining the fact of appearance of mass anomalies. A number of experimental data don't correspond to current theoretical ideas about the nature of anomalies. It is considered to distinguish "background" and "mass" anomalies. "Background" anomalies can not be a good indicator of unfavourable condition of development.

  4. Dental anomalies: prevalence and associations between them in a large sample of non-orthodontic subjects, a cross-sectional study.

    PubMed

    Laganà, G; Venza, N; Borzabadi-Farahani, A; Fabi, F; Danesi, C; Cozza, P

    2017-03-11

    To analyze the prevalence and associations between dental anomalies detectable on panoramic radiographs in a sample of non-orthodontic growing subjects. For this cross-sectional study, digital panoramic radiographs of 5005 subjects were initially screened from a single radiographic center in Rome. Inclusion criteria were: subjects who were aged 8-12 years, Caucasian, and had good diagnostic quality radiographs. Syndromic subjects, those with craniofacial malformation, or orthodontic patients were excluded and this led to a sample of 4706 subjects [mean (SD) age = 9.6 (1.2) years, 2366 males and 2340 females]. Sample was subsequently divided into four subgroups (8, 9, 10, and 11-12 year-old groups). Two operators examined panoramic radiographs to observe the presence of common dental anomalies. The prevalence and associations between dental anomalies were also investigated. The overall prevalence of dental anomalies was 20.9%. Approximately, 17.9% showed only one anomaly, 2.7% two anomalies, while only 0.3% had more than two anomalies. The most frequent anomalies were the displacement of maxillary canine (7.5%), hypodontia (7.1%), impacted teeth (3.9%), tooth ankylosis (2.8%), and tooth transposition (1.4%). The lower right second premolar was the most frequent missing teeth; 3.7% had only one tooth agenesis, and 0.08% had six or more missing tooth (Oligodontia). Mesiodens was the most common type of supernumerary tooth (0.66%). Two subjects had taurodontic tooth (0.04%). Tooth transpositions and displacement of maxillary canine were seen in 1.4 and 7.5%, retrospectively (approximately 69 and 58% were in the 8 and 9 year-old groups, retrospectively). Significant associations were detected between the different dental anomalies (P < .05). The results of our study revealed significant associations among different dental anomalies and provide further evidences to support common etiological factors.

  5. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO)

    PubMed Central

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  6. The intermediate wavelength magnetic anomaly field of the north Pacific and possible source distributions

    NASA Technical Reports Server (NTRS)

    Labrecque, J. L.; Cande, S. C.; Jarrard, R. D. (Principal Investigator)

    1983-01-01

    A technique that eliminates external field sources and the effects of strike aliasing was used to extract from marine survey data the intermediate wavelength magnetic anomaly field for (B) in the North Pacific. A strong correlation exists between this field and the MAGSAT field although a directional sensitivity in the MAGSAT field can be detected. The intermediate wavelength field is correlated to tectonic features. Island arcs appear as positive anomalies of induced origin likely due to variations in crustal thickness. Seamount chains and oceanic plateaus also are manifested by strong anomalies. The primary contribution to many of these anomalies appears to be due to a remanent magnetization. The source parameters for the remainder of these features are presently unidentified ambiguous. Results indicate that the sea surface field is a valuable source of information for secular variation analysis and the resolution of intermediate wavelength source parameters.

  7. Inhomogeneities detection in annual precipitation time series in Portugal using direct sequential simulation

    NASA Astrophysics Data System (ADS)

    Caineta, Júlio; Ribeiro, Sara; Costa, Ana Cristina; Henriques, Roberto; Soares, Amílcar

    2014-05-01

    Climate data homogenisation is of major importance in monitoring climate change, the validation of weather forecasting, general circulation and regional atmospheric models, modelling of erosion, drought monitoring, among other studies of hydrological and environmental impacts. This happens because non-climate factors can cause time series discontinuities which may hide the true climatic signal and patterns, thus potentially bias the conclusions of those studies. In the last two decades, many methods have been developed to identify and remove these inhomogeneities. One of those is based on geostatistical simulation (DSS - direct sequential simulation), where local probability density functions (pdf) are calculated at candidate monitoring stations, using spatial and temporal neighbouring observations, and then are used for detection of inhomogeneities. This approach has been previously applied to detect inhomogeneities in four precipitation series (wet day count) from a network with 66 monitoring stations located in the southern region of Portugal (1980-2001). This study revealed promising results and the potential advantages of geostatistical techniques for inhomogeneities detection in climate time series. This work extends the case study presented before and investigates the application of the geostatistical stochastic approach to ten precipitation series that were previously classified as inhomogeneous by one of six absolute homogeneity tests (Mann-Kendall test, Wald-Wolfowitz runs test, Von Neumann ratio test, Standard normal homogeneity test (SNHT) for a single break, Pettit test, and Buishand range test). Moreover, a sensibility analysis is implemented to investigate the number of simulated realisations that should be used to accurately infer the local pdfs. Accordingly, the number of simulations per iteration is increased from 50 to 500, which resulted in a more representative local pdf. A set of default and recommended settings is provided, which will help

  8. Apollo experience report: Flight anomaly resolution

    NASA Technical Reports Server (NTRS)

    Lobb, J. D.

    1975-01-01

    The identification of flight anomalies, the determination of their causes, and the approaches taken for corrective action are described. Interrelationships of the broad range of disciplines involved with the complex systems and the team concept employed to ensure timely and accurate resolution of anomalies are discussed. The documentation techniques and the techniques for management of anomaly resolution are included. Examples of specific anomalies are presented in the original form of their progressive documentation. Flight anomaly resolution functioned as a part of the real-time mission support and postflight testing, and results were included in the postflight documentation.

  9. Presentation and Treatment of Poland Anomaly.

    PubMed

    Buckwalter V, Joseph A; Shah, Apurva S

    2016-12-01

    Background: Poland anomaly is a sporadic, phenotypically variable congenital condition usually characterized by unilateral pectoral muscle agenesis and ipsilateral hand deformity. Methods: A comprehensive review of the medical literature on Poland anomaly was performed using a Medline search. Results: Poland anomaly is a sporadic, phenotypically variable congenital condition usually characterized by unilateral, simple syndactyly with ipsilateral limb hypoplasia and pectoralis muscle agenesis. Operative management of syndactyly in Poland anomaly is determined by the severity of hand involvement and the resulting anatomical dysfunction. Syndactyly reconstruction is recommended in all but the mildest cases because most patients with Poland anomaly have notable brachydactyly, and digital separation can improve functional length. Conclusions: Improved understanding the etiology and presentation of Poland anomaly can improve clinician recognition and management of this rare congenital condition.

  10. Presentation and Treatment of Poland Anomaly

    PubMed Central

    Buckwalter V, Joseph A.; Shah, Apurva S.

    2016-01-01

    Background: Poland anomaly is a sporadic, phenotypically variable congenital condition usually characterized by unilateral pectoral muscle agenesis and ipsilateral hand deformity. Methods: A comprehensive review of the medical literature on Poland anomaly was performed using a Medline search. Results: Poland anomaly is a sporadic, phenotypically variable congenital condition usually characterized by unilateral, simple syndactyly with ipsilateral limb hypoplasia and pectoralis muscle agenesis. Operative management of syndactyly in Poland anomaly is determined by the severity of hand involvement and the resulting anatomical dysfunction. Syndactyly reconstruction is recommended in all but the mildest cases because most patients with Poland anomaly have notable brachydactyly, and digital separation can improve functional length. Conclusions: Improved understanding the etiology and presentation of Poland anomaly can improve clinician recognition and management of this rare congenital condition. PMID:28149203

  11. System for closure of a physical anomaly

    DOEpatents

    Bearinger, Jane P; Maitland, Duncan J; Schumann, Daniel L; Wilson, Thomas S

    2014-11-11

    Systems for closure of a physical anomaly. Closure is accomplished by a closure body with an exterior surface. The exterior surface contacts the opening of the anomaly and closes the anomaly. The closure body has a primary shape for closing the anomaly and a secondary shape for being positioned in the physical anomaly. The closure body preferably comprises a shape memory polymer.

  12. Inverse sequential detection of parameter changes in developing time series

    NASA Technical Reports Server (NTRS)

    Radok, Uwe; Brown, Timothy J.

    1992-01-01

    Progressive values of two probabilities are obtained for parameter estimates derived from an existing set of values and from the same set enlarged by one or more new values, respectively. One probability is that of erroneously preferring the second of these estimates for the existing data ('type 1 error'), while the second probability is that of erroneously accepting their estimates for the enlarged test ('type 2 error'). A more stable combined 'no change' probability which always falls between 0.5 and 0 is derived from the (logarithmic) width of the uncertainty region of an equivalent 'inverted' sequential probability ratio test (SPRT, Wald 1945) in which the error probabilities are calculated rather than prescribed. A parameter change is indicated when the compound probability undergoes a progressive decrease. The test is explicitly formulated and exemplified for Gaussian samples.

  13. Exploring the sequential lineup advantage using WITNESS.

    PubMed

    Goodsell, Charles A; Gronlund, Scott D; Carlson, Curt A

    2010-12-01

    Advocates claim that the sequential lineup is an improvement over simultaneous lineup procedures, but no formal (quantitatively specified) explanation exists for why it is better. The computational model WITNESS (Clark, Appl Cogn Psychol 17:629-654, 2003) was used to develop theoretical explanations for the sequential lineup advantage. In its current form, WITNESS produced a sequential advantage only by pairing conservative sequential choosing with liberal simultaneous choosing. However, this combination failed to approximate four extant experiments that exhibited large sequential advantages. Two of these experiments became the focus of our efforts because the data were uncontaminated by likely suspect position effects. Decision-based and memory-based modifications to WITNESS approximated the data and produced a sequential advantage. The next step is to evaluate the proposed explanations and modify public policy recommendations accordingly.

  14. Associated congenital anomalies among cases with Down syndrome.

    PubMed

    Stoll, Claude; Dott, Beatrice; Alembik, Yves; Roth, Marie-Paule

    2015-12-01

    Down syndrome (DS) is the most common congenital anomaly widely studied for at least 150 years. However, the type and the frequency of congenital anomalies associated with DS are still controversial. Despite prenatal diagnosis and elective termination of pregnancy for fetal anomalies, in Europe, from 2008 to 2012 the live birth prevalence of DS per 10,000 was 10. 2. The objectives of this study were to examine the major congenital anomalies occurring in infants and fetuses with Down syndrome. The material for this study came from 402,532 consecutive pregnancies of known outcome registered by our registry of congenital anomalies between 1979 and 2008. Four hundred sixty seven (64%) out of the 728 cases with DS registered had at least one major associated congenital anomaly. The most common associated anomalies were cardiac anomalies, 323 cases (44%), followed by digestive system anomalies, 42 cases (6%), musculoskeletal system anomalies, 35 cases (5%), urinary system anomalies, 28 cases (4%), respiratory system anomalies, 13 cases (2%), and other system anomalies, 26 cases (3.6%). Among the cases with DS with congenital heart defects, the most common cardiac anomaly was atrioventricular septal defect (30%) followed by atrial septum defect (25%), ventricular septal defect (22%), patent ductus arteriosus (5%), coarctation of aorta (5%), and tetralogy of Fallot (3%). Among the cases with DS with a digestive system anomaly recorded, duodenal atresia (67%), Hirschsprung disease (14%), and tracheo-esophageal atresia (10%) were the most common. Fourteen (2%) of the cases with DS had an obstructive anomaly of the renal pelvis, including hydronephrosis. The other most common anomalies associated with cases with DS were syndactyly, club foot, polydactyly, limb reduction, cataract, hydrocephaly, cleft palate, hypospadias and diaphragmatic hernia. Many studies to assess the anomalies associated with DS have reported various results. There is no agreement in the literature as to

  15. Gravity Anomalies

    NASA Image and Video Library

    2015-04-15

    Analysis of radio tracking data have enabled maps of the gravity field of Mercury to be derived. In this image, overlain on a mosaic obtained by MESSENGER's Mercury Dual Imaging System and illuminated with a shape model determined from stereo-photoclinometry, Mercury's gravity anomalies are depicted in colors. Red tones indicate mass concentrations, centered on the Caloris basin (center) and the Sobkou region (right limb). Such large-scale gravitational anomalies are signatures of subsurface structure and evolution. The north pole is near the top of the sunlit area in this view. http://photojournal.jpl.nasa.gov/catalog/PIA19285

  16. Sequential and simultaneous choices: testing the diet selection and sequential choice models.

    PubMed

    Freidin, Esteban; Aw, Justine; Kacelnik, Alex

    2009-03-01

    We investigate simultaneous and sequential choices in starlings, using Charnov's Diet Choice Model (DCM) and Shapiro, Siller and Kacelnik's Sequential Choice Model (SCM) to integrate function and mechanism. During a training phase, starlings encountered one food-related option per trial (A, B or R) in random sequence and with equal probability. A and B delivered food rewards after programmed delays (shorter for A), while R ('rejection') moved directly to the next trial without reward. In this phase we measured latencies to respond. In a later, choice, phase, birds encountered the pairs A-B, A-R and B-R, the first implementing a simultaneous choice and the second and third sequential choices. The DCM predicts when R should be chosen to maximize intake rate, and SCM uses latencies of the training phase to predict choices between any pair of options in the choice phase. The predictions of both models coincided, and both successfully predicted the birds' preferences. The DCM does not deal with partial preferences, while the SCM does, and experimental results were strongly correlated to this model's predictions. We believe that the SCM may expose a very general mechanism of animal choice, and that its wider domain of success reflects the greater ecological significance of sequential over simultaneous choices.

  17. Hamiltonian Anomalies from Extended Field Theories

    NASA Astrophysics Data System (ADS)

    Monnier, Samuel

    2015-09-01

    We develop a proposal by Freed to see anomalous field theories as relative field theories, namely field theories taking value in a field theory in one dimension higher, the anomaly field theory. We show that when the anomaly field theory is extended down to codimension 2, familiar facts about Hamiltonian anomalies can be naturally recovered, such as the fact that the anomalous symmetry group admits only a projective representation on the Hilbert space, or that the latter is really an abelian bundle gerbe over the moduli space. We include in the discussion the case of non-invertible anomaly field theories, which is relevant to six-dimensional (2, 0) superconformal theories. In this case, we show that the Hamiltonian anomaly is characterized by a degree 2 non-abelian group cohomology class, associated to the non-abelian gerbe playing the role of the state space of the anomalous theory. We construct Dai-Freed theories, governing the anomalies of chiral fermionic theories, and Wess-Zumino theories, governing the anomalies of Wess-Zumino terms and self-dual field theories, as extended field theories down to codimension 2.

  18. Congenital hand anomalies in Upper Egypt

    PubMed Central

    Abulezz, Tarek; Talaat, Mohamed; Elsani, Asem; Allam, Karam

    2016-01-01

    Background: Congenital hand anomalies are numerous and markedly variant. Their significance is attributed to the frequent occurrence and their serious social, psychological and functional impacts on patient's life. Patients and Methods: This is a follow-up study of 64 patients with hand anomalies of variable severity. All patients were presented to Plastic Surgery Department of Sohag University Hospital in a period of 24 months. Results: This study revealed that failure of differentiation and duplication deformities were the most frequent, with polydactyly was the most common anomaly encountered. The mean age of presentation was 6 years and female to male ratio was 1.46:1. Hand anomalies were either isolated, associated with other anomalies or part of a syndrome. Conclusion: Incidence of congenital hand anomalies in Upper Egypt is difficult to be estimated due to social and cultural concepts, lack of education, poor registration and deficient medical survey. Management of hand anomalies should be individualised, carefully planned and started as early as possible to achieve the best outcome. PMID:27833283

  19. A review on remotely sensed land surface temperature anomaly as an earthquake precursor

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Anshuman; Singh, Shaktiman; Sam, Lydia; Joshi, P. K.; Bhardwaj, Akanksha; Martín-Torres, F. Javier; Kumar, Rajesh

    2017-12-01

    The low predictability of earthquakes and the high uncertainty associated with their forecasts make earthquakes one of the worst natural calamities, capable of causing instant loss of life and property. Here, we discuss the studies reporting the observed anomalies in the satellite-derived Land Surface Temperature (LST) before an earthquake. We compile the conclusions of these studies and evaluate the use of remotely sensed LST anomalies as precursors of earthquakes. The arrival times and the amplitudes of the anomalies vary widely, thus making it difficult to consider them as universal markers to issue earthquake warnings. Based on the randomness in the observations of these precursors, we support employing a global-scale monitoring system to detect statistically robust anomalous geophysical signals prior to earthquakes before considering them as definite precursors.

  20. Flux-fusion anomaly test and bosonic topological crystalline insulators

    DOE PAGES

    Hermele, Michael; Chen, Xie

    2016-10-13

    Here, we introduce a method, dubbed the flux-fusion anomaly test, to detect certain anomalous symmetry fractionalization patterns in two-dimensional symmetry-enriched topological (SET) phases. We focus on bosonic systems with Z2 topological order and a symmetry group of the form G=U(1)xG', where G' is an arbitrary group that may include spatial symmetries and/or time reversal. The anomalous fractionalization patterns we identify cannot occur in strictly d=2 systems but can occur at surfaces of d=3 symmetry-protected topological (SPT) phases. This observation leads to examples of d=3 bosonic topological crystalline insulators (TCIs) that, to our knowledge, have not previously been identified. In somemore » cases, these d=3 bosonic TCIs can have an anomalous superfluid at the surface, which is characterized by nontrivial projective transformations of the superfluid vortices under symmetry. The basic idea of our anomaly test is to introduce fluxes of the U(1) symmetry and to show that some fractionalization patterns cannot be extended to a consistent action of G' symmetry on the fluxes. For some anomalies, this can be described in terms of dimensional reduction to d=1 SPT phases. We apply our method to several different symmetry groups with nontrivial anomalies, including G=U(1)×Z T 2 and G=U(1)×Z P 2, where Z T 2 and Z P 2 are time-reversal and d=2 reflection symmetry, respectively.« less