Science.gov

Sample records for anomaly detection system

  1. Modeling And Detecting Anomalies In Scada Systems

    NASA Astrophysics Data System (ADS)

    Svendsen, Nils; Wolthusen, Stephen

    The detection of attacks and intrusions based on anomalies is hampered by the limits of specificity underlying the detection techniques. However, in the case of many critical infrastructure systems, domain-specific knowledge and models can impose constraints that potentially reduce error rates. At the same time, attackers can use their knowledge of system behavior to mask their manipulations, causing adverse effects to observed only after a significant period of time. This paper describes elementary statistical techniques that can be applied to detect anomalies in critical infrastructure networks. A SCADA system employed in liquefied natural gas (LNG) production is used as a case study.

  2. Detecting data anomalies methods in distributed systems

    NASA Astrophysics Data System (ADS)

    Mosiej, Lukasz

    2009-06-01

    Distributed systems became most popular systems in big companies. Nowadays many telecommunications companies want to hold large volumes of data about all customers. Obviously, those data cannot be stored in single database because of many technical difficulties, such as data access efficiency, security reasons, etc. On the other hand there is no need to hold all data in one place, because companies already have dedicated systems to perform specific tasks. In the distributed systems there is a redundancy of data and each system holds only interesting data in appropriate form. Data updated in one system should be also updated in the rest of systems, which hold that data. There are technical problems to update those data in all systems in transactional way. This article is about data anomalies in distributed systems. Avail data anomalies detection methods are shown. Furthermore, a new initial concept of new data anomalies detection methods is described on the last section.

  3. System and method for anomaly detection

    DOEpatents

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  4. A model for anomaly classification in intrusion detection systems

    NASA Astrophysics Data System (ADS)

    Ferreira, V. O.; Galhardi, V. V.; Gonçalves, L. B. L.; Silva, R. C.; Cansian, A. M.

    2015-09-01

    Intrusion Detection Systems (IDS) are traditionally divided into two types according to the detection methods they employ, namely (i) misuse detection and (ii) anomaly detection. Anomaly detection has been widely used and its main advantage is the ability to detect new attacks. However, the analysis of anomalies generated can become expensive, since they often have no clear information about the malicious events they represent. In this context, this paper presents a model for automated classification of alerts generated by an anomaly based IDS. The main goal is either the classification of the detected anomalies in well-defined taxonomies of attacks or to identify whether it is a false positive misclassified by the IDS. Some common attacks to computer networks were considered and we achieved important results that can equip security analysts with best resources for their analyses.

  5. Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)

    NASA Technical Reports Server (NTRS)

    McIntosh, Dawn

    2006-01-01

    This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search

  6. Network Anomaly Detection System with Optimized DS Evidence Theory

    PubMed Central

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  7. Network anomaly detection system with optimized DS evidence theory.

    PubMed

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network-complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each sensor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  8. System for Anomaly and Failure Detection (SAFD) system development

    NASA Technical Reports Server (NTRS)

    Oreilly, D.

    1993-01-01

    The System for Anomaly and Failure Detection (SAFD) algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failures as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient conditions. This task assignment originally specified developing a platform for executing the algorithm during hot fire tests at Technology Test Bed (TTB) and installing the SAFD algorithm on that platform. Two units were built and installed in the Hardware Simulation Lab and at the TTB in December 1991. Since that time, the task primarily entailed improvement and maintenance of the systems, additional testing to prove the feasibility of the algorithm, and support of hot fire testing. This document addresses the work done since the last report of June 1992. The work on the System for Anomaly and Failure Detection during this period included improving the platform and the algorithm, testing the algorithm against previous test data and in the Hardware Simulation Lab, installing other algorithms on the system, providing support for operations at the Technology Test Bed, and providing routine maintenance.

  9. System for Anomaly and Failure Detection (SAFD) system development

    NASA Astrophysics Data System (ADS)

    Oreilly, D.

    1992-07-01

    This task specified developing the hardware and software necessary to implement the System for Anomaly and Failure Detection (SAFD) algorithm, developed under Technology Test Bed (TTB) Task 21, on the TTB engine stand. This effort involved building two units; one unit to be installed in the Block II Space Shuttle Main Engine (SSME) Hardware Simulation Lab (HSL) at Marshall Space Flight Center (MSFC), and one unit to be installed at the TTB engine stand. Rocketdyne personnel from the HSL performed the task. The SAFD algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failure as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient condition.

  10. System for Anomaly and Failure Detection (SAFD) system development

    NASA Technical Reports Server (NTRS)

    Oreilly, D.

    1992-01-01

    This task specified developing the hardware and software necessary to implement the System for Anomaly and Failure Detection (SAFD) algorithm, developed under Technology Test Bed (TTB) Task 21, on the TTB engine stand. This effort involved building two units; one unit to be installed in the Block II Space Shuttle Main Engine (SSME) Hardware Simulation Lab (HSL) at Marshall Space Flight Center (MSFC), and one unit to be installed at the TTB engine stand. Rocketdyne personnel from the HSL performed the task. The SAFD algorithm was developed as an improvement over the current redline system used in the Space Shuttle Main Engine Controller (SSMEC). Simulation tests and execution against previous hot fire tests demonstrated that the SAFD algorithm can detect engine failure as much as tens of seconds before the redline system recognized the failure. Although the current algorithm only operates during steady state conditions (engine not throttling), work is underway to expand the algorithm to work during transient condition.

  11. Attention focusing and anomaly detection in systems monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. The focus of this paper is a new technique for attention focusing. The technique involves reasoning about the distance between two frequency distributions, and is used to detect both anomalous system parameters and 'broken' causal dependencies. These two forms of information together isolate the locus of anomalous behavior in the system being monitored.

  12. Extending TOPS: Knowledge Management System for Anomaly Detection and Analysis

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Michaelis, A.

    2009-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. While there are large numbers of anomaly detection algorithms for multivariate datasets, we are extending this capability beyond the anomaly detection itself and towards an automated analysis that would discover the possible causes of the anomalies. There are often indirect connections between datasets that manifest themselves during occurrence of external events and rather than searching exhaustively throughout all the datasets, our goal is to capture this knowledge and provide it to the system during automated analysis. This results in more efficient processing. Since we don’t need to process all the datasets using the original anomaly detection algorithms, which is often compute intensive; we achieve data reduction as we don’t need to store all the datasets in order to search for possible connections but we can download selected data on-demand based on our analysis. For example, an anomaly observed in vegetation Net Primary Production (NPP) can relate to an anomaly in vegetation Leaf Area Index (LAI), which is a fairly direct connection, as LAI is one of the inputs for NPP, however the change in LAI could be caused by a fire event, which is not directly connected with NPP. Because we are able to capture this knowledge we can analyze fire datasets and if there is a match with the NPP anomaly, we can infer that a fire is a likely cause. The knowledge is captured using OWL ontology language, where connections are defined in a schema

  13. Extending TOPS: Ontology-driven Anomaly Detection and Analysis System

    NASA Astrophysics Data System (ADS)

    Votava, P.; Nemani, R. R.; Michaelis, A.

    2010-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.

  14. Anomaly-based intrusion detection for SCADA systems

    SciTech Connect

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper will briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)

  15. Using Physical Models for Anomaly Detection in Control Systems

    NASA Astrophysics Data System (ADS)

    Svendsen, Nils; Wolthusen, Stephen

    Supervisory control and data acquisition (SCADA) systems are increasingly used to operate critical infrastructure assets. However, the inclusion of advanced information technology and communications components and elaborate control strategies in SCADA systems increase the threat surface for external and subversion-type attacks. The problems are exacerbated by site-specific properties of SCADA environments that make subversion detection impractical; and by sensor noise and feedback characteristics that degrade conventional anomaly detection systems. Moreover, potential attack mechanisms are ill-defined and may include both physical and logical aspects.

  16. Rule-based expert system for maritime anomaly detection

    NASA Astrophysics Data System (ADS)

    Roy, Jean

    2010-04-01

    Maritime domain operators/analysts have a mandate to be aware of all that is happening within their areas of responsibility. This mandate derives from the needs to defend sovereignty, protect infrastructures, counter terrorism, detect illegal activities, etc., and it has become more challenging in the past decade, as commercial shipping turned into a potential threat. In particular, a huge portion of the data and information made available to the operators/analysts is mundane, from maritime platforms going about normal, legitimate activities, and it is very challenging for them to detect and identify the non-mundane. To achieve such anomaly detection, they must establish numerous relevant situational facts from a variety of sensor data streams. Unfortunately, many of the facts of interest just cannot be observed; the operators/analysts thus use their knowledge of the maritime domain and their reasoning faculties to infer these facts. As they are often overwhelmed by the large amount of data and information, automated reasoning tools could be used to support them by inferring the necessary facts, ultimately providing indications and warning on a small number of anomalous events worthy of their attention. Along this line of thought, this paper describes a proof-of-concept prototype of a rule-based expert system implementing automated rule-based reasoning in support of maritime anomaly detection.

  17. Log Summarization and Anomaly Detection for TroubleshootingDistributed Systems

    SciTech Connect

    Gunter, Dan; Tierney, Brian L.; Brown, Aaron; Swany, Martin; Bresnahan, John; Schopf, Jennifer M.

    2007-08-01

    Today's system monitoring tools are capable of detectingsystem failures such as host failures, OS errors, and network partitionsin near-real time. Unfortunately, the same cannot yet be said of theend-to-end distributed softwarestack. Any given action, for example,reliably transferring a directory of files, can involve a wide range ofcomplex and interrelated actions across multiple pieces of software:checking user certificates and permissions, getting details for allfiles, performing third-party transfers, understanding re-try policydecisions, etc. We present an infrastructure for troubleshooting complexmiddleware, a general purpose technique for configurable logsummarization, and an anomaly detection technique that works in near-realtime on running Grid middleware. We present results gathered using thisinfrastructure from instrumented Grid middleware and applications runningon the Emulab testbed. From these results, we analyze the effectivenessof several algorithms at accurately detecting a variety of performanceanomalies.

  18. Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)

    2001-01-01

    The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.

  19. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    SciTech Connect

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  20. Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

    SciTech Connect

    Ondrej Linda; Todd Vollmer; Milos Manic

    2012-08-01

    The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, this paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.

  1. Addressing the Challenges of Anomaly Detection for Cyber Physical Energy Grid Systems

    SciTech Connect

    Ferragut, Erik M; Laska, Jason A; Melin, Alexander M; Czejdo, Bogdan

    2013-01-01

    The consolidation of cyber communications networks and physical control systems within the energy smart grid introduces a number of new risks. Unfortunately, these risks are largely unknown and poorly understood, yet include very high impact losses from attack and component failures. One important aspect of risk management is the detection of anomalies and changes. However, anomaly detection within cyber security remains a difficult, open problem, with special challenges in dealing with false alert rates and heterogeneous data. Furthermore, the integration of cyber and physical dynamics is often intractable. And, because of their broad scope, energy grid cyber-physical systems must be analyzed at multiple scales, from individual components, up to network level dynamics. We describe an improved approach to anomaly detection that combines three important aspects. First, system dynamics are modeled using a reduced order model for greater computational tractability. Second, a probabilistic and principled approach to anomaly detection is adopted that allows for regulation of false alerts and comparison of anomalies across heterogeneous data sources. Third, a hierarchy of aggregations are constructed to support interactive and automated analyses of anomalies at multiple scales.

  2. Analyzing Global Climate System Using Graph Based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Das, K.; Agrawal, S.; Atluri, G.; Liess, S.; Steinbach, M.; Kumar, V.

    2014-12-01

    Climate networks have been studied for understanding complex relationships between different spatial locations such as community structures and teleconnections. Analysis of time-evolving climate networks reveals changes that occur in those relationships over time and can provide insights for discovering new and complex climate phenomena. We have recently developed a novel data mining technique to discover anomalous relationships from dynamic climate networks. The algorithms efficiently identifies anomalous changes in relationships that cause significant structural changes in the climate network from one time instance to the next. Using this technique we investigated the presence of anomalies in precipitation networks that were constructed based on monthly averages of precipitation recorded at .5 degree resolution during the time period 1982 to 2002. The precipitation network consisted of 10-nearest neighbor graphs for every month's data. Preliminary results on this data set indicate that we were able to discover several anomalies that have been verified to be related to or as the outcome of well known climate phenomena. For instance, one such set of anomalies corresponds to transition from January 1994 (normal conditions) to January 1995 (El-Nino conditions) and include events like worst droughts of the 20th century in Australian Plains, very high rainfall in southeast Asian islands, and drought-like conditions in Peru, Chile, and eastern equatorial Africa during that time period. We plan to further apply our technique to networks constructed out of different climate variables such as sea-level pressure, surface air temperature, wind velocity, 500 geo-potential height etc. at different resolutions. Using this method we hope to develop deeper insights regarding the interactions of multiple climate variables globally over time, which might lead to discovery of previously unknown climate phenomena involving heterogeneous data sources.

  3. A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.

  4. A comparison of algorithms for anomaly detection in safeguards and computer security systems using neural networks

    SciTech Connect

    Howell, J.A.; Whiteson, R.

    1992-08-01

    Detection of anomalies in nuclear safeguards and computer security systems is a tedious and time-consuming task. It typically requires the examination of large amounts of data for unusual patterns of activity. Neural networks provide a flexible pattern-recognition capability that can easily be adapted for these purposes. In this paper, we discuss architectures for accomplishing this task.

  5. A comparison of algorithms for anomaly detection in safeguards and computer security systems using neural networks

    SciTech Connect

    Howell, J.A.; Whiteson, R.

    1992-01-01

    Detection of anomalies in nuclear safeguards and computer security systems is a tedious and time-consuming task. It typically requires the examination of large amounts of data for unusual patterns of activity. Neural networks provide a flexible pattern-recognition capability that can easily be adapted for these purposes. In this paper, we discuss architectures for accomplishing this task.

  6. HPNAIDM: The High-Performance Network Anomaly/Intrusion Detection and Mitigation System

    SciTech Connect

    Chen, Yan

    2013-12-05

    Identifying traffic anomalies and attacks rapidly and accurately is critical for large network operators. With the rapid growth of network bandwidth, such as the next generation DOE UltraScience Network, and fast emergence of new attacks/virus/worms, existing network intrusion detection systems (IDS) are insufficient because they: • Are mostly host-based and not scalable to high-performance networks; • Are mostly signature-based and unable to adaptively recognize flow-level unknown attacks; • Cannot differentiate malicious events from the unintentional anomalies. To address these challenges, we proposed and developed a new paradigm called high-performance network anomaly/intrustion detection and mitigation (HPNAIDM) system. The new paradigm is significantly different from existing IDSes with the following features (research thrusts). • Online traffic recording and analysis on high-speed networks; • Online adaptive flow-level anomaly/intrusion detection and mitigation; • Integrated approach for false positive reduction. Our research prototype and evaluation demonstrate that the HPNAIDM system is highly effective and economically feasible. Beyond satisfying the pre-set goals, we even exceed that significantly (see more details in the next section). Overall, our project harvested 23 publications (2 book chapters, 6 journal papers and 15 peer-reviewed conference/workshop papers). Besides, we built a website for technique dissemination, which hosts two system prototype release to the research community. We also filed a patent application and developed strong international and domestic collaborations which span both academia and industry.

  7. Automated anomaly detection processor

    NASA Astrophysics Data System (ADS)

    Kraiman, James B.; Arouh, Scott L.; Webb, Michael L.

    2002-07-01

    Robust exploitation of tracking and surveillance data will provide an early warning and cueing capability for military and civilian Law Enforcement Agency operations. This will improve dynamic tasking of limited resources and hence operational efficiency. The challenge is to rapidly identify threat activity within a huge background of noncombatant traffic. We discuss development of an Automated Anomaly Detection Processor (AADP) that exploits multi-INT, multi-sensor tracking and surveillance data to rapidly identify and characterize events and/or objects of military interest, without requiring operators to specify threat behaviors or templates. The AADP has successfully detected an anomaly in traffic patterns in Los Angeles, analyzed ship track data collected during a Fleet Battle Experiment to detect simulated mine laying behavior amongst maritime noncombatants, and is currently under development for surface vessel tracking within the Coast Guard's Vessel Traffic Service to support port security, ship inspection, and harbor traffic control missions, and to monitor medical surveillance databases for early alert of a bioterrorist attack. The AADP can also be integrated into combat simulations to enhance model fidelity of multi-sensor fusion effects in military operations.

  8. Can we detect regional methane anomalies? A comparison between three observing systems

    NASA Astrophysics Data System (ADS)

    Cressot, Cindy; Pison, Isabelle; Rayner, Peter J.; Bousquet, Philippe; Fortems-Cheiney, Audrey; Chevallier, Frédéric

    2016-07-01

    A Bayesian inversion system is used to evaluate the capability of the current global surface network and of the space-borne GOSAT/TANSO-FTS and IASI instruments to quantify surface flux anomalies of methane at various spatial (global, semi-hemispheric and regional) and time (seasonal, yearly, 3-yearly) scales. The evaluation is based on a signal-to-noise ratio analysis, the signal being the methane fluxes inferred from the surface-based inversion from 2000 to 2011 and the noise (i.e., precision) of each of the three observing systems being computed from the Bayesian equation. At the global and semi-hemispheric scales, all observing systems detect flux anomalies at most of the tested timescales. At the regional scale, some seasonal flux anomalies are detected by the three observing systems, but year-to-year anomalies and longer-term trends are only poorly detected. Moreover, reliably detected regions depend on the reference surface-based inversion used as the signal. Indeed, tropical flux inter-annual variability, for instance, can be attributed mostly to Africa in the reference inversion or spread between tropical regions in Africa and America. Our results show that inter-annual analyses of methane emissions inferred by atmospheric inversions should always include an uncertainty assessment and that the attribution of current trends in atmospheric methane to particular regions' needs increased effort, for instance, gathering more observations (in the future) and improving transport models. At all scales, GOSAT generally shows the best performance of the three observing systems.

  9. Apparatus for detecting a magnetic anomaly contiguous to remote location by squid gradiometer and magnetometer systems

    DOEpatents

    Overton, Jr., William C.; Steyert, Jr., William A.

    1984-01-01

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  10. The Frog-Boiling Attack: Limitations of Anomaly Detection for Secure Network Coordinate Systems

    NASA Astrophysics Data System (ADS)

    Chan-Tin, Eric; Feldman, Daniel; Hopper, Nicholas; Kim, Yongdae

    A network coordinate system assigns Euclidean “virtual” coordinates to every node in a network to allow easy estimation of network latency between pairs of nodes that have never contacted each other. These systems have been implemented in a variety of applications, most notably the popular Azureus/Vuze BitTorrent client. Zage and Nita-Rotaru (CCS 2007) and independently, Kaafar et al. (SIGCOMM 2007), demonstrated that several widely-cited network coordinate systems are prone to simple attacks, and proposed mechanisms to defeat these attacks using outlier detection to filter out adversarial inputs. We propose a new attack, Frog-Boiling, that defeats anomaly-detection based defenses in the context of network coordinate systems, and demonstrate empirically that Frog-Boiling is more disruptive than the previously known attacks. Our results suggest that a new approach is needed to solve this problem: outlier detection alone cannot be used to secure network coordinate systems.

  11. Survey of Anomaly Detection Methods

    SciTech Connect

    Ng, B

    2006-10-12

    This survey defines the problem of anomaly detection and provides an overview of existing methods. The methods are categorized into two general classes: generative and discriminative. A generative approach involves building a model that represents the joint distribution of the input features and the output labels of system behavior (e.g., normal or anomalous) then applies the model to formulate a decision rule for detecting anomalies. On the other hand, a discriminative approach aims directly to find the decision rule, with the smallest error rate, that distinguishes between normal and anomalous behavior. For each approach, we will give an overview of popular techniques and provide references to state-of-the-art applications.

  12. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems

    PubMed Central

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477

  13. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.

    PubMed

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477

  14. Characterization of normality of chaotic systems including prediction and detection of anomalies

    NASA Astrophysics Data System (ADS)

    Engler, Joseph John

    Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational

  15. A function approximation approach to anomaly detection in propulsion system test data

    NASA Astrophysics Data System (ADS)

    Whitehead, Bruce A.; Hoyt, W. A.

    1993-06-01

    Ground test data from propulsion systems such as the Space Shuttle Main Engine (SSME) can be automatically screened for anomalies by a neural network. The neural network screens data after being trained with nominal data only. Given the values of 14 measurements reflecting external influences on the SSME at a given time, the neural network predicts the expected nominal value of a desired engine parameter at that time. We compared the ability of three different function-approximation techniques to perform this nominal value prediction: a novel neural network architecture based on Gaussian bar basis functions, a conventional back propagation neural network, and linear regression. These three techniques were tested with real data from six SSME ground tests containing two anomalies. The basis function network trained more rapidly than back propagation. It yielded nominal predictions with, a tight enough confidence interval to distinguish anomalous deviations from the nominal fluctuations in an engine parameter. Since the function-approximation approach requires nominal training data only, it is capable of detecting unknown classes of anomalies for which training data is not available.

  16. A function approximation approach to anomaly detection in propulsion system test data

    NASA Technical Reports Server (NTRS)

    Whitehead, Bruce A.; Hoyt, W. A.

    1993-01-01

    Ground test data from propulsion systems such as the Space Shuttle Main Engine (SSME) can be automatically screened for anomalies by a neural network. The neural network screens data after being trained with nominal data only. Given the values of 14 measurements reflecting external influences on the SSME at a given time, the neural network predicts the expected nominal value of a desired engine parameter at that time. We compared the ability of three different function-approximation techniques to perform this nominal value prediction: a novel neural network architecture based on Gaussian bar basis functions, a conventional back propagation neural network, and linear regression. These three techniques were tested with real data from six SSME ground tests containing two anomalies. The basis function network trained more rapidly than back propagation. It yielded nominal predictions with, a tight enough confidence interval to distinguish anomalous deviations from the nominal fluctuations in an engine parameter. Since the function-approximation approach requires nominal training data only, it is capable of detecting unknown classes of anomalies for which training data is not available.

  17. Seismic data fusion anomaly detection

    NASA Astrophysics Data System (ADS)

    Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David

    2014-06-01

    Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.

  18. Mining Building Energy Management System Data Using Fuzzy Anomaly Detection and Linguistic Descriptions

    SciTech Connect

    Dumidu Wijayasekara; Ondrej Linda; Milos Manic; Craig Rieger

    2014-08-01

    Building Energy Management Systems (BEMSs) are essential components of modern buildings that utilize digital control technologies to minimize energy consumption while maintaining high levels of occupant comfort. However, BEMSs can only achieve these energy savings when properly tuned and controlled. Since indoor environment is dependent on uncertain criteria such as weather, occupancy, and thermal state, performance of BEMS can be sub-optimal at times. Unfortunately, the complexity of BEMS control mechanism, the large amount of data available and inter-relations between the data can make identifying these sub-optimal behaviors difficult. This paper proposes a novel Fuzzy Anomaly Detection and Linguistic Description (Fuzzy-ADLD) based method for improving the understandability of BEMS behavior for improved state-awareness. The presented method is composed of two main parts: 1) detection of anomalous BEMS behavior and 2) linguistic representation of BEMS behavior. The first part utilizes modified nearest neighbor clustering algorithm and fuzzy logic rule extraction technique to build a model of normal BEMS behavior. The second part of the presented method computes the most relevant linguistic description of the identified anomalies. The presented Fuzzy-ADLD method was applied to real-world BEMS system and compared against a traditional alarm based BEMS. In six different scenarios, the Fuzzy-ADLD method identified anomalous behavior either as fast as or faster (an hour or more), that the alarm based BEMS. In addition, the Fuzzy-ADLD method identified cases that were missed by the alarm based system, demonstrating potential for increased state-awareness of abnormal building behavior.

  19. Realization and detection of Weyl semimetals and the chiral anomaly in cold atomic systems

    NASA Astrophysics Data System (ADS)

    He, Wen-Yu; Zhang, Shizhong; Law, K. T.

    2016-07-01

    In this work, we describe a method to realize a three-dimensional Weyl semimetal by coupling multilayers of a honeycomb optical lattice in the presence of a pair of Raman lasers. The Raman lasers render each isolated honeycomb layer a Chern insulator. With finite interlayer coupling, the bulk gap of the system closes at certain out-of-plane momenta due to Raman assisted tunneling and results in the Weyl semimetal phase. Using experimentally relevant parameters, we show that both one pair and two pairs of Weyl points can be realized by tuning the interlayer coupling strength. We suggest that Landau-Zener tunneling can be used to detect Weyl points and show that the transition probability increases dramatically when the Weyl point emerges. The realization of chiral anomaly by using a magnetic-field gradient is also discussed.

  20. Data Mining for Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj

    2013-01-01

    The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.

  1. Model selection for anomaly detection

    NASA Astrophysics Data System (ADS)

    Burnaev, E.; Erofeev, P.; Smolyakov, D.

    2015-12-01

    Anomaly detection based on one-class classification algorithms is broadly used in many applied domains like image processing (e.g. detection of whether a patient is "cancerous" or "healthy" from mammography image), network intrusion detection, etc. Performance of an anomaly detection algorithm crucially depends on a kernel, used to measure similarity in a feature space. The standard approaches (e.g. cross-validation) for kernel selection, used in two-class classification problems, can not be used directly due to the specific nature of a data (absence of a second, abnormal, class data). In this paper we generalize several kernel selection methods from binary-class case to the case of one-class classification and perform extensive comparison of these approaches using both synthetic and real-world data.

  2. Astrometric solar system anomalies

    SciTech Connect

    Nieto, Michael Martin; Anderson, John D

    2009-01-01

    There are at least four unexplained anomalies connected with astrometric data. perhaps the most disturbing is the fact that when a spacecraft on a flyby trajectory approaches the Earth within 2000 km or less, it often experiences a change in total orbital energy per unit mass. next, a secular change in the astronomical unit AU is definitely a concern. It is increasing by about 15 cm yr{sup -1}. The other two anomalies are perhaps less disturbing because of known sources of nongravitational acceleration. The first is an apparent slowing of the two Pioneer spacecraft as they exit the solar system in opposite directions. Some astronomers and physicists are convinced this effect is of concern, but many others are convinced it is produced by a nearly identical thermal emission from both spacecraft, in a direction away from the Sun, thereby producing acceleration toward the Sun. The fourth anomaly is a measured increase in the eccentricity of the Moon's orbit. Here again, an increase is expected from tidal friction in both the Earth and Moon. However, there is a reported unexplained increase that is significant at the three-sigma level. It is produent to suspect that all four anomalies have mundane explanations, or that one or more anomalies are a result of systematic error. Yet they might eventually be explained by new physics. For example, a slightly modified theory of gravitation is not ruled out, perhaps analogous to Einstein's 1916 explanation for the excess precession of Mercury's perihelion.

  3. Apparatus and method for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    DOEpatents

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1981-05-22

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  4. Network Anomaly Detection Based on Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  5. System and method for the detection of anomalies in an image

    DOEpatents

    Prasad, Lakshman; Swaminarayan, Sriram

    2013-09-03

    Preferred aspects of the present invention can include receiving a digital image at a processor; segmenting the digital image into a hierarchy of feature layers comprising one or more fine-scale features defining a foreground object embedded in one or more coarser-scale features defining a background to the one or more fine-scale features in the segmentation hierarchy; detecting a first fine-scale foreground feature as an anomaly with respect to a first background feature within which it is embedded; and constructing an anomalous feature layer by synthesizing spatially contiguous anomalous fine-scale features. Additional preferred aspects of the present invention can include detecting non-pervasive changes between sets of images in response at least in part to one or more difference images between the sets of images.

  6. Network Event Recording Device: An automated system for Network anomaly detection, and notification. Draft

    SciTech Connect

    Simmons, D.G.; Wilkins, R.

    1994-09-01

    The goal of the Network Event Recording Device (NERD) is to provide a flexible autonomous system for network logging and notification when significant network anomalies occur. The NERD is also charged with increasing the efficiency and effectiveness of currently implemented network security procedures. While it has always been possible for network and security managers to review log files for evidence of network irregularities, the NERD provides real-time display of network activity, as well as constant monitoring and notification services for managers. Similarly, real-time display and notification of possible security breaches will provide improved effectiveness in combating resource infiltration from both inside and outside the immediate network environment.

  7. Anomaly detection for internet surveillance

    NASA Astrophysics Data System (ADS)

    Bouma, Henri; Raaijmakers, Stephan; Halma, Arvid; Wedemeijer, Harry

    2012-06-01

    Many threats in the real world can be related to activity of persons on the internet. Internet surveillance aims to predict and prevent attacks and to assist in finding suspects based on information from the web. However, the amount of data on the internet rapidly increases and it is time consuming to monitor many websites. In this paper, we present a novel method to automatically monitor trends and find anomalies on the internet. The system was tested on Twitter data. The results showed that it can successfully recognize abnormal changes in activity or emotion.

  8. Anomaly Detection in Dynamic Networks

    SciTech Connect

    Turcotte, Melissa

    2014-10-14

    Anomaly detection in dynamic communication networks has many important security applications. These networks can be extremely large and so detecting any changes in their structure can be computationally challenging; hence, computationally fast, parallelisable methods for monitoring the network are paramount. For this reason the methods presented here use independent node and edge based models to detect locally anomalous substructures within communication networks. As a first stage, the aim is to detect changes in the data streams arising from node or edge communications. Throughout the thesis simple, conjugate Bayesian models for counting processes are used to model these data streams. A second stage of analysis can then be performed on a much reduced subset of the network comprising nodes and edges which have been identified as potentially anomalous in the first stage. The first method assumes communications in a network arise from an inhomogeneous Poisson process with piecewise constant intensity. Anomaly detection is then treated as a changepoint problem on the intensities. The changepoint model is extended to incorporate seasonal behavior inherent in communication networks. This seasonal behavior is also viewed as a changepoint problem acting on a piecewise constant Poisson process. In a static time frame, inference is made on this extended model via a Gibbs sampling strategy. In a sequential time frame, where the data arrive as a stream, a novel, fast Sequential Monte Carlo (SMC) algorithm is introduced to sample from the sequence of posterior distributions of the change points over time. A second method is considered for monitoring communications in a large scale computer network. The usage patterns in these types of networks are very bursty in nature and don’t fit a Poisson process model. For tractable inference, discrete time models are considered, where the data are aggregated into discrete time periods and probability models are fitted to the

  9. A Clustering Method for Improving Performance of Anomaly-Based Intrusion Detection System

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Ohira, Kenji; Takakura, Hiroki; Okabe, Yasuo; Kwon, Yongjin

    Intrusion detection system (IDS) has played a central role as an appliance to effectively defend our crucial computer systems or networks against attackers on the Internet. The most widely deployed and commercially available methods for intrusion detection employ signature-based detection. However, they cannot detect unknown intrusions intrinsically which are not matched to the signatures, and their methods consume huge amounts of cost and time to acquire the signatures. In order to cope with the problems, many researchers have proposed various kinds of methods that are based on unsupervised learning techniques. Although they enable one to construct intrusion detection model with low cost and effort, and have capability to detect unforeseen attacks, they still have mainly two problems in intrusion detection: a low detection rate and a high false positive rate. In this paper, we present a new clustering method to improve the detection rate while maintaining a low false positive rate. We evaluated our method using KDD Cup 1999 data set. Evaluation results show that superiority of our approach to other existing algorithms reported in the literature.

  10. Anomaly detection in the maritime domain

    NASA Astrophysics Data System (ADS)

    Roy, Jean

    2008-04-01

    Defence R&D Canada is developing a Collaborative Knowledge Exploitation Framework (CKEF) to support the analysts in efficiently managing and exploiting relevant knowledge assets to achieve maritime domain awareness in joint operations centres of the Canadian Forces. While developing the CKEF, anomaly detection has been clearly recognized as an important aspect requiring R&D. An activity has thus been undertaken to implement, within the CKEF, a proof-of-concept prototype of a rule-based expert system to support the analysts regarding this aspect. This expert system has to perform automated reasoning and output recommendations (or alerts) about maritime anomalies, thereby supporting the identification of vessels of interest and threat analysis. The system must contribute to a lower false alarm rate and a better probability of detection in drawing operator's attention to vessels worthy of their attention. It must provide explanations as to why the vessels may be of interest, with links to resources that help the operators dig deeper. Mechanisms are necessary for the analysts to fine tune the system, and for the knowledge engineer to maintain the knowledge base as the expertise of the operators evolves. This paper portrays the anomaly detection prototype, and describes the knowledge acquisition and elicitation session conducted to capture the know-how of the experts, the formal knowledge representation enablers and the ontology required for aspects of the maritime domain that are relevant to anomaly detection, vessels of interest, and threat analysis, the prototype high-level design and implementation on the service-oriented architecture of the CKEF, and other findings and results of this ongoing activity.

  11. Conscious and unconscious detection of semantic anomalies.

    PubMed

    Hannon, Brenda

    2015-01-01

    When asked What superhero is associated with bats, Robin, the Penguin, Metropolis, Catwoman, the Riddler, the Joker, and Mr. Freeze? people frequently fail to notice the anomalous word Metropolis. The goals of this study were to determine whether detection of semantic anomalies, like Metropolis, is conscious or unconscious and whether this detection is immediate or delayed. To achieve these goals, participants answered anomalous and nonanomalous questions as their reading times for words were recorded. Comparisons between detected versus undetected anomalies revealed slower reading times for detected anomalies-a finding that suggests that people immediately and consciously detected anomalies. Further, comparisons between first and second words following undetected anomalies versus nonanomalous controls revealed some slower reading times for first and second words-a finding that suggests that people may have unconsciously detected anomalies but this detection was delayed. Taken together, these findings support the idea that when we are immediately aware of a semantic anomaly (i.e., immediate conscious detection) our language processes make immediate adjustments in order to reconcile contradictory information of anomalies with surrounding text; however, even when we are not consciously aware of semantic anomalies, our language processes still make these adjustments, although these adjustments are delayed (i.e., delayed unconscious detection). PMID:25624136

  12. Recent Results on "Approximations to Optimal Alarm Systems for Anomaly Detection"

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2009-01-01

    An optimal alarm system and its approximations may use Kalman filtering for univariate linear dynamic systems driven by Gaussian noise to provide a layer of predictive capability. Predicted Kalman filter future process values and a fixed critical threshold can be used to construct a candidate level-crossing event over a predetermined prediction window. An optimal alarm system can be designed to elicit the fewest false alarms for a fixed detection probability in this particular scenario.

  13. Geomagnetic anomaly detected at hydromagnetic wave frequencies

    NASA Astrophysics Data System (ADS)

    Meloni, A.; Medford, L. V.; Lanzerotti, L. J.

    1985-04-01

    We report the discovery, in northwestern Illinois, of a geomagnetic anomaly, using hydromagnetic wave frequencies as the source spectrum. Three portable magnetometer stations with computer-compatible digital data acquisition systems were operated in a longitude array at Piano and Ashton, Illinois, and Cascade, Iowa (total separation ˜200 km), in 1981-1982. Analysis of the natural geomagnetic field fluctuations in the hydromagnetic wave regime reveals that the vertical components of the detected fluctuations are essentially 180° out of phase between Plano/Ashton and Cascade for variations with periods ˜30-120 s. The observations can be modeled in terms of a shallow (˜10-20 km) north-south oriented geomagnetic anomaly of enhanced conductivity located between Ashton and Cascade, approximately parallel to the Mississippi River valley.

  14. A New, Principled Approach to Anomaly Detection

    SciTech Connect

    Ferragut, Erik M; Laska, Jason A; Bridges, Robert A

    2012-01-01

    Intrusion detection is often described as having two main approaches: signature-based and anomaly-based. We argue that only unsupervised methods are suitable for detecting anomalies. However, there has been a tendency in the literature to conflate the notion of an anomaly with the notion of a malicious event. As a result, the methods used to discover anomalies have typically been ad hoc, making it nearly impossible to systematically compare between models or regulate the number of alerts. We propose a new, principled approach to anomaly detection that addresses the main shortcomings of ad hoc approaches. We provide both theoretical and cyber-specific examples to demonstrate the benefits of our more principled approach.

  15. Efficient Computer Network Anomaly Detection by Changepoint Detection Methods

    NASA Astrophysics Data System (ADS)

    Tartakovsky, Alexander G.; Polunchenko, Aleksey S.; Sokolov, Grigory

    2013-02-01

    We consider the problem of efficient on-line anomaly detection in computer network traffic. The problem is approached statistically, as that of sequential (quickest) changepoint detection. A multi-cyclic setting of quickest change detection is a natural fit for this problem. We propose a novel score-based multi-cyclic detection algorithm. The algorithm is based on the so-called Shiryaev-Roberts procedure. This procedure is as easy to employ in practice and as computationally inexpensive as the popular Cumulative Sum chart and the Exponentially Weighted Moving Average scheme. The likelihood ratio based Shiryaev-Roberts procedure has appealing optimality properties, particularly it is exactly optimal in a multi-cyclic setting geared to detect a change occurring at a far time horizon. It is therefore expected that an intrusion detection algorithm based on the Shiryaev-Roberts procedure will perform better than other detection schemes. This is confirmed experimentally for real traces. We also discuss the possibility of complementing our anomaly detection algorithm with a spectral-signature intrusion detection system with false alarm filtering and true attack confirmation capability, so as to obtain a synergistic system.

  16. Anomaly Detection for Resilient Control Systems Using Fuzzy-Neural Data Fusion Engine

    SciTech Connect

    Ondrej Linda; Milos Manic; Timothy R. McJunkin

    2011-08-01

    Resilient control systems in critical infrastructures require increased cyber-security and state-awareness. One of the necessary conditions for achieving the desired high level of resiliency is timely reporting and understanding of the status and behavioral trends of the control system. This paper describes the design and development of a neural-network based data-fusion system for increased state-awareness of resilient control systems. The proposed system consists of a dedicated data-fusion engine for each component of the control system. Each data-fusion engine implements three-layered alarm system consisting of: (1) conventional threshold-based alarms, (2) anomalous behavior detector using self-organizing maps, and (3) prediction error based alarms using neural network based signal forecasting. The proposed system was integrated with a model of the Idaho National Laboratory Hytest facility, which is a testing facility for hybrid energy systems. Experimental results demonstrate that the implemented data fusion system provides timely plant performance monitoring and cyber-state reporting.

  17. Spacecraft environmental anomalies expert system

    NASA Technical Reports Server (NTRS)

    Koons, H. C.; Gorney, D. J.

    1988-01-01

    A microcomputer-based expert system is being developed at the Aerospace Corporation Space Sciences Laboratory to assist in the diagnosis of satellite anomalies caused by the space environment. The expert system is designed to address anomalies caused by surface charging, bulk charging, single event effects and total radiation dose. These effects depend on the orbit of the satellite, the local environment (which is highly variable), the satellite exposure time and the hardness of the circuits and components of the satellite. The expert system is a rule-based system that uses the Texas Instruments Personal Consultant Plus expert system shell. The completed expert system knowledge base will include 150 to 200 rules, as well as a spacecraft attributes database, an historical spacecraft anomalies database, and a space environment database which is updated in near real-time. Currently, the expert system is undergoing development and testing within the Aerospace Corporation Space Sciences Laboratory.

  18. Artificial immune system via Euclidean Distance Minimization for anomaly detection in bearings

    NASA Astrophysics Data System (ADS)

    Montechiesi, L.; Cocconcelli, M.; Rubini, R.

    2016-08-01

    In recent years new diagnostics methodologies have emerged, with particular interest into machinery operating in non-stationary conditions. In fact continuous speed changes and variable loads make non-trivial the spectrum analysis. A variable speed means a variable characteristic fault frequency related to the damage that is no more recognizable in the spectrum. To overcome this problem the scientific community proposed different approaches listed in two main categories: model-based approaches and expert systems. In this context the paper aims to present a simple expert system derived from the mechanisms of the immune system called Euclidean Distance Minimization, and its application in a real case of bearing faults recognition. The proposed method is a simplification of the original process, adapted by the class of Artificial Immune Systems, which proved to be useful and promising in different application fields. Comparative results are provided, with a complete explanation of the algorithm and its functioning aspects.

  19. Spectral anomaly detection in deep shadows.

    PubMed

    Kanaev, Andrey V; Murray-Krezan, Jeremy

    2010-03-20

    Although several hyperspectral anomaly detection algorithms have proven useful when illumination conditions provide for enough light, many of these same detection algorithms fail to perform well when shadows are also present. To date, no general approach to the problem has been demonstrated. In this paper, a novel hyperspectral anomaly detection algorithm that adapts the dimensionality of the spectral detection subspace to multiple illumination levels is described. The novel detection algorithm is applied to reflectance domain hyperspectral data that represents a variety of illumination conditions: well illuminated and poorly illuminated (i.e., shadowed). Detection results obtained for objects located in deep shadows and light-shadow transition areas suggest superiority of the novel algorithm over standard subspace RX detection. PMID:20300158

  20. System for closure of a physical anomaly

    DOEpatents

    Bearinger, Jane P; Maitland, Duncan J; Schumann, Daniel L; Wilson, Thomas S

    2014-11-11

    Systems for closure of a physical anomaly. Closure is accomplished by a closure body with an exterior surface. The exterior surface contacts the opening of the anomaly and closes the anomaly. The closure body has a primary shape for closing the anomaly and a secondary shape for being positioned in the physical anomaly. The closure body preferably comprises a shape memory polymer.

  1. Attention focussing and anomaly detection in real-time systems monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.; Chien, Steve A.; Fayyad, Usama M.; Porta, Harry J.

    1993-01-01

    In real-time monitoring situations, more information is not necessarily better. When faced with complex emergency situations, operators can experience information overload and a compromising of their ability to react quickly and correctly. We describe an approach to focusing operator attention in real-time systems monitoring based on a set of empirical and model-based measures for determining the relative importance of sensor data.

  2. Predictability in space launch vehicle anomaly detection using intelligent neuro-fuzzy systems

    NASA Technical Reports Server (NTRS)

    Gulati, Sandeep; Toomarian, Nikzad; Barhen, Jacob; Maccalla, Ayanna; Tawel, Raoul; Thakoor, Anil; Daud, Taher

    1994-01-01

    Included in this viewgraph presentation on intelligent neuroprocessors for launch vehicle health management systems (HMS) are the following: where the flight failures have been in launch vehicles; cumulative delay time; breakdown of operations hours; failure of Mars Probe; vehicle health management (VHM) cost optimizing curve; target HMS-STS auxiliary power unit location; APU monitoring and diagnosis; and integration of neural networks and fuzzy logic.

  3. Anomaly Detection Using Behavioral Approaches

    NASA Astrophysics Data System (ADS)

    Benferhat, Salem; Tabia, Karim

    Behavioral approaches, which represent normal/abnormal activities, have been widely used during last years in intrusion detection and computer security. Nevertheless, most works showed that they are ineffective for detecting novel attacks involving new behaviors. In this paper, we first study this recurring problem due on one hand to inadequate handling of anomalous and unusual audit events and on other hand to insufficient decision rules which do not meet behavioral approach objectives. We then propose to enhance the standard decision rules in order to fit behavioral approach requirements and better detect novel attacks. Experimental studies carried out on real and simulated http traffic show that these enhanced decision rules improve detecting most novel attacks without triggering higher false alarm rates.

  4. Anomaly Detection for Discrete Sequences: A Survey

    SciTech Connect

    Chandola, Varun; Banerjee, Arindam; Kumar, Vipin

    2012-01-01

    This survey attempts to provide a comprehensive and structured overview of the existing research for the problem of detecting anomalies in discrete/symbolic sequences. The objective is to provide a global understanding of the sequence anomaly detection problem and how existing techniques relate to each other. The key contribution of this survey is the classification of the existing research into three distinct categories, based on the problem formulation that they are trying to solve. These problem formulations are: 1) identifying anomalous sequences with respect to a database of normal sequences; 2) identifying an anomalous subsequence within a long sequence; and 3) identifying a pattern in a sequence whose frequency of occurrence is anomalous. We show how each of these problem formulations is characteristically distinct from each other and discuss their relevance in various application domains. We review techniques from many disparate and disconnected application domains that address each of these formulations. Within each problem formulation, we group techniques into categories based on the nature of the underlying algorithm. For each category, we provide a basic anomaly detection technique, and show how the existing techniques are variants of the basic technique. This approach shows how different techniques within a category are related or different from each other. Our categorization reveals new variants and combinations that have not been investigated before for anomaly detection. We also provide a discussion of relative strengths and weaknesses of different techniques. We show how techniques developed for one problem formulation can be adapted to solve a different formulation, thereby providing several novel adaptations to solve the different problem formulations. We also highlight the applicability of the techniques that handle discrete sequences to other related areas such as online anomaly detection and time series anomaly detection.

  5. Development of a Computer Architecture to Support the Optical Plume Anomaly Detection (OPAD) System

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1996-01-01

    The NASA OPAD spectrometer system relies heavily on extensive software which repetitively extracts spectral information from the engine plume and reports the amounts of metals which are present in the plume. The development of this software is at a sufficiently advanced stage where it can be used in actual engine tests to provide valuable data on engine operation and health. This activity will continue and, in addition, the OPAD system is planned to be used in flight aboard space vehicles. The two implementations, test-stand and in-flight, may have some differing requirements. For example, the data stored during a test-stand experiment are much more extensive than in the in-flight case. In both cases though, the majority of the requirements are similar. New data from the spectrograph is generated at a rate of once every 0.5 sec or faster. All processing must be completed within this period of time to maintain real-time performance. Every 0.5 sec, the OPAD system must report the amounts of specific metals within the engine plume, given the spectral data. At present, the software in the OPAD system performs this function by solving the inverse problem. It uses powerful physics-based computational models (the SPECTRA code), which receive amounts of metals as inputs to produce the spectral data that would have been observed, had the same metal amounts been present in the engine plume. During the experiment, for every spectrum that is observed, an initial approximation is performed using neural networks to establish an initial metal composition which approximates as accurately as possible the real one. Then, using optimization techniques, the SPECTRA code is repetitively used to produce a fit to the data, by adjusting the metal input amounts until the produced spectrum matches the observed one to within a given level of tolerance. This iterative solution to the original problem of determining the metal composition in the plume requires a relatively long period of time

  6. Hyperspectral Anomaly Detection in Urban Scenarios

    NASA Astrophysics Data System (ADS)

    Rejas Ayuga, J. G.; Martínez Marín, R.; Marchamalo Sacristán, M.; Bonatti, J.; Ojeda, J. C.

    2016-06-01

    We have studied the spectral features of reflectance and emissivity in the pattern recognition of urban materials in several single hyperspectral scenes through a comparative analysis of anomaly detection methods and their relationship with city surfaces with the aim to improve information extraction processes. Spectral ranges of the visible-near infrared (VNIR), shortwave infrared (SWIR) and thermal infrared (TIR) from hyperspectral data cubes of AHS sensor and HyMAP and MASTER of two cities, Alcalá de Henares (Spain) and San José (Costa Rica) respectively, have been used. In this research it is assumed no prior knowledge of the targets, thus, the pixels are automatically separated according to their spectral information, significantly differentiated with respect to a background, either globally for the full scene, or locally by image segmentation. Several experiments on urban scenarios and semi-urban have been designed, analyzing the behaviour of the standard RX anomaly detector and different methods based on subspace, image projection and segmentation-based anomaly detection methods. A new technique for anomaly detection in hyperspectral data called DATB (Detector of Anomalies from Thermal Background) based on dimensionality reduction by projecting targets with unknown spectral signatures to a background calculated from thermal spectrum wavelengths is presented. First results and their consequences in non-supervised classification and extraction information processes are discussed.

  7. Anomaly detection using classified eigenblocks in GPR image

    NASA Astrophysics Data System (ADS)

    Kim, Min Ju; Kim, Seong Dae; Lee, Seung-eui

    2016-05-01

    Automatic landmine detection system using ground penetrating radar has been widely researched. For the automatic mine detection system, system speed is an important factor. Many techniques for mine detection have been developed based on statistical background. Among them, a detection technique employing the Principal Component Analysis(PCA) has been used for clutter reduction and anomaly detection. However, the PCA technique can retard the entire process, because of large basis dimension and a numerous number of inner product operations. In order to overcome this problem, we propose a fast anomaly detection system using 2D DCT and PCA. Our experiments use a set of data obtained from a test site where the anti-tank and anti- personnel mines are buried. We evaluate the proposed system in terms of the ROC curve. The result shows that the proposed system performs much better than the conventional PCA systems from the viewpoint of speed and false alarm rate.

  8. Fusion and normalization to enhance anomaly detection

    NASA Astrophysics Data System (ADS)

    Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.

    2009-05-01

    This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.

  9. Anomaly Detection Techniques for Ad Hoc Networks

    ERIC Educational Resources Information Center

    Cai, Chaoli

    2009-01-01

    Anomaly detection is an important and indispensable aspect of any computer security mechanism. Ad hoc and mobile networks consist of a number of peer mobile nodes that are capable of communicating with each other absent a fixed infrastructure. Arbitrary node movements and lack of centralized control make them vulnerable to a wide variety of…

  10. OPAD data analysis. [Optical Plumes Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Kraft, Richard; Whitaker, Kevin; Cooper, Anita E.; Powers, W. T.; Wallace, Tim L.

    1993-01-01

    Data obtained in the framework of an Optical Plume Anomaly Detection (OPAD) program intended to create a rocket engine health monitor based on spectrometric detections of anomalous atomic and molecular species in the exhaust plume are analyzed. The major results include techniques for handling data noise, methods for registration of spectra to wavelength, and a simple automatic process for estimating the metallic component of a spectrum.

  11. Multiple-Instance Learning for Anomaly Detection in Digital Mammography.

    PubMed

    Quellec, Gwenole; Lamard, Mathieu; Cozic, Michel; Coatrieux, Gouenou; Cazuguel, Guy

    2016-07-01

    This paper describes a computer-aided detection and diagnosis system for breast cancer, the most common form of cancer among women, using mammography. The system relies on the Multiple-Instance Learning (MIL) paradigm, which has proven useful for medical decision support in previous works from our team. In the proposed framework, breasts are first partitioned adaptively into regions. Then, features derived from the detection of lesions (masses and microcalcifications) as well as textural features, are extracted from each region and combined in order to classify mammography examinations as "normal" or "abnormal". Whenever an abnormal examination record is detected, the regions that induced that automated diagnosis can be highlighted. Two strategies are evaluated to define this anomaly detector. In a first scenario, manual segmentations of lesions are used to train an SVM that assigns an anomaly index to each region; local anomaly indices are then combined into a global anomaly index. In a second scenario, the local and global anomaly detectors are trained simultaneously, without manual segmentations, using various MIL algorithms (DD, APR, mi-SVM, MI-SVM and MILBoost). Experiments on the DDSM dataset show that the second approach, which is only weakly-supervised, surprisingly outperforms the first approach, even though it is strongly-supervised. This suggests that anomaly detectors can be advantageously trained on large medical image archives, without the need for manual segmentation. PMID:26829783

  12. The role of noninvasive and invasive diagnostic imaging techniques for detection of extra-cranial venous system anomalies and developmental variants

    PubMed Central

    2013-01-01

    The extra-cranial venous system is complex and not well studied in comparison to the peripheral venous system. A newly proposed vascular condition, named chronic cerebrospinal venous insufficiency (CCSVI), described initially in patients with multiple sclerosis (MS) has triggered intense interest in better understanding of the role of extra-cranial venous anomalies and developmental variants. So far, there is no established diagnostic imaging modality, non-invasive or invasive, that can serve as the “gold standard” for detection of these venous anomalies. However, consensus guidelines and standardized imaging protocols are emerging. Most likely, a multimodal imaging approach will ultimately be the most comprehensive means for screening, diagnostic and monitoring purposes. Further research is needed to determine the spectrum of extra-cranial venous pathology and to compare the imaging findings with pathological examinations. The ability to define and reliably detect noninvasively these anomalies is an essential step toward establishing their incidence and prevalence. The role for these anomalies in causing significant hemodynamic consequences for the intra-cranial venous drainage in MS patients and other neurologic disorders, and in aging, remains unproven. PMID:23806142

  13. Gravity anomaly detection: Apollo/Soyuz

    NASA Technical Reports Server (NTRS)

    Vonbun, F. O.; Kahn, W. D.; Bryan, J. W.; Schmid, P. E.; Wells, W. T.; Conrad, D. T.

    1976-01-01

    The Goddard Apollo-Soyuz Geodynamics Experiment is described. It was performed to demonstrate the feasibility of tracking and recovering high frequency components of the earth's gravity field by utilizing a synchronous orbiting tracking station such as ATS-6. Gravity anomalies of 5 MGLS or larger having wavelengths of 300 to 1000 kilometers on the earth's surface are important for geologic studies of the upper layers of the earth's crust. Short wavelength Earth's gravity anomalies were detected from space. Two prime areas of data collection were selected for the experiment: (1) the center of the African continent and (2) the Indian Ocean Depression centered at 5% north latitude and 75% east longitude. Preliminary results show that the detectability objective of the experiment was met in both areas as well as at several additional anomalous areas around the globe. Gravity anomalies of the Karakoram and Himalayan mountain ranges, ocean trenches, as well as the Diamantina Depth, can be seen. Maps outlining the anomalies discovered are shown.

  14. Anomaly Detection in Power Quality at Data Centers

    NASA Technical Reports Server (NTRS)

    Grichine, Art; Solano, Wanda M.

    2015-01-01

    The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.

  15. Firefly Algorithm in detection of TEC seismo-ionospheric anomalies

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, Mehdi

    2015-07-01

    Anomaly detection in time series of different earthquake precursors is an essential introduction to create an early warning system with an allowable uncertainty. Since these time series are more often non linear, complex and massive, therefore the applied predictor method should be able to detect the discord patterns from a large data in a short time. This study acknowledges Firefly Algorithm (FA) as a simple and robust predictor to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of the some powerful earthquakes including Chile (27 February 2010), Varzeghan (11 August 2012) and Saravan (16 April 2013). Outstanding anomalies were observed 7 and 5 days before the Chile and Varzeghan earthquakes, respectively and also 3 and 8 days prior to the Saravan earthquake.

  16. A hybrid approach for efficient anomaly detection using metaheuristic methods.

    PubMed

    Ghanem, Tamer F; Elkilani, Wail S; Abdul-Kader, Hatem M

    2015-07-01

    Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. PMID:26199752

  17. Anomaly Detection Based on Sensor Data in Petroleum Industry Applications

    PubMed Central

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-01

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection. PMID:25633599

  18. Anomaly detection based on sensor data in petroleum industry applications.

    PubMed

    Martí, Luis; Sanchez-Pi, Nayat; Molina, José Manuel; Garcia, Ana Cristina Bicharra

    2015-01-01

    Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA), a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection. PMID:25633599

  19. Profile-based adaptive anomaly detection for network security.

    SciTech Connect

    Zhang, Pengchu C. (Sandia National Laboratories, Albuquerque, NM); Durgin, Nancy Ann

    2005-11-01

    As information systems become increasingly complex and pervasive, they become inextricably intertwined with the critical infrastructure of national, public, and private organizations. The problem of recognizing and evaluating threats against these complex, heterogeneous networks of cyber and physical components is a difficult one, yet a solution is vital to ensuring security. In this paper we investigate profile-based anomaly detection techniques that can be used to address this problem. We focus primarily on the area of network anomaly detection, but the approach could be extended to other problem domains. We investigate using several data analysis techniques to create profiles of network hosts and perform anomaly detection using those profiles. The ''profiles'' reduce multi-dimensional vectors representing ''normal behavior'' into fewer dimensions, thus allowing pattern and cluster discovery. New events are compared against the profiles, producing a quantitative measure of how ''anomalous'' the event is. Most network intrusion detection systems (IDSs) detect malicious behavior by searching for known patterns in the network traffic. This approach suffers from several weaknesses, including a lack of generalizability, an inability to detect stealthy or novel attacks, and lack of flexibility regarding alarm thresholds. Our research focuses on enhancing current IDS capabilities by addressing some of these shortcomings. We identify and evaluate promising techniques for data mining and machine-learning. The algorithms are ''trained'' by providing them with a series of data-points from ''normal'' network traffic. A successful algorithm can be trained automatically and efficiently, will have a low error rate (low false alarm and miss rates), and will be able to identify anomalies in ''pseudo real-time'' (i.e., while the intrusion is still in progress, rather than after the fact). We also build a prototype anomaly detection tool that demonstrates how the techniques might

  20. Investigation of the collision line broadening problem as applicable to the NASA Optical Plume Anomaly Detection (OPAD) system, phase 1

    NASA Astrophysics Data System (ADS)

    Dean, Timothy C.; Ventrice, Carl A.

    1995-05-01

    As a final report for phase 1 of the project, the researchers are submitting to the Tennessee Tech Office of Research the following two papers (reprinted in this report): 'Collision Line Broadening Effects on Spectrometric Data from the Optical Plume Anomaly System (OPAD),' presented at the 30th AIAA/ASME/SAE/ASEE Joint Propulsion Conference, 27-29 June 1994, and 'Calculation of Collision Cross Sections for Atomic Line Broadening in the Plume of the Space Shuttle Main Engine (SSME),' presented at the IEEE Southeastcon '95, 26-29 March 1995. These papers fully state the problem and the progress made up to the end of NASA Fiscal Year 1994. The NASA OPAD system was devised to predict concentrations of anomalous species in the plume of the Space Shuttle Main Engine (SSME) through analysis of spectrometric data. The self absorption of the radiation of these plume anomalies is highly dependent on the line shape of the atomic transition of interest. The Collision Line Broadening paper discusses the methods used to predict line shapes of atomic transitions in the environment of a rocket plume. The Voigt profile is used as the line shape factor since both Doppler and collisional line broadening are significant. Methods used to determine the collisional cross sections are discussed and the results are given and compared with experimental data. These collisional cross sections are then incorporated into the current self absorbing radiative model and the predicted spectrum is compared to actual spectral data collected from the Stennis Space Center Diagnostic Test Facility rocket engine. The second paper included in this report investigates an analytical method for determining the cross sections for collision line broadening by molecular perturbers, using effective central force interaction potentials. These cross sections are determined for several atomic species with H2, one of the principal constituents of the SSME plume environment, and compared with experimental data.

  1. Investigation of the collision line broadening problem as applicable to the NASA Optical Plume Anomaly Detection (OPAD) system, phase 1

    NASA Technical Reports Server (NTRS)

    Dean, Timothy C.; Ventrice, Carl A.

    1995-01-01

    As a final report for phase 1 of the project, the researchers are submitting to the Tennessee Tech Office of Research the following two papers (reprinted in this report): 'Collision Line Broadening Effects on Spectrometric Data from the Optical Plume Anomaly System (OPAD),' presented at the 30th AIAA/ASME/SAE/ASEE Joint Propulsion Conference, 27-29 June 1994, and 'Calculation of Collision Cross Sections for Atomic Line Broadening in the Plume of the Space Shuttle Main Engine (SSME),' presented at the IEEE Southeastcon '95, 26-29 March 1995. These papers fully state the problem and the progress made up to the end of NASA Fiscal Year 1994. The NASA OPAD system was devised to predict concentrations of anomalous species in the plume of the Space Shuttle Main Engine (SSME) through analysis of spectrometric data. The self absorption of the radiation of these plume anomalies is highly dependent on the line shape of the atomic transition of interest. The Collision Line Broadening paper discusses the methods used to predict line shapes of atomic transitions in the environment of a rocket plume. The Voigt profile is used as the line shape factor since both Doppler and collisional line broadening are significant. Methods used to determine the collisional cross sections are discussed and the results are given and compared with experimental data. These collisional cross sections are then incorporated into the current self absorbing radiative model and the predicted spectrum is compared to actual spectral data collected from the Stennis Space Center Diagnostic Test Facility rocket engine. The second paper included in this report investigates an analytical method for determining the cross sections for collision line broadening by molecular perturbers, using effective central force interaction potentials. These cross sections are determined for several atomic species with H2, one of the principal constituents of the SSME plume environment, and compared with experimental data.

  2. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  3. Anomaly Detection for Next-Generation Space Launch Ground Operations

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.

    2010-01-01

    NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.

  4. Automatic detection of anomalies in Space Shuttle Main Engine turbopumps

    NASA Astrophysics Data System (ADS)

    Lo, Ching F.; Whitehead, B. A.; Wu, Kewei

    1992-07-01

    A prototype expert system (developed on both PC and Symbolics 3670 lisp machine) for detecting anomalies in turbopump vibration data has been tested with data from ground tests 902-473, 902-501, 902-519, and 904-097 of the Space Shuttle Main Engine (SSME). The expert system has been utilized to analyze vibration data from each of the following SSME components: high-pressure oxidizer turbopump, high-pressure fuel turbopump, low-pressure fuel turbopump, and preburner boost pump. The expert system locates and classifies peaks in the power spectral density of each 0.4-sec window of steady-state data. Peaks representing the fundamental and harmonic frequencies of both shaft rotation and bearing cage rotation are identified by the expert system. Anomalies are then detected on the basis of sequential criteria and two threshold criteria set individually for the amplitude of each of these peaks: a prior threshold used during the first few windows of data in a test, and a posterior threshold used thereafter. In most cases the anomalies detected by the expert system agree with those reported by NASA. The two cases where there is significant disagreement will be further studied and the system design refined accordingly.

  5. Automatic detection of anomalies in Space Shuttle Main Engine turbopumps

    NASA Technical Reports Server (NTRS)

    Lo, Ching F. (Principal Investigator); Whitehead, Bruce; Wu, Kewei; Rogers, George

    1992-01-01

    A prototype expert system for detecting anomalies in turbopump vibration data has been tested with data from ground tests 902-473, 902-501 902-519, and 904-097 of the Space Shuttle Main Engine!nc (SSME). The expert system has been utilized to analyze vibration ion data from each of the following SSME components: pressure oxidizer turbopump, high-pressure fuel turbo pump, low-pressure fuel turbopump, and preburner boost pump. The expert system locates and classifies peaks in the power spectral density of each 0.4 s window of steady-state data. Peaks representing the fundamental and harmonic frequencies of both shaft rotation and bearing cage rotation are identified by the expert system. Anomalies are then detected on the basis of of two thresholds set individually for the amplitude of each of these peaks: a prior threshold used during the first few windows of data in a test, and a posterior threshold used thereafter. In most cases the anomalies detected by the expert system agree with those reported by NASA. The two cases where there is significant disagreement will be further studied and the system design refined accordingly.

  6. Automatic detection of anomalies in Space Shuttle Main Engine turbopumps

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Whitehead, B. A.; Wu, Kewei

    1992-01-01

    A prototype expert system (developed on both PC and Symbolics 3670 lisp machine) for detecting anomalies in turbopump vibration data has been tested with data from ground tests 902-473, 902-501, 902-519, and 904-097 of the Space Shuttle Main Engine (SSME). The expert system has been utilized to analyze vibration data from each of the following SSME components: high-pressure oxidizer turbopump, high-pressure fuel turbopump, low-pressure fuel turbopump, and preburner boost pump. The expert system locates and classifies peaks in the power spectral density of each 0.4-sec window of steady-state data. Peaks representing the fundamental and harmonic frequencies of both shaft rotation and bearing cage rotation are identified by the expert system. Anomalies are then detected on the basis of sequential criteria and two threshold criteria set individually for the amplitude of each of these peaks: a prior threshold used during the first few windows of data in a test, and a posterior threshold used thereafter. In most cases the anomalies detected by the expert system agree with those reported by NASA. The two cases where there is significant disagreement will be further studied and the system design refined accordingly.

  7. Detecting syntactic and semantic anomalies in schizophrenia.

    PubMed

    Moro, Andrea; Bambini, Valentina; Bosia, Marta; Anselmetti, Simona; Riccaboni, Roberta; Cappa, Stefano F; Smeraldi, Enrico; Cavallaro, Roberto

    2015-12-01

    One of the major challenges in the study of language in schizophrenia is to identify specific levels of the linguistic structure that might be selectively impaired. While historically a main semantic deficit has been widely claimed, results are mixed, with also evidence of syntactic impairment. This might be due to heterogeneity in materials and paradigms across studies, which often do not allow to tap into single linguistic components. Moreover, the interaction between linguistic and neurocognitive deficits is still unclear. In this study, we concentrated on syntactic and semantic knowledge. We employed an anomaly detection task including short and long sentences with either syntactic errors violating the principles of Universal Grammar, or a novel form of semantic errors, resulting from a contradiction in the computation of the whole sentence meaning. Fifty-eight patients with diagnosis of schizophrenia were compared to 30 healthy subjects. Results showed that, in patients, only the ability to identify syntactic anomaly, both in short and long sentences, was impaired. This result cannot be explained by working memory abilities or psychopathological features. These findings suggest the presence of an impairment of syntactic knowledge in schizophrenia, at least partially independent of the cognitive and psychopathological profile. On the contrary, we cannot conclude that there is a semantic impairment, at least in terms of compositional semantics abilities. PMID:26519554

  8. Automated anomaly detection for Orbiter High Temperature Reusable Surface Insulation

    NASA Astrophysics Data System (ADS)

    Cooper, Eric G.; Jones, Sharon M.; Goode, Plesent W.; Vazquez, Sixto L.

    1992-11-01

    The description, analysis, and experimental results of a method for identifying possible defects on High Temperature Reusable Surface Insulation (HRSI) of the Orbiter Thermal Protection System (TPS) is presented. Currently, a visual postflight inspection of Orbiter TPS is conducted to detect and classify defects as part of the Orbiter maintenance flow. The objective of the method is to automate the detection of defects by identifying anomalies between preflight and postflight images of TPS components. The initial version is intended to detect and label gross (greater than 0.1 inches in the smallest dimension) anomalies on HRSI components for subsequent classification by a human inspector. The approach is a modified Golden Template technique where the preflight image of a tile serves as the template against which the postflight image of the tile is compared. Candidate anomalies are selected as a result of the comparison and processed to identify true anomalies. The processing methods are developed and discussed, and the results of testing on actual and simulated tile images are presented. Solutions to the problems of brightness and spatial normalization, timely execution, and minimization of false positives are also discussed.

  9. Anomaly detection enhanced classification in computer intrusion detection

    SciTech Connect

    Fugate, M. L.; Gattiker, J. R.

    2002-01-01

    This report describes work with the goal of enhancing capabilities in computer intrusion detection. The work builds upon a study of classification performance, that compared various methods of classifying information derived from computer network packets into attack versus normal categories, based on a labeled training dataset. This previous work validates our classification methods, and clears the ground for studying whether and how anomaly detection can be used to enhance this performance, The DARPA project that initiated the dataset used here concluded that anomaly detection should be examined to boost the performance of machine learning in the computer intrusion detection task. This report investigates the data set for aspects that will be valuable for anomaly detection application, and supports these results with models constructed from the data. In this report, the term anomaly detection means learning a model from unlabeled data, and using this to make some inference about future data. Our data is a feature vector derived from network packets: an 'example' or 'sample'. On the other hand, classification means building a model from labeled data, and using that model to classify unlabeled (future) examples. There is some precedent in the literature for combining these methods. One approach is to stage the two techniques, using anomaly detection to segment data into two sets for classification. An interpretation of this is a method to combat nonstationarity in the data. In our previous work, we demonstrated that the data has substantial temporal nonstationarity. With classification methods that can be thought of as learning a decision surface between two statistical distributions, performance is expected to degrade significantly when classifying examples that are from regions not well represented in the training set. Anomaly detection can be seen as a problem of learning the density (landscape) or the support (boundary) of a statistical distribution so that

  10. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  11. Statistical Anomaly Detection for Monitoring of Human Dynamics

    NASA Astrophysics Data System (ADS)

    Kamiya, K.; Fuse, T.

    2015-05-01

    Understanding of human dynamics has drawn attention to various areas. Due to the wide spread of positioning technologies that use GPS or public Wi-Fi, location information can be obtained with high spatial-temporal resolution as well as at low cost. By collecting set of individual location information in real time, monitoring of human dynamics is recently considered possible and is expected to lead to dynamic traffic control in the future. Although this monitoring focuses on detecting anomalous states of human dynamics, anomaly detection methods are developed ad hoc and not fully systematized. This research aims to define an anomaly detection problem of the human dynamics monitoring with gridded population data and develop an anomaly detection method based on the definition. According to the result of a review we have comprehensively conducted, we discussed the characteristics of the anomaly detection of human dynamics monitoring and categorized our problem to a semi-supervised anomaly detection problem that detects contextual anomalies behind time-series data. We developed an anomaly detection method based on a sticky HDP-HMM, which is able to estimate the number of hidden states according to input data. Results of the experiment with synthetic data showed that our proposed method has good fundamental performance with respect to the detection rate. Through the experiment with real gridded population data, an anomaly was detected when and where an actual social event had occurred.

  12. Anomaly detection in clutter using spectrally enhanced LADAR

    NASA Astrophysics Data System (ADS)

    Chhabra, Puneet S.; Wallace, Andrew M.; Hopgood, James R.

    2015-05-01

    Discrete return (DR) Laser Detection and Ranging (Ladar) systems provide a series of echoes that reflect from objects in a scene. These can be first, last or multi-echo returns. In contrast, Full-Waveform (FW)-Ladar systems measure the intensity of light reflected from objects continuously over a period of time. In a camflouaged scenario, e.g., objects hidden behind dense foliage, a FW-Ladar penetrates such foliage and returns a sequence of echoes including buried faint echoes. The aim of this paper is to learn local-patterns of co-occurring echoes characterised by their measured spectra. A deviation from such patterns defines an abnormal event in a forest/tree depth profile. As far as the authors know, neither DR or FW-Ladar, along with several spectral measurements, has not been applied to anomaly detection. This work presents an algorithm that allows detection of spectral and temporal anomalies in FW-Multi Spectral Ladar (FW-MSL) data samples. An anomaly is defined as a full waveform temporal and spectral signature that does not conform to a prior expectation, represented using a learnt subspace (dictionary) and set of coefficients that capture co-occurring local-patterns using an overlapping temporal window. A modified optimization scheme is proposed for subspace learning based on stochastic approximations. The objective function is augmented with a discriminative term that represents the subspace's separability properties and supports anomaly characterisation. The algorithm detects several man-made objects and anomalous spectra hidden in a dense clutter of vegetation and also allows tree species classification.

  13. Thermal and TEC anomalies detection using an intelligent hybrid system around the time of the Saravan, Iran, (Mw = 7.7) earthquake of 16 April 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2014-02-01

    A powerful earthquake of Mw = 7.7 struck the Saravan region (28.107° N, 62.053° E) in Iran on 16 April 2013. Up to now nomination of an automated anomaly detection method in a non linear time series of earthquake precursor has been an attractive and challenging task. Artificial Neural Network (ANN) and Particle Swarm Optimization (PSO) have revealed strong potentials in accurate time series prediction. This paper presents the first study of an integration of ANN and PSO method in the research of earthquake precursors to detect the unusual variations of the thermal and total electron content (TEC) seismo-ionospheric anomalies induced by the strong earthquake of Saravan. In this study, to overcome the stagnation in local minimum during the ANN training, PSO as an optimization method is used instead of traditional algorithms for training the ANN method. The proposed hybrid method detected a considerable number of anomalies 4 and 8 days preceding the earthquake. Since, in this case study, ionospheric TEC anomalies induced by seismic activity is confused with background fluctuations due to solar activity, a multi-resolution time series processing technique based on wavelet transform has been applied on TEC signal variations. In view of the fact that the accordance in the final results deduced from some robust methods is a convincing indication for the efficiency of the method, therefore the detected thermal and TEC anomalies using the ANN + PSO method were compared to the results with regard to the observed anomalies by implementing the mean, median, Wavelet, Kalman filter, Auto-Regressive Integrated Moving Average (ARIMA), Support Vector Machine (SVM) and Genetic Algorithm (GA) methods. The results indicate that the ANN + PSO method is quite promising and deserves serious attention as a new tool for thermal and TEC seismo anomalies detection.

  14. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    PubMed

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets. PMID:26336154

  15. Automated Network Anomaly Detection with Learning, Control and Mitigation

    ERIC Educational Resources Information Center

    Ippoliti, Dennis

    2014-01-01

    Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…

  16. Claycap anomaly detection using hyperspectral remote sensing and lidargrammetric techniques

    NASA Astrophysics Data System (ADS)

    Garcia Quijano, Maria Jose

    Clay capped waste sites are a common method to dispose of the more than 40 million tons of hazardous waste produced in the United States every year (EPA, 2003). Due to the potential threat that hazardous waste poses, it is essential to monitor closely the performance of these facilities. Development of a monitoring system that exploits spectral and topographic changes over hazardous waste sites is presented. Spectral anomaly detection is based upon the observed changes in absolute reflectance and spectral derivatives in centipede grass (Eremochloa ophiuroides) under different irrigation levels. The spectral features that provide the best separability among irrigation levels were identified using Stepwise Discriminant Analyses. The Red Edge Position was selected as a suitable discriminant variable to compare the performance of a global and a local anomaly detection algorithm using a DAIS 3715 hyperspectral image. Topographical anomaly detection is assessed by evaluating the vertical accuracy of two LIDAR datasets acquired from two different altitudes (700 m and 1,200 m AGL) over a clay-capped hazardous site at the Savannah River National Laboratory, SC using the same Optech ALTM 2050 and Cessna 337 platform. Additionally, a quantitative comparison is performed to determine the effect that decreasing platform altitude and increasing posting density have on the vertical accuracy of the LIDAR data collected.

  17. Discovering System Health Anomalies Using Data Mining Techniques

    NASA Technical Reports Server (NTRS)

    Sriastava, Ashok, N.

    2005-01-01

    We present a data mining framework for the analysis and discovery of anomalies in high-dimensional time series of sensor measurements that would be found in an Integrated System Health Monitoring system. We specifically treat the problem of discovering anomalous features in the time series that may be indicative of a system anomaly, or in the case of a manned system, an anomaly due to the human. Identification of these anomalies is crucial to building stable, reusable, and cost-efficient systems. The framework consists of an analysis platform and new algorithms that can scale to thousands of sensor streams to discovers temporal anomalies. We discuss the mathematical framework that underlies the system and also describe in detail how this framework is general enough to encompass both discrete and continuous sensor measurements. We also describe a new set of data mining algorithms based on kernel methods and hidden Markov models that allow for the rapid assimilation, analysis, and discovery of system anomalies. We then describe the performance of the system on a real-world problem in the aircraft domain where we analyze the cockpit data from aircraft as well as data from the aircraft propulsion, control, and guidance systems. These data are discrete and continuous sensor measurements and are dealt with seamlessly in order to discover anomalous flights. We conclude with recommendations that describe the tradeoffs in building an integrated scalable platform for robust anomaly detection in ISHM applications.

  18. Detection of Low Temperature Volcanogenic Thermal Anomalies with ASTER

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Baxter, S.

    2009-12-01

    Predicting volcanic eruptions is a thorny problem, as volcanoes typically exhibit idiosyncratic waxing and/or waning pre-eruption emission, geodetic, and seismic behavior. It is no surprise that increasing our accuracy and precision in eruption prediction depends on assessing the time-progressions of all relevant precursor geophysical, geochemical, and geological phenomena, and on more frequently observing volcanoes when they become restless. The ASTER instrument on the NASA Terra Earth Observing System satellite in low earth orbit provides important capabilities in the area of detection of volcanogenic anomalies such as thermal precursors and increased passive gas emissions. Its unique high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), bore-sighted with visible and near-IR imaging data, and combined with off-nadir pointing and stereo-photogrammetric capabilities make ASTER a potentially important volcanic precursor detection tool. We are utilizing the JPL ASTER Volcano Archive (http://ava.jpl.nasa.gov) to systematically examine 80,000+ ASTER volcano images to analyze (a) thermal emission baseline behavior for over 1500 volcanoes worldwide, (b) the form and magnitude of time-dependent thermal emission variability for these volcanoes, and (c) the spatio-temporal limits of detection of pre-eruption temporal changes in thermal emission in the context of eruption precursor behavior. We are creating and analyzing a catalog of the magnitude, frequency, and distribution of volcano thermal signatures worldwide as observed from ASTER since 2000 at 90m/pixel. Of particular interest as eruption precursors are small low contrast thermal anomalies of low apparent absolute temperature (e.g., melt-water lakes, fumaroles, geysers, grossly sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and

  19. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  20. Hierarchical Kohonenen net for anomaly detection in network security.

    PubMed

    Sarasamma, Suseela T; Zhu, Qiuming A; Huff, Julie

    2005-04-01

    A novel multilevel hierarchical Kohonen Net (K-Map) for an intrusion detection system is presented. Each level of the hierarchical map is modeled as a simple winner-take-all K-Map. One significant advantage of this multilevel hierarchical K-Map is its computational efficiency. Unlike other statistical anomaly detection methods such as nearest neighbor approach, K-means clustering or probabilistic analysis that employ distance computation in the feature space to identify the outliers, our approach does not involve costly point-to-point computation in organizing the data into clusters. Another advantage is the reduced network size. We use the classification capability of the K-Map on selected dimensions of data set in detecting anomalies. Randomly selected subsets that contain both attacks and normal records from the KDD Cup 1999 benchmark data are used to train the hierarchical net. We use a confidence measure to label the clusters. Then we use the test set from the same KDD Cup 1999 benchmark to test the hierarchical net. We show that a hierarchical K-Map in which each layer operates on a small subset of the feature space is superior to a single-layer K-Map operating on the whole feature space in detecting a variety of attacks in terms of detection rate as well as false positive rate. PMID:15828658

  1. Anomaly Detection in Test Equipment via Sliding Mode Observers

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.; Drakunov, Sergey V.

    2012-01-01

    Nonlinear observers were originally developed based on the ideas of variable structure control, and for the purpose of detecting disturbances in complex systems. In this anomaly detection application, these observers were designed for estimating the distributed state of fluid flow in a pipe described by a class of advection equations. The observer algorithm uses collected data in a piping system to estimate the distributed system state (pressure and velocity along a pipe containing liquid gas propellant flow) using only boundary measurements. These estimates are then used to further estimate and localize possible anomalies such as leaks or foreign objects, and instrumentation metering problems such as incorrect flow meter orifice plate size. The observer algorithm has the following parts: a mathematical model of the fluid flow, observer control algorithm, and an anomaly identification algorithm. The main functional operation of the algorithm is in creating the sliding mode in the observer system implemented as software. Once the sliding mode starts in the system, the equivalent value of the discontinuous function in sliding mode can be obtained by filtering out the high-frequency chattering component. In control theory, "observers" are dynamic algorithms for the online estimation of the current state of a dynamic system by measurements of an output of the system. Classical linear observers can provide optimal estimates of a system state in case of uncertainty modeled by white noise. For nonlinear cases, the theory of nonlinear observers has been developed and its success is mainly due to the sliding mode approach. Using the mathematical theory of variable structure systems with sliding modes, the observer algorithm is designed in such a way that it steers the output of the model to the output of the system obtained via a variety of sensors, in spite of possible mismatches between the assumed model and actual system. The unique properties of sliding mode control

  2. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    NASA Technical Reports Server (NTRS)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  3. Post-processing for improving hyperspectral anomaly detection accuracy

    NASA Astrophysics Data System (ADS)

    Wu, Jee-Cheng; Jiang, Chi-Ming; Huang, Chen-Liang

    2015-10-01

    Anomaly detection is an important topic in the exploitation of hyperspectral data. Based on the Reed-Xiaoli (RX) detector and a morphology operator, this research proposes a novel technique for improving the accuracy of hyperspectral anomaly detection. Firstly, the RX-based detector is used to process a given input scene. Then, a post-processing scheme using morphology operator is employed to detect those pixels around high-scoring anomaly pixels. Tests were conducted using two real hyperspectral images with ground truth information and the results based on receiver operating characteristic curves, illustrated that the proposed method reduced the false alarm rates of the RXbased detector.

  4. Computationally efficient strategies to perform anomaly detection in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Rossi, Alessandro; Acito, Nicola; Diani, Marco; Corsini, Giovanni

    2012-11-01

    In remote sensing, hyperspectral sensors are effectively used for target detection and recognition because of their high spectral resolution that allows discrimination of different materials in the sensed scene. When a priori information about the spectrum of the targets of interest is not available, target detection turns into anomaly detection (AD), i.e. searching for objects that are anomalous with respect to the scene background. In the field of AD, anomalies can be generally associated to observations that statistically move away from background clutter, being this latter intended as a local neighborhood surrounding the observed pixel or as a large part of the image. In this context, many efforts have been put to reduce the computational load of AD algorithms so as to furnish information for real-time decision making. In this work, a sub-class of AD methods is considered that aim at detecting small rare objects that are anomalous with respect to their local background. Such techniques not only are characterized by mathematical tractability but also allow the design of real-time strategies for AD. Within these methods, one of the most-established anomaly detectors is the RX algorithm which is based on a local Gaussian model for background modeling. In the literature, the RX decision rule has been employed to develop computationally efficient algorithms implemented in real-time systems. In this work, a survey of computationally efficient methods to implement the RX detector is presented where advanced algebraic strategies are exploited to speed up the estimate of the covariance matrix and of its inverse. The comparison of the overall number of operations required by the different implementations of the RX algorithms is given and discussed by varying the RX parameters in order to show the computational improvements achieved with the introduced algebraic strategy.

  5. Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Lauzon, N.; Lence, B. J.

    2002-12-01

    This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be

  6. Identification and detection of anomalies through SSME data analysis

    NASA Technical Reports Server (NTRS)

    Pereira, Lisa; Ali, Moonis

    1990-01-01

    The goal of the ongoing research described in this paper is to analyze real-time ground test data in order to identify patterns associated with the anomalous engine behavior, and on the basis of this analysis to develop an expert system which detects anomalous engine behavior in the early stages of fault development. A prototype of the expert system has been developed and tested on the high frequency data of two SSME tests, namely Test #901-0516 and Test #904-044. The comparison of our results with the post-test analyses indicates that the expert system detected the presence of the anomalies in a significantly early stage of fault development.

  7. Hyperspectral anomaly detection method based on auto-encoder

    NASA Astrophysics Data System (ADS)

    Bati, Emrecan; ćalışkan, Akın.; Koz, Alper; Alatan, A. A.

    2015-10-01

    A major drawback of most of the existing hyperspectral anomaly detection methods is the lack of an efficient background representation, which can successfully adapt to the varying complexity of hyperspectral images. In this paper, we propose a novel anomaly detection method which represents the hyperspectral scenes of different complexity with the state-of-the-art representation learning method, namely auto-encoder. The proposed method first encodes the spectral image into a sparse code, then decodes the coded image, and finally, assesses the coding error at each pixel as a measure of anomaly. Predictive Sparse Decomposition Auto-encoder is utilized in the proposed anomaly method due to its efficient joint learning for the encoding and decoding functions. The performance of the proposed anomaly detection method is both tested on visible-near infrared (VNIR) and long wave infrared (LWIR) hyperspectral images and compared with the conventional anomaly detection method, namely Reed-Xiaoli (RX) detector.1 The experiments has verified the superiority of the proposed anomaly detection method in terms of receiver operating characteristics (ROC) performance.

  8. Load characterization and anomaly detection for voice over IP traffic.

    PubMed

    Mandjes, Michel; Saniee, Iraj; Stolyar, Alexander L

    2005-09-01

    We consider the problem of traffic anomaly detection in IP networks. Traffic anomalies typically arise when there is focused overload or when a network element fails and it is desired to infer these purely from the measured traffic. We derive new general formulae for the variance of the cumulative traffic over a fixed time interval and show how the derived analytical expression simplifies for the case of voice over IP traffic, the focus of this paper. To detect load anomalies, we show it is sufficient to consider cumulative traffic over relatively long intervals such as 5 min. We also propose simple anomaly detection tests including detection of over/underload. This approach substantially extends the current practice in IP network management where only the first-order statistics and fixed thresholds are used to identify abnormal behavior. We conclude with the application of the scheme to field data from an operational network. PMID:16252813

  9. SCARES: A Spacecraft Control Anomaly Resolution Expert System

    NASA Technical Reports Server (NTRS)

    Hamilton, Marc

    1988-01-01

    The current pace of technological development is reflected in the increased mission lifetime of each new generation of satellite. Coupled with this has come a reduced availability of experts to provide technical assistance in satellite operation on a day to day basis. Given such an environment, an expert system is discussed based on architecture for spacecraft anomaly resolution. By capturing deep knowledge about a spacecraft, the system is able to detect and diagnose fault better than previous conventional approaches. A prototype expert system named SCARES (applied only to spacecraft attitude control system) is discussed. Extension of the prototype to handle anomalies in other systems of the satellite is also discussed.

  10. Lidar detection algorithm for time and range anomalies

    NASA Astrophysics Data System (ADS)

    Ben-David, Avishai; Davidson, Charles E.; Vanderbeek, Richard G.

    2007-10-01

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t1 to t2" is addressed, and for range anomaly where the question "is a target present at time t within ranges R1 and R2" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO2 lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  11. Lidar detection algorithm for time and range anomalies.

    PubMed

    Ben-David, Avishai; Davidson, Charles E; Vanderbeek, Richard G

    2007-10-10

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t(1) to t(2)" is addressed, and for range anomaly where the question "is a target present at time t within ranges R(1) and R(2)" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO(2) lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed. PMID:17932542

  12. A New Methodology for Early Anomaly Detection of BWR Instabilities

    SciTech Connect

    Ivanov, K. N.

    2005-11-27

    The objective of the performed research is to develop an early anomaly detection methodology so as to enhance safety, availability, and operational flexibility of Boiling Water Reactor (BWR) nuclear power plants. The technical approach relies on suppression of potential power oscillations in BWRs by detecting small anomalies at an early stage and taking appropriate prognostic actions based on an anticipated operation schedule. The research utilizes a model of coupled (two-phase) thermal-hydraulic and neutron flux dynamics, which is used as a generator of time series data for anomaly detection at an early stage. The model captures critical nonlinear features of coupled thermal-hydraulic and nuclear reactor dynamics and (slow time-scale) evolution of the anomalies as non-stationary parameters. The time series data derived from this nonlinear non-stationary model serves as the source of information for generating the symbolic dynamics for characterization of model parameter changes that quantitatively represent small anomalies. The major focus of the presented research activity was on developing and qualifying algorithms of pattern recognition for power instability based on anomaly detection from time series data, which later can be used to formulate real-time decision and control algorithms for suppression of power oscillations for a variety of anticipated operating conditions. The research being performed in the framework of this project is essential to make significant improvement in the capability of thermal instability analyses for enhancing safety, availability, and operational flexibility of currently operating and next generation BWRs.

  13. Anomaly detection applied to a materials control and accounting database

    SciTech Connect

    Whiteson, R.; Spanks, L.; Yarbro, T.

    1995-09-01

    An important component of the national mission of reducing the nuclear danger includes accurate recording of the processing and transportation of nuclear materials. Nuclear material storage facilities, nuclear chemical processing plants, and nuclear fuel fabrication facilities collect and store large amounts of data describing transactions that involve nuclear materials. To maintain confidence in the integrity of these data, it is essential to identify anomalies in the databases. Anomalous data could indicate error, theft, or diversion of material. Yet, because of the complex and diverse nature of the data, analysis and evaluation are extremely tedious. This paper describes the authors work in the development of analysis tools to automate the anomaly detection process for the Material Accountability and Safeguards System (MASS) that tracks and records the activities associated with accountable quantities of nuclear material at Los Alamos National Laboratory. Using existing guidelines that describe valid transactions, the authors have created an expert system that identifies transactions that do not conform to the guidelines. Thus, this expert system can be used to focus the attention of the expert or inspector directly on significant phenomena.

  14. Evaluation schemes for video and image anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael

    2016-05-01

    Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.

  15. Cross correlation anomaly detection system

    NASA Technical Reports Server (NTRS)

    Micka, E. Z. (Inventor)

    1975-01-01

    This invention provides a method for automatically inspecting the surface of an object, such as an integrated circuit chip, whereby the data obtained by the light reflected from the surface, caused by a scanning light beam, is automatically compared with data representing acceptable values for each unique surface. A signal output provided indicated of acceptance or rejection of the chip. Acceptance is based on predetermined statistical confidence intervals calculated from known good regions of the object being tested, or their representative values. The method can utilize a known good chip, a photographic mask from which the I.C. was fabricated, or a computer stored replica of each pattern being tested.

  16. Anomaly Detection In Additively Manufactured Parts Using Laser Doppler Vibrometery

    SciTech Connect

    Hernandez, Carlos A.

    2015-09-29

    Additively manufactured parts are susceptible to non-uniform structure caused by the unique manufacturing process. This can lead to structural weakness or catastrophic failure. Using laser Doppler vibrometry and frequency response analysis, non-contact detection of anomalies in additively manufactured parts may be possible. Preliminary tests show promise for small scale detection, but more future work is necessary.

  17. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.

    PubMed

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-01-01

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561

  18. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines

    PubMed Central

    Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu

    2016-01-01

    In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561

  19. Visual analytics of anomaly detection in large data streams

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay

    2009-01-01

    Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.

  20. [Anomaly Detection of Multivariate Time Series Based on Riemannian Manifolds].

    PubMed

    Xu, Yonghong; Hou, Xiaoying; Li Shuting; Cui, Jie

    2015-06-01

    Multivariate time series problems widely exist in production and life in the society. Anomaly detection has provided people with a lot of valuable information in financial, hydrological, meteorological fields, and the research areas of earthquake, video surveillance, medicine and others. In order to quickly and efficiently find exceptions in time sequence so that it can be presented in front of people in an intuitive way, we in this study combined the Riemannian manifold with statistical process control charts, based on sliding window, with a description of the covariance matrix as the time sequence, to achieve the multivariate time series of anomaly detection and its visualization. We made MA analog data flow and abnormal electrocardiogram data from MIT-BIH as experimental objects, and verified the anomaly detection method. The results showed that the method was reasonable and effective. PMID:26485975

  1. Near-Real Time Anomaly Detection for Scientific Sensor Data

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A.; Tweedie, C. E.; goswami, S.; Jaimes, A.; Gamon, J. A.

    2011-12-01

    Environmental scientists use advanced sensor technology such as meteorological towers, wireless sensor networks and robotic trams equipped with sensors to perform data collection at remote research sites. Because the amount of environmental sensor data acquired in real time by such instruments is increasing, both the ability to evaluate the accuracy of the data at near-real time and check that the instrumentation is operating correctly are critical in order to not lose valuable time and information. The goal of the research is to define a software engineering-based solution that provides the foundation to define reusable templates for formally specifying data properties and automatically generate programming code that can monitor data streams to identify anomalies at near real-time. The research effort has resulted in a data property categorization that is based on a literature survey of 15 projects that collected environmental data from sensors and a case study conducted in the Arctic. More than 500 published data properties were manually extracted and analyzed from the surveyed projects. The data property categorization revealed recurrent data patterns. Using these patterns and the Specification and Pattern System (SPS) from the software-engineering community as a model, we developed the Data Specification and Pattern System (D-SPS) to capture data properties. D-SPS is the foundation for the Data Property Specification (DaProS) prototype tool that assists scientists in specification of sensor data properties. A series of experiments have been conducted in collaboration with experts working with Eddy covariance (EC) data from the Jornada Basin Experimental Range (JER) and with hyper-spectral data collected using robotic tram systems from the Arctic. The goal of the experiments were to determine if the approach for specifying data properties is effective for specifying data properties and identifying anomalies in sensor data. A complementary Sensor Data

  2. Anomalies.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on anomalies includes Web sites, CD-ROMs and software, videos, books, and additional resources for elementary and junior high school students. Pertinent activities are suggested, and sidebars discuss UFOs, animal anomalies, and anomalies from nature; and resources covering unexplained phenonmenas like crop circles, Easter Island,…

  3. Gravitational anomalies in the solar system?

    NASA Astrophysics Data System (ADS)

    Iorio, Lorenzo

    2015-02-01

    Mindful of the anomalous perihelion precession of Mercury discovered by Le Verrier in the second half of the nineteenth century and its successful explanation by Einstein with his General Theory of Relativity in the early years of the twentieth century, discrepancies among observed effects in our Solar system and their theoretical predictions on the basis of the currently accepted laws of gravitation applied to known matter-energy distributions have the potential of paving the way for remarkable advances in fundamental physics. This is particularly important now more than ever, given that most of the universe seems to be made of unknown substances dubbed Dark Matter and Dark Energy. Should this not be directly the case, Solar system's anomalies could anyhow lead to advancements in either cumulative science, as shown to us by the discovery of Neptune in the first half of the nineteenth century, and technology itself. Moreover, investigations in one of such directions can serendipitously enrich the other one as well. The current status of some alleged gravitational anomalies in the Solar system is critically reviewed. They are: (a) Possible anomalous advances of planetary perihelia. (b) Unexplained orbital residuals of a recently discovered moon of Uranus (Mab). (c) The lingering unexplained secular increase of the eccentricity of the orbit of the Moon. (d) The so-called Faint Young Sun Paradox. (e) The secular decrease of the mass parameter of the Sun. (f) The Flyby Anomaly. (g) The Pioneer Anomaly. (h) The anomalous secular increase of the astronomical unit.

  4. Locality-constrained anomaly detection for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Liu, Jiabin; Li, Wei; Du, Qian; Liu, Kui

    2015-12-01

    Detecting a target with low-occurrence-probability from unknown background in a hyperspectral image, namely anomaly detection, is of practical significance. Reed-Xiaoli (RX) algorithm is considered as a classic anomaly detector, which calculates the Mahalanobis distance between local background and the pixel under test. Local RX, as an adaptive RX detector, employs a dual-window strategy to consider pixels within the frame between inner and outer windows as local background. However, the detector is sensitive if such a local region contains anomalous pixels (i.e., outliers). In this paper, a locality-constrained anomaly detector is proposed to remove outliers in the local background region before employing the RX algorithm. Specifically, a local linear representation is designed to exploit the internal relationship between linearly correlated pixels in the local background region and the pixel under test and its neighbors. Experimental results demonstrate that the proposed detector improves the original local RX algorithm.

  5. The use of Compton scattering in detecting anomaly in soil-possible use in pyromaterial detection

    NASA Astrophysics Data System (ADS)

    Abedin, Ahmad Firdaus Zainal; Ibrahim, Noorddin; Zabidi, Noriza Ahmad; Demon, Siti Zulaikha Ngah

    2016-01-01

    The Compton scattering is able to determine the signature of land mine detection based on dependency of density anomaly and energy change of scattered photons. In this study, 4.43 MeV gamma of the Am-Be source was used to perform Compton scattering. Two detectors were placed between source with distance of 8 cm and radius of 1.9 cm. Detectors of thallium-doped sodium iodide NaI(TI) was used for detecting gamma ray. There are 9 anomalies used in this simulation. The physical of anomaly is in cylinder form with radius of 10 cm and 8.9 cm height. The anomaly is buried 5 cm deep in the bed soil measured 80 cm radius and 53.5 cm height. Monte Carlo methods indicated the scattering of photons is directly proportional to density of anomalies. The difference between detector response with anomaly and without anomaly namely contrast ratio values are in a linear relationship with density of anomalies. Anomalies of air, wood and water give positive contrast ratio values whereas explosive, sand, concrete, graphite, limestone and polyethylene give negative contrast ratio values. Overall, the contrast ratio values are greater than 2 % for all anomalies. The strong contrast ratios result a good detection capability and distinction between anomalies.

  6. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  7. Remote detection of geobotanical anomalies associated with hydrocarbon microseepage

    NASA Technical Reports Server (NTRS)

    Rock, B. N.

    1985-01-01

    As part of the continuing study of the Lost River, West Virginia NASA/Geosat Test Case Site, an extensive soil gas survey of the site was conducted during the summer of 1983. This soil gas survey has identified an order of magnitude methane, ethane, propane, and butane anomaly that is precisely coincident with the linear maple anomaly reported previously. This and other maple anomalies were previously suggested to be indicative of anaerobic soil conditions associated with hydrocarbon microseepage. In vitro studies support the view that anomalous distributions of native tree species tolerant of anaerobic soil conditions may be useful indicators of methane microseepage in heavily vegetated areas of the United States characterized by deciduous forest cover. Remote sensing systems which allow discrimination and mapping of native tree species and/or species associations will provide the exploration community with a means of identifying vegetation distributional anomalies indicative of microseepage.

  8. Robust and efficient anomaly detection using heterogeneous representations

    NASA Astrophysics Data System (ADS)

    Hu, Xing; Hu, Shiqiang; Xie, Jinhua; Zheng, Shiyou

    2015-05-01

    Various approaches have been proposed for video anomaly detection. Yet these approaches typically suffer from one or more limitations: they often characterize the pattern using its internal information, but ignore its external relationship which is important for local anomaly detection. Moreover, the high-dimensionality and the lack of robustness of pattern representation may lead to problems, including overfitting, increased computational cost and memory requirements, and high false alarm rate. We propose a video anomaly detection framework which relies on a heterogeneous representation to account for both the pattern's internal information and external relationship. The internal information is characterized by slow features learned by slow feature analysis from low-level representations, and the external relationship is characterized by the spatial contextual distances. The heterogeneous representation is compact, robust, efficient, and discriminative for anomaly detection. Moreover, both the pattern's internal information and external relationship can be taken into account in the proposed framework. Extensive experiments demonstrate the robustness and efficiency of our approach by comparison with the state-of-the-art approaches on the widely used benchmark datasets.

  9. SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) 2013

    SciTech Connect

    Gordon Rueff; Lyle Roybal; Denis Vollmer

    2013-01-01

    There is a significant need to protect the nation’s energy infrastructures from malicious actors using cyber methods. Supervisory, Control, and Data Acquisition (SCADA) systems may be vulnerable due to the insufficient security implemented during the design and deployment of these control systems. This is particularly true in older legacy SCADA systems that are still commonly in use. The purpose of INL’s research on the SCADA Protocol Anomaly Detection Utilizing Compression (SPADUC) project was to determine if and how data compression techniques could be used to identify and protect SCADA systems from cyber attacks. Initially, the concept was centered on how to train a compression algorithm to recognize normal control system traffic versus hostile network traffic. Because large portions of the TCP/IP message traffic (called packets) are repetitive, the concept of using compression techniques to differentiate “non-normal” traffic was proposed. In this manner, malicious SCADA traffic could be identified at the packet level prior to completing its payload. Previous research has shown that SCADA network traffic has traits desirable for compression analysis. This work investigated three different approaches to identify malicious SCADA network traffic using compression techniques. The preliminary analyses and results presented herein are clearly able to differentiate normal from malicious network traffic at the packet level at a very high confidence level for the conditions tested. Additionally, the master dictionary approach used in this research appears to initially provide a meaningful way to categorize and compare packets within a communication channel.

  10. Solar cell anomaly detection method and apparatus

    NASA Technical Reports Server (NTRS)

    Miller, Emmett L. (Inventor); Shumka, Alex (Inventor); Gauthier, Michael K. (Inventor)

    1981-01-01

    A method is provided for detecting cracks and other imperfections in a solar cell, which includes scanning a narrow light beam back and forth across the cell in a raster pattern, while monitoring the electrical output of the cell to find locations where the electrical output varies significantly. The electrical output can be monitored on a television type screen containing a raster pattern with each point on the screen corresponding to a point on the solar cell surface, and with the brightness of each point on the screen corresponding to the electrical output from the cell which was produced when the light beam was at the corresponding point on the cell. The technique can be utilized to scan a large array of interconnected solar cells, to determine which ones are defective.

  11. A spring window for geobotanical anomaly detection

    NASA Technical Reports Server (NTRS)

    Bell, R.; Labovitz, M. L.; Masuoka, E. J.

    1985-01-01

    The observation of senescence of deciduous vegetation to detect soil heavy metal mineralization is discussed. A gridded sampling of two sites of Quercus alba L. in south-central Virginia in 1982 is studied. The data reveal that smaller leaf blade lengths are observed in the soil site with copper, lead, and zinc concentrations. A random study in 1983 of red and white Q. rubra L., Q. prinus L., and Acer rubrum L., to confirm previous results is described. The observations of blade length and bud breaks show a 7-10 day lag in growth in the mineral site for the oak trees; however, the maple trees are not influenced by the minerals.

  12. Sensor Anomaly Detection in Wireless Sensor Networks for Healthcare

    PubMed Central

    Haque, Shah Ahsanul; Rahman, Mustafizur; Aziz, Syed Mahfuzul

    2015-01-01

    Wireless Sensor Networks (WSN) are vulnerable to various sensor faults and faulty measurements. This vulnerability hinders efficient and timely response in various WSN applications, such as healthcare. For example, faulty measurements can create false alarms which may require unnecessary intervention from healthcare personnel. Therefore, an approach to differentiate between real medical conditions and false alarms will improve remote patient monitoring systems and quality of healthcare service afforded by WSN. In this paper, a novel approach is proposed to detect sensor anomaly by analyzing collected physiological data from medical sensors. The objective of this method is to effectively distinguish false alarms from true alarms. It predicts a sensor value from historic values and compares it with the actual sensed value for a particular instance. The difference is compared against a threshold value, which is dynamically adjusted, to ascertain whether the sensor value is anomalous. The proposed approach has been applied to real healthcare datasets and compared with existing approaches. Experimental results demonstrate the effectiveness of the proposed system, providing high Detection Rate (DR) and low False Positive Rate (FPR). PMID:25884786

  13. Gaussian Process for Activity Modeling and Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Liao, W.; Rosenhahn, B.; Yang, M. Ying

    2015-08-01

    Complex activity modeling and identification of anomaly is one of the most interesting and desired capabilities for automated video behavior analysis. A number of different approaches have been proposed in the past to tackle this problem. There are two main challenges for activity modeling and anomaly detection: 1) most existing approaches require sufficient data and supervision for learning; 2) the most interesting abnormal activities arise rarely and are ambiguous among typical activities, i.e. hard to be precisely defined. In this paper, we propose a novel approach to model complex activities and detect anomalies by using non-parametric Gaussian Process (GP) models in a crowded and complicated traffic scene. In comparison with parametric models such as HMM, GP models are nonparametric and have their advantages. Our GP models exploit implicit spatial-temporal dependence among local activity patterns. The learned GP regression models give a probabilistic prediction of regional activities at next time interval based on observations at present. An anomaly will be detected by comparing the actual observations with the prediction at real time. We verify the effectiveness and robustness of the proposed model on the QMUL Junction Dataset. Furthermore, we provide a publicly available manually labeled ground truth of this data set.

  14. Limitations of Aneuploidy and Anomaly Detection in the Obese Patient.

    PubMed

    Zozzaro-Smith, Paula; Gray, Lisa M; Bacak, Stephen J; Thornburg, Loralei L

    2014-01-01

    Obesity is a worldwide epidemic and can have a profound effect on pregnancy risks. Obese patients tend to be older and are at increased risk for structural fetal anomalies and aneuploidy, making screening options critically important for these women. Failure rates for first-trimester nuchal translucency (NT) screening increase with obesity, while the ability to detect soft-markers declines, limiting ultrasound-based screening options. Obesity also decreases the chances of completing the anatomy survey and increases the residual risk of undetected anomalies. Additionally, non-invasive prenatal testing (NIPT) is less likely to provide an informative result in obese patients. Understanding the limitations and diagnostic accuracy of aneuploidy and anomaly screening in obese patients can help guide clinicians in counseling patients on the screening options. PMID:26237478

  15. Anomaly detection for machine learning redshifts applied to SDSS galaxies

    NASA Astrophysics Data System (ADS)

    Hoyle, Ben; Rau, Markus Michael; Paech, Kerstin; Bonnett, Christopher; Seitz, Stella; Weller, Jochen

    2015-10-01

    We present an analysis of anomaly detection for machine learning redshift estimation. Anomaly detection allows the removal of poor training examples, which can adversely influence redshift estimates. Anomalous training examples may be photometric galaxies with incorrect spectroscopic redshifts, or galaxies with one or more poorly measured photometric quantity. We select 2.5 million `clean' SDSS DR12 galaxies with reliable spectroscopic redshifts, and 6730 `anomalous' galaxies with spectroscopic redshift measurements which are flagged as unreliable. We contaminate the clean base galaxy sample with galaxies with unreliable redshifts and attempt to recover the contaminating galaxies using the Elliptical Envelope technique. We then train four machine learning architectures for redshift analysis on both the contaminated sample and on the preprocessed `anomaly-removed' sample and measure redshift statistics on a clean validation sample generated without any preprocessing. We find an improvement on all measured statistics of up to 80 per cent when training on the anomaly removed sample as compared with training on the contaminated sample for each of the machine learning routines explored. We further describe a method to estimate the contamination fraction of a base data sample.

  16. Security inspection in ports by anomaly detection using hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Rivera, Javier; Valverde, Fernando; Saldaña, Manuel; Manian, Vidya

    2013-05-01

    Applying hyperspectral imaging technology in port security is crucial for the detection of possible threats or illegal activities. One of the most common problems that cargo suffers is tampering. This represents a danger to society because it creates a channel to smuggle illegal and hazardous products. If a cargo is altered, security inspections on that cargo should contain anomalies that reveal the nature of the tampering. Hyperspectral images can detect anomalies by gathering information through multiple electromagnetic bands. The spectrums extracted from these bands can be used to detect surface anomalies from different materials. Based on this technology, a scenario was built in which a hyperspectral camera was used to inspect the cargo for any surface anomalies and a user interface shows the results. The spectrum of items, altered by different materials that can be used to conceal illegal products, is analyzed and classified in order to provide information about the tampered cargo. The image is analyzed with a variety of techniques such as multiple features extracting algorithms, autonomous anomaly detection, and target spectrum detection. The results will be exported to a workstation or mobile device in order to show them in an easy -to-use interface. This process could enhance the current capabilities of security systems that are already implemented, providing a more complete approach to detect threats and illegal cargo.

  17. Sparsity-driven anomaly detection for ship detection and tracking in maritime video

    NASA Astrophysics Data System (ADS)

    Shafer, Scott; Harguess, Josh; Forero, Pedro A.

    2015-05-01

    This work examines joint anomaly detection and dictionary learning approaches for identifying anomalies in persistent surveillance applications that require data compression. We have developed a sparsity-driven anomaly detector that can be used for learning dictionaries to address these challenges. In our approach, each training datum is modeled as a sparse linear combination of dictionary atoms in the presence of noise. The noise term is modeled as additive Gaussian noise and a deterministic term models the anomalies. However, no model for the statistical distribution of the anomalies is made. An estimator is postulated for a dictionary that exploits the fact that since anomalies by definition are rare, only a few anomalies will be present when considering the entire dataset. From this vantage point, we endow the deterministic noise term (anomaly-related) with a group-sparsity property. A robust dictionary learning problem is postulated where a group-lasso penalty is used to encourage most anomaly-related noise components to be zero. The proposed estimator achieves robustness by both identifying the anomalies and removing their effect from the dictionary estimate. Our approach is applied to the problem of ship detection and tracking from full-motion video with promising results.

  18. Energy Detection Based on Undecimated Discrete Wavelet Transform and Its Application in Magnetic Anomaly Detection

    PubMed Central

    Nie, Xinhua; Pan, Zhongming; Zhang, Dasha; Zhou, Han; Chen, Min; Zhang, Wenna

    2014-01-01

    Magnetic anomaly detection (MAD) is a passive approach for detection of a ferromagnetic target, and its performance is often limited by external noises. In consideration of one major noise source is the fractal noise (or called 1/f noise) with a power spectral density of 1/fa (0detection method based on undecimated discrete wavelet transform (UDWT) is proposed in this paper. Firstly, the foundations of magnetic anomaly detection and UDWT are introduced in brief, while a possible detection system based on giant magneto-impedance (GMI) magnetic sensor is also given out. Then our proposed energy detection based on UDWT is described in detail, and the probabilities of false alarm and detection for given the detection threshold in theory are presented. It is noticeable that no a priori assumptions regarding the ferromagnetic target or the magnetic noise probability are necessary for our method, and different from the discrete wavelet transform (DWT), the UDWT is shift invariant. Finally, some simulations are performed and the results show that the detection performance of our proposed detector is better than that of the conventional energy detector even utilized in the Gaussian white noise, especially when the spectral parameter α is less than 1.0. In addition, a real-world experiment was done to demonstrate the advantages of the proposed method. PMID:25343484

  19. Anomaly detection based on the statistics of hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Catterall, Stephen P.

    2004-10-01

    The purpose of this paper is to introduce a new anomaly detection algorithm for application to hyperspectral imaging (HSI) data. The algorithm uses characterisations of the joint (among wavebands) probability density function (pdf) of HSI data. Traditionally, the pdf has been assumed to be multivariate Gaussian or a mixture of multivariate Gaussians. Other distributions have been considered by previous authors, in particular Elliptically Contoured Distributions (ECDs). In this paper we focus on another distribution, which has only recently been defined and studied. This distribution has a more flexible and extensive set of parameters than the multivariate Gaussian does, yet the pdf takes on a relatively simple mathematical form. The result of all this is a model for the pdf of a hyperspectral image, consisting of a mixture of these distributions. Once a model for the pdf of a hyperspectral image has been obtained, it can be incorporated into an anomaly detector. The new anomaly detector is implemented and applied to some medium wave infra-red (MWIR) hyperspectral imagery. Comparison is made with a well-known anomaly detector, and it will be seen that the results are promising.

  20. GPR anomaly detection with robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Masarik, Matthew P.; Burns, Joseph; Thelen, Brian T.; Kelly, Jack; Havens, Timothy C.

    2015-05-01

    This paper investigates the application of Robust Principal Component Analysis (RPCA) to ground penetrating radar as a means to improve GPR anomaly detection. The method consists of a preprocessing routine to smoothly align the ground and remove the ground response (haircut), followed by mapping to the frequency domain, applying RPCA, and then mapping the sparse component of the RPCA decomposition back to the time domain. A prescreener is then applied to the time-domain sparse component to perform anomaly detection. The emphasis of the RPCA algorithm on sparsity has the effect of significantly increasing the apparent signal-to-clutter ratio (SCR) as compared to the original data, thereby enabling improved anomaly detection. This method is compared to detrending (spatial-mean removal) and classical principal component analysis (PCA), and the RPCA-based processing is seen to provide substantial improvements in the apparent SCR over both of these alternative processing schemes. In particular, the algorithm has been applied to both field collected impulse GPR data and has shown significant improvement in terms of the ROC curve relative to detrending and PCA.

  1. An expert system for diagnosing anomalies of spacecraft

    NASA Technical Reports Server (NTRS)

    Lauriente, Michael; Durand, Rick; Vampola, AL; Koons, Harry C.; Gorney, David

    1994-01-01

    Although the analysis of anomalous behavior of satellites is difficult because it is a very complex process, it is important to be able to make an accurate assessment in a timely manner when the anomaly is observed. Spacecraft operators may have to take corrective action or to 'safe' the spacecraft; space-environment forecasters may have to assess the environmental situation and issue warnings and alerts regarding hazardous conditions, and scientists and engineers may want to gain knowledge for future designs to mitigate the problems. Anomalies can be hardware problems, software errors, environmentally induced, or even the cause of workmanship. Spacecraft anomalies attributable to electrostatic discharges have been known to cause command errors. A goal is to develop an automated system based on this concept to reduce the number of personnel required to operate large programs or missions such as Hubble Space Telescope (HST) and Mission to Planet Earth (MTPE). Although expert systems to detect anomalous behavior of satellites during operations are established, diagnosis of the anomaly is a complex procedure and is a new development.

  2. Inflight and Preflight Detection of Pitot Tube Anomalies

    NASA Technical Reports Server (NTRS)

    Mitchell, Darrell W.

    2014-01-01

    The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.

  3. BEARS: a multi-mission anomaly response system

    NASA Astrophysics Data System (ADS)

    Roberts, Bryce A.

    2009-05-01

    The Mission Operations Group at UC Berkeley's Space Sciences Laboratory operates a highly automated ground station and presently a fleet of seven satellites, each with its own associated command and control console. However, the requirement for prompt anomaly detection and resolution is shared commonly between the ground segment and all spacecraft. The efficient, low-cost operation and "lights-out" staffing of the Mission Operations Group requires that controllers and engineers be notified of spacecraft and ground system problems around the clock. The Berkeley Emergency Anomaly and Response System (BEARS) is an in-house developed web- and paging-based software system that meets this need. BEARS was developed as a replacement for an existing emergency reporting software system that was too closedsource, platform-specific, expensive, and antiquated to expand or maintain. To avoid these limitations, the new system design leverages cross-platform, open-source software products such as MySQL, PHP, and Qt. Anomaly notifications and responses make use of the two-way paging capabilities of modern smart phones.

  4. Segmentation of laser range image for pipe anomaly detection

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Krys, Dennis

    2010-04-01

    Laser-based scanning can provide a precise surface profile. It has been widely applied to the inspection of pipe inner walls and is often used along with other types of sensors, like sonar and close-circuit television (CCTV). These measurements can be used for pipe deterioration modeling and condition assessment. Geometric information needs to be extracted to characterize anomalies in the pipe profile. Since the laser scanning measures the distance, segmentation with a threshold is a straightforward way to isolate the anomalies. However, threshold with a fixed distance value does not work well for the laser range image due to the intensity inhomogeneity, which is caused the uncontrollable factors during the inspection. Thus, a local binary fitting (LBF) active contour model is employed in this work to process the laser range image and an image phase congruency algorithm is adopted to provide the initial contour as required by the LBF method. The combination of these two approaches can successfully detect the anomalies from a laser range image.

  5. Detection of chiral anomaly and valley transport in Dirac semimetals

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Zhang, Enze; Liu, Yanwen; Chen, Zhigang; Liang, Sihang; Cao, Junzhi; Yuan, Xiang; Tang, Lei; Li, Qian; Gu, Teng; Wu, Yizheng; Zou, Jin; Xiu, Faxian

    Chiral anomaly is a non-conservation of chiral charge pumped by the topological nontrivial gauge field, which has been predicted to exist in the emergent quasiparticle excitations in Dirac and Weyl semimetals. However, so far, such pumping process hasn't been clearly demonstrated and lacks a convincing experimental identification. Here, we report the detection of the charge pumping effect and the related valley transport in Cd3As2 driven by external electric and magnetic fields (EB). We find that the chiral imbalance leads to a non-zero gyrotropic coefficient, which can be confirmed by the EB-generated Kerr effect. By applying B along the current direction, we observe a negative magnetoresistance despite the giant positive one at other directions, a clear indication of the chiral anomaly. Remarkably, a robust nonlocal response in valley diffusion originated from the chiral anomaly is persistent up to room temperature when B is parallel to E. The ability to manipulate the valley polarization in Dirac semimetal opens up a brand-new route to understand its fundamental properties through external fields and utilize the chiral fermions in valleytronic applications.

  6. New models for hyperspectral anomaly detection and un-mixing

    NASA Astrophysics Data System (ADS)

    Bernhardt, M.; Heather, J. P.; Smith, M. I.

    2005-06-01

    It is now established that hyperspectral images of many natural backgrounds have statistics with fat-tails. In spite of this, many of the algorithms that are used to process them appeal to the multivariate Gaussian model. In this paper we consider biologically motivated generative models that might explain observed mixtures of vegetation in natural backgrounds. The degree to which these models match the observed fat-tailed distributions is investigated. Having shown how fat-tailed statistics arise naturally from the generative process, the models are put to work in new anomaly detection and un-mixing algorithms. The performance of these algorithms is compared with more traditional approaches.

  7. Inductive inference model of anomaly and misuse detection

    SciTech Connect

    Helman, P.

    1997-01-01

    Further consequences of the inductive inference model of anomaly and misuse detection are presented. The results apply to the design of both probability models for the inductive inference framework and to the design of W&S rule bases. The issues considered include: the role of misuse models M{sub A}, the selection of relevant sets of attributes and the aggregation of their values, the effect on a rule base of nonmaximal rules, and the partitioning of a set of attributes into a left hand and right hand side.

  8. Anomaly depth detection in trans-admittance mammography: a formula independent of anomaly size or admittivity contrast

    NASA Astrophysics Data System (ADS)

    Zhang, Tingting; Lee, Eunjung; Seo, Jin Keun

    2014-04-01

    Trans-admittance mammography (TAM) is a bioimpedance technique for breast cancer detection. It is based on the comparison of tissue conductivity: cancerous tissue is identified by its higher conductivity in comparison with the surrounding normal tissue. In TAM, the breast is compressed between two electrical plates (in a similar architecture to x-ray mammography). The bottom plate has many sensing point electrodes that provide two-dimensional images (trans-admittance maps) that are induced by voltage differences between the two plates. Multi-frequency admittance data (Neumann data) are measured over the range 50 Hz-500 kHz. TAM aims to determine the location and size of any anomaly from the multi-frequency admittance data. Various anomaly detection algorithms can be used to process TAM data to determine the transverse positions of anomalies. However, existing methods cannot reliably determine the depth or size of an anomaly. Breast cancer detection using TAM would be improved if the depth or size of an anomaly could also be estimated, properties that are independent of the admittivity contrast. A formula is proposed here that can estimate the depth of an anomaly independent of its size and the admittivity contrast. This depth estimation can also be used to derive an estimation of the size of the anomaly. The proposed estimations are verified rigorously under a simplified model. Numerical simulation shows that the proposed method also works well in general settings.

  9. Detecting errors and anomalies in computerized materials control and accountability databases

    SciTech Connect

    Whiteson, R.; Hench, K.; Yarbro, T.; Baumgart, C.

    1998-12-31

    The Automated MC and A Database Assessment project is aimed at improving anomaly and error detection in materials control and accountability (MC and A) databases and increasing confidence in the data that they contain. Anomalous data resulting in poor categorization of nuclear material inventories greatly reduces the value of the database information to users. Therefore it is essential that MC and A data be assessed periodically for anomalies or errors. Anomaly detection can identify errors in databases and thus provide assurance of the integrity of data. An expert system has been developed at Los Alamos National Laboratory that examines these large databases for anomalous or erroneous data. For several years, MC and A subject matter experts at Los Alamos have been using this automated system to examine the large amounts of accountability data that the Los Alamos Plutonium Facility generates. These data are collected and managed by the Material Accountability and Safeguards System, a near-real-time computerized nuclear material accountability and safeguards system. This year they have expanded the user base, customizing the anomaly detector for the varying requirements of different groups of users. This paper describes the progress in customizing the expert systems to the needs of the users of the data and reports on their results.

  10. Anomaly Detection in Multiple Scale for Insider Threat Analysis

    SciTech Connect

    Kim, Yoohwan; Sheldon, Frederick T; Hively, Lee M

    2012-01-01

    We propose a method to quantify malicious insider activity with statistical and graph-based analysis aided with semantic scoring rules. Different types of personal activities or interactions are monitored to form a set of directed weighted graphs. The semantic scoring rules assign higher scores for the events more significant and suspicious. Then we build personal activity profiles in the form of score tables. Profiles are created in multiple scales where the low level profiles are aggregated toward more stable higherlevel profiles within the subject or object hierarchy. Further, the profiles are created in different time scales such as day, week, or month. During operation, the insider s current activity profile is compared to the historical profiles to produce an anomaly score. For each subject with a high anomaly score, a subgraph of connected subjects is extracted to look for any related score movement. Finally the subjects are ranked by their anomaly scores to help the analysts focus on high-scored subjects. The threat-ranking component supports the interaction between the User Dashboard and the Insider Threat Knowledge Base portal. The portal includes a repository for historical results, i.e., adjudicated cases containing all of the information first presented to the user and including any additional insights to help the analysts. In this paper we show the framework of the proposed system and the operational algorithms.

  11. Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method

    NASA Technical Reports Server (NTRS)

    Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.

    2005-01-01

    NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.

  12. Anomaly detection of microstructural defects in continuous fiber reinforced composites

    NASA Astrophysics Data System (ADS)

    Bricker, Stephen; Simmons, J. P.; Przybyla, Craig; Hardie, Russell

    2015-03-01

    Ceramic matrix composites (CMC) with continuous fiber reinforcements have the potential to enable the next generation of high speed hypersonic vehicles and/or significant improvements in gas turbine engine performance due to their exhibited toughness when subjected to high mechanical loads at extreme temperatures (2200F+). Reinforced fiber composites (RFC) provide increased fracture toughness, crack growth resistance, and strength, though little is known about how stochastic variation and imperfections in the material effect material properties. In this work, tools are developed for quantifying anomalies within the microstructure at several scales. The detection and characterization of anomalous microstructure is a critical step in linking production techniques to properties, as well as in accurate material simulation and property prediction for the integrated computation materials engineering (ICME) of RFC based components. It is desired to find statistical outliers for any number of material characteristics such as fibers, fiber coatings, and pores. Here, fiber orientation, or `velocity', and `velocity' gradient are developed and examined for anomalous behavior. Categorizing anomalous behavior in the CMC is approached by multivariate Gaussian mixture modeling. A Gaussian mixture is employed to estimate the probability density function (PDF) of the features in question, and anomalies are classified by their likelihood of belonging to the statistical normal behavior for that feature.

  13. Unsupervised Anomaly Detection Based on Clustering and Multiple One-Class SVM

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Kwon, Yongjin

    Intrusion detection system (IDS) has played an important role as a device to defend our networks from cyber attacks. However, since it is unable to detect unknown attacks, i.e., 0-day attacks, the ultimate challenge in intrusion detection field is how we can exactly identify such an attack by an automated manner. Over the past few years, several studies on solving these problems have been made on anomaly detection using unsupervised learning techniques such as clustering, one-class support vector machine (SVM), etc. Although they enable one to construct intrusion detection models at low cost and effort, and have capability to detect unforeseen attacks, they still have mainly two problems in intrusion detection: a low detection rate and a high false positive rate. In this paper, we propose a new anomaly detection method based on clustering and multiple one-class SVM in order to improve the detection rate while maintaining a low false positive rate. We evaluated our method using KDD Cup 1999 data set. Evaluation results show that our approach outperforms the existing algorithms reported in the literature; especially in detection of unknown attacks.

  14. A high-order statistical tensor based algorithm for anomaly detection in hyperspectral imagery.

    PubMed

    Geng, Xiurui; Sun, Kang; Ji, Luyan; Zhao, Yongchao

    2014-01-01

    Recently, high-order statistics have received more and more interest in the field of hyperspectral anomaly detection. However, most of the existing high-order statistics based anomaly detection methods require stepwise iterations since they are the direct applications of blind source separation. Moreover, these methods usually produce multiple detection maps rather than a single anomaly distribution image. In this study, we exploit the concept of coskewness tensor and propose a new anomaly detection method, which is called COSD (coskewness detector). COSD does not need iteration and can produce single detection map. The experiments based on both simulated and real hyperspectral data sets verify the effectiveness of our algorithm. PMID:25366706

  15. Recursive SAM-based band selection for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    He, Yuanlei; Liu, Daizhi; Yi, Shihua

    2010-10-01

    Band selection has been widely used in hyperspectral image processing for dimension reduction. In this paper, a recursive SAM-based band selection (RSAM-BBS) method is proposed. Once two initial bands are given, RSAM-BBS is performed in a sequential manner, and at each step the band that can best describe the spectral separation of two hyperspectral signatures is added to the bands already selected until the spectral angle reaches its maximum. In order to demonstrate the utility of the proposed band selection method, an anomaly detection algorithm is developed, which first extracts the anomalous target spectrum from the original image using automatic target detection and classification algorithm (ATDCA), followed by maximum spectral screening (MSS) to estimate the background average spectrum, then implements RSAM-BBS to select bands that participate in the subsequent adaptive cosine estimator (ACE) target detection. As shown in the experimental result on the AVIRIS dataset, less than five bands selected by the RSAM-BBS can achieve comparable detection performance using the full bands.

  16. A new approach for structural health monitoring by applying anomaly detection on strain sensor data

    NASA Astrophysics Data System (ADS)

    Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik

    2014-03-01

    Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.

  17. [Multi-DSP parallel processing technique of hyperspectral RX anomaly detection].

    PubMed

    Guo, Wen-Ji; Zeng, Xiao-Ru; Zhao, Bao-Wei; Ming, Xing; Zhang, Gui-Feng; Lü, Qun-Bo

    2014-05-01

    To satisfy the requirement of high speed, real-time and mass data storage etc. for RX anomaly detection of hyperspectral image data, the present paper proposes a solution of multi-DSP parallel processing system for hyperspectral image based on CPCI Express standard bus architecture. Hardware topological architecture of the system combines the tight coupling of four DSPs sharing data bus and memory unit with the interconnection of Link ports. On this hardware platform, by assigning parallel processing task for each DSP in consideration of the spectrum RX anomaly detection algorithm and the feature of 3D data in the spectral image, a 4DSP parallel processing technique which computes and solves the mean matrix and covariance matrix of the whole image by spatially partitioning the image is proposed. The experiment result shows that, in the case of equivalent detective effect, it can reach the time efficiency 4 times higher than single DSP process with the 4-DSP parallel processing technique of RX anomaly detection algorithm proposed by this paper, which makes a breakthrough in the constraints to the huge data image processing of DSP's internal storage capacity, meanwhile well meeting the demands of the spectral data in real-time processing. PMID:25095443

  18. FRaC: a feature-modeling approach for semi-supervised and unsupervised anomaly detection

    PubMed Central

    Brodley, Carla; Slonim, Donna

    2011-01-01

    Anomaly detection involves identifying rare data instances (anomalies) that come from a different class or distribution than the majority (which are simply called “normal” instances). Given a training set of only normal data, the semi-supervised anomaly detection task is to identify anomalies in the future. Good solutions to this task have applications in fraud and intrusion detection. The unsupervised anomaly detection task is different: Given unlabeled, mostly-normal data, identify the anomalies among them. Many real-world machine learning tasks, including many fraud and intrusion detection tasks, are unsupervised because it is impractical (or impossible) to verify all of the training data. We recently presented FRaC, a new approach for semi-supervised anomaly detection. FRaC is based on using normal instances to build an ensemble of feature models, and then identifying instances that disagree with those models as anomalous. In this paper, we investigate the behavior of FRaC experimentally and explain why FRaC is so successful. We also show that FRaC is a superior approach for the unsupervised as well as the semi-supervised anomaly detection task, compared to well-known state-of-the-art anomaly detection methods, LOF and one-class support vector machines, and to an existing feature-modeling approach. PMID:22639542

  19. Feasibility of anomaly detection and characterization using trans-admittance mammography with 60 × 60 electrode array

    NASA Astrophysics Data System (ADS)

    Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; In Oh, Tong

    2014-10-01

    Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.

  20. Online anomaly detection in crowd scenes via structure analysis.

    PubMed

    Yuan, Yuan; Fang, Jianwu; Wang, Qi

    2015-03-01

    Abnormal behavior detection in crowd scenes is continuously a challenge in the field of computer vision. For tackling this problem, this paper starts from a novel structure modeling of crowd behavior. We first propose an informative structural context descriptor (SCD) for describing the crowd individual, which originally introduces the potential energy function of particle's interforce in solid-state physics to intuitively conduct vision contextual cueing. For computing the crowd SCD variation effectively, we then design a robust multi-object tracker to associate the targets in different frames, which employs the incremental analytical ability of the 3-D discrete cosine transform (DCT). By online spatial-temporal analyzing the SCD variation of the crowd, the abnormality is finally localized. Our contribution mainly lies on three aspects: 1) the new exploration of abnormal detection from structure modeling where the motion difference between individuals is computed by a novel selective histogram of optical flow that makes the proposed method can deal with more kinds of anomalies; 2) the SCD description that can effectively represent the relationship among the individuals; and 3) the 3-D DCT multi-object tracker that can robustly associate the limited number of (instead of all) targets which makes the tracking analysis in high density crowd situation feasible. Experimental results on several publicly available crowd video datasets verify the effectiveness of the proposed method. PMID:24988603

  1. A Comparative Study of Unsupervised Anomaly Detection Techniques Using Honeypot Data

    NASA Astrophysics Data System (ADS)

    Song, Jungsuk; Takakura, Hiroki; Okabe, Yasuo; Inoue, Daisuke; Eto, Masashi; Nakao, Koji

    Intrusion Detection Systems (IDS) have been received considerable attention among the network security researchers as one of the most promising countermeasures to defend our crucial computer systems or networks against attackers on the Internet. Over the past few years, many machine learning techniques have been applied to IDSs so as to improve their performance and to construct them with low cost and effort. Especially, unsupervised anomaly detection techniques have a significant advantage in their capability to identify unforeseen attacks, i.e., 0-day attacks, and to build intrusion detection models without any labeled (i.e., pre-classified) training data in an automated manner. In this paper, we conduct a set of experiments to evaluate and analyze performance of the major unsupervised anomaly detection techniques using real traffic data which are obtained at our honeypots deployed inside and outside of the campus network of Kyoto University, and using various evaluation criteria, i.e., performance evaluation by similarity measurements and the size of training data, overall performance, detection ability for unknown attacks, and time complexity. Our experimental results give some practical and useful guidelines to IDS researchers and operators, so that they can acquire insight to apply these techniques to the area of intrusion detection, and devise more effective intrusion detection models.

  2. Developing a new, passive diffusion sampling array to detect helium anomalies associated with volcanic unrest

    USGS Publications Warehouse

    Dame, Brittany E; Solomon, D Kip; Evans, William C.; Ingebritsen, Steven E.

    2015-01-01

    Helium (He) concentration and 3 He/ 4 He anomalies in soil gas and spring water are potentially powerful tools for investigating hydrothermal circulation associated with volca- nism and could perhaps serve as part of a hazards warning system. However, in operational practice, He and other gases are often sampled only after volcanic unrest is detected by other means. A new passive diffusion sampler suite, intended to be collected after the onset of unrest, has been developed and tested as a relatively low-cost method of determining He- isotope composition pre- and post-unrest. The samplers, each with a distinct equilibration time, passively record He concen- tration and isotope ratio in springs and soil gas. Once collected and analyzed, the He concentrations in the samplers are used to deconvolve the time history of the He concentration and the 3 He/ 4 He ratio at the collection site. The current suite consisting of three samplers is sufficient to deconvolve both the magnitude and the timing of a step change in in situ con- centration if the suite is collected within 100 h of the change. The effects of temperature and prolonged deployment on the suite ’ s capability of recording He anomalies have also been evaluated. The suite has captured a significant 3 He/ 4 He soil gas anomaly at Horseshoe Lake near Mammoth Lakes, California. The passive diffusion sampler suite appears to be an accurate and affordable alternative for determining He anomalies associated with volcanic unrest.

  3. Clusters versus GPUs for Parallel Target and Anomaly Detection in Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2010-12-01

    Remotely sensed hyperspectral sensors provide image data containing rich information in both the spatial and the spectral domain, and this information can be used to address detection tasks in many applications. In many surveillance applications, the size of the objects (targets) searched for constitutes a very small fraction of the total search area and the spectral signatures associated to the targets are generally different from those of the background, hence the targets can be seen as anomalies. In hyperspectral imaging, many algorithms have been proposed for automatic target and anomaly detection. Given the dimensionality of hyperspectral scenes, these techniques can be time-consuming and difficult to apply in applications requiring real-time performance. In this paper, we develop several new parallel implementations of automatic target and anomaly detection algorithms. The proposed parallel algorithms are quantitatively evaluated using hyperspectral data collected by the NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) system over theWorld Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in theWTC complex.

  4. Parallel implementation of RX anomaly detection on multi-core processors: impact of data partitioning strategies

    NASA Astrophysics Data System (ADS)

    Molero, Jose M.; Garzón, Ester M.; García, Inmaculada; Plaza, Antonio

    2011-11-01

    Anomaly detection is an important task for remotely sensed hyperspectral data exploitation. One of the most widely used and successful algorithms for anomaly detection in hyperspectral images is the Reed-Xiaoli (RX) algorithm. Despite its wide acceptance and high computational complexity when applied to real hyperspectral scenes, few documented parallel implementations of this algorithm exist, in particular for multi-core processors. The advantage of multi-core platforms over other specialized parallel architectures is that they are a low-power, inexpensive, widely available and well-known technology. A critical issue in the parallel implementation of RX is the sample covariance matrix calculation, which can be approached in global or local fashion. This aspect is crucial for the RX implementation since the consideration of a local or global strategy for the computation of the sample covariance matrix is expected to affect both the scalability of the parallel solution and the anomaly detection results. In this paper, we develop new parallel implementations of the RX in multi-core processors and specifically investigate the impact of different data partitioning strategies when parallelizing its computations. For this purpose, we consider both global and local data partitioning strategies in the spatial domain of the scene, and further analyze their scalability in different multi-core platforms. The numerical effectiveness of the considered solutions is evaluated using receiver operating characteristics (ROC) curves, analyzing their capacity to detect thermal hot spots (anomalies) in hyperspectral data collected by the NASA's Airborne Visible Infra- Red Imaging Spectrometer system over the World Trade Center in New York, five days after the terrorist attacks of September 11th, 2001.

  5. A new morphological anomaly detection algorithm for hyperspectral images and its GPU implementation

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2011-10-01

    Anomaly detection is considered a very important task for hyperspectral data exploitation. It is now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we develop a new morphological algorithm for anomaly detection in hyperspectral images along with an efficient GPU implementation of the algorithm. The algorithm is implemented on latest-generation GPU architectures, and evaluated with regards to other anomaly detection algorithms using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex. The proposed GPU implementation achieves real-time performance in the considered case study.

  6. An anomaly detection and isolation scheme with instance-based learning and sequential analysis

    SciTech Connect

    Yoo, T. S.; Garcia, H. E.

    2006-07-01

    This paper presents an online anomaly detection and isolation (FDI) technique using an instance-based learning method combined with a sequential change detection and isolation algorithm. The proposed method uses kernel density estimation techniques to build statistical models of the given empirical data (null hypothesis). The null hypothesis is associated with the set of alternative hypotheses modeling the abnormalities of the systems. A decision procedure involves a sequential change detection and isolation algorithm. Notably, the proposed method enjoys asymptotic optimality as the applied change detection and isolation algorithm is optimal in minimizing the worst mean detection/isolation delay for a given mean time before a false alarm or a false isolation. Applicability of this methodology is illustrated with redundant sensor data set and its performance. (authors)

  7. Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.

    2010-01-01

    In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.

  8. Fuzzy neural networks for classification and detection of anomalies.

    PubMed

    Meneganti, M; Saviello, F S; Tagliaferri, R

    1998-01-01

    In this paper, a new learning algorithm for the Simpson's fuzzy min-max neural network is presented. It overcomes some undesired properties of the Simpson's model: specifically, in it there are neither thresholds that bound the dimension of the hyperboxes nor sensitivity parameters. Our new algorithm improves the network performance: in fact, the classification result does not depend on the presentation order of the patterns in the training set, and at each step, the classification error in the training set cannot increase. The new neural model is particularly useful in classification problems as it is shown by comparison with some fuzzy neural nets cited in literature (Simpson's min-max model, fuzzy ARTMAP proposed by Carpenter, Grossberg et al. in 1992, adaptive fuzzy systems as introduced by Wang in his book) and the classical multilayer perceptron neural network with backpropagation learning algorithm. The tests were executed on three different classification problems: the first one with two-dimensional synthetic data, the second one with realistic data generated by a simulator to find anomalies in the cooling system of a blast furnace, and the third one with real data for industrial diagnosis. The experiments were made following some recent evaluation criteria known in literature and by using Microsoft Visual C++ development environment on personal computers. PMID:18255771

  9. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data.

    PubMed

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601

  10. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data

    PubMed Central

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601

  11. Multiple Kernel Learning for Heterogeneous Anomaly Detection: Algorithm and Aviation Safety Case Study

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.

    2010-01-01

    The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods

  12. Accumulating pyramid spatial-spectral collaborative coding divergence for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Zou, Huanxin; Zhou, Shilin

    2016-03-01

    Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.

  13. Detecting Distributed Network Traffic Anomaly with Network-Wide Correlation Analysis

    NASA Astrophysics Data System (ADS)

    Zonglin, Li; Guangmin, Hu; Xingmiao, Yao; Dan, Yang

    2008-12-01

    Distributed network traffic anomaly refers to a traffic abnormal behavior involving many links of a network and caused by the same source (e.g., DDoS attack, worm propagation). The anomaly transiting in a single link might be unnoticeable and hard to detect, while the anomalous aggregation from many links can be prevailing, and does more harm to the networks. Aiming at the similar features of distributed traffic anomaly on many links, this paper proposes a network-wide detection method by performing anomalous correlation analysis of traffic signals' instantaneous parameters. In our method, traffic signals' instantaneous parameters are firstly computed, and their network-wide anomalous space is then extracted via traffic prediction. Finally, an anomaly is detected by a global correlation coefficient of anomalous space. Our evaluation using Abilene traffic traces demonstrates the excellent performance of this approach for distributed traffic anomaly detection.

  14. Incremental classification learning for anomaly detection in medical images

    NASA Astrophysics Data System (ADS)

    Giritharan, Balathasan; Yuan, Xiaohui; Liu, Jianguo

    2009-02-01

    Computer-aided diagnosis usually screens thousands of instances to find only a few positive cases that indicate probable presence of disease.The amount of patient data increases consistently all the time. In diagnosis of new instances, disagreement occurs between a CAD system and physicians, which suggests inaccurate classifiers. Intuitively, misclassified instances and the previously acquired data should be used to retrain the classifier. This, however, is very time consuming and, in some cases where dataset is too large, becomes infeasible. In addition, among the patient data, only a small percentile shows positive sign, which is known as imbalanced data.We present an incremental Support Vector Machines(SVM) as a solution for the class imbalance problem in classification of anomaly in medical images. The support vectors provide a concise representation of the distribution of the training data. Here we use bootstrapping to identify potential candidate support vectors for future iterations. Experiments were conducted using images from endoscopy videos, and the sensitivity and specificity were close to that of SVM trained using all samples available at a given incremental step with significantly improved efficiency in training the classifier.

  15. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    NASA Astrophysics Data System (ADS)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  16. A Mobile Device System for Early Warning of ECG Anomalies

    PubMed Central

    Szczepański, Adam; Saeed, Khalid

    2014-01-01

    With the rapid increase in computational power of mobile devices the amount of ambient intelligence-based smart environment systems has increased greatly in recent years. A proposition of such a solution is described in this paper, namely real time monitoring of an electrocardiogram (ECG) signal during everyday activities for identification of life threatening situations. The paper, being both research and review, describes previous work of the authors, current state of the art in the context of the authors' work and the proposed aforementioned system. Although parts of the solution were described in earlier publications of the authors, the whole concept is presented completely for the first time along with the prototype implementation on mobile device—a Windows 8 tablet with Modern UI. The system has three main purposes. The first goal is the detection of sudden rapid cardiac malfunctions and informing the people in the patient's surroundings, family and friends and the nearest emergency station about the deteriorating health of the monitored person. The second goal is a monitoring of ECG signals under non-clinical conditions to detect anomalies that are typically not found during diagnostic tests. The third goal is to register and analyze repeatable, long-term disturbances in the regular signal and finding their patterns. PMID:24955946

  17. A mobile device system for early warning of ECG anomalies.

    PubMed

    Szczepański, Adam; Saeed, Khalid

    2014-01-01

    With the rapid increase in computational power of mobile devices the amount of ambient intelligence-based smart environment systems has increased greatly in recent years. A proposition of such a solution is described in this paper, namely real time monitoring of an electrocardiogram (ECG) signal during everyday activities for identification of life threatening situations. The paper, being both research and review, describes previous work of the authors, current state of the art in the context of the authors' work and the proposed aforementioned system. Although parts of the solution were described in earlier publications of the authors, the whole concept is presented completely for the first time along with the prototype implementation on mobile device-a Windows 8 tablet with Modern UI. The system has three main purposes. The first goal is the detection of sudden rapid cardiac malfunctions and informing the people in the patient's surroundings, family and friends and the nearest emergency station about the deteriorating health of the monitored person. The second goal is a monitoring of ECG signals under non-clinical conditions to detect anomalies that are typically not found during diagnostic tests. The third goal is to register and analyze repeatable, long-term disturbances in the regular signal and finding their patterns. PMID:24955946

  18. Gaussian Process Regression-Based Video Anomaly Detection and Localization With Hierarchical Feature Representation.

    PubMed

    Cheng, Kai-Wen; Chen, Yie-Tarng; Fang, Wen-Hsien

    2015-12-01

    This paper presents a hierarchical framework for detecting local and global anomalies via hierarchical feature representation and Gaussian process regression (GPR) which is fully non-parametric and robust to the noisy training data, and supports sparse features. While most research on anomaly detection has focused more on detecting local anomalies, we are more interested in global anomalies that involve multiple normal events interacting in an unusual manner, such as car accidents. To simultaneously detect local and global anomalies, we cast the extraction of normal interactions from the training videos as a problem of finding the frequent geometric relations of the nearby sparse spatio-temporal interest points (STIPs). A codebook of interaction templates is then constructed and modeled using the GPR, based on which a novel inference method for computing the likelihood of an observed interaction is also developed. Thereafter, these local likelihood scores are integrated into globally consistent anomaly masks, from which anomalies can be succinctly identified. To the best of our knowledge, it is the first time GPR is employed to model the relationship of the nearby STIPs for anomaly detection. Simulations based on four widespread datasets show that the new method outperforms the main state-of-the-art methods with lower computational burden. PMID:26394423

  19. Jamming anomaly in PT-symmetric systems

    NASA Astrophysics Data System (ADS)

    Barashenkov, I. V.; Zezyulin, D. A.; Konotop, V. V.

    2016-07-01

    The Schrödinger equation with a { P }{ T }-symmetric potential is used to model an optical structure consisting of an element with gain coupled to an element with loss. At low gain–loss amplitudes γ, raising the amplitude results in the energy flux from the active to the leaky element being boosted. We study the anomalous behaviour occurring for larger γ, where the increase of the amplitude produces a drop of the flux across the gain–loss interface. We show that this jamming anomaly is either a precursor of the exceptional point, where two real eigenvalues coalesce and acquire imaginary parts, or precedes the eigenvalue's immersion in the continuous spectrum.

  20. Small-scale anomaly detection in panoramic imaging using neural models of low-level vision

    NASA Astrophysics Data System (ADS)

    Casey, Matthew C.; Hickman, Duncan L.; Pavlou, Athanasios; Sadler, James R. E.

    2011-06-01

    Our understanding of sensory processing in animals has reached the stage where we can exploit neurobiological principles in commercial systems. In human vision, one brain structure that offers insight into how we might detect anomalies in real-time imaging is the superior colliculus (SC). The SC is a small structure that rapidly orients our eyes to a movement, sound or touch that it detects, even when the stimulus may be on a small-scale; think of a camouflaged movement or the rustle of leaves. This automatic orientation allows us to prioritize the use of our eyes to raise awareness of a potential threat, such as a predator approaching stealthily. In this paper we describe the application of a neural network model of the SC to the detection of anomalies in panoramic imaging. The neural approach consists of a mosaic of topographic maps that are each trained using competitive Hebbian learning to rapidly detect image features of a pre-defined shape and scale. What makes this approach interesting is the ability of the competition between neurons to automatically filter noise, yet with the capability of generalizing the desired shape and scale. We will present the results of this technique applied to the real-time detection of obscured targets in visible-band panoramic CCTV images. Using background subtraction to highlight potential movement, the technique is able to correctly identify targets which span as little as 3 pixels wide while filtering small-scale noise.

  1. Prenatal Diagnosis of Central Nervous System Anomalies by High-Resolution Chromosomal Microarray Analysis

    PubMed Central

    Sun, Lijuan; Wu, Qingqing; Jiang, Shi-Wen; Yan, Yani; Wang, Xin; Zhang, Juan; Liu, Yan; Yao, Ling; Ma, Yuqing; Wang, Li

    2015-01-01

    The aims of this study were to evaluate the contribution of chromosomal microarray analysis (CMA) in the prenatal diagnosis of fetuses with central nervous system (CNS) anomalies but normal chromosomal karyotype. A total of 46 fetuses with CNS anomalies with or without other ultrasound anomalies but normal karyotypes were evaluated by array-based comparative genomic hybridisation (aCGH) or single-nucleotide polymorphism (SNP) array. The result showed that CNVs were detected in 17 (37.0%) fetuses. Of these, CNVs identified in 5 (5/46, 10.9%) fetuses were considered to be likely pathogenic, and CNVs detected in 3 (3/46, 6.5%) fetuses were defined as being of uncertain clinical significance. Fetuses with CNS malformations plus other ultrasound anomalies had a higher rate of pathogenic CNVs than those with isolated CNS anomalies (13.6% versus 8.3%), but there was no significant difference (Fisher's exact test, P > 0.05). Pathogenic CNVs were detected most frequently in fetuses with Dandy-Walker syndrome (2/6, 33.3%) when compared with other types of neural malformations, and holoprosencephaly (2/7, 28.6%) ranked the second. CMA is valuable in prenatal genetic diagnosis of fetuses with CNS anomalies. It should be considered as part of prenatal diagnosis in fetuses with CNS malformations and normal karyotypes. PMID:26064910

  2. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  3. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    NASA Technical Reports Server (NTRS)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  4. Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  5. Lunar magnetic anomalies detected by the Apollo subsatellite magnetometers

    NASA Technical Reports Server (NTRS)

    Hood, L. L.; Coleman, P. J., Jr.; Russell, C. T.; Wilhelms, D. E.

    1979-01-01

    Properties of lunar crustal magnetization thus far deduced from Apollo subsatellite magnetometer data are reviewed using two of the most accurate available magnetic anomaly maps, one covering a portion of the lunar near side and the other a part of the far side. The largest single anomaly found within the region of coverage on the near-side map correlates exactly with a conspicuous light-colored marking in western Oceanus Procellarum called Reiner Gamma. This feature is interpreted as an unusual deposit of ejecta from secondary craters of the large nearby primary impact crater Cavalerius. The mean altitude of the far-side anomaly gap is much higher than that of the near side map and the surface geology is more complex; individual anomaly sources have therefore not yet been identified. The mechanism of magnetization and the origin of the magnetizing field remain unresolved, but the uniformity with which the Reiner Gamma deposit is apparently magnetized, and the north-south depletion of magnetization intensity across a substantial portion of the far side, seem to require the existence of an ambient field, perhaps of global or larger extent.

  6. GPU implementation of target and anomaly detection algorithms for remotely sensed hyperspectral image analysis

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio

    2010-08-01

    Automatic target and anomaly detection are considered very important tasks for hyperspectral data exploitation. These techniques are now routinely applied in many application domains, including defence and intelligence, public safety, precision agriculture, geology, or forestry. Many of these applications require timely responses for swift decisions which depend upon high computing performance of algorithm analysis. However, with the recent explosion in the amount and dimensionality of hyperspectral imagery, this problem calls for the incorporation of parallel computing techniques. In the past, clusters of computers have offered an attractive solution for fast anomaly and target detection in hyperspectral data sets already transmitted to Earth. However, these systems are expensive and difficult to adapt to on-board data processing scenarios, in which low-weight and low-power integrated components are essential to reduce mission payload and obtain analysis results in (near) real-time, i.e., at the same time as the data is collected by the sensor. An exciting new development in the field of commodity computing is the emergence of commodity graphics processing units (GPUs), which can now bridge the gap towards on-board processing of remotely sensed hyperspectral data. In this paper, we describe several new GPU-based implementations of target and anomaly detection algorithms for hyperspectral data exploitation. The parallel algorithms are implemented on latest-generation Tesla C1060 GPU architectures, and quantitatively evaluated using hyperspectral data collected by NASA's AVIRIS system over the World Trade Center (WTC) in New York, five days after the terrorist attacks that collapsed the two main towers in the WTC complex.

  7. Volcanic activity and satellite-detected thermal anomalies at Central American volcanoes

    NASA Technical Reports Server (NTRS)

    Stoiber, R. E. (Principal Investigator); Rose, W. I., Jr.

    1973-01-01

    The author has identified the following significant results. A large nuee ardente eruption occurred at Santiaguito volcano, within the test area on 16 September 1973. Through a system of local observers, the eruption has been described, reported to the international scientific community, extent of affected area mapped, and the new ash sampled. A more extensive report on this event will be prepared. The eruption is an excellent example of the kind of volcanic situation in which satellite thermal imagery might be useful. The Santiaguito dome is a complex mass with a whole series of historically active vents. It's location makes access difficult, yet its activity is of great concern to large agricultural populations who live downslope. Santiaguito has produced a number of large eruptions with little apparent warning. In the earlier ground survey large thermal anomalies were identified at Santiaguito. There is no way of knowing whether satellite monitoring could have detected changes in thermal anomaly patterns related to this recent event, but the position of thermal anomalies on Santiaguito and any changes in their character would be relevant information.

  8. A novel approach for detection of anomalies using measurement data of the Ironton-Russell bridge

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Norouzi, Mehdi; Hunt, Victor; Helmicki, Arthur

    2015-04-01

    Data models have been increasingly used in recent years for documenting normal behavior of structures and hence detect and classify anomalies. Large numbers of machine learning algorithms were proposed by various researchers to model operational and functional changes in structures; however, a limited number of studies were applied to actual measurement data due to limited access to the long term measurement data of structures and lack of access to the damaged states of structures. By monitoring the structure during construction and reviewing the effect of construction events on the measurement data, this study introduces a new approach to detect and eventually classify anomalies during construction and after construction. First, the implementation procedure of the sensory network that develops while the bridge is being built and its current status will be detailed. Second, the proposed anomaly detection algorithm will be applied on the collected data and finally, detected anomalies will be validated against the archived construction events.

  9. Discovering Recurring Anomalies in Text Reports Regarding Complex Space Systems

    NASA Technical Reports Server (NTRS)

    Zane-Ulman, Brett; Srivastava, Ashok N.

    2005-01-01

    Many existing complex space systems have a significant amount of historical maintenance and problem data bases that are stored in unstructured text forms. For some platforms, these reports may be encoded as scanned images rather than even searchable text. The problem that we address in this paper is the discovery of recurring anomalies and relationships between different problem reports that may indicate larger systemic problems. We will illustrate our techniques on data from discrepancy reports regarding software anomalies in the Space Shuttle. These free text reports are written by a number of different penp!e, thus the emphasis and wording varies considerably.

  10. Anomaly Detection Techniques with Real Test Data from a Spinning Turbine Engine-Like Rotor

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Woike, Mark R.; Oza, Nikunj C.; Matthews, Bryan L.

    2012-01-01

    Online detection techniques to monitor the health of rotating engine components are becoming increasingly attractive to aircraft engine manufacturers in order to increase safety of operation and lower maintenance costs. Health monitoring remains a challenge to easily implement, especially in the presence of scattered loading conditions, crack size, component geometry, and materials properties. The current trend, however, is to utilize noninvasive types of health monitoring or nondestructive techniques to detect hidden flaws and mini-cracks before any catastrophic event occurs. These techniques go further to evaluate material discontinuities and other anomalies that have grown to the level of critical defects that can lead to failure. Generally, health monitoring is highly dependent on sensor systems capable of performing in various engine environmental conditions and able to transmit a signal upon a predetermined crack length, while acting in a neutral form upon the overall performance of the engine system.

  11. Enabling the Discovery of Recurring Anomalies in Aerospace System Problem Reports using High-Dimensional Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Srivastava, Ashok, N.; Akella, Ram; Diev, Vesselin; Kumaresan, Sakthi Preethi; McIntosh, Dawn M.; Pontikakis, Emmanuel D.; Xu, Zuobing; Zhang, Yi

    2006-01-01

    This paper describes the results of a significant research and development effort conducted at NASA Ames Research Center to develop new text mining techniques to discover anomalies in free-text reports regarding system health and safety of two aerospace systems. We discuss two problems of significant importance in the aviation industry. The first problem is that of automatic anomaly discovery about an aerospace system through the analysis of tens of thousands of free-text problem reports that are written about the system. The second problem that we address is that of automatic discovery of recurring anomalies, i.e., anomalies that may be described m different ways by different authors, at varying times and under varying conditions, but that are truly about the same part of the system. The intent of recurring anomaly identification is to determine project or system weakness or high-risk issues. The discovery of recurring anomalies is a key goal in building safe, reliable, and cost-effective aerospace systems. We address the anomaly discovery problem on thousands of free-text reports using two strategies: (1) as an unsupervised learning problem where an algorithm takes free-text reports as input and automatically groups them into different bins, where each bin corresponds to a different unknown anomaly category; and (2) as a supervised learning problem where the algorithm classifies the free-text reports into one of a number of known anomaly categories. We then discuss the application of these methods to the problem of discovering recurring anomalies. In fact the special nature of recurring anomalies (very small cluster sizes) requires incorporating new methods and measures to enhance the original approach for anomaly detection. ?& pant 0-

  12. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The

  13. Lunar magnetic anomalies detected by the Apollo substatellite magnetometers

    USGS Publications Warehouse

    Hood, L.L.; Coleman, P.J., Jr.; Russell, C.T.; Wilhelms, D.E.

    1979-01-01

    Properties of lunar crustal magnetization thus far deduced from Apollo subsatellite magnetometer data are reviewed using two of the most accurate presently available magnetic anomaly maps - one covering a portion of the lunar near side and the other a part of the far side. The largest single anomaly found within the region of coverage on the near-side map correlates exactly with a conspicuous, light-colored marking in western Oceanus Procellarum called Reiner Gamma. This feature is interpreted as an unusual deposit of ejecta from secondary craters of the large nearby primary impact crater Cavalerius. An age for Cavalerius (and, by implication, for Reiner Gamma) of 3.2 ?? 0.2 ?? 109 y is estimated. The main (30 ?? 60 km) Reiner Gamma deposit is nearly uniformly magnetized in a single direction, with a minimum mean magnetization intensity of ???7 ?? 10-2 G cm3/g (assuming a density of 3 g/cm3), or about 700 times the stable magnetization component of the most magnetic returned samples. Additional medium-amplitude anomalies exist over the Fra Mauro Formation (Imbrium basin ejecta emplaced ???3.9 ?? 109 y ago) where it has not been flooded by mare basalt flows, but are nearly absent over the maria and over the craters Copernicus, Kepler, and Reiner and their encircling ejecta mantles. The mean altitude of the far-side anomaly gap is much higher than that of the near-side map and the surface geology is more complex, so individual anomaly sources have not yet been identified. However, it is clear that a concentration of especially strong sources exists in the vicinity of the craters Van de Graaff and Aitken. Numerical modeling of the associated fields reveals that the source locations do not correspond with the larger primary impact craters of the region and, by analogy with Reiner Gamma, may be less conspicuous secondary crater ejecta deposits. The reason for a special concentration of strong sources in the Van de Graaff-Aitken region is unknown, but may be indirectly

  14. A hyperspectral imagery anomaly detection algorithm based on local three-dimensional orthogonal subspace projection

    NASA Astrophysics Data System (ADS)

    Zhang, Xing; Wen, Gongjian

    2015-10-01

    Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.

  15. Multi-Level Anomaly Detection on Time-Varying Graph Data

    SciTech Connect

    Bridges, Robert A; Collins, John P; Ferragut, Erik M; Laska, Jason A; Sullivan, Blair D

    2015-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.

  16. Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering

    NASA Astrophysics Data System (ADS)

    Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech

    2015-03-01

    We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.

  17. MedMon: securing medical devices through wireless monitoring and anomaly detection.

    PubMed

    Zhang, Meng; Raghunathan, Anand; Jha, Niraj K

    2013-12-01

    Rapid advances in personal healthcare systems based on implantable and wearable medical devices promise to greatly improve the quality of diagnosis and treatment for a range of medical conditions. However, the increasing programmability and wireless connectivity of medical devices also open up opportunities for malicious attackers. Unfortunately, implantable/wearable medical devices come with extreme size and power constraints, and unique usage models, making it infeasible to simply borrow conventional security solutions such as cryptography. We propose a general framework for securing medical devices based on wireless channel monitoring and anomaly detection. Our proposal is based on a medical security monitor (MedMon) that snoops on all the radio-frequency wireless communications to/from medical devices and uses multi-layered anomaly detection to identify potentially malicious transactions. Upon detection of a malicious transaction, MedMon takes appropriate response actions, which could range from passive (notifying the user) to active (jamming the packets so that they do not reach the medical device). A key benefit of MedMon is that it is applicable to existing medical devices that are in use by patients, with no hardware or software modifications to them. Consequently, it also leads to zero power overheads on these devices. We demonstrate the feasibility of our proposal by developing a prototype implementation for an insulin delivery system using off-the-shelf components (USRP software-defined radio). We evaluate its effectiveness under several attack scenarios. Our results show that MedMon can detect virtually all naive attacks and a large fraction of more sophisticated attacks, suggesting that it is an effective approach to enhancing the security of medical devices. PMID:24473551

  18. Advancements of data anomaly detection research in wireless sensor networks: a survey and open issues.

    PubMed

    Rassam, Murad A; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-01-01

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept "Internet of Things" has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182

  19. Advancements of Data Anomaly Detection Research in Wireless Sensor Networks: A Survey and Open Issues

    PubMed Central

    Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-01-01

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182

  20. A Diagnoser Algorithm for Anomaly Detection in DEDS under Partial Unreliable Observations: Characterization and Inclusion in Sensor Configuration Optimizaton

    SciTech Connect

    Wen-Chiao Lin; Humberto Garcia; Tae-Sic Yoo

    2013-03-01

    Complex engineering systems have to be carefully monitored to meet demanding performance requirements, including detecting anomalies in their operations. There are two major monitoring challenges for these systems. The first challenge is that information collected from the monitored system is often partial and/or unreliable, in the sense that some occurred events may not be reported and/or may be reported incorrectly (e.g., reported as another event). The second is that anomalies often consist of sequences of event patterns separated in space and time. This paper introduces and analyzes a diagnoser algorithm that meets these challenges for detecting and counting occurrences of anomalies in engineering systems. The proposed diagnoser algorithm assumes that models are available for characterizing plant operations (via stochastic automata) and sensors (via probabilistic mappings) used for reporting partial and unreliable information. Methods for analyzing the effects of model uncertainties on the diagnoser performance are also discussed. In order to select configurations that reduce sensor costs, while satisfying diagnoser performance requirements, a sensor configuration selection algorithm developed in previous work is then extended for the proposed diagnoser algorithm. The proposed algorithms and methods are then applied to a multi-unit-operation system, which is derived from an actual facility application. Results show that the proposed diagnoser algorithm is able to detect and count occurrences of anomalies accurately and that its performance is robust to model uncertainties. Furthermore, the sensor configuration selection algorithm is able to suggest optimal sensor configurations with significantly reduced costs, while still yielding acceptable performance for counting the occurrences of anomalies.

  1. Electronic systems failures and anomalies attributed to electromagnetic interference

    NASA Technical Reports Server (NTRS)

    Leach, R. D. (Editor); Alexander, M. B. (Editor)

    1995-01-01

    The effects of electromagnetic interference can be very detrimental to electronic systems utilized in space missions. Assuring that subsystems and systems are electrically compatible is an important engineering function necessary to assure mission success. This reference publication will acquaint the reader with spacecraft electronic systems failures and anomalies caused by electromagnetic interference and will show the importance of electromagnetic compatibility activities in conjunction with space flight programs. It is also hoped that the report will illustrate that evolving electronic systems are increasingly sensitive to electromagnetic interference and that NASA personnel must continue to diligently pursue electromagnetic compatibility on space flight systems.

  2. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu

  3. Anomaly detection of turbopump vibration in Space Shuttle Main Engine using statistics and neural networks

    NASA Astrophysics Data System (ADS)

    Lo, C. F.; Wu, K.; Whitehead, B. A.

    1993-06-01

    The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.

  4. Anomaly detection of turbopump vibration in Space Shuttle Main Engine using statistics and neural networks

    NASA Technical Reports Server (NTRS)

    Lo, C. F.; Wu, K.; Whitehead, B. A.

    1993-01-01

    The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.

  5. Scalable Algorithms for Unsupervised Classification and Anomaly Detection in Large Geospatiotemporal Data Sets

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hoffman, F. M.; Kumar, J.

    2015-12-01

    The increasing availability of high-resolution geospatiotemporal datasets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery and mining of ecological data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe some unsupervised knowledge discovery and anomaly detection approaches based on highly scalable parallel algorithms for k-means clustering and singular value decomposition, consider a few practical applications thereof to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.

  6. Operator based integration of information in multimodal radiological search mission with applications to anomaly detection

    NASA Astrophysics Data System (ADS)

    Benedetto, J.; Cloninger, A.; Czaja, W.; Doster, T.; Kochersberger, K.; Manning, B.; McCullough, T.; McLane, M.

    2014-05-01

    Successful performance of radiological search mission is dependent on effective utilization of mixture of signals. Examples of modalities include, e.g., EO imagery and gamma radiation data, or radiation data collected during multiple events. In addition, elevation data or spatial proximity can be used to enhance the performance of acquisition systems. State of the art techniques in processing and exploitation of complex information manifolds rely on diffusion operators. Our approach involves machine learning techniques based on analysis of joint data- dependent graphs and their associated diffusion kernels. Then, the significant eigenvectors of the derived fused graph Laplace and Schroedinger operators form the new representation, which provides integrated features from the heterogeneous input data. The families of data-dependent Laplace and Schroedinger operators on joint data graphs, shall be integrated by means of appropriately designed fusion metrics. These fused representations are used for target and anomaly detection.

  7. Stochastic anomaly detection in eye-tracking data for quantification of motor symptoms in Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Jansson, Daniel; Medvedev, Alexander; Axelson, Hans; Nyholm, Dag

    2013-10-01

    Two methods for distinguishing between healthy controls and patients diagnosed with Parkinson's disease by means of recorded smooth pursuit eye movements are presented and evaluated. Both methods are based on the principles of stochastic anomaly detection and make use of orthogonal series approximation for probability distribution estimation. The first method relies on the identification of a Wiener-type model of the smooth pursuit system and attempts to find statistically significant differences between the estimated parameters in healthy controls and patientts with Parkinson's disease. The second method applies the same statistical method to distinguish between the gaze trajectories of healthy and Parkinson subjects attempting to track visual stimuli. Both methods show promising results, where healthy controls and patients with Parkinson's disease are effectively separated in terms of the considered metric. The results are preliminary because of the small number of participating test subjects, but they are indicative of the potential of the presented methods as diagnosing or staging tools for Parkinson's disease.

  8. A novel anomaly detection approach based on clustering and decision-level fusion

    NASA Astrophysics Data System (ADS)

    Zhong, Shengwei; Zhang, Ye

    2015-09-01

    In hyperspectral image processing, anomaly detection is a valuable way of searching targets whose spectral characteristics are not known, and the estimation of background signals is the key procedure. On account of the high dimensionality and complexity of hyperspectral image, dimensionality reduction and background suppression is necessary. In addition, the complementarity of different anomaly detection algorithms can be utilized to improve the effectiveness of anomaly detection. In this paper, we propose a novel method of anomaly detection, which is based on clustering of optimized K-means and decision-level fusion. In our proposed method, pixels with similar features are firstly clustered using an optimized k-means method. Secondly, dimensionality reduction is conducted using principle component analysis to reduce the amount of calculation. Then, to increase the accuracy of detection and decrease the false-alarm ratio, both Reed-Xiaoli (RX) and Kernel RX algorithm are used on processed image. Lastly, a decision-level fusion is processed on the detection results. A simulated hyperspectral image and a real hyperspectral one are both used to evaluate the performance of our proposed method. Visual analysis and quantative analysis of receiver operating characteristic (ROC) curves show that our algorithm can achieve better performance when compared with other classic approaches and state-of-the-art approaches.

  9. Anomaly Detection using Multi-channel FLAC for Supporting Diagnosis of ECG

    NASA Astrophysics Data System (ADS)

    Ye, Jiaxing; Kobayashi, Takumi; Murakawa, Masahiro; Higuchi, Tetsuya; Otsu, Nobuyuki

    In this paper, we propose an approach for abnormality detection in multi-channel ECG signals. This system serves as front end to detect the irregular sections in ECG signals, where symptoms may be observed. Thereby, the doctor can focus on only the detected suspected symptom sections, ignoring the disease-free parts. Hence the workload of the inspection by the doctors is significantly reduced and the diagnosis efficiency can be sharply improved. For extracting the predominant characteristics of multi-channel ECG signals, we propose multi-channel Fourier local auto-correlations (m-FLAC) features on multi-channel complex spectrograms. The method characterizes the amplitude and phase information as well as temporal dynamics of the multi-channel ECG signal. At the anomaly detection stage, we employ complex subspace method for statistically modeling the normal (healthy) ECG patterns as in one-class learning. Then, we investigate the input ECG signals by measuring its deviation distance to the trained subspace. The ECG sections with disordered spectral distributions can be effectively discerned based on such distance metric. To validate the proposed approach, we conducted experiments on ECG dataset. The experimental results demonstrated the effectiveness of the proposed approach including promising performance and high efficiency, compared to conventional methods.

  10. CTS TEP thermal anomalies: Heat pipe system performance

    NASA Technical Reports Server (NTRS)

    Marcus, B. D.

    1977-01-01

    A part of the investigation is summarized of the thermal anomalies of the transmitter experiment package (TEP) on the Communications Technology Satellite (CTS) which were observed on four occasions in 1977. Specifically, the possible failure modes of the variable conductance heat pipe system (VCHPS) used for principal thermal control of the high-power traveling wave tube in the TEP are considered. Further, the investigation examines how those malfunctions may have given rise to the TEP thermal anomalies. Using CTS flight data information, ground test results, analysis conclusions, and other relevant information, the investigation concentrated on artery depriming as the most likely VCHPS failure mode. Included in the study as possible depriming mechanisms were freezing of the working fluid, Marangoni flow, and gas evolution within the arteries. The report concludes that while depriming of the heat pipe arteries is consistent with the bulk of the observed data, the factors which cause the arteries to deprime have yet to be identified.

  11. Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection.

    PubMed

    Wang, Kun; Langevin, Stanley; O'Hern, Corey S; Shattuck, Mark D; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G; Kirby, Michael

    2016-01-01

    diagnostic tools to distinguish between acute viral and bacterial respiratory infections is critical to improve patient care and limit the overuse of antibiotics in the medical community. The identification of prognostic respiratory virus biomarkers provides an early warning system that is capable of predicting which subjects will become symptomatic to expand our medical diagnostic capabilities and treatment options for acute infectious diseases. The host response to acute infection may be viewed as a deterministic signaling network responsible for maintaining the health of the host organism. We identify pathway signatures that reflect the very earliest perturbations in the host response to acute infection. These pathways provide a monitor the health state of the host using anomaly detection to quantify and predict health outcomes to pathogens. PMID:27532264

  12. Anomaly Detection in Host Signaling Pathways for the Early Prognosis of Acute Infection

    PubMed Central

    O’Hern, Corey S.; Shattuck, Mark D.; Ogle, Serenity; Forero, Adriana; Morrison, Juliet; Slayden, Richard; Katze, Michael G.

    2016-01-01

    diagnostic tools to distinguish between acute viral and bacterial respiratory infections is critical to improve patient care and limit the overuse of antibiotics in the medical community. The identification of prognostic respiratory virus biomarkers provides an early warning system that is capable of predicting which subjects will become symptomatic to expand our medical diagnostic capabilities and treatment options for acute infectious diseases. The host response to acute infection may be viewed as a deterministic signaling network responsible for maintaining the health of the host organism. We identify pathway signatures that reflect the very earliest perturbations in the host response to acute infection. These pathways provide a monitor the health state of the host using anomaly detection to quantify and predict health outcomes to pathogens. PMID:27532264

  13. Reasoning about anomalies: a study of the analytical process of detecting and identifying anomalous behavior in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Kronhamn, Thomas

    2009-05-01

    The goal of visual analytical tools is to support the analytical reasoning process, maximizing human perceptual, understanding and reasoning capabilities in complex and dynamic situations. Visual analytics software must be built upon an understanding of the reasoning process, since it must provide appropriate interactions that allow a true discourse with the information. In order to deepen our understanding of the human analytical process and guide developers in the creation of more efficient anomaly detection systems, this paper investigates how is the human analytical process of detecting and identifying anomalous behavior in maritime traffic data. The main focus of this work is to capture the entire analysis process that an analyst goes through, from the raw data to the detection and identification of anomalous behavior. Three different sources are used in this study: a literature survey of the science of analytical reasoning, requirements specified by experts from organizations with interest in port security and user field studies conducted in different marine surveillance control centers. Furthermore, this study elaborates on how to support the human analytical process using data mining, visualization and interaction methods. The contribution of this paper is twofold: (1) within visual analytics, contribute to the science of analytical reasoning with practical understanding of users tasks in order to develop a taxonomy of interactions that support the analytical reasoning process and (2) within anomaly detection, facilitate the design of future anomaly detector systems when fully automatic approaches are not viable and human participation is needed.

  14. Low frequency of Y anomaly detected in Australian Brahman cow-herds.

    PubMed

    de Camargo, Gregório M F; Porto-Neto, Laercio R; Fortes, Marina R S; Bunch, Rowan J; Tonhati, Humberto; Reverter, Antonio; Moore, Stephen S; Lehnert, Sigrid A

    2015-02-01

    Indicine cattle have lower reproductive performance in comparison to taurine. A chromosomal anomaly characterized by the presence Y markers in females was reported and associated with infertility in cattle. The aim of this study was to investigate the occurrence of the anomaly in Brahman cows. Brahman cows (n = 929) were genotyped for a Y chromosome specific region using real time-PCR. Only six out of 929 cows had the anomaly (0.6%). The anomaly frequency was much lower in Brahman cows than in the crossbred population, in which it was first detected. It also seems that the anomaly doesn't affect pregnancy in the population. Due to the low frequency, association analyses couldn't be executed. Further, SNP signal of the pseudoautosomal boundary region of the Y chromosome was investigated using HD SNP chip. Pooled DNA of "non-pregnant" and "pregnant" cows were compared and no difference in SNP allele frequency was observed. Results suggest that the anomaly had a very low frequency in this Australian Brahman population and had no effect on reproduction. Further studies comparing pregnant cows and cows that failed to conceive should be executed after better assembly and annotation of the Y chromosome in cattle. PMID:25750859

  15. Low frequency of Y anomaly detected in Australian Brahman cow-herds

    PubMed Central

    de Camargo, Gregório M.F.; Porto-Neto, Laercio R.; Fortes, Marina R.S.; Bunch, Rowan J.; Tonhati, Humberto; Reverter, Antonio; Moore, Stephen S.; Lehnert, Sigrid A.

    2015-01-01

    Indicine cattle have lower reproductive performance in comparison to taurine. A chromosomal anomaly characterized by the presence Y markers in females was reported and associated with infertility in cattle. The aim of this study was to investigate the occurrence of the anomaly in Brahman cows. Brahman cows (n = 929) were genotyped for a Y chromosome specific region using real time-PCR. Only six out of 929 cows had the anomaly (0.6%). The anomaly frequency was much lower in Brahman cows than in the crossbred population, in which it was first detected. It also seems that the anomaly doesn't affect pregnancy in the population. Due to the low frequency, association analyses couldn't be executed. Further, SNP signal of the pseudoautosomal boundary region of the Y chromosome was investigated using HD SNP chip. Pooled DNA of “non-pregnant” and “pregnant” cows were compared and no difference in SNP allele frequency was observed. Results suggest that the anomaly had a very low frequency in this Australian Brahman population and had no effect on reproduction. Further studies comparing pregnant cows and cows that failed to conceive should be executed after better assembly and annotation of the Y chromosome in cattle. PMID:25750859

  16. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  17. [A Hyperspectral Imagery Anomaly Detection Algorithm Based on Gauss-Markov Model].

    PubMed

    Gao, Kun; Liu, Ying; Wang, Li-jing; Zhu, Zhen-yu; Cheng, Hao-bo

    2015-10-01

    With the development of spectral imaging technology, hyperspectral anomaly detection is getting more and more widely used in remote sensing imagery processing. The traditional RX anomaly detection algorithm neglects spatial correlation of images. Besides, it does not validly reduce the data dimension, which costs too much processing time and shows low validity on hyperspectral data. The hyperspectral images follow Gauss-Markov Random Field (GMRF) in space and spectral dimensions. The inverse matrix of covariance matrix is able to be directly calculated by building the Gauss-Markov parameters, which avoids the huge calculation of hyperspectral data. This paper proposes an improved RX anomaly detection algorithm based on three-dimensional GMRF. The hyperspectral imagery data is simulated with GMRF model, and the GMRF parameters are estimated with the Approximated Maximum Likelihood method. The detection operator is constructed with GMRF estimation parameters. The detecting pixel is considered as the centre in a local optimization window, which calls GMRF detecting window. The abnormal degree is calculated with mean vector and covariance inverse matrix, and the mean vector and covariance inverse matrix are calculated within the window. The image is detected pixel by pixel with the moving of GMRF window. The traditional RX detection algorithm, the regional hypothesis detection algorithm based on GMRF and the algorithm proposed in this paper are simulated with AVIRIS hyperspectral data. Simulation results show that the proposed anomaly detection method is able to improve the detection efficiency and reduce false alarm rate. We get the operation time statistics of the three algorithms in the same computer environment. The results show that the proposed algorithm improves the operation time by 45.2%, which shows good computing efficiency. PMID:26904830

  18. Using new edges for anomaly detection in computer networks

    DOEpatents

    Neil, Joshua Charles

    2015-05-19

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  19. Approaches for detecting behavioural anomalies in public areas using video surveillance data

    NASA Astrophysics Data System (ADS)

    Brax, Christoffer; Laxhammar, Rikard; Niklasson, Lars

    2008-10-01

    In many surveillance missions information from a large number of interconnected sensors must be analysed in real time. When using visual sensors like CCTV cameras, it is not uncommon that an operator simultaneously has to survey the information from as many as fifty to a hundred cameras. It is obvious that the probability that the operator finds interesting observations is quite low when surveying information from that many cameras. In this paper we evaluate two different approaches for automatically detecting anomalies in data from visual surveillance sensors. Using the approaches suggested here the system can automatically direct the operator to the cameras where some possibly interesting activities take place. The approaches include creating structures for representing data, building "normal models" by filling the structures with data for the situation at hand, and finally detecting deviations in new data. One approach allows detections based on the incorporation of a priori knowledge about the situation combined with data-driven analysis. The other approach makes as few assumptions as possible about the situation at hand and builds almost entirely on data-driven analysis. The proposed approaches are evaluated off-line using real-world data and the results shows that the approaches can be used in real-time applications to support operators in civil and military surveillance applications.

  20. Dual Use Corrosion Inhibitor and Penetrant for Anomaly Detection in Neutron/X Radiography

    NASA Technical Reports Server (NTRS)

    Hall, Phillip B. (Inventor); Novak, Howard L. (Inventor)

    2004-01-01

    A dual purpose corrosion inhibitor and penetrant composition sensitive to radiography interrogation is provided. The corrosion inhibitor mitigates or eliminates corrosion on the surface of a substrate upon which the corrosion inhibitor is applied. In addition, the corrosion inhibitor provides for the attenuation of a signal used during radiography interrogation thereby providing for detection of anomalies on the surface of the substrate.

  1. Anomaly Detection in the Right Hemisphere: The Influence of Visuospatial Factors

    ERIC Educational Resources Information Center

    Smith, Stephen D.; Dixon, Michael J.; Tays, William J.; Bulman-Fleming, M. Barbara

    2004-01-01

    Previous research with both brain-damaged and neurologically intact populations has demonstrated that the right cerebral hemisphere (RH) is superior to the left cerebral hemisphere (LH) at detecting anomalies (or incongruities) in objects (Ramachandran, 1995; Smith, Tays, Dixon, & Bulman-Fleming, 2002). The current research assesses whether the RH…

  2. Underwater magnetic gradiometer for magnetic anomaly detection, localization, and tracking

    NASA Astrophysics Data System (ADS)

    Kumar, S.; Sulzberger, G.; Bono, J.; Skvoretz, D.; Allen, G. I.; Clem, T. R.; Ebbert, M.; Bennett, S. L.; Ostrom, R. K.; Tzouris, A.

    2007-04-01

    GE Security and the Naval Surface Warfare Center, Panama City (NSWC-PC) have collaborated to develop a magnetic gradiometer, called the Real-time Tracking Gradiometer or RTG that is mounted inside an unmanned underwater vehicle (UUV). The RTG is part of a buried mine hunting platform being developed by the United States Navy. The RTG has been successfully used to make test runs on mine-like targets buried off the coast of Florida. We will present a general description of the system and latest results describing system performance. This system can be also potentially used for other applications including those in the area of Homeland Security.

  3. 3D Reconstruction For The Detection Of Cranial Anomalies

    NASA Astrophysics Data System (ADS)

    Kettner, B.; Shalev, S.; Lavelle, C.

    1986-01-01

    There is a growing interest in the use of three-dimensional (3D) cranial reconstruction from CT scans for surgical planning. A low-cost imaging system has been developed, which provides pseudo-3D images which may be manipulated to reveal the craniofacial skeleton as a whole or any particular component region. The contrast between congenital (hydrocephalic), normocephalic and acquired (carcinoma of the maxillary sinus) anomalous cranial forms demonstrates the potential of this system.

  4. Magnetic anomaly detection (MAD) of ferromagnetic pipelines using principal component analysis (PCA)

    NASA Astrophysics Data System (ADS)

    Sheinker, Arie; Moldwin, Mark B.

    2016-04-01

    The magnetic anomaly detection (MAD) method is used for detection of visually obscured ferromagnetic objects. The method exploits the magnetic field originating from the ferromagnetic object, which constitutes an anomaly in the ambient earth’s magnetic field. Traditionally, MAD is used to detect objects with a magnetic field of a dipole structure, where far from the object it can be considered as a point source. In the present work, we expand MAD to the case of a non-dipole source, i.e. a ferromagnetic pipeline. We use principal component analysis (PCA) to calculate the principal components, which are then employed to construct an effective detector. Experiments conducted in our lab with real-world data validate the above analysis. The simplicity, low computational complexity, and the high detection rate make the proposed detector attractive for real-time, low power applications.

  5. Detection of Local Anomalies in High Resolution Hyperspectral Imagery Using Geostatistical Filtering and Local Spatial Statistics

    NASA Astrophysics Data System (ADS)

    Goovaerts, P.; Jacquez, G. M.; Marcus, A. W.

    2004-12-01

    Spatial data are periodically collected and processed to monitor, analyze and interpret developments in our changing environment. Remote sensing is a modern way of data collecting and has seen an enormous growth since launching of modern satellites and development of airborne sensors. In particular, the recent availability of high spatial resolution hyperspectral imagery (spatial resolution of less than 5 meters and including data collected over 64 or more bands of electromagnetic radiation for each pixel offers a great potential to significantly enhance environmental mapping and our ability to model spatial systems. High spatial resolution imagery contains a remarkable quantity of information that could be used to analyze spatial breaks (boundaries), areas of similarity (clusters), and spatial autocorrelation (associations) across the landscape. This paper addresses the specific issue of soil disturbance detection, which could indicate the presence of land mines or recent movements of troop and heavy equipment. A challenge presented by soil detection is to retain the measurement of fine-scale features (i.e. mineral soil changes, organic content changes, vegetation disturbance related changes, aspect changes) while still covering proportionally large spatial areas. An additional difficulty is that no ground data might be available for the calibration of spectral signatures, and little might be known about the size of patches of disturbed soils to be detected. This paper describes a new technique for automatic target detection which capitalizes on both spatial and across spectral bands correlation, does not require any a priori information on the target spectral signature but does not allow discrimination between targets. This approach involves successively a multivariate statistical analysis (principal component analysis) of all spectral bands, a geostatistical filtering of noise and regional background in the first principal components using factorial kriging, and

  6. Towards spatial localisation of harmful algal blooms; statistics-based spatial anomaly detection

    NASA Astrophysics Data System (ADS)

    Shutler, J. D.; Grant, M. G.; Miller, P. I.

    2005-10-01

    Harmful algal blooms are believed to be increasing in occurrence and their toxins can be concentrated by filter-feeding shellfish and cause amnesia or paralysis when ingested. As a result fisheries and beaches in the vicinity of blooms may need to be closed and the local population informed. For this avoidance planning timely information on the existence of a bloom, its species and an accurate map of its extent would be prudent. Current research to detect these blooms from space has mainly concentrated on spectral approaches towards determining species. We present a novel statistics-based background-subtraction technique that produces improved descriptions of an anomaly's extent from remotely-sensed ocean colour data. This is achieved by extracting bulk information from a background model; this is complemented by a computer vision ramp filtering technique to specifically detect the perimeter of the anomaly. The complete extraction technique uses temporal-variance estimates which control the subtraction of the scene of interest from the time-weighted background estimate, producing confidence maps of anomaly extent. Through the variance estimates the method learns the associated noise present in the data sequence, providing robustness, and allowing generic application. Further, the use of the median for the background model reduces the effects of anomalies that appear within the time sequence used to generate it, allowing seasonal variations in the background levels to be closely followed. To illustrate the detection algorithm's application, it has been applied to two spectrally different oceanic regions.

  7. Anomaly Detection in Large Sets of High-Dimensional Symbol Sequences

    NASA Technical Reports Server (NTRS)

    Budalakoti, Suratna; Srivastava, Ashok N.; Akella, Ram; Turkov, Eugene

    2006-01-01

    This paper addresses the problem of detecting and describing anomalies in large sets of high-dimensional symbol sequences. The approach taken uses unsupervised clustering of sequences using the normalized longest common subsequence (LCS) as a similarity measure, followed by detailed analysis of outliers to detect anomalies. As the LCS measure is expensive to compute, the first part of the paper discusses existing algorithms, such as the Hunt-Szymanski algorithm, that have low time-complexity. We then discuss why these algorithms often do not work well in practice and present a new hybrid algorithm for computing the LCS that, in our tests, outperforms the Hunt-Szymanski algorithm by a factor of five. The second part of the paper presents new algorithms for outlier analysis that provide comprehensible indicators as to why a particular sequence was deemed to be an outlier. The algorithms provide a coherent description to an analyst of the anomalies in the sequence, compared to more normal sequences. The algorithms we present are general and domain-independent, so we discuss applications in related areas such as anomaly detection.

  8. A Stochastic-entropic Approach to Detect Persistent Low-temperature Volcanogenic Thermal Anomalies

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Baxter, S.

    2011-12-01

    Eruption prediction is a chancy idiosyncratic affair, as volcanoes often manifest waxing and/or waning pre-eruption emission, geodetic, and seismic behavior that is unsystematic. Thus, fundamental to increased prediction accuracy and precision are good and frequent assessments of the time-series behavior of relevant precursor geophysical, geochemical, and geological phenomena, especially when volcanoes become restless. The Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER), in orbit since 1999 on the NASA Terra Earth Observing System satellite is an important capability for detection of thermal eruption precursors (even subtle ones) and increased passive gas emissions. The unique combination of ASTER high spatial resolution multi-spectral thermal IR imaging data (90m/pixel; 5 bands in the 8-12um region), combined with simultaneous visible and near-IR imaging data, and stereo-photogrammetric capabilities make it a useful, especially thermal, precursor detection tool. The JPL ASTER Volcano Archive consisting of 80,000+ASTER volcano images allows systematic analysis of (a) baseline thermal emissions for 1550+ volcanoes, (b) important aspects of the time-dependent thermal variability, and (c) the limits of detection of temporal dynamics of eruption precursors. We are analyzing a catalog of the magnitude, frequency, and distribution of ASTER-documented volcano thermal signatures, compiled from 2000 onward, at 90m/pixel. Low contrast thermal anomalies of relatively low apparent absolute temperature (e.g., summit lakes, fumarolically altered areas, geysers, very small sub-pixel hotspots), for which the signal-to-noise ratio may be marginal (e.g., scene confusion due to clouds, water and water vapor, fumarolic emissions, variegated ground emissivity, and their combinations), are particularly important to discern and monitor. We have developed a technique to detect persistent hotspots that takes into account in-scene observed pixel joint frequency

  9. Detection of Surface Temperature Anomalies in the Coso Geothermal Field Using Thermal Infrared Remote Sensing

    NASA Astrophysics Data System (ADS)

    Coolbaugh, M.; Eneva, M.; Bjornstad, S.; Combs, J.

    2007-12-01

    We use thermal infrared (TIR) data from the spaceborne ASTER instrument to detect surface temperature anomalies in the Coso geothermal field in eastern California. The identification of such anomalies in a known geothermal area serves as an incentive to search for similar markers to areas of unknown geothermal potential. We carried out field measurements concurrently with the collection of ASTER images. The field data included reflectance, subsurface and surface temperatures, and radiosonde atmospheric profiles. We apply techniques specifically targeted to correct for thermal artifacts caused by topography, albedo, and thermal inertia. This approach has the potential to reduce data noise and to reveal thermal anomalies which are not distinguishable in the uncorrected imagery. The combination of remote sensing and field data can be used to evaluate the performance of TIR remote sensing as a cost-effective geothermal exploration tool.

  10. GNSS reflectometry aboard the International Space Station: phase-altimetry simulation to detect ocean topography anomalies

    NASA Astrophysics Data System (ADS)

    Semmling, Maximilian; Leister, Vera; Saynisch, Jan; Zus, Florian; Wickert, Jens

    2016-04-01

    An ocean altimetry experiment using Earth reflected GNSS signals has been proposed to the European Space Agency (ESA). It is part of the GNSS Reflectometry Radio Occultation Scatterometry (GEROS) mission that is planned aboard the International Space Station (ISS). Altimetric simulations are presented that examine the detection of ocean topography anomalies assuming GNSS phase delay observations. Such delay measurements are well established for positioning and are possible due to a sufficient synchronization of GNSS receiver and transmitter. For altimetric purpose delays of Earth reflected GNSS signals can be observed similar to radar altimeter signals. The advantage of GNSS is the synchronized separation of transmitter and receiver that allow a significantly increased number of observation per receiver due to more than 70 GNSS transmitters currently in orbit. The altimetric concept has already been applied successfully to flight data recorded over the Mediterranean Sea. The presented altimetric simulation considers anomalies in the Agulhas current region which are obtained from the Region Ocean Model System (ROMS). Suitable reflection events in an elevation range between 3° and 30° last about 10min with ground track's length >3000km. Typical along-track footprints (1s signal integration time) have a length of about 5km. The reflection's Fresnel zone limits the footprint of coherent observations to a major axis extention between 1 to 6km dependent on the elevation. The altimetric performance depends on the signal-to-noise ratio (SNR) of the reflection. Simulation results show that precision is better than 10cm for SNR of 30dB. Whereas, it is worse than 0.5m if SNR goes down to 10dB. Precision, in general, improves towards higher elevation angles. Critical biases are introduced by atmospheric and ionospheric refraction. Corresponding correction strategies are still under investigation.

  11. Application of Artificial Bee Colony algorithm in TEC seismo-ionospheric anomalies detection

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2015-09-01

    In this study, the efficiency of Artificial Bee Colony (ABC) algorithm is investigated to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of some strong earthquakes including Chile (27 February 2010; 01 April 2014), Varzeghan (11 August 2012), Saravan (16 April 2013) and Papua New Guinea (29 March 2015). In comparison with other anomaly detection algorithms, ABC has a number of advantages which can be numerated as (1) detection of discord patterns in a large non linear data during a short time, (2) simplicity, (3) having less control parameters and (4) efficiently for solving multimodal and multidimensional optimization problems. Also the results of this study acknowledge the TEC time-series as a robust earthquake precursor.

  12. System for detection of hazardous events

    DOEpatents

    Kulesz, James J.; Worley, Brian A.

    2006-05-23

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  13. System For Detection Of Hazardous Events

    DOEpatents

    Kulesz, James J [Oak Ridge, TN; Worley, Brian A [Knoxville, TN

    2005-08-16

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  14. Millimeter Wave Detection of Localized Anomalies in the Space Shuttle External Fuel Tank Insulating Foam

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Case, J. T.; Abou-Khousa, M. A.; Zoughi, R.; Hepburn, F.

    2006-01-01

    The Space Shuttle Columbia's catastrophic accident emphasizes the growing need for developing and applying effective, robust and life-cycle oriented nondestructive testing (NDT) methods for inspecting the shuttle external fuel tank spray on foam insulation (SOFI). Millimeter wave NDT techniques were one of the methods chosen for evaluating their potential for inspecting these structures. Several panels with embedded anomalies (mainly voids) were produced and tested for this purpose. Near-field and far-field millimeter wave NDT methods were used for producing images of the anomalies in these panels. This paper presents the results of an investigation for the purpose of detecting localized anomalies in several SOFI panels. To this end, reflectometers at a relatively wide range of frequencies (Ka-band (26.5 - 40 GHz) to W-band (75 - 110 GHz)) and utilizing different types of radiators were employed. The resulting raw images revealed a significant amount of information about the interior of these panels. However, using simple image processing techniques the results were improved in particular as it relate s to detecting the smaller anomalies. This paper presents the results of this investigation and a discussion of these results.

  15. A Distance Measure for Attention Focusing and Anaomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R. J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by.

  16. Processing forward-looking data for anomaly detection: single-look, multi-look, and spatial classification

    NASA Astrophysics Data System (ADS)

    Malof, Jordan M.; Morton, Kenneth D., Jr.; Collins, Leslie M.; Torrione, Peter A.

    2012-06-01

    Many effective buried threat detection systems rely on close proximity and near vertical deployment over subsurface objects before reasonable performance can be obtained. A forward-looking sensor configuration, where an object can be detected from much greater distances, allows for safer detection of buried explosive threats, and increased rates of advance. Forward-looking configurations also provide an additional advantage of yielding multiple perspectives and looks at each subsurface area, and data from these multiple pose angles can be potentially exploited for improved detection. This work investigates several aspects of detection algorithms that can be applied to forward-looking imagery. Previous forward-looking detection algorithms have employed several anomaly detection algorithms, such as the RX algorithm. In this work the performance of the RX algorithm is compared to a scale-space approach based on Laplcaian of Gaussian filtering. This work also investigates methods to combine the detection output from successive frames to aid detection performance. This is done by exploiting the spatial colocation of detection alarms after they are mapped from image coordinates into world coordinates. The performance of the resulting algorithms are measured on data from a forward-looking vehicle mounted optical sensor system collected over several lanes at a western U.S. test facility. Results indicate that exploiting the spatial colocation of detections made in successive frames can yield improved performance.

  17. Data requirements for an anomaly detector in an automated safeguards system using neural networks

    SciTech Connect

    Whiteson, R.; Britschgi, J.J.

    1993-08-01

    An automated safeguards system must be able to detect and identify anomalous events in a near-real-time manner. Our approach to anomaly detection is based on the demonstrated ability of neural networks to model complex, nonlinear, real-time processes. By modeling the normal behavior of processes, we can detect how a system should behave and, thereby, detect when an abnormal state or event occurs. In this paper, we explore the computational intensity of training neural networks, and we discuss the issues involved in gathering and preprocessing the safeguards data necessary to train a neural network for anomaly detection. We explore data requirements for training neural networks and evaluate how different features of the training data affect the training and operation of the networks. We use actual process data to train our previous 3-tank model and compare the results to those achieved using simulated safeguards data. Comparisons are made on the basis of required training times in addition to correctness of prediction.

  18. An Approach to Detecting Crowd Anomalies for Entrance and Checkpoint Security

    NASA Astrophysics Data System (ADS)

    Zelnio, Holly

    This thesis develops an approach for detecting behavioral anomalies using tracks of pedestrians, including specified threat tracks. The application area is installation security with focus on monitoring the entrances of these installations. The approach specifically allows operator interaction to specify threats and to interactively adjust the system parameters depending on the context of the situation. This research has discovered physically meaningful features that are developed and organized in a manner so that features can be systematically added or deleted depending on the situation and operator preference. The features can be used with standard classifiers such as the one class support vector machine that is used in this research. The one class support vector machine is very stable for this application and provides significant insight into the nature of its decision boundary. Its stability and ease of system use stems from a unique automatic tuning approach that is computationally efficient and compares favorable with competing approaches. This automatic tuning approach is believed to be novel and was developed as part of this research. Results are provided using both measured and synthetic data.

  19. Conformal prediction for anomaly detection and collision alert in space surveillance

    NASA Astrophysics Data System (ADS)

    Chen, Huimin; Chen, Genshe; Blasch, Erik; Pham, Khanh

    2013-05-01

    Anomaly detection has been considered as an important technique for detecting critical events in a wide range of data rich applications where a majority of the data is inconsequential and/or uninteresting. We study the detection of anomalous behaviors among space objects using the theory of conformal prediction for distribution-independent on-line learning to provide collision alerts with a desirable confidence level. We exploit the fact that conformal predictors provide valid forecasted sets at specified confidence levels under the relatively weak assumption that the normal training data, together with the normal testing data, are generated from the same distribution. If the actual observation is not included in the conformal prediction set, it is classified as anomalous at the corresponding significance level. Interpreting the significance level as an upper bound of the probability that a normal observation is mistakenly classified as anomalous, we can conveniently adjust the sensitivity to anomalies while controlling the false alarm rate without having to find the application specific threshold. The proposed conformal prediction method was evaluated for a space surveillance application using the open source North American Aerospace Defense Command (NORAD) catalog data. The validity of the prediction sets is justified by the empirical error rate that matches the significance level. In addition, experiments with simulated anomalous data indicate that anomaly detection sensitivity with conformal prediction is superior to that of the existing methods in declaring potential collision events.

  20. An earthquake from space: detection of precursory magnetic anomalies from Swarm satellites before the 2015 M8 Nepal Earthquake

    NASA Astrophysics Data System (ADS)

    De Santis, A.; Balasis, G.; Pavón-Carrasco, F. J.; Cianchini, G.; Mandea, M.

    2015-12-01

    A large earthquake of around 8 magnitude occurred on 25 April 2015, 06:26 UTC, with epicenter in Nepal, causing more than 9000 fatalities and devastating destruction. The contemporary orbiting in the topside ionosphere of the three Swarm satellites by ESA makes it possible to look for possible pre-earthquake magnetic anomalous signals, likely due to some lithosphere-atmosphere-ionosphere (LAI) coupling. First, a wavelet analysis has been performed during the same day of the earthquake (from the external magnetic point of view, an exceptionally quiet day) with the result that a ULF anomalous and persisting signal (from around 3 to 6 UTC), is clearly detected before the earthquake. After this single-spot analysis, we performed a more extensive analysis for two months around the earthquake occurrence, to confirm or refute the cause-effect relationship. From the series of the detected magnetic anomalies (during night and magnetically quiet times) from Swarm satellites, we show that the cumulative numbers of anomalies follows the same typical power-law behavior of a critical system approaching its critical time, in our case, the large seismic event of 25 April, 2015, and then it recovers as the typical recovery phase after a large earthquake. The impressive similarity of this behavior with the analogous of seismic data analysis, provides strong support to the lithospheric origin of the satellite magnetic anomalies, as due to the LAI coupling during the preparation phase of the Nepal earthquake.

  1. GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection

    SciTech Connect

    Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D; Reed, Joel W; Goodall, John R

    2016-01-01

    This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% true positive rates at both.

  2. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  3. Capacitance probe for detection of anomalies in non-metallic plastic pipe

    DOEpatents

    Mathur, Mahendra P.; Spenik, James L.; Condon, Christopher M.; Anderson, Rodney; Driscoll, Daniel J.; Fincham, Jr., William L.; Monazam, Esmail R.

    2010-11-23

    The disclosure relates to analysis of materials using a capacitive sensor to detect anomalies through comparison of measured capacitances. The capacitive sensor is used in conjunction with a capacitance measurement device, a location device, and a processor in order to generate a capacitance versus location output which may be inspected for the detection and localization of anomalies within the material under test. The components may be carried as payload on an inspection vehicle which may traverse through a pipe interior, allowing evaluation of nonmetallic or plastic pipes when the piping exterior is not accessible. In an embodiment, supporting components are solid-state devices powered by a low voltage on-board power supply, providing for use in environments where voltage levels may be restricted.

  4. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  5. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  6. An expert system for diagnosing environmentally induced spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark; Lauriente, Michael; Koons, Harry C.; Gorney, David

    1992-01-01

    A new rule-based, machine independent analytical tool was designed for diagnosing spacecraft anomalies using an expert system. Expert systems provide an effective method for saving knowledge, allow computers to sift through large amounts of data pinpointing significant parts, and most importantly, use heuristics in addition to algorithms, which allow approximate reasoning and inference and the ability to attack problems not rigidly defined. The knowledge base consists of over two-hundred (200) rules and provides links to historical and environmental databases. The environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information and allows the user to input partial information (varying degrees of confidence in an answer) or 'unknown' to any question. The modularity of the expert system allows for easy updates and modifications. It not only provides scientists with needed risk analysis and confidence not found in algorithmic programs, but is also an effective learning tool, and the window implementation makes it very easy to use. The system currently runs on a Micro VAX II at Goddard Space Flight Center (GSFC). The inference engine used is NASA's C Language Integrated Production System (CLIPS).

  7. Detection and Origin of Hydrocarbon Seepage Anomalies in the Barents Sea

    NASA Astrophysics Data System (ADS)

    Polteau, Stephane; Planke, Sverre; Stolze, Lina; Kjølhamar, Bent E.; Myklebust, Reidun

    2016-04-01

    We have collected more than 450 gravity cores in the Barents Sea to detect hydrocarbon seepage anomalies and for seismic-stratigraphic tie. The cores are from the Hoop Area (125 samples) and from the Barents Sea SE (293 samples). In addition, we have collected cores near seven exploration wells. The samples were analyzed using three different analytical methods; (1) the standard organic geochemical analyzes of Applied Petroleum Technologies (APT), (2) the Amplified Geochemical Imaging (AGI) method, and (3) the Microbial Prospecting for Oil and Gas (MPOG) method. These analytical approaches can detect trace amounts of thermogenic hydrocarbons in the sediment samples, and may provide additional information about the fluid phases and the depositional environment, maturation, and age of the source rocks. However, hydrocarbon anomalies in seabed sediments may also be related to shallow sources, such as biogenic gas or reworked source rocks in the sediments. To better understand the origin of the hydrocarbon anomalies in the Barents Sea we have studied 35 samples collected approximately 200 m away from seven exploration wells. The wells included three boreholes associated with oil discoveries, two with gas discoveries, one dry well with gas shows, and one dry well. In general, the results of this case study reveal that the oil wells have an oil signature, gas wells show a gas signature, and dry wells have a background signature. However, differences in results from the three methods may occur and have largely been explained in terms of analytical measurement ranges, method sensitivities, and bio-geochemical processes in the seabed sediments. The standard geochemical method applied by APT relies on measuring the abundance of compounds between C1 to C5 in the headspace gas and between C11 to C36 in the sediment extracts. The anomalies detected in the sediment samples from this study were in the C16 to C30 range. Since the organic matter yields were mostly very low, the

  8. Gaussian mixture model based approach to anomaly detection in multi/hyperspectral images

    NASA Astrophysics Data System (ADS)

    Acito, N.; Diani, M.; Corsini, G.

    2005-10-01

    Anomaly detectors reveal the presence of objects/materials in a multi/hyperspectral image simply searching for those pixels whose spectrum differs from the background one (anomalies). This procedure can be applied directly to the radiance at the sensor level and has the great advantage of avoiding the difficult step of atmospheric correction. The most popular anomaly detector is the RX algorithm derived by Yu and Reed. It is based on the assumption that the pixels, in a region around the one under test, follow a single multivariate Gaussian distribution. Unfortunately, such a hypothesis is generally not met in actual scenarios and a large number of false alarms is usually experienced when the RX algorithm is applied in practice. In this paper, a more general approach to anomaly detection is considered based on the assumption that the background contains different terrain types (clusters) each of them Gaussian distributed. In this approach the parameters of each cluster are estimated and used in the detection process. Two detectors are considered: the SEM-RX and the K-means RX. Both the algorithms follow two steps: first, 1) the parameters of the background clusters are estimated, then, 2) a detection rule based on the RX test is applied. The SEM-RX stems from the GMM and employs the SEM algorithm to estimate the clusters' parameters; instead, the K-means RX resorts to the well known K-means algorithm to obtain the background clusters. An automatic procedure is defined, for both the detectors, to select the number of clusters and a novel criterion is proposed to set the test threshold. The performances of the two detectors are also evaluated on an experimental data set and compared to the ones of the RX algorithm. The comparative analysis is carried out in terms of experimental Receiver Operating Characteristics.

  9. Evaluating the anomaly resolution capability of an MC&A system

    SciTech Connect

    Saleh, R.; Smith, G.

    1991-07-01

    Discrepancies in accounting for Special Nuclear Material (SNM) require quick and accurate resolution. The ability to make a definitive resolution often depends on the types of measurement data available and on the way records are maintained in the Material Control and Accounting (MC&A) system. A new method is presented for systematically evaluating the overall anomaly detection and resolution capability of an MC&A system. The method begins with a detailed specification of the material process cycle including, all authorized material locations, possible unauthorized locations, and the procedures for measuring and recording movement between locations. The analysis proceeds by identifying the types of errors that could logically occur in the measurement and recording system and estimating their frequency. A method is described for quantifying the detection capability and resolution effectiveness for each possible error. A new metric is also proposed for quantifying the overall effectiveness of the MC&A system.

  10. Interpretation of Magnetic Anomalies in Salihli (Turkey) Geothermal Area Using 3-D Inversion and Edge Detection Techniques

    NASA Astrophysics Data System (ADS)

    Timur, Emre

    2016-04-01

    There are numerous geophysical methods used to investigate geothermal areas. The major purpose of this magnetic survey is to locate the boudaries of active hydrothermal system in the South of Gediz Graben in Salihli (Manisa/Turkey). The presence of the hydrothermal system had already been inferred from surface evidence of hydrothermal activity and drillings. Firstly, 3-D prismatic models were theoretically investigated and edge detection methods were utilized with an iterative inversion method to define the boundaries and the parameters of the structure. In the first step of the application, it was necessary to convert the total field anomaly into a pseudo-gravity anomaly map. Then the geometric boudaries of the structures were determined by applying a MATLAB based software with 3 different edge detection algorithms. The exact location of the structures were obtained by using these boundary coordinates as initial geometric parameters in the inversion process. In addition to these methods, reduction to pole and horizontal gradient methods were applied to the data to achieve more information about the location and shape of the possible reservoir. As a result, the edge detection methods were found to be successful, both in the field and as theoretical data sets for delineating the boundaries of the possible geothermal reservoir structure. The depth of the geothermal reservoir was determined as 2,4 km from 3-D inversion and 2,1 km from power spectrum methods.

  11. Small sample training and test selection method for optimized anomaly detection algorithms in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Mindrup, Frank M.; Friend, Mark A.; Bauer, Kenneth W.

    2012-01-01

    There are numerous anomaly detection algorithms proposed for hyperspectral imagery. Robust parameter design (RPD) techniques provide an avenue to select robust settings capable of operating consistently across a large variety of image scenes. Many researchers in this area are faced with a paucity of data. Unfortunately, there are no data splitting methods for model validation of datasets with small sample sizes. Typically, training and test sets of hyperspectral images are chosen randomly. Previous research has developed a framework for optimizing anomaly detection in HSI by considering specific image characteristics as noise variables within the context of RPD; these characteristics include the Fisher's score, ratio of target pixels and number of clusters. We have developed method for selecting hyperspectral image training and test subsets that yields consistent RPD results based on these noise features. These subsets are not necessarily orthogonal, but still provide improvements over random training and test subset assignments by maximizing the volume and average distance between image noise characteristics. The small sample training and test selection method is contrasted with randomly selected training sets as well as training sets chosen from the CADEX and DUPLEX algorithms for the well known Reed-Xiaoli anomaly detector.

  12. Wavelet-RX anomaly detection for dual-band forward-looking infrared imagery.

    PubMed

    Mehmood, Asif; Nasrabadi, Nasser M

    2010-08-20

    This paper describes a new wavelet-based anomaly detection technique for a dual-band forward-looking infrared (FLIR) sensor consisting of a coregistered longwave (LW) with a midwave (MW) sensor. The proposed approach, called the wavelet-RX (Reed-Xiaoli) algorithm, consists of a combination of a two-dimensional (2D) wavelet transform and a well-known multivariate anomaly detector called the RX algorithm. In our wavelet-RX algorithm, a 2D wavelet transform is first applied to decompose the input image into uniform subbands. A subband-image cube is formed by concatenating together a number of significant subbands (high-energy subbands). The RX algorithm is then applied to the subband-image cube obtained from a wavelet decomposition of the LW or MW sensor data. In the case of the dual band, the RX algorithm is applied to a subband-image cube constructed by concatenating together the high-energy subbands of the LW and MW subband-image cubes. Experimental results are presented for the proposed wavelet-RX and the classical constant false alarm rate (CFAR) algorithm for detecting anomalies (targets) in a single broadband FLIR (LW or MW) or in a coregistered dual-band FLIR sensor. The results show that the proposed wavelet-RX algorithm outperforms the classical CFAR detector for both single-band and dual-band FLIR sensors. PMID:20733634

  13. Anomaly detection in hyperspectral imagery based on low-rank and sparse decomposition

    NASA Astrophysics Data System (ADS)

    Cui, Xiaoguang; Tian, Yuan; Weng, Lubin; Yang, Yiping

    2014-01-01

    This paper presents a novel low-rank and sparse decomposition (LSD) based model for anomaly detection in hyperspectral images. In our model, a local image region is represented as a low-rank matrix plus spares noises in the spectral space, where the background can be explained by the low-rank matrix, and the anomalies are indicated by the sparse noises. The detection of anomalies in local image regions is formulated as a constrained LSD problem, which can be solved efficiently and robustly with a modified "Go Decomposition" (GoDec) method. To enhance the validity of this model, we adapts a "simple linear iterative clustering" (SLIC) superpixel algorithm to efficiently generate homogeneous local image regions i.e. superpixels in hyperspectral imagery, thus ensures that the background in local image regions satisfies the condition of low-rank. Experimental results on real hyperspectral data demonstrate that, compared with several known local detectors including RX detector, kernel RX detector, and SVDD detector, the proposed model can comfortably achieves better performance in satisfactory computation time.

  14. Motivating Complex Dependence Structures in Data Mining: A Case Study with Anomaly Detection in Climate

    SciTech Connect

    Kao, Shih-Chieh; Ganguly, Auroop R; Steinhaeuser, Karsten J K

    2009-01-01

    While data mining aims to identify hidden knowledge from massive and high dimensional datasets, the importance of dependence structure among time, space, and between different variables is less emphasized. Analogous to the use of probability density functions in modeling individual variables, it is now possible to characterize the complete dependence space mathematically through the application of copulas. By adopting copulas, the multivariate joint probability distribution can be constructed without constraint to specific types of marginal distributions. Some common assumptions, like normality and independence between variables, can also be relieved. This study provides fundamental introduction and illustration of dependence structure, aimed at the potential applicability of copulas in general data mining. The case study in hydro-climatic anomaly detection shows that the frequency of multivariate anomalies is affected by the dependence level between variables. The appropriate multivariate thresholds can be determined through a copula-based approach.

  15. Process fault detection and nonlinear time series analysis for anomaly detection in safeguards

    SciTech Connect

    Burr, T.L.; Mullen, M.F.; Wangen, L.E.

    1994-02-01

    In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect loss of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.

  16. Anomaly detection in radiographic images of composite materials via crosshatch regression

    NASA Astrophysics Data System (ADS)

    Lockard, Colin D.

    The development and testing of new composite materials is an important area of research supporting advances in aerospace engineering. Understanding the properties of these materials requires the analysis of material samples to identify damage. Given the significant time and effort required from human experts to analyze computed tomography (CT) scans related to the non-destructive evaluation of carbon fiber materials, it is advantageous to develop an automated system for identifying anomalies in these images. This thesis introduces a regression-based algorithm for identifying anomalies in grayscale images, with a particular focus on its application for the analysis of CT scan images of carbon fiber. The algorithm centers around a "crosshatch regression" approach in which each two-dimensional image is divided into a series of one-dimensional signals, each representing a single line of pixels. A robust multiple linear regression model is fitted to each signal and outliers are identified. Smoothing and quality control techniques help better define anomaly boundaries and remove noise, and multiple crosshatch regression runs are combined to generate the final result. A ground truth set was created and the algorithm was run against these images for testing. The experimental results support the efficacy of the technique, locating 92% of anomalies with an average recall of 88%, precision of 78%, and root mean square deviation of 11.2 pixels.

  17. Optimal Index Policies for Anomaly Localization in Resource-Constrained Cyber Systems

    NASA Astrophysics Data System (ADS)

    Cohen, Kobi; Zhao, Qing; Swami, Ananthram

    2014-08-01

    The problem of anomaly localization in a resource-constrained cyber system is considered. Each anomalous component of the system incurs a cost per unit time until its anomaly is identified and fixed. Different anomalous components may incur different costs depending on their criticality to the system. Due to resource constraints, only one component can be probed at each given time. The observations from a probed component are realizations drawn from two different distributions depending on whether the component is normal or anomalous. The objective is a probing strategy that minimizes the total expected cost, incurred by all the components during the detection process, under reliability constraints. We consider both independent and exclusive models. In the former, each component can be abnormal with a certain probability independent of other components. In the latter, one and only one component is abnormal. We develop optimal simple index policies under both models. The proposed index policies apply to a more general case where a subset (more than one) of the components can be probed simultaneously and have strong performance as demonstrated by simulation examples. The problem under study also finds applications in spectrum scanning in cognitive radio networks and event detection in sensor networks.

  18. Least Square Support Vector Machine for Detection of - Ionospheric Anomalies Associated with the Powerful Nepal Earthquake (Mw = 7.5) of 25 April 2015

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2016-06-01

    Due to the irrepalable devastations of strong earthquakes, accurate anomaly detection in time series of different precursors for creating a trustworthy early warning system has brought new challenges. In this paper the predictability of Least Square Support Vector Machine (LSSVM) has been investigated by forecasting the GPS-TEC (Total Electron Content) variations around the time and location of Nepal earthquake. In 77 km NW of Kathmandu in Nepal (28.147° N, 84.708° E, depth = 15.0 km) a powerful earthquake of Mw = 7.8 took place at 06:11:26 UTC on April 25, 2015. For comparing purpose, other two methods including Median and ANN (Artificial Neural Network) have been implemented. All implemented algorithms indicate on striking TEC anomalies 2 days prior to the main shock. Results reveal that LSSVM method is promising for TEC sesimo-ionospheric anomalies detection.

  19. Development of an expert system for analysis of Shuttle atmospheric revitalization and pressure control subsystem anomalies

    NASA Technical Reports Server (NTRS)

    Lafuse, Sharon A.

    1991-01-01

    The paper describes the Shuttle Leak Management Expert System (SLMES), a preprototype expert system developed to enable the ECLSS subsystem manager to analyze subsystem anomalies and to formulate flight procedures based on flight data. The SLMES combines the rule-based expert system technology with the traditional FORTRAN-based software into an integrated system. SLMES analyzes the data using rules, and, when it detects a problem that requires simulation, it sets up the input for the FORTRAN-based simulation program ARPCS2AT2, which predicts the cabin total pressure and composition as a function of time. The program simulates the pressure control system, the crew oxygen masks, the airlock repress/depress valves, and the leakage. When the simulation has completed, other SLMES rules are triggered to examine the results of simulation contrary to flight data and to suggest methods for correcting the problem. Results are then presented in form of graphs and tables.

  20. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery.

    PubMed

    Marapareddy, Ramakalavathi; Aanstoos, James V; Younan, Nicolas H

    2016-01-01

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ₁, λ₂, and λ₃), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory's (JPL's) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers. PMID:27322270

  1. Automatic, Real-Time Algorithms for Anomaly Detection in High Resolution Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Srivastava, A. N.; Nemani, R. R.; Votava, P.

    2008-12-01

    Earth observing satellites are generating data at an unprecedented rate, surpassing almost all other data intensive applications. However, most of the data that arrives from the satellites is not analyzed directly. Rather, multiple scientific teams analyze only a small fraction of the total data available in the data stream. Although there are many reasons for this situation one paramount concern is developing algorithms and methods that can analyze the vast, high dimensional, streaming satellite images. This paper describes a new set of methods that are among the fastest available algorithms for real-time anomaly detection. These algorithms were built to maximize accuracy and speed for a variety of applications in fields outside of the earth sciences. However, our studies indicate that with appropriate modifications, these algorithms can be extremely valuable for identifying anomalies rapidly using only modest computational power. We review two algorithms which are used as benchmarks in the field: Orca, One-Class Support Vector Machines and discuss the anomalies that are discovered in MODIS data taken over the Central California region. We are especially interested in automatic identification of disturbances within the ecosystems (e,g, wildfires, droughts, floods, insect/pest damage, wind damage, logging). We show the scalability of the algorithms and demonstrate that with appropriately adapted technology, the dream of real-time analysis can be made a reality.

  2. Advanced Unsupervised Classification Methods to Detect Anomalies on Earthen Levees Using Polarimetric SAR Imagery

    PubMed Central

    Marapareddy, Ramakalavathi; Aanstoos, James V.; Younan, Nicolas H.

    2016-01-01

    Fully polarimetric Synthetic Aperture Radar (polSAR) data analysis has wide applications for terrain and ground cover classification. The dynamics of surface and subsurface water events can lead to slope instability resulting in slough slides on earthen levees. Early detection of these anomalies by a remote sensing approach could save time versus direct assessment. We used L-band Synthetic Aperture Radar (SAR) to screen levees for anomalies. SAR technology, due to its high spatial resolution and soil penetration capability, is a good choice for identifying problematic areas on earthen levees. Using the parameters entropy (H), anisotropy (A), alpha (α), and eigenvalues (λ, λ1, λ2, and λ3), we implemented several unsupervised classification algorithms for the identification of anomalies on the levee. The classification techniques applied are H/α, H/A, A/α, Wishart H/α, Wishart H/A/α, and H/α/λ classification algorithms. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory’s (JPL’s) Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers. PMID:27322270

  3. Ferromagnetic eddy current probe having eccentric magnetization for detecting anomalies in a tube

    SciTech Connect

    Cecco, V.S.; Carter, J.R.

    1993-08-17

    An eddy current probe is described for detecting anomalies in a tube made of a ferromagnetic material, comprising: a probe housing made of a non-ferromagnetic material and shaped to be introduced into the tube for inspection, said housing having a central axis substantially coinciding with the axis of the tube to be inspected when the probe is in use; at least two eddy current measuring assemblies provided in said housing, each said assembly including magnetization means for generating a magnetic field in the tube under inspection to magnetize said tube, said magnetization means producing a maximum magnetization at an area of said tube and a minimum magnetization at a diametrically opposite area of said tube and at least one eddy current measuring coil associated with said magnetization means to measure the eddy current generated in the said tube and which has a relatively high sensitivity to an anomaly at said maximum magnetization area; and said eddy current measuring assemblies being spaced apart axially within said housing and rotated about said central axis from each other by a predetermined angle so that each assembly is sensitive to anomalies differently depending upon their location in said housing.

  4. Portable modular detection system

    DOEpatents

    Brennan, James S.; Singh, Anup; Throckmorton, Daniel J.; Stamps, James F.

    2009-10-13

    Disclosed herein are portable and modular detection devices and systems for detecting electromagnetic radiation, such as fluorescence, from an analyte which comprises at least one optical element removably attached to at least one alignment rail. Also disclosed are modular detection devices and systems having an integrated lock-in amplifier and spatial filter and assay methods using the portable and modular detection devices.

  5. Fiber Optic Bragg Grating Sensors for Thermographic Detection of Subsurface Anomalies

    NASA Technical Reports Server (NTRS)

    Allison, Sidney G.; Winfree, William P.; Wu, Meng-Chou

    2009-01-01

    Conventional thermography with an infrared imager has been shown to be an extremely viable technique for nondestructively detecting subsurface anomalies such as thickness variations due to corrosion. A recently developed technique using fiber optic sensors to measure temperature holds potential for performing similar inspections without requiring an infrared imager. The structure is heated using a heat source such as a quartz lamp with fiber Bragg grating (FBG) sensors at the surface of the structure to detect temperature. Investigated structures include a stainless steel plate with thickness variations simulated by small platelets attached to the back side using thermal grease. A relationship is shown between the FBG sensor thermal response and variations in material thickness. For comparison, finite element modeling was performed and found to agree closely with the fiber optic thermography results. This technique shows potential for applications where FBG sensors are already bonded to structures for Integrated Vehicle Health Monitoring (IVHM) strain measurements and can serve dual-use by also performing thermographic detection of subsurface anomalies.

  6. Molecular Detection of Human Cytomegalovirus (HCMV) Among Infants with Congenital Anomalies in Khartoum State, Sudan

    PubMed Central

    Ebrahim, Maha G.; Ali, Aisha S.; Mustafa, Mohamed O.; Musa, Dalal F.; El Hussein, Abdel Rahim M.; Elkhidir, Isam M.; Enan, Khalid A.

    2015-01-01

    Human Cytomegalovirus (HCMV) infection still represents the most common potentially serious viral complication in humans and is a major cause of congenital anomalies in infants. This study is aimed to detect HCMV in infants with congenital anomalies. Study subjects consisted of infants born with neural tube defect, hydrocephalus and microcephaly. Fifty serum specimens (20 males, 30 females) were collected from different hospitals in Khartoum State. The sera were investigated for cytomegalovirus specific immunoglobin M (IgM) antibodies using enzyme-linked immunosorbent assay (ELISA), and for Cytomegalovirus DNA using polymerase chain reaction (PCR). Out of the 50 sera tested, one patient’s (2%) sample showed HCMV IgM, but with no detectable DNA, other 4(8.2 %) sera were positive for HCMV DNA but with no detectable IgM. Various diagnostic techniques should be considered to evaluate HCMV disease and routine screening for HCMV should be introduced for pregnant women in this setting. It is vital to initiate further research work with many samples from different area to assess prevalence and characterize HCMV and evaluate its maternal health implications. PMID:26862356

  7. Application of the LMC algorithm to anomaly detection using the Wichmann/NIITEK ground-penetrating radar

    NASA Astrophysics Data System (ADS)

    Torrione, Peter A.; Collins, Leslie M.; Clodfelter, Fred; Frasier, Shane; Starnes, Ian

    2003-09-01

    This paper describes the application of a 2-dimensional (2-D) lattice LMS algorithm for anomaly detection using the Wichmann/Niitek ground penetrating radar (GPR) system. Sets of 3-dimensional (3-D) data are collected from the GPR system and these are processed in separate 2-D slices. Those 2-D slices that are spatially correlated in depth are combined into separate "depth segments" and these are processed independently. When target/no target declarations need to be made, the individual depth segments are combined to yield a 2-D confidence map. The 2-D confidence map is then thresholded and alarms are placed at the centroids of the remaining 8-connected data points. Calibration lane results are presented for data collected over several soil types under several weather conditions. Results show a false alarm rate improvement of at least an order of magnitude over other GPR systems, as well as significant improvement over other adaptive algorithms operating on the same data.

  8. Sparsity divergence index based on locally linear embedding for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhang, Lili; Zhao, Chunhui

    2016-04-01

    Hyperspectral imagery (HSI) has high spectral and spatial resolutions, which are essential for anomaly detection (AD). Many anomaly detectors assume that the spectrum signature of HSI pixels can be modeled with a Gaussian distribution, which is actually not accurate and often leads to many false alarms. Therefore, a sparsity model without any distribution hypothesis is usually employed. Dimensionality reduction (DR) as a preprocessing step for HSI is important. Principal component analysis as a conventional DR method is a linear projection and cannot exploit the nonlinear properties in hyperspectral data, whereas locally linear embedding (LLE) as a local, nonlinear manifold learning algorithm works well for DR of HSI. A modified algorithm of sparsity divergence index based on locally linear embedding (SDI-LLE) is thus proposed. First, kernel collaborative representation detection is adopted to calculate the sparse dictionary matrix of local reconstruction weights in LLE. Then, SDI is obtained both in the spectral and spatial domains, where spatial SDI is computed after DR by LLE. Finally, joint SDI, combining spectral SDI and spatial SDI, is computed, and the optimal SDI is performed for AD. Experimental results demonstrate that the proposed algorithm significantly improves the performance, when compared with its counterparts.

  9. A MLP neural network as an investigator of TEC time series to detect seismo-ionospheric anomalies

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-06-01

    Anomaly detection is extremely important for earthquake parameters estimation. In this paper, an application of Artificial Neural Networks (ANNs) in the earthquake precursor's domain has been developed. This study is concerned with investigating the Total Electron Content (TEC) time series by using a Multi-Layer Perceptron (MLP) neural network to detect seismo-ionospheric anomalous variations induced by the powerful Tohoku earthquake of March 11, 2011.The duration of TEC time series dataset is 120 days at time resolution of 2 h. The results show that the MLP presents anomalies better than referenced and conventional methods such as Auto-Regressive Integrated Moving Average (ARIMA) technique. In this study, also the detected TEC anomalies using the proposed method, are compared to the previous results (Akhoondzadeh, 2012) dealing with the observed TEC anomalies by applying the mean, median, wavelet and Kalman filter methods. The MLP detected anomalies are similar to those detected using the previous methods applied on the same case study. The results indicate that a MLP feed-forward neural network can be a suitable non-parametric method to detect changes of a non linear time series such as variations of earthquake precursors.

  10. Using anomaly detection method and multi-temporal Radarsat images for short-term land use/land cover change detection

    NASA Astrophysics Data System (ADS)

    Qian, JunPing; Chen, XiaoYue; Li, Xia; Yeh, Anthony Gar-On; Ai, Bin

    2008-10-01

    -temporal Radarsat images and object features including mean value of backscattering coefficient (Mean), minimal value of backscattering (Min), homogeneity of gray level co-occurrence matrix (GLCMhomo) and dissimilarity of gray level co-occurrence matrix (GLCMdis) were extracted basing on segmented image objects. After that change-vector was constructed for each land objects. In the third step DBAD algorithm was applied to the change vector dataset to detect anomaly change in the 3 scenes of images. Finally field surveying data plus manual interpretation were used for validation. Comparing with object-based image regression method, DBAD results in better accuracy. Besides, data validation also shows that DBAD have better accuracy in both under-constructed area and newly built up area (error lower than 12%). While for built up area and some mixed used area, it gains relatively lower accuracy than other land types (from 10% to 28.57%). To conclude, short-term land use change in time series images could be defined as spatial and temporal anomaly in remote sensing images. By extending traditional anomaly detection to spatial-temporal anomaly detection, land use change caused by human activity could be effectively detected during short time intervals. The algorithm DBAD focus only on the density of change vectors in feature space, which is independent of the amplitude and direction of change vectors. This enable DBAD effectively discriminate temporal image variation caused by observation system, environment or seasonal land cover change, especially in vegetation and cultivated area which changed remarkably during the observation period, from land use change caused by human activities. This helps to decrease the false alarming in short-term change detection.

  11. Interior intrusion detection systems

    SciTech Connect

    Rodriguez, J.R.; Matter, J.C. ); Dry, B. )

    1991-10-01

    The purpose of this NUREG is to present technical information that should be useful to NRC licensees in designing interior intrusion detection systems. Interior intrusion sensors are discussed according to their primary application: boundary-penetration detection, volumetric detection, and point protection. Information necessary for implementation of an effective interior intrusion detection system is presented, including principles of operation, performance characteristics and guidelines for design, procurement, installation, testing, and maintenance. A glossary of sensor data terms is included. 36 figs., 6 tabs.

  12. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling.

    PubMed

    Raghuram, Jayaram; Miller, David J; Kesidis, George

    2014-07-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511

  13. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling

    PubMed Central

    Raghuram, Jayaram; Miller, David J.; Kesidis, George

    2014-01-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511

  14. Detection of submicron scale cracks and other surface anomalies using positron emission tomography

    DOEpatents

    Cowan, Thomas E.; Howell, Richard H.; Colmenares, Carlos A.

    2004-02-17

    Detection of submicron scale cracks and other mechanical and chemical surface anomalies using PET. This surface technique has sufficient sensitivity to detect single voids or pits of sub-millimeter size and single cracks or fissures of millimeter size; and single cracks or fissures of millimeter-scale length, micrometer-scale depth, and nanometer-scale length, micrometer-scale depth, and nanometer-scale width. This technique can also be applied to detect surface regions of differing chemical reactivity. It may be utilized in a scanning or survey mode to simultaneously detect such mechanical or chemical features over large interior or exterior surface areas of parts as large as about 50 cm in diameter. The technique involves exposing a surface to short-lived radioactive gas for a time period, removing the excess gas to leave a partial monolayer, determining the location and shape of the cracks, voids, porous regions, etc., and calculating the width, depth, and length thereof. Detection of 0.01 mm deep cracks using a 3 mm detector resolution has been accomplished using this technique.

  15. Detection of subpixel anomalies in multispectral infrared imagery using an adaptive Bayesian classifier

    SciTech Connect

    Ashton, E.A.

    1998-03-01

    The detection of subpixel targets with unknown spectral signatures and cluttered backgrounds in multispectral imagery is a topic of great interest for remote surveillance applications. Because no knowledge of the target is assumed, the only way to accomplish such a detection is through a search for anomalous pixels. Two approaches to this problem are examined in this paper. The first is to separate the image into a number of statistical clusters by using an extension of the well-known {kappa}-means algorithm. Each bin of resultant residual vectors is then decorrelated, and the results are thresholded to provide detection. The second approach requires the formation of a probabilistic background model by using an adaptive Bayesian classification algorithm. This allows the calculation of a probability for each pixel, with respect to the model. These probabilities are then thresholded to provide detection. Both algorithms are shown to provide significant improvement over current filtering techniques for anomaly detection in experiments using multispectral IR imagery with both simulated and actual subpixel targets.

  16. Anomaly detection driven active learning for identifying suspicious tracks and events in WAMI video

    NASA Astrophysics Data System (ADS)

    Miller, David J.; Natraj, Aditya; Hockenbury, Ryler; Dunn, Katherine; Sheffler, Michael; Sullivan, Kevin

    2012-06-01

    We describe a comprehensive system for learning to identify suspicious vehicle tracks from wide-area motion (WAMI) video. First, since the road network for the scene of interest is assumed unknown, agglomerative hierarchical clustering is applied to all spatial vehicle measurements, resulting in spatial cells that largely capture individual road segments. Next, for each track, both at the cell (speed, acceleration, azimuth) and track (range, total distance, duration) levels, extreme value feature statistics are both computed and aggregated, to form summary (p-value based) anomaly statistics for each track. Here, to fairly evaluate tracks that travel across different numbers of spatial cells, for each cell-level feature type, a single (most extreme) statistic is chosen, over all cells traveled. Finally, a novel active learning paradigm, applied to a (logistic regression) track classifier, is invoked to learn to distinguish suspicious from merely anomalous tracks, starting from anomaly-ranked track prioritization, with ground-truth labeling by a human operator. This system has been applied to WAMI video data (ARGUS), with the tracks automatically extracted by a system developed in-house at Toyon Research Corporation. Our system gives promising preliminary results in highly ranking as suspicious aerial vehicles, dismounts, and traffic violators, and in learning which features are most indicative of suspicious tracks.

  17. Classification of radar data by detecting and identifying spatial and temporal anomalies

    NASA Astrophysics Data System (ADS)

    Väilä, Minna; Venäläinen, Ilkka; Jylhä, Juha; Ruotsalainen, Marja; Perälä, Henna; Visa, Ari

    2010-04-01

    For some time, applying the theory of pattern recognition and classification to radar signal processing has been a topic of interest in the field of remote sensing. Efficient operation and target indication is often hindered by the signal background, which can have similar properties with the interesting signal. Because noise and clutter may constitute most part of the response of surveillance radar, aircraft and other interesting targets can be seen as anomalies in the data. We propose an algorithm for detecting these anomalies on a heterogeneous clutter background in each range-Doppler cell, the basic unit in the radar data defined by the resolution in range, angle and Doppler. The analysis is based on the time history of the response in a cell and its correlation to the spatial surroundings. If the newest time window of response in a resolution cell differs statistically from the time history of the cell, the cell is determined anomalous. Normal cells are classified as noise or different type of clutter based on their strength on each Doppler band. Anomalous cells are analyzed using a longer time window, which emulates a longer coherent illumination. Based on the decorrelation behavior of the response in the long time window, the anomalous cells are classified as clutter, an airplane or a helicopter. The algorithm is tested with both experimental and simulated radar data. The experimental radar data has been recorded in a forested landscape.

  18. Detection of Anomalies in Citrus Leaves Using Laser-Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Sankaran, Sindhuja; Ehsani, Reza; Morgan, Kelly T

    2015-08-01

    Nutrient assessment and management are important to maintain productivity in citrus orchards. In this study, laser-induced breakdown spectroscopy (LIBS) was applied for rapid and real-time detection of citrus anomalies. Laser-induced breakdown spectroscopy spectra were collected from citrus leaves with anomalies such as diseases (Huanglongbing, citrus canker) and nutrient deficiencies (iron, manganese, magnesium, zinc), and compared with those of healthy leaves. Baseline correction, wavelet multivariate denoising, and normalization techniques were applied to the LIBS spectra before analysis. After spectral pre-processing, features were extracted using principal component analysis and classified using two models, quadratic discriminant analysis and support vector machine (SVM). The SVM resulted in a high average classification accuracy of 97.5%, with high average canker classification accuracy (96.5%). LIBS peak analysis indicated that high intensities at 229.7, 247.9, 280.3, 393.5, 397.0, and 769.8 nm were observed of 11 peaks found in all the samples. Future studies using controlled experiments with variable nutrient applications are required for quantification of foliar nutrients by using LIBS-based sensing. PMID:26163130

  19. Regional and residual anomaly separation in microgravity maps for cave detection: The case study of Gruta de las Maravillas (SW Spain)

    NASA Astrophysics Data System (ADS)

    Martínez-Moreno, F. J.; Galindo-Zaldívar, J.; Pedrera, A.; Teixidó, T.; Peña, J. A.; González-Castillo, L.

    2015-03-01

    Gravity can be considered an optimal geophysical method for cave detection, given the high density contrast between an empty cavity and the surrounding materials. A number of methods can be used for regional and residual gravity anomaly separation, although they have not been tested in natural scenarios. With the purpose of comparing the different methods, we calculate the residual anomalies associated with the karst system of Gruta de las Maravillas whose cave morphology and dimensions are well-known. A total of 1857 field measurements, mostly distributed in a regular grid of 10 × 10 m, cover the studied area. The microgravity data were acquired using a Scintrex CG5 gravimeter and topography control was carried out with a differential GPS. Regional anomaly maps were calculated by means of several algorithms to generate the corresponding residual gravimetric maps: polynomial first-order fitting, fast Fourier transformation with an upward continuation filter, moving average, minimum curvature and kriging methods. Results are analysed and discussed in terms of resolution, implying the capacity to detect shallow voids. We propose that polynomial fitting is the best technique when microgravity data are used to obtain the residual anomaly maps for cave detection.

  20. Distribution water quality anomaly detection from UV optical sensor monitoring data by integrating principal component analysis with chi-square distribution.

    PubMed

    Hou, Dibo; Zhang, Jian; Yang, Zheling; Liu, Shu; Huang, Pingjie; Zhang, Guangxin

    2015-06-29

    The issue of distribution water quality security ensuring is recently attracting global attention due to the potential threat from harmful contaminants. The real-time monitoring based on ultraviolet optical sensors is a promising technique. This method is of reagent-free, low maintenance cost, rapid analysis and wide cover range. However, the ultraviolet absorption spectra are of large size and easily interfered. While within the on-site application, there is almost no prior knowledge like spectral characteristics of potential contaminants before determined. Meanwhile, the concept of normal water quality is also varying due to the operating condition. In this paper, a procedure based on multivariate statistical analysis is proposed to detect distribution water quality anomaly based on ultraviolet optical sensors. Firstly, the principal component analysis is employed to capture the main variety features from the spectral matrix and reduce the dimensionality. A new statistical variable is then constructed and used for evaluating the local outlying degree according to the chi-square distribution in the principal component subspace. The possibility of anomaly of the latest observation is calculated by the accumulation of the outlying degrees from the adjacent previous observations. To develop a more reliable anomaly detection procedure, several key parameters are discussed. By utilizing the proposed methods, the distribution water quality anomalies and the optical abnormal changes can be detected. The contaminants intrusion experiment is conducted in a pilot-scale distribution system by injecting phenol solution. The effectiveness of the proposed procedure is finally testified using the experimental spectral data. PMID:26191757

  1. Sirenomelia with associated systemic anomalies: an autopsy pathologic illustration of a series of four cases.

    PubMed

    Chikkannaiah, Panduranga; Mahadevan, Anita; Gosavi, Manasi; Kangle, Ranjit; Anuradha; Shankar, S K

    2014-07-01

    Sirenomelia, a developmental defect involving the caudal region of the body, is associated with several internal visceral anomalies. We report a detailed spectrum of anomalies in an autopsy study of four fetuses with sirenomelia (gestational ages - 20, 21, 22.4, and 22.5 weeks). Three of the fetuses had single umbilical artery, with genitourinary and gastrointestinal anomalies. Central nervous system anomalies were evident in two of the fetuses, with alobar holoprosencephaly in one and lumbar meningomyelocele in another. The most common gastrointestinal anomaly was blind ended gut (imperforate anus), while esophageal atresia and omphalocele were noted in one case each. Renal hypoplasia was seen in two fetuses, renal agenesis in one and cystic renal dysplasia was noted in one case. Literature regarding pathogenesis of this condition is briefly discussed. PMID:24656289

  2. Fuzzy Logic Based Anomaly Detection for Embedded Network Security Cyber Sensor

    SciTech Connect

    Ondrej Linda; Todd Vollmer; Jason Wright; Milos Manic

    2011-04-01

    Resiliency and security in critical infrastructure control systems in the modern world of cyber terrorism constitute a relevant concern. Developing a network security system specifically tailored to the requirements of such critical assets is of a primary importance. This paper proposes a novel learning algorithm for anomaly based network security cyber sensor together with its hardware implementation. The presented learning algorithm constructs a fuzzy logic rule based model of normal network behavior. Individual fuzzy rules are extracted directly from the stream of incoming packets using an online clustering algorithm. This learning algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental test-bed mimicking the environment of a critical infrastructure control system.

  3. Development of newly designed VHF interferometer system for observing earthquake-related atmospheric anomalies

    PubMed Central

    Yamamoto, Isao; Fujiwara, Hironobu; Kamogawa, Masashi; Iyono, Atsushi; Kroumov, Valeri; Azakami, Takashi

    2009-01-01

    Temporal correlation between atmospheric anomalies and earthquakes has recently been verified statistically through measuring VHF FM radio waves transmitted beyond the line-of-sight. In order to locate the sources of such atmospheric anomalies, we developed a VHF interferometer system (bistatic-radar type) capable of finding the arrival direction of FM radio waves scattered possibly by earthquake-related atmospheric anomalies. In general, frequency modulation of FM radio waves produces ambiguity of arrival direction. However, our system, employing high-sampling rates of the order of kHz, can precisely measure the arrival direction of FM radio waves by stacking received signals. PMID:20009381

  4. Structure and dynamics of decadal anomalies in the wintertime midlatitude North Pacific ocean-atmosphere system

    NASA Astrophysics Data System (ADS)

    Fang, Jiabei; Yang, Xiu-Qun

    2015-12-01

    The structure and dynamics of decadal anomalies in the wintertime midlatitude North Pacific ocean-atmosphere system are examined in this study, using the NCEP/NCAR atmospheric reanalysis, HadISST SST and Simple Ocean Data Assimilation data for 1960-2010. The midlatitude decadal anomalies associated with the Pacific Decadal Oscillation are identified, being characterized by an equivalent barotropic atmospheric low (high) pressure over a cold (warm) oceanic surface. Such a unique configuration of decadal anomalies can be maintained by an unstable ocean-atmosphere interaction mechanism in the midlatitudes, which is hypothesized as follows. Associated with a warm PDO phase, an initial midlatitude surface westerly anomaly accompanied with intensified Aleutian low tends to force a negative SST anomaly by increasing upward surface heat fluxes and driving southward Ekman current anomaly. The SST cooling tends to increase the meridional SST gradient, thus enhancing the subtropical oceanic front. As an adjustment of the atmospheric boundary layer to the enhanced oceanic front, the low-level atmospheric meridional temperature gradient and thus the low-level atmospheric baroclinicity tend to be strengthened, inducing more active transient eddy activities that increase transient eddy vorticity forcing. The vorticity forcing that dominates the total atmospheric forcing tends to produce an equivalent barotropic atmospheric low pressure north of the initial westerly anomaly, intensifying the initial anomalies of the midlatitude surface westerly and Aleutian low. Therefore, it is suggested that the midlatitude ocean-atmosphere interaction can provide a positive feedback mechanism for the development of initial anomaly, in which the oceanic front and the atmospheric transient eddy are the indispensable ingredients. Such a positive ocean-atmosphere feedback mechanism is fundamentally responsible for the observed decadal anomalies in the midlatitude North Pacific ocean

  5. Recursive spectral similarity measure-based band selection for anomaly detection in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    He, Yuanlei; Liu, Daizhi; Yi, Shihua

    2011-01-01

    Band selection has been widely used in hyperspectral image processing for dimension reduction. In this paper, a recursive spectral similarity measure-based band selection (RSSM-BBS) method is presented. Unlike most of the existing image-based band selection techniques, it is for two hyperspectral signatures with its main focus on their spectral separability. Furthermore, it is unsupervised and based on the recursive calculation of the spectral similarity measure with an additional band. In order to demonstrate the utility of the proposed method, an anomaly detection algorithm is developed, which first extracts the anomalous target spectrum from the image using the automatic target detection and classification algorithm (ATDCA), followed by the maximum spectral screening (MSS) to obtain a good estimate of the background, and then implements RSSM-BBS to select bands that participate in the subsequent adaptive cosine/coherence estimator (ACE) target detection. As shown in the experimental result on the AVIRIS dataset, the detection performance of the ACE has been improved greatly with the bands selected by RSSM-BBS over that using full bands.

  6. Adaptive hidden Markov model with anomaly States for price manipulation detection.

    PubMed

    Cao, Yi; Li, Yuhua; Coleman, Sonya; Belatreche, Ammar; McGinnity, Thomas Martin

    2015-02-01

    Price manipulation refers to the activities of those traders who use carefully designed trading behaviors to manually push up or down the underlying equity prices for making profits. With increasing volumes and frequency of trading, price manipulation can be extremely damaging to the proper functioning and integrity of capital markets. The existing literature focuses on either empirical studies of market abuse cases or analysis of particular manipulation types based on certain assumptions. Effective approaches for analyzing and detecting price manipulation in real time are yet to be developed. This paper proposes a novel approach, called adaptive hidden Markov model with anomaly states (AHMMAS) for modeling and detecting price manipulation activities. Together with wavelet transformations and gradients as the feature extraction methods, the AHMMAS model caters to price manipulation detection and basic manipulation type recognition. The evaluation experiments conducted on seven stock tick data from NASDAQ and the London Stock Exchange and 10 simulated stock prices by stochastic differential equation show that the proposed AHMMAS model can effectively detect price manipulation patterns and outperforms the selected benchmark models. PMID:25608293

  7. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  8. Bayesian anomaly detection in heterogeneous media with applications to geophysical tomography

    NASA Astrophysics Data System (ADS)

    Simon, Martin

    2014-11-01

    In this paper, we consider the problem of detecting a parameterized anomaly in an isotropic, stationary and ergodic conductivity random field via electrical impedance tomography. A homogenization result for a stochastic forward problem built on the complete electrode model is derived, which serves as the basis for a two-stage numerical method in the framework of Bayesian inverse problems. The novelty of this method lies in the introduction of an enhanced error model accounting for the approximation errors that result from reducing the full forward model to a homogenized one. In the first stage, a MAP estimate for the reduced forward model equipped with the enhanced error model is computed. Then, in the second stage, a bootstrap prior based on the first stage results is defined and the resulting posterior distribution is sampled via Markov chain Monte Carlo. We provide the theoretical foundation of the proposed method, discuss different aspects of a numerical implementation and present numerical experiments to support our findings.

  9. Detection, identification and mapping of iron anomalies in brain tissue using X-ray absorption spectroscopy

    SciTech Connect

    Mikhaylova, A.; Davidson, M.; Toastmann, H.; Channell, J.E.T.; Guyodo, Y.; Batich, C.; Dobson, J.

    2008-06-16

    This work describes a novel method for the detection, identification and mapping of anomalous iron compounds in mammalian brain tissue using X-ray absorption spectroscopy. We have located and identified individual iron anomalies in an avian tissue model associated with ferritin, biogenic magnetite and haemoglobin with a pixel resolution of less than 5 {micro}m. This technique represents a breakthrough in the study of both intra- and extra-cellular iron compounds in brain tissue. The potential for high-resolution iron mapping using microfocused X-ray beams has direct application to investigations of the location and structural form of iron compounds associated with human neurodegenerative disorders - a problem which has vexed researchers for 50 years.

  10. Detection, identification and mapping of iron anomalies in brain tissue using X-ray absorption spectroscopy

    PubMed Central

    Mikhaylova, A; Davidson, M; Toastmann, H; Channell, J.E.T; Guyodo, Y; Batich, C; Dobson, J

    2005-01-01

    This work describes a novel method for the detection, identification and mapping of anomalous iron compounds in mammalian brain tissue using X-ray absorption spectroscopy. We have located and identified individual iron anomalies in an avian tissue model associated with ferritin, biogenic magnetite and haemoglobin with a pixel resolution of less than 5 μm. This technique represents a breakthrough in the study of both intra- and extra-cellular iron compounds in brain tissue. The potential for high-resolution iron mapping using microfocused X-ray beams has direct application to investigations of the location and structural form of iron compounds associated with human neurodegenerative disorders—a problem which has vexed researchers for 50 years. PMID:16849161

  11. Seismological detection of low-velocity anomalies surrounding the mantle transition zone in Japan subduction zone

    NASA Astrophysics Data System (ADS)

    Liu, Zhen; Park, Jeffrey; Karato, Shun-ichiro

    2016-03-01

    In the Japan subduction zone, a locally depressed 660 discontinuity has been observed beneath northeast Asia, suggesting downwelling of materials from the mantle transition zone (MTZ). Vertical transport of water-rich MTZ materials across the major mineral phase changes could lead to water release and to partial melting in surrounding mantle regions, causing seismic low-velocity anomalies. Melt layers implied by low-velocity zones (LVZs) above the 410 discontinuity have been detected in many regions, but seismic evidence for partial melting below the 660 discontinuity has been limited. High-frequency migrated Ps receiver functions indicate LVZs below the depressed 660 discontinuity and above the 410 discontinuity in the deep Japan subduction zone, suggesting dehydration melting induced by water transport out of the MTZ. Our results provide insights into water circulation associated with dynamic interactions between the subducted slab and surrounding mantle.

  12. Anomaly and Signature Filtering Improve Classifier Performance For Detection Of Suspicious Access To EHRs

    PubMed Central

    Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila

    2011-01-01

    Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10−6. Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful. PMID:22195129

  13. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, M. Clinton; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many konts to make possible high speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flow fields/plumes; the Optical Plume Anomaly Detection (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDIFIS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Additionally, efforts are being advanced to hardware encode components of the EDIFIS in order to address real-time operational requirements for health monitoring and management. This paper addresses the OPAD with its tool suite, and discusses what is considered a natural progression: a concept for migrating OPAD towards detection of high energy particles, including neutrons and gamma rays. The integration of these tools and capabilities will provide NASA with a systematic approach to monitor space vehicle internal and external environment.

  14. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    SciTech Connect

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-08-15

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of {>=}5% in {approx}1 mm{sup 2} areas and {>=}2% in {approx}20 mm{sup 2} areas. Conclusions: The ability to detect small dose differences ({<=}2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified.

  15. The 2014-2015 warming anomaly in the Southern California Current System observed by underwater gliders

    NASA Astrophysics Data System (ADS)

    Zaba, Katherine D.; Rudnick, Daniel L.

    2016-02-01

    Large-scale patterns of positive temperature anomalies persisted throughout the surface waters of the North Pacific Ocean during 2014-2015. In the Southern California Current System, measurements by our sustained network of underwater gliders reveal the coastal effects of the recent warming. Regional upper ocean temperature anomalies were greatest since the initiation of the glider network in 2006. Additional observed physical anomalies included a depressed thermocline, high stratification, and freshening; induced biological consequences included changes in the vertical distribution of chlorophyll fluorescence. Contemporaneous surface heat flux and wind strength perturbations suggest that local anomalous atmospheric forcing caused the unusual oceanic conditions.

  16. DEVELOPMENT AND TESTING OF PROCEDURES FOR CARRYING OUT EMERGENCY PHYSICAL INVENTORY TAKING AFTER DETECTING ANOMALY EVENTS CONCERNING NM SECURITY.

    SciTech Connect

    VALENTE,J.FISHBONE,L.ET AL.

    2003-07-13

    In the State Scientific Center of Russian Federation - Institute of Physics and Power Engineering (SSC RF-IPPE, Obninsk), which is under Minatom jurisdiction, the procedures for carrying out emergency physical inventory taking (EPIT) were developed and tested in cooperation with the Brookhaven National Laboratory (USA). Here the emergency physical inventory taking means the PIT, which is carried out in case of symptoms indicating a possibility of NM loss (theft). Such PIT often requires a verification of attributes and quantitative characteristics for all the NM items located in a specific Material Balance Area (MBA). In order to carry out the exercise, an MBA was selected where many thousands of NM items containing highly enriched uranium are used. Three clients of the computerized material accounting system (CMAS) are installed in this MBA. Labels with unique (within IPPE site) identification numbers in the form of digit combinations and an appropriate bar code have been applied on the NM items, containers and authorized locations. All the data to be checked during the EPIT are stored in the CMAS database. Five variants of anomalies initiating EPIT and requiring different types of activities on EPIT organization are considered. Automatic working places (AWP) were created on the basis of the client computers in order to carry out a large number of measurements within a reasonable time. In addition to a CMAS client computer, the main components of an AWP include a bar-code reader, an electronic scale and an enrichment meter with NaI--detector--the lMCA Inspector (manufactured by the Canberra Company). All these devices work together with a client computer in the on-line mode. Special computer code (Emergency Inventory Software-EIS) was developed. All the algorithms of interaction between the operator and the system, as well as algorithms of data exchange during the measurements and data comparison, are implemented in this software. Registration of detected

  17. The architecture of a network level intrusion detection system

    SciTech Connect

    Heady, R.; Luger, G.; Maccabe, A.; Servilla, M.

    1990-08-15

    This paper presents the preliminary architecture of a network level intrusion detection system. The proposed system will monitor base level information in network packets (source, destination, packet size, and time), learning the normal patterns and announcing anomalies as they occur. The goal of this research is to determine the applicability of current intrusion detection technology to the detection of network level intrusions. In particular, the authors are investigating the possibility of using this technology to detect and react to worm programs.

  18. Value of Ultrasound in Detecting Urinary Tract Anomalies After First Febrile Urinary Tract Infection in Children.

    PubMed

    Ghobrial, Emad E; Abdelaziz, Doaa M; Sheba, Maha F; Abdel-Azeem, Yasser S

    2016-05-01

    Background Urinary tract infection (UTI) is an infection that affects part of the urinary tract. Ultrasound is a noninvasive test that can demonstrate the size and shape of kidneys, presence of dilatation of the ureters, and the existence of anatomic abnormalities. The aim of the study is to estimate the value of ultrasound in detecting urinary tract anomalies after first attack of UTI. Methods This study was conducted at the Nephrology Clinic, New Children's Hospital, Faculty of Medicine, Cairo University, from August 2012 to March 2013, and included 30 children who presented with first attack of acute febrile UTI. All patients were subjected to urine analysis, urine culture and sensitivity, serum creatinine, complete blood count, and imaging in the form of renal ultrasound, voiding cysto-urethrography, and renal scan. Results All the patients had fever with a mean of 38.96°C ± 0.44°C and the mean duration of illness was 6.23 ± 5.64 days. Nineteen patients (63.3%) had an ultrasound abnormality. The commonest abnormalities were kidney stones (15.8%). Only 2 patients who had abnormal ultrasound had also vesicoureteric reflux on cystourethrography. Sensitivity of ultrasound was 66.7%, specificity was 37.5%, positive predictive value was 21.1%, negative predictive value was 81.8%, and total accuracy was 43.33%. Conclusion We concluded that ultrasound alone was not of much value in diagnosing and putting a plan of first attack of febrile UTI. It is recommended that combined investigations are the best way to confirm diagnosis of urinary tract anomalies. PMID:26084536

  19. Airborne detection of magnetic anomalies associated with soils on the Oak Ridge Reservation, Tennessee

    SciTech Connect

    Doll, W.E.; Beard, L.P.; Helm, J.M.

    1995-04-01

    Reconnaissance airborne geophysical data acquired over the 35,000-acre Oak Ridge Reservation (ORR), TN, show several magnetic anomalies over undisturbed areas mapped as Copper Ridge Dolomite (CRD). The anomalies of interest are most apparent in magnetic gradient maps where they exceed 0.06 nT/m and in some cases exceed 0.5 nT/m. Anomalies as large as 25nT are seen on maps. Some of the anomalies correlate with known or suspected karst, or with apparent conductivity anomalies calculated from electromagnetic data acquired contemporaneously with the magnetic data. Some of the anomalies have a strong correlation with topographic lows or closed depressions. Surface magnetic data have been acquired over some of these sites and have confirmed the existence of the anomalies. Ground inspections in the vicinity of several of the anomalies has not led to any discoveries of manmade surface materials of sufficient size to generate the observed anomalies. One would expect an anomaly of approximately 1 nT for a pickup truck from 200 ft altitude. Typical residual magnetic anomalies have magnitudes of 5--10 nT, and some are as large as 25nT. The absence of roads or other indications of culture (past or present) near the anomalies and the modeling of anomalies in data acquired with surface instruments indicate that man-made metallic objects are unlikely to be responsible for the anomaly. The authors show that observed anomalies in the CRD can reasonably be associated with thickening of the soil layer. The occurrence of the anomalies in areas where evidences of karstification are seen would follow because sediment deposition would occur in topographic lows. Linear groups of anomalies on the maps may be associated with fracture zones which were eroded more than adjacent rocks and were subsequently covered with a thicker blanket of sediment. This study indicates that airborne magnetic data may be of use in other sites where fracture zones or buried collapse structures are of interest.

  20. Hypergraph-based anomaly detection of high-dimensional co-occurrences.

    PubMed

    Silva, Jorge; Willett, Rebecca

    2009-03-01

    This paper addresses the problem of detecting anomalous multivariate co-occurrences using a limited number of unlabeled training observations. A novel method based on using a hypergraph representation of the data is proposed to deal with this very high-dimensional problem. Hypergraphs constitute an important extension of graphs which allow edges to connect more than two vertices simultaneously. A variational Expectation-Maximization algorithm for detecting anomalies directly on the hypergraph domain without any feature selection or dimensionality reduction is presented. The resulting estimate can be used to calculate a measure of anomalousness based on the False Discovery Rate. The algorithm has O(np) computational complexity, where n is the number of training observations and p is the number of potential participants in each co-occurrence event. This efficiency makes the method ideally suited for very high-dimensional settings, and requires no tuning, bandwidth or regularization parameters. The proposed approach is validated on both high-dimensional synthetic data and the Enron email database, where p > 75,000, and it is shown that it can outperform other state-of-the-art methods. PMID:19147882

  1. Insider threat detection enabled by converting user applications into fractal fingerprints and autonomously detecting anomalies

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James

    2012-06-01

    We demonstrate insider threat detection for determining when the behavior of a computer user is suspicious or different from his or her normal behavior. This is accomplished by combining features extracted from text, emails, and blogs that are associated with the user. These sources can be characterized using QUEST, DANCER, and MenTat to extract features; however, some of these features are still in text form. We show how to convert these features into numerical form and characterize them using parametric and non-parametric statistics. These features are then used as input into a Random Forest classifier that is trained to recognize whenever the user's behavior is suspicious or different from normal (off-nominal). Active authentication (user identification) is also demonstrated using the features and classifiers derived in this work. We also introduce a novel concept for remotely monitoring user behavior indicator patterns displayed as an infrared overlay on the computer monitor, which the user is unaware of, but a narrow pass-band filtered webcam can clearly distinguish. The results of our analysis are presented.

  2. Idaho Explosive Detection System

    SciTech Connect

    Klinger, Jeff

    2011-01-01

    Learn how INL researchers are making the world safer by developing an explosives detection system that can inspect cargo. For more information about INL security research, visit http://www.facebook.com/idahonationallaboratory

  3. Idaho Explosive Detection System

    ScienceCinema

    Klinger, Jeff

    2013-05-28

    Learn how INL researchers are making the world safer by developing an explosives detection system that can inspect cargo. For more information about INL security research, visit http://www.facebook.com/idahonationallaboratory

  4. Underwater laser detection system

    NASA Astrophysics Data System (ADS)

    Gomaa, Walid; El-Sherif, Ashraf F.; El-Sharkawy, Yasser H.

    2015-02-01

    The conventional method used to detect an underwater target is by sending and receiving some form of acoustic energy. But the acoustic systems have limitations in the range resolution and accuracy; while, the potential benefits of a laserbased underwater target detection include high directionality, high response, and high range accuracy. Lasers operating in the blue-green region of the light spectrum(420 : 570nm)have a several applications in the area of detection and ranging of submersible targets due to minimum attenuation through water ( less than 0.1 m-1) and maximum laser reflection from estimated target (like mines or submarines) to provide a long range of detection. In this paper laser attenuation in water was measured experimentally by new simple method by using high resolution spectrometer. The laser echoes from different targets (metal, plastic, wood, and rubber) were detected using high resolution CCD camera; the position of detection camera was optimized to provide a high reflection laser from target and low backscattering noise from the water medium, digital image processing techniques were applied to detect and discriminate the echoes from the metal target and subtract the echoes from other objects. Extraction the image of target from the scattering noise is done by background subtraction and edge detection techniques. As a conclusion, we present a high response laser imaging system to detect and discriminate small size, like-mine underwater targets.

  5. Bro Intrusion Detection System

    SciTech Connect

    Paxson, Vern; Campbell, Scott; leres, Craig; Lee, Jason

    2006-01-25

    Bro is a Unix-based Network Intrusion Detection System (IDS). Bro monitors network traffic and detects intrusion attempts based on the traffic characteristics and content. Bro detects intrusions by comparing network traffic against rules describing events that are deemed troublesome. These rules might describe activities (e.g., certain hosts connecting to certain services), what activities are worth alerting (e.g., attempts to a given number of different hosts constitutes a "scan"), or signatures describing known attacks or access to known vulnerabilities. If Bro detects something of interest, it can be instructed to either issue a log entry or initiate the execution of an operating system command. Bro targets high-speed (Gbps), high-volume intrusion detection. By judiciously leveraging packet filtering techniques, Bro is able to achieve the performance necessary to do so while running on commercially available PC hardware, and thus can serve as a cost effective means of monitoring a site’s Internet connection.

  6. Dysplasia of the atrioventricular valves associated with conduction system anomalies.

    PubMed Central

    Daliento, L; Nava, A; Fasoli, G; Mazzucco, A; Thiene, G

    1984-01-01

    Clinical, vectorcardiographic, and echocardiographic data from two siblings with atrial septal defects and dysplasia of the mitral and tricuspid valves are reported. Vectorcardiograms showed that both siblings had abnormal ventricular activation with initial electrical forces directed posteriorly. One sibling died after surgery, and necropsy showed incomplete differentiation of the leaflets and tensor apparatus producing anomalies resembling "mitral arcade." Serial histological examination of the conducting tissue showed that the atrioventricular node was located on the left side of the atrial septum, that the central fibrous body and the membranous septum were hypoplastic, and that an accessory nodoventricular pathway originating in the compact node joined the left side of the ventricular septum. This accessory pathway was probably the cause of the unusual ventricular activation. Dysplasia of the mitral and tricuspid valves together with hypoplasia of the central fibrous body and the presence of accessory pathways are probably part of a malformative complex caused by incomplete differentiation of both the cardiac atrioventricular valves and the junctional area. Images PMID:6696801

  7. Experiments to Detect Clandestine Graves from Interpreted High Resolution Geophysical Anomalies

    NASA Astrophysics Data System (ADS)

    Molina, C. M.; Hernandez, O.; Pringle, J.

    2013-05-01

    This project refers to the search for clandestine sites where possibly missing people have been buried based on interpreted near surface high resolution geophysical anomalies. Nowadays, there are thousands of missing people around the world that could have been tortured and killed and buried in clandestine graves. This is a huge problem for their families and governments that are responsible to warranty the human rights for everybody. These people need to be found and the related crime cases need to be resolved. This work proposes to construct a series of graves where all the conditions of the grave, human remains and related objects are known. It is expected to detect contrasting physical properties of soil to identify the known human remains and objects. The proposed geophysical methods will include electrical tomography, magnetic and ground penetrating radar, among others. Two geographical sites will be selected to located and build standard graves with contrasting weather, soil, vegetation, geographic and geologic conditions. Forward and inverse modeling will be applied to locate and enhance the geophysical response of the known graves and to validate the methodology. As a result, an integrated geophysical program will be provided to support the search for clandestine graves helping to find missing people that have been illegally buried. Optionally, the methodology will be tested to search for real clandestine graves.

  8. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks

    PubMed Central

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-01-01

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens’ quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%. PMID:27304957

  9. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks.

    PubMed

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-01-01

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens' quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%. PMID:27304957

  10. Anomaly Identification from Super-Low Frequency Electromagnetic Data for the Coalbed Methane Detection

    NASA Astrophysics Data System (ADS)

    Zhao, S. S.; Wang, N.; Hui, J.; Ye, X.; Qin, Q.

    2016-06-01

    Natural source Super Low Frequency(SLF) electromagnetic prospecting methods have become an increasingly promising way in the resource detection. The capacity estimation of the reservoirs is of great importance to evaluate their exploitation potency. In this paper, we built a signal-estimate model for SLF electromagnetic signal and processed the monitored data with adaptive filter. The non-normal distribution test showed that the distribution of the signal was obviously different from Gaussian probability distribution, and Class B instantaneous amplitude probability model can well describe the statistical properties of SLF electromagnetic data. The Class B model parameter estimation is very complicated because its kernel function is confluent hypergeometric function. The parameters of the model were estimated based on property spectral function using Least Square Gradient Method(LSGM). The simulation of this estimation method was carried out, and the results of simulation demonstrated that the LGSM estimation method can reflect important information of the Class B signal model, of which the Gaussian component was considered to be the systematic noise and random noise, and the Intermediate Event Component was considered to be the background ground and human activity noise. Then the observation data was processed using adaptive noise cancellation filter. With the noise components subtracted out adaptively, the remaining part is the signal of interest, i.e., the anomaly information. It was considered to be relevant to the reservoir position of the coalbed methane stratum.

  11. Extraction of oil slicks on the sea surface from optical satellite images by using an anomaly detection technique

    NASA Astrophysics Data System (ADS)

    Chen, Chi-Farn; Chang, Li-Yu

    2010-12-01

    Many methods for the detection of oil pollution on the sea surface from remotely sensed images have been developed in recent years. However, because of the diverse physical properties of oil on the sea surface in the visible wavelengths, such images are easily affected by the surrounding environment. This is a common difficulty encountered when optical satellite images are used as data sources for observing oil slicks on the sea surface. However, provided the spectral interference generated by the surrounding environment can be regarded as noise and properly modeled, the spectral anomalies caused by an oil slick on normal sea water may be observed after the suppression of this noise. In this study, sea surface oil slicks are extracted by detecting spectral anomalies in multispectral optical satellite images. First, assuming that the sea water and oil slick comprise the dominant background and target anomaly, respectively, an RX algorithm is used to enhance the oil slick anomaly. The oil slick can be distinguished from the sea water background after modeling and suppression of inherent noise. Next, a Gaussian mixture model is used to characterize the statistical distributions of the background and anomaly, respectively. The expectation maximization (EM) algorithm is used to obtain the parameters needed for the Gaussian mixture model. Finally, according to the Bayesian decision rule of minimum error, an optimized threshold can be obtained to extract the oil slick areas from the source image. Furthermore, with the obtained Gaussian distributions and optimized threshold, a theoretical false alarm level can be established to evaluate the quality of the extracted oil slicks. Experimental results show that the proposed method can not only successfully detect oil slicks from multispectral optical satellite images, but also provide a quantitative accuracy evaluation of the detected image.

  12. A machine independent expert system for diagnosing environmentally induced spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark J.

    1991-01-01

    A new rule-based, machine independent analytical tool for diagnosing spacecraft anomalies, the EnviroNET expert system, was developed. Expert systems provide an effective method for storing knowledge, allow computers to sift through large amounts of data pinpointing significant parts, and most importantly, use heuristics in addition to algorithms which allow approximate reasoning and inference, and the ability to attack problems not rigidly defines. The EviroNET expert system knowledge base currently contains over two hundred rules, and links to databases which include past environmental data, satellite data, and previous known anomalies. The environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose.

  13. Portable pathogen detection system

    DOEpatents

    Colston, Billy W.; Everett, Matthew; Milanovich, Fred P.; Brown, Steve B.; Vendateswaran, Kodumudi; Simon, Jonathan N.

    2005-06-14

    A portable pathogen detection system that accomplishes on-site multiplex detection of targets in biological samples. The system includes: microbead specific reagents, incubation/mixing chambers, a disposable microbead capture substrate, and an optical measurement and decoding arrangement. The basis of this system is a highly flexible Liquid Array that utilizes optically encoded microbeads as the templates for biological assays. Target biological samples are optically labeled and captured on the microbeads, which are in turn captured on an ordered array or disordered array disposable capture substrate and then optically read.

  14. Solar system fault detection

    DOEpatents

    Farrington, R.B.; Pruett, J.C. Jr.

    1984-05-14

    A fault detecting apparatus and method are provided for use with an active solar system. The apparatus provides an indication as to whether one or more predetermined faults have occurred in the solar system. The apparatus includes a plurality of sensors, each sensor being used in determining whether a predetermined condition is present. The outputs of the sensors are combined in a pre-established manner in accordance with the kind of predetermined faults to be detected. Indicators communicate with the outputs generated by combining the sensor outputs to give the user of the solar system and the apparatus an indication as to whether a predetermined fault has occurred. Upon detection and indication of any predetermined fault, the user can take appropriate corrective action so that the overall reliability and efficiency of the active solar system are increased.

  15. Solar system fault detection

    DOEpatents

    Farrington, Robert B.; Pruett, Jr., James C.

    1986-01-01

    A fault detecting apparatus and method are provided for use with an active solar system. The apparatus provides an indication as to whether one or more predetermined faults have occurred in the solar system. The apparatus includes a plurality of sensors, each sensor being used in determining whether a predetermined condition is present. The outputs of the sensors are combined in a pre-established manner in accordance with the kind of predetermined faults to be detected. Indicators communicate with the outputs generated by combining the sensor outputs to give the user of the solar system and the apparatus an indication as to whether a predetermined fault has occurred. Upon detection and indication of any predetermined fault, the user can take appropriate corrective action so that the overall reliability and efficiency of the active solar system are increased.

  16. Idaho Explosives Detection System

    SciTech Connect

    Edward L. Reber; Larry G. Blackwood; Andrew J. Edwards; J. Keith Jewell; Kenneth W. Rohde; Edward H. Seabury; Jeffery B. Klinger

    2005-12-01

    The Idaho Explosives Detection System was developed at the Idaho National Laboratory (INL) to respond to threats imposed by delivery trucks potentially carrying explosives into military bases. A full-scale prototype system has been built and is currently undergoing testing. The system consists of two racks, one on each side of a subject vehicle. Each rack includes a neutron generator and an array of NaI detectors. The two neutron generators are pulsed and synchronized. A laptop computer controls the entire system. The control software is easily operable by minimally trained staff. The system was developed to detect explosives in a medium size truck within a 5-min measurement time. System performance was successfully demonstrated with explosives at the INL in June 2004 and at Andrews Air Force Base in July 2004.

  17. Damage detection in initially nonlinear systems

    SciTech Connect

    Bornn, Luke; Farrar, Charles; Park, Gyuhae

    2009-01-01

    The primary goal of Structural Health Monitoring (SHM) is to detect structural anomalies before they reach a critical level. Because of the potential life-safety and economic benefits, SHM has been widely studied over the past decade. In recent years there has been an effort to provide solid mathematical and physical underpinnings for these methods; however, most focus on systems that behave linearly in their undamaged state - a condition that often does not hold in complex 'real world' systems and systems for which monitoring begins mid-lifecycle. In this work, we highlight the inadequacy of linear-based methodology in handling initially nonlinear systems. We then show how the recently developed autoregressive support vector machine (AR-SVM) approach to time series modeling can be used for detecting damage in a system that exhibits initially nonlinear response. This process is applied to data acquired from a structure with induced nonlinearity tested in a laboratory environment.

  18. Detecting Anomalous Insiders in Collaborative Information Systems

    PubMed Central

    Chen, You; Nyemba, Steve; Malin, Bradley

    2012-01-01

    Collaborative information systems (CISs) are deployed within a diverse array of environments that manage sensitive information. Current security mechanisms detect insider threats, but they are ill-suited to monitor systems in which users function in dynamic teams. In this paper, we introduce the community anomaly detection system (CADS), an unsupervised learning framework to detect insider threats based on the access logs of collaborative environments. The framework is based on the observation that typical CIS users tend to form community structures based on the subjects accessed (e.g., patients’ records viewed by healthcare providers). CADS consists of two components: 1) relational pattern extraction, which derives community structures and 2) anomaly prediction, which leverages a statistical model to determine when users have sufficiently deviated from communities. We further extend CADS into MetaCADS to account for the semantics of subjects (e.g., patients’ diagnoses). To empirically evaluate the framework, we perform an assessment with three months of access logs from a real electronic health record (EHR) system in a large medical center. The results illustrate our models exhibit significant performance gains over state-of-the-art competitors. When the number of illicit users is low, MetaCADS is the best model, but as the number grows, commonly accessed semantics lead to hiding in a crowd, such that CADS is more prudent. PMID:24489520

  19. Detection of aeromagnetic anomaly change associated with volcanic activity: An application of the generalized mis-tie control method

    NASA Astrophysics Data System (ADS)

    Nakatsuka, Tadashi; Utsugi, Mitsuru; Okuma, Shigeo; Tanaka, Yoshikazu; Hashimoto, Takeshi

    2009-12-01

    Repeat aeromagnetic surveys may assist in mapping and monitoring long-term changes associated with volcanic activity. However, when dealing with repeat aeromagnetic survey data, the problem of how to extract the real change of magnetic anomalies from a limited set of observations arises, i.e. the problem of spatial aliasing. Recent development of the generalized mis-tie control method for aeromagnetic surveys flown at variable elevations enables us to statistically extract the errors from ambiguous noise sources. This technique can be applied to overcome the spatial alias effect when detecting magnetic anomaly changes between aeromagnetic surveys flown at different times. We successfully apply this technique to Asama Volcano, one of the active volcanoes in Japan, which erupted in 2004. Following the volcanic activity in 2005, we conducted a helicopter-borne aeromagnetic survey, which we compare here to the result from a previous survey flown in 1992. To discuss small changes in magnetic anomalies induced by volcanic activity, it is essential to estimate the accuracy of the reference and the repeat aeromagnetic measurements and the probable errors induced by data processing. In our case, the positioning inaccuracy of the 1992 reference survey was the most serious factor affecting the estimation of the magnetic anomaly change because GPS was still in an early stage at that time. However, our analysis revealed that the magnetic anomaly change over the Asama Volcano area from 1992 to 2005 exceeded the estimated error at three locations, one of which is interpreted as a loss of magnetization induced by volcanic activity. In this study, we suffered from the problem of positioning inaccuracy in the 1992 survey data, and it was important to evaluate its effect when deriving the magnetic anomaly change.

  20. MODVOLC2: A Hybrid Time Series Analysis for Detecting Thermal Anomalies Applied to Thermal Infrared Satellite Data

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Wright, R.; Pilger, E.

    2009-12-01

    We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found

  1. Water system virus detection

    NASA Technical Reports Server (NTRS)

    Fraser, A. S.; Wells, A. F.; Tenoso, H. J.

    1975-01-01

    A monitoring system developed to test the capability of a water recovery system to reject the passage of viruses into the recovered water is described. A nonpathogenic marker virus, bacteriophage F2, is fed into the process stream before the recovery unit and the reclaimed water is assayed for its presence. Detection of the marker virus consists of two major components, concentration and isolation of the marker virus, and detection of the marker virus. The concentration system involves adsorption of virus to cellulose acetate filters in the presence of trivalent cations and low pH with subsequent desorption of the virus using volumes of high pH buffer. The detection of the virus is performed by a passive immune agglutination test utilizing specially prepared polystyrene particles. An engineering preliminary design was performed as a parallel effort to the laboratory development of the marker virus test system. Engineering schematics and drawings of a fully functional laboratory prototype capable of zero-G operation are presented. The instrument consists of reagent pump/metering system, reagent storage containers, a filter concentrator, an incubation/detector system, and an electronic readout and control system.

  2. Subsurface faults detection based on magnetic anomalies investigation: A field example at Taba protectorate, South Sinai

    NASA Astrophysics Data System (ADS)

    Khalil, Mohamed H.

    2016-08-01

    Quantitative interpretation of the magnetic data particularly in a complex dissected structure necessitates using of filtering techniques. In Taba protectorate, Sinai synthesis of different filtering algorithms was carried out to distinct and verifies the subsurface structure and estimates the depth of the causative magnetic sources. In order to separate the shallow-seated structure, filters of the vertical derivatives (VDR), Butterworth high-pass (BWHP), analytic signal (AS) amplitude, and total horizontal derivative of the tilt derivative (TDR_THDR) were conducted. While, filters of the apparent susceptibility and Butterworth low-pass (BWLP) were conducted to identify the deep-seated structure. The depths of the geological contacts and faults were calculated by the 3D Euler deconvolution. Noteworthy, TDR_THDR was independent of geomagnetic inclination, significantly less susceptible to noise, and more sensitive to the details of the shallow superimposed structures. Whereas, the BWLP proved high resolution capabilities in attenuating the shorter wavelength of the near surface anomalies and emphasizing the longer wavelength derived from deeper causative structure. 3D Euler deconvolution (SI = 0) was quite amenable to estimate the depths of superimposed subsurface structure. The pattern, location, and trend of the deduced shallow and deep faults were conformed remarkably to the addressed fault system.

  3. Discrete shearlet transform on GPU with applications in anomaly detection and denoising

    NASA Astrophysics Data System (ADS)

    Gibert, Xavier; Patel, Vishal M.; Labate, Demetrio; Chellappa, Rama

    2014-12-01

    Shearlets have emerged in recent years as one of the most successful methods for the multiscale analysis of multidimensional signals. Unlike wavelets, shearlets form a pyramid of well-localized functions defined not only over a range of scales and locations, but also over a range of orientations and with highly anisotropic supports. As a result, shearlets are much more effective than traditional wavelets in handling the geometry of multidimensional data, and this was exploited in a wide range of applications from image and signal processing. However, despite their desirable properties, the wider applicability of shearlets is limited by the computational complexity of current software implementations. For example, denoising a single 512 × 512 image using a current implementation of the shearlet-based shrinkage algorithm can take between 10 s and 2 min, depending on the number of CPU cores, and much longer processing times are required for video denoising. On the other hand, due to the parallel nature of the shearlet transform, it is possible to use graphics processing units (GPU) to accelerate its implementation. In this paper, we present an open source stand-alone implementation of the 2D discrete shearlet transform using CUDA C++ as well as GPU-accelerated MATLAB implementations of the 2D and 3D shearlet transforms. We have instrumented the code so that we can analyze the running time of each kernel under different GPU hardware. In addition to denoising, we describe a novel application of shearlets for detecting anomalies in textured images. In this application, computation times can be reduced by a factor of 50 or more, compared to multicore CPU implementations.

  4. Bro Intrusion Detection System

    Energy Science and Technology Software Center (ESTSC)

    2006-01-25

    Bro is a Unix-based Network Intrusion Detection System (IDS). Bro monitors network traffic and detects intrusion attempts based on the traffic characteristics and content. Bro detects intrusions by comparing network traffic against rules describing events that are deemed troublesome. These rules might describe activities (e.g., certain hosts connecting to certain services), what activities are worth alerting (e.g., attempts to a given number of different hosts constitutes a "scan"), or signatures describing known attacks or accessmore » to known vulnerabilities. If Bro detects something of interest, it can be instructed to either issue a log entry or initiate the execution of an operating system command. Bro targets high-speed (Gbps), high-volume intrusion detection. By judiciously leveraging packet filtering techniques, Bro is able to achieve the performance necessary to do so while running on commercially available PC hardware, and thus can serve as a cost effective means of monitoring a site’s Internet connection.« less

  5. Finding Cardinality Heavy-Hitters in Massive Traffic Data and Its Application to Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Ishibashi, Keisuke; Mori, Tatsuya; Kawahara, Ryoichi; Hirokawa, Yutaka; Kobayashi, Atsushi; Yamamoto, Kimihiro; Sakamoto, Hitoaki; Asano, Shoichiro

    introduce an application of our algorithm to anomaly detection. With actual traffic data, our method could successfully detect a sudden network scan.

  6. Radiation detection system

    DOEpatents

    Nelson, Melvin A.; Davies, Terence J.; Morton, III, John R.

    1976-01-01

    A radiation detection system which utilizes the generation of Cerenkov light in and the transmission of that light longitudinally through fiber optic wave guides in order to transmit intelligence relating to the radiation to a remote location. The wave guides are aligned with respect to charged particle radiation so that the Cerenkov light, which is generated at an angle to the radiation, is accepted by the fiber for transmission therethrough. The Cerenkov radiation is detected, recorded, and analyzed at the other end of the fiber.

  7. Evolutionary neural networks for anomaly detection based on the behavior of a program.

    PubMed

    Han, Sang-Jun; Cho, Sung-Bae

    2006-06-01

    The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection. PMID:16761810

  8. Decision Tree, Bagging and Random Forest methods detect TEC seismo-ionospheric anomalies around the time of the Chile, (Mw = 8.8) earthquake of 27 February 2010

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, Mehdi

    2016-06-01

    In this paper for the first time ensemble methods including Decision Tree, Bagging and Random Forest have been proposed in the field of earthquake precursors to detect GPS-TEC (Total Electron Content) seismo-ionospheric anomalies around the time and location of Chile earthquake of 27 February 2010. All of the implemented ensemble methods detected a striking anomaly in time series of TEC data, 1 day after the earthquake at 14:00 UTC. The results indicate that the proposed methods due to their performance, speed and simplicity are quite promising and deserve serious attention as a new predictor tools for seismo-ionospheric anomalies detection.

  9. Digital speckle pattern interferometry based anomaly detection in breast mimicking phantoms: a pilot study

    NASA Astrophysics Data System (ADS)

    Udayakumar, K.; Sujatha, N.; Ganesan, A. R.

    2015-03-01

    Early screening of subsurface anomalies in breast can improve the patient survival rate. Clinically approved breast screening modalities may either have body ionizing effect/cause pain to the body parts/ involves body contact/ increased cost. In this paper, a non-invasive, whole field Digital Speckle Pattern Interferometry (DSPI) is used to study normal and abnormal breast mimicking tissue phantoms. While uniform fringes were obtained for a normal phantom in the out of plane speckle pattern interferometry configuration, the non uniformity in the observed fringes clearly showed the anomaly location in the abnormal phantom. The results are compared with deformation profiles using finite element analysis of the sample under similar loading conditions.

  10. Ultrasonic Leak Detection System

    NASA Technical Reports Server (NTRS)

    Youngquist, Robert C. (Inventor); Moerk, J. Steven (Inventor)

    1998-01-01

    A system for detecting ultrasonic vibrations. such as those generated by a small leak in a pressurized container. vessel. pipe. or the like. comprises an ultrasonic transducer assembly and a processing circuit for converting transducer signals into an audio frequency range signal. The audio frequency range signal can be used to drive a pair of headphones worn by an operator. A diode rectifier based mixing circuit provides a simple, inexpensive way to mix the transducer signal with a square wave signal generated by an oscillator, and thereby generate the audio frequency signal. The sensitivity of the system is greatly increased through proper selection and matching of the system components. and the use of noise rejection filters and elements. In addition, a parabolic collecting horn is preferably employed which is mounted on the transducer assembly housing. The collecting horn increases sensitivity of the system by amplifying the received signals. and provides directionality which facilitates easier location of an ultrasonic vibration source.

  11. Water system virus detection

    NASA Technical Reports Server (NTRS)

    Fraser, A. S.; Wells, A. F.; Tenoso, H. J. (Inventor)

    1978-01-01

    The performance of a waste water reclamation system is monitored by introducing a non-pathogenic marker virus, bacteriophage F2, into the waste-water prior to treatment and, thereafter, testing the reclaimed water for the presence of the marker virus. A test sample is first concentrated by absorbing any marker virus onto a cellulose acetate filter in the presence of a trivalent cation at low pH and then flushing the filter with a limited quantity of a glycine buffer solution to desorb any marker virus present on the filter. Photo-optical detection of indirect passive immune agglutination by polystyrene beads indicates the performance of the water reclamation system in removing the marker virus. A closed system provides for concentrating any marker virus, initiating and monitoring the passive immune agglutination reaction, and then flushing the system to prepare for another sample.

  12. Early India-Australia Spreading History Revealed by Newly Detected Magnetic Anomalies

    NASA Astrophysics Data System (ADS)

    Williams, S.; Whittaker, J. M.; Granot, R.; Müller, D.

    2013-12-01

    The seafloor within the Perth Abyssal Plain (PAP), offshore Western Australia, is the only section of crust that directly records the early spreading history between India and Australia during the Mesozoic breakup of Gondwana. However, this early spreading has been poorly constrained due to an absence of data, including marine magnetic anomalies and data constraining the crustal nature of key tectonic features. Here, we present new magnetic anomaly data from the PAP that shows that the crust in the western part of the basin was part of the Indian Plate - the conjugate flank to the oceanic crust immediately offshore the Perth margin, Australia. We identify a sequence of M2 and older anomalies in the west PAP within crust that initially moved with the Indian Plate, formed at intermediate half-spreading rates (35 mm/yr) consistent with the conjugate sequence on the Australian Plate. More speculatively, we reinterpret the youngest anomalies in the east PAP, finding that the M0-age crust initially formed on the Indian Plate was transferred to the Australian Plate by a westward jump or propagation of the spreading ridge shortly after M0 time. Samples dredged from the Gulden Draak and Batavia Knolls (at the western edge of the PAP) reveal that these bathymetric features are continental fragments rather than igneous plateaus related to Broken Ridge. These microcontinents rifted away from Australia with Greater India during initial breakup at ~130 Ma, then rifted from India following the cessation of spreading in the PAP (~101-103 Ma).

  13. Anomalies in the detection of change: When changes in sample size are mistaken for changes in proportions.

    PubMed

    Fiedler, Klaus; Kareev, Yaakov; Avrahami, Judith; Beier, Susanne; Kutzner, Florian; Hütter, Mandy

    2016-01-01

    Detecting changes, in performance, sales, markets, risks, social relations, or public opinions, constitutes an important adaptive function. In a sequential paradigm devised to investigate detection of change, every trial provides a sample of binary outcomes (e.g., correct vs. incorrect student responses). Participants have to decide whether the proportion of a focal feature (e.g., correct responses) in the population from which the sample is drawn has decreased, remained constant, or increased. Strong and persistent anomalies in change detection arise when changes in proportional quantities vary orthogonally to changes in absolute sample size. Proportional increases are readily detected and nonchanges are erroneously perceived as increases when absolute sample size increases. Conversely, decreasing sample size facilitates the correct detection of proportional decreases and the erroneous perception of nonchanges as decreases. These anomalies are however confined to experienced samples of elementary raw events from which proportions have to be inferred inductively. They disappear when sample proportions are described as percentages in a normalized probability format. To explain these challenging findings, it is essential to understand the inductive-learning constraints imposed on decisions from experience. PMID:26179055

  14. Gas Flow Detection System

    NASA Technical Reports Server (NTRS)

    Moss, Thomas; Ihlefeld, Curtis; Slack, Barry

    2010-01-01

    This system provides a portable means to detect gas flow through a thin-walled tube without breaking into the tubing system. The flow detection system was specifically designed to detect flow through two parallel branches of a manifold with only one inlet and outlet, and is a means for verifying a space shuttle program requirement that saves time and reduces the risk of flight hardware damage compared to the current means of requirement verification. The prototype Purge Vent and Drain Window Cavity Conditioning System (PVD WCCS) Flow Detection System consists of a heater and a temperature-sensing thermistor attached to a piece of Velcro to be attached to each branch of a WCCS manifold for the duration of the requirement verification test. The heaters and thermistors are connected to a shielded cable and then to an electronics enclosure, which contains the power supplies, relays, and circuit board to provide power, signal conditioning, and control. The electronics enclosure is then connected to a commercial data acquisition box to provide analog to digital conversion as well as digital control. This data acquisition box is then connected to a commercial laptop running a custom application created using National Instruments LabVIEW. The operation of the PVD WCCS Flow Detection System consists of first attaching a heater/thermistor assembly to each of the two branches of one manifold while there is no flow through the manifold. Next, the software application running on the laptop is used to turn on the heaters and to monitor the manifold branch temperatures. When the system has reached thermal equilibrium, the software application s graphical user interface (GUI) will indicate that the branch temperatures are stable. The operator can then physically open the flow control valve to initiate the test flow of gaseous nitrogen (GN2) through the manifold. Next, the software user interface will be monitored for stable temperature indications when the system is again at

  15. Cyber-Critical Infrastructure Protection Using Real-Time Payload-Based Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Düssel, Patrick; Gehl, Christian; Laskov, Pavel; Bußer, Jens-Uwe; Störmann, Christof; Kästner, Jan

    With an increasing demand of inter-connectivity and protocol standardization modern cyber-critical infrastructures are exposed to a multitude of serious threats that may give rise to severe damage for life and assets without the implementation of proper safeguards. Thus, we propose a method that is capable to reliably detect unknown, exploit-based attacks on cyber-critical infrastructures carried out over the network. We illustrate the effectiveness of the proposed method by conducting experiments on network traffic that can be found in modern industrial control systems. Moreover, we provide results of a throughput measuring which demonstrate the real-time capabilities of our system.

  16. Gasometric anomalies in bottom sediments of the Barents Sea as instrument of Modern Petroleum System study

    NASA Astrophysics Data System (ADS)

    Fokina, A.; Akhmanov, G.; Andreassen, K.; Yurchenko, A.

    2014-12-01

    Southern Barents Sea no gas anomalies were detected: low gas concentrations, the gas is of biogenic origin. Geochemical survey within North- Kildinsk field and Fedynskii high were unsuccessful. Petroleum system in the surface geochemical field practically do not manifest due to the low permeability of dense clay silts.

  17. Arc fault detection system

    DOEpatents

    Jha, K.N.

    1999-05-18

    An arc fault detection system for use on ungrounded or high-resistance-grounded power distribution systems is provided which can be retrofitted outside electrical switchboard circuits having limited space constraints. The system includes a differential current relay that senses a current differential between current flowing from secondary windings located in a current transformer coupled to a power supply side of a switchboard, and a total current induced in secondary windings coupled to a load side of the switchboard. When such a current differential is experienced, a current travels through a operating coil of the differential current relay, which in turn opens an upstream circuit breaker located between the switchboard and a power supply to remove the supply of power to the switchboard. 1 fig.

  18. Arc fault detection system

    DOEpatents

    Jha, Kamal N.

    1999-01-01

    An arc fault detection system for use on ungrounded or high-resistance-grounded power distribution systems is provided which can be retrofitted outside electrical switchboard circuits having limited space constraints. The system includes a differential current relay that senses a current differential between current flowing from secondary windings located in a current transformer coupled to a power supply side of a switchboard, and a total current induced in secondary windings coupled to a load side of the switchboard. When such a current differential is experienced, a current travels through a operating coil of the differential current relay, which in turn opens an upstream circuit breaker located between the switchboard and a power supply to remove the supply of power to the switchboard.

  19. Detection of anomalies in radio tomography of asteroids: Source count and forward errors

    NASA Astrophysics Data System (ADS)

    Pursiainen, S.; Kaasalainen, M.

    2014-09-01

    The purpose of this study was to advance numerical methods for radio tomography in which asteroid's internal electric permittivity distribution is to be recovered from radio frequency data gathered by an orbiter. The focus was on signal generation via multiple sources (transponders) providing one potential, or even essential, scenario to be implemented in a challenging in situ measurement environment and within tight payload limits. As a novel feature, the effects of forward errors including noise and a priori uncertainty of the forward (data) simulation were examined through a combination of the iterative alternating sequential (IAS) inverse algorithm and finite-difference time-domain (FDTD) simulation of time evolution data. Single and multiple source scenarios were compared in two-dimensional localization of permittivity anomalies. Three different anomaly strengths and four levels of total noise were tested. Results suggest, among other things, that multiple sources can be necessary to obtain appropriate results, for example, to distinguish three separate anomalies with permittivity less or equal than half of the background value, relevant in recovery of internal cavities.

  20. Mesoscale convective system surface pressure anomalies responsible for meteotsunamis along the U.S. East Coast on June 13th, 2013.

    PubMed

    Wertman, Christina A; Yablonsky, Richard M; Shen, Yang; Merrill, John; Kincaid, Christopher R; Pockalny, Robert A

    2014-01-01

    Two destructive high-frequency sea level oscillation events occurred on June 13th, 2013 along the U.S. East Coast. Seafloor processes can be dismissed as the sources, as no concurrent offshore earthquakes or landslides were detected. Here, we present evidence that these tsunami-like events were generated by atmospheric mesoscale convective systems (MCSs) propagating from inland to offshore. The USArray Transportable Array inland and NOAA tide gauges along the coast recorded the pressure anomalies associated with the MCSs. Once offshore, the pressure anomalies generated shallow water waves, which were amplified by the resonance between the water column and atmospheric forcing. Analysis of the tidal data reveals that these waves reflected off the continental shelf break and reached the coast, where bathymetry and coastal geometry contributed to their hazard potential. This study demonstrates that monitoring MCS pressure anomalies in the interior of the U.S. provides important observations for early warnings of MCS-generated tsunamis. PMID:25420958

  1. Mesoscale convective system surface pressure anomalies responsible for meteotsunamis along the U.S. East Coast on June 13th, 2013

    PubMed Central

    Wertman, Christina A.; Yablonsky, Richard M.; Shen, Yang; Merrill, John; Kincaid, Christopher R.; Pockalny, Robert A.

    2014-01-01

    Two destructive high-frequency sea level oscillation events occurred on June 13th, 2013 along the U.S. East Coast. Seafloor processes can be dismissed as the sources, as no concurrent offshore earthquakes or landslides were detected. Here, we present evidence that these tsunami-like events were generated by atmospheric mesoscale convective systems (MCSs) propagating from inland to offshore. The USArray Transportable Array inland and NOAA tide gauges along the coast recorded the pressure anomalies associated with the MCSs. Once offshore, the pressure anomalies generated shallow water waves, which were amplified by the resonance between the water column and atmospheric forcing. Analysis of the tidal data reveals that these waves reflected off the continental shelf break and reached the coast, where bathymetry and coastal geometry contributed to their hazard potential. This study demonstrates that monitoring MCS pressure anomalies in the interior of the U.S. provides important observations for early warnings of MCS-generated tsunamis. PMID:25420958

  2. Millimeter Wave Detection of Localized Anomalies in the Space Shuttle External Fuel Tank Insulating Foam and Acreage Heat Tiles

    NASA Technical Reports Server (NTRS)

    Kharkovsky, S.; Case, J. T.; Zoughi, R.; Hepburn, F.

    2005-01-01

    The Space Shuttle Columbia's catastrophic accident emphasizes the growing need for developing and applying effective, robust and life-cycle oriented nondestructive testing (NDT) methods for inspecting the shuttle external fuel tank spray on foam insulation (SOFI) and its protective acreage heat tiles. Millimeter wave NDT techniques were one of the methods chosen for evaluating their potential for inspecting these structures. Several panels with embedded anomalies (mainly voids) were produced and tested for this purpose. Near-field and far-field millimeter wave NDT methods were used for producing millimeter wave images of the anomalies in SOFI panel and heat tiles. This paper presents the results of an investigation for the purpose of detecting localized anomalies in two SOFI panels and a set of heat tiles. To this end, reflectometers at a relatively wide range of frequencies (Ka-band (26.5 - 40 GHz) to W-band (75 - 110 GHz)) and utilizing different types of radiators were employed. The results clearly illustrate the utility of these methods for this purpose.

  3. Unified Mars detection system. [life detection

    NASA Technical Reports Server (NTRS)

    Martin, J. P.; Kok, B.; Radmer, R.; Johnson, R. D.

    1976-01-01

    A life-detection system is described which is designed to detect and characterize possible Martian biota and to gather information about the chemical environment of Mars, especially the water and amino acid contents of the soil. The system is organized around a central mass spectrometer that can sensitively analyze trace gases from a variety of different experiments. Some biological assays and soil-chemistry tests that have been performed in the laboratory as typical experiment candidates for the system are discussed, including tests for soil-organism metabolism, measurements of soil carbon contents, and determinations of primary aliphatic amines (amino acids and protein) in soils. Two possible test strategies are outlined, and the operational concept of the detection system is illustrated. Detailed descriptions are given for the mass spectrometer, gas inlet, incubation box, test cell modules, seal drive mechanism, soil distribution assembly, and electronic control system.

  4. Threshold anomaly for the 7Be +58Ni system at near-Coulomb-barrier energies

    NASA Astrophysics Data System (ADS)

    Gómez Camacho, A.; Aguilera, E. F.

    2014-12-01

    By using recent fusion cross section measurements for the weakly bound system 7Be+58Ni around the Coulomb barrier, a simultaneous χ2 analysis of elastic scattering and fusion cross section data is performed. The analysis is carried out with optical polarization potentials for the fusion and direct reaction processes. That is, the nuclear polarization potential UN is split into a volume part UF which accounts for fusion reactions and a surface part UD R that is responsible for direct reactions. The parameters of fusion and direct reaction Woods-Saxon polarization potentials are determined by the analysis of the data. The presence of the threshold anomaly is investigated from the energy dependence of these polarization potentials. It is found that, contrary to other weakly bound systems, the 7Be+58Ni reaction presents the usual threshold anomaly.

  5. Detection of Characteristic Precipitation Anomaly Patterns of El Nino / La Nina in Time- variable Gravity Fields by GRACE

    NASA Astrophysics Data System (ADS)

    Heki, K.; Morishita, Y.

    2007-12-01

    GRACE (Gravity Recovery and Climate Experiment) satellites, launched in March 2002, have been mapping monthly gravity fields of the Earth, allowing us to infer changes in surface mass, e.g. water and ice. Past findings include the ice mass loss in southern Greenland (Luthcke et al., 2006) and its acceleration in 2004 (Velicogna and Wahr, 2006), crustal dilatation by the 2004 Sumatra Earthquake (Han et al., 2006) and the postseismic movement of water in mantle (Ogawa and Heki, 2007). ENSO (El Nino and Southern Oscillation) brings about global climate impacts, together with its opposite phenomenon, La Nina. Ropelewski and Halpert (1987) showed typical precipitation patterns in ENSO years; characteristic regional-scale precipitation anomalies occur in India, tropical and southern Africa and South America. Nearly opposite precipitation anomalies are shown to occur in La Nina years (Ropelewski and Halpert, 1988). Here we report the detection of such precipitation anomaly patterns in the GRACE monthly gravity data 2002 - 2007, which includes both La Nina (2005 fall - 2006 spring) and El Nino (2006 fall - 2007 spring) periods. We modeled the worldwide gravity time series with constant trends and seasonal changes, and extracted deviations of gravity values at two time epochs, i.e. February 2006 and 2007, and converted them into the changes in equivalent surface water mass. East Africa showed negative gravity deviation (-20.5 cm in water) in 2006 February (La Nina), which reversed to positive (18.7 cm) in 2007 February (El Nino). Northern and southern parts of South America also showed similar see-saw patterns. Such patterns closely resemble to those found meteorologically (Ropelewski and Halpert, 1987; 1988), suggesting the potential of GRACE as a sensor of inter-annual precipitation anomalies through changes in continental water storage. We performed numerical simulations of soil moisture changes at grid points in land area incorporating the CMAP precipitation data, NCEP

  6. Measuring anomaly with algorithmic entropy

    NASA Astrophysics Data System (ADS)

    Solano, Wanda M.

    Anomaly detection refers to the identification of observations that are considered outside of normal. Since they are unknown to the system prior to training and rare, the anomaly detection problem is particularly challenging. Model based techniques require large quantities of existing data are to build the model. Statistically based techniques result in the use of statistical metrics or thresholds for determining whether a particular observation is anomalous. I propose a novel approach to anomaly detection using wavelet based algorithmic entropy that does not require modeling or large amounts of data. My method embodies the concept of information distance that rests on the fact that data encodes information. This distance is large when little information is shared, and small when there is greater information sharing. I compare my approach with several techniques in the literature using data obtained from testing of NASA's Space Shuttle Main Engines (SSME)

  7. A Hybrid Positive-and-Negative Curvature Approach for Detection of the Edges of Magnetic Anomalies, and Its Application in the South China Sea

    NASA Astrophysics Data System (ADS)

    Guo, Lianghui; Gao, Rui; Meng, Xiaohong; Zhang, Guoli

    2015-10-01

    In work discussed in this paper the characteristics of both the most positive and most negative curvatures of a magnetic anomaly were analyzed, and a new approach for detection of the edges of magnetic anomalies is proposed. The new approach, called the hybrid positive-and-negative curvature approach, combines the most positive and most negative curvatures into one curvature by formula adjustments and weighted summation, combining the advantages of the two curvatures to improve edge detection. This approach is suitable for vertically magnetized or reduction-to-pole anomalies, which avoids the complexity of magnetic anomalies caused by oblique magnetization. Testing on synthetic vertically magnetized magnetic anomalies data demonstrated that the hybrid approach traces the edges of magnetic source bodies effectively, discriminates between high and low magnetism intuitively, and is better than approaches based solely on use of the most positive or most negative curvature. Testing on reduced-to-pole magnetic anomalies data around the ocean basin of the South China Sea showed that the hybrid approach enables better edge detection than the most positive or most negative curvatures. On the basis of the features of the reduced-to-pole magnetic anomalies and their hybrid curvature, we suggest the tectonic boundary between the southwestern subbasin and the eastern subbasin of the South China Sea ranges from the northeastern edge of the Zhongsha Islands in the southeast direction to the northeastern edge of the Reed Bank.

  8. Lessons Learned from the Space Shuttle Engine Cutoff System (ECO) Anomalies

    NASA Technical Reports Server (NTRS)

    Martinez, Hugo E.; Welzyn, Ken

    2011-01-01

    The Space Shuttle Orbiter's main engine cutoff (ECO) system first failed ground checkout in April, 2005 during a first tanking test prior to Return-to-Flight. Despite significant troubleshooting and investigative efforts that followed, the root cause could not be found and intermittent anomalies continued to plague the Program. By implementing hardware upgrades, enhancing monitoring capability, and relaxing the launch rules, the Shuttle fleet was allowed to continue flying in spite of these unexplained failures. Root cause was finally determined following the launch attempts of STS-122 in December, 2007 when the anomalies repeated, which allowed drag-on instrumentation to pinpoint the fault (the ET feedthrough connector). The suspect hardware was removed and provided additional evidence towards root cause determination. Corrective action was implemented and the system has performed successfully since then. This white paper presents the lessons learned from the entire experience, beginning with the anomalies since Return-to-Flight through discovery and correction of the problem. To put these lessons in better perspective for the reader, an overview of the ECO system is presented first. Next, a chronological account of the failures and associated investigation activities is discussed. Root cause and corrective action are summarized, followed by the lessons learned.

  9. Neonatal Jaundice Detection System.

    PubMed

    Aydın, Mustafa; Hardalaç, Fırat; Ural, Berkan; Karap, Serhat

    2016-07-01

    Neonatal jaundice is a common condition that occurs in newborn infants in the first week of life. Today, techniques used for detection are required blood samples and other clinical testing with special equipment. The aim of this study is creating a non-invasive system to control and to detect the jaundice periodically and helping doctors for early diagnosis. In this work, first, a patient group which is consisted from jaundiced babies and a control group which is consisted from healthy babies are prepared, then between 24 and 48 h after birth, 40 jaundiced and 40 healthy newborns are chosen. Second, advanced image processing techniques are used on the images which are taken with a standard smartphone and the color calibration card. Segmentation, pixel similarity and white balancing methods are used as image processing techniques and RGB values and pixels' important information are obtained exactly. Third, during feature extraction stage, with using colormap transformations and feature calculation, comparisons are done in RGB plane between color change values and the 8-color calibration card which is specially designed. Finally, in the bilirubin level estimation stage, kNN and SVR machine learning regressions are used on the dataset which are obtained from feature extraction. At the end of the process, when the control group is based on for comparisons, jaundice is succesfully detected for 40 jaundiced infants and the success rate is 85 %. Obtained bilirubin estimation results are consisted with bilirubin results which are obtained from the standard blood test and the compliance rate is 85 %. PMID:27229489

  10. Mesosiderite clasts with the most extreme positive europium anomalies among solar system rocks

    NASA Technical Reports Server (NTRS)

    Mittlefehldt, David W.; Rubin, Alan E.; Davis, Andrew M.

    1992-01-01

    Pigeonite-plagioclase gabbros that occur as clasts in mesosiderites (brecciated stony-iron meteorites) show extreme fractionations of the rare-earth elements (REEs) with larger positive europium anomalies than any previously known for igneous rocks from the earth, moon, or meteorite parent bodies and greater depletions of light REEs relative to heavy REEs than known for comparable cumulate gabbros. The REE pattern for merrillite in one of these clasts is depleted in light REEs and has a large positive europium anomaly as a result of metamorphic equilibration with the silicates. The extreme REE ratios exhibited by the mesosiderite clasts demonstrate that multistage igneous processes must have occurred on some asteroids in the early solar system. Melting of the crust by large-scale impacts or electrical induction from an early T-Tauri-phase sun may be responsible for these processes.

  11. Neonatal head ultrasound: systematic approach to congenital Central Nervous System anomalies. A pictorial essay.

    PubMed

    Yoon, Hye-Kyung; Cho, Seong Whi

    2016-09-01

    Brain ultrasound is widely used for the screening of prematurely born babies. Although the best imaging modality for the central nervous system anomaly is brain MRI, the first imaging study in the post-natal period is brain ultrasonography in most cases. Anomalies could be found incidentally on screening ultrasound, or in those cases already suspected on prenatal ultrasound. In order not to miss congenital structural abnormalities of the brain on screening ultrasound, systematic approaches would be very helpful. The ventricles and sylvian fissures are very important structures to suspect central nervous system anomalies: they are symmetric structures so we should look for any asymmetry or maldevelopment. And then, on sagittal images, the midline structures including the corpus callosum and cerebellar vermis should be observed carefully. Finally, we should look for any abnormality in gyration or cortical development. Skull defect with herniation of intracranial contents, a spectrum of encephalo-meningocele, could be also identified on ultrasound. Congenital infections such as cytomegalovirus infection may show ventriculomegaly and malformation of the cortical development on imaging studies. PMID:27622417

  12. Selecting training and test images for optimized anomaly detection algorithms in hyperspectral imagery through robust parameter design

    NASA Astrophysics Data System (ADS)

    Mindrup, Frank M.; Friend, Mark A.; Bauer, Kenneth W.

    2011-06-01

    There are numerous anomaly detection algorithms proposed for hyperspectral imagery. Robust parameter design (RPD) techniques have been applied to some of these algorithms in an attempt to choose robust settings capable of operating consistently across a large variety of image scenes. Typically, training and test sets of hyperspectral images are chosen randomly. Previous research developed a frameworkfor optimizing anomaly detection in HSI by considering specific image characteristics as noise variables within the context of RPD; these characteristics include the Fisher's score, ratio of target pixels and number of clusters. This paper describes a method for selecting hyperspectral image training and test subsets yielding consistent RPD results based on these noise features. These subsets are not necessarily orthogonal, but still provide improvements over random training and test subset assignments by maximizing the volume and average distance between image noise characteristics. Several different mathematical models representing the value of a training and test set based on such measures as the D-optimal score and various distance norms are tested in a simulation experiment.

  13. Hydrocarbon anomaly in soil gas as near-surface expressions of upflows and outflows in geothermal systems

    SciTech Connect

    Ong, H.L.; Higashihara, M.; Klusman, R.W.; Voorhees, K.J.; Pudjianto, R.; Ong, J

    1996-01-24

    A variety of hydrocarbons, C1 - C12, have been found in volcanic gases (fumarolic) and in geothermal waters and gases. The hydrocarbons are thought to have come from products of pyrolysis of kerogen in sedimentary rocks or they could be fed into the geothermal system by the recharging waters which may contain dissolved hydrocarbons or hydrocarbons extracted by the waters from the rocks. In the hot geothermal zone, 300°+ C, many of these hydrocarbons are in their critical state. It is thought that they move upwards due to buoyancy and flux up with the upflowing geothermal fluids in the upflow zones together with the magmatic gases. Permeability which could be provided by faults, fissures, mini and micro fractures are thought to provide pathways for the upward flux. A sensitive technique (Petrex) utilizing passive integrative adsorption of the hydrocarbons in soil gas on activated charcoal followed by desorption and analysis of the hydrocarbons by direct introduction mass spectrometry allows mapping of the anomalous areas. Surveys for geothermal resources conducted in Japan and in Indonesia show that the hydrocarbon anomaly occur over known fields and over areas strongly suspected of geothermal potential. The hydrocarbons found and identified were n-paraffins (C7-C9) and aromatics (C7-C8). Detection of permeable, i.e. active or open faults, parts of older faults which have been reactivated, e.g. by younger intersecting faults, and the area surrounding these faulted and permeable region is possible. The mechanism leading to the appearance of the hydrocarbon in the soil gas over upflow zones of the geothermal reservoir is proposed. The paraffins seems to be better pathfinders for the location of upflows than the aromatics. However the aromatics may, under certain circumstances, give better indications of the direction of the outflow of the geothermal system. It is thought that an upflow zone can be

  14. Anomaly discrimination in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Yu; Paylor, Drew; Chang, Chein-I.

    2014-05-01

    Anomaly detection finds data samples whose signatures are spectrally distinct from their surrounding data samples. Unfortunately, it cannot discriminate the anomalies it detected one from another. In order to accomplish this task it requires a way of measuring spectral similarity such as spectral angle mapper (SAM) or spectral information divergence (SID) to determine if a detected anomaly is different from another. However, this arises in a challenging issue of how to find an appropriate thresholding value for this purpose. Interestingly, this issue has not received much attention in the past. This paper investigates the issue of anomaly discrimination which can differentiate detected anomalies without using any spectral measure. The ideas are to makes use unsupervised target detection algorithms, Automatic Target Generation Process (ATGP) coupled with an anomaly detector to distinguish detected anomalies. Experimental results show that the proposed methods are indeed very effective in anomaly discrimination.

  15. Intelligent Leak Detection System

    Energy Science and Technology Software Center (ESTSC)

    2014-10-27

    apability of underground carbon dioxide storage to confine and sustain injected CO2 for a very long time is the main concern for geologic CO2 sequestration. If a leakage from a geological CO2 sequestration site occurs, it is crucial to find the approximate amount and the location of the leak in order to implement proper remediation activity. An overwhelming majority of research and development for storage site monitoring has been concentrated on atmospheric, surface or nearmore » surface monitoring of the sequestered CO2. This study aims to monitor the integrity of CO2 storage at the reservoir level. This work proposes developing in-situ CO2 Monitoring and Verification technology based on the implementation of Permanent Down-hole Gauges (PDG) or “Smart Wells” along with Artificial Intelligence and Data Mining (AI&DM). The technology attempts to identify the characteristics of the CO2 leakage by de-convolving the pressure signals collected from Permanent Down-hole Gauges (PDG). Citronelle field, a saline aquifer reservoir, located in the U.S. was considered for this study. A reservoir simulation model for CO2 sequestration in the Citronelle field was developed and history matched. The presence of the PDGs were considered in the reservoir model at the injection well and an observation well. High frequency pressure data from sensors were collected based on different synthetic CO2 leakage scenarios in the model. Due to complexity of the pressure signal behaviors, a Machine Learning-based technology was introduced to build an Intelligent Leakage Detection System (ILDS). The ILDS was able to detect leakage characteristics in a short period of time (less than a day) demonstrating the capability of the system in quantifying leakage characteristics subject to complex rate behaviors. The performance of ILDS was examined under different conditions such as multiple well leakages, cap rock leakage, availability of an additional monitoring well, presence of pressure drift

  16. Intelligent Leak Detection System

    SciTech Connect

    Mohaghegh, Shahab D.

    2014-10-27

    apability of underground carbon dioxide storage to confine and sustain injected CO2 for a very long time is the main concern for geologic CO2 sequestration. If a leakage from a geological CO2 sequestration site occurs, it is crucial to find the approximate amount and the location of the leak in order to implement proper remediation activity. An overwhelming majority of research and development for storage site monitoring has been concentrated on atmospheric, surface or near surface monitoring of the sequestered CO2. This study aims to monitor the integrity of CO2 storage at the reservoir level. This work proposes developing in-situ CO2 Monitoring and Verification technology based on the implementation of Permanent Down-hole Gauges (PDG) or “Smart Wells” along with Artificial Intelligence and Data Mining (AI&DM). The technology attempts to identify the characteristics of the CO2 leakage by de-convolving the pressure signals collected from Permanent Down-hole Gauges (PDG). Citronelle field, a saline aquifer reservoir, located in the U.S. was considered for this study. A reservoir simulation model for CO2 sequestration in the Citronelle field was developed and history matched. The presence of the PDGs were considered in the reservoir model at the injection well and an observation well. High frequency pressure data from sensors were collected based on different synthetic CO2 leakage scenarios in the model. Due to complexity of the pressure signal behaviors, a Machine Learning-based technology was introduced to build an Intelligent Leakage Detection System (ILDS). The ILDS was able to detect leakage characteristics in a short period of time (less than a day) demonstrating the capability of the system in quantifying leakage characteristics subject to complex rate behaviors. The performance of ILDS was examined under different conditions such as multiple well leakages, cap rock leakage, availability of an additional monitoring well, presence of pressure drift and noise

  17. First integrals of motion in a gauge covariant framework, Killing-Maxwell system and quantum anomalies

    SciTech Connect

    Visinescu, M.

    2012-10-15

    Hidden symmetries in a covariant Hamiltonian framework are investigated. The special role of the Stackel-Killing and Killing-Yano tensors is pointed out. The covariant phase-space is extended to include external gauge fields and scalar potentials. We investigate the possibility for a higher-order symmetry to survive when the electromagnetic interactions are taken into account. Aconcrete realization of this possibility is given by the Killing-Maxwell system. The classical conserved quantities do not generally transfer to the quantized systems producing quantum gravitational anomalies. As a rule the conformal extension of the Killing vectors and tensors does not produce symmetry operators for the Klein-Gordon operator.

  18. Incipient fire detection system

    DOEpatents

    Brooks, Jr., William K.

    1999-01-01

    A method and apparatus for an incipient fire detection system that receives gaseous samples and measures the light absorption spectrum of the mixture of gases evolving from heated combustibles includes a detector for receiving gaseous samples and subjecting the samples to spectroscopy and determining wavelengths of absorption of the gaseous samples. The wavelengths of absorption of the gaseous samples are compared to predetermined absorption wavelengths. A warning signal is generated whenever the wavelengths of absorption of the gaseous samples correspond to the predetermined absorption wavelengths. The method includes receiving gaseous samples, subjecting the samples to light spectroscopy, determining wavelengths of absorption of the gaseous samples, comparing the wavelengths of absorption of the gaseous samples to predetermined absorption wavelengths and generating a warning signal whenever the wavelengths of absorption of the gaseous samples correspond to the predetermined absorption wavelengths. In an alternate embodiment, the apparatus includes a series of channels fluidically connected to a plurality of remote locations. A pump is connected to the channels for drawing gaseous samples into the channels. A detector is connected to the channels for receiving the drawn gaseous samples and subjecting the samples to spectroscopy. The wavelengths of absorption are determined and compared to predetermined absorption wavelengths is provided. A warning signal is generated whenever the wavelengths correspond.

  19. Algorithms for Spectral Decomposition with Applications to Optical Plume Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Srivastava, Askok N.; Matthews, Bryan; Das, Santanu

    2008-01-01

    The analysis of spectral signals for features that represent physical phenomenon is ubiquitous in the science and engineering communities. There are two main approaches that can be taken to extract relevant features from these high-dimensional data streams. The first set of approaches relies on extracting features using a physics-based paradigm where the underlying physical mechanism that generates the spectra is used to infer the most important features in the data stream. We focus on a complementary methodology that uses a data-driven technique that is informed by the underlying physics but also has the ability to adapt to unmodeled system attributes and dynamics. We discuss the following four algorithms: Spectral Decomposition Algorithm (SDA), Non-Negative Matrix Factorization (NMF), Independent Component Analysis (ICA) and Principal Components Analysis (PCA) and compare their performance on a spectral emulator which we use to generate artificial data with known statistical properties. This spectral emulator mimics the real-world phenomena arising from the plume of the space shuttle main engine and can be used to validate the results that arise from various spectral decomposition algorithms and is very useful for situations where real-world systems have very low probabilities of fault or failure. Our results indicate that methods like SDA and NMF provide a straightforward way of incorporating prior physical knowledge while NMF with a tuning mechanism can give superior performance on some tests. We demonstrate these algorithms to detect potential system-health issues on data from a spectral emulator with tunable health parameters.

  20. Principle of indirect comparison (PIC): simulation and analysis of PIC-based anomaly detection in multispectral data

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton

    2006-05-01

    The Army has gained a renewed interest in hyperspectral (HS) imagery for military surveillance. As a result, a HS research team has been established at the Army Research Lab (ARL) to focus exclusively on the design of innovative algorithms for target detection in natural clutter. In 2005 at this symposium, we presented comparison performances between a proposed anomaly detector and existing ones testing real HS data. Herein, we present some insightful results on our general approach using analyses of statistical performances of an additional ARL anomaly detector testing 1500 simulated realizations of model-specific data to shed some light on its effectiveness. Simulated data of increasing background complexity will be used for the analysis, where highly correlated multivariate Gaussian random samples will model homogeneous backgrounds and mixtures of Gaussian will model non-homogeneous backgrounds. Distinct multivariate random samples will model targets, and targets will be added to backgrounds. The principle that led to the design of our detectors employs an indirect sample comparison to test the likelihood that local HS random samples belong to the same population. Let X and Y denote two random samples, and let Z = X U Y, where U denotes the union. We showed that X can be indirectly compared to Y by comparing, instead, Z to Y (or to X). Mathematical implementations of this simple idea have shown a remarkable ability to preserve performance of meaningful detections (e.g., full-pixel targets), while significantly reducing the number of meaningless detections (e.g., transitions of background regions in the scene).

  1. Magnetic and gravity anomalies of the slow-spreading system in the Gulf of Aden

    NASA Astrophysics Data System (ADS)

    Nakanishi, M.; Fujimoto, H.; Tamaki, K.; Okino, K.

    2002-12-01

    The spreading system in the Gulf of Aden between Somalia, NE Africa, and Arabia has an ENE-WSW trend and its half spreading rate is about 1.0 cm/yr (e.g., Jestin et al., 1994). Previous studies (e.g., Tamsett and Searle, 1988) provided the general morphology of the spreading system. To reveal detailed morphology and tectonics of the spreading system in the Gulf of Aden, geophysical investigation was conducted along the spreading system between 45°30OE and 50°20OE by the R/V Hakuho-maru from December 2000 to January 2001. Bathymetric data were collected using an echo sounder SEA BEAM 2120 aboard R/V Hakuho-maru. Magnetic and gravity data were collected by towed proton magnetometer and shipboard gravimeter, respectively. The strike of the spreading centers east of 46°30OE is N65°W. The topographic expression of the spreading centers east of N46°30OE is an axial rift valley offset by transform faults siilar to that observed at slow spreading centers in other areas. The bathymetric feature of the spreading centers between 45°50OE and 46°30OE with a strike N80°E is N65°W trending en-echelon basins. The spreading center west of 45°50OE with a strike E-W is bouned by linear ridges and its bathymetric expression is N65°W trending en-echelon ridges. The axial rift valley west of N46°30OE is not offset by any prominent transform faults. Negative magnetic anomaly is dominant over the axial valleys. Its amplitude is about 500 nT and the wavelength is about 30 km. Prominent linear negative magnetic anomaly, which is more than 1000 nT, exists west of N46°30OE. The strike of the linear magnetic anomaly correlates with that of axial valleys west of N46°30OE. Mantle Bouguer gravity anomaly of the spreading centers increases eastward. This trend correlates with the eastward deepening of spreading centers.

  2. Elastic anomalies and phonon damping in a metallic high spin-low spin system

    NASA Astrophysics Data System (ADS)

    Ihlemann, J.; Bärner, K.

    1984-12-01

    The elastic constants and the sound attenuation in single crystals of the metallic high spin (hs)-low spin (ls) system MnAs 1- xP x have been measured for temperaturres between 10 and 500 K. Elastic anomalies and damping maxima have been found for the second-order displacive (B8 1⇌B31) phase transition, the hs-ls transition and for the magnetic order-disorder transition. The phenomena near the hs-ls transition, in particular, are interpreted in terms of a condensation of a soft static phonon at the ls (hs) site in a hs (ls) matrix.

  3. Detection of oxygen isotopic anomaly in terrestrial atmospheric carbonates and its implications to Mars.

    PubMed

    Shaheen, R; Abramian, A; Horn, J; Dominguez, G; Sullivan, R; Thiemens, Mark H

    2010-11-23

    The debate of life on Mars centers around the source of the globular, micrometer-sized mineral carbonates in the ALH84001 meteorite; consequently, the identification of Martian processes that form carbonates is critical. This paper reports a previously undescribed carbonate formation process that occurs on Earth and, likely, on Mars. We identified micrometer-sized carbonates in terrestrial aerosols that possess excess (17)O (0.4-3.9‰). The unique O-isotopic composition mechanistically describes the atmospheric heterogeneous chemical reaction on aerosol surfaces. Concomitant laboratory experiments define the transfer of ozone isotopic anomaly to carbonates via hydrogen peroxide formation when O(3) reacts with surface adsorbed water. This previously unidentified chemical reaction scenario provides an explanation for production of the isotopically anomalous carbonates found in the SNC (shergottites, nakhlaites, chassignites) Martian meteorites and terrestrial atmospheric carbonates. The anomalous hydrogen peroxide formed on the aerosol surfaces may transfer its O-isotopic signature to the water reservoir, thus producing mass independently fractionated secondary mineral evaporites. The formation of peroxide via heterogeneous chemistry on aerosol surfaces also reveals a previously undescribed oxidative process of utility in understanding ozone and oxygen chemistry, both on Mars and Earth. PMID:21059939

  4. Detection of oxygen isotopic anomaly in terrestrial atmospheric carbonates and its implications to Mars

    PubMed Central

    Shaheen, R.; Abramian, A.; Horn, J.; Dominguez, G.; Sullivan, R.; Thiemens, Mark H.

    2010-01-01

    The debate of life on Mars centers around the source of the globular, micrometer-sized mineral carbonates in the ALH84001 meteorite; consequently, the identification of Martian processes that form carbonates is critical. This paper reports a previously undescribed carbonate formation process that occurs on Earth and, likely, on Mars. We identified micrometer-sized carbonates in terrestrial aerosols that possess excess 17O (0.4–3.9‰). The unique O-isotopic composition mechanistically describes the atmospheric heterogeneous chemical reaction on aerosol surfaces. Concomitant laboratory experiments define the transfer of ozone isotopic anomaly to carbonates via hydrogen peroxide formation when O3 reacts with surface adsorbed water. This previously unidentified chemical reaction scenario provides an explanation for production of the isotopically anomalous carbonates found in the SNC (shergottites, nakhlaites, chassignites) Martian meteorites and terrestrial atmospheric carbonates. The anomalous hydrogen peroxide formed on the aerosol surfaces may transfer its O-isotopic signature to the water reservoir, thus producing mass independently fractionated secondary mineral evaporites. The formation of peroxide via heterogeneous chemistry on aerosol surfaces also reveals a previously undescribed oxidative process of utility in understanding ozone and oxygen chemistry, both on Mars and Earth. PMID:21059939

  5. An anomaly detector applied to a materials control and accounting system

    SciTech Connect

    Whiteson, R.; Kelso, F.; Baumgart, C.; Tunnell, T.W.

    1994-08-01

    Large amounts of safeguards data are automatically gathered and stored by monitoring instruments used in nuclear chemical processing plants, nuclear material storage facilities, and nuclear fuel fabrication facilities. An integrated safeguards approach requires the ability to identify anomalous activities or states in these data. Anomalies in the data could be indications of error, theft, or diversion of material. The large volume of the data makes analysis and evaluation by human experts very tedious, and the complex and diverse nature of the data makes these tasks difficult to automate. This paper describes the early work in the development of analysis tools to automate the anomaly detection process. Using data from accounting databases, the authors are modeling the normal behavior of processes. From these models they hope to be able to identify activities or data that deviate from that norm. Such tools would be used to reveal trends, identify errors, and recognize unusual data. Thus the expert`s attention can be focused directly on significant phenomena.

  6. Characterization and reduction of stochastic and periodic anomalies in a hyperspectral imaging sensor system

    NASA Astrophysics Data System (ADS)

    Shetler, Bruce V.; Kieffer, Hugh H.

    1996-11-01

    HYDICE, the HYperspectral Digital Imagery Collection Experiment, is an airborne hyperspectral imaging sensor operating in a pushbroom mode. HYDICE collects data simultaneously in 210 wavelength bands from 0.4 to 2.5 micrometers using a prism as the dispersing element. While the overall quality of HYDICE data is excellent, certain data stream anomalies have been identified, among which are a periodic offset in DN level related to the operation of the system cryocooler and a quasi-random variation in the spectral alignment between the dispersed image and the focal plane. In this paper we report on an investigation into the above two effects and the development of algorithms and software to correct or minimize their impact in a production data processing system. We find the periodic variation to have unexpected time and band-dependent characteristics which argues against the possibility of correction in post- processing, but to be relatively insensitive to signal and consequently of low impact on the operation of the system. We investigate spectral jitter through an algorithm which performs a least squares fit to several atmospheric spectral features to characterize both the time-dependent jitter motion and systematic spectral mis-registration. This method is also implemented to correct the anomalies in the production data stream. A comprehensive set of hyperspectral sensor calibration and correction algorithm is also presented.

  7. Hand held explosives detection system

    DOEpatents

    Conrad, Frank J.

    1992-01-01

    The present invention is directed to a sensitive hand-held explosives detection device capable of detecting the presence of extremely low quantities of high explosives molecules, and which is applicable to sampling vapors from personnel, baggage, cargo, etc., as part of an explosives detection system.

  8. On the possibility of detecting large-scale crustal remanent magnetization with Magsat vector magnetic anomaly data

    NASA Technical Reports Server (NTRS)

    Galliher, S. C.; Mayhew, M. A.

    1982-01-01

    Magnetic anomaly component data measured by Magsat is compared with synthetic anomaly component fields arising from an equivalent source dipole array at the earth's surface generated from total field anomaly data alone. It is found that the synthetic components fit the component data regardless of the dipole orientation assigned to the equivalent sources and of the dipole spacing. Tentative conclusions are: (1) over the U.S., vector anomaly fields can be determined to the accuracy of the measurements from the total field anomaly data alone; and (2) the equivalent source technique is not useful for determining the direction of large-scale crustal magnetization.

  9. Antigen detection systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Infectious agents or their constituent parts (antigens or nucleic acids) can be detected in fresh, frozen, or fixed tissue using a variety of direct or indirect assays. The assays can be modified to yield the greatest sensitivity and specificity but in most cases a particular methodology is chosen ...

  10. Antigen detection systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Infectious agents or their constituent parts (antigens or nucleic acids) can be detected in fresh, frozen, or fixed tissues or other specimens, using a variety of direct or indirect assays. The assays can be modified to yield the greatest sensitivity and specificity but in most cases a particular m...

  11. Protein detection system

    DOEpatents

    Fruetel, Julie A.; Fiechtner, Gregory J.; Kliner, Dahv A. V.; McIlroy, Andrew

    2009-05-05

    The present embodiment describes a miniature, microfluidic, absorption-based sensor to detect proteins at sensitivities comparable to LIF but without the need for tagging. This instrument utilizes fiber-based evanescent-field cavity-ringdown spectroscopy, in combination with faceted prism microchannels. The combination of these techniques will increase the effective absorption path length by a factor of 10.sup.3 to 10.sup.4 (to .about.1-m), thereby providing unprecedented sensitivity using direct absorption. The coupling of high-sensitivity absorption with high-performance microfluidic separation will enable real-time sensing of biological agents in aqueous samples (including aerosol collector fluids) and will provide a general method with spectral fingerprint capability for detecting specific bio-agents.

  12. Analyzing the drainage system anomaly of Zagros basins: Implications for active tectonics

    NASA Astrophysics Data System (ADS)

    Bahrami, Shahram

    2013-11-01

    Morphometric analysis of hierarchical arrangement of drainage networks allows to evaluate the effects of external controls especially tectonics on basin development. In this study, a quantitative method for calculation of stream's hierarchical anomaly number is introduced. Morphometric parameters such as hierarchal anomaly index (∆a), percent of asymmetry factor (PAF), basin Shape (Bs), basin length to mean width ratio (Bl/Bmw), stream's bifurcation ratio (Rb), bifurcation index (R), drainage density (Dd), drainage frequency (Df) and anticline's hinge spacing (Hs) of 15 basins in Zagros Mountains were examined. Results show that the strong correlations exist between pairs ∆a-PAF (r = 0.844), ∆a-Bs (r = 0.732), ∆a-Bl/Bmw (r = 0.775), ∆a-R (r = 0.517), PAF-Bl/Bmw (r = 0.519), Bs-R (r = 0.659), Bl/Bmw-R (r = 0.703), Hs-∆a (r = - 0.708), Hs-PAF (r = - 0.529) and Hs-Bs (r = - 0.516). The variations in trend of anticlines control the shape of basins so that where anticlines hinges become closer to each other in the downstream direction, basin become narrower downward and hence the ∆a increases. The more uplifted northeastern anticlines cause the trunk river of the basins to migrate toward the younger anticlines in southwest and hence ∆a increases because the trunk river receives a lot of first order streams. Data reveal that the rate of ∆a is higher in elongated synclinal basins. Due to the decrease in the intensity of deformation from northeast toward southwest of Zagros, the hinge spacing of anticlines increases southwestwards. Data reveal that the variation in hinge spacing of anticlines strongly controls the basin's shape and tilting as well as the hierarchical anomaly of drainage system. Since the elongation and tilting of basins are associated with the variations in rates of folding, uplift and hinge spacing of anticlines, it can be concluded that the hierarchical anomaly of drainages in studied basins is controlled by the intensity of Zagros

  13. Spatial scanning for anomaly detection in acoustic emission testing of an aerospace structure

    NASA Astrophysics Data System (ADS)

    Hensman, James; Worden, Keith; Eaton, Mark; Pullin, Rhys; Holford, Karen; Evans, Sam

    2011-10-01

    Acoustic emission (AE) monitoring of engineering structures potentially provides a convenient, cost-effective means of performing structural health monitoring. Networks of AE sensors can be easily and unobtrusively installed upon structures, giving the ability to detect and locate damage-related strain releases ('events') in the structure. Use of the technique is not widespread due to the lack of a simple and effective method for detecting abnormal activity levels: the sensitivity of AE sensor networks is such that events unrelated to damage are prevalent in most applications. In this publication, we propose to monitor AE activity in a structure using a spatial scanning statistic, developed and used effectively in the field of epidemiology. The technique is demonstrated on an aerospace structure - an Airbus A320 main landing gear fitting - undergoing fatigue loading, and the method is compared to existing techniques. Despite its simplicity, the scanning statistic proves to be an extremely effective tool in detecting the onset of damage in the structure: it requires little to no user intervention or expertise, is inexpensive to compute and has an easily interpretable output. Furthermore, the generic nature of the method allows the technique to be used in a variety of monitoring scenarios, to detect damage in a wide range of structures.

  14. Behavioral Anomaly Detection: A Socio-Technical Study of Trustworthiness in Virtual Organizations

    ERIC Educational Resources Information Center

    Ho, Shuyuan Mary

    2009-01-01

    This study examines perceptions of human "trustworthiness" as a key component in countering insider threats. The term "insider threat" refers to situations where a critical member of an organization behaves against the interests of the organization, in an illegal and/or unethical manner. Identifying and detecting how an individual's behavior…

  15. Can residuals of the solar system foreground explain low multipole anomalies of the CMB?

    SciTech Connect

    Hansen, M.; Kim, J.; Frejsel, A.M.; Ramazanov, S.; Naselsky, P.; Zhao, W.; Burigana, C. E-mail: jkim@nbi.dk E-mail: sabir_ra@nbi.dk E-mail: wzhao7@nbi.ku.dk

    2012-10-01

    The low multipole anomalies of the Cosmic Microwave Background has received much attention during the last few years. It is still not ascertained whether these anomalies are indeed primordial or the result of systematics or foregrounds. An example of a foreground, which could generate some non-Gaussian and statistically anisotropic features at low multipole range, is the very symmetric Kuiper Belt in the outer solar system. In this paper, expanding upon the methods presented in [1], we investigate the contributions from the Kuiper Belt objects (KBO) to the WMAP ILC 7 map, whereby we can minimize the contrast in power between even and odd multipoles in the CMB, discussed in [2,3]. We submit our KBO de-correlated CMB signal to several tests, to analyze its validity, and find that incorporation of the KBO emission can decrease the quadrupole-octupole alignment and parity asymmetry problems, provided that the KBO signals has a non-cosmological dipole modulation, associated with the statistical anisotropy of the ILC 7 map. Additionally, we show that the amplitude of the dipole modulation, within a 2σ interval, is in agreement with the corresponding amplitudes, discussed in [4].

  16. Particle detection systems and methods

    DOEpatents

    Morris, Christopher L.; Makela, Mark F.

    2010-05-11

    Techniques, apparatus and systems for detecting particles such as muons and neutrons. In one implementation, a particle detection system employs a plurality of drift cells, which can be for example sealed gas-filled drift tubes, arranged on sides of a volume to be scanned to track incoming and outgoing charged particles, such as cosmic ray-produced muons. The drift cells can include a neutron sensitive medium to enable concurrent counting of neutrons. The system can selectively detect devices or materials, such as iron, lead, gold, uranium, plutonium, and/or tungsten, occupying the volume from multiple scattering of the charged particles passing through the volume and can concurrently detect any unshielded neutron sources occupying the volume from neutrons emitted therefrom. If necessary, the drift cells can be used to also detect gamma rays. The system can be employed to inspect occupied vehicles at border crossings for nuclear threat objects.

  17. Breakup threshold anomaly for the 8B + 58Ni system at near-Coulomb barrier energies

    NASA Astrophysics Data System (ADS)

    Gómez Camacho, A.; Aguilera, E. F.; Gomes, P. R. S.; Lubian, J.

    2011-09-01

    By using recent fusion cross section measurements for the system 8B + 58Ni, a simultaneous analysis of elastic scattering, fusion, and total reaction cross sections is performed for the weakly bound system 8B + 58Ni at energies close to the Coulomb barrier. The analysis is carried out with an optical potential with fusion and direct reaction parts (i.e., the nuclear polarization potential U is split into a volume part UF, which accounts for fusion reactions and a surface part UDR, responsible for direct reactions). The parameters of the Woods-Saxon potentials are determined by a χ2 analysis of the data. The presence of the threshold anomaly is investigated from the energy dependence of both the fusion and direct reaction parts of the polarization potential.

  18. Spacecraft System Failures and Anomalies Attributed to the Natural Space Environment

    NASA Technical Reports Server (NTRS)

    Bedingfield, Keith, L.; Leach, Richard D.; Alexander, Margaret B. (Editor)

    1996-01-01

    The natural space environment is characterized by many complex and subtle phenomena hostile to spacecraft. The effects of these phenomena impact spacecraft design, development, and operations. Space systems become increasingly susceptible to the space environment as use of composite materials and smaller, faster electronics increases. This trend makes an understanding of the natural space environment essential to accomplish overall mission objectives, especially in the current climate of better/cheaper/faster. This primer provides a brief overview of the natural space environment - definition, related programmatic issues, and effects on various spacecraft subsystems. The primary focus, however, is to catalog, through representative case histories, spacecraft failures and anomalies attributed to the natural space environment. This primer is one in a series of NASA Reference Publications currently being developed by the Electromagnetics and Aerospace Environments Branch, Systems Analysis and Integration Laboratory, Marshall Space Flight Center (MSFC), National Aeronautics and Space Administration (NASA).

  19. An automated computer misuse detection system for UNICOS

    SciTech Connect

    Jackson, K.A.; Neuman, M.C.; Simmonds, D.D.; Stallings, C.A.; Thompson, J.L.; Christoph, G.G.

    1994-09-27

    An effective method for detecting computer misuse is the automatic monitoring and analysis of on-line user activity. This activity is reflected in the system audit record, in the system vulnerability posture, and in other evidence found through active testing of the system. During the last several years we have implemented an automatic misuse detection system at Los Alamos. This is the Network Anomaly Detection and Intrusion Reporter (NADIR). We are currently expanding NADIR to include processing of the Cray UNICOS operating system. This new component is called the UNICOS Realtime NADIR, or UNICORN. UNICORN summarizes user activity and system configuration in statistical profiles. It compares these profiles to expert rules that define security policy and improper or suspicious behavior. It reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. The first phase of UNICORN development is nearing completion, and will be operational in late 1994.

  20. Genetic algorithm for TEC seismo-ionospheric anomalies detection around the time of the Solomon (Mw = 8.0) earthquake of 06 February 2013

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-08-01

    On 6 February 2013, at 12:12:27 local time (01:12:27 UTC) a seismic event registering Mw 8.0 struck the Solomon Islands, located at the boundaries of the Australian and Pacific tectonic plates. Time series prediction is an important and widely interesting topic in the research of earthquake precursors. This paper describes a new computational intelligence approach to detect the unusual variations of the total electron content (TEC) seismo-ionospheric anomalies induced by the powerful Solomon earthquake using genetic algorithm (GA). The GA detected a considerable number of anomalous occurrences on earthquake day and also 7 and 8 days prior to the earthquake in a period of high geomagnetic activities. In this study, also the detected TEC anomalies using the proposed method are compared to the results dealing with the observed TEC anomalies by applying the mean, median, wavelet, Kalman filter, ARIMA, neural network and support vector machine methods. The accordance in the final results of all eight methods is a convincing indication for the efficiency of the GA method. It indicates that GA can be an appropriate non-parametric tool for anomaly detection in a non linear time series showing the seismo-ionospheric precursors variations.

  1. Thermal neutron detection system

    DOEpatents

    Peurrung, Anthony J.; Stromswold, David C.

    2000-01-01

    According to the present invention, a system for measuring a thermal neutron emission from a neutron source, has a reflector/moderator proximate the neutron source that reflects and moderates neutrons from the neutron source. The reflector/moderator further directs thermal neutrons toward an unmoderated thermal neutron detector.

  2. Heat capacity anomaly in a self-aggregating system: Triblock copolymer 17R4 in water

    NASA Astrophysics Data System (ADS)

    Dumancas, Lorenzo V.; Simpson, David E.; Jacobs, D. T.

    2015-05-01

    The reverse Pluronic, triblock copolymer 17R4 is formed from poly(propylene oxide) (PPO) and poly(ethylene oxide) (PEO): PPO14 - PEO24 - PPO14, where the number of monomers in each block is denoted by the subscripts. In water, 17R4 has a micellization line marking the transition from a unimer network to self-aggregated spherical micelles which is quite near a cloud point curve above which the system separates into copolymer-rich and copolymer-poor liquid phases. The phase separation has an Ising-like, lower consolute critical point with a well-determined critical temperature and composition. We have measured the heat capacity as a function of temperature using an adiabatic calorimeter for three compositions: (1) the critical composition where the anomaly at the critical point is analyzed, (2) a composition much less than the critical composition with a much smaller spike when the cloud point curve is crossed, and (3) a composition near where the micellization line intersects the cloud point curve that only shows micellization. For the critical composition, the heat capacity anomaly very near the critical point is observed for the first time in a Pluronic/water system and is described well as a second-order phase transition resulting from the copolymer-water interaction. For all compositions, the onset of micellization is clear, but the formation of micelles occurs over a broad range of temperatures and never becomes complete because micelles form differently in each phase above the cloud point curve. The integrated heat capacity gives an enthalpy that is smaller than the standard state enthalpy of micellization given by a van't Hoff plot, a typical result for Pluronic systems.

  3. Methods and Systems for Characterization of an Anomaly Using Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M. (Inventor)

    2013-01-01

    A method for characterizing an anomaly in a material comprises (a) extracting contrast data; (b) measuring a contrast evolution; (c) filtering the contrast evolution; (d) measuring a peak amplitude of the contrast evolution; (d) determining a diameter and a depth of the anomaly, and (e) repeating the step of determining the diameter and the depth of the anomaly until a change in the estimate of the depth is less than a set value. The step of determining the diameter and the depth of the anomaly comprises estimating the depth using a diameter constant C.sub.D equal to one for the first iteration of determining the diameter and the depth; estimating the diameter; and comparing the estimate of the depth of the anomaly after each iteration of estimating to the prior estimate of the depth to calculate the change in the estimate of the depth of the anomaly.

  4. Power line detection system

    DOEpatents

    Latorre, Victor R.; Watwood, Donald B.

    1994-01-01

    A short-range, radio frequency (RF) transmitting-receiving system that provides both visual and audio warnings to the pilot of a helicopter or light aircraft of an up-coming power transmission line complex. Small, milliwatt-level narrowband transmitters, powered by the transmission line itself, are installed on top of selected transmission line support towers or within existing warning balls, and provide a continuous RF signal to approaching aircraft. The on-board receiver can be either a separate unit or a portion of the existing avionics, and can also share an existing antenna with another airborne system. Upon receipt of a warning signal, the receiver will trigger a visual and an audio alarm to alert the pilot to the potential power line hazard.

  5. Power line detection system

    DOEpatents

    Latorre, V.R.; Watwood, D.B.

    1994-09-27

    A short-range, radio frequency (RF) transmitting-receiving system that provides both visual and audio warnings to the pilot of a helicopter or light aircraft of an up-coming power transmission line complex. Small, milliwatt-level narrowband transmitters, powered by the transmission line itself, are installed on top of selected transmission line support towers or within existing warning balls, and provide a continuous RF signal to approaching aircraft. The on-board receiver can be either a separate unit or a portion of the existing avionics, and can also share an existing antenna with another airborne system. Upon receipt of a warning signal, the receiver will trigger a visual and an audio alarm to alert the pilot to the potential power line hazard. 4 figs.

  6. Centrifugal unbalance detection system

    DOEpatents

    Cordaro, Joseph V.; Reeves, George; Mets, Michael

    2002-01-01

    A system consisting of an accelerometer sensor attached to a centrifuge enclosure for sensing vibrations and outputting a signal in the form of a sine wave with an amplitude and frequency that is passed through a pre-amp to convert it to a voltage signal, a low pass filter for removing extraneous noise, an A/D converter and a processor and algorithm for operating on the signal, whereby the algorithm interprets the amplitude and frequency associated with the signal and once an amplitude threshold has been exceeded the algorithm begins to count cycles during a predetermined time period and if a given number of complete cycles exceeds the frequency threshold during the predetermined time period, the system shuts down the centrifuge.

  7. Radiation detection system

    DOEpatents

    Whited, R.C.

    A system for obtaining improved resolution in relatively thick semiconductor radiation detectors, such as HgI/sub 2/, which exhibit significant hole trapping. Two amplifiers are used: the first measures the charge collected and the second the contribution of the electrons to the charge collected. The outputs of the two amplifiers are utilized to unfold the total charge generated within the detector in response to a radiation event.

  8. Surface deformation and geoid anomalies over single and double-layered convective systems

    NASA Technical Reports Server (NTRS)

    Koch, M.; Yuen, D. A.

    1985-01-01

    Using a primitive variable formulation of the finite-element method, the differences in the surface observables, such as topography and geoid, produced by single- and double-layered thermal convection, were compared. Both constant and depth-dependent viscosities have been considered. For the same Rayleigh number, larger surface perturbations are produced by single-cell convection. For the same Nusselt number, the magnitudes of the surface observables are greater for double-layered convection. For the same surface heat-flux, surface topographies have similar magnitudes, but the relative amplitudes of geoid anomalies depend greatly on the style of viscosity stratification. This difference in the geoid between the two systems increases with greater surface heat-flow, regardless of viscosity structure.

  9. System and Method for Outlier Detection via Estimating Clusters

    NASA Technical Reports Server (NTRS)

    Iverson, David J. (Inventor)

    2016-01-01

    An efficient method and system for real-time or offline analysis of multivariate sensor data for use in anomaly detection, fault detection, and system health monitoring is provided. Models automatically derived from training data, typically nominal system data acquired from sensors in normally operating conditions or from detailed simulations, are used to identify unusual, out of family data samples (outliers) that indicate possible system failure or degradation. Outliers are determined through analyzing a degree of deviation of current system behavior from the models formed from the nominal system data. The deviation of current system behavior is presented as an easy to interpret numerical score along with a measure of the relative contribution of each system parameter to any off-nominal deviation. The techniques described herein may also be used to "clean" the training data.

  10. APDS: Autonomous Pathogen Detection System

    SciTech Connect

    Langlois, R G; Brown, S; Burris, L; Colston, B; Jones, L; Makarewicz, T; Mariella, R; Masquelier, D; McBride, M; Milanovich, F; Masarabadi, S; Venkateswaran, K; Marshall, G; Olson, D; Wolcott, D

    2002-02-14

    An early warning system to counter bioterrorism, the Autonomous Pathogen Detection System (APDS) continuously monitors the environment for the presence of biological pathogens (e.g., anthrax) and once detected, it sounds an alarm much like a smoke detector warns of a fire. Long before September 11, 2001, this system was being developed to protect domestic venues and events including performing arts centers, mass transit systems, major sporting and entertainment events, and other high profile situations in which the public is at risk of becoming a target of bioterrorist attacks. Customizing off-the-shelf components and developing new components, a multidisciplinary team developed APDS, a stand-alone system for rapid, continuous monitoring of multiple airborne biological threat agents in the environment. The completely automated APDS samples the air, prepares fluid samples in-line, and performs two orthogonal tests: immunoassay and nucleic acid detection. When compared to competing technologies, APDS is unprecedented in terms of flexibility and system performance.

  11. Diversified transmission multichannel detection system

    SciTech Connect

    Tournois, P.; Engelhard, P.

    1984-07-03

    A detection system for imaging by sonar or radar signals. The system associates diversified transmissions with an interferometric base. This base provides an angular channel formation means and each signal formed in this way is processed by matched filtering in a circuit containing copy signals characterizing the space coloring obtained by the diversified transmission means. The invention is particularly applicable to side or front looking detection sonars.

  12. An on-line expert system for diagnosing environmentally induced spacecraft anomalies using CLIPS

    NASA Technical Reports Server (NTRS)

    Lauriente, Michael; Rolincik, Mark; Koons, Harry C; Gorney, David

    1993-01-01

    A new rule-based, expert system for diagnosing spacecraft anomalies is under development. The knowledge base consists of over two-hundred rules and provide links to historical and environmental databases. Environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information (varying degrees of confidence in an answer) or 'unknown' to any question. The expert system not only provides scientists with needed risk analysis and confidence estimates not available in standard numerical models or databases, but it is also an effective learning tool. In addition, the architecture of the expert system allows easy additions to the knowledge base and the database. For example, new frames concerning orbital debris and ionospheric scintillation are being considered. The system currently runs on a MicroVAX and uses the C Language Integrated Production System (CLIPS).

  13. On the origin of the flux ratio anomaly in quadruple lens systems

    NASA Astrophysics Data System (ADS)

    Inoue, Kaiki Taro

    2016-09-01

    We explore the origin of the flux ratio anomaly in quadruple lens systems. Using a semi-analytic method based on N-body simulations, we estimate the effect of a possible magnification perturbation caused by subhaloes with a mass scale of ≲109 h-1 M⊙ in lensing galaxy haloes. Taking into account astrometric shifts and assuming that the primary lens is described by a singular isothermal ellipsoid, the expected change to the flux ratios for a multiply lensed image is just a few per cent and the mean of the expected convergence perturbation at the effective Einstein radius of the lensing galaxy halo is <δκsub> = 0.003, corresponding to the mean of the ratio of a projected dark matter mass fraction in subhaloes at the effective Einstein radius = 0.006. In contrast, the expected change to the flux ratio caused by line-of-sight structures is typically ˜10 per cent and the mean of the convergence perturbation is <|δκlos|> = 0.008, corresponding to = 0.017. The contribution of the magnification perturbation caused by subhaloes is ˜40 per cent of the total at a source redshift zS = 0.7 and decreases monotonically in zS to ˜20 per cent at zS = 3.6. Assuming statistical isotropy, the convergence perturbation estimated from 11 observed quadruple lens systems has a positive correlation with the source redshift zS, which is much stronger than that with the lens redshift zL. This feature also supports that the flux ratio anomaly is caused mainly by line-of-sight structures rather than subhaloes. We also discuss a possible imprint of line-of-sight structures in the demagnification of minimum images due to locally underdense structures in the line of sight.

  14. On the Origin of Flux Ratio Anomaly in Quadruple Lens Systems

    NASA Astrophysics Data System (ADS)

    Inoue, Kaiki Taro

    2016-05-01

    We explore the origin of flux ratio anomaly in quadruple lens systems. Using a semi-analytic method based on N-body simulations, we estimate the effect of possible magnification perturbation caused by subhaloes with a mass scale of ≲ 109 h-1M⊙ in lensing galaxy haloes. Taking into account astrometric shifts, assuming that the primary lens is described by a singular isothermal ellipsoid, the expected change to the flux ratios per a multiply lensed image is just a few percent and the mean of the expected convergence perturbation at the effective Einstein radius of the lensing galaxy halo is <δκsub> = 0.003, corresponding to the mean of the ratio of a projected dark matter mass fraction in subhaloes at the effective Einstein radius = 0.006. In contrast, the expected change to the flux ratio caused by line-of-sight structures is typically ˜10 percent and the mean of the convergence perturbation is <|δκlos|> = 0.008, corresponding to = 0.017. The contribution of magnification perturbation caused by subhaloes is ˜40 percent of the total at a source redshift zS = 0.7 and decreases monotonically in zS to ˜20 percent at zS = 3.6. Assuming statistical isotropy, the convergence perturbation estimated from observed 11 quadruple lens systems has a positive correlation with the source redshift zS, which is much stronger than that with the lens redshift zL. This feature also supports an idea that the flux ratio anomaly is caused mainly by line-of-sight structures rather than subhaloes. We also discuss about a possible imprint of line-of-sight structures in demagnification of minimum images due to locally underdense structures in the line of sight.

  15. Anomaly detection using simulated MTI data cubes derived from HYDICE data

    SciTech Connect

    Moya, M.M.; Taylor, J.G.; Stallard, B.R.; Motomatsu, S.E.

    1998-07-01

    The US Department of Energy is funding the development of the Multi-spectral Thermal Imager (MTI), a satellite-based multi-spectral (MS) thermal imaging sensor scheduled for launch in October 1999. MTI is a research and development (R and D) platform to test the applicability of multispectral and thermal imaging technology for detecting and monitoring signs of proliferation of weapons of mass destruction. During its three-year mission, MTI will periodically record images of participating government, industrial and natural sites in fifteen visible and infrared spectral bands to provide a variety of image data associated with weapons production activities. The MTI satellite will have spatial resolution in the visible bands that is five times better than LANDSAT TM in each dimension and will have five thermal bands. In this work, the authors quantify the separability between specific materials and the natural background by applying Receiver Operating Curve (ROC) analysis to the residual errors from a linear unmixing. The authors apply the ROC analysis to quantify performance of the MTI. They describe the MTI imager and simulate its data by filtering HYDICE hyperspectral imagery both spatially and spectrally and by introducing atmospheric effects corresponding to the MTI satellite altitude. They compare and contrast the individual effects on performance of spectral resolution, spatial resolution, atmospheric corrections, and varying atmospheric conditions.

  16. A computational tool to highlight anomalies on shearographic images in optical flaw detection

    NASA Astrophysics Data System (ADS)

    Fantin, A. V.; Willemann, D. P.; Viotti, M.; Albertazzi, A.

    2013-04-01

    Shearography is an optical and nondestructive technique that has been largely used for damage detection in layered composite materials where delaminations and debondings are found among the most common flaws. Shearography is a relative measurement on which two images are recorded for different loading conditions of the sample. The applied loading induces some deformations into the sample generating a displacement field on its surface. The absolute difference between two phase maps recorded at two different loading instances produces an interference fringe pattern which is directly correlated to the displacements produced on the material surface. In some cases, depending on the loading level and mainly on the sample geometry, interference patterns will contain fringes resulting from geometry changes. This will mask those fringes correlated to flaws presented into the material, resulting in an image misinterpretation. This phenomenon takes place mainly when the sample has curved geometries, as for example pipe or vessel surfaces. This paper presents an algorithm which uses a mathematical processing to improve the visualization of flaws in shearographic images. The mathematical processing is based on divergent calculation. This algorithm highlights defected regions and eliminates fringes caused by geometry changes, providing an easier interpretation for complex shearographic images. This paper also shows the principle and the algorithm used for the processing. Results, advantages and difficulties of the method are presented and discussed by using simulated fringe maps as well as real ones.

  17. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition

    PubMed Central

    Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine

    2016-01-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466

  18. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition.

    PubMed

    Hébert-Dufresne, Laurent; Grochow, Joshua A; Allard, Antoine

    2016-01-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466

  19. Anomaly Detection and Comparative Analysis of Hydrothermal Alteration Materials Trough Hyperspectral Multisensor Data in the Turrialba Volcano

    NASA Astrophysics Data System (ADS)

    Rejas, J. G.; Martínez-Frías, J.; Bonatti, J.; Martínez, R.; Marchamalo, M.

    2012-07-01

    The aim of this work is the comparative study of the presence of hydrothermal alteration materials in the Turrialba volcano (Costa Rica) in relation with computed spectral anomalies from multitemporal and multisensor data adquired in spectral ranges of the visible (VIS), short wave infrared (SWIR) and thermal infrared (TIR). We used for this purposes hyperspectral and multispectral images from the HyMAP and MASTER airborne sensors, and ASTER and Hyperion scenes in a period between 2002 and 2010. Field radiometry was applied in order to remove the atmospheric contribution in an empirical line method. HyMAP and MASTER images were georeferenced directly thanks to positioning and orientation data that were measured at the same time in the acquisition campaign from an inertial system based on GPS/IMU. These two important steps were allowed the identification of spectral diagnostic bands of hydrothermal alteration minerals and the accuracy spatial correlation. Enviromental impact of the volcano activity has been studied through different vegetation indexes and soil patterns. Have been mapped hydrothermal materials in the crater of the volcano, in fact currently active, and their surrounding carrying out a principal components analysis differentiated for a high and low absorption bands to characterize accumulations of kaolinite, illite, alunite and kaolinite+smectite, delimitating zones with the presence of these minerals. Spectral anomalies have been calculated on a comparative study of methods pixel and subpixel focused in thermal bands fused with high-resolution images. Results are presented as an approach based on expert whose main interest lies in the automated identification of patterns of hydrothermal altered materials without prior knowledge or poor information on the area.

  20. Relationships between Rwandan seasonal rainfall anomalies and ENSO events

    NASA Astrophysics Data System (ADS)

    Muhire, I.; Ahmed, F.; Abutaleb, K.

    2015-10-01

    This study aims primarily at investigating the relationships between Rwandan seasonal rainfall anomalies and El Niño-South Oscillation phenomenon (ENSO) events. The study is useful for early warning of negative effects associated with extreme rainfall anomalies across the country. It covers the period 1935-1992, using long and short rains data from 28 weather stations in Rwanda and ENSO events resourced from Glantz (2001). The mean standardized anomaly indices were calculated to investigate their associations with ENSO events. One-way analysis of variance was applied on the mean standardized anomaly index values per ENSO event to explore the spatial correlation of rainfall anomalies per ENSO event. A geographical information system was used to present spatially the variations in mean standardized anomaly indices per ENSO event. The results showed approximately three climatic periods, namely, dry period (1935-1960), semi-humid period (1961-1976) and wet period (1977-1992). Though positive and negative correlations were detected between extreme short rains anomalies and El Niño events, La Niña events were mostly linked to negative rainfall anomalies while El Niño events were associated with positive rainfall anomalies. The occurrence of El Niño and La Niña in the same year does not show any clear association with rainfall anomalies. However, the phenomenon was more linked with positive long rains anomalies and negative short rains anomalies. The normal years were largely linked with negative long rains anomalies and positive short rains anomalies, which is a pointer to the influence of other factors other than ENSO events. This makes projection of seasonal rainfall anomalies in the country by merely predicting ENSO events difficult.

  1. Beware of Venous Anomalies in Young Patients with Sick Sinus Syndrome: A Report of Two Cases of Sick Sinus Syndrome with Systemic Venous Anomalies

    PubMed Central

    Rathakrishnan, Shanmuga Sundaram; Kaliappan, Tamilarasu; Gopalan, Rajendiran

    2015-01-01

    We report two young patients with symptomatic sick sinus syndrome admitted for permanent pacemaker implantation (PPI). On evaluation with echocardiography, one of them was found to have persistent left superior vena cava and venography showed absent right superior vena cava also. He underwent PPI with leads inserted via left superior vena cava, coronary sinus, right atrium and right ventricle. The other patient was incidentally found to have interrupted inferior vena cava with azygos continuation while being planned for temporary pacemaker implantation. She underwent successful PPI. We would like to stress the importance of having a high suspicion for these systemic venous anomalies in patients presenting with sick sinus syndrome especially at young age. If we could diagnose preoperatively, we can avoid on table surprises. PMID:27326354

  2. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Marshall Clint; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many fronts to make possible high-speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flowfields/plumes. The Optical Plume Anomaly Detector (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDiFiS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Capabilities for real-time processing are being advanced on several fronts, including an effort to hardware encode components of the EDiFiS for health monitoring and management. This paper addresses the OPAD with its tool suites, and discusses what is considered a natural progression: a concept for taking OPAD to the next logical level of high energy physics, incorporating fermion and boson particle analyses in measurement of neutron flux.

  3. An Intelligent computer-aided tutoring system for diagnosing anomalies of spacecraft in operation

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark; Lauriente, Michael; Koons, Harry C.; Gorney, David

    1993-01-01

    A new rule-based, expert system for diagnosing spacecraft anomalies is under development. The knowledge base consists of over two-hundred (200) rules and provides links to historical and environmental databases. Environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. When the user selects the novice mode, the system automatically gives detailed explanations and descriptions of terms and reasoning as the session progresses, in a sense teaching the user. As such it is an effective tutoring tool. The use of heuristics frees the user from searching through large amounts of irrelevant information and allows the user to input partial information (varying degrees of confidence in an answer) or 'unknown' to any question. The system is available on-line and uses C Language Integrated Production System (CLIPS), an expert shell developed by the NASA Johnson Space Center AI Laboratory in Houston.

  4. Knowledge-based verification of clinical guidelines by detection of anomalies.

    PubMed

    Duftschmid, G; Miksch, S

    2001-04-01

    As shown in numerous studies, a significant part of published clinical guidelines is tainted with different types of semantical errors that interfere with their practical application. The adaptation of generic guidelines, necessitated by circumstances such as resource limitations within the applying organization or unexpected events arising in the course of patient care, further promotes the introduction of defects. Still, most current approaches for the automation of clinical guidelines are lacking mechanisms, which check the overall correctness of their output. In the domain of software engineering in general and in the domain of knowledge-based systems (KBS) in particular, a common strategy to examine a system for potential defects consists in its verification. The focus of this work is to present an approach, which helps to ensure the semantical correctness of clinical guidelines in a three-step process. We use a particular guideline specification language called Asbru to demonstrate our verification mechanism. A scenario-based evaluation of our method is provided based on a guideline for the artificial ventilation of newborn infants. The described approach is kept sufficiently general in order to allow its application to several other guideline representation formats. PMID:11259882

  5. Selecting Observation Platforms for Optimized Anomaly Detectability under Unreliable Partial Observations

    SciTech Connect

    Wen-Chiao Lin; Humberto E. Garcia; Tae-Sic Yoo

    2011-06-01

    Diagnosers for keeping track on the occurrences of special events in the framework of unreliable partially observed discrete-event dynamical systems were developed in previous work. This paper considers observation platforms consisting of sensors that provide partial and unreliable observations and of diagnosers that analyze them. Diagnosers in observation platforms typically perform better as sensors providing the observations become more costly or increase in number. This paper proposes a methodology for finding an observation platform that achieves an optimal balance between cost and performance, while satisfying given observability requirements and constraints. Since this problem is generally computational hard in the framework considered, an observation platform optimization algorithm is utilized that uses two greedy heuristics, one myopic and another based on projected performances. These heuristics are sequentially executed in order to find best observation platforms. The developed algorithm is then applied to an observation platform optimization problem for a multi-unit-operation system. Results show that improved observation platforms can be found that may significantly reduce the observation platform cost but still yield acceptable performance for correctly inferring the occurrences of special events.

  6. 3D magnetotelluric characterization of the geothermal anomaly in the Llucmajor aquifer system (Majorca, Spain)

    NASA Astrophysics Data System (ADS)

    Arango, C.; Marcuello, A.; Ledo, J.; Queralt, P.

    2009-08-01

    In the Llucmajor aquifer system (Majorca Island, Spain) some geothermal evidences have appeared. This phenomenon is not isolated to Majorca and it is present in other areas, where it can be associated with structural conditions, especially to the extensional event suffered by the island after the Alpine Orogeny. However, the origin of this anomaly in Llucmajor is not well known, and there is no surface geological evidence of these structural conditions. With the aim of delineating the geoelectrical structure of the zone and identifying the geological structure that allows the presence of this anomaly, an audiomagnetotelluric (AMT) survey was carried out. The AMT data was processed using a Wavelet Transform-based scheme. Dimensionality analysis indicates that the geoelectrical structure is mainly 3D. The 3D model was obtained by trial and error forward modeling, taking accounting of the responses from the determinant of the impedance tensor. The model shows a vertical resistivity distribution with three horizons associated with different units: on the top, a shallow high resistive media related to an unconfined shallow aquifer; in the middle, a conductive layer related to the aquitard, and below it, another resistive media related to the confined deeper aquifer. The intermediate horizon shows a sudden thinning beneath the thermal anomalous zone that can be identified as a weakness zone (fault or fracture) connecting both aquifers. An exploratory well was drilled after the AMT survey and reached almost 700 m in depth. This allowed correlating the resistivity distribution of the 3D model with data logging and lithology obtained from the well, showing a proper agreement between them.

  7. Effects of soil moisture anomalies on the North American Monsoon System

    NASA Astrophysics Data System (ADS)

    Xu, J.; Small, E. E.; Lakshmi, V.

    2001-05-01

    A positive soil moisture-rainfall feedback exists when above-normal soil moisture increases the likelihood of future precipitation, and vice versa for dry soil. Observations and modeling studies suggest that this feedback may be important in magnifying and prolonging hydroclimatic anomalies in a variety of regions. Our preliminary modeling experiments show that the soil moisture-rainfall feedback is strong in the North American Monsoon System (NAMS) region and may contribute to variability of summertime precipitation in this area. However, this result is based on sensitivity experiments using extreme forcing - soil moisture was held at field capacity or wilting point throughout season long simulations. Here we use the MM5 model linked to the OSU land surface scheme to assess the strength of soil moisture-rainfall feedbacks in the NAMS region that result from realistic soil moisture forcing. Simulations are driven by NCEP reanalysis. The horizontal resolution of the finest grid is 30 km. All experiments begin on May 1 and end on October 1. First, we use the coupled MM5-OSU model to simulate NAMS climate and soil moisture in wet (1999) and dry (2000) monsoon seasons. Second, we repeat these two experiments but constrain the soil moisture field so that it approximates the mean state. This is accomplished by scaling the simulated precipitation at each point so that it is equal to mean observed precipitation at that location. This modification preserves the temporal variability of soil moisture that is characteristic of the NAMS region. We compare the atmosphere and land surface state in the control and sensitivity experiments. This isolates the effects of soil moisture anomalies on NAMS climate in both a dry and wet year. We check that the simulated soil moisture and surface energy balance state in the control and sensitivity simulations is reasonable via comparisons to both field observations and remotely sensed data.

  8. Structural Anomalies Detected in Ceramic Matrix Composites Using Combined Nondestructive Evaluation and Finite Element Analysis (NDE and FEA)

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.; Bhatt, Ramakrishna T.

    2003-01-01

    and the experimental data. Furthermore, modeling of the voids collected via NDE offered an analytical advantage that resulted in more accurate assessments of the material s structural strength. The top figure shows a CT scan image of the specimen test section illustrating various hidden structural entities in the material and an optical image of the test specimen considered in this study. The bottom figure represents the stress response predicted from the finite element analyses (ref .3 ) for a selected CT slice where it clearly illustrates the correspondence of the high stress risers due to voids in the material with those predicted by the NDE. This study is continuing, and efforts are concentrated on improving the modeling capabilities to imitate the structural anomalies as detected.

  9. Holonomy anomalies

    SciTech Connect

    Bagger, J.; Nemeschansky, D.; Yankielowicz, S.

    1985-05-01

    A new type of anomaly is discussed that afflicts certain non-linear sigma models with fermions. This anomaly is similar to the ordinary gauge and gravitational anomalies since it reflects a topological obstruction to the reparametrization invariance of the quantum effective action. Nonlinear sigma models are constructed based on homogeneous spaces G/H. Anomalies arising when the fermions are chiral are shown to be cancelled sometimes by Chern-Simons terms. Nonlinear sigma models are considered based on general Riemannian manifolds. 9 refs. (LEW)

  10. Integrated multisensor perimeter detection systems

    NASA Astrophysics Data System (ADS)

    Kent, P. J.; Fretwell, P.; Barrett, D. J.; Faulkner, D. A.

    2007-10-01

    The report describes the results of a multi-year programme of research aimed at the development of an integrated multi-sensor perimeter detection system capable of being deployed at an operational site. The research was driven by end user requirements in protective security, particularly in threat detection and assessment, where effective capability was either not available or prohibitively expensive. Novel video analytics have been designed to provide robust detection of pedestrians in clutter while new radar detection and tracking algorithms provide wide area day/night surveillance. A modular integrated architecture based on commercially available components has been developed. A graphical user interface allows intuitive interaction and visualisation with the sensors. The fusion of video, radar and other sensor data provides the basis of a threat detection capability for real life conditions. The system was designed to be modular and extendable in order to accommodate future and legacy surveillance sensors. The current sensor mix includes stereoscopic video cameras, mmWave ground movement radar, CCTV and a commercially available perimeter detection cable. The paper outlines the development of the system and describes the lessons learnt after deployment in a pilot trial.

  11. The Association between Autism Spectrum Disorders and Congenital Anomalies by Organ Systems in a Finnish National Birth Cohort

    ERIC Educational Resources Information Center

    Timonen-Soivio, Laura; Sourander, Andre; Malm, Heli; Hinkka-Yli-Salomäki, Susanna; Gissler, Mika; Brown, Alan; Vanhala, Raija

    2015-01-01

    The aim of this study was to evaluate the association between autism spectrum disorders (ASD) with and without intellectual disability (ID) and congenital anomalies (CAs) by organ system. The sample included all children diagnosed with ASD (n = 4441) from the Finnish Hospital Discharge Register during 1987-2000 and a total of four controls per…

  12. An airborne real-time hyperspectral target detection system

    NASA Astrophysics Data System (ADS)

    Skauli, Torbjorn; Haavardsholm, Trym V.; Kåsen, Ingebjørg; Arisholm, Gunnar; Kavara, Amela; Opsahl, Thomas Olsvik; Skaugen, Atle

    2010-04-01

    An airborne system for hyperspectral target detection is described. The main sensor is a HySpex pushbroom hyperspectral imager for the visible and near-infrared spectral range with 1600 pixels across track, supplemented by a panchromatic line imager. An optional third sensor can be added, either a SWIR hyperspectral camera or a thermal camera. In real time, the system performs radiometric calibration and georeferencing of the images, followed by image processing for target detection and visualization. The current version of the system implements only spectral anomaly detection, based on normal mixture models. Image processing runs on a PC with a multicore Intel processor and an Nvidia graphics processing unit (GPU). The processing runs in a software framework optimized for large sustained data rates. The platform is a Cessna 172 aircraft based close to FFI, modified with a camera port in the floor.

  13. Bangui Anomaly

    NASA Technical Reports Server (NTRS)

    Taylor, Patrick T.

    2004-01-01

    Bangui anomaly is the name given to one of the Earth s largest crustal magnetic anomalies and the largest over the African continent. It covers two-thirds of the Central African Republic and therefore the name derives from the capitol city-Bangui that is also near the center of this feature. From surface magnetic survey data Godivier and Le Donche (1962) were the first to describe this anomaly. Subsequently high-altitude world magnetic surveying by the U.S. Naval Oceanographic Office (Project Magnet) recorded a greater than 1000 nT dipolar, peak-to-trough anomaly with the major portion being negative (figure 1). Satellite observations (Cosmos 49) were first reported in 1964, these revealed a 40nT anomaly at 350 km altitude. Subsequently the higher altitude (417-499km) POGO (Polar Orbiting Geomagnetic Observatory) satellite data recorded peak-to-trough anomalies of 20 nT these data were added to Cosmos 49 measurements by Regan et al. (1975) for a regional satellite altitude map. In October 1979, with the launch of Magsat, a satellite designed to measure crustal magnetic anomalies, a more uniform satellite altitude magnetic map was obtained. These data, computed at 375 km altitude recorded a -22 nT anomaly (figure 2). This elliptically shaped anomaly is approximately 760 by 1000 km and is centered at 6%, 18%. The Bangui anomaly is composed of three segments; there are two positive anomalies lobes north and south of a large central negative field. This displays the classic pattern of a magnetic anomalous body being magnetized by induction in a zero inclination field. This is not surprising since the magnetic equator passes near the center of this body.

  14. Incipient-signature identification of mechanical anomalies in a ship-borne satellite antenna system using an ensemble multiwavelet

    NASA Astrophysics Data System (ADS)

    He, Shuilong; Zi, Yanyang; Chen, Jinglong; Zhao, Chenlu; Chen, Binqiang; Yuan, Jing; He, Zhengjia

    2014-10-01

    The instrumented tracking and telemetry ship with a ship-borne satellite antenna (SSA) is the critical device to ensure high quality of space exploration work. To effectively detect mechanical anomalies that can lead to unexpected downtime of the SSA, an ensemble multiwavelet (EM) is presented for identifying the anomaly related incipient-signatures within the measured dynamic signals. Rather than using a predetermined basis as in a conventional multiwavelet, an EM optimizes the matching basis which satisfactorily adapts to the anomaly related incipient-signatures. The construction technique of an EM is based on the conjunction of a two-scale similarity transform (TST) and lifting scheme (LS). For the technique above, the TST improves the regularity by increasing the approximation order of multiscaling functions, while subsequently the LS enhances the smoothness and localizability via utilizing the vanishing moment of multiwavelet functions. Moreover, combining the Hilbert transform with EM decomposition, we identify the incipient-signatures induced by the mechanical anomalies from the measured dynamic signals. A numerical simulation and two successful applications of diagnosis cases (a planetary gearbox and a roller bearing) demonstrate that the proposed technique is capable of dealing with the challenging incipient-signature identification task even though spectral complexity, as well as the strong amplitude/frequency modulation effect, is present in the dynamic signals.

  15. Heavy metal anomalies in the Tinto and Odiel River and estuary system, Spain

    USGS Publications Warehouse

    Nelson, C.H.; Lamothe, P.J.

    1993-01-01

    The Tinto and Odiel rivers drain 100 km from the Rio Tinto sulphide mining district, and join at a 20-km long estuary entering the Atlantic Ocean. A reconnaissance study of heavy metal anomalies in channel sand and overbank mud of the river and estuary by semi-quantitative emission dc-arc spectrographic analysis shows the following upstream to downstream ranges in ppm (??g g-1): As 3,000 to <200, Cd 30 to <0.1, Cu 1,500 to 10, Pb 2,000 to <10, Sb 3000 to <150, and Zn 3,000 to <200. Organic-rich (1.3-2.6% total organic carbon, TOC), sandysilty overbank clay has been analyzed to represent suspended load materials. The high content of heavy metals in the overbank clay throughout the river and estuary systems indicates the importance of suspended sediment transport for dispersing heavy metals from natural erosion and anthropogenic mining activities of the sulfide deposit. The organic-poor (0.21-0.37% TOC) river bed sand has been analyzed to represent bedload transport of naturally-occurring sulfide minerals. The sand has high concentrations of metals upstream but these decrease an order of magnitude in the lower estuary. Although heavy metal contamination of estuary mouth beach sand has been diluted to background levels estuary mud exhibits increased contamination apparently related to finer grain size, higher organic carbon content, precipitation of river-borne dissolved solids, and input of anthropogenic heavy metals from industrial sources. The contaminated estuary mud disperses to the inner shelf mud belt and offshore suspended sediment, which exhibit metal anomalies from natural erosion and mining of upstream Rio Tinto sulphide lode sources (Pb, Cu, Zn) and industrial activities within the estuary (Fe, Cr, Ti). Because heavy metal contamination of Tinto-Odiel river sediment reaches or exceeds the highest levels encountered in other river sediments of Spain and Europe, a detailed analysis of metals in water and suspended sediment throughout the system, and

  16. Continental and oceanic magnetic anomalies: Enhancement through GRM

    NASA Technical Reports Server (NTRS)

    Vonfrese, R. R. B.; Hinze, W. J.

    1985-01-01

    In contrast to the POGO and MAGSAT satellites, the Geopotential Research Mission (GRM) satellite system will orbit at a minimum elevation to provide significantly better resolved lithospheric magnetic anomalies for more detailed and improved geologic analysis. In addition, GRM will measure corresponding gravity anomalies to enhance our understanding of the gravity field for vast regions of the Earth which are largely inaccessible to more conventional surface mapping. Crustal studies will greatly benefit from the dual data sets as modeling has shown that lithospheric sources of long wavelength magnetic anomalies frequently involve density variations which may produce detectable gravity anomalies at satellite elevations. Furthermore, GRM will provide an important replication of lithospheric magnetic anomalies as an aid to identifying and extracting these anomalies from satellite magnetic measurements. The potential benefits to the study of the origin and characterization of the continents and oceans, that may result from the increased GRM resolution are examined.

  17. Assessing two operational systems for monthly and seasonal climatic anomalies forecast in Italy.

    NASA Astrophysics Data System (ADS)

    Pasqui, M.; Pavan, V.; Quaresima, S.; Primicerio, J.; Cacciamani, C.; Gozzini, B.; Perini, L.

    2010-09-01

    The multi-model ensemble system for long term predictions over Italy organised by the National Civil Protection Agency is described. The system has been designed in technical support of decision making at national level in issues related with water management, and, in general, with mitigation of impacts on population and production activities of intense climate anomalies over Italy. Two separate systems have been developed: a multi-model monthly prediction system and a seasonal (three months) prediction system. The final products for these two systems consist of the probability of occurrence of events for specific indices obtained from two surface climate fields: mean temperature and cumulated precipitation. The indices are obtained averaging the values of these fields over the Northern, the Central (plus Sardinia) and the Southern (plus Sicily) Italian regions. The events considered are defined on the ground of the three terciles (lower, medium and upper) of the probability distribution of the values of these indices over a long period (namely from 1987 to 2009) or using the 15th or the 85th percentile over the long period of the same indices, so as to identify the probability of occurrence of possible intense events. Several different Institutes collaborate to the system, but in the present work are analysed only the skill scores of a subset of these products: two separate products of the CNR-IBIMET for the monthly and the seasonal time-scale, and the product of ARPA-SIMC for the seasonal time scale. The two products produced by CNR-IBIMET, namely the monthly and the seasonal predictions, are obtained using a statistical model in which the probabilistic predictions are the predictands of a multi-regressive statistical scheme using as predictors the observed values of selected large-scale atmospheric indices and sea surface anomalies. The seasonal predictions of ARPA-SIMC are obtained applying a Model Output Statistics (MOS) calibration scheme, also based on

  18. Hearing aid malfunction detection system

    NASA Technical Reports Server (NTRS)

    Kessinger, R. L. (Inventor)

    1977-01-01

    A malfunction detection system for detecting malfunctions in electrical signal processing circuits is disclosed. Malfunctions of a hearing aid in the form of frequency distortion and/or inadequate amplification by the hearing aid amplifier, as well as weakening of the hearing aid power supply are detectable. A test signal is generated and a timed switching circuit periodically applies the test signal to the input of the hearing aid amplifier in place of the input signal from the microphone. The resulting amplifier output is compared with the input test signal used as a reference signal. The hearing aid battery voltage is also periodically compared to a reference voltage. Deviations from the references beyond preset limits cause a warning system to operate.

  19. A vehicle threat detection system using correlation analysis and synthesized x-ray images

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Elmaghraby, Adel

    2013-06-01

    The goal of the proposed research is to automate the vehicle threat detection with X-ray images when a vehicle crosses the country border or the gateway of a secured facility (military base). The proposed detection system requires two inputs: probe images (from X-ray machine) and gallery images (from database). For each vehicle, the gallery images include the X-ray images of fully-loaded (with typical cargo) and unloaded (empty) vehicle. The proposed system produces two types of outputs for threat detection: the detected anomalies and the synthesized images (e.g., grayscale fusion, color fusion, and differential images). The anomalies are automatically detected with the block-wise correlation analysis between two temporally aligned images (probe versus gallery). The locations of detected anomalies can be marked with small rectangles on the probe X-ray images. The several side-view images can be combined into one fused image in gray scale and in colors (color fusion) that provides more comprehensive information to the operator. The fused images are suitable for human analysis and decision. We analyzed a set of vehicle X-ray images, which consists of 4 images generated from AS and E OmniView Gantry™. The preliminary results of detected anomalies and synthesized images are very promising; meanwhile the processing speed is very fast.

  20. Semi autonomous mine detection system

    SciTech Connect

    Douglas Few; Roelof Versteeg; Herman Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, a countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIK was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude – from an autonomous robotic perspective – the rapid development and deployment of fieldable systems.

  1. Portable Microleak-Detection System

    NASA Technical Reports Server (NTRS)

    Rivers, H. Kevin; Sikora, Joseph G.; Sankaran, Sankara N.

    2007-01-01

    The figure schematically depicts a portable microleak-detection system that has been built especially for use in testing hydrogen tanks made of polymer-matrix composite materials. (As used here, microleak signifies a leak that is too small to be detectable by the simple soap-bubble technique.) The system can also be used to test for microleaks in tanks that are made of other materials and that contain gases other than hydrogen. Results of calibration tests have shown that measurement errors are less than 10 percent for leak rates ranging from 0.3 to 200 cm3/min. Like some other microleak-detection systems, this system includes a vacuum pump and associated plumbing for sampling the leaking gas, and a mass spectrometer for analyzing the molecular constituents of the gas. The system includes a flexible vacuum chamber that can be attached to the outer surface of a tank or other object of interest that is to be tested for leakage (hereafter denoted, simply, the test object). The gas used in a test can be the gas or vapor (e.g., hydrogen in the original application) to be contained by the test object. Alternatively, following common practice in leak testing, helium can be used as a test gas. In either case, the mass spectrometer can be used to verify that the gas measured by the system is the test gas rather than a different gas and, hence, that the leak is indeed from the test object.

  2. Competing Orders and Anomalies.

    PubMed

    Moon, Eun-Gook

    2016-01-01

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation "laws" could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the 't Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed. PMID:27499184

  3. Competing Orders and Anomalies

    NASA Astrophysics Data System (ADS)

    Moon, Eun-Gook

    2016-08-01

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation “laws” could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the ’t Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed.

  4. Competing Orders and Anomalies

    PubMed Central

    Moon, Eun-Gook

    2016-01-01

    A conservation law is one of the most fundamental properties in nature, but a certain class of conservation “laws” could be spoiled by intrinsic quantum mechanical effects, so-called quantum anomalies. Profound properties of the anomalies have deepened our understanding in quantum many body systems. Here, we investigate quantum anomaly effects in quantum phase transitions between competing orders and striking consequences of their presence. We explicitly calculate topological nature of anomalies of non-linear sigma models (NLSMs) with the Wess-Zumino-Witten (WZW) terms. The non-perturbative nature is directly related with the ’t Hooft anomaly matching condition: anomalies are conserved in renormalization group flow. By applying the matching condition, we show massless excitations are enforced by the anomalies in a whole phase diagram in sharp contrast to the case of the Landau-Ginzburg-Wilson theory which only has massive excitations in symmetric phases. Furthermore, we find non-perturbative criteria to characterize quantum phase transitions between competing orders. For example, in 4D, we show the two competing order parameter theories, CP(1) and the NLSM with WZW, describe different universality class. Physical realizations and experimental implication of the anomalies are also discussed. PMID:27499184

  5. Variations of Cloud and Radiative Properties of Boundary-layer and Deep Convective Systems with Sea Surface Temperature Anomalies

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2010-01-01

    Gridded monthly-mean satellite data contain compositing information from different cloud system types and clear-sky environments. To isolate the variations of cloud physical properties of an individual cloud system type with its environment, orbital data are needed. In this study, we will analyze the variations of cloud and radiative properties of boundary-layer clouds and deep convective cloud systems with sea surface temperature (SST) anomalies. We use Terra-CERES (Clouds and the Earth s Radiant Energy System) Level 2 data to classify distinct cloud objects defined by cloud-system types (deep convection, boundary-layer cumulus, stratocumulus and overcast clouds), sizes, geographic locations, and matched large-scale environments. This analysis method identifies a cloud object as a contiguous region of the Earth with a single dominant cloud-system type. It determines the shape and size of the cloud object from the satellite data and the cloud-system selection criteria. The statistical properties of the identified cloud objects are analyzed in terms of probability density functions (PDFs) of a single property or joint PDFs between two properties. The SST anomalies are defined as the differences from five-year annual-cycle means. Individual cloud objects are sorted into one of five equal size subsets, with the matched SST anomalies ranging from the most negative to the most positive values, for a given size category of deep convective cloud objects, boundary-layer cumulus, stratocumulus and overcast cloud objects. The PDFs of cloud and radiative properties for deep convective cloud objects (between 30 S and 30 N) are found to largely similar among the five SST anomaly subsets except for the lowest SST anomaly subset. The different characteristics from this SST anomaly subset may be related to some cloud objects resulting from equatorward movement of extratropical cloud systems. This result holds true for all three different size categories (measured by equivalent

  6. Tape Cassette Bacteria Detection System

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The design, fabrication, and testing of an automatic bacteria detection system with a zero-g capability and based on the filter-capsule approach is described. This system is intended for monitoring the sterility of regenerated water in a spacecraft. The principle of detection is based on measuring the increase in chemiluminescence produced by the action of bacterial porphyrins (i.e., catalase, cytochromes, etc.) on a luminol-hydrogen peroxide mixture. Since viable as well as nonviable organisms initiate this luminescence, viable organisms are detected by comparing the signal of an incubated water sample with an unincubated control. Higher signals for the former indicate the presence of viable organisms. System features include disposable sealed sterile capsules, each containing a filter membrane, for processing discrete water samples and a tape transport for moving these capsules through a processing sequence which involves sample concentration, nutrient addition, incubation, a 4 Molar Urea wash and reaction with luminol-hydrogen peroxide in front of a photomultiplier tube. Liquids are introduced by means of a syringe needle which pierces a rubber septum contained in the wall of the capsule. Detection thresholds obtained with this unit towards E. coli and S. marcescens assuming a 400 ml water sample are indicated.

  7. Space-borne detection of volcanic carbon dioxide anomalies: The importance of ground-based validation networks

    NASA Astrophysics Data System (ADS)

    Schwandner, F. M.; Carn, S. A.; Corradini, S.; Merucci, L.; Salerno, G.; La Spina, A.

    2012-04-01

    We have investigated the feasibility of space-borne detection of volcanic carbon dioxide (CO2) anomalies, and their integration with ground-based observations. Three goals provide motivation to their integration: (a) development of new volcano monitoring techniques, with better spatial and temporal coverage, because pre-eruptive volcanic CO2 emissions are potentially the earliest available indicators of volcanic unrest; (b) improvement the currently very poor global CO2 source strength inventory for volcanoes, and (c) use of volcanic CO2 emissions for high altitude strong point source emission and dispersion studies. (1) Feasibility of space-borne detection of volcanic CO2 anomalies. Volcanoes are highly variable but continuous CO2 emitters, distributed globally, and emissions often occur at high altitudes. To detect strong point sources of CO2 from space, several hurdles have to be overcome: orographic clouds, unknown dispersion behavior, a high CO2 background in the troposphere, and sparse data coverage from existing satellite sensors. These obstacles can be overcome by a small field of view, enhanced spectral resolving power, and by employing repeat target mode observation strategies. The Japanese GOSAT instrument has been operational since January 2009, producing CO2 total column measurements with a repeat cycle of 3 days and a field of view of 10km. GOSAT thus has the potential to provide spatially integrated data for entire volcanic edifices, especially in target mode. Since summer 2010 we have conducted repeated target mode observations of over 20 persistently active global volcanoes including Etna (Italy), Erta Ale (Ethiopia), and Ambrym (Vanuatu), using L2 GOSAT FTS SWIR data. One of our best-studied test cases is Mt. Etna on Sicily (Italy), which reawakened in 2011 after a period of quiescence and produced a sequence of eruptive activities including lava fountaining events, coinciding with target-mode GOSAT observations conducted there since 2010. For the

  8. Congenital anomalies of the genitourinary system can help in diagnosis of the primary site of metastatic cancer: a case report and a review of the literature

    PubMed Central

    Deptala, Andrzej; Romanowicz, Agnieszka; Czerw, Aleksandra; Walecki, Jerzy; Rogowski, Wojciech; Nasierowska-Guttmejer, Anna

    2016-01-01

    Objective To analyze whether the presence of congenital anomalies of the genitourinary system that are accompanied by specific types of cancer and predispose patients to many complications, including infection, obstruction, stasis, calculus formation, and impaired renal function, could help in the diagnosis of the primary site of a metastatic tumor. Case presentation We report a case of a 58-year-old man with metastatic adenocarcinoma, in whom congenital anomalies of the genitourinary system proved helpful for the diagnosis of the primary site of cancer originating in the seminal vesicles. Conclusion We report an extremely rare case of primary adenocarcinoma arising probably from the left seminal vesicle associated with ipsilateral renal agenesis. The lesion was detected on ultrasound and contrast-enhanced computed tomography and confirmed histologically with ultrasound-guided biopsy. Serum markers, ie, CA19-9 and CA125, were elevated, while prostate-specific antigen and carcinoembryonic antigen were within normal limits. Such a constellation of markers strengthened the diagnosis. Our patient unfortunately presented very late in the course of the disease. Hence, we decided to initiate antiandrogen therapy and best supportive care in a hospice setting. Only early detection seems to be the key factor that may result in improved cure rates for cancer of the seminal vesicles. We also performed a literature search for current concepts related to the diagnosis and clinical management of primary adenocarcinoma of seminal vesicles. PMID:27499637

  9. Fiber optic hydrogen detection system

    NASA Astrophysics Data System (ADS)

    Kazemi, Alex A.; Larson, David B.; Wuestling, Mark D.

    1999-12-01

    Commercial and military launch vehicles are designed to use hydrogen as the main propellant, which is very volatile, extremely flammable, and highly explosive. Current detection systems uses Teflon transfer tubes at a large number of vehicle locations through which gas samples are drawn and the stream analyzed by a mass spectrometer. A concern with this approach is the high cost of the system. Also, the current system does not provide leak location and is not in real-time. This system is very complex and cumbersome for production and ground support measurement personnel. The fiber optic micromirror sensor under development for cryogenic environment relies on a reversible chemical interaction causing a change in reflectivity of a thin film of coated Palladium. The magnitude of the reflectivity change is correlated to hydrogen concentration. The sensor uses only a tiny light beam, with no electricity whatsoever at the sensor, leading to devices that is intrinsically safe from explosive ignition. The sensor, extremely small in size and weight detects, hydrogen concentration using a passive element consisting of chemically reactive microcoatings deposited on the surface of a glass microlens, which is then bonded to an optical fiber. The system uses a multiplexing technique with a fiber optic driver-receiver consisting of a modulated LED source that is launched into the sensor, and a photodiode detector that synchronously measures the reflected signal. The system incorporates a microprocessor (or PC) to perform the data analysis and storage, as well as trending and set alarm function. As it is a low cost system with a fast response, many more detection sensors can be used that will be extremely helpful in determining leak location for safety of crew and vehicles during launch operations.

  10. A pattern recognition system for JPEG steganography detection

    NASA Astrophysics Data System (ADS)

    Chen, C. L. Philip; Chen, Mei-Ching; Agaian, Sos; Zhou, Yicong; Roy, Anuradha; Rodriguez, Benjamin M.

    2012-10-01

    This paper builds up a pattern recognition system to detect anomalies in JPEG images, especially steganographic content. The system consists of feature generation, feature ranking and selection, feature extraction, and pattern classification. These processes tend to capture image characteristics, reduce the problem dimensionality, eliminate the noise inferences between features, and further improve classification accuracies on clean and steganography JPEG images. Based on the discussion and analysis of six popular JPEG steganography methods, the entire recognition system results in higher classification accuracies between clean and steganography classes compared to merely using individual feature subset for JPEG steganography detection. The strength of feature combination and preprocessing has been integrated even when a small amount of information is embedded. The work demonstrated in this paper is extensible and can be improved by integrating various new and current techniques.

  11. Networked gamma radiation detection system for tactical deployment

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ronald; Smith, Ethan; Guss, Paul; Mitchell, Stephen

    2015-08-01

    A networked gamma radiation detection system with directional sensitivity and energy spectral data acquisition capability is being developed by the National Security Technologies, LLC, Remote Sensing Laboratory to support the close and intense tactical engagement of law enforcement who carry out counterterrorism missions. In the proposed design, three clusters of 2″ × 4″ × 16″ sodium iodide crystals (4 each) with digiBASE-E (for list mode data collection) would be placed on the passenger side of a minivan. To enhance localization and facilitate rapid identification of isotopes, advanced smart real-time localization and radioisotope identification algorithms like WAVRAD (wavelet-assisted variance reduction for anomaly detection) and NSCRAD (nuisance-rejection spectral comparison ratio anomaly detection) will be incorporated. We will test a collection of algorithms and analysis that centers on the problem of radiation detection with a distributed sensor network. We will study the basic characteristics of a radiation sensor network and focus on the trade-offs between false positive alarm rates, true positive alarm rates, and time to detect multiple radiation sources in a large area. Empirical and simulation analyses of critical system parameters, such as number of sensors, sensor placement, and sensor response functions, will be examined. This networked system will provide an integrated radiation detection architecture and framework with (i) a large nationally recognized search database equivalent that would help generate a common operational picture in a major radiological crisis; (ii) a robust reach back connectivity for search data to be evaluated by home teams; and, finally, (iii) a possibility of integrating search data from multi-agency responders.

  12. The Autonomous Pathogen Detection System

    SciTech Connect

    Dzenitis, J M; Makarewicz, A J

    2009-01-13

    We developed, tested, and now operate a civilian biological defense capability that continuously monitors the air for biological threat agents. The Autonomous Pathogen Detection System (APDS) collects, prepares, reads, analyzes, and reports results of multiplexed immunoassays and multiplexed PCR assays using Luminex{copyright} xMAP technology and flow cytometer. The mission we conduct is particularly demanding: continuous monitoring, multiple threat agents, high sensitivity, challenging environments, and ultimately extremely low false positive rates. Here, we introduce the mission requirements and metrics, show the system engineering and analysis framework, and describe the progress to date including early development and current status.

  13. Behavioral economics without anomalies.

    PubMed Central

    Rachlin, H

    1995-01-01

    Behavioral economics is often conceived as the study of anomalies superimposed on a rational system. As research has progressed, anomalies have multiplied until little is left of rationality. Another conception of behavioral economics is based on the axiom that value is always maximized. It incorporates so-called anomalies either as conflicts between temporal patterns of behavior and the individual acts comprising those patterns or as outcomes of nonexponential time discounting. This second conception of behavioral economics is both empirically based and internally consistent. PMID:8551195

  14. Early India-Australia spreading history revealed by newly detected Mesozoic magnetic anomalies in the Perth Abyssal Plain

    NASA Astrophysics Data System (ADS)

    Williams, Simon E.; Whittaker, Joanne M.; Granot, Roi; Müller, Dietmar R.

    2013-07-01

    seafloor within the Perth Abyssal Plain (PAP), offshore Western Australia, is the only section of crust that directly records the early spreading history between India and Australia during the Mesozoic breakup of Gondwana. However, this early spreading has been poorly constrained due to an absence of data, including marine magnetic anomalies and data constraining the crustal nature of key tectonic features. Here, we present new magnetic anomaly data from the PAP that shows that the crust in the western part of the basin was part of the Indian Plate—the conjugate flank to the oceanic crust immediately offshore the Perth margin, Australia. We identify a sequence of M2 and older anomalies in the west PAP within crust that initially moved with the Indian Plate, formed at intermediate half-spreading rates (35 mm/yr) consistent with the conjugate sequence on the Australian Plate. More speculatively, we reinterpret the youngest anomalies in the east PAP, finding that the M0-age crust initially formed on the Indian Plate was transferred to the Australian Plate by a westward jump or propagation of the spreading ridge shortly after M0 time. Samples dredged from the Gulden Draak and Batavia Knolls (at the western edge of the PAP) reveal that these bathymetric features are continental fragments rather than igneous plateaus related to Broken Ridge. These microcontinents rifted away from Australia with Greater India during initial breakup at ~130 Ma, then rifted from India following the cessation of spreading in the PAP (~101-103 Ma).

  15. Does the Neptunian system of satellites challenge a gravitational origin for the Pioneer anomaly?

    NASA Astrophysics Data System (ADS)

    Iorio, L.

    2010-07-01

    If the Pioneer anomaly (PA) was a genuine dynamical effect of gravitational origin, it should also affect the orbital motions of the Solar system's bodies moving in the space regions in which the PA manifested itself in its presently known form, i.e. as a constant and uniform acceleration approximately directed towards the Sun with a non-zero magnitude after 20au from the Sun. In this paper we preliminarily investigate its effects on the orbital motions of the Neptunian satellites Triton, Nereid and Proteus, located at about 30au from the Sun, both analytically and numerically. Extensive observational records covering several orbital revolutions have recently been analysed for them, notably improving the knowledge of their orbits. Both analytical and numerical calculations, limited to the direct, Neptune-satellite interaction, show that the peak-to-peak amplitudes of the PA-induced radial, transverse and out-of-plane perturbations over one century are up to 300, 600km, 8m for Triton, 17500, 35000, 800km for Nereid and 60, 120km, 30m for Proteus. The corresponding orbital uncertainties obtained from a recent analysis of all the data available for the satellites considered are, in general, smaller by one-two orders of magnitude, although obtained without modelling a Pioneer-like extra-force. Further investigations based on a reprocessing of the satellites' real or simulated data with modified equations of motions including an additional Pioneer-type force as well are worth being implemented and may shed further light on this important issue.

  16. Detecting transition in agricultural systems

    NASA Technical Reports Server (NTRS)

    Neary, P. J.; Coiner, J. C.

    1979-01-01

    Remote sensing of agricultural phenomena has been largely concentrated on analysis of agriculture at the field level. Concern has been to identify crop status, crop condition, and crop distribution, all of which are spatially analyzed on a field-by-field basis. A more general level of abstraction is the agricultural system, or the complex of crops and other land cover that differentiate various agricultural economies. The paper reports on a methodology to assist in the analysis of the landscape elements of agricultural systems with Landsat digital data. The methodology involves tracing periods of photosynthetic activity for a fixed area. Change from one agricultural system to another is detected through shifts in the intensity and periodicity of photosynthetic activity as recorded in the radiometric return to Landsat. The Landsat-derived radiometric indicator of photosynthetic activity appears to provide the ability to differentiate agricultural systems from each other as well as from conterminous natural vegetation.

  17. Nucleic acid detection system and method for detecting influenza

    SciTech Connect

    Cai, Hong; Song, Jian

    2015-03-17

    The invention provides a rapid, sensitive and specific nucleic acid detection system which utilizes isothermal nucleic acid amplification in combination with a lateral flow chromatographic device, or DNA dipstick, for DNA-hybridization detection. The system of the invention requires no complex instrumentation or electronic hardware, and provides a low cost nucleic acid detection system suitable for highly sensitive pathogen detection. Hybridization to single-stranded DNA amplification products using the system of the invention provides a sensitive and specific means by which assays can be multiplexed for the detection of multiple target sequences.

  18. A comparison of classical and intelligent methods to detect potential thermal anomalies before the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4)

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-04-01

    In this paper, a number of classical and intelligent methods, including interquartile, autoregressive integrated moving average (ARIMA), artificial neural network (ANN) and support vector machine (SVM), have been proposed to quantify potential thermal anomalies around the time of the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4). The duration of the data set, which is comprised of Aqua-MODIS land surface temperature (LST) night-time snapshot images, is 62 days. In order to quantify variations of LST data obtained from satellite images, the air temperature (AT) data derived from the meteorological station close to the earthquake epicenter has been taken into account. For the models examined here, results indicate the following: (i) ARIMA models, which are the most widely used in the time series community for short-term forecasting, are quickly and easily implemented, and can efficiently act through linear solutions. (ii) A multilayer perceptron (MLP) feed-forward neural network can be a suitable non-parametric method to detect the anomalous changes of a non-linear time series such as variations of LST. (iii) Since SVMs are often used due to their many advantages for classification and regression tasks, it can be shown that, if the difference between the predicted value using the SVM method and the observed value exceeds the pre-defined threshold value, then the observed value could be regarded as an anomaly. (iv) ANN and SVM methods could be powerful tools in modeling complex phenomena such as earthquake precursor time series where we may not know what the underlying data generating process is. There is good agreement in the results obtained from the different methods for quantifying potential anomalies in a given LST time series. This paper indicates that the detection of the potential thermal anomalies derive credibility from the overall efficiencies and potentialities of the four integrated methods.

  19. A prototype implementation of a network-level intrusion detection system. Technical report number CS91-11

    SciTech Connect

    Heady, R.; Luger, G.F.; Maccabe, A.B.; Servilla, M.; Sturtevant, J.

    1991-05-15

    This paper presents the implementation of a prototype network level intrusion detection system. The prototype system monitors base level information in network packets (source, destination, packet size, time, and network protocol), learning the normal patterns and announcing anomalies as they occur. The goal of this research is to determine the applicability of current intrusion detection technology to the detection of network level intrusions. In particular, the authors are investigating the possibility of using this technology to detect and react to worm programs.

  20. Compensated intruder-detection systems

    DOEpatents

    McNeilly, David R.; Miller, William R.

    1984-01-01

    Intruder-detection systems in which intruder-induced signals are transmitted through a medium also receive spurious signals induced by changes in a climatic condition affecting the medium. To combat this, signals received from the detection medium are converted to a first signal. The system also provides a reference signal proportional to climate-induced changes in the medium. The first signal and the reference signal are combined for generating therefrom an output signal which is insensitive to the climatic changes in the medium. An alarm is energized if the output signal exceeds a preselected value. In one embodiment, an acoustic cable is coupled to a fence to generate a first electrical signal proportional to movements thereof. False alarms resulting from wind-induced movements of the fence (detection medium) are eliminated by providing an anemometer-driven voltage generator to provide a reference voltage proportional to the velocity of wind incident on the fence. An analog divider receives the first electrical signal and the reference signal as its numerator and denominator inputs, respectively, and generates therefrom an output signal which is insensitive to the wind-induced movements in the fence.

  1. Capillary Electrophoresis - Optical Detection Systems

    SciTech Connect

    Sepaniak, M. J.

    2001-08-06

    Molecular recognition systems are developed via molecular modeling and synthesis to enhance separation performance in capillary electrophoresis and optical detection methods for capillary electrophoresis. The underpinning theme of our work is the rational design and development of molecular recognition systems in chemical separations and analysis. There have been, however, some subtle and exciting shifts in our research paradigm during this period. Specifically, we have moved from mostly separations research to a good balance between separations and spectroscopic detection for separations. This shift is based on our perception that the pressing research challenges and needs in capillary electrophoresis and electrokinetic chromatography relate to the persistent detection and flow rate reproducibility limitations of these techniques (see page 1 of the accompanying Renewal Application for further discussion). In most of our work molecular recognition reagents are employed to provide selectivity and enhance performance. Also, an emerging trend is the use of these reagents with specially-prepared nano-scale materials. Although not part of our DOE BES-supported work, the modeling and synthesis of new receptors has indirectly supported the development of novel microcantilevers-based MEMS for the sensing of vapor and liquid phase analytes. This fortuitous overlap is briefly covered in this report. Several of the more significant publications that have resulted from our work are appended. To facilitate brevity we refer to these publications liberally in this progress report. Reference is also made to very recent work in the Background and Preliminary Studies Section of the Renewal Application.

  2. Müllerian anomalies.

    PubMed

    Gell, Jennifer S

    2003-11-01

    The reproductive organs in both males and females consist of gonads, internal ductal structures, and external genitalia. Normal sexual differentiation is dependent on the genetic sex determined by the presence or absence of the Y chromosome at fertilization. Testes develop under the influence of the Y chromosome and ovaries develop when no Y chromosome is present. In the absence of testes and their normal hormonal products, sexual differentiation proceeds along the female pathway, resulting in a normal female phenotype. Anatomic gynecologic anomalies occur when there is failure of normal embryologic ductal development. These anomalies include congenital absence of the vagina as well as defects in lateral and vertical fusion of the Müllerian ducts. Treatment of müllerian anomalies begins with the correct identification of the anomaly and an understanding of the embryologic origin. This includes evaluation for other associated anomalies such as renal or skeletal abnormalities. After correct identification, treatment options include nonsurgical as well as surgical intervention. This chapter serves to review the embryology and development of the reproductive system and to describe common genital tract anomalies. Details of surgical or nonsurgical correction of these anomalies are presented. PMID:14724770

  3. Fish detection and classification system

    NASA Astrophysics Data System (ADS)

    Tidd, Richard A.; Wilder, Joseph

    2001-01-01

    Marine biologists traditionally determine the presence and quantities of different types of fish by dragging nets across the bottom, and examining their contents. This method, although accurate, kills the collected fish, damages their habitat, and consumes large quantities of resources. This paper presents an alternative, a machine vision system capable of determining the presence of fish species. Illumination presents a unique problem in this environment, and the design of an effective illumination system is discussed. The related issues of object orientation and measurement are also discussed and resolved. Capturing images of fish in murky water also presents challenges. An adaptive thresholding technique is required to appropriately segment the fish from the background in these images. Mode detection, and histogram analysis are useful tools in determining these localized thresholds. It is anticipated that this system, created in conjunction with the Rutgers Institute for Marine and Coastal Science, will effectively classify fish in the estuarine environment.

  4. Ionization detection system for aerosols

    DOEpatents

    Jacobs, Martin E.

    1977-01-01

    This invention relates to an improved smoke-detection system of the ionization-chamber type. In the preferred embodiment, the system utilizes a conventional detector head comprising a measuring ionization chamber, a reference ionization chamber, and a normally non-conductive gas triode for discharging when a threshold concentration of airborne particulates is present in the measuring chamber. The improved system utilizes a measuring ionization chamber which is modified to minimize false alarms and reductions in sensitivity resulting from changes in ambient temperature. In the preferred form of the modification, an annular radiation shield is mounted about the usual radiation source provided to effect ionization in the measuring chamber. The shield is supported by a bimetallic strip which flexes in response to changes in ambient temperature, moving the shield relative to the source so as to vary the radiative area of the source in a manner offsetting temperature-induced variations in the sensitivity of the chamber.

  5. Pulsed helium ionization detection system

    DOEpatents

    Ramsey, Roswitha S.; Todd, Richard A.

    1987-01-01

    A helium ionization detection system is provided which produces stable operation of a conventional helium ionization detector while providing improved sensitivity and linearity. Stability is improved by applying pulsed dc supply voltage across the ionization detector, thereby modifying the sampling of the detectors output current. A unique pulse generator is used to supply pulsed dc to the detector which has variable width and interval adjust features that allows up to 500 V to be applied in pulse widths ranging from about 150 nsec to about dc conditions.

  6. Pulsed helium ionization detection system

    DOEpatents

    Ramsey, R.S.; Todd, R.A.

    1985-04-09

    A helium ionization detection system is provided which produces stable operation of a conventional helium ionization detector while providing improved sensitivity and linearity. Stability is improved by applying pulsed dc supply voltage across the ionization detector, thereby modifying the sampling of the detectors output current. A unique pulse generator is used to supply pulsed dc to the detector which has variable width and interval adjust features that allows up to 500 V to be applied in pulse widths ranging from about 150 nsec to about dc conditions.

  7. Infrared trace element detection system

    DOEpatents

    Bien, Fritz; Bernstein, Lawrence S.; Matthew, Michael W.

    1988-01-01

    An infrared trace element detection system including an optical cell into which the sample fluid to be examined is introduced and removed. Also introduced into the optical cell is a sample beam of infrared radiation in a first wavelength band which is significantly absorbed by the trace element and a second wavelength band which is not significantly absorbed by the trace element for passage through the optical cell through the sample fluid. The output intensities of the sample beam of radiation are selectively detected in the first and second wavelength bands. The intensities of a reference beam of the radiation are similarly detected in the first and second wavelength bands. The sensed output intensity of the sample beam in one of the first and second wavelength bands is normalized with respect to the other and similarly, the intensity of the reference beam of radiation in one of the first and second wavelength bands is normalized with respect to the other. The normalized sample beam intensity and normalized reference beam intensity are then compared to provide a signal from which the amount of trace element in the sample fluid can be determined.

  8. Infrared trace element detection system

    DOEpatents

    Bien, F.; Bernstein, L.S.; Matthew, M.W.

    1988-11-15

    An infrared trace element detection system includes an optical cell into which the sample fluid to be examined is introduced and removed. Also introduced into the optical cell is a sample beam of infrared radiation in a first wavelength band which is significantly absorbed by the trace element and a second wavelength band which is not significantly absorbed by the trace element for passage through the optical cell through the sample fluid. The output intensities of the sample beam of radiation are selectively detected in the first and second wavelength bands. The intensities of a reference beam of the radiation are similarly detected in the first and second wavelength bands. The sensed output intensity of the sample beam in one of the first and second wavelength bands is normalized with respect to the other and similarly, the intensity of the reference beam of radiation in one of the first and second wavelength bands is normalized with respect to the other. The normalized sample beam intensity and normalized reference beam intensity are then compared to provide a signal from which the amount of trace element in the sample fluid can be determined. 11 figs.

  9. Enzyme leaching of surficial geochemical samples for detecting hydromorphic trace-element anomalies associated with precious-metal mineralized bedrock buried beneath glacial overburden in northern Minnesota

    USGS Publications Warehouse

    Clark, Robert J.; Meier, A.L.; Riddle, G.

    1990-01-01

    One objective of the International Falls and Roseau, Minnesota, CUSMAP projects was to develop a means of conducting regional-scale geochemical surveys in areas where bedrock is buried beneath complex glacially derived overburden. Partial analysis of B-horizon soils offered hope for detecting subtle hydromorphic trace-element dispersion patterns. An enzyme-based partial leach selectively removes metals from oxide coatings on the surfaces of soil materials without attacking their matrix. Most trace-element concentrations in the resulting solutions are in the part-per-trillion to low part-per-billion range, necessitating determinations by inductively coupled plasma/mass spectrometry. The resulting data show greater contrasts for many trace elements than with other techniques tested. Spatially, many trace metal anomalies are locally discontinuous, but anomalous trends within larger areas are apparent. In many instances, the source for an anomaly seems to be either basal till or bedrock. Ground water flow is probably the most important mechanism for transporting metals toward the surface, although ionic diffusion, electrochemical gradients, and capillary action may play a role in anomaly dispersal. Sample sites near the Rainy Lake-Seine River fault zone, a regional shear zone, often have anomalous concentrations of a variety of metals, commonly including Zn and/or one or more metals which substitute for Zn in sphalerite (Cd, Ge, Ga, and Sn). Shifts in background concentrations of Bi, Sb, and As show a trend across the area indicating a possible regional zoning of lode-Au mineralization. Soil anomalies of Ag, Co, and Tl parallel basement structures, suggesting areas that may have potential for Cobalt/Thunder Baytype silver viens. An area around Baudette, Minnesota, which is underlain by quartz-chlorite-carbonate-altered shear zones, is anomalous in Ag, As, Bi, Co, Mo, Te, Tl, and W. Anomalies of Ag, As, Bi, Te, and W tend to follow the fault zones, suggesting potential

  10. Explosives detection system and method

    DOEpatents

    Reber, Edward L.; Jewell, James K.; Rohde, Kenneth W.; Seabury, Edward H.; Blackwood, Larry G.; Edwards, Andrew J.; Derr, Kurt W.

    2007-12-11

    A method of detecting explosives in a vehicle includes providing a first rack on one side of the vehicle, the rack including a neutron generator and a plurality of gamma ray detectors; providing a second rack on another side of the vehicle, the second rack including a neutron generator and a plurality of gamma ray detectors; providing a control system, remote from the first and second racks, coupled to the neutron generators and gamma ray detectors; using the control system, causing the neutron generators to generate neutrons; and performing gamma ray spectroscopy on spectra read by the gamma ray detectors to look for a signature indicative of presence of an explosive. Various apparatus and other methods are also provided.

  11. Evaluation of Intrusion Detection Systems

    PubMed Central

    Ulvila, Jacob W.; Gaffney, John E.

    2003-01-01

    This paper presents a comprehensive method for evaluating intrusion detection systems (IDSs). It integrates and extends ROC (receiver operating characteristic) and cost analysis methods to provide an expected cost metric. Results are given for determining the optimal operation of an IDS based on this expected cost metric. Results are given for the operation of a single IDS and for a combination of two IDSs. The method is illustrated for: 1) determining the best operating point for a single and double IDS based on the costs of mistakes and the hostility of the operating environment as represented in the prior probability of intrusion and 2) evaluating single and double IDSs on the basis of expected cost. A method is also described for representing a compound IDS as an equivalent single IDS. Results are presented from the point of view of a system administrator, but they apply equally to designers of IDSs.

  12. On strontium isotopic anomalies and odd-A p-process abundances. [in solar system

    NASA Technical Reports Server (NTRS)

    Clayton, D. D.

    1978-01-01

    Several aspects of the nucleosynthesis of Sr isotopes are considered in an attempt to shed light on the problem of the Sr isotopic anomalies discovered in an inclusion of the Allende meteorite. Decomposition of the Sr isotopes into average r-, s-, and p-process nucleosynthetic classes is performed. It is suggested that the Allende inclusion most likely has an excess of s-process Sr and that the initial Sr-87/Sr-86 isotopic ratio is probably slightly more primitive than basaltic achondrites. The results also show that Sn-115 is mostly due to the r-process and that odd-A yields are very small. It is concluded that if the Sr anomaly in the inclusion is an average s enhancement, it argues somewhat in favor of a model of gas/dust fractionation of s and r isotopes during accumulation of the inclusion parent in the protosolar cloud.

  13. Recent advances in microfluidic detection systems

    PubMed Central

    Baker, Christopher A; Duong, Cindy T; Grimley, Alix; Roper, Michael G

    2009-01-01

    There are numerous detection methods available for methods are being put to use for detection on these miniaturized systems, with the analyte of interest driving the choice of detection method. In this article, we summarize microfluidic 2 years. More focus is given to unconventional approaches to detection routes and novel strategies for performing high-sensitivity detection. PMID:20414455

  14. Prevalence of minor musculoskeletal anomalies in children with congenital hypothyroidism.

    PubMed

    El Kholy, Mohamed; Fahmi, Marwa E; Nassar, Ayman E; Selim, Samia; Elsedfy, Heba H

    2007-01-01

    In the last decade a high frequency of extrathyroidal congenital anomalies has been reported in infants with congenital hypothyroidism (CH) detected by neonatal screening. In the present study the occurrence of additional congenital malformations (CM) in a cohort of children with confirmed primary CH due to thyroid dysgenesis was investigated. A high prevalence of extrathyroidal major congenital anomalies (15.9%), more than 5-fold higher than that reported in the Egyptian population (2.7%), was found. The cardiac and musculoskeletal systems were the most commonly involved, comprising 9.09 and 47.72% of all anomalies, respectively. The high prevalence of musculoskeletal anomalies in this study was mostly due to minor anomalies as brachydactyly and digitalization of thumbs. The type of dysgenesis (i.e. aplastic, ectopic or hypoplastic) as well as the severity of hypothyroidism, as assessed by TSH and T(4) levels at diagnosis, had no relation with the occurrence of extrathyroidal abnormalities. PMID:17587855

  15. Large negative Ti-50 anomalies in refractory inclusions from the Murchison carbonaceous chondrite - Evidence for incomplete mixing of neutron-rich supernova ejecta into the solar system

    NASA Technical Reports Server (NTRS)

    Hinton, Richard W.; Davis, Andrew M.; Scatena-Wachel, Debra E.

    1987-01-01

    An ion microprobe was used to measure Ti-50 variations in hibonite-rich inclusions from the Murchison chondrite. Both deficits and excesses of the isotope were found, depending on the inclusion being scanned. The anomalies were not correlated with the mineralogy, chemical composition, other isotopic anomalies of Ti, etc. The lack of correlations indicates that the cosmic chemical memory model (Clayton, 1981) cannot explain the observed variations. The Ti-50 concentrations may have originated when a supernova explosion triggered the collapse of a molecular cloud that formed the solar system. The solar system Ti-50 anomalies were from the cloud, not the progenitor star.

  16. Geoelectrical Characterization of the Punta Banda System: A Possible Structural Control for the Geothermal Anomalies

    NASA Astrophysics Data System (ADS)

    Arango-Galvan, C.; Flores-Marquez, E.; Prol-Ledesma, R.; Working Group, I.

    2007-05-01

    The lack of sufficient drinking water in México has become a very serious problem, especially in the northern desert regions of the country. In order to give a real solution to this phenomenon the IMPULSA research program has been created to develope novel technologies based on desalination of sea and brackish water using renewable sources of energy to face the problem. The Punta Banda geothermal anomaly is located towards the northern part of Baja California Peninsula (Mexico). High water temperatures in some wells along the coast depicted a geothermal anomaly. An audiomagnetotelluric survey was carried out in the area as a preliminary study, both to understand the process generating these anomalous temperatures and to assess its potential exploitation to supply hot water to desalination plants. Among the electromagnetic methods, the audiomagnetotellurics (AMT) method is appropriated for deep groundwater and geothermal studies. The survey consisted of 27 AMT stations covering a 5 km profile along the Agua Blanca Fault. The employed array allowed us to characterize the geoelectrical properties of the main structures up to 500 m depth. Two main geoelectrical zones were identified: 1) a shallow low resistivity media located at the central portion of the profile, coinciding with the Maneadero valley and 2) two high resitivity structures bordering the conductive zone possibly related to NS faulting, already identified by previous geophysical studies. These results suggest that the main geothermal anomalies are controlled by the dominant structural regime in the zone.

  17. Toward Baseline Software Anomalies in NASA Missions

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.

    2012-01-01

    In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.

  18. Electrochemical anomalies of protic ionic liquid - Water systems: A case study using ethylammonium nitrate - Water system

    NASA Astrophysics Data System (ADS)

    Abe, Hiroshi; Nakama, Kazuya; Hayashi, Ryotaro; Aono, Masami; Takekiyo, Takahiro; Yoshimura, Yukihiro; Saihara, Koji; Shimizu, Akio

    2016-08-01

    Electrochemical impedance spectroscopy was used to evaluate protic ionic liquid (pIL)-water mixtures in the temperature range of -35-25 °C. The pIL used in this study was ethylammonium nitrate (EAN). At room temperature, the resonant mode of conductivity was observed in the high frequency region. The anomalous conductivity disappeared once solidification occurred at low temperatures. The kinetic pH of the EAN-water system was investigated at a fixed temperature. Rhythmic pH oscillations in the EAN-H2O mixtures were induced at 70 < x < 90 mol% H2O. The electrochemical instabilities in a EAN-water mixture are caused in an intermediate state between pIL and bulk water. From the ab initio calculations, it was observed that the dipole moment of the EAN-water complex shows a discrete jump at around 85 mol% H2O. Water-mediated hydrogen bonding network drastically changes at the crossover concentration.

  19. DOWN'S ANOMALY.

    ERIC Educational Resources Information Center

    PENROSE, L.S.; SMITH, G.F.

    BOTH CLINICAL AND PATHOLOGICAL ASPECTS AND MATHEMATICAL ELABORATIONS OF DOWN'S ANOMALY, KNOWN ALSO AS MONGOLISM, ARE PRESENTED IN THIS REFERENCE MANUAL FOR PROFESSIONAL PERSONNEL. INFORMATION PROVIDED CONCERNS (1) HISTORICAL STUDIES, (2) PHYSICAL SIGNS, (3) BONES AND MUSCLES, (4) MENTAL DEVELOPMENT, (5) DERMATOGLYPHS, (6) HEMATOLOGY, (7)…

  20. A Portable Infrasonic Detection System

    NASA Technical Reports Server (NTRS)

    Shams, Qamar A.; Burkett, Cecil G.; Zuckerwar, Allan J.; Lawrenson, Christopher C.; Masterman, Michael

    2008-01-01

    During last couple of years, NASA Langley has designed and developed a portable infrasonic detection system which can be used to make useful infrasound measurements at a location where it was not possible previously. The system comprises an electret condenser microphone, having a 3-inch membrane diameter, and a small, compact windscreen. Electret-based technology offers the lowest possible background noise, because Johnson noise generated in the supporting electronics (preamplifier) is minimized. The microphone features a high membrane compliance with a large backchamber volume, a prepolarized backplane and a high impedance preamplifier located inside the backchamber. The windscreen, based on the high transmission coefficient of infrasound through matter, is made of a material having a low acoustic impedance and sufficiently thick wall to insure structural stability. Close-cell polyurethane foam has been found to serve the purpose well. In the proposed test, test parameters will be sensitivity, background noise, signal fidelity (harmonic distortion), and temporal stability. The design and results of the compact system, based upon laboratory and field experiments, will be presented.

  1. Photoelectric detection system. [manufacturing automation

    NASA Technical Reports Server (NTRS)

    Currie, J. R.; Schansman, R. R. (Inventor)

    1982-01-01

    A photoelectric beam system for the detection of the arrival of an object at a discrete station wherein artificial light, natural light, or no light may be present is described. A signal generator turns on and off a signal light at a selected frequency. When the object in question arrives on station, ambient light is blocked by the object, and the light from the signal light is reflected onto a photoelectric sensor which has a delayed electrical output but is of the frequency of the signal light. Outputs from both the signal source and the photoelectric sensor are fed to inputs of an exclusively OR detector which provides as an output the difference between them. The difference signal is a small width pulse occurring at the frequency of the signal source. By filter means, this signal is distinguished from those responsive to sunlight, darkness, or 120 Hz artificial light. In this fashion, the presence of an object is positively established.

  2. Planetary system detection by POINTS

    NASA Technical Reports Server (NTRS)

    Reasenberg, Robert D.

    1993-01-01

    The final report and semiannual reports 1, 2, and 3 in response to the study of 'Planetary System Detection by POINTS' is presented. The grant covered the period from 15 Jun. 1988 through 31 Dec. 1989. The work during that period comprised the further development and refinement of the POINTS concept. The status of the POINTS development at the end of the Grant period was described by Reasenberg in a paper given at the JPL Workshop on Space Interferometry, 12-13 Mar. 1990, and distributed as CfA Preprint 3138. That paper, 'POINTS: a Small Astrometric Interferometer,' follows as Appendix-A. Our proposal P2276-7-09, dated July 1990, included a more detailed description of the state of the development of POINTS at the end of the tenure of Grant NAGW-1355. That proposal, which resulted in Grant NAGW-2497, is included by reference.

  3. Using a combination of MLPA kits to detect chromosomal imbalances in patients with multiple congenital anomalies and mental retardation is a valuable choice for developing countries.

    PubMed

    Jehee, Fernanda Sarquis; Takamori, Jean Tetsuo; Medeiros, Paula F Vasconcelos; Pordeus, Ana Carolina B; Latini, Flavia Roche M; Bertola, Débora Romeo; Kim, Chong Ae; Passos-Bueno, Maria Rita

    2011-01-01

    Conventional karyotyping detects anomalies in 3-15% of patients with multiple congenital anomalies and mental retardation (MCA/MR). Whole-genome array screening (WGAS) has been consistently suggested as the first choice diagnostic test for this group of patients, but it is very costly for large-scale use in developing countries. We evaluated the use of a combination of Multiplex Ligation-dependent Probe Amplification (MLPA) kits to increase the detection rate of chromosomal abnormalities in MCA/MR patients. We screened 261 MCA/MR patients with two subtelomeric and one microdeletion kits. This would theoretically detect up to 70% of all submicroscopic abnormalities. Additionally we scored the de Vries score for 209 patients in an effort to find a suitable cut-off for MLPA screening. Our results reveal that chromosomal abnormalities were present in 87 (33.3%) patients, but only 57 (21.8%) were considered causative. Karyotyping detected 15 abnormalities (6.9%), while MLPA identified 54 (20.7%). Our combined MLPA screening raised the total detection number of pathogenic imbalances more than three times when compared to conventional karyotyping. We also show that using the de Vries score as a cut-off for this screening would only be suitable under financial restrictions. A decision analytic model was constructed with three possible strategies: karyotype, karyotype + MLPA and karyotype + WGAS. Karyotype + MLPA strategy detected anomalies in 19.8% of cases which account for 76.45% of the expected yield for karyotype + WGAS. Incremental Cost Effectiveness Ratio (ICER) of MLPA is three times lower than that of WGAS, which means that, for the same costs, we have three additional diagnoses with MLPA but only one with WGAS. We list all causative alterations found, including rare findings, such as reciprocal duplications of regions deleted in Sotos and Williams-Beuren syndromes. We also describe imbalances that were considered polymorphisms or rare variants, such as the new SNP

  4. The isotopic homogeneity in the early solar system: Revisiting the CAI oxygen isotopic anomaly

    NASA Astrophysics Data System (ADS)

    Ozima, M.; Yamada, A.

    2009-12-01

    Since the first discovery of the mass-independently fractionated oxygen isotopes in anhydrous, high temperature Ca-Al rich inclusion minerals in carbonaceous meteorites (CAIs) by Clayton et al. (1), their common occurrence in primitive meteorites has generally been regarded to reflect some fundamental process prevalent in the early solar nebula. The CAI oxygen isotopic composition is uniquely characterized by (i) large mass independent isotopic fractionation and (ii) their isotopic data in an oxygen three isotope plot (δ17O - δ18O (δ17O ≡ {(17O/16O)/(17O/16O)SMOW - 1} × 1000) yield nearly a straight line with a slope 1.0. In establishing these characteristics, ion microprobe analyses has played a central role, especially an isotopic mapping technique (isotopography) was crucial (e.g., 2). The extraordinary oxygen isotopic ratio in CAIs is widely attributed to the self-shielding absorption of UV radiation in CO, one of the dominant chemical compounds in the early solar nebula (3). However, the self-shielding scenario necessarily leads to the unusual prediction that a mean solar oxygen isotopic composition differs from most of planetary bodies including Earth, Moon, and Mars. If the self-shielding process were indeed responsible to the CAI oxygen isotopic anomaly, this would require a fundamental revision of the current theory of the origin of the solar system, which generally assumes the initial total vaporization of nebula material to give rise to isotopic homogenization. The GENESIS mission launched in 2001(4), which collected oxygen in the solar wind was hoped to resolve the isotopic composition of the Sun. However, because of difficulties in correcting for instrumental and more importantly for intrinsic isotopic fractionation between the SW and the Sun, a final answer is yet to be seen (5). Here, we show on the basis of the oxygen isotopic fractionation systematics that the self shielding hypothesis cannot explain the key characteristics of the CAI oxygen

  5. New tool to detect operation anomalies on automatic voltage regulator equipment of large power units; Generator simulator (GS)

    SciTech Connect

    Blanchet, P. )

    1990-01-01

    When large generating plants are installed on site remote from the consumer areas, the operation of network with correct margins of stability is conditioned by adjustment of automatic voltage regulator (AVR). Any spoiled deviation in normal operation or especially in abnormal run must be detected at first overhaul or first shutdown. Then, without delay, this new tool which is the generator simulator (GS) contributes to minimize the time necessary for failures investigation and to qualify again AVR equipment after repair. The two main objectives of this paper are: to qualify the AVR performances of power unit during the scheduled overhaul; and to lighten failures research into AVR system, avoiding faulty dismantling during the unit fortuitous shutdown.

  6. NADIR: A prototype system for detecting network and file system abuse

    SciTech Connect

    Hochberg, J.G.; Jackson, K.A.; Stallings, C.A.; McClary, J.F.; DuBois, D.H.; Ford, J.R.

    1992-10-01

    This paper describes the design of a prototype computer misuse detection system for the Los Alamos Notional Laboratory`s Integrated Computing Network (ICN). This automated expert system, the Network Anomaly Detection and Intrusion Reporter (NADIR), streamlines and supplements the manual audit record review traditionally performed by security auditors. NADIR compares network activity, as summarized in weekly profiles of individual users and the ICN as a whole, against expert rules that define security policy, improper or suspicious behavior, and normal user activity. NADIR reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. This paper describes analysis by NADIR of two types of ICN activity: user authentication and access control, and mass file storage. It highlights system design issues of data handling, exploiting existing auditing systems, and performing audit analysis at the network level.

  7. NADIR: A prototype system for detecting network and file system abuse

    SciTech Connect

    Hochberg, J.G.; Jackson, K.A.; Stallings, C.A.; McClary, J.F.; DuBois, D.H.; Ford, J.R.

    1992-01-01

    This paper describes the design of a prototype computer misuse detection system for the Los Alamos Notional Laboratory's Integrated Computing Network (ICN). This automated expert system, the Network Anomaly Detection and Intrusion Reporter (NADIR), streamlines and supplements the manual audit record review traditionally performed by security auditors. NADIR compares network activity, as summarized in weekly profiles of individual users and the ICN as a whole, against expert rules that define security policy, improper or suspicious behavior, and normal user activity. NADIR reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. This paper describes analysis by NADIR of two types of ICN activity: user authentication and access control, and mass file storage. It highlights system design issues of data handling, exploiting existing auditing systems, and performing audit analysis at the network level.

  8. Temperature dependence of the zero-bias anomaly in the Anderson-Hubbard model: insights from an ensemble of two-site systems

    NASA Astrophysics Data System (ADS)

    Wortis, R.; Atkinson, W. A.

    2011-03-01

    Motivated by experiments on doped transition metal oxides, this paper considers the interplay of interactions, disorder, kinetic energy and temperature in a simple system. An ensemble of two-site Anderson-Hubbard model systems has already been shown to display a zero-bias anomaly (Wortis and Atkinson 2010 Phys. Rev. B 82 073107) which shares features with that found in the two-dimensional Anderson-Hubbard model (Chiesa et al 2008 Phys. Rev. Lett. 101 086401). Here the temperature dependence of the density of states of this ensemble is examined. In the atomic limit, there is no zero-bias anomaly at zero temperature, but one develops at small nonzero temperatures. With hopping, small temperatures augment the zero-temperature kinetic-energy-driven zero-bias anomaly, while at larger temperatures the anomaly is filled in.

  9. Detection and characterization of transient forcing episodes affecting earthquake activity in the Aleutian Arc system

    NASA Astrophysics Data System (ADS)

    Reverso, T.; Marsan, D.; Helmstetter, A.

    2015-02-01

    Crustal, slow deformation transients can be caused by fluid or magmatic intrusions, and by slow slip on faults. They can affect earthquake dynamics, if they occur close to or within seismically active zones. We here further develop, and test, a statistical method for detecting and characterizing seismicity anomalies that is only based on earthquake occurrence times and locations. We make use of this method to analyze the 2004-2013 seismicity at mc = 3.5 in the Aleutian subduction system, to find six statistically significant anomalies, with typical 1 day duration and 30 to 50 km size, that are likely related to slow deformation transients. They tend to be located in zones characterized by intermediate seismic coupling, and to mark the termination of past large to mega-thrust earthquakes. These anomalies account for a non-negligible (9%) part of the total activity, proving that non-stationary aseismic loading plays an important role in the dynamics of crustal deformation.

  10. Trouble Brewing: Using Observations of Invariant Behavior to Detect Malicious Agency in Distributed Control Systems

    NASA Astrophysics Data System (ADS)

    McEvoy, Thomas Richard; Wolthusen, Stephen D.

    Recent research on intrusion detection in supervisory data acquisition and control (SCADA) and DCS systems has focused on anomaly detection at protocol level based on the well-defined nature of traffic on such networks. Here, we consider attacks which compromise sensors or actuators (including physical manipulation), where intrusion may not be readily apparent as data and computational states can be controlled to give an appearance of normality, and sensor and control systems have limited accuracy. To counter these, we propose to consider indirect relations between sensor readings to detect such attacks through concurrent observations as determined by control laws and constraints.

  11. Controls on Martian Hydrothermal Systems: Application to Valley Network and Magnetic Anomaly Formation

    NASA Technical Reports Server (NTRS)

    Harrison, Keith P.; Grimm, Robert E.

    2002-01-01

    Models of hydrothermal groundwater circulation can quantify limits to the role of hydrothermal activity in Martian crustal processes. We present here the results of numerical simulations of convection in a porous medium due to the presence of a hot intruded magma chamber. The parameter space includes magma chamber depth, volume, aspect ratio, and host rock permeability and porosity. A primary goal of the models is the computation of surface discharge. Discharge increases approximately linearly with chamber volume, decreases weakly with depth (at low geothermal gradients), and is maximized for equant-shaped chambers. Discharge increases linearly with permeability until limited by the energy available from the intrusion. Changes in the average porosity are balanced by changes in flow velocity and therefore have little effect. Water/rock ratios of approximately 0.1, obtained by other workers from models based on the mineralogy of the Shergotty meteorite, imply minimum permeabilities of 10(exp -16) sq m2 during hydrothermal alteration. If substantial vapor volumes are required for soil alteration, the permeability must exceed 10(exp -15) sq m. The principal application of our model is to test the viability of hydrothermal circulation as the primary process responsible for the broad spatial correlation of Martian valley networks with magnetic anomalies. For host rock permeabilities as low as 10(exp -17) sq m and intrusion volumes as low as 50 cu km, the total discharge due to intrusions building that part of the southern highlands crust associated with magnetic anomalies spans a comparable range as the inferred discharge from the overlying valley networks.

  12. Analysis of the Nuevo Leon Magnetic Anomaly and its possible relation to the Cerro Prieto magmatic-hydrothermal system

    SciTech Connect

    Goldstein, N.E.; Corrigan, D.J.; Wilt, M.J.

    1984-01-01

    The broad dipolar magnetic anomaly whose positive peak is centered near Ejido Nuevo Leon, some 5 km east of the Cerro Prieto I power plant, has long been suspected to have a genetic relationship to the thermal source of the Cerro Prieto geothermal system. This suspicion was reinforced after several deep geothermal wells, drilled to depths of 3-3.5 km over the anomaly, intersected an apparent dike-sill complex consisting mainly of diabase but with minor rhyodacite. A detailed fit of the observed magnetic field to a computer model indicates that the source may be approximated by a tabular block 4 x 6 km in area, 3.7 km in depth, 2.3 km thick, and dipping slightly to the north. Mafic dike chips from one well, NL-1, were analysed by means of electron microprobe analyses which showed them to contain a titanomagnetite that is paramagnetic at in situ temperature conditions. As the dike mineralogy does not account for the magnetic anomaly, the magnetic source is believed to be a deeper, magnetite-rich assemblage of peridotite-gabbro plutons. The suite of igneous rocks was probably emplaced at a shallow depth in response to crustal extension and thinning brought on by en echelon strike-slip faulting. The bottom of the magnetic source body, at an estimated depth of 6 km, is presumed to be at or near that of the Curie isotherm (575/sup 0/C) for magnetite, the principal ferromagnetic mineral in peridotiticgabbroic rocks. The geological model derived from the magnetic study is generally supported by other geophysical data. In particular, earthquake data suggest dike injection is occurring at depths of 6-11 km in an area beneath the magnetic source. Thus, it is possible that heat for the geothermal field is being maintained by continuing crustal extension and magmatic activity.

  13. Analysis of the Nuevo Leon magnetic anomaly and its possible relation to the Cerro Prieto magmatic-hydrothermal system

    SciTech Connect

    Goldstein, N.E.; Wilt, M.J.; Corrigan, D.J.

    1982-10-01

    The broad dipolar magnetic anomaly whose positive peak is centered near Ejido Nuevo Leon, some 5 km east of the Cerro Prieto I Power Plant, has long been suspected to have a genetic relationship to the thermal source of the Cerro Prieto geothermal system. This suspicion was reinforced after several deep geothermal wells, drilled to depths of 3 to 3.5 km over the anomaly, intersected an apparent dike-sill complex consisting mainly of diabase but with minor rhyodacite. A detailed fit of the observed magnetic field to a computer model indicates that the source may be approximated by a tabular block 4 by 6 km in area, 3.7 km in depth, 2.3 km thick, and dipping slightly to the north. Mafic dike chips from one well, NL-1, were analyzed by means of electron microprobe analyses which showed tham to contain a titanomagnetite that is paramagnetic at in-situ temperature conditions. As the dike mineralogy does not account for the magnetic anomaly, the magnetic source is believed to be a deeper, magnetite-rich assemblage of peridotite-gabbro plutons. the suite of igneous rocks was probably passively emplaced at a shallow depth in response to crustal extension and thinning brought on by strike-slip faulting. The bottom of the magnetic source body, at an estimated depth of 6 km, is presumed to be at or near that of the Curie isotherm (575/sup 0/C) for magnetite, the principal ferromagnetic mineral in peridotitic-gabbroic rocks. The geological model derived from the magnetic study is generally supported by other geophysical data. In particular, earthquake data suggest dike injection is occurring at depths of 6 to 11 km in an area beneath the magnetic source. Thus, it is possible that heat for the geothermal field is being maintained by continuing crustal extension and magmatic activity.

  14. Detection of High-Potential Oil and Gas Fields Using Normalized Full Gradient of Gravity Anomalies: A Case Study in the Tabas Basin, Eastern Iran

    NASA Astrophysics Data System (ADS)

    Aghajani, Hamid; Moradzadeh, Ali; Zeng, Hualin

    2011-10-01

    The normalized full gradient (NFG) represents the full gradient of the gravity anomaly at a point divided by the average of the full gradient at the same point. The NFG minimum between two maxima in an NFG section or a closed minimum surrounded by closed maxima on an NFG map may indicate density-deficient anomalies closely related to possible oil-gas reservoirs. On a cross-section, closed minima can be used to estimate the depth to centers of possible hydrocarbon reservoirs. The NFG map can also be used to locate oil-gas exploratory wells for estimation of the depth of possible reservoirs. The objective of this paper is to use two and three-dimensional (2D and 3D) NFG on gravity data of the Tabas basin in Yazd province, eastern Iran. A hypothetical model is first considered to explore the NFG characteristics and their relationship with the geometry of the model. The physical properties of the model are then studied to simplify the interpretation of real data. Finally 2D and 3D NFG models are developed for real gravity data to predict the location of any possible high potential oil-gas reservoirs. The results obtained indicate two zones in the northern and central parts of the Tabas basin suitable for hydrocarbon prospecting. However, the favorable zone located in the middle of the basin in which anticline E is detected at a depth of 5-7 km is more important for the purpose of hydrocarbon exploration.

  15. Neutron Interrogation System For Underwater Threat Detection And Identification

    SciTech Connect

    Barzilov, Alexander P.; Novikov, Ivan S.; Womble, Phil C.

    2009-03-10

    Wartime and terrorist activities, training and munitions testing, dumping and accidents have generated significant munitions contamination in the coastal and inland waters in the United States and abroad. Although current methods provide information about the existence of the anomaly (for instance, metal objects) in the sea bottom, they fail to identify the nature of the found objects. Field experience indicates that often in excess of 90% of objects excavated during the course of munitions clean up are found to be non-hazardous items (false alarm). The technology to detect and identify waterborne or underwater threats is also vital for protection of critical infrastructures (ports, dams, locks, refineries, and LNG/LPG). We are proposing a compact neutron interrogation system, which will be used to confirm possible threats by determining the chemical composition of the suspicious underwater object. The system consists of an electronic d-T 14-MeV neutron generator, a gamma detector to detect the gamma signal from the irradiated object and a data acquisition system. The detected signal then is analyzed to quantify the chemical elements of interest and to identify explosives or chemical warfare agents.

  16. Neutron Interrogation System For Underwater Threat Detection And Identification

    NASA Astrophysics Data System (ADS)

    Barzilov, Alexander P.; Novikov, Ivan S.; Womble, Phil C.

    2009-03-01

    Wartime and terrorist activities, training and munitions testing, dumping and accidents have generated significant munitions contamination in the coastal and inland waters in the United States and abroad. Although current methods provide information about the existence of the anomaly (for instance, metal objects) in the sea bottom, they fail to identify the nature of the found objects. Field experience indicates that often in excess of 90% of objects excavated during the course of munitions clean up are found to be non-hazardous items (false alarm). The technology to detect and identify waterborne or underwater threats is also vital for protection of critical infrastructures (ports, dams, locks, refineries, and LNG/LPG). We are proposing a compact neutron interrogation system, which will be used to confirm possible threats by determining the chemical composition of the suspicious underwater object. The system consists of an electronic d-T 14-MeV neutron generator, a gamma detector to detect the gamma signal from the irradiated object and a data acquisition system. The detected signal then is analyzed to quantify the chemical elements of interest and to identify explosives or chemical warfare agents.

  17. CAXSS: an intelligent threat-detection system

    NASA Astrophysics Data System (ADS)

    Feather, Thomas; Guan, Ling; Lee-Kwen, Adrian; Paranjape, Raman B.

    1993-04-01

    Array Systems Computing Inc. (ASC) is developing a prototype Computer Assisted X-ray Screening System (CAXSS) which uses state-of-the-art image processing and computer vision technology to detect threats seen in x-ray images of passenger carry-on luggage at national and international airports. This system is successful in detecting weapons including guns, knives, grenades, aerosol cans, etc. Currently, bomb detection is also being implemented; preliminary results using this bomb detector are promising.

  18. Detection of a tropospheric ozone anomaly using a newly developed ozone retrieval algorithm for an up-looking infrared interferometer

    NASA Astrophysics Data System (ADS)

    Lightner, K. J.; McMillan, W. W.; McCann, K. J.; Hoff, R. M.; Newchurch, M. J.; Hintsa, E. J.; Barnet, C. D.

    2009-03-01

    On 2 June 2003, the Baltimore Bomem Atmospheric Emitted Radiance Interferometer (BBAERI) recorded an infrared spectral time series indicating the presence of a tropospheric ozone anomaly. The measurements were collected during an Atmospheric Infrared Sounder (AIRS) validation campaign called the 2003 AIRS BBAERI Ocean Validation Experiment (ABOVE03) conducted at the United States Coast Guard Chesapeake Light station located 14 miles due east of Virginia Beach, Virginia (36.91°N, 75.71°W). Ozone retrievals were performed with the Kurt Lightner Ozone BBAERI Retrieval (KLOBBER) algorithm, which retrieves tropospheric column ozone, surface to 300 mbar, from zenith-viewing atmospheric thermal emission spectra. KLOBBER is modeled after the AIRS retrieval algorithm consisting of a synthetic statistical regression followed by a physical retrieval. The physical retrieval is implemented using the k-Compressed Atmospheric Radiative Transfer Algorithm (kCARTA) to compute spectra. The time series of retrieved integrated ozone column on 2 June 2003 displays spikes of about 10 Dobson units, well above the error of the KLOBBER algorithm. Using instrumentation at Chesapeake Light, satellite imaging, trace gas retrievals from satellites, and Potential Vorticity (PV) computations, it was determined that these sudden increases in column ozone likely were caused by a combination of midtropospheric biomass burning products from forest fires in Siberia, Russia, and stratospheric intrusion by a tropopause fold occurring over central Canada and the midwestern United States.

  19. Isotopic Anomalies in Primitive Solar System Matter: Spin-State-Dependent Fractionation of Nitrogen and Deuterium in Interstellar Clouds

    NASA Technical Reports Server (NTRS)

    Wirstrom, Eva S.; Charnley, Steven B.; Cordiner, Martin A.; Milam, Stefanie N.

    2012-01-01

    Organic material found in meteorites and interplanetary dust particles is enriched in D and N-15. This is consistent with the idea that the functional groups carrying these isotopic anomalies, nitriles and amines, were formed by ion-molecule chemistry in the protosolar nebula, Theoretical models of interstellar fractionation at low temperatures predict large enrichments in both D and N-15 and can account for the largest isotopic enrichments measured in carbonaceous meteorites. However, more recent measurements have shown that, in some primitive samples, a large N-15 enrichment does not correlate with one in D, and that some D-enriched primitive material displays little, if any, N-15 enrichment. By considering the spin-state dependence in ion-molecule reactions involving the ortho and para forms of H2, we show that ammonia and related molecules can exhibit such a wide range of fractionation for both N-15 and D in dense cloud cores. We also show that while the nitriles, HCN and HNC, contain the greatest N=15 enrichment, this is not expected to correlate with extreme D enrichment. These calculations therefore support the view that solar system N-15 and D isotopic anomalies have an interstellar heritage. We also compare our results to existing astronomical observations and briefly discuss future tests of this model.

  20. Isotopic Anomalies in Primitive Solar System Matter: Spin-State Dependent Fractionation of Nitrogen and Deuterium in Interstellar Clouds

    NASA Technical Reports Server (NTRS)

    Wirstrom, Eva S.; Charnley, Steven B.; Cordiner, Martin A.; Milan, Stefanie N.

    2012-01-01

    Organic material found in meteorites and interplanetary dust particles is enriched in D and N-15, This is consistent with the idea that the functional groups carrying these isotopic anomalies, nitriles and amines, were formed by ion-molecule chemistry in the protosolar core. Theoretical models of interstellar fractionation at low temperatures predict large enrichments in both D and N-15 and can account for the largest isotop c enrichments measured in carbonaceous meteorites, However, more recent measurements have shown that, in some primitive samples, a large N-15 enrichment does not correlate with one in D, and that some D-enriched primitive material displays little, if any, N-15 enrichment. By considering the spin-state dependence in ion-molecule reactions involving the ortho and para forms of H2, we show that ammonia and related molecules can exhibit such a wide range of fractionation for both N-15 and D in dense cloud cores, We also show that while the nitriles, HCN and HNC, contain the greatest N-15 enrichment, this is not expected to correlate with extreme D emichment. These calculations therefore support the view that Solar System N-15 and D isotopic anomalies have an interstellar heritage, We also compare our results to existing astronomical observations and briefly discuss future tests of this model.

  1. Fontaine-Farriaux syndrome: a recognizable craniosynostosis syndrome with nail, skeletal, abdominal, and central nervous system anomalies.

    PubMed

    Castori, Marco; Silvestri, Evelina; Pedace, Lucia; Marseglia, Giuseppina; Tempera, Alessia; Antigoni, Ivana; Torricelli, Francesca; Majore, Silvia; Grammatico, Paola

    2009-10-01

    Craniosynostosis is an etiologically heterogeneous malformation, which may present as an isolated finding or in association with other anomalies. The concurrence of craniosynostosis together with specific central nervous system, abdominal, genital, and limb malformations defines the Fontaine-Farriaux syndrome, described so far in only two patients. We report on a stillborn who mainly presented severe intrauterine growth retardation, bilateral coronal synostosis, generalized nail hypo/aplasia more evident on the posterior side, tapered digits, mild cutaneous syndactyly, abdominal muscle hypoplasia, micropenis and bilateral cryptorchidism. Skeletal radiographs revealed universal platyspondyly and necropsy findings comprised intestinal malrotation, abnormal cortical gyral formation, periventricular heterotopia, and cerebellar hypoplasia. Comparison between the present and the two previously described patients demonstrates that our case shows a combination of features strikingly resembling the original description. Conversely, the second reported patient shows a very atypical phenotype and is, most probably, affected by a distinct clinical entity. The triad of craniosynostosis, anonychia, and abdominal muscle hypo/aplasia emerges as the most consistent core phenotype, although skeletal and brain anomalies are relevant ancillary findings. An in-depth differential diagnosis with other partially overlapping conditions is carried out. PMID:19731360

  2. ISOTOPIC ANOMALIES IN PRIMITIVE SOLAR SYSTEM MATTER: SPIN-STATE-DEPENDENT FRACTIONATION OF NITROGEN AND DEUTERIUM IN INTERSTELLAR CLOUDS

    SciTech Connect

    Wirstroem, Eva S.; Cordiner, Martin A.; Charnley, Steven B.; Milam, Stefanie N.

    2012-09-20

    Organic material found in meteorites and interplanetary dust particles is enriched in D and {sup 15}N. This is consistent with the idea that the functional groups carrying these isotopic anomalies, nitriles and amines, were formed by ion-molecule chemistry in the protosolar nebula. Theoretical models of interstellar fractionation at low temperatures predict large enrichments in both D and {sup 15}N and can account for the largest isotopic enrichments measured in carbonaceous meteorites. However, more recent measurements have shown that, in some primitive samples, a large {sup 15}N enrichment does not correlate with one in D, and that some D-enriched primitive material displays little, if any, {sup 15}N enrichment. By considering the spin-state dependence in ion-molecule reactions involving the ortho and para forms of H{sub 2}, we show that ammonia and related molecules can exhibit such a wide range of fractionation for both {sup 15}N and D in dense cloud cores. We also show that while the nitriles, HCN and HNC, contain the greatest {sup 15}N enrichment, this is not expected to correlate with extreme D enrichment. These calculations therefore support the view that solar system {sup 15}N and D isotopic anomalies have an interstellar heritage. We also compare our results to existing astronomical observations and briefly discuss future tests of this model.

  3. A Rare Anomaly of Biliary System: MRCP Evidence of a Cystic Duct Cyst.

    PubMed

    Goya, Cemil; Arslan, Mehmet Serif; Yavuz, Alpaslan; Hamidi, Cihad; Kuday, Suzan; Okur, Mehmet Hanifi; Aydogdu, Bahattin

    2014-01-01

    Cystic duct cysts are a rare congenital anomaly. While the other bile duct cysts (choledochus and the intrahepatic bile ducts) are classified according to the classification described by Tadoni, there is no classification method described by the cystic duct cysts, although it is claimed that the cystic duct cysts may constitute a new "Type 6" category. Only a limited number of patients with cystic duct cysts have been reported in the literature. The diagnosis is usually made in the neonatal period or during childhood. The clinical symptoms are nonspecific and usually include pain in the right upper quadrant and jaundice. The condition may also present with biliary colic, cholangitis, cholelithiasis, or pancreatitis. In our case, the abdominal ultrasonography (US) performed on a 6-year-old female patient who presented with pain in the right upper quadrant pointed out an anechoic cyst at the neck of the gall bladder. Based on the magnetic resonance cholangiopancreatography (MRCP) results, a cystic dilatation was diagnosed in the cystic duct. The aim of this case-report presentation was to discuss the US and MRCP findings of the cystic dilatation of cystic duct, which is an extremely rare condition, in the light of the literature information. PMID:24987540

  4. Nondestructive Crack Detection in a Fuel System Component

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay; Wincheski, Russell; Prosser, William; Russell, Richard; Devries, Robert; Engel, James; Ruffino, Norman

    2010-01-01

    The paper discusses development of various NDE techniques to detect cracks in A40 steel poppets used in a valve of the fuel system of the Space Shuttle Orbiter. The valve assembly experiences a severe high cycle fatigue environment during its operation. Cracks were discovered at the radius of the poppet flange. Experience shows that very small cracks or material anomalies do not cause failure in a single operation event. While the design is being modified to eliminate the issue, NDE has been used to screen the poppets for cracks before every use. Several surface flaw detection techniques were considered and a few NDE techniques were developed to provide NDE screening for the flaw detection. The primary method used was the eddy current testing. In the eddy current technique, the X-Y channel test data from the eddy current instrument was recorded as computer files. A Matlab data review and plotting application was developed to analyze the data files. The Matlab application provides much higher resolution than the eddy current instrument that was used to acquire the data. Other techniques that were used included ultrasonic surface wave and magnetic particle testing. A probability of detection (POD) study was undertaken to determine the 90/95 size for the eddy current technique. This study used specimens with same geometry and material as the poppet. Fatigue cracks were grown in these specimens. Information on results of the NDE techniques and results of the POD study are provided.

  5. A population-based case-control study of drinking-water nitrate and congenital anomalies using Geographic Information Systems (GIS) to develop individual-level exposure estimates.

    PubMed

    Holtby, Caitlin E; Guernsey, Judith R; Allen, Alexander C; Vanleeuwen, John A; Allen, Victoria M; Gordon, Robert J

    2014-02-01

    Animal studies and epidemiological evidence suggest an association between prenatal exposure to drinking water with elevated nitrate (NO3-N) concentrations and incidence of congenital anomalies. This study used Geographic Information Systems (GIS) to derive individual-level prenatal drinking-water nitrate exposure estimates from measured nitrate concentrations from 140 temporally monitored private wells and 6 municipal water supplies. Cases of major congenital anomalies in Kings County, Nova Scotia, Canada, between 1988 and 2006 were selected from province-wide population-based perinatal surveillance databases and matched to controls from the same databases. Unconditional multivariable logistic regression was performed to test for an association between drinking-water nitrate exposure and congenital anomalies after adjusting for clinically relevant risk factors. Employing all nitrate data there was a trend toward increased risk of congenital anomalies for increased nitrate exposure levels though this was not statistically significant. After stratification of the data by conception before or after folic acid supplementation, an increased risk of congenital anomalies for nitrate exposure of 1.5-5.56 mg/L (2.44; 1.05-5.66) and a trend toward increased risk for >5.56 mg/L (2.25; 0.92-5.52) was found. Though the study is likely underpowered, these results suggest that drinking-water nitrate exposure may contribute to increased risk of congenital anomalies at levels below the current Canadian maximum allowable concentration. PMID:24503976

  6. Discriminating ultrasonic proximity detection system

    DOEpatents

    Annala, Wayne C.

    1989-01-01

    This invention uses an ultrasonic transmitter and receiver and a microprocessor to detect the presence of an object. In the reset mode the invention uses a plurality of echoes from each ultrasonic burst to create a reference table of the echo-burst-signature of the empty monitored environment. The invention then processes the reference table so that it only uses the most reliable data. In the detection mode the invention compares the echo-burst-signature of the present environment with the reference table, detecting an object if there is a consistent difference between the echo-burst-signature of the empty monitored environment recorded in the reference table and the echo-burst-signature of the present environment.

  7. Airborne change detection system for the detection of route mines

    NASA Astrophysics Data System (ADS)

    Donzelli, Thomas P.; Jackson, Larry; Yeshnik, Mark; Petty, Thomas E.

    2003-09-01

    The US Army is interested in technologies that will enable it to maintain the free flow of traffic along routes such as Main Supply Routes (MSRs). Mines emplaced in the road by enemy forces under cover of darkness represent a major threat to maintaining a rapid Operational Tempo (OPTEMPO) along such routes. One technique that shows promise for detecting enemy mining activity is Airborne Change Detection, which allows an operator to detect suspicious day-to-day changes in and around the road that may be indicative of enemy mining. This paper presents an Airborne Change Detection that is currently under development at the US Army Night Vision and Electronic Sensors Directorate (NVESD). The system has been tested using a longwave infrared (LWIR) sensor on a vertical take-off and landing unmanned aerial vehicle (VTOL UAV) and a midwave infrared (MWIR) sensor on a fixed wing aircraft. The system is described and results of the various tests conducted to date are presented.

  8. Idaho National Laboratory Supervisory Control and Data Acquisition Intrusion Detection System (SCADA IDS)

    SciTech Connect

    Jared Verba; Michael Milvich

    2008-05-01

    Current Intrusion Detection System (IDS) technology is not suited to be widely deployed inside a Supervisory, Control and Data Acquisition (SCADA) environment. Anomaly- and signature-based IDS technologies have developed methods to cover information technology-based networks activity and protocols effectively. However, these IDS technologies do not include the fine protocol granularity required to ensure network security inside an environment with weak protocols lacking authentication and encryption. By implementing a more specific and more intelligent packet inspection mechanism, tailored traffic flow analysis, and unique packet tampering detection, IDS technology developed specifically for SCADA environments can be deployed with confidence in detecting malicious activity.

  9. Thermal systems for landmine detection

    NASA Astrophysics Data System (ADS)

    D'Angelo, Marco; Del Vecchio, Luca; Esposito, Salvatore; Balsi, Marco; Jankowski, Stanislaw

    2009-06-01

    This paper presents new techniques of landmine detection and localization using thermal methods. Described methods use both dynamical and static analysis. The work is based on datasets obtained from the Humanitarian Demining Laboratory of Università La Sapienza di Roma, Italy.

  10. Forward Obstacle Detection System by Stereo Vision

    NASA Astrophysics Data System (ADS)

    Iwata, Hiroaki; Saneyoshi, Keiji

    Forward obstacle detection is needed to prevent car accidents. We have developed forward obstacle detection system which has good detectability and the accuracy of distance only by using stereo vision. The system runs in real time by using a stereo processing system based on a Field-Programmable Gate Array (FPGA). Road surfaces are detected and the space to drive can be limited. A smoothing filter is also used. Owing to these, the accuracy of distance is improved. In the experiments, this system could detect forward obstacles 100 m away. Its error of distance up to 80 m was less than 1.5 m. It could immediately detect cutting-in objects.

  11. Toward detecting deception in intelligent systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene, Jr.; Johnson, Gregory, Jr.

    2004-08-01

    Contemporary decision makers often must choose a course of action using knowledge from several sources. Knowledge may be provided from many diverse sources including electronic sources such as knowledge-based diagnostic or decision support systems or through data mining techniques. As the decision maker becomes more dependent on these electronic information sources, detecting deceptive information from these sources becomes vital to making a correct, or at least more informed, decision. This applies to unintentional disinformation as well as intentional misinformation. Our ongoing research focuses on employing models of deception and deception detection from the fields of psychology and cognitive science to these systems as well as implementing deception detection algorithms for probabilistic intelligent systems. The deception detection algorithms are used to detect, classify and correct attempts at deception. Algorithms for detecting unexpected information rely upon a prediction algorithm from the collaborative filtering domain to predict agent responses in a multi-agent system.

  12. Hydrogen Fire Detection System Features Sharp Discrimination

    NASA Technical Reports Server (NTRS)

    Bright, C. S.

    1966-01-01

    Hydrogen fire detection system discovers fires by detecting the flickering ultraviolet radiation emitted by the OH molecule, a short-lived intermediate combustion product found in hydrogen-air flames. In a space application, the system discriminates against false signals from sunlight and rocket engine exhaust plume radiation.

  13. Doses due to the South Atlantic Anomaly during the Euromir'95 mission measured by an on-board TLD system.

    PubMed

    Deme, S; Reitz, G; Apathy, I; Hejja, I; Lang, E; Feher, I

    1999-01-01

    During the Euromir'95 mission, a specially designed microprocessor-controlled thermoluminescent detector (TLD) system, called the 'Pille'95', was used by ESA astronaut Thomas Reiter to measure the cosmic radiation dose inside the Mir space station. One of the experiment's objectives was to determine the dose fraction on Mir due to the South Atlantic Anomaly (SAA) on an orbit inclined at 51.6 degrees and at an altitude of about 400 km. Using an hourly measuring period for 170 h in automatic mode, dose components both of galactic (independent of SAA) and SAA origin were determined. It was found that the maximum dose due to crossing the SAA was equal to 55 microGy. Averaging all the measurements it was calculated that the mean dose rate inside the Mir was 12-14 microGy h-1 and that half of this value was caused by the SAA. PMID:11542232

  14. Tectonic history of the north portion of the San Andreas fault system, California, inferred from gravity and magnetic anomalies

    USGS Publications Warehouse

    Griscom, A.; Jachens, R.C.

    1989-01-01

    Geologic and geophysical data for the San Andreas fault system north of San Francisco suggest that the eastern boundary of the Pacific plate migrated eastward from its presumed original position at the base of the continental slope to its present position along the San Andreas transform fault by means of a series of eastward jumps of the Mendocino triple junction. These eastward jumps total a distance of about 150 km since 29 Ma. Correlation of right-laterally displaced gravity and magnetic anomalies that now have components at San Francisco and on the shelf north of Point Arena indicates that the presently active strand of the San Andreas fault north of the San Francisco peninsula formed recently at about 5 Ma when the triple junction jumped eastward a minimum of 100 km to its present location at the north end of the San Andreas fault. -from Authors

  15. Real time prediction of sea level anomaly data with the Prognocean system - comparison of results obtained using different prediction techniques

    NASA Astrophysics Data System (ADS)

    Mizinski, Bartlomiej; Niedzielski, Tomasz; Kosek, Wieslaw

    2013-04-01

    Prognocean is a near-real time modeling and prediction system elaborated and based at University of Wroclaw, Poland. It operates on gridded Sea Level Anomaly (SLA) data obtained from the Archiving, Validation and Interpretation of Satellite Oceanographic data (AVISO), France. The data acquisition flow from AVISO to Prognocean is entirely automatic and is implemented in Python. The core of the system - including data pre-processing, modeling, prediction, validation and visualization procedures - is composed of a series of R scripts that are interrelated and work at three levels of generalization. The objective of the work presented here is to show the results of our numerical experiment that have been carried out since early 2012. Four prediction models have been implemented to date: (1) extrapolation of polynomial-harmonic model and the extrapolation of polynomial-harmonic model with (2) autoregressive model, (3) threshold autoregressive model and (4) autocovariance procedure. Although the presentation is limited to four models and their predictive skills, Prognocean consists of modules and hence new techniques may be plugged in at any time. In this paper, the comparison of the results into forecasting sea level anomaly maps is presented. Along with sample predictions, with various lead times up to two weeks, we present and discuss a set of root mean square prediction error maps computed in real time after the observations have been available. We identified areas where linear prediction models reveal considerable errors, which may indicate a non-linear mode of sea level change. In addition, we have identified an agreement between the spatial pattern of large prediction errors and the spatial occurrence of key mesoscale ocean eddies.

  16. The comprehensiveness of the ESHRE/ESGE classification of female genital tract congenital anomalies: a systematic review of cases not classified by the AFS system

    PubMed Central

    Di Spiezio Sardo, A.; Campo, R.; Gordts, S.; Spinelli, M.; Cosimato, C.; Tanos, V.; Brucker, S.; Li, T. C.; Gergolet, M.; De Angelis, C.; Gianaroli, L.; Grimbizis, G.

    2015-01-01

    STUDY QUESTION How comprehensive is the recently published European Society of Human Reproduction and Embryology (ESHRE)/European Society for Gynaecological Endoscopy (ESGE) classification system of female genital anomalies? SUMMARY ANSWER The ESHRE/ESGE classification provides a comprehensive description and categorization of almost all of the currently known anomalies that could not be classified properly with the American Fertility Society (AFS) system. WHAT IS KNOWN ALREADY Until now, the more accepted classification system, namely that of the AFS, is associated with serious limitations in effective categorization of female genital anomalies. Many cases published in the literature could not be properly classified using the AFS system, yet a clear and accurate classification is a prerequisite for treatment. STUDY DESIGN, SIZE AND DURATION The CONUTA (CONgenital UTerine Anomalies) ESHRE/ESGE group conducted a systematic review of the literature to examine if those types of anomalies that could not be properly classified with the AFS system could be effectively classified with the use of the new ESHRE/ESGE system. An electronic literature search through Medline, Embase and Cochrane library was carried out from January 1988 to January 2014. Three participants independently screened, selected articles of potential interest and finally extracted data from all the included studies. Any disagreement was discussed and resolved after consultation with a fourth reviewer and the results were assessed independently and approved by all members of the CONUTA group. PARTICIPANTS/MATERIALS, SETTING, METHODS Among the 143 articles assessed in detail, 120 were finally selected reporting 140 cases that could not properly fit into a specific class of the AFS system. Those 140 cases were clustered in 39 different types of anomalies. MAIN RESULTS AND THE ROLE OF CHANCE The congenital anomaly involved a single organ in 12 (30.8%) out of the 39 types of anomalies, while multiple organs

  17. Development of a Global Agricultural Hotspot Detection and Early Warning System

    NASA Astrophysics Data System (ADS)

    Lemoine, G.; Rembold, F.; Urbano, F.; Csak, G.

    2015-12-01

    The number of web based platforms for crop monitoring has grown rapidly over the last years and anomaly maps and time profiles of remote sensing derived indicators can be accessed online thanks to a number of web based portals. However, while these systems make available a large amount of crop monitoring data to the agriculture and food security analysts, there is no global platform which provides agricultural production hotspot warning in a highly automatic and timely manner. Therefore a web based system providing timely warning evidence as maps and short narratives is currently under development by the Joint Research Centre. The system (called "HotSpot Detection System of Agriculture Production Anomalies", HSDS) will focus on water limited agricultural systems worldwide. The automatic analysis of relevant meteorological and vegetation indicators at selected administrative units (Gaul 1 level) will trigger warning messages for the areas where anomalous conditions are observed. The level of warning (ranging from "watch" to "alert") will depend on the nature and number of indicators for which an anomaly is detected. Information regarding the extent of the agricultural areas concerned by the anomaly and the progress of the agricultural season will complement the warning label. In addition, we are testing supplementary detailed information from other sources for the areas triggering a warning. These regard the automatic web-based and food security-tailored analysis of media (using the JRC Media Monitor semantic search engine) and the automatic detection of active crop area using Sentinel 1, upcoming Sentinel-2 and Landsat 8 imagery processed in Google Earth Engine. The basic processing will be fully automated and updated every 10 days exploiting low resolution rainfall estimates and satellite vegetation indices. Maps, trend graphs and statistics accompanied by short narratives edited by a team of crop monitoring experts, will be made available on the website on a

  18. TV system for detection of latent fingerprints

    NASA Astrophysics Data System (ADS)

    Li, Ping; Ban, Xianfu; Liu, Shaowu; Ding, Zhenfang

    1993-04-01

    A fingerprint is reliable evidence for recognizing criminals in detecting cases. There are many conventional chemical and physical methods in detecting fingerprints. In this paper, a newly developed portable TV system for detecting a latent fingerprint is described. This system is suited for field reconnaissance of cases as well as for laboratory testing. It can display a latent fingerprint, which is hard to identify and even cannot be displayed by conventional methods, and it can detect prints or stamps which are faded, altered, or falsified, etc.

  19. Online Monitoring System for Performance Fault Detection

    SciTech Connect

    Gioiosa, Roberto; Kestor, Gokcen; Kerbyson, Darren J.

    2014-12-31

    To achieve the exaFLOPS performance within a contained power budget, next generation supercomputers will feature hundreds of millions of components operating at low- and near-threshold voltage. As the probability that at least one of these components fails during the execution of an application approaches certainty, it seems unrealistic to expect that any run of a scientific application will not experience some performance faults. We believe that there is need of a new generation of light-weight performance and debugging tools that can be used online even during production runs of parallel applications and that can identify performance anomalies during the application execution. In this work we propose the design and implementation of such a monitoring system.

  20. Online Monitoring System for Performance Fault Detection

    SciTech Connect

    Gioiosa, Roberto; Kestor, Gokcen; Kerbyson, Darren J.

    2014-05-19

    To achieve the exaFLOPS performance within a contain power budget, next supercomputers will feature hundreds of millions of components operating at low- and near-threshold voltage. As the probability that at least one of these components fails during the execution of an application approaches certainty, it seems unrealistic to expect that any run of a scientific application will not experience some performance faults. We believe that there is need of a new generation of light-weight performance and debugging tools that can be used online even during production runs of parallel applications and that can identify performance anomalies during the application execution. In this work we propose the design and implementation of a monitoring system that continuously inspects the evolution of run