Science.gov

Sample records for model-based event detection

  1. Identification of new events in Apollo 16 lunar seismic data by Hidden Markov Model-based event detection and classification

    NASA Astrophysics Data System (ADS)

    Knapmeyer-Endrun, Brigitte; Hammer, Conny

    2015-10-01

    Detection and identification of interesting events in single-station seismic data with little prior knowledge and under tight time constraints is a typical scenario in planetary seismology. The Apollo lunar seismic data, with the only confirmed events recorded on any extraterrestrial body yet, provide a valuable test case. Here we present the application of a stochastic event detector and classifier to the data of station Apollo 16. Based on a single-waveform example for each event class and some hours of background noise, the system is trained to recognize deep moonquakes, impacts, and shallow moonquakes and performs reliably over 3 years of data. The algorithm's demonstrated ability to detect rare events and flag previously undefined signal classes as new event types is of particular interest in the analysis of the first seismic recordings from a completely new environment. We are able to classify more than 50% of previously unclassified lunar events, and additionally find over 200 new events not listed in the current lunar event catalog. These events include deep moonquakes as well as impacts and could be used to update studies on temporal variations in event rate or deep moonquakes stacks used in phase picking for localization. No unambiguous new shallow moonquake was detected, but application to data of the other Apollo stations has the potential for additional new discoveries 40 years after the data were recorded. Besides, the classification system could be useful for future seismometer missions to other planets, e.g., the InSight mission to Mars.

  2. a Topic Modeling Based Representation to Detect Tweet Locations. Example of the Event "je Suis Charlie"

    NASA Astrophysics Data System (ADS)

    Morchid, M.; Josselin, D.; Portilla, Y.; Dufour, R.; Altman, E.; Linarès, G.

    2015-09-01

    Social Networks became a major actor in information propagation. Using the Twitter popular platform, mobile users post or relay messages from different locations. The tweet content, meaning and location, show how an event-such as the bursty one "JeSuisCharlie", happened in France in January 2015, is comprehended in different countries. This research aims at clustering the tweets according to the co-occurrence of their terms, including the country, and forecasting the probable country of a non-located tweet, knowing its content. First, we present the process of collecting a large quantity of data from the Twitter website. We finally have a set of 2,189 located tweets about "Charlie", from the 7th to the 14th of January. We describe an original method adapted from the Author-Topic (AT) model based on the Latent Dirichlet Allocation (LDA) method. We define an homogeneous space containing both lexical content (words) and spatial information (country). During a training process on a part of the sample, we provide a set of clusters (topics) based on statistical relations between lexical and spatial terms. During a clustering task, we evaluate the method effectiveness on the rest of the sample that reaches up to 95% of good assignment. It shows that our model is pertinent to foresee tweet location after a learning process.

  3. Applying a Hidden Markov Model-Based Event Detection and Classification Algorithm to Apollo Lunar Seismic Data

    NASA Astrophysics Data System (ADS)

    Knapmeyer-Endrun, B.; Hammer, C.

    2014-12-01

    The seismometers that the Apollo astronauts deployed on the Moon provide the only recordings of seismic events from any extra-terrestrial body so far. These lunar events are significantly different from ones recorded on Earth, in terms of both signal shape and source processes. Thus they are a valuable test case for any experiment in planetary seismology. In this study, we analyze Apollo 16 data with a single-station event detection and classification algorithm in view of NASA's upcoming InSight mission to Mars. InSight, scheduled for launch in early 2016, has the goal to investigate Mars' internal structure by deploying a seismometer on its surface. As the mission does not feature any orbiter, continuous data will be relayed to Earth at a reduced rate. Full range data will only be available by requesting specific time-windows within a few days after the receipt of the original transmission. We apply a recently introduced algorithm based on hidden Markov models that requires only a single example waveform of each event class for training appropriate models. After constructing the prototypes we detect and classify impacts and deep and shallow moonquakes. Initial results for 1972 (year of station installation with 8 months of data) indicate a high detection rate of over 95% for impacts, of which more than 80% are classified correctly. Deep moonquakes, which occur in large amounts, but often show only very weak signals, are detected with less certainty (~70%). As there is only one weak shallow moonquake covered, results for this event class are not statistically significant. Daily adjustments of the background noise model help to reduce false alarms, which are mainly erroneous deep moonquake detections, by about 25%. The algorithm enables us to classify events that were previously listed in the catalog without classification, and, through the combined use of long period and short period data, identify some unlisted local impacts as well as at least two yet unreported

  4. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  5. Detection of anomalous events

    DOEpatents

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  6. Detection of solar events

    SciTech Connect

    Fischbach, Ephraim; Jenkins, Jere

    2013-08-27

    A flux detection apparatus can include a radioactive sample having a decay rate capable of changing in response to interaction with a first particle or a field, and a detector associated with the radioactive sample. The detector is responsive to a second particle or radiation formed by decay of the radioactive sample. The rate of decay of the radioactive sample can be correlated to flux of the first particle or the field. Detection of the first particle or the field can provide an early warning for an impending solar event.

  7. Model-Based Signal Processing: Correlation Detection With Synthetic Seismograms

    SciTech Connect

    Rodgers, A; Harris, D; Pasyanos, M; Blair, S; Matt, R

    2006-08-30

    Recent applications of correlation methods to seismological problems illustrate the power of coherent signal processing applied to seismic waveforms. Examples of these applications include detection of low amplitude signals buried in ambient noise and cross-correlation of sets of waveforms to form event clusters and accurately measure delay times for event relocation and/or earth structure. These methods rely on the exploitation of the similarity of individual waveforms and have been successfully applied to large sets of empirical observations. However, in cases with little or no empirical event data, such as aseismic regions or exotic event types, correlation methods with observed seismograms will not be possible due to the lack of previously observed similar waveforms. This study uses model-based signals computed for three-dimensional (3D) Earth models to form the basis for correlation detection. Synthetic seismograms are computed for fully 3D models estimated from the Markov Chain Monte-Carlo (MCMC) method. MCMC uses stochastic sampling to fit multiple seismological data sets. Rather than estimate a single ''optimal'' model, MCMC results in a suite of models that sample the model space and incorporates uncertainty through variability of the models. The variability reflects our ignorance of Earth structure, due to limited resolution, data and modeling errors, and produces variability in the seismic waveform response. Model-based signals are combined using a subspace method where the synthetic signals are decomposed into an orthogonal basis by singular-value decomposition (SVD) and the observed waveforms are represented with a linear combination of a sub-set of eigenvectors (signals) associated with the most significant eigenvalues. We have demonstrated the method by modeling long-period (80-10 seconds) regional seismograms for a moderate (M{approx}5) earthquake near the China-North Korea border. Synthetic seismograms are computed with the Spectral Element Method

  8. Scintillation event energy measurement via a pulse model based iterative deconvolution method

    NASA Astrophysics Data System (ADS)

    Deng, Zhenzhou; Xie, Qingguo; Duan, Zhiwen; Xiao, Peng

    2013-11-01

    This work focuses on event energy measurement, a crucial task of scintillation detection systems. We modeled the scintillation detector as a linear system and treated the energy measurement as a deconvolution problem. We proposed a pulse model based iterative deconvolution (PMID) method, which can process pileup events without detection and is adaptive for different signal pulse shapes. The proposed method was compared with digital gated integrator (DGI) and digital delay-line clipping (DDLC) using real world experimental data. For singles data, the energy resolution (ER) produced by PMID matched that of DGI. For pileups, the PMID method outperformed both DGI and DDLC in ER and counts recovery. The encouraging results suggest that the PMID method has great potentials in applications like photon-counting systems and pulse height spectrometers, in which multiple-event pileups are common.

  9. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  10. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  11. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. PMID:25363706

  12. Adaptive, Model-Based Monitoring and Threat Detection

    NASA Astrophysics Data System (ADS)

    Valdes, Alfonso; Skinner, Keith

    2002-09-01

    We explore the suitability of model-based probabilistic techniques, such as Bayes networks, to the field of intrusion detection and alert report correlation. We describe a network intrusion detection system (IDS) using Bayes inference, wherein the knowledge base is encoded not as rules but as conditional probability relations between observables and hypotheses of normal and malicious usage. The same high-performance Bayes inference library was employed in a component of the Mission-Based Correlation effort, using an initial knowledge base that adaptively learns the security administrator's preference for alert priority and rank. Another major effort demonstrated probabilistic techniques in heterogeneous sensor correlation. We provide results for simulated attack data, live traffic, and the CyberPanel Grand Challenge Problem. Our results establish that model-based probabilistic techniques are an important complementary capability to signature-based methods in detection and correlation.

  13. Fuzzy model-based observers for fault detection in CSTR.

    PubMed

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions. PMID:26521723

  14. An improved intrusion detection model based on paraconsistent logic

    NASA Astrophysics Data System (ADS)

    Yan, Fei; Zhang, Huanguo; Wang, Lina; Yang, Min

    2005-02-01

    A major difficulty of current intrusion detection model is the attack set cannot be separated from normal set thoroughly. On the basis of paraconsistent logic, an improved intrusion detection model is proposed to solve this problem. We give a proof that the detection model is trivial and discuss the reason of false alerts. A parallel paraconsistent detection algorithm is presented to develop the detection technology based on our model. An experiment using network connection data, which is usually used to evaluate the intrusion detection methods, is given to illustrate the performance of this model. We use one-class supported vector machine (SVM) to train our profiles and use supported vector-clustering (SVC) algorithm to update our detection profiles. Results of the experiment indicate that the detection system based on our model can deal with the uncertain events and reduce the false alerts.

  15. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Smith, Timothy A. (Inventor); Urnes, James M., Sr. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  16. Model-based approach to real-time target detection

    NASA Astrophysics Data System (ADS)

    Hackett, Jay K.; Gold, Ed V.; Long, Daniel T.; Cloud, Eugene L.; Duvoisin, Herbert A.

    1992-09-01

    Land mine detection and extraction from infra-red (IR) scenes using real-time parallel processing is of significant interest to ground based infantry. The mine detection algorithms consist of several sub-processes to progress from raw input IR imagery to feature based mine nominations. Image enhancement is first applied; this consists of noise and sensor artifact removal. Edge grouping is used to determine the boundary of the objects. The generalized Hough Transform tuned to the land mine signature acts as a model based matched nomination filter. Once the object is found, the model is used to guide the labeling of each pixel as background, object, or object boundary. Using these labels to identify object regions, feature primitives are extracted in a high speed parallel processor. A feature based screener then compares each object's feature primitives to acceptable values and rejects all objects that do not resemble mines. This operation greatly reduces the number of objects that must be passed from a real-time parallel processor to the classifier. We will discuss details of this model- based approach, including results from actual IR field test imagery.

  17. Event rates for WIMP detection

    SciTech Connect

    Vergados, J. D.; Moustakidis, Ch. C.; Oikonomou, V.

    2006-11-28

    The event rates for the direct detection of dark matter for various types of WIMPs are presented. In addition to the neutralino of SUSY models, we considered other candidates (exotic scalars as well as particles in Kaluza-Klein and technicolour theories) with masses in the TeV region. Then one finds reasonable branching ratios to excited states. Thus the detection of the WIMP can be made not only by recoil measurements, by measuring the de-excitation {gamma}-rays as well.

  18. a model based on crowsourcing for detecting natural hazards

    NASA Astrophysics Data System (ADS)

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  19. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  20. Probabilistic model-based approach for heart beat detection.

    PubMed

    Chen, Hugh; Erol, Yusuf; Shen, Eric; Russell, Stuart

    2016-09-01

    Nowadays, hospitals are ubiquitous and integral to modern society. Patients flow in and out of a veritable whirlwind of paperwork, consultations, and potential inpatient admissions, through an abstracted system that is not without flaws. One of the biggest flaws in the medical system is perhaps an unexpected one: the patient alarm system. One longitudinal study reported an 88.8% rate of false alarms, with other studies reporting numbers of similar magnitudes. These false alarm rates lead to deleterious effects that manifest in a lower standard of care across clinics. This paper discusses a model-based probabilistic inference approach to estimate physiological variables at a detection level. We design a generative model that complies with a layman's understanding of human physiology and perform approximate Bayesian inference. One primary goal of this paper is to justify a Bayesian modeling approach to increasing robustness in a physiological domain. In order to evaluate our algorithm we look at the application of heart beat detection using four datasets provided by PhysioNet, a research resource for complex physiological signals, in the form of the PhysioNet 2014 Challenge set-p1 and set-p2, the MIT-BIH Polysomnographic Database, and the MGH/MF Waveform Database. On these data sets our algorithm performs on par with the other top six submissions to the PhysioNet 2014 challenge. The overall evaluation scores in terms of sensitivity and positive predictivity values obtained were as follows: set-p1 (99.72%), set-p2 (93.51%), MIT-BIH (99.66%), and MGH/MF (95.53%). These scores are based on the averaging of gross sensitivity, gross positive predictivity, average sensitivity, and average positive predictivity. PMID:27480267

  1. GPU Accelerated Event Detection Algorithm

    Energy Science and Technology Software Center (ESTSC)

    2011-05-25

    Smart grid external require new algorithmic approaches as well as parallel formulations. One of the critical components is the prediction of changes and detection of anomalies within the power grid. The state-of-the-art algorithms are not suited to handle the demands of streaming data analysis. (i) need for events detection algorithms that can scale with the size of data, (ii) need for algorithms that can not only handle multi dimensional nature of the data, but alsomore » model both spatial and temporal dependencies in the data, which, for the most part, are highly nonlinear, (iii) need for algorithms that can operate in an online fashion with streaming data. The GAEDA code is a new online anomaly detection techniques that take into account spatial, temporal, multi-dimensional aspects of the data set. The basic idea behind the proposed approach is to (a) to convert a multi-dimensional sequence into a univariate time series that captures the changes between successive windows extracted from the original sequence using singular value decomposition (SVD), and then (b) to apply known anomaly detection techniques for univariate time series. A key challenge for the proposed approach is to make the algorithm scalable to huge datasets by adopting techniques from perturbation theory, incremental SVD analysis. We used recent advances in tensor decomposition techniques which reduce computational complexity to monitor the change between successive windows and detect anomalies in the same manner as described above. Therefore we propose to develop the parallel solutions on many core systems such as GPUs, because these algorithms involve lot of numerical operations and are highly data-parallelizable.« less

  2. Detecting Adverse Events Using Information Technology

    PubMed Central

    Bates, David W.; Evans, R. Scott; Murff, Harvey; Stetson, Peter D.; Pizziferri, Lisa; Hripcsak, George

    2003-01-01

    Context: Although patient safety is a major problem, most health care organizations rely on spontaneous reporting, which detects only a small minority of adverse events. As a result, problems with safety have remained hidden. Chart review can detect adverse events in research settings, but it is too expensive for routine use. Information technology techniques can detect some adverse events in a timely and cost-effective way, in some cases early enough to prevent patient harm. Objective: To review methodologies of detecting adverse events using information technology, reports of studies that used these techniques to detect adverse events, and study results for specific types of adverse events. Design: Structured review. Methodology: English-language studies that reported using information technology to detect adverse events were identified using standard techniques. Only studies that contained original data were included. Main Outcome Measures: Adverse events, with specific focus on nosocomial infections, adverse drug events, and injurious falls. Results: Tools such as event monitoring and natural language processing can inexpensively detect certain types of adverse events in clinical databases. These approaches already work well for some types of adverse events, including adverse drug events and nosocomial infections, and are in routine use in a few hospitals. In addition, it appears likely that these techniques will be adaptable in ways that allow detection of a broad array of adverse events, especially as more medical information becomes computerized. Conclusion: Computerized detection of adverse events will soon be practical on a widespread basis. PMID:12595401

  3. Model-Based Detection in a Shallow Water Ocean Environment

    SciTech Connect

    Candy, J V

    2001-07-30

    A model-based detector is developed to process shallow water ocean acoustic data. The function of the detector is to adaptively monitor the environment and decide whether or not a change from normal has occurred. Here we develop a processor incorporating both a normal-mode ocean acoustic model and a vertical hydrophone array. The detector is applied to data acquired from the Hudson Canyon experiments at various ranges and its performance is evaluated.

  4. Crowd Event Detection on Optical Flow Manifolds.

    PubMed

    Rao, Aravinda S; Gubbi, Jayavardhana; Marusic, Slaven; Palaniswami, Marimuthu

    2016-07-01

    Analyzing crowd events in a video is key to understanding the behavioral characteristics of people (humans). Detecting crowd events in videos is challenging because of articulated human movements and occlusions. The aim of this paper is to detect the events in a probabilistic framework for automatically interpreting the visual crowd behavior. In this paper, crowd event detection and classification in optical flow manifolds (OFMs) are addressed. A new algorithm to detect walking and running events has been proposed, which uses optical flow vector lengths in OFMs. Furthermore, a new algorithm to detect merging and splitting events has been proposed, which uses Riemannian connections in the optical flow bundle (OFB). The longest vector from the OFB provides a key feature for distinguishing walking and running events. Using a Riemannian connection, the optical flow vectors are parallel transported to localize the crowd groups. The geodesic lengths among the groups provide a criterion for merging and splitting events. Dispersion and evacuation events are jointly modeled from the walking/running and merging/splitting events. Our results show that the proposed approach delivers a comparable model to detect crowd events. Using the performance evaluation of tracking and surveillance 2009 dataset, the proposed method is shown to produce the best results in merging, splitting, and dispersion events, and comparable results in walking, running, and evacuation events when compared with other methods. PMID:26219100

  5. Event oriented dictionary learning for complex event detection.

    PubMed

    Yan, Yan; Yang, Yi; Meng, Deyu; Liu, Gaowen; Tong, Wei; Hauptmann, Alexander G; Sebe, Nicu

    2015-06-01

    Complex event detection is a retrieval task with the goal of finding videos of a particular event in a large-scale unconstrained Internet video archive, given example videos and text descriptions. Nowadays, different multimodal fusion schemes of low-level and high-level features are extensively investigated and evaluated for the complex event detection task. However, how to effectively select the high-level semantic meaningful concepts from a large pool to assist complex event detection is rarely studied in the literature. In this paper, we propose a novel strategy to automatically select semantic meaningful concepts for the event detection task based on both the events-kit text descriptions and the concepts high-level feature descriptions. Moreover, we introduce a novel event oriented dictionary representation based on the selected semantic concepts. Toward this goal, we leverage training images (frames) of selected concepts from the semantic indexing dataset with a pool of 346 concepts, into a novel supervised multitask lp -norm dictionary learning framework. Extensive experimental results on TRECVID multimedia event detection dataset demonstrate the efficacy of our proposed method. PMID:25794390

  6. Sequential Model-Based Detection in a Shallow Ocean Acoustic Environment

    SciTech Connect

    Candy, J V

    2002-03-26

    A model-based detection scheme is developed to passively monitor an ocean acoustic environment along with its associated variations. The technique employs an embedded model-based processor and a reference model in a sequential likelihood detection scheme. The monitor is therefore called a sequential reference detector. The underlying theory for the design is developed and discussed in detail.

  7. Model-Based Design of Tree WSNs for Decentralized Detection.

    PubMed

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-01-01

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches. PMID:26307989

  8. A biological hierarchical model based underwater moving object detection.

    PubMed

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results. PMID:25140194

  9. A novel interacting multiple model based network intrusion detection scheme

    NASA Astrophysics Data System (ADS)

    Xin, Ruichi; Venkatasubramanian, Vijay; Leung, Henry

    2006-04-01

    In today's information age, information and network security are of primary importance to any organization. Network intrusion is a serious threat to security of computers and data networks. In internet protocol (IP) based network, intrusions originate in different kinds of packets/messages contained in the open system interconnection (OSI) layer 3 or higher layers. Network intrusion detection and prevention systems observe the layer 3 packets (or layer 4 to 7 messages) to screen for intrusions and security threats. Signature based methods use a pre-existing database that document intrusion patterns as perceived in the layer 3 to 7 protocol traffics and match the incoming traffic for potential intrusion attacks. Alternately, network traffic data can be modeled and any huge anomaly from the established traffic pattern can be detected as network intrusion. The latter method, also known as anomaly based detection is gaining popularity for its versatility in learning new patterns and discovering new attacks. It is apparent that for a reliable performance, an accurate model of the network data needs to be established. In this paper, we illustrate using collected data that network traffic is seldom stationary. We propose the use of multiple models to accurately represent the traffic data. The improvement in reliability of the proposed model is verified by measuring the detection and false alarm rates on several datasets.

  10. A Biological Hierarchical Model Based Underwater Moving Object Detection

    PubMed Central

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results. PMID:25140194

  11. Monitoring the Ocean Acoustic Environment: A Model-Based Detection Approach

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    2000-03-13

    A model-based approach is applied in the development of a processor designed to passively monitor an ocean acoustic environment along with its associated variations. The technique employs an adaptive, model-based processor embedded in a sequential likelihood detection scheme. The trade-off between state-based and innovations-based monitor designs is discussed, conceptually. The underlying theory for the innovations-based design is briefly developed and applied to a simulated data set.

  12. Automated Detection of Events of Scientific Interest

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.

  13. Model-based failure detection for cylindrical shells from noisy vibration measurements.

    PubMed

    Candy, J V; Fisher, K A; Guidry, B L; Chambers, D H

    2014-12-01

    Model-based processing is a theoretically sound methodology to address difficult objectives in complex physical problems involving multi-channel sensor measurement systems. It involves the incorporation of analytical models of both physical phenomenology (complex vibrating structures, noisy operating environment, etc.) and the measurement processes (sensor networks and including noise) into the processor to extract the desired information. In this paper, a model-based methodology is developed to accomplish the task of online failure monitoring of a vibrating cylindrical shell externally excited by controlled excitations. A model-based processor is formulated to monitor system performance and detect potential failure conditions. The objective of this paper is to develop a real-time, model-based monitoring scheme for online diagnostics in a representative structural vibrational system based on controlled experimental data. PMID:25480059

  14. On event-based optical flow detection

    PubMed Central

    Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko

    2015-01-01

    Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470

  15. Phenological Event Detection from Multitemporal Image Data

    SciTech Connect

    Vatsavai, Raju

    2009-01-01

    Monitoring biomass over large geographic regions for seasonal changes in vegetation and crop phenology is important for many applications. In this paper we a present a novel clustering based change detection method using MODIS NDVI time series data. We used well known EM technique to find GMM parameters and Bayesian Information Criteria (BIC) for determining the number of clusters. KL Divergence measure is then used to establish the cluster correspondence across two years (2001 and 2006) to determine changes between these two years. The changes identied were further analyzed for understanding phenological events. This preliminary study shows interesting relationships between key phenological events such as onset, length, end of growing seasons.

  16. Detection and recognition of indoor smoking events

    NASA Astrophysics Data System (ADS)

    Bien, Tse-Lun; Lin, Chang Hong

    2013-03-01

    Smoking in public indoor spaces has become prohibited in many countries since it not only affects the health of the people around you, but also increases the risk of fire outbreaks. This paper proposes a novel scheme to automatically detect and recognize smoking events by using exsiting surveillance cameras. The main idea of our proposed method is to detect human smoking events by recognizing their actions. In this scheme, the human pose estimation is introduced to analyze human actions from their poses. The human pose estimation method segments head and both hands from human body parts by using a skin color detection method. However, the skin color methods may fail in insufficient light conditions. Therefore, the lighting compensation is applied to help the skin color detection method become more accurate. Due to the human body parts may be covered by shadows, which may cause the human pose estimation to fail, the Kalman filter is applied to track the missed body parts. After that, we evaluate the probability features of hands approaching the head. The support vector machine (SVM) is applied to learn and recognize the smoking events by the probability features. To analysis the performance of proposed method, the datasets established in the survillance camera view under indoor enviroment are tested. The experimental results show the effectiveness of our proposed method with accuracy rate of 83.33%.

  17. Implementation of a Fractional Model-Based Fault Detection Algorithm into a PLC Controller

    NASA Astrophysics Data System (ADS)

    Kopka, Ryszard

    2014-12-01

    This paper presents results related to the implementation of model-based fault detection and diagnosis procedures into a typical PLC controller. To construct the mathematical model and to implement the PID regulator, a non-integer order differential/integral calculation was used. Such an approach allows for more exact control of the process and more precise modelling. This is very crucial in model-based diagnostic methods. The theoretical results were verified on a real object in the form of a supercapacitor connected to a PLC controller by a dedicated electronic circuit controlled directly from the PLC outputs.

  18. Radioactive Threat Detection with Scattering Physics: A Model-Based Application

    SciTech Connect

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-01-21

    The detection of radioactive contraband is a critical problem in maintaining national security for any country. Emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. The development of a model-based sequential Bayesian processor that captures both the underlying transport physics including scattering offers a physics-based approach to attack this challenging problem. It is shown that this processor can be used to develop an effective detection technique.

  19. Model-Based Detection of Radioactive Contraband for Harbor Defense Incorporating Compton Scattering Physics

    SciTech Connect

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-03-02

    The detection of radioactive contraband is a critical problem is maintaining national security for any country. Photon emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. This problem becomes especially important when ships are intercepted by U.S. Coast Guard harbor patrols searching for contraband. The development of a sequential model-based processor that captures both the underlying transport physics of gamma-ray emissions including Compton scattering and the measurement of photon energies offers a physics-based approach to attack this challenging problem. The inclusion of a basic radionuclide representation of absorbed/scattered photons at a given energy along with interarrival times is used to extract the physics information available from the noisy measurements portable radiation detection systems used to interdict contraband. It is shown that this physics representation can incorporated scattering physics leading to an 'extended' model-based structure that can be used to develop an effective sequential detection technique. The resulting model-based processor is shown to perform quite well based on data obtained from a controlled experiment.

  20. Phase-Space Detection of Cyber Events

    SciTech Connect

    Hernandez Jimenez, Jarilyn M; Ferber, Aaron E; Prowell, Stacy J; Hively, Lee M

    2015-01-01

    Energy Delivery Systems (EDS) are a network of processes that produce, transfer and distribute energy. EDS are increasingly dependent on networked computing assets, as are many Industrial Control Systems. Consequently, cyber-attacks pose a real and pertinent threat, as evidenced by Stuxnet, Shamoon and Dragonfly. Hence, there is a critical need for novel methods to detect, prevent, and mitigate effects of such attacks. To detect cyber-attacks in EDS, we developed a framework for gathering and analyzing timing data that involves establishing a baseline execution profile and then capturing the effect of perturbations in the state from injecting various malware. The data analysis was based on nonlinear dynamics and graph theory to improve detection of anomalous events in cyber applications. The goal was the extraction of changing dynamics or anomalous activity in the underlying computer system. Takens' theorem in nonlinear dynamics allows reconstruction of topologically invariant, time-delay-embedding states from the computer data in a sufficiently high-dimensional space. The resultant dynamical states were nodes, and the state-to-state transitions were links in a mathematical graph. Alternatively, sequential tabulation of executing instructions provides the nodes with corresponding instruction-to-instruction links. Graph theorems guarantee graph-invariant measures to quantify the dynamical changes in the running applications. Results showed a successful detection of cyber events.

  1. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. PMID:20804980

  2. WCEDS: A waveform correlation event detection system

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.; Astiz, L.; Shearer, P.M.

    1995-08-01

    We have developed a working prototype of a grid-based global event detection system based on waveform correlation. The algorithm comes from a long-period detector but we have recast it in a full matrix formulation which can reduce the number of multiplications needed by better than two orders of magnitude for realistic monitoring scenarios. The reduction is made possible by eliminating redundant multiplications in the original formulation. All unique correlations for a given origin time are stored in a correlation matrix (C) which is formed by a full matrix product of a Master Image matrix (M) and a data matrix (D). The detector value at each grid point is calculated by following a different summation path through the correlation matrix. Master Images can be derived either empirically or synthetically. Our testing has used synthetic Master Images because their influence on the detector is easier to understand. We tested the system using the matrix formulation with continuous data from the IRIS (Incorporate Research Institutes for Seismology) broadband global network to monitor a 2 degree evenly spaced surface grid with a time discretization of 1 sps; we successfully detected the largest event in a two hour segment from October 1993. The output at the correct gridpoint was at least 33% larger than at adjacent grid points, and the output at the correct gridpoint at the correct origin time was more than 500% larger than the output at the same gridpoint immediately before or after. Analysis of the C matrix for the origin time of the event demonstrates that there are many significant ``false`` correlations of observed phases with incorrect predicted phases. These false correlations dull the sensitivity of the detector and so must be dealt with if our system is to attain detection thresholds consistent with a Comprehensive Test Ban Treaty (CTBT).

  3. Implementation of a model based fault detection and diagnosis technique for actuation faults of the SSME

    NASA Technical Reports Server (NTRS)

    Duyar, A.; Guo, T.-H.; Merrill, W.; Musgrave, J.

    1991-01-01

    In a previous study, Guo, Merrill and Duyar, 1990, reported a conceptual development of a fault detection and diagnosis system for actuation faults of the Space Shuttle main engine. This study, which is a continuation of the previous work, implements the developed fault detection and diagnosis scheme for the real time actuation fault diagnosis of the Space Shuttle Main Engine. The scheme will be used as an integral part of an intelligent control system demonstration experiment at NASA Lewis. The diagnosis system utilizes a model based method with real time identification and hypothesis testing for actuation, sensor, and performance degradation faults.

  4. Real-Time Model-Based Leak-Through Detection within Cryogenic Flow Systems

    NASA Technical Reports Server (NTRS)

    Walker, M.; Figueroa, F.

    2015-01-01

    The timely detection of leaks within cryogenic fuel replenishment systems is of significant importance to operators on account of the safety and economic impacts associated with material loss and operational inefficiencies. Associated loss in control of pressure also effects the stability and ability to control the phase of cryogenic fluids during replenishment operations. Current research dedicated to providing Prognostics and Health Management (PHM) coverage of such cryogenic replenishment systems has focused on the detection of leaks to atmosphere involving relatively simple model-based diagnostic approaches that, while effective, are unable to isolate the fault to specific piping system components. The authors have extended this research to focus on the detection of leaks through closed valves that are intended to isolate sections of the piping system from the flow and pressurization of cryogenic fluids. The described approach employs model-based detection of leak-through conditions based on correlations of pressure changes across isolation valves and attempts to isolate the faults to specific valves. Implementation of this capability is enabled by knowledge and information embedded in the domain model of the system. The approach has been used effectively to detect such leak-through faults during cryogenic operational testing at the Cryogenic Testbed at NASA's Kennedy Space Center.

  5. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGESBeta

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  6. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    SciTech Connect

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.

  7. A model-based approach for detection of objects in low resolution passive millimeter wave images

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Tang, Yuan-Liang; Devadiga, Sadashiva

    1993-01-01

    A model-based vision system to assist the pilots in landing maneuvers under restricted visibility conditions is described. The system was designed to analyze image sequences obtained from a Passive Millimeter Wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere, but their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. These data are exploited to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. Also analytical expressions were derived from the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  8. Model-based approach for detection of objects in low-resolution passive-millimeter images

    NASA Astrophysics Data System (ADS)

    Tang, Yuan-Ling; Devadiga, Sadashiva; Kasturi, Rangachar; Harris, Randall L., Sr.

    1994-03-01

    We describe a model-based vision system to assist the pilots in landing maneuvers under restricted visibility conditions. The system has been designed to analyze image sequences obtained from a passive millimeter wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere, but their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. We exploit these data to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. We also derive analytical expressions for the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  9. Rare Event Detection Algorithm Of Water Quality

    NASA Astrophysics Data System (ADS)

    Ungs, M. J.

    2011-12-01

    A novel method is presented describing the development and implementation of an on-line water quality event detection algorithm. An algorithm was developed to distinguish between normal variation in water quality parameters and changes in these parameters triggered by the presence of contaminant spikes. Emphasis is placed on simultaneously limiting the number of false alarms (which are called false positives) that occur and the number of misses (called false negatives). The problem of excessive false alarms is common to existing change detection algorithms. EPA's standard measure of evaluation for event detection algorithms is to have a false alarm rate of less than 0.5 percent and a false positive rate less than 2 percent (EPA 817-R-07-002). A detailed description of the algorithm's development is presented. The algorithm is tested using historical water quality data collected by a public water supply agency at multiple locations and using spiking contaminants developed by the USEPA, Water Security Division. The water quality parameters of specific conductivity, chlorine residual, total organic carbon, pH, and oxidation reduction potential are considered. Abnormal data sets are generated by superimposing water quality changes on the historical or baseline data. Eddies-ET has defined reaction expressions which specify how the peak or spike concentration of a particular contaminant affects each water quality parameter. Nine default contaminants (Eddies-ET) were previously derived from pipe-loop tests performed at EPA's National Homeland Security Research Center (NHSRC) Test and Evaluation (T&E) Facility. A contaminant strength value of approximately 1.5 is considered to be a significant threat. The proposed algorithm has been able to achieve a combined false alarm rate of less than 0.03 percent for both false positives and for false negatives using contaminant spikes of strength 2 or more.

  10. Model-based estimation of measures of association for time-to-event outcomes

    PubMed Central

    2014-01-01

    Background Hazard ratios are ubiquitously used in time to event applications to quantify adjusted covariate effects. Although hazard ratios are invaluable for hypothesis testing, other adjusted measures of association, both relative and absolute, should be provided to fully appreciate studies results. The corrected group prognosis method is generally used to estimate the absolute risk reduction and the number needed to be treated for categorical covariates. Methods The goal of this paper is to present transformation models for time-to-event outcomes to obtain, directly from estimated coefficients, the measures of association widely used in biostatistics together with their confidence interval. Pseudo-values are used for a practical estimation of transformation models. Results Using the regression model estimated through pseudo-values with suitable link functions, relative risks, risk differences and the number needed to treat, are obtained together with their confidence intervals. One example based on literature data and one original application to the study of prognostic factors in primary retroperitoneal soft tissue sarcomas are presented. A simulation study is used to show some properties of the different estimation methods. Conclusions Clinically useful measures of treatment or exposure effect are widely available in epidemiology. When time to event outcomes are present, the analysis is performed generally resorting to predicted values from Cox regression model. It is now possible to resort to more general regression models, adopting suitable link functions and pseudo values for estimation, to obtain alternative measures of effect directly from regression coefficients together with their confidence interval. This may be especially useful when, in presence of time dependent covariate effects, it is not straightforward to specify the correct, if any, time dependent functional form. The method can easily be implemented with standard software. PMID:25106903

  11. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  12. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  13. A model-based framework for the detection of spiculated masses on mammography

    SciTech Connect

    Sampat, Mehul P.; Bovik, Alan C.; Whitman, Gary J.; Markey, Mia K.

    2008-05-15

    The detection of lesions on mammography is a repetitive and fatiguing task. Thus, computer-aided detection systems have been developed to aid radiologists. The detection accuracy of current systems is much higher for clusters of microcalcifications than for spiculated masses. In this article, the authors present a new model-based framework for the detection of spiculated masses. The authors have invented a new class of linear filters, spiculated lesion filters, for the detection of converging lines or spiculations. These filters are highly specific narrowband filters, which are designed to match the expected structures of spiculated masses. As a part of this algorithm, the authors have also invented a novel technique to enhance spicules on mammograms. This entails filtering in the radon domain. They have also developed models to reduce the false positives due to normal linear structures. A key contribution of this work is that the parameters of the detection algorithm are based on measurements of physical properties of spiculated masses. The results of the detection algorithm are presented in the form of free-response receiver operating characteristic curves on images from the Mammographic Image Analysis Society and Digital Database for Screening Mammography databases.

  14. Model-based fault detection and identification with online aerodynamic model structure selection

    NASA Astrophysics Data System (ADS)

    Lombaerts, T.

    2013-12-01

    This publication describes a recursive algorithm for the approximation of time-varying nonlinear aerodynamic models by means of a joint adaptive selection of the model structure and parameter estimation. This procedure is called adaptive recursive orthogonal least squares (AROLS) and is an extension and modification of the previously developed ROLS procedure. This algorithm is particularly useful for model-based fault detection and identification (FDI) of aerospace systems. After the failure, a completely new aerodynamic model can be elaborated recursively with respect to structure as well as parameter values. The performance of the identification algorithm is demonstrated on a simulation data set.

  15. Observer and data-driven-model-based fault detection in power plant coal mills

    SciTech Connect

    Odgaard, P.F.; Lin, B.; Jorgensen, S.B.

    2008-06-15

    This paper presents and compares model-based and data-driven fault detection approaches for coal mill systems. The first approach detects faults with an optimal unknown input observer developed from a simplified energy balance model. Due to the time-consuming effort in developing a first principles model with motor power as the controlled variable, data-driven methods for fault detection are also investigated. Regression models that represent normal operating conditions (NOCs) are developed with both static and dynamic principal component analysis and partial least squares methods. The residual between process measurement and the NOC model prediction is used for fault detection. A hybrid approach, where a data-driven model is employed to derive an optimal unknown input observer, is also implemented. The three methods are evaluated with case studies on coal mill data, which includes a fault caused by a blocked inlet pipe. All three approaches detect the fault as it emerges. The optimal unknown input observer approach is most robust, in that, it has no false positives. On the other hand, the data-driven approaches are more straightforward to implement, since they just require the selection of appropriate confidence limit to avoid false detection. The proposed hybrid approach is promising for systems where a first principles model is cumbersome to obtain.

  16. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  17. Comparison of chiller models for use in model-based fault detection

    SciTech Connect

    Sreedharan, Priya; Haves, Philip

    2001-06-07

    Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Factors that are considered in evaluating a model include accuracy, training data requirements, calibration effort, generality, and computational requirements. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression chillers. Three different models were studied: the Gordon and Ng Universal Chiller model (2nd generation) and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles, and the DOE-2 chiller model, as implemented in CoolTools{trademark}, which is empirical. The models were compared in terms of their ability to reproduce the observed performance of an older, centrifugal chiller operating in a commercial office building and a newer centrifugal chiller in a laboratory. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.

  18. Model-based imputation approach for data analysis in the presence of non-detects.

    PubMed

    Krishnamoorthy, K; Mallick, Avishek; Mathew, Thomas

    2009-04-01

    A model-based multiple imputation approach for analyzing sample data with non-detects is proposed. The imputation approach involves randomly generating observations below the detection limit using the detected sample values and then analyzing the data using complete sample techniques, along with suitable adjustments to account for the imputation. The method is described for the normal case and is illustrated for making inferences for constructing prediction limits, tolerance limits, for setting an upper bound for an exceedance probability and for interval estimation of a log-normal mean. Two imputation approaches are investigated in the paper: one uses approximate maximum likelihood estimates (MLEs) of the parameters and a second approach uses simple ad hoc estimates that were developed for the specific purpose of imputations. The accuracy of the approaches is verified using Monte Carlo simulation. Simulation studies show that both approaches are very satisfactory for small to moderately large sample sizes, but only the MLE-based approach is satisfactory for large sample sizes. The MLE-based approach can be calibrated to perform very well for large samples. Applicability of the method to the log-normal distribution and the gamma distribution (via a cube root transformation) is outlined. Simulation studies also show that the imputation approach works well for constructing tolerance limits and prediction limits for a gamma distribution. The approach is illustrated using a few practical examples. PMID:19181626

  19. 3D model-based detection and tracking for space autonomous and uncooperative rendezvous

    NASA Astrophysics Data System (ADS)

    Shang, Yang; Zhang, Yueqiang; Liu, Haibo

    2015-10-01

    In order to fully navigate using a vision sensor, a 3D edge model based detection and tracking technique was developed. Firstly, we proposed a target detection strategy over a sequence of several images from the 3D model to initialize the tracking. The overall purpose of such approach is to robustly match each image with the model views of the target. Thus we designed a line segment detection and matching method based on the multi-scale space technology. Experiments on real images showed that our method is highly robust under various image changes. Secondly, we proposed a method based on 3D particle filter (PF) coupled with M-estimation to track and estimate the pose of the target efficiently. In the proposed approach, a similarity observation model was designed according to a new distance function of line segments. Then, based on the tracking results of PF, the pose was optimized using M-estimation. Experiments indicated that the proposed method can effectively track and accurately estimate the pose of freely moving target in unconstrained environment.

  20. Detecting seismic events using Benford's Law

    NASA Astrophysics Data System (ADS)

    Diaz, Jordi; Gallart, Josep; Ruiz, Mario

    2015-04-01

    The Benford's Law (BL) states that the distribution of first significant digits is not uniform but follows a logarithmic frequency distribution. Even if a remarkable wide range of natural and socioeconomical data sets, from stock market values to quantum phase transitions, fit this peculiar law, the conformity to it has deserved few scientific applications, being used mainly as a test to pinpoint anomalous or fraudulent data. We developed a procedure to detect the arrival of seismic waves based on the degree of conformity of the amplitude values in the raw seismic trace to the BL. The signal is divided in time windows of appropriate length and the fitting of the first digits distribution to BL is checked in each time window using a conformity estimator. We document that both teleseismic and local earthquakes can be clearly identified in this procedure and we compare its performance with respect to the classical STA/LTA approach. Moreover, we show that the conformity of the seismic record to the BL does not depend on the amplitude of the incoming series, as the occurrence of events with very different amplitudes result in quite similar degree of BL fitting. On the other hand, we show that natural or man-made quasi-monochromatic seismic signals, surface wave trains or engine-generated vibrations can be identified through their very low BL estimator values, when appropriate interval lengths are used. Therefore, we conclude that the degree of conformity of a seismic signal with the BL is primarily dependent on the frequency content of that signal.

  1. AESOP: Adaptive Event detection SOftware using Programming by example

    NASA Astrophysics Data System (ADS)

    Thangali, Ashwin; Prasad, Harsha; Kethamakka, Sai; Demirdjian, David; Checka, Neal

    2015-05-01

    This paper presents AESOP, a software tool for automatic event detection in video. AESOP employs a super- vised learning approach for constructing event models, given training examples from different event classes. A trajectory-based formulation is used for modeling events with an aim towards incorporating invariance to changes in the camera location and orientation parameters. The proposed formulation is designed to accommodate events that involve interactions between two or more entities over an extended period of time. AESOP's event models are formulated as HMMs to improve the event detection algorithm's robustness to noise in input data and to achieve computationally efficient algorithms for event model training and event detection. AESOP's performance is demonstrated on a wide range of different scenarios, including stationary camera surveillance and aerial video footage captured in land and maritime environments.

  2. Model-based approach to the detection and classification of mines in sidescan sonar.

    PubMed

    Reed, Scott; Petillot, Yvan; Bell, Judith

    2004-01-10

    This paper presents a model-based approach to mine detection and classification by use of sidescan sonar. Advances in autonomous underwater vehicle technology have increased the interest in automatic target recognition systems in an effort to automate a process that is currently carried out by a human operator. Current automated systems generally require training and thus produce poor results when the test data set is different from the training set. This has led to research into unsupervised systems, which are able to cope with the large variability in conditions and terrains seen in sidescan imagery. The system presented in this paper first detects possible minelike objects using a Markov random field model, which operates well on noisy images, such as sidescan, and allows a priori information to be included through the use of priors. The highlight and shadow regions of the object are then extracted with a cooperating statistical snake, which assumes these regions are statistically separate from the background. Finally, a classification decision is made using Dempster-Shafer theory, where the extracted features are compared with synthetic realizations generated with a sidescan sonar simulator model. Results for the entire process are shown on real sidescan sonar data. Similarities between the sidescan sonar and synthetic aperture radar (SAR) imaging processes ensure that the approach outlined here could be made applied to SAR image analysis. PMID:14735943

  3. Model-based approach to the detection and classification of mines in sidescan sonar

    NASA Astrophysics Data System (ADS)

    Reed, Scott; Petillot, Yvan; Bell, Judith

    2004-01-01

    This paper presents a model-based approach to mine detection and classification by use of sidescan sonar. Advances in autonomous underwater vehicle technology have increased the interest in automatic target recognition systems in an effort to automate a process that is currently carried out by a human operator. Current automated systems generally require training and thus produce poor results when the test data set is different from the training set. This has led to research into unsupervised systems, which are able to cope with the large variability in conditions and terrains seen in sidescan imagery. The system presented in this paper first detects possible minelike objects using a Markov random field model, which operates well on noisy images, such as sidescan, and allows a priori information to be included through the use of priors. The highlight and shadow regions of the object are then extracted with a cooperating statistical snake, which assumes these regions are statistically separate from the background. Finally, a classification decision is made using Dempster-Shafer theory, where the extracted features are compared with synthetic realizations generated with a sidescan sonar simulator model. Results for the entire process are shown on real sidescan sonar data. Similarities between the sidescan sonar and synthetic aperture radar (SAR) imaging processes ensure that the approach outlined here could be made applied to SAR image analysis.

  4. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  5. Asynchronous event-based corner detection and matching.

    PubMed

    Clady, Xavier; Ieng, Sio-Hoi; Benosman, Ryad

    2015-06-01

    This paper introduces an event-based luminance-free method to detect and match corner events from the output of asynchronous event-based neuromorphic retinas. The method relies on the use of space-time properties of moving edges. Asynchronous event-based neuromorphic retinas are composed of autonomous pixels, each of them asynchronously generating "spiking" events that encode relative changes in pixels' illumination at high temporal resolutions. Corner events are defined as the spatiotemporal locations where the aperture problem can be solved using the intersection of several geometric constraints in events' spatiotemporal spaces. A regularization process provides the required constraints, i.e. the motion attributes of the edges with respect to their spatiotemporal locations using local geometric properties of visual events. Experimental results are presented on several real scenes showing the stability and robustness of the detection and matching. PMID:25828960

  6. Particle Filtering for Model-Based Anomaly Detection in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Solano, Wanda; Banerjee, Bikramjit; Kraemer, Landon

    2012-01-01

    A novel technique has been developed for anomaly detection of rocket engine test stand (RETS) data. The objective was to develop a system that postprocesses a csv file containing the sensor readings and activities (time-series) from a rocket engine test, and detects any anomalies that might have occurred during the test. The output consists of the names of the sensors that show anomalous behavior, and the start and end time of each anomaly. In order to reduce the involvement of domain experts significantly, several data-driven approaches have been proposed where models are automatically acquired from the data, thus bypassing the cost and effort of building system models. Many supervised learning methods can efficiently learn operational and fault models, given large amounts of both nominal and fault data. However, for domains such as RETS data, the amount of anomalous data that is actually available is relatively small, making most supervised learning methods rather ineffective, and in general met with limited success in anomaly detection. The fundamental problem with existing approaches is that they assume that the data are iid, i.e., independent and identically distributed, which is violated in typical RETS data. None of these techniques naturally exploit the temporal information inherent in time series data from the sensor networks. There are correlations among the sensor readings, not only at the same time, but also across time. However, these approaches have not explicitly identified and exploited such correlations. Given these limitations of model-free methods, there has been renewed interest in model-based methods, specifically graphical methods that explicitly reason temporally. The Gaussian Mixture Model (GMM) in a Linear Dynamic System approach assumes that the multi-dimensional test data is a mixture of multi-variate Gaussians, and fits a given number of Gaussian clusters with the help of the wellknown Expectation Maximization (EM) algorithm. The

  7. System for detection of hazardous events

    DOEpatents

    Kulesz, James J.; Worley, Brian A.

    2006-05-23

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  8. System For Detection Of Hazardous Events

    DOEpatents

    Kulesz, James J [Oak Ridge, TN; Worley, Brian A [Knoxville, TN

    2005-08-16

    A system for detecting the occurrence of anomalies, includes a plurality of spaced apart nodes, with each node having adjacent nodes, each of the nodes having one or more sensors associated with the node and capable of detecting anomalies, and each of the nodes having a controller connected to the sensors associated with the node. The system also includes communication links between adjacent nodes, whereby the nodes form a network. Each controller is programmed to query its adjacent nodes to assess the status of the adjacent nodes and the communication links.

  9. Subsurface event detection and classification using Wireless Signal Networks.

    PubMed

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  10. Subsurface Event Detection and Classification Using Wireless Signal Networks

    PubMed Central

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  11. Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.

    2008-12-01

    To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.

  12. Event Detection using Twitter: A Spatio-Temporal Approach

    PubMed Central

    Cheng, Tao; Wicks, Thomas

    2014-01-01

    Background Every day, around 400 million tweets are sent worldwide, which has become a rich source for detecting, monitoring and analysing news stories and special (disaster) events. Existing research within this field follows key words attributed to an event, monitoring temporal changes in word usage. However, this method requires prior knowledge of the event in order to know which words to follow, and does not guarantee that the words chosen will be the most appropriate to monitor. Methods This paper suggests an alternative methodology for event detection using space-time scan statistics (STSS). This technique looks for clusters within the dataset across both space and time, regardless of tweet content. It is expected that clusters of tweets will emerge during spatio-temporally relevant events, as people will tweet more than expected in order to describe the event and spread information. The special event used as a case study is the 2013 London helicopter crash. Results and Conclusion A spatio-temporally significant cluster is found relating to the London helicopter crash. Although the cluster only remains significant for a relatively short time, it is rich in information, such as important key words and photographs. The method also detects other special events such as football matches, as well as train and flight delays from Twitter data. These findings demonstrate that STSS is an effective approach to analysing Twitter data for event detection. PMID:24893168

  13. Video Event Detection Framework on Large-Scale Video Data

    ERIC Educational Resources Information Center

    Park, Dong-Jun

    2011-01-01

    Detection of events and actions in video entails substantial processing of very large, even open-ended, video streams. Video data present a unique challenge for the information retrieval community because properly representing video events is challenging. We propose a novel approach to analyze temporal aspects of video data. We consider video data…

  14. Automatic detection of iceberg calving events using seismic observations

    NASA Astrophysics Data System (ADS)

    Andersen, M. L.; Larsen, T.; Hamilton, G. S.; Nettles, M.

    2014-12-01

    Iceberg calving at large, marine-terminating glaciers has been shown to be seismogenic. Seismic energy from these events is released slowly, resulting in characteristic low-frequency signals. The events therefore typically escape detection by traditional systematic methods. Here we show the results of a detection algorithm applied to data observed at two stations, both ~100 km from Helheim Glacier, South East Greenland, in 2007 and 2008 for the purpose of detecting calving-related seismic signals. The detector entails sliding a 150 s wide window over the observed vertical displacement seismograms at steps of one second. Relative power in the 1.1-3.3 s band is monitored, and the detector is activated when a pre-defined threshold is exceeded. We determine the threshold by calibrating the detector with a record of known events observed by time lapse cameras at Helheim Glacier and automatic detections of glacial earthquakes from the GSN (Global Seismic Network) stations. The resulting list of detections is then filtered for events overlapping with tectonic events, both local and global. We observe a clear periodicity in the detections, with most events occurring during the late summer and early fall, roughly coinciding with the end of the melt season. This apparent offset from peak melt intensity leads us to speculate that the pattern in calving is the result of a combination of the seasonal development of multiple physical properties of the glacier, i.e., surface crevassing, subglacial melt and crevassing, and the subglacial drainage system.

  15. An Organic Model for Detecting Cyber Events

    SciTech Connect

    Oehmen, Christopher S.; Peterson, Elena S.; Dowson, Scott T.

    2010-04-21

    Cyber entities in many ways mimic the behavior of organic systems. Individuals or groups compete for limited resources using a variety of strategies and effective strategies are re-used and refined in later ‘generations’. Traditionally this drift has made detection of malicious entities very difficult because 1) recognition systems are often built on exact matching to a pattern that can only be ‘learned’ after a malicious entity reveals itself and 2) the enormous volume and variation in benign entities is an overwhelming source of previously unseen entities that often confound detectors. To turn the tables of complexity on the would-be attackers, we have developed a method for mapping the sequence of behaviors in which cyber entities engage to strings of text and analyze these strings using modified bioinformatics algorithms. Bioinformatics algorithms optimize the alignment between text strings even in the presence of mismatches, insertions or deletions and do not require an a priori definition of the patterns one is seeking. Nor does it require any type of exact matching. This allows the data itself to suggest meaningful patterns that are conserved between cyber entities. We demonstrate this method on data generated from network traffic. The impact of this approach is that it can rapidly calculate similarity measures of previously unseen cyber entities in terms of well-characterized entities. These measures may also be used to organize large collections of data into families, making it possible to identify motifs indicative of each family.

  16. Detection of flood events in hydrological discharge time series

    NASA Astrophysics Data System (ADS)

    Seibert, S. P.; Ehret, U.

    2012-04-01

    The shortcomings of mean-squared-error (MSE) based distance metrics are well known (Beran 1999, Schaeffli & Gupta 2007) and the development of novel distance metrics (Pappenberger & Beven 2004, Ehret & Zehe 2011) and multi-criteria-approaches enjoy increasing popularity (Reusser 2009, Gupta et al. 2009). Nevertheless, the hydrological community still lacks metrics which identify and thus, allow signature based evaluations of hydrological discharge time series. Signature based information/evaluations are required wherever specific time series features, such as flood events, are of special concern. Calculation of event based runoff coefficients or precise knowledge on flood event characteristics (like onset or duration of rising limp or the volume of falling limp, etc.) are possible applications. The same applies for flood forecasting/simulation models. Directly comparing simulated and observed flood event features may reveal thorough insights into model dynamics. Compared to continuous space-and-time-aggregated distance metrics, event based evaluations may provide answers like the distributions of event characteristics or the percentage of the events which were actually reproduced by a hydrological model. It also may help to provide information on the simulation accuracy of small, medium and/or large events in terms of timing and magnitude. However, the number of approaches which expose time series features is small and their usage is limited to very specific questions (Merz & Blöschl 2009, Norbiato et al. 2009). We believe this is due to the following reasons: i) a generally accepted definition of the signature of interest is missing or difficult to obtain (in our case: what makes a flood event a flood event?) and/or ii) it is difficult to translate such a definition into a equation or (graphical) procedure which exposes the feature of interest in the discharge time series. We reviewed approaches which detect event starts and/or ends in hydrological discharge time

  17. Stable algorithm for event detection in event-driven particle dynamics: logical states

    NASA Astrophysics Data System (ADS)

    Strobl, Severin; Bannerman, Marcus N.; Pöschel, Thorsten

    2016-07-01

    Following the recent development of a stable event-detection algorithm for hard-sphere systems, the implications of more complex interaction models are examined. The relative location of particles leads to ambiguity when it is used to determine the interaction state of a particle in stepped potentials, such as the square-well model. To correctly predict the next event in these systems, the concept of an additional state that is tracked separately from the particle position is introduced and integrated into the stable algorithm for event detection.

  18. Structuring an event ontology for disease outbreak detection

    PubMed Central

    Kawazoe, Ai; Chanlekha, Hutchatai; Shigematsu, Mika; Collier, Nigel

    2008-01-01

    Background This paper describes the design of an event ontology being developed for application in the machine understanding of infectious disease-related events reported in natural language text. This event ontology is designed to support timely detection of disease outbreaks and rapid judgment of their alerting status by 1) bridging a gap between layman's language used in disease outbreak reports and public health experts' deep knowledge, and 2) making multi-lingual information available. Construction and content This event ontology integrates a model of experts' knowledge for disease surveillance, and at the same time sets of linguistic expressions which denote disease-related events, and formal definitions of events. In this ontology, rather general event classes, which are suitable for application to language-oriented tasks such as recognition of event expressions, are placed on the upper-level, and more specific events of the experts' interest are in the lower level. Each class is related to other classes which represent participants of events, and linked with multi-lingual synonym sets and axioms. Conclusions We consider that the design of the event ontology and the methodology introduced in this paper are applicable to other domains which require integration of natural language information and machine support for experts to assess them. The first version of the ontology, with about 40 concepts, will be available in March 2008. PMID:18426553

  19. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    PubMed Central

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  20. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.

    PubMed

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  1. Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection

    PubMed Central

    Liu, Changyu; Li, Huiling

    2014-01-01

    We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches. PMID:25147840

  2. Adaptive noise estimation and suppression for improving microseismic event detection

    NASA Astrophysics Data System (ADS)

    Mousavi, S. Mostafa; Langston, Charles A.

    2016-09-01

    Microseismic data recorded by surface arrays are often strongly contaminated by unwanted noise. This background noise makes the detection of small magnitude events difficult. A noise level estimation and noise reduction algorithm is presented for microseismic data analysis based upon minimally controlled recursive averaging and neighborhood shrinkage estimators. The method might not be compared with more sophisticated and computationally expensive denoising algorithm in terms of preserving detailed features of seismic signal. However, it is fast and data-driven and can be applied in real-time processing of continuous data for event detection purposes. Results from application of this algorithm to synthetic and real seismic data show that it holds a great promise for improving microseismic event detection.

  3. On Identifiability of Bias-Type Actuator-Sensor Faults in Multiple-Model-Based Fault Detection and Identification

    NASA Technical Reports Server (NTRS)

    Joshi, Suresh M.

    2012-01-01

    This paper explores a class of multiple-model-based fault detection and identification (FDI) methods for bias-type faults in actuators and sensors. These methods employ banks of Kalman-Bucy filters to detect the faults, determine the fault pattern, and estimate the fault values, wherein each Kalman-Bucy filter is tuned to a different failure pattern. Necessary and sufficient conditions are presented for identifiability of actuator faults, sensor faults, and simultaneous actuator and sensor faults. It is shown that FDI of simultaneous actuator and sensor faults is not possible using these methods when all sensors have biases.

  4. Implementation of a model based fault detection and diagnosis for actuation faults of the Space Shuttle main engine

    NASA Technical Reports Server (NTRS)

    Duyar, A.; Guo, T.-H.; Merrill, W.; Musgrave, J.

    1992-01-01

    In a previous study, Guo, Merrill and Duyar, 1990, reported a conceptual development of a fault detection and diagnosis system for actuation faults of the space shuttle main engine. This study, which is a continuation of the previous work, implements the developed fault detection and diagnosis scheme for the real time actuation fault diagnosis of the space shuttle main engine. The scheme will be used as an integral part of an intelligent control system demonstration experiment at NASA Lewis. The diagnosis system utilizes a model based method with real time identification and hypothesis testing for actuation, sensor, and performance degradation faults.

  5. Data mining for signal detection of adverse event safety data.

    PubMed

    Chen, Hung-Chia; Tsong, Yi; Chen, James J

    2013-01-01

    The Adverse Event Reporting System (AERS) is the primary database designed to support the Food and Drug Administration (FDA) postmarketing safety surveillance program for all approved drugs and therapeutic biologic products. Most current disproportionality analysis focuses on the detection of potential adverse events (AE) involving a single drug and a single AE only. In this paper, we present a data mining biclustering technique based on the singular value decomposition to extract local regions of association for a safety study. The analysis consists of collection of biclusters, each representing an association between a set of drugs with the corresponding set of adverse events. Significance of each bicluster can be tested using disproportionality analysis. Individual drug-event combination can be further tested. A safety data set consisting of 193 drugs with 8453 adverse events is analyzed as an illustration. PMID:23331228

  6. Method for early detection of cooling-loss events

    SciTech Connect

    Bermudez, Sergio A.; Hamann, Hendrik; Marianno, Fernando J.

    2015-06-30

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  7. Method for early detection of cooling-loss events

    SciTech Connect

    Bermudez, Sergio A.; Hamann, Hendrik F.; Marianno, Fernando J.

    2015-12-22

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  8. Development of the IDC Infrasound Event Detection Pipeline

    NASA Astrophysics Data System (ADS)

    Mialle, P.; Bittner, P.; Brown, D.; Given, J. W.

    2012-12-01

    The first atmospheric event built only from infrasound arrivals was reported in the Reviewed Event Bulletin (REB) of the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO) in 2003. In the subsequent decade, 45 infrasound stations from the International Monitoring System (IMS) have been installed and are transmitting continuous data to the IDC. The growing amount of infrasound data and detections produced by the automatic system challenges the station and network processing at the IDC and requires the Organization to improve the infrasound data processing. In 2010, the IDC began full-time operational automatic processing of infrasound data followed by interactive analysis. The detected and located events are systematically included in the analyst-reviewed Late Event Bulletin (LEB) and REB. Approximately 16% of SEL3 (Selected Event List 3, automatically produced 6 hours after real-time) events with associated infrasound signals pass interactive analysis and make it to the IDC bulletins. 41% of those SEL3 events rejected after review have only 2 associated infrasound phases (and possibly other seismic and hydro-acoustic detections). Therefore, the process whereby infrasound detections are associated with events needs to be investigated further. The objective of this study is to reduce the number of associated infrasound arrivals that are falsely associated during the creation of the SEL3. There are two parts to the study. First, the detection accuracy at the infrasound arrays is improved by improving the infrasound signal detector, which is based on the PMCC (Progressive Multi-Channel Correlation) algorithm. The second part focuses on improving the reliability of the association algorithm. The association algorithm is enhanced to include better characterization of the variable atmospheric phenomena, which profoundly affect the detection patterns of the infrasound signals. The algorithm is then further tuned to reduce the

  9. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  10. A model-based approach for detection of objects in low resolution passive-millimeter wave images

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Devadiga, Sadashiva; Kasturi, Rangachar; Harris, Randall L., Sr.

    1993-01-01

    We describe a model-based vision system to assist pilots in landing maneuvers under restricted visibility conditions. The system was designed to analyze image sequences obtained from a Passive Millimeter Wave (PMMW) imaging system mounted on the aircraft to delineate runways/taxiways, buildings, and other objects on or near runways. PMMW sensors have good response in a foggy atmosphere; but, their spatial resolution is very low. However, additional data such as airport model and approximate position and orientation of aircraft are available. We exploit these data to guide our model-based system to locate objects in the low resolution image and generate warning signals to alert the pilots. We also derive analytical expressions for the accuracy of the camera position estimate obtained by detecting the position of known objects in the image.

  11. Context-aware event detection smartphone application for first responders

    NASA Astrophysics Data System (ADS)

    Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.

    2013-05-01

    The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.

  12. Summary of gas release events detected by hydrogen monitoring

    SciTech Connect

    MCCAIN, D.J.

    1999-05-18

    This paper summarizes the results of monitoring tank headspace for flammable gas release events. In over 40 tank years of monitoring the largest detected release in a single-shell tank is 2.4 cubic meters of Hydrogen. In the double-shell tanks the largest release is 19.3 cubic meters except in SY-101 pre mixer pump installation condition.

  13. Context and quality estimation in video for enhanced event detection

    NASA Astrophysics Data System (ADS)

    Irvine, John M.; Wood, Richard J.

    2015-05-01

    Numerous practical applications for automated event recognition in video rely on analysis of the objects and their associated motion, i.e., the kinematics of the scene. The ability to recognize events in practice depends on accurate tracking objects of interest in the video data and accurate recognition of changes relative to the background. Numerous factors can degrade the performance of automated algorithms. Our object detection and tracking algorithms estimate the object position and attributes within the context of a dynamic assessment of video quality, to provide more reliable event recognition under challenging conditions. We present an approach to robustly modeling the image quality which informs tuning parameters to use for a given video stream. The video quality model rests on a suite of image metrics computed in real-time from the video. We will describe the formulation of the image quality model. Results from a recent experiment will quantify the empirical performance for recognition of events of interest.

  14. Adaptive Model-Based Mine Detection/Localization using Noisy Laser Doppler Vibration Measurements

    SciTech Connect

    Sullivan, E J; Xiang, N; Candy, J V

    2009-04-06

    The acoustic detection of buried mines is hampered by the fact that at the frequencies required for obtaining useful penetration, the energy is quickly absorbed by the ground. A recent approach which avoids this problem, is to excite the ground with a high-level low frequency sound, which excites low frequency resonances in the mine. These resonances cause a low-level vibration on the surface which can be detected by a Laser Doppler Vibrometer. This paper presents a method of quickly and efficiently detecting these vibrations by sensing a change in the statistics of the signal when the mine is present. Results based on real data are shown.

  15. Human visual system-based smoking event detection

    NASA Astrophysics Data System (ADS)

    Odetallah, Amjad D.; Agaian, Sos S.

    2012-06-01

    Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.

  16. Detection of dominant flow and abnormal events in surveillance video

    NASA Astrophysics Data System (ADS)

    Kwak, Sooyeong; Byun, Hyeran

    2011-02-01

    We propose an algorithm for abnormal event detection in surveillance video. The proposed algorithm is based on a semi-unsupervised learning method, a kind of feature-based approach so that it does not detect the moving object individually. The proposed algorithm identifies dominant flow without individual object tracking using a latent Dirichlet allocation model in crowded environments. It can also automatically detect and localize an abnormally moving object in real-life video. The performance tests are taken with several real-life databases, and their results show that the proposed algorithm can efficiently detect abnormally moving objects in real time. The proposed algorithm can be applied to any situation in which abnormal directions or abnormal speeds are detected regardless of direction.

  17. Automatic event detection based on artificial neural networks

    NASA Astrophysics Data System (ADS)

    Doubravová, Jana; Wiszniowski, Jan; Horálek, Josef

    2015-04-01

    The proposed algorithm was developed to be used for Webnet, a local seismic network in West Bohemia. The Webnet network was built to monitor West Bohemia/Vogtland swarm area. During the earthquake swarms there is a large number of events which must be evaluated automatically to get a quick estimate of the current earthquake activity. Our focus is to get good automatic results prior to precise manual processing. With automatic data processing we may also reach a lower completeness magnitude. The first step of automatic seismic data processing is the detection of events. To get a good detection performance we require low number of false detections as well as high number of correctly detected events. We used a single layer recurrent neural network (SLRNN) trained by manual detections from swarms in West Bohemia in the past years. As inputs of the SLRNN we use STA/LTA of half-octave filter bank fed by vertical and horizontal components of seismograms. All stations were trained together to obtain the same network with the same neuron weights. We tried several architectures - different number of neurons - and different starting points for training. Networks giving the best results for training set must not be the optimal ones for unknown waveforms. Therefore we test each network on test set from different swarm (but still with similar characteristics, i.e. location, focal mechanisms, magnitude range). We also apply a coincidence verification for each event. It means that we can lower the number of false detections by rejecting events on one station only and force to declare an event on all stations in the network by coincidence on two or more stations. In further work we would like to retrain the network for each station individually so each station will have its own coefficients (neural weights) set. We would also like to apply this method to data from Reykjanet network located in Reykjanes peninsula, Iceland. As soon as we have a reliable detection, we can proceed to

  18. Discriminative boundary detection for model-based heart segmentation in CT images

    NASA Astrophysics Data System (ADS)

    Peters, Jochen; Ecabert, Olivier; Schramm, Hauke; Weese, Jürgen

    2007-03-01

    Segmentation of organs in medical images can be successfully performed with deformable models. Most approaches combine a boundary detection step with some smoothness or shape constraint. An objective function for the model deformation is thus established from two terms: the first one attracts the surface model to the detected boundaries while the second one keeps the surface smooth or close to expected shapes. In this work, we assign locally varying boundary detection functions to all parts of the surface model. These functions combine an edge detector with local image analysis in order to accept or reject possible edge candidates. The goal is to optimize the discrimination between the wanted and misleading boundaries. We present a method to automatically learn from a representative set of 3D training images which features are optimal at each position of the surface model. The basic idea is to simulate the boundary detection for the given 3D images and to select those features that minimize the distance between the detected position and the desired object boundary. The approach is experimentally evaluated for the complex task of full-heart segmentation in CT images. A cyclic cross-evaluation on 25 cardiac CT images shows that the optimized feature training and selection enables robust, fully automatic heart segmentation with a mean error well below 1 mm. Comparing this approach to simpler training schemes that use the same basic formalism to accept or reject edges shows the importance of the discriminative optimization.

  19. ARX model-based gearbox fault detection and localization under varying load conditions

    NASA Astrophysics Data System (ADS)

    Yang, Ming; Makis, Viliam

    2010-11-01

    The development of the fault detection schemes for gearbox systems has received considerable attention in recent years. Both time series modeling and feature extraction based on wavelet methods have been considered, mostly under constant load. Constant load assumption implies that changes in vibration data are caused only by deterioration of the gearbox. However, most real gearbox systems operate under varying load and speed which affect the vibration signature of the system and in general make it difficult to recognize the occurrence of an impending fault. This paper presents a novel approach to detect and localize the gear failure occurrence for a gearbox operating under varying load conditions. First, residual signal is calculated using an autoregressive model with exogenous variables (ARX) fitted to the time-synchronously averaged (TSA) vibration data and filtered TSA envelopes when the gearbox operated under various load conditions in the healthy state. The gear of interest is divided into several sections so that each section includes the same number of adjacent teeth. Then, the fault detection and localization indicator is calculated by applying F-test to the residual signal of the ARX model. The proposed fault detection scheme indicates not only when the gear fault occurs, but also in which section of the gear. Finally, the performance of the fault detection scheme is checked using full lifetime vibration data obtained from the gearbox operating from a new condition to a breakdown under varying load.

  20. Model-Based Design of Tree WSNs for Decentralized Detection

    PubMed Central

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-01-01

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches. PMID:26307989

  1. FraudMiner: a novel credit card fraud detection model based on frequent itemset mining.

    PubMed

    Seeja, K R; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers. PMID:25302317

  2. FraudMiner: A Novel Credit Card Fraud Detection Model Based on Frequent Itemset Mining

    PubMed Central

    Seeja, K. R.; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers. PMID:25302317

  3. Model-based assessment of the role of human-induced climate change in the 2005 Caribbean coral bleaching event.

    PubMed

    Donner, Simon D; Knutson, Thomas R; Oppenheimer, Michael

    2007-03-27

    Episodes of mass coral bleaching around the world in recent decades have been attributed to periods of anomalously warm ocean temperatures. In 2005, the sea surface temperature (SST) anomaly in the tropical North Atlantic that may have contributed to the strong hurricane season caused widespread coral bleaching in the Eastern Caribbean. Here, we use two global climate models to evaluate the contribution of natural climate variability and anthropogenic forcing to the thermal stress that caused the 2005 coral bleaching event. Historical temperature data and simulations for the 1870-2000 period show that the observed warming in the region is unlikely to be due to unforced climate variability alone. Simulation of background climate variability suggests that anthropogenic warming may have increased the probability of occurrence of significant thermal stress events for corals in this region by an order of magnitude. Under scenarios of future greenhouse gas emissions, mass coral bleaching in the Eastern Caribbean may become a biannual event in 20-30 years. However, if corals and their symbionts can adapt by 1-1.5 degrees C, such mass bleaching events may not begin to recur at potentially harmful intervals until the latter half of the century. The delay could enable more time to alter the path of greenhouse gas emissions, although long-term "committed warming" even after stabilization of atmospheric CO(2) levels may still represent an additional long-term threat to corals. PMID:17360373

  4. Model-based assessment of the role of human-induced climate change in the 2005 Caribbean coral bleaching event

    SciTech Connect

    Donner, S.D.; Knutson, T.R.; Oppenheimer, M.

    2007-03-27

    Episodes of mass coral bleaching around the world in recent decades have been attributed to periods of anomalously warm ocean temperatures. In 2005, the sea surface temperature (SST) anomaly in the tropical North Atlantic that may have contributed to the strong hurricane season caused widespread coral bleaching in the Eastern Caribbean. Here, the authors use two global climate models to evaluate the contribution of natural climate variability and anthropogenic forcing to the thermal stress that caused the 2005 coral bleaching event. Historical temperature data and simulations for the 1870-2000 period show that the observed warming in the region is unlikely to be due to unforced climate variability alone. Simulation of background climate variability suggests that anthropogenic warming may have increased the probability of occurrence of significant thermal stress events for corals in this region by an order of magnitude. Under scenarios of future greenhouse gas emissions, mass coral bleaching in the Eastern Caribbean may become a biannual event in 20-30 years. However, if corals and their symbionts can adapt by 1-1.5{sup o}C, such mass bleaching events may not begin to recur at potentially harmful intervals until the latter half of the century. The delay could enable more time to alter the path of greenhouse gas emissions, although long-term 'committed warming' even after stabilization of atmospheric CO{sub 2} levels may still represent an additional long-term threat to corals.

  5. Detecting rare gene transfer events in bacterial populations.

    PubMed

    Nielsen, Kaare M; Bøhn, Thomas; Townsend, Jeffrey P

    2014-01-01

    Horizontal gene transfer (HGT) enables bacteria to access, share, and recombine genetic variation, resulting in genetic diversity that cannot be obtained through mutational processes alone. In most cases, the observation of evolutionary successful HGT events relies on the outcome of initially rare events that lead to novel functions in the new host, and that exhibit a positive effect on host fitness. Conversely, the large majority of HGT events occurring in bacterial populations will go undetected due to lack of replication success of transformants. Moreover, other HGT events that would be highly beneficial to new hosts can fail to ensue due to lack of physical proximity to the donor organism, lack of a suitable gene transfer mechanism, genetic compatibility, and stochasticity in tempo-spatial occurrence. Experimental attempts to detect HGT events in bacterial populations have typically focused on the transformed cells or their immediate offspring. However, rare HGT events occurring in large and structured populations are unlikely to reach relative population sizes that will allow their immediate identification; the exception being the unusually strong positive selection conferred by antibiotics. Most HGT events are not expected to alter the likelihood of host survival to such an extreme extent, and will confer only minor changes in host fitness. Due to the large population sizes of bacteria and the time scales involved, the process and outcome of HGT are often not amenable to experimental investigation. Population genetic modeling of the growth dynamics of bacteria with differing HGT rates and resulting fitness changes is therefore necessary to guide sampling design and predict realistic time frames for detection of HGT, as it occurs in laboratory or natural settings. Here we review the key population genetic parameters, consider their complexity and highlight knowledge gaps for further research. PMID:24432015

  6. Detecting rare gene transfer events in bacterial populations

    PubMed Central

    Nielsen, Kaare M.; Bøhn, Thomas; Townsend, Jeffrey P.

    2014-01-01

    Horizontal gene transfer (HGT) enables bacteria to access, share, and recombine genetic variation, resulting in genetic diversity that cannot be obtained through mutational processes alone. In most cases, the observation of evolutionary successful HGT events relies on the outcome of initially rare events that lead to novel functions in the new host, and that exhibit a positive effect on host fitness. Conversely, the large majority of HGT events occurring in bacterial populations will go undetected due to lack of replication success of transformants. Moreover, other HGT events that would be highly beneficial to new hosts can fail to ensue due to lack of physical proximity to the donor organism, lack of a suitable gene transfer mechanism, genetic compatibility, and stochasticity in tempo-spatial occurrence. Experimental attempts to detect HGT events in bacterial populations have typically focused on the transformed cells or their immediate offspring. However, rare HGT events occurring in large and structured populations are unlikely to reach relative population sizes that will allow their immediate identification; the exception being the unusually strong positive selection conferred by antibiotics. Most HGT events are not expected to alter the likelihood of host survival to such an extreme extent, and will confer only minor changes in host fitness. Due to the large population sizes of bacteria and the time scales involved, the process and outcome of HGT are often not amenable to experimental investigation. Population genetic modeling of the growth dynamics of bacteria with differing HGT rates and resulting fitness changes is therefore necessary to guide sampling design and predict realistic time frames for detection of HGT, as it occurs in laboratory or natural settings. Here we review the key population genetic parameters, consider their complexity and highlight knowledge gaps for further research. PMID:24432015

  7. Comparison of Event Detection Methods for Centralized Sensor Networks

    NASA Technical Reports Server (NTRS)

    Sauvageon, Julien; Agogiono, Alice M.; Farhang, Ali; Tumer, Irem Y.

    2006-01-01

    The development of an Integrated Vehicle Health Management (IVHM) for space vehicles has become a great concern. Smart Sensor Networks is one of the promising technologies that are catching a lot of attention. In this paper, we propose to a qualitative comparison of several local event (hot spot) detection algorithms in centralized redundant sensor networks. The algorithms are compared regarding their ability to locate and evaluate the event under noise and sensor failures. The purpose of this study is to check if the ratio performance/computational power of the Mote Fuzzy Validation and Fusion algorithm is relevant compare to simpler methods.

  8. Detection and interpretation of seismoacoustic events at German infrasound stations

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Koch, Karl; Ceranna, Lars

    2016-04-01

    Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.

  9. Automatic Detection of Student Mental Models Based on Natural Language Student Input during Metacognitive Skill Training

    ERIC Educational Resources Information Center

    Lintean, Mihai; Rus, Vasile; Azevedo, Roger

    2012-01-01

    This article describes the problem of detecting the student mental models, i.e. students' knowledge states, during the self-regulatory activity of prior knowledge activation in MetaTutor, an intelligent tutoring system that teaches students self-regulation skills while learning complex science topics. The article presents several approaches to…

  10. Model Based Determination of Detection Limits for Proton Transfer Reaction Mass Spectrometer

    NASA Astrophysics Data System (ADS)

    Amann, Anton; Schwarz, Konrad; Wimmer, Gejza; Witkovský, Viktor

    2010-01-01

    Proton Transfer Reaction Mass Spectrometry (PTR-MS) is a chemical ionization mass spectrometric technique which allows to measure trace gases as, for example, in exhaled human breath. The quantification of compounds at low concentrations is desirable for medical diagnostics. Typically, an increase of measuring accuracy can be achieved if the duration of the measuring process is extended. For real time measurements the time windows for measurement are relatively short, in order to get a good time resolution (e.g. with breath-to-breath resolution during exercise on a stationary bicycle). Determination of statistical detection limits is typically based on calibration measurements, but this approach is limited, especially for very low concentrations. To overcome this problem, a calculation of limit of quantification (LOQ) and limit of detection (LOD), respectively, based on a theoretical model of the measurement process is outlined.

  11. PMU Data Event Detection: A User Guide for Power Engineers

    SciTech Connect

    Allen, A.; Singh, M.; Muljadi, E.; Santoso, S.

    2014-10-01

    This user guide is intended to accompany a software package containing a Matrix Laboratory (MATLAB) script and related functions for processing phasor measurement unit (PMU) data. This package and guide have been developed by the National Renewable Energy Laboratory and the University of Texas at Austin. The objective of this data processing exercise is to discover events in the vast quantities of data collected by PMUs. This document attempts to cover some of the theory behind processing the data to isolate events as well as the functioning of the MATLAB scripts. The report describes (1) the algorithms and mathematical background that the accompanying MATLAB codes use to detect events in PMU data and (2) the inputs required from the user and the outputs generated by the scripts.

  12. Gait Event Detection during Stair Walking Using a Rate Gyroscope

    PubMed Central

    Formento, Paola Catalfamo; Acevedo, Ruben; Ghoussayni, Salim; Ewins, David

    2014-01-01

    Gyroscopes have been proposed as sensors for ambulatory gait analysis and functional electrical stimulation systems. These applications often require detection of the initial contact (IC) of the foot with the floor and/or final contact or foot off (FO) from the floor during outdoor walking. Previous investigations have reported the use of a single gyroscope placed on the shank for detection of IC and FO on level ground and incline walking. This paper describes the evaluation of a gyroscope placed on the shank for determination of IC and FO in subjects ascending and descending a set of stairs. Performance was compared with a reference pressure measurement system. The absolute mean difference between the gyroscope and the reference was less than 45 ms for IC and better than 135 ms for FO for both activities. Detection success was over 93%. These results provide preliminary evidence supporting the use of a gyroscope for gait event detection when walking up and down stairs. PMID:24651724

  13. Integrated Seismic Event Detection and Location by Advanced Array Processing

    SciTech Connect

    Kvaerna, T; Gibbons, S J; Ringdal, F; Harris, D B

    2007-02-09

    The principal objective of this two-year study is to develop and test a new advanced, automatic approach to seismic detection/location using array processing. We address a strategy to obtain significantly improved precision in the location of low-magnitude events compared with current fully-automatic approaches, combined with a low false alarm rate. We have developed and evaluated a prototype automatic system which uses as a basis regional array processing with fixed, carefully calibrated, site-specific parameters in conjuction with improved automatic phase onset time estimation. We have in parallel developed tools for Matched Field Processing for optimized detection and source-region identification of seismic signals. This narrow-band procedure aims to mitigate some of the causes of difficulty encountered using the standard array processing system, specifically complicated source-time histories of seismic events and shortcomings in the plane-wave approximation for seismic phase arrivals at regional arrays.

  14. Gaussian mixture model based approach to anomaly detection in multi/hyperspectral images

    NASA Astrophysics Data System (ADS)

    Acito, N.; Diani, M.; Corsini, G.

    2005-10-01

    Anomaly detectors reveal the presence of objects/materials in a multi/hyperspectral image simply searching for those pixels whose spectrum differs from the background one (anomalies). This procedure can be applied directly to the radiance at the sensor level and has the great advantage of avoiding the difficult step of atmospheric correction. The most popular anomaly detector is the RX algorithm derived by Yu and Reed. It is based on the assumption that the pixels, in a region around the one under test, follow a single multivariate Gaussian distribution. Unfortunately, such a hypothesis is generally not met in actual scenarios and a large number of false alarms is usually experienced when the RX algorithm is applied in practice. In this paper, a more general approach to anomaly detection is considered based on the assumption that the background contains different terrain types (clusters) each of them Gaussian distributed. In this approach the parameters of each cluster are estimated and used in the detection process. Two detectors are considered: the SEM-RX and the K-means RX. Both the algorithms follow two steps: first, 1) the parameters of the background clusters are estimated, then, 2) a detection rule based on the RX test is applied. The SEM-RX stems from the GMM and employs the SEM algorithm to estimate the clusters' parameters; instead, the K-means RX resorts to the well known K-means algorithm to obtain the background clusters. An automatic procedure is defined, for both the detectors, to select the number of clusters and a novel criterion is proposed to set the test threshold. The performances of the two detectors are also evaluated on an experimental data set and compared to the ones of the RX algorithm. The comparative analysis is carried out in terms of experimental Receiver Operating Characteristics.

  15. Hybrid light transport model based bioluminescence tomography reconstruction for early gastric cancer detection

    NASA Astrophysics Data System (ADS)

    Chen, Xueli; Liang, Jimin; Hu, Hao; Qu, Xiaochao; Yang, Defu; Chen, Duofang; Zhu, Shouping; Tian, Jie

    2012-03-01

    Gastric cancer is the second cause of cancer-related death in the world, and it remains difficult to cure because it has been in late-stage once that is found. Early gastric cancer detection becomes an effective approach to decrease the gastric cancer mortality. Bioluminescence tomography (BLT) has been applied to detect early liver cancer and prostate cancer metastasis. However, the gastric cancer commonly originates from the gastric mucosa and grows outwards. The bioluminescent light will pass through a non-scattering region constructed by gastric pouch when it transports in tissues. Thus, the current BLT reconstruction algorithms based on the approximation model of radiative transfer equation are not optimal to handle this problem. To address the gastric cancer specific problem, this paper presents a novel reconstruction algorithm that uses a hybrid light transport model to describe the bioluminescent light propagation in tissues. The radiosity theory integrated with the diffusion equation to form the hybrid light transport model is utilized to describe light propagation in the non-scattering region. After the finite element discretization, the hybrid light transport model is converted into a minimization problem which fuses an l1 norm based regularization term to reveal the sparsity of bioluminescent source distribution. The performance of the reconstruction algorithm is first demonstrated with a digital mouse based simulation with the reconstruction error less than 1mm. An in situ gastric cancer-bearing nude mouse based experiment is then conducted. The primary result reveals the ability of the novel BLT reconstruction algorithm in early gastric cancer detection.

  16. Model-based estimation of cardiovascular repolarization features: ischaemia detection and PTCA monitoring.

    PubMed

    Laguna, P; García, J; Roncal, I; Wagner, G; Lander, P; Mark, R

    1998-01-01

    The ST-T segment of the surface ECG reflects cardiac repolarization, and is quite sensitive to a number of pathological conditions, particularly ischaemia. ST-T changes generally affect the entire waveshape, and are inadequately characterized by single features such as depression of the ST segment at one particular point. Metrics which represent overall waveshape should provide more sensitive indicators of ST-T wave abnormalities, particularly when they are subtle, intermittent or periodic. This study discusses a Karhunen-Loève transform (KLT) technique for the analysis of the ST-T waveform. The KL technique was used to analyse the ST-T complexes in the ESC ST-T database. KL coefficients were plotted as a function of time, and were effective in detection of transient ischaemic episodes. Twenty per cent of the records showed bursts of periodic ischaemia suggesting local vascular instability. A comparison between kl and ST depression series has shown the KL technique as more appropriate to the study of ST-T complex variations. Using the kl series, an ischaemia detector has been developed based on a resampled, filtered, and differentiated KL series. This technique demonstrates a sensitivity of 65% and a specificity of 54%. These low values can be due to shifts of the electrical axis which are detected as ischaemic changes, real ischaemic episodes that were not annotated with the protocol used at the European ST-T database, or erroneous detections. An increase in sensitivity can be obtained at the expense of a decrease in the positive predictive value and thus becomes a useful technique for previous scanning of the ECG record and subsequent review by the expert. The technique has also been used to monitor patients during a PTCA process, demonstrating that this technique allows us to monitor PTCA-induced ischaemia. A detailed analysis has shown that in some cases a repetitive oscillatory behaviour appears, lasting for a period of around 20 s, and highly related to the

  17. Real-time detection of traffic events using smart cameras

    NASA Astrophysics Data System (ADS)

    Macesic, M.; Jelaca, V.; Niño-Castaneda, J. O.; Prodanovic, N.; Panic, M.; Pizurica, A.; Crnojevic, V.; Philips, W.

    2012-01-01

    With rapid increase of number of vehicles on roads it is necessary to maintain close monitoring of traffic. For this purpose many surveillance cameras are placed along roads and on crossroads, creating a huge communication load between the cameras and the monitoring center. Therefore, the data needs to be processed on site and transferred to the monitoring centers in form of metadata or as a set of selected images. For this purpose it is necessary to detect events of interest already on the camera side, which implies using smart cameras as visual sensors. In this paper we propose a method for tracking of vehicles and analysis of vehicle trajectories to detect different traffic events. Kalman filtering is used for tracking, combining foreground and optical flow measurements. Obtained vehicle trajectories are used to detect different traffic events. Every new trajectory is compared with collection of normal routes and clustered accordingly. If the observed trajectory differs from all normal routes more than a predefined threshold, it is marked as abnormal and the alarm is raised. The system was developed and tested on Texas Instruments OMAP platform. Testing was done on four different locations, two locations in the city and two locations on the open road.

  18. Multi-resolution model-based traffic sign detection and tracking

    NASA Astrophysics Data System (ADS)

    Marinas, Javier; Salgado, Luis; Camplani, Massimo

    2012-06-01

    In this paper we propose an innovative approach to tackle the problem of traffic sign detection using a computer vision algorithm and taking into account real-time operation constraints, trying to establish intelligent strategies to simplify as much as possible the algorithm complexity and to speed up the process. Firstly, a set of candidates is generated according to a color segmentation stage, followed by a region analysis strategy, where spatial characteristic of previously detected objects are taken into account. Finally, temporal coherence is introduced by means of a tracking scheme, performed using a Kalman filter for each potential candidate. Taking into consideration time constraints, efficiency is achieved two-fold: on the one side, a multi-resolution strategy is adopted for segmentation, where global operation will be applied only to low-resolution images, increasing the resolution to the maximum only when a potential road sign is being tracked. On the other side, we take advantage of the expected spacing between traffic signs. Namely, the tracking of objects of interest allows to generate inhibition areas, which are those ones where no new traffic signs are expected to appear due to the existence of a TS in the neighborhood. The proposed solution has been tested with real sequences in both urban areas and highways, and proved to achieve higher computational efficiency, especially as a result of the multi-resolution approach.

  19. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  20. Hidden Markov Models for Detecting Aseismic Events in Southern California

    NASA Astrophysics Data System (ADS)

    Granat, R.

    2004-12-01

    We employ a hidden Markov model (HMM) to segment surface displacement time series collection by the Southern California Integrated Geodetic Network (SCIGN). These segmented time series are then used to detect regional events by observing the number of simultaneous mode changes across the network; if a large number of stations change at the same time, that indicates an event. The hidden Markov model (HMM) approach assumes that the observed data has been generated by an unobservable dynamical statistical process. The process is of a particular form such that each observation is coincident with the system being in a particular discrete state, which is interpreted as a behavioral mode. The dynamics are the model are constructed so that the next state is directly dependent only on the current state -- it is a first order Markov process. The model is completely described by a set of parameters: the initial state probabilities, the first order Markov chain state-to-state transition probabilities, and the probability distribution of observable outputs associated with each state. The result of this approach is that our segmentation decisions are based entirely on statistical changes in the behavior of the observed daily displacements. In general, finding the optimal model parameters to fit the data is a difficult problem. We present an innovative model fitting method that is unsupervised (i.e., it requires no labeled training data) and uses a regularized version of the expectation-maximization (EM) algorithm to ensure that model solutions are both robust with respect to initial conditions and of high quality. We demonstrate the reliability of the method as compared to standard model fitting methods and show that it results in lower noise in the mode change correlation signal used to detect regional events. We compare candidate events detected by this method to the seismic record and observe that most are not correlated with a significant seismic event. Our analysis

  1. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  2. [Establishment and Improvement of Portable X-Ray Fluorescence Spectrometer Detection Model Based on Wavelet Transform].

    PubMed

    Li, Fang; Wang, Ji-hua; Lu, An-xiang; Han, Ping

    2015-04-01

    The concentration of Cr, Cu, Zn, As and Pb in soil was tested by portable X-ray fluorescence spectrometer. Each sample was tested for 3 times, then after using wavelet threshold noise filtering method for denoising and smoothing the spectra, a standard curve for each heavy metal was established according to the standard values of heavy metals in soil and the corresponding counts which was the average of the 3 processed spectra. The signal to noise ratio (SNR), mean square error (MSE) and information entropy (H) were taken to assess the effects of denoising when using wavelet threshold noise filtering method for determining the best wavelet basis and wavelet decomposition level. Some samples with different concentrations and H3 B03 (blank) were chosen to retest this instrument to verify its stability. The results show that: the best denoising result was obtained with the coif3 wavelet basis at the decomposition level of 3 when using the wavelet transform method. The determination coefficient (R2) range of the instrument is 0.990-0.996, indicating that a high degree of linearity was found between the contents of heavy metals in soil and each X-ray fluorescence spectral characteristic peak intensity with the instrument measurement within the range (0-1,500 mg · kg(-1)). After retesting and calculating, the results indicate that all the detection limits of the instrument are below the soil standards at national level. The accuracy of the model has been effectively improved, and the instrument also shows good precision with the practical application of wavelet transform to the establishment and improvement of X-ray fluorescence spectrometer detection model. Thus the instrument can be applied in on-site rapid screening of heavy metal in contaminated soil. PMID:26197612

  3. Application of Kalman Filtering Techniques for Microseismic Event Detection

    NASA Astrophysics Data System (ADS)

    Baziw, E.; Weir-Jones, I.

    - Microseismic monitoring systems are generally installed in areas of induced seismicity caused by human activity. Induced seismicity results from changes in the state of stress which may occur as a result of excavation within the rock mass in mining (i.e., rockbursts), and changes in hydrostatic pressures and rock temperatures (e.g., during fluid injection or extraction) in oil exploitation, dam construction or fluid disposal. Microseismic monitoring systems determine event locations and important source parameters such as attenuation, seismic moment, source radius, static stress drop, peak particle velocity and seismic energy. An essential part of the operation of a microseismic monitoring system is the reliable detection of microseismic events. In the absence of reliable, automated picking techniques, operators rely upon manual picking. This is time-consuming, costly and, in the presence of background noise, very prone to error. The techniques described in this paper not only permit the reliable identification of events in cluttered signal environments they have also enabled the authors to develop reliable automated event picking procedures. This opens the way to use microseismic monitoring as a cost-effective production/operations procedure. It has been the experience of the authors that in certain noisy environments, the seismic monitoring system may trigger on and subsequently acquire substantial quantities of erroneous data, due to the high energy content of the ambient noise. Digital filtering techniques need to be applied on the microseismic data so that the ambient noise is removed and event detection simplified. The monitoring of seismic acoustic emissions is a continuous, real-time process and it is desirable to implement digital filters which can also be designed in the time domain and in real-time such as the Kalman Filter. This paper presents a real-time Kalman Filter which removes the statistically describable background noise from the recorded

  4. Swarm intelligence for detecting interesting events in crowded environments.

    PubMed

    Kaltsa, Vagia; Briassouli, Alexia; Kompatsiaris, Ioannis; Hadjileontiadis, Leontios J; Strintzis, Michael Gerasimos

    2015-07-01

    This paper focuses on detecting and localizing anomalous events in videos of crowded scenes, i.e., divergences from a dominant pattern. Both motion and appearance information are considered, so as to robustly distinguish different kinds of anomalies, for a wide range of scenarios. A newly introduced concept based on swarm theory, histograms of oriented swarms (HOS), is applied to capture the dynamics of crowded environments. HOS, together with the well-known histograms of oriented gradients, are combined to build a descriptor that effectively characterizes each scene. These appearance and motion features are only extracted within spatiotemporal volumes of moving pixels to ensure robustness to local noise, increase accuracy in the detection of local, nondominant anomalies, and achieve a lower computational cost. Experiments on benchmark data sets containing various situations with human crowds, as well as on traffic data, led to results that surpassed the current state of the art (SoA), confirming the method's efficacy and generality. Finally, the experiments show that our approach achieves significantly higher accuracy, especially for pixel-level event detection compared to SoA methods, at a low computational cost. PMID:25769154

  5. Endmember detection in marine environment with oil spill event

    NASA Astrophysics Data System (ADS)

    Andreou, Charoula; Karathanassi, Vassilia

    2011-11-01

    Oil spill events are a crucial environmental issue. Detection of oil spills is important for both oil exploration and environmental protection. In this paper, investigation of hyperspectral remote sensing is performed for the detection of oil spills and the discrimination of different oil types. Spectral signatures of different oil types are very useful, since they may serve as endmembers in unmixing and classification models. Towards this direction, an oil spectral library, resulting from spectral measurements of artificial oil spills as well as of look-alikes in marine environment was compiled. Samples of four different oil types were used; two crude oils, one marine residual fuel oil, and one light petroleum product. Lookalikes comprise sea water, river discharges, shallow water and water with algae. Spectral measurements were acquired with spectro-radiometer GER1500. Moreover, oil and look-alikes spectral signatures have been examined whether they can be served as endmembers. This was accomplished by testifying their linear independence. After that, synthetic hyperspectral images based on the relevant oil spectral library were created. Several simplex-based endmember algorithms such as sequential maximum angle convex cone (SMACC), vertex component analysis (VCA), n-finder algorithm (N-FINDR), and automatic target generation process (ATGP) were applied on the synthetic images in order to evaluate their effectiveness for detecting oil spill events occurred from different oil types. Results showed that different types of oil spills with various thicknesses can be extracted as endmembers.

  6. A Pulse-type Hardware Level Difference Detection Model Based on Sound Source Localization Mechanism in Barn Owl

    NASA Astrophysics Data System (ADS)

    Sakurai, Tsubasa; Sekine, Yoshifumi

    Auditory information processing is very important in the darkness where vision information is extremely limited. Barn owls have excellent hearing information processing function. Barn owls can detect a sound source in the high accuracy of less than two degrees in both of the vertical and horizontal directions. When they perform the sound source localization, the barn owls use the interaural time difference for localization in the horizontal plane, and the interaural level difference for localization in the vertical plane. We are constructing the two-dimensional sound source localization model using pulse-type hardware neuron models based on sound source localization mechanism of barn owl for the purpose of the engineering application. In this paper, we propose a pulse-type hardware model for level difference detection based on sound source localization mechanism of barn owl. Firstly, we discuss the response characteristics of the mathematical model for level difference detection. Next we discuss the response characteristics of the hardware mode. As a result, we show clearly that this proposal model can be used as a sound source localization model of vertical direction.

  7. An Automated Visual Event Detection System for Cabled Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Mariette, J.

    2007-12-01

    The permanent presence of underwater cameras on oceanic cabled observatories, such as the Victoria Experimental Network Under the Sea (VENUS) and Eye-In-The-Sea (EITS) on Monterey Accelerated Research System (MARS), will generate valuable data that can move forward the boundaries of understanding the underwater world. However, sightings of underwater animal activities are rare, resulting in the recording of many hours of video with relatively few events of interest. The burden of video management and analysis often requires reducing the amount of video recorded and later analyzed. Sometimes enough human resources do not exist to analyze the video; the strains on human attention needed to analyze video demand an automated way to assist in video analysis. Towards this end, an Automated Visual Event Detection System (AVED) is in development at the Monterey Bay Aquarium Research Institute (MBARI) to address the problem of analyzing cabled observatory video. Here we describe the overall design of the system to process video data and enable science users to analyze the results. We present our results analyzing video from the VENUS observatory and test data from EITS deployments. This automated system for detecting visual events includes a collection of custom and open source software that can be run three ways: through a Web Service, through a Condor managed pool of AVED enabled compute servers, or locally on a single computer. The collection of software also includes a graphical user interface to preview or edit detected results and to setup processing options. To optimize the compute-intensive AVED algorithms, a parallel program has been implemented for high-data rate applications like the EITS instrument on MARS.

  8. The waveform correlation event detection system global prototype software design

    SciTech Connect

    Beiriger, J.I.; Moore, S.G.; Trujillo, J.R.; Young, C.J.

    1997-12-01

    The WCEDS prototype software system was developed to investigate the usefulness of waveform correlation methods for CTBT monitoring. The WCEDS prototype performs global seismic event detection and has been used in numerous experiments. This report documents the software system design, presenting an overview of the system operation, describing the system functions, tracing the information flow through the system, discussing the software structures, and describing the subsystem services and interactions. The effectiveness of the software design in meeting project objectives is considered, as well as opportunities for code refuse and lessons learned from the development process. The report concludes with recommendations for modifications and additions envisioned for regional waveform-correlation-based detector.

  9. Use of sonification in the detection of anomalous events

    NASA Astrophysics Data System (ADS)

    Ballora, Mark; Cole, Robert J.; Kruesi, Heidi; Greene, Herbert; Monahan, Ganesh; Hall, David L.

    2012-06-01

    In this paper, we describe the construction of a soundtrack that fuses stock market data with information taken from tweets. This soundtrack, or auditory display, presents the numerical and text data in such a way that anomalous events may be readily detected, even by untrained listeners. The soundtrack generation is flexible, allowing an individual listener to create a unique audio mix from the available information sources. Properly constructed, the display exploits the auditory system's sensitivities to periodicities, to dynamic changes, and to patterns. This type of display could be valuable in environments that demand high levels of situational awareness based on multiple sources of incoming information.

  10. Automatic adverse drug events detection using letters to the editor.

    PubMed

    Yang, Chao; Srinivasan, Padmini; Polgreen, Philip M

    2012-01-01

    We present and test the intuition that letters to the editor in journals carry early signals of adverse drug events (ADEs). Surprisingly these letters have not yet been exploited for automatic ADE detection unlike for example, clinical records and PubMed. Part of the challenge is that it is not easy to access the full-text of letters (for the most part these do not appear in PubMed). Also letters are likely underrated in comparison with full articles. Besides demonstrating that this intuition holds we contribute techniques for post market drug surveillance. Specifically, we test an automatic approach for ADE detection from letters using off-the-shelf machine learning tools. We also involve natural language processing for feature definitions. Overall we achieve high accuracy in our experiments and our method also works well on a second new test set. Our results encourage us to further pursue this line of research. PMID:23304379

  11. Automatic Adverse Drug Events Detection Using Letters to the Editor

    PubMed Central

    Yang, Chao; Srinivasan, Padmini; Polgreen, Philip M.

    2012-01-01

    We present and test the intuition that letters to the editor in journals carry early signals of adverse drug events (ADEs). Surprisingly these letters have not yet been exploited for automatic ADE detection unlike for example, clinical records and PubMed. Part of the challenge is that it is not easy to access the full-text of letters (for the most part these do not appear in PubMed). Also letters are likely underrated in comparison with full articles. Besides demonstrating that this intuition holds we contribute techniques for post market drug surveillance. Specifically, we test an automatic approach for ADE detection from letters using off-the-shelf machine learning tools. We also involve natural language processing for feature definitions. Overall we achieve high accuracy in our experiments and our method also works well on a second new test set. Our results encourage us to further pursue this line of research. PMID:23304379

  12. Increased SERS detection efficiency for characterizing rare events in flow.

    PubMed

    Jacobs, Kevin T; Schultz, Zachary D

    2015-08-18

    Improved surface-enhanced Raman scattering (SERS) measurements of a flowing aqueous sample are accomplished by combining line focus optics with sheath-flow SERS detection. The straightforward introduction of a cylindrical lens into the optical path of the Raman excitation laser increases the efficiency of SERS detection and the reproducibility of SERS signals at low concentrations. The width of the line focus is matched to the width of the sample capillary from which the analyte elutes under hydrodynamic focusing conditions, allowing for increased collection across the SERS substrate while maintaining the power density below the damage threshold at any specific point. We show that a 4× increase in power spread across the line increases the signal-to-noise ratio by a factor of 2 for a variety of analytes, such as rhodamine 6G, amino acids, and lipid vesicles, without any detectable photodamage. COMSOL simulations and Raman maps elucidate the hydrodynamic focusing properties of the flow cell, providing a clearer picture of the confinement effects at the surface where the sample exits the capillary. The lipid vesicle results suggest that the combination of hydrodynamic focusing and increased optical collection enables the reproducible detection of rare events, in this case individual lipid vesicles. PMID:26168151

  13. Local Seismic Event Detection Using Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.

    2013-12-01

    The large footprint of regularly-spaced broadband seismometers afforded by EarthScope's USArray Transportable Array (TA) [www.usarray.org] presents an unprecedented opportunity to develop novel seismic array processing methods. Here we report preliminary results from a new automated method for detecting small local seismic events within the footprint of the TA using image processing techniques. The overarching goal is to develop a new methodology for automated searches of large seismic datasets for signals that are difficult to detect by traditional means, such as STA/LTA triggering algorithms. We first process the raw broadband data for each station by bandpass filtering at 7-19 Hz and integrating the absolute value of the velocity waveform over a sequence of 5-second intervals. We further combine the integrated values of all three orthogonal channels into a single new time series with a 5-second sampling rate. This new time series is analogous to a measurement of the total seismic energy recorded at the station in each 5-second interval; we call this time series Integrated Ground Motion (IGM). Each sample is compared to a sliding longer-term average to remove diurnal and long-term noise effects. We create an image file by mapping each station location to an equivalent position in a blank image array, and use a modified Voronoi tessellation algorithm to assign each pixel in the image to the IGM value of the nearest station. We assign a value of zero if the pixel is more than a maximum distance from the nearest station. We apply 2-dimensional spatial image filtering techniques to remove large-scale features affecting much of the image, as we assume these likely result from teleseismic events. We also filter the time series to remove very small-scale features from noise spikes affecting a single seismic station. The resulting image contains only features of regional scale affecting 2 or more stations. For each of the remaining image features, we find the center

  14. A robustness study of parametric and non-parametric tests in model-based multifactor dimensionality reduction for epistasis detection

    PubMed Central

    2013-01-01

    Background Applying a statistical method implies identifying underlying (model) assumptions and checking their validity in the particular context. One of these contexts is association modeling for epistasis detection. Here, depending on the technique used, violation of model assumptions may result in increased type I error, power loss, or biased parameter estimates. Remedial measures for violated underlying conditions or assumptions include data transformation or selecting a more relaxed modeling or testing strategy. Model-Based Multifactor Dimensionality Reduction (MB-MDR) for epistasis detection relies on association testing between a trait and a factor consisting of multilocus genotype information. For quantitative traits, the framework is essentially Analysis of Variance (ANOVA) that decomposes the variability in the trait amongst the different factors. In this study, we assess through simulations, the cumulative effect of deviations from normality and homoscedasticity on the overall performance of quantitative Model-Based Multifactor Dimensionality Reduction (MB-MDR) to detect 2-locus epistasis signals in the absence of main effects. Methodology Our simulation study focuses on pure epistasis models with varying degrees of genetic influence on a quantitative trait. Conditional on a multilocus genotype, we consider quantitative trait distributions that are normal, chi-square or Student’s t with constant or non-constant phenotypic variances. All data are analyzed with MB-MDR using the built-in Student’s t-test for association, as well as a novel MB-MDR implementation based on Welch’s t-test. Traits are either left untransformed or are transformed into new traits via logarithmic, standardization or rank-based transformations, prior to MB-MDR modeling. Results Our simulation results show that MB-MDR controls type I error and false positive rates irrespective of the association test considered. Empirically-based MB-MDR power estimates for MB-MDR with Welch

  15. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  16. Detecting Rare Events in the Time-Domain

    SciTech Connect

    Rest, A; Garg, A

    2008-10-31

    One of the biggest challenges in current and future time-domain surveys is to extract the objects of interest from the immense data stream. There are two aspects to achieving this goal: detecting variable sources and classifying them. Difference imaging provides an elegant technique for identifying new transients or changes in source brightness. Much progress has been made in recent years toward refining the process. We discuss a selection of pitfalls that can afflict an automated difference imagine pipeline and describe some solutions. After identifying true astrophysical variables, we are faced with the challenge of classifying them. For rare events, such as supernovae and microlensing, this challenge is magnified because we must balance having selection criteria that select for the largest number of objects of interest against a high contamination rate. We discuss considerations and techniques for developing classification schemes.

  17. Detecting and characterising ramp events in wind power time series

    NASA Astrophysics Data System (ADS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-12-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain.

  18. Event Detection and Spatial Analysis for Characterizing Extreme Precipitation

    NASA Astrophysics Data System (ADS)

    Jeon, S.; Prabhat, M.; Byna, S.; Collins, W.; Wehner, M. F.

    2013-12-01

    Atmospheric Rivers (ARs) are large spatially coherent weather systems with high concentrations of elevated water vapor that often cause severe downpours and flooding over western coastal United States. With the availability of more atmospheric moisture in the future under global warming, we expect ARs to play an important role as a potential cause of extreme precipitation. We have recently developed TECA software for automatically identifying and tracking features in climate datasets. In particular, we are able to identify ARs that make landfall on the western coast of North America. This detection tool examines integrated water vapor field above a certain threshold and performs geometric analysis. Based on the detection procedure, we investigate impacts of ARs by exploring spatial extent of AR precipitation for CMIP5 simulations, and characterize spatial pattern of dependence for future projections under climate change within the framework of extreme value theory. The results show that AR events in RCP8.5 scenario (2076-2100) tend to produce heavier rainfall with higher frequency and longer duration than the events from historical run (1981-2005). Range of spatial dependence between extreme precipitations is concentrated on smaller localized area in California under the highest emission scenario than present day. Preliminary results are illustrated in Figure 1 and 2. Fig 1: Boxplot of annual max precipitation (left two) and max AR precipitation (right two) from GFDL-ESM2M during 25-year time period by station in California, US. Fig 2: Spatial dependence of max AR precipitation calculated from Station 4 (triangle) for historical run (left) and for future projections of RCP8.5 (right) from GFDL-ESM2M. Green and orange colors represent complete dependence and independence between two stations respectively.

  19. Barometric pressure and triaxial accelerometry-based falls event detection.

    PubMed

    Bianchi, Federico; Redmond, Stephen J; Narayanan, Michael R; Cerutti, Sergio; Lovell, Nigel H

    2010-12-01

    Falls and fall related injuries are a significant cause of morbidity, disability, and health care utilization, particularly among the age group of 65 years and over. The ability to detect falls events in an unsupervised manner would lead to improved prognoses for falls victims. Several wearable accelerometry and gyroscope-based falls detection devices have been described in the literature; however, they all suffer from unacceptable false positive rates. This paper investigates the augmentation of such systems with a barometric pressure sensor, as a surrogate measure of altitude, to assist in discriminating real fall events from normal activities of daily living. The acceleration and air pressure data are recorded using a wearable device attached to the subject's waist and analyzed offline. The study incorporates several protocols including simulated falls onto a mattress and simulated activities of daily living, in a cohort of 20 young healthy volunteers (12 male and 8 female; age: 23.7 ±3.0 years). A heuristically trained decision tree classifier is used to label suspected falls. The proposed system demonstrated considerable improvements in comparison to an existing accelerometry-based technique; showing an accuracy, sensitivity and specificity of 96.9%, 97.5%, and 96.5%, respectively, in the indoor environment, with no false positives generated during extended testing during activities of daily living. This is compared to 85.3%, 75%, and 91.5% for the same measures, respectively, when using accelerometry alone. The increased specificity of this system may enhance the usage of falls detectors among the elderly population. PMID:20805056

  20. Model-based iterative reconstruction and adaptive statistical iterative reconstruction: dose-reduced CT for detecting pancreatic calcification

    PubMed Central

    Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2016-01-01

    Background Iterative reconstruction methods have attracted attention for reducing radiation doses in computed tomography (CT). Purpose To investigate the detectability of pancreatic calcification using dose-reduced CT reconstructed with model-based iterative construction (MBIR) and adaptive statistical iterative reconstruction (ASIR). Material and Methods This prospective study approved by Institutional Review Board included 85 patients (57 men, 28 women; mean age, 69.9 years; mean body weight, 61.2 kg). Unenhanced CT was performed three times with different radiation doses (reference-dose CT [RDCT], low-dose CT [LDCT], ultralow-dose CT [ULDCT]). From RDCT, LDCT, and ULDCT, images were reconstructed with filtered-back projection (R-FBP, used for establishing reference standard), ASIR (L-ASIR), and MBIR and ASIR (UL-MBIR and UL-ASIR), respectively. A lesion (pancreatic calcification) detection test was performed by two blinded radiologists with a five-point certainty level scale. Results Dose-length products of RDCT, LDCT, and ULDCT were 410, 97, and 36 mGy-cm, respectively. Nine patients had pancreatic calcification. The sensitivity for detecting pancreatic calcification with UL-MBIR was high (0.67–0.89) compared to L-ASIR or UL-ASIR (0.11–0.44), and a significant difference was seen between UL-MBIR and UL-ASIR for one reader (P = 0.014). The area under the receiver-operating characteristic curve for UL-MBIR (0.818–0.860) was comparable to that for L-ASIR (0.696–0.844). The specificity was lower with UL-MBIR (0.79–0.92) than with L-ASIR or UL-ASIR (0.96–0.99), and a significant difference was seen for one reader (P < 0.01). Conclusion In UL-MBIR, pancreatic calcification can be detected with high sensitivity, however, we should pay attention to the slightly lower specificity. PMID:27110389

  1. Detecting Tidal Disruption Events (TDEs) with the Einstein Probe

    NASA Astrophysics Data System (ADS)

    Yuan, W.; Komossa, S.; Zhang, C.; Feng, H.; Zhang, S.; Osborne, J.; O'Brien, P.; Watson, M.; Fraser, G.

    2014-07-01

    Stars are tidally disrupted and accreted when they approach supermassive black holes (SMBHs) closely, producing a flare of electromagnetic radiation. The majority of the (approximately two dozen) tidal disruption events (TDEs) identified so far have been discovered by their luminous, transient X-ray emission. Once TDEs are detected in much larger numbers, in future dedicated transient surveys, a wealth of new applications will become possible. Including (1) TDE rate measurements in dependence of host galaxy types, (2) an assessment of the population of IMBHs, and (3) new probes of general relativity and accretion processes. Here, we present the proposed X-ray mission Einstein Probe}, which aims at detecting TDEs in large numbers. The mission consists of a wide-field micro-pore Lobster-eye imager (60deg x 60deg, or ˜1 ster), and is designed to carry out an all-sky transient survey at energies of 0.5-4 keV. It will also carry an X-ray telescope of the same micro-pore optics for follow-ups, with a smaller field-of-view. It will be capable of issuing public transient alerts rapidly.

  2. Communication of ALS Patients by Detecting Event-Related Potential

    NASA Astrophysics Data System (ADS)

    Kanou, Naoyuki; Sakuma, Kenji; Nakashima, Kenji

    Amyotrophic Lateral Sclerosis(ALS) patients are unable to successfully communicate their desires, although their mental capacity is the same as non-affected persons. Therefore, the authors put emphasis on Event-Related Potential(ERP) which elicits the highest outcome for the target visual and hearing stimuli. P300 is one component of ERP. It is positive potential that is elicited when the subject focuses attention on stimuli that appears infrequently. In this paper, the authors focused on P200 and N200 components, in addition to P300, for their great improvement in the rate of correct judgment in the target word-specific experiment. Hence the authors propose the algorithm that specifies target words by detecting these three components. Ten healthy subjects and ALS patient underwent the experiment in which a target word out of five words, was specified by this algorithm. The rates of correct judgment in nine of ten healthy subjects were more than 90.0%. The highest rate was 99.7%. The highest rate of ALS patient was 100.0%. Through these results, the authors found the possibility that ALS patients could communicate with surrounding persons by detecting ERP(P200, N200 and P300) as their desire.

  3. Apparatus and method for detecting full-capture radiation events

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    An apparatus and method for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events.

  4. Detection of Local/Regional Events in Kuwait Using Next-Generation Detection Algorithms

    SciTech Connect

    Gok, M. Rengin; Al-Jerri, Farra; Dodge, Douglas; Al-Enezi, Abdullah; Hauk, Terri; Mellors, R.

    2014-12-10

    Seismic networks around the world use conventional triggering algorithms to detect seismic signals in order to locate local/regional seismic events. Kuwait National Seismological Network (KNSN) of Kuwait Institute of Scientific Research (KISR) is operating seven broad-band and short-period three-component stations in Kuwait. The network is equipped with Nanometrics digitizers and uses Antelope and Guralp acquisition software for processing and archiving the data. In this study, we selected 10 days of archived hourly-segmented continuous data of five stations (Figure 1) and 250 days of continuous recording at MIB. For the temporary deployment our selection criteria was based on KNSN catalog intensity for the period of time we test the method. An autonomous event detection and clustering framework is employed to test a more complete catalog of this short period of time. The goal is to illustrate the effectiveness of the technique and pursue the framework for longer period of time.

  5. Automatic detection of volcano-seismic events by modeling state and event duration in hidden Markov models

    NASA Astrophysics Data System (ADS)

    Bhatti, Sohail Masood; Khan, Muhammad Salman; Wuth, Jorge; Huenupan, Fernando; Curilem, Millaray; Franco, Luis; Yoma, Nestor Becerra

    2016-09-01

    In this paper we propose an automatic volcano event detection system based on Hidden Markov Model (HMM) with state and event duration models. Since different volcanic events have different durations, therefore the state and whole event durations learnt from the training data are enforced on the corresponding state and event duration models within the HMM. Seismic signals from the Llaima volcano are used to train the system. Two types of events are employed in this study, Long Period (LP) and Volcano-Tectonic (VT). Experiments show that the standard HMMs can detect the volcano events with high accuracy but generates false positives. The results presented in this paper show that the incorporation of duration modeling can lead to reductions in false positive rate in event detection as high as 31% with a true positive accuracy equal to 94%. Further evaluation of the false positives indicate that the false alarms generated by the system were mostly potential events based on the signal-to-noise ratio criteria recommended by a volcano expert.

  6. Direct phosphorescent detection of primary event of photodynamic action

    NASA Astrophysics Data System (ADS)

    Losev, Anatoly P.; Knukshto, Valentin N.; Zhuravkin, Ivan N.

    1994-07-01

    Highly phosphorescent photosensitizer Pd-tetra (o-methoxy-p-sulfo) phenyl porphyrin (Pd-MSPP) was used to follow the primary events of photodynamic action - quenching of triplet states by free oxygen in different systems: water solutions of proteins, cells and tissues in vivo and in vitro. The photosensitizer forms complexes with proteins in solutions and biosystems showing remarkable hypsochromic shifts of band and an increase of the quantum yield and lifetime of phosphorescence at the binding to proteins. In absence of oxygen the lifetime of phosphorescence is almost single exponential, and depends on the energy of lowest triplet state of the sensitizer. The photochemical quenching of the triplets by cell components is negligible. In presence of free oxygen the quenching of the sensitizer triplets takes place. The emission spectrum of singlet oxygen with maximum 1271 nm was recorded in water protein solutions and quantum yield of sensitized luminescence was measured. In the systems studied, oxygen consumption was detected and oxygen concentration was estimated in the course of photodynamics by an increase in photosensitizer phosphorescence lifetime, using laser flash photolysis technique. At least two exponential kinetics of the phosphorescence decay shows that the distribution of the free oxygen is not uniform in tissues.

  7. Visual traffic surveillance framework: classification to event detection

    NASA Astrophysics Data System (ADS)

    Ambardekar, Amol; Nicolescu, Mircea; Bebis, George; Nicolescu, Monica

    2013-10-01

    Visual traffic surveillance using computer vision techniques can be noninvasive, automated, and cost effective. Traffic surveillance systems with the ability to detect, count, and classify vehicles can be employed in gathering traffic statistics and achieving better traffic control in intelligent transportation systems. However, vehicle classification poses a difficult problem as vehicles have high intraclass variation and relatively low interclass variation. Five different object recognition techniques are investigated: principal component analysis (PCA)+difference from vehicle space, PCA+difference in vehicle space, PCA+support vector machine, linear discriminant analysis, and constellation-based modeling applied to the problem of vehicle classification. Three of the techniques that performed well were incorporated into a unified traffic surveillance system for online classification of vehicles, which uses tracking results to improve the classification accuracy. To evaluate the accuracy of the system, 31 min of traffic video containing multilane traffic intersection was processed. It was possible to achieve classification accuracy as high as 90.49% while classifying correctly tracked vehicles into four classes: cars, SUVs/vans, pickup trucks, and buses/semis. While processing a video, our system also recorded important traffic parameters such as the appearance, speed, trajectory of a vehicle, etc. This information was later used in a search assistant tool to find interesting traffic events.

  8. Large Time Projection Chambers for Rare Event Detection

    SciTech Connect

    Heffner, M

    2009-11-03

    The Time Projection Chamber (TPC) concept [add ref to TPC section] has been applied to many projects outside of particle physics and the accelerator based experiments where it was initially developed. TPCs in non-accelerator particle physics experiments are principally focused on rare event detection (e.g. neutrino and darkmater experiments) and the physics of these experiments can place dramatically different constraints on the TPC design (only extensions to the traditional TPCs are discussed here). The drift gas, or liquid, is usually the target or matter under observation and due to very low signal rates a TPC with the largest active mass is desired. The large mass complicates particle tracking of short and sometimes very low energy particles. Other special design issues include, efficient light collection, background rejection, internal triggering and optimal energy resolution. Backgrounds from gamma-rays and neutrons are significant design issues in the construction of these TPCs. They are generally placed deep underground to shield from cosmogenic particles and surrounded with shielding to reduce radiation from the local surroundings. The construction materials have to be carefully screened for radiopurity as they are in close contact with the active mass and can be a signification source of background events. The TPC excels in reducing this internal background because the mass inside the fieldcage forms one monolithic volume from which fiducial cuts can be made ex post facto to isolate quiet drift mass, and can be circulated and purified to a very high level. Self shielding in these large mass systems can be significant and the effect improves with density. The liquid phase TPC can obtain a high density at low pressure which results in very good self-shielding and compact installation with a lightweight containment. The down sides are the need for cryogenics, slower charge drift, tracks shorter than the typical electron diffusion, lower energy resolution (e

  9. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. PMID:25996752

  10. The waveform correlation event detection system project: Issues in system refinement, tuning, and operation

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Harris, J.M.; Moore, S.G.; Trujillo, J.R.; Withers, M.M.; Aster, R.C.

    1996-08-01

    The goal of the Waveform Correlation Event Detection System (WCEDS) Project at Sandia Labs has been to develop a prototype of a full-waveform correlation based seismic event detection system which could be used to assess potential usefulness for CTBT monitoring. The current seismic event detection system in use at the IDC is very sophisticated and provides good results but there is still significant room for improvement, particularly in reducing the number of false events (currently being nearly equal to the number of real events). Our first prototype was developed last year and since then we have used it for extensive testing from which we have gained considerable insight. The original prototype was based on a long-period detector designed by Shearer (1994), but it has been heavily modified to address problems encountered in application to a data set from the Incorporated Research Institutes for Seismology (IRIS) broadband global network. Important modifications include capabilities for event masking and iterative event detection, continuous near-real time execution, improved Master Image creation, and individualized station pre-processing. All have been shown to improve bulletin quality. In some cases the system has detected marginal events which may not be detectable by traditional detection systems, but definitive conclusions cannot be made without direct comparisons. For this reason future work will focus on using the system to process GSETT3 data for comparison with current event detection systems at the IDC.

  11. Method and apparatus for detecting and determining event characteristics with reduced data collection

    NASA Technical Reports Server (NTRS)

    Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)

    2007-01-01

    A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.

  12. Reading Times and the Detection of Event Shift Processing

    ERIC Educational Resources Information Center

    Radvansky, Gabriel A.; Copeland, David E.

    2010-01-01

    When people read narratives, they often need to update their situation models as the described events change. Previous research has shown little to no increases in reading times for spatial shifts but consistent increases for temporal shifts. On this basis, researchers have suggested that spatial updating does not regularly occur, whereas temporal…

  13. Unsupervised Event Characterization and Detection in Multichannel Signals: An EEG application.

    PubMed

    Mur, Angel; Dormido, Raquel; Vega, Jesús; Duro, Natividad; Dormido-Canto, Sebastian

    2016-01-01

    In this paper, we propose a new unsupervised method to automatically characterize and detect events in multichannel signals. This method is used to identify artifacts in electroencephalogram (EEG) recordings of brain activity. The proposed algorithm has been evaluated and compared with a supervised method. To this end an example of the performance of the algorithm to detect artifacts is shown. The results show that although both methods obtain similar classification, the proposed method allows detecting events without training data and can also be applied in signals whose events are unknown a priori. Furthermore, the proposed method provides an optimal window whereby an optimal detection and characterization of events is found. The detection of events can be applied in real-time. PMID:27120605

  14. Unsupervised Event Characterization and Detection in Multichannel Signals: An EEG application

    PubMed Central

    Mur, Angel; Dormido, Raquel; Vega, Jesús; Duro, Natividad; Dormido-Canto, Sebastian

    2016-01-01

    In this paper, we propose a new unsupervised method to automatically characterize and detect events in multichannel signals. This method is used to identify artifacts in electroencephalogram (EEG) recordings of brain activity. The proposed algorithm has been evaluated and compared with a supervised method. To this end an example of the performance of the algorithm to detect artifacts is shown. The results show that although both methods obtain similar classification, the proposed method allows detecting events without training data and can also be applied in signals whose events are unknown a priori. Furthermore, the proposed method provides an optimal window whereby an optimal detection and characterization of events is found. The detection of events can be applied in real-time. PMID:27120605

  15. Setting objective thresholds for rare event detection in flow cytometry.

    PubMed

    Richards, Adam J; Staats, Janet; Enzor, Jennifer; McKinnon, Katherine; Frelinger, Jacob; Denny, Thomas N; Weinhold, Kent J; Chan, Cliburn

    2014-07-01

    The accurate identification of rare antigen-specific cytokine positive cells from peripheral blood mononuclear cells (PBMC) after antigenic stimulation in an intracellular staining (ICS) flow cytometry assay is challenging, as cytokine positive events may be fairly diffusely distributed and lack an obvious separation from the negative population. Traditionally, the approach by flow operators has been to manually set a positivity threshold to partition events into cytokine-positive and cytokine-negative. This approach suffers from subjectivity and inconsistency across different flow operators. The use of statistical clustering methods does not remove the need to find an objective threshold between between positive and negative events since consistent identification of rare event subsets is highly challenging for automated algorithms, especially when there is distributional overlap between the positive and negative events ("smear"). We present a new approach, based on the Fβ measure, that is similar to manual thresholding in providing a hard cutoff, but has the advantage of being determined objectively. The performance of this algorithm is compared with results obtained by expert visual gating. Several ICS data sets from the External Quality Assurance Program Oversight Laboratory (EQAPOL) proficiency program were used to make the comparisons. We first show that visually determined thresholds are difficult to reproduce and pose a problem when comparing results across operators or laboratories, as well as problems that occur with the use of commonly employed clustering algorithms. In contrast, a single parameterization for the Fβ method performs consistently across different centers, samples, and instruments because it optimizes the precision/recall tradeoff by using both negative and positive controls. PMID:24727143

  16. Motor task event detection using Subthalamic Nucleus Local Field Potentials.

    PubMed

    Niketeghad, Soroush; Hebb, Adam O; Nedrud, Joshua; Hanrahan, Sara J; Mahoor, Mohammad H

    2015-08-01

    Deep Brain Stimulation (DBS) provides significant therapeutic benefit for movement disorders such as Parkinson's disease. Current DBS devices lack real-time feedback (thus are open loop) and stimulation parameters are adjusted during scheduled visits with a clinician. A closed-loop DBS system may reduce power consumption and DBS side effects. In such systems, DBS parameters are adjusted based on patient's behavior, which means that behavior detection is a major step in designing such systems. Various physiological signals can be used to recognize the behaviors. Subthalamic Nucleus (STN) Local Field Potential (LFP) is a great candidate signal for the neural feedback, because it can be recorded from the stimulation lead and does not require additional sensors. A practical behavior detection method should be able to detect behaviors asynchronously meaning that it should not use any prior knowledge of behavior onsets. In this paper, we introduce a behavior detection method that is able to asynchronously detect the finger movements of Parkinson patients. As a result of this study, we learned that there is a motor-modulated inter-hemispheric connectivity between LFP signals recorded bilaterally from STN. We used non-linear regression method to measure this connectivity and use it to detect the finger movements. Performance of this method is evaluated using Receiver Operating Characteristic (ROC). PMID:26737550

  17. The GRACE satellites detect recent extreme climate events in China

    NASA Astrophysics Data System (ADS)

    Tang, Jingshi; Liu, Lin

    2012-07-01

    As the climate changes, the extreme climates are occurring more frequenly over the globe. In China, drought or flood recently strikes almost every year and there have been several disastrous events in these years. We show that some of the disastrous events are so strong that corresponding gravity change can be observed by geodetic satellies. We use the Gravity Recovery and Climate Experiment (GRACE), which is a joint mission between NASA and DLR. One primary job of GRACE is to map Earth temporal gravity field with high resolution. Over the years the twin satellites have observed the loss of mass in Antarctic and Greenland, strong earthquakes, severe climate change in South America and so on, which provides a unique way to study the geophysical or climatological process. In this report, the Level-2 product in recent few years from Center for Space Research is used and specific areas in China are focused on. It is shown that after decorrelation, filter and other processes, the gravity anomalies observed by GRACE match the extreme climate events and the hydrological data from the Global Land Data Assimilation System (GLDAS).

  18. Nonthreshold-based event detection for 3d environment monitoring in sensor networks

    SciTech Connect

    Li, M.; Liu, Y.H.; Chen, L.

    2008-12-15

    Event detection is a crucial task for wireless sensor network applications, especially environment monitoring. Existing approaches for event detection are mainly based on some predefined threshold values and, thus, are often inaccurate and incapable of capturing complex events. For example, in coal mine monitoring scenarios, gas leakage or water osmosis can hardly be described by the overrun of specified attribute thresholds but some complex pattern in the full-scale view of the environmental data. To address this issue, we propose a nonthreshold-based approach for the real 3D sensor monitoring environment. We employ energy-efficient methods to collect a time series of data maps from the sensor network and detect complex events through matching the gathered data to spatiotemporal data patterns. Finally, we conduct trace-driven simulations to prove the efficacy and efficiency of this approach on detecting events of complex phenomena from real-life records.

  19. Minimal elastographic modeling of breast cancer for model based tumor detection in a digital image elasto tomography (DIET) system

    NASA Astrophysics Data System (ADS)

    Lotz, Thomas F.; Muller, Natalie; Hann, Christopher E.; Chase, J. Geoffrey

    2011-03-01

    Digital Image Elasto Tomography (DIET) is a non-invasive breast cancer screening technology that images the surface motion of a breast under harmonic mechanical actuation. A new approach capturing the dynamics and characteristics of tumor behavior is presented. A simple mechanical model of the breast is used to identify a transfer function relating the input harmonic actuation to the output surface displacements using imaging data of a silicone phantom. Areas of higher stiffness cause significant changes of damping and resonant frequencies as seen in the resulting Bode plots. A case study on a healthy and tumor silicone breast phantom shows the potential for this model-based method to clearly distinguish cancerous and healthy tissue as well as correctly predicting the tumor position.

  20. Neuro-evolutionary event detection technique for downhole microseismic surveys

    NASA Astrophysics Data System (ADS)

    Maity, Debotyam; Salehi, Iraj

    2016-01-01

    Recent years have seen a significant increase in borehole microseismic data acquisition programs associated with unconventional reservoir developments such as hydraulic fracturing programs for shale oil and gas. The data so acquired is used for hydraulic fracture monitoring and diagnostics and therefore, the quality of the data in terms of resolution and accuracy has a significant impact on its value to the industry. Borehole microseismic data acquired in such environments typically suffer from propagation effects due to the presence of thin interbedded shale layers as well as noise and interference effects. Moreover, acquisition geometry has significant impact on detectability across portions of the sensor array. Our work focuses on developing robust first arrival detection and pick selection workflow for both P and S waves specifically designed for such environments. We introduce a novel workflow for refinement of picks with immunity towards significant noise artifacts and applicability over data with very low signal-to-noise ratio provided some accurate picks have already been made. This workflow utilizes multi-step hybrid detection and classification routine which makes use of a neural network based autopicker for initial picking and an evolutionary algorithm for pick refinement. We highlight the results from an actual field case study including multiple examples demonstrating immunity towards noise and compare the effectiveness of the workflow with two contemporary autopicking routines without the application of the shared detection/refinement procedure. Finally, we use a windowed waveform cross-correlation based uncertainty estimation method for potential quality control purposes. While the workflow was developed to work with the neural network based autopicker, it can be used with any other traditional autopicker and provides significant improvements in pick detection across seismic gathers.

  1. Spatial-temporal event detection in climate parameter imagery.

    SciTech Connect

    McKenna, Sean Andrew; Gutierrez, Karen A.

    2011-10-01

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to the earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.

  2. Detection of intermittent events in atmospheric time series

    NASA Astrophysics Data System (ADS)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    The modeling approach in atmospheric sciences is based on the assumption that local fluxes of mass, momentum, heat, etc... can be described as linear functions of the local gradient of some intensive property (concentration, flow strain, temperature,...). This is essentially associated with Gaussian statistics and short range (exponential) correlations. However, the atmosphere is a complex dynamical system displaying a wide range of spatial and temporal scales. A global description of the atmospheric dynamics should include a great number of degrees of freedom, strongly interacting on several temporal and spatial scales, thus generating long range (power-law) correlations and non-Gaussian distribution of fluctuations (Lévy flights, Lévy walks, Continuous Time Random Walks) [1]. This is typically associated with anomalous diffusion and scaling, non-trivial memory features and correlation decays and, especially, with the emergence of flux-gradient relationships that are non-linear and/or non-local in time and/or space. Actually, the local flux-gradient relationship is greatly preferred due to a more clear physical meaning, allowing to perform direct comparisons with experimental data, and, especially, to smaller computational costs in numerical models. In particular, the linearity of this relationship allows to define a transport coefficient (e.g., turbulent diffusivity) and the modeling effort is usually focused on this coefficient. However, the validity of the local (and linear) flux-gradient model is strongly dependent on the range of spatial and temporal scales represented by the model and, consequently, by the sub-grid processes included in the flux-gradient relationship. In this work, in order to check the validity of local and linear flux-gradient relationships, an approach based on the concept of renewal critical events [2] is introduced. In fact, in renewal theory [2], the dynamical origin of anomalous behaviour and non-local flux-gradient relation is

  3. A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks

    PubMed Central

    Zhang, Shukui; Chen, Hao; Zhu, Qiaoming

    2014-01-01

    The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690

  4. Object-Oriented Query Language For Events Detection From Images Sequences

    NASA Astrophysics Data System (ADS)

    Ganea, Ion Eugen

    2015-09-01

    In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.

  5. Detection of Upper Airway Status and Respiratory Events by a Current Generation Positive Airway Pressure Device

    PubMed Central

    Li, Qing Yun; Berry, Richard B.; Goetting, Mark G.; Staley, Bethany; Soto-Calderon, Haideliza; Tsai, Sheila C.; Jasko, Jeffrey G.; Pack, Allan I.; Kuna, Samuel T.

    2015-01-01

    Study Objectives: To compare a positive airway pressure (PAP) device's detection of respiratory events and airway status during device-detected apneas with events scored on simultaneous polysomnography (PSG). Design: Prospective PSGs of patients with sleep apnea using a new-generation PAP device. Settings: Four clinical and academic sleep centers. Patients: Forty-five patients with obstructive sleep apnea (OSA) and complex sleep apnea (Comp SA) performed a PSG on PAP levels adjusted to induce respiratory events. Interventions: None. Measurements and Results: PAP device data identifying the type of respiratory event and whether the airway during a device-detected apnea was open or obstructed were compared to time-synced, manually scored respiratory events on simultaneous PSG recording. Intraclass correlation coefficients between device-detected and PSG scored events were 0.854 for apnea-hypopnea index (AHI), 0.783 for apnea index, 0.252 for hypopnea index, and 0.098 for respiratory event-related arousals index. At a device AHI (AHIFlow) of 10 events/h, area under the receiver operating characteristic curve was 0.98, with sensitivity 0.92 and specificity 0.84. AHIFlow tended to overestimate AHI on PSG at values less than 10 events/h. The device detected that the airway was obstructed in 87.4% of manually scored obstructive apneas. Of the device-detected apneas with clear airway, a minority (15.8%) were manually scored as obstructive apneas. Conclusions: A device-detected apnea-hypopnea index (AHIFlow) < 10 events/h on a positive airway pressure device is strong evidence of good treatment efficacy. Device-detected airway status agrees closely with the presumed airway status during polysomnography scored events, but should not be equated with a specific type of respiratory event. Citation: Li QY, Berry RB, Goetting MG, Staley B, Soto-Calderon H, Tsai SC, Jasko JG, Pack AI, Kuna ST. Detection of upper airway status and respiratory events by a current generation positive

  6. Unsupervised Spatial Event Detection in Targeted Domains with Applications to Civil Unrest Modeling

    PubMed Central

    Zhao, Liang; Chen, Feng; Dai, Jing; Hua, Ting; Lu, Chang-Tien; Ramakrishnan, Naren

    2014-01-01

    Twitter has become a popular data source as a surrogate for monitoring and detecting events. Targeted domains such as crime, election, and social unrest require the creation of algorithms capable of detecting events pertinent to these domains. Due to the unstructured language, short-length messages, dynamics, and heterogeneity typical of Twitter data streams, it is technically difficult and labor-intensive to develop and maintain supervised learning systems. We present a novel unsupervised approach for detecting spatial events in targeted domains and illustrate this approach using one specific domain, viz. civil unrest modeling. Given a targeted domain, we propose a dynamic query expansion algorithm to iteratively expand domain-related terms, and generate a tweet homogeneous graph. An anomaly identification method is utilized to detect spatial events over this graph by jointly maximizing local modularity and spatial scan statistics. Extensive experiments conducted in 10 Latin American countries demonstrate the effectiveness of the proposed approach. PMID:25350136

  7. Seismic network detection probability assessment using waveforms and accounting to event association logic

    NASA Astrophysics Data System (ADS)

    Pinsky, Vladimir; Shapira, Avi

    2016-05-01

    The geographical area where a seismic event of magnitude M ≥ M t is detected by a seismic station network, for a defined probability is derived from a station probability of detection estimated as a function of epicentral distance. The latter is determined from both the bulletin data and the waveforms recorded by the station during the occurrence of the event with and without band-pass filtering. For simulating the real detection process, the waveforms are processed using the conventional Carl Johnson detection and association algorithm. The attempt is presented to account for the association time criterion in addition to the conventional approach adopted by the known PMC method.

  8. A model-based information sharing protocol for profile Hidden Markov Models used for HIV-1 recombination detection

    PubMed Central

    2014-01-01

    Background In many applications, a family of nucleotide or protein sequences classified into several subfamilies has to be modeled. Profile Hidden Markov Models (pHMMs) are widely used for this task, modeling each subfamily separately by one pHMM. However, a major drawback of this approach is the difficulty of dealing with subfamilies composed of very few sequences. One of the most crucial bioinformatical tasks affected by the problem of small-size subfamilies is the subtyping of human immunodeficiency virus type 1 (HIV-1) sequences, i.e., HIV-1 subtypes for which only a small number of sequences is known. Results To deal with small samples for particular subfamilies of HIV-1, we introduce a novel model-based information sharing protocol. It estimates the emission probabilities of the pHMM modeling a particular subfamily not only based on the nucleotide frequencies of the respective subfamily but also incorporating the nucleotide frequencies of all available subfamilies. To this end, the underlying probabilistic model mimics the pattern of commonality and variation between the subtypes with regards to the biological characteristics of HI viruses. In order to implement the proposed protocol, we make use of an existing HMM architecture and its associated inference engine. Conclusions We apply the modified algorithm to classify HIV-1 sequence data in the form of partial HIV-1 sequences and semi-artificial recombinants. Thereby, we demonstrate that the performance of pHMMs can be significantly improved by the proposed technique. Moreover, we show that our algorithm performs significantly better than Simplot and Bootscanning. PMID:24946781

  9. Find Your Manners: How Do Infants Detect the Invariant Manner of Motion in Dynamic Events?

    ERIC Educational Resources Information Center

    Pruden, Shannon M.; Goksun, Tilbe; Roseberry, Sarah; Hirsh-Pasek, Kathy; Golinkoff, Roberta M.

    2012-01-01

    To learn motion verbs, infants must be sensitive to the specific event features lexicalized in their language. One event feature important for the acquisition of English motion verbs is the manner of motion. This article examines when and how infants detect manners of motion across variations in the figure's path. Experiment 1 shows that 13- to…

  10. MCD for detection of event-based landslides

    NASA Astrophysics Data System (ADS)

    Mondini, A. C.; Chang, K.; Guzzetti, F.

    2011-12-01

    Landslides play an important role in the landscape evolution of mountainous terrain. They also present a socioeconomic problem in terms of risk for people and properties. Landslide inventory maps are not available for many areas affected by slope instabilities, resulting in a lack of primary information for the comprehension of the phenomenon, evaluation of relative landslide statistics, and civil protection operations on large scales. Traditional methods for the preparation of landslide inventory maps are based on the geomorphological interpretation of stereoscopic aerial photography and field surveys. These methods are expensive and time consuming. The exploitation of new remote sensing data, in particular very high resolution (VHR) satellite images, and new dedicated methods present an alternative to the traditional methods and are at the forefront of modern landslide research. Recent studies have showed the possibility to produce accurate landslide maps, reducing the time and resources required for their compilation and systematic update. This paper presents the Multiple Change Detection (MCD) technique, a new method that has shown promising results in landslide mapping. Through supervised or unsupervised classifiers, MCD combines different algorithms of change detection metrics, such as change in Normalized Differential Vegetation Index, spectral angle, principal component analysis, and independent component analysis, and applies them to a multi-temporal set of VHR satellite images to distinguish new landslides from stable areas. MCD has been applied with success in different geographical areas and with different satellite images, suggesting it is a reliable and robust technique. The technique can distinguish old from new landslides and capture runout features. Results of these case studies will be presented in the conference. Also to be presented are new developments of MCD involving the introduction of a priori information on landslide susceptibility within

  11. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. PMID:25770443

  12. Comparison of the STA/LTA and power spectral density methods for microseismic event detection

    NASA Astrophysics Data System (ADS)

    Vaezi, Yoones; Van der Baan, Mirko

    2015-12-01

    Robust event detection and picking is a prerequisite for reliable (micro-) seismic interpretations. Detection of weak events is a common challenge among various available event detection algorithms. In this paper we compare the performance of two event detection methods, the short-term average/long-term average (STA/LTA) method, which is the most commonly used technique in industry, and a newly introduced method that is based on the power spectral density (PSD) measurements. We have applied both techniques to a 1-hr long segment of the vertical component of some raw continuous data recorded at a borehole geophone in a hydraulic fracturing experiment. The PSD technique outperforms the STA/LTA technique by detecting a higher number of weak events while keeping the number of false alarms at a reasonable level. The time-frequency representations obtained through the PSD method can also help define a more suitable bandpass filter which is usually required for the STA/LTA method. The method offers thus much promise for automated event detection in industrial, local, regional and global seismological data sets.

  13. Field testing of component-level model-based fault detection methods for mixing boxes and VAV fan systems

    SciTech Connect

    Xu, Peng; Haves, Philip

    2002-05-16

    An automated fault detection and diagnosis tool for HVAC systems is being developed, based on an integrated, life-cycle, approach to commissioning and performance monitoring. The tool uses component-level HVAC equipment models implemented in the SPARK equation-based simulation environment. The models are configured using design information and component manufacturers' data and then fine-tuned to match the actual performance of the equipment by using data measured during functional tests of the sort using in commissioning. This paper presents the results of field tests of mixing box and VAV fan system models in an experimental facility and a commercial office building. The models were found to be capable of representing the performance of correctly operating mixing box and VAV fan systems and detecting several types of incorrect operation.

  14. Probabilistic approaches to fault detection in networked discrete event systems.

    PubMed

    Athanasopoulou, Eleftheria; Hadjicostis, Christoforos N

    2005-09-01

    In this paper, we consider distributed systems that can be modeled as finite state machines with known behavior under fault-free conditions, and we study the detection of a general class of faults that manifest themselves as permanent changes in the next-state transition functionality of the system. This scenario could arise in a variety of situations encountered in communication networks, including faults occurred due to design or implementation errors during the execution of communication protocols. In our approach, fault diagnosis is performed by an external observer/diagnoser that functions as a finite state machine and which has access to the input sequence applied to the system but has only limited access to the system state or output. In particular, we assume that the observer/diagnoser is only able to obtain partial information regarding the state of the given system at intermittent time intervals that are determined by certain synchronizing conditions between the system and the observer/diagnoser. By adopting a probabilistic framework, we analyze ways to optimally choose these synchronizing conditions and develop adaptive strategies that achieve a low probability of aliasing, i.e., a low probability that the external observer/diagnoser incorrectly declares the system as fault-free. An application of these ideas in the context of protocol testing/classification is provided as an example. PMID:16252815

  15. The waveform correlation event detection system project, Phase I: Issues in prototype development and testing

    SciTech Connect

    Young, C.; Harris, M.; Beiriger, J.; Moore, S.; Trujillo, J.; Withers, M.; Aster, R.

    1996-08-01

    A study using long-period seismic data showed that seismic events can be detected and located based on correlations of processed waveform profiles with the profile expected for an event. In this technique both time and space are discretized and events are found by forming profiles and calculating correlations for all time-distance points. events are declared at points with large correlations. In the first phase of the Waveform Correlation Event Detection System (WCEDS) Project at Sandia Labs we have developed a prototype automatic event detection system based on Shearer`s work which shows promise for treaty monitoring applications. Many modifications have been made to meet the requirements of the monitoring environment. A new full matrix multiplication has been developed which can reduce the number of computations needed for the data correlation by as much as two orders of magnitude for large grids. New methodology has also been developed to deal with the problems caused by false correlations (sidelobes) generated during the correlation process. When an event has been detected, masking matrices are set up which will mask all correlation sidelobes due to the event, allowing other events with intermingled phases to be found. This process is repeated until a detection threshold is reached. The system was tested on one hour of Incorporated Research Institutions for Seismology (IRIS) broadband data and built all 4 of the events listed in the National Earthquake Information Center (NEIC) Preliminary Determination of Epicenters (PDE) which were observable by the IRIS network. A continuous execution scheme has been developed for the system but has not yet been implemented. Improvements to the efficiency of the code are in various stages of development. Many refinements would have to be made to the system before it could be used as part of an actual monitoring system, but at this stage we know of no clear barriers which would prevent an eventual implementation of the system.

  16. Qualitative and event-specific real-time PCR detection methods for Bt brinjal event EE-1.

    PubMed

    Randhawa, Gurinder Jit; Sharma, Ruchi; Singh, Monika

    2012-01-01

    Bt brinjal event EE-1 with cry1Ac gene, expressing insecticidal protein against fruit and shoot borer, is the first genetically modified food crop in the pipeline for commercialization in India. Qualitative polymerase chain reaction (PCR) along with event-specific conventional as well as real-time PCR methods to characterize the event EE-1 is reported. A multiplex (pentaplex) PCR system simultaneously amplifying cry1Ac transgene, Cauliflower Mosaic Virus (CaMV) 35S promoter, nopaline synthase (nos) terminator, aminoglycoside adenyltransferase (aadA) marker gene, and a taxon-specific beta-fructosidase gene in event EE-1 has been developed. Furthermore, construct-specific PCR, targeting the approximate 1.8 kb region of inserted gene construct comprising the region of CaMV 35S promoter and cry1Ac gene has also been developed. The LOD of developed EE-1 specific conventional PCR assay is 0.01%. The method performance of the reported real-time PCR assay was consistent with the acceptance criteria of Codex Alimentarius Commission ALINORM 10/33/23, with the LOD and LOQ values of 0.05%. The developed detection methods would not only facilitate effective regulatory compliance for identification of genetic traits, risk assessment, management, and postrelease monitoring, but also address consumer concerns and resolution of legal disputes. PMID:23451391

  17. Real-time detection and classification of anomalous events in streaming data

    DOEpatents

    Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.; Laska, Jason A.; Harrison, Lane T.

    2016-04-19

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.

  18. Investigation of EMIC Waves During Balloon Detected Relativistic Electron Precipitation Events

    NASA Astrophysics Data System (ADS)

    Woodger, L. A.; Millan, R. M.

    2009-12-01

    Multiple relativistic electron precipitation (REP) events were detected by balloon-borne instrumentation during the MAXIS 2000 and MINIS 2005 campaigns. It has been suggested that resonance with EMIC waves caused these precipitation events (Lorentzen et al, 2000 and Millan et al, 2002) due to their location in the dusk sector. We present observations of dusk-side relativistic electron precipitation events, and use supporting satellite and theoretical data to investigate the relationship between EMIC waves and the detected REP. Satellite data can provide direct measurements of not only the waves themselves but also important resonance condition parameters. The data will be presented collectively with each event to showcase similarities and differences between events and the challenges that arise in trying to understand the relationship between dusk-side relativistic electron precipitation and EMIC waves.

  19. Detection of stick-slip events within the Whillans Ice Stream using an artificial neural network

    NASA Astrophysics Data System (ADS)

    Bernsen, S. P.

    2014-12-01

    Temporal changes in the periodic stick-slip events on the Whillans Ice Stream (WIS) help to understand the hydrosphere-cryosphere coupling in West Antarctica. Previous studies have shown that the periodic behavior has been ongoing for a number of years but the record of slip events is incomplete. Rayleigh waves from WIS grounding line events exhibit different patterns than events from the interior of the glacier. An algorithm using a backpropagation neural network is proposed to efficiently extract surface waves that are a result of stick slip events. A neural network approach has its advantages of machine learning, simplified mathematics, and eliminates the need for an analyst to correctly pick first arrivals. Training data has been assembled using 107 events occuring during the 2010 austral summer that were previously identified to correspond to stick slip events at the grounding line as well as the interior of the WIS. A 0.1 s moving window of 3 s of each of the preprocessed attributes is input into the neural network for automated surface wave detection. Following surface wave detection a much longer 30 minute sliding window is used to classify surface wave detections as grounding line, interior, or non-stick slip events. Similar to the automatic detection algorithms for body waves, preprocessing using STA/LTA ratio, degree of polarization, variance, and skewness exhibit obvious patterns during the onset of surface waves. The The automated event detection could lead to more cost effective means of data collection in future seismic experiments especially with an increase in array density in cold weather regions.

  20. Early snowmelt events: detection, distribution, and significance in a major sub-arctic watershed

    NASA Astrophysics Data System (ADS)

    Alese Semmens, Kathryn; Ramage, Joan; Bartsch, Annett; Liston, Glen E.

    2013-03-01

    High latitude drainage basins are experiencing higher average temperatures, earlier snowmelt onset in spring, and an increase in rain on snow (ROS) events in winter, trends that climate models project into the future. Snowmelt-dominated basins are most sensitive to winter temperature increases that influence the frequency of ROS events and the timing and duration of snowmelt, resulting in changes to spring runoff. Of specific interest in this study are early melt events that occur in late winter preceding melt onset in the spring. The study focuses on satellite determination and characterization of these early melt events using the Yukon River Basin (Canada/USA) as a test domain. The timing of these events was estimated using data from passive (Advanced Microwave Scanning Radiometer—EOS (AMSR-E)) and active (SeaWinds on Quick Scatterometer (QuikSCAT)) microwave remote sensors, employing detection algorithms for brightness temperature (AMSR-E) and radar backscatter (QuikSCAT). The satellite detected events were validated with ground station meteorological and hydrological data, and the spatial and temporal variability of the events across the entire river basin was characterized. Possible causative factors for the detected events, including ROS, fog, and positive air temperatures, were determined by comparing the timing of the events to parameters from SnowModel and National Centers for Environmental Prediction North American Regional Reanalysis (NARR) outputs, and weather station data. All melt events coincided with above freezing temperatures, while a limited number corresponded to ROS (determined from SnowModel and ground data) and a majority to fog occurrence (determined from NARR). The results underscore the significant influence that warm air intrusions have on melt in some areas and demonstrate the large temporal and spatial variability over years and regions. The study provides a method for melt detection and a baseline from which to assess future change.

  1. Event Detection and Visualization of Ocean Eddies based on SSH and Velocity Field

    NASA Astrophysics Data System (ADS)

    Matsuoka, Daisuke; Araki, Fumiaki; Inoue, Yumi; Sasaki, Hideharu

    2016-04-01

    Numerical studies of ocean eddies have been progressed using high-resolution ocean general circulation models. In order to understand ocean eddies from simulation results with large amount of information volume, it is necessary to visualize not only distribution of eddies of each time step, but also events or phenomena of eddies. However, previous methods cannot precisely detect eddies, especially, during the events such as eddies' amalgamation, bifurcation. In the present study, we propose a new approach of eddy's detection, tracking and event visualization based on sea surface height (SSH) and velocity field. The proposed method detects eddies region as well as streams and currents region, and classifies detected eddies into several types. By tracking the time-varying change of classified eddies, it is possible to detect not only eddies event such as amalgamation and bifurcation but also the interaction between eddy and ocean current. As a result of visualizing detected eddies and events, we succeeded in creating the movie which enables us to intuitively understand the region of interest.

  2. The Cognitive Processes Underlying Event-Based Prospective Memory in School-Age Children and Young Adults: A Formal Model-Based Study

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Bayen, Ute J.; Martin, Claudia

    2010-01-01

    Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the…

  3. Event Detection in Aerospace Systems using Centralized Sensor Networks: A Comparative Study of Several Methodologies

    NASA Technical Reports Server (NTRS)

    Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.

    2006-01-01

    Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.

  4. Detection of gait events using an F-Scan in-shoe pressure measurement system.

    PubMed

    Catalfamo, Paola; Moser, David; Ghoussayni, Salim; Ewins, David

    2008-10-01

    A portable system capable of accurate detection of initial contact (IC) and foot off (FO) without adding encumbrance to the subject would be extremely useful in many gait analysis applications. Force platforms represent the gold standard method for determining these events and other methods including foot switches and kinematic data have also been proposed. These approaches, however, present limitations in terms of the number of steps that can be analysed per trial, the portability for outdoor measurements or the information needed beforehand. The purpose of this study was to evaluate the F-Scan((R)) Mobile pressure measurement system when detecting IC and FO. Two methods were used, one was the force detection (FD) in-built algorithm used by F-Scan software and a new area detection (AD) method using the loaded area during the gait cycle. Both methods were tested in ten healthy adults and compared with the detection provided by a kinetic detection (KT) algorithm. The absolute mean differences between KT and FD were (mean+/-standard deviation) 42+/-11 ms for IC and 37+/-11 ms for FO. The absolute mean differences between KT and AD were 22+/-9 ms for IC and 10+/-4 ms for FO. The AD method remained closer to KT detection for all subjects providing sufficiently accurate detection of both events and presenting advantages in terms of portability, number of steps analysed per trial and practicality as to make it a system of choice for gait event detection. PMID:18468441

  5. Model-based analysis supports interglacial refugia over long-dispersal events in the diversification of two South American cactus species.

    PubMed

    Perez, M F; Bonatelli, I A S; Moraes, E M; Carstens, B C

    2016-06-01

    Pilosocereus machrisii and P. aurisetus are cactus species within the P. aurisetus complex, a group of eight cacti that are restricted to rocky habitats within the Neotropical savannas of eastern South America. Previous studies have suggested that diversification within this complex was driven by distributional fragmentation, isolation leading to allopatric differentiation, and secondary contact among divergent lineages. These events have been associated with Quaternary climatic cycles, leading to the hypothesis that the xerophytic vegetation patches which presently harbor these populations operate as refugia during the current interglacial. However, owing to limitations of the standard phylogeographic approaches used in these studies, this hypothesis was not explicitly tested. Here we use Approximate Bayesian Computation to refine the previous inferences and test the role of different events in the diversification of two species within P. aurisetus group. We used molecular data from chloroplast DNA and simple sequence repeats loci of P. machrisii and P. aurisetus, the two species with broadest distribution in the complex, in order to test if the diversification in each species was driven mostly by vicariance or by long-dispersal events. We found that both species were affected primarily by vicariance, with a refuge model as the most likely scenario for P. aurisetus and a soft vicariance scenario most probable for P. machrisii. These results emphasize the importance of distributional fragmentation in these species, and add support to the hypothesis of long-term isolation in interglacial refugia previously proposed for the P. aurisetus species complex diversification. PMID:27071846

  6. A canonical correlation analysis based method for contamination event detection in water sources.

    PubMed

    Li, Ruonan; Liu, Shuming; Smith, Kate; Che, Han

    2016-06-15

    In this study, a general framework integrating a data-driven estimation model is employed for contamination event detection in water sources. Sequential canonical correlation coefficients are updated in the model using multivariate water quality time series. The proposed method utilizes canonical correlation analysis for studying the interplay between two sets of water quality parameters. The model is assessed by precision, recall and F-measure. The proposed method is tested using data from a laboratory contaminant injection experiment. The proposed method could detect a contamination event 1 minute after the introduction of 1.600 mg l(-1) acrylamide solution. With optimized parameter values, the proposed method can correctly detect 97.50% of all contamination events with no false alarms. The robustness of the proposed method can be explained using the Bauer-Fike theorem. PMID:27264637

  7. On-line Machine Learning and Event Detection in Petascale Data Streams

    NASA Astrophysics Data System (ADS)

    Thompson, David R.; Wagstaff, K. L.

    2012-01-01

    Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data

  8. Comparison of pointwise and regional statistical approaches to detect non stationarity in extreme rainfall events. Application to the Sahelian region

    NASA Astrophysics Data System (ADS)

    Panthou, G.; Vischel, T.; Lebel, T.; Quantin, G.; Favre, A.; Blanchet, J.; Ali, A.

    2012-12-01

    Studying trends in rainfall extremes at regional scale is required to provide reference climatology to evaluate General Circulation Model global predictions as well as to help managing and designing hydraulic works. The present study compares three methods to detect trends (linear and change-point) in series of daily rainfall annual maxima: (i) The first approach is widely used and consist in applying statistical stationarity tests (linear trend and change-point) on the point-wise maxima series; (ii) The second approach compares the performances of a constant and a time dependent Generalized Extreme Value (GEV) distribution fitted to the point-wise maxima series. (iii) The last method uses an original regional statistical model based on space-time GEV distribution which is used to detect changes in rainfall extremes directly at regional scale. The three methods are applied to detect trends in extreme daily rainfall over the Sahel during the period 1950-1990 for which a network of 128 daily rain gages is available. This region has experienced an intense drought since the end of the 1960s; it is thus an interesting case-study to illustrate how a regional climate change can affect the extreme rainfall distributions. One major result is that the statistical stationarity tests rarely detect non-stationarities in the series while the two GEV-based models converge to show that the extreme rainfall series have a negative break point around 1970. The study points out the limit of the widely used classical stationarity tests to detect trends in noisy series affected by sampling errors. The use of parametric time-dependent GEV seems to reduce this effect especially when a regional approach is used. From a climatological point of view, the results show that the great Sahelian drought has been accompanied by a decrease of extreme rainfall events, both in magnitude and occurence.

  9. Model-Based Fault Tolerant Control

    NASA Technical Reports Server (NTRS)

    Kumar, Aditya; Viassolo, Daniel

    2008-01-01

    The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.

  10. Method for detecting binding events using micro-X-ray fluorescence spectrometry

    DOEpatents

    Warner, Benjamin P.; Havrilla, George J.; Mann, Grace

    2010-12-28

    Method for detecting binding events using micro-X-ray fluorescence spectrometry. Receptors are exposed to at least one potential binder and arrayed on a substrate support. Each member of the array is exposed to X-ray radiation. The magnitude of a detectable X-ray fluorescence signal for at least one element can be used to determine whether a binding event between a binder and a receptor has occurred, and can provide information related to the extent of binding between the binder and receptor.

  11. Detecting, Monitoring, and Reporting Possible Adverse Drug Events Using an Arden-Syntax-based Rule Engine.

    PubMed

    Fehre, Karsten; Plössnig, Manuela; Schuler, Jochen; Hofer-Dückelmann, Christina; Rappelsberger, Andrea; Adlassnig, Klaus-Peter

    2015-01-01

    The detection of adverse drug events (ADEs) is an important aspect of improving patient safety. The iMedication system employs predefined triggers associated with significant events in a patient's clinical data to automatically detect possible ADEs. We defined four clinically relevant conditions: hyperkalemia, hyponatremia, renal failure, and over-anticoagulation. These are some of the most relevant ADEs in internal medical and geriatric wards. For each patient, ADE risk scores for all four situations are calculated, compared against a threshold, and judged to be monitored, or reported. A ward-based cockpit view summarizes the results. PMID:26262252

  12. Model-based fault detection and isolation for intermittently active faults with application to motion-based thruster fault detection and isolation for spacecraft

    NASA Technical Reports Server (NTRS)

    Wilson, Edward (Inventor)

    2008-01-01

    The present invention is a method for detecting and isolating fault modes in a system having a model describing its behavior and regularly sampled measurements. The models are used to calculate past and present deviations from measurements that would result with no faults present, as well as with one or more potential fault modes present. Algorithms that calculate and store these deviations, along with memory of when said faults, if present, would have an effect on the said actual measurements, are used to detect when a fault is present. Related algorithms are used to exonerate false fault modes and finally to isolate the true fault mode. This invention is presented with application to detection and isolation of thruster faults for a thruster-controlled spacecraft. As a supporting aspect of the invention, a novel, effective, and efficient filtering method for estimating the derivative of a noisy signal is presented.

  13. Detection of invisible and crucial events: from seismic fluctuations to the war against terrorism

    NASA Astrophysics Data System (ADS)

    Allegrini, Paolo; Fronzoni, Leone; Grigolini, Paolo; Latora, Vito; Mega, Mirko S.; Palatella, Luigi; Rapisarda, Andrea; Vinciguerra, Sergio

    2004-04-01

    We prove the efficiency of a new method for the detection of crucial events that might have useful applications to the war against terrorism. This has to do with the search for rare but significant events, a theme of research that has been made of extreme importance by the tragedy of September 11. This method is applied here to defining the statistics of seismic main-shocks, as done in cond-mat/0212529. The emphasis here is on the conceptual issues behind the results obtained in cond-mat/0212529 than on geophysics. This discussion suggests that the method has a wider range of validity. We support this general discussion with a dynamic model originally proposed in cond-mat/0107597 for purposes different from geophysical applications. However, it is a case where the crucial events to detect are under our control, thereby making it possible for us to check the accuracy of the method of detection of invisible and crucial events that we propose here for a general purpose, including the war against terrorism. For this model an analytical treatment has been recently found [cond-mat/0209038], supporting the claims that we make in this paper for the accuracy of the method of detection. For the reader's convenience, the results on the seismic fluctuations are suitably reviewed, and discussed in the light of the more general perspective of this paper. We also review the model for seismic fluctuations, proposed in the earlier work of cond-mat/0212529. This model shares with the model of cond-mat/0107597 the property that the crucial events are imbedded in a sea of secondary events, but it allows us to reveal with accuracy the statistics of the crucial events for different mathematical reasons.

  14. Why conventional detection methods fail in identifying the existence of contamination events.

    PubMed

    Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han

    2016-04-15

    Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. PMID:26905801

  15. Development of an algorithm for automatic detection and rating of squeak and rattle events

    NASA Astrophysics Data System (ADS)

    Chandrika, Unnikrishnan Kuttan; Kim, Jay H.

    2010-10-01

    A new algorithm for automatic detection and rating of squeak and rattle (S&R) events was developed. The algorithm utilizes the perceived transient loudness (PTL) that approximates the human perception of a transient noise. At first, instantaneous specific loudness time histories are calculated over 1-24 bark range by applying the analytic wavelet transform and Zwicker loudness transform to the recorded noise. Transient specific loudness time histories are then obtained by removing estimated contributions of the background noise from instantaneous specific loudness time histories. These transient specific loudness time histories are summed to obtain the transient loudness time history. Finally, the PTL time history is obtained by applying Glasberg and Moore temporal integration to the transient loudness time history. Detection of S&R events utilizes the PTL time history obtained by summing only 18-24 barks components to take advantage of high signal-to-noise ratio in the high frequency range. A S&R event is identified when the value of the PTL time history exceeds the detection threshold pre-determined by a jury test. The maximum value of the PTL time history is used for rating of S&R events. Another jury test showed that the method performs much better if the PTL time history obtained by summing all frequency components is used. Therefore, r ating of S&R events utilizes this modified PTL time history. Two additional jury tests were conducted to validate the developed detection and rating methods. The algorithm developed in this work will enable automatic detection and rating of S&R events with good accuracy and minimum possibility of false alarm.

  16. Fast and robust microseismic event detection using very fast simulated annealing

    NASA Astrophysics Data System (ADS)

    Velis, Danilo R.; Sabbione, Juan I.; Sacchi, Mauricio D.

    2013-04-01

    The study of microseismic data has become an essential tool in many geoscience fields, including oil reservoir geophysics, mining and CO2 sequestration. In hydraulic fracturing, microseismicity studies permit the characterization and monitoring of the reservoir dynamics in order to optimize the production and the fluid injection process itself. As the number of events is usually large and the signal-to-noise ratio is in general very low, fast, automated, and robust detection algorithms are required for most applications. Also, real-time functionality is commonly needed to control the fluid injection in the field. Generally, events are located by means of grid search algorithms that rely on some approximate velocity model. These techniques are very effective and accurate, but computationally intensive when dealing with large three or four-dimensional grids. Here, we present a fast and robust method that allows to automatically detect and pick an event in 3C microseismic data without any input information about the velocity model. The detection is carried out by means of a very fast simulated annealing (VFSA) algorithm. To this end, we define an objective function that measures the energy of a potential microseismic event along the multichannel signal. This objective function is based on the stacked energy of the envelope of the signals calculated within a predefined narrow time window that depends on the source position, receivers geometry and velocity. Once an event has been detected, the source location can be estimated, in a second stage, by inverting the corresponding traveltimes using a standard technique, which would naturally require some knowledge of the velocity model. Since the proposed technique focuses on the detection of the microseismic events only, the velocity model is not required, leading to a fast algorithm that carries out the detection in real-time. Besides, the strategy is applicable to data with very low signal-to-noise ratios, for it relies

  17. Wenchuan Event Detection And Localization Using Waveform Correlation Coupled With Double Difference

    NASA Astrophysics Data System (ADS)

    Slinkard, M.; Heck, S.; Schaff, D. P.; Young, C. J.; Richards, P. G.

    2014-12-01

    The well-studied Wenchuan aftershock sequence triggered by the May 12, 2008, Ms 8.0, mainshock offers an ideal test case for evaluating the effectiveness of using waveform correlation coupled with double difference relocation to detect and locate events in a large aftershock sequence. We use Sandia's SeisCorr detector to process 3 months of data recorded by permanent IRIS and temporary ASCENT stations using templates from events listed in a global catalog to find similar events in the raw data stream. Then we take the detections and relocate them using the double difference method. We explore both the performance that can be expected with using just a small number of stations, and, the benefits of reprocessing a well-studied sequence such as this one using waveform correlation to find even more events. We benchmark our results against previously published results describing relocations of regional catalog data. Before starting this project, we had examples where with just a few stations at far-regional distances, waveform correlation combined with double difference did and impressive job of detection and location events with precision at the few hundred and even tens of meters level.

  18. Testing the ability of different seismic detections approaches to monitor aftershocks following a moderate magnitude event.

    NASA Astrophysics Data System (ADS)

    Romero, Paula; Díaz, Jordi; Ruiz, Mario; Cantavella, Juan Vicente; Gomez-García, Clara

    2016-04-01

    The detection and picking of seismic events is a permanent concern for seismic surveying, in particular when dealing with aftershocks of moderate magnitude events. Many efforts have been done to find the balance between computer efficiency and the robustness of the detection methods. In this work, data recorded by a high density seismic network deployed following a 5.2 magnitude event located close to Albacete, SE Spain, is used to test the ability of classical and recently proposed detection methodologies. Two days after the main shock, occurred the 23th February, a network formed by 11 stations from ICTJA-CSIC and 2 stations from IGN were deployed over the region, with inter-station distances ranging between 5 and 10 km. The network remained in operation until April 6th, 2015 and allowed to manually identify up to 552 events with magnitudes from 0.2 to 3.5 located in an area of just 25 km2 inside the network limits. The detection methods here studied applied are the classical STA/LTA, a power spectral method, a detector based in the Benford's law and a waveform similarity method. The STA/LTA method, based in the comparison of background noise and seismic signal amplitudes, is taken as a reference to evaluate the results arising from the other approaches. The power spectral density method is based in the inspection of the characteristic frequency pattern associated to seismic events. The Benford's Law detector analyses the distribution of the first-digit of displacement count in the histogram of a seismic waveform, considering that only the windows containing seismic wave arrivals will match the logarithmic law. Finally, the waveform similarity method is based in the analysis of the normalized waveform amplitude, detecting those events with waveform similar to a previously defined master event. The aim of this contribution is to inspect the ability of the different approaches to accurately detect the aftershocks events for this kind of seismic crisis and to

  19. Detecting Continuity Violations in Infancy: A New Account and New Evidence from Covering and Tube Events

    ERIC Educational Resources Information Center

    Wang, S.h.; Baillargeon, R.; Paterson, S.

    2005-01-01

    Recent research on infants' responses to occlusion and containment events indicates that, although some violations of the continuity principle are detected at an early age e.g. Aguiar, A., & Baillargeon, R. (1999). 2.5-month-old infants' reasoning about when objects should and should not be occluded. Cognitive Psychology 39, 116-157; Hespos, S.…

  20. A novel seizure detection algorithm informed by hidden Markov model event states

    NASA Astrophysics Data System (ADS)

    Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian

    2016-06-01

    Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h‑1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.

  1. Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field.

    PubMed

    Jeong, Myeong-Hun; Duckham, Matt

    2015-01-01

    This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes' coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks. PMID:26343672

  2. Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field

    PubMed Central

    Jeong, Myeong-Hun; Duckham, Matt

    2015-01-01

    This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes’ coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks. PMID:26343672

  3. Feature selection of seismic waveforms for long period event detection at Cotopaxi Volcano

    NASA Astrophysics Data System (ADS)

    Lara-Cueva, R. A.; Benítez, D. S.; Carrera, E. V.; Ruiz, M.; Rojo-Álvarez, J. L.

    2016-04-01

    Volcano Early Warning Systems (VEWS) have become a research topic in order to preserve human lives and material losses. In this setting, event detection criteria based on classification using machine learning techniques have proven useful, and a number of systems have been proposed in the literature. However, to the best of our knowledge, no comprehensive and principled study has been conducted to compare the influence of the many different sets of possible features that have been used as input spaces in previous works. We present an automatic recognition system of volcano seismicity, by considering feature extraction, event classification, and subsequent event detection, in order to reduce the processing time as a first step towards a high reliability automatic detection system in real-time. We compiled and extracted a comprehensive set of temporal, moving average, spectral, and scale-domain features, for separating long period seismic events from background noise. We benchmarked two usual kinds of feature selection techniques, namely, filter (mutual information and statistical dependence) and embedded (cross-validation and pruning), each of them by using suitable and appropriate classification algorithms such as k Nearest Neighbors (k-NN) and Decision Trees (DT). We applied this approach to the seismicity presented at Cotopaxi Volcano in Ecuador during 2009 and 2010. The best results were obtained by using a 15 s segmentation window, feature matrix in the frequency domain, and DT classifier, yielding 99% of detection accuracy and sensitivity. Selected features and their interpretation were consistent among different input spaces, in simple terms of amplitude and spectral content. Our study provides the framework for an event detection system with high accuracy and reduced computational requirements.

  4. Detection, tracking and event localization of interesting features in 4-D atmospheric data

    NASA Astrophysics Data System (ADS)

    Limbach, S.; Schömer, E.; Wernli, H.

    2011-11-01

    We introduce a novel algorithm for the efficient detection and tracking of interesting features in spatial-temporal atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. The algorithm is based on the well-known region growing segmentation method. We extended the basic idea towards the analysis of the complete 4-D dataset, identifying segments representing the spatial features and their development over time. Each segment consists of one set of distinct 3-D features per time step. The algorithm keeps track of the successors of each 3-D feature, constructing the so-called event graph of each segment. The precise localization of the splitting events is based on a search for all grid points inside the initial 3-D feature which have a similar distance to all successive 3-D features of the next time step. The merging event is localized analogously considering inverted direction of time. We tested the implementation on a four-dimensional field of wind speed data from European Centre for Medium-Range Weather Forecasts (ECMWF) analyses and computed a climatology of upper-tropospheric jet streams and their events. We compare our results with a previous climatology, investigate the statistical distribution of the merging and splitting events, and illustrate the meteorological significance of the jet splitting events with a case study. A brief outlook is given on additional potential applications of the 4-D data segmentation technique.

  5. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  6. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-05-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analyzing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multi-channel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multi-component waveforms into the ray-centered co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, i.e. microseismic events for which only one of the S- or P-wave arrival is evident due to unfavorable S/N conditions. A real-data example using microseismic monitoring data from 4 stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than four-fold increase) in the number of located events compared with the original catalog. Moreover, analysis of the new MFA catalog suggests that this approach leads to more robust interpretation of the

  7. Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection

    PubMed Central

    Olson, Sarah H.; Benedum, Corey M.; Mekaru, Sumiko R.; Preston, Nicholas D.; Mazet, Jonna A.K.; Joly, Damien O.

    2015-01-01

    The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data. PMID:26196106

  8. Drivers of Emerging Infectious Disease Events as a Framework for Digital Detection.

    PubMed

    Olson, Sarah H; Benedum, Corey M; Mekaru, Sumiko R; Preston, Nicholas D; Mazet, Jonna A K; Joly, Damien O; Brownstein, John S

    2015-08-01

    The growing field of digital disease detection, or epidemic intelligence, attempts to improve timely detection and awareness of infectious disease (ID) events. Early detection remains an important priority; thus, the next frontier for ID surveillance is to improve the recognition and monitoring of drivers (antecedent conditions) of ID emergence for signals that precede disease events. These data could help alert public health officials to indicators of elevated ID risk, thereby triggering targeted active surveillance and interventions. We believe that ID emergence risks can be anticipated through surveillance of their drivers, just as successful warning systems of climate-based, meteorologically sensitive diseases are supported by improved temperature and precipitation data. We present approaches to driver surveillance, gaps in the current literature, and a scientific framework for the creation of a digital warning system. Fulfilling the promise of driver surveillance will require concerted action to expand the collection of appropriate digital driver data. PMID:26196106

  9. Covert Network Analysis for Key Player Detection and Event Prediction Using a Hybrid Classifier

    PubMed Central

    Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus

    2014-01-01

    National security has gained vital importance due to increasing number of suspicious and terrorist events across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of events. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674

  10. A novel adaptive, real-time algorithm to detect gait events from wearable sensors.

    PubMed

    Chia Bejarano, Noelia; Ambrosini, Emilia; Pedrocchi, Alessandra; Ferrigno, Giancarlo; Monticone, Marco; Ferrante, Simona

    2015-05-01

    A real-time, adaptive algorithm based on two inertial and magnetic sensors placed on the shanks was developed for gait-event detection. For each leg, the algorithm detected the Initial Contact (IC), as the minimum of the flexion/extension angle, and the End Contact (EC) and the Mid-Swing (MS), as minimum and maximum of the angular velocity, respectively. The algorithm consisted of calibration, real-time detection, and step-by-step update. Data collected from 22 healthy subjects (21 to 85 years) walking at three self-selected speeds were used to validate the algorithm against the GaitRite system. Comparable levels of accuracy and significantly lower detection delays were achieved with respect to other published methods. The algorithm robustness was tested on ten healthy subjects performing sudden speed changes and on ten stroke subjects (43 to 89 years). For healthy subjects, F1-scores of 1 and mean detection delays lower than 14 ms were obtained. For stroke subjects, F1-scores of 0.998 and 0.944 were obtained for IC and EC, respectively, with mean detection delays always below 31 ms. The algorithm accurately detected gait events in real time from a heterogeneous dataset of gait patterns and paves the way for the design of closed-loop controllers for customized gait trainings and/or assistive devices. PMID:25069118

  11. Sparse conditional mixture model: late fusion with missing scores for multimedia event detection

    NASA Astrophysics Data System (ADS)

    Nallapati, Ramesh; Yeh, Eric; Myers, Gregory

    2013-03-01

    The problem of event detection in multimedia clips is typically handled by modeling each of the component modalities independently, then combining their detection scores in a late fusion approach. One of the problems of a late fusion model in the multimedia setting is that the detection scores may be missing from one or more components for a given clip; e.g., when there is no speech in the clip; or when there is no overlay text. Standard fusion techniques typically address this problem by assuming a default backoff score for a component when its detection score is missing for a clip. This may potentially bias the fusion model, especially if there are many missing detections from a given component. In this work, we present the Sparse Conditional Mixture Model (SCMM) which models only the observed detection scores for each example, thereby avoiding making any assumptions about the distributions of the scores that are made by backoff models. Our experiments in multi-media event detection using the TRECVID-2011 corpus demonstrates that SCMM achieves statistically significant performance gains over standard late fusion techniques. The SCMM model is very general and is applicable to fusion problems with missing data in any domain.

  12. The development of a temporal-BRDF model-based approach to change detection, an application to the identification and delineation of fire affected areas

    NASA Astrophysics Data System (ADS)

    Rebelo, Lisa-Maria

    Although large quantities of southern Africa burn every year, minimal information is available relating to the fire regimes of this area. This study develops a new, generic approach to change detection, applicable to the identification of land cover change from high temporal and moderate spatial resolution satellite data. Traditional change detection techniques have several key limitations which are identified and addressed in this work. In particular these approaches fail to account for directional effects in the remote sensing signal introduced by variations in the solar and sensing geometry, and are sensitive to underlying phenological changes in the surface as well as noise in the data due to cloud or atmospheric contamination. This research develops a bi-directional, model-based change detection algorithm. An empirical temporal component is incorporated into a semi-empirical linear BRDF model. This may be fitted to a long time series of reflectance with less sensitivity to the presence of underlying phenological change. Outliers are identified based on an estimation of noise in the data and the calculation of uncertainty in the model parameters and are removed from the sequence. A "step function kernel" is incorporated into the formulation in order to detect explicitly sudden step-like changes in the surface reflectance induced by burning. The change detection model is applied to the problem of locating and mapping fire affected areas from daily moderate spatial resolution satellite data, and an indicator of burn severity is introduced. Monthly burned area datasets for a 2400km by 1200km area of southern Africa detailing the day and severity of burning are created for a five year period (2000-2004). These data are analysed and the fire regimes of southern African ecosystems during this time are characterised. The results highlight the extent of the burning which is taking place within southern Africa, with between 27-32% of the study area burning during each

  13. Event Detection and Location of Earthquakes Using the Cascadia Initiative Dataset

    NASA Astrophysics Data System (ADS)

    Morton, E.; Bilek, S. L.; Rowe, C. A.

    2015-12-01

    The Cascadia subduction zone (CSZ) produces a range of slip behavior along the plate boundary megathrust, from great earthquakes to episodic slow slip and tremor (ETS). Unlike other subduction zones that produce great earthquakes and ETS, the CSZ is notable for the lack of small and moderate magnitude earthquakes recorded. The seismogenic zone extent is currently estimated to be primarily offshore, thus the lack of observed small, interplate earthquakes may be partially due to the use of only land seismometers. The Cascadia Initiative (CI) community seismic experiment seeks to address this issue by including ocean bottom seismometers (OBS) deployed directly over the locked seismogenic zone, in addition to land seismometers. We use these seismic data to explore whether small magnitude earthquakes are occurring on the plate interface, but have gone undetected by the land-based seismic networks. We select a subset of small magnitude (M0.1-3.7) earthquakes from existing earthquake catalogs, based on land seismic data, whose preliminary hypocentral locations suggest they may have occurred on the plate interface. We window the waveforms on CI OBS and land seismometers around the phase arrival times for these earthquakes to generate templates for subspace detection, which allows for additional flexibility over traditional matched filter detection methods. Here we present event detections from the first year of CI deployment and preliminary locations for the detected events. Initial results of scanning the first year of the CI deployment using one cluster of template events, located near a previously identified subducted seamount, include 473 detections on OBS station M08A (~61.6 km offshore) and 710 detections on OBS station J25A (~44.8 km northeast of M08A). Ongoing efforts include detection using additional OBS stations along the margin, as well as determining locations of clusters detected in the first year of deployment.

  14. Event-specific quantitative detection of nine genetically modified maizes using one novel standard reference molecule.

    PubMed

    Yang, Litao; Guo, Jinchao; Pan, Aihu; Zhang, Haibo; Zhang, Kewei; Wang, Zhengming; Zhang, Dabing

    2007-01-10

    With the development of genetically modified organism (GMO) detection techniques, the Polymerase Chain Reaction (PCR) technique has been the mainstay for GMO detection, and real-time PCR is the most effective and important method for GMO quantification. An event-specific detection strategy based on the unique and specific integration junction sequences between the host plant genome DNA and the integrated gene is being developed for its high specificity. This study establishes the event-specific detection methods for TC1507 and CBH351 maizes. In addition, the event-specific TaqMan real-time PCR detection methods for another seven GM maize events (Bt11, Bt176, GA21, MON810, MON863, NK603, and T25) were systematically optimized and developed. In these PCR assays, the fluorescent quencher, TAMRA, was dyed on the T-base of the probe at the internal position to improve the intensity of the fluorescent signal. To overcome the difficulties in obtaining the certified reference materials of these GM maizes, one novel standard reference molecule containing all nine specific integration junction sequences of these GM maizes and the maize endogenous reference gene, zSSIIb, was constructed and used for quantitative analysis. The limits of detection of these methods were 20 copies for these different GM maizes, the limits of quantitation were about 20 copies, and the dynamic ranges for quantification were from 0.05 to 100% in 100 ng of DNA template. Furthermore, nine groups of the mixed maize samples of these nine GM maize events were quantitatively analyzed to evaluate the accuracy and precision. The accuracy expressed as bias varied from 0.67 to 28.00% for the nine tested groups of GM maize samples, and the precision expressed as relative standard deviations was from 0.83 to 26.20%. All of these indicated that the established event-specific real-time PCR detection systems and the reference molecule in this study are suitable for the identification and quantification of these GM

  15. Detection, tracking and event localization of jet stream features in 4-D atmospheric data

    NASA Astrophysics Data System (ADS)

    Limbach, S.; Schömer, E.; Wernli, H.

    2012-04-01

    We introduce a novel algorithm for the efficient detection and tracking of features in spatiotemporal atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. The algorithm works on data given on a four-dimensional structured grid. Feature selection and clustering are based on adjustable local and global criteria, feature tracking is predominantly based on spatial overlaps of the feature's full volumes. The resulting 3-D features and the identified correspondences between features of consecutive time steps are represented as the nodes and edges of a directed acyclic graph, the event graph. Merging and splitting events appear in the event graph as nodes with multiple incoming or outgoing edges, respectively. The precise localization of the splitting events is based on a search for all grid points inside the initial 3-D feature that have a similar distance to two successive 3-D features of the next time step. The merging event is localized analogously, operating backward in time. As a first application of our method we present a climatology of upper-tropospheric jet streams and their events, based on four-dimensional wind speed data from European Centre for Medium-Range Weather Forecasts (ECMWF) analyses. We compare our results with a climatology from a previous study, investigate the statistical distribution of the merging and splitting events, and illustrate the meteorological significance of the jet splitting events with a case study. A brief outlook is given on additional potential applications of the 4-D data segmentation technique.

  16. Signal Detection of Adverse Drug Reaction of Amoxicillin Using the Korea Adverse Event Reporting System Database.

    PubMed

    Soukavong, Mick; Kim, Jungmee; Park, Kyounghoon; Yang, Bo Ram; Lee, Joongyub; Jin, Xue Mei; Park, Byung Joo

    2016-09-01

    We conducted pharmacovigilance data mining for a β-lactam antibiotics, amoxicillin, and compare the adverse events (AEs) with the drug labels of 9 countries including Korea, USA, UK, Japan, Germany, Swiss, Italy, France, and Laos. We used the Korea Adverse Event Reporting System (KAERS) database, a nationwide database of AE reports, between December 1988 and June 2014. Frequentist and Bayesian methods were used to calculate disproportionality distribution of drug-AE pairs. The AE which was detected by all the three indices of proportional reporting ratio (PRR), reporting odds ratio (ROR), and information component (IC) was defined as a signal. The KAERS database contained a total of 807,582 AE reports, among which 1,722 reports were attributed to amoxicillin. Among the 192,510 antibiotics-AE pairs, the number of amoxicillin-AE pairs was 2,913. Among 241 AEs, 52 adverse events were detected as amoxicillin signals. Comparing the drug labels of 9 countries, 12 adverse events including ineffective medicine, bronchitis, rhinitis, sinusitis, dry mouth, gastroesophageal reflux, hypercholesterolemia, gastric carcinoma, abnormal crying, induration, pulmonary carcinoma, and influenza-like symptoms were not listed on any of the labels of nine countries. In conclusion, we detected 12 new signals of amoxicillin which were not listed on the labels of 9 countries. Therefore, it should be followed by signal evaluation including causal association, clinical significance, and preventability. PMID:27510377

  17. Signal Detection of Adverse Drug Reaction of Amoxicillin Using the Korea Adverse Event Reporting System Database

    PubMed Central

    2016-01-01

    We conducted pharmacovigilance data mining for a β-lactam antibiotics, amoxicillin, and compare the adverse events (AEs) with the drug labels of 9 countries including Korea, USA, UK, Japan, Germany, Swiss, Italy, France, and Laos. We used the Korea Adverse Event Reporting System (KAERS) database, a nationwide database of AE reports, between December 1988 and June 2014. Frequentist and Bayesian methods were used to calculate disproportionality distribution of drug-AE pairs. The AE which was detected by all the three indices of proportional reporting ratio (PRR), reporting odds ratio (ROR), and information component (IC) was defined as a signal. The KAERS database contained a total of 807,582 AE reports, among which 1,722 reports were attributed to amoxicillin. Among the 192,510 antibiotics-AE pairs, the number of amoxicillin-AE pairs was 2,913. Among 241 AEs, 52 adverse events were detected as amoxicillin signals. Comparing the drug labels of 9 countries, 12 adverse events including ineffective medicine, bronchitis, rhinitis, sinusitis, dry mouth, gastroesophageal reflux, hypercholesterolemia, gastric carcinoma, abnormal crying, induration, pulmonary carcinoma, and influenza-like symptoms were not listed on any of the labels of nine countries. In conclusion, we detected 12 new signals of amoxicillin which were not listed on the labels of 9 countries. Therefore, it should be followed by signal evaluation including causal association, clinical significance, and preventability. PMID:27510377

  18. A method for detecting and locating geophysical events using groups of arrays

    NASA Astrophysics Data System (ADS)

    de Groot-Hedlin, Catherine D.; Hedlin, Michael A. H.

    2015-11-01

    We have developed a novel method to detect and locate geophysical events that makes use of any sufficiently dense sensor network. This method is demonstrated using acoustic sensor data collected in 2013 at the USArray Transportable Array (TA). The algorithm applies Delaunay triangulation to divide the sensor network into a mesh of three-element arrays, called triads. Because infrasound waveforms are incoherent between the sensors within each triad, the data are transformed into envelopes, which are cross-correlated to find signals that satisfy a consistency criterion. The propagation azimuth, phase velocity and signal arrival time are computed for each signal. Triads with signals that are consistent with a single source are bundled as an event group. The ensemble of arrival times and azimuths of detected signals within each group are used to locate a common source in space and time. A total of 513 infrasonic stations that were active for part or all of 2013 were divided into over 2000 triads. Low (0.5-2 Hz) and high (2-8 Hz) catalogues of infrasonic events were created for the eastern USA. The low-frequency catalogue includes over 900 events and reveals several highly active source areas on land that correspond with coal mining regions. The high-frequency catalogue includes over 2000 events, with most occurring offshore. Although their cause is not certain, most events are clearly anthropogenic as almost all occur during regular working hours each week. The regions to which the TA is most sensitive vary seasonally, with the direction of reception dependent on the direction of zonal winds. The catalogue has also revealed large acoustic events that may provide useful insight into the nature of long-range infrasound propagation in the atmosphere.

  19. A Heuristic Indication and Warning Staging Model for Detection and Assessment of Biological Events

    PubMed Central

    Wilson, James M.; Polyak, Marat G.; Blake, Jane W.; Collmann, Jeff

    2008-01-01

    Objective This paper presents a model designed to enable rapid detection and assessment of biological threats that may require swift intervention by the international public health community. Design We utilized Strauss’ grounded theory to develop an expanded model of social disruption due to biological events based on retrospective and prospective case studies. We then applied this model to the temporal domain and propose a heuristic staging model, the Wilson–Collmann Scale for assessing biological event evolution. Measurements We retrospectively and manually examined hard copy archival local media reports in the native vernacular for three biological events associated with substantial social disruption. The model was then tested prospectively through media harvesting based on keywords corresponding to the model parameters. Results Our heuristic staging model provides valuable information about the features of a biological event that can be used to determine the level of concern warranted, such as whether the pathogen in question is responding to established public health disease control measures, including the use of antimicrobials or vaccines; whether the public health and medical infrastructure of the country involved is adequate to mount the necessary response; whether the country’s officials are providing an appropriate level of information to international public health authorities; and whether the event poses a international threat. The approach is applicable for monitoring open-source (public-domain) media for indications and warnings of such events, and specifically for markers of the social disruption that commonly occur as these events unfold. These indications and warnings can then be used as the basis for staging the biological threat in the same manner that the United States National Weather Service currently uses storm warning models (such as the Saffir-Simpson Hurricane Scale) to detect and assess threatening weather conditions. Conclusion

  20. Characterization and Analysis of Networked Array of Sensors for Event Detection (CANARY-EDS)

    Energy Science and Technology Software Center (ESTSC)

    2011-05-27

    CANARY -EDS provides probabilistic event detection based on analysis of time-series data from water quality or other sensors. CANARY also can compare patterns against a library of previously seen data to indicate that a certain pattern has reoccurred, suppressing what would otherwise be considered an event. CANARY can be configured to analyze previously recorded data from files or databases, or it can be configured to run in real-time mode directory from a database, or throughmore » the US EPA EDDIES software.« less

  1. CTBT infrasound network performance to detect the 2013 Russian fireball event

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garcés, Milton A.

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  2. Collaborative-Comparison Learning for Complex Event Detection Using Distributed Hierarchical Graph Neuron (DHGN) Approach in Wireless Sensor Network

    NASA Astrophysics Data System (ADS)

    Muhamad Amin, Anang Hudaya; Khan, Asad I.

    Research trends in existing event detection schemes using Wireless Sensor Network (WSN) have mainly focused on routing and localisation of nodes for optimum coordination when retrieving sensory information. Efforts have also been put in place to create schemes that are able to provide learning mechanisms for event detection using classification or clustering approaches. These schemes entail substantial communication and computational overheads owing to the event-oblivious nature of data transmissions. In this paper, we present an event detection scheme that has the ability to distribute detection processes over the resource-constrained wireless sensor nodes and is suitable for events with spatio-temporal characteristics. We adopt a pattern recognition algorithm known as Distributed Hierarchical Graph Neuron (DHGN) with collaborative-comparison learning for detecting critical events in WSN. The scheme demonstrates good accuracy for binary classification and offers low-complexity and high-scalability in terms of its processing requirements.

  3. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).

    PubMed

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  4. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO)

    PubMed Central

    Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing

    2016-01-01

    The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073

  5. Support Vector Machine Model for Automatic Detection and Classification of Seismic Events

    NASA Astrophysics Data System (ADS)

    Barros, Vesna; Barros, Lucas

    2016-04-01

    The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support

  6. Surface-Wave Multiple-Event Relocation and Detection of Earthquakes along the Romanche Fracture Zone

    NASA Astrophysics Data System (ADS)

    Cleveland, M.; Ammon, C. J.; VanDeMark, T. F.

    2011-12-01

    The Romanche Transform system, located along the equatorial Mid-Atlantic Ridge, is approximately 900 km in length and separates plates moving with a relative plate speed of three cm/yr. We use cross-correlation of globally recorded Rayleigh waves to estimate precise relative epicentroids of moderate-size earthquakes along the Romanche Fracture Zone system. The Romanche transform has an even distribution of large events along its entire length that provide a good base of events with excellent signal-to-noise observations. Two distinct moderate-magnitude event clusters occur along the eastern half of the transform and the region between the clusters hosted a large event in the last decade. Based on initial results (Van DeMark, 2006), unlike those of shorter transform systems, the events along the Romanche do not follow narrow features, the event clusters seem to spread perpendicular as well as laterally to the transform trend. These patterns are consistent with parallel, en echelon and/or braided fault systems, which have been previously observed on the Romanche through the use of side scanning sonar (Parson and Searle, 1986). We also explore the character and potential of seismic body waves to extend the method to help improve relative event depth estimates. Relying on a good base of larger and moderate-magnitude seismicity, we attempt to extend the analysis by processing continuous data streams through processes measuring waveform similarity (e.g. cross-correlation) in an attempt to detect smaller events using a subset of nearest seismic stations.

  7. Detection of Unusual Events and Trends in Complex Non-Stationary Data Streams

    SciTech Connect

    Perez, Rafael B; Protopopescu, Vladimir A; Worley, Brian Addison; Perez, Cristina

    2006-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for a host of different applications, ranging from nuclear power plant and electric grid operation to internet traffic and implementation of non-proliferation protocols. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden intermittent events inside non-stationary signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method.

  8. Use of a clinical event monitor to prevent and detect medication errors.

    PubMed Central

    Payne, T. H.; Savarino, J.; Marshall, R.; Hoey, C. T.

    2000-01-01

    Errors in health care facilities are common and often unrecognized. We have used our clinical event monitor to prevent and detect medication errors by scrutinizing electronic messages sent to it when any medication order is written in our facility. A growing collection of medication safety rules covering dose limit errors, laboratory monitoring, and other topics may be applied to each medication order message to provide an additional layer of protection beyond existing order checks, reminders, and alerts available within our computer-based record system. During a typical day the event monitor receives 4802 messages, of which 4719 pertain to medication orders. We have found the clinical event monitor to be a valuable tool for clinicians and quality management groups charged with improving medication safety. PMID:11079962

  9. Detection of Severe Rain on Snow events using passive microwave remote sensing

    NASA Astrophysics Data System (ADS)

    Grenfell, T. C.; Putkonen, J.

    2007-12-01

    Severe wintertime rain-on-snow (ROS) events create a strong ice layer or layers in the snow on arctic tundra that act as a barrier to ungulate grazing. These events are linked with large-scale ungulate herd declines via starvation and reduced calf production rate when the animals are unable to penetrate through the resulting ice layer. ROS events also produce considerable perturbation in the mean wintertime soil temperature beneath the snow pack. ROS is a sporadic but well-known and significant phenomenon that is currently very poorly documented. Characterization of the distribution and occurrence of severe rain-on-snow events is based only on anecdotal evidence, indirect observations of carcasses found adjacent to iced snow packs, and irregular detection by a sparse observational weather network. We have analyzed in detail a particular well-identified ROS event that took place on Banks Island in early October 2003 that resulted in the death of 20,000 musk oxen. We make use of multifrequency passive microwave imagery from the special sensing microwave imager satellite sensor suite (SSM/I) in conjunction with a strong-fluctuation-theory (SFT) emissivity model. We show that a combination of time series analysis and cluster analysis based on microwave spectral gradients and polarization ratios provides a means to detect the stages of the ROS event resulting from the modification of the vertical structure of the snow pack, specifically wetting the snow, the accumulation of liquid water at the base of the snow during the rain event, and the subsequent modification of the snowpack after refreezing. SFT model analysis provides quantitative confirmation of our interpretation of the evolution of the microwave properties of the snowpack as a result of the ROS event. In particular, in addition to the grain coarsening due to destructive metamorphism, we detect the presence of the internal water and ice layers, directly identifying the physical properties producing the

  10. Detection and analysis of high-temperature events in the BIRD mission

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2005-01-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite is detection and quantitative analysis of high-temperature events like fires and volcanoes. An absence of saturation in the BIRD infrared channels makes it possible to improve false alarm rejection as well as to retrieve quantitative characteristics of hot targets, including their effective fire temperature, area and the radiative energy release. Examples are given of detection and analysis of wild and coal seam fires, of volcanic activity as well as of oil fires in Iraq. The smallest fires detected by BIRD, which were verified on ground, had an area of 12m2 at daytime and 4m2 at night.

  11. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  12. Assessing Reliability of Medical Record Reviews for the Detection of Hospital Adverse Events

    PubMed Central

    Ock, Minsu; Lee, Sang-il; Jo, Min-Woo; Lee, Jin Yong; Kim, Seon-Ha

    2015-01-01

    Objectives: The purpose of this study was to assess the inter-rater reliability and intra-rater reliability of medical record review for the detection of hospital adverse events. Methods: We conducted two stages retrospective medical records review of a random sample of 96 patients from one acute-care general hospital. The first stage was an explicit patient record review by two nurses to detect the presence of 41 screening criteria (SC). The second stage was an implicit structured review by two physicians to identify the occurrence of adverse events from the positive cases on the SC. The inter-rater reliability of two nurses and that of two physicians were assessed. The intra-rater reliability was also evaluated by using test-retest method at approximately two weeks later. Results: In 84.2% of the patient medical records, the nurses agreed as to the necessity for the second stage review (kappa, 0.68; 95% confidence interval [CI], 0.54 to 0.83). In 93.0% of the patient medical records screened by nurses, the physicians agreed about the absence or presence of adverse events (kappa, 0.71; 95% CI, 0.44 to 0.97). When assessing intra-rater reliability, the kappa indices of two nurses were 0.54 (95% CI, 0.31 to 0.77) and 0.67 (95% CI, 0.47 to 0.87), whereas those of two physicians were 0.87 (95% CI, 0.62 to 1.00) and 0.37 (95% CI, -0.16 to 0.89). Conclusions: In this study, the medical record review for detecting adverse events showed intermediate to good level of inter-rater and intra-rater reliability. Well organized training program for reviewers and clearly defining SC are required to get more reliable results in the hospital adverse event study. PMID:26429290

  13. How unusual are the "unusual events" detected by control chart techniques in healthcare settings?

    PubMed

    Borckardt, Jeffrey J; Nash, Michael R; Hardesty, Susan; Herbert, Joan; Cooney, Harriet; Pelic, Christopher

    2006-01-01

    Statistical process control (SPC) charts have become widely implemented tools for quality monitoring and assurance in healthcare settings across the United States. SPC methods have been successfully used in industrial settings to track the quality of products manufactured by machines and to detect deviations from acceptable Levels of product quality. However, problems may arise when SPC methods are used to evaluate human behavior. Specifically, when human behavior is tracked over time, the data stream generated usually exhibits periodicity and gradualism with respect to behavioral changes over time. These tendencies can be quantified and are recognized in the statistical field as autocorrelation. When autocorrelation is present, conventional SPC methods too often identify events as "unusuaL" when they really should be understood as products of random fluctuation. This article discusses the concept of autocorrelation and demonstrates the negative impact of autocorrelation on traditional SPC methods, with a specific focus on the use of SPC charts to detect unusual events. PMID:16944647

  14. Energy Reconstruction for Events Detected in TES X-ray Detectors

    NASA Astrophysics Data System (ADS)

    Ceballos, M. T.; Cardiel, N.; Cobo, B.

    2015-09-01

    The processing of the X-ray events detected by a TES (Transition Edge Sensor) device (such as the one that will be proposed in the ESA AO call for instruments for the Athena mission (Nandra et al. 2013) as a high spectral resolution instrument, X-IFU (Barret et al. 2013)), is a several step procedure that starts with the detection of the current pulses in a noisy signal and ends up with their energy reconstruction. For this last stage, an energy calibration process is required to convert the pseudo energies measured in the detector to the real energies of the incoming photons, accounting for possible nonlinearity effects in the detector. We present the details of the energy calibration algorithm we implemented as the last part of the Event Processing software that we are developing for the X-IFU instrument, that permits the calculation of the calibration constants in an analytical way.

  15. Automatic Detection and Classification of Unsafe Events During Power Wheelchair Use

    PubMed Central

    Moghaddam, Athena K.; Yuen, Hiu Kim; Archambault, Philippe S.; Routhier, François; Michaud, François; Boissy, Patrick

    2014-01-01

    Using a powered wheelchair (PW) is a complex task requiring advanced perceptual and motor control skills. Unfortunately, PW incidents and accidents are not uncommon and their consequences can be serious. The objective of this paper is to develop technological tools that can be used to characterize a wheelchair user’s driving behavior under various settings. In the experiments conducted, PWs are outfitted with a datalogging platform that records, in real-time, the 3-D acceleration of the PW. Data collection was conducted over 35 different activities, designed to capture a spectrum of PW driving events performed at different speeds (collisions with fixed or moving objects, rolling on incline plane, and rolling across multiple types obstacles). The data was processed using time-series analysis and data mining techniques, to automatically detect and identify the different events. We compared the classification accuracy using four different types of time-series features: 1) time-delay embeddings; 2) time-domain characterization; 3) frequency-domain features; and 4) wavelet transforms. In the analysis, we compared the classification accuracy obtained when distinguishing between safe and unsafe events during each of the 35 different activities. For the purposes of this study, unsafe events were defined as activities containing collisions against objects at different speed, and the remainder were defined as safe events. We were able to accurately detect 98% of unsafe events, with a low (12%) false positive rate, using only five examples of each activity. This proof-of-concept study shows that the proposed approach has the potential of capturing, based on limited input from embedded sensors, contextual information on PW use, and of automatically characterizing a user’s PW driving behavior. PMID:27170879

  16. Automatic Detection and Classification of Unsafe Events During Power Wheelchair Use.

    PubMed

    Pineau, Joelle; Moghaddam, Athena K; Yuen, Hiu Kim; Archambault, Philippe S; Routhier, François; Michaud, François; Boissy, Patrick

    2014-01-01

    Using a powered wheelchair (PW) is a complex task requiring advanced perceptual and motor control skills. Unfortunately, PW incidents and accidents are not uncommon and their consequences can be serious. The objective of this paper is to develop technological tools that can be used to characterize a wheelchair user's driving behavior under various settings. In the experiments conducted, PWs are outfitted with a datalogging platform that records, in real-time, the 3-D acceleration of the PW. Data collection was conducted over 35 different activities, designed to capture a spectrum of PW driving events performed at different speeds (collisions with fixed or moving objects, rolling on incline plane, and rolling across multiple types obstacles). The data was processed using time-series analysis and data mining techniques, to automatically detect and identify the different events. We compared the classification accuracy using four different types of time-series features: 1) time-delay embeddings; 2) time-domain characterization; 3) frequency-domain features; and 4) wavelet transforms. In the analysis, we compared the classification accuracy obtained when distinguishing between safe and unsafe events during each of the 35 different activities. For the purposes of this study, unsafe events were defined as activities containing collisions against objects at different speed, and the remainder were defined as safe events. We were able to accurately detect 98% of unsafe events, with a low (12%) false positive rate, using only five examples of each activity. This proof-of-concept study shows that the proposed approach has the potential of capturing, based on limited input from embedded sensors, contextual information on PW use, and of automatically characterizing a user's PW driving behavior. PMID:27170879

  17. A multivariate based event detection method and performance comparison with two baseline methods.

    PubMed

    Liu, Shuming; Smith, Kate; Che, Han

    2015-09-01

    Early warning systems have been widely deployed to protect water systems from accidental and intentional contamination events. Conventional detection algorithms are often criticized for having high false positive rates and low true positive rates. This mainly stems from the inability of these methods to determine whether variation in sensor measurements is caused by equipment noise or the presence of contamination. This paper presents a new detection method that identifies the existence of contamination by comparing Euclidean distances of correlation indicators, which are derived from the correlation coefficients of multiple water quality sensors. The performance of the proposed method was evaluated using data from a contaminant injection experiment and compared with two baseline detection methods. The results show that the proposed method can differentiate between fluctuations caused by equipment noise and those due to the presence of contamination. It yielded higher possibility of detection and a lower false alarm rate than the two baseline methods. With optimized parameter values, the proposed method can correctly detect 95% of all contamination events with a 2% false alarm rate. PMID:25996758

  18. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence

    PubMed Central

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images. PMID:26869966

  19. Temporal and Spatial Predictability of an Irrelevant Event Differently Affect Detection and Memory of Items in a Visual Sequence.

    PubMed

    Ohyama, Junji; Watanabe, Katsumi

    2016-01-01

    We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition; it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time (RT) task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection RTs were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising) events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images. PMID:26869966

  20. Testing the waveform correlation event detection system: Teleseismic, regional, and local distances

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Harris, J.M.

    1997-08-01

    Waveform Correlation Event Detection System (WCEDS) prototypes have now been developed for both global and regional networks and the authors have extensively tested them to assess the potential usefulness of this technology for CTBT (Comprehensive Test Ban Treaty) monitoring. In this paper they present the results of tests on data sets from the IDC (International Data Center) Primary Network and the New Mexico Tech Seismic Network. The data sets span a variety of event types and noise conditions. The results are encouraging at both scales but show particular promise for regional networks. The global system was developed at Sandia Labs and has been tested on data from the IDC Primary Network. The authors have found that for this network the system does not perform at acceptable levels for either detection or location unless directional information (azimuth and slowness) is used. By incorporating directional information, however, both areas can be improved substantially suggesting that WCEDS may be able to offer a global detection capability which could complement that provided by the GA (Global Association) system in use at the IDC and USNDC (United States National Data Center). The local version of WCEDS (LWCEDS) has been developed and tested at New Mexico Tech using data from the New Mexico Tech Seismic Network (NMTSN). Results indicate that the WCEDS technology works well at this scale, despite the fact that the present implementation of LWCEDS does not use directional information. The NMTSN data set is a good test bed for the development of LWCEDS because of a typically large number of observed local phases and near network-wide recording of most local and regional events. Detection levels approach those of trained analysts, and locations are within 3 km of manually determined locations for local events.

  1. Event Detection for Hydrothermal Plumes: A case study at Grotto Vent

    NASA Astrophysics Data System (ADS)

    Bemis, K. G.; Ozer, S.; Xu, G.; Rona, P. A.; Silver, D.

    2012-12-01

    Evidence is mounting that geologic events such as volcanic eruptions (and intrusions) and earthquakes (near and far) influence the flow rates and temperatures of hydrothermal systems. Connecting such suppositions to observations of hydrothermal output is challenging, but new ongoing time series have the potential to capture such events. This study explores using activity detection, a technique modified from computer vision, to identify pre-defined events within an extended time series recorded by COVIS (Cabled Observatory Vent Imaging Sonar) and applies it to a time series, with gaps, from Sept 2010 to the present; available measurements include plume orientation, plume rise rate, and diffuse flow area at the NEPTUNE Canada Observatory at Grotto Vent, Main Endeavour Field, Juan de Fuca Ridge. Activity detection is the process of finding a pattern (activity) in a data set containing many different types of patterns. Among many approaches proposed to model and detect activities, we have chosen a graph-based technique, Petri Nets, as they do not require training data to model the activity. They use the domain expert's knowledge to build the activity as a combination of feature states and their transitions (actions). Starting from a conceptual model of how hydrothermal plumes respond to daily tides, we have developed a Petri Net based detection algorithm that identifies deviations from the specified response. Initially we assumed that the orientation of the plume would change smoothly and symmetrically in a consistent daily pattern. However, results indicate that the rate of directional changes varies. The present Petri Net detects unusually large and rapid changes in direction or amount of bending; however inspection of Figure 1 suggests that many of the events detected may be artifacts resulting from gaps in the data or from the large temporal spacing. Still, considerable complexity overlies the "normal" tidal response pattern (the data has a dominant frequency of

  2. Real-time gait event detection for transfemoral amputees during ramp ascending and descending.

    PubMed

    Maqbool, H F; Husman, M A B; Awad, M I; Abouhossein, A; Dehghani-Sanij, A A

    2015-01-01

    Events and phases detection of the human gait are vital for controlling prosthesis, orthosis and functional electrical stimulation (FES) systems. Wearable sensors are inexpensive, portable and have fast processing capability. They are frequently used to assess spatio-temporal, kinematic and kinetic parameters of the human gait which in turn provide more details about the human voluntary control and ampute-eprosthesis interaction. This paper presents a reliable real-time gait event detection algorithm based on simple heuristics approach, applicable to signals from tri-axial gyroscope for lower limb amputees during ramp ascending and descending. Experimental validation is done by comparing the results of gyroscope signal with footswitches. For healthy subjects, the mean difference between events detected by gyroscope and footswitches is 14 ms and 10.5 ms for initial contact (IC) whereas for toe off (TO) it is -5 ms and -25 ms for ramp up and down respectively. For transfemoral amputee, the error is slightly higher either due to the placement of footswitches underneath the foot or the lack of proper knee flexion and ankle plantarflexion/dorsiflexion during ramp up and down. Finally, repeatability tests showed promising results. PMID:26737364

  3. Method for the depth corrected detection of ionizing events from a co-planar grids sensor

    DOEpatents

    De Geronimo, Gianluigi; Bolotnikov, Aleksey E.; Carini, Gabriella

    2009-05-12

    A method for the detection of ionizing events utilizing a co-planar grids sensor comprising a semiconductor substrate, cathode electrode, collecting grid and non-collecting grid. The semiconductor substrate is sensitive to ionizing radiation. A voltage less than 0 Volts is applied to the cathode electrode. A voltage greater than the voltage applied to the cathode is applied to the non-collecting grid. A voltage greater than the voltage applied to the non-collecting grid is applied to the collecting grid. The collecting grid and the non-collecting grid are summed and subtracted creating a sum and difference respectively. The difference and sum are divided creating a ratio. A gain coefficient factor for each depth (distance between the ionizing event and the collecting grid) is determined, whereby the difference between the collecting electrode and the non-collecting electrode multiplied by the corresponding gain coefficient is the depth corrected energy of an ionizing event. Therefore, the energy of each ionizing event is the difference between the collecting grid and the non-collecting grid multiplied by the corresponding gain coefficient. The depth of the ionizing event can also be determined from the ratio.

  4. Advanced Geospatial Hydrodynamic Signals Analysis for Tsunami Event Detection and Warning

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, Banafshe; Sabeur, Zoheir

    2013-04-01

    Current early tsunami warning can be issued upon the detection of a seismic event which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic event from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source event. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded

  5. BioSense: implementation of a National Early Event Detection and Situational Awareness System.

    PubMed

    Bradley, Colleen A; Rolka, H; Walker, D; Loonsk, J

    2005-08-26

    BioSense is a CDC initiative to support enhanced early detection, quantification, and localization of possible biologic terrorism attacks and other events of public health concern on a national level. The goals of the BioSense initiative are to advance early detection by providing the standards, infrastructure, and data acquisition for near real-time reporting, analytic evaluation and implementation, and early event detection support for state and local public health officials. BioSense collects and analyzes Department of Defense and Department of Veterans Affairs ambulatory clinical diagnoses and procedures and Laboratory Corporation of America laboratory-test orders. The application summarizes and presents analytical results and data visualizations by source, day, and syndrome for each ZIP code, state, and metropolitan area through maps, graphs, and tables. An initial proof of a concept evaluation project was conducted before the system was made available to state and local users in April 2004. User recruitment involved identifying and training BioSense administrators and users from state and local health departments. User support has been an essential component of the implementation and enhancement process. CDC initiated the BioIntelligence Center (BIC) in June 2004 to conduct internal monitoring of BioSense national data daily. BIC staff have supported state and local system monitoring, conducted data anomaly inquiries, and communicated with state and local public health officials. Substantial investments will be made in providing regional, state, and local data for early event detection and situational awareness, test beds for data and algorithm evaluation, detection algorithm development, and data management technologies, while maintaining the focus on state and local public health needs. PMID:16177687

  6. Group localisation and unsupervised detection and classification of basic crowd behaviour events for surveillance applications

    NASA Astrophysics Data System (ADS)

    Roubtsova, Nadejda S.; de With, Peter H. N.

    2013-02-01

    Technology for monitoring crowd behaviour is in demand for surveillance and security applications. The trend in research is to tackle detection of complex crowd behaviour events (panic, ght, evacuation etc.) directly using machine learning techniques. In this paper, we present a contrary, bottom-up approach seeking basic group information: (1) instantaneous location and (2) the merge, split and lateral slide-by events - the three basic motion patterns comprising any crowd behaviour. The focus on such generic group information makes our algorithm suitable as a building block in a variety of surveillance systems, possibly integrated with static content analysis solutions. Our feature extraction framework has optical ow in its core. The framework is universal being motion-based, rather than object-detection-based and generates a large variety of motion-blob- characterising features useful for an array of classi cation problems. Motion-based characterisation is performed on a group as an atomic whole and not by means of superposition of individual human motions. Within that feature space, our classi cation system makes decisions based on heuristic rules and thresholds, without machine learning. Our system performs well on group localisation, consistently generating contours around both moving and halted groups. The visual output of our periodical group localisation is equivalent to tracking and the group contour accuracy ranges from adequate to exceptionally good. The system successfully detects and classi es within our merge/split/slide-by event space in surveillance-type video sequences, di ering in resolution, scale, quality and motion content. Quantitatively, its performance is characterised by a good recall: 83% on detection and 71% on combined detection and classi cation.

  7. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    SciTech Connect

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-30

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  8. Application of data cubes for improving detection of water cycle extreme events

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Albayrak, A.

    2015-12-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.

  9. Non Conventional Seismic Events Along the Himalayan Arc Detected in the Hi-Climb Dataset

    NASA Astrophysics Data System (ADS)

    Vergne, J.; Nàbĕlek, J. L.; Rivera, L.; Bollinger, L.; Burtin, A.

    2008-12-01

    From September 2002 to August 2005, more than 200 broadband seismic stations were operated across the Himalayan arc and the southern Tibetan plateau in the framework of the Hi-Climb project. Here, we take advantage of the high density of stations along the main profile to look for coherent seismic wave arrivals that can not be attributed to ordinary tectonic events. An automatic detection algorithm is applied to the continuous data streams filtered between 1 and 10 Hz, followed by a visual inspection of all detections. We discovered about one hundred coherent signals that cannot be attributed to local, regional or teleseismic earthquakes and which are characterized by emergent arrivals and long durations ranging from one minute to several hours. Most of these non conventional seismic events have a low signal to noise ratio and are thus only observed above 1 Hz in the frequency band where the seismic noise is the lowest. However, a small subset of them are strong enough to be observed in a larger frequency band and show an enhancement of long periods compared to standard earthquakes. Based on the analysis of the relative amplitude measured at each station or, when possible, on the correlation of the low frequency part of the signals, most of these events appear to be located along the High Himalayan range. But, because of their emergent character and the main orientation of the seismic profile, their longitude and depth remain poorly constrained. The origin of these non conventional seismic events is still unsealed but their seismic signature shares several characteristics with non volcanic tremors, glacial earthquakes and/or debris avalanches. All these phenomena may occur along the Himalayan range but were not seismically detected before. Here we discuss the pros and cons for each of these postulated candidates based on the analysis of the recorded waveforms and slip models.

  10. Global Detection of Protein Kinase D-dependent Phosphorylation Events in Nocodazole-treated Human Cells*

    PubMed Central

    Franz-Wachtel, Mirita; Eisler, Stephan A.; Krug, Karsten; Wahl, Silke; Carpy, Alejandro; Nordheim, Alfred; Pfizenmaier, Klaus; Hausser, Angelika; Macek, Boris

    2012-01-01

    Protein kinase D (PKD) is a cytosolic serine/threonine kinase implicated in regulation of several cellular processes such as response to oxidative stress, directed cell migration, invasion, differentiation, and fission of the vesicles at the trans-Golgi network. Its variety of functions must be mediated by numerous substrates; however, only a couple of PKD substrates have been identified so far. Here we perform stable isotope labeling of amino acids in cell culture-based quantitative phosphoproteomic analysis to detect phosphorylation events dependent on PKD1 activity in human cells. We compare relative phosphorylation levels between constitutively active and kinase dead PKD1 strains of HEK293 cells, both treated with nocodazole, a microtubule-depolymerizing reagent that disrupts the Golgi complex and activates PKD1. We identify 124 phosphorylation sites that are significantly down-regulated upon decrease of PKD1 activity and show that the PKD target motif is significantly enriched among down-regulated phosphorylation events, pointing to the presence of direct PKD1 substrates. We further perform PKD1 target motif analysis, showing that a proline residue at position +1 relative to the phosphorylation site serves as an inhibitory cue for PKD1 activity. Among PKD1-dependent phosphorylation events, we detect predominantly proteins with localization at Golgi membranes and function in protein sorting, among them several sorting nexins and members of the insulin-like growth factor 2 receptor pathway. This study presents the first global detection of PKD1-dependent phosphorylation events and provides a wealth of information for functional follow-up of PKD1 activity upon disruption of the Golgi network in human cells. PMID:22496350